Oct  2 06:46:50 np0005466013 kernel: Linux version 5.14.0-620.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-11), GNU ld version 2.35.2-67.el9) #1 SMP PREEMPT_DYNAMIC Fri Sep 26 01:13:23 UTC 2025
Oct  2 06:46:50 np0005466013 kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Oct  2 06:46:50 np0005466013 kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-620.el9.x86_64 root=UUID=1631a6ad-43b8-436d-ae76-16fa14b94458 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Oct  2 06:46:50 np0005466013 kernel: BIOS-provided physical RAM map:
Oct  2 06:46:50 np0005466013 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Oct  2 06:46:50 np0005466013 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Oct  2 06:46:50 np0005466013 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Oct  2 06:46:50 np0005466013 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Oct  2 06:46:50 np0005466013 kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Oct  2 06:46:50 np0005466013 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Oct  2 06:46:50 np0005466013 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Oct  2 06:46:50 np0005466013 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Oct  2 06:46:50 np0005466013 kernel: NX (Execute Disable) protection: active
Oct  2 06:46:50 np0005466013 kernel: APIC: Static calls initialized
Oct  2 06:46:50 np0005466013 kernel: SMBIOS 2.8 present.
Oct  2 06:46:50 np0005466013 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Oct  2 06:46:50 np0005466013 kernel: Hypervisor detected: KVM
Oct  2 06:46:50 np0005466013 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Oct  2 06:46:50 np0005466013 kernel: kvm-clock: using sched offset of 8035414528 cycles
Oct  2 06:46:50 np0005466013 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Oct  2 06:46:50 np0005466013 kernel: tsc: Detected 2799.886 MHz processor
Oct  2 06:46:50 np0005466013 kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Oct  2 06:46:50 np0005466013 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Oct  2 06:46:50 np0005466013 kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Oct  2 06:46:50 np0005466013 kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Oct  2 06:46:50 np0005466013 kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Oct  2 06:46:50 np0005466013 kernel: Using GB pages for direct mapping
Oct  2 06:46:50 np0005466013 kernel: RAMDISK: [mem 0x2d7c4000-0x32bd9fff]
Oct  2 06:46:50 np0005466013 kernel: ACPI: Early table checksum verification disabled
Oct  2 06:46:50 np0005466013 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Oct  2 06:46:50 np0005466013 kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct  2 06:46:50 np0005466013 kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct  2 06:46:50 np0005466013 kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct  2 06:46:50 np0005466013 kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Oct  2 06:46:50 np0005466013 kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct  2 06:46:50 np0005466013 kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct  2 06:46:50 np0005466013 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Oct  2 06:46:50 np0005466013 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Oct  2 06:46:50 np0005466013 kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Oct  2 06:46:50 np0005466013 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Oct  2 06:46:50 np0005466013 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Oct  2 06:46:50 np0005466013 kernel: No NUMA configuration found
Oct  2 06:46:50 np0005466013 kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Oct  2 06:46:50 np0005466013 kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Oct  2 06:46:50 np0005466013 kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Oct  2 06:46:50 np0005466013 kernel: Zone ranges:
Oct  2 06:46:50 np0005466013 kernel:  DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Oct  2 06:46:50 np0005466013 kernel:  DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Oct  2 06:46:50 np0005466013 kernel:  Normal   [mem 0x0000000100000000-0x000000023fffffff]
Oct  2 06:46:50 np0005466013 kernel:  Device   empty
Oct  2 06:46:50 np0005466013 kernel: Movable zone start for each node
Oct  2 06:46:50 np0005466013 kernel: Early memory node ranges
Oct  2 06:46:50 np0005466013 kernel:  node   0: [mem 0x0000000000001000-0x000000000009efff]
Oct  2 06:46:50 np0005466013 kernel:  node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Oct  2 06:46:50 np0005466013 kernel:  node   0: [mem 0x0000000100000000-0x000000023fffffff]
Oct  2 06:46:50 np0005466013 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Oct  2 06:46:50 np0005466013 kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Oct  2 06:46:50 np0005466013 kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Oct  2 06:46:50 np0005466013 kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Oct  2 06:46:50 np0005466013 kernel: ACPI: PM-Timer IO Port: 0x608
Oct  2 06:46:50 np0005466013 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Oct  2 06:46:50 np0005466013 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Oct  2 06:46:50 np0005466013 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Oct  2 06:46:50 np0005466013 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Oct  2 06:46:50 np0005466013 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Oct  2 06:46:50 np0005466013 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Oct  2 06:46:50 np0005466013 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Oct  2 06:46:50 np0005466013 kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Oct  2 06:46:50 np0005466013 kernel: TSC deadline timer available
Oct  2 06:46:50 np0005466013 kernel: CPU topo: Max. logical packages:   8
Oct  2 06:46:50 np0005466013 kernel: CPU topo: Max. logical dies:       8
Oct  2 06:46:50 np0005466013 kernel: CPU topo: Max. dies per package:   1
Oct  2 06:46:50 np0005466013 kernel: CPU topo: Max. threads per core:   1
Oct  2 06:46:50 np0005466013 kernel: CPU topo: Num. cores per package:     1
Oct  2 06:46:50 np0005466013 kernel: CPU topo: Num. threads per package:   1
Oct  2 06:46:50 np0005466013 kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Oct  2 06:46:50 np0005466013 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Oct  2 06:46:50 np0005466013 kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Oct  2 06:46:50 np0005466013 kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Oct  2 06:46:50 np0005466013 kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Oct  2 06:46:50 np0005466013 kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Oct  2 06:46:50 np0005466013 kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Oct  2 06:46:50 np0005466013 kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Oct  2 06:46:50 np0005466013 kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Oct  2 06:46:50 np0005466013 kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Oct  2 06:46:50 np0005466013 kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Oct  2 06:46:50 np0005466013 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Oct  2 06:46:50 np0005466013 kernel: Booting paravirtualized kernel on KVM
Oct  2 06:46:50 np0005466013 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Oct  2 06:46:50 np0005466013 kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Oct  2 06:46:50 np0005466013 kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Oct  2 06:46:50 np0005466013 kernel: kvm-guest: PV spinlocks disabled, no host support
Oct  2 06:46:50 np0005466013 kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-620.el9.x86_64 root=UUID=1631a6ad-43b8-436d-ae76-16fa14b94458 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Oct  2 06:46:50 np0005466013 kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-620.el9.x86_64", will be passed to user space.
Oct  2 06:46:50 np0005466013 kernel: random: crng init done
Oct  2 06:46:50 np0005466013 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Oct  2 06:46:50 np0005466013 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Oct  2 06:46:50 np0005466013 kernel: Fallback order for Node 0: 0 
Oct  2 06:46:50 np0005466013 kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Oct  2 06:46:50 np0005466013 kernel: Policy zone: Normal
Oct  2 06:46:50 np0005466013 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Oct  2 06:46:50 np0005466013 kernel: software IO TLB: area num 8.
Oct  2 06:46:50 np0005466013 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Oct  2 06:46:50 np0005466013 kernel: ftrace: allocating 49370 entries in 193 pages
Oct  2 06:46:50 np0005466013 kernel: ftrace: allocated 193 pages with 3 groups
Oct  2 06:46:50 np0005466013 kernel: Dynamic Preempt: voluntary
Oct  2 06:46:50 np0005466013 kernel: rcu: Preemptible hierarchical RCU implementation.
Oct  2 06:46:50 np0005466013 kernel: rcu: #011RCU event tracing is enabled.
Oct  2 06:46:50 np0005466013 kernel: rcu: #011RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Oct  2 06:46:50 np0005466013 kernel: #011Trampoline variant of Tasks RCU enabled.
Oct  2 06:46:50 np0005466013 kernel: #011Rude variant of Tasks RCU enabled.
Oct  2 06:46:50 np0005466013 kernel: #011Tracing variant of Tasks RCU enabled.
Oct  2 06:46:50 np0005466013 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Oct  2 06:46:50 np0005466013 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Oct  2 06:46:50 np0005466013 kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Oct  2 06:46:50 np0005466013 kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Oct  2 06:46:50 np0005466013 kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Oct  2 06:46:50 np0005466013 kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Oct  2 06:46:50 np0005466013 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Oct  2 06:46:50 np0005466013 kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Oct  2 06:46:50 np0005466013 kernel: Console: colour VGA+ 80x25
Oct  2 06:46:50 np0005466013 kernel: printk: console [ttyS0] enabled
Oct  2 06:46:50 np0005466013 kernel: ACPI: Core revision 20230331
Oct  2 06:46:50 np0005466013 kernel: APIC: Switch to symmetric I/O mode setup
Oct  2 06:46:50 np0005466013 kernel: x2apic enabled
Oct  2 06:46:50 np0005466013 kernel: APIC: Switched APIC routing to: physical x2apic
Oct  2 06:46:50 np0005466013 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Oct  2 06:46:50 np0005466013 kernel: Calibrating delay loop (skipped) preset value.. 5599.77 BogoMIPS (lpj=2799886)
Oct  2 06:46:50 np0005466013 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Oct  2 06:46:50 np0005466013 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Oct  2 06:46:50 np0005466013 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Oct  2 06:46:50 np0005466013 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Oct  2 06:46:50 np0005466013 kernel: Spectre V2 : Mitigation: Retpolines
Oct  2 06:46:50 np0005466013 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Oct  2 06:46:50 np0005466013 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Oct  2 06:46:50 np0005466013 kernel: RETBleed: Mitigation: untrained return thunk
Oct  2 06:46:50 np0005466013 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Oct  2 06:46:50 np0005466013 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Oct  2 06:46:50 np0005466013 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Oct  2 06:46:50 np0005466013 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Oct  2 06:46:50 np0005466013 kernel: x86/bugs: return thunk changed
Oct  2 06:46:50 np0005466013 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Oct  2 06:46:50 np0005466013 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Oct  2 06:46:50 np0005466013 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Oct  2 06:46:50 np0005466013 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Oct  2 06:46:50 np0005466013 kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Oct  2 06:46:50 np0005466013 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Oct  2 06:46:50 np0005466013 kernel: Freeing SMP alternatives memory: 40K
Oct  2 06:46:50 np0005466013 kernel: pid_max: default: 32768 minimum: 301
Oct  2 06:46:50 np0005466013 kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Oct  2 06:46:50 np0005466013 kernel: landlock: Up and running.
Oct  2 06:46:50 np0005466013 kernel: Yama: becoming mindful.
Oct  2 06:46:50 np0005466013 kernel: SELinux:  Initializing.
Oct  2 06:46:50 np0005466013 kernel: LSM support for eBPF active
Oct  2 06:46:50 np0005466013 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Oct  2 06:46:50 np0005466013 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Oct  2 06:46:50 np0005466013 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Oct  2 06:46:50 np0005466013 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Oct  2 06:46:50 np0005466013 kernel: ... version:                0
Oct  2 06:46:50 np0005466013 kernel: ... bit width:              48
Oct  2 06:46:50 np0005466013 kernel: ... generic registers:      6
Oct  2 06:46:50 np0005466013 kernel: ... value mask:             0000ffffffffffff
Oct  2 06:46:50 np0005466013 kernel: ... max period:             00007fffffffffff
Oct  2 06:46:50 np0005466013 kernel: ... fixed-purpose events:   0
Oct  2 06:46:50 np0005466013 kernel: ... event mask:             000000000000003f
Oct  2 06:46:50 np0005466013 kernel: signal: max sigframe size: 1776
Oct  2 06:46:50 np0005466013 kernel: rcu: Hierarchical SRCU implementation.
Oct  2 06:46:50 np0005466013 kernel: rcu: #011Max phase no-delay instances is 400.
Oct  2 06:46:50 np0005466013 kernel: smp: Bringing up secondary CPUs ...
Oct  2 06:46:50 np0005466013 kernel: smpboot: x86: Booting SMP configuration:
Oct  2 06:46:50 np0005466013 kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Oct  2 06:46:50 np0005466013 kernel: smp: Brought up 1 node, 8 CPUs
Oct  2 06:46:50 np0005466013 kernel: smpboot: Total of 8 processors activated (44798.17 BogoMIPS)
Oct  2 06:46:50 np0005466013 kernel: node 0 deferred pages initialised in 27ms
Oct  2 06:46:50 np0005466013 kernel: Memory: 7765536K/8388068K available (16384K kernel code, 5784K rwdata, 13996K rodata, 4068K init, 7304K bss, 616500K reserved, 0K cma-reserved)
Oct  2 06:46:50 np0005466013 kernel: devtmpfs: initialized
Oct  2 06:46:50 np0005466013 kernel: x86/mm: Memory block size: 128MB
Oct  2 06:46:50 np0005466013 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Oct  2 06:46:50 np0005466013 kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear)
Oct  2 06:46:50 np0005466013 kernel: pinctrl core: initialized pinctrl subsystem
Oct  2 06:46:50 np0005466013 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Oct  2 06:46:50 np0005466013 kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Oct  2 06:46:50 np0005466013 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Oct  2 06:46:50 np0005466013 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Oct  2 06:46:50 np0005466013 kernel: audit: initializing netlink subsys (disabled)
Oct  2 06:46:50 np0005466013 kernel: audit: type=2000 audit(1759402008.469:1): state=initialized audit_enabled=0 res=1
Oct  2 06:46:50 np0005466013 kernel: thermal_sys: Registered thermal governor 'fair_share'
Oct  2 06:46:50 np0005466013 kernel: thermal_sys: Registered thermal governor 'step_wise'
Oct  2 06:46:50 np0005466013 kernel: thermal_sys: Registered thermal governor 'user_space'
Oct  2 06:46:50 np0005466013 kernel: cpuidle: using governor menu
Oct  2 06:46:50 np0005466013 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Oct  2 06:46:50 np0005466013 kernel: PCI: Using configuration type 1 for base access
Oct  2 06:46:50 np0005466013 kernel: PCI: Using configuration type 1 for extended access
Oct  2 06:46:50 np0005466013 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Oct  2 06:46:50 np0005466013 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Oct  2 06:46:50 np0005466013 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Oct  2 06:46:50 np0005466013 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Oct  2 06:46:50 np0005466013 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Oct  2 06:46:50 np0005466013 kernel: Demotion targets for Node 0: null
Oct  2 06:46:50 np0005466013 kernel: cryptd: max_cpu_qlen set to 1000
Oct  2 06:46:50 np0005466013 kernel: ACPI: Added _OSI(Module Device)
Oct  2 06:46:50 np0005466013 kernel: ACPI: Added _OSI(Processor Device)
Oct  2 06:46:50 np0005466013 kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Oct  2 06:46:50 np0005466013 kernel: ACPI: Added _OSI(Processor Aggregator Device)
Oct  2 06:46:50 np0005466013 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Oct  2 06:46:50 np0005466013 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Oct  2 06:46:50 np0005466013 kernel: ACPI: Interpreter enabled
Oct  2 06:46:50 np0005466013 kernel: ACPI: PM: (supports S0 S3 S4 S5)
Oct  2 06:46:50 np0005466013 kernel: ACPI: Using IOAPIC for interrupt routing
Oct  2 06:46:50 np0005466013 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Oct  2 06:46:50 np0005466013 kernel: PCI: Using E820 reservations for host bridge windows
Oct  2 06:46:50 np0005466013 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Oct  2 06:46:50 np0005466013 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Oct  2 06:46:50 np0005466013 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Oct  2 06:46:50 np0005466013 kernel: acpiphp: Slot [3] registered
Oct  2 06:46:50 np0005466013 kernel: acpiphp: Slot [4] registered
Oct  2 06:46:50 np0005466013 kernel: acpiphp: Slot [5] registered
Oct  2 06:46:50 np0005466013 kernel: acpiphp: Slot [6] registered
Oct  2 06:46:50 np0005466013 kernel: acpiphp: Slot [7] registered
Oct  2 06:46:50 np0005466013 kernel: acpiphp: Slot [8] registered
Oct  2 06:46:50 np0005466013 kernel: acpiphp: Slot [9] registered
Oct  2 06:46:50 np0005466013 kernel: acpiphp: Slot [10] registered
Oct  2 06:46:50 np0005466013 kernel: acpiphp: Slot [11] registered
Oct  2 06:46:50 np0005466013 kernel: acpiphp: Slot [12] registered
Oct  2 06:46:50 np0005466013 kernel: acpiphp: Slot [13] registered
Oct  2 06:46:50 np0005466013 kernel: acpiphp: Slot [14] registered
Oct  2 06:46:50 np0005466013 kernel: acpiphp: Slot [15] registered
Oct  2 06:46:50 np0005466013 kernel: acpiphp: Slot [16] registered
Oct  2 06:46:50 np0005466013 kernel: acpiphp: Slot [17] registered
Oct  2 06:46:50 np0005466013 kernel: acpiphp: Slot [18] registered
Oct  2 06:46:50 np0005466013 kernel: acpiphp: Slot [19] registered
Oct  2 06:46:50 np0005466013 kernel: acpiphp: Slot [20] registered
Oct  2 06:46:50 np0005466013 kernel: acpiphp: Slot [21] registered
Oct  2 06:46:50 np0005466013 kernel: acpiphp: Slot [22] registered
Oct  2 06:46:50 np0005466013 kernel: acpiphp: Slot [23] registered
Oct  2 06:46:50 np0005466013 kernel: acpiphp: Slot [24] registered
Oct  2 06:46:50 np0005466013 kernel: acpiphp: Slot [25] registered
Oct  2 06:46:50 np0005466013 kernel: acpiphp: Slot [26] registered
Oct  2 06:46:50 np0005466013 kernel: acpiphp: Slot [27] registered
Oct  2 06:46:50 np0005466013 kernel: acpiphp: Slot [28] registered
Oct  2 06:46:50 np0005466013 kernel: acpiphp: Slot [29] registered
Oct  2 06:46:50 np0005466013 kernel: acpiphp: Slot [30] registered
Oct  2 06:46:50 np0005466013 kernel: acpiphp: Slot [31] registered
Oct  2 06:46:50 np0005466013 kernel: PCI host bridge to bus 0000:00
Oct  2 06:46:50 np0005466013 kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Oct  2 06:46:50 np0005466013 kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Oct  2 06:46:50 np0005466013 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Oct  2 06:46:50 np0005466013 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Oct  2 06:46:50 np0005466013 kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Oct  2 06:46:50 np0005466013 kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Oct  2 06:46:50 np0005466013 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Oct  2 06:46:50 np0005466013 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Oct  2 06:46:50 np0005466013 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Oct  2 06:46:50 np0005466013 kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Oct  2 06:46:50 np0005466013 kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Oct  2 06:46:50 np0005466013 kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Oct  2 06:46:50 np0005466013 kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Oct  2 06:46:50 np0005466013 kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Oct  2 06:46:50 np0005466013 kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Oct  2 06:46:50 np0005466013 kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Oct  2 06:46:50 np0005466013 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Oct  2 06:46:50 np0005466013 kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Oct  2 06:46:50 np0005466013 kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Oct  2 06:46:50 np0005466013 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Oct  2 06:46:50 np0005466013 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Oct  2 06:46:50 np0005466013 kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Oct  2 06:46:50 np0005466013 kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Oct  2 06:46:50 np0005466013 kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Oct  2 06:46:50 np0005466013 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Oct  2 06:46:50 np0005466013 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Oct  2 06:46:50 np0005466013 kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Oct  2 06:46:50 np0005466013 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Oct  2 06:46:50 np0005466013 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Oct  2 06:46:50 np0005466013 kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Oct  2 06:46:50 np0005466013 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Oct  2 06:46:50 np0005466013 kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Oct  2 06:46:50 np0005466013 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Oct  2 06:46:50 np0005466013 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Oct  2 06:46:50 np0005466013 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Oct  2 06:46:50 np0005466013 kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Oct  2 06:46:50 np0005466013 kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Oct  2 06:46:50 np0005466013 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Oct  2 06:46:50 np0005466013 kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Oct  2 06:46:50 np0005466013 kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Oct  2 06:46:50 np0005466013 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Oct  2 06:46:50 np0005466013 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Oct  2 06:46:50 np0005466013 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Oct  2 06:46:50 np0005466013 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Oct  2 06:46:50 np0005466013 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Oct  2 06:46:50 np0005466013 kernel: iommu: Default domain type: Translated
Oct  2 06:46:50 np0005466013 kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Oct  2 06:46:50 np0005466013 kernel: SCSI subsystem initialized
Oct  2 06:46:50 np0005466013 kernel: ACPI: bus type USB registered
Oct  2 06:46:50 np0005466013 kernel: usbcore: registered new interface driver usbfs
Oct  2 06:46:50 np0005466013 kernel: usbcore: registered new interface driver hub
Oct  2 06:46:50 np0005466013 kernel: usbcore: registered new device driver usb
Oct  2 06:46:50 np0005466013 kernel: pps_core: LinuxPPS API ver. 1 registered
Oct  2 06:46:50 np0005466013 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Oct  2 06:46:50 np0005466013 kernel: PTP clock support registered
Oct  2 06:46:50 np0005466013 kernel: EDAC MC: Ver: 3.0.0
Oct  2 06:46:50 np0005466013 kernel: NetLabel: Initializing
Oct  2 06:46:50 np0005466013 kernel: NetLabel:  domain hash size = 128
Oct  2 06:46:50 np0005466013 kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Oct  2 06:46:50 np0005466013 kernel: NetLabel:  unlabeled traffic allowed by default
Oct  2 06:46:50 np0005466013 kernel: PCI: Using ACPI for IRQ routing
Oct  2 06:46:50 np0005466013 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Oct  2 06:46:50 np0005466013 kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Oct  2 06:46:50 np0005466013 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Oct  2 06:46:50 np0005466013 kernel: vgaarb: loaded
Oct  2 06:46:50 np0005466013 kernel: clocksource: Switched to clocksource kvm-clock
Oct  2 06:46:50 np0005466013 kernel: VFS: Disk quotas dquot_6.6.0
Oct  2 06:46:50 np0005466013 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Oct  2 06:46:50 np0005466013 kernel: pnp: PnP ACPI init
Oct  2 06:46:50 np0005466013 kernel: pnp: PnP ACPI: found 5 devices
Oct  2 06:46:50 np0005466013 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Oct  2 06:46:50 np0005466013 kernel: NET: Registered PF_INET protocol family
Oct  2 06:46:50 np0005466013 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Oct  2 06:46:50 np0005466013 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Oct  2 06:46:50 np0005466013 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Oct  2 06:46:50 np0005466013 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Oct  2 06:46:50 np0005466013 kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Oct  2 06:46:50 np0005466013 kernel: TCP: Hash tables configured (established 65536 bind 65536)
Oct  2 06:46:50 np0005466013 kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Oct  2 06:46:50 np0005466013 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Oct  2 06:46:50 np0005466013 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Oct  2 06:46:50 np0005466013 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Oct  2 06:46:50 np0005466013 kernel: NET: Registered PF_XDP protocol family
Oct  2 06:46:50 np0005466013 kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Oct  2 06:46:50 np0005466013 kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Oct  2 06:46:50 np0005466013 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Oct  2 06:46:50 np0005466013 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Oct  2 06:46:50 np0005466013 kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Oct  2 06:46:50 np0005466013 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Oct  2 06:46:50 np0005466013 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Oct  2 06:46:50 np0005466013 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Oct  2 06:46:50 np0005466013 kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x140 took 81210 usecs
Oct  2 06:46:50 np0005466013 kernel: PCI: CLS 0 bytes, default 64
Oct  2 06:46:50 np0005466013 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Oct  2 06:46:50 np0005466013 kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Oct  2 06:46:50 np0005466013 kernel: ACPI: bus type thunderbolt registered
Oct  2 06:46:50 np0005466013 kernel: Trying to unpack rootfs image as initramfs...
Oct  2 06:46:50 np0005466013 kernel: Initialise system trusted keyrings
Oct  2 06:46:50 np0005466013 kernel: Key type blacklist registered
Oct  2 06:46:50 np0005466013 kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Oct  2 06:46:50 np0005466013 kernel: zbud: loaded
Oct  2 06:46:50 np0005466013 kernel: integrity: Platform Keyring initialized
Oct  2 06:46:50 np0005466013 kernel: integrity: Machine keyring initialized
Oct  2 06:46:50 np0005466013 kernel: Freeing initrd memory: 86104K
Oct  2 06:46:50 np0005466013 kernel: NET: Registered PF_ALG protocol family
Oct  2 06:46:50 np0005466013 kernel: xor: automatically using best checksumming function   avx       
Oct  2 06:46:50 np0005466013 kernel: Key type asymmetric registered
Oct  2 06:46:50 np0005466013 kernel: Asymmetric key parser 'x509' registered
Oct  2 06:46:50 np0005466013 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Oct  2 06:46:50 np0005466013 kernel: io scheduler mq-deadline registered
Oct  2 06:46:50 np0005466013 kernel: io scheduler kyber registered
Oct  2 06:46:50 np0005466013 kernel: io scheduler bfq registered
Oct  2 06:46:50 np0005466013 kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Oct  2 06:46:50 np0005466013 kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Oct  2 06:46:50 np0005466013 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Oct  2 06:46:50 np0005466013 kernel: ACPI: button: Power Button [PWRF]
Oct  2 06:46:50 np0005466013 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Oct  2 06:46:50 np0005466013 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Oct  2 06:46:50 np0005466013 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Oct  2 06:46:50 np0005466013 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Oct  2 06:46:50 np0005466013 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Oct  2 06:46:50 np0005466013 kernel: Non-volatile memory driver v1.3
Oct  2 06:46:50 np0005466013 kernel: rdac: device handler registered
Oct  2 06:46:50 np0005466013 kernel: hp_sw: device handler registered
Oct  2 06:46:50 np0005466013 kernel: emc: device handler registered
Oct  2 06:46:50 np0005466013 kernel: alua: device handler registered
Oct  2 06:46:50 np0005466013 kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Oct  2 06:46:50 np0005466013 kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Oct  2 06:46:50 np0005466013 kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Oct  2 06:46:50 np0005466013 kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Oct  2 06:46:50 np0005466013 kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Oct  2 06:46:50 np0005466013 kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Oct  2 06:46:50 np0005466013 kernel: usb usb1: Product: UHCI Host Controller
Oct  2 06:46:50 np0005466013 kernel: usb usb1: Manufacturer: Linux 5.14.0-620.el9.x86_64 uhci_hcd
Oct  2 06:46:50 np0005466013 kernel: usb usb1: SerialNumber: 0000:00:01.2
Oct  2 06:46:50 np0005466013 kernel: hub 1-0:1.0: USB hub found
Oct  2 06:46:50 np0005466013 kernel: hub 1-0:1.0: 2 ports detected
Oct  2 06:46:50 np0005466013 kernel: usbcore: registered new interface driver usbserial_generic
Oct  2 06:46:50 np0005466013 kernel: usbserial: USB Serial support registered for generic
Oct  2 06:46:50 np0005466013 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Oct  2 06:46:50 np0005466013 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Oct  2 06:46:50 np0005466013 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Oct  2 06:46:50 np0005466013 kernel: mousedev: PS/2 mouse device common for all mice
Oct  2 06:46:50 np0005466013 kernel: rtc_cmos 00:04: RTC can wake from S4
Oct  2 06:46:50 np0005466013 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Oct  2 06:46:50 np0005466013 kernel: rtc_cmos 00:04: registered as rtc0
Oct  2 06:46:50 np0005466013 kernel: rtc_cmos 00:04: setting system clock to 2025-10-02T10:46:49 UTC (1759402009)
Oct  2 06:46:50 np0005466013 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Oct  2 06:46:50 np0005466013 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Oct  2 06:46:50 np0005466013 kernel: hid: raw HID events driver (C) Jiri Kosina
Oct  2 06:46:50 np0005466013 kernel: usbcore: registered new interface driver usbhid
Oct  2 06:46:50 np0005466013 kernel: usbhid: USB HID core driver
Oct  2 06:46:50 np0005466013 kernel: drop_monitor: Initializing network drop monitor service
Oct  2 06:46:50 np0005466013 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Oct  2 06:46:50 np0005466013 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Oct  2 06:46:50 np0005466013 kernel: Initializing XFRM netlink socket
Oct  2 06:46:50 np0005466013 kernel: NET: Registered PF_INET6 protocol family
Oct  2 06:46:50 np0005466013 kernel: Segment Routing with IPv6
Oct  2 06:46:50 np0005466013 kernel: NET: Registered PF_PACKET protocol family
Oct  2 06:46:50 np0005466013 kernel: mpls_gso: MPLS GSO support
Oct  2 06:46:50 np0005466013 kernel: IPI shorthand broadcast: enabled
Oct  2 06:46:50 np0005466013 kernel: AVX2 version of gcm_enc/dec engaged.
Oct  2 06:46:50 np0005466013 kernel: AES CTR mode by8 optimization enabled
Oct  2 06:46:50 np0005466013 kernel: sched_clock: Marking stable (1216002736, 146799735)->(1449282628, -86480157)
Oct  2 06:46:50 np0005466013 kernel: registered taskstats version 1
Oct  2 06:46:50 np0005466013 kernel: Loading compiled-in X.509 certificates
Oct  2 06:46:50 np0005466013 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 4ff821c4997fbb659836adb05f5bc400c914e148'
Oct  2 06:46:50 np0005466013 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Oct  2 06:46:50 np0005466013 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Oct  2 06:46:50 np0005466013 kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Oct  2 06:46:50 np0005466013 kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Oct  2 06:46:50 np0005466013 kernel: Demotion targets for Node 0: null
Oct  2 06:46:50 np0005466013 kernel: page_owner is disabled
Oct  2 06:46:50 np0005466013 kernel: Key type .fscrypt registered
Oct  2 06:46:50 np0005466013 kernel: Key type fscrypt-provisioning registered
Oct  2 06:46:50 np0005466013 kernel: Key type big_key registered
Oct  2 06:46:50 np0005466013 kernel: Key type encrypted registered
Oct  2 06:46:50 np0005466013 kernel: ima: No TPM chip found, activating TPM-bypass!
Oct  2 06:46:50 np0005466013 kernel: Loading compiled-in module X.509 certificates
Oct  2 06:46:50 np0005466013 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 4ff821c4997fbb659836adb05f5bc400c914e148'
Oct  2 06:46:50 np0005466013 kernel: ima: Allocated hash algorithm: sha256
Oct  2 06:46:50 np0005466013 kernel: ima: No architecture policies found
Oct  2 06:46:50 np0005466013 kernel: evm: Initialising EVM extended attributes:
Oct  2 06:46:50 np0005466013 kernel: evm: security.selinux
Oct  2 06:46:50 np0005466013 kernel: evm: security.SMACK64 (disabled)
Oct  2 06:46:50 np0005466013 kernel: evm: security.SMACK64EXEC (disabled)
Oct  2 06:46:50 np0005466013 kernel: evm: security.SMACK64TRANSMUTE (disabled)
Oct  2 06:46:50 np0005466013 kernel: evm: security.SMACK64MMAP (disabled)
Oct  2 06:46:50 np0005466013 kernel: evm: security.apparmor (disabled)
Oct  2 06:46:50 np0005466013 kernel: evm: security.ima
Oct  2 06:46:50 np0005466013 kernel: evm: security.capability
Oct  2 06:46:50 np0005466013 kernel: evm: HMAC attrs: 0x1
Oct  2 06:46:50 np0005466013 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Oct  2 06:46:50 np0005466013 kernel: Running certificate verification RSA selftest
Oct  2 06:46:50 np0005466013 kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Oct  2 06:46:50 np0005466013 kernel: Running certificate verification ECDSA selftest
Oct  2 06:46:50 np0005466013 kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Oct  2 06:46:50 np0005466013 kernel: clk: Disabling unused clocks
Oct  2 06:46:50 np0005466013 kernel: Freeing unused decrypted memory: 2028K
Oct  2 06:46:50 np0005466013 kernel: Freeing unused kernel image (initmem) memory: 4068K
Oct  2 06:46:50 np0005466013 kernel: Write protecting the kernel read-only data: 30720k
Oct  2 06:46:50 np0005466013 kernel: Freeing unused kernel image (rodata/data gap) memory: 340K
Oct  2 06:46:50 np0005466013 kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Oct  2 06:46:50 np0005466013 kernel: Run /init as init process
Oct  2 06:46:50 np0005466013 systemd: systemd 252-55.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Oct  2 06:46:50 np0005466013 systemd: Detected virtualization kvm.
Oct  2 06:46:50 np0005466013 systemd: Detected architecture x86-64.
Oct  2 06:46:50 np0005466013 systemd: Running in initrd.
Oct  2 06:46:50 np0005466013 systemd: No hostname configured, using default hostname.
Oct  2 06:46:50 np0005466013 systemd: Hostname set to <localhost>.
Oct  2 06:46:50 np0005466013 systemd: Initializing machine ID from VM UUID.
Oct  2 06:46:50 np0005466013 kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Oct  2 06:46:50 np0005466013 kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Oct  2 06:46:50 np0005466013 kernel: usb 1-1: Product: QEMU USB Tablet
Oct  2 06:46:50 np0005466013 kernel: usb 1-1: Manufacturer: QEMU
Oct  2 06:46:50 np0005466013 kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Oct  2 06:46:50 np0005466013 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Oct  2 06:46:50 np0005466013 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Oct  2 06:46:50 np0005466013 systemd: Queued start job for default target Initrd Default Target.
Oct  2 06:46:50 np0005466013 systemd: Started Dispatch Password Requests to Console Directory Watch.
Oct  2 06:46:50 np0005466013 systemd: Reached target Local Encrypted Volumes.
Oct  2 06:46:50 np0005466013 systemd: Reached target Initrd /usr File System.
Oct  2 06:46:50 np0005466013 systemd: Reached target Local File Systems.
Oct  2 06:46:50 np0005466013 systemd: Reached target Path Units.
Oct  2 06:46:50 np0005466013 systemd: Reached target Slice Units.
Oct  2 06:46:50 np0005466013 systemd: Reached target Swaps.
Oct  2 06:46:50 np0005466013 systemd: Reached target Timer Units.
Oct  2 06:46:50 np0005466013 systemd: Listening on D-Bus System Message Bus Socket.
Oct  2 06:46:50 np0005466013 systemd: Listening on Journal Socket (/dev/log).
Oct  2 06:46:50 np0005466013 systemd: Listening on Journal Socket.
Oct  2 06:46:50 np0005466013 systemd: Listening on udev Control Socket.
Oct  2 06:46:50 np0005466013 systemd: Listening on udev Kernel Socket.
Oct  2 06:46:50 np0005466013 systemd: Reached target Socket Units.
Oct  2 06:46:50 np0005466013 systemd: Starting Create List of Static Device Nodes...
Oct  2 06:46:50 np0005466013 systemd: Starting Journal Service...
Oct  2 06:46:50 np0005466013 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Oct  2 06:46:50 np0005466013 systemd: Starting Apply Kernel Variables...
Oct  2 06:46:50 np0005466013 systemd: Starting Create System Users...
Oct  2 06:46:50 np0005466013 systemd: Starting Setup Virtual Console...
Oct  2 06:46:50 np0005466013 systemd: Finished Create List of Static Device Nodes.
Oct  2 06:46:50 np0005466013 systemd: Finished Apply Kernel Variables.
Oct  2 06:46:50 np0005466013 systemd: Finished Create System Users.
Oct  2 06:46:50 np0005466013 systemd-journald[305]: Journal started
Oct  2 06:46:50 np0005466013 systemd-journald[305]: Runtime Journal (/run/log/journal/f771464327c84af0a2a6be96dd6da6b1) is 8.0M, max 153.5M, 145.5M free.
Oct  2 06:46:50 np0005466013 systemd-sysusers[310]: Creating group 'users' with GID 100.
Oct  2 06:46:50 np0005466013 systemd-sysusers[310]: Creating group 'dbus' with GID 81.
Oct  2 06:46:50 np0005466013 systemd-sysusers[310]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Oct  2 06:46:50 np0005466013 systemd: Starting Create Static Device Nodes in /dev...
Oct  2 06:46:50 np0005466013 systemd: Started Journal Service.
Oct  2 06:46:50 np0005466013 systemd[1]: Starting Create Volatile Files and Directories...
Oct  2 06:46:50 np0005466013 systemd[1]: Finished Create Static Device Nodes in /dev.
Oct  2 06:46:50 np0005466013 systemd[1]: Finished Create Volatile Files and Directories.
Oct  2 06:46:50 np0005466013 systemd[1]: Finished Setup Virtual Console.
Oct  2 06:46:50 np0005466013 systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Oct  2 06:46:50 np0005466013 systemd[1]: Starting dracut cmdline hook...
Oct  2 06:46:50 np0005466013 dracut-cmdline[326]: dracut-9 dracut-057-102.git20250818.el9
Oct  2 06:46:50 np0005466013 dracut-cmdline[326]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-620.el9.x86_64 root=UUID=1631a6ad-43b8-436d-ae76-16fa14b94458 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Oct  2 06:46:50 np0005466013 systemd[1]: Finished dracut cmdline hook.
Oct  2 06:46:50 np0005466013 systemd[1]: Starting dracut pre-udev hook...
Oct  2 06:46:50 np0005466013 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Oct  2 06:46:50 np0005466013 kernel: device-mapper: uevent: version 1.0.3
Oct  2 06:46:50 np0005466013 kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Oct  2 06:46:50 np0005466013 kernel: RPC: Registered named UNIX socket transport module.
Oct  2 06:46:50 np0005466013 kernel: RPC: Registered udp transport module.
Oct  2 06:46:50 np0005466013 kernel: RPC: Registered tcp transport module.
Oct  2 06:46:50 np0005466013 kernel: RPC: Registered tcp-with-tls transport module.
Oct  2 06:46:50 np0005466013 kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Oct  2 06:46:50 np0005466013 rpc.statd[443]: Version 2.5.4 starting
Oct  2 06:46:50 np0005466013 rpc.statd[443]: Initializing NSM state
Oct  2 06:46:50 np0005466013 rpc.idmapd[448]: Setting log level to 0
Oct  2 06:46:50 np0005466013 systemd[1]: Finished dracut pre-udev hook.
Oct  2 06:46:50 np0005466013 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Oct  2 06:46:50 np0005466013 systemd-udevd[461]: Using default interface naming scheme 'rhel-9.0'.
Oct  2 06:46:50 np0005466013 systemd[1]: Started Rule-based Manager for Device Events and Files.
Oct  2 06:46:50 np0005466013 systemd[1]: Starting dracut pre-trigger hook...
Oct  2 06:46:50 np0005466013 systemd[1]: Finished dracut pre-trigger hook.
Oct  2 06:46:50 np0005466013 systemd[1]: Starting Coldplug All udev Devices...
Oct  2 06:46:50 np0005466013 systemd[1]: Created slice Slice /system/modprobe.
Oct  2 06:46:50 np0005466013 systemd[1]: Starting Load Kernel Module configfs...
Oct  2 06:46:50 np0005466013 systemd[1]: Finished Coldplug All udev Devices.
Oct  2 06:46:50 np0005466013 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Oct  2 06:46:50 np0005466013 systemd[1]: Finished Load Kernel Module configfs.
Oct  2 06:46:50 np0005466013 systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Oct  2 06:46:50 np0005466013 systemd[1]: Reached target Network.
Oct  2 06:46:50 np0005466013 systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Oct  2 06:46:50 np0005466013 systemd[1]: Starting dracut initqueue hook...
Oct  2 06:46:50 np0005466013 kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Oct  2 06:46:50 np0005466013 kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Oct  2 06:46:50 np0005466013 kernel: vda: vda1
Oct  2 06:46:50 np0005466013 systemd-udevd[495]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 06:46:50 np0005466013 kernel: scsi host0: ata_piix
Oct  2 06:46:50 np0005466013 kernel: scsi host1: ata_piix
Oct  2 06:46:50 np0005466013 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Oct  2 06:46:50 np0005466013 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Oct  2 06:46:50 np0005466013 systemd[1]: Found device /dev/disk/by-uuid/1631a6ad-43b8-436d-ae76-16fa14b94458.
Oct  2 06:46:50 np0005466013 systemd[1]: Reached target Initrd Root Device.
Oct  2 06:46:51 np0005466013 systemd[1]: Mounting Kernel Configuration File System...
Oct  2 06:46:51 np0005466013 kernel: ata1: found unknown device (class 0)
Oct  2 06:46:51 np0005466013 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Oct  2 06:46:51 np0005466013 kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Oct  2 06:46:51 np0005466013 kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Oct  2 06:46:51 np0005466013 systemd[1]: Mounted Kernel Configuration File System.
Oct  2 06:46:51 np0005466013 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Oct  2 06:46:51 np0005466013 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Oct  2 06:46:51 np0005466013 systemd[1]: Reached target System Initialization.
Oct  2 06:46:51 np0005466013 systemd[1]: Reached target Basic System.
Oct  2 06:46:51 np0005466013 systemd[1]: Finished dracut initqueue hook.
Oct  2 06:46:51 np0005466013 systemd[1]: Reached target Preparation for Remote File Systems.
Oct  2 06:46:51 np0005466013 systemd[1]: Reached target Remote Encrypted Volumes.
Oct  2 06:46:51 np0005466013 systemd[1]: Reached target Remote File Systems.
Oct  2 06:46:51 np0005466013 systemd[1]: Starting dracut pre-mount hook...
Oct  2 06:46:51 np0005466013 systemd[1]: Finished dracut pre-mount hook.
Oct  2 06:46:51 np0005466013 systemd[1]: Starting File System Check on /dev/disk/by-uuid/1631a6ad-43b8-436d-ae76-16fa14b94458...
Oct  2 06:46:51 np0005466013 systemd-fsck[552]: /usr/sbin/fsck.xfs: XFS file system.
Oct  2 06:46:51 np0005466013 systemd[1]: Finished File System Check on /dev/disk/by-uuid/1631a6ad-43b8-436d-ae76-16fa14b94458.
Oct  2 06:46:51 np0005466013 systemd[1]: Mounting /sysroot...
Oct  2 06:46:51 np0005466013 kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Oct  2 06:46:51 np0005466013 kernel: XFS (vda1): Mounting V5 Filesystem 1631a6ad-43b8-436d-ae76-16fa14b94458
Oct  2 06:46:51 np0005466013 kernel: XFS (vda1): Ending clean mount
Oct  2 06:46:51 np0005466013 systemd[1]: Mounted /sysroot.
Oct  2 06:46:51 np0005466013 systemd[1]: Reached target Initrd Root File System.
Oct  2 06:46:51 np0005466013 systemd[1]: Starting Mountpoints Configured in the Real Root...
Oct  2 06:46:51 np0005466013 systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Oct  2 06:46:51 np0005466013 systemd[1]: Finished Mountpoints Configured in the Real Root.
Oct  2 06:46:51 np0005466013 systemd[1]: Reached target Initrd File Systems.
Oct  2 06:46:51 np0005466013 systemd[1]: Reached target Initrd Default Target.
Oct  2 06:46:51 np0005466013 systemd[1]: Starting dracut mount hook...
Oct  2 06:46:52 np0005466013 systemd[1]: Finished dracut mount hook.
Oct  2 06:46:52 np0005466013 systemd[1]: Starting dracut pre-pivot and cleanup hook...
Oct  2 06:46:52 np0005466013 rpc.idmapd[448]: exiting on signal 15
Oct  2 06:46:52 np0005466013 systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Oct  2 06:46:52 np0005466013 systemd[1]: Finished dracut pre-pivot and cleanup hook.
Oct  2 06:46:52 np0005466013 systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Oct  2 06:46:52 np0005466013 systemd[1]: Stopped target Network.
Oct  2 06:46:52 np0005466013 systemd[1]: Stopped target Remote Encrypted Volumes.
Oct  2 06:46:52 np0005466013 systemd[1]: Stopped target Timer Units.
Oct  2 06:46:52 np0005466013 systemd[1]: dbus.socket: Deactivated successfully.
Oct  2 06:46:52 np0005466013 systemd[1]: Closed D-Bus System Message Bus Socket.
Oct  2 06:46:52 np0005466013 systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Oct  2 06:46:52 np0005466013 systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Oct  2 06:46:52 np0005466013 systemd[1]: Stopped target Initrd Default Target.
Oct  2 06:46:52 np0005466013 systemd[1]: Stopped target Basic System.
Oct  2 06:46:52 np0005466013 systemd[1]: Stopped target Initrd Root Device.
Oct  2 06:46:52 np0005466013 systemd[1]: Stopped target Initrd /usr File System.
Oct  2 06:46:52 np0005466013 systemd[1]: Stopped target Path Units.
Oct  2 06:46:52 np0005466013 systemd[1]: Stopped target Remote File Systems.
Oct  2 06:46:52 np0005466013 systemd[1]: Stopped target Preparation for Remote File Systems.
Oct  2 06:46:52 np0005466013 systemd[1]: Stopped target Slice Units.
Oct  2 06:46:52 np0005466013 systemd[1]: Stopped target Socket Units.
Oct  2 06:46:52 np0005466013 systemd[1]: Stopped target System Initialization.
Oct  2 06:46:52 np0005466013 systemd[1]: Stopped target Local File Systems.
Oct  2 06:46:52 np0005466013 systemd[1]: Stopped target Swaps.
Oct  2 06:46:52 np0005466013 systemd[1]: dracut-mount.service: Deactivated successfully.
Oct  2 06:46:52 np0005466013 systemd[1]: Stopped dracut mount hook.
Oct  2 06:46:52 np0005466013 systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Oct  2 06:46:52 np0005466013 systemd[1]: Stopped dracut pre-mount hook.
Oct  2 06:46:52 np0005466013 systemd[1]: Stopped target Local Encrypted Volumes.
Oct  2 06:46:52 np0005466013 systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Oct  2 06:46:52 np0005466013 systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Oct  2 06:46:52 np0005466013 systemd[1]: dracut-initqueue.service: Deactivated successfully.
Oct  2 06:46:52 np0005466013 systemd[1]: Stopped dracut initqueue hook.
Oct  2 06:46:52 np0005466013 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Oct  2 06:46:52 np0005466013 systemd[1]: Stopped Apply Kernel Variables.
Oct  2 06:46:52 np0005466013 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Oct  2 06:46:52 np0005466013 systemd[1]: Stopped Create Volatile Files and Directories.
Oct  2 06:46:52 np0005466013 systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Oct  2 06:46:52 np0005466013 systemd[1]: Stopped Coldplug All udev Devices.
Oct  2 06:46:52 np0005466013 systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Oct  2 06:46:52 np0005466013 systemd[1]: Stopped dracut pre-trigger hook.
Oct  2 06:46:52 np0005466013 systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Oct  2 06:46:52 np0005466013 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Oct  2 06:46:52 np0005466013 systemd[1]: Stopped Setup Virtual Console.
Oct  2 06:46:52 np0005466013 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Oct  2 06:46:52 np0005466013 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Oct  2 06:46:52 np0005466013 systemd[1]: initrd-cleanup.service: Deactivated successfully.
Oct  2 06:46:52 np0005466013 systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Oct  2 06:46:52 np0005466013 systemd[1]: systemd-udevd.service: Deactivated successfully.
Oct  2 06:46:52 np0005466013 systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Oct  2 06:46:52 np0005466013 systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Oct  2 06:46:52 np0005466013 systemd[1]: Closed udev Control Socket.
Oct  2 06:46:52 np0005466013 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Oct  2 06:46:52 np0005466013 systemd[1]: Closed udev Kernel Socket.
Oct  2 06:46:52 np0005466013 systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Oct  2 06:46:52 np0005466013 systemd[1]: Stopped dracut pre-udev hook.
Oct  2 06:46:52 np0005466013 systemd[1]: dracut-cmdline.service: Deactivated successfully.
Oct  2 06:46:52 np0005466013 systemd[1]: Stopped dracut cmdline hook.
Oct  2 06:46:52 np0005466013 systemd[1]: Starting Cleanup udev Database...
Oct  2 06:46:52 np0005466013 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Oct  2 06:46:52 np0005466013 systemd[1]: Stopped Create Static Device Nodes in /dev.
Oct  2 06:46:52 np0005466013 systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Oct  2 06:46:52 np0005466013 systemd[1]: Stopped Create List of Static Device Nodes.
Oct  2 06:46:52 np0005466013 systemd[1]: systemd-sysusers.service: Deactivated successfully.
Oct  2 06:46:52 np0005466013 systemd[1]: Stopped Create System Users.
Oct  2 06:46:52 np0005466013 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Oct  2 06:46:52 np0005466013 systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Oct  2 06:46:52 np0005466013 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Oct  2 06:46:52 np0005466013 systemd[1]: Finished Cleanup udev Database.
Oct  2 06:46:52 np0005466013 systemd[1]: Reached target Switch Root.
Oct  2 06:46:52 np0005466013 systemd[1]: Starting Switch Root...
Oct  2 06:46:52 np0005466013 systemd[1]: Switching root.
Oct  2 06:46:52 np0005466013 systemd-journald[305]: Received SIGTERM from PID 1 (systemd).
Oct  2 06:46:52 np0005466013 systemd-journald[305]: Journal stopped
Oct  2 06:46:54 np0005466013 kernel: audit: type=1404 audit(1759402012.513:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Oct  2 06:46:54 np0005466013 kernel: SELinux:  policy capability network_peer_controls=1
Oct  2 06:46:54 np0005466013 kernel: SELinux:  policy capability open_perms=1
Oct  2 06:46:54 np0005466013 kernel: SELinux:  policy capability extended_socket_class=1
Oct  2 06:46:54 np0005466013 kernel: SELinux:  policy capability always_check_network=0
Oct  2 06:46:54 np0005466013 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  2 06:46:54 np0005466013 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  2 06:46:54 np0005466013 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  2 06:46:54 np0005466013 kernel: audit: type=1403 audit(1759402012.689:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Oct  2 06:46:54 np0005466013 systemd: Successfully loaded SELinux policy in 182.328ms.
Oct  2 06:46:54 np0005466013 systemd: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 31.431ms.
Oct  2 06:46:54 np0005466013 systemd: systemd 252-55.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Oct  2 06:46:54 np0005466013 systemd: Detected virtualization kvm.
Oct  2 06:46:54 np0005466013 systemd: Detected architecture x86-64.
Oct  2 06:46:54 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 06:46:54 np0005466013 systemd: initrd-switch-root.service: Deactivated successfully.
Oct  2 06:46:54 np0005466013 systemd: Stopped Switch Root.
Oct  2 06:46:54 np0005466013 systemd: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Oct  2 06:46:54 np0005466013 systemd: Created slice Slice /system/getty.
Oct  2 06:46:54 np0005466013 systemd: Created slice Slice /system/serial-getty.
Oct  2 06:46:54 np0005466013 systemd: Created slice Slice /system/sshd-keygen.
Oct  2 06:46:54 np0005466013 systemd: Created slice User and Session Slice.
Oct  2 06:46:54 np0005466013 systemd: Started Dispatch Password Requests to Console Directory Watch.
Oct  2 06:46:54 np0005466013 systemd: Started Forward Password Requests to Wall Directory Watch.
Oct  2 06:46:54 np0005466013 systemd: Set up automount Arbitrary Executable File Formats File System Automount Point.
Oct  2 06:46:54 np0005466013 systemd: Reached target Local Encrypted Volumes.
Oct  2 06:46:54 np0005466013 systemd: Stopped target Switch Root.
Oct  2 06:46:54 np0005466013 systemd: Stopped target Initrd File Systems.
Oct  2 06:46:54 np0005466013 systemd: Stopped target Initrd Root File System.
Oct  2 06:46:54 np0005466013 systemd: Reached target Local Integrity Protected Volumes.
Oct  2 06:46:54 np0005466013 systemd: Reached target Path Units.
Oct  2 06:46:54 np0005466013 systemd: Reached target rpc_pipefs.target.
Oct  2 06:46:54 np0005466013 systemd: Reached target Slice Units.
Oct  2 06:46:54 np0005466013 systemd: Reached target Swaps.
Oct  2 06:46:54 np0005466013 systemd: Reached target Local Verity Protected Volumes.
Oct  2 06:46:54 np0005466013 systemd: Listening on RPCbind Server Activation Socket.
Oct  2 06:46:54 np0005466013 systemd: Reached target RPC Port Mapper.
Oct  2 06:46:54 np0005466013 systemd: Listening on Process Core Dump Socket.
Oct  2 06:46:54 np0005466013 systemd: Listening on initctl Compatibility Named Pipe.
Oct  2 06:46:54 np0005466013 systemd: Listening on udev Control Socket.
Oct  2 06:46:54 np0005466013 systemd: Listening on udev Kernel Socket.
Oct  2 06:46:54 np0005466013 systemd: Mounting Huge Pages File System...
Oct  2 06:46:54 np0005466013 systemd: Mounting POSIX Message Queue File System...
Oct  2 06:46:54 np0005466013 systemd: Mounting Kernel Debug File System...
Oct  2 06:46:54 np0005466013 systemd: Mounting Kernel Trace File System...
Oct  2 06:46:54 np0005466013 systemd: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Oct  2 06:46:54 np0005466013 systemd: Starting Create List of Static Device Nodes...
Oct  2 06:46:54 np0005466013 systemd: Starting Load Kernel Module configfs...
Oct  2 06:46:54 np0005466013 systemd: Starting Load Kernel Module drm...
Oct  2 06:46:54 np0005466013 systemd: Starting Load Kernel Module efi_pstore...
Oct  2 06:46:54 np0005466013 systemd: Starting Load Kernel Module fuse...
Oct  2 06:46:54 np0005466013 systemd: Starting Read and set NIS domainname from /etc/sysconfig/network...
Oct  2 06:46:54 np0005466013 systemd: systemd-fsck-root.service: Deactivated successfully.
Oct  2 06:46:54 np0005466013 systemd: Stopped File System Check on Root Device.
Oct  2 06:46:54 np0005466013 systemd: Stopped Journal Service.
Oct  2 06:46:54 np0005466013 systemd: Starting Journal Service...
Oct  2 06:46:54 np0005466013 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Oct  2 06:46:54 np0005466013 systemd: Starting Generate network units from Kernel command line...
Oct  2 06:46:54 np0005466013 systemd: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Oct  2 06:46:54 np0005466013 systemd: Starting Remount Root and Kernel File Systems...
Oct  2 06:46:54 np0005466013 systemd: Repartition Root Disk was skipped because no trigger condition checks were met.
Oct  2 06:46:54 np0005466013 systemd: Starting Apply Kernel Variables...
Oct  2 06:46:54 np0005466013 kernel: fuse: init (API version 7.37)
Oct  2 06:46:54 np0005466013 systemd: Starting Coldplug All udev Devices...
Oct  2 06:46:54 np0005466013 kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Oct  2 06:46:54 np0005466013 systemd: Mounted Huge Pages File System.
Oct  2 06:46:54 np0005466013 systemd-journald[677]: Journal started
Oct  2 06:46:54 np0005466013 systemd-journald[677]: Runtime Journal (/run/log/journal/42833e1b511a402df82cb9cb2fc36491) is 8.0M, max 153.5M, 145.5M free.
Oct  2 06:46:54 np0005466013 systemd[1]: Queued start job for default target Multi-User System.
Oct  2 06:46:54 np0005466013 systemd[1]: systemd-journald.service: Deactivated successfully.
Oct  2 06:46:54 np0005466013 systemd: Started Journal Service.
Oct  2 06:46:54 np0005466013 systemd[1]: Mounted POSIX Message Queue File System.
Oct  2 06:46:54 np0005466013 systemd[1]: Mounted Kernel Debug File System.
Oct  2 06:46:54 np0005466013 systemd[1]: Mounted Kernel Trace File System.
Oct  2 06:46:54 np0005466013 systemd[1]: Finished Create List of Static Device Nodes.
Oct  2 06:46:54 np0005466013 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Oct  2 06:46:54 np0005466013 systemd[1]: Finished Load Kernel Module configfs.
Oct  2 06:46:54 np0005466013 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Oct  2 06:46:54 np0005466013 systemd[1]: Finished Load Kernel Module efi_pstore.
Oct  2 06:46:54 np0005466013 systemd[1]: modprobe@fuse.service: Deactivated successfully.
Oct  2 06:46:54 np0005466013 systemd[1]: Finished Load Kernel Module fuse.
Oct  2 06:46:54 np0005466013 systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Oct  2 06:46:54 np0005466013 systemd[1]: Finished Generate network units from Kernel command line.
Oct  2 06:46:54 np0005466013 systemd[1]: Finished Remount Root and Kernel File Systems.
Oct  2 06:46:54 np0005466013 systemd[1]: Finished Apply Kernel Variables.
Oct  2 06:46:54 np0005466013 kernel: ACPI: bus type drm_connector registered
Oct  2 06:46:54 np0005466013 systemd[1]: Mounting FUSE Control File System...
Oct  2 06:46:54 np0005466013 systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Oct  2 06:46:54 np0005466013 systemd[1]: Starting Rebuild Hardware Database...
Oct  2 06:46:54 np0005466013 systemd[1]: Starting Flush Journal to Persistent Storage...
Oct  2 06:46:54 np0005466013 systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Oct  2 06:46:54 np0005466013 systemd[1]: Starting Load/Save OS Random Seed...
Oct  2 06:46:54 np0005466013 systemd[1]: Starting Create System Users...
Oct  2 06:46:54 np0005466013 systemd[1]: modprobe@drm.service: Deactivated successfully.
Oct  2 06:46:54 np0005466013 systemd[1]: Finished Load Kernel Module drm.
Oct  2 06:46:54 np0005466013 systemd-journald[677]: Runtime Journal (/run/log/journal/42833e1b511a402df82cb9cb2fc36491) is 8.0M, max 153.5M, 145.5M free.
Oct  2 06:46:54 np0005466013 systemd-journald[677]: Received client request to flush runtime journal.
Oct  2 06:46:54 np0005466013 systemd[1]: Mounted FUSE Control File System.
Oct  2 06:46:54 np0005466013 systemd[1]: Finished Flush Journal to Persistent Storage.
Oct  2 06:46:54 np0005466013 systemd[1]: Finished Load/Save OS Random Seed.
Oct  2 06:46:54 np0005466013 systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Oct  2 06:46:54 np0005466013 systemd[1]: Finished Create System Users.
Oct  2 06:46:54 np0005466013 systemd[1]: Starting Create Static Device Nodes in /dev...
Oct  2 06:46:54 np0005466013 systemd[1]: Finished Coldplug All udev Devices.
Oct  2 06:46:54 np0005466013 systemd[1]: Finished Create Static Device Nodes in /dev.
Oct  2 06:46:54 np0005466013 systemd[1]: Reached target Preparation for Local File Systems.
Oct  2 06:46:54 np0005466013 systemd[1]: Reached target Local File Systems.
Oct  2 06:46:54 np0005466013 systemd[1]: Starting Rebuild Dynamic Linker Cache...
Oct  2 06:46:54 np0005466013 systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Oct  2 06:46:54 np0005466013 systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Oct  2 06:46:54 np0005466013 systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Oct  2 06:46:54 np0005466013 systemd[1]: Starting Automatic Boot Loader Update...
Oct  2 06:46:54 np0005466013 systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Oct  2 06:46:54 np0005466013 systemd[1]: Starting Create Volatile Files and Directories...
Oct  2 06:46:54 np0005466013 bootctl[695]: Couldn't find EFI system partition, skipping.
Oct  2 06:46:54 np0005466013 systemd[1]: Finished Automatic Boot Loader Update.
Oct  2 06:46:54 np0005466013 systemd[1]: Finished Create Volatile Files and Directories.
Oct  2 06:46:54 np0005466013 systemd[1]: Starting Security Auditing Service...
Oct  2 06:46:54 np0005466013 systemd[1]: Starting RPC Bind...
Oct  2 06:46:54 np0005466013 systemd[1]: Starting Rebuild Journal Catalog...
Oct  2 06:46:54 np0005466013 auditd[701]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Oct  2 06:46:54 np0005466013 auditd[701]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Oct  2 06:46:54 np0005466013 systemd[1]: Finished Rebuild Journal Catalog.
Oct  2 06:46:54 np0005466013 systemd[1]: Started RPC Bind.
Oct  2 06:46:54 np0005466013 augenrules[706]: /sbin/augenrules: No change
Oct  2 06:46:54 np0005466013 augenrules[721]: No rules
Oct  2 06:46:54 np0005466013 augenrules[721]: enabled 1
Oct  2 06:46:54 np0005466013 augenrules[721]: failure 1
Oct  2 06:46:54 np0005466013 augenrules[721]: pid 701
Oct  2 06:46:54 np0005466013 augenrules[721]: rate_limit 0
Oct  2 06:46:54 np0005466013 augenrules[721]: backlog_limit 8192
Oct  2 06:46:54 np0005466013 augenrules[721]: lost 0
Oct  2 06:46:54 np0005466013 augenrules[721]: backlog 0
Oct  2 06:46:54 np0005466013 augenrules[721]: backlog_wait_time 60000
Oct  2 06:46:54 np0005466013 augenrules[721]: backlog_wait_time_actual 0
Oct  2 06:46:54 np0005466013 augenrules[721]: enabled 1
Oct  2 06:46:54 np0005466013 augenrules[721]: failure 1
Oct  2 06:46:54 np0005466013 augenrules[721]: pid 701
Oct  2 06:46:54 np0005466013 augenrules[721]: rate_limit 0
Oct  2 06:46:54 np0005466013 augenrules[721]: backlog_limit 8192
Oct  2 06:46:54 np0005466013 augenrules[721]: lost 0
Oct  2 06:46:54 np0005466013 augenrules[721]: backlog 0
Oct  2 06:46:54 np0005466013 augenrules[721]: backlog_wait_time 60000
Oct  2 06:46:54 np0005466013 augenrules[721]: backlog_wait_time_actual 0
Oct  2 06:46:54 np0005466013 augenrules[721]: enabled 1
Oct  2 06:46:54 np0005466013 augenrules[721]: failure 1
Oct  2 06:46:54 np0005466013 augenrules[721]: pid 701
Oct  2 06:46:54 np0005466013 augenrules[721]: rate_limit 0
Oct  2 06:46:54 np0005466013 augenrules[721]: backlog_limit 8192
Oct  2 06:46:54 np0005466013 augenrules[721]: lost 0
Oct  2 06:46:54 np0005466013 augenrules[721]: backlog 0
Oct  2 06:46:54 np0005466013 augenrules[721]: backlog_wait_time 60000
Oct  2 06:46:54 np0005466013 augenrules[721]: backlog_wait_time_actual 0
Oct  2 06:46:54 np0005466013 systemd[1]: Started Security Auditing Service.
Oct  2 06:46:54 np0005466013 systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Oct  2 06:46:54 np0005466013 systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Oct  2 06:46:54 np0005466013 systemd[1]: Finished Rebuild Dynamic Linker Cache.
Oct  2 06:46:54 np0005466013 systemd[1]: Finished Rebuild Hardware Database.
Oct  2 06:46:54 np0005466013 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Oct  2 06:46:54 np0005466013 systemd[1]: Starting Update is Completed...
Oct  2 06:46:54 np0005466013 systemd[1]: Finished Update is Completed.
Oct  2 06:46:54 np0005466013 systemd-udevd[729]: Using default interface naming scheme 'rhel-9.0'.
Oct  2 06:46:54 np0005466013 systemd[1]: Started Rule-based Manager for Device Events and Files.
Oct  2 06:46:54 np0005466013 systemd[1]: Reached target System Initialization.
Oct  2 06:46:54 np0005466013 systemd[1]: Started dnf makecache --timer.
Oct  2 06:46:54 np0005466013 systemd[1]: Started Daily rotation of log files.
Oct  2 06:46:54 np0005466013 systemd[1]: Started Daily Cleanup of Temporary Directories.
Oct  2 06:46:54 np0005466013 systemd[1]: Reached target Timer Units.
Oct  2 06:46:54 np0005466013 systemd[1]: Listening on D-Bus System Message Bus Socket.
Oct  2 06:46:54 np0005466013 systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Oct  2 06:46:54 np0005466013 systemd[1]: Reached target Socket Units.
Oct  2 06:46:54 np0005466013 systemd[1]: Starting D-Bus System Message Bus...
Oct  2 06:46:54 np0005466013 systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Oct  2 06:46:54 np0005466013 systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Oct  2 06:46:54 np0005466013 systemd[1]: Starting Load Kernel Module configfs...
Oct  2 06:46:54 np0005466013 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Oct  2 06:46:54 np0005466013 systemd[1]: Finished Load Kernel Module configfs.
Oct  2 06:46:54 np0005466013 systemd[1]: Started D-Bus System Message Bus.
Oct  2 06:46:54 np0005466013 systemd[1]: Reached target Basic System.
Oct  2 06:46:54 np0005466013 systemd-udevd[744]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 06:46:54 np0005466013 dbus-broker-lau[741]: Ready
Oct  2 06:46:54 np0005466013 systemd[1]: Starting NTP client/server...
Oct  2 06:46:54 np0005466013 systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Oct  2 06:46:54 np0005466013 systemd[1]: Starting Restore /run/initramfs on shutdown...
Oct  2 06:46:54 np0005466013 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Oct  2 06:46:54 np0005466013 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Oct  2 06:46:54 np0005466013 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Oct  2 06:46:54 np0005466013 systemd[1]: Starting IPv4 firewall with iptables...
Oct  2 06:46:55 np0005466013 systemd[1]: Started irqbalance daemon.
Oct  2 06:46:55 np0005466013 systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Oct  2 06:46:55 np0005466013 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct  2 06:46:55 np0005466013 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct  2 06:46:55 np0005466013 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct  2 06:46:55 np0005466013 systemd[1]: Reached target sshd-keygen.target.
Oct  2 06:46:55 np0005466013 systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Oct  2 06:46:55 np0005466013 systemd[1]: Reached target User and Group Name Lookups.
Oct  2 06:46:55 np0005466013 systemd[1]: Starting User Login Management...
Oct  2 06:46:55 np0005466013 systemd[1]: Finished Restore /run/initramfs on shutdown.
Oct  2 06:46:55 np0005466013 kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Oct  2 06:46:55 np0005466013 chronyd[794]: chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Oct  2 06:46:55 np0005466013 chronyd[794]: Loaded 0 symmetric keys
Oct  2 06:46:55 np0005466013 chronyd[794]: Using right/UTC timezone to obtain leap second data
Oct  2 06:46:55 np0005466013 chronyd[794]: Loaded seccomp filter (level 2)
Oct  2 06:46:55 np0005466013 systemd[1]: Started NTP client/server.
Oct  2 06:46:55 np0005466013 systemd-logind[784]: Watching system buttons on /dev/input/event0 (Power Button)
Oct  2 06:46:55 np0005466013 systemd-logind[784]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Oct  2 06:46:55 np0005466013 systemd-logind[784]: New seat seat0.
Oct  2 06:46:55 np0005466013 systemd[1]: Started User Login Management.
Oct  2 06:46:55 np0005466013 kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Oct  2 06:46:55 np0005466013 kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Oct  2 06:46:55 np0005466013 kernel: kvm_amd: TSC scaling supported
Oct  2 06:46:55 np0005466013 kernel: kvm_amd: Nested Virtualization enabled
Oct  2 06:46:55 np0005466013 kernel: kvm_amd: Nested Paging enabled
Oct  2 06:46:55 np0005466013 kernel: kvm_amd: LBR virtualization supported
Oct  2 06:46:55 np0005466013 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Oct  2 06:46:55 np0005466013 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Oct  2 06:46:55 np0005466013 kernel: Console: switching to colour dummy device 80x25
Oct  2 06:46:55 np0005466013 kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Oct  2 06:46:55 np0005466013 kernel: [drm] features: -context_init
Oct  2 06:46:55 np0005466013 kernel: [drm] number of scanouts: 1
Oct  2 06:46:55 np0005466013 kernel: [drm] number of cap sets: 0
Oct  2 06:46:55 np0005466013 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Oct  2 06:46:55 np0005466013 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Oct  2 06:46:55 np0005466013 kernel: Console: switching to colour frame buffer device 128x48
Oct  2 06:46:55 np0005466013 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Oct  2 06:46:55 np0005466013 iptables.init[778]: iptables: Applying firewall rules: [  OK  ]
Oct  2 06:46:55 np0005466013 systemd[1]: Finished IPv4 firewall with iptables.
Oct  2 06:46:55 np0005466013 cloud-init[839]: Cloud-init v. 24.4-7.el9 running 'init-local' at Thu, 02 Oct 2025 10:46:55 +0000. Up 7.40 seconds.
Oct  2 06:46:55 np0005466013 systemd[1]: run-cloud\x2dinit-tmp-tmp5b0b12tu.mount: Deactivated successfully.
Oct  2 06:46:56 np0005466013 systemd[1]: Starting Hostname Service...
Oct  2 06:46:56 np0005466013 systemd[1]: Started Hostname Service.
Oct  2 06:46:56 np0005466013 systemd-hostnamed[853]: Hostname set to <np0005466013.novalocal> (static)
Oct  2 06:46:56 np0005466013 systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Oct  2 06:46:56 np0005466013 systemd[1]: Reached target Preparation for Network.
Oct  2 06:46:56 np0005466013 systemd[1]: Starting Network Manager...
Oct  2 06:46:56 np0005466013 NetworkManager[857]: <info>  [1759402016.3399] NetworkManager (version 1.54.1-1.el9) is starting... (boot:e608a69a-f987-4b3e-a0e0-e678d85d8e75)
Oct  2 06:46:56 np0005466013 NetworkManager[857]: <info>  [1759402016.3405] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Oct  2 06:46:56 np0005466013 NetworkManager[857]: <info>  [1759402016.3562] manager[0x55dcc6f4c080]: monitoring kernel firmware directory '/lib/firmware'.
Oct  2 06:46:56 np0005466013 NetworkManager[857]: <info>  [1759402016.3609] hostname: hostname: using hostnamed
Oct  2 06:46:56 np0005466013 NetworkManager[857]: <info>  [1759402016.3609] hostname: static hostname changed from (none) to "np0005466013.novalocal"
Oct  2 06:46:56 np0005466013 NetworkManager[857]: <info>  [1759402016.3613] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Oct  2 06:46:56 np0005466013 NetworkManager[857]: <info>  [1759402016.3748] manager[0x55dcc6f4c080]: rfkill: Wi-Fi hardware radio set enabled
Oct  2 06:46:56 np0005466013 NetworkManager[857]: <info>  [1759402016.3751] manager[0x55dcc6f4c080]: rfkill: WWAN hardware radio set enabled
Oct  2 06:46:56 np0005466013 systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Oct  2 06:46:56 np0005466013 NetworkManager[857]: <info>  [1759402016.3858] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Oct  2 06:46:56 np0005466013 NetworkManager[857]: <info>  [1759402016.3858] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Oct  2 06:46:56 np0005466013 NetworkManager[857]: <info>  [1759402016.3859] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Oct  2 06:46:56 np0005466013 NetworkManager[857]: <info>  [1759402016.3859] manager: Networking is enabled by state file
Oct  2 06:46:56 np0005466013 NetworkManager[857]: <info>  [1759402016.3861] settings: Loaded settings plugin: keyfile (internal)
Oct  2 06:46:56 np0005466013 NetworkManager[857]: <info>  [1759402016.3927] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Oct  2 06:46:56 np0005466013 NetworkManager[857]: <info>  [1759402016.3955] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Oct  2 06:46:56 np0005466013 NetworkManager[857]: <info>  [1759402016.3982] dhcp: init: Using DHCP client 'internal'
Oct  2 06:46:56 np0005466013 NetworkManager[857]: <info>  [1759402016.3985] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Oct  2 06:46:56 np0005466013 NetworkManager[857]: <info>  [1759402016.4002] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 06:46:56 np0005466013 NetworkManager[857]: <info>  [1759402016.4016] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Oct  2 06:46:56 np0005466013 NetworkManager[857]: <info>  [1759402016.4026] device (lo): Activation: starting connection 'lo' (e6177a1f-26b3-4119-8eb3-690387d13626)
Oct  2 06:46:56 np0005466013 NetworkManager[857]: <info>  [1759402016.4037] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Oct  2 06:46:56 np0005466013 NetworkManager[857]: <info>  [1759402016.4042] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  2 06:46:56 np0005466013 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct  2 06:46:56 np0005466013 NetworkManager[857]: <info>  [1759402016.4095] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Oct  2 06:46:56 np0005466013 NetworkManager[857]: <info>  [1759402016.4101] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Oct  2 06:46:56 np0005466013 NetworkManager[857]: <info>  [1759402016.4104] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Oct  2 06:46:56 np0005466013 NetworkManager[857]: <info>  [1759402016.4107] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Oct  2 06:46:56 np0005466013 NetworkManager[857]: <info>  [1759402016.4111] device (eth0): carrier: link connected
Oct  2 06:46:56 np0005466013 NetworkManager[857]: <info>  [1759402016.4114] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Oct  2 06:46:56 np0005466013 systemd[1]: Started Network Manager.
Oct  2 06:46:56 np0005466013 NetworkManager[857]: <info>  [1759402016.4122] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Oct  2 06:46:56 np0005466013 systemd[1]: Reached target Network.
Oct  2 06:46:56 np0005466013 NetworkManager[857]: <info>  [1759402016.4140] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct  2 06:46:56 np0005466013 NetworkManager[857]: <info>  [1759402016.4144] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct  2 06:46:56 np0005466013 NetworkManager[857]: <info>  [1759402016.4145] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  2 06:46:56 np0005466013 NetworkManager[857]: <info>  [1759402016.4149] manager: NetworkManager state is now CONNECTING
Oct  2 06:46:56 np0005466013 NetworkManager[857]: <info>  [1759402016.4150] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  2 06:46:56 np0005466013 NetworkManager[857]: <info>  [1759402016.4158] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  2 06:46:56 np0005466013 NetworkManager[857]: <info>  [1759402016.4162] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct  2 06:46:56 np0005466013 systemd[1]: Starting Network Manager Wait Online...
Oct  2 06:46:56 np0005466013 NetworkManager[857]: <info>  [1759402016.4197] dhcp4 (eth0): state changed new lease, address=38.102.83.45
Oct  2 06:46:56 np0005466013 systemd[1]: Starting GSSAPI Proxy Daemon...
Oct  2 06:46:56 np0005466013 NetworkManager[857]: <info>  [1759402016.4206] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Oct  2 06:46:56 np0005466013 NetworkManager[857]: <info>  [1759402016.4234] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  2 06:46:56 np0005466013 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct  2 06:46:56 np0005466013 NetworkManager[857]: <info>  [1759402016.4334] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Oct  2 06:46:56 np0005466013 NetworkManager[857]: <info>  [1759402016.4337] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  2 06:46:56 np0005466013 NetworkManager[857]: <info>  [1759402016.4339] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Oct  2 06:46:56 np0005466013 NetworkManager[857]: <info>  [1759402016.4348] device (lo): Activation: successful, device activated.
Oct  2 06:46:56 np0005466013 NetworkManager[857]: <info>  [1759402016.4354] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  2 06:46:56 np0005466013 NetworkManager[857]: <info>  [1759402016.4357] manager: NetworkManager state is now CONNECTED_SITE
Oct  2 06:46:56 np0005466013 NetworkManager[857]: <info>  [1759402016.4359] device (eth0): Activation: successful, device activated.
Oct  2 06:46:56 np0005466013 NetworkManager[857]: <info>  [1759402016.4365] manager: NetworkManager state is now CONNECTED_GLOBAL
Oct  2 06:46:56 np0005466013 NetworkManager[857]: <info>  [1759402016.4367] manager: startup complete
Oct  2 06:46:56 np0005466013 systemd[1]: Started GSSAPI Proxy Daemon.
Oct  2 06:46:56 np0005466013 systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Oct  2 06:46:56 np0005466013 systemd[1]: Reached target NFS client services.
Oct  2 06:46:56 np0005466013 systemd[1]: Reached target Preparation for Remote File Systems.
Oct  2 06:46:56 np0005466013 systemd[1]: Reached target Remote File Systems.
Oct  2 06:46:56 np0005466013 systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Oct  2 06:46:56 np0005466013 systemd[1]: Finished Network Manager Wait Online.
Oct  2 06:46:56 np0005466013 systemd[1]: Starting Cloud-init: Network Stage...
Oct  2 06:46:56 np0005466013 cloud-init[921]: Cloud-init v. 24.4-7.el9 running 'init' at Thu, 02 Oct 2025 10:46:56 +0000. Up 8.43 seconds.
Oct  2 06:46:56 np0005466013 cloud-init[921]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Oct  2 06:46:56 np0005466013 cloud-init[921]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Oct  2 06:46:56 np0005466013 cloud-init[921]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Oct  2 06:46:56 np0005466013 cloud-init[921]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Oct  2 06:46:56 np0005466013 cloud-init[921]: ci-info: |  eth0  | True |         38.102.83.45         | 255.255.255.0 | global | fa:16:3e:c2:af:05 |
Oct  2 06:46:56 np0005466013 cloud-init[921]: ci-info: |  eth0  | True | fe80::f816:3eff:fec2:af05/64 |       .       |  link  | fa:16:3e:c2:af:05 |
Oct  2 06:46:56 np0005466013 cloud-init[921]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Oct  2 06:46:56 np0005466013 cloud-init[921]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Oct  2 06:46:56 np0005466013 cloud-init[921]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Oct  2 06:46:56 np0005466013 cloud-init[921]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Oct  2 06:46:56 np0005466013 cloud-init[921]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Oct  2 06:46:56 np0005466013 cloud-init[921]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Oct  2 06:46:56 np0005466013 cloud-init[921]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Oct  2 06:46:56 np0005466013 cloud-init[921]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Oct  2 06:46:56 np0005466013 cloud-init[921]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Oct  2 06:46:56 np0005466013 cloud-init[921]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Oct  2 06:46:56 np0005466013 cloud-init[921]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Oct  2 06:46:56 np0005466013 cloud-init[921]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Oct  2 06:46:56 np0005466013 cloud-init[921]: ci-info: +-------+-------------+---------+-----------+-------+
Oct  2 06:46:56 np0005466013 cloud-init[921]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Oct  2 06:46:56 np0005466013 cloud-init[921]: ci-info: +-------+-------------+---------+-----------+-------+
Oct  2 06:46:56 np0005466013 cloud-init[921]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Oct  2 06:46:56 np0005466013 cloud-init[921]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Oct  2 06:46:56 np0005466013 cloud-init[921]: ci-info: +-------+-------------+---------+-----------+-------+
Oct  2 06:46:58 np0005466013 cloud-init[921]: Generating public/private rsa key pair.
Oct  2 06:46:58 np0005466013 cloud-init[921]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Oct  2 06:46:58 np0005466013 cloud-init[921]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Oct  2 06:46:58 np0005466013 cloud-init[921]: The key fingerprint is:
Oct  2 06:46:58 np0005466013 cloud-init[921]: SHA256:e8hrWQiEMSqjip9L27XbzidI/qKge4OAbc5abdgdRqE root@np0005466013.novalocal
Oct  2 06:46:58 np0005466013 cloud-init[921]: The key's randomart image is:
Oct  2 06:46:58 np0005466013 cloud-init[921]: +---[RSA 3072]----+
Oct  2 06:46:58 np0005466013 cloud-init[921]: |    oo.          |
Oct  2 06:46:58 np0005466013 cloud-init[921]: |   ..o..         |
Oct  2 06:46:58 np0005466013 cloud-init[921]: |o . E..          |
Oct  2 06:46:58 np0005466013 cloud-init[921]: |.o   ..          |
Oct  2 06:46:58 np0005466013 cloud-init[921]: |o.    o.S.       |
Oct  2 06:46:58 np0005466013 cloud-init[921]: |= o+ o.o.o.      |
Oct  2 06:46:58 np0005466013 cloud-init[921]: |+== +oo.+o.      |
Oct  2 06:46:58 np0005466013 cloud-init[921]: |.==* .+++o.      |
Oct  2 06:46:58 np0005466013 cloud-init[921]: |++=oo.o**o       |
Oct  2 06:46:58 np0005466013 cloud-init[921]: +----[SHA256]-----+
Oct  2 06:46:58 np0005466013 cloud-init[921]: Generating public/private ecdsa key pair.
Oct  2 06:46:58 np0005466013 cloud-init[921]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Oct  2 06:46:58 np0005466013 cloud-init[921]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Oct  2 06:46:58 np0005466013 cloud-init[921]: The key fingerprint is:
Oct  2 06:46:58 np0005466013 cloud-init[921]: SHA256:cYKKNNLDo2PwPCiMknN32Ixi08HhciVFtoU/lXtagU4 root@np0005466013.novalocal
Oct  2 06:46:58 np0005466013 cloud-init[921]: The key's randomart image is:
Oct  2 06:46:58 np0005466013 cloud-init[921]: +---[ECDSA 256]---+
Oct  2 06:46:58 np0005466013 cloud-init[921]: |    oo=..  o     |
Oct  2 06:46:58 np0005466013 cloud-init[921]: | o o =.+  E .    |
Oct  2 06:46:58 np0005466013 cloud-init[921]: |o O = o.o+.. .   |
Oct  2 06:46:58 np0005466013 cloud-init[921]: |+O O B  o+o o    |
Oct  2 06:46:58 np0005466013 cloud-init[921]: |XoX * + S. +     |
Oct  2 06:46:58 np0005466013 cloud-init[921]: |+= = .    .      |
Oct  2 06:46:58 np0005466013 cloud-init[921]: |                 |
Oct  2 06:46:58 np0005466013 cloud-init[921]: |                 |
Oct  2 06:46:58 np0005466013 cloud-init[921]: |                 |
Oct  2 06:46:58 np0005466013 cloud-init[921]: +----[SHA256]-----+
Oct  2 06:46:58 np0005466013 cloud-init[921]: Generating public/private ed25519 key pair.
Oct  2 06:46:58 np0005466013 cloud-init[921]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Oct  2 06:46:58 np0005466013 cloud-init[921]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Oct  2 06:46:58 np0005466013 cloud-init[921]: The key fingerprint is:
Oct  2 06:46:58 np0005466013 cloud-init[921]: SHA256:BogmYyN/gVXnU5ow7kQBT9vV/KuQSqNVenb6dITzUqU root@np0005466013.novalocal
Oct  2 06:46:58 np0005466013 cloud-init[921]: The key's randomart image is:
Oct  2 06:46:58 np0005466013 cloud-init[921]: +--[ED25519 256]--+
Oct  2 06:46:58 np0005466013 cloud-init[921]: |    ooB.. oo     |
Oct  2 06:46:58 np0005466013 cloud-init[921]: |   + * B =  o    |
Oct  2 06:46:58 np0005466013 cloud-init[921]: |+o+ o * *    .  .|
Oct  2 06:46:58 np0005466013 cloud-init[921]: |o=.  + . ..  ..o |
Oct  2 06:46:58 np0005466013 cloud-init[921]: |  . . . So .o E. |
Oct  2 06:46:58 np0005466013 cloud-init[921]: |   .   .= = .=.  |
Oct  2 06:46:58 np0005466013 cloud-init[921]: |       + = +o.o  |
Oct  2 06:46:58 np0005466013 cloud-init[921]: |      . . ...o   |
Oct  2 06:46:58 np0005466013 cloud-init[921]: |           ..    |
Oct  2 06:46:58 np0005466013 cloud-init[921]: +----[SHA256]-----+
Oct  2 06:46:58 np0005466013 systemd[1]: Finished Cloud-init: Network Stage.
Oct  2 06:46:58 np0005466013 systemd[1]: Reached target Cloud-config availability.
Oct  2 06:46:58 np0005466013 systemd[1]: Reached target Network is Online.
Oct  2 06:46:58 np0005466013 systemd[1]: Starting Cloud-init: Config Stage...
Oct  2 06:46:58 np0005466013 systemd[1]: Starting Notify NFS peers of a restart...
Oct  2 06:46:58 np0005466013 systemd[1]: Starting System Logging Service...
Oct  2 06:46:58 np0005466013 systemd[1]: Starting OpenSSH server daemon...
Oct  2 06:46:58 np0005466013 sm-notify[1002]: Version 2.5.4 starting
Oct  2 06:46:58 np0005466013 systemd[1]: Starting Permit User Sessions...
Oct  2 06:46:58 np0005466013 systemd[1]: Started Notify NFS peers of a restart.
Oct  2 06:46:58 np0005466013 systemd[1]: Started OpenSSH server daemon.
Oct  2 06:46:58 np0005466013 systemd[1]: Finished Permit User Sessions.
Oct  2 06:46:58 np0005466013 systemd[1]: Started Command Scheduler.
Oct  2 06:46:58 np0005466013 systemd[1]: Started Getty on tty1.
Oct  2 06:46:58 np0005466013 systemd[1]: Started Serial Getty on ttyS0.
Oct  2 06:46:58 np0005466013 systemd[1]: Reached target Login Prompts.
Oct  2 06:46:58 np0005466013 rsyslogd[1003]: [origin software="rsyslogd" swVersion="8.2506.0-2.el9" x-pid="1003" x-info="https://www.rsyslog.com"] start
Oct  2 06:46:58 np0005466013 rsyslogd[1003]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Oct  2 06:46:58 np0005466013 systemd[1]: Started System Logging Service.
Oct  2 06:46:58 np0005466013 systemd[1]: Reached target Multi-User System.
Oct  2 06:46:58 np0005466013 systemd[1]: Starting Record Runlevel Change in UTMP...
Oct  2 06:46:58 np0005466013 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Oct  2 06:46:58 np0005466013 systemd[1]: Finished Record Runlevel Change in UTMP.
Oct  2 06:46:58 np0005466013 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 06:46:58 np0005466013 cloud-init[1015]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Thu, 02 Oct 2025 10:46:58 +0000. Up 10.34 seconds.
Oct  2 06:46:58 np0005466013 systemd[1]: Finished Cloud-init: Config Stage.
Oct  2 06:46:58 np0005466013 systemd[1]: Starting Cloud-init: Final Stage...
Oct  2 06:46:59 np0005466013 cloud-init[1019]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Thu, 02 Oct 2025 10:46:59 +0000. Up 10.74 seconds.
Oct  2 06:46:59 np0005466013 cloud-init[1021]: #############################################################
Oct  2 06:46:59 np0005466013 cloud-init[1022]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Oct  2 06:46:59 np0005466013 cloud-init[1024]: 256 SHA256:cYKKNNLDo2PwPCiMknN32Ixi08HhciVFtoU/lXtagU4 root@np0005466013.novalocal (ECDSA)
Oct  2 06:46:59 np0005466013 cloud-init[1026]: 256 SHA256:BogmYyN/gVXnU5ow7kQBT9vV/KuQSqNVenb6dITzUqU root@np0005466013.novalocal (ED25519)
Oct  2 06:46:59 np0005466013 cloud-init[1028]: 3072 SHA256:e8hrWQiEMSqjip9L27XbzidI/qKge4OAbc5abdgdRqE root@np0005466013.novalocal (RSA)
Oct  2 06:46:59 np0005466013 cloud-init[1029]: -----END SSH HOST KEY FINGERPRINTS-----
Oct  2 06:46:59 np0005466013 cloud-init[1030]: #############################################################
Oct  2 06:46:59 np0005466013 cloud-init[1019]: Cloud-init v. 24.4-7.el9 finished at Thu, 02 Oct 2025 10:46:59 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 10.92 seconds
Oct  2 06:46:59 np0005466013 systemd[1]: Finished Cloud-init: Final Stage.
Oct  2 06:46:59 np0005466013 systemd[1]: Reached target Cloud-init target.
Oct  2 06:46:59 np0005466013 systemd[1]: Startup finished in 1.550s (kernel) + 2.624s (initrd) + 6.813s (userspace) = 10.988s.
Oct  2 06:47:01 np0005466013 chronyd[794]: Selected source 149.56.19.163 (2.centos.pool.ntp.org)
Oct  2 06:47:01 np0005466013 chronyd[794]: System clock TAI offset set to 37 seconds
Oct  2 06:47:05 np0005466013 irqbalance[782]: Cannot change IRQ 25 affinity: Operation not permitted
Oct  2 06:47:05 np0005466013 irqbalance[782]: IRQ 25 affinity is now unmanaged
Oct  2 06:47:05 np0005466013 irqbalance[782]: Cannot change IRQ 31 affinity: Operation not permitted
Oct  2 06:47:05 np0005466013 irqbalance[782]: IRQ 31 affinity is now unmanaged
Oct  2 06:47:05 np0005466013 irqbalance[782]: Cannot change IRQ 28 affinity: Operation not permitted
Oct  2 06:47:05 np0005466013 irqbalance[782]: IRQ 28 affinity is now unmanaged
Oct  2 06:47:05 np0005466013 irqbalance[782]: Cannot change IRQ 32 affinity: Operation not permitted
Oct  2 06:47:05 np0005466013 irqbalance[782]: IRQ 32 affinity is now unmanaged
Oct  2 06:47:05 np0005466013 irqbalance[782]: Cannot change IRQ 30 affinity: Operation not permitted
Oct  2 06:47:05 np0005466013 irqbalance[782]: IRQ 30 affinity is now unmanaged
Oct  2 06:47:05 np0005466013 irqbalance[782]: Cannot change IRQ 29 affinity: Operation not permitted
Oct  2 06:47:05 np0005466013 irqbalance[782]: IRQ 29 affinity is now unmanaged
Oct  2 06:47:06 np0005466013 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct  2 06:47:26 np0005466013 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct  2 06:59:29 np0005466013 systemd[1]: Starting dnf makecache...
Oct  2 06:59:30 np0005466013 dnf[1065]: Failed determining last makecache time.
Oct  2 06:59:30 np0005466013 dnf[1065]: CentOS Stream 9 - BaseOS                         51 kB/s | 6.7 kB     00:00
Oct  2 06:59:30 np0005466013 dnf[1065]: CentOS Stream 9 - AppStream                      26 kB/s | 6.8 kB     00:00
Oct  2 06:59:31 np0005466013 dnf[1065]: CentOS Stream 9 - CRB                            58 kB/s | 6.6 kB     00:00
Oct  2 06:59:31 np0005466013 dnf[1065]: CentOS Stream 9 - Extras packages                80 kB/s | 8.0 kB     00:00
Oct  2 06:59:31 np0005466013 dnf[1065]: Metadata cache created.
Oct  2 06:59:31 np0005466013 systemd[1]: dnf-makecache.service: Deactivated successfully.
Oct  2 06:59:31 np0005466013 systemd[1]: Finished dnf makecache.
Oct  2 07:01:41 np0005466013 systemd[1]: Created slice User Slice of UID 1000.
Oct  2 07:01:41 np0005466013 systemd[1]: Starting User Runtime Directory /run/user/1000...
Oct  2 07:01:41 np0005466013 systemd-logind[784]: New session 1 of user zuul.
Oct  2 07:01:41 np0005466013 systemd[1]: Finished User Runtime Directory /run/user/1000.
Oct  2 07:01:41 np0005466013 systemd[1]: Starting User Manager for UID 1000...
Oct  2 07:01:41 np0005466013 systemd[1094]: Queued start job for default target Main User Target.
Oct  2 07:01:41 np0005466013 systemd[1094]: Created slice User Application Slice.
Oct  2 07:01:41 np0005466013 systemd[1094]: Started Mark boot as successful after the user session has run 2 minutes.
Oct  2 07:01:41 np0005466013 systemd[1094]: Started Daily Cleanup of User's Temporary Directories.
Oct  2 07:01:41 np0005466013 systemd[1094]: Reached target Paths.
Oct  2 07:01:41 np0005466013 systemd[1094]: Reached target Timers.
Oct  2 07:01:41 np0005466013 systemd[1094]: Starting D-Bus User Message Bus Socket...
Oct  2 07:01:41 np0005466013 systemd[1094]: Starting Create User's Volatile Files and Directories...
Oct  2 07:01:41 np0005466013 systemd[1094]: Finished Create User's Volatile Files and Directories.
Oct  2 07:01:41 np0005466013 systemd[1094]: Listening on D-Bus User Message Bus Socket.
Oct  2 07:01:41 np0005466013 systemd[1094]: Reached target Sockets.
Oct  2 07:01:41 np0005466013 systemd[1094]: Reached target Basic System.
Oct  2 07:01:41 np0005466013 systemd[1094]: Reached target Main User Target.
Oct  2 07:01:41 np0005466013 systemd[1094]: Startup finished in 188ms.
Oct  2 07:01:41 np0005466013 systemd[1]: Started User Manager for UID 1000.
Oct  2 07:01:41 np0005466013 systemd[1]: Started Session 1 of User zuul.
Oct  2 07:01:42 np0005466013 python3[1177]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:01:47 np0005466013 python3[1205]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:01:54 np0005466013 python3[1263]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:01:54 np0005466013 systemd[1]: Starting Cleanup of Temporary Directories...
Oct  2 07:01:54 np0005466013 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Oct  2 07:01:54 np0005466013 systemd[1]: Finished Cleanup of Temporary Directories.
Oct  2 07:01:54 np0005466013 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Oct  2 07:01:55 np0005466013 python3[1307]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Oct  2 07:01:57 np0005466013 python3[1333]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDQCZ3vFv33zuU9QR4Erz5ZRISFa/oPvha0xwrBdyzVa18ydYUaCm/1GZP9yUXeHFz7iqX2LFQYNsjkqZJz1Uu67Idku6xgJC7Fx6g9BMv0MT1Zlak1CqYHg2DEyLxPerFs9LKBlOaZV+zN8b4kdG8Ww5E2kG2A7Ui3Cuzht/VP01bi+s4UjtwKH6CZ6X56ylQhY7z0Z+hPDBDFz1Oy2SYkyvdrztTs4eWaoebh/cWCdWX0V2djhSx6cc/r+wVBz3Aibc6gZzEn+Gpq8ffdM/6w/oD9Iqy6ijpCtmVA92FGjAJvr33J1xKd5XxDh4pvKaqFm7hjEeL+KJ1Z1ABjWrwV0uQNNHxit/J8k2+UdRsH+ZYoO3rrg4X8rEHQr981ffbmUPm16g5UJE1TZx20ZMh8oTkA5hXg5ydzjiktL9jGvgn+fSI1iCi1fdR/jUZ3xfQN6Q23wnG7lApoHjP4JXM75nxGNc0elGo9oGrDGWSVEwTOqp4qQPIuFtq+hNm1uTU= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:01:57 np0005466013 python3[1357]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:01:58 np0005466013 python3[1456]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  2 07:01:58 np0005466013 python3[1527]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759402918.0615737-254-69464586238258/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=e98c7d2ea8ba4729beeb0aae1d087b01_id_rsa follow=False checksum=84e221810e16da2c918261cb937e6458833c76e7 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:01:59 np0005466013 python3[1650]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  2 07:01:59 np0005466013 python3[1721]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759402919.0763261-309-184236824702586/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=e98c7d2ea8ba4729beeb0aae1d087b01_id_rsa.pub follow=False checksum=4dfd599c92ceccedc02682f946e69efec3324503 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:02:01 np0005466013 python3[1769]: ansible-ping Invoked with data=pong
Oct  2 07:02:02 np0005466013 python3[1793]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:02:04 np0005466013 python3[1851]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Oct  2 07:02:05 np0005466013 python3[1883]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:02:05 np0005466013 python3[1907]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:02:06 np0005466013 python3[1931]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:02:06 np0005466013 python3[1955]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:02:06 np0005466013 python3[1979]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:02:06 np0005466013 python3[2003]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:02:08 np0005466013 python3[2029]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:02:09 np0005466013 python3[2107]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  2 07:02:10 np0005466013 python3[2180]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759402929.0808434-34-237191245951442/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:02:10 np0005466013 python3[2228]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:02:11 np0005466013 python3[2252]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:02:11 np0005466013 python3[2276]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:02:11 np0005466013 python3[2300]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:02:12 np0005466013 python3[2324]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:02:12 np0005466013 python3[2348]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:02:12 np0005466013 python3[2372]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:02:12 np0005466013 python3[2396]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:02:13 np0005466013 python3[2420]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:02:13 np0005466013 python3[2444]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:02:13 np0005466013 python3[2468]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:02:14 np0005466013 python3[2492]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:02:14 np0005466013 python3[2516]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:02:14 np0005466013 python3[2540]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:02:14 np0005466013 python3[2564]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:02:15 np0005466013 python3[2588]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:02:15 np0005466013 python3[2612]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:02:15 np0005466013 python3[2636]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:02:16 np0005466013 python3[2660]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:02:16 np0005466013 python3[2684]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:02:16 np0005466013 python3[2708]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:02:17 np0005466013 python3[2732]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:02:17 np0005466013 python3[2756]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:02:17 np0005466013 python3[2780]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:02:17 np0005466013 python3[2804]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:02:18 np0005466013 python3[2828]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:02:20 np0005466013 python3[2854]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Oct  2 07:02:20 np0005466013 systemd[1]: Starting Time & Date Service...
Oct  2 07:02:20 np0005466013 systemd[1]: Started Time & Date Service.
Oct  2 07:02:20 np0005466013 systemd-timedated[2856]: Changed time zone to 'UTC' (UTC).
Oct  2 07:02:22 np0005466013 python3[2885]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:02:22 np0005466013 python3[2961]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  2 07:02:23 np0005466013 python3[3032]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1759402942.5146265-254-33069120796874/source _original_basename=tmpye2pn_ez follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:02:23 np0005466013 python3[3132]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  2 07:02:24 np0005466013 python3[3203]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1759402943.4110563-304-160454230044279/source _original_basename=tmp8gms2p7i follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:02:24 np0005466013 python3[3305]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  2 07:02:25 np0005466013 python3[3378]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1759402944.6011956-384-137217347354466/source _original_basename=tmpogqqgi0z follow=False checksum=46661bebe1ecf41d3f6ce259e9385cc50a3d3082 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:02:25 np0005466013 python3[3426]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:02:26 np0005466013 python3[3452]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:02:26 np0005466013 python3[3532]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  2 07:02:26 np0005466013 python3[3605]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1759402946.3020294-454-178869141442884/source _original_basename=tmpgzsmreh7 follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:02:27 np0005466013 python3[3656]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ef9-e89a-893c-5f89-00000000001f-1-compute2 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:02:28 np0005466013 python3[3684]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-893c-5f89-000000000020-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Oct  2 07:02:29 np0005466013 python3[3712]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:02:47 np0005466013 python3[3738]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:02:50 np0005466013 systemd[1]: systemd-timedated.service: Deactivated successfully.
Oct  2 07:03:47 np0005466013 systemd-logind[784]: Session 1 logged out. Waiting for processes to exit.
Oct  2 07:04:15 np0005466013 kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Oct  2 07:04:15 np0005466013 kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Oct  2 07:04:15 np0005466013 kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Oct  2 07:04:15 np0005466013 kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Oct  2 07:04:15 np0005466013 kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Oct  2 07:04:15 np0005466013 kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Oct  2 07:04:15 np0005466013 kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Oct  2 07:04:15 np0005466013 kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Oct  2 07:04:15 np0005466013 kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Oct  2 07:04:15 np0005466013 kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Oct  2 07:04:15 np0005466013 NetworkManager[857]: <info>  [1759403055.3283] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Oct  2 07:04:15 np0005466013 systemd-udevd[3743]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 07:04:15 np0005466013 NetworkManager[857]: <info>  [1759403055.3497] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  2 07:04:15 np0005466013 NetworkManager[857]: <info>  [1759403055.3527] settings: (eth1): created default wired connection 'Wired connection 1'
Oct  2 07:04:15 np0005466013 NetworkManager[857]: <info>  [1759403055.3533] device (eth1): carrier: link connected
Oct  2 07:04:15 np0005466013 NetworkManager[857]: <info>  [1759403055.3536] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Oct  2 07:04:15 np0005466013 NetworkManager[857]: <info>  [1759403055.3543] policy: auto-activating connection 'Wired connection 1' (72057882-bf27-3d3f-a38a-a878b09c7d39)
Oct  2 07:04:15 np0005466013 NetworkManager[857]: <info>  [1759403055.3546] device (eth1): Activation: starting connection 'Wired connection 1' (72057882-bf27-3d3f-a38a-a878b09c7d39)
Oct  2 07:04:15 np0005466013 NetworkManager[857]: <info>  [1759403055.3548] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  2 07:04:15 np0005466013 NetworkManager[857]: <info>  [1759403055.3552] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  2 07:04:15 np0005466013 NetworkManager[857]: <info>  [1759403055.3557] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  2 07:04:15 np0005466013 NetworkManager[857]: <info>  [1759403055.3563] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Oct  2 07:04:15 np0005466013 systemd[1094]: Starting Mark boot as successful...
Oct  2 07:04:15 np0005466013 systemd[1094]: Finished Mark boot as successful.
Oct  2 07:04:16 np0005466013 systemd-logind[784]: New session 3 of user zuul.
Oct  2 07:04:16 np0005466013 systemd[1]: Started Session 3 of User zuul.
Oct  2 07:04:16 np0005466013 python3[3774]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ef9-e89a-d220-2459-0000000001ea-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:04:23 np0005466013 python3[3856]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  2 07:04:24 np0005466013 python3[3929]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759403063.3989656-206-237067109640065/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=c7a8c7a7cb44703011e053f30708cce7b5922f63 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:04:24 np0005466013 python3[3979]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:04:24 np0005466013 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Oct  2 07:04:24 np0005466013 systemd[1]: Stopped Network Manager Wait Online.
Oct  2 07:04:24 np0005466013 systemd[1]: Stopping Network Manager Wait Online...
Oct  2 07:04:24 np0005466013 systemd[1]: Stopping Network Manager...
Oct  2 07:04:24 np0005466013 NetworkManager[857]: <info>  [1759403064.7692] caught SIGTERM, shutting down normally.
Oct  2 07:04:24 np0005466013 NetworkManager[857]: <info>  [1759403064.7702] dhcp4 (eth0): canceled DHCP transaction
Oct  2 07:04:24 np0005466013 NetworkManager[857]: <info>  [1759403064.7702] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct  2 07:04:24 np0005466013 NetworkManager[857]: <info>  [1759403064.7702] dhcp4 (eth0): state changed no lease
Oct  2 07:04:24 np0005466013 NetworkManager[857]: <info>  [1759403064.7704] manager: NetworkManager state is now CONNECTING
Oct  2 07:04:24 np0005466013 NetworkManager[857]: <info>  [1759403064.7768] dhcp4 (eth1): canceled DHCP transaction
Oct  2 07:04:24 np0005466013 NetworkManager[857]: <info>  [1759403064.7769] dhcp4 (eth1): state changed no lease
Oct  2 07:04:24 np0005466013 NetworkManager[857]: <info>  [1759403064.7803] exiting (success)
Oct  2 07:04:24 np0005466013 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct  2 07:04:24 np0005466013 systemd[1]: NetworkManager.service: Deactivated successfully.
Oct  2 07:04:24 np0005466013 systemd[1]: Stopped Network Manager.
Oct  2 07:04:24 np0005466013 systemd[1]: NetworkManager.service: Consumed 5.790s CPU time, 10.0M memory peak.
Oct  2 07:04:24 np0005466013 systemd[1]: Starting Network Manager...
Oct  2 07:04:24 np0005466013 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct  2 07:04:24 np0005466013 NetworkManager[3984]: <info>  [1759403064.8473] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:e608a69a-f987-4b3e-a0e0-e678d85d8e75)
Oct  2 07:04:24 np0005466013 NetworkManager[3984]: <info>  [1759403064.8478] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Oct  2 07:04:24 np0005466013 NetworkManager[3984]: <info>  [1759403064.8541] manager[0x558ea0168070]: monitoring kernel firmware directory '/lib/firmware'.
Oct  2 07:04:24 np0005466013 systemd[1]: Starting Hostname Service...
Oct  2 07:04:24 np0005466013 systemd[1]: Started Hostname Service.
Oct  2 07:04:24 np0005466013 NetworkManager[3984]: <info>  [1759403064.9570] hostname: hostname: using hostnamed
Oct  2 07:04:24 np0005466013 NetworkManager[3984]: <info>  [1759403064.9571] hostname: static hostname changed from (none) to "np0005466013.novalocal"
Oct  2 07:04:24 np0005466013 NetworkManager[3984]: <info>  [1759403064.9581] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Oct  2 07:04:24 np0005466013 NetworkManager[3984]: <info>  [1759403064.9588] manager[0x558ea0168070]: rfkill: Wi-Fi hardware radio set enabled
Oct  2 07:04:24 np0005466013 NetworkManager[3984]: <info>  [1759403064.9589] manager[0x558ea0168070]: rfkill: WWAN hardware radio set enabled
Oct  2 07:04:24 np0005466013 NetworkManager[3984]: <info>  [1759403064.9634] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Oct  2 07:04:24 np0005466013 NetworkManager[3984]: <info>  [1759403064.9635] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Oct  2 07:04:24 np0005466013 NetworkManager[3984]: <info>  [1759403064.9635] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Oct  2 07:04:24 np0005466013 NetworkManager[3984]: <info>  [1759403064.9636] manager: Networking is enabled by state file
Oct  2 07:04:24 np0005466013 NetworkManager[3984]: <info>  [1759403064.9640] settings: Loaded settings plugin: keyfile (internal)
Oct  2 07:04:24 np0005466013 NetworkManager[3984]: <info>  [1759403064.9646] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Oct  2 07:04:24 np0005466013 NetworkManager[3984]: <info>  [1759403064.9686] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Oct  2 07:04:24 np0005466013 NetworkManager[3984]: <info>  [1759403064.9704] dhcp: init: Using DHCP client 'internal'
Oct  2 07:04:24 np0005466013 NetworkManager[3984]: <info>  [1759403064.9709] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Oct  2 07:04:24 np0005466013 NetworkManager[3984]: <info>  [1759403064.9716] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 07:04:24 np0005466013 NetworkManager[3984]: <info>  [1759403064.9725] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Oct  2 07:04:24 np0005466013 NetworkManager[3984]: <info>  [1759403064.9738] device (lo): Activation: starting connection 'lo' (e6177a1f-26b3-4119-8eb3-690387d13626)
Oct  2 07:04:24 np0005466013 NetworkManager[3984]: <info>  [1759403064.9749] device (eth0): carrier: link connected
Oct  2 07:04:24 np0005466013 NetworkManager[3984]: <info>  [1759403064.9756] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Oct  2 07:04:24 np0005466013 NetworkManager[3984]: <info>  [1759403064.9764] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Oct  2 07:04:24 np0005466013 NetworkManager[3984]: <info>  [1759403064.9764] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Oct  2 07:04:24 np0005466013 NetworkManager[3984]: <info>  [1759403064.9774] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Oct  2 07:04:24 np0005466013 NetworkManager[3984]: <info>  [1759403064.9784] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct  2 07:04:24 np0005466013 NetworkManager[3984]: <info>  [1759403064.9793] device (eth1): carrier: link connected
Oct  2 07:04:24 np0005466013 NetworkManager[3984]: <info>  [1759403064.9800] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Oct  2 07:04:24 np0005466013 NetworkManager[3984]: <info>  [1759403064.9807] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (72057882-bf27-3d3f-a38a-a878b09c7d39) (indicated)
Oct  2 07:04:24 np0005466013 NetworkManager[3984]: <info>  [1759403064.9808] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Oct  2 07:04:24 np0005466013 NetworkManager[3984]: <info>  [1759403064.9816] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Oct  2 07:04:24 np0005466013 NetworkManager[3984]: <info>  [1759403064.9826] device (eth1): Activation: starting connection 'Wired connection 1' (72057882-bf27-3d3f-a38a-a878b09c7d39)
Oct  2 07:04:24 np0005466013 systemd[1]: Started Network Manager.
Oct  2 07:04:24 np0005466013 NetworkManager[3984]: <info>  [1759403064.9835] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Oct  2 07:04:24 np0005466013 NetworkManager[3984]: <info>  [1759403064.9844] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Oct  2 07:04:24 np0005466013 NetworkManager[3984]: <info>  [1759403064.9848] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Oct  2 07:04:24 np0005466013 NetworkManager[3984]: <info>  [1759403064.9851] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Oct  2 07:04:24 np0005466013 NetworkManager[3984]: <info>  [1759403064.9854] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Oct  2 07:04:24 np0005466013 NetworkManager[3984]: <info>  [1759403064.9859] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Oct  2 07:04:24 np0005466013 NetworkManager[3984]: <info>  [1759403064.9862] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Oct  2 07:04:24 np0005466013 NetworkManager[3984]: <info>  [1759403064.9865] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Oct  2 07:04:24 np0005466013 NetworkManager[3984]: <info>  [1759403064.9870] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Oct  2 07:04:24 np0005466013 NetworkManager[3984]: <info>  [1759403064.9880] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Oct  2 07:04:24 np0005466013 NetworkManager[3984]: <info>  [1759403064.9884] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct  2 07:04:24 np0005466013 NetworkManager[3984]: <info>  [1759403064.9896] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Oct  2 07:04:24 np0005466013 NetworkManager[3984]: <info>  [1759403064.9899] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Oct  2 07:04:24 np0005466013 NetworkManager[3984]: <info>  [1759403064.9928] dhcp4 (eth0): state changed new lease, address=38.102.83.45
Oct  2 07:04:24 np0005466013 NetworkManager[3984]: <info>  [1759403064.9936] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Oct  2 07:04:24 np0005466013 systemd[1]: Starting Network Manager Wait Online...
Oct  2 07:04:25 np0005466013 NetworkManager[3984]: <info>  [1759403065.0003] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Oct  2 07:04:25 np0005466013 NetworkManager[3984]: <info>  [1759403065.0009] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Oct  2 07:04:25 np0005466013 NetworkManager[3984]: <info>  [1759403065.0018] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Oct  2 07:04:25 np0005466013 NetworkManager[3984]: <info>  [1759403065.0025] device (lo): Activation: successful, device activated.
Oct  2 07:04:25 np0005466013 NetworkManager[3984]: <info>  [1759403065.0068] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Oct  2 07:04:25 np0005466013 NetworkManager[3984]: <info>  [1759403065.0072] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Oct  2 07:04:25 np0005466013 NetworkManager[3984]: <info>  [1759403065.0076] manager: NetworkManager state is now CONNECTED_SITE
Oct  2 07:04:25 np0005466013 NetworkManager[3984]: <info>  [1759403065.0081] device (eth0): Activation: successful, device activated.
Oct  2 07:04:25 np0005466013 NetworkManager[3984]: <info>  [1759403065.0088] manager: NetworkManager state is now CONNECTED_GLOBAL
Oct  2 07:04:25 np0005466013 python3[4064]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ef9-e89a-d220-2459-0000000000d3-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:04:35 np0005466013 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct  2 07:04:54 np0005466013 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct  2 07:05:10 np0005466013 NetworkManager[3984]: <info>  [1759403110.3221] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Oct  2 07:05:10 np0005466013 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct  2 07:05:10 np0005466013 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct  2 07:05:10 np0005466013 NetworkManager[3984]: <info>  [1759403110.3550] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Oct  2 07:05:10 np0005466013 NetworkManager[3984]: <info>  [1759403110.3553] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Oct  2 07:05:10 np0005466013 NetworkManager[3984]: <info>  [1759403110.3563] device (eth1): Activation: successful, device activated.
Oct  2 07:05:10 np0005466013 NetworkManager[3984]: <info>  [1759403110.3572] manager: startup complete
Oct  2 07:05:10 np0005466013 NetworkManager[3984]: <info>  [1759403110.3575] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Oct  2 07:05:10 np0005466013 NetworkManager[3984]: <warn>  [1759403110.3582] device (eth1): Activation: failed for connection 'Wired connection 1'
Oct  2 07:05:10 np0005466013 NetworkManager[3984]: <info>  [1759403110.3592] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Oct  2 07:05:10 np0005466013 systemd[1]: Finished Network Manager Wait Online.
Oct  2 07:05:10 np0005466013 NetworkManager[3984]: <info>  [1759403110.3679] dhcp4 (eth1): canceled DHCP transaction
Oct  2 07:05:10 np0005466013 NetworkManager[3984]: <info>  [1759403110.3679] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Oct  2 07:05:10 np0005466013 NetworkManager[3984]: <info>  [1759403110.3680] dhcp4 (eth1): state changed no lease
Oct  2 07:05:10 np0005466013 NetworkManager[3984]: <info>  [1759403110.3698] policy: auto-activating connection 'ci-private-network' (56c504e7-1263-5e10-8bcd-ab3a397bb040)
Oct  2 07:05:10 np0005466013 NetworkManager[3984]: <info>  [1759403110.3705] device (eth1): Activation: starting connection 'ci-private-network' (56c504e7-1263-5e10-8bcd-ab3a397bb040)
Oct  2 07:05:10 np0005466013 NetworkManager[3984]: <info>  [1759403110.3706] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  2 07:05:10 np0005466013 NetworkManager[3984]: <info>  [1759403110.3710] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  2 07:05:10 np0005466013 NetworkManager[3984]: <info>  [1759403110.3719] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  2 07:05:10 np0005466013 NetworkManager[3984]: <info>  [1759403110.3733] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  2 07:05:10 np0005466013 NetworkManager[3984]: <info>  [1759403110.3776] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  2 07:05:10 np0005466013 NetworkManager[3984]: <info>  [1759403110.3778] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  2 07:05:10 np0005466013 NetworkManager[3984]: <info>  [1759403110.3783] device (eth1): Activation: successful, device activated.
Oct  2 07:05:20 np0005466013 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct  2 07:05:25 np0005466013 systemd[1]: session-3.scope: Deactivated successfully.
Oct  2 07:05:25 np0005466013 systemd[1]: session-3.scope: Consumed 1.841s CPU time.
Oct  2 07:05:25 np0005466013 systemd-logind[784]: Session 3 logged out. Waiting for processes to exit.
Oct  2 07:05:25 np0005466013 systemd-logind[784]: Removed session 3.
Oct  2 07:05:35 np0005466013 systemd-logind[784]: New session 4 of user zuul.
Oct  2 07:05:35 np0005466013 systemd[1]: Started Session 4 of User zuul.
Oct  2 07:05:35 np0005466013 python3[4178]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  2 07:05:35 np0005466013 python3[4251]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759403135.1852455-365-183274594300008/source _original_basename=tmp6ex_ckoc follow=False checksum=7ead7cbef44571b5903e56d225b6c0c65e6bdcb6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:05:38 np0005466013 systemd[1]: session-4.scope: Deactivated successfully.
Oct  2 07:05:38 np0005466013 systemd-logind[784]: Session 4 logged out. Waiting for processes to exit.
Oct  2 07:05:38 np0005466013 systemd-logind[784]: Removed session 4.
Oct  2 07:07:29 np0005466013 systemd[1094]: Created slice User Background Tasks Slice.
Oct  2 07:07:29 np0005466013 systemd[1094]: Starting Cleanup of User's Temporary Files and Directories...
Oct  2 07:07:29 np0005466013 systemd[1094]: Finished Cleanup of User's Temporary Files and Directories.
Oct  2 07:11:24 np0005466013 systemd-logind[784]: New session 5 of user zuul.
Oct  2 07:11:24 np0005466013 systemd[1]: Started Session 5 of User zuul.
Oct  2 07:11:24 np0005466013 python3[4313]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-67bc-3be9-000000000ca4-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:11:25 np0005466013 python3[4342]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:11:25 np0005466013 python3[4368]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:11:25 np0005466013 python3[4394]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:11:25 np0005466013 python3[4420]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:11:26 np0005466013 python3[4446]: ansible-ansible.builtin.lineinfile Invoked with path=/etc/systemd/system.conf regexp=^#DefaultIOAccounting=no line=DefaultIOAccounting=yes state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:11:26 np0005466013 python3[4446]: ansible-ansible.builtin.lineinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Oct  2 07:11:27 np0005466013 python3[4472]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  2 07:11:27 np0005466013 systemd[1]: Reloading.
Oct  2 07:11:27 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:11:29 np0005466013 python3[4528]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Oct  2 07:11:29 np0005466013 python3[4554]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:11:30 np0005466013 python3[4582]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:11:30 np0005466013 python3[4610]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:11:30 np0005466013 python3[4638]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:11:31 np0005466013 python3[4665]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-67bc-3be9-000000000caa-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:11:31 np0005466013 python3[4695]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:11:34 np0005466013 systemd-logind[784]: Session 5 logged out. Waiting for processes to exit.
Oct  2 07:11:34 np0005466013 systemd[1]: session-5.scope: Deactivated successfully.
Oct  2 07:11:34 np0005466013 systemd[1]: session-5.scope: Consumed 3.556s CPU time.
Oct  2 07:11:34 np0005466013 systemd-logind[784]: Removed session 5.
Oct  2 07:11:36 np0005466013 systemd-logind[784]: New session 6 of user zuul.
Oct  2 07:11:36 np0005466013 systemd[1]: Started Session 6 of User zuul.
Oct  2 07:11:36 np0005466013 python3[4731]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Oct  2 07:11:45 np0005466013 irqbalance[782]: Cannot change IRQ 27 affinity: Operation not permitted
Oct  2 07:11:45 np0005466013 irqbalance[782]: IRQ 27 affinity is now unmanaged
Oct  2 07:11:55 np0005466013 kernel: SELinux:  Converting 366 SID table entries...
Oct  2 07:11:55 np0005466013 kernel: SELinux:  policy capability network_peer_controls=1
Oct  2 07:11:55 np0005466013 kernel: SELinux:  policy capability open_perms=1
Oct  2 07:11:55 np0005466013 kernel: SELinux:  policy capability extended_socket_class=1
Oct  2 07:11:55 np0005466013 kernel: SELinux:  policy capability always_check_network=0
Oct  2 07:11:55 np0005466013 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  2 07:11:55 np0005466013 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  2 07:11:55 np0005466013 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  2 07:12:04 np0005466013 kernel: SELinux:  Converting 366 SID table entries...
Oct  2 07:12:04 np0005466013 kernel: SELinux:  policy capability network_peer_controls=1
Oct  2 07:12:04 np0005466013 kernel: SELinux:  policy capability open_perms=1
Oct  2 07:12:04 np0005466013 kernel: SELinux:  policy capability extended_socket_class=1
Oct  2 07:12:04 np0005466013 kernel: SELinux:  policy capability always_check_network=0
Oct  2 07:12:04 np0005466013 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  2 07:12:04 np0005466013 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  2 07:12:04 np0005466013 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  2 07:12:17 np0005466013 kernel: SELinux:  Converting 366 SID table entries...
Oct  2 07:12:17 np0005466013 kernel: SELinux:  policy capability network_peer_controls=1
Oct  2 07:12:17 np0005466013 kernel: SELinux:  policy capability open_perms=1
Oct  2 07:12:17 np0005466013 kernel: SELinux:  policy capability extended_socket_class=1
Oct  2 07:12:17 np0005466013 kernel: SELinux:  policy capability always_check_network=0
Oct  2 07:12:17 np0005466013 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  2 07:12:17 np0005466013 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  2 07:12:17 np0005466013 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  2 07:12:20 np0005466013 setsebool[4791]: The virt_use_nfs policy boolean was changed to 1 by root
Oct  2 07:12:20 np0005466013 setsebool[4791]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Oct  2 07:12:36 np0005466013 kernel: SELinux:  Converting 369 SID table entries...
Oct  2 07:12:36 np0005466013 kernel: SELinux:  policy capability network_peer_controls=1
Oct  2 07:12:36 np0005466013 kernel: SELinux:  policy capability open_perms=1
Oct  2 07:12:36 np0005466013 kernel: SELinux:  policy capability extended_socket_class=1
Oct  2 07:12:36 np0005466013 kernel: SELinux:  policy capability always_check_network=0
Oct  2 07:12:36 np0005466013 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  2 07:12:36 np0005466013 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  2 07:12:36 np0005466013 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  2 07:13:02 np0005466013 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Oct  2 07:13:02 np0005466013 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  2 07:13:02 np0005466013 systemd[1]: Starting man-db-cache-update.service...
Oct  2 07:13:02 np0005466013 systemd[1]: Reloading.
Oct  2 07:13:03 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:13:03 np0005466013 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  2 07:13:10 np0005466013 systemd[1]: Starting PackageKit Daemon...
Oct  2 07:13:10 np0005466013 systemd[1]: Starting Authorization Manager...
Oct  2 07:13:10 np0005466013 polkitd[8433]: Started polkitd version 0.117
Oct  2 07:13:10 np0005466013 systemd[1]: Started Authorization Manager.
Oct  2 07:13:10 np0005466013 systemd[1]: Started PackageKit Daemon.
Oct  2 07:13:14 np0005466013 python3[9531]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-ae2a-8cff-00000000000c-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:13:16 np0005466013 kernel: evm: overlay not supported
Oct  2 07:13:17 np0005466013 systemd[1094]: Starting D-Bus User Message Bus...
Oct  2 07:13:17 np0005466013 dbus-broker-launch[10547]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Oct  2 07:13:17 np0005466013 dbus-broker-launch[10547]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Oct  2 07:13:17 np0005466013 systemd[1094]: Started D-Bus User Message Bus.
Oct  2 07:13:17 np0005466013 dbus-broker-lau[10547]: Ready
Oct  2 07:13:17 np0005466013 systemd[1094]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Oct  2 07:13:17 np0005466013 systemd[1094]: Created slice Slice /user.
Oct  2 07:13:17 np0005466013 systemd[1094]: podman-10274.scope: unit configures an IP firewall, but not running as root.
Oct  2 07:13:17 np0005466013 systemd[1094]: (This warning is only shown for the first unit using IP firewalling.)
Oct  2 07:13:17 np0005466013 systemd[1094]: Started podman-10274.scope.
Oct  2 07:13:17 np0005466013 systemd[1094]: Started podman-pause-f8aa53ce.scope.
Oct  2 07:13:18 np0005466013 python3[10686]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]#012location = "38.102.83.80:5001"#012insecure = true path=/etc/containers/registries.conf block=[[registry]]#012location = "38.102.83.80:5001"#012insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:13:18 np0005466013 systemd[1]: session-6.scope: Deactivated successfully.
Oct  2 07:13:18 np0005466013 systemd[1]: session-6.scope: Consumed 1min 7.845s CPU time.
Oct  2 07:13:18 np0005466013 systemd-logind[784]: Session 6 logged out. Waiting for processes to exit.
Oct  2 07:13:18 np0005466013 systemd-logind[784]: Removed session 6.
Oct  2 07:13:50 np0005466013 systemd-logind[784]: New session 7 of user zuul.
Oct  2 07:13:50 np0005466013 systemd[1]: Started Session 7 of User zuul.
Oct  2 07:13:51 np0005466013 python3[19209]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBG5wXEu1JOQA5KJoTkupC8GEbQNIbg6S2Q6Mp50kFLAjQIUiHO0Vf9azsWL1hcnqZwbQOjTwG/mdjPHjLP6jQ28= zuul@np0005466010.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:13:51 np0005466013 python3[19352]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBG5wXEu1JOQA5KJoTkupC8GEbQNIbg6S2Q6Mp50kFLAjQIUiHO0Vf9azsWL1hcnqZwbQOjTwG/mdjPHjLP6jQ28= zuul@np0005466010.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:13:52 np0005466013 python3[19685]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005466013.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Oct  2 07:13:53 np0005466013 python3[19977]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBG5wXEu1JOQA5KJoTkupC8GEbQNIbg6S2Q6Mp50kFLAjQIUiHO0Vf9azsWL1hcnqZwbQOjTwG/mdjPHjLP6jQ28= zuul@np0005466010.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:13:53 np0005466013 python3[20202]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  2 07:13:54 np0005466013 python3[20423]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1759403633.554306-170-12331180129357/source _original_basename=tmpx8jswfm4 follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:13:55 np0005466013 python3[20697]: ansible-ansible.builtin.hostname Invoked with name=compute-2 use=systemd
Oct  2 07:13:55 np0005466013 systemd[1]: Starting Hostname Service...
Oct  2 07:13:55 np0005466013 systemd[1]: Started Hostname Service.
Oct  2 07:13:55 np0005466013 systemd-hostnamed[20790]: Changed pretty hostname to 'compute-2'
Oct  2 07:13:55 np0005466013 systemd-hostnamed[20790]: Hostname set to <compute-2> (static)
Oct  2 07:13:55 np0005466013 NetworkManager[3984]: <info>  [1759403635.4083] hostname: static hostname changed from "np0005466013.novalocal" to "compute-2"
Oct  2 07:13:55 np0005466013 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct  2 07:13:55 np0005466013 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct  2 07:13:56 np0005466013 systemd[1]: session-7.scope: Deactivated successfully.
Oct  2 07:13:56 np0005466013 systemd[1]: session-7.scope: Consumed 2.433s CPU time.
Oct  2 07:13:56 np0005466013 systemd-logind[784]: Session 7 logged out. Waiting for processes to exit.
Oct  2 07:13:56 np0005466013 systemd-logind[784]: Removed session 7.
Oct  2 07:14:05 np0005466013 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct  2 07:14:21 np0005466013 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct  2 07:14:21 np0005466013 systemd[1]: Finished man-db-cache-update.service.
Oct  2 07:14:21 np0005466013 systemd[1]: man-db-cache-update.service: Consumed 1min 6.311s CPU time.
Oct  2 07:14:21 np0005466013 systemd[1]: run-rc6bb6544334f4ca88be94e4331f83d9d.service: Deactivated successfully.
Oct  2 07:14:25 np0005466013 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct  2 07:18:16 np0005466013 systemd[1]: packagekit.service: Deactivated successfully.
Oct  2 07:18:19 np0005466013 systemd-logind[784]: New session 8 of user zuul.
Oct  2 07:18:19 np0005466013 systemd[1]: Started Session 8 of User zuul.
Oct  2 07:18:20 np0005466013 python3[26698]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:18:21 np0005466013 python3[26814]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  2 07:18:22 np0005466013 python3[26887]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759403901.6498508-30696-234674108180746/source mode=0755 _original_basename=delorean.repo follow=False checksum=bb4c2ff9dad546f135d54d9729ea11b84117755d backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:18:22 np0005466013 python3[26913]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  2 07:18:23 np0005466013 python3[26986]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759403901.6498508-30696-234674108180746/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=0bdbb813b840548359ae77c28d76ca272ccaf31b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:18:23 np0005466013 python3[27012]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  2 07:18:23 np0005466013 python3[27085]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759403901.6498508-30696-234674108180746/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:18:23 np0005466013 python3[27111]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  2 07:18:24 np0005466013 python3[27184]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759403901.6498508-30696-234674108180746/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:18:24 np0005466013 python3[27210]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  2 07:18:24 np0005466013 python3[27283]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759403901.6498508-30696-234674108180746/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:18:25 np0005466013 python3[27309]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  2 07:18:25 np0005466013 python3[27382]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759403901.6498508-30696-234674108180746/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:18:25 np0005466013 python3[27408]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  2 07:18:26 np0005466013 python3[27481]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759403901.6498508-30696-234674108180746/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=d911291791b114a72daf18f370e91cb1ae300933 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:18:35 np0005466013 python3[27529]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:23:35 np0005466013 systemd-logind[784]: Session 8 logged out. Waiting for processes to exit.
Oct  2 07:23:35 np0005466013 systemd[1]: session-8.scope: Deactivated successfully.
Oct  2 07:23:35 np0005466013 systemd[1]: session-8.scope: Consumed 5.038s CPU time.
Oct  2 07:23:35 np0005466013 systemd-logind[784]: Removed session 8.
Oct  2 07:32:51 np0005466013 systemd-logind[784]: New session 9 of user zuul.
Oct  2 07:32:51 np0005466013 systemd[1]: Started Session 9 of User zuul.
Oct  2 07:32:53 np0005466013 python3.9[27697]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:32:54 np0005466013 python3.9[27878]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:33:03 np0005466013 systemd[1]: session-9.scope: Deactivated successfully.
Oct  2 07:33:03 np0005466013 systemd[1]: session-9.scope: Consumed 8.634s CPU time.
Oct  2 07:33:03 np0005466013 systemd-logind[784]: Session 9 logged out. Waiting for processes to exit.
Oct  2 07:33:03 np0005466013 systemd-logind[784]: Removed session 9.
Oct  2 07:33:19 np0005466013 systemd-logind[784]: New session 10 of user zuul.
Oct  2 07:33:19 np0005466013 systemd[1]: Started Session 10 of User zuul.
Oct  2 07:33:20 np0005466013 python3.9[28089]: ansible-ansible.legacy.ping Invoked with data=pong
Oct  2 07:33:21 np0005466013 python3.9[28263]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:33:22 np0005466013 python3.9[28415]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:33:23 np0005466013 python3.9[28568]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:33:24 np0005466013 python3.9[28720]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:33:25 np0005466013 python3.9[28872]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:33:26 np0005466013 python3.9[28995]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1759404804.7958553-184-190363947915191/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:33:26 np0005466013 python3.9[29147]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:33:27 np0005466013 python3.9[29303]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:33:28 np0005466013 python3.9[29453]: ansible-ansible.builtin.service_facts Invoked
Oct  2 07:33:33 np0005466013 python3.9[29708]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:33:34 np0005466013 python3.9[29858]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:33:35 np0005466013 python3.9[30012]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:33:37 np0005466013 python3.9[30170]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  2 07:33:37 np0005466013 python3.9[30254]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  2 07:34:27 np0005466013 systemd[1]: Reloading.
Oct  2 07:34:27 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:34:27 np0005466013 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Oct  2 07:34:27 np0005466013 systemd[1]: Reloading.
Oct  2 07:34:27 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:34:27 np0005466013 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Oct  2 07:34:27 np0005466013 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Oct  2 07:34:27 np0005466013 systemd[1]: Reloading.
Oct  2 07:34:28 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:34:28 np0005466013 systemd[1]: Listening on LVM2 poll daemon socket.
Oct  2 07:34:28 np0005466013 dbus-broker-launch[741]: Noticed file-system modification, trigger reload.
Oct  2 07:34:28 np0005466013 dbus-broker-launch[741]: Noticed file-system modification, trigger reload.
Oct  2 07:34:28 np0005466013 dbus-broker-launch[741]: Noticed file-system modification, trigger reload.
Oct  2 07:35:38 np0005466013 kernel: SELinux:  Converting 2714 SID table entries...
Oct  2 07:35:38 np0005466013 kernel: SELinux:  policy capability network_peer_controls=1
Oct  2 07:35:38 np0005466013 kernel: SELinux:  policy capability open_perms=1
Oct  2 07:35:38 np0005466013 kernel: SELinux:  policy capability extended_socket_class=1
Oct  2 07:35:38 np0005466013 kernel: SELinux:  policy capability always_check_network=0
Oct  2 07:35:38 np0005466013 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  2 07:35:38 np0005466013 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  2 07:35:38 np0005466013 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  2 07:35:38 np0005466013 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Oct  2 07:35:38 np0005466013 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  2 07:35:38 np0005466013 systemd[1]: Starting man-db-cache-update.service...
Oct  2 07:35:38 np0005466013 systemd[1]: Reloading.
Oct  2 07:35:38 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:35:38 np0005466013 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  2 07:35:39 np0005466013 systemd[1]: Starting PackageKit Daemon...
Oct  2 07:35:39 np0005466013 systemd[1]: Started PackageKit Daemon.
Oct  2 07:35:39 np0005466013 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct  2 07:35:39 np0005466013 systemd[1]: Finished man-db-cache-update.service.
Oct  2 07:35:39 np0005466013 systemd[1]: man-db-cache-update.service: Consumed 1.209s CPU time.
Oct  2 07:35:39 np0005466013 systemd[1]: run-r07a53f8842124b05aaf8bcdd5708caf3.service: Deactivated successfully.
Oct  2 07:35:40 np0005466013 python3.9[31777]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:35:43 np0005466013 python3.9[32058]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Oct  2 07:35:44 np0005466013 python3.9[32210]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Oct  2 07:35:46 np0005466013 python3.9[32363]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:35:48 np0005466013 python3.9[32515]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Oct  2 07:35:57 np0005466013 python3.9[32668]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:35:57 np0005466013 python3.9[32820]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:35:58 np0005466013 python3.9[32943]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759404957.2993715-647-93386583206276/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=74de1ba89bc28b0be0e3b8a77822f232ede7d253 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:36:00 np0005466013 python3.9[33095]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Oct  2 07:36:01 np0005466013 python3.9[33248]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct  2 07:36:01 np0005466013 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 07:36:02 np0005466013 python3.9[33407]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct  2 07:36:03 np0005466013 python3.9[33567]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Oct  2 07:36:04 np0005466013 python3.9[33720]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct  2 07:36:05 np0005466013 python3.9[33878]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Oct  2 07:36:06 np0005466013 python3.9[34030]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  2 07:36:08 np0005466013 python3.9[34184]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:36:09 np0005466013 python3.9[34336]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:36:09 np0005466013 python3.9[34459]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759404968.6492784-932-85120362972626/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:36:10 np0005466013 python3.9[34611]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:36:10 np0005466013 systemd[1]: Starting Load Kernel Modules...
Oct  2 07:36:10 np0005466013 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Oct  2 07:36:10 np0005466013 kernel: Bridge firewalling registered
Oct  2 07:36:10 np0005466013 systemd-modules-load[34615]: Inserted module 'br_netfilter'
Oct  2 07:36:10 np0005466013 systemd[1]: Finished Load Kernel Modules.
Oct  2 07:36:11 np0005466013 python3.9[34771]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:36:12 np0005466013 python3.9[34894]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759404971.2526572-1000-39070687717957/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:36:13 np0005466013 python3.9[35046]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  2 07:36:16 np0005466013 dbus-broker-launch[741]: Noticed file-system modification, trigger reload.
Oct  2 07:36:17 np0005466013 dbus-broker-launch[741]: Noticed file-system modification, trigger reload.
Oct  2 07:36:17 np0005466013 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  2 07:36:17 np0005466013 systemd[1]: Starting man-db-cache-update.service...
Oct  2 07:36:17 np0005466013 systemd[1]: Reloading.
Oct  2 07:36:17 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:36:17 np0005466013 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  2 07:36:19 np0005466013 python3.9[36541]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:36:19 np0005466013 python3.9[37397]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Oct  2 07:36:20 np0005466013 python3.9[38143]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:36:21 np0005466013 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct  2 07:36:21 np0005466013 systemd[1]: Finished man-db-cache-update.service.
Oct  2 07:36:21 np0005466013 systemd[1]: man-db-cache-update.service: Consumed 4.818s CPU time.
Oct  2 07:36:21 np0005466013 systemd[1]: run-rca8e4a1e95e94678803c57bbe6916c4b.service: Deactivated successfully.
Oct  2 07:36:21 np0005466013 python3.9[39234]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:36:21 np0005466013 systemd[1]: Starting Dynamic System Tuning Daemon...
Oct  2 07:36:21 np0005466013 systemd[1]: Started Dynamic System Tuning Daemon.
Oct  2 07:36:22 np0005466013 python3.9[39620]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:36:22 np0005466013 systemd[1]: Stopping Dynamic System Tuning Daemon...
Oct  2 07:36:22 np0005466013 systemd[1]: tuned.service: Deactivated successfully.
Oct  2 07:36:22 np0005466013 systemd[1]: Stopped Dynamic System Tuning Daemon.
Oct  2 07:36:22 np0005466013 systemd[1]: Starting Dynamic System Tuning Daemon...
Oct  2 07:36:23 np0005466013 systemd[1]: Started Dynamic System Tuning Daemon.
Oct  2 07:36:23 np0005466013 python3.9[39781]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Oct  2 07:36:27 np0005466013 python3.9[39933]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:36:27 np0005466013 systemd[1]: Reloading.
Oct  2 07:36:27 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:36:28 np0005466013 python3.9[40122]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:36:28 np0005466013 systemd[1]: Reloading.
Oct  2 07:36:28 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:36:29 np0005466013 python3.9[40311]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:36:30 np0005466013 python3.9[40464]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:36:30 np0005466013 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Oct  2 07:36:30 np0005466013 python3.9[40617]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:36:33 np0005466013 python3.9[40779]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:36:34 np0005466013 python3.9[40932]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:36:34 np0005466013 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Oct  2 07:36:34 np0005466013 systemd[1]: Stopped Apply Kernel Variables.
Oct  2 07:36:34 np0005466013 systemd[1]: Stopping Apply Kernel Variables...
Oct  2 07:36:34 np0005466013 systemd[1]: Starting Apply Kernel Variables...
Oct  2 07:36:34 np0005466013 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Oct  2 07:36:34 np0005466013 systemd[1]: Finished Apply Kernel Variables.
Oct  2 07:36:34 np0005466013 systemd[1]: session-10.scope: Deactivated successfully.
Oct  2 07:36:34 np0005466013 systemd[1]: session-10.scope: Consumed 2min 21.183s CPU time.
Oct  2 07:36:34 np0005466013 systemd-logind[784]: Session 10 logged out. Waiting for processes to exit.
Oct  2 07:36:34 np0005466013 systemd-logind[784]: Removed session 10.
Oct  2 07:36:40 np0005466013 systemd-logind[784]: New session 11 of user zuul.
Oct  2 07:36:40 np0005466013 systemd[1]: Started Session 11 of User zuul.
Oct  2 07:36:41 np0005466013 python3.9[41116]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:36:42 np0005466013 python3.9[41270]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:36:44 np0005466013 python3.9[41426]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:36:45 np0005466013 python3.9[41577]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:36:46 np0005466013 python3.9[41733]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  2 07:36:47 np0005466013 python3.9[41817]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  2 07:36:49 np0005466013 python3.9[41970]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  2 07:36:50 np0005466013 python3.9[42141]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:36:51 np0005466013 python3.9[42293]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:36:51 np0005466013 systemd[1]: var-lib-containers-storage-overlay-compat2493605903-merged.mount: Deactivated successfully.
Oct  2 07:36:51 np0005466013 systemd[1]: var-lib-containers-storage-overlay-metacopy\x2dcheck4191712469-merged.mount: Deactivated successfully.
Oct  2 07:36:51 np0005466013 podman[42294]: 2025-10-02 11:36:51.088818912 +0000 UTC m=+0.060224233 system refresh
Oct  2 07:36:51 np0005466013 python3.9[42456]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:36:52 np0005466013 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:36:52 np0005466013 python3.9[42579]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405011.359185-294-269837030832567/.source.json follow=False _original_basename=podman_network_config.j2 checksum=beaca6ab39f706fcfb7bbfdb9e5775ad1ae9e814 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:36:53 np0005466013 python3.9[42731]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:36:53 np0005466013 python3.9[42854]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759405012.9157565-340-77128067531876/.source.conf follow=False _original_basename=registries.conf.j2 checksum=a4fd3ca7d18166099562a65af8d6da655db34efc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:36:54 np0005466013 python3.9[43006]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:36:55 np0005466013 python3.9[43158]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:36:56 np0005466013 python3.9[43310]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:36:56 np0005466013 python3.9[43463]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:36:57 np0005466013 python3.9[43613]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:36:58 np0005466013 python3.9[43767]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  2 07:37:00 np0005466013 python3.9[43920]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  2 07:37:03 np0005466013 python3.9[44080]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  2 07:37:05 np0005466013 python3.9[44235]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  2 07:37:07 np0005466013 python3.9[44388]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  2 07:37:10 np0005466013 python3.9[44544]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  2 07:37:14 np0005466013 python3.9[44712]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  2 07:37:16 np0005466013 python3.9[44865]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  2 07:37:29 np0005466013 python3.9[45202]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:37:30 np0005466013 python3.9[45377]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:37:30 np0005466013 python3.9[45500]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1759405049.804664-756-149517887987177/.source.json _original_basename=.isdargdf follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:37:31 np0005466013 python3.9[45652]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct  2 07:37:31 np0005466013 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:37:34 np0005466013 systemd[1]: var-lib-containers-storage-overlay-compat3499477041-lower\x2dmapped.mount: Deactivated successfully.
Oct  2 07:37:38 np0005466013 podman[45665]: 2025-10-02 11:37:38.20708276 +0000 UTC m=+6.245867459 image pull 1b3fd7f2436e5c6f2e28c01b83721476c7b295789c77b3d63e30f49404389ea1 quay.io/podified-antelope-centos9/openstack-iscsid:current-podified
Oct  2 07:37:38 np0005466013 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:37:38 np0005466013 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:37:38 np0005466013 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:37:39 np0005466013 python3.9[45964]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct  2 07:37:39 np0005466013 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:37:41 np0005466013 podman[45976]: 2025-10-02 11:37:41.979294811 +0000 UTC m=+2.180670504 image pull ae232aa720979600656d94fc26ba957f1cdf5bca825fe9b57990f60c6534611f quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Oct  2 07:37:41 np0005466013 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:37:42 np0005466013 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:37:42 np0005466013 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:37:43 np0005466013 python3.9[46231]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct  2 07:37:43 np0005466013 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:37:44 np0005466013 podman[46244]: 2025-10-02 11:37:44.106309988 +0000 UTC m=+0.949037263 image pull d8d739f82a6fecf9df690e49539b589e74665b54e36448657b874630717d5bd1 quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Oct  2 07:37:44 np0005466013 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:37:44 np0005466013 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:37:44 np0005466013 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:37:45 np0005466013 python3.9[46476]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct  2 07:37:45 np0005466013 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:37:55 np0005466013 podman[46488]: 2025-10-02 11:37:55.872641768 +0000 UTC m=+10.642359844 image pull e36f31143f26011980def9337d375f895bea59b742a3a2b372b996aa8ad58eba quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Oct  2 07:37:55 np0005466013 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:37:55 np0005466013 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:37:55 np0005466013 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:38:02 np0005466013 python3.9[46765]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct  2 07:38:03 np0005466013 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:38:06 np0005466013 podman[46777]: 2025-10-02 11:38:06.377546085 +0000 UTC m=+3.328858898 image pull 5f0622bc7c13827171d93b3baf72157e23d24d44579ad79fe3a89ad88180a4bb quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Oct  2 07:38:06 np0005466013 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:38:06 np0005466013 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:38:06 np0005466013 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:38:07 np0005466013 python3.9[47036]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct  2 07:38:08 np0005466013 podman[47048]: 2025-10-02 11:38:08.524380222 +0000 UTC m=+1.298116339 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Oct  2 07:38:08 np0005466013 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:38:08 np0005466013 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:38:08 np0005466013 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:38:09 np0005466013 systemd[1]: session-11.scope: Deactivated successfully.
Oct  2 07:38:09 np0005466013 systemd[1]: session-11.scope: Consumed 1min 28.850s CPU time.
Oct  2 07:38:09 np0005466013 systemd-logind[784]: Session 11 logged out. Waiting for processes to exit.
Oct  2 07:38:09 np0005466013 systemd-logind[784]: Removed session 11.
Oct  2 07:38:14 np0005466013 systemd-logind[784]: New session 12 of user zuul.
Oct  2 07:38:14 np0005466013 systemd[1]: Started Session 12 of User zuul.
Oct  2 07:38:15 np0005466013 python3.9[47352]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:38:17 np0005466013 python3.9[47508]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Oct  2 07:38:18 np0005466013 python3.9[47661]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct  2 07:38:19 np0005466013 python3.9[47819]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct  2 07:38:20 np0005466013 python3.9[47979]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  2 07:38:21 np0005466013 python3.9[48063]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  2 07:38:24 np0005466013 python3.9[48224]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  2 07:38:37 np0005466013 kernel: SELinux:  Converting 2725 SID table entries...
Oct  2 07:38:37 np0005466013 kernel: SELinux:  policy capability network_peer_controls=1
Oct  2 07:38:37 np0005466013 kernel: SELinux:  policy capability open_perms=1
Oct  2 07:38:37 np0005466013 kernel: SELinux:  policy capability extended_socket_class=1
Oct  2 07:38:37 np0005466013 kernel: SELinux:  policy capability always_check_network=0
Oct  2 07:38:37 np0005466013 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  2 07:38:37 np0005466013 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  2 07:38:37 np0005466013 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  2 07:38:37 np0005466013 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Oct  2 07:38:37 np0005466013 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Oct  2 07:38:39 np0005466013 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  2 07:38:39 np0005466013 systemd[1]: Starting man-db-cache-update.service...
Oct  2 07:38:39 np0005466013 systemd[1]: Reloading.
Oct  2 07:38:39 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:38:39 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:38:39 np0005466013 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  2 07:38:42 np0005466013 python3.9[49325]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  2 07:38:42 np0005466013 systemd[1]: Reloading.
Oct  2 07:38:42 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:38:42 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:38:43 np0005466013 systemd[1]: Starting Open vSwitch Database Unit...
Oct  2 07:38:43 np0005466013 chown[49367]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Oct  2 07:38:43 np0005466013 ovs-ctl[49372]: /etc/openvswitch/conf.db does not exist ... (warning).
Oct  2 07:38:43 np0005466013 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct  2 07:38:43 np0005466013 systemd[1]: Finished man-db-cache-update.service.
Oct  2 07:38:43 np0005466013 systemd[1]: run-r068008275814489f80a0d1435f88c272.service: Deactivated successfully.
Oct  2 07:38:43 np0005466013 ovs-ctl[49372]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Oct  2 07:38:43 np0005466013 ovs-ctl[49372]: Starting ovsdb-server [  OK  ]
Oct  2 07:38:43 np0005466013 ovs-vsctl[49422]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Oct  2 07:38:43 np0005466013 ovs-vsctl[49439]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"1fc220e5-4479-4f53-8f4d-9aefe7dad458\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Oct  2 07:38:43 np0005466013 ovs-ctl[49372]: Configuring Open vSwitch system IDs [  OK  ]
Oct  2 07:38:43 np0005466013 ovs-ctl[49372]: Enabling remote OVSDB managers [  OK  ]
Oct  2 07:38:43 np0005466013 ovs-vsctl[49448]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-2
Oct  2 07:38:43 np0005466013 systemd[1]: Started Open vSwitch Database Unit.
Oct  2 07:38:43 np0005466013 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Oct  2 07:38:43 np0005466013 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Oct  2 07:38:43 np0005466013 systemd[1]: Starting Open vSwitch Forwarding Unit...
Oct  2 07:38:43 np0005466013 kernel: openvswitch: Open vSwitch switching datapath
Oct  2 07:38:43 np0005466013 ovs-ctl[49492]: Inserting openvswitch module [  OK  ]
Oct  2 07:38:43 np0005466013 ovs-ctl[49461]: Starting ovs-vswitchd [  OK  ]
Oct  2 07:38:43 np0005466013 ovs-vsctl[49509]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-2
Oct  2 07:38:43 np0005466013 ovs-ctl[49461]: Enabling remote OVSDB managers [  OK  ]
Oct  2 07:38:43 np0005466013 systemd[1]: Started Open vSwitch Forwarding Unit.
Oct  2 07:38:43 np0005466013 systemd[1]: Starting Open vSwitch...
Oct  2 07:38:43 np0005466013 systemd[1]: Finished Open vSwitch.
Oct  2 07:38:44 np0005466013 python3.9[49661]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:38:45 np0005466013 python3.9[49813]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Oct  2 07:38:46 np0005466013 kernel: SELinux:  Converting 2739 SID table entries...
Oct  2 07:38:46 np0005466013 kernel: SELinux:  policy capability network_peer_controls=1
Oct  2 07:38:46 np0005466013 kernel: SELinux:  policy capability open_perms=1
Oct  2 07:38:46 np0005466013 kernel: SELinux:  policy capability extended_socket_class=1
Oct  2 07:38:46 np0005466013 kernel: SELinux:  policy capability always_check_network=0
Oct  2 07:38:46 np0005466013 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  2 07:38:46 np0005466013 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  2 07:38:46 np0005466013 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  2 07:38:48 np0005466013 python3.9[49969]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:38:48 np0005466013 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Oct  2 07:38:49 np0005466013 python3.9[50127]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  2 07:38:51 np0005466013 python3.9[50280]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:38:52 np0005466013 python3.9[50567]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct  2 07:38:53 np0005466013 python3.9[50717]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:38:54 np0005466013 python3.9[50871]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  2 07:38:55 np0005466013 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  2 07:38:55 np0005466013 systemd[1]: Starting man-db-cache-update.service...
Oct  2 07:38:55 np0005466013 systemd[1]: Reloading.
Oct  2 07:38:56 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:38:56 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:38:56 np0005466013 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  2 07:38:57 np0005466013 python3.9[51187]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:38:57 np0005466013 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Oct  2 07:38:57 np0005466013 systemd[1]: Stopped Network Manager Wait Online.
Oct  2 07:38:57 np0005466013 systemd[1]: Stopping Network Manager Wait Online...
Oct  2 07:38:57 np0005466013 systemd[1]: Stopping Network Manager...
Oct  2 07:38:57 np0005466013 NetworkManager[3984]: <info>  [1759405137.8288] caught SIGTERM, shutting down normally.
Oct  2 07:38:57 np0005466013 NetworkManager[3984]: <info>  [1759405137.8309] dhcp4 (eth0): canceled DHCP transaction
Oct  2 07:38:57 np0005466013 NetworkManager[3984]: <info>  [1759405137.8309] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct  2 07:38:57 np0005466013 NetworkManager[3984]: <info>  [1759405137.8310] dhcp4 (eth0): state changed no lease
Oct  2 07:38:57 np0005466013 NetworkManager[3984]: <info>  [1759405137.8312] manager: NetworkManager state is now CONNECTED_SITE
Oct  2 07:38:57 np0005466013 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct  2 07:38:57 np0005466013 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct  2 07:38:57 np0005466013 NetworkManager[3984]: <info>  [1759405137.9272] exiting (success)
Oct  2 07:38:57 np0005466013 systemd[1]: NetworkManager.service: Deactivated successfully.
Oct  2 07:38:57 np0005466013 systemd[1]: Stopped Network Manager.
Oct  2 07:38:57 np0005466013 systemd[1]: NetworkManager.service: Consumed 14.334s CPU time, 4.3M memory peak, read 0B from disk, written 21.0K to disk.
Oct  2 07:38:57 np0005466013 systemd[1]: Starting Network Manager...
Oct  2 07:38:57 np0005466013 NetworkManager[51205]: <info>  [1759405137.9853] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:e608a69a-f987-4b3e-a0e0-e678d85d8e75)
Oct  2 07:38:57 np0005466013 NetworkManager[51205]: <info>  [1759405137.9855] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Oct  2 07:38:57 np0005466013 NetworkManager[51205]: <info>  [1759405137.9914] manager[0x55d7a3bf9090]: monitoring kernel firmware directory '/lib/firmware'.
Oct  2 07:38:58 np0005466013 systemd[1]: Starting Hostname Service...
Oct  2 07:38:58 np0005466013 systemd[1]: Started Hostname Service.
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.0705] hostname: hostname: using hostnamed
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.0706] hostname: static hostname changed from (none) to "compute-2"
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.0710] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.0714] manager[0x55d7a3bf9090]: rfkill: Wi-Fi hardware radio set enabled
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.0715] manager[0x55d7a3bf9090]: rfkill: WWAN hardware radio set enabled
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.0737] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.0746] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.0746] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.0747] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.0747] manager: Networking is enabled by state file
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.0752] settings: Loaded settings plugin: keyfile (internal)
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.0755] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.0777] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.0787] dhcp: init: Using DHCP client 'internal'
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.0792] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.0797] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.0803] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.0810] device (lo): Activation: starting connection 'lo' (e6177a1f-26b3-4119-8eb3-690387d13626)
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.0815] device (eth0): carrier: link connected
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.0819] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.0823] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.0824] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.0829] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.0835] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.0840] device (eth1): carrier: link connected
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.0844] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.0849] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (56c504e7-1263-5e10-8bcd-ab3a397bb040) (indicated)
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.0849] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.0854] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.0861] device (eth1): Activation: starting connection 'ci-private-network' (56c504e7-1263-5e10-8bcd-ab3a397bb040)
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.0868] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Oct  2 07:38:58 np0005466013 systemd[1]: Started Network Manager.
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.0876] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.0878] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.0889] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.0898] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.0901] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.0905] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.0907] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.0913] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.0919] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.0922] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.0936] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.0948] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.0954] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.0957] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.0962] device (lo): Activation: successful, device activated.
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.0967] dhcp4 (eth0): state changed new lease, address=38.102.83.45
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.0973] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Oct  2 07:38:58 np0005466013 systemd[1]: Starting Network Manager Wait Online...
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.1894] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.1907] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.1911] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.1914] manager: NetworkManager state is now CONNECTED_LOCAL
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.1917] device (eth1): Activation: successful, device activated.
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.2411] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.2413] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.2416] manager: NetworkManager state is now CONNECTED_SITE
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.2419] device (eth0): Activation: successful, device activated.
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.2423] manager: NetworkManager state is now CONNECTED_GLOBAL
Oct  2 07:38:58 np0005466013 NetworkManager[51205]: <info>  [1759405138.2944] manager: startup complete
Oct  2 07:38:58 np0005466013 systemd[1]: Finished Network Manager Wait Online.
Oct  2 07:38:58 np0005466013 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct  2 07:38:58 np0005466013 systemd[1]: Finished man-db-cache-update.service.
Oct  2 07:38:58 np0005466013 systemd[1]: run-r5211c15a5fe549039950ff1f69465ba6.service: Deactivated successfully.
Oct  2 07:38:58 np0005466013 python3.9[51414]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  2 07:39:06 np0005466013 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  2 07:39:06 np0005466013 systemd[1]: Starting man-db-cache-update.service...
Oct  2 07:39:06 np0005466013 systemd[1]: Reloading.
Oct  2 07:39:06 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:39:06 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:39:06 np0005466013 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  2 07:39:07 np0005466013 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct  2 07:39:07 np0005466013 systemd[1]: Finished man-db-cache-update.service.
Oct  2 07:39:07 np0005466013 systemd[1]: run-rf6b71a9a178940c780e37d3d3860f928.service: Deactivated successfully.
Oct  2 07:39:08 np0005466013 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct  2 07:39:08 np0005466013 python3.9[51877]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:39:09 np0005466013 python3.9[52029]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:39:10 np0005466013 python3.9[52183]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:39:11 np0005466013 python3.9[52335]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:39:11 np0005466013 python3.9[52487]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:39:12 np0005466013 python3.9[52639]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:39:12 np0005466013 python3.9[52791]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:39:13 np0005466013 python3.9[52914]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405152.4769769-653-35073243207778/.source _original_basename=.kwrvy95x follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:39:14 np0005466013 python3.9[53066]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:39:14 np0005466013 python3.9[53218]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Oct  2 07:39:15 np0005466013 python3.9[53370]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:39:17 np0005466013 python3.9[53797]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Oct  2 07:39:18 np0005466013 ansible-async_wrapper.py[53972]: Invoked with j320485700469 300 /home/zuul/.ansible/tmp/ansible-tmp-1759405158.1402004-851-45455681693165/AnsiballZ_edpm_os_net_config.py _
Oct  2 07:39:18 np0005466013 ansible-async_wrapper.py[53975]: Starting module and watcher
Oct  2 07:39:18 np0005466013 ansible-async_wrapper.py[53975]: Start watching 53976 (300)
Oct  2 07:39:18 np0005466013 ansible-async_wrapper.py[53976]: Start module (53976)
Oct  2 07:39:19 np0005466013 ansible-async_wrapper.py[53972]: Return async_wrapper task started.
Oct  2 07:39:19 np0005466013 python3.9[53977]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Oct  2 07:39:19 np0005466013 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Oct  2 07:39:19 np0005466013 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Oct  2 07:39:19 np0005466013 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Oct  2 07:39:19 np0005466013 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Oct  2 07:39:19 np0005466013 kernel: cfg80211: failed to load regulatory.db
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.8270] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=53978 uid=0 result="success"
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.8290] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=53978 uid=0 result="success"
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.8837] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.8839] audit: op="connection-add" uuid="90721afa-1abe-4a26-b436-d2a4c83a5953" name="br-ex-br" pid=53978 uid=0 result="success"
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.8854] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.8855] audit: op="connection-add" uuid="bf7847b5-76e1-4718-a708-ae7b412f12b3" name="br-ex-port" pid=53978 uid=0 result="success"
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.8867] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.8868] audit: op="connection-add" uuid="59c5b0b7-3435-45eb-8fcb-77a74c50d4ea" name="eth1-port" pid=53978 uid=0 result="success"
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.8879] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.8881] audit: op="connection-add" uuid="e8e34eb5-3e13-4e9c-9275-8870d787503e" name="vlan20-port" pid=53978 uid=0 result="success"
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.8892] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.8893] audit: op="connection-add" uuid="ce7067f6-5199-4012-93ba-dc547753e378" name="vlan21-port" pid=53978 uid=0 result="success"
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.8903] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.8904] audit: op="connection-add" uuid="6e236e0a-dc7f-482e-8b30-4565176163f2" name="vlan22-port" pid=53978 uid=0 result="success"
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.8922] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="ipv6.addr-gen-mode,ipv6.dhcp-timeout,ipv6.method,ipv4.dhcp-client-id,ipv4.dhcp-timeout,802-3-ethernet.mtu,connection.autoconnect-priority,connection.timestamp" pid=53978 uid=0 result="success"
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.8937] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.8938] audit: op="connection-add" uuid="785043b5-82c9-42f7-b642-223d3ee05c56" name="br-ex-if" pid=53978 uid=0 result="success"
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9032] audit: op="connection-update" uuid="56c504e7-1263-5e10-8bcd-ab3a397bb040" name="ci-private-network" args="ipv6.routing-rules,ipv6.addresses,ipv6.addr-gen-mode,ipv6.dns,ipv6.method,ipv6.routes,ipv4.addresses,ipv4.routing-rules,ipv4.dns,ipv4.never-default,ipv4.method,ipv4.routes,ovs-external-ids.data,connection.controller,connection.port-type,connection.master,connection.slave-type,connection.timestamp,ovs-interface.type" pid=53978 uid=0 result="success"
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9048] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9049] audit: op="connection-add" uuid="f1bed0bd-cca5-4738-987f-a98c67c05083" name="vlan20-if" pid=53978 uid=0 result="success"
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9064] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9065] audit: op="connection-add" uuid="90d32ff4-a7a7-42e9-8b2c-37973bd8a37e" name="vlan21-if" pid=53978 uid=0 result="success"
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9080] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9081] audit: op="connection-add" uuid="37b1f9e4-4f38-4689-a9a1-bb276104eb5f" name="vlan22-if" pid=53978 uid=0 result="success"
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9091] audit: op="connection-delete" uuid="72057882-bf27-3d3f-a38a-a878b09c7d39" name="Wired connection 1" pid=53978 uid=0 result="success"
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9103] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9112] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9116] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (90721afa-1abe-4a26-b436-d2a4c83a5953)
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9117] audit: op="connection-activate" uuid="90721afa-1abe-4a26-b436-d2a4c83a5953" name="br-ex-br" pid=53978 uid=0 result="success"
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9119] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9126] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9131] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (bf7847b5-76e1-4718-a708-ae7b412f12b3)
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9132] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9138] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9142] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (59c5b0b7-3435-45eb-8fcb-77a74c50d4ea)
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9144] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9151] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9154] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (e8e34eb5-3e13-4e9c-9275-8870d787503e)
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9156] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9162] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9166] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (ce7067f6-5199-4012-93ba-dc547753e378)
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9168] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9174] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9177] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (6e236e0a-dc7f-482e-8b30-4565176163f2)
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9179] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9181] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9183] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9189] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9193] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9198] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (785043b5-82c9-42f7-b642-223d3ee05c56)
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9199] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9202] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9204] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9205] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9206] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9216] device (eth1): disconnecting for new activation request.
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9217] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9220] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9223] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9224] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9227] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9232] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9236] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (f1bed0bd-cca5-4738-987f-a98c67c05083)
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9248] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9252] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9255] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9256] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9259] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9263] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9267] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (90d32ff4-a7a7-42e9-8b2c-37973bd8a37e)
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9268] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9271] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9273] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9273] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9276] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9279] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9283] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (37b1f9e4-4f38-4689-a9a1-bb276104eb5f)
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9284] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9287] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9289] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9290] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9291] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9304] audit: op="device-reapply" interface="eth0" ifindex=2 args="ipv6.addr-gen-mode,ipv6.method,ipv4.dhcp-client-id,ipv4.dhcp-timeout,802-3-ethernet.mtu,connection.autoconnect-priority" pid=53978 uid=0 result="success"
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9306] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9308] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9310] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9316] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9319] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9323] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9326] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9328] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 kernel: ovs-system: entered promiscuous mode
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9333] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9337] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9340] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9342] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9347] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9352] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 kernel: Timeout policy base is empty
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9356] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 systemd-udevd[53982]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9358] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9362] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9366] dhcp4 (eth0): canceled DHCP transaction
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9366] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9366] dhcp4 (eth0): state changed no lease
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9367] dhcp4 (eth0): activation: beginning transaction (no timeout)
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9379] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9383] audit: op="device-reapply" interface="eth1" ifindex=3 pid=53978 uid=0 result="fail" reason="Device is not activated"
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9388] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Oct  2 07:39:20 np0005466013 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9429] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9442] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9449] device (eth1): disconnecting for new activation request.
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9449] audit: op="connection-activate" uuid="56c504e7-1263-5e10-8bcd-ab3a397bb040" name="ci-private-network" pid=53978 uid=0 result="success"
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9465] dhcp4 (eth0): state changed new lease, address=38.102.83.45
Oct  2 07:39:20 np0005466013 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9558] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=53978 uid=0 result="success"
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9559] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Oct  2 07:39:20 np0005466013 kernel: br-ex: entered promiscuous mode
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9657] device (eth1): Activation: starting connection 'ci-private-network' (56c504e7-1263-5e10-8bcd-ab3a397bb040)
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9661] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9662] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9664] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9665] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9667] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9669] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9675] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9680] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9685] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9691] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9695] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9702] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9706] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9710] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9714] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9718] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9722] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9727] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9731] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9736] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9759] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9771] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9772] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9777] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 kernel: vlan22: entered promiscuous mode
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9798] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 kernel: vlan21: entered promiscuous mode
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9856] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9858] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 systemd-udevd[53984]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9861] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9869] device (eth1): Activation: successful, device activated.
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9874] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9879] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9891] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Oct  2 07:39:20 np0005466013 NetworkManager[51205]: <info>  [1759405160.9903] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466013 kernel: vlan20: entered promiscuous mode
Oct  2 07:39:20 np0005466013 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Oct  2 07:39:21 np0005466013 NetworkManager[51205]: <info>  [1759405161.0000] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Oct  2 07:39:21 np0005466013 NetworkManager[51205]: <info>  [1759405161.0005] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  2 07:39:21 np0005466013 NetworkManager[51205]: <info>  [1759405161.0016] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  2 07:39:21 np0005466013 NetworkManager[51205]: <info>  [1759405161.0023] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Oct  2 07:39:21 np0005466013 NetworkManager[51205]: <info>  [1759405161.0045] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  2 07:39:21 np0005466013 NetworkManager[51205]: <info>  [1759405161.0056] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Oct  2 07:39:21 np0005466013 NetworkManager[51205]: <info>  [1759405161.0069] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  2 07:39:21 np0005466013 NetworkManager[51205]: <info>  [1759405161.0127] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  2 07:39:21 np0005466013 NetworkManager[51205]: <info>  [1759405161.0129] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  2 07:39:21 np0005466013 NetworkManager[51205]: <info>  [1759405161.0130] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  2 07:39:21 np0005466013 NetworkManager[51205]: <info>  [1759405161.0136] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Oct  2 07:39:21 np0005466013 NetworkManager[51205]: <info>  [1759405161.0141] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  2 07:39:21 np0005466013 NetworkManager[51205]: <info>  [1759405161.0145] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Oct  2 07:39:22 np0005466013 NetworkManager[51205]: <info>  [1759405162.1373] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=53978 uid=0 result="success"
Oct  2 07:39:22 np0005466013 NetworkManager[51205]: <info>  [1759405162.2910] checkpoint[0x55d7a3bcf950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Oct  2 07:39:22 np0005466013 NetworkManager[51205]: <info>  [1759405162.2912] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=53978 uid=0 result="success"
Oct  2 07:39:22 np0005466013 NetworkManager[51205]: <info>  [1759405162.5305] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=53978 uid=0 result="success"
Oct  2 07:39:22 np0005466013 NetworkManager[51205]: <info>  [1759405162.5317] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=53978 uid=0 result="success"
Oct  2 07:39:22 np0005466013 NetworkManager[51205]: <info>  [1759405162.7202] audit: op="networking-control" arg="global-dns-configuration" pid=53978 uid=0 result="success"
Oct  2 07:39:22 np0005466013 NetworkManager[51205]: <info>  [1759405162.7253] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Oct  2 07:39:22 np0005466013 NetworkManager[51205]: <info>  [1759405162.7280] audit: op="networking-control" arg="global-dns-configuration" pid=53978 uid=0 result="success"
Oct  2 07:39:22 np0005466013 NetworkManager[51205]: <info>  [1759405162.7306] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=53978 uid=0 result="success"
Oct  2 07:39:22 np0005466013 python3.9[54312]: ansible-ansible.legacy.async_status Invoked with jid=j320485700469.53972 mode=status _async_dir=/root/.ansible_async
Oct  2 07:39:22 np0005466013 NetworkManager[51205]: <info>  [1759405162.8520] checkpoint[0x55d7a3bcfa20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Oct  2 07:39:22 np0005466013 NetworkManager[51205]: <info>  [1759405162.8523] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=53978 uid=0 result="success"
Oct  2 07:39:22 np0005466013 ansible-async_wrapper.py[53976]: Module complete (53976)
Oct  2 07:39:24 np0005466013 ansible-async_wrapper.py[53975]: Done in kid B.
Oct  2 07:39:26 np0005466013 python3.9[54416]: ansible-ansible.legacy.async_status Invoked with jid=j320485700469.53972 mode=status _async_dir=/root/.ansible_async
Oct  2 07:39:26 np0005466013 python3.9[54516]: ansible-ansible.legacy.async_status Invoked with jid=j320485700469.53972 mode=cleanup _async_dir=/root/.ansible_async
Oct  2 07:39:27 np0005466013 python3.9[54668]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:39:28 np0005466013 python3.9[54791]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405167.0337234-927-15422593842598/.source.returncode _original_basename=.e6_mottd follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:39:28 np0005466013 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct  2 07:39:28 np0005466013 python3.9[54945]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:39:29 np0005466013 python3.9[55068]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405168.2463496-975-171552940902121/.source.cfg _original_basename=.qwucld08 follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:39:30 np0005466013 python3.9[55221]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:39:30 np0005466013 systemd[1]: Reloading Network Manager...
Oct  2 07:39:30 np0005466013 NetworkManager[51205]: <info>  [1759405170.1514] audit: op="reload" arg="0" pid=55225 uid=0 result="success"
Oct  2 07:39:30 np0005466013 NetworkManager[51205]: <info>  [1759405170.1523] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Oct  2 07:39:30 np0005466013 systemd[1]: Reloaded Network Manager.
Oct  2 07:39:30 np0005466013 systemd[1]: session-12.scope: Deactivated successfully.
Oct  2 07:39:30 np0005466013 systemd[1]: session-12.scope: Consumed 48.112s CPU time.
Oct  2 07:39:30 np0005466013 systemd-logind[784]: Session 12 logged out. Waiting for processes to exit.
Oct  2 07:39:30 np0005466013 systemd-logind[784]: Removed session 12.
Oct  2 07:39:35 np0005466013 systemd-logind[784]: New session 13 of user zuul.
Oct  2 07:39:35 np0005466013 systemd[1]: Started Session 13 of User zuul.
Oct  2 07:39:36 np0005466013 python3.9[55409]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:39:37 np0005466013 python3.9[55563]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  2 07:39:38 np0005466013 python3.9[55752]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:39:39 np0005466013 systemd[1]: session-13.scope: Deactivated successfully.
Oct  2 07:39:39 np0005466013 systemd[1]: session-13.scope: Consumed 2.160s CPU time.
Oct  2 07:39:39 np0005466013 systemd-logind[784]: Session 13 logged out. Waiting for processes to exit.
Oct  2 07:39:39 np0005466013 systemd-logind[784]: Removed session 13.
Oct  2 07:39:40 np0005466013 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct  2 07:39:44 np0005466013 systemd-logind[784]: New session 14 of user zuul.
Oct  2 07:39:44 np0005466013 systemd[1]: Started Session 14 of User zuul.
Oct  2 07:39:45 np0005466013 python3.9[55934]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:39:46 np0005466013 python3.9[56089]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:39:47 np0005466013 python3.9[56245]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  2 07:39:48 np0005466013 python3.9[56329]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  2 07:39:50 np0005466013 python3.9[56483]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  2 07:39:51 np0005466013 python3.9[56674]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:39:52 np0005466013 python3.9[56827]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:39:52 np0005466013 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:39:53 np0005466013 python3.9[56990]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:39:53 np0005466013 python3.9[57068]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:39:54 np0005466013 python3.9[57220]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:39:54 np0005466013 python3.9[57298]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:39:55 np0005466013 python3.9[57450]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:39:55 np0005466013 python3.9[57602]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:39:56 np0005466013 python3.9[57754]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:39:57 np0005466013 python3.9[57906]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:39:57 np0005466013 python3.9[58058]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  2 07:40:00 np0005466013 python3.9[58211]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:40:00 np0005466013 python3.9[58365]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:40:01 np0005466013 python3.9[58517]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:40:02 np0005466013 python3.9[58669]: ansible-service_facts Invoked
Oct  2 07:40:02 np0005466013 network[58686]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  2 07:40:02 np0005466013 network[58687]: 'network-scripts' will be removed from distribution in near future.
Oct  2 07:40:02 np0005466013 network[58688]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  2 07:40:08 np0005466013 python3.9[59144]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  2 07:40:10 np0005466013 python3.9[59297]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Oct  2 07:40:12 np0005466013 python3.9[59449]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:40:13 np0005466013 python3.9[59574]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405212.092255-628-177244086084218/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:40:14 np0005466013 python3.9[59728]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:40:14 np0005466013 python3.9[59853]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405213.5940468-674-51313989837461/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:40:16 np0005466013 python3.9[60007]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:40:17 np0005466013 python3.9[60161]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  2 07:40:19 np0005466013 python3.9[60245]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:40:20 np0005466013 python3.9[60399]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  2 07:40:21 np0005466013 python3.9[60483]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:40:21 np0005466013 chronyd[794]: chronyd exiting
Oct  2 07:40:21 np0005466013 systemd[1]: Stopping NTP client/server...
Oct  2 07:40:21 np0005466013 systemd[1]: chronyd.service: Deactivated successfully.
Oct  2 07:40:21 np0005466013 systemd[1]: Stopped NTP client/server.
Oct  2 07:40:21 np0005466013 systemd[1]: Starting NTP client/server...
Oct  2 07:40:21 np0005466013 chronyd[60492]: chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Oct  2 07:40:21 np0005466013 chronyd[60492]: Frequency 9.193 +/- 0.197 ppm read from /var/lib/chrony/drift
Oct  2 07:40:21 np0005466013 chronyd[60492]: Loaded seccomp filter (level 2)
Oct  2 07:40:21 np0005466013 systemd[1]: Started NTP client/server.
Oct  2 07:40:22 np0005466013 systemd[1]: session-14.scope: Deactivated successfully.
Oct  2 07:40:22 np0005466013 systemd[1]: session-14.scope: Consumed 23.243s CPU time.
Oct  2 07:40:22 np0005466013 systemd-logind[784]: Session 14 logged out. Waiting for processes to exit.
Oct  2 07:40:22 np0005466013 systemd-logind[784]: Removed session 14.
Oct  2 07:40:27 np0005466013 systemd-logind[784]: New session 15 of user zuul.
Oct  2 07:40:27 np0005466013 systemd[1]: Started Session 15 of User zuul.
Oct  2 07:40:28 np0005466013 python3.9[60671]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:40:29 np0005466013 python3.9[60827]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:40:30 np0005466013 python3.9[61002]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:40:30 np0005466013 python3.9[61080]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.9lzsqprk recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:40:31 np0005466013 python3.9[61232]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:40:32 np0005466013 python3.9[61355]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405231.5266414-151-239568847818290/.source _original_basename=.7lcbklk_ follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:40:33 np0005466013 python3.9[61507]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:40:34 np0005466013 python3.9[61659]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:40:34 np0005466013 python3.9[61782]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759405233.7379065-223-161367182201571/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:40:35 np0005466013 python3.9[61934]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:40:35 np0005466013 python3.9[62057]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759405234.8797584-223-190418854397322/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:40:36 np0005466013 python3.9[62209]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:40:37 np0005466013 python3.9[62361]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:40:38 np0005466013 python3.9[62484]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405237.1481037-333-74509855907860/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:40:38 np0005466013 python3.9[62636]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:40:39 np0005466013 python3.9[62759]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405238.4277556-378-128223784252638/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:40:40 np0005466013 python3.9[62911]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:40:40 np0005466013 systemd[1]: Reloading.
Oct  2 07:40:40 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:40:40 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:40:40 np0005466013 systemd[1]: Reloading.
Oct  2 07:40:40 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:40:40 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:40:41 np0005466013 systemd[1]: Starting EDPM Container Shutdown...
Oct  2 07:40:41 np0005466013 systemd[1]: Finished EDPM Container Shutdown.
Oct  2 07:40:41 np0005466013 python3.9[63138]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:40:42 np0005466013 python3.9[63261]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405241.2524784-448-264801241483382/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:40:42 np0005466013 python3.9[63413]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:40:43 np0005466013 python3.9[63536]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405242.5773737-493-38692580033079/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:40:44 np0005466013 python3.9[63688]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:40:44 np0005466013 systemd[1]: Reloading.
Oct  2 07:40:44 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:40:44 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:40:44 np0005466013 systemd[1]: Reloading.
Oct  2 07:40:44 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:40:44 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:40:44 np0005466013 systemd[1]: Starting Create netns directory...
Oct  2 07:40:44 np0005466013 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct  2 07:40:44 np0005466013 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct  2 07:40:44 np0005466013 systemd[1]: Finished Create netns directory.
Oct  2 07:40:45 np0005466013 python3.9[63915]: ansible-ansible.builtin.service_facts Invoked
Oct  2 07:40:45 np0005466013 network[63932]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  2 07:40:45 np0005466013 network[63933]: 'network-scripts' will be removed from distribution in near future.
Oct  2 07:40:45 np0005466013 network[63934]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  2 07:40:54 np0005466013 python3.9[64198]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:40:54 np0005466013 systemd[1]: Reloading.
Oct  2 07:40:54 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:40:54 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:40:54 np0005466013 systemd[1]: Stopping IPv4 firewall with iptables...
Oct  2 07:40:54 np0005466013 iptables.init[64238]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Oct  2 07:40:55 np0005466013 iptables.init[64238]: iptables: Flushing firewall rules: [  OK  ]
Oct  2 07:40:55 np0005466013 systemd[1]: iptables.service: Deactivated successfully.
Oct  2 07:40:55 np0005466013 systemd[1]: Stopped IPv4 firewall with iptables.
Oct  2 07:40:55 np0005466013 python3.9[64434]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:40:56 np0005466013 python3.9[64588]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:40:56 np0005466013 systemd[1]: Reloading.
Oct  2 07:40:56 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:40:56 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:40:56 np0005466013 systemd[1]: Starting Netfilter Tables...
Oct  2 07:40:56 np0005466013 systemd[1]: Finished Netfilter Tables.
Oct  2 07:40:57 np0005466013 python3.9[64780]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:40:58 np0005466013 python3.9[64933]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:40:59 np0005466013 python3.9[65058]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405258.370883-700-13796294970115/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=4729b6ffc5b555fa142bf0b6e6dc15609cb89a22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:41:00 np0005466013 python3.9[65209]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:41:25 np0005466013 systemd[1]: session-15.scope: Deactivated successfully.
Oct  2 07:41:25 np0005466013 systemd[1]: session-15.scope: Consumed 17.667s CPU time.
Oct  2 07:41:25 np0005466013 systemd-logind[784]: Session 15 logged out. Waiting for processes to exit.
Oct  2 07:41:25 np0005466013 systemd-logind[784]: Removed session 15.
Oct  2 07:41:38 np0005466013 systemd-logind[784]: New session 16 of user zuul.
Oct  2 07:41:38 np0005466013 systemd[1]: Started Session 16 of User zuul.
Oct  2 07:41:39 np0005466013 python3.9[65402]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:41:40 np0005466013 python3.9[65558]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:41:41 np0005466013 python3.9[65733]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:41:41 np0005466013 python3.9[65811]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.251tpa77 recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:41:42 np0005466013 python3.9[65963]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:41:42 np0005466013 python3.9[66041]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/podman_drop_in _original_basename=.uaurewei recurse=False state=file path=/etc/sysconfig/podman_drop_in force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:41:43 np0005466013 python3.9[66193]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:41:44 np0005466013 python3.9[66345]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:41:44 np0005466013 python3.9[66423]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:41:45 np0005466013 python3.9[66575]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:41:46 np0005466013 python3.9[66653]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:41:46 np0005466013 python3.9[66805]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:41:47 np0005466013 python3.9[66957]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:41:47 np0005466013 python3.9[67035]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:41:48 np0005466013 python3.9[67187]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:41:48 np0005466013 python3.9[67265]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:41:50 np0005466013 python3.9[67417]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:41:50 np0005466013 systemd[1]: Reloading.
Oct  2 07:41:50 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:41:50 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:41:50 np0005466013 python3.9[67606]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:41:51 np0005466013 python3.9[67684]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:41:52 np0005466013 python3.9[67836]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:41:52 np0005466013 python3.9[67914]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:41:53 np0005466013 python3.9[68066]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:41:53 np0005466013 systemd[1]: Reloading.
Oct  2 07:41:53 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:41:53 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:41:53 np0005466013 systemd[1]: Starting Create netns directory...
Oct  2 07:41:53 np0005466013 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct  2 07:41:53 np0005466013 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct  2 07:41:53 np0005466013 systemd[1]: Finished Create netns directory.
Oct  2 07:41:55 np0005466013 python3.9[68259]: ansible-ansible.builtin.service_facts Invoked
Oct  2 07:41:55 np0005466013 network[68276]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  2 07:41:55 np0005466013 network[68277]: 'network-scripts' will be removed from distribution in near future.
Oct  2 07:41:55 np0005466013 network[68278]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  2 07:42:00 np0005466013 python3.9[68541]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:42:00 np0005466013 python3.9[68619]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/etc/ssh/sshd_config _original_basename=sshd_config_block.j2 recurse=False state=file path=/etc/ssh/sshd_config force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:01 np0005466013 python3.9[68771]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:02 np0005466013 python3.9[68923]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:42:02 np0005466013 python3.9[69046]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405321.835913-615-24530245382258/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:04 np0005466013 python3.9[69198]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Oct  2 07:42:04 np0005466013 systemd[1]: Starting Time & Date Service...
Oct  2 07:42:04 np0005466013 systemd[1]: Started Time & Date Service.
Oct  2 07:42:05 np0005466013 python3.9[69354]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:05 np0005466013 python3.9[69506]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:42:06 np0005466013 python3.9[69629]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405325.2518005-721-138866123473330/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:06 np0005466013 python3.9[69781]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:42:07 np0005466013 python3.9[69904]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405326.4859705-766-92071999837667/.source.yaml _original_basename=.osv4jk23 follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:08 np0005466013 python3.9[70056]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:42:08 np0005466013 python3.9[70179]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405327.6240714-811-20018959840519/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:09 np0005466013 python3.9[70331]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:42:10 np0005466013 python3.9[70484]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:42:10 np0005466013 python3[70637]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct  2 07:42:11 np0005466013 python3.9[70789]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:42:12 np0005466013 python3.9[70912]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405331.1923463-928-50451627452955/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:13 np0005466013 python3.9[71064]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:42:13 np0005466013 python3.9[71187]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405332.5375772-972-227020183230699/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:14 np0005466013 python3.9[71339]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:42:14 np0005466013 python3.9[71462]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405333.8327677-1018-8327320181312/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:15 np0005466013 python3.9[71614]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:42:16 np0005466013 python3.9[71737]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405335.5221834-1063-194600151955523/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:17 np0005466013 python3.9[71889]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:42:17 np0005466013 python3.9[72012]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405336.7703068-1108-84612640279023/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:18 np0005466013 python3.9[72164]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:19 np0005466013 python3.9[72316]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:42:20 np0005466013 python3.9[72475]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:21 np0005466013 python3.9[72628]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:21 np0005466013 python3.9[72780]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:23 np0005466013 python3.9[72932]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Oct  2 07:42:23 np0005466013 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 07:42:23 np0005466013 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 07:42:23 np0005466013 python3.9[73086]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Oct  2 07:42:24 np0005466013 systemd[1]: session-16.scope: Deactivated successfully.
Oct  2 07:42:24 np0005466013 systemd[1]: session-16.scope: Consumed 30.324s CPU time.
Oct  2 07:42:24 np0005466013 systemd-logind[784]: Session 16 logged out. Waiting for processes to exit.
Oct  2 07:42:24 np0005466013 systemd-logind[784]: Removed session 16.
Oct  2 07:42:29 np0005466013 systemd-logind[784]: New session 17 of user zuul.
Oct  2 07:42:29 np0005466013 systemd[1]: Started Session 17 of User zuul.
Oct  2 07:42:30 np0005466013 python3.9[73267]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Oct  2 07:42:30 np0005466013 python3.9[73419]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:42:31 np0005466013 chronyd[60492]: Selected source 45.61.49.156 (pool.ntp.org)
Oct  2 07:42:32 np0005466013 python3.9[73571]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:42:33 np0005466013 python3.9[73723]: ansible-ansible.builtin.blockinfile Invoked with block=compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCVkUIA0SGLhushMFSLFSAWpWCX1FF5YUjql8/6tMZQcpzUyU7mJOEQY7Jf3ZvoRVMiETNv8NaicCQ10qaPGZQwEamylEkW24WAdEJ+0NDO/DPkUTIp6vmhyqMNK8IeoLM1RrAM82pBxdQ+jut498Pj6OeLzo75U5X+AQp3kNKD6nnt+JeBNs5kT35nF/5InhW1d2N5LWKKnnw2LJIgpPZkpDwuRAOTnEp/nyNR1NyRQY1VpGMuAXgEkvvu1no1xBYM2lnfNEwn46Bcfr8p+n5Jv3gJBcteKnTCaLF0CagpfSTcvar4pcN97zXX4Jlq0VyVjit+YemnX5EnCaQoK6sYtatkGsRooS56wc+WtVHhf155ZIAj8wPRwWpcXZq+EV0SwoTFwUUNTXToz7qscdq04OHTl0bFRFQevmks+w6V4a7CzQa1/eeGlYdGEUS1I0dC5eeHDewjoLwo5+ufxHrbBmxaZrgbtwk1E9MQqj3PmdFlh17a83VHQwat591/QWU=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKeqMQGrAV3pXZcV6Ore8xolY214SO0KlbtK5lvj/17F#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBM+LLWBT7ZHcUDyX8Xq/MZx34NXsN2QLd9BzdUzQgHmTREhCHesKInMqP8HfljOxzmUfohPV1AQVEYpXvhkaaQM=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCpSx4Pw7AZfRlGUxa5vBESKqssXVQvrJz1cHMKGIXXt2c6o14yjlDJQkQZMnezLwc2Chr5fJ4DbiWklwKtNLctbQXR5ygqWs0bxMEnUYw+SjtdNwhykNDKkOJF64+ZtokEdpLHge0NvMivE2EBqu3TeXUji1OpHV3NGMiFKFwb0YsujbJuPjzPh6igp8NPD3uwcNrf+rcVQz8qlT/9rxdBMoyNjDoha3HCOOQDoColV7DbtQNdDBy+PMi8DOqzRJ/iPi7C26lVo+1xQL/ZKdmOOijv/QkqsY3ejuzIO9w3z3+GuykWEdEzm3EkUZJ8Q48/OwksBIdmcOC2Ke46PTLmftlRsdK0YUy7UyzGX7HQ++JYiTXyXN92ieFxNY3MmKu/70/67TT90mqVUOkZ9C32ixYMvj2hhnxS5+bmnMjpwCkUvgS1BmmSof6ghFjYZsP6zgTonqOtP5gt5VLjy7xNuApqVGmSN09/ExnZcGBX3ymXsxepc6spJeZ7hw2P4E8=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILxXcYZxs19Ipxj4mIzt7SBi+8WzNq9W70+VNtppPYi1#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDZWnRJzd2ypotxaluhYES+V8G+b3/YU1LqQdpTWOWSO1QiTR0RJRiCt3KgKfluISOv8H6sHrJ9PKv84heszJQY=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC30XwCLl481RnJGbLpEu8HK5UD53phC9aWXrs97/vSr9LY5Wlu1FrcSpDZK7wWwcUjs+Ug5XBJCr6avXKE4rjPwk89lQ1q9g/H9bpxdA4xrV5Eoc6riCUU7Ig86tKNwjKxxe5YXXkbQXzO1m31FHYpGh6MsVqR+sdC4B+xoAW7BJ+sTbHJ0l17YcK68hwv9ZNXBecuDjZDvLtDNje8ZGmmlUIAQ9MfLqzQr0EclCOAdN+tu1Se7EQ/8vqrT6CSp6hCSBXg2bK7fPi0mqJ1MgA1xig5gH2fONZWMZ9gDEbfhr3UMzXKiB9YuhIx/xfPq174TvmMwN89+fteCUEl7FYK0+huTyjiBNyHBhniq+ndB0camrvH6y1i0qFjY2JAZ9zt1odn0an1VRX2fLnwHlbLgEzV7kFf7kzLvc38F4Hd2a4K7/W8rJ80hL4T0aYiPZvbt0T6Z8dKMiNdh5Uq6HXxMW3HhGZER30lJh4bTzzwRBwMlgFLe4nxKXKtNZggdHU=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDQRjtGGFpYzrfHwb+9O0hMfMhijlzqGxkH0vMapGQGq#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLy3xOsuqZD05zHjHYtORv2L5Dy5w2gv1l1NTxi4JLb2kboxAJmGY6ewcs/tttddwUtZ4hxQZpPqVyCmq+Pg//I=#012 create=True mode=0644 path=/tmp/ansible.29a2jlw5 state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:34 np0005466013 python3.9[73875]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.29a2jlw5' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:42:34 np0005466013 systemd[1]: systemd-timedated.service: Deactivated successfully.
Oct  2 07:42:34 np0005466013 python3.9[74031]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.29a2jlw5 state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:35 np0005466013 systemd[1]: session-17.scope: Deactivated successfully.
Oct  2 07:42:35 np0005466013 systemd[1]: session-17.scope: Consumed 3.281s CPU time.
Oct  2 07:42:35 np0005466013 systemd-logind[784]: Session 17 logged out. Waiting for processes to exit.
Oct  2 07:42:35 np0005466013 systemd-logind[784]: Removed session 17.
Oct  2 07:42:42 np0005466013 systemd-logind[784]: New session 18 of user zuul.
Oct  2 07:42:42 np0005466013 systemd[1]: Started Session 18 of User zuul.
Oct  2 07:42:43 np0005466013 python3.9[74209]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:42:44 np0005466013 python3.9[74365]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Oct  2 07:42:45 np0005466013 python3.9[74519]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:42:46 np0005466013 python3.9[74672]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:42:47 np0005466013 python3.9[74825]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:42:47 np0005466013 python3.9[74979]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:42:48 np0005466013 python3.9[75134]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:49 np0005466013 systemd[1]: session-18.scope: Deactivated successfully.
Oct  2 07:42:49 np0005466013 systemd[1]: session-18.scope: Consumed 4.232s CPU time.
Oct  2 07:42:49 np0005466013 systemd-logind[784]: Session 18 logged out. Waiting for processes to exit.
Oct  2 07:42:49 np0005466013 systemd-logind[784]: Removed session 18.
Oct  2 07:42:55 np0005466013 systemd-logind[784]: New session 19 of user zuul.
Oct  2 07:42:55 np0005466013 systemd[1]: Started Session 19 of User zuul.
Oct  2 07:42:56 np0005466013 python3.9[75313]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:42:57 np0005466013 python3.9[75469]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  2 07:42:58 np0005466013 python3.9[75553]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  2 07:43:01 np0005466013 python3.9[75704]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:43:02 np0005466013 python3.9[75855]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct  2 07:43:03 np0005466013 python3.9[76005]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:43:04 np0005466013 python3.9[76155]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:43:04 np0005466013 systemd[1]: session-19.scope: Deactivated successfully.
Oct  2 07:43:04 np0005466013 systemd[1]: session-19.scope: Consumed 5.778s CPU time.
Oct  2 07:43:04 np0005466013 systemd-logind[784]: Session 19 logged out. Waiting for processes to exit.
Oct  2 07:43:04 np0005466013 systemd-logind[784]: Removed session 19.
Oct  2 07:43:11 np0005466013 systemd-logind[784]: New session 20 of user zuul.
Oct  2 07:43:11 np0005466013 systemd[1]: Started Session 20 of User zuul.
Oct  2 07:43:12 np0005466013 python3.9[76335]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:43:14 np0005466013 python3.9[76491]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:43:15 np0005466013 python3.9[76643]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:43:16 np0005466013 python3.9[76795]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:43:17 np0005466013 python3.9[76918]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405395.8595192-166-260605620159774/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=8cab53582acd1ad5f20464960729bb0fb5324499 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:43:17 np0005466013 python3.9[77070]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:43:18 np0005466013 python3.9[77193]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405397.5377235-166-95856153429886/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=618966fd8924c3b9caddce17df39815c03c6e5f3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:43:19 np0005466013 python3.9[77345]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:43:19 np0005466013 python3.9[77468]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405398.7634976-166-161950454906313/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=e65379068ad06f0ba13585933201a6d36d9f1f1d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:43:20 np0005466013 python3.9[77620]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:43:21 np0005466013 python3.9[77772]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:43:22 np0005466013 python3.9[77924]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:43:22 np0005466013 python3.9[78047]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405401.5616715-339-132150247076519/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=234a7a8588e3a8e82cb5d65c444bd7e6af253d2a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:43:23 np0005466013 python3.9[78199]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:43:23 np0005466013 python3.9[78322]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405402.7231874-339-261558631062355/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=cef26c6879264807de4e1e28241ed8a223aa26e4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:43:24 np0005466013 python3.9[78474]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:43:24 np0005466013 python3.9[78597]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405403.86072-339-140907509797270/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=75be0f1c479fd8aa8d482acb1a7da9cd9546c247 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:43:25 np0005466013 python3.9[78749]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:43:26 np0005466013 python3.9[78901]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:43:26 np0005466013 python3.9[79053]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:43:27 np0005466013 python3.9[79176]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405406.377826-527-230127786934214/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=afb88d33bd08f76a92c2e002ce714b95ad2aba5e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:43:27 np0005466013 python3.9[79328]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:43:28 np0005466013 python3.9[79451]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405407.4937022-527-228685283242185/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=b4329abfe8c8dfc3dff902009782a13facac4ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:43:29 np0005466013 python3.9[79603]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:43:29 np0005466013 python3.9[79726]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405408.6133287-527-123945707657940/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=55efa9e2e14f10469eaffbdd5d8921262802e77e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:43:30 np0005466013 python3.9[79878]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:43:30 np0005466013 python3.9[80030]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:43:31 np0005466013 python3.9[80182]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:43:32 np0005466013 python3.9[80305]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405411.1006489-703-211667088288072/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=751c8d4ce017633338bc5fb50233ef2b2ef14464 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:43:32 np0005466013 python3.9[80457]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:43:33 np0005466013 python3.9[80580]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405412.369854-703-272636368938997/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=b4329abfe8c8dfc3dff902009782a13facac4ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:43:33 np0005466013 python3.9[80732]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:43:34 np0005466013 python3.9[80855]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405413.4659328-703-239943843272839/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=7b4ba87dfe939a8498468ac1f28f97794a0cc2b5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:43:35 np0005466013 python3.9[81007]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:43:36 np0005466013 python3.9[81159]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:43:36 np0005466013 python3.9[81282]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405415.8555553-909-225679253035838/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=74de1ba89bc28b0be0e3b8a77822f232ede7d253 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:43:37 np0005466013 python3.9[81434]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:43:38 np0005466013 python3.9[81586]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:43:38 np0005466013 python3.9[81709]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405417.6908612-977-106893259449162/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=74de1ba89bc28b0be0e3b8a77822f232ede7d253 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:43:39 np0005466013 python3.9[81861]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:43:40 np0005466013 python3.9[82013]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:43:40 np0005466013 python3.9[82136]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405419.500897-1043-68851157402512/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=74de1ba89bc28b0be0e3b8a77822f232ede7d253 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:43:41 np0005466013 python3.9[82288]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:43:41 np0005466013 python3.9[82440]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:43:42 np0005466013 python3.9[82563]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405421.3540244-1105-16388122716488/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=74de1ba89bc28b0be0e3b8a77822f232ede7d253 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:43:43 np0005466013 python3.9[82715]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:43:43 np0005466013 python3.9[82867]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:43:44 np0005466013 python3.9[82990]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405423.192547-1173-34519888593587/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=74de1ba89bc28b0be0e3b8a77822f232ede7d253 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:43:44 np0005466013 python3.9[83142]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:43:45 np0005466013 python3.9[83294]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:43:45 np0005466013 python3.9[83417]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405424.9356127-1245-215129692521764/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=74de1ba89bc28b0be0e3b8a77822f232ede7d253 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:43:46 np0005466013 python3.9[83569]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:43:47 np0005466013 python3.9[83721]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:43:47 np0005466013 python3.9[83844]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405426.859923-1322-143270025560158/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=74de1ba89bc28b0be0e3b8a77822f232ede7d253 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:43:49 np0005466013 systemd[1]: session-20.scope: Deactivated successfully.
Oct  2 07:43:49 np0005466013 systemd[1]: session-20.scope: Consumed 27.058s CPU time.
Oct  2 07:43:49 np0005466013 systemd-logind[784]: Session 20 logged out. Waiting for processes to exit.
Oct  2 07:43:49 np0005466013 systemd-logind[784]: Removed session 20.
Oct  2 07:43:54 np0005466013 systemd-logind[784]: New session 21 of user zuul.
Oct  2 07:43:54 np0005466013 systemd[1]: Started Session 21 of User zuul.
Oct  2 07:43:55 np0005466013 python3.9[84022]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:43:56 np0005466013 python3.9[84178]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:43:57 np0005466013 python3.9[84330]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:43:57 np0005466013 python3.9[84480]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:43:58 np0005466013 python3.9[84632]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Oct  2 07:44:00 np0005466013 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Oct  2 07:44:00 np0005466013 python3.9[84788]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  2 07:44:01 np0005466013 python3.9[84872]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  2 07:44:04 np0005466013 python3.9[85025]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  2 07:44:05 np0005466013 python3[85180]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks#012  rule:#012    proto: udp#012    dport: 4789#012- rule_name: 119 neutron geneve networks#012  rule:#012    proto: udp#012    dport: 6081#012    state: ["UNTRACKED"]#012- rule_name: 120 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: OUTPUT#012    jump: NOTRACK#012    action: append#012    state: []#012- rule_name: 121 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: PREROUTING#012    jump: NOTRACK#012    action: append#012    state: []#012 dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Oct  2 07:44:05 np0005466013 python3.9[85332]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:06 np0005466013 python3.9[85484]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:44:07 np0005466013 python3.9[85562]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:07 np0005466013 python3.9[85714]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:44:08 np0005466013 python3.9[85792]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.rxuiymki recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:08 np0005466013 python3.9[85944]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:44:09 np0005466013 python3.9[86022]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:10 np0005466013 python3.9[86174]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:44:11 np0005466013 python3[86327]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct  2 07:44:11 np0005466013 python3.9[86479]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:44:12 np0005466013 python3.9[86604]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405451.4570105-438-246867456483048/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:13 np0005466013 python3.9[86756]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:44:13 np0005466013 python3.9[86881]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405452.828688-483-216059165656929/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:14 np0005466013 systemd[1]: packagekit.service: Deactivated successfully.
Oct  2 07:44:14 np0005466013 python3.9[87033]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:44:15 np0005466013 python3.9[87158]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405454.1351619-529-234134005402416/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:15 np0005466013 python3.9[87310]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:44:16 np0005466013 python3.9[87435]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405455.326021-574-266107565031041/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:17 np0005466013 python3.9[87587]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:44:17 np0005466013 python3.9[87712]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405456.5264778-618-9932007738091/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:18 np0005466013 python3.9[87864]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:19 np0005466013 python3.9[88016]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:44:19 np0005466013 python3.9[88171]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:20 np0005466013 python3.9[88323]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:44:21 np0005466013 python3.9[88476]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:44:22 np0005466013 python3.9[88630]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:44:22 np0005466013 python3.9[88785]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:24 np0005466013 python3.9[88935]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:44:25 np0005466013 python3.9[89088]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-2.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:0e:0a:d8:76:c8:90" external_ids:ovn-encap-ip=172.19.0.102 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch #012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:44:25 np0005466013 ovs-vsctl[89089]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-2.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:0e:0a:d8:76:c8:90 external_ids:ovn-encap-ip=172.19.0.102 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Oct  2 07:44:26 np0005466013 python3.9[89241]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ovs-vsctl show | grep -q "Manager"#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:44:27 np0005466013 python3.9[89396]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:44:27 np0005466013 ovs-vsctl[89397]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Oct  2 07:44:27 np0005466013 python3.9[89547]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:44:28 np0005466013 python3.9[89701]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:44:29 np0005466013 python3.9[89853]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:44:29 np0005466013 python3.9[89931]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:44:30 np0005466013 python3.9[90083]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:44:30 np0005466013 python3.9[90161]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:44:31 np0005466013 python3.9[90313]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:32 np0005466013 python3.9[90465]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:44:32 np0005466013 python3.9[90543]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:33 np0005466013 python3.9[90695]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:44:33 np0005466013 python3.9[90773]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:34 np0005466013 python3.9[90925]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:44:34 np0005466013 systemd[1]: Reloading.
Oct  2 07:44:34 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:44:34 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:44:35 np0005466013 python3.9[91115]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:44:36 np0005466013 python3.9[91193]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:36 np0005466013 python3.9[91345]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:44:37 np0005466013 python3.9[91423]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:38 np0005466013 python3.9[91575]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:44:38 np0005466013 systemd[1]: Reloading.
Oct  2 07:44:38 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:44:38 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:44:38 np0005466013 systemd[1]: Starting Create netns directory...
Oct  2 07:44:38 np0005466013 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct  2 07:44:38 np0005466013 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct  2 07:44:38 np0005466013 systemd[1]: Finished Create netns directory.
Oct  2 07:44:39 np0005466013 python3.9[91768]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:44:40 np0005466013 python3.9[91920]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:44:41 np0005466013 python3.9[92043]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759405479.8588486-1371-224004800948827/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:44:42 np0005466013 python3.9[92195]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:44:43 np0005466013 python3.9[92347]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:44:44 np0005466013 python3.9[92470]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405482.9521687-1446-158735894311834/.source.json _original_basename=.9z6j0i2f follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:44 np0005466013 python3.9[92622]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:47 np0005466013 python3.9[93049]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Oct  2 07:44:48 np0005466013 python3.9[93201]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  2 07:44:49 np0005466013 python3.9[93353]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct  2 07:44:49 np0005466013 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:44:50 np0005466013 python3[93516]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct  2 07:44:50 np0005466013 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:44:50 np0005466013 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:44:51 np0005466013 podman[93550]: 2025-10-02 11:44:51.08223291 +0000 UTC m=+0.045799291 container create ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 07:44:51 np0005466013 podman[93550]: 2025-10-02 11:44:51.058675803 +0000 UTC m=+0.022242214 image pull ae232aa720979600656d94fc26ba957f1cdf5bca825fe9b57990f60c6534611f quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Oct  2 07:44:51 np0005466013 python3[93516]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Oct  2 07:44:51 np0005466013 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:44:52 np0005466013 python3.9[93741]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:44:53 np0005466013 python3.9[93895]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:53 np0005466013 python3.9[93971]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:44:54 np0005466013 python3.9[94122]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759405493.7751577-1710-250042204585326/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:55 np0005466013 python3.9[94198]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  2 07:44:55 np0005466013 systemd[1]: Reloading.
Oct  2 07:44:55 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:44:55 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:44:55 np0005466013 python3.9[94309]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:44:55 np0005466013 systemd[1]: Reloading.
Oct  2 07:44:55 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:44:55 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:44:56 np0005466013 systemd[1]: Starting ovn_controller container...
Oct  2 07:44:56 np0005466013 systemd[1]: Created slice Virtual Machine and Container Slice.
Oct  2 07:44:56 np0005466013 systemd[1]: Started libcrun container.
Oct  2 07:44:56 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9d0dc0fc5e0fe0e80f0a48745bed47d078f4e61e981638cac0444e3fc9520ce3/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Oct  2 07:44:56 np0005466013 systemd[1]: Started /usr/bin/podman healthcheck run ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486.
Oct  2 07:44:56 np0005466013 podman[94351]: 2025-10-02 11:44:56.241435813 +0000 UTC m=+0.171224544 container init ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Oct  2 07:44:56 np0005466013 ovn_controller[94366]: + sudo -E kolla_set_configs
Oct  2 07:44:56 np0005466013 podman[94351]: 2025-10-02 11:44:56.266499711 +0000 UTC m=+0.196288452 container start ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 07:44:56 np0005466013 edpm-start-podman-container[94351]: ovn_controller
Oct  2 07:44:56 np0005466013 systemd[1]: Created slice User Slice of UID 0.
Oct  2 07:44:56 np0005466013 systemd[1]: Starting User Runtime Directory /run/user/0...
Oct  2 07:44:56 np0005466013 systemd[1]: Finished User Runtime Directory /run/user/0.
Oct  2 07:44:56 np0005466013 systemd[1]: Starting User Manager for UID 0...
Oct  2 07:44:56 np0005466013 edpm-start-podman-container[94350]: Creating additional drop-in dependency for "ovn_controller" (ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486)
Oct  2 07:44:56 np0005466013 podman[94372]: 2025-10-02 11:44:56.33566904 +0000 UTC m=+0.058020807 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 07:44:56 np0005466013 systemd[1]: ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486-38ae293ca7a434d3.service: Main process exited, code=exited, status=1/FAILURE
Oct  2 07:44:56 np0005466013 systemd[1]: ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486-38ae293ca7a434d3.service: Failed with result 'exit-code'.
Oct  2 07:44:56 np0005466013 systemd[1]: Reloading.
Oct  2 07:44:56 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:44:56 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:44:56 np0005466013 systemd[94396]: Queued start job for default target Main User Target.
Oct  2 07:44:56 np0005466013 systemd[94396]: Created slice User Application Slice.
Oct  2 07:44:56 np0005466013 systemd[94396]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Oct  2 07:44:56 np0005466013 systemd[94396]: Started Daily Cleanup of User's Temporary Directories.
Oct  2 07:44:56 np0005466013 systemd[94396]: Reached target Paths.
Oct  2 07:44:56 np0005466013 systemd[94396]: Reached target Timers.
Oct  2 07:44:56 np0005466013 systemd[94396]: Starting D-Bus User Message Bus Socket...
Oct  2 07:44:56 np0005466013 systemd[94396]: Starting Create User's Volatile Files and Directories...
Oct  2 07:44:56 np0005466013 systemd[94396]: Finished Create User's Volatile Files and Directories.
Oct  2 07:44:56 np0005466013 systemd[94396]: Listening on D-Bus User Message Bus Socket.
Oct  2 07:44:56 np0005466013 systemd[94396]: Reached target Sockets.
Oct  2 07:44:56 np0005466013 systemd[94396]: Reached target Basic System.
Oct  2 07:44:56 np0005466013 systemd[94396]: Reached target Main User Target.
Oct  2 07:44:56 np0005466013 systemd[94396]: Startup finished in 142ms.
Oct  2 07:44:56 np0005466013 systemd[1]: Started User Manager for UID 0.
Oct  2 07:44:56 np0005466013 systemd[1]: Started ovn_controller container.
Oct  2 07:44:56 np0005466013 systemd[1]: Started Session c1 of User root.
Oct  2 07:44:56 np0005466013 ovn_controller[94366]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  2 07:44:56 np0005466013 ovn_controller[94366]: INFO:__main__:Validating config file
Oct  2 07:44:56 np0005466013 ovn_controller[94366]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  2 07:44:56 np0005466013 ovn_controller[94366]: INFO:__main__:Writing out command to execute
Oct  2 07:44:56 np0005466013 systemd[1]: session-c1.scope: Deactivated successfully.
Oct  2 07:44:56 np0005466013 ovn_controller[94366]: ++ cat /run_command
Oct  2 07:44:56 np0005466013 ovn_controller[94366]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Oct  2 07:44:56 np0005466013 ovn_controller[94366]: + ARGS=
Oct  2 07:44:56 np0005466013 ovn_controller[94366]: + sudo kolla_copy_cacerts
Oct  2 07:44:56 np0005466013 systemd[1]: Started Session c2 of User root.
Oct  2 07:44:56 np0005466013 ovn_controller[94366]: + [[ ! -n '' ]]
Oct  2 07:44:56 np0005466013 ovn_controller[94366]: + . kolla_extend_start
Oct  2 07:44:56 np0005466013 ovn_controller[94366]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Oct  2 07:44:56 np0005466013 ovn_controller[94366]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Oct  2 07:44:56 np0005466013 ovn_controller[94366]: + umask 0022
Oct  2 07:44:56 np0005466013 ovn_controller[94366]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Oct  2 07:44:56 np0005466013 systemd[1]: session-c2.scope: Deactivated successfully.
Oct  2 07:44:56 np0005466013 ovn_controller[94366]: 2025-10-02T11:44:56Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Oct  2 07:44:56 np0005466013 ovn_controller[94366]: 2025-10-02T11:44:56Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Oct  2 07:44:56 np0005466013 ovn_controller[94366]: 2025-10-02T11:44:56Z|00003|main|INFO|OVN internal version is : [24.03.7-20.33.0-76.8]
Oct  2 07:44:56 np0005466013 ovn_controller[94366]: 2025-10-02T11:44:56Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Oct  2 07:44:56 np0005466013 ovn_controller[94366]: 2025-10-02T11:44:56Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Oct  2 07:44:56 np0005466013 ovn_controller[94366]: 2025-10-02T11:44:56Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Oct  2 07:44:56 np0005466013 NetworkManager[51205]: <info>  [1759405496.7321] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Oct  2 07:44:56 np0005466013 NetworkManager[51205]: <info>  [1759405496.7326] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  2 07:44:56 np0005466013 NetworkManager[51205]: <info>  [1759405496.7333] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Oct  2 07:44:56 np0005466013 NetworkManager[51205]: <info>  [1759405496.7337] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Oct  2 07:44:56 np0005466013 NetworkManager[51205]: <info>  [1759405496.7339] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Oct  2 07:44:56 np0005466013 kernel: br-int: entered promiscuous mode
Oct  2 07:44:56 np0005466013 ovn_controller[94366]: 2025-10-02T11:44:56Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Oct  2 07:44:56 np0005466013 systemd-udevd[94498]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 07:44:56 np0005466013 ovn_controller[94366]: 2025-10-02T11:44:56Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Oct  2 07:44:56 np0005466013 ovn_controller[94366]: 2025-10-02T11:44:56Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct  2 07:44:56 np0005466013 ovn_controller[94366]: 2025-10-02T11:44:56Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Oct  2 07:44:56 np0005466013 ovn_controller[94366]: 2025-10-02T11:44:56Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Oct  2 07:44:56 np0005466013 ovn_controller[94366]: 2025-10-02T11:44:56Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Oct  2 07:44:56 np0005466013 ovn_controller[94366]: 2025-10-02T11:44:56Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Oct  2 07:44:56 np0005466013 ovn_controller[94366]: 2025-10-02T11:44:56Z|00014|main|INFO|OVS feature set changed, force recompute.
Oct  2 07:44:56 np0005466013 ovn_controller[94366]: 2025-10-02T11:44:56Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Oct  2 07:44:56 np0005466013 ovn_controller[94366]: 2025-10-02T11:44:56Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct  2 07:44:56 np0005466013 ovn_controller[94366]: 2025-10-02T11:44:56Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct  2 07:44:56 np0005466013 ovn_controller[94366]: 2025-10-02T11:44:56Z|00018|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Oct  2 07:44:56 np0005466013 ovn_controller[94366]: 2025-10-02T11:44:56Z|00019|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Oct  2 07:44:56 np0005466013 ovn_controller[94366]: 2025-10-02T11:44:56Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct  2 07:44:56 np0005466013 ovn_controller[94366]: 2025-10-02T11:44:56Z|00021|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Oct  2 07:44:56 np0005466013 ovn_controller[94366]: 2025-10-02T11:44:56Z|00022|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Oct  2 07:44:56 np0005466013 ovn_controller[94366]: 2025-10-02T11:44:56Z|00023|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Oct  2 07:44:56 np0005466013 ovn_controller[94366]: 2025-10-02T11:44:56Z|00024|main|INFO|OVS feature set changed, force recompute.
Oct  2 07:44:56 np0005466013 ovn_controller[94366]: 2025-10-02T11:44:56Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Oct  2 07:44:56 np0005466013 ovn_controller[94366]: 2025-10-02T11:44:56Z|00001|statctrl(ovn_statctrl2)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Oct  2 07:44:56 np0005466013 ovn_controller[94366]: 2025-10-02T11:44:56Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct  2 07:44:56 np0005466013 ovn_controller[94366]: 2025-10-02T11:44:56Z|00002|rconn(ovn_statctrl2)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct  2 07:44:56 np0005466013 ovn_controller[94366]: 2025-10-02T11:44:56Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct  2 07:44:56 np0005466013 ovn_controller[94366]: 2025-10-02T11:44:56Z|00003|rconn(ovn_statctrl2)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct  2 07:44:56 np0005466013 NetworkManager[51205]: <info>  [1759405496.8408] manager: (ovn-3ff68c-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Oct  2 07:44:56 np0005466013 kernel: genev_sys_6081: entered promiscuous mode
Oct  2 07:44:56 np0005466013 NetworkManager[51205]: <info>  [1759405496.8580] device (genev_sys_6081): carrier: link connected
Oct  2 07:44:56 np0005466013 NetworkManager[51205]: <info>  [1759405496.8582] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/18)
Oct  2 07:44:56 np0005466013 NetworkManager[51205]: <info>  [1759405496.8861] manager: (ovn-ef6a8b-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Oct  2 07:44:56 np0005466013 NetworkManager[51205]: <info>  [1759405496.9331] manager: (ovn-c9f3d6-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Oct  2 07:44:57 np0005466013 python3.9[94631]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:44:57 np0005466013 ovs-vsctl[94632]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Oct  2 07:44:58 np0005466013 python3.9[94784]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:44:58 np0005466013 ovs-vsctl[94786]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Oct  2 07:44:59 np0005466013 python3.9[94939]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:44:59 np0005466013 ovs-vsctl[94940]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Oct  2 07:44:59 np0005466013 systemd[1]: session-21.scope: Deactivated successfully.
Oct  2 07:44:59 np0005466013 systemd[1]: session-21.scope: Consumed 43.818s CPU time.
Oct  2 07:44:59 np0005466013 systemd-logind[784]: Session 21 logged out. Waiting for processes to exit.
Oct  2 07:44:59 np0005466013 systemd-logind[784]: Removed session 21.
Oct  2 07:45:04 np0005466013 systemd-logind[784]: New session 23 of user zuul.
Oct  2 07:45:04 np0005466013 systemd[1]: Started Session 23 of User zuul.
Oct  2 07:45:05 np0005466013 python3.9[95118]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:45:06 np0005466013 systemd[1]: Stopping User Manager for UID 0...
Oct  2 07:45:06 np0005466013 systemd[94396]: Activating special unit Exit the Session...
Oct  2 07:45:06 np0005466013 systemd[94396]: Stopped target Main User Target.
Oct  2 07:45:06 np0005466013 systemd[94396]: Stopped target Basic System.
Oct  2 07:45:06 np0005466013 systemd[94396]: Stopped target Paths.
Oct  2 07:45:06 np0005466013 systemd[94396]: Stopped target Sockets.
Oct  2 07:45:06 np0005466013 systemd[94396]: Stopped target Timers.
Oct  2 07:45:06 np0005466013 systemd[94396]: Stopped Daily Cleanup of User's Temporary Directories.
Oct  2 07:45:06 np0005466013 systemd[94396]: Closed D-Bus User Message Bus Socket.
Oct  2 07:45:06 np0005466013 systemd[94396]: Stopped Create User's Volatile Files and Directories.
Oct  2 07:45:06 np0005466013 systemd[94396]: Removed slice User Application Slice.
Oct  2 07:45:06 np0005466013 systemd[94396]: Reached target Shutdown.
Oct  2 07:45:06 np0005466013 systemd[94396]: Finished Exit the Session.
Oct  2 07:45:06 np0005466013 systemd[94396]: Reached target Exit the Session.
Oct  2 07:45:06 np0005466013 systemd[1]: user@0.service: Deactivated successfully.
Oct  2 07:45:06 np0005466013 systemd[1]: Stopped User Manager for UID 0.
Oct  2 07:45:06 np0005466013 systemd[1]: Stopping User Runtime Directory /run/user/0...
Oct  2 07:45:06 np0005466013 systemd[1]: run-user-0.mount: Deactivated successfully.
Oct  2 07:45:06 np0005466013 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Oct  2 07:45:06 np0005466013 systemd[1]: Stopped User Runtime Directory /run/user/0.
Oct  2 07:45:06 np0005466013 systemd[1]: Removed slice User Slice of UID 0.
Oct  2 07:45:07 np0005466013 python3.9[95277]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:45:07 np0005466013 python3.9[95429]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:45:08 np0005466013 python3.9[95581]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:45:09 np0005466013 python3.9[95733]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:45:09 np0005466013 python3.9[95885]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:45:10 np0005466013 python3.9[96035]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:45:11 np0005466013 python3.9[96187]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Oct  2 07:45:13 np0005466013 python3.9[96337]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:45:13 np0005466013 python3.9[96458]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759405512.368764-225-259397422152964/.source follow=False _original_basename=haproxy.j2 checksum=95c62e64c8f82dd9393a560d1b052dc98d38f810 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:45:14 np0005466013 python3.9[96608]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:45:15 np0005466013 python3.9[96730]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759405514.065495-270-191234044370008/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:45:16 np0005466013 python3.9[96882]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  2 07:45:16 np0005466013 python3.9[96966]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  2 07:45:19 np0005466013 python3.9[97119]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  2 07:45:20 np0005466013 python3.9[97272]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:45:20 np0005466013 python3.9[97393]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759405519.758161-382-54156584011961/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:45:21 np0005466013 python3.9[97543]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:45:21 np0005466013 python3.9[97664]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759405520.9173455-382-220102817594005/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:45:23 np0005466013 python3.9[97814]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:45:23 np0005466013 python3.9[97935]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759405522.8280852-513-2560987565222/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:45:24 np0005466013 python3.9[98085]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:45:25 np0005466013 python3.9[98206]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759405524.0394819-513-207919561129203/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=3fd0bbe67f8d6b170421a2b4395a288aa69eaea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:45:25 np0005466013 python3.9[98356]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:45:26 np0005466013 python3.9[98510]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:45:26 np0005466013 ovn_controller[94366]: 2025-10-02T11:45:26Z|00025|memory|INFO|16256 kB peak resident set size after 30.0 seconds
Oct  2 07:45:26 np0005466013 ovn_controller[94366]: 2025-10-02T11:45:26Z|00026|memory|INFO|idl-cells-OVN_Southbound:273 idl-cells-Open_vSwitch:585 ofctrl_desired_flow_usage-KB:7 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:3
Oct  2 07:45:26 np0005466013 podman[98535]: 2025-10-02 11:45:26.74615483 +0000 UTC m=+0.122306504 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 07:45:27 np0005466013 python3.9[98688]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:45:27 np0005466013 python3.9[98766]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:45:28 np0005466013 python3.9[98918]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:45:28 np0005466013 python3.9[98996]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:45:29 np0005466013 python3.9[99148]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:45:30 np0005466013 python3.9[99300]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:45:30 np0005466013 python3.9[99378]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:45:31 np0005466013 python3.9[99530]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:45:31 np0005466013 python3.9[99608]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:45:32 np0005466013 python3.9[99760]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:45:32 np0005466013 systemd[1]: Reloading.
Oct  2 07:45:32 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:45:32 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:45:33 np0005466013 python3.9[99949]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:45:33 np0005466013 python3.9[100027]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:45:34 np0005466013 python3.9[100179]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:45:35 np0005466013 python3.9[100257]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:45:35 np0005466013 python3.9[100409]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:45:35 np0005466013 systemd[1]: Reloading.
Oct  2 07:45:36 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:45:36 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:45:36 np0005466013 systemd[1]: Starting Create netns directory...
Oct  2 07:45:36 np0005466013 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct  2 07:45:36 np0005466013 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct  2 07:45:36 np0005466013 systemd[1]: Finished Create netns directory.
Oct  2 07:45:38 np0005466013 python3.9[100602]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:45:39 np0005466013 python3.9[100754]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:45:39 np0005466013 python3.9[100877]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759405538.5110843-967-173611539349660/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:45:40 np0005466013 python3.9[101029]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:45:41 np0005466013 python3.9[101181]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:45:41 np0005466013 python3.9[101304]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405540.9640183-1041-170234380887857/.source.json _original_basename=.0f12ezir follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:45:42 np0005466013 python3.9[101456]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:45:45 np0005466013 python3.9[101883]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Oct  2 07:45:46 np0005466013 python3.9[102035]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  2 07:45:47 np0005466013 python3.9[102187]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct  2 07:45:48 np0005466013 python3[102363]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct  2 07:45:55 np0005466013 podman[102376]: 2025-10-02 11:45:55.505558224 +0000 UTC m=+6.875436154 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 07:45:55 np0005466013 podman[102475]: 2025-10-02 11:45:55.627496935 +0000 UTC m=+0.019811530 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 07:45:55 np0005466013 podman[102475]: 2025-10-02 11:45:55.812765687 +0000 UTC m=+0.205080252 container create 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Oct  2 07:45:55 np0005466013 python3[102363]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 07:45:56 np0005466013 python3.9[102665]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:45:57 np0005466013 podman[102791]: 2025-10-02 11:45:57.278650638 +0000 UTC m=+0.109953226 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  2 07:45:57 np0005466013 python3.9[102837]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:45:57 np0005466013 python3.9[102922]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:45:58 np0005466013 python3.9[103073]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759405557.962593-1305-218054280839648/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:45:59 np0005466013 python3.9[103149]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  2 07:45:59 np0005466013 systemd[1]: Reloading.
Oct  2 07:45:59 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:45:59 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:45:59 np0005466013 python3.9[103260]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:46:00 np0005466013 systemd[1]: Reloading.
Oct  2 07:46:00 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:46:00 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:46:00 np0005466013 systemd[1]: Starting ovn_metadata_agent container...
Oct  2 07:46:00 np0005466013 systemd[1]: Started libcrun container.
Oct  2 07:46:00 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9a369279edf95a5863a8e01214ea2839504987fd9675fd96b7b5b55517f60400/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Oct  2 07:46:00 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9a369279edf95a5863a8e01214ea2839504987fd9675fd96b7b5b55517f60400/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 07:46:00 np0005466013 systemd[1]: Started /usr/bin/podman healthcheck run 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa.
Oct  2 07:46:00 np0005466013 podman[103302]: 2025-10-02 11:46:00.370407978 +0000 UTC m=+0.106425578 container init 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Oct  2 07:46:00 np0005466013 ovn_metadata_agent[103318]: + sudo -E kolla_set_configs
Oct  2 07:46:00 np0005466013 podman[103302]: 2025-10-02 11:46:00.398955044 +0000 UTC m=+0.134972664 container start 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 07:46:00 np0005466013 edpm-start-podman-container[103302]: ovn_metadata_agent
Oct  2 07:46:00 np0005466013 edpm-start-podman-container[103301]: Creating additional drop-in dependency for "ovn_metadata_agent" (4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa)
Oct  2 07:46:00 np0005466013 ovn_metadata_agent[103318]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  2 07:46:00 np0005466013 ovn_metadata_agent[103318]: INFO:__main__:Validating config file
Oct  2 07:46:00 np0005466013 ovn_metadata_agent[103318]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  2 07:46:00 np0005466013 ovn_metadata_agent[103318]: INFO:__main__:Copying service configuration files
Oct  2 07:46:00 np0005466013 ovn_metadata_agent[103318]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Oct  2 07:46:00 np0005466013 ovn_metadata_agent[103318]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Oct  2 07:46:00 np0005466013 ovn_metadata_agent[103318]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Oct  2 07:46:00 np0005466013 ovn_metadata_agent[103318]: INFO:__main__:Writing out command to execute
Oct  2 07:46:00 np0005466013 ovn_metadata_agent[103318]: INFO:__main__:Setting permission for /var/lib/neutron
Oct  2 07:46:00 np0005466013 ovn_metadata_agent[103318]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Oct  2 07:46:00 np0005466013 ovn_metadata_agent[103318]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Oct  2 07:46:00 np0005466013 ovn_metadata_agent[103318]: INFO:__main__:Setting permission for /var/lib/neutron/external
Oct  2 07:46:00 np0005466013 ovn_metadata_agent[103318]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Oct  2 07:46:00 np0005466013 ovn_metadata_agent[103318]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Oct  2 07:46:00 np0005466013 ovn_metadata_agent[103318]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Oct  2 07:46:00 np0005466013 ovn_metadata_agent[103318]: ++ cat /run_command
Oct  2 07:46:00 np0005466013 ovn_metadata_agent[103318]: + CMD=neutron-ovn-metadata-agent
Oct  2 07:46:00 np0005466013 ovn_metadata_agent[103318]: + ARGS=
Oct  2 07:46:00 np0005466013 ovn_metadata_agent[103318]: + sudo kolla_copy_cacerts
Oct  2 07:46:00 np0005466013 systemd[1]: Reloading.
Oct  2 07:46:00 np0005466013 ovn_metadata_agent[103318]: + [[ ! -n '' ]]
Oct  2 07:46:00 np0005466013 ovn_metadata_agent[103318]: + . kolla_extend_start
Oct  2 07:46:00 np0005466013 ovn_metadata_agent[103318]: Running command: 'neutron-ovn-metadata-agent'
Oct  2 07:46:00 np0005466013 ovn_metadata_agent[103318]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Oct  2 07:46:00 np0005466013 ovn_metadata_agent[103318]: + umask 0022
Oct  2 07:46:00 np0005466013 ovn_metadata_agent[103318]: + exec neutron-ovn-metadata-agent
Oct  2 07:46:00 np0005466013 podman[103325]: 2025-10-02 11:46:00.485779818 +0000 UTC m=+0.077141329 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  2 07:46:00 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:46:00 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:46:00 np0005466013 systemd[1]: Started ovn_metadata_agent container.
Oct  2 07:46:01 np0005466013 systemd[1]: session-23.scope: Deactivated successfully.
Oct  2 07:46:01 np0005466013 systemd[1]: session-23.scope: Consumed 47.365s CPU time.
Oct  2 07:46:01 np0005466013 systemd-logind[784]: Session 23 logged out. Waiting for processes to exit.
Oct  2 07:46:01 np0005466013 systemd-logind[784]: Removed session 23.
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.219 103323 INFO neutron.common.config [-] Logging enabled!#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.219 103323 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.219 103323 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.220 103323 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.220 103323 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.220 103323 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.220 103323 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.220 103323 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.220 103323 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.220 103323 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.220 103323 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.221 103323 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.221 103323 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.221 103323 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.221 103323 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.221 103323 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.221 103323 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.221 103323 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.221 103323 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.221 103323 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.222 103323 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.222 103323 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.222 103323 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.222 103323 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.222 103323 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.222 103323 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.222 103323 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.222 103323 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.222 103323 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.223 103323 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.223 103323 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.223 103323 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.223 103323 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.223 103323 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.223 103323 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.223 103323 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.224 103323 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.224 103323 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.224 103323 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.224 103323 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.224 103323 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.224 103323 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.224 103323 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.224 103323 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.225 103323 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.225 103323 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.225 103323 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.225 103323 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.225 103323 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.225 103323 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.225 103323 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.225 103323 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.225 103323 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.225 103323 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.226 103323 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.226 103323 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.226 103323 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.226 103323 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.226 103323 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.226 103323 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.226 103323 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.226 103323 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.226 103323 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.227 103323 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.227 103323 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.227 103323 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.227 103323 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.227 103323 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.227 103323 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.227 103323 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.227 103323 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.227 103323 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.228 103323 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.228 103323 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-cell1-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.228 103323 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.228 103323 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.228 103323 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.228 103323 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.228 103323 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.228 103323 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.228 103323 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.228 103323 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.229 103323 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.229 103323 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.229 103323 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.229 103323 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.229 103323 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.229 103323 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.229 103323 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.229 103323 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.229 103323 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.230 103323 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.230 103323 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.230 103323 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.230 103323 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.230 103323 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.230 103323 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.230 103323 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.230 103323 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.230 103323 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.230 103323 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.231 103323 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.231 103323 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.231 103323 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.231 103323 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.231 103323 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.231 103323 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.231 103323 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.231 103323 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.231 103323 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.232 103323 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.232 103323 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.232 103323 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.232 103323 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.232 103323 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.232 103323 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.232 103323 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.232 103323 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.232 103323 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.233 103323 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.233 103323 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.233 103323 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.233 103323 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.233 103323 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.233 103323 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.233 103323 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.233 103323 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.233 103323 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.234 103323 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.234 103323 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.234 103323 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.234 103323 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.234 103323 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.234 103323 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.234 103323 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.234 103323 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.234 103323 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.235 103323 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.235 103323 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.235 103323 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.235 103323 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.235 103323 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.235 103323 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.235 103323 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.235 103323 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.235 103323 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.236 103323 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.236 103323 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.236 103323 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.236 103323 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.236 103323 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.236 103323 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.236 103323 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.236 103323 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.237 103323 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.237 103323 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.237 103323 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.237 103323 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.237 103323 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.237 103323 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.237 103323 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.237 103323 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.237 103323 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.238 103323 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.238 103323 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.238 103323 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.238 103323 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.238 103323 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.238 103323 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.238 103323 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.238 103323 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.238 103323 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.239 103323 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.239 103323 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.239 103323 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.239 103323 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.239 103323 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.239 103323 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.239 103323 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.239 103323 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.239 103323 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.240 103323 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.240 103323 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.240 103323 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.240 103323 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.240 103323 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.240 103323 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.240 103323 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.240 103323 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.241 103323 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.241 103323 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.241 103323 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.241 103323 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.241 103323 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.241 103323 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.241 103323 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.241 103323 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.241 103323 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.242 103323 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.242 103323 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.242 103323 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.242 103323 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.242 103323 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.242 103323 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.242 103323 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.242 103323 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.243 103323 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.243 103323 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.243 103323 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.243 103323 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.243 103323 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.243 103323 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.243 103323 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.243 103323 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.244 103323 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.244 103323 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.244 103323 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.244 103323 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.244 103323 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.244 103323 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.244 103323 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.244 103323 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.244 103323 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.245 103323 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.245 103323 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.245 103323 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.245 103323 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.245 103323 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.245 103323 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.245 103323 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.245 103323 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.245 103323 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.245 103323 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.246 103323 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.246 103323 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.246 103323 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.246 103323 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.246 103323 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.246 103323 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.246 103323 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.246 103323 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.246 103323 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.247 103323 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.247 103323 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.247 103323 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.247 103323 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.247 103323 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.247 103323 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.247 103323 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.247 103323 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.248 103323 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.248 103323 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.248 103323 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.248 103323 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.248 103323 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.248 103323 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.248 103323 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.248 103323 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.249 103323 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.249 103323 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.249 103323 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.249 103323 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.249 103323 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.249 103323 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.249 103323 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.249 103323 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.249 103323 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.250 103323 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.250 103323 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.250 103323 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.250 103323 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.250 103323 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.250 103323 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.250 103323 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.250 103323 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.250 103323 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.251 103323 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.251 103323 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.251 103323 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.251 103323 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.251 103323 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.251 103323 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.251 103323 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.251 103323 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.251 103323 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.252 103323 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.252 103323 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.252 103323 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.252 103323 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.252 103323 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.252 103323 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.252 103323 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.252 103323 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.252 103323 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.253 103323 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.253 103323 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.253 103323 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.253 103323 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.253 103323 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.253 103323 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.253 103323 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.253 103323 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.253 103323 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.253 103323 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.262 103323 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.262 103323 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.263 103323 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.263 103323 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.263 103323 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.274 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 1fc220e5-4479-4f53-8f4d-9aefe7dad458 (UUID: 1fc220e5-4479-4f53-8f4d-9aefe7dad458) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.314 103323 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.315 103323 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.315 103323 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.315 103323 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.318 103323 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.323 103323 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.334 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '1fc220e5-4479-4f53-8f4d-9aefe7dad458'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], external_ids={}, name=1fc220e5-4479-4f53-8f4d-9aefe7dad458, nb_cfg_timestamp=1759405504820, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.335 103323 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7febe61f0bb0>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.336 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.336 103323 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.336 103323 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.336 103323 INFO oslo_service.service [-] Starting 1 workers#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.340 103323 DEBUG oslo_service.service [-] Started child 103434 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.343 103434 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-372653'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.344 103323 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmp9f5tjeb8/privsep.sock']#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.364 103434 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.365 103434 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.365 103434 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.372 103434 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.378 103434 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.387 103434 INFO eventlet.wsgi.server [-] (103434) wsgi starting up on http:/var/lib/neutron/metadata_proxy#033[00m
Oct  2 07:46:02 np0005466013 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.976 103323 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.976 103323 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp9f5tjeb8/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.858 103439 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.861 103439 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.865 103439 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.865 103439 INFO oslo.privsep.daemon [-] privsep daemon running as pid 103439#033[00m
Oct  2 07:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:02.979 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[43ffe3d8-9cfa-4392-a8e8-3431f6eca68e]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 07:46:03 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:03.473 103439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:46:03 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:03.473 103439 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:46:03 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:03.473 103439 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:46:03 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:03.995 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[ffd2ab95-f38e-4acc-9d21-3990a9503d98]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 07:46:03 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:03.997 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, column=external_ids, values=({'neutron:ovn-metadata-id': '4be5bf17-844a-591e-87ee-8303d773261d'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.014 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.023 103323 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.023 103323 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.023 103323 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.024 103323 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.024 103323 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.024 103323 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.024 103323 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.024 103323 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.024 103323 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.024 103323 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.025 103323 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.025 103323 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.025 103323 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.025 103323 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.025 103323 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.025 103323 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.025 103323 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.025 103323 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.025 103323 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.026 103323 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.026 103323 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.026 103323 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.026 103323 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.026 103323 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.026 103323 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.026 103323 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.026 103323 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.027 103323 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.027 103323 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.027 103323 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.027 103323 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.027 103323 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.027 103323 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.027 103323 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.027 103323 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.027 103323 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.028 103323 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.028 103323 DEBUG oslo_service.service [-] host                           = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.028 103323 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.028 103323 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.028 103323 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.028 103323 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.028 103323 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.028 103323 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.029 103323 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.029 103323 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.029 103323 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.029 103323 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.029 103323 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.029 103323 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.029 103323 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.029 103323 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.029 103323 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.029 103323 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.030 103323 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.030 103323 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.030 103323 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.030 103323 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.030 103323 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.030 103323 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.030 103323 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.030 103323 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.031 103323 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.031 103323 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.031 103323 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.031 103323 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.031 103323 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.031 103323 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.031 103323 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.031 103323 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.031 103323 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.032 103323 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.032 103323 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.032 103323 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-cell1-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.032 103323 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.032 103323 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.032 103323 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.032 103323 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.032 103323 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.032 103323 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.033 103323 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.033 103323 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.033 103323 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.033 103323 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.033 103323 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.033 103323 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.033 103323 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.033 103323 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.033 103323 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.033 103323 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.034 103323 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.034 103323 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.034 103323 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.034 103323 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.034 103323 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.034 103323 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.034 103323 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.034 103323 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.034 103323 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.035 103323 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.035 103323 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.035 103323 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.035 103323 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.035 103323 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.035 103323 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.035 103323 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.035 103323 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.035 103323 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.036 103323 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.036 103323 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.036 103323 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.036 103323 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.036 103323 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.036 103323 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.036 103323 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.036 103323 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.036 103323 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.037 103323 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.037 103323 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.037 103323 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.037 103323 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.037 103323 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.037 103323 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.037 103323 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.037 103323 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.038 103323 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.038 103323 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.038 103323 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.038 103323 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.038 103323 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.038 103323 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.039 103323 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.039 103323 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.039 103323 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.039 103323 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.039 103323 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.039 103323 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.039 103323 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.040 103323 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.040 103323 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.040 103323 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.040 103323 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.040 103323 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.040 103323 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.040 103323 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.040 103323 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.041 103323 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.041 103323 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.041 103323 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.041 103323 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.041 103323 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.041 103323 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.041 103323 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.041 103323 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.042 103323 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.042 103323 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.042 103323 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.042 103323 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.042 103323 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.042 103323 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.042 103323 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.042 103323 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.043 103323 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.043 103323 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.043 103323 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.043 103323 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.043 103323 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.043 103323 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.043 103323 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.043 103323 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.043 103323 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.044 103323 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.044 103323 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.044 103323 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.044 103323 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.044 103323 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.044 103323 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.044 103323 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.044 103323 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.044 103323 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.044 103323 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.045 103323 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.045 103323 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.045 103323 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.045 103323 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.045 103323 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.045 103323 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.045 103323 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.045 103323 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.045 103323 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.046 103323 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.046 103323 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.046 103323 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.046 103323 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.046 103323 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.046 103323 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.046 103323 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.046 103323 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.046 103323 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.046 103323 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.047 103323 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.047 103323 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.047 103323 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.047 103323 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.047 103323 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.047 103323 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.047 103323 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.047 103323 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.047 103323 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.047 103323 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.048 103323 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.048 103323 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.048 103323 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.048 103323 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.048 103323 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.048 103323 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.048 103323 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.048 103323 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.048 103323 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.048 103323 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.049 103323 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.049 103323 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.049 103323 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.049 103323 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.049 103323 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.049 103323 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.049 103323 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.049 103323 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.049 103323 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.049 103323 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.050 103323 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.050 103323 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.050 103323 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.050 103323 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.050 103323 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.050 103323 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.050 103323 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.050 103323 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.050 103323 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.050 103323 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.051 103323 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.051 103323 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.051 103323 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.051 103323 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.051 103323 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.051 103323 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.051 103323 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.051 103323 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.051 103323 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.051 103323 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.052 103323 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.052 103323 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.052 103323 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.052 103323 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.052 103323 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.052 103323 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.052 103323 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.052 103323 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.052 103323 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.052 103323 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.053 103323 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.053 103323 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.053 103323 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.053 103323 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.053 103323 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.053 103323 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.053 103323 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.053 103323 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.053 103323 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.053 103323 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.054 103323 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.054 103323 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.054 103323 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.054 103323 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.054 103323 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.054 103323 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.054 103323 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.054 103323 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.054 103323 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.055 103323 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.055 103323 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.055 103323 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.055 103323 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.055 103323 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.055 103323 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.055 103323 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.055 103323 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.055 103323 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.055 103323 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.056 103323 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.056 103323 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.056 103323 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.056 103323 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.056 103323 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.056 103323 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.056 103323 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.056 103323 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.056 103323 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.057 103323 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.057 103323 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.057 103323 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.057 103323 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.057 103323 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:46:04.057 103323 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Oct  2 07:46:06 np0005466013 systemd-logind[784]: New session 24 of user zuul.
Oct  2 07:46:06 np0005466013 systemd[1]: Started Session 24 of User zuul.
Oct  2 07:46:07 np0005466013 python3.9[103597]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:46:08 np0005466013 python3.9[103753]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:46:09 np0005466013 python3.9[103918]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  2 07:46:09 np0005466013 systemd[1]: Reloading.
Oct  2 07:46:09 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:46:09 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:46:10 np0005466013 python3.9[104104]: ansible-ansible.builtin.service_facts Invoked
Oct  2 07:46:10 np0005466013 network[104121]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  2 07:46:10 np0005466013 network[104122]: 'network-scripts' will be removed from distribution in near future.
Oct  2 07:46:10 np0005466013 network[104123]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  2 07:46:18 np0005466013 python3.9[104387]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:46:18 np0005466013 python3.9[104540]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:46:19 np0005466013 python3.9[104693]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:46:20 np0005466013 python3.9[104846]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:46:21 np0005466013 python3.9[104999]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:46:22 np0005466013 python3.9[105152]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:46:22 np0005466013 python3.9[105305]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:46:24 np0005466013 python3.9[105458]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:46:24 np0005466013 python3.9[105610]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:46:25 np0005466013 python3.9[105762]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:46:26 np0005466013 python3.9[105914]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:46:26 np0005466013 python3.9[106066]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:46:27 np0005466013 python3.9[106218]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:46:27 np0005466013 podman[106342]: 2025-10-02 11:46:27.648135924 +0000 UTC m=+0.086885457 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 07:46:27 np0005466013 python3.9[106387]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:46:29 np0005466013 python3.9[106549]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:46:29 np0005466013 python3.9[106701]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:46:30 np0005466013 python3.9[106853]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:46:31 np0005466013 podman[106977]: 2025-10-02 11:46:31.201589576 +0000 UTC m=+0.061927534 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  2 07:46:31 np0005466013 python3.9[107022]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:46:31 np0005466013 python3.9[107175]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:46:32 np0005466013 python3.9[107327]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:46:33 np0005466013 python3.9[107479]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:46:34 np0005466013 python3.9[107631]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:46:34 np0005466013 python3.9[107783]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct  2 07:46:35 np0005466013 python3.9[107935]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  2 07:46:35 np0005466013 systemd[1]: Reloading.
Oct  2 07:46:36 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:46:36 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:46:36 np0005466013 python3.9[108121]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:46:37 np0005466013 python3.9[108274]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:46:38 np0005466013 python3.9[108427]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:46:38 np0005466013 python3.9[108580]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:46:39 np0005466013 python3.9[108733]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:46:40 np0005466013 python3.9[108886]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:46:40 np0005466013 python3.9[109039]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:46:43 np0005466013 python3.9[109192]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Oct  2 07:46:44 np0005466013 python3.9[109345]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct  2 07:46:45 np0005466013 python3.9[109503]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct  2 07:46:46 np0005466013 python3.9[109663]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  2 07:46:47 np0005466013 python3.9[109747]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  2 07:46:58 np0005466013 podman[109915]: 2025-10-02 11:46:58.705516929 +0000 UTC m=+0.080594029 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct  2 07:47:01 np0005466013 podman[109960]: 2025-10-02 11:47:01.674732213 +0000 UTC m=+0.046466502 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 07:47:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:47:02.255 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:47:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:47:02.256 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:47:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:47:02.256 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:47:20 np0005466013 kernel: SELinux:  Converting 2752 SID table entries...
Oct  2 07:47:20 np0005466013 kernel: SELinux:  policy capability network_peer_controls=1
Oct  2 07:47:20 np0005466013 kernel: SELinux:  policy capability open_perms=1
Oct  2 07:47:20 np0005466013 kernel: SELinux:  policy capability extended_socket_class=1
Oct  2 07:47:20 np0005466013 kernel: SELinux:  policy capability always_check_network=0
Oct  2 07:47:20 np0005466013 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  2 07:47:20 np0005466013 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  2 07:47:20 np0005466013 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  2 07:47:29 np0005466013 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Oct  2 07:47:29 np0005466013 podman[109993]: 2025-10-02 11:47:29.726808143 +0000 UTC m=+0.085992901 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct  2 07:47:33 np0005466013 podman[110023]: 2025-10-02 11:47:33.194010009 +0000 UTC m=+0.576211321 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Oct  2 07:47:33 np0005466013 kernel: SELinux:  Converting 2752 SID table entries...
Oct  2 07:47:33 np0005466013 kernel: SELinux:  policy capability network_peer_controls=1
Oct  2 07:47:33 np0005466013 kernel: SELinux:  policy capability open_perms=1
Oct  2 07:47:33 np0005466013 kernel: SELinux:  policy capability extended_socket_class=1
Oct  2 07:47:33 np0005466013 kernel: SELinux:  policy capability always_check_network=0
Oct  2 07:47:33 np0005466013 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  2 07:47:33 np0005466013 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  2 07:47:33 np0005466013 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  2 07:48:00 np0005466013 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Oct  2 07:48:00 np0005466013 podman[119476]: 2025-10-02 11:48:00.744363499 +0000 UTC m=+0.114470369 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Oct  2 07:48:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:48:02.256 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:48:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:48:02.256 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:48:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:48:02.256 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:48:03 np0005466013 podman[121419]: 2025-10-02 11:48:03.667845531 +0000 UTC m=+0.046606348 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Oct  2 07:48:26 np0005466013 kernel: SELinux:  Converting 2753 SID table entries...
Oct  2 07:48:26 np0005466013 kernel: SELinux:  policy capability network_peer_controls=1
Oct  2 07:48:26 np0005466013 kernel: SELinux:  policy capability open_perms=1
Oct  2 07:48:26 np0005466013 kernel: SELinux:  policy capability extended_socket_class=1
Oct  2 07:48:26 np0005466013 kernel: SELinux:  policy capability always_check_network=0
Oct  2 07:48:26 np0005466013 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  2 07:48:26 np0005466013 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  2 07:48:26 np0005466013 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  2 07:48:27 np0005466013 dbus-broker-launch[741]: Noticed file-system modification, trigger reload.
Oct  2 07:48:27 np0005466013 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Oct  2 07:48:27 np0005466013 dbus-broker-launch[741]: Noticed file-system modification, trigger reload.
Oct  2 07:48:31 np0005466013 podman[126920]: 2025-10-02 11:48:31.200898197 +0000 UTC m=+0.094924948 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  2 07:48:34 np0005466013 podman[127123]: 2025-10-02 11:48:34.568231442 +0000 UTC m=+0.085649838 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  2 07:48:35 np0005466013 systemd[1]: Stopping OpenSSH server daemon...
Oct  2 07:48:35 np0005466013 systemd[1]: sshd.service: Deactivated successfully.
Oct  2 07:48:35 np0005466013 systemd[1]: Stopped OpenSSH server daemon.
Oct  2 07:48:35 np0005466013 systemd[1]: sshd.service: Consumed 1.449s CPU time, no IO.
Oct  2 07:48:35 np0005466013 systemd[1]: Stopped target sshd-keygen.target.
Oct  2 07:48:35 np0005466013 systemd[1]: Stopping sshd-keygen.target...
Oct  2 07:48:35 np0005466013 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct  2 07:48:35 np0005466013 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct  2 07:48:35 np0005466013 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct  2 07:48:35 np0005466013 systemd[1]: Reached target sshd-keygen.target.
Oct  2 07:48:35 np0005466013 systemd[1]: Starting OpenSSH server daemon...
Oct  2 07:48:35 np0005466013 systemd[1]: Started OpenSSH server daemon.
Oct  2 07:48:37 np0005466013 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  2 07:48:37 np0005466013 systemd[1]: Starting man-db-cache-update.service...
Oct  2 07:48:37 np0005466013 systemd[1]: Reloading.
Oct  2 07:48:37 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:48:37 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:48:37 np0005466013 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  2 07:48:39 np0005466013 systemd[1]: Starting PackageKit Daemon...
Oct  2 07:48:39 np0005466013 systemd[1]: Started PackageKit Daemon.
Oct  2 07:48:41 np0005466013 python3.9[131714]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  2 07:48:41 np0005466013 systemd[1]: Reloading.
Oct  2 07:48:41 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:48:41 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:48:42 np0005466013 python3.9[132898]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  2 07:48:42 np0005466013 systemd[1]: Reloading.
Oct  2 07:48:42 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:48:42 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:48:43 np0005466013 python3.9[134242]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  2 07:48:43 np0005466013 systemd[1]: Reloading.
Oct  2 07:48:43 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:48:43 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:48:44 np0005466013 python3.9[135545]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  2 07:48:44 np0005466013 systemd[1]: Reloading.
Oct  2 07:48:44 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:48:44 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:48:45 np0005466013 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct  2 07:48:45 np0005466013 systemd[1]: Finished man-db-cache-update.service.
Oct  2 07:48:45 np0005466013 systemd[1]: man-db-cache-update.service: Consumed 10.122s CPU time.
Oct  2 07:48:45 np0005466013 systemd[1]: run-rd577fc7b45fa424b99524c8a843f38c9.service: Deactivated successfully.
Oct  2 07:48:46 np0005466013 python3.9[137069]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:48:46 np0005466013 systemd[1]: Reloading.
Oct  2 07:48:47 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:48:47 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:48:47 np0005466013 python3.9[137259]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:48:48 np0005466013 systemd[1]: Reloading.
Oct  2 07:48:48 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:48:48 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:48:48 np0005466013 python3.9[137451]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:48:49 np0005466013 systemd[1]: Reloading.
Oct  2 07:48:49 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:48:49 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:48:50 np0005466013 python3.9[137641]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:48:50 np0005466013 python3.9[137796]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:48:51 np0005466013 systemd[1]: Reloading.
Oct  2 07:48:51 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:48:51 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:48:52 np0005466013 python3.9[137986]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  2 07:48:52 np0005466013 systemd[1]: Reloading.
Oct  2 07:48:52 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:48:52 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:48:52 np0005466013 systemd[1]: Listening on libvirt proxy daemon socket.
Oct  2 07:48:52 np0005466013 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Oct  2 07:48:53 np0005466013 python3.9[138178]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:48:54 np0005466013 python3.9[138333]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:48:55 np0005466013 python3.9[138488]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:48:55 np0005466013 python3.9[138643]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:48:56 np0005466013 python3.9[138798]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:48:57 np0005466013 python3.9[138953]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:48:58 np0005466013 python3.9[139108]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:48:59 np0005466013 python3.9[139263]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:48:59 np0005466013 python3.9[139418]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:49:00 np0005466013 python3.9[139573]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:49:01 np0005466013 podman[139728]: 2025-10-02 11:49:01.340557561 +0000 UTC m=+0.082131627 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 07:49:01 np0005466013 python3.9[139729]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:49:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:49:02.257 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:49:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:49:02.257 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:49:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:49:02.257 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:49:02 np0005466013 python3.9[139910]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:49:03 np0005466013 python3.9[140065]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:49:04 np0005466013 python3.9[140220]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:49:04 np0005466013 podman[140277]: 2025-10-02 11:49:04.679865605 +0000 UTC m=+0.049916694 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Oct  2 07:49:05 np0005466013 python3.9[140394]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:49:05 np0005466013 python3.9[140546]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:49:06 np0005466013 python3.9[140698]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:49:06 np0005466013 python3.9[140850]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:49:07 np0005466013 python3.9[141002]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:49:08 np0005466013 python3.9[141154]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:49:09 np0005466013 python3.9[141306]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:09 np0005466013 python3.9[141431]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759405748.3937633-1629-84208967580898/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:10 np0005466013 python3.9[141583]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:11 np0005466013 python3.9[141708]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759405750.0323558-1629-232113585492386/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:11 np0005466013 python3.9[141860]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:12 np0005466013 python3.9[141985]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759405751.3411424-1629-278065907097469/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:13 np0005466013 python3.9[142137]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:13 np0005466013 python3.9[142262]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759405752.5960574-1629-152929763879450/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:14 np0005466013 python3.9[142414]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:14 np0005466013 python3.9[142539]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759405753.8353832-1629-215100564531919/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:15 np0005466013 python3.9[142691]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:16 np0005466013 python3.9[142816]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759405755.0586164-1629-179741835080320/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:16 np0005466013 python3.9[142968]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:17 np0005466013 python3.9[143091]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759405756.3429172-1629-210341152110859/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:18 np0005466013 python3.9[143243]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:18 np0005466013 python3.9[143368]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759405757.557356-1629-188577820325287/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:19 np0005466013 python3.9[143520]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Oct  2 07:49:20 np0005466013 python3.9[143673]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:20 np0005466013 python3.9[143825]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:21 np0005466013 python3.9[143977]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:22 np0005466013 python3.9[144129]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:22 np0005466013 python3.9[144281]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:23 np0005466013 python3.9[144433]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:23 np0005466013 python3.9[144585]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:24 np0005466013 python3.9[144737]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:25 np0005466013 python3.9[144889]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:25 np0005466013 python3.9[145041]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:26 np0005466013 python3.9[145193]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:27 np0005466013 python3.9[145345]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:27 np0005466013 python3.9[145497]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:28 np0005466013 python3.9[145649]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:29 np0005466013 python3.9[145801]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:29 np0005466013 python3.9[145924]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405768.5649228-2293-123274913185426/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:30 np0005466013 python3.9[146076]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:30 np0005466013 python3.9[146199]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405769.8886638-2293-248418317908246/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:31 np0005466013 python3.9[146351]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:31 np0005466013 podman[146352]: 2025-10-02 11:49:31.603939565 +0000 UTC m=+0.068802408 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  2 07:49:32 np0005466013 python3.9[146498]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405771.0194592-2293-56353545986350/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:32 np0005466013 python3.9[146650]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:33 np0005466013 python3.9[146773]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405772.2384977-2293-87339236758376/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:33 np0005466013 python3.9[146925]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:34 np0005466013 python3.9[147048]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405773.4449723-2293-276354283415387/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:34 np0005466013 podman[147174]: 2025-10-02 11:49:34.946139272 +0000 UTC m=+0.080624693 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent)
Oct  2 07:49:35 np0005466013 python3.9[147221]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:35 np0005466013 python3.9[147344]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405774.6339948-2293-24947446004573/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:36 np0005466013 python3.9[147497]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:36 np0005466013 python3.9[147620]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405775.8007479-2293-29974192279234/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:37 np0005466013 python3.9[147772]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:38 np0005466013 python3.9[147895]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405777.009966-2293-149436399835598/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:38 np0005466013 python3.9[148047]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:39 np0005466013 python3.9[148170]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405778.1467059-2293-256547605377083/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:39 np0005466013 python3.9[148323]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:40 np0005466013 python3.9[148446]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405779.3004494-2293-174693768574011/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:40 np0005466013 python3.9[148598]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:41 np0005466013 python3.9[148721]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405780.4394221-2293-198196268181474/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:42 np0005466013 python3.9[148873]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:42 np0005466013 python3.9[148996]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405781.7226999-2293-71696034893061/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:43 np0005466013 python3.9[149149]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:43 np0005466013 python3.9[149272]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405782.8452654-2293-107394890865949/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:44 np0005466013 python3.9[149424]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:45 np0005466013 python3.9[149547]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405783.9497726-2293-170875514906431/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:45 np0005466013 python3.9[149697]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ls -lRZ /run/libvirt | grep -E ':container_\S+_t'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:49:46 np0005466013 python3.9[149852]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Oct  2 07:49:48 np0005466013 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Oct  2 07:49:48 np0005466013 python3.9[150011]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:49 np0005466013 python3.9[150163]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:50 np0005466013 python3.9[150315]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:51 np0005466013 python3.9[150467]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:51 np0005466013 python3.9[150620]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:52 np0005466013 python3.9[150772]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:53 np0005466013 python3.9[150924]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:53 np0005466013 python3.9[151076]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:54 np0005466013 python3.9[151228]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:54 np0005466013 python3.9[151380]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:55 np0005466013 python3.9[151533]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:49:55 np0005466013 systemd[1]: Reloading.
Oct  2 07:49:55 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:49:55 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:49:55 np0005466013 systemd[1]: Starting libvirt logging daemon socket...
Oct  2 07:49:55 np0005466013 systemd[1]: Listening on libvirt logging daemon socket.
Oct  2 07:49:55 np0005466013 systemd[1]: Starting libvirt logging daemon admin socket...
Oct  2 07:49:55 np0005466013 systemd[1]: Listening on libvirt logging daemon admin socket.
Oct  2 07:49:55 np0005466013 systemd[1]: Starting libvirt logging daemon...
Oct  2 07:49:56 np0005466013 systemd[1]: Started libvirt logging daemon.
Oct  2 07:49:56 np0005466013 python3.9[151726]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:49:56 np0005466013 systemd[1]: Reloading.
Oct  2 07:49:56 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:49:56 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:49:57 np0005466013 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Oct  2 07:49:57 np0005466013 systemd[1]: Starting libvirt nodedev daemon socket...
Oct  2 07:49:57 np0005466013 systemd[1]: Listening on libvirt nodedev daemon socket.
Oct  2 07:49:57 np0005466013 systemd[1]: Starting libvirt nodedev daemon admin socket...
Oct  2 07:49:57 np0005466013 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Oct  2 07:49:57 np0005466013 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Oct  2 07:49:57 np0005466013 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Oct  2 07:49:57 np0005466013 systemd[1]: Starting libvirt nodedev daemon...
Oct  2 07:49:57 np0005466013 systemd[1]: Started libvirt nodedev daemon.
Oct  2 07:49:57 np0005466013 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Oct  2 07:49:57 np0005466013 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Oct  2 07:49:57 np0005466013 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Oct  2 07:49:57 np0005466013 python3.9[151942]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:49:57 np0005466013 systemd[1]: Reloading.
Oct  2 07:49:57 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:49:57 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:49:58 np0005466013 systemd[1]: Starting libvirt proxy daemon admin socket...
Oct  2 07:49:58 np0005466013 systemd[1]: Starting libvirt proxy daemon read-only socket...
Oct  2 07:49:58 np0005466013 systemd[1]: Listening on libvirt proxy daemon admin socket.
Oct  2 07:49:58 np0005466013 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Oct  2 07:49:58 np0005466013 systemd[1]: Starting libvirt proxy daemon...
Oct  2 07:49:58 np0005466013 systemd[1]: Started libvirt proxy daemon.
Oct  2 07:49:58 np0005466013 setroubleshoot[151763]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 43986b5d-62b7-4f47-a5b4-d27e8c0c8386
Oct  2 07:49:58 np0005466013 setroubleshoot[151763]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Oct  2 07:49:58 np0005466013 setroubleshoot[151763]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 43986b5d-62b7-4f47-a5b4-d27e8c0c8386
Oct  2 07:49:58 np0005466013 setroubleshoot[151763]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Oct  2 07:49:59 np0005466013 python3.9[152163]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:49:59 np0005466013 systemd[1]: Reloading.
Oct  2 07:49:59 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:49:59 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:49:59 np0005466013 systemd[1]: Listening on libvirt locking daemon socket.
Oct  2 07:49:59 np0005466013 systemd[1]: Starting libvirt QEMU daemon socket...
Oct  2 07:49:59 np0005466013 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Oct  2 07:49:59 np0005466013 systemd[1]: Starting Virtual Machine and Container Registration Service...
Oct  2 07:49:59 np0005466013 systemd[1]: Listening on libvirt QEMU daemon socket.
Oct  2 07:49:59 np0005466013 systemd[1]: Starting libvirt QEMU daemon admin socket...
Oct  2 07:49:59 np0005466013 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Oct  2 07:49:59 np0005466013 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Oct  2 07:49:59 np0005466013 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Oct  2 07:49:59 np0005466013 systemd[1]: Started Virtual Machine and Container Registration Service.
Oct  2 07:49:59 np0005466013 systemd[1]: Starting libvirt QEMU daemon...
Oct  2 07:49:59 np0005466013 systemd[1]: Started libvirt QEMU daemon.
Oct  2 07:50:00 np0005466013 python3.9[152376]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:50:00 np0005466013 systemd[1]: Reloading.
Oct  2 07:50:00 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:50:00 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:50:00 np0005466013 systemd[1]: Starting libvirt secret daemon socket...
Oct  2 07:50:00 np0005466013 systemd[1]: Listening on libvirt secret daemon socket.
Oct  2 07:50:00 np0005466013 systemd[1]: Starting libvirt secret daemon admin socket...
Oct  2 07:50:00 np0005466013 systemd[1]: Starting libvirt secret daemon read-only socket...
Oct  2 07:50:00 np0005466013 systemd[1]: Listening on libvirt secret daemon admin socket.
Oct  2 07:50:00 np0005466013 systemd[1]: Listening on libvirt secret daemon read-only socket.
Oct  2 07:50:00 np0005466013 systemd[1]: Starting libvirt secret daemon...
Oct  2 07:50:00 np0005466013 systemd[1]: Started libvirt secret daemon.
Oct  2 07:50:01 np0005466013 python3.9[152586]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:02 np0005466013 podman[152710]: 2025-10-02 11:50:02.205133354 +0000 UTC m=+0.115667214 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 07:50:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:50:02.258 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:50:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:50:02.258 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:50:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:50:02.258 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:50:02 np0005466013 python3.9[152750]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct  2 07:50:03 np0005466013 python3.9[152916]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:50:03 np0005466013 python3.9[153040]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405802.9260428-3328-197958588385944/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:04 np0005466013 python3.9[153192]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:05 np0005466013 podman[153317]: 2025-10-02 11:50:05.370926915 +0000 UTC m=+0.062262501 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Oct  2 07:50:05 np0005466013 python3.9[153360]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:50:06 np0005466013 python3.9[153440]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:06 np0005466013 python3.9[153592]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:50:07 np0005466013 python3.9[153670]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.wk2mjuek recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:07 np0005466013 python3.9[153822]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:50:08 np0005466013 python3.9[153900]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:08 np0005466013 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Oct  2 07:50:08 np0005466013 systemd[1]: setroubleshootd.service: Deactivated successfully.
Oct  2 07:50:09 np0005466013 python3.9[154053]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:50:10 np0005466013 python3[154206]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct  2 07:50:10 np0005466013 python3.9[154358]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:50:11 np0005466013 python3.9[154436]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:11 np0005466013 python3.9[154588]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:50:12 np0005466013 python3.9[154666]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:13 np0005466013 python3.9[154818]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:50:13 np0005466013 python3.9[154896]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:14 np0005466013 python3.9[155048]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:50:14 np0005466013 python3.9[155126]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:15 np0005466013 python3.9[155278]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:50:16 np0005466013 python3.9[155403]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405815.197449-3703-275158415592144/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:17 np0005466013 python3.9[155555]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:17 np0005466013 python3.9[155707]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:50:18 np0005466013 python3.9[155862]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:19 np0005466013 python3.9[156014]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:50:20 np0005466013 python3.9[156167]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:50:20 np0005466013 python3.9[156321]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:50:21 np0005466013 python3.9[156476]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:22 np0005466013 python3.9[156628]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:50:22 np0005466013 python3.9[156751]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405821.8638048-3918-190127353095194/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:23 np0005466013 python3.9[156903]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:50:24 np0005466013 python3.9[157026]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405823.1336148-3964-139335894257982/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:24 np0005466013 python3.9[157178]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:50:25 np0005466013 python3.9[157301]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405824.394745-4009-180671770372473/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:26 np0005466013 python3.9[157453]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:50:26 np0005466013 systemd[1]: Reloading.
Oct  2 07:50:26 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:50:26 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:50:26 np0005466013 systemd[1]: Reached target edpm_libvirt.target.
Oct  2 07:50:28 np0005466013 python3.9[157646]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Oct  2 07:50:28 np0005466013 systemd[1]: Reloading.
Oct  2 07:50:28 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:50:28 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:50:29 np0005466013 systemd[1]: Reloading.
Oct  2 07:50:29 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:50:29 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:50:29 np0005466013 systemd[1]: session-24.scope: Deactivated successfully.
Oct  2 07:50:29 np0005466013 systemd[1]: session-24.scope: Consumed 3min 21.302s CPU time.
Oct  2 07:50:29 np0005466013 systemd-logind[784]: Session 24 logged out. Waiting for processes to exit.
Oct  2 07:50:29 np0005466013 systemd-logind[784]: Removed session 24.
Oct  2 07:50:32 np0005466013 podman[157743]: 2025-10-02 11:50:32.720817539 +0000 UTC m=+0.095616605 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 07:50:35 np0005466013 systemd-logind[784]: New session 25 of user zuul.
Oct  2 07:50:35 np0005466013 systemd[1]: Started Session 25 of User zuul.
Oct  2 07:50:35 np0005466013 podman[157773]: 2025-10-02 11:50:35.54160713 +0000 UTC m=+0.050940886 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Oct  2 07:50:36 np0005466013 python3.9[157944]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:50:37 np0005466013 python3.9[158100]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:50:38 np0005466013 python3.9[158252]: ansible-ansible.builtin.file Invoked with path=/etc/target setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:50:39 np0005466013 python3.9[158404]: ansible-ansible.builtin.file Invoked with path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:50:39 np0005466013 python3.9[158556]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/config-data selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct  2 07:50:40 np0005466013 python3.9[158708]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/config-data/ansible-generated/iscsid setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:50:41 np0005466013 python3.9[158860]: ansible-ansible.builtin.stat Invoked with path=/lib/systemd/system/iscsid.socket follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:50:42 np0005466013 python3.9[159014]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iscsid.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:50:42 np0005466013 systemd[1]: Reloading.
Oct  2 07:50:42 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:50:42 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:50:43 np0005466013 python3.9[159203]: ansible-ansible.builtin.service_facts Invoked
Oct  2 07:50:43 np0005466013 network[159220]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  2 07:50:43 np0005466013 network[159221]: 'network-scripts' will be removed from distribution in near future.
Oct  2 07:50:43 np0005466013 network[159222]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  2 07:50:48 np0005466013 python3.9[159495]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iscsi-starter.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:50:48 np0005466013 systemd[1]: Reloading.
Oct  2 07:50:49 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:50:49 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:50:49 np0005466013 python3.9[159683]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:50:50 np0005466013 python3.9[159835]: ansible-containers.podman.podman_container Invoked with command=/usr/sbin/iscsi-iname detach=False image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified name=iscsid_config rm=True tty=True executable=podman state=started debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Oct  2 07:50:51 np0005466013 podman[159871]: 2025-10-02 11:50:51.108935983 +0000 UTC m=+0.043216623 container create 92f7cfc8da3936ad74dfd55c1dfcdeb816f8ef5d4900bfde59acefbcb914475f (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid_config, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 07:50:51 np0005466013 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 07:50:51 np0005466013 NetworkManager[51205]: <info>  [1759405851.1340] manager: (podman0): new Bridge device (/org/freedesktop/NetworkManager/Devices/21)
Oct  2 07:50:51 np0005466013 kernel: podman0: port 1(veth0) entered blocking state
Oct  2 07:50:51 np0005466013 kernel: podman0: port 1(veth0) entered disabled state
Oct  2 07:50:51 np0005466013 kernel: veth0: entered allmulticast mode
Oct  2 07:50:51 np0005466013 kernel: veth0: entered promiscuous mode
Oct  2 07:50:51 np0005466013 NetworkManager[51205]: <info>  [1759405851.1545] manager: (veth0): new Veth device (/org/freedesktop/NetworkManager/Devices/22)
Oct  2 07:50:51 np0005466013 kernel: podman0: port 1(veth0) entered blocking state
Oct  2 07:50:51 np0005466013 kernel: podman0: port 1(veth0) entered forwarding state
Oct  2 07:50:51 np0005466013 NetworkManager[51205]: <info>  [1759405851.1564] device (veth0): carrier: link connected
Oct  2 07:50:51 np0005466013 NetworkManager[51205]: <info>  [1759405851.1566] device (podman0): carrier: link connected
Oct  2 07:50:51 np0005466013 systemd-udevd[159892]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 07:50:51 np0005466013 systemd-udevd[159896]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 07:50:51 np0005466013 NetworkManager[51205]: <info>  [1759405851.1770] device (podman0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 07:50:51 np0005466013 NetworkManager[51205]: <info>  [1759405851.1775] device (podman0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Oct  2 07:50:51 np0005466013 NetworkManager[51205]: <info>  [1759405851.1780] device (podman0): Activation: starting connection 'podman0' (50c9630e-00d0-4508-a688-47174329dccf)
Oct  2 07:50:51 np0005466013 NetworkManager[51205]: <info>  [1759405851.1781] device (podman0): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Oct  2 07:50:51 np0005466013 NetworkManager[51205]: <info>  [1759405851.1783] device (podman0): state change: prepare -> config (reason 'none', managed-type: 'external')
Oct  2 07:50:51 np0005466013 NetworkManager[51205]: <info>  [1759405851.1784] device (podman0): state change: config -> ip-config (reason 'none', managed-type: 'external')
Oct  2 07:50:51 np0005466013 NetworkManager[51205]: <info>  [1759405851.1786] device (podman0): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Oct  2 07:50:51 np0005466013 podman[159871]: 2025-10-02 11:50:51.088782663 +0000 UTC m=+0.023063323 image pull 1b3fd7f2436e5c6f2e28c01b83721476c7b295789c77b3d63e30f49404389ea1 quay.io/podified-antelope-centos9/openstack-iscsid:current-podified
Oct  2 07:50:51 np0005466013 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct  2 07:50:51 np0005466013 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct  2 07:50:51 np0005466013 NetworkManager[51205]: <info>  [1759405851.2057] device (podman0): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Oct  2 07:50:51 np0005466013 NetworkManager[51205]: <info>  [1759405851.2059] device (podman0): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Oct  2 07:50:51 np0005466013 NetworkManager[51205]: <info>  [1759405851.2067] device (podman0): Activation: successful, device activated.
Oct  2 07:50:51 np0005466013 systemd[1]: iscsi.service: Unit cannot be reloaded because it is inactive.
Oct  2 07:50:51 np0005466013 systemd[1]: Started libpod-conmon-92f7cfc8da3936ad74dfd55c1dfcdeb816f8ef5d4900bfde59acefbcb914475f.scope.
Oct  2 07:50:51 np0005466013 systemd[1]: Started libcrun container.
Oct  2 07:50:51 np0005466013 podman[159871]: 2025-10-02 11:50:51.469679418 +0000 UTC m=+0.403960078 container init 92f7cfc8da3936ad74dfd55c1dfcdeb816f8ef5d4900bfde59acefbcb914475f (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid_config, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 07:50:51 np0005466013 podman[159871]: 2025-10-02 11:50:51.478219595 +0000 UTC m=+0.412500235 container start 92f7cfc8da3936ad74dfd55c1dfcdeb816f8ef5d4900bfde59acefbcb914475f (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid_config, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Oct  2 07:50:51 np0005466013 iscsid_config[160030]: iqn.1994-05.com.redhat:cc8cb94a9c7#015
Oct  2 07:50:51 np0005466013 systemd[1]: libpod-92f7cfc8da3936ad74dfd55c1dfcdeb816f8ef5d4900bfde59acefbcb914475f.scope: Deactivated successfully.
Oct  2 07:50:51 np0005466013 podman[159871]: 2025-10-02 11:50:51.484360748 +0000 UTC m=+0.418641418 container attach 92f7cfc8da3936ad74dfd55c1dfcdeb816f8ef5d4900bfde59acefbcb914475f (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid_config, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct  2 07:50:51 np0005466013 podman[159871]: 2025-10-02 11:50:51.485614887 +0000 UTC m=+0.419895537 container died 92f7cfc8da3936ad74dfd55c1dfcdeb816f8ef5d4900bfde59acefbcb914475f (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid_config, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001)
Oct  2 07:50:51 np0005466013 kernel: podman0: port 1(veth0) entered disabled state
Oct  2 07:50:51 np0005466013 kernel: veth0 (unregistering): left allmulticast mode
Oct  2 07:50:51 np0005466013 kernel: veth0 (unregistering): left promiscuous mode
Oct  2 07:50:51 np0005466013 kernel: podman0: port 1(veth0) entered disabled state
Oct  2 07:50:51 np0005466013 NetworkManager[51205]: <info>  [1759405851.5352] device (podman0): state change: activated -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 07:50:51 np0005466013 systemd[1]: run-netns-netns\x2d970424ab\x2dd296\x2de201\x2db8c8\x2d5e088dc94c9d.mount: Deactivated successfully.
Oct  2 07:50:51 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-92f7cfc8da3936ad74dfd55c1dfcdeb816f8ef5d4900bfde59acefbcb914475f-userdata-shm.mount: Deactivated successfully.
Oct  2 07:50:51 np0005466013 systemd[1]: var-lib-containers-storage-overlay-67e6a03112d3d8652464c5a8646d7283def752a8a7b5866fb80c8b1e3a8b0ff6-merged.mount: Deactivated successfully.
Oct  2 07:50:51 np0005466013 podman[159871]: 2025-10-02 11:50:51.897150781 +0000 UTC m=+0.831431421 container remove 92f7cfc8da3936ad74dfd55c1dfcdeb816f8ef5d4900bfde59acefbcb914475f (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid_config, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true)
Oct  2 07:50:51 np0005466013 python3.9[159835]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman run --name iscsid_config --detach=False --rm --tty=True quay.io/podified-antelope-centos9/openstack-iscsid:current-podified /usr/sbin/iscsi-iname
Oct  2 07:50:51 np0005466013 systemd[1]: libpod-conmon-92f7cfc8da3936ad74dfd55c1dfcdeb816f8ef5d4900bfde59acefbcb914475f.scope: Deactivated successfully.
Oct  2 07:50:51 np0005466013 python3.9[159835]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: Error generating systemd: #012DEPRECATED command:#012It is recommended to use Quadlets for running containers and pods under systemd.#012#012Please refer to podman-systemd.unit(5) for details.#012Error: iscsid_config does not refer to a container or pod: no pod with name or ID iscsid_config found: no such pod: no container with name or ID "iscsid_config" found: no such container
Oct  2 07:50:52 np0005466013 python3.9[160272]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:50:53 np0005466013 python3.9[160395]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405852.2368512-324-34309963896942/.source.iscsi _original_basename=.yxjp2deq follow=False checksum=25375af88adeb6f165088bb7162ee92ac112ab55 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:54 np0005466013 python3.9[160547]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:54 np0005466013 python3.9[160697]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/iscsid.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:50:55 np0005466013 python3.9[160851]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:56 np0005466013 python3.9[161003]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:50:57 np0005466013 python3.9[161155]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:50:57 np0005466013 python3.9[161233]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:50:58 np0005466013 python3.9[161385]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:50:58 np0005466013 python3.9[161463]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:50:59 np0005466013 python3.9[161615]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:00 np0005466013 python3.9[161767]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:51:00 np0005466013 python3.9[161845]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:01 np0005466013 python3.9[161997]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:51:01 np0005466013 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct  2 07:51:01 np0005466013 python3.9[162075]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:51:02.259 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:51:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:51:02.260 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:51:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:51:02.261 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:51:02 np0005466013 python3.9[162227]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:51:02 np0005466013 systemd[1]: Reloading.
Oct  2 07:51:02 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:51:02 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:51:02 np0005466013 podman[162229]: 2025-10-02 11:51:02.943616357 +0000 UTC m=+0.145632830 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 07:51:03 np0005466013 python3.9[162443]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:51:04 np0005466013 python3.9[162521]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:04 np0005466013 python3.9[162673]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:51:05 np0005466013 python3.9[162751]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:05 np0005466013 podman[162849]: 2025-10-02 11:51:05.677559261 +0000 UTC m=+0.055409226 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent)
Oct  2 07:51:06 np0005466013 python3.9[162922]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:51:06 np0005466013 systemd[1]: Reloading.
Oct  2 07:51:06 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:51:06 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:51:06 np0005466013 systemd[1]: Starting Create netns directory...
Oct  2 07:51:06 np0005466013 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct  2 07:51:06 np0005466013 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct  2 07:51:06 np0005466013 systemd[1]: Finished Create netns directory.
Oct  2 07:51:07 np0005466013 python3.9[163115]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:51:08 np0005466013 python3.9[163267]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/iscsid/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:51:08 np0005466013 python3.9[163390]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/iscsid/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759405867.7854693-787-173083753348409/.source _original_basename=healthcheck follow=False checksum=2e1237e7fe015c809b173c52e24cfb87132f4344 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:51:09 np0005466013 python3.9[163542]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:51:10 np0005466013 python3.9[163694]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/iscsid.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:51:11 np0005466013 python3.9[163817]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/iscsid.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405870.3637578-862-13402516892364/.source.json _original_basename=.uqc9xog5 follow=False checksum=80e4f97460718c7e5c66b21ef8b846eba0e0dbc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:12 np0005466013 python3.9[163969]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/iscsid state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:14 np0005466013 python3.9[164396]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/iscsid config_pattern=*.json debug=False
Oct  2 07:51:15 np0005466013 python3.9[164548]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  2 07:51:16 np0005466013 python3.9[164700]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct  2 07:51:18 np0005466013 python3[164878]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/iscsid config_id=iscsid config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct  2 07:51:18 np0005466013 podman[164916]: 2025-10-02 11:51:18.219903555 +0000 UTC m=+0.019274060 image pull 1b3fd7f2436e5c6f2e28c01b83721476c7b295789c77b3d63e30f49404389ea1 quay.io/podified-antelope-centos9/openstack-iscsid:current-podified
Oct  2 07:51:18 np0005466013 podman[164916]: 2025-10-02 11:51:18.401654365 +0000 UTC m=+0.201024850 container create f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, config_id=iscsid, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, container_name=iscsid, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3)
Oct  2 07:51:18 np0005466013 python3[164878]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name iscsid --conmon-pidfile /run/iscsid.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=iscsid --label container_name=iscsid --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run:/run --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:z --volume /etc/target:/etc/target:z --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /var/lib/openstack/healthchecks/iscsid:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-iscsid:current-podified
Oct  2 07:51:19 np0005466013 python3.9[165105]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:51:20 np0005466013 python3.9[165259]: ansible-file Invoked with path=/etc/systemd/system/edpm_iscsid.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:20 np0005466013 python3.9[165335]: ansible-stat Invoked with path=/etc/systemd/system/edpm_iscsid_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:51:21 np0005466013 python3.9[165486]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759405880.7627366-1126-31765832164127/source dest=/etc/systemd/system/edpm_iscsid.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:22 np0005466013 python3.9[165562]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  2 07:51:22 np0005466013 systemd[1]: Reloading.
Oct  2 07:51:22 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:51:22 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:51:22 np0005466013 python3.9[165673]: ansible-systemd Invoked with state=restarted name=edpm_iscsid.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:51:22 np0005466013 systemd[1]: Reloading.
Oct  2 07:51:22 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:51:22 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:51:23 np0005466013 systemd[1]: Starting iscsid container...
Oct  2 07:51:23 np0005466013 systemd[1]: Started libcrun container.
Oct  2 07:51:23 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5209eacee739208547c1d177d547186bdbf8ade1b94b1915520fa189cb81871d/merged/etc/target supports timestamps until 2038 (0x7fffffff)
Oct  2 07:51:23 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5209eacee739208547c1d177d547186bdbf8ade1b94b1915520fa189cb81871d/merged/etc/iscsi supports timestamps until 2038 (0x7fffffff)
Oct  2 07:51:23 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5209eacee739208547c1d177d547186bdbf8ade1b94b1915520fa189cb81871d/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct  2 07:51:23 np0005466013 systemd[1]: Started /usr/bin/podman healthcheck run f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb.
Oct  2 07:51:23 np0005466013 podman[165713]: 2025-10-02 11:51:23.281691277 +0000 UTC m=+0.107521518 container init f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 07:51:23 np0005466013 iscsid[165728]: + sudo -E kolla_set_configs
Oct  2 07:51:23 np0005466013 podman[165713]: 2025-10-02 11:51:23.304115044 +0000 UTC m=+0.129945255 container start f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 07:51:23 np0005466013 podman[165713]: iscsid
Oct  2 07:51:23 np0005466013 systemd[1]: Started iscsid container.
Oct  2 07:51:23 np0005466013 systemd[1]: Created slice User Slice of UID 0.
Oct  2 07:51:23 np0005466013 systemd[1]: Starting User Runtime Directory /run/user/0...
Oct  2 07:51:23 np0005466013 systemd[1]: Finished User Runtime Directory /run/user/0.
Oct  2 07:51:23 np0005466013 systemd[1]: Starting User Manager for UID 0...
Oct  2 07:51:23 np0005466013 podman[165735]: 2025-10-02 11:51:23.375637174 +0000 UTC m=+0.061376910 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=starting, health_failing_streak=1, health_log=, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 07:51:23 np0005466013 systemd[1]: f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb-6d327863188491da.service: Main process exited, code=exited, status=1/FAILURE
Oct  2 07:51:23 np0005466013 systemd[1]: f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb-6d327863188491da.service: Failed with result 'exit-code'.
Oct  2 07:51:23 np0005466013 systemd[165752]: Queued start job for default target Main User Target.
Oct  2 07:51:23 np0005466013 systemd[165752]: Created slice User Application Slice.
Oct  2 07:51:23 np0005466013 systemd[165752]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Oct  2 07:51:23 np0005466013 systemd[165752]: Started Daily Cleanup of User's Temporary Directories.
Oct  2 07:51:23 np0005466013 systemd[165752]: Reached target Paths.
Oct  2 07:51:23 np0005466013 systemd[165752]: Reached target Timers.
Oct  2 07:51:23 np0005466013 systemd[165752]: Starting D-Bus User Message Bus Socket...
Oct  2 07:51:23 np0005466013 systemd[165752]: Starting Create User's Volatile Files and Directories...
Oct  2 07:51:23 np0005466013 systemd[165752]: Finished Create User's Volatile Files and Directories.
Oct  2 07:51:23 np0005466013 systemd[165752]: Listening on D-Bus User Message Bus Socket.
Oct  2 07:51:23 np0005466013 systemd[165752]: Reached target Sockets.
Oct  2 07:51:23 np0005466013 systemd[165752]: Reached target Basic System.
Oct  2 07:51:23 np0005466013 systemd[165752]: Reached target Main User Target.
Oct  2 07:51:23 np0005466013 systemd[165752]: Startup finished in 116ms.
Oct  2 07:51:23 np0005466013 systemd[1]: Started User Manager for UID 0.
Oct  2 07:51:23 np0005466013 systemd[1]: Started Session c3 of User root.
Oct  2 07:51:23 np0005466013 iscsid[165728]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  2 07:51:23 np0005466013 iscsid[165728]: INFO:__main__:Validating config file
Oct  2 07:51:23 np0005466013 iscsid[165728]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  2 07:51:23 np0005466013 iscsid[165728]: INFO:__main__:Writing out command to execute
Oct  2 07:51:23 np0005466013 systemd[1]: session-c3.scope: Deactivated successfully.
Oct  2 07:51:23 np0005466013 iscsid[165728]: ++ cat /run_command
Oct  2 07:51:23 np0005466013 iscsid[165728]: + CMD='/usr/sbin/iscsid -f'
Oct  2 07:51:23 np0005466013 iscsid[165728]: + ARGS=
Oct  2 07:51:23 np0005466013 iscsid[165728]: + sudo kolla_copy_cacerts
Oct  2 07:51:23 np0005466013 systemd[1]: Started Session c4 of User root.
Oct  2 07:51:23 np0005466013 systemd[1]: session-c4.scope: Deactivated successfully.
Oct  2 07:51:23 np0005466013 iscsid[165728]: + [[ ! -n '' ]]
Oct  2 07:51:23 np0005466013 iscsid[165728]: + . kolla_extend_start
Oct  2 07:51:23 np0005466013 iscsid[165728]: ++ [[ ! -f /etc/iscsi/initiatorname.iscsi ]]
Oct  2 07:51:23 np0005466013 iscsid[165728]: Running command: '/usr/sbin/iscsid -f'
Oct  2 07:51:23 np0005466013 iscsid[165728]: + echo 'Running command: '\''/usr/sbin/iscsid -f'\'''
Oct  2 07:51:23 np0005466013 iscsid[165728]: + umask 0022
Oct  2 07:51:23 np0005466013 iscsid[165728]: + exec /usr/sbin/iscsid -f
Oct  2 07:51:23 np0005466013 kernel: Loading iSCSI transport class v2.0-870.
Oct  2 07:51:24 np0005466013 python3.9[165934]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.iscsid_restart_required follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:51:25 np0005466013 python3.9[166086]: ansible-ansible.builtin.file Invoked with path=/etc/iscsi/.iscsid_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:26 np0005466013 python3.9[166238]: ansible-ansible.builtin.service_facts Invoked
Oct  2 07:51:26 np0005466013 network[166255]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  2 07:51:26 np0005466013 network[166256]: 'network-scripts' will be removed from distribution in near future.
Oct  2 07:51:26 np0005466013 network[166257]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  2 07:51:30 np0005466013 python3.9[166531]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct  2 07:51:31 np0005466013 python3.9[166683]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Oct  2 07:51:32 np0005466013 python3.9[166839]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:51:33 np0005466013 python3.9[166962]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405892.273064-1348-124119826663133/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:33 np0005466013 systemd[1]: Stopping User Manager for UID 0...
Oct  2 07:51:33 np0005466013 systemd[165752]: Activating special unit Exit the Session...
Oct  2 07:51:33 np0005466013 systemd[165752]: Stopped target Main User Target.
Oct  2 07:51:33 np0005466013 systemd[165752]: Stopped target Basic System.
Oct  2 07:51:33 np0005466013 systemd[165752]: Stopped target Paths.
Oct  2 07:51:33 np0005466013 systemd[165752]: Stopped target Sockets.
Oct  2 07:51:33 np0005466013 systemd[165752]: Stopped target Timers.
Oct  2 07:51:33 np0005466013 systemd[165752]: Stopped Daily Cleanup of User's Temporary Directories.
Oct  2 07:51:33 np0005466013 systemd[165752]: Closed D-Bus User Message Bus Socket.
Oct  2 07:51:33 np0005466013 systemd[165752]: Stopped Create User's Volatile Files and Directories.
Oct  2 07:51:33 np0005466013 systemd[165752]: Removed slice User Application Slice.
Oct  2 07:51:33 np0005466013 systemd[165752]: Reached target Shutdown.
Oct  2 07:51:33 np0005466013 systemd[165752]: Finished Exit the Session.
Oct  2 07:51:33 np0005466013 systemd[165752]: Reached target Exit the Session.
Oct  2 07:51:33 np0005466013 systemd[1]: user@0.service: Deactivated successfully.
Oct  2 07:51:33 np0005466013 systemd[1]: Stopped User Manager for UID 0.
Oct  2 07:51:33 np0005466013 systemd[1]: Stopping User Runtime Directory /run/user/0...
Oct  2 07:51:33 np0005466013 systemd[1]: run-user-0.mount: Deactivated successfully.
Oct  2 07:51:33 np0005466013 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Oct  2 07:51:33 np0005466013 systemd[1]: Stopped User Runtime Directory /run/user/0.
Oct  2 07:51:33 np0005466013 systemd[1]: Removed slice User Slice of UID 0.
Oct  2 07:51:33 np0005466013 podman[166987]: 2025-10-02 11:51:33.718761473 +0000 UTC m=+0.092063479 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct  2 07:51:34 np0005466013 python3.9[167142]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:35 np0005466013 python3.9[167294]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:51:35 np0005466013 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Oct  2 07:51:35 np0005466013 systemd[1]: Stopped Load Kernel Modules.
Oct  2 07:51:35 np0005466013 systemd[1]: Stopping Load Kernel Modules...
Oct  2 07:51:35 np0005466013 systemd[1]: Starting Load Kernel Modules...
Oct  2 07:51:35 np0005466013 systemd[1]: Finished Load Kernel Modules.
Oct  2 07:51:35 np0005466013 podman[167422]: 2025-10-02 11:51:35.813374373 +0000 UTC m=+0.058021343 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 07:51:36 np0005466013 python3.9[167470]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:51:36 np0005466013 python3.9[167623]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:51:37 np0005466013 python3.9[167775]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:51:38 np0005466013 python3.9[167927]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:51:39 np0005466013 python3.9[168050]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405897.9487174-1522-261110299398722/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:39 np0005466013 python3.9[168202]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:51:40 np0005466013 python3.9[168355]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:41 np0005466013 python3.9[168507]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:42 np0005466013 python3.9[168659]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:43 np0005466013 python3.9[168811]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:43 np0005466013 python3.9[168963]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:44 np0005466013 python3.9[169115]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:44 np0005466013 python3.9[169267]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:45 np0005466013 python3.9[169419]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:51:46 np0005466013 python3.9[169573]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:47 np0005466013 python3.9[169725]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:51:48 np0005466013 python3.9[169877]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:51:48 np0005466013 python3.9[169955]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:51:49 np0005466013 python3.9[170107]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:51:49 np0005466013 python3.9[170185]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:51:50 np0005466013 python3.9[170337]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:51 np0005466013 python3.9[170489]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:51:51 np0005466013 python3.9[170567]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:52 np0005466013 python3.9[170719]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:51:52 np0005466013 python3.9[170797]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:53 np0005466013 podman[170921]: 2025-10-02 11:51:53.578721425 +0000 UTC m=+0.051342373 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct  2 07:51:53 np0005466013 python3.9[170969]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:51:53 np0005466013 systemd[1]: Reloading.
Oct  2 07:51:53 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:51:53 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:51:55 np0005466013 python3.9[171159]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:51:55 np0005466013 python3.9[171237]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:56 np0005466013 python3.9[171389]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:51:56 np0005466013 python3.9[171467]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:57 np0005466013 systemd[1]: virtnodedevd.service: Deactivated successfully.
Oct  2 07:51:57 np0005466013 python3.9[171620]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:51:57 np0005466013 systemd[1]: Reloading.
Oct  2 07:51:57 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:51:57 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:51:57 np0005466013 systemd[1]: Starting Create netns directory...
Oct  2 07:51:57 np0005466013 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct  2 07:51:57 np0005466013 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct  2 07:51:57 np0005466013 systemd[1]: Finished Create netns directory.
Oct  2 07:51:58 np0005466013 systemd[1]: virtproxyd.service: Deactivated successfully.
Oct  2 07:51:58 np0005466013 python3.9[171814]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:51:59 np0005466013 python3.9[171966]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:51:59 np0005466013 systemd[1]: virtqemud.service: Deactivated successfully.
Oct  2 07:51:59 np0005466013 python3.9[172090]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759405919.0154333-2143-227842485179112/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:52:00 np0005466013 systemd[1]: virtsecretd.service: Deactivated successfully.
Oct  2 07:52:00 np0005466013 python3.9[172242]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:52:01 np0005466013 python3.9[172395]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:52:02 np0005466013 python3.9[172518]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405921.2043324-2217-70166203094768/.source.json _original_basename=.obec44mc follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:52:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:52:02.260 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:52:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:52:02.261 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:52:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:52:02.261 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:52:02 np0005466013 python3.9[172670]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:52:04 np0005466013 podman[172917]: 2025-10-02 11:52:04.197721128 +0000 UTC m=+0.082033253 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Oct  2 07:52:05 np0005466013 python3.9[173123]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Oct  2 07:52:05 np0005466013 podman[173247]: 2025-10-02 11:52:05.922463575 +0000 UTC m=+0.058870939 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Oct  2 07:52:06 np0005466013 python3.9[173294]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  2 07:52:06 np0005466013 python3.9[173447]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct  2 07:52:08 np0005466013 python3[173624]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct  2 07:52:08 np0005466013 podman[173657]: 2025-10-02 11:52:08.662998419 +0000 UTC m=+0.042917757 container create 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 07:52:08 np0005466013 podman[173657]: 2025-10-02 11:52:08.639679162 +0000 UTC m=+0.019598510 image pull d8d739f82a6fecf9df690e49539b589e74665b54e36448657b874630717d5bd1 quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Oct  2 07:52:08 np0005466013 python3[173624]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Oct  2 07:52:10 np0005466013 python3.9[173847]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:52:11 np0005466013 python3.9[174001]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:52:11 np0005466013 python3.9[174077]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:52:12 np0005466013 python3.9[174228]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759405932.039775-2481-63427820708853/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:52:13 np0005466013 python3.9[174304]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  2 07:52:13 np0005466013 systemd[1]: Reloading.
Oct  2 07:52:13 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:52:13 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:52:14 np0005466013 python3.9[174414]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:52:14 np0005466013 systemd[1]: Reloading.
Oct  2 07:52:14 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:52:14 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:52:14 np0005466013 systemd[1]: Starting multipathd container...
Oct  2 07:52:14 np0005466013 systemd[1]: Started libcrun container.
Oct  2 07:52:14 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/337a8bf87fe7c4d599a8a5c7bea05f0845bf4af1e091cd1d93f3052f394f62e1/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct  2 07:52:14 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/337a8bf87fe7c4d599a8a5c7bea05f0845bf4af1e091cd1d93f3052f394f62e1/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct  2 07:52:14 np0005466013 systemd[1]: Started /usr/bin/podman healthcheck run 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6.
Oct  2 07:52:14 np0005466013 podman[174454]: 2025-10-02 11:52:14.626235413 +0000 UTC m=+0.150456245 container init 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, container_name=multipathd, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 07:52:14 np0005466013 multipathd[174470]: + sudo -E kolla_set_configs
Oct  2 07:52:14 np0005466013 podman[174454]: 2025-10-02 11:52:14.65058423 +0000 UTC m=+0.174805042 container start 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=multipathd, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=multipathd)
Oct  2 07:52:14 np0005466013 podman[174454]: multipathd
Oct  2 07:52:14 np0005466013 systemd[1]: Started multipathd container.
Oct  2 07:52:14 np0005466013 multipathd[174470]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  2 07:52:14 np0005466013 multipathd[174470]: INFO:__main__:Validating config file
Oct  2 07:52:14 np0005466013 multipathd[174470]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  2 07:52:14 np0005466013 multipathd[174470]: INFO:__main__:Writing out command to execute
Oct  2 07:52:14 np0005466013 multipathd[174470]: ++ cat /run_command
Oct  2 07:52:14 np0005466013 multipathd[174470]: + CMD='/usr/sbin/multipathd -d'
Oct  2 07:52:14 np0005466013 multipathd[174470]: + ARGS=
Oct  2 07:52:14 np0005466013 multipathd[174470]: + sudo kolla_copy_cacerts
Oct  2 07:52:14 np0005466013 multipathd[174470]: + [[ ! -n '' ]]
Oct  2 07:52:14 np0005466013 multipathd[174470]: + . kolla_extend_start
Oct  2 07:52:14 np0005466013 multipathd[174470]: Running command: '/usr/sbin/multipathd -d'
Oct  2 07:52:14 np0005466013 multipathd[174470]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Oct  2 07:52:14 np0005466013 multipathd[174470]: + umask 0022
Oct  2 07:52:14 np0005466013 multipathd[174470]: + exec /usr/sbin/multipathd -d
Oct  2 07:52:14 np0005466013 multipathd[174470]: 3926.417167 | --------start up--------
Oct  2 07:52:14 np0005466013 multipathd[174470]: 3926.417181 | read /etc/multipath.conf
Oct  2 07:52:14 np0005466013 podman[174477]: 2025-10-02 11:52:14.743009402 +0000 UTC m=+0.079690600 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  2 07:52:14 np0005466013 multipathd[174470]: 3926.422639 | path checkers start up
Oct  2 07:52:14 np0005466013 systemd[1]: 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6-6a84737c9fdf37c6.service: Main process exited, code=exited, status=1/FAILURE
Oct  2 07:52:14 np0005466013 systemd[1]: 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6-6a84737c9fdf37c6.service: Failed with result 'exit-code'.
Oct  2 07:52:15 np0005466013 python3.9[174659]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:52:16 np0005466013 python3.9[174813]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:52:17 np0005466013 python3.9[174978]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:52:17 np0005466013 systemd[1]: Stopping multipathd container...
Oct  2 07:52:17 np0005466013 multipathd[174470]: 3929.022704 | exit (signal)
Oct  2 07:52:17 np0005466013 multipathd[174470]: 3929.022762 | --------shut down-------
Oct  2 07:52:17 np0005466013 systemd[1]: libpod-5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6.scope: Deactivated successfully.
Oct  2 07:52:17 np0005466013 podman[174982]: 2025-10-02 11:52:17.376741137 +0000 UTC m=+0.065864789 container died 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3)
Oct  2 07:52:17 np0005466013 systemd[1]: 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6-6a84737c9fdf37c6.timer: Deactivated successfully.
Oct  2 07:52:17 np0005466013 systemd[1]: Stopped /usr/bin/podman healthcheck run 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6.
Oct  2 07:52:17 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6-userdata-shm.mount: Deactivated successfully.
Oct  2 07:52:17 np0005466013 systemd[1]: var-lib-containers-storage-overlay-337a8bf87fe7c4d599a8a5c7bea05f0845bf4af1e091cd1d93f3052f394f62e1-merged.mount: Deactivated successfully.
Oct  2 07:52:17 np0005466013 podman[174982]: 2025-10-02 11:52:17.465210094 +0000 UTC m=+0.154333746 container cleanup 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd)
Oct  2 07:52:17 np0005466013 podman[174982]: multipathd
Oct  2 07:52:17 np0005466013 podman[175012]: multipathd
Oct  2 07:52:17 np0005466013 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Oct  2 07:52:17 np0005466013 systemd[1]: Stopped multipathd container.
Oct  2 07:52:17 np0005466013 systemd[1]: Starting multipathd container...
Oct  2 07:52:17 np0005466013 systemd[1]: Started libcrun container.
Oct  2 07:52:17 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/337a8bf87fe7c4d599a8a5c7bea05f0845bf4af1e091cd1d93f3052f394f62e1/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct  2 07:52:17 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/337a8bf87fe7c4d599a8a5c7bea05f0845bf4af1e091cd1d93f3052f394f62e1/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct  2 07:52:17 np0005466013 systemd[1]: Started /usr/bin/podman healthcheck run 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6.
Oct  2 07:52:17 np0005466013 podman[175025]: 2025-10-02 11:52:17.695359856 +0000 UTC m=+0.142656580 container init 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 07:52:17 np0005466013 multipathd[175041]: + sudo -E kolla_set_configs
Oct  2 07:52:17 np0005466013 podman[175025]: 2025-10-02 11:52:17.726519335 +0000 UTC m=+0.173816359 container start 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 07:52:17 np0005466013 podman[175025]: multipathd
Oct  2 07:52:17 np0005466013 systemd[1]: Started multipathd container.
Oct  2 07:52:17 np0005466013 multipathd[175041]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  2 07:52:17 np0005466013 multipathd[175041]: INFO:__main__:Validating config file
Oct  2 07:52:17 np0005466013 multipathd[175041]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  2 07:52:17 np0005466013 multipathd[175041]: INFO:__main__:Writing out command to execute
Oct  2 07:52:17 np0005466013 multipathd[175041]: ++ cat /run_command
Oct  2 07:52:17 np0005466013 multipathd[175041]: + CMD='/usr/sbin/multipathd -d'
Oct  2 07:52:17 np0005466013 multipathd[175041]: + ARGS=
Oct  2 07:52:17 np0005466013 multipathd[175041]: + sudo kolla_copy_cacerts
Oct  2 07:52:17 np0005466013 multipathd[175041]: Running command: '/usr/sbin/multipathd -d'
Oct  2 07:52:17 np0005466013 multipathd[175041]: + [[ ! -n '' ]]
Oct  2 07:52:17 np0005466013 multipathd[175041]: + . kolla_extend_start
Oct  2 07:52:17 np0005466013 multipathd[175041]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Oct  2 07:52:17 np0005466013 multipathd[175041]: + umask 0022
Oct  2 07:52:17 np0005466013 multipathd[175041]: + exec /usr/sbin/multipathd -d
Oct  2 07:52:17 np0005466013 podman[175048]: 2025-10-02 11:52:17.803802252 +0000 UTC m=+0.058273072 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  2 07:52:17 np0005466013 systemd[1]: 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6-13559aa655364f85.service: Main process exited, code=exited, status=1/FAILURE
Oct  2 07:52:17 np0005466013 systemd[1]: 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6-13559aa655364f85.service: Failed with result 'exit-code'.
Oct  2 07:52:17 np0005466013 multipathd[175041]: 3929.489676 | --------start up--------
Oct  2 07:52:17 np0005466013 multipathd[175041]: 3929.489694 | read /etc/multipath.conf
Oct  2 07:52:17 np0005466013 multipathd[175041]: 3929.494725 | path checkers start up
Oct  2 07:52:18 np0005466013 python3.9[175230]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:52:19 np0005466013 python3.9[175382]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct  2 07:52:20 np0005466013 python3.9[175534]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Oct  2 07:52:20 np0005466013 kernel: Key type psk registered
Oct  2 07:52:21 np0005466013 python3.9[175698]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:52:21 np0005466013 python3.9[175821]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405940.844994-2722-221094496442150/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:52:22 np0005466013 python3.9[175973]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:52:23 np0005466013 podman[176126]: 2025-10-02 11:52:23.67168892 +0000 UTC m=+0.050093674 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Oct  2 07:52:23 np0005466013 python3.9[176125]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:52:23 np0005466013 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Oct  2 07:52:23 np0005466013 systemd[1]: Stopped Load Kernel Modules.
Oct  2 07:52:23 np0005466013 systemd[1]: Stopping Load Kernel Modules...
Oct  2 07:52:23 np0005466013 systemd[1]: Starting Load Kernel Modules...
Oct  2 07:52:23 np0005466013 systemd[1]: Finished Load Kernel Modules.
Oct  2 07:52:24 np0005466013 python3.9[176301]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  2 07:52:25 np0005466013 python3.9[176385]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  2 07:52:32 np0005466013 systemd[1]: Reloading.
Oct  2 07:52:32 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:52:32 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:52:32 np0005466013 systemd[1]: Reloading.
Oct  2 07:52:32 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:52:32 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:52:33 np0005466013 systemd-logind[784]: Watching system buttons on /dev/input/event0 (Power Button)
Oct  2 07:52:33 np0005466013 systemd-logind[784]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Oct  2 07:52:33 np0005466013 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  2 07:52:33 np0005466013 systemd[1]: Starting man-db-cache-update.service...
Oct  2 07:52:33 np0005466013 systemd[1]: Reloading.
Oct  2 07:52:33 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:52:33 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:52:33 np0005466013 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  2 07:52:34 np0005466013 podman[177625]: 2025-10-02 11:52:34.71678301 +0000 UTC m=+0.092087281 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  2 07:52:34 np0005466013 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct  2 07:52:34 np0005466013 systemd[1]: Finished man-db-cache-update.service.
Oct  2 07:52:34 np0005466013 systemd[1]: man-db-cache-update.service: Consumed 1.499s CPU time.
Oct  2 07:52:34 np0005466013 systemd[1]: run-rf099c1baa4474a21b56f85b01d25b6fb.service: Deactivated successfully.
Oct  2 07:52:35 np0005466013 python3.9[177862]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.iscsid_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:52:35 np0005466013 python3.9[178012]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:52:36 np0005466013 podman[178116]: 2025-10-02 11:52:36.67778655 +0000 UTC m=+0.054291387 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Oct  2 07:52:36 np0005466013 python3.9[178187]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:52:38 np0005466013 python3.9[178339]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  2 07:52:38 np0005466013 systemd[1]: Reloading.
Oct  2 07:52:38 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:52:38 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:52:39 np0005466013 python3.9[178523]: ansible-ansible.builtin.service_facts Invoked
Oct  2 07:52:39 np0005466013 network[178540]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  2 07:52:39 np0005466013 network[178541]: 'network-scripts' will be removed from distribution in near future.
Oct  2 07:52:39 np0005466013 network[178542]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  2 07:52:44 np0005466013 python3.9[178819]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:52:45 np0005466013 python3.9[178972]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:52:45 np0005466013 python3.9[179125]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:52:46 np0005466013 python3.9[179278]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:52:47 np0005466013 python3.9[179431]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:52:48 np0005466013 python3.9[179584]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:52:48 np0005466013 podman[179586]: 2025-10-02 11:52:48.124601914 +0000 UTC m=+0.058648424 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 07:52:48 np0005466013 python3.9[179757]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:52:49 np0005466013 python3.9[179910]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:52:51 np0005466013 python3.9[180063]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:52:52 np0005466013 python3.9[180215]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:52:52 np0005466013 python3.9[180367]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:52:53 np0005466013 python3.9[180519]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:52:54 np0005466013 podman[180643]: 2025-10-02 11:52:54.182658735 +0000 UTC m=+0.080930563 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 07:52:54 np0005466013 python3.9[180691]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:52:54 np0005466013 python3.9[180843]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:52:55 np0005466013 python3.9[180995]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:52:56 np0005466013 python3.9[181147]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:52:57 np0005466013 python3.9[181299]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:52:57 np0005466013 python3.9[181451]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:52:58 np0005466013 python3.9[181603]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:52:59 np0005466013 python3.9[181755]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:52:59 np0005466013 python3.9[181907]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:53:00 np0005466013 python3.9[182059]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:53:00 np0005466013 python3.9[182211]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:53:01 np0005466013 python3.9[182363]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:53:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:53:02.262 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:53:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:53:02.262 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:53:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:53:02.262 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:53:02 np0005466013 python3.9[182515]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:53:03 np0005466013 python3.9[182667]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct  2 07:53:04 np0005466013 python3.9[182819]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  2 07:53:04 np0005466013 systemd[1]: Reloading.
Oct  2 07:53:04 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:53:04 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:53:04 np0005466013 podman[182821]: 2025-10-02 11:53:04.979084873 +0000 UTC m=+0.081668767 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team)
Oct  2 07:53:06 np0005466013 python3.9[183032]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:53:06 np0005466013 python3.9[183185]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:53:07 np0005466013 podman[183310]: 2025-10-02 11:53:07.06562619 +0000 UTC m=+0.054257956 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent)
Oct  2 07:53:07 np0005466013 python3.9[183352]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:53:08 np0005466013 python3.9[183512]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:53:08 np0005466013 python3.9[183665]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:53:09 np0005466013 python3.9[183818]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:53:09 np0005466013 python3.9[183971]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:53:10 np0005466013 python3.9[184124]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:53:12 np0005466013 python3.9[184277]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:53:13 np0005466013 python3.9[184429]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:53:13 np0005466013 python3.9[184581]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:53:14 np0005466013 python3.9[184733]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:53:15 np0005466013 python3.9[184885]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:53:15 np0005466013 python3.9[185037]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:53:16 np0005466013 python3.9[185189]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:53:17 np0005466013 python3.9[185341]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:53:17 np0005466013 python3.9[185493]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:53:18 np0005466013 podman[185617]: 2025-10-02 11:53:18.385949444 +0000 UTC m=+0.059063729 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  2 07:53:18 np0005466013 python3.9[185663]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:53:19 np0005466013 python3.9[185815]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:53:19 np0005466013 python3.9[185967]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:53:24 np0005466013 podman[186044]: 2025-10-02 11:53:24.665953236 +0000 UTC m=+0.043351278 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 07:53:25 np0005466013 python3.9[186140]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Oct  2 07:53:25 np0005466013 python3.9[186293]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct  2 07:53:27 np0005466013 python3.9[186451]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct  2 07:53:28 np0005466013 systemd-logind[784]: New session 27 of user zuul.
Oct  2 07:53:28 np0005466013 systemd[1]: Started Session 27 of User zuul.
Oct  2 07:53:28 np0005466013 systemd[1]: session-27.scope: Deactivated successfully.
Oct  2 07:53:28 np0005466013 systemd-logind[784]: Session 27 logged out. Waiting for processes to exit.
Oct  2 07:53:28 np0005466013 systemd-logind[784]: Removed session 27.
Oct  2 07:53:29 np0005466013 python3.9[186637]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:53:29 np0005466013 python3.9[186758]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759406008.7019207-4339-281268988486978/.source.json follow=False _original_basename=config.json.j2 checksum=2c2474b5f24ef7c9ed37f49680082593e0d1100b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:53:30 np0005466013 python3.9[186908]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:53:30 np0005466013 python3.9[186984]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:53:31 np0005466013 python3.9[187134]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:53:32 np0005466013 python3.9[187255]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759406010.98071-4339-199818069711393/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:53:32 np0005466013 python3.9[187405]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:53:33 np0005466013 python3.9[187526]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759406012.3146048-4339-248561361865351/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=d01cc1b48d783e4ed08d12bb4d0a107aba230a69 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:53:33 np0005466013 python3.9[187676]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:53:34 np0005466013 python3.9[187797]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759406013.443505-4339-30984442598945/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:53:35 np0005466013 podman[187921]: 2025-10-02 11:53:35.3002517 +0000 UTC m=+0.117132006 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Oct  2 07:53:35 np0005466013 python3.9[187968]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:53:36 np0005466013 python3.9[188127]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:53:37 np0005466013 python3.9[188279]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:53:37 np0005466013 podman[188403]: 2025-10-02 11:53:37.661786998 +0000 UTC m=+0.051585748 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Oct  2 07:53:37 np0005466013 python3.9[188448]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:53:38 np0005466013 python3.9[188573]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1759406017.3633451-4620-126041901237211/.source _original_basename=.j9kqdpsy follow=False checksum=80fc04559bf518736c73f0670fbb398e11961636 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Oct  2 07:53:39 np0005466013 python3.9[188725]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:53:40 np0005466013 python3.9[188877]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:53:40 np0005466013 python3.9[188998]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759406019.5929344-4696-11875825223775/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=f022386746472553146d29f689b545df70fa8a60 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:53:41 np0005466013 python3.9[189148]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:53:42 np0005466013 python3.9[189269]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759406021.0498767-4741-277556212123273/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:53:43 np0005466013 python3.9[189421]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Oct  2 07:53:44 np0005466013 python3.9[189573]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  2 07:53:45 np0005466013 python3[189725]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Oct  2 07:53:45 np0005466013 podman[189761]: 2025-10-02 11:53:45.208330454 +0000 UTC m=+0.060567031 container create b3ae9345ae2fb5b9aa9259f4bcb157e1448ee87af6cf81f51eb8d3d28e140ee9 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, container_name=nova_compute_init, org.label-schema.build-date=20251001, config_id=edpm, org.label-schema.schema-version=1.0)
Oct  2 07:53:45 np0005466013 podman[189761]: 2025-10-02 11:53:45.169547601 +0000 UTC m=+0.021784198 image pull e36f31143f26011980def9337d375f895bea59b742a3a2b372b996aa8ad58eba quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Oct  2 07:53:45 np0005466013 python3[189725]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Oct  2 07:53:46 np0005466013 python3.9[189951]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:53:47 np0005466013 python3.9[190105]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Oct  2 07:53:47 np0005466013 python3.9[190257]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  2 07:53:48 np0005466013 podman[190357]: 2025-10-02 11:53:48.684806115 +0000 UTC m=+0.057290708 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd)
Oct  2 07:53:48 np0005466013 python3[190429]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Oct  2 07:53:49 np0005466013 podman[190465]: 2025-10-02 11:53:49.157491087 +0000 UTC m=+0.042143790 container create c26424251010dbb8755e82469b23befb89d0f426f5be450664d59ee7671ad0d8 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, container_name=nova_compute, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Oct  2 07:53:49 np0005466013 podman[190465]: 2025-10-02 11:53:49.135031578 +0000 UTC m=+0.019684291 image pull e36f31143f26011980def9337d375f895bea59b742a3a2b372b996aa8ad58eba quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Oct  2 07:53:49 np0005466013 python3[190429]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Oct  2 07:53:50 np0005466013 python3.9[190655]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:53:51 np0005466013 python3.9[190809]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:53:51 np0005466013 python3.9[190960]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759406031.1319263-5016-148013285733861/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:53:52 np0005466013 python3.9[191036]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  2 07:53:52 np0005466013 systemd[1]: Reloading.
Oct  2 07:53:52 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:53:52 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:53:53 np0005466013 python3.9[191149]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:53:53 np0005466013 systemd[1]: Reloading.
Oct  2 07:53:53 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:53:53 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:53:53 np0005466013 systemd[1]: Starting nova_compute container...
Oct  2 07:53:53 np0005466013 systemd[1]: Started libcrun container.
Oct  2 07:53:53 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9ce797768def4e72fbe8f8c271402e40528c5cd037562ceb4acf4e8604e3e8b/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Oct  2 07:53:53 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9ce797768def4e72fbe8f8c271402e40528c5cd037562ceb4acf4e8604e3e8b/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct  2 07:53:53 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9ce797768def4e72fbe8f8c271402e40528c5cd037562ceb4acf4e8604e3e8b/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct  2 07:53:53 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9ce797768def4e72fbe8f8c271402e40528c5cd037562ceb4acf4e8604e3e8b/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Oct  2 07:53:53 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9ce797768def4e72fbe8f8c271402e40528c5cd037562ceb4acf4e8604e3e8b/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct  2 07:53:53 np0005466013 podman[191189]: 2025-10-02 11:53:53.758359919 +0000 UTC m=+0.086620214 container init c26424251010dbb8755e82469b23befb89d0f426f5be450664d59ee7671ad0d8 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  2 07:53:53 np0005466013 podman[191189]: 2025-10-02 11:53:53.766575628 +0000 UTC m=+0.094835903 container start c26424251010dbb8755e82469b23befb89d0f426f5be450664d59ee7671ad0d8 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=edpm, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 07:53:53 np0005466013 podman[191189]: nova_compute
Oct  2 07:53:53 np0005466013 nova_compute[191205]: + sudo -E kolla_set_configs
Oct  2 07:53:53 np0005466013 systemd[1]: Started nova_compute container.
Oct  2 07:53:53 np0005466013 nova_compute[191205]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  2 07:53:53 np0005466013 nova_compute[191205]: INFO:__main__:Validating config file
Oct  2 07:53:53 np0005466013 nova_compute[191205]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  2 07:53:53 np0005466013 nova_compute[191205]: INFO:__main__:Copying service configuration files
Oct  2 07:53:53 np0005466013 nova_compute[191205]: INFO:__main__:Deleting /etc/nova/nova.conf
Oct  2 07:53:53 np0005466013 nova_compute[191205]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Oct  2 07:53:53 np0005466013 nova_compute[191205]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Oct  2 07:53:53 np0005466013 nova_compute[191205]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Oct  2 07:53:53 np0005466013 nova_compute[191205]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Oct  2 07:53:53 np0005466013 nova_compute[191205]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Oct  2 07:53:53 np0005466013 nova_compute[191205]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Oct  2 07:53:53 np0005466013 nova_compute[191205]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Oct  2 07:53:53 np0005466013 nova_compute[191205]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Oct  2 07:53:53 np0005466013 nova_compute[191205]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct  2 07:53:53 np0005466013 nova_compute[191205]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct  2 07:53:53 np0005466013 nova_compute[191205]: INFO:__main__:Deleting /etc/ceph
Oct  2 07:53:53 np0005466013 nova_compute[191205]: INFO:__main__:Creating directory /etc/ceph
Oct  2 07:53:53 np0005466013 nova_compute[191205]: INFO:__main__:Setting permission for /etc/ceph
Oct  2 07:53:53 np0005466013 nova_compute[191205]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Oct  2 07:53:53 np0005466013 nova_compute[191205]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct  2 07:53:53 np0005466013 nova_compute[191205]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Oct  2 07:53:53 np0005466013 nova_compute[191205]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct  2 07:53:53 np0005466013 nova_compute[191205]: INFO:__main__:Writing out command to execute
Oct  2 07:53:53 np0005466013 nova_compute[191205]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Oct  2 07:53:53 np0005466013 nova_compute[191205]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct  2 07:53:53 np0005466013 nova_compute[191205]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct  2 07:53:53 np0005466013 nova_compute[191205]: ++ cat /run_command
Oct  2 07:53:53 np0005466013 nova_compute[191205]: + CMD=nova-compute
Oct  2 07:53:53 np0005466013 nova_compute[191205]: + ARGS=
Oct  2 07:53:53 np0005466013 nova_compute[191205]: + sudo kolla_copy_cacerts
Oct  2 07:53:53 np0005466013 nova_compute[191205]: + [[ ! -n '' ]]
Oct  2 07:53:53 np0005466013 nova_compute[191205]: + . kolla_extend_start
Oct  2 07:53:53 np0005466013 nova_compute[191205]: + echo 'Running command: '\''nova-compute'\'''
Oct  2 07:53:53 np0005466013 nova_compute[191205]: Running command: 'nova-compute'
Oct  2 07:53:53 np0005466013 nova_compute[191205]: + umask 0022
Oct  2 07:53:53 np0005466013 nova_compute[191205]: + exec nova-compute
Oct  2 07:53:54 np0005466013 podman[191341]: 2025-10-02 11:53:54.891743044 +0000 UTC m=+0.051552568 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, container_name=iscsid, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 07:53:55 np0005466013 python3.9[191384]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:53:55 np0005466013 nova_compute[191205]: 2025-10-02 11:53:55.915 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Oct  2 07:53:55 np0005466013 nova_compute[191205]: 2025-10-02 11:53:55.915 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Oct  2 07:53:55 np0005466013 nova_compute[191205]: 2025-10-02 11:53:55.916 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Oct  2 07:53:55 np0005466013 nova_compute[191205]: 2025-10-02 11:53:55.916 2 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Oct  2 07:53:56 np0005466013 python3.9[191541]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:53:56 np0005466013 nova_compute[191205]: 2025-10-02 11:53:56.061 2 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 07:53:56 np0005466013 nova_compute[191205]: 2025-10-02 11:53:56.074 2 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.013s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 07:53:57 np0005466013 python3.9[191693]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:53:57 np0005466013 nova_compute[191205]: 2025-10-02 11:53:57.900 2 INFO nova.virt.driver [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.004 2 INFO nova.compute.provider_config [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.253 2 DEBUG oslo_concurrency.lockutils [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.253 2 DEBUG oslo_concurrency.lockutils [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.253 2 DEBUG oslo_concurrency.lockutils [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.254 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.254 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.254 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.254 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.254 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.254 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.255 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.255 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.255 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.255 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.255 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.255 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.256 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.256 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.256 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.256 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.256 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.256 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.256 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.257 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] console_host                   = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.257 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.257 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.257 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.257 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.257 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.258 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.258 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.258 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.258 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.258 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.258 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.259 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.259 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.259 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.259 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.259 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.259 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.259 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.260 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.260 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.260 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.260 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.260 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.260 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.261 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.261 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.261 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.261 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.261 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.261 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.261 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.262 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.262 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.262 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.262 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.262 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.262 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.262 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.263 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.263 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.263 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.263 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.263 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.263 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.264 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.264 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.264 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.264 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.264 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.264 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.264 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.265 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.265 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.265 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.265 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.265 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.265 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.265 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.266 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.266 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.266 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] my_block_storage_ip            = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.266 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] my_ip                          = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.266 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.266 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.267 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.267 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.267 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.267 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.267 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.267 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.267 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.268 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.268 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.268 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.268 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.268 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.268 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.268 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.269 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.269 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.269 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.269 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.269 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.269 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.269 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.270 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.270 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.270 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.270 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.270 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.270 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.270 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.270 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.271 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.271 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.271 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.271 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.271 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.271 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.271 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.272 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.272 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.272 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.272 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.272 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.272 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.272 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.273 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.273 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.273 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.273 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.273 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.273 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.273 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.274 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.274 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.274 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.274 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.274 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.274 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.274 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.275 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.275 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.275 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.275 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.275 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.275 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.275 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.276 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.276 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.276 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.276 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.276 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.276 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.276 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.277 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.277 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.277 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.277 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.277 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.277 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.277 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.278 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.278 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.278 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.278 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.278 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.278 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.279 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.279 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.279 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.279 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.279 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.279 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.279 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.279 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.280 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.280 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.280 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.280 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.280 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.280 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.280 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.281 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.281 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.281 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.281 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.281 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.281 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.282 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.282 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.282 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.282 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.282 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.282 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.283 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.283 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.283 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.283 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.283 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.283 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.283 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.284 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.284 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.284 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.284 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.284 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.284 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.284 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.285 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.285 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.285 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.285 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.285 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.285 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.286 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.286 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.286 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.286 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.286 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.286 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.286 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.287 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.287 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.287 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.287 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.287 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.287 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.287 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.288 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.288 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.288 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.288 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.288 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.288 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.288 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.289 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.289 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.289 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.289 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.289 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.289 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.289 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.290 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.290 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.290 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.290 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.290 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.290 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.291 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.291 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.291 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.291 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.291 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.291 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.292 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.292 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.292 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.292 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.292 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.293 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.293 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.293 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.293 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.293 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.294 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.294 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.294 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.294 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.294 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.294 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.295 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.295 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.295 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.295 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.295 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.295 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.295 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.295 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.296 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.296 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.296 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.296 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.296 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.296 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.296 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.297 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.297 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.297 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.297 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.297 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.298 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.298 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.298 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.298 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.298 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.298 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.299 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.299 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.299 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.299 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.299 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.299 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.300 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.300 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.300 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.300 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.300 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.301 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.301 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.301 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.301 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.301 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.302 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.302 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.302 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.302 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.302 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.302 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.302 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.303 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.303 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.303 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.303 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.303 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.303 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.303 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.304 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.304 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.304 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.304 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.304 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.304 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.305 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.305 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.305 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.305 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.305 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.305 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.305 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.306 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.306 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.306 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.306 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.306 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.306 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.307 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.307 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.307 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.307 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.307 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.307 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.308 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.308 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.308 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.308 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.309 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.309 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.309 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.309 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.309 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.309 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.310 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.310 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.310 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.310 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.310 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.310 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.311 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.311 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.311 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.311 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.311 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.311 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.312 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.312 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.312 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.312 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.312 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.313 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.313 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.313 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.313 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.313 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.314 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.314 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.314 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.314 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.314 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.314 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.315 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.315 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.315 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.315 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.315 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.316 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.316 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.316 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.316 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.316 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.317 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.317 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.317 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.317 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.317 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.318 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.318 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.318 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.318 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.318 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.318 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.318 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.319 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.319 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.319 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.319 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.319 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.319 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.319 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.320 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.320 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.320 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.320 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.320 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.320 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.320 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.321 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.321 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.321 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.321 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.321 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.321 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.321 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.322 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.322 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.322 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.322 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.322 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.322 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.323 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.323 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.323 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.323 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.323 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.324 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.324 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.324 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.324 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.324 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.324 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.324 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.325 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.325 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.325 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.325 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.325 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.325 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.326 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.326 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.326 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.326 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.326 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.326 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.326 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.327 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.327 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.327 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.327 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.327 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.328 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.328 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.328 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.328 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.328 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.328 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.329 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.329 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.329 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.329 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.329 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.329 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.329 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.330 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.330 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.330 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.330 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.330 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.330 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.330 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.331 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.331 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.331 2 WARNING oslo_config.cfg [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Oct  2 07:53:58 np0005466013 nova_compute[191205]: live_migration_uri is deprecated for removal in favor of two other options that
Oct  2 07:53:58 np0005466013 nova_compute[191205]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Oct  2 07:53:58 np0005466013 nova_compute[191205]: and ``live_migration_inbound_addr`` respectively.
Oct  2 07:53:58 np0005466013 nova_compute[191205]: ).  Its value may be silently ignored in the future.#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.331 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.331 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.332 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.332 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.332 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.332 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.332 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.333 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.333 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.333 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.333 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.333 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.333 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.334 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.334 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.334 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.334 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.334 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.335 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.335 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.335 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.335 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.335 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.335 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.335 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.336 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.336 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.336 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.336 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.336 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.336 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.337 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.337 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.337 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.337 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.337 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.337 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.338 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.338 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.338 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.338 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.338 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.338 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.339 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.339 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.339 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.339 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.339 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.340 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.340 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.340 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.340 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.340 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.340 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.340 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.341 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.341 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.341 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.341 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.341 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.342 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.342 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.342 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.342 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.342 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.342 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.343 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.343 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.343 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.343 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.343 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.343 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.344 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.344 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.344 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.344 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.344 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.345 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.345 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.345 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.345 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.345 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.345 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.346 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.346 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.346 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.346 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.346 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.346 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.346 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.347 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.347 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.347 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.347 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.347 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.347 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.347 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.348 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.348 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.348 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.348 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.348 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.348 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.348 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.348 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.349 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.349 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.349 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.349 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.349 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.349 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.350 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.350 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.350 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.350 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.350 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.350 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.350 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.351 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.351 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.351 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.351 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.351 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.351 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.351 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.352 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.352 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.352 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.352 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.352 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.352 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.352 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.353 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.353 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.353 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.353 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.353 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.353 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.353 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.354 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.354 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.354 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.354 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.354 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.354 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.355 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.355 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.355 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.355 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.355 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.355 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.355 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.356 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.356 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.356 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.356 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.356 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.356 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.357 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.357 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.357 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.357 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.357 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.357 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.357 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.358 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.358 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.358 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.358 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.358 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.358 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.359 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.359 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.359 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.359 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.359 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.359 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.359 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.360 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.360 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.360 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.360 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.360 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.360 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.361 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.361 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.361 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.361 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.361 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.361 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.361 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.362 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.362 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.362 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.362 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.362 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.362 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.363 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.363 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.363 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.363 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.363 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.363 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.363 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.364 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.364 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.364 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.364 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.364 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.364 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.364 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.365 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.365 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.365 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.365 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.365 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.365 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.365 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.366 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.366 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.366 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.366 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.366 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.366 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.367 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.367 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.367 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.367 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.367 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.367 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.367 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.367 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.368 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.368 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.368 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.368 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.368 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.368 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.368 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.369 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.369 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.369 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.369 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.369 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.369 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.369 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.370 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.370 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.370 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.370 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.370 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.371 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.371 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vnc.server_proxyclient_address = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.371 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.371 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.371 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.371 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.371 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.372 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.372 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.372 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.372 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.372 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.372 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.372 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.373 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.373 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.373 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.373 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.373 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.373 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.374 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.374 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.374 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.374 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.374 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.374 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.374 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.375 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.375 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.375 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.375 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.375 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.375 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.375 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.376 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.376 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.376 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.376 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.376 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.376 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.376 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.377 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.377 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.377 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.377 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.377 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.377 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.377 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.378 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.378 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.378 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.378 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.378 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.378 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.379 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.379 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.379 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.379 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.379 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.379 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.379 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.380 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.380 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.380 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.380 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.380 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.380 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.380 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.381 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.381 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.381 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.381 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.381 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.381 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.382 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.382 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.382 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.382 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.382 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.382 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.382 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.383 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.383 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.383 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.383 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.383 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.383 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.384 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.384 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.384 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.384 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.384 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.384 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.384 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.385 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.385 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.385 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.385 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.385 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.385 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.385 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.386 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.386 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.386 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.386 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.386 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.386 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.386 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.387 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.387 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.387 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.387 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.387 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.387 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.387 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.388 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.388 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.388 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.388 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.388 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.388 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.388 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.388 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.389 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.389 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.389 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.389 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.389 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.389 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.389 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.390 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.390 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.390 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.390 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.390 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.390 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.390 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.391 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.391 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.391 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.391 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.391 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.391 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.391 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.392 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.392 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.392 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.392 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.392 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.392 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.392 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.393 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.393 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.393 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.393 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.393 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.393 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.393 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.394 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.394 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.394 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.394 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.394 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.394 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.394 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.395 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.395 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.395 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.395 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.395 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.395 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.395 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.396 2 DEBUG oslo_service.service [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.397 2 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.449 2 DEBUG nova.virt.libvirt.host [None req-977f39b8-3f99-4ba5-a206-73bed3e02014 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.450 2 DEBUG nova.virt.libvirt.host [None req-977f39b8-3f99-4ba5-a206-73bed3e02014 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.451 2 DEBUG nova.virt.libvirt.host [None req-977f39b8-3f99-4ba5-a206-73bed3e02014 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.451 2 DEBUG nova.virt.libvirt.host [None req-977f39b8-3f99-4ba5-a206-73bed3e02014 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Oct  2 07:53:58 np0005466013 systemd[1]: Starting libvirt QEMU daemon...
Oct  2 07:53:58 np0005466013 systemd[1]: Started libvirt QEMU daemon.
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.527 2 DEBUG nova.virt.libvirt.host [None req-977f39b8-3f99-4ba5-a206-73bed3e02014 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fea3d773f70> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.530 2 DEBUG nova.virt.libvirt.host [None req-977f39b8-3f99-4ba5-a206-73bed3e02014 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fea3d773f70> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.530 2 INFO nova.virt.libvirt.driver [None req-977f39b8-3f99-4ba5-a206-73bed3e02014 - - - - - -] Connection event '1' reason 'None'#033[00m
Oct  2 07:53:58 np0005466013 python3.9[191845]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.581 2 WARNING nova.virt.libvirt.driver [None req-977f39b8-3f99-4ba5-a206-73bed3e02014 - - - - - -] Cannot update service status on host "compute-2.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-2.ctlplane.example.com could not be found.#033[00m
Oct  2 07:53:58 np0005466013 nova_compute[191205]: 2025-10-02 11:53:58.582 2 DEBUG nova.virt.libvirt.volume.mount [None req-977f39b8-3f99-4ba5-a206-73bed3e02014 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Oct  2 07:53:58 np0005466013 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 07:53:58 np0005466013 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 07:53:59 np0005466013 nova_compute[191205]: 2025-10-02 11:53:59.312 2 INFO nova.virt.libvirt.host [None req-977f39b8-3f99-4ba5-a206-73bed3e02014 - - - - - -] Libvirt host capabilities <capabilities>
Oct  2 07:53:59 np0005466013 nova_compute[191205]: 
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  <host>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <uuid>f7714643-27c8-4af0-a2a6-be96dd6da6b1</uuid>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <cpu>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <arch>x86_64</arch>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model>EPYC-Rome-v4</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <vendor>AMD</vendor>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <microcode version='16777317'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <signature family='23' model='49' stepping='0'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <maxphysaddr mode='emulate' bits='40'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature name='x2apic'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature name='tsc-deadline'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature name='osxsave'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature name='hypervisor'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature name='tsc_adjust'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature name='spec-ctrl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature name='stibp'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature name='arch-capabilities'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature name='ssbd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature name='cmp_legacy'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature name='topoext'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature name='virt-ssbd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature name='lbrv'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature name='tsc-scale'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature name='vmcb-clean'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature name='pause-filter'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature name='pfthreshold'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature name='svme-addr-chk'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature name='rdctl-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature name='skip-l1dfl-vmentry'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature name='mds-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature name='pschange-mc-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <pages unit='KiB' size='4'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <pages unit='KiB' size='2048'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <pages unit='KiB' size='1048576'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </cpu>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <power_management>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <suspend_mem/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <suspend_disk/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <suspend_hybrid/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </power_management>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <iommu support='no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <migration_features>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <live/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <uri_transports>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <uri_transport>tcp</uri_transport>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <uri_transport>rdma</uri_transport>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </uri_transports>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </migration_features>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <topology>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <cells num='1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <cell id='0'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:          <memory unit='KiB'>7864108</memory>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:          <pages unit='KiB' size='4'>1966027</pages>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:          <pages unit='KiB' size='2048'>0</pages>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:          <pages unit='KiB' size='1048576'>0</pages>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:          <distances>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:            <sibling id='0' value='10'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:          </distances>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:          <cpus num='8'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:          </cpus>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        </cell>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </cells>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </topology>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <cache>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </cache>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <secmodel>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model>selinux</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <doi>0</doi>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </secmodel>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <secmodel>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model>dac</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <doi>0</doi>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <baselabel type='kvm'>+107:+107</baselabel>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <baselabel type='qemu'>+107:+107</baselabel>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </secmodel>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  </host>
Oct  2 07:53:59 np0005466013 nova_compute[191205]: 
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  <guest>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <os_type>hvm</os_type>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <arch name='i686'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <wordsize>32</wordsize>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <domain type='qemu'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <domain type='kvm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </arch>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <features>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <pae/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <nonpae/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <acpi default='on' toggle='yes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <apic default='on' toggle='no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <cpuselection/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <deviceboot/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <disksnapshot default='on' toggle='no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <externalSnapshot/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </features>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  </guest>
Oct  2 07:53:59 np0005466013 nova_compute[191205]: 
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  <guest>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <os_type>hvm</os_type>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <arch name='x86_64'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <wordsize>64</wordsize>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <domain type='qemu'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <domain type='kvm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </arch>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <features>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <acpi default='on' toggle='yes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <apic default='on' toggle='no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <cpuselection/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <deviceboot/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <disksnapshot default='on' toggle='no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <externalSnapshot/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </features>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  </guest>
Oct  2 07:53:59 np0005466013 nova_compute[191205]: 
Oct  2 07:53:59 np0005466013 nova_compute[191205]: </capabilities>
Oct  2 07:53:59 np0005466013 nova_compute[191205]: #033[00m
Oct  2 07:53:59 np0005466013 nova_compute[191205]: 2025-10-02 11:53:59.320 2 DEBUG nova.virt.libvirt.host [None req-977f39b8-3f99-4ba5-a206-73bed3e02014 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Oct  2 07:53:59 np0005466013 nova_compute[191205]: 2025-10-02 11:53:59.339 2 DEBUG nova.virt.libvirt.host [None req-977f39b8-3f99-4ba5-a206-73bed3e02014 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Oct  2 07:53:59 np0005466013 nova_compute[191205]: <domainCapabilities>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  <path>/usr/libexec/qemu-kvm</path>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  <domain>kvm</domain>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  <machine>pc-i440fx-rhel7.6.0</machine>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  <arch>i686</arch>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  <vcpu max='240'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  <iothreads supported='yes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  <os supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <enum name='firmware'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <loader supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='type'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>rom</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>pflash</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='readonly'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>yes</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>no</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='secure'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>no</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </loader>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  </os>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  <cpu>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <mode name='host-passthrough' supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='hostPassthroughMigratable'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>on</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>off</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </mode>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <mode name='maximum' supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='maximumMigratable'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>on</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>off</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </mode>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <mode name='host-model' supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model fallback='forbid'>EPYC-Rome</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <vendor>AMD</vendor>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='x2apic'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='tsc-deadline'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='hypervisor'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='tsc_adjust'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='spec-ctrl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='stibp'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='arch-capabilities'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='ssbd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='cmp_legacy'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='overflow-recov'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='succor'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='ibrs'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='amd-ssbd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='virt-ssbd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='lbrv'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='tsc-scale'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='vmcb-clean'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='flushbyasid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='pause-filter'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='pfthreshold'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='svme-addr-chk'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='lfence-always-serializing'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='rdctl-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='mds-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='pschange-mc-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='gds-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='rfds-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='disable' name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </mode>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <mode name='custom' supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Broadwell'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Broadwell-IBRS'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Broadwell-noTSX'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Broadwell-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Broadwell-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Broadwell-v3'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Broadwell-v4'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Cascadelake-Server'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Cascadelake-Server-noTSX'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Cascadelake-Server-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Cascadelake-Server-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Cascadelake-Server-v3'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Cascadelake-Server-v4'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Cascadelake-Server-v5'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Cooperlake'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Cooperlake-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Cooperlake-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Denverton'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='mpx'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Denverton-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='mpx'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Denverton-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Denverton-v3'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Dhyana-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='EPYC-Genoa'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amd-psfd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='auto-ibrs'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='no-nested-data-bp'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='null-sel-clr-base'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='stibp-always-on'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='EPYC-Genoa-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amd-psfd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='auto-ibrs'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='no-nested-data-bp'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='null-sel-clr-base'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='stibp-always-on'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='EPYC-Milan'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='EPYC-Milan-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='EPYC-Milan-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amd-psfd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='no-nested-data-bp'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='null-sel-clr-base'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='stibp-always-on'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='EPYC-Rome'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='EPYC-Rome-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='EPYC-Rome-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='EPYC-Rome-v3'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='EPYC-v3'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='EPYC-v4'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='GraniteRapids'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-fp16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-int8'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-tile'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-fp16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fbsdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrc'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fzrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='mcdt-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pbrsb-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='prefetchiti'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='psdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xfd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='GraniteRapids-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-fp16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-int8'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-tile'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-fp16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fbsdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrc'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fzrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='mcdt-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pbrsb-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='prefetchiti'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='psdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xfd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='GraniteRapids-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-fp16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-int8'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-tile'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx10'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx10-128'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx10-256'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx10-512'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-fp16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='cldemote'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fbsdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrc'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fzrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='mcdt-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='movdir64b'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='movdiri'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pbrsb-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='prefetchiti'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='psdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xfd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Haswell'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Haswell-IBRS'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Haswell-noTSX'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Haswell-noTSX-IBRS'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Haswell-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Haswell-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Haswell-v3'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Haswell-v4'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Icelake-Server'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Icelake-Server-noTSX'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Icelake-Server-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Icelake-Server-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Icelake-Server-v3'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Icelake-Server-v4'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Icelake-Server-v5'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Icelake-Server-v6'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Icelake-Server-v7'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='IvyBridge'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='IvyBridge-IBRS'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='IvyBridge-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='IvyBridge-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='KnightsMill'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-4fmaps'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-4vnniw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512er'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512pf'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='KnightsMill-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-4fmaps'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-4vnniw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512er'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512pf'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Opteron_G4'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fma4'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xop'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Opteron_G4-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fma4'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xop'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Opteron_G5'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fma4'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='tbm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xop'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Opteron_G5-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fma4'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='tbm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xop'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='SapphireRapids'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-int8'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-tile'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-fp16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrc'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fzrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xfd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='SapphireRapids-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-int8'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-tile'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-fp16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrc'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fzrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xfd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='SapphireRapids-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-int8'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-tile'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-fp16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fbsdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrc'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fzrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='psdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xfd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='SapphireRapids-v3'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-int8'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-tile'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-fp16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='cldemote'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fbsdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrc'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fzrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='movdir64b'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='movdiri'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='psdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xfd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='SierraForest'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-ne-convert'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-vnni-int8'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='cmpccxadd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fbsdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='mcdt-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pbrsb-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='psdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='SierraForest-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-ne-convert'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-vnni-int8'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='cmpccxadd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fbsdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='mcdt-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pbrsb-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='psdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Client'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Client-IBRS'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Client-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Client-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Client-v3'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Client-v4'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Server'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Server-IBRS'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Server-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Server-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Server-v3'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Server-v4'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Server-v5'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Snowridge'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='cldemote'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='core-capability'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='movdir64b'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='movdiri'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='mpx'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='split-lock-detect'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Snowridge-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='cldemote'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='core-capability'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='movdir64b'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='movdiri'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='mpx'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='split-lock-detect'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Snowridge-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='cldemote'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='core-capability'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='movdir64b'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='movdiri'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='split-lock-detect'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Snowridge-v3'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='cldemote'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='core-capability'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='movdir64b'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='movdiri'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='split-lock-detect'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Snowridge-v4'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='cldemote'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='movdir64b'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='movdiri'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='athlon'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='3dnow'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='3dnowext'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='athlon-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='3dnow'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='3dnowext'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='core2duo'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='core2duo-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='coreduo'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='coreduo-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='n270'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='n270-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='phenom'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='3dnow'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='3dnowext'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='phenom-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='3dnow'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='3dnowext'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </mode>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  </cpu>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  <memoryBacking supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <enum name='sourceType'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <value>file</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <value>anonymous</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <value>memfd</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  </memoryBacking>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  <devices>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <disk supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='diskDevice'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>disk</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>cdrom</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>floppy</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>lun</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='bus'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>ide</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>fdc</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>scsi</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>virtio</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>usb</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>sata</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='model'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>virtio</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>virtio-transitional</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>virtio-non-transitional</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </disk>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <graphics supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='type'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>vnc</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>egl-headless</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>dbus</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </graphics>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <video supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='modelType'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>vga</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>cirrus</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>virtio</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>none</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>bochs</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>ramfb</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </video>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <hostdev supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='mode'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>subsystem</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='startupPolicy'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>default</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>mandatory</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>requisite</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>optional</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='subsysType'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>usb</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>pci</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>scsi</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='capsType'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='pciBackend'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </hostdev>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <rng supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='model'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>virtio</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>virtio-transitional</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>virtio-non-transitional</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='backendModel'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>random</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>egd</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>builtin</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </rng>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <filesystem supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='driverType'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>path</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>handle</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>virtiofs</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </filesystem>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <tpm supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='model'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>tpm-tis</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>tpm-crb</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='backendModel'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>emulator</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>external</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='backendVersion'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>2.0</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </tpm>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <redirdev supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='bus'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>usb</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </redirdev>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <channel supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='type'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>pty</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>unix</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </channel>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <crypto supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='model'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='type'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>qemu</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='backendModel'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>builtin</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </crypto>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <interface supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='backendType'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>default</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>passt</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </interface>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <panic supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='model'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>isa</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>hyperv</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </panic>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  </devices>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  <features>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <gic supported='no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <vmcoreinfo supported='yes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <genid supported='yes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <backingStoreInput supported='yes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <backup supported='yes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <async-teardown supported='yes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <ps2 supported='yes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <sev supported='no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <sgx supported='no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <hyperv supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='features'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>relaxed</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>vapic</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>spinlocks</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>vpindex</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>runtime</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>synic</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>stimer</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>reset</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>vendor_id</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>frequencies</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>reenlightenment</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>tlbflush</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>ipi</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>avic</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>emsr_bitmap</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>xmm_input</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </hyperv>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <launchSecurity supported='no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  </features>
Oct  2 07:53:59 np0005466013 nova_compute[191205]: </domainCapabilities>
Oct  2 07:53:59 np0005466013 nova_compute[191205]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct  2 07:53:59 np0005466013 nova_compute[191205]: 2025-10-02 11:53:59.346 2 DEBUG nova.virt.libvirt.host [None req-977f39b8-3f99-4ba5-a206-73bed3e02014 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Oct  2 07:53:59 np0005466013 nova_compute[191205]: <domainCapabilities>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  <path>/usr/libexec/qemu-kvm</path>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  <domain>kvm</domain>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  <machine>pc-q35-rhel9.6.0</machine>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  <arch>i686</arch>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  <vcpu max='4096'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  <iothreads supported='yes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  <os supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <enum name='firmware'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <loader supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='type'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>rom</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>pflash</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='readonly'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>yes</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>no</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='secure'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>no</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </loader>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  </os>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  <cpu>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <mode name='host-passthrough' supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='hostPassthroughMigratable'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>on</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>off</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </mode>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <mode name='maximum' supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='maximumMigratable'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>on</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>off</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </mode>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <mode name='host-model' supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model fallback='forbid'>EPYC-Rome</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <vendor>AMD</vendor>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='x2apic'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='tsc-deadline'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='hypervisor'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='tsc_adjust'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='spec-ctrl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='stibp'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='arch-capabilities'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='ssbd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='cmp_legacy'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='overflow-recov'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='succor'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='ibrs'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='amd-ssbd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='virt-ssbd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='lbrv'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='tsc-scale'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='vmcb-clean'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='flushbyasid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='pause-filter'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='pfthreshold'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='svme-addr-chk'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='lfence-always-serializing'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='rdctl-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='mds-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='pschange-mc-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='gds-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='rfds-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='disable' name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </mode>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <mode name='custom' supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Broadwell'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Broadwell-IBRS'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Broadwell-noTSX'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Broadwell-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Broadwell-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Broadwell-v3'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Broadwell-v4'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Cascadelake-Server'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Cascadelake-Server-noTSX'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Cascadelake-Server-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Cascadelake-Server-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Cascadelake-Server-v3'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Cascadelake-Server-v4'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Cascadelake-Server-v5'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Cooperlake'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Cooperlake-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Cooperlake-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Denverton'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='mpx'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Denverton-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='mpx'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Denverton-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Denverton-v3'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Dhyana-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='EPYC-Genoa'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amd-psfd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='auto-ibrs'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='no-nested-data-bp'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='null-sel-clr-base'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='stibp-always-on'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='EPYC-Genoa-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amd-psfd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='auto-ibrs'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='no-nested-data-bp'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='null-sel-clr-base'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='stibp-always-on'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='EPYC-Milan'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='EPYC-Milan-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='EPYC-Milan-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amd-psfd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='no-nested-data-bp'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='null-sel-clr-base'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='stibp-always-on'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='EPYC-Rome'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='EPYC-Rome-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='EPYC-Rome-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='EPYC-Rome-v3'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='EPYC-v3'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='EPYC-v4'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='GraniteRapids'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-fp16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-int8'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-tile'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-fp16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fbsdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrc'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fzrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='mcdt-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pbrsb-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='prefetchiti'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='psdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xfd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='GraniteRapids-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-fp16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-int8'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-tile'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-fp16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fbsdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrc'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fzrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='mcdt-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pbrsb-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='prefetchiti'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='psdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xfd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='GraniteRapids-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-fp16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-int8'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-tile'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx10'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx10-128'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx10-256'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx10-512'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-fp16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='cldemote'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fbsdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrc'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fzrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='mcdt-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='movdir64b'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='movdiri'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pbrsb-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='prefetchiti'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='psdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xfd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Haswell'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Haswell-IBRS'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Haswell-noTSX'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Haswell-noTSX-IBRS'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Haswell-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Haswell-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Haswell-v3'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Haswell-v4'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Icelake-Server'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Icelake-Server-noTSX'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Icelake-Server-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Icelake-Server-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Icelake-Server-v3'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Icelake-Server-v4'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Icelake-Server-v5'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Icelake-Server-v6'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Icelake-Server-v7'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='IvyBridge'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='IvyBridge-IBRS'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='IvyBridge-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='IvyBridge-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='KnightsMill'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-4fmaps'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-4vnniw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512er'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512pf'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='KnightsMill-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-4fmaps'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-4vnniw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512er'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512pf'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Opteron_G4'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fma4'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xop'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Opteron_G4-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fma4'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xop'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Opteron_G5'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fma4'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='tbm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xop'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Opteron_G5-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fma4'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='tbm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xop'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='SapphireRapids'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-int8'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-tile'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-fp16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrc'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fzrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xfd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='SapphireRapids-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-int8'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-tile'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-fp16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrc'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fzrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xfd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='SapphireRapids-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-int8'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-tile'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-fp16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fbsdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrc'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fzrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='psdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xfd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='SapphireRapids-v3'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-int8'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-tile'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-fp16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='cldemote'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fbsdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrc'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fzrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='movdir64b'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='movdiri'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='psdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xfd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='SierraForest'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-ne-convert'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-vnni-int8'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='cmpccxadd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fbsdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='mcdt-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pbrsb-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='psdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='SierraForest-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-ne-convert'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-vnni-int8'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='cmpccxadd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fbsdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='mcdt-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pbrsb-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='psdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Client'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Client-IBRS'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Client-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Client-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Client-v3'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Client-v4'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Server'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Server-IBRS'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Server-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Server-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Server-v3'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Server-v4'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Server-v5'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Snowridge'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='cldemote'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='core-capability'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='movdir64b'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='movdiri'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='mpx'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='split-lock-detect'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Snowridge-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='cldemote'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='core-capability'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='movdir64b'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='movdiri'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='mpx'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='split-lock-detect'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Snowridge-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='cldemote'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='core-capability'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='movdir64b'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='movdiri'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='split-lock-detect'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Snowridge-v3'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='cldemote'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='core-capability'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='movdir64b'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='movdiri'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='split-lock-detect'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Snowridge-v4'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='cldemote'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='movdir64b'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='movdiri'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='athlon'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='3dnow'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='3dnowext'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='athlon-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='3dnow'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='3dnowext'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='core2duo'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='core2duo-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='coreduo'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='coreduo-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='n270'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='n270-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='phenom'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='3dnow'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='3dnowext'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='phenom-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='3dnow'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='3dnowext'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </mode>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  </cpu>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  <memoryBacking supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <enum name='sourceType'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <value>file</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <value>anonymous</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <value>memfd</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  </memoryBacking>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  <devices>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <disk supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='diskDevice'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>disk</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>cdrom</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>floppy</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>lun</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='bus'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>fdc</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>scsi</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>virtio</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>usb</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>sata</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='model'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>virtio</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>virtio-transitional</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>virtio-non-transitional</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </disk>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <graphics supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='type'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>vnc</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>egl-headless</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>dbus</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </graphics>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <video supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='modelType'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>vga</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>cirrus</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>virtio</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>none</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>bochs</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>ramfb</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </video>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <hostdev supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='mode'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>subsystem</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='startupPolicy'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>default</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>mandatory</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>requisite</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>optional</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='subsysType'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>usb</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>pci</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>scsi</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='capsType'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='pciBackend'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </hostdev>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <rng supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='model'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>virtio</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>virtio-transitional</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>virtio-non-transitional</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='backendModel'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>random</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>egd</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>builtin</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </rng>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <filesystem supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='driverType'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>path</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>handle</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>virtiofs</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </filesystem>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <tpm supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='model'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>tpm-tis</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>tpm-crb</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='backendModel'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>emulator</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>external</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='backendVersion'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>2.0</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </tpm>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <redirdev supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='bus'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>usb</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </redirdev>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <channel supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='type'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>pty</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>unix</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </channel>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <crypto supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='model'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='type'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>qemu</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='backendModel'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>builtin</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </crypto>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <interface supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='backendType'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>default</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>passt</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </interface>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <panic supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='model'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>isa</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>hyperv</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </panic>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  </devices>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  <features>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <gic supported='no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <vmcoreinfo supported='yes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <genid supported='yes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <backingStoreInput supported='yes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <backup supported='yes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <async-teardown supported='yes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <ps2 supported='yes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <sev supported='no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <sgx supported='no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <hyperv supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='features'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>relaxed</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>vapic</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>spinlocks</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>vpindex</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>runtime</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>synic</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>stimer</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>reset</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>vendor_id</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>frequencies</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>reenlightenment</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>tlbflush</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>ipi</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>avic</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>emsr_bitmap</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>xmm_input</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </hyperv>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <launchSecurity supported='no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  </features>
Oct  2 07:53:59 np0005466013 nova_compute[191205]: </domainCapabilities>
Oct  2 07:53:59 np0005466013 nova_compute[191205]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct  2 07:53:59 np0005466013 nova_compute[191205]: 2025-10-02 11:53:59.372 2 DEBUG nova.virt.libvirt.host [None req-977f39b8-3f99-4ba5-a206-73bed3e02014 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Oct  2 07:53:59 np0005466013 nova_compute[191205]: 2025-10-02 11:53:59.375 2 DEBUG nova.virt.libvirt.host [None req-977f39b8-3f99-4ba5-a206-73bed3e02014 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Oct  2 07:53:59 np0005466013 nova_compute[191205]: <domainCapabilities>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  <path>/usr/libexec/qemu-kvm</path>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  <domain>kvm</domain>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  <machine>pc-i440fx-rhel7.6.0</machine>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  <arch>x86_64</arch>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  <vcpu max='240'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  <iothreads supported='yes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  <os supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <enum name='firmware'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <loader supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='type'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>rom</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>pflash</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='readonly'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>yes</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>no</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='secure'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>no</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </loader>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  </os>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  <cpu>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <mode name='host-passthrough' supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='hostPassthroughMigratable'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>on</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>off</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </mode>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <mode name='maximum' supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='maximumMigratable'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>on</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>off</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </mode>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <mode name='host-model' supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model fallback='forbid'>EPYC-Rome</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <vendor>AMD</vendor>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='x2apic'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='tsc-deadline'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='hypervisor'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='tsc_adjust'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='spec-ctrl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='stibp'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='arch-capabilities'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='ssbd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='cmp_legacy'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='overflow-recov'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='succor'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='ibrs'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='amd-ssbd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='virt-ssbd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='lbrv'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='tsc-scale'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='vmcb-clean'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='flushbyasid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='pause-filter'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='pfthreshold'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='svme-addr-chk'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='lfence-always-serializing'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='rdctl-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='mds-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='pschange-mc-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='gds-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='rfds-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='disable' name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </mode>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <mode name='custom' supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Broadwell'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Broadwell-IBRS'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Broadwell-noTSX'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Broadwell-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Broadwell-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Broadwell-v3'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Broadwell-v4'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Cascadelake-Server'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Cascadelake-Server-noTSX'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Cascadelake-Server-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Cascadelake-Server-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Cascadelake-Server-v3'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Cascadelake-Server-v4'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Cascadelake-Server-v5'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Cooperlake'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Cooperlake-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Cooperlake-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Denverton'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='mpx'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Denverton-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='mpx'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Denverton-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Denverton-v3'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Dhyana-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='EPYC-Genoa'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amd-psfd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='auto-ibrs'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='no-nested-data-bp'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='null-sel-clr-base'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='stibp-always-on'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='EPYC-Genoa-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amd-psfd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='auto-ibrs'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='no-nested-data-bp'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='null-sel-clr-base'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='stibp-always-on'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='EPYC-Milan'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='EPYC-Milan-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='EPYC-Milan-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amd-psfd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='no-nested-data-bp'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='null-sel-clr-base'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='stibp-always-on'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='EPYC-Rome'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='EPYC-Rome-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='EPYC-Rome-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='EPYC-Rome-v3'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='EPYC-v3'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='EPYC-v4'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='GraniteRapids'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-fp16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-int8'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-tile'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-fp16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fbsdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrc'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fzrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='mcdt-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pbrsb-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='prefetchiti'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='psdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xfd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='GraniteRapids-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-fp16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-int8'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-tile'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-fp16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fbsdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrc'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fzrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='mcdt-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pbrsb-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='prefetchiti'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='psdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xfd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='GraniteRapids-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-fp16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-int8'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-tile'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx10'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx10-128'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx10-256'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx10-512'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-fp16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='cldemote'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fbsdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrc'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fzrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='mcdt-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='movdir64b'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='movdiri'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pbrsb-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='prefetchiti'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='psdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xfd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Haswell'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Haswell-IBRS'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Haswell-noTSX'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Haswell-noTSX-IBRS'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Haswell-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Haswell-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Haswell-v3'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Haswell-v4'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Icelake-Server'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Icelake-Server-noTSX'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Icelake-Server-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Icelake-Server-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Icelake-Server-v3'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Icelake-Server-v4'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Icelake-Server-v5'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Icelake-Server-v6'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Icelake-Server-v7'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='IvyBridge'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='IvyBridge-IBRS'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='IvyBridge-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='IvyBridge-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='KnightsMill'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-4fmaps'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-4vnniw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512er'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512pf'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='KnightsMill-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-4fmaps'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-4vnniw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512er'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512pf'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Opteron_G4'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fma4'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xop'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Opteron_G4-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fma4'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xop'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Opteron_G5'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fma4'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='tbm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xop'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Opteron_G5-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fma4'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='tbm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xop'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='SapphireRapids'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-int8'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-tile'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-fp16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrc'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fzrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xfd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='SapphireRapids-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-int8'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-tile'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-fp16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrc'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fzrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xfd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='SapphireRapids-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-int8'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-tile'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-fp16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fbsdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrc'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fzrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='psdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xfd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='SapphireRapids-v3'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-int8'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-tile'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-fp16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='cldemote'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fbsdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrc'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fzrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='movdir64b'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='movdiri'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='psdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xfd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='SierraForest'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-ne-convert'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-vnni-int8'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='cmpccxadd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fbsdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='mcdt-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pbrsb-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='psdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='SierraForest-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-ne-convert'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-vnni-int8'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='cmpccxadd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fbsdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='mcdt-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pbrsb-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='psdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Client'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Client-IBRS'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Client-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Client-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Client-v3'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Client-v4'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Server'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Server-IBRS'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Server-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Server-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Server-v3'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Server-v4'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Server-v5'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Snowridge'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='cldemote'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='core-capability'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='movdir64b'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='movdiri'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='mpx'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='split-lock-detect'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Snowridge-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='cldemote'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='core-capability'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='movdir64b'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='movdiri'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='mpx'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='split-lock-detect'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Snowridge-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='cldemote'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='core-capability'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='movdir64b'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='movdiri'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='split-lock-detect'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Snowridge-v3'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='cldemote'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='core-capability'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='movdir64b'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='movdiri'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='split-lock-detect'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Snowridge-v4'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='cldemote'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='movdir64b'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='movdiri'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='athlon'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='3dnow'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='3dnowext'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='athlon-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='3dnow'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='3dnowext'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='core2duo'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='core2duo-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='coreduo'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='coreduo-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='n270'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='n270-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='phenom'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='3dnow'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='3dnowext'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='phenom-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='3dnow'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='3dnowext'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </mode>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  </cpu>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  <memoryBacking supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <enum name='sourceType'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <value>file</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <value>anonymous</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <value>memfd</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  </memoryBacking>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  <devices>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <disk supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='diskDevice'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>disk</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>cdrom</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>floppy</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>lun</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='bus'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>ide</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>fdc</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>scsi</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>virtio</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>usb</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>sata</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='model'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>virtio</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>virtio-transitional</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>virtio-non-transitional</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </disk>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <graphics supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='type'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>vnc</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>egl-headless</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>dbus</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </graphics>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <video supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='modelType'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>vga</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>cirrus</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>virtio</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>none</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>bochs</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>ramfb</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </video>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <hostdev supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='mode'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>subsystem</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='startupPolicy'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>default</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>mandatory</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>requisite</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>optional</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='subsysType'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>usb</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>pci</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>scsi</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='capsType'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='pciBackend'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </hostdev>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <rng supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='model'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>virtio</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>virtio-transitional</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>virtio-non-transitional</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='backendModel'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>random</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>egd</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>builtin</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </rng>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <filesystem supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='driverType'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>path</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>handle</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>virtiofs</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </filesystem>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <tpm supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='model'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>tpm-tis</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>tpm-crb</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='backendModel'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>emulator</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>external</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='backendVersion'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>2.0</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </tpm>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <redirdev supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='bus'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>usb</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </redirdev>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <channel supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='type'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>pty</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>unix</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </channel>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <crypto supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='model'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='type'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>qemu</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='backendModel'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>builtin</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </crypto>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <interface supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='backendType'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>default</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>passt</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </interface>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <panic supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='model'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>isa</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>hyperv</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </panic>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  </devices>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  <features>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <gic supported='no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <vmcoreinfo supported='yes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <genid supported='yes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <backingStoreInput supported='yes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <backup supported='yes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <async-teardown supported='yes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <ps2 supported='yes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <sev supported='no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <sgx supported='no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <hyperv supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='features'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>relaxed</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>vapic</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>spinlocks</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>vpindex</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>runtime</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>synic</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>stimer</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>reset</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>vendor_id</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>frequencies</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>reenlightenment</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>tlbflush</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>ipi</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>avic</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>emsr_bitmap</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>xmm_input</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </hyperv>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <launchSecurity supported='no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  </features>
Oct  2 07:53:59 np0005466013 nova_compute[191205]: </domainCapabilities>
Oct  2 07:53:59 np0005466013 nova_compute[191205]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct  2 07:53:59 np0005466013 nova_compute[191205]: 2025-10-02 11:53:59.438 2 DEBUG nova.virt.libvirt.host [None req-977f39b8-3f99-4ba5-a206-73bed3e02014 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Oct  2 07:53:59 np0005466013 nova_compute[191205]: <domainCapabilities>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  <path>/usr/libexec/qemu-kvm</path>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  <domain>kvm</domain>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  <machine>pc-q35-rhel9.6.0</machine>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  <arch>x86_64</arch>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  <vcpu max='4096'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  <iothreads supported='yes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  <os supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <enum name='firmware'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <value>efi</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <loader supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='type'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>rom</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>pflash</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='readonly'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>yes</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>no</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='secure'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>yes</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>no</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </loader>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  </os>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  <cpu>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <mode name='host-passthrough' supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='hostPassthroughMigratable'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>on</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>off</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </mode>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <mode name='maximum' supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='maximumMigratable'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>on</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>off</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </mode>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <mode name='host-model' supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model fallback='forbid'>EPYC-Rome</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <vendor>AMD</vendor>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='x2apic'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='tsc-deadline'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='hypervisor'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='tsc_adjust'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='spec-ctrl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='stibp'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='arch-capabilities'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='ssbd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='cmp_legacy'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='overflow-recov'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='succor'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='ibrs'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='amd-ssbd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='virt-ssbd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='lbrv'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='tsc-scale'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='vmcb-clean'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='flushbyasid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='pause-filter'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='pfthreshold'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='svme-addr-chk'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='lfence-always-serializing'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='rdctl-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='mds-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='pschange-mc-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='gds-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='require' name='rfds-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <feature policy='disable' name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </mode>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <mode name='custom' supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Broadwell'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Broadwell-IBRS'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Broadwell-noTSX'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Broadwell-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Broadwell-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Broadwell-v3'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Broadwell-v4'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Cascadelake-Server'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Cascadelake-Server-noTSX'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Cascadelake-Server-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Cascadelake-Server-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Cascadelake-Server-v3'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Cascadelake-Server-v4'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Cascadelake-Server-v5'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Cooperlake'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Cooperlake-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Cooperlake-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Denverton'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='mpx'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Denverton-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='mpx'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Denverton-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Denverton-v3'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Dhyana-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='EPYC-Genoa'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amd-psfd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='auto-ibrs'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='no-nested-data-bp'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='null-sel-clr-base'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='stibp-always-on'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='EPYC-Genoa-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amd-psfd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='auto-ibrs'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='no-nested-data-bp'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='null-sel-clr-base'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='stibp-always-on'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='EPYC-Milan'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='EPYC-Milan-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='EPYC-Milan-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amd-psfd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='no-nested-data-bp'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='null-sel-clr-base'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='stibp-always-on'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='EPYC-Rome'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='EPYC-Rome-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='EPYC-Rome-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='EPYC-Rome-v3'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='EPYC-v3'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='EPYC-v4'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='GraniteRapids'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-fp16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-int8'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-tile'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-fp16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fbsdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrc'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fzrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='mcdt-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pbrsb-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='prefetchiti'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='psdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xfd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='GraniteRapids-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-fp16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-int8'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-tile'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-fp16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fbsdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrc'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fzrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='mcdt-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pbrsb-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='prefetchiti'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='psdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xfd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='GraniteRapids-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-fp16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-int8'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-tile'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx10'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx10-128'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx10-256'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx10-512'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-fp16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='cldemote'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fbsdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrc'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fzrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='mcdt-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='movdir64b'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='movdiri'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pbrsb-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='prefetchiti'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='psdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xfd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Haswell'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Haswell-IBRS'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Haswell-noTSX'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Haswell-noTSX-IBRS'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Haswell-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Haswell-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Haswell-v3'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Haswell-v4'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Icelake-Server'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Icelake-Server-noTSX'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Icelake-Server-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Icelake-Server-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Icelake-Server-v3'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Icelake-Server-v4'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Icelake-Server-v5'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Icelake-Server-v6'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Icelake-Server-v7'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='IvyBridge'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='IvyBridge-IBRS'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='IvyBridge-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='IvyBridge-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='KnightsMill'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-4fmaps'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-4vnniw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512er'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512pf'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='KnightsMill-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-4fmaps'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-4vnniw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512er'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512pf'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Opteron_G4'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fma4'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xop'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Opteron_G4-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fma4'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xop'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Opteron_G5'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fma4'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='tbm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xop'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Opteron_G5-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fma4'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='tbm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xop'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='SapphireRapids'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-int8'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-tile'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-fp16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrc'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fzrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xfd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='SapphireRapids-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-int8'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-tile'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-fp16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrc'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fzrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xfd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='SapphireRapids-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-int8'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-tile'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-fp16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fbsdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrc'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fzrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='psdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xfd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='SapphireRapids-v3'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-int8'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='amx-tile'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-fp16'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='cldemote'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fbsdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrc'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fzrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='movdir64b'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='movdiri'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='psdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xfd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='SierraForest'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-ne-convert'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-vnni-int8'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='cmpccxadd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fbsdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='mcdt-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pbrsb-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='psdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='SierraForest-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-ifma'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-ne-convert'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx-vnni-int8'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='cmpccxadd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fbsdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='mcdt-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pbrsb-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='psdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Client'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Client-IBRS'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Client-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Client-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Client-v3'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Client-v4'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Server'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Server-IBRS'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Server-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Server-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Server-v3'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Server-v4'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Skylake-Server-v5'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Snowridge'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='cldemote'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='core-capability'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='movdir64b'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='movdiri'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='mpx'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='split-lock-detect'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Snowridge-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='cldemote'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='core-capability'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='movdir64b'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='movdiri'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='mpx'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='split-lock-detect'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Snowridge-v2'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='cldemote'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='core-capability'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='movdir64b'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='movdiri'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='split-lock-detect'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Snowridge-v3'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='cldemote'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='core-capability'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='movdir64b'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='movdiri'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='split-lock-detect'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='Snowridge-v4'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='cldemote'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='movdir64b'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='movdiri'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='athlon'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='3dnow'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='3dnowext'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='athlon-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='3dnow'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='3dnowext'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='core2duo'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='core2duo-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='coreduo'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='coreduo-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='n270'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='n270-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='phenom'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='3dnow'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='3dnowext'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <blockers model='phenom-v1'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='3dnow'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <feature name='3dnowext'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </blockers>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </mode>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  </cpu>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  <memoryBacking supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <enum name='sourceType'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <value>file</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <value>anonymous</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <value>memfd</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  </memoryBacking>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  <devices>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <disk supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='diskDevice'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>disk</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>cdrom</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>floppy</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>lun</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='bus'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>fdc</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>scsi</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>virtio</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>usb</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>sata</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='model'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>virtio</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>virtio-transitional</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>virtio-non-transitional</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </disk>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <graphics supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='type'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>vnc</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>egl-headless</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>dbus</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </graphics>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <video supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='modelType'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>vga</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>cirrus</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>virtio</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>none</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>bochs</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>ramfb</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </video>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <hostdev supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='mode'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>subsystem</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='startupPolicy'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>default</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>mandatory</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>requisite</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>optional</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='subsysType'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>usb</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>pci</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>scsi</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='capsType'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='pciBackend'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </hostdev>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <rng supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='model'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>virtio</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>virtio-transitional</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>virtio-non-transitional</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='backendModel'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>random</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>egd</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>builtin</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </rng>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <filesystem supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='driverType'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>path</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>handle</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>virtiofs</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </filesystem>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <tpm supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='model'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>tpm-tis</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>tpm-crb</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='backendModel'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>emulator</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>external</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='backendVersion'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>2.0</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </tpm>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <redirdev supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='bus'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>usb</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </redirdev>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <channel supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='type'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>pty</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>unix</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </channel>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <crypto supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='model'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='type'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>qemu</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='backendModel'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>builtin</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </crypto>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <interface supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='backendType'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>default</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>passt</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </interface>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <panic supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='model'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>isa</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>hyperv</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </panic>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  </devices>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  <features>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <gic supported='no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <vmcoreinfo supported='yes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <genid supported='yes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <backingStoreInput supported='yes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <backup supported='yes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <async-teardown supported='yes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <ps2 supported='yes'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <sev supported='no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <sgx supported='no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <hyperv supported='yes'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      <enum name='features'>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>relaxed</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>vapic</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>spinlocks</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>vpindex</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>runtime</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>synic</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>stimer</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>reset</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>vendor_id</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>frequencies</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>reenlightenment</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>tlbflush</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>ipi</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>avic</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>emsr_bitmap</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:        <value>xmm_input</value>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:      </enum>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    </hyperv>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:    <launchSecurity supported='no'/>
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  </features>
Oct  2 07:53:59 np0005466013 nova_compute[191205]: </domainCapabilities>
Oct  2 07:53:59 np0005466013 nova_compute[191205]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct  2 07:53:59 np0005466013 nova_compute[191205]: 2025-10-02 11:53:59.495 2 DEBUG nova.virt.libvirt.host [None req-977f39b8-3f99-4ba5-a206-73bed3e02014 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Oct  2 07:53:59 np0005466013 nova_compute[191205]: 2025-10-02 11:53:59.495 2 DEBUG nova.virt.libvirt.host [None req-977f39b8-3f99-4ba5-a206-73bed3e02014 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Oct  2 07:53:59 np0005466013 nova_compute[191205]: 2025-10-02 11:53:59.496 2 DEBUG nova.virt.libvirt.host [None req-977f39b8-3f99-4ba5-a206-73bed3e02014 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Oct  2 07:53:59 np0005466013 nova_compute[191205]: 2025-10-02 11:53:59.496 2 INFO nova.virt.libvirt.host [None req-977f39b8-3f99-4ba5-a206-73bed3e02014 - - - - - -] Secure Boot support detected#033[00m
Oct  2 07:53:59 np0005466013 nova_compute[191205]: 2025-10-02 11:53:59.498 2 INFO nova.virt.libvirt.driver [None req-977f39b8-3f99-4ba5-a206-73bed3e02014 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Oct  2 07:53:59 np0005466013 nova_compute[191205]: 2025-10-02 11:53:59.499 2 INFO nova.virt.libvirt.driver [None req-977f39b8-3f99-4ba5-a206-73bed3e02014 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Oct  2 07:53:59 np0005466013 nova_compute[191205]: 2025-10-02 11:53:59.511 2 DEBUG nova.virt.libvirt.driver [None req-977f39b8-3f99-4ba5-a206-73bed3e02014 - - - - - -] cpu compare xml: <cpu match="exact">
Oct  2 07:53:59 np0005466013 nova_compute[191205]:  <model>Nehalem</model>
Oct  2 07:53:59 np0005466013 nova_compute[191205]: </cpu>
Oct  2 07:53:59 np0005466013 nova_compute[191205]: _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019#033[00m
Oct  2 07:53:59 np0005466013 nova_compute[191205]: 2025-10-02 11:53:59.514 2 DEBUG nova.virt.libvirt.driver [None req-977f39b8-3f99-4ba5-a206-73bed3e02014 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Oct  2 07:53:59 np0005466013 python3.9[192084]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:53:59 np0005466013 systemd[1]: Stopping nova_compute container...
Oct  2 07:53:59 np0005466013 nova_compute[191205]: 2025-10-02 11:53:59.963 2 DEBUG oslo_concurrency.lockutils [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 07:53:59 np0005466013 nova_compute[191205]: 2025-10-02 11:53:59.963 2 DEBUG oslo_concurrency.lockutils [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 07:53:59 np0005466013 nova_compute[191205]: 2025-10-02 11:53:59.964 2 DEBUG oslo_concurrency.lockutils [None req-b9fee032-caf8-416a-971e-50d5949120f2 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 07:54:00 np0005466013 virtqemud[191867]: libvirt version: 10.10.0, package: 15.el9 (builder@centos.org, 2025-08-18-13:22:20, )
Oct  2 07:54:00 np0005466013 systemd[1]: libpod-c26424251010dbb8755e82469b23befb89d0f426f5be450664d59ee7671ad0d8.scope: Deactivated successfully.
Oct  2 07:54:00 np0005466013 systemd[1]: libpod-c26424251010dbb8755e82469b23befb89d0f426f5be450664d59ee7671ad0d8.scope: Consumed 3.169s CPU time.
Oct  2 07:54:00 np0005466013 virtqemud[191867]: hostname: compute-2
Oct  2 07:54:00 np0005466013 virtqemud[191867]: End of file while reading data: Input/output error
Oct  2 07:54:00 np0005466013 podman[192088]: 2025-10-02 11:54:00.426359241 +0000 UTC m=+0.550308352 container died c26424251010dbb8755e82469b23befb89d0f426f5be450664d59ee7671ad0d8 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=nova_compute, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 07:54:00 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c26424251010dbb8755e82469b23befb89d0f426f5be450664d59ee7671ad0d8-userdata-shm.mount: Deactivated successfully.
Oct  2 07:54:00 np0005466013 systemd[1]: var-lib-containers-storage-overlay-f9ce797768def4e72fbe8f8c271402e40528c5cd037562ceb4acf4e8604e3e8b-merged.mount: Deactivated successfully.
Oct  2 07:54:00 np0005466013 podman[192088]: 2025-10-02 11:54:00.49355373 +0000 UTC m=+0.617502841 container cleanup c26424251010dbb8755e82469b23befb89d0f426f5be450664d59ee7671ad0d8 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm)
Oct  2 07:54:00 np0005466013 podman[192088]: nova_compute
Oct  2 07:54:00 np0005466013 podman[192117]: nova_compute
Oct  2 07:54:00 np0005466013 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Oct  2 07:54:00 np0005466013 systemd[1]: Stopped nova_compute container.
Oct  2 07:54:00 np0005466013 systemd[1]: Starting nova_compute container...
Oct  2 07:54:00 np0005466013 systemd[1]: Started libcrun container.
Oct  2 07:54:00 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9ce797768def4e72fbe8f8c271402e40528c5cd037562ceb4acf4e8604e3e8b/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Oct  2 07:54:00 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9ce797768def4e72fbe8f8c271402e40528c5cd037562ceb4acf4e8604e3e8b/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct  2 07:54:00 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9ce797768def4e72fbe8f8c271402e40528c5cd037562ceb4acf4e8604e3e8b/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct  2 07:54:00 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9ce797768def4e72fbe8f8c271402e40528c5cd037562ceb4acf4e8604e3e8b/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Oct  2 07:54:00 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9ce797768def4e72fbe8f8c271402e40528c5cd037562ceb4acf4e8604e3e8b/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct  2 07:54:00 np0005466013 podman[192129]: 2025-10-02 11:54:00.656434288 +0000 UTC m=+0.084089634 container init c26424251010dbb8755e82469b23befb89d0f426f5be450664d59ee7671ad0d8 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_id=edpm, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct  2 07:54:00 np0005466013 podman[192129]: 2025-10-02 11:54:00.661823918 +0000 UTC m=+0.089479264 container start c26424251010dbb8755e82469b23befb89d0f426f5be450664d59ee7671ad0d8 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=edpm, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 07:54:00 np0005466013 podman[192129]: nova_compute
Oct  2 07:54:00 np0005466013 nova_compute[192144]: + sudo -E kolla_set_configs
Oct  2 07:54:00 np0005466013 systemd[1]: Started nova_compute container.
Oct  2 07:54:00 np0005466013 nova_compute[192144]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  2 07:54:00 np0005466013 nova_compute[192144]: INFO:__main__:Validating config file
Oct  2 07:54:00 np0005466013 nova_compute[192144]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  2 07:54:00 np0005466013 nova_compute[192144]: INFO:__main__:Copying service configuration files
Oct  2 07:54:00 np0005466013 nova_compute[192144]: INFO:__main__:Deleting /etc/nova/nova.conf
Oct  2 07:54:00 np0005466013 nova_compute[192144]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Oct  2 07:54:00 np0005466013 nova_compute[192144]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Oct  2 07:54:00 np0005466013 nova_compute[192144]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Oct  2 07:54:00 np0005466013 nova_compute[192144]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Oct  2 07:54:00 np0005466013 nova_compute[192144]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Oct  2 07:54:00 np0005466013 nova_compute[192144]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Oct  2 07:54:00 np0005466013 nova_compute[192144]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Oct  2 07:54:00 np0005466013 nova_compute[192144]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Oct  2 07:54:00 np0005466013 nova_compute[192144]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Oct  2 07:54:00 np0005466013 nova_compute[192144]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Oct  2 07:54:00 np0005466013 nova_compute[192144]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Oct  2 07:54:00 np0005466013 nova_compute[192144]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct  2 07:54:00 np0005466013 nova_compute[192144]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct  2 07:54:00 np0005466013 nova_compute[192144]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct  2 07:54:00 np0005466013 nova_compute[192144]: INFO:__main__:Deleting /etc/ceph
Oct  2 07:54:00 np0005466013 nova_compute[192144]: INFO:__main__:Creating directory /etc/ceph
Oct  2 07:54:00 np0005466013 nova_compute[192144]: INFO:__main__:Setting permission for /etc/ceph
Oct  2 07:54:00 np0005466013 nova_compute[192144]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Oct  2 07:54:00 np0005466013 nova_compute[192144]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Oct  2 07:54:00 np0005466013 nova_compute[192144]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct  2 07:54:00 np0005466013 nova_compute[192144]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Oct  2 07:54:00 np0005466013 nova_compute[192144]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Oct  2 07:54:00 np0005466013 nova_compute[192144]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct  2 07:54:00 np0005466013 nova_compute[192144]: INFO:__main__:Writing out command to execute
Oct  2 07:54:00 np0005466013 nova_compute[192144]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Oct  2 07:54:00 np0005466013 nova_compute[192144]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct  2 07:54:00 np0005466013 nova_compute[192144]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct  2 07:54:00 np0005466013 nova_compute[192144]: ++ cat /run_command
Oct  2 07:54:00 np0005466013 nova_compute[192144]: + CMD=nova-compute
Oct  2 07:54:00 np0005466013 nova_compute[192144]: + ARGS=
Oct  2 07:54:00 np0005466013 nova_compute[192144]: + sudo kolla_copy_cacerts
Oct  2 07:54:00 np0005466013 nova_compute[192144]: + [[ ! -n '' ]]
Oct  2 07:54:00 np0005466013 nova_compute[192144]: + . kolla_extend_start
Oct  2 07:54:00 np0005466013 nova_compute[192144]: Running command: 'nova-compute'
Oct  2 07:54:00 np0005466013 nova_compute[192144]: + echo 'Running command: '\''nova-compute'\'''
Oct  2 07:54:00 np0005466013 nova_compute[192144]: + umask 0022
Oct  2 07:54:00 np0005466013 nova_compute[192144]: + exec nova-compute
Oct  2 07:54:01 np0005466013 python3.9[192307]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Oct  2 07:54:01 np0005466013 systemd[1]: Started libpod-conmon-b3ae9345ae2fb5b9aa9259f4bcb157e1448ee87af6cf81f51eb8d3d28e140ee9.scope.
Oct  2 07:54:01 np0005466013 systemd[1]: Started libcrun container.
Oct  2 07:54:01 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/361d8f6f2a62a375c9da7d66abfac395c9be0977edbda7477a9f89ed6b094e3d/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Oct  2 07:54:01 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/361d8f6f2a62a375c9da7d66abfac395c9be0977edbda7477a9f89ed6b094e3d/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Oct  2 07:54:01 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/361d8f6f2a62a375c9da7d66abfac395c9be0977edbda7477a9f89ed6b094e3d/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct  2 07:54:01 np0005466013 podman[192332]: 2025-10-02 11:54:01.77612026 +0000 UTC m=+0.100775700 container init b3ae9345ae2fb5b9aa9259f4bcb157e1448ee87af6cf81f51eb8d3d28e140ee9 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute_init, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2)
Oct  2 07:54:01 np0005466013 podman[192332]: 2025-10-02 11:54:01.784278498 +0000 UTC m=+0.108933918 container start b3ae9345ae2fb5b9aa9259f4bcb157e1448ee87af6cf81f51eb8d3d28e140ee9 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=nova_compute_init, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm)
Oct  2 07:54:01 np0005466013 python3.9[192307]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Oct  2 07:54:01 np0005466013 nova_compute_init[192354]: INFO:nova_statedir:Applying nova statedir ownership
Oct  2 07:54:01 np0005466013 nova_compute_init[192354]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Oct  2 07:54:01 np0005466013 nova_compute_init[192354]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Oct  2 07:54:01 np0005466013 nova_compute_init[192354]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Oct  2 07:54:01 np0005466013 nova_compute_init[192354]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Oct  2 07:54:01 np0005466013 nova_compute_init[192354]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Oct  2 07:54:01 np0005466013 nova_compute_init[192354]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Oct  2 07:54:01 np0005466013 nova_compute_init[192354]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Oct  2 07:54:01 np0005466013 nova_compute_init[192354]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Oct  2 07:54:01 np0005466013 nova_compute_init[192354]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Oct  2 07:54:01 np0005466013 nova_compute_init[192354]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Oct  2 07:54:01 np0005466013 nova_compute_init[192354]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Oct  2 07:54:01 np0005466013 nova_compute_init[192354]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Oct  2 07:54:01 np0005466013 nova_compute_init[192354]: INFO:nova_statedir:Nova statedir ownership complete
Oct  2 07:54:01 np0005466013 systemd[1]: libpod-b3ae9345ae2fb5b9aa9259f4bcb157e1448ee87af6cf81f51eb8d3d28e140ee9.scope: Deactivated successfully.
Oct  2 07:54:01 np0005466013 podman[192355]: 2025-10-02 11:54:01.870998403 +0000 UTC m=+0.028072017 container died b3ae9345ae2fb5b9aa9259f4bcb157e1448ee87af6cf81f51eb8d3d28e140ee9 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=nova_compute_init, org.label-schema.build-date=20251001, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.schema-version=1.0, config_id=edpm, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 07:54:01 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b3ae9345ae2fb5b9aa9259f4bcb157e1448ee87af6cf81f51eb8d3d28e140ee9-userdata-shm.mount: Deactivated successfully.
Oct  2 07:54:01 np0005466013 systemd[1]: var-lib-containers-storage-overlay-361d8f6f2a62a375c9da7d66abfac395c9be0977edbda7477a9f89ed6b094e3d-merged.mount: Deactivated successfully.
Oct  2 07:54:01 np0005466013 podman[192361]: 2025-10-02 11:54:01.921816706 +0000 UTC m=+0.066340393 container cleanup b3ae9345ae2fb5b9aa9259f4bcb157e1448ee87af6cf81f51eb8d3d28e140ee9 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=nova_compute_init, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 07:54:01 np0005466013 systemd[1]: libpod-conmon-b3ae9345ae2fb5b9aa9259f4bcb157e1448ee87af6cf81f51eb8d3d28e140ee9.scope: Deactivated successfully.
Oct  2 07:54:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:54:02.263 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:54:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:54:02.264 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:54:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:54:02.264 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:54:02 np0005466013 nova_compute[192144]: 2025-10-02 11:54:02.821 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Oct  2 07:54:02 np0005466013 nova_compute[192144]: 2025-10-02 11:54:02.821 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Oct  2 07:54:02 np0005466013 nova_compute[192144]: 2025-10-02 11:54:02.821 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Oct  2 07:54:02 np0005466013 nova_compute[192144]: 2025-10-02 11:54:02.822 2 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Oct  2 07:54:02 np0005466013 systemd[1]: session-25.scope: Deactivated successfully.
Oct  2 07:54:02 np0005466013 systemd[1]: session-25.scope: Consumed 2min 9.068s CPU time.
Oct  2 07:54:02 np0005466013 systemd-logind[784]: Session 25 logged out. Waiting for processes to exit.
Oct  2 07:54:02 np0005466013 systemd-logind[784]: Removed session 25.
Oct  2 07:54:02 np0005466013 nova_compute[192144]: 2025-10-02 11:54:02.961 2 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 07:54:02 np0005466013 nova_compute[192144]: 2025-10-02 11:54:02.983 2 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 07:54:03 np0005466013 nova_compute[192144]: 2025-10-02 11:54:03.841 2 INFO nova.virt.driver [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Oct  2 07:54:03 np0005466013 nova_compute[192144]: 2025-10-02 11:54:03.940 2 INFO nova.compute.provider_config [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.320 2 DEBUG oslo_concurrency.lockutils [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.320 2 DEBUG oslo_concurrency.lockutils [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.320 2 DEBUG oslo_concurrency.lockutils [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.321 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.321 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.321 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.321 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.321 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.321 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.322 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.322 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.322 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.322 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.322 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.323 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.323 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.323 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.323 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.323 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.324 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.324 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.324 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.324 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] console_host                   = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.324 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.324 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.325 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.325 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.325 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.325 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.325 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.325 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.326 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.326 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.326 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.326 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.326 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.326 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.327 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.327 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.327 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.327 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.327 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.327 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.328 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.328 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.328 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.328 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.328 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.328 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.329 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.329 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.329 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.329 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.329 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.329 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.329 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.330 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.330 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.330 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.330 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.330 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.331 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.331 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.331 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.331 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.331 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.331 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.331 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.332 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.332 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.332 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.332 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.332 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.332 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.332 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.333 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.333 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.333 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.333 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.333 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.334 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.334 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.334 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.334 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] my_block_storage_ip            = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.334 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] my_ip                          = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.334 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.335 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.335 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.335 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.335 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.335 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.336 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.336 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.336 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.336 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.336 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.336 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.337 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.337 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.337 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.337 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.337 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.338 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.338 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.338 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.338 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.338 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.339 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.339 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.339 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.339 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.339 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.339 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.340 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.340 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.340 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.340 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.340 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.340 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.340 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.341 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.341 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.341 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.341 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.341 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.341 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.341 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.342 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.342 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.342 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.342 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.342 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.343 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.343 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.343 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.343 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.343 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.343 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.344 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.344 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.344 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.344 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.344 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.344 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.345 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.345 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.345 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.345 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.345 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.345 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.345 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.346 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.346 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.346 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.346 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.346 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.347 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.347 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.347 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.347 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.347 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.348 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.348 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.348 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.348 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.348 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.349 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.349 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.349 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.349 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.349 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.349 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.350 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.350 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.350 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.350 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.350 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.350 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.351 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.351 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.351 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.351 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.351 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.352 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.352 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.352 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.352 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.352 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.352 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.353 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.353 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.353 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.353 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.353 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.353 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.353 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.354 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.354 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.354 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.354 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.354 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.354 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.355 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.355 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.355 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.355 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.355 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.355 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.355 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.356 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.356 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.356 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.356 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.356 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.356 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.356 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.357 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.357 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.357 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.357 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.357 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.357 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.357 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.358 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.358 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.358 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.358 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.358 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.358 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.358 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.359 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.359 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.359 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.359 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.359 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.359 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.360 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.360 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.360 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.360 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.360 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.360 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.361 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.361 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.361 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.361 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.361 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.361 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.361 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.362 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.362 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.362 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.362 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.362 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.362 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.362 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.363 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.363 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.363 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.363 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.363 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.363 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.363 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.364 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.364 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.364 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.364 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.364 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.364 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.364 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.365 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.365 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.365 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.365 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.365 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.365 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.366 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.366 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.366 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.366 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.366 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.366 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.367 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.367 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.367 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.367 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.367 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.367 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.368 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.368 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.368 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.368 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.368 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.368 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.368 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.369 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.369 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.369 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.369 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.369 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.369 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.369 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.370 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.370 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.370 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.370 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.370 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.370 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.371 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.371 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.371 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.371 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.371 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.371 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.372 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.372 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.372 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.372 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.372 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.372 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.372 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.373 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.373 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.373 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.373 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.373 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.373 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.373 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.374 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.374 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.374 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.374 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.374 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.374 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.375 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.375 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.375 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.375 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.375 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.375 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.375 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.376 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.376 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.376 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.376 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.376 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.376 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.376 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.376 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.377 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.377 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.377 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.377 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.377 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.377 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.377 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.378 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.378 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.378 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.378 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.379 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.379 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.379 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.379 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.379 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.380 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.380 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.380 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.380 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.380 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.380 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.381 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.381 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.381 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.381 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.381 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.381 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.381 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.382 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.382 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.382 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.382 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.382 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.382 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.382 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.383 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.383 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.383 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.383 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.383 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.383 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.383 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.384 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.384 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.384 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.384 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.384 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.384 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.385 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.385 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.385 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.385 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.385 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.385 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.385 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.386 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.386 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.386 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.386 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.386 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.386 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.386 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.387 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.387 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.387 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.387 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.387 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.387 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.387 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.387 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.388 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.388 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.388 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.388 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.388 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.388 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.388 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.389 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.389 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.389 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.389 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.389 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.390 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.390 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.390 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.390 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.390 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.390 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.390 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.391 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.391 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.391 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.391 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.391 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.391 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.391 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.392 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.392 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.392 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.392 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.392 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.392 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.392 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.392 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.393 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.393 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.393 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.393 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.393 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.393 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.394 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.394 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.394 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.394 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.394 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.394 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.394 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.394 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.395 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.395 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.395 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.395 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.395 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.395 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.395 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.396 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.396 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.396 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.396 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.396 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.396 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.397 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.397 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.397 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.397 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.397 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.397 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.397 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.398 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.398 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.398 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.398 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.398 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.398 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.399 2 WARNING oslo_config.cfg [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Oct  2 07:54:05 np0005466013 nova_compute[192144]: live_migration_uri is deprecated for removal in favor of two other options that
Oct  2 07:54:05 np0005466013 nova_compute[192144]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Oct  2 07:54:05 np0005466013 nova_compute[192144]: and ``live_migration_inbound_addr`` respectively.
Oct  2 07:54:05 np0005466013 nova_compute[192144]: ).  Its value may be silently ignored in the future.#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.399 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.399 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.399 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.399 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.400 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.400 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.400 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.400 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.400 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.400 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.401 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.401 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.401 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.401 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.401 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.402 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.402 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.402 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.402 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.402 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.403 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.403 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.403 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.403 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.403 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.403 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.403 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.404 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.404 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.404 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.404 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.404 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.404 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.405 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.405 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.405 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.405 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.405 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.405 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.405 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.406 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.406 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.406 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.406 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.407 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.407 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.407 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.407 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.407 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.407 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.408 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.408 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.408 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.408 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.408 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.408 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.408 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.409 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.409 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.409 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.409 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.409 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.409 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.409 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.410 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.410 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.410 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.410 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.410 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.410 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.411 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.411 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.411 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.411 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.411 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.411 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.411 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.411 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.412 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.412 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.412 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.412 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.412 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.412 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.412 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.413 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.413 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.413 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.413 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.413 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.413 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.413 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.414 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.414 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.414 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.414 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.414 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.414 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.414 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.415 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.415 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.415 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.415 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.415 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.415 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.415 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.415 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.416 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.416 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.416 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.416 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.416 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.416 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.416 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.417 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.417 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.417 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.417 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.417 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.417 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.417 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.418 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.418 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.418 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.418 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.418 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.418 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.419 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.419 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.419 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.419 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.419 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.419 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.419 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.419 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.420 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.420 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.420 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.420 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.420 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.421 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.421 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.421 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.421 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.421 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.421 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.422 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.422 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.422 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.422 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.422 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.422 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.423 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.423 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.423 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.423 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.423 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.423 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.424 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.424 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.424 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.424 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.424 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.424 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.425 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.425 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.425 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.425 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.425 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.425 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.425 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.426 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.426 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.426 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.426 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.426 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.426 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.427 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.427 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.427 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.427 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.427 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.428 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.428 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.428 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.428 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.428 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.429 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.429 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.429 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.429 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.429 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.429 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.430 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.430 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.430 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.430 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.430 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.430 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.431 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.431 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.431 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.431 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.431 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.432 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.432 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.432 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.432 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.433 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.433 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.433 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.433 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.433 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.433 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.434 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.434 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.434 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.434 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.434 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.434 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.434 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.435 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.435 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.435 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.435 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.435 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.435 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.436 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.436 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.436 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.436 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.436 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.436 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.436 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.437 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.437 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.437 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.437 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.437 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.437 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.437 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.438 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.438 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.438 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.438 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.438 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.439 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.439 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.439 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.439 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.439 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.439 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.440 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vnc.server_proxyclient_address = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.440 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.440 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.440 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.440 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.440 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.440 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.441 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.441 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.441 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.441 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.441 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.441 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.441 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.442 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.442 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.442 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.442 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.442 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.442 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.442 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.443 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.443 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.443 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.443 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.443 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.443 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.443 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.443 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.444 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.444 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.444 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.444 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.444 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.444 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.444 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.445 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.445 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.445 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.445 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.445 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.445 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.445 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.446 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.446 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.446 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.446 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.446 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.446 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.446 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.447 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.447 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.447 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.447 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.447 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.447 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.447 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.448 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.448 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.448 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.448 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.448 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.448 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.448 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.449 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.449 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.449 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.449 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.449 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.449 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.449 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.449 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.450 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.450 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.450 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.450 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.450 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.450 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.450 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.451 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.451 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.451 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.451 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.451 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.451 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.451 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.452 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.452 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.452 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.452 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.452 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.452 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.452 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.453 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.453 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.453 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.453 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.453 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.453 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.453 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.454 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.454 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.454 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.454 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.454 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.454 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.454 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.455 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.455 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.455 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.455 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.455 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.455 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.455 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.456 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.456 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.456 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.456 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.456 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.456 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.456 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.456 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.457 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.457 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.457 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.457 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.457 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.457 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.458 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.458 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.458 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.458 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.458 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.458 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.459 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.459 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.459 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.459 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.459 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.459 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.459 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.460 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.460 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.460 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.460 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.460 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.460 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.460 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.461 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.461 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.461 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.461 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.461 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.461 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.461 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.462 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.462 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.462 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.462 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.462 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.462 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.462 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.463 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.463 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.463 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.463 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.463 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.463 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.464 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.464 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.464 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.464 2 DEBUG oslo_service.service [None req-65419a52-cbc7-460e-8418-03bfe86a359a - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.465 2 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.491 2 DEBUG nova.virt.libvirt.host [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.491 2 DEBUG nova.virt.libvirt.host [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.492 2 DEBUG nova.virt.libvirt.host [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.492 2 DEBUG nova.virt.libvirt.host [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.519 2 DEBUG nova.virt.libvirt.host [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f9a4e8a6c70> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.521 2 DEBUG nova.virt.libvirt.host [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f9a4e8a6c70> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.523 2 INFO nova.virt.libvirt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Connection event '1' reason 'None'#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.530 2 INFO nova.virt.libvirt.host [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Libvirt host capabilities <capabilities>
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  <host>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <uuid>f7714643-27c8-4af0-a2a6-be96dd6da6b1</uuid>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <cpu>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <arch>x86_64</arch>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model>EPYC-Rome-v4</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <vendor>AMD</vendor>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <microcode version='16777317'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <signature family='23' model='49' stepping='0'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <maxphysaddr mode='emulate' bits='40'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature name='x2apic'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature name='tsc-deadline'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature name='osxsave'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature name='hypervisor'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature name='tsc_adjust'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature name='spec-ctrl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature name='stibp'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature name='arch-capabilities'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature name='ssbd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature name='cmp_legacy'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature name='topoext'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature name='virt-ssbd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature name='lbrv'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature name='tsc-scale'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature name='vmcb-clean'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature name='pause-filter'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature name='pfthreshold'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature name='svme-addr-chk'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature name='rdctl-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature name='skip-l1dfl-vmentry'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature name='mds-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature name='pschange-mc-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <pages unit='KiB' size='4'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <pages unit='KiB' size='2048'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <pages unit='KiB' size='1048576'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </cpu>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <power_management>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <suspend_mem/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <suspend_disk/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <suspend_hybrid/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </power_management>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <iommu support='no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <migration_features>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <live/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <uri_transports>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <uri_transport>tcp</uri_transport>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <uri_transport>rdma</uri_transport>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </uri_transports>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </migration_features>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <topology>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <cells num='1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <cell id='0'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:          <memory unit='KiB'>7864108</memory>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:          <pages unit='KiB' size='4'>1966027</pages>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:          <pages unit='KiB' size='2048'>0</pages>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:          <pages unit='KiB' size='1048576'>0</pages>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:          <distances>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:            <sibling id='0' value='10'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:          </distances>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:          <cpus num='8'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:          </cpus>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        </cell>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </cells>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </topology>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <cache>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </cache>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <secmodel>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model>selinux</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <doi>0</doi>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </secmodel>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <secmodel>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model>dac</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <doi>0</doi>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <baselabel type='kvm'>+107:+107</baselabel>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <baselabel type='qemu'>+107:+107</baselabel>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </secmodel>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  </host>
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  <guest>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <os_type>hvm</os_type>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <arch name='i686'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <wordsize>32</wordsize>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <domain type='qemu'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <domain type='kvm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </arch>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <features>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <pae/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <nonpae/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <acpi default='on' toggle='yes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <apic default='on' toggle='no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <cpuselection/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <deviceboot/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <disksnapshot default='on' toggle='no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <externalSnapshot/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </features>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  </guest>
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  <guest>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <os_type>hvm</os_type>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <arch name='x86_64'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <wordsize>64</wordsize>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <domain type='qemu'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <domain type='kvm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </arch>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <features>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <acpi default='on' toggle='yes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <apic default='on' toggle='no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <cpuselection/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <deviceboot/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <disksnapshot default='on' toggle='no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <externalSnapshot/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </features>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  </guest>
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 
Oct  2 07:54:05 np0005466013 nova_compute[192144]: </capabilities>
Oct  2 07:54:05 np0005466013 nova_compute[192144]: #033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.538 2 DEBUG nova.virt.libvirt.host [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.542 2 DEBUG nova.virt.libvirt.host [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Oct  2 07:54:05 np0005466013 nova_compute[192144]: <domainCapabilities>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  <path>/usr/libexec/qemu-kvm</path>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  <domain>kvm</domain>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  <machine>pc-q35-rhel9.6.0</machine>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  <arch>i686</arch>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  <vcpu max='4096'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  <iothreads supported='yes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  <os supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <enum name='firmware'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <loader supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='type'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>rom</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>pflash</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='readonly'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>yes</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>no</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='secure'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>no</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </loader>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  </os>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  <cpu>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <mode name='host-passthrough' supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='hostPassthroughMigratable'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>on</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>off</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </mode>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <mode name='maximum' supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='maximumMigratable'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>on</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>off</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </mode>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <mode name='host-model' supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model fallback='forbid'>EPYC-Rome</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <vendor>AMD</vendor>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='x2apic'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='tsc-deadline'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='hypervisor'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='tsc_adjust'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='spec-ctrl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='stibp'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='arch-capabilities'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='ssbd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='cmp_legacy'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='overflow-recov'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='succor'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='ibrs'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='amd-ssbd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='virt-ssbd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='lbrv'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='tsc-scale'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='vmcb-clean'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='flushbyasid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='pause-filter'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='pfthreshold'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='svme-addr-chk'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='lfence-always-serializing'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='rdctl-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='mds-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='pschange-mc-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='gds-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='rfds-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='disable' name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </mode>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <mode name='custom' supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Broadwell'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Broadwell-IBRS'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Broadwell-noTSX'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Broadwell-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Broadwell-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Broadwell-v3'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Broadwell-v4'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Cascadelake-Server'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Cascadelake-Server-noTSX'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Cascadelake-Server-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Cascadelake-Server-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Cascadelake-Server-v3'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Cascadelake-Server-v4'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Cascadelake-Server-v5'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Cooperlake'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Cooperlake-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Cooperlake-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Denverton'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='mpx'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Denverton-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='mpx'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Denverton-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Denverton-v3'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Dhyana-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='EPYC-Genoa'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amd-psfd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='auto-ibrs'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='no-nested-data-bp'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='null-sel-clr-base'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='stibp-always-on'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='EPYC-Genoa-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amd-psfd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='auto-ibrs'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='no-nested-data-bp'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='null-sel-clr-base'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='stibp-always-on'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='EPYC-Milan'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='EPYC-Milan-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='EPYC-Milan-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amd-psfd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='no-nested-data-bp'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='null-sel-clr-base'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='stibp-always-on'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='EPYC-Rome'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='EPYC-Rome-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='EPYC-Rome-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='EPYC-Rome-v3'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='EPYC-v3'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='EPYC-v4'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='GraniteRapids'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-fp16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-int8'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-tile'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-fp16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fbsdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrc'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fzrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='mcdt-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pbrsb-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='prefetchiti'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='psdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='tsx-ldtrk'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xfd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='GraniteRapids-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-fp16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-int8'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-tile'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-fp16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fbsdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrc'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fzrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='mcdt-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pbrsb-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='prefetchiti'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='psdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='tsx-ldtrk'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xfd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='GraniteRapids-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-fp16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-int8'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-tile'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx10'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx10-128'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx10-256'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx10-512'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-fp16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='cldemote'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fbsdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrc'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fzrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='mcdt-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='movdir64b'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='movdiri'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pbrsb-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='prefetchiti'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='psdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='tsx-ldtrk'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xfd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Haswell'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Haswell-IBRS'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Haswell-noTSX'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Haswell-noTSX-IBRS'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Haswell-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Haswell-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Haswell-v3'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Haswell-v4'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Icelake-Server'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Icelake-Server-noTSX'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Icelake-Server-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Icelake-Server-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Icelake-Server-v3'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Icelake-Server-v4'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Icelake-Server-v5'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Icelake-Server-v6'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Icelake-Server-v7'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='IvyBridge'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='IvyBridge-IBRS'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='IvyBridge-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='IvyBridge-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='KnightsMill'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-4fmaps'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-4vnniw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512er'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512pf'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='KnightsMill-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-4fmaps'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-4vnniw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512er'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512pf'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Opteron_G4'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fma4'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xop'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Opteron_G4-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fma4'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xop'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Opteron_G5'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fma4'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='tbm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xop'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Opteron_G5-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fma4'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='tbm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xop'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='SapphireRapids'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-int8'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-tile'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-fp16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrc'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fzrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='tsx-ldtrk'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xfd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='SapphireRapids-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-int8'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-tile'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-fp16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrc'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fzrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='tsx-ldtrk'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xfd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='SapphireRapids-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-int8'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-tile'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-fp16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fbsdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrc'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fzrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='psdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='tsx-ldtrk'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xfd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='SapphireRapids-v3'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-int8'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-tile'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-fp16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='cldemote'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fbsdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrc'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fzrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='movdir64b'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='movdiri'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='psdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='tsx-ldtrk'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xfd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='SierraForest'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-ne-convert'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-vnni-int8'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='cmpccxadd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fbsdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='mcdt-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pbrsb-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='psdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='SierraForest-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-ne-convert'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-vnni-int8'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='cmpccxadd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fbsdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='mcdt-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pbrsb-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='psdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Client'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Client-IBRS'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Client-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Client-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Client-v3'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Client-v4'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Server'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Server-IBRS'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Server-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Server-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Server-v3'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Server-v4'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Server-v5'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Snowridge'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='cldemote'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='core-capability'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='movdir64b'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='movdiri'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='mpx'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='split-lock-detect'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Snowridge-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='cldemote'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='core-capability'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='movdir64b'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='movdiri'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='mpx'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='split-lock-detect'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Snowridge-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='cldemote'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='core-capability'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='movdir64b'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='movdiri'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='split-lock-detect'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Snowridge-v3'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='cldemote'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='core-capability'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='movdir64b'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='movdiri'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='split-lock-detect'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Snowridge-v4'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='cldemote'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='movdir64b'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='movdiri'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='athlon'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='3dnow'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='3dnowext'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='athlon-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='3dnow'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='3dnowext'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='core2duo'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='core2duo-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='coreduo'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='coreduo-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='n270'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='n270-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='phenom'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='3dnow'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='3dnowext'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='phenom-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='3dnow'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='3dnowext'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </mode>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  <memoryBacking supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <enum name='sourceType'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <value>file</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <value>anonymous</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <value>memfd</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  </memoryBacking>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  <devices>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <disk supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='diskDevice'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>disk</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>cdrom</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>floppy</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>lun</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='bus'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>fdc</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>scsi</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>virtio</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>usb</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>sata</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='model'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>virtio</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>virtio-transitional</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>virtio-non-transitional</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </disk>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <graphics supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='type'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>vnc</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>egl-headless</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>dbus</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </graphics>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <video supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='modelType'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>vga</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>cirrus</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>virtio</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>none</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>bochs</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>ramfb</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </video>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <hostdev supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='mode'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>subsystem</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='startupPolicy'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>default</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>mandatory</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>requisite</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>optional</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='subsysType'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>usb</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>pci</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>scsi</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='capsType'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='pciBackend'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </hostdev>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <rng supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='model'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>virtio</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>virtio-transitional</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>virtio-non-transitional</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='backendModel'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>random</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>egd</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>builtin</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </rng>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <filesystem supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='driverType'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>path</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>handle</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>virtiofs</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </filesystem>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <tpm supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='model'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>tpm-tis</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>tpm-crb</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='backendModel'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>emulator</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>external</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='backendVersion'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>2.0</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </tpm>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <redirdev supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='bus'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>usb</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </redirdev>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <channel supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='type'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>pty</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>unix</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </channel>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <crypto supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='model'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='type'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>qemu</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='backendModel'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>builtin</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </crypto>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <interface supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='backendType'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>default</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>passt</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </interface>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <panic supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='model'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>isa</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>hyperv</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </panic>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  </devices>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  <features>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <gic supported='no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <vmcoreinfo supported='yes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <genid supported='yes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <backingStoreInput supported='yes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <backup supported='yes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <async-teardown supported='yes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <ps2 supported='yes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <sev supported='no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <sgx supported='no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <hyperv supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='features'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>relaxed</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>vapic</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>spinlocks</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>vpindex</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>runtime</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>synic</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>stimer</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>reset</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>vendor_id</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>frequencies</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>reenlightenment</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>tlbflush</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>ipi</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>avic</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>emsr_bitmap</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>xmm_input</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </hyperv>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <launchSecurity supported='no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  </features>
Oct  2 07:54:05 np0005466013 nova_compute[192144]: </domainCapabilities>
Oct  2 07:54:05 np0005466013 nova_compute[192144]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.549 2 DEBUG nova.virt.libvirt.host [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Oct  2 07:54:05 np0005466013 nova_compute[192144]: <domainCapabilities>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  <path>/usr/libexec/qemu-kvm</path>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  <domain>kvm</domain>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  <machine>pc-i440fx-rhel7.6.0</machine>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  <arch>i686</arch>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  <vcpu max='240'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  <iothreads supported='yes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  <os supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <enum name='firmware'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <loader supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='type'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>rom</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>pflash</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='readonly'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>yes</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>no</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='secure'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>no</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </loader>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  </os>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  <cpu>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <mode name='host-passthrough' supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='hostPassthroughMigratable'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>on</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>off</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </mode>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <mode name='maximum' supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='maximumMigratable'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>on</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>off</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </mode>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <mode name='host-model' supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model fallback='forbid'>EPYC-Rome</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <vendor>AMD</vendor>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='x2apic'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='tsc-deadline'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='hypervisor'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='tsc_adjust'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='spec-ctrl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='stibp'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='arch-capabilities'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='ssbd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='cmp_legacy'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='overflow-recov'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='succor'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='ibrs'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='amd-ssbd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='virt-ssbd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='lbrv'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='tsc-scale'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='vmcb-clean'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='flushbyasid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='pause-filter'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='pfthreshold'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='svme-addr-chk'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='lfence-always-serializing'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='rdctl-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='mds-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='pschange-mc-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='gds-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='rfds-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='disable' name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </mode>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <mode name='custom' supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Broadwell'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Broadwell-IBRS'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Broadwell-noTSX'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Broadwell-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Broadwell-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Broadwell-v3'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Broadwell-v4'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Cascadelake-Server'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Cascadelake-Server-noTSX'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Cascadelake-Server-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Cascadelake-Server-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Cascadelake-Server-v3'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Cascadelake-Server-v4'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Cascadelake-Server-v5'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Cooperlake'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Cooperlake-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Cooperlake-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Denverton'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='mpx'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Denverton-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='mpx'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Denverton-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Denverton-v3'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Dhyana-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='EPYC-Genoa'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amd-psfd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='auto-ibrs'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='no-nested-data-bp'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='null-sel-clr-base'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='stibp-always-on'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='EPYC-Genoa-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amd-psfd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='auto-ibrs'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='no-nested-data-bp'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='null-sel-clr-base'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='stibp-always-on'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='EPYC-Milan'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='EPYC-Milan-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='EPYC-Milan-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amd-psfd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='no-nested-data-bp'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='null-sel-clr-base'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='stibp-always-on'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='EPYC-Rome'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='EPYC-Rome-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='EPYC-Rome-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='EPYC-Rome-v3'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='EPYC-v3'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='EPYC-v4'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='GraniteRapids'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-fp16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-int8'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-tile'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-fp16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fbsdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrc'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fzrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='mcdt-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pbrsb-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='prefetchiti'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='psdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='tsx-ldtrk'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xfd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='GraniteRapids-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-fp16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-int8'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-tile'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-fp16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fbsdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrc'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fzrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='mcdt-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pbrsb-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='prefetchiti'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='psdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='tsx-ldtrk'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xfd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='GraniteRapids-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-fp16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-int8'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-tile'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx10'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx10-128'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx10-256'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx10-512'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-fp16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='cldemote'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fbsdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrc'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fzrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='mcdt-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='movdir64b'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='movdiri'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pbrsb-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='prefetchiti'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='psdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='tsx-ldtrk'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xfd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Haswell'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Haswell-IBRS'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Haswell-noTSX'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Haswell-noTSX-IBRS'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Haswell-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Haswell-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Haswell-v3'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Haswell-v4'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 podman[192444]: 2025-10-02 11:54:05.625732512 +0000 UTC m=+0.098162028 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Icelake-Server'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Icelake-Server-noTSX'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Icelake-Server-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Icelake-Server-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Icelake-Server-v3'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Icelake-Server-v4'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Icelake-Server-v5'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Icelake-Server-v6'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Icelake-Server-v7'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='IvyBridge'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='IvyBridge-IBRS'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='IvyBridge-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='IvyBridge-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='KnightsMill'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-4fmaps'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-4vnniw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512er'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512pf'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='KnightsMill-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-4fmaps'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-4vnniw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512er'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512pf'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Opteron_G4'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fma4'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xop'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Opteron_G4-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fma4'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xop'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Opteron_G5'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fma4'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='tbm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xop'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Opteron_G5-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fma4'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='tbm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xop'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='SapphireRapids'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-int8'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-tile'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-fp16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrc'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fzrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='tsx-ldtrk'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xfd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='SapphireRapids-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-int8'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-tile'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-fp16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrc'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fzrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='tsx-ldtrk'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xfd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='SapphireRapids-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-int8'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-tile'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-fp16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fbsdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrc'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fzrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='psdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='tsx-ldtrk'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xfd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='SapphireRapids-v3'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-int8'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-tile'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-fp16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='cldemote'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fbsdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrc'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fzrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='movdir64b'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='movdiri'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='psdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='tsx-ldtrk'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xfd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='SierraForest'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-ne-convert'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-vnni-int8'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='cmpccxadd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fbsdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='mcdt-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pbrsb-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='psdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='SierraForest-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-ne-convert'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-vnni-int8'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='cmpccxadd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fbsdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='mcdt-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pbrsb-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='psdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Client'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Client-IBRS'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Client-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Client-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Client-v3'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Client-v4'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Server'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Server-IBRS'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Server-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Server-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Server-v3'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Server-v4'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Server-v5'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Snowridge'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='cldemote'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='core-capability'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='movdir64b'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='movdiri'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='mpx'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='split-lock-detect'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Snowridge-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='cldemote'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='core-capability'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='movdir64b'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='movdiri'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='mpx'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='split-lock-detect'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Snowridge-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='cldemote'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='core-capability'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='movdir64b'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='movdiri'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='split-lock-detect'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Snowridge-v3'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='cldemote'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='core-capability'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='movdir64b'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='movdiri'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='split-lock-detect'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Snowridge-v4'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='cldemote'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='movdir64b'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='movdiri'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='athlon'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='3dnow'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='3dnowext'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='athlon-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='3dnow'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='3dnowext'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='core2duo'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='core2duo-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='coreduo'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='coreduo-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='n270'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='n270-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='phenom'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='3dnow'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='3dnowext'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='phenom-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='3dnow'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='3dnowext'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </mode>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  <memoryBacking supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <enum name='sourceType'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <value>file</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <value>anonymous</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <value>memfd</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  </memoryBacking>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  <devices>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <disk supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='diskDevice'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>disk</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>cdrom</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>floppy</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>lun</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='bus'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>ide</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>fdc</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>scsi</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>virtio</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>usb</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>sata</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='model'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>virtio</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>virtio-transitional</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>virtio-non-transitional</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </disk>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <graphics supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='type'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>vnc</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>egl-headless</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>dbus</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </graphics>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <video supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='modelType'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>vga</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>cirrus</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>virtio</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>none</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>bochs</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>ramfb</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </video>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <hostdev supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='mode'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>subsystem</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='startupPolicy'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>default</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>mandatory</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>requisite</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>optional</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='subsysType'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>usb</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>pci</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>scsi</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='capsType'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='pciBackend'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </hostdev>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <rng supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='model'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>virtio</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>virtio-transitional</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>virtio-non-transitional</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='backendModel'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>random</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>egd</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>builtin</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </rng>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <filesystem supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='driverType'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>path</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>handle</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>virtiofs</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </filesystem>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <tpm supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='model'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>tpm-tis</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>tpm-crb</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='backendModel'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>emulator</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>external</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='backendVersion'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>2.0</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </tpm>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <redirdev supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='bus'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>usb</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </redirdev>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <channel supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='type'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>pty</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>unix</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </channel>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <crypto supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='model'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='type'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>qemu</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='backendModel'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>builtin</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </crypto>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <interface supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='backendType'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>default</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>passt</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </interface>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <panic supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='model'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>isa</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>hyperv</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </panic>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  </devices>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  <features>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <gic supported='no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <vmcoreinfo supported='yes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <genid supported='yes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <backingStoreInput supported='yes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <backup supported='yes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <async-teardown supported='yes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <ps2 supported='yes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <sev supported='no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <sgx supported='no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <hyperv supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='features'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>relaxed</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>vapic</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>spinlocks</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>vpindex</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>runtime</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>synic</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>stimer</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>reset</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>vendor_id</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>frequencies</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>reenlightenment</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>tlbflush</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>ipi</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>avic</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>emsr_bitmap</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>xmm_input</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </hyperv>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <launchSecurity supported='no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  </features>
Oct  2 07:54:05 np0005466013 nova_compute[192144]: </domainCapabilities>
Oct  2 07:54:05 np0005466013 nova_compute[192144]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.576 2 DEBUG nova.virt.libvirt.host [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.578 2 WARNING nova.virt.libvirt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Cannot update service status on host "compute-2.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-2.ctlplane.example.com could not be found.#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.579 2 DEBUG nova.virt.libvirt.volume.mount [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.582 2 DEBUG nova.virt.libvirt.host [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Oct  2 07:54:05 np0005466013 nova_compute[192144]: <domainCapabilities>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  <path>/usr/libexec/qemu-kvm</path>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  <domain>kvm</domain>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  <machine>pc-q35-rhel9.6.0</machine>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  <arch>x86_64</arch>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  <vcpu max='4096'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  <iothreads supported='yes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  <os supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <enum name='firmware'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <value>efi</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <loader supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='type'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>rom</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>pflash</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='readonly'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>yes</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>no</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='secure'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>yes</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>no</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </loader>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  </os>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  <cpu>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <mode name='host-passthrough' supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='hostPassthroughMigratable'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>on</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>off</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </mode>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <mode name='maximum' supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='maximumMigratable'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>on</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>off</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </mode>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <mode name='host-model' supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model fallback='forbid'>EPYC-Rome</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <vendor>AMD</vendor>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='x2apic'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='tsc-deadline'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='hypervisor'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='tsc_adjust'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='spec-ctrl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='stibp'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='arch-capabilities'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='ssbd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='cmp_legacy'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='overflow-recov'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='succor'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='ibrs'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='amd-ssbd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='virt-ssbd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='lbrv'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='tsc-scale'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='vmcb-clean'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='flushbyasid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='pause-filter'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='pfthreshold'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='svme-addr-chk'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='lfence-always-serializing'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='rdctl-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='mds-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='pschange-mc-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='gds-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='rfds-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='disable' name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </mode>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <mode name='custom' supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Broadwell'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Broadwell-IBRS'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Broadwell-noTSX'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Broadwell-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Broadwell-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Broadwell-v3'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Broadwell-v4'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Cascadelake-Server'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Cascadelake-Server-noTSX'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Cascadelake-Server-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Cascadelake-Server-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Cascadelake-Server-v3'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Cascadelake-Server-v4'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Cascadelake-Server-v5'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Cooperlake'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Cooperlake-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Cooperlake-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Denverton'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='mpx'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Denverton-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='mpx'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Denverton-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Denverton-v3'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Dhyana-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='EPYC-Genoa'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amd-psfd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='auto-ibrs'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='no-nested-data-bp'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='null-sel-clr-base'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='stibp-always-on'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='EPYC-Genoa-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amd-psfd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='auto-ibrs'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='no-nested-data-bp'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='null-sel-clr-base'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='stibp-always-on'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='EPYC-Milan'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='EPYC-Milan-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='EPYC-Milan-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amd-psfd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='no-nested-data-bp'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='null-sel-clr-base'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='stibp-always-on'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='EPYC-Rome'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='EPYC-Rome-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='EPYC-Rome-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='EPYC-Rome-v3'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='EPYC-v3'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='EPYC-v4'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='GraniteRapids'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-fp16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-int8'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-tile'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-fp16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fbsdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrc'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fzrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='mcdt-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pbrsb-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='prefetchiti'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='psdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='tsx-ldtrk'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xfd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='GraniteRapids-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-fp16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-int8'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-tile'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-fp16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fbsdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrc'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fzrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='mcdt-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pbrsb-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='prefetchiti'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='psdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='tsx-ldtrk'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xfd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='GraniteRapids-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-fp16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-int8'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-tile'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx10'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx10-128'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx10-256'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx10-512'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-fp16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='cldemote'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fbsdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrc'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fzrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='mcdt-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='movdir64b'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='movdiri'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pbrsb-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='prefetchiti'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='psdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='tsx-ldtrk'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xfd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Haswell'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Haswell-IBRS'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Haswell-noTSX'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Haswell-noTSX-IBRS'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Haswell-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Haswell-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Haswell-v3'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Haswell-v4'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Icelake-Server'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Icelake-Server-noTSX'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Icelake-Server-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Icelake-Server-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Icelake-Server-v3'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Icelake-Server-v4'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Icelake-Server-v5'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Icelake-Server-v6'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Icelake-Server-v7'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='IvyBridge'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='IvyBridge-IBRS'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='IvyBridge-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='IvyBridge-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='KnightsMill'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-4fmaps'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-4vnniw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512er'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512pf'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='KnightsMill-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-4fmaps'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-4vnniw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512er'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512pf'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Opteron_G4'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fma4'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xop'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Opteron_G4-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fma4'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xop'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Opteron_G5'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fma4'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='tbm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xop'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Opteron_G5-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fma4'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='tbm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xop'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='SapphireRapids'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-int8'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-tile'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-fp16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrc'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fzrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='tsx-ldtrk'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xfd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='SapphireRapids-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-int8'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-tile'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-fp16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrc'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fzrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='tsx-ldtrk'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xfd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='SapphireRapids-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-int8'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-tile'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-fp16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fbsdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrc'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fzrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='psdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='tsx-ldtrk'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xfd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='SapphireRapids-v3'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-int8'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-tile'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-fp16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='cldemote'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fbsdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrc'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fzrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='movdir64b'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='movdiri'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='psdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='tsx-ldtrk'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xfd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='SierraForest'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-ne-convert'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-vnni-int8'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='cmpccxadd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fbsdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='mcdt-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pbrsb-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='psdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='SierraForest-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-ne-convert'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-vnni-int8'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='cmpccxadd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fbsdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='mcdt-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pbrsb-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='psdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Client'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Client-IBRS'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Client-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Client-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Client-v3'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Client-v4'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Server'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Server-IBRS'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Server-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Server-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Server-v3'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Server-v4'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Server-v5'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Snowridge'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='cldemote'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='core-capability'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='movdir64b'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='movdiri'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='mpx'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='split-lock-detect'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Snowridge-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='cldemote'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='core-capability'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='movdir64b'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='movdiri'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='mpx'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='split-lock-detect'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Snowridge-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='cldemote'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='core-capability'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='movdir64b'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='movdiri'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='split-lock-detect'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Snowridge-v3'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='cldemote'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='core-capability'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='movdir64b'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='movdiri'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='split-lock-detect'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Snowridge-v4'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='cldemote'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='movdir64b'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='movdiri'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='athlon'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='3dnow'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='3dnowext'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='athlon-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='3dnow'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='3dnowext'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='core2duo'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='core2duo-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='coreduo'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='coreduo-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='n270'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='n270-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='phenom'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='3dnow'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='3dnowext'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='phenom-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='3dnow'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='3dnowext'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </mode>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  <memoryBacking supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <enum name='sourceType'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <value>file</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <value>anonymous</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <value>memfd</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  </memoryBacking>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  <devices>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <disk supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='diskDevice'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>disk</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>cdrom</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>floppy</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>lun</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='bus'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>fdc</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>scsi</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>virtio</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>usb</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>sata</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='model'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>virtio</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>virtio-transitional</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>virtio-non-transitional</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </disk>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <graphics supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='type'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>vnc</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>egl-headless</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>dbus</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </graphics>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <video supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='modelType'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>vga</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>cirrus</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>virtio</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>none</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>bochs</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>ramfb</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </video>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <hostdev supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='mode'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>subsystem</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='startupPolicy'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>default</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>mandatory</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>requisite</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>optional</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='subsysType'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>usb</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>pci</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>scsi</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='capsType'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='pciBackend'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </hostdev>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <rng supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='model'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>virtio</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>virtio-transitional</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>virtio-non-transitional</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='backendModel'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>random</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>egd</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>builtin</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </rng>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <filesystem supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='driverType'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>path</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>handle</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>virtiofs</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </filesystem>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <tpm supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='model'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>tpm-tis</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>tpm-crb</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='backendModel'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>emulator</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>external</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='backendVersion'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>2.0</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </tpm>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <redirdev supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='bus'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>usb</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </redirdev>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <channel supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='type'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>pty</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>unix</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </channel>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <crypto supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='model'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='type'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>qemu</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='backendModel'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>builtin</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </crypto>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <interface supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='backendType'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>default</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>passt</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </interface>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <panic supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='model'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>isa</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>hyperv</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </panic>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  </devices>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  <features>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <gic supported='no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <vmcoreinfo supported='yes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <genid supported='yes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <backingStoreInput supported='yes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <backup supported='yes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <async-teardown supported='yes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <ps2 supported='yes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <sev supported='no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <sgx supported='no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <hyperv supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='features'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>relaxed</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>vapic</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>spinlocks</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>vpindex</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>runtime</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>synic</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>stimer</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>reset</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>vendor_id</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>frequencies</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>reenlightenment</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>tlbflush</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>ipi</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>avic</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>emsr_bitmap</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>xmm_input</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </hyperv>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <launchSecurity supported='no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  </features>
Oct  2 07:54:05 np0005466013 nova_compute[192144]: </domainCapabilities>
Oct  2 07:54:05 np0005466013 nova_compute[192144]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.658 2 DEBUG nova.virt.libvirt.host [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Oct  2 07:54:05 np0005466013 nova_compute[192144]: <domainCapabilities>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  <path>/usr/libexec/qemu-kvm</path>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  <domain>kvm</domain>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  <machine>pc-i440fx-rhel7.6.0</machine>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  <arch>x86_64</arch>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  <vcpu max='240'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  <iothreads supported='yes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  <os supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <enum name='firmware'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <loader supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='type'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>rom</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>pflash</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='readonly'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>yes</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>no</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='secure'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>no</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </loader>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  </os>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  <cpu>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <mode name='host-passthrough' supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='hostPassthroughMigratable'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>on</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>off</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </mode>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <mode name='maximum' supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='maximumMigratable'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>on</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>off</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </mode>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <mode name='host-model' supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model fallback='forbid'>EPYC-Rome</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <vendor>AMD</vendor>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='x2apic'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='tsc-deadline'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='hypervisor'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='tsc_adjust'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='spec-ctrl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='stibp'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='arch-capabilities'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='ssbd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='cmp_legacy'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='overflow-recov'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='succor'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='ibrs'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='amd-ssbd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='virt-ssbd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='lbrv'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='tsc-scale'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='vmcb-clean'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='flushbyasid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='pause-filter'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='pfthreshold'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='svme-addr-chk'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='lfence-always-serializing'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='rdctl-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='mds-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='pschange-mc-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='gds-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='require' name='rfds-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <feature policy='disable' name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </mode>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <mode name='custom' supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Broadwell'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Broadwell-IBRS'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Broadwell-noTSX'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Broadwell-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Broadwell-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Broadwell-v3'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Broadwell-v4'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Cascadelake-Server'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Cascadelake-Server-noTSX'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Cascadelake-Server-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Cascadelake-Server-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Cascadelake-Server-v3'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Cascadelake-Server-v4'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Cascadelake-Server-v5'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Cooperlake'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Cooperlake-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Cooperlake-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Denverton'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='mpx'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Denverton-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='mpx'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Denverton-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Denverton-v3'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Dhyana-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='EPYC-Genoa'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amd-psfd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='auto-ibrs'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='no-nested-data-bp'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='null-sel-clr-base'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='stibp-always-on'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='EPYC-Genoa-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amd-psfd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='auto-ibrs'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='no-nested-data-bp'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='null-sel-clr-base'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='stibp-always-on'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='EPYC-Milan'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='EPYC-Milan-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='EPYC-Milan-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amd-psfd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='no-nested-data-bp'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='null-sel-clr-base'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='stibp-always-on'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='EPYC-Rome'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='EPYC-Rome-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='EPYC-Rome-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='EPYC-Rome-v3'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='EPYC-v3'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='EPYC-v4'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='GraniteRapids'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-fp16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-int8'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-tile'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-fp16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fbsdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrc'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fzrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='mcdt-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pbrsb-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='prefetchiti'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='psdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='tsx-ldtrk'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xfd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='GraniteRapids-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-fp16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-int8'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-tile'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-fp16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fbsdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrc'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fzrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='mcdt-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pbrsb-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='prefetchiti'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='psdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='tsx-ldtrk'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xfd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='GraniteRapids-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-fp16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-int8'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-tile'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx10'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx10-128'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx10-256'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx10-512'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-fp16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='cldemote'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fbsdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrc'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fzrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='mcdt-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='movdir64b'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='movdiri'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pbrsb-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='prefetchiti'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='psdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='tsx-ldtrk'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xfd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Haswell'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Haswell-IBRS'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Haswell-noTSX'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Haswell-noTSX-IBRS'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Haswell-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Haswell-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Haswell-v3'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Haswell-v4'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Icelake-Server'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Icelake-Server-noTSX'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Icelake-Server-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Icelake-Server-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Icelake-Server-v3'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Icelake-Server-v4'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Icelake-Server-v5'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Icelake-Server-v6'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Icelake-Server-v7'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='IvyBridge'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='IvyBridge-IBRS'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='IvyBridge-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='IvyBridge-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='KnightsMill'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-4fmaps'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-4vnniw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512er'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512pf'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='KnightsMill-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-4fmaps'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-4vnniw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512er'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512pf'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Opteron_G4'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fma4'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xop'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Opteron_G4-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fma4'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xop'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Opteron_G5'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fma4'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='tbm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xop'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Opteron_G5-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fma4'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='tbm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xop'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='SapphireRapids'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-int8'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-tile'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-fp16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrc'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fzrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='tsx-ldtrk'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xfd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='SapphireRapids-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-int8'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-tile'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-fp16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrc'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fzrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='tsx-ldtrk'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xfd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='SapphireRapids-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-int8'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-tile'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-fp16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fbsdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrc'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fzrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='psdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='tsx-ldtrk'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xfd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='SapphireRapids-v3'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-int8'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='amx-tile'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-fp16'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='cldemote'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fbsdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrc'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fzrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='movdir64b'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='movdiri'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='psdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='tsx-ldtrk'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xfd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='SierraForest'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-ne-convert'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-vnni-int8'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='cmpccxadd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fbsdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='mcdt-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pbrsb-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='psdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='SierraForest-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-ifma'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-ne-convert'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx-vnni-int8'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='cmpccxadd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fbsdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='mcdt-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pbrsb-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='psdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Client'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Client-IBRS'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Client-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Client-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Client-v3'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Client-v4'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Server'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Server-IBRS'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Server-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Server-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Server-v3'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Server-v4'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Skylake-Server-v5'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Snowridge'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='cldemote'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='core-capability'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='movdir64b'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='movdiri'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='mpx'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='split-lock-detect'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Snowridge-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='cldemote'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='core-capability'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='movdir64b'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='movdiri'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='mpx'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='split-lock-detect'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Snowridge-v2'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='cldemote'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='core-capability'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='movdir64b'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='movdiri'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='split-lock-detect'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Snowridge-v3'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='cldemote'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='core-capability'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='movdir64b'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='movdiri'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='split-lock-detect'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='Snowridge-v4'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='cldemote'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='movdir64b'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='movdiri'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='athlon'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='3dnow'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='3dnowext'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='athlon-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='3dnow'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='3dnowext'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='core2duo'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='core2duo-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='coreduo'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='coreduo-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='n270'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='n270-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='phenom'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='3dnow'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='3dnowext'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <blockers model='phenom-v1'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='3dnow'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <feature name='3dnowext'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </blockers>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </mode>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  <memoryBacking supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <enum name='sourceType'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <value>file</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <value>anonymous</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <value>memfd</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  </memoryBacking>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  <devices>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <disk supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='diskDevice'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>disk</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>cdrom</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>floppy</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>lun</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='bus'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>ide</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>fdc</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>scsi</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>virtio</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>usb</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>sata</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='model'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>virtio</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>virtio-transitional</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>virtio-non-transitional</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </disk>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <graphics supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='type'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>vnc</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>egl-headless</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>dbus</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </graphics>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <video supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='modelType'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>vga</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>cirrus</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>virtio</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>none</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>bochs</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>ramfb</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </video>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <hostdev supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='mode'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>subsystem</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='startupPolicy'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>default</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>mandatory</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>requisite</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>optional</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='subsysType'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>usb</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>pci</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>scsi</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='capsType'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='pciBackend'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </hostdev>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <rng supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='model'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>virtio</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>virtio-transitional</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>virtio-non-transitional</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='backendModel'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>random</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>egd</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>builtin</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </rng>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <filesystem supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='driverType'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>path</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>handle</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>virtiofs</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </filesystem>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <tpm supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='model'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>tpm-tis</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>tpm-crb</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='backendModel'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>emulator</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>external</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='backendVersion'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>2.0</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </tpm>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <redirdev supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='bus'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>usb</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </redirdev>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <channel supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='type'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>pty</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>unix</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </channel>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <crypto supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='model'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='type'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>qemu</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='backendModel'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>builtin</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </crypto>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <interface supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='backendType'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>default</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>passt</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </interface>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <panic supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='model'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>isa</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>hyperv</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </panic>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  </devices>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  <features>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <gic supported='no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <vmcoreinfo supported='yes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <genid supported='yes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <backingStoreInput supported='yes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <backup supported='yes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <async-teardown supported='yes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <ps2 supported='yes'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <sev supported='no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <sgx supported='no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <hyperv supported='yes'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      <enum name='features'>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>relaxed</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>vapic</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>spinlocks</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>vpindex</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>runtime</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>synic</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>stimer</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>reset</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>vendor_id</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>frequencies</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>reenlightenment</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>tlbflush</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>ipi</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>avic</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>emsr_bitmap</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:        <value>xmm_input</value>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:      </enum>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    </hyperv>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:    <launchSecurity supported='no'/>
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  </features>
Oct  2 07:54:05 np0005466013 nova_compute[192144]: </domainCapabilities>
Oct  2 07:54:05 np0005466013 nova_compute[192144]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.727 2 DEBUG nova.virt.libvirt.host [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.728 2 INFO nova.virt.libvirt.host [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Secure Boot support detected#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.729 2 INFO nova.virt.libvirt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.729 2 INFO nova.virt.libvirt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.738 2 DEBUG nova.virt.libvirt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] cpu compare xml: <cpu match="exact">
Oct  2 07:54:05 np0005466013 nova_compute[192144]:  <model>Nehalem</model>
Oct  2 07:54:05 np0005466013 nova_compute[192144]: </cpu>
Oct  2 07:54:05 np0005466013 nova_compute[192144]: _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.740 2 DEBUG nova.virt.libvirt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.785 2 INFO nova.virt.node [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Determined node identity 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 from /var/lib/nova/compute_id#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.815 2 WARNING nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Compute nodes ['8a5c5335-95d5-48d7-aa6f-2fc6c798dc80'] for host compute-2.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.896 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.968 2 WARNING nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] No compute node record found for host compute-2.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-2.ctlplane.example.com could not be found.#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.968 2 DEBUG oslo_concurrency.lockutils [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.969 2 DEBUG oslo_concurrency.lockutils [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.969 2 DEBUG oslo_concurrency.lockutils [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:54:05 np0005466013 nova_compute[192144]: 2025-10-02 11:54:05.969 2 DEBUG nova.compute.resource_tracker [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 07:54:06 np0005466013 systemd[1]: Starting libvirt nodedev daemon...
Oct  2 07:54:06 np0005466013 systemd[1]: Started libvirt nodedev daemon.
Oct  2 07:54:06 np0005466013 nova_compute[192144]: 2025-10-02 11:54:06.234 2 WARNING nova.virt.libvirt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 07:54:06 np0005466013 nova_compute[192144]: 2025-10-02 11:54:06.236 2 DEBUG nova.compute.resource_tracker [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=6135MB free_disk=73.66623306274414GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 07:54:06 np0005466013 nova_compute[192144]: 2025-10-02 11:54:06.236 2 DEBUG oslo_concurrency.lockutils [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:54:06 np0005466013 nova_compute[192144]: 2025-10-02 11:54:06.236 2 DEBUG oslo_concurrency.lockutils [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:54:08 np0005466013 podman[192492]: 2025-10-02 11:54:08.668523922 +0000 UTC m=+0.048013556 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  2 07:54:08 np0005466013 nova_compute[192144]: 2025-10-02 11:54:08.677 2 WARNING nova.compute.resource_tracker [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] No compute node record for compute-2.ctlplane.example.com:8a5c5335-95d5-48d7-aa6f-2fc6c798dc80: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 could not be found.#033[00m
Oct  2 07:54:08 np0005466013 nova_compute[192144]: 2025-10-02 11:54:08.991 2 INFO nova.compute.resource_tracker [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Compute node record created for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com with uuid: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80#033[00m
Oct  2 07:54:09 np0005466013 nova_compute[192144]: 2025-10-02 11:54:09.232 2 DEBUG nova.compute.resource_tracker [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 07:54:09 np0005466013 nova_compute[192144]: 2025-10-02 11:54:09.233 2 DEBUG nova.compute.resource_tracker [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 07:54:09 np0005466013 nova_compute[192144]: 2025-10-02 11:54:09.862 2 INFO nova.scheduler.client.report [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [req-5f2c89c1-4cba-4833-aa6a-fefe0cc0ce5c] Created resource provider record via placement API for resource provider with UUID 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 and name compute-2.ctlplane.example.com.#033[00m
Oct  2 07:54:09 np0005466013 nova_compute[192144]: 2025-10-02 11:54:09.957 2 DEBUG nova.virt.libvirt.host [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Oct  2 07:54:09 np0005466013 nova_compute[192144]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m
Oct  2 07:54:09 np0005466013 nova_compute[192144]: 2025-10-02 11:54:09.958 2 INFO nova.virt.libvirt.host [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] kernel doesn't support AMD SEV#033[00m
Oct  2 07:54:09 np0005466013 nova_compute[192144]: 2025-10-02 11:54:09.958 2 DEBUG nova.compute.provider_tree [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Updating inventory in ProviderTree for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 07:54:09 np0005466013 nova_compute[192144]: 2025-10-02 11:54:09.958 2 DEBUG nova.virt.libvirt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 07:54:09 np0005466013 nova_compute[192144]: 2025-10-02 11:54:09.960 2 DEBUG nova.virt.libvirt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Libvirt baseline CPU <cpu>
Oct  2 07:54:09 np0005466013 nova_compute[192144]:  <arch>x86_64</arch>
Oct  2 07:54:09 np0005466013 nova_compute[192144]:  <model>Nehalem</model>
Oct  2 07:54:09 np0005466013 nova_compute[192144]:  <vendor>AMD</vendor>
Oct  2 07:54:09 np0005466013 nova_compute[192144]:  <topology sockets="8" cores="1" threads="1"/>
Oct  2 07:54:09 np0005466013 nova_compute[192144]: </cpu>
Oct  2 07:54:09 np0005466013 nova_compute[192144]: _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537#033[00m
Oct  2 07:54:10 np0005466013 nova_compute[192144]: 2025-10-02 11:54:10.130 2 DEBUG nova.scheduler.client.report [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Updated inventory for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Oct  2 07:54:10 np0005466013 nova_compute[192144]: 2025-10-02 11:54:10.131 2 DEBUG nova.compute.provider_tree [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Updating resource provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Oct  2 07:54:10 np0005466013 nova_compute[192144]: 2025-10-02 11:54:10.131 2 DEBUG nova.compute.provider_tree [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Updating inventory in ProviderTree for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 07:54:10 np0005466013 nova_compute[192144]: 2025-10-02 11:54:10.232 2 DEBUG nova.compute.provider_tree [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Updating resource provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Oct  2 07:54:10 np0005466013 nova_compute[192144]: 2025-10-02 11:54:10.269 2 DEBUG nova.compute.resource_tracker [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 07:54:10 np0005466013 nova_compute[192144]: 2025-10-02 11:54:10.270 2 DEBUG oslo_concurrency.lockutils [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.034s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:54:10 np0005466013 nova_compute[192144]: 2025-10-02 11:54:10.270 2 DEBUG nova.service [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m
Oct  2 07:54:10 np0005466013 nova_compute[192144]: 2025-10-02 11:54:10.354 2 DEBUG nova.service [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m
Oct  2 07:54:10 np0005466013 nova_compute[192144]: 2025-10-02 11:54:10.354 2 DEBUG nova.servicegroup.drivers.db [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] DB_Driver: join new ServiceGroup member compute-2.ctlplane.example.com to the compute group, service = <Service: host=compute-2.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m
Oct  2 07:54:10 np0005466013 systemd-logind[784]: New session 28 of user zuul.
Oct  2 07:54:10 np0005466013 systemd[1]: Started Session 28 of User zuul.
Oct  2 07:54:12 np0005466013 python3.9[192665]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:54:14 np0005466013 python3.9[192821]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  2 07:54:14 np0005466013 systemd[1]: Reloading.
Oct  2 07:54:14 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:54:14 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:54:15 np0005466013 python3.9[193006]: ansible-ansible.builtin.service_facts Invoked
Oct  2 07:54:15 np0005466013 network[193023]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  2 07:54:15 np0005466013 network[193024]: 'network-scripts' will be removed from distribution in near future.
Oct  2 07:54:15 np0005466013 network[193025]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  2 07:54:19 np0005466013 podman[193274]: 2025-10-02 11:54:19.016998908 +0000 UTC m=+0.066484968 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.build-date=20251001, config_id=multipathd)
Oct  2 07:54:19 np0005466013 python3.9[193318]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:54:20 np0005466013 python3.9[193475]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:54:20 np0005466013 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 07:54:21 np0005466013 python3.9[193628]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:54:22 np0005466013 python3.9[193780]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:54:23 np0005466013 python3.9[193932]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct  2 07:54:24 np0005466013 python3.9[194084]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  2 07:54:24 np0005466013 systemd[1]: Reloading.
Oct  2 07:54:24 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:54:24 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:54:25 np0005466013 podman[194243]: 2025-10-02 11:54:25.062606932 +0000 UTC m=+0.057459288 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 07:54:25 np0005466013 python3.9[194289]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:54:25 np0005466013 nova_compute[192144]: 2025-10-02 11:54:25.357 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:54:25 np0005466013 nova_compute[192144]: 2025-10-02 11:54:25.430 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:54:26 np0005466013 python3.9[194444]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:54:26 np0005466013 python3.9[194594]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:54:27 np0005466013 python3.9[194746]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:54:28 np0005466013 python3.9[194867]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759406067.1756132-366-1556915291504/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=d3d36c542f4af449a66988015465dd0bb4b47bb9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:54:29 np0005466013 python3.9[195019]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Oct  2 07:54:30 np0005466013 python3.9[195171]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Oct  2 07:54:30 np0005466013 python3.9[195324]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct  2 07:54:32 np0005466013 python3.9[195482]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct  2 07:54:35 np0005466013 python3.9[195640]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:54:35 np0005466013 podman[195735]: 2025-10-02 11:54:35.763710598 +0000 UTC m=+0.084216434 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  2 07:54:35 np0005466013 python3.9[195774]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1759406074.9096692-570-138544012150695/.source.conf _original_basename=ceilometer.conf follow=False checksum=f74f01c63e6cdeca5458ef9aff2a1db5d6a4e4b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:54:36 np0005466013 python3.9[195937]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:54:36 np0005466013 auditd[701]: Audit daemon rotating log files
Oct  2 07:54:37 np0005466013 python3.9[196058]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1759406076.0522816-570-135882280548926/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:54:37 np0005466013 python3.9[196208]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:54:38 np0005466013 python3.9[196329]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1759406077.2204468-570-243666592125550/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:54:38 np0005466013 podman[196453]: 2025-10-02 11:54:38.825719275 +0000 UTC m=+0.052753378 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  2 07:54:38 np0005466013 python3.9[196496]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:54:39 np0005466013 python3.9[196650]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:54:40 np0005466013 python3.9[196802]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:54:40 np0005466013 python3.9[196923]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759406079.963687-747-138675358013410/.source.json follow=False _original_basename=ceilometer-agent-compute.json.j2 checksum=264d11e8d3809e7ef745878dce7edd46098e25b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:54:41 np0005466013 python3.9[197073]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:54:41 np0005466013 python3.9[197149]: ansible-ansible.legacy.file Invoked with mode=420 dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf _original_basename=ceilometer-host-specific.conf.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:54:42 np0005466013 python3.9[197299]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:54:43 np0005466013 python3.9[197420]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759406082.132796-747-203788604677979/.source.json follow=False _original_basename=ceilometer_agent_compute.json.j2 checksum=17453a32c9d181134878b3e453cb84c3cd9bd67d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:54:43 np0005466013 python3.9[197570]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:54:44 np0005466013 python3.9[197691]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759406083.265645-747-231920882151058/.source.yaml follow=False _original_basename=ceilometer_prom_exporter.yaml.j2 checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:54:44 np0005466013 python3.9[197841]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:54:45 np0005466013 python3.9[197962]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/firewall.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759406084.404495-747-30740319243104/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:54:46 np0005466013 python3.9[198112]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:54:46 np0005466013 python3.9[198233]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759406085.7908103-747-136329890541724/.source.json follow=False _original_basename=node_exporter.json.j2 checksum=6e4982940d2bfae88404914dfaf72552f6356d81 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:54:47 np0005466013 python3.9[198383]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:54:47 np0005466013 python3.9[198504]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759406086.9445734-747-34232607371641/.source.yaml follow=False _original_basename=node_exporter.yaml.j2 checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:54:48 np0005466013 python3.9[198654]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:54:49 np0005466013 python3.9[198775]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759406088.111371-747-145751571010843/.source.json follow=False _original_basename=openstack_network_exporter.json.j2 checksum=d474f1e4c3dbd24762592c51cbe5311f0a037273 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:54:49 np0005466013 podman[198776]: 2025-10-02 11:54:49.143184504 +0000 UTC m=+0.057980265 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  2 07:54:49 np0005466013 python3.9[198945]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:54:50 np0005466013 python3.9[199066]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759406089.2097504-747-195618526313611/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=2b6bd0891e609bf38a73282f42888052b750bed6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:54:50 np0005466013 python3.9[199216]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:54:51 np0005466013 python3.9[199337]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759406090.3125618-747-62199735346811/.source.json follow=False _original_basename=podman_exporter.json.j2 checksum=e342121a88f67e2bae7ebc05d1e6d350470198a5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:54:51 np0005466013 python3.9[199487]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:54:52 np0005466013 python3.9[199608]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759406091.5150979-747-202897939466533/.source.yaml follow=False _original_basename=podman_exporter.yaml.j2 checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:54:54 np0005466013 python3.9[199758]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:54:55 np0005466013 python3.9[199834]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/node_exporter.yaml _original_basename=node_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/node_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:54:55 np0005466013 podman[199835]: 2025-10-02 11:54:55.167145699 +0000 UTC m=+0.062408627 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3)
Oct  2 07:54:55 np0005466013 python3.9[200004]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:54:56 np0005466013 python3.9[200080]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml _original_basename=podman_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/podman_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:54:57 np0005466013 python3.9[200230]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:54:57 np0005466013 python3.9[200306]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml _original_basename=ceilometer_prom_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:54:58 np0005466013 python3.9[200458]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:54:59 np0005466013 python3.9[200610]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:54:59 np0005466013 python3.9[200762]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:55:00 np0005466013 python3.9[200914]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:55:00 np0005466013 systemd[1]: Reloading.
Oct  2 07:55:00 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:55:00 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:55:01 np0005466013 systemd[1]: Listening on Podman API Socket.
Oct  2 07:55:01 np0005466013 python3.9[201105]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:55:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:55:02.265 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:55:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:55:02.266 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:55:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:55:02.266 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:55:02 np0005466013 python3.9[201228]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759406101.4649522-1412-252213337707594/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:55:02 np0005466013 python3.9[201304]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:55:02 np0005466013 nova_compute[192144]: 2025-10-02 11:55:02.995 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:55:02 np0005466013 nova_compute[192144]: 2025-10-02 11:55:02.995 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:55:02 np0005466013 nova_compute[192144]: 2025-10-02 11:55:02.995 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 07:55:02 np0005466013 nova_compute[192144]: 2025-10-02 11:55:02.996 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 07:55:03 np0005466013 nova_compute[192144]: 2025-10-02 11:55:03.011 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 07:55:03 np0005466013 nova_compute[192144]: 2025-10-02 11:55:03.011 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:55:03 np0005466013 nova_compute[192144]: 2025-10-02 11:55:03.012 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:55:03 np0005466013 nova_compute[192144]: 2025-10-02 11:55:03.012 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:55:03 np0005466013 nova_compute[192144]: 2025-10-02 11:55:03.012 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:55:03 np0005466013 nova_compute[192144]: 2025-10-02 11:55:03.014 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:55:03 np0005466013 nova_compute[192144]: 2025-10-02 11:55:03.014 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:55:03 np0005466013 nova_compute[192144]: 2025-10-02 11:55:03.014 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 07:55:03 np0005466013 nova_compute[192144]: 2025-10-02 11:55:03.015 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:55:03 np0005466013 nova_compute[192144]: 2025-10-02 11:55:03.041 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:55:03 np0005466013 nova_compute[192144]: 2025-10-02 11:55:03.041 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:55:03 np0005466013 nova_compute[192144]: 2025-10-02 11:55:03.042 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:55:03 np0005466013 nova_compute[192144]: 2025-10-02 11:55:03.042 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 07:55:03 np0005466013 nova_compute[192144]: 2025-10-02 11:55:03.192 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 07:55:03 np0005466013 nova_compute[192144]: 2025-10-02 11:55:03.193 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=6148MB free_disk=73.66873168945312GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 07:55:03 np0005466013 nova_compute[192144]: 2025-10-02 11:55:03.193 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:55:03 np0005466013 nova_compute[192144]: 2025-10-02 11:55:03.194 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:55:03 np0005466013 nova_compute[192144]: 2025-10-02 11:55:03.243 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 07:55:03 np0005466013 nova_compute[192144]: 2025-10-02 11:55:03.243 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 07:55:03 np0005466013 nova_compute[192144]: 2025-10-02 11:55:03.265 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 07:55:03 np0005466013 nova_compute[192144]: 2025-10-02 11:55:03.304 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 07:55:03 np0005466013 nova_compute[192144]: 2025-10-02 11:55:03.305 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 07:55:03 np0005466013 nova_compute[192144]: 2025-10-02 11:55:03.306 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:55:03 np0005466013 python3.9[201427]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759406101.4649522-1412-252213337707594/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:55:04 np0005466013 python3.9[201579]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=ceilometer_agent_compute.json debug=False
Oct  2 07:55:05 np0005466013 python3.9[201731]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  2 07:55:06 np0005466013 podman[201855]: 2025-10-02 11:55:06.775583575 +0000 UTC m=+0.147058663 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Oct  2 07:55:06 np0005466013 python3[201900]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=ceilometer_agent_compute.json log_base_path=/var/log/containers/stdouts debug=False
Oct  2 07:55:07 np0005466013 podman[201943]: 2025-10-02 11:55:07.112662254 +0000 UTC m=+0.027235033 image pull 5f0622bc7c13827171d93b3baf72157e23d24d44579ad79fe3a89ad88180a4bb quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Oct  2 07:55:07 np0005466013 podman[201943]: 2025-10-02 11:55:07.212347721 +0000 UTC m=+0.126920480 container create e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, config_id=edpm, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true)
Oct  2 07:55:07 np0005466013 python3[201900]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck compute --label config_id=edpm --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z --volume /var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified kolla_start
Oct  2 07:55:08 np0005466013 python3.9[202131]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:55:09 np0005466013 podman[202257]: 2025-10-02 11:55:09.165456192 +0000 UTC m=+0.083575754 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Oct  2 07:55:09 np0005466013 python3.9[202303]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:55:10 np0005466013 python3.9[202454]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759406109.4608185-1604-30539814256897/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:55:10 np0005466013 python3.9[202530]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  2 07:55:10 np0005466013 systemd[1]: Reloading.
Oct  2 07:55:11 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:55:11 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:55:11 np0005466013 python3.9[202642]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:55:11 np0005466013 systemd[1]: Reloading.
Oct  2 07:55:11 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:55:11 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:55:12 np0005466013 systemd[1]: Starting ceilometer_agent_compute container...
Oct  2 07:55:12 np0005466013 systemd[1]: Started libcrun container.
Oct  2 07:55:12 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/76a804b6d0e9b8a2f807f1b43d168489573ef6df1e4cbe0171d264523b49706f/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Oct  2 07:55:12 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/76a804b6d0e9b8a2f807f1b43d168489573ef6df1e4cbe0171d264523b49706f/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct  2 07:55:12 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/76a804b6d0e9b8a2f807f1b43d168489573ef6df1e4cbe0171d264523b49706f/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Oct  2 07:55:12 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/76a804b6d0e9b8a2f807f1b43d168489573ef6df1e4cbe0171d264523b49706f/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Oct  2 07:55:12 np0005466013 systemd[1]: Started /usr/bin/podman healthcheck run e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc.
Oct  2 07:55:12 np0005466013 podman[202682]: 2025-10-02 11:55:12.296203976 +0000 UTC m=+0.108991926 container init e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 07:55:12 np0005466013 ceilometer_agent_compute[202697]: + sudo -E kolla_set_configs
Oct  2 07:55:12 np0005466013 ceilometer_agent_compute[202697]: sudo: unable to send audit message: Operation not permitted
Oct  2 07:55:12 np0005466013 podman[202682]: 2025-10-02 11:55:12.333731216 +0000 UTC m=+0.146519136 container start e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct  2 07:55:12 np0005466013 podman[202682]: ceilometer_agent_compute
Oct  2 07:55:12 np0005466013 systemd[1]: Started ceilometer_agent_compute container.
Oct  2 07:55:12 np0005466013 ceilometer_agent_compute[202697]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  2 07:55:12 np0005466013 ceilometer_agent_compute[202697]: INFO:__main__:Validating config file
Oct  2 07:55:12 np0005466013 ceilometer_agent_compute[202697]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  2 07:55:12 np0005466013 ceilometer_agent_compute[202697]: INFO:__main__:Copying service configuration files
Oct  2 07:55:12 np0005466013 ceilometer_agent_compute[202697]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Oct  2 07:55:12 np0005466013 ceilometer_agent_compute[202697]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Oct  2 07:55:12 np0005466013 ceilometer_agent_compute[202697]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Oct  2 07:55:12 np0005466013 ceilometer_agent_compute[202697]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Oct  2 07:55:12 np0005466013 ceilometer_agent_compute[202697]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Oct  2 07:55:12 np0005466013 ceilometer_agent_compute[202697]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Oct  2 07:55:12 np0005466013 ceilometer_agent_compute[202697]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Oct  2 07:55:12 np0005466013 ceilometer_agent_compute[202697]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Oct  2 07:55:12 np0005466013 ceilometer_agent_compute[202697]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Oct  2 07:55:12 np0005466013 ceilometer_agent_compute[202697]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Oct  2 07:55:12 np0005466013 ceilometer_agent_compute[202697]: INFO:__main__:Writing out command to execute
Oct  2 07:55:12 np0005466013 ceilometer_agent_compute[202697]: ++ cat /run_command
Oct  2 07:55:12 np0005466013 ceilometer_agent_compute[202697]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Oct  2 07:55:12 np0005466013 ceilometer_agent_compute[202697]: + ARGS=
Oct  2 07:55:12 np0005466013 ceilometer_agent_compute[202697]: + sudo kolla_copy_cacerts
Oct  2 07:55:12 np0005466013 ceilometer_agent_compute[202697]: sudo: unable to send audit message: Operation not permitted
Oct  2 07:55:12 np0005466013 ceilometer_agent_compute[202697]: + [[ ! -n '' ]]
Oct  2 07:55:12 np0005466013 ceilometer_agent_compute[202697]: + . kolla_extend_start
Oct  2 07:55:12 np0005466013 ceilometer_agent_compute[202697]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Oct  2 07:55:12 np0005466013 ceilometer_agent_compute[202697]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Oct  2 07:55:12 np0005466013 ceilometer_agent_compute[202697]: + umask 0022
Oct  2 07:55:12 np0005466013 ceilometer_agent_compute[202697]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Oct  2 07:55:12 np0005466013 podman[202704]: 2025-10-02 11:55:12.421175002 +0000 UTC m=+0.080061091 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 07:55:12 np0005466013 systemd[1]: e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc-19b77ef594a7eff0.service: Main process exited, code=exited, status=1/FAILURE
Oct  2 07:55:12 np0005466013 systemd[1]: e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc-19b77ef594a7eff0.service: Failed with result 'exit-code'.
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.356 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.357 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.357 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.357 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.357 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.357 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.358 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.358 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.358 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.358 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.358 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.358 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.359 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.359 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.359 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.359 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.359 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.359 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.360 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.360 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.360 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.360 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.360 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.360 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.360 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.360 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.361 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.361 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.361 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.361 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.361 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.361 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.362 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.362 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.362 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.362 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.362 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.362 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.362 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.363 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.363 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.363 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.363 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.363 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.363 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.363 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.364 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.364 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.364 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.364 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.364 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.365 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.365 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.365 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.365 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.365 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.365 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.365 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.366 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.366 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.366 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.366 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.366 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.366 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.366 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.367 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.367 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.367 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.367 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.367 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.367 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.367 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.368 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.368 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.368 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.368 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.368 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.368 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.369 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.369 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.369 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.369 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.369 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.369 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.370 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.370 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.370 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.370 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.370 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.370 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.370 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.370 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.371 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.371 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.371 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.371 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.371 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.371 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.371 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.372 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.372 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.372 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.372 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.372 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.372 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.373 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.373 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.373 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.373 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.373 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.374 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.374 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.374 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.374 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.374 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.374 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.375 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.375 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.375 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.375 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.375 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.375 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.375 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.376 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.376 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.376 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.376 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.376 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.376 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.376 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.376 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.377 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.377 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.377 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.377 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.377 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.377 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.378 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.378 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.378 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.378 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.378 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.378 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.378 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.379 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.379 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.379 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.379 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.379 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.379 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.379 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.379 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.380 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.380 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.380 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.380 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.380 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.380 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.380 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.380 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.381 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.381 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.381 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.381 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.381 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.398 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.399 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.400 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.487 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.574 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.574 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.574 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.574 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.574 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.574 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.574 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.575 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.575 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.575 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.575 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.575 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.575 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.575 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.576 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.576 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.576 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.576 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.576 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.576 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.576 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.576 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.576 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.577 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.577 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.577 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.577 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.577 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.577 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.577 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.577 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.577 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.577 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.577 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.578 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.578 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.578 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.578 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.578 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.578 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.578 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.578 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.578 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.579 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.579 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.579 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.579 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.579 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.579 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.579 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.579 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.579 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.579 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.580 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.580 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.580 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.580 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.580 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.580 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.580 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.580 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.580 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.580 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.580 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.580 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.581 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.581 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.581 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.581 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.581 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.581 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.581 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.581 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.582 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.582 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.582 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.582 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.582 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.582 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.582 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.582 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.582 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.583 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.583 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.583 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.583 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.583 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.583 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.583 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.583 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.583 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.583 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.584 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.584 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.584 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.584 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.584 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.584 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.584 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.584 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.584 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.584 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.584 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.584 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.585 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.585 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.585 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.585 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.585 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.585 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.585 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.585 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.585 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.585 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.586 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.586 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.586 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.586 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.586 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.586 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.586 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.586 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.586 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.586 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.586 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.586 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.587 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.587 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.587 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.587 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.587 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.587 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.587 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.587 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.587 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.587 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.588 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.588 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.588 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.588 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.588 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.588 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.588 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.588 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.588 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.588 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.589 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.589 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.589 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.589 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.589 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.589 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.589 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.589 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.589 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.590 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.590 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.590 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.590 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.590 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.590 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.590 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.590 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.590 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.590 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.591 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.591 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.591 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.591 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.591 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.591 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.591 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.591 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.591 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.591 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.591 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.592 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.592 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.592 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.592 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.592 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.592 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.592 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.592 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.592 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.592 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.592 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.593 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.593 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.593 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.593 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.593 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.593 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.593 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.593 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.593 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.593 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.593 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.594 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.594 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.594 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.594 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.594 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.594 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.594 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.594 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.594 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.594 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.594 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.595 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.595 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.595 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.595 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.595 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.595 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.595 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.595 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.595 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.595 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.597 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.602 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.606 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.606 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.606 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.606 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.606 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.606 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.606 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.606 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.608 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.608 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.608 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.608 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.608 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.608 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.608 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:13 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:13.608 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:14 np0005466013 python3.9[202885]: ansible-ansible.builtin.systemd Invoked with name=edpm_ceilometer_agent_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:55:14 np0005466013 systemd[1]: Stopping ceilometer_agent_compute container...
Oct  2 07:55:14 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:14.327 2 INFO cotyledon._service_manager [-] Caught SIGTERM signal, graceful exiting of master process
Oct  2 07:55:14 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:14.428 2 DEBUG cotyledon._service_manager [-] Killing services with signal SIGTERM _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:304
Oct  2 07:55:14 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:14.428 2 DEBUG cotyledon._service_manager [-] Waiting services to terminate _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:308
Oct  2 07:55:14 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:14.429 12 INFO cotyledon._service [-] Caught SIGTERM signal, graceful exiting of service AgentManager(0) [12]
Oct  2 07:55:14 np0005466013 ceilometer_agent_compute[202697]: 2025-10-02 11:55:14.437 2 DEBUG cotyledon._service_manager [-] Shutdown finish _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:320
Oct  2 07:55:14 np0005466013 virtqemud[191867]: End of file while reading data: Input/output error
Oct  2 07:55:14 np0005466013 virtqemud[191867]: End of file while reading data: Input/output error
Oct  2 07:55:14 np0005466013 systemd[1]: libpod-e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc.scope: Deactivated successfully.
Oct  2 07:55:14 np0005466013 podman[202889]: 2025-10-02 11:55:14.615824015 +0000 UTC m=+0.321739408 container stop e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible)
Oct  2 07:55:14 np0005466013 systemd[1]: libpod-e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc.scope: Consumed 1.474s CPU time.
Oct  2 07:55:14 np0005466013 podman[202889]: 2025-10-02 11:55:14.646065723 +0000 UTC m=+0.351981136 container died e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  2 07:55:14 np0005466013 systemd[1]: e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc-19b77ef594a7eff0.timer: Deactivated successfully.
Oct  2 07:55:14 np0005466013 systemd[1]: Stopped /usr/bin/podman healthcheck run e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc.
Oct  2 07:55:14 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc-userdata-shm.mount: Deactivated successfully.
Oct  2 07:55:14 np0005466013 systemd[1]: var-lib-containers-storage-overlay-76a804b6d0e9b8a2f807f1b43d168489573ef6df1e4cbe0171d264523b49706f-merged.mount: Deactivated successfully.
Oct  2 07:55:14 np0005466013 podman[202889]: 2025-10-02 11:55:14.761130881 +0000 UTC m=+0.467046274 container cleanup e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct  2 07:55:14 np0005466013 podman[202889]: ceilometer_agent_compute
Oct  2 07:55:14 np0005466013 podman[202918]: ceilometer_agent_compute
Oct  2 07:55:14 np0005466013 systemd[1]: edpm_ceilometer_agent_compute.service: Deactivated successfully.
Oct  2 07:55:14 np0005466013 systemd[1]: Stopped ceilometer_agent_compute container.
Oct  2 07:55:14 np0005466013 systemd[1]: Starting ceilometer_agent_compute container...
Oct  2 07:55:15 np0005466013 systemd[1]: Started libcrun container.
Oct  2 07:55:15 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/76a804b6d0e9b8a2f807f1b43d168489573ef6df1e4cbe0171d264523b49706f/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Oct  2 07:55:15 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/76a804b6d0e9b8a2f807f1b43d168489573ef6df1e4cbe0171d264523b49706f/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct  2 07:55:15 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/76a804b6d0e9b8a2f807f1b43d168489573ef6df1e4cbe0171d264523b49706f/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Oct  2 07:55:15 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/76a804b6d0e9b8a2f807f1b43d168489573ef6df1e4cbe0171d264523b49706f/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Oct  2 07:55:15 np0005466013 systemd[1]: Started /usr/bin/podman healthcheck run e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc.
Oct  2 07:55:15 np0005466013 podman[202931]: 2025-10-02 11:55:15.118270382 +0000 UTC m=+0.268872809 container init e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute)
Oct  2 07:55:15 np0005466013 ceilometer_agent_compute[202946]: + sudo -E kolla_set_configs
Oct  2 07:55:15 np0005466013 ceilometer_agent_compute[202946]: sudo: unable to send audit message: Operation not permitted
Oct  2 07:55:15 np0005466013 podman[202931]: 2025-10-02 11:55:15.143787937 +0000 UTC m=+0.294390334 container start e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=edpm, container_name=ceilometer_agent_compute)
Oct  2 07:55:15 np0005466013 podman[202931]: ceilometer_agent_compute
Oct  2 07:55:15 np0005466013 systemd[1]: Started ceilometer_agent_compute container.
Oct  2 07:55:15 np0005466013 ceilometer_agent_compute[202946]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  2 07:55:15 np0005466013 ceilometer_agent_compute[202946]: INFO:__main__:Validating config file
Oct  2 07:55:15 np0005466013 ceilometer_agent_compute[202946]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  2 07:55:15 np0005466013 ceilometer_agent_compute[202946]: INFO:__main__:Copying service configuration files
Oct  2 07:55:15 np0005466013 ceilometer_agent_compute[202946]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Oct  2 07:55:15 np0005466013 ceilometer_agent_compute[202946]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Oct  2 07:55:15 np0005466013 ceilometer_agent_compute[202946]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Oct  2 07:55:15 np0005466013 ceilometer_agent_compute[202946]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Oct  2 07:55:15 np0005466013 ceilometer_agent_compute[202946]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Oct  2 07:55:15 np0005466013 ceilometer_agent_compute[202946]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Oct  2 07:55:15 np0005466013 ceilometer_agent_compute[202946]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Oct  2 07:55:15 np0005466013 ceilometer_agent_compute[202946]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Oct  2 07:55:15 np0005466013 ceilometer_agent_compute[202946]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Oct  2 07:55:15 np0005466013 ceilometer_agent_compute[202946]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Oct  2 07:55:15 np0005466013 ceilometer_agent_compute[202946]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Oct  2 07:55:15 np0005466013 ceilometer_agent_compute[202946]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Oct  2 07:55:15 np0005466013 ceilometer_agent_compute[202946]: INFO:__main__:Writing out command to execute
Oct  2 07:55:15 np0005466013 podman[202953]: 2025-10-02 11:55:15.20394567 +0000 UTC m=+0.050324980 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=edpm, org.label-schema.vendor=CentOS)
Oct  2 07:55:15 np0005466013 ceilometer_agent_compute[202946]: ++ cat /run_command
Oct  2 07:55:15 np0005466013 ceilometer_agent_compute[202946]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Oct  2 07:55:15 np0005466013 ceilometer_agent_compute[202946]: + ARGS=
Oct  2 07:55:15 np0005466013 ceilometer_agent_compute[202946]: + sudo kolla_copy_cacerts
Oct  2 07:55:15 np0005466013 systemd[1]: e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc-3f978aca15effb91.service: Main process exited, code=exited, status=1/FAILURE
Oct  2 07:55:15 np0005466013 systemd[1]: e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc-3f978aca15effb91.service: Failed with result 'exit-code'.
Oct  2 07:55:15 np0005466013 ceilometer_agent_compute[202946]: sudo: unable to send audit message: Operation not permitted
Oct  2 07:55:15 np0005466013 ceilometer_agent_compute[202946]: + [[ ! -n '' ]]
Oct  2 07:55:15 np0005466013 ceilometer_agent_compute[202946]: + . kolla_extend_start
Oct  2 07:55:15 np0005466013 ceilometer_agent_compute[202946]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Oct  2 07:55:15 np0005466013 ceilometer_agent_compute[202946]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Oct  2 07:55:15 np0005466013 ceilometer_agent_compute[202946]: + umask 0022
Oct  2 07:55:15 np0005466013 ceilometer_agent_compute[202946]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.128 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.128 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.128 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.128 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.128 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.128 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.128 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.128 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.129 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.129 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.129 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.129 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.129 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.129 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.129 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.129 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.129 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.130 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.130 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.130 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.130 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.130 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.130 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.130 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.130 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.130 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.130 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.130 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.131 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.131 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.131 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.131 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.131 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.131 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.131 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.131 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.131 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.131 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.131 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.131 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.132 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.132 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.132 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.132 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.132 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.132 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.132 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.132 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.132 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.132 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.132 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.132 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.133 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.133 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.133 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.133 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.133 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.133 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.133 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.133 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.133 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.134 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.134 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.134 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.134 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.134 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.134 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.134 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.134 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.134 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.135 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.135 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.135 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.135 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.135 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.135 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.135 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.135 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.135 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.135 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.136 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.136 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.136 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.136 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.136 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.136 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.136 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.136 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.136 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.136 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.137 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.137 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.137 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.137 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.137 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.137 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.137 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.137 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.137 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.137 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.138 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.138 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.138 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.138 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.138 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.138 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.138 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.138 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.139 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.139 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.139 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.139 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.139 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.139 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.139 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.139 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.140 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.140 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.140 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.140 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.140 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.140 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.140 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.140 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.141 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.141 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.141 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.141 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.141 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.141 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.141 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.141 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.141 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.141 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.141 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.142 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.142 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.142 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.142 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.142 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.142 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.142 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.142 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.143 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.143 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.143 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.143 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.143 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.143 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.143 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.143 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.143 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.144 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.144 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.144 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.144 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.144 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.144 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.144 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.144 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.145 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.145 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.145 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.145 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.145 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.162 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.163 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.164 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.177 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.312 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.312 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.312 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.312 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.313 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.313 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.313 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.313 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.313 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.313 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.313 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.313 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.313 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.314 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.314 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.314 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.314 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.314 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.314 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.314 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.314 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.314 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.315 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.315 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.315 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.315 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.315 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.315 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.315 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.315 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.315 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.315 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.316 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.316 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.316 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.316 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.316 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.316 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.316 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.316 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.316 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.316 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.316 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.316 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.317 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.317 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.317 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.317 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.317 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.317 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.317 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.317 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.317 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.317 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.318 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.318 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.318 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.318 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.318 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.318 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.318 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.318 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.318 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.318 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.318 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.318 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.319 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.319 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.319 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.319 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.319 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.319 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.319 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.319 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.319 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.319 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.320 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.320 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.320 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.320 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.320 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.320 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.320 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.320 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.320 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.320 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.320 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.320 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.321 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.321 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.321 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.321 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.321 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.321 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.321 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.321 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.321 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.321 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.321 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.322 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.322 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.322 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.322 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.322 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.322 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.322 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.322 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.322 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.322 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.322 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.323 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.323 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.323 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.323 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.323 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.323 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.323 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.323 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.323 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.323 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.323 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.324 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.324 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.324 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.324 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.324 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.324 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.324 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.324 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.324 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.324 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.324 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.325 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.325 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.325 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.325 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.325 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.325 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.325 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.325 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.325 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.325 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.325 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.326 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.326 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.326 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.326 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.326 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.326 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.326 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.326 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.326 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.326 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.326 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.326 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.327 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.327 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.327 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.327 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.327 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.327 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.327 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.327 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.327 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.327 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.327 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.327 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.328 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.328 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.328 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.328 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.328 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.328 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.328 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.328 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.328 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.328 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.328 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.328 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.329 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.329 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.329 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.329 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.329 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.329 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.329 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.329 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.329 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.329 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.329 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.329 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.330 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.330 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.330 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.330 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.330 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.330 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.330 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.330 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.330 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.330 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.330 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.331 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.331 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.331 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.331 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.331 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.331 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.331 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.331 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.331 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.331 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.331 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.332 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.332 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.332 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.332 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.332 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.332 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.334 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.338 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.341 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.341 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.341 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.342 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.342 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.342 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.342 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.342 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.342 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.342 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.342 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.342 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.342 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.342 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:55:16.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:16 np0005466013 python3.9[203135]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:55:17 np0005466013 python3.9[203258]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759406116.2207396-1701-154098441314759/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:55:18 np0005466013 python3.9[203410]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=node_exporter.json debug=False
Oct  2 07:55:19 np0005466013 python3.9[203562]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  2 07:55:19 np0005466013 podman[203587]: 2025-10-02 11:55:19.689783384 +0000 UTC m=+0.060677631 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Oct  2 07:55:20 np0005466013 python3[203734]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=node_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Oct  2 07:55:20 np0005466013 podman[203771]: 2025-10-02 11:55:20.626568738 +0000 UTC m=+0.046815058 container create bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, config_id=edpm, container_name=node_exporter, managed_by=edpm_ansible)
Oct  2 07:55:20 np0005466013 podman[203771]: 2025-10-02 11:55:20.602732805 +0000 UTC m=+0.022979145 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Oct  2 07:55:20 np0005466013 python3[203734]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck node_exporter --label config_id=edpm --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter:v1.5.0 --web.config.file=/etc/node_exporter/node_exporter.yaml --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl
Oct  2 07:55:22 np0005466013 python3.9[203961]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:55:22 np0005466013 python3.9[204115]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:55:23 np0005466013 python3.9[204266]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759406123.0609272-1859-59800581568779/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:55:24 np0005466013 python3.9[204342]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  2 07:55:24 np0005466013 systemd[1]: Reloading.
Oct  2 07:55:24 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:55:24 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:55:25 np0005466013 python3.9[204452]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:55:25 np0005466013 systemd[1]: Reloading.
Oct  2 07:55:25 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:55:25 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:55:25 np0005466013 podman[204456]: 2025-10-02 11:55:25.306769216 +0000 UTC m=+0.095529335 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct  2 07:55:25 np0005466013 systemd[1]: Starting node_exporter container...
Oct  2 07:55:25 np0005466013 systemd[1]: Started libcrun container.
Oct  2 07:55:25 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/68e312634c92438262c9256ec49373a6620fa751fd492c5b090c6bfa082ba12c/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct  2 07:55:25 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/68e312634c92438262c9256ec49373a6620fa751fd492c5b090c6bfa082ba12c/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct  2 07:55:25 np0005466013 systemd[1]: Started /usr/bin/podman healthcheck run bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720.
Oct  2 07:55:25 np0005466013 podman[204512]: 2025-10-02 11:55:25.694982229 +0000 UTC m=+0.124443880 container init bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 07:55:25 np0005466013 node_exporter[204527]: ts=2025-10-02T11:55:25.707Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Oct  2 07:55:25 np0005466013 node_exporter[204527]: ts=2025-10-02T11:55:25.707Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Oct  2 07:55:25 np0005466013 node_exporter[204527]: ts=2025-10-02T11:55:25.707Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Oct  2 07:55:25 np0005466013 node_exporter[204527]: ts=2025-10-02T11:55:25.708Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Oct  2 07:55:25 np0005466013 node_exporter[204527]: ts=2025-10-02T11:55:25.708Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Oct  2 07:55:25 np0005466013 node_exporter[204527]: ts=2025-10-02T11:55:25.708Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Oct  2 07:55:25 np0005466013 node_exporter[204527]: ts=2025-10-02T11:55:25.708Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Oct  2 07:55:25 np0005466013 node_exporter[204527]: ts=2025-10-02T11:55:25.708Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Oct  2 07:55:25 np0005466013 node_exporter[204527]: ts=2025-10-02T11:55:25.708Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Oct  2 07:55:25 np0005466013 node_exporter[204527]: ts=2025-10-02T11:55:25.708Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Oct  2 07:55:25 np0005466013 node_exporter[204527]: ts=2025-10-02T11:55:25.708Z caller=node_exporter.go:117 level=info collector=arp
Oct  2 07:55:25 np0005466013 node_exporter[204527]: ts=2025-10-02T11:55:25.708Z caller=node_exporter.go:117 level=info collector=bcache
Oct  2 07:55:25 np0005466013 node_exporter[204527]: ts=2025-10-02T11:55:25.708Z caller=node_exporter.go:117 level=info collector=bonding
Oct  2 07:55:25 np0005466013 node_exporter[204527]: ts=2025-10-02T11:55:25.708Z caller=node_exporter.go:117 level=info collector=btrfs
Oct  2 07:55:25 np0005466013 node_exporter[204527]: ts=2025-10-02T11:55:25.708Z caller=node_exporter.go:117 level=info collector=conntrack
Oct  2 07:55:25 np0005466013 node_exporter[204527]: ts=2025-10-02T11:55:25.708Z caller=node_exporter.go:117 level=info collector=cpu
Oct  2 07:55:25 np0005466013 node_exporter[204527]: ts=2025-10-02T11:55:25.708Z caller=node_exporter.go:117 level=info collector=cpufreq
Oct  2 07:55:25 np0005466013 node_exporter[204527]: ts=2025-10-02T11:55:25.708Z caller=node_exporter.go:117 level=info collector=diskstats
Oct  2 07:55:25 np0005466013 node_exporter[204527]: ts=2025-10-02T11:55:25.708Z caller=node_exporter.go:117 level=info collector=edac
Oct  2 07:55:25 np0005466013 node_exporter[204527]: ts=2025-10-02T11:55:25.708Z caller=node_exporter.go:117 level=info collector=fibrechannel
Oct  2 07:55:25 np0005466013 node_exporter[204527]: ts=2025-10-02T11:55:25.709Z caller=node_exporter.go:117 level=info collector=filefd
Oct  2 07:55:25 np0005466013 node_exporter[204527]: ts=2025-10-02T11:55:25.709Z caller=node_exporter.go:117 level=info collector=filesystem
Oct  2 07:55:25 np0005466013 node_exporter[204527]: ts=2025-10-02T11:55:25.709Z caller=node_exporter.go:117 level=info collector=infiniband
Oct  2 07:55:25 np0005466013 node_exporter[204527]: ts=2025-10-02T11:55:25.709Z caller=node_exporter.go:117 level=info collector=ipvs
Oct  2 07:55:25 np0005466013 node_exporter[204527]: ts=2025-10-02T11:55:25.709Z caller=node_exporter.go:117 level=info collector=loadavg
Oct  2 07:55:25 np0005466013 node_exporter[204527]: ts=2025-10-02T11:55:25.709Z caller=node_exporter.go:117 level=info collector=mdadm
Oct  2 07:55:25 np0005466013 node_exporter[204527]: ts=2025-10-02T11:55:25.709Z caller=node_exporter.go:117 level=info collector=meminfo
Oct  2 07:55:25 np0005466013 node_exporter[204527]: ts=2025-10-02T11:55:25.709Z caller=node_exporter.go:117 level=info collector=netclass
Oct  2 07:55:25 np0005466013 node_exporter[204527]: ts=2025-10-02T11:55:25.709Z caller=node_exporter.go:117 level=info collector=netdev
Oct  2 07:55:25 np0005466013 node_exporter[204527]: ts=2025-10-02T11:55:25.709Z caller=node_exporter.go:117 level=info collector=netstat
Oct  2 07:55:25 np0005466013 node_exporter[204527]: ts=2025-10-02T11:55:25.709Z caller=node_exporter.go:117 level=info collector=nfs
Oct  2 07:55:25 np0005466013 node_exporter[204527]: ts=2025-10-02T11:55:25.709Z caller=node_exporter.go:117 level=info collector=nfsd
Oct  2 07:55:25 np0005466013 node_exporter[204527]: ts=2025-10-02T11:55:25.709Z caller=node_exporter.go:117 level=info collector=nvme
Oct  2 07:55:25 np0005466013 node_exporter[204527]: ts=2025-10-02T11:55:25.709Z caller=node_exporter.go:117 level=info collector=schedstat
Oct  2 07:55:25 np0005466013 node_exporter[204527]: ts=2025-10-02T11:55:25.709Z caller=node_exporter.go:117 level=info collector=sockstat
Oct  2 07:55:25 np0005466013 node_exporter[204527]: ts=2025-10-02T11:55:25.709Z caller=node_exporter.go:117 level=info collector=softnet
Oct  2 07:55:25 np0005466013 node_exporter[204527]: ts=2025-10-02T11:55:25.709Z caller=node_exporter.go:117 level=info collector=systemd
Oct  2 07:55:25 np0005466013 node_exporter[204527]: ts=2025-10-02T11:55:25.709Z caller=node_exporter.go:117 level=info collector=tapestats
Oct  2 07:55:25 np0005466013 node_exporter[204527]: ts=2025-10-02T11:55:25.709Z caller=node_exporter.go:117 level=info collector=udp_queues
Oct  2 07:55:25 np0005466013 node_exporter[204527]: ts=2025-10-02T11:55:25.709Z caller=node_exporter.go:117 level=info collector=vmstat
Oct  2 07:55:25 np0005466013 node_exporter[204527]: ts=2025-10-02T11:55:25.709Z caller=node_exporter.go:117 level=info collector=xfs
Oct  2 07:55:25 np0005466013 node_exporter[204527]: ts=2025-10-02T11:55:25.709Z caller=node_exporter.go:117 level=info collector=zfs
Oct  2 07:55:25 np0005466013 node_exporter[204527]: ts=2025-10-02T11:55:25.709Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Oct  2 07:55:25 np0005466013 node_exporter[204527]: ts=2025-10-02T11:55:25.710Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Oct  2 07:55:25 np0005466013 podman[204512]: 2025-10-02 11:55:25.725550196 +0000 UTC m=+0.155011827 container start bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  2 07:55:25 np0005466013 podman[204512]: node_exporter
Oct  2 07:55:25 np0005466013 systemd[1]: Started node_exporter container.
Oct  2 07:55:25 np0005466013 podman[204536]: 2025-10-02 11:55:25.793025674 +0000 UTC m=+0.056795917 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 07:55:27 np0005466013 python3.9[204712]: ansible-ansible.builtin.systemd Invoked with name=edpm_node_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:55:27 np0005466013 systemd[1]: Stopping node_exporter container...
Oct  2 07:55:27 np0005466013 systemd[1]: libpod-bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720.scope: Deactivated successfully.
Oct  2 07:55:27 np0005466013 podman[204716]: 2025-10-02 11:55:27.338408487 +0000 UTC m=+0.043971367 container died bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 07:55:27 np0005466013 systemd[1]: bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720-2a983eeb9c8bdbb6.timer: Deactivated successfully.
Oct  2 07:55:27 np0005466013 systemd[1]: Stopped /usr/bin/podman healthcheck run bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720.
Oct  2 07:55:27 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720-userdata-shm.mount: Deactivated successfully.
Oct  2 07:55:27 np0005466013 systemd[1]: var-lib-containers-storage-overlay-68e312634c92438262c9256ec49373a6620fa751fd492c5b090c6bfa082ba12c-merged.mount: Deactivated successfully.
Oct  2 07:55:27 np0005466013 podman[204716]: 2025-10-02 11:55:27.3794667 +0000 UTC m=+0.085029560 container cleanup bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 07:55:27 np0005466013 podman[204716]: node_exporter
Oct  2 07:55:27 np0005466013 systemd[1]: edpm_node_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Oct  2 07:55:27 np0005466013 podman[204744]: node_exporter
Oct  2 07:55:27 np0005466013 systemd[1]: edpm_node_exporter.service: Failed with result 'exit-code'.
Oct  2 07:55:27 np0005466013 systemd[1]: Stopped node_exporter container.
Oct  2 07:55:27 np0005466013 systemd[1]: Starting node_exporter container...
Oct  2 07:55:27 np0005466013 systemd[1]: Started libcrun container.
Oct  2 07:55:27 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/68e312634c92438262c9256ec49373a6620fa751fd492c5b090c6bfa082ba12c/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct  2 07:55:27 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/68e312634c92438262c9256ec49373a6620fa751fd492c5b090c6bfa082ba12c/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct  2 07:55:27 np0005466013 systemd[1]: Started /usr/bin/podman healthcheck run bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720.
Oct  2 07:55:27 np0005466013 podman[204757]: 2025-10-02 11:55:27.583143792 +0000 UTC m=+0.108618684 container init bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 07:55:27 np0005466013 node_exporter[204772]: ts=2025-10-02T11:55:27.594Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Oct  2 07:55:27 np0005466013 node_exporter[204772]: ts=2025-10-02T11:55:27.594Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Oct  2 07:55:27 np0005466013 node_exporter[204772]: ts=2025-10-02T11:55:27.594Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Oct  2 07:55:27 np0005466013 node_exporter[204772]: ts=2025-10-02T11:55:27.594Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Oct  2 07:55:27 np0005466013 node_exporter[204772]: ts=2025-10-02T11:55:27.594Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Oct  2 07:55:27 np0005466013 node_exporter[204772]: ts=2025-10-02T11:55:27.595Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Oct  2 07:55:27 np0005466013 node_exporter[204772]: ts=2025-10-02T11:55:27.595Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Oct  2 07:55:27 np0005466013 node_exporter[204772]: ts=2025-10-02T11:55:27.595Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Oct  2 07:55:27 np0005466013 node_exporter[204772]: ts=2025-10-02T11:55:27.595Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Oct  2 07:55:27 np0005466013 node_exporter[204772]: ts=2025-10-02T11:55:27.595Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Oct  2 07:55:27 np0005466013 node_exporter[204772]: ts=2025-10-02T11:55:27.595Z caller=node_exporter.go:117 level=info collector=arp
Oct  2 07:55:27 np0005466013 node_exporter[204772]: ts=2025-10-02T11:55:27.595Z caller=node_exporter.go:117 level=info collector=bcache
Oct  2 07:55:27 np0005466013 node_exporter[204772]: ts=2025-10-02T11:55:27.595Z caller=node_exporter.go:117 level=info collector=bonding
Oct  2 07:55:27 np0005466013 node_exporter[204772]: ts=2025-10-02T11:55:27.595Z caller=node_exporter.go:117 level=info collector=btrfs
Oct  2 07:55:27 np0005466013 node_exporter[204772]: ts=2025-10-02T11:55:27.595Z caller=node_exporter.go:117 level=info collector=conntrack
Oct  2 07:55:27 np0005466013 node_exporter[204772]: ts=2025-10-02T11:55:27.595Z caller=node_exporter.go:117 level=info collector=cpu
Oct  2 07:55:27 np0005466013 node_exporter[204772]: ts=2025-10-02T11:55:27.595Z caller=node_exporter.go:117 level=info collector=cpufreq
Oct  2 07:55:27 np0005466013 node_exporter[204772]: ts=2025-10-02T11:55:27.595Z caller=node_exporter.go:117 level=info collector=diskstats
Oct  2 07:55:27 np0005466013 node_exporter[204772]: ts=2025-10-02T11:55:27.595Z caller=node_exporter.go:117 level=info collector=edac
Oct  2 07:55:27 np0005466013 node_exporter[204772]: ts=2025-10-02T11:55:27.595Z caller=node_exporter.go:117 level=info collector=fibrechannel
Oct  2 07:55:27 np0005466013 node_exporter[204772]: ts=2025-10-02T11:55:27.595Z caller=node_exporter.go:117 level=info collector=filefd
Oct  2 07:55:27 np0005466013 node_exporter[204772]: ts=2025-10-02T11:55:27.595Z caller=node_exporter.go:117 level=info collector=filesystem
Oct  2 07:55:27 np0005466013 node_exporter[204772]: ts=2025-10-02T11:55:27.595Z caller=node_exporter.go:117 level=info collector=infiniband
Oct  2 07:55:27 np0005466013 node_exporter[204772]: ts=2025-10-02T11:55:27.595Z caller=node_exporter.go:117 level=info collector=ipvs
Oct  2 07:55:27 np0005466013 node_exporter[204772]: ts=2025-10-02T11:55:27.595Z caller=node_exporter.go:117 level=info collector=loadavg
Oct  2 07:55:27 np0005466013 node_exporter[204772]: ts=2025-10-02T11:55:27.595Z caller=node_exporter.go:117 level=info collector=mdadm
Oct  2 07:55:27 np0005466013 node_exporter[204772]: ts=2025-10-02T11:55:27.595Z caller=node_exporter.go:117 level=info collector=meminfo
Oct  2 07:55:27 np0005466013 node_exporter[204772]: ts=2025-10-02T11:55:27.595Z caller=node_exporter.go:117 level=info collector=netclass
Oct  2 07:55:27 np0005466013 node_exporter[204772]: ts=2025-10-02T11:55:27.595Z caller=node_exporter.go:117 level=info collector=netdev
Oct  2 07:55:27 np0005466013 node_exporter[204772]: ts=2025-10-02T11:55:27.595Z caller=node_exporter.go:117 level=info collector=netstat
Oct  2 07:55:27 np0005466013 node_exporter[204772]: ts=2025-10-02T11:55:27.595Z caller=node_exporter.go:117 level=info collector=nfs
Oct  2 07:55:27 np0005466013 node_exporter[204772]: ts=2025-10-02T11:55:27.595Z caller=node_exporter.go:117 level=info collector=nfsd
Oct  2 07:55:27 np0005466013 node_exporter[204772]: ts=2025-10-02T11:55:27.595Z caller=node_exporter.go:117 level=info collector=nvme
Oct  2 07:55:27 np0005466013 node_exporter[204772]: ts=2025-10-02T11:55:27.595Z caller=node_exporter.go:117 level=info collector=schedstat
Oct  2 07:55:27 np0005466013 node_exporter[204772]: ts=2025-10-02T11:55:27.595Z caller=node_exporter.go:117 level=info collector=sockstat
Oct  2 07:55:27 np0005466013 node_exporter[204772]: ts=2025-10-02T11:55:27.595Z caller=node_exporter.go:117 level=info collector=softnet
Oct  2 07:55:27 np0005466013 node_exporter[204772]: ts=2025-10-02T11:55:27.595Z caller=node_exporter.go:117 level=info collector=systemd
Oct  2 07:55:27 np0005466013 node_exporter[204772]: ts=2025-10-02T11:55:27.595Z caller=node_exporter.go:117 level=info collector=tapestats
Oct  2 07:55:27 np0005466013 node_exporter[204772]: ts=2025-10-02T11:55:27.595Z caller=node_exporter.go:117 level=info collector=udp_queues
Oct  2 07:55:27 np0005466013 node_exporter[204772]: ts=2025-10-02T11:55:27.595Z caller=node_exporter.go:117 level=info collector=vmstat
Oct  2 07:55:27 np0005466013 node_exporter[204772]: ts=2025-10-02T11:55:27.596Z caller=node_exporter.go:117 level=info collector=xfs
Oct  2 07:55:27 np0005466013 node_exporter[204772]: ts=2025-10-02T11:55:27.596Z caller=node_exporter.go:117 level=info collector=zfs
Oct  2 07:55:27 np0005466013 node_exporter[204772]: ts=2025-10-02T11:55:27.596Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Oct  2 07:55:27 np0005466013 node_exporter[204772]: ts=2025-10-02T11:55:27.597Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Oct  2 07:55:27 np0005466013 podman[204757]: 2025-10-02 11:55:27.607928115 +0000 UTC m=+0.133402987 container start bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 07:55:27 np0005466013 podman[204757]: node_exporter
Oct  2 07:55:27 np0005466013 systemd[1]: Started node_exporter container.
Oct  2 07:55:27 np0005466013 podman[204781]: 2025-10-02 11:55:27.666557939 +0000 UTC m=+0.049964358 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 07:55:28 np0005466013 python3.9[204958]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:55:28 np0005466013 python3.9[205081]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759406127.8278468-1956-120289957187130/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:55:29 np0005466013 python3.9[205233]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=podman_exporter.json debug=False
Oct  2 07:55:30 np0005466013 python3.9[205385]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  2 07:55:31 np0005466013 python3[205537]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=podman_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Oct  2 07:55:32 np0005466013 podman[205550]: 2025-10-02 11:55:32.768301889 +0000 UTC m=+1.026611269 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Oct  2 07:55:32 np0005466013 podman[205646]: 2025-10-02 11:55:32.900113772 +0000 UTC m=+0.046082661 container create 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=edpm)
Oct  2 07:55:32 np0005466013 podman[205646]: 2025-10-02 11:55:32.874994435 +0000 UTC m=+0.020963354 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Oct  2 07:55:32 np0005466013 python3[205537]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env OS_ENDPOINT_TYPE=internal --env CONTAINER_HOST=unix:///run/podman/podman.sock --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=edpm --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Oct  2 07:55:33 np0005466013 python3.9[205836]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:55:34 np0005466013 python3.9[205990]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:55:35 np0005466013 python3.9[206141]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759406134.8715403-2114-187509323487810/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:55:36 np0005466013 python3.9[206217]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  2 07:55:36 np0005466013 systemd[1]: Reloading.
Oct  2 07:55:36 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:55:36 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:55:37 np0005466013 python3.9[206329]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:55:37 np0005466013 systemd[1]: Reloading.
Oct  2 07:55:37 np0005466013 podman[206331]: 2025-10-02 11:55:37.164229587 +0000 UTC m=+0.104885499 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct  2 07:55:37 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:55:37 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:55:37 np0005466013 systemd[1]: Starting podman_exporter container...
Oct  2 07:55:37 np0005466013 systemd[1]: Started libcrun container.
Oct  2 07:55:37 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/564e166b504c11aed3a63ee0fc625713bcbfd0014174e20e587ba9deac36bd93/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct  2 07:55:37 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/564e166b504c11aed3a63ee0fc625713bcbfd0014174e20e587ba9deac36bd93/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct  2 07:55:37 np0005466013 systemd[1]: Started /usr/bin/podman healthcheck run 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58.
Oct  2 07:55:37 np0005466013 podman[206395]: 2025-10-02 11:55:37.517901595 +0000 UTC m=+0.106032976 container init 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 07:55:37 np0005466013 podman_exporter[206411]: ts=2025-10-02T11:55:37.531Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Oct  2 07:55:37 np0005466013 podman_exporter[206411]: ts=2025-10-02T11:55:37.531Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Oct  2 07:55:37 np0005466013 podman_exporter[206411]: ts=2025-10-02T11:55:37.531Z caller=handler.go:94 level=info msg="enabled collectors"
Oct  2 07:55:37 np0005466013 podman_exporter[206411]: ts=2025-10-02T11:55:37.531Z caller=handler.go:105 level=info collector=container
Oct  2 07:55:37 np0005466013 podman[206395]: 2025-10-02 11:55:37.543380532 +0000 UTC m=+0.131511903 container start 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  2 07:55:37 np0005466013 podman[206395]: podman_exporter
Oct  2 07:55:37 np0005466013 systemd[1]: Starting Podman API Service...
Oct  2 07:55:37 np0005466013 systemd[1]: Started Podman API Service.
Oct  2 07:55:37 np0005466013 systemd[1]: Started podman_exporter container.
Oct  2 07:55:37 np0005466013 podman[206422]: time="2025-10-02T11:55:37Z" level=info msg="/usr/bin/podman filtering at log level info"
Oct  2 07:55:37 np0005466013 podman[206422]: time="2025-10-02T11:55:37Z" level=info msg="Setting parallel job count to 25"
Oct  2 07:55:37 np0005466013 podman[206422]: time="2025-10-02T11:55:37Z" level=info msg="Using sqlite as database backend"
Oct  2 07:55:37 np0005466013 podman[206422]: time="2025-10-02T11:55:37Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Oct  2 07:55:37 np0005466013 podman[206422]: time="2025-10-02T11:55:37Z" level=info msg="Using systemd socket activation to determine API endpoint"
Oct  2 07:55:37 np0005466013 podman[206422]: time="2025-10-02T11:55:37Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Oct  2 07:55:37 np0005466013 podman[206422]: @ - - [02/Oct/2025:11:55:37 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Oct  2 07:55:37 np0005466013 podman[206422]: time="2025-10-02T11:55:37Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct  2 07:55:37 np0005466013 podman[206421]: 2025-10-02 11:55:37.610803718 +0000 UTC m=+0.057665663 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 07:55:37 np0005466013 podman[206422]: @ - - [02/Oct/2025:11:55:37 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 22059 "" "Go-http-client/1.1"
Oct  2 07:55:37 np0005466013 podman_exporter[206411]: ts=2025-10-02T11:55:37.616Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Oct  2 07:55:37 np0005466013 podman_exporter[206411]: ts=2025-10-02T11:55:37.617Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Oct  2 07:55:37 np0005466013 systemd[1]: 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58-4657a1b6a43c72ab.service: Main process exited, code=exited, status=1/FAILURE
Oct  2 07:55:37 np0005466013 systemd[1]: 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58-4657a1b6a43c72ab.service: Failed with result 'exit-code'.
Oct  2 07:55:37 np0005466013 podman_exporter[206411]: ts=2025-10-02T11:55:37.617Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Oct  2 07:55:38 np0005466013 python3.9[206609]: ansible-ansible.builtin.systemd Invoked with name=edpm_podman_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:55:38 np0005466013 systemd[1]: Stopping podman_exporter container...
Oct  2 07:55:38 np0005466013 podman[206422]: @ - - [02/Oct/2025:11:55:37 +0000] "GET /v4.9.3/libpod/events?filters=%7B%7D&since=&stream=true&until= HTTP/1.1" 200 1449 "" "Go-http-client/1.1"
Oct  2 07:55:38 np0005466013 systemd[1]: libpod-16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58.scope: Deactivated successfully.
Oct  2 07:55:38 np0005466013 podman[206613]: 2025-10-02 11:55:38.499961182 +0000 UTC m=+0.050288866 container died 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 07:55:38 np0005466013 systemd[1]: 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58-4657a1b6a43c72ab.timer: Deactivated successfully.
Oct  2 07:55:38 np0005466013 systemd[1]: Stopped /usr/bin/podman healthcheck run 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58.
Oct  2 07:55:38 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58-userdata-shm.mount: Deactivated successfully.
Oct  2 07:55:38 np0005466013 systemd[1]: var-lib-containers-storage-overlay-564e166b504c11aed3a63ee0fc625713bcbfd0014174e20e587ba9deac36bd93-merged.mount: Deactivated successfully.
Oct  2 07:55:38 np0005466013 podman[206613]: 2025-10-02 11:55:38.702386882 +0000 UTC m=+0.252714556 container cleanup 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 07:55:38 np0005466013 podman[206613]: podman_exporter
Oct  2 07:55:38 np0005466013 systemd[1]: edpm_podman_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Oct  2 07:55:38 np0005466013 podman[206643]: podman_exporter
Oct  2 07:55:38 np0005466013 systemd[1]: edpm_podman_exporter.service: Failed with result 'exit-code'.
Oct  2 07:55:38 np0005466013 systemd[1]: Stopped podman_exporter container.
Oct  2 07:55:38 np0005466013 systemd[1]: Starting podman_exporter container...
Oct  2 07:55:38 np0005466013 systemd[1]: Started libcrun container.
Oct  2 07:55:38 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/564e166b504c11aed3a63ee0fc625713bcbfd0014174e20e587ba9deac36bd93/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct  2 07:55:38 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/564e166b504c11aed3a63ee0fc625713bcbfd0014174e20e587ba9deac36bd93/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct  2 07:55:38 np0005466013 systemd[1]: Started /usr/bin/podman healthcheck run 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58.
Oct  2 07:55:38 np0005466013 podman[206656]: 2025-10-02 11:55:38.8975938 +0000 UTC m=+0.103438312 container init 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 07:55:38 np0005466013 podman_exporter[206671]: ts=2025-10-02T11:55:38.910Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Oct  2 07:55:38 np0005466013 podman_exporter[206671]: ts=2025-10-02T11:55:38.910Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Oct  2 07:55:38 np0005466013 podman_exporter[206671]: ts=2025-10-02T11:55:38.911Z caller=handler.go:94 level=info msg="enabled collectors"
Oct  2 07:55:38 np0005466013 podman_exporter[206671]: ts=2025-10-02T11:55:38.911Z caller=handler.go:105 level=info collector=container
Oct  2 07:55:38 np0005466013 podman[206422]: @ - - [02/Oct/2025:11:55:38 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Oct  2 07:55:38 np0005466013 podman[206422]: time="2025-10-02T11:55:38Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct  2 07:55:38 np0005466013 podman[206656]: 2025-10-02 11:55:38.924792714 +0000 UTC m=+0.130637256 container start 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 07:55:38 np0005466013 podman[206656]: podman_exporter
Oct  2 07:55:38 np0005466013 systemd[1]: Started podman_exporter container.
Oct  2 07:55:38 np0005466013 podman[206422]: @ - - [02/Oct/2025:11:55:38 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 22061 "" "Go-http-client/1.1"
Oct  2 07:55:38 np0005466013 podman_exporter[206671]: ts=2025-10-02T11:55:38.936Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Oct  2 07:55:38 np0005466013 podman_exporter[206671]: ts=2025-10-02T11:55:38.937Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Oct  2 07:55:38 np0005466013 podman_exporter[206671]: ts=2025-10-02T11:55:38.937Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Oct  2 07:55:38 np0005466013 podman[206680]: 2025-10-02 11:55:38.989195022 +0000 UTC m=+0.055186393 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 07:55:39 np0005466013 podman[206799]: 2025-10-02 11:55:39.672894038 +0000 UTC m=+0.050557674 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct  2 07:55:39 np0005466013 python3.9[206875]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:55:40 np0005466013 python3.9[206998]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759406139.4978898-2210-34474115327849/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:55:41 np0005466013 python3.9[207150]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=openstack_network_exporter.json debug=False
Oct  2 07:55:42 np0005466013 python3.9[207302]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  2 07:55:43 np0005466013 python3[207454]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=openstack_network_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Oct  2 07:55:45 np0005466013 podman[207523]: 2025-10-02 11:55:45.529651318 +0000 UTC m=+0.170164876 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=2, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  2 07:55:45 np0005466013 systemd[1]: e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc-3f978aca15effb91.service: Main process exited, code=exited, status=1/FAILURE
Oct  2 07:55:45 np0005466013 systemd[1]: e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc-3f978aca15effb91.service: Failed with result 'exit-code'.
Oct  2 07:55:45 np0005466013 podman[207466]: 2025-10-02 11:55:45.71564546 +0000 UTC m=+2.544004207 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Oct  2 07:55:45 np0005466013 podman[207579]: 2025-10-02 11:55:45.864070117 +0000 UTC m=+0.055094801 container create b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=9.6, io.buildah.version=1.33.7, vcs-type=git, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, name=ubi9-minimal)
Oct  2 07:55:45 np0005466013 podman[207579]: 2025-10-02 11:55:45.829541547 +0000 UTC m=+0.020566261 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Oct  2 07:55:45 np0005466013 python3[207454]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OS_ENDPOINT_TYPE=internal --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=edpm --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Oct  2 07:55:46 np0005466013 python3.9[207769]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:55:47 np0005466013 python3.9[207923]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:55:48 np0005466013 python3.9[208074]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759406147.5351176-2369-39977339538012/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:55:48 np0005466013 python3.9[208150]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  2 07:55:48 np0005466013 systemd[1]: Reloading.
Oct  2 07:55:48 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:55:48 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:55:49 np0005466013 python3.9[208261]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:55:49 np0005466013 systemd[1]: Reloading.
Oct  2 07:55:49 np0005466013 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:55:49 np0005466013 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:55:49 np0005466013 systemd[1]: Starting openstack_network_exporter container...
Oct  2 07:55:49 np0005466013 podman[208299]: 2025-10-02 11:55:49.949746299 +0000 UTC m=+0.059993147 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd)
Oct  2 07:55:49 np0005466013 systemd[1]: Started libcrun container.
Oct  2 07:55:49 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/66c51b9c34232190264cfef934c7f7bc6ecf7687e11150466ed93fe1cf89ce3d/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Oct  2 07:55:49 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/66c51b9c34232190264cfef934c7f7bc6ecf7687e11150466ed93fe1cf89ce3d/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct  2 07:55:49 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/66c51b9c34232190264cfef934c7f7bc6ecf7687e11150466ed93fe1cf89ce3d/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct  2 07:55:49 np0005466013 systemd[1]: Started /usr/bin/podman healthcheck run b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d.
Oct  2 07:55:49 np0005466013 podman[208301]: 2025-10-02 11:55:49.995715626 +0000 UTC m=+0.104552619 container init b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, distribution-scope=public, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.openshift.expose-services=, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  2 07:55:50 np0005466013 openstack_network_exporter[208332]: INFO    11:55:50 main.go:48: registering *bridge.Collector
Oct  2 07:55:50 np0005466013 openstack_network_exporter[208332]: INFO    11:55:50 main.go:48: registering *coverage.Collector
Oct  2 07:55:50 np0005466013 openstack_network_exporter[208332]: INFO    11:55:50 main.go:48: registering *datapath.Collector
Oct  2 07:55:50 np0005466013 openstack_network_exporter[208332]: INFO    11:55:50 main.go:48: registering *iface.Collector
Oct  2 07:55:50 np0005466013 openstack_network_exporter[208332]: INFO    11:55:50 main.go:48: registering *memory.Collector
Oct  2 07:55:50 np0005466013 openstack_network_exporter[208332]: INFO    11:55:50 main.go:48: registering *ovnnorthd.Collector
Oct  2 07:55:50 np0005466013 openstack_network_exporter[208332]: INFO    11:55:50 main.go:48: registering *ovn.Collector
Oct  2 07:55:50 np0005466013 openstack_network_exporter[208332]: INFO    11:55:50 main.go:48: registering *ovsdbserver.Collector
Oct  2 07:55:50 np0005466013 openstack_network_exporter[208332]: INFO    11:55:50 main.go:48: registering *pmd_perf.Collector
Oct  2 07:55:50 np0005466013 openstack_network_exporter[208332]: INFO    11:55:50 main.go:48: registering *pmd_rxq.Collector
Oct  2 07:55:50 np0005466013 openstack_network_exporter[208332]: INFO    11:55:50 main.go:48: registering *vswitch.Collector
Oct  2 07:55:50 np0005466013 openstack_network_exporter[208332]: NOTICE  11:55:50 main.go:76: listening on https://:9105/metrics
Oct  2 07:55:50 np0005466013 podman[208301]: 2025-10-02 11:55:50.017922359 +0000 UTC m=+0.126759362 container start b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, version=9.6, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., config_id=edpm, architecture=x86_64, distribution-scope=public, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-type=git, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41)
Oct  2 07:55:50 np0005466013 podman[208301]: openstack_network_exporter
Oct  2 07:55:50 np0005466013 systemd[1]: Started openstack_network_exporter container.
Oct  2 07:55:50 np0005466013 podman[208342]: 2025-10-02 11:55:50.1198072 +0000 UTC m=+0.091948822 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, release=1755695350, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64)
Oct  2 07:55:50 np0005466013 python3.9[208517]: ansible-ansible.builtin.systemd Invoked with name=edpm_openstack_network_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:55:50 np0005466013 systemd[1]: Stopping openstack_network_exporter container...
Oct  2 07:55:51 np0005466013 systemd[1]: libpod-b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d.scope: Deactivated successfully.
Oct  2 07:55:51 np0005466013 podman[208521]: 2025-10-02 11:55:51.011783734 +0000 UTC m=+0.046236716 container died b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, vcs-type=git, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, distribution-scope=public, release=1755695350, architecture=x86_64, version=9.6)
Oct  2 07:55:51 np0005466013 systemd[1]: b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d-4934dd6d8ea2eed4.timer: Deactivated successfully.
Oct  2 07:55:51 np0005466013 systemd[1]: Stopped /usr/bin/podman healthcheck run b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d.
Oct  2 07:55:51 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d-userdata-shm.mount: Deactivated successfully.
Oct  2 07:55:51 np0005466013 systemd[1]: var-lib-containers-storage-overlay-66c51b9c34232190264cfef934c7f7bc6ecf7687e11150466ed93fe1cf89ce3d-merged.mount: Deactivated successfully.
Oct  2 07:55:52 np0005466013 podman[208521]: 2025-10-02 11:55:52.22855309 +0000 UTC m=+1.263006072 container cleanup b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, name=ubi9-minimal, vendor=Red Hat, Inc., io.buildah.version=1.33.7, distribution-scope=public, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_id=edpm, release=1755695350, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Oct  2 07:55:52 np0005466013 podman[208521]: openstack_network_exporter
Oct  2 07:55:52 np0005466013 systemd[1]: edpm_openstack_network_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Oct  2 07:55:52 np0005466013 podman[208548]: openstack_network_exporter
Oct  2 07:55:52 np0005466013 systemd[1]: edpm_openstack_network_exporter.service: Failed with result 'exit-code'.
Oct  2 07:55:52 np0005466013 systemd[1]: Stopped openstack_network_exporter container.
Oct  2 07:55:52 np0005466013 systemd[1]: Starting openstack_network_exporter container...
Oct  2 07:55:52 np0005466013 systemd[1]: Started libcrun container.
Oct  2 07:55:52 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/66c51b9c34232190264cfef934c7f7bc6ecf7687e11150466ed93fe1cf89ce3d/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Oct  2 07:55:52 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/66c51b9c34232190264cfef934c7f7bc6ecf7687e11150466ed93fe1cf89ce3d/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct  2 07:55:52 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/66c51b9c34232190264cfef934c7f7bc6ecf7687e11150466ed93fe1cf89ce3d/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct  2 07:55:52 np0005466013 systemd[1]: Started /usr/bin/podman healthcheck run b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d.
Oct  2 07:55:52 np0005466013 podman[208561]: 2025-10-02 11:55:52.430755353 +0000 UTC m=+0.107865445 container init b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, architecture=x86_64, io.openshift.expose-services=, release=1755695350, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vcs-type=git, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6)
Oct  2 07:55:52 np0005466013 openstack_network_exporter[208577]: INFO    11:55:52 main.go:48: registering *bridge.Collector
Oct  2 07:55:52 np0005466013 openstack_network_exporter[208577]: INFO    11:55:52 main.go:48: registering *coverage.Collector
Oct  2 07:55:52 np0005466013 openstack_network_exporter[208577]: INFO    11:55:52 main.go:48: registering *datapath.Collector
Oct  2 07:55:52 np0005466013 openstack_network_exporter[208577]: INFO    11:55:52 main.go:48: registering *iface.Collector
Oct  2 07:55:52 np0005466013 openstack_network_exporter[208577]: INFO    11:55:52 main.go:48: registering *memory.Collector
Oct  2 07:55:52 np0005466013 openstack_network_exporter[208577]: INFO    11:55:52 main.go:48: registering *ovnnorthd.Collector
Oct  2 07:55:52 np0005466013 openstack_network_exporter[208577]: INFO    11:55:52 main.go:48: registering *ovn.Collector
Oct  2 07:55:52 np0005466013 openstack_network_exporter[208577]: INFO    11:55:52 main.go:48: registering *ovsdbserver.Collector
Oct  2 07:55:52 np0005466013 openstack_network_exporter[208577]: INFO    11:55:52 main.go:48: registering *pmd_perf.Collector
Oct  2 07:55:52 np0005466013 openstack_network_exporter[208577]: INFO    11:55:52 main.go:48: registering *pmd_rxq.Collector
Oct  2 07:55:52 np0005466013 openstack_network_exporter[208577]: INFO    11:55:52 main.go:48: registering *vswitch.Collector
Oct  2 07:55:52 np0005466013 openstack_network_exporter[208577]: NOTICE  11:55:52 main.go:76: listening on https://:9105/metrics
Oct  2 07:55:52 np0005466013 podman[208561]: 2025-10-02 11:55:52.461336025 +0000 UTC m=+0.138446107 container start b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, maintainer=Red Hat, Inc., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, managed_by=edpm_ansible, vendor=Red Hat, Inc., version=9.6)
Oct  2 07:55:52 np0005466013 podman[208561]: openstack_network_exporter
Oct  2 07:55:52 np0005466013 systemd[1]: Started openstack_network_exporter container.
Oct  2 07:55:52 np0005466013 podman[208587]: 2025-10-02 11:55:52.522760527 +0000 UTC m=+0.054044866 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, config_id=edpm, release=1755695350)
Oct  2 07:55:53 np0005466013 python3.9[208761]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct  2 07:55:55 np0005466013 podman[208786]: 2025-10-02 11:55:55.67732515 +0000 UTC m=+0.054193871 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3)
Oct  2 07:55:58 np0005466013 podman[208806]: 2025-10-02 11:55:58.668077523 +0000 UTC m=+0.050158852 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  2 07:56:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:56:02.266 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:56:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:56:02.267 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:56:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:56:02.267 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:56:03 np0005466013 nova_compute[192144]: 2025-10-02 11:56:03.298 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:56:03 np0005466013 nova_compute[192144]: 2025-10-02 11:56:03.313 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:56:03 np0005466013 nova_compute[192144]: 2025-10-02 11:56:03.314 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:56:03 np0005466013 nova_compute[192144]: 2025-10-02 11:56:03.314 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:56:03 np0005466013 nova_compute[192144]: 2025-10-02 11:56:03.314 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:56:03 np0005466013 nova_compute[192144]: 2025-10-02 11:56:03.314 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 07:56:03 np0005466013 nova_compute[192144]: 2025-10-02 11:56:03.314 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:56:03 np0005466013 nova_compute[192144]: 2025-10-02 11:56:03.337 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:56:03 np0005466013 nova_compute[192144]: 2025-10-02 11:56:03.338 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:56:03 np0005466013 nova_compute[192144]: 2025-10-02 11:56:03.338 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:56:03 np0005466013 nova_compute[192144]: 2025-10-02 11:56:03.338 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 07:56:03 np0005466013 nova_compute[192144]: 2025-10-02 11:56:03.450 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 07:56:03 np0005466013 nova_compute[192144]: 2025-10-02 11:56:03.451 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5944MB free_disk=73.50312805175781GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 07:56:03 np0005466013 nova_compute[192144]: 2025-10-02 11:56:03.451 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:56:03 np0005466013 nova_compute[192144]: 2025-10-02 11:56:03.452 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:56:03 np0005466013 nova_compute[192144]: 2025-10-02 11:56:03.513 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 07:56:03 np0005466013 nova_compute[192144]: 2025-10-02 11:56:03.513 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 07:56:03 np0005466013 nova_compute[192144]: 2025-10-02 11:56:03.542 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 07:56:03 np0005466013 nova_compute[192144]: 2025-10-02 11:56:03.568 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 07:56:03 np0005466013 nova_compute[192144]: 2025-10-02 11:56:03.569 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 07:56:03 np0005466013 nova_compute[192144]: 2025-10-02 11:56:03.569 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.118s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:56:04 np0005466013 nova_compute[192144]: 2025-10-02 11:56:04.249 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:56:04 np0005466013 nova_compute[192144]: 2025-10-02 11:56:04.250 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 07:56:04 np0005466013 nova_compute[192144]: 2025-10-02 11:56:04.250 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 07:56:04 np0005466013 nova_compute[192144]: 2025-10-02 11:56:04.269 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 07:56:04 np0005466013 nova_compute[192144]: 2025-10-02 11:56:04.269 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:56:04 np0005466013 nova_compute[192144]: 2025-10-02 11:56:04.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:56:04 np0005466013 nova_compute[192144]: 2025-10-02 11:56:04.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:56:07 np0005466013 podman[208834]: 2025-10-02 11:56:07.735978694 +0000 UTC m=+0.108590249 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 07:56:09 np0005466013 podman[208861]: 2025-10-02 11:56:09.680244481 +0000 UTC m=+0.053466469 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 07:56:09 np0005466013 podman[208886]: 2025-10-02 11:56:09.758319118 +0000 UTC m=+0.047086924 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  2 07:56:16 np0005466013 podman[208909]: 2025-10-02 11:56:16.675992577 +0000 UTC m=+0.051107813 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, health_failing_streak=3, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Oct  2 07:56:16 np0005466013 systemd[1]: e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc-3f978aca15effb91.service: Main process exited, code=exited, status=1/FAILURE
Oct  2 07:56:16 np0005466013 systemd[1]: e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc-3f978aca15effb91.service: Failed with result 'exit-code'.
Oct  2 07:56:20 np0005466013 podman[208930]: 2025-10-02 11:56:20.667606611 +0000 UTC m=+0.048239800 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd)
Oct  2 07:56:22 np0005466013 podman[208954]: 2025-10-02 11:56:22.669018033 +0000 UTC m=+0.050000537 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vendor=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.expose-services=, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible)
Oct  2 07:56:26 np0005466013 podman[208976]: 2025-10-02 11:56:26.70087747 +0000 UTC m=+0.074279807 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  2 07:56:29 np0005466013 podman[208997]: 2025-10-02 11:56:29.667518878 +0000 UTC m=+0.049156039 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 07:56:38 np0005466013 podman[209021]: 2025-10-02 11:56:38.692607554 +0000 UTC m=+0.073080765 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct  2 07:56:40 np0005466013 podman[209049]: 2025-10-02 11:56:40.676566693 +0000 UTC m=+0.053823335 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Oct  2 07:56:40 np0005466013 podman[209048]: 2025-10-02 11:56:40.689178186 +0000 UTC m=+0.069742136 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  2 07:56:47 np0005466013 podman[209088]: 2025-10-02 11:56:47.666005552 +0000 UTC m=+0.047389603 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, health_failing_streak=4, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 07:56:47 np0005466013 systemd[1]: e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc-3f978aca15effb91.service: Main process exited, code=exited, status=1/FAILURE
Oct  2 07:56:47 np0005466013 systemd[1]: e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc-3f978aca15effb91.service: Failed with result 'exit-code'.
Oct  2 07:56:51 np0005466013 podman[209107]: 2025-10-02 11:56:51.69617134 +0000 UTC m=+0.077169669 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  2 07:56:52 np0005466013 podman[209254]: 2025-10-02 11:56:52.775890965 +0000 UTC m=+0.057465824 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, vcs-type=git, distribution-scope=public, version=9.6, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal)
Oct  2 07:56:52 np0005466013 python3.9[209255]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Oct  2 07:56:53 np0005466013 python3.9[209441]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  2 07:56:53 np0005466013 systemd[1]: Started libpod-conmon-ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486.scope.
Oct  2 07:56:53 np0005466013 podman[209442]: 2025-10-02 11:56:53.663791424 +0000 UTC m=+0.077965716 container exec ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Oct  2 07:56:53 np0005466013 podman[209442]: 2025-10-02 11:56:53.694117008 +0000 UTC m=+0.108291280 container exec_died ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct  2 07:56:53 np0005466013 systemd[1]: libpod-conmon-ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486.scope: Deactivated successfully.
Oct  2 07:56:54 np0005466013 python3.9[209626]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  2 07:56:54 np0005466013 systemd[1]: Started libpod-conmon-ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486.scope.
Oct  2 07:56:54 np0005466013 podman[209627]: 2025-10-02 11:56:54.415111029 +0000 UTC m=+0.063452070 container exec ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_controller)
Oct  2 07:56:54 np0005466013 podman[209627]: 2025-10-02 11:56:54.446209028 +0000 UTC m=+0.094550049 container exec_died ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, managed_by=edpm_ansible)
Oct  2 07:56:54 np0005466013 systemd[1]: libpod-conmon-ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486.scope: Deactivated successfully.
Oct  2 07:56:55 np0005466013 python3.9[209811]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:56:55 np0005466013 python3.9[209963]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Oct  2 07:56:56 np0005466013 python3.9[210127]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  2 07:56:56 np0005466013 systemd[1]: Started libpod-conmon-4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa.scope.
Oct  2 07:56:56 np0005466013 podman[210128]: 2025-10-02 11:56:56.521322673 +0000 UTC m=+0.064519515 container exec 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Oct  2 07:56:56 np0005466013 podman[210128]: 2025-10-02 11:56:56.550939333 +0000 UTC m=+0.094136145 container exec_died 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  2 07:56:56 np0005466013 systemd[1]: libpod-conmon-4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa.scope: Deactivated successfully.
Oct  2 07:56:57 np0005466013 podman[210284]: 2025-10-02 11:56:57.040082488 +0000 UTC m=+0.069711104 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid)
Oct  2 07:56:57 np0005466013 python3.9[210331]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  2 07:56:57 np0005466013 systemd[1]: Started libpod-conmon-4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa.scope.
Oct  2 07:56:57 np0005466013 podman[210335]: 2025-10-02 11:56:57.304649117 +0000 UTC m=+0.068855007 container exec 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  2 07:56:57 np0005466013 podman[210335]: 2025-10-02 11:56:57.315306106 +0000 UTC m=+0.079511956 container exec_died 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  2 07:56:57 np0005466013 systemd[1]: libpod-conmon-4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa.scope: Deactivated successfully.
Oct  2 07:56:57 np0005466013 python3.9[210519]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:56:58 np0005466013 python3.9[210671]: ansible-containers.podman.podman_container_info Invoked with name=['iscsid'] executable=podman
Oct  2 07:56:59 np0005466013 python3.9[210837]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=iscsid detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  2 07:56:59 np0005466013 systemd[1]: Started libpod-conmon-f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb.scope.
Oct  2 07:56:59 np0005466013 podman[210838]: 2025-10-02 11:56:59.382975687 +0000 UTC m=+0.078662798 container exec f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid)
Oct  2 07:56:59 np0005466013 podman[210857]: 2025-10-02 11:56:59.450119737 +0000 UTC m=+0.052737779 container exec_died f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.schema-version=1.0, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 07:56:59 np0005466013 podman[210838]: 2025-10-02 11:56:59.456040091 +0000 UTC m=+0.151727202 container exec_died f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  2 07:56:59 np0005466013 systemd[1]: libpod-conmon-f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb.scope: Deactivated successfully.
Oct  2 07:56:59 np0005466013 podman[210993]: 2025-10-02 11:56:59.922974269 +0000 UTC m=+0.059263233 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 07:57:00 np0005466013 python3.9[211039]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=iscsid detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  2 07:57:00 np0005466013 systemd[1]: Started libpod-conmon-f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb.scope.
Oct  2 07:57:00 np0005466013 podman[211043]: 2025-10-02 11:57:00.231098144 +0000 UTC m=+0.086149224 container exec f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=iscsid)
Oct  2 07:57:00 np0005466013 podman[211043]: 2025-10-02 11:57:00.2651848 +0000 UTC m=+0.120235880 container exec_died f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid, container_name=iscsid, org.label-schema.build-date=20251001, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  2 07:57:00 np0005466013 systemd[1]: libpod-conmon-f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb.scope: Deactivated successfully.
Oct  2 07:57:01 np0005466013 python3.9[211226]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/iscsid recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:57:01 np0005466013 python3.9[211378]: ansible-containers.podman.podman_container_info Invoked with name=['multipathd'] executable=podman
Oct  2 07:57:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:57:02.267 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:57:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:57:02.268 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:57:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:57:02.268 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:57:02 np0005466013 python3.9[211543]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  2 07:57:02 np0005466013 systemd[1]: Started libpod-conmon-5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6.scope.
Oct  2 07:57:02 np0005466013 podman[211544]: 2025-10-02 11:57:02.50924134 +0000 UTC m=+0.069439325 container exec 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  2 07:57:02 np0005466013 podman[211544]: 2025-10-02 11:57:02.539884715 +0000 UTC m=+0.100082660 container exec_died 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001)
Oct  2 07:57:02 np0005466013 systemd[1]: libpod-conmon-5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6.scope: Deactivated successfully.
Oct  2 07:57:03 np0005466013 python3.9[211728]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  2 07:57:03 np0005466013 systemd[1]: Started libpod-conmon-5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6.scope.
Oct  2 07:57:03 np0005466013 podman[211729]: 2025-10-02 11:57:03.237954345 +0000 UTC m=+0.075541016 container exec 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, org.label-schema.build-date=20251001)
Oct  2 07:57:03 np0005466013 podman[211748]: 2025-10-02 11:57:03.302106067 +0000 UTC m=+0.052203372 container exec_died 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 07:57:03 np0005466013 podman[211729]: 2025-10-02 11:57:03.314206933 +0000 UTC m=+0.151793614 container exec_died 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3)
Oct  2 07:57:03 np0005466013 systemd[1]: libpod-conmon-5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6.scope: Deactivated successfully.
Oct  2 07:57:03 np0005466013 nova_compute[192144]: 2025-10-02 11:57:03.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:57:03 np0005466013 nova_compute[192144]: 2025-10-02 11:57:03.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:57:04 np0005466013 python3.9[211912]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/multipathd recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:57:04 np0005466013 nova_compute[192144]: 2025-10-02 11:57:04.019 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:57:04 np0005466013 nova_compute[192144]: 2025-10-02 11:57:04.019 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:57:04 np0005466013 nova_compute[192144]: 2025-10-02 11:57:04.020 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:57:04 np0005466013 nova_compute[192144]: 2025-10-02 11:57:04.020 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 07:57:04 np0005466013 nova_compute[192144]: 2025-10-02 11:57:04.163 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 07:57:04 np0005466013 nova_compute[192144]: 2025-10-02 11:57:04.164 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5975MB free_disk=73.50279235839844GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 07:57:04 np0005466013 nova_compute[192144]: 2025-10-02 11:57:04.164 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:57:04 np0005466013 nova_compute[192144]: 2025-10-02 11:57:04.165 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:57:04 np0005466013 nova_compute[192144]: 2025-10-02 11:57:04.223 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 07:57:04 np0005466013 nova_compute[192144]: 2025-10-02 11:57:04.223 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 07:57:04 np0005466013 nova_compute[192144]: 2025-10-02 11:57:04.253 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 07:57:04 np0005466013 nova_compute[192144]: 2025-10-02 11:57:04.267 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 07:57:04 np0005466013 nova_compute[192144]: 2025-10-02 11:57:04.268 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 07:57:04 np0005466013 nova_compute[192144]: 2025-10-02 11:57:04.268 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:57:04 np0005466013 python3.9[212064]: ansible-containers.podman.podman_container_info Invoked with name=['ceilometer_agent_compute'] executable=podman
Oct  2 07:57:05 np0005466013 nova_compute[192144]: 2025-10-02 11:57:05.268 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:57:05 np0005466013 nova_compute[192144]: 2025-10-02 11:57:05.268 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:57:05 np0005466013 nova_compute[192144]: 2025-10-02 11:57:05.268 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:57:05 np0005466013 nova_compute[192144]: 2025-10-02 11:57:05.268 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 07:57:05 np0005466013 python3.9[212229]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  2 07:57:05 np0005466013 systemd[1]: Started libpod-conmon-e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc.scope.
Oct  2 07:57:05 np0005466013 podman[212230]: 2025-10-02 11:57:05.598076198 +0000 UTC m=+0.097506555 container exec e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=edpm, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 07:57:05 np0005466013 podman[212230]: 2025-10-02 11:57:05.628113112 +0000 UTC m=+0.127543459 container exec_died e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=edpm, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  2 07:57:05 np0005466013 systemd[1]: libpod-conmon-e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc.scope: Deactivated successfully.
Oct  2 07:57:05 np0005466013 nova_compute[192144]: 2025-10-02 11:57:05.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:57:05 np0005466013 nova_compute[192144]: 2025-10-02 11:57:05.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 07:57:05 np0005466013 nova_compute[192144]: 2025-10-02 11:57:05.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 07:57:06 np0005466013 nova_compute[192144]: 2025-10-02 11:57:06.012 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 07:57:06 np0005466013 nova_compute[192144]: 2025-10-02 11:57:06.012 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:57:06 np0005466013 python3.9[212413]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  2 07:57:06 np0005466013 systemd[1]: Started libpod-conmon-e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc.scope.
Oct  2 07:57:06 np0005466013 podman[212414]: 2025-10-02 11:57:06.450246487 +0000 UTC m=+0.106033935 container exec e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 07:57:06 np0005466013 podman[212435]: 2025-10-02 11:57:06.560083346 +0000 UTC m=+0.093027920 container exec_died e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3)
Oct  2 07:57:06 np0005466013 podman[212414]: 2025-10-02 11:57:06.635020581 +0000 UTC m=+0.290807809 container exec_died e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm)
Oct  2 07:57:06 np0005466013 systemd[1]: libpod-conmon-e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc.scope: Deactivated successfully.
Oct  2 07:57:06 np0005466013 nova_compute[192144]: 2025-10-02 11:57:06.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:57:06 np0005466013 nova_compute[192144]: 2025-10-02 11:57:06.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:57:07 np0005466013 python3.9[212599]: ansible-ansible.builtin.file Invoked with group=42405 mode=0700 owner=42405 path=/var/lib/openstack/healthchecks/ceilometer_agent_compute recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:57:08 np0005466013 python3.9[212751]: ansible-containers.podman.podman_container_info Invoked with name=['node_exporter'] executable=podman
Oct  2 07:57:08 np0005466013 python3.9[212916]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  2 07:57:08 np0005466013 systemd[1]: Started libpod-conmon-bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720.scope.
Oct  2 07:57:08 np0005466013 podman[212917]: 2025-10-02 11:57:08.936980158 +0000 UTC m=+0.077170530 container exec bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 07:57:08 np0005466013 podman[212917]: 2025-10-02 11:57:08.971887281 +0000 UTC m=+0.112077653 container exec_died bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  2 07:57:09 np0005466013 systemd[1]: libpod-conmon-bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720.scope: Deactivated successfully.
Oct  2 07:57:09 np0005466013 podman[212933]: 2025-10-02 11:57:09.094068464 +0000 UTC m=+0.152533308 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Oct  2 07:57:09 np0005466013 python3.9[213123]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  2 07:57:09 np0005466013 systemd[1]: Started libpod-conmon-bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720.scope.
Oct  2 07:57:09 np0005466013 podman[213124]: 2025-10-02 11:57:09.826780539 +0000 UTC m=+0.095607483 container exec bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 07:57:09 np0005466013 podman[213143]: 2025-10-02 11:57:09.974115056 +0000 UTC m=+0.134812227 container exec_died bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 07:57:09 np0005466013 podman[213124]: 2025-10-02 11:57:09.999135996 +0000 UTC m=+0.267962920 container exec_died bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 07:57:10 np0005466013 systemd[1]: libpod-conmon-bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720.scope: Deactivated successfully.
Oct  2 07:57:10 np0005466013 python3.9[213307]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:57:11 np0005466013 podman[213432]: 2025-10-02 11:57:11.220905764 +0000 UTC m=+0.058412535 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  2 07:57:11 np0005466013 podman[213431]: 2025-10-02 11:57:11.222708633 +0000 UTC m=+0.065444705 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  2 07:57:11 np0005466013 python3.9[213493]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Oct  2 07:57:12 np0005466013 python3.9[213665]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  2 07:57:12 np0005466013 systemd[1]: Started libpod-conmon-16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58.scope.
Oct  2 07:57:12 np0005466013 podman[213666]: 2025-10-02 11:57:12.4667112 +0000 UTC m=+0.254986745 container exec 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  2 07:57:12 np0005466013 podman[213685]: 2025-10-02 11:57:12.533217048 +0000 UTC m=+0.054312900 container exec_died 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  2 07:57:12 np0005466013 podman[213666]: 2025-10-02 11:57:12.543323579 +0000 UTC m=+0.331599114 container exec_died 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 07:57:12 np0005466013 systemd[1]: libpod-conmon-16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58.scope: Deactivated successfully.
Oct  2 07:57:13 np0005466013 python3.9[213849]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  2 07:57:13 np0005466013 systemd[1]: Started libpod-conmon-16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58.scope.
Oct  2 07:57:13 np0005466013 podman[213850]: 2025-10-02 11:57:13.337402625 +0000 UTC m=+0.070956835 container exec 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 07:57:13 np0005466013 podman[213870]: 2025-10-02 11:57:13.406025634 +0000 UTC m=+0.055869812 container exec_died 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  2 07:57:13 np0005466013 podman[213850]: 2025-10-02 11:57:13.436149411 +0000 UTC m=+0.169703591 container exec_died 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  2 07:57:13 np0005466013 systemd[1]: libpod-conmon-16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58.scope: Deactivated successfully.
Oct  2 07:57:14 np0005466013 python3.9[214034]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:57:14 np0005466013 python3.9[214186]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Oct  2 07:57:15 np0005466013 python3.9[214351]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  2 07:57:15 np0005466013 systemd[1]: Started libpod-conmon-b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d.scope.
Oct  2 07:57:15 np0005466013 podman[214352]: 2025-10-02 11:57:15.691657366 +0000 UTC m=+0.072061012 container exec b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, version=9.6, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-type=git, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, distribution-scope=public, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64)
Oct  2 07:57:15 np0005466013 podman[214352]: 2025-10-02 11:57:15.700171035 +0000 UTC m=+0.080574681 container exec_died b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-type=git, managed_by=edpm_ansible, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., container_name=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers)
Oct  2 07:57:15 np0005466013 systemd[1]: libpod-conmon-b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d.scope: Deactivated successfully.
Oct  2 07:57:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:57:16.340 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:57:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:57:16.340 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:57:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:57:16.341 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:57:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:57:16.341 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:57:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:57:16.341 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:57:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:57:16.341 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:57:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:57:16.341 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:57:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:57:16.341 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:57:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:57:16.341 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:57:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:57:16.341 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:57:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:57:16.341 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:57:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:57:16.341 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:57:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:57:16.342 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:57:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:57:16.342 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:57:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:57:16.342 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:57:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:57:16.342 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:57:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:57:16.342 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:57:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:57:16.342 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:57:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:57:16.342 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:57:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:57:16.342 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:57:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:57:16.342 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:57:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:57:16.342 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:57:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:57:16.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:57:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:57:16.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:57:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:57:16.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:57:16 np0005466013 python3.9[214536]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  2 07:57:16 np0005466013 systemd[1]: Started libpod-conmon-b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d.scope.
Oct  2 07:57:16 np0005466013 podman[214537]: 2025-10-02 11:57:16.519795468 +0000 UTC m=+0.098005472 container exec b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-type=git, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_id=edpm, io.buildah.version=1.33.7, io.openshift.expose-services=, version=9.6, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct  2 07:57:16 np0005466013 podman[214537]: 2025-10-02 11:57:16.550222584 +0000 UTC m=+0.128432458 container exec_died b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, version=9.6, vcs-type=git, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, distribution-scope=public, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.expose-services=, name=ubi9-minimal, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc.)
Oct  2 07:57:16 np0005466013 systemd[1]: libpod-conmon-b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d.scope: Deactivated successfully.
Oct  2 07:57:17 np0005466013 python3.9[214720]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:57:17 np0005466013 podman[214844]: 2025-10-02 11:57:17.832394891 +0000 UTC m=+0.069515508 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 07:57:18 np0005466013 python3.9[214892]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:57:18 np0005466013 python3.9[215045]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:57:19 np0005466013 python3.9[215168]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1759406238.3112528-3312-175300982228091/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:57:20 np0005466013 python3.9[215320]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:57:21 np0005466013 python3.9[215472]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:57:21 np0005466013 podman[215550]: 2025-10-02 11:57:21.831669907 +0000 UTC m=+0.057515465 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct  2 07:57:22 np0005466013 python3.9[215551]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:57:22 np0005466013 python3.9[215722]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:57:23 np0005466013 podman[215772]: 2025-10-02 11:57:23.246871572 +0000 UTC m=+0.056792052 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, config_id=edpm, vcs-type=git, name=ubi9-minimal, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc.)
Oct  2 07:57:23 np0005466013 python3.9[215821]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.a5tcfqoq recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:57:24 np0005466013 python3.9[215974]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:57:24 np0005466013 python3.9[216052]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:57:25 np0005466013 python3.9[216204]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:57:26 np0005466013 python3[216357]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct  2 07:57:27 np0005466013 podman[216481]: 2025-10-02 11:57:27.641693557 +0000 UTC m=+0.056265285 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  2 07:57:27 np0005466013 python3.9[216531]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:57:28 np0005466013 python3.9[216609]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:57:29 np0005466013 python3.9[216761]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:57:29 np0005466013 python3.9[216839]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:57:30 np0005466013 podman[216963]: 2025-10-02 11:57:30.474685901 +0000 UTC m=+0.053363918 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  2 07:57:30 np0005466013 python3.9[217015]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:57:31 np0005466013 python3.9[217093]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:57:32 np0005466013 python3.9[217245]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:57:32 np0005466013 python3.9[217323]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:57:33 np0005466013 python3.9[217475]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:57:34 np0005466013 python3.9[217600]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759406252.817422-3687-62172828141144/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:57:34 np0005466013 python3.9[217752]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:57:35 np0005466013 python3.9[217904]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:57:36 np0005466013 python3.9[218059]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:57:37 np0005466013 python3.9[218211]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:57:38 np0005466013 python3.9[218364]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:57:38 np0005466013 python3.9[218518]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:57:39 np0005466013 podman[218630]: 2025-10-02 11:57:39.769941861 +0000 UTC m=+0.144519412 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 07:57:39 np0005466013 python3.9[218696]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:57:40 np0005466013 systemd[1]: packagekit.service: Deactivated successfully.
Oct  2 07:57:40 np0005466013 systemd[1]: session-28.scope: Deactivated successfully.
Oct  2 07:57:40 np0005466013 systemd[1]: session-28.scope: Consumed 1min 38.196s CPU time.
Oct  2 07:57:40 np0005466013 systemd-logind[784]: Session 28 logged out. Waiting for processes to exit.
Oct  2 07:57:40 np0005466013 systemd-logind[784]: Removed session 28.
Oct  2 07:57:41 np0005466013 podman[218725]: 2025-10-02 11:57:41.672769304 +0000 UTC m=+0.051210067 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001)
Oct  2 07:57:41 np0005466013 podman[218724]: 2025-10-02 11:57:41.672917219 +0000 UTC m=+0.053446815 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 07:57:48 np0005466013 podman[218764]: 2025-10-02 11:57:48.670269726 +0000 UTC m=+0.051371962 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 07:57:52 np0005466013 podman[218784]: 2025-10-02 11:57:52.681076417 +0000 UTC m=+0.059097057 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Oct  2 07:57:53 np0005466013 podman[218804]: 2025-10-02 11:57:53.68074656 +0000 UTC m=+0.055490668 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, io.buildah.version=1.33.7, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, release=1755695350, name=ubi9-minimal, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers)
Oct  2 07:57:58 np0005466013 podman[218827]: 2025-10-02 11:57:58.669742768 +0000 UTC m=+0.051360562 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 07:58:00 np0005466013 podman[218848]: 2025-10-02 11:58:00.677191639 +0000 UTC m=+0.054977571 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  2 07:58:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:58:02.268 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:58:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:58:02.268 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:58:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:58:02.269 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:58:04 np0005466013 nova_compute[192144]: 2025-10-02 11:58:04.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:58:04 np0005466013 nova_compute[192144]: 2025-10-02 11:58:04.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 07:58:04 np0005466013 nova_compute[192144]: 2025-10-02 11:58:04.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:58:05 np0005466013 nova_compute[192144]: 2025-10-02 11:58:05.018 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:58:05 np0005466013 nova_compute[192144]: 2025-10-02 11:58:05.019 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:58:05 np0005466013 nova_compute[192144]: 2025-10-02 11:58:05.019 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:58:05 np0005466013 nova_compute[192144]: 2025-10-02 11:58:05.019 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 07:58:05 np0005466013 nova_compute[192144]: 2025-10-02 11:58:05.173 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 07:58:05 np0005466013 nova_compute[192144]: 2025-10-02 11:58:05.174 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=6019MB free_disk=73.50214767456055GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 07:58:05 np0005466013 nova_compute[192144]: 2025-10-02 11:58:05.174 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:58:05 np0005466013 nova_compute[192144]: 2025-10-02 11:58:05.174 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:58:05 np0005466013 nova_compute[192144]: 2025-10-02 11:58:05.233 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 07:58:05 np0005466013 nova_compute[192144]: 2025-10-02 11:58:05.233 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 07:58:05 np0005466013 nova_compute[192144]: 2025-10-02 11:58:05.252 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 07:58:05 np0005466013 nova_compute[192144]: 2025-10-02 11:58:05.271 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 07:58:05 np0005466013 nova_compute[192144]: 2025-10-02 11:58:05.272 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 07:58:05 np0005466013 nova_compute[192144]: 2025-10-02 11:58:05.272 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.098s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:58:06 np0005466013 nova_compute[192144]: 2025-10-02 11:58:06.273 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:58:06 np0005466013 nova_compute[192144]: 2025-10-02 11:58:06.273 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 07:58:06 np0005466013 nova_compute[192144]: 2025-10-02 11:58:06.273 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 07:58:06 np0005466013 nova_compute[192144]: 2025-10-02 11:58:06.287 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 07:58:06 np0005466013 nova_compute[192144]: 2025-10-02 11:58:06.287 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:58:06 np0005466013 nova_compute[192144]: 2025-10-02 11:58:06.995 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:58:06 np0005466013 nova_compute[192144]: 2025-10-02 11:58:06.996 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:58:06 np0005466013 nova_compute[192144]: 2025-10-02 11:58:06.996 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:58:07 np0005466013 nova_compute[192144]: 2025-10-02 11:58:07.992 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:58:07 np0005466013 nova_compute[192144]: 2025-10-02 11:58:07.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:58:08 np0005466013 nova_compute[192144]: 2025-10-02 11:58:08.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:58:10 np0005466013 podman[218874]: 2025-10-02 11:58:10.75373059 +0000 UTC m=+0.122328438 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 07:58:12 np0005466013 podman[218900]: 2025-10-02 11:58:12.668712301 +0000 UTC m=+0.048337179 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 07:58:12 np0005466013 podman[218901]: 2025-10-02 11:58:12.674778346 +0000 UTC m=+0.049806564 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Oct  2 07:58:19 np0005466013 podman[218943]: 2025-10-02 11:58:19.664649596 +0000 UTC m=+0.047438981 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  2 07:58:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:58:19.895 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 07:58:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:58:19.897 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 07:58:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:58:19.897 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 07:58:23 np0005466013 podman[218963]: 2025-10-02 11:58:23.673707973 +0000 UTC m=+0.052641251 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 07:58:24 np0005466013 podman[218983]: 2025-10-02 11:58:24.684982438 +0000 UTC m=+0.062445898 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, architecture=x86_64, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.expose-services=)
Oct  2 07:58:29 np0005466013 podman[219005]: 2025-10-02 11:58:29.676193374 +0000 UTC m=+0.056560470 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid)
Oct  2 07:58:31 np0005466013 podman[219025]: 2025-10-02 11:58:31.666776113 +0000 UTC m=+0.042535554 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 07:58:41 np0005466013 podman[219051]: 2025-10-02 11:58:41.700587516 +0000 UTC m=+0.079550037 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 07:58:43 np0005466013 podman[219077]: 2025-10-02 11:58:43.665695086 +0000 UTC m=+0.046527736 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  2 07:58:43 np0005466013 podman[219078]: 2025-10-02 11:58:43.668551379 +0000 UTC m=+0.047152675 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent)
Oct  2 07:58:50 np0005466013 podman[219118]: 2025-10-02 11:58:50.669918184 +0000 UTC m=+0.050940463 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute)
Oct  2 07:58:54 np0005466013 podman[219138]: 2025-10-02 11:58:54.673496267 +0000 UTC m=+0.053480717 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 07:58:55 np0005466013 podman[219159]: 2025-10-02 11:58:55.667603088 +0000 UTC m=+0.049009918 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Oct  2 07:59:00 np0005466013 podman[219180]: 2025-10-02 11:59:00.684671273 +0000 UTC m=+0.066090561 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  2 07:59:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:59:02.272 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:59:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:59:02.272 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:59:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:59:02.272 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:59:02 np0005466013 podman[219202]: 2025-10-02 11:59:02.686107209 +0000 UTC m=+0.058841522 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 07:59:02 np0005466013 nova_compute[192144]: 2025-10-02 11:59:02.995 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:59:02 np0005466013 nova_compute[192144]: 2025-10-02 11:59:02.996 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  2 07:59:03 np0005466013 nova_compute[192144]: 2025-10-02 11:59:03.069 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  2 07:59:03 np0005466013 nova_compute[192144]: 2025-10-02 11:59:03.070 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:59:03 np0005466013 nova_compute[192144]: 2025-10-02 11:59:03.070 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  2 07:59:03 np0005466013 nova_compute[192144]: 2025-10-02 11:59:03.088 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:59:07 np0005466013 nova_compute[192144]: 2025-10-02 11:59:07.102 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:59:07 np0005466013 nova_compute[192144]: 2025-10-02 11:59:07.102 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 07:59:07 np0005466013 nova_compute[192144]: 2025-10-02 11:59:07.102 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 07:59:07 np0005466013 nova_compute[192144]: 2025-10-02 11:59:07.122 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 07:59:07 np0005466013 nova_compute[192144]: 2025-10-02 11:59:07.122 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:59:07 np0005466013 nova_compute[192144]: 2025-10-02 11:59:07.122 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 07:59:07 np0005466013 nova_compute[192144]: 2025-10-02 11:59:07.123 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:59:07 np0005466013 nova_compute[192144]: 2025-10-02 11:59:07.151 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:59:07 np0005466013 nova_compute[192144]: 2025-10-02 11:59:07.152 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:59:07 np0005466013 nova_compute[192144]: 2025-10-02 11:59:07.152 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:59:07 np0005466013 nova_compute[192144]: 2025-10-02 11:59:07.152 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 07:59:07 np0005466013 nova_compute[192144]: 2025-10-02 11:59:07.273 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 07:59:07 np0005466013 nova_compute[192144]: 2025-10-02 11:59:07.274 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=6048MB free_disk=73.50225448608398GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 07:59:07 np0005466013 nova_compute[192144]: 2025-10-02 11:59:07.274 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:59:07 np0005466013 nova_compute[192144]: 2025-10-02 11:59:07.274 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:59:07 np0005466013 nova_compute[192144]: 2025-10-02 11:59:07.345 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 07:59:07 np0005466013 nova_compute[192144]: 2025-10-02 11:59:07.345 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 07:59:07 np0005466013 nova_compute[192144]: 2025-10-02 11:59:07.364 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 07:59:07 np0005466013 nova_compute[192144]: 2025-10-02 11:59:07.384 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 07:59:07 np0005466013 nova_compute[192144]: 2025-10-02 11:59:07.385 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 07:59:07 np0005466013 nova_compute[192144]: 2025-10-02 11:59:07.385 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:59:08 np0005466013 nova_compute[192144]: 2025-10-02 11:59:08.258 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:59:08 np0005466013 nova_compute[192144]: 2025-10-02 11:59:08.258 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:59:08 np0005466013 nova_compute[192144]: 2025-10-02 11:59:08.991 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:59:08 np0005466013 nova_compute[192144]: 2025-10-02 11:59:08.992 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:59:08 np0005466013 nova_compute[192144]: 2025-10-02 11:59:08.992 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:59:10 np0005466013 nova_compute[192144]: 2025-10-02 11:59:10.995 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:59:12 np0005466013 podman[219226]: 2025-10-02 11:59:12.698713626 +0000 UTC m=+0.074376931 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller)
Oct  2 07:59:14 np0005466013 podman[219252]: 2025-10-02 11:59:14.671694388 +0000 UTC m=+0.048985628 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  2 07:59:14 np0005466013 podman[219253]: 2025-10-02 11:59:14.706913546 +0000 UTC m=+0.078711136 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  2 07:59:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:59:16.340 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:59:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:59:16.340 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:59:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:59:16.340 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:59:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:59:16.341 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:59:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:59:16.341 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:59:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:59:16.341 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:59:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:59:16.341 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:59:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:59:16.341 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:59:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:59:16.341 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:59:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:59:16.341 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:59:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:59:16.341 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:59:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:59:16.341 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:59:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:59:16.341 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:59:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:59:16.341 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:59:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:59:16.341 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:59:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:59:16.342 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:59:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:59:16.342 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:59:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:59:16.342 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:59:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:59:16.342 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:59:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:59:16.342 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:59:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:59:16.342 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:59:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:59:16.342 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:59:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:59:16.342 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:59:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:59:16.342 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:59:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 11:59:16.342 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:59:21 np0005466013 podman[219293]: 2025-10-02 11:59:21.7037586 +0000 UTC m=+0.084305858 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 07:59:25 np0005466013 podman[219313]: 2025-10-02 11:59:25.671308972 +0000 UTC m=+0.050550092 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  2 07:59:25 np0005466013 podman[219333]: 2025-10-02 11:59:25.750719987 +0000 UTC m=+0.049117661 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_id=edpm, managed_by=edpm_ansible, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7)
Oct  2 07:59:31 np0005466013 podman[219354]: 2025-10-02 11:59:31.666585949 +0000 UTC m=+0.044356694 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 07:59:33 np0005466013 podman[219374]: 2025-10-02 11:59:33.671723441 +0000 UTC m=+0.049614595 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 07:59:43 np0005466013 podman[219398]: 2025-10-02 11:59:43.702642328 +0000 UTC m=+0.082543286 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Oct  2 07:59:45 np0005466013 podman[219427]: 2025-10-02 11:59:45.676945741 +0000 UTC m=+0.051413617 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001)
Oct  2 07:59:45 np0005466013 podman[219426]: 2025-10-02 11:59:45.706922067 +0000 UTC m=+0.081985441 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  2 07:59:51 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:59:51.472 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 07:59:51 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:59:51.473 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 07:59:51 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 11:59:51.474 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 07:59:52 np0005466013 podman[219467]: 2025-10-02 11:59:52.675785512 +0000 UTC m=+0.053649505 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Oct  2 07:59:54 np0005466013 nova_compute[192144]: 2025-10-02 11:59:54.475 2 DEBUG oslo_concurrency.lockutils [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Acquiring lock "5c14c0c1-8950-4722-9065-b691fc2dc856" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:59:54 np0005466013 nova_compute[192144]: 2025-10-02 11:59:54.475 2 DEBUG oslo_concurrency.lockutils [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lock "5c14c0c1-8950-4722-9065-b691fc2dc856" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:59:54 np0005466013 nova_compute[192144]: 2025-10-02 11:59:54.497 2 DEBUG nova.compute.manager [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 07:59:54 np0005466013 nova_compute[192144]: 2025-10-02 11:59:54.915 2 DEBUG oslo_concurrency.lockutils [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:59:54 np0005466013 nova_compute[192144]: 2025-10-02 11:59:54.916 2 DEBUG oslo_concurrency.lockutils [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:59:54 np0005466013 nova_compute[192144]: 2025-10-02 11:59:54.923 2 DEBUG nova.virt.hardware [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 07:59:54 np0005466013 nova_compute[192144]: 2025-10-02 11:59:54.924 2 INFO nova.compute.claims [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 07:59:55 np0005466013 nova_compute[192144]: 2025-10-02 11:59:55.269 2 DEBUG nova.scheduler.client.report [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Refreshing inventories for resource provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  2 07:59:55 np0005466013 nova_compute[192144]: 2025-10-02 11:59:55.376 2 DEBUG nova.scheduler.client.report [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Updating ProviderTree inventory for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  2 07:59:55 np0005466013 nova_compute[192144]: 2025-10-02 11:59:55.376 2 DEBUG nova.compute.provider_tree [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Updating inventory in ProviderTree for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 07:59:55 np0005466013 nova_compute[192144]: 2025-10-02 11:59:55.397 2 DEBUG nova.scheduler.client.report [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Refreshing aggregate associations for resource provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  2 07:59:55 np0005466013 nova_compute[192144]: 2025-10-02 11:59:55.449 2 DEBUG nova.scheduler.client.report [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Refreshing trait associations for resource provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80, traits: COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_TRUSTED_CERTS,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NODE,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE,HW_CPU_X86_MMX,COMPUTE_RESCUE_BFV _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  2 07:59:55 np0005466013 nova_compute[192144]: 2025-10-02 11:59:55.528 2 DEBUG nova.compute.provider_tree [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 07:59:55 np0005466013 nova_compute[192144]: 2025-10-02 11:59:55.546 2 DEBUG nova.scheduler.client.report [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 07:59:55 np0005466013 nova_compute[192144]: 2025-10-02 11:59:55.582 2 DEBUG oslo_concurrency.lockutils [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.666s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:59:55 np0005466013 nova_compute[192144]: 2025-10-02 11:59:55.583 2 DEBUG nova.compute.manager [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 07:59:55 np0005466013 nova_compute[192144]: 2025-10-02 11:59:55.905 2 DEBUG nova.compute.manager [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948#033[00m
Oct  2 07:59:55 np0005466013 nova_compute[192144]: 2025-10-02 11:59:55.939 2 INFO nova.virt.libvirt.driver [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 07:59:55 np0005466013 nova_compute[192144]: 2025-10-02 11:59:55.967 2 DEBUG nova.compute.manager [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 07:59:56 np0005466013 nova_compute[192144]: 2025-10-02 11:59:56.124 2 DEBUG nova.compute.manager [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 07:59:56 np0005466013 nova_compute[192144]: 2025-10-02 11:59:56.126 2 DEBUG nova.virt.libvirt.driver [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 07:59:56 np0005466013 nova_compute[192144]: 2025-10-02 11:59:56.126 2 INFO nova.virt.libvirt.driver [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] Creating image(s)#033[00m
Oct  2 07:59:56 np0005466013 nova_compute[192144]: 2025-10-02 11:59:56.127 2 DEBUG oslo_concurrency.lockutils [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Acquiring lock "/var/lib/nova/instances/5c14c0c1-8950-4722-9065-b691fc2dc856/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:59:56 np0005466013 nova_compute[192144]: 2025-10-02 11:59:56.127 2 DEBUG oslo_concurrency.lockutils [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lock "/var/lib/nova/instances/5c14c0c1-8950-4722-9065-b691fc2dc856/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:59:56 np0005466013 nova_compute[192144]: 2025-10-02 11:59:56.127 2 DEBUG oslo_concurrency.lockutils [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lock "/var/lib/nova/instances/5c14c0c1-8950-4722-9065-b691fc2dc856/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:59:56 np0005466013 nova_compute[192144]: 2025-10-02 11:59:56.128 2 DEBUG oslo_concurrency.lockutils [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:59:56 np0005466013 nova_compute[192144]: 2025-10-02 11:59:56.128 2 DEBUG oslo_concurrency.lockutils [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:59:56 np0005466013 podman[219487]: 2025-10-02 11:59:56.670515702 +0000 UTC m=+0.051941927 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 07:59:56 np0005466013 podman[219488]: 2025-10-02 11:59:56.670555513 +0000 UTC m=+0.050394214 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, managed_by=edpm_ansible, name=ubi9-minimal, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, container_name=openstack_network_exporter, version=9.6, io.openshift.tags=minimal rhel9)
Oct  2 07:59:59 np0005466013 nova_compute[192144]: 2025-10-02 11:59:59.755 2 DEBUG oslo_concurrency.processutils [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 07:59:59 np0005466013 nova_compute[192144]: 2025-10-02 11:59:59.811 2 DEBUG oslo_concurrency.processutils [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955.part --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 07:59:59 np0005466013 nova_compute[192144]: 2025-10-02 11:59:59.812 2 DEBUG nova.virt.images [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] cf60d86d-f1d5-4be4-976e-7488dbdcf0b2 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Oct  2 07:59:59 np0005466013 nova_compute[192144]: 2025-10-02 11:59:59.820 2 DEBUG nova.privsep.utils [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Oct  2 07:59:59 np0005466013 nova_compute[192144]: 2025-10-02 11:59:59.820 2 DEBUG oslo_concurrency.processutils [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955.part /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:00:01 np0005466013 nova_compute[192144]: 2025-10-02 12:00:01.536 2 DEBUG oslo_concurrency.processutils [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955.part /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955.converted" returned: 0 in 1.716s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:00:01 np0005466013 nova_compute[192144]: 2025-10-02 12:00:01.541 2 DEBUG oslo_concurrency.processutils [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:00:01 np0005466013 nova_compute[192144]: 2025-10-02 12:00:01.595 2 DEBUG oslo_concurrency.processutils [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955.converted --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:00:01 np0005466013 nova_compute[192144]: 2025-10-02 12:00:01.597 2 DEBUG oslo_concurrency.lockutils [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 5.469s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:00:01 np0005466013 nova_compute[192144]: 2025-10-02 12:00:01.609 2 INFO oslo.privsep.daemon [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpk5w0oqf8/privsep.sock']#033[00m
Oct  2 08:00:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:02.272 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:00:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:02.273 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:00:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:02.273 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:00:02 np0005466013 nova_compute[192144]: 2025-10-02 12:00:02.333 2 INFO oslo.privsep.daemon [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Spawned new privsep daemon via rootwrap#033[00m
Oct  2 08:00:02 np0005466013 nova_compute[192144]: 2025-10-02 12:00:02.182 56 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Oct  2 08:00:02 np0005466013 nova_compute[192144]: 2025-10-02 12:00:02.185 56 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Oct  2 08:00:02 np0005466013 nova_compute[192144]: 2025-10-02 12:00:02.187 56 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Oct  2 08:00:02 np0005466013 nova_compute[192144]: 2025-10-02 12:00:02.188 56 INFO oslo.privsep.daemon [-] privsep daemon running as pid 56#033[00m
Oct  2 08:00:02 np0005466013 nova_compute[192144]: 2025-10-02 12:00:02.423 2 DEBUG oslo_concurrency.processutils [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:00:02 np0005466013 nova_compute[192144]: 2025-10-02 12:00:02.476 2 DEBUG oslo_concurrency.processutils [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:00:02 np0005466013 nova_compute[192144]: 2025-10-02 12:00:02.477 2 DEBUG oslo_concurrency.lockutils [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:00:02 np0005466013 nova_compute[192144]: 2025-10-02 12:00:02.477 2 DEBUG oslo_concurrency.lockutils [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:00:02 np0005466013 nova_compute[192144]: 2025-10-02 12:00:02.489 2 DEBUG oslo_concurrency.processutils [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:00:02 np0005466013 nova_compute[192144]: 2025-10-02 12:00:02.542 2 DEBUG oslo_concurrency.processutils [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:00:02 np0005466013 nova_compute[192144]: 2025-10-02 12:00:02.543 2 DEBUG oslo_concurrency.processutils [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/5c14c0c1-8950-4722-9065-b691fc2dc856/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:00:02 np0005466013 nova_compute[192144]: 2025-10-02 12:00:02.617 2 DEBUG oslo_concurrency.processutils [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/5c14c0c1-8950-4722-9065-b691fc2dc856/disk 1073741824" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:00:02 np0005466013 nova_compute[192144]: 2025-10-02 12:00:02.618 2 DEBUG oslo_concurrency.lockutils [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:00:02 np0005466013 nova_compute[192144]: 2025-10-02 12:00:02.619 2 DEBUG oslo_concurrency.processutils [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:00:02 np0005466013 nova_compute[192144]: 2025-10-02 12:00:02.670 2 DEBUG oslo_concurrency.processutils [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:00:02 np0005466013 nova_compute[192144]: 2025-10-02 12:00:02.672 2 DEBUG nova.virt.disk.api [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Checking if we can resize image /var/lib/nova/instances/5c14c0c1-8950-4722-9065-b691fc2dc856/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:00:02 np0005466013 nova_compute[192144]: 2025-10-02 12:00:02.672 2 DEBUG oslo_concurrency.processutils [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5c14c0c1-8950-4722-9065-b691fc2dc856/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:00:02 np0005466013 podman[219553]: 2025-10-02 12:00:02.692890064 +0000 UTC m=+0.074176755 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, container_name=iscsid, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Oct  2 08:00:02 np0005466013 nova_compute[192144]: 2025-10-02 12:00:02.726 2 DEBUG oslo_concurrency.processutils [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5c14c0c1-8950-4722-9065-b691fc2dc856/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:00:02 np0005466013 nova_compute[192144]: 2025-10-02 12:00:02.727 2 DEBUG nova.virt.disk.api [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Cannot resize image /var/lib/nova/instances/5c14c0c1-8950-4722-9065-b691fc2dc856/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:00:02 np0005466013 nova_compute[192144]: 2025-10-02 12:00:02.727 2 DEBUG nova.objects.instance [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lazy-loading 'migration_context' on Instance uuid 5c14c0c1-8950-4722-9065-b691fc2dc856 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:00:02 np0005466013 nova_compute[192144]: 2025-10-02 12:00:02.747 2 DEBUG nova.virt.libvirt.driver [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:00:02 np0005466013 nova_compute[192144]: 2025-10-02 12:00:02.747 2 DEBUG nova.virt.libvirt.driver [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] Ensure instance console log exists: /var/lib/nova/instances/5c14c0c1-8950-4722-9065-b691fc2dc856/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:00:02 np0005466013 nova_compute[192144]: 2025-10-02 12:00:02.748 2 DEBUG oslo_concurrency.lockutils [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:00:02 np0005466013 nova_compute[192144]: 2025-10-02 12:00:02.748 2 DEBUG oslo_concurrency.lockutils [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:00:02 np0005466013 nova_compute[192144]: 2025-10-02 12:00:02.748 2 DEBUG oslo_concurrency.lockutils [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:00:02 np0005466013 nova_compute[192144]: 2025-10-02 12:00:02.751 2 DEBUG nova.virt.libvirt.driver [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:00:02 np0005466013 nova_compute[192144]: 2025-10-02 12:00:02.755 2 WARNING nova.virt.libvirt.driver [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:00:02 np0005466013 nova_compute[192144]: 2025-10-02 12:00:02.761 2 DEBUG nova.virt.libvirt.host [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:00:02 np0005466013 nova_compute[192144]: 2025-10-02 12:00:02.762 2 DEBUG nova.virt.libvirt.host [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:00:02 np0005466013 nova_compute[192144]: 2025-10-02 12:00:02.771 2 DEBUG nova.virt.libvirt.host [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:00:02 np0005466013 nova_compute[192144]: 2025-10-02 12:00:02.772 2 DEBUG nova.virt.libvirt.host [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:00:02 np0005466013 nova_compute[192144]: 2025-10-02 12:00:02.774 2 DEBUG nova.virt.libvirt.driver [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:00:02 np0005466013 nova_compute[192144]: 2025-10-02 12:00:02.774 2 DEBUG nova.virt.hardware [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:00:02 np0005466013 nova_compute[192144]: 2025-10-02 12:00:02.775 2 DEBUG nova.virt.hardware [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:00:02 np0005466013 nova_compute[192144]: 2025-10-02 12:00:02.775 2 DEBUG nova.virt.hardware [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:00:02 np0005466013 nova_compute[192144]: 2025-10-02 12:00:02.775 2 DEBUG nova.virt.hardware [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:00:02 np0005466013 nova_compute[192144]: 2025-10-02 12:00:02.775 2 DEBUG nova.virt.hardware [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:00:02 np0005466013 nova_compute[192144]: 2025-10-02 12:00:02.776 2 DEBUG nova.virt.hardware [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:00:02 np0005466013 nova_compute[192144]: 2025-10-02 12:00:02.776 2 DEBUG nova.virt.hardware [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:00:02 np0005466013 nova_compute[192144]: 2025-10-02 12:00:02.776 2 DEBUG nova.virt.hardware [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:00:02 np0005466013 nova_compute[192144]: 2025-10-02 12:00:02.776 2 DEBUG nova.virt.hardware [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:00:02 np0005466013 nova_compute[192144]: 2025-10-02 12:00:02.777 2 DEBUG nova.virt.hardware [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:00:02 np0005466013 nova_compute[192144]: 2025-10-02 12:00:02.777 2 DEBUG nova.virt.hardware [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:00:02 np0005466013 nova_compute[192144]: 2025-10-02 12:00:02.783 2 DEBUG nova.privsep.utils [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Oct  2 08:00:02 np0005466013 nova_compute[192144]: 2025-10-02 12:00:02.786 2 DEBUG nova.objects.instance [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lazy-loading 'pci_devices' on Instance uuid 5c14c0c1-8950-4722-9065-b691fc2dc856 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:00:02 np0005466013 nova_compute[192144]: 2025-10-02 12:00:02.807 2 DEBUG nova.virt.libvirt.driver [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:00:02 np0005466013 nova_compute[192144]:  <uuid>5c14c0c1-8950-4722-9065-b691fc2dc856</uuid>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:  <name>instance-00000001</name>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:00:02 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:      <nova:name>tempest-AutoAllocateNetworkTest-server-28691690</nova:name>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:00:02</nova:creationTime>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:00:02 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:        <nova:user uuid="4e1cdf41d58b4774b94da988b9e8db73">tempest-AutoAllocateNetworkTest-1436985778-project-member</nova:user>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:        <nova:project uuid="23de7e9a877e477cb52ac4d4c1410e0d">tempest-AutoAllocateNetworkTest-1436985778</nova:project>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:      <nova:ports/>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:00:02 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:      <entry name="serial">5c14c0c1-8950-4722-9065-b691fc2dc856</entry>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:      <entry name="uuid">5c14c0c1-8950-4722-9065-b691fc2dc856</entry>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:00:02 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:00:02 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:00:02 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/5c14c0c1-8950-4722-9065-b691fc2dc856/disk"/>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:00:02 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/5c14c0c1-8950-4722-9065-b691fc2dc856/disk.config"/>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:00:02 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/5c14c0c1-8950-4722-9065-b691fc2dc856/console.log" append="off"/>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:00:02 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:00:02 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:00:02 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:00:02 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:00:02 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:00:02 np0005466013 nova_compute[192144]: 2025-10-02 12:00:02.863 2 DEBUG nova.virt.libvirt.driver [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:00:02 np0005466013 nova_compute[192144]: 2025-10-02 12:00:02.863 2 DEBUG nova.virt.libvirt.driver [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:00:02 np0005466013 nova_compute[192144]: 2025-10-02 12:00:02.864 2 INFO nova.virt.libvirt.driver [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] Using config drive#033[00m
Oct  2 08:00:03 np0005466013 nova_compute[192144]: 2025-10-02 12:00:03.640 2 INFO nova.virt.libvirt.driver [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] Creating config drive at /var/lib/nova/instances/5c14c0c1-8950-4722-9065-b691fc2dc856/disk.config#033[00m
Oct  2 08:00:03 np0005466013 nova_compute[192144]: 2025-10-02 12:00:03.644 2 DEBUG oslo_concurrency.processutils [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5c14c0c1-8950-4722-9065-b691fc2dc856/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp56_ecfqz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:00:03 np0005466013 nova_compute[192144]: 2025-10-02 12:00:03.772 2 DEBUG oslo_concurrency.processutils [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5c14c0c1-8950-4722-9065-b691fc2dc856/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp56_ecfqz" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:00:03 np0005466013 systemd-machined[152202]: New machine qemu-1-instance-00000001.
Oct  2 08:00:03 np0005466013 systemd[1]: Started Virtual Machine qemu-1-instance-00000001.
Oct  2 08:00:03 np0005466013 podman[219589]: 2025-10-02 12:00:03.901776466 +0000 UTC m=+0.056023630 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:00:04 np0005466013 nova_compute[192144]: 2025-10-02 12:00:04.756 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759406404.7557507, 5c14c0c1-8950-4722-9065-b691fc2dc856 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:00:04 np0005466013 nova_compute[192144]: 2025-10-02 12:00:04.757 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:00:04 np0005466013 nova_compute[192144]: 2025-10-02 12:00:04.768 2 DEBUG nova.compute.manager [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:00:04 np0005466013 nova_compute[192144]: 2025-10-02 12:00:04.768 2 DEBUG nova.virt.libvirt.driver [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:00:04 np0005466013 nova_compute[192144]: 2025-10-02 12:00:04.771 2 INFO nova.virt.libvirt.driver [-] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] Instance spawned successfully.#033[00m
Oct  2 08:00:04 np0005466013 nova_compute[192144]: 2025-10-02 12:00:04.772 2 DEBUG nova.virt.libvirt.driver [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:00:04 np0005466013 nova_compute[192144]: 2025-10-02 12:00:04.817 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:00:04 np0005466013 nova_compute[192144]: 2025-10-02 12:00:04.820 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:00:04 np0005466013 nova_compute[192144]: 2025-10-02 12:00:04.834 2 DEBUG nova.virt.libvirt.driver [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:00:04 np0005466013 nova_compute[192144]: 2025-10-02 12:00:04.835 2 DEBUG nova.virt.libvirt.driver [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:00:04 np0005466013 nova_compute[192144]: 2025-10-02 12:00:04.836 2 DEBUG nova.virt.libvirt.driver [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:00:04 np0005466013 nova_compute[192144]: 2025-10-02 12:00:04.836 2 DEBUG nova.virt.libvirt.driver [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:00:04 np0005466013 nova_compute[192144]: 2025-10-02 12:00:04.836 2 DEBUG nova.virt.libvirt.driver [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:00:04 np0005466013 nova_compute[192144]: 2025-10-02 12:00:04.837 2 DEBUG nova.virt.libvirt.driver [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:00:04 np0005466013 nova_compute[192144]: 2025-10-02 12:00:04.874 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:00:04 np0005466013 nova_compute[192144]: 2025-10-02 12:00:04.875 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759406404.7681305, 5c14c0c1-8950-4722-9065-b691fc2dc856 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:00:04 np0005466013 nova_compute[192144]: 2025-10-02 12:00:04.875 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] VM Started (Lifecycle Event)#033[00m
Oct  2 08:00:04 np0005466013 nova_compute[192144]: 2025-10-02 12:00:04.905 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:00:04 np0005466013 nova_compute[192144]: 2025-10-02 12:00:04.908 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:00:04 np0005466013 nova_compute[192144]: 2025-10-02 12:00:04.928 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:00:04 np0005466013 nova_compute[192144]: 2025-10-02 12:00:04.940 2 INFO nova.compute.manager [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] Took 8.81 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:00:04 np0005466013 nova_compute[192144]: 2025-10-02 12:00:04.941 2 DEBUG nova.compute.manager [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:00:05 np0005466013 nova_compute[192144]: 2025-10-02 12:00:05.035 2 INFO nova.compute.manager [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] Took 10.42 seconds to build instance.#033[00m
Oct  2 08:00:05 np0005466013 nova_compute[192144]: 2025-10-02 12:00:05.075 2 DEBUG oslo_concurrency.lockutils [None req-de761dbd-d94e-466e-a0f3-abfd520ef044 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lock "5c14c0c1-8950-4722-9065-b691fc2dc856" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.600s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:00:06 np0005466013 nova_compute[192144]: 2025-10-02 12:00:06.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:00:07 np0005466013 nova_compute[192144]: 2025-10-02 12:00:07.020 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:00:07 np0005466013 nova_compute[192144]: 2025-10-02 12:00:07.021 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:00:07 np0005466013 nova_compute[192144]: 2025-10-02 12:00:07.021 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:00:07 np0005466013 nova_compute[192144]: 2025-10-02 12:00:07.021 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:00:07 np0005466013 nova_compute[192144]: 2025-10-02 12:00:07.104 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5c14c0c1-8950-4722-9065-b691fc2dc856/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:00:07 np0005466013 nova_compute[192144]: 2025-10-02 12:00:07.162 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5c14c0c1-8950-4722-9065-b691fc2dc856/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:00:07 np0005466013 nova_compute[192144]: 2025-10-02 12:00:07.163 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5c14c0c1-8950-4722-9065-b691fc2dc856/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:00:07 np0005466013 nova_compute[192144]: 2025-10-02 12:00:07.220 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5c14c0c1-8950-4722-9065-b691fc2dc856/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:00:07 np0005466013 nova_compute[192144]: 2025-10-02 12:00:07.347 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:00:07 np0005466013 nova_compute[192144]: 2025-10-02 12:00:07.348 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5905MB free_disk=73.46766662597656GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:00:07 np0005466013 nova_compute[192144]: 2025-10-02 12:00:07.349 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:00:07 np0005466013 nova_compute[192144]: 2025-10-02 12:00:07.349 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:00:07 np0005466013 nova_compute[192144]: 2025-10-02 12:00:07.459 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance 5c14c0c1-8950-4722-9065-b691fc2dc856 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:00:07 np0005466013 nova_compute[192144]: 2025-10-02 12:00:07.459 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:00:07 np0005466013 nova_compute[192144]: 2025-10-02 12:00:07.460 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:00:07 np0005466013 nova_compute[192144]: 2025-10-02 12:00:07.504 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Updating inventory in ProviderTree for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 08:00:07 np0005466013 nova_compute[192144]: 2025-10-02 12:00:07.583 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Updated inventory for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Oct  2 08:00:07 np0005466013 nova_compute[192144]: 2025-10-02 12:00:07.583 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Updating resource provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Oct  2 08:00:07 np0005466013 nova_compute[192144]: 2025-10-02 12:00:07.584 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Updating inventory in ProviderTree for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 08:00:07 np0005466013 nova_compute[192144]: 2025-10-02 12:00:07.616 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:00:07 np0005466013 nova_compute[192144]: 2025-10-02 12:00:07.617 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.268s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:00:08 np0005466013 nova_compute[192144]: 2025-10-02 12:00:08.618 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:00:08 np0005466013 nova_compute[192144]: 2025-10-02 12:00:08.619 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:00:08 np0005466013 nova_compute[192144]: 2025-10-02 12:00:08.620 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:00:08 np0005466013 nova_compute[192144]: 2025-10-02 12:00:08.971 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "refresh_cache-5c14c0c1-8950-4722-9065-b691fc2dc856" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:00:08 np0005466013 nova_compute[192144]: 2025-10-02 12:00:08.971 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquired lock "refresh_cache-5c14c0c1-8950-4722-9065-b691fc2dc856" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:00:08 np0005466013 nova_compute[192144]: 2025-10-02 12:00:08.972 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:00:08 np0005466013 nova_compute[192144]: 2025-10-02 12:00:08.972 2 DEBUG nova.objects.instance [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lazy-loading 'info_cache' on Instance uuid 5c14c0c1-8950-4722-9065-b691fc2dc856 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:00:09 np0005466013 nova_compute[192144]: 2025-10-02 12:00:09.276 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:00:09 np0005466013 nova_compute[192144]: 2025-10-02 12:00:09.840 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:00:09 np0005466013 nova_compute[192144]: 2025-10-02 12:00:09.866 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Releasing lock "refresh_cache-5c14c0c1-8950-4722-9065-b691fc2dc856" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:00:09 np0005466013 nova_compute[192144]: 2025-10-02 12:00:09.866 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:00:09 np0005466013 nova_compute[192144]: 2025-10-02 12:00:09.866 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:00:09 np0005466013 nova_compute[192144]: 2025-10-02 12:00:09.867 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:00:09 np0005466013 nova_compute[192144]: 2025-10-02 12:00:09.867 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:00:09 np0005466013 nova_compute[192144]: 2025-10-02 12:00:09.867 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:00:09 np0005466013 nova_compute[192144]: 2025-10-02 12:00:09.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:00:10 np0005466013 nova_compute[192144]: 2025-10-02 12:00:10.988 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:00:10 np0005466013 nova_compute[192144]: 2025-10-02 12:00:10.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:00:11 np0005466013 nova_compute[192144]: 2025-10-02 12:00:11.275 2 DEBUG oslo_concurrency.lockutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Acquiring lock "6e063d2c-d996-4af7-b02f-2ea76d2ba132" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:00:11 np0005466013 nova_compute[192144]: 2025-10-02 12:00:11.275 2 DEBUG oslo_concurrency.lockutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lock "6e063d2c-d996-4af7-b02f-2ea76d2ba132" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:00:11 np0005466013 nova_compute[192144]: 2025-10-02 12:00:11.308 2 DEBUG nova.compute.manager [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:00:11 np0005466013 nova_compute[192144]: 2025-10-02 12:00:11.427 2 DEBUG oslo_concurrency.lockutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:00:11 np0005466013 nova_compute[192144]: 2025-10-02 12:00:11.427 2 DEBUG oslo_concurrency.lockutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:00:11 np0005466013 nova_compute[192144]: 2025-10-02 12:00:11.434 2 DEBUG nova.virt.hardware [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:00:11 np0005466013 nova_compute[192144]: 2025-10-02 12:00:11.434 2 INFO nova.compute.claims [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:00:11 np0005466013 nova_compute[192144]: 2025-10-02 12:00:11.640 2 DEBUG nova.compute.provider_tree [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:00:11 np0005466013 nova_compute[192144]: 2025-10-02 12:00:11.819 2 DEBUG nova.scheduler.client.report [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:00:11 np0005466013 nova_compute[192144]: 2025-10-02 12:00:11.860 2 DEBUG oslo_concurrency.lockutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.433s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:00:11 np0005466013 nova_compute[192144]: 2025-10-02 12:00:11.861 2 DEBUG nova.compute.manager [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:00:11 np0005466013 nova_compute[192144]: 2025-10-02 12:00:11.935 2 DEBUG nova.compute.manager [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:00:11 np0005466013 nova_compute[192144]: 2025-10-02 12:00:11.936 2 DEBUG nova.network.neutron [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:00:11 np0005466013 nova_compute[192144]: 2025-10-02 12:00:11.970 2 INFO nova.virt.libvirt.driver [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:00:12 np0005466013 nova_compute[192144]: 2025-10-02 12:00:12.008 2 DEBUG nova.compute.manager [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:00:12 np0005466013 nova_compute[192144]: 2025-10-02 12:00:12.293 2 DEBUG nova.compute.manager [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:00:12 np0005466013 nova_compute[192144]: 2025-10-02 12:00:12.295 2 DEBUG nova.virt.libvirt.driver [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:00:12 np0005466013 nova_compute[192144]: 2025-10-02 12:00:12.295 2 INFO nova.virt.libvirt.driver [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Creating image(s)#033[00m
Oct  2 08:00:12 np0005466013 nova_compute[192144]: 2025-10-02 12:00:12.295 2 DEBUG oslo_concurrency.lockutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Acquiring lock "/var/lib/nova/instances/6e063d2c-d996-4af7-b02f-2ea76d2ba132/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:00:12 np0005466013 nova_compute[192144]: 2025-10-02 12:00:12.296 2 DEBUG oslo_concurrency.lockutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lock "/var/lib/nova/instances/6e063d2c-d996-4af7-b02f-2ea76d2ba132/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:00:12 np0005466013 nova_compute[192144]: 2025-10-02 12:00:12.296 2 DEBUG oslo_concurrency.lockutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lock "/var/lib/nova/instances/6e063d2c-d996-4af7-b02f-2ea76d2ba132/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:00:12 np0005466013 nova_compute[192144]: 2025-10-02 12:00:12.309 2 DEBUG oslo_concurrency.processutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:00:12 np0005466013 nova_compute[192144]: 2025-10-02 12:00:12.374 2 DEBUG oslo_concurrency.processutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:00:12 np0005466013 nova_compute[192144]: 2025-10-02 12:00:12.375 2 DEBUG oslo_concurrency.lockutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:00:12 np0005466013 nova_compute[192144]: 2025-10-02 12:00:12.376 2 DEBUG oslo_concurrency.lockutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:00:12 np0005466013 nova_compute[192144]: 2025-10-02 12:00:12.389 2 DEBUG oslo_concurrency.processutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:00:12 np0005466013 nova_compute[192144]: 2025-10-02 12:00:12.451 2 DEBUG oslo_concurrency.processutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:00:12 np0005466013 nova_compute[192144]: 2025-10-02 12:00:12.452 2 DEBUG oslo_concurrency.processutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/6e063d2c-d996-4af7-b02f-2ea76d2ba132/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:00:12 np0005466013 nova_compute[192144]: 2025-10-02 12:00:12.497 2 DEBUG oslo_concurrency.processutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/6e063d2c-d996-4af7-b02f-2ea76d2ba132/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:00:12 np0005466013 nova_compute[192144]: 2025-10-02 12:00:12.498 2 DEBUG oslo_concurrency.lockutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:00:12 np0005466013 nova_compute[192144]: 2025-10-02 12:00:12.498 2 DEBUG oslo_concurrency.processutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:00:12 np0005466013 nova_compute[192144]: 2025-10-02 12:00:12.548 2 DEBUG oslo_concurrency.processutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:00:12 np0005466013 nova_compute[192144]: 2025-10-02 12:00:12.549 2 DEBUG nova.virt.disk.api [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Checking if we can resize image /var/lib/nova/instances/6e063d2c-d996-4af7-b02f-2ea76d2ba132/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:00:12 np0005466013 nova_compute[192144]: 2025-10-02 12:00:12.550 2 DEBUG oslo_concurrency.processutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6e063d2c-d996-4af7-b02f-2ea76d2ba132/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:00:12 np0005466013 nova_compute[192144]: 2025-10-02 12:00:12.612 2 DEBUG oslo_concurrency.processutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6e063d2c-d996-4af7-b02f-2ea76d2ba132/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:00:12 np0005466013 nova_compute[192144]: 2025-10-02 12:00:12.613 2 DEBUG nova.virt.disk.api [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Cannot resize image /var/lib/nova/instances/6e063d2c-d996-4af7-b02f-2ea76d2ba132/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:00:12 np0005466013 nova_compute[192144]: 2025-10-02 12:00:12.613 2 DEBUG nova.objects.instance [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lazy-loading 'migration_context' on Instance uuid 6e063d2c-d996-4af7-b02f-2ea76d2ba132 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:00:12 np0005466013 nova_compute[192144]: 2025-10-02 12:00:12.651 2 DEBUG nova.virt.libvirt.driver [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:00:12 np0005466013 nova_compute[192144]: 2025-10-02 12:00:12.651 2 DEBUG nova.virt.libvirt.driver [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Ensure instance console log exists: /var/lib/nova/instances/6e063d2c-d996-4af7-b02f-2ea76d2ba132/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:00:12 np0005466013 nova_compute[192144]: 2025-10-02 12:00:12.652 2 DEBUG oslo_concurrency.lockutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:00:12 np0005466013 nova_compute[192144]: 2025-10-02 12:00:12.652 2 DEBUG oslo_concurrency.lockutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:00:12 np0005466013 nova_compute[192144]: 2025-10-02 12:00:12.652 2 DEBUG oslo_concurrency.lockutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:00:12 np0005466013 nova_compute[192144]: 2025-10-02 12:00:12.988 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:00:13 np0005466013 nova_compute[192144]: 2025-10-02 12:00:13.012 2 DEBUG nova.network.neutron [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Automatically allocating a network for project 23de7e9a877e477cb52ac4d4c1410e0d. _auto_allocate_network /usr/lib/python3.9/site-packages/nova/network/neutron.py:2460#033[00m
Oct  2 08:00:13 np0005466013 nova_compute[192144]: 2025-10-02 12:00:13.021 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:00:14 np0005466013 podman[219652]: 2025-10-02 12:00:14.707125621 +0000 UTC m=+0.081330845 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3)
Oct  2 08:00:16 np0005466013 podman[219695]: 2025-10-02 12:00:16.671995076 +0000 UTC m=+0.045503697 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  2 08:00:16 np0005466013 podman[219696]: 2025-10-02 12:00:16.687781206 +0000 UTC m=+0.054581090 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:00:23 np0005466013 podman[219735]: 2025-10-02 12:00:23.684405379 +0000 UTC m=+0.063960041 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  2 08:00:27 np0005466013 podman[219758]: 2025-10-02 12:00:27.682577535 +0000 UTC m=+0.058834059 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, release=1755695350, vendor=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Oct  2 08:00:27 np0005466013 podman[219757]: 2025-10-02 12:00:27.682564894 +0000 UTC m=+0.061642467 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct  2 08:00:33 np0005466013 podman[219799]: 2025-10-02 12:00:33.675337563 +0000 UTC m=+0.054146179 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:00:34 np0005466013 podman[219819]: 2025-10-02 12:00:34.668022786 +0000 UTC m=+0.049253052 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:00:37 np0005466013 nova_compute[192144]: 2025-10-02 12:00:37.091 2 DEBUG nova.network.neutron [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Automatically allocated network: {'id': '0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6', 'name': 'auto_allocated_network', 'tenant_id': '23de7e9a877e477cb52ac4d4c1410e0d', 'admin_state_up': True, 'mtu': 1442, 'status': 'ACTIVE', 'subnets': ['6a2058e4-dc89-48d3-88fc-bc95dba8da8b', 'd2e1858b-8344-4341-91f5-cb724ceffc0a'], 'shared': False, 'availability_zone_hints': [], 'availability_zones': [], 'ipv4_address_scope': None, 'ipv6_address_scope': None, 'router:external': False, 'description': '', 'qos_policy_id': None, 'port_security_enabled': True, 'dns_domain': '', 'l2_adjacency': True, 'tags': [], 'created_at': '2025-10-02T12:00:14Z', 'updated_at': '2025-10-02T12:00:26Z', 'revision_number': 4, 'project_id': '23de7e9a877e477cb52ac4d4c1410e0d'} _auto_allocate_network /usr/lib/python3.9/site-packages/nova/network/neutron.py:2478#033[00m
Oct  2 08:00:37 np0005466013 nova_compute[192144]: 2025-10-02 12:00:37.102 2 WARNING oslo_policy.policy [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Oct  2 08:00:37 np0005466013 nova_compute[192144]: 2025-10-02 12:00:37.103 2 WARNING oslo_policy.policy [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Oct  2 08:00:37 np0005466013 nova_compute[192144]: 2025-10-02 12:00:37.105 2 DEBUG nova.policy [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4e1cdf41d58b4774b94da988b9e8db73', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '23de7e9a877e477cb52ac4d4c1410e0d', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:00:39 np0005466013 nova_compute[192144]: 2025-10-02 12:00:39.806 2 DEBUG nova.network.neutron [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Successfully created port: 48210443-c8f5-4013-be00-7509a0d2c6b7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:00:42 np0005466013 nova_compute[192144]: 2025-10-02 12:00:42.469 2 DEBUG nova.network.neutron [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Successfully updated port: 48210443-c8f5-4013-be00-7509a0d2c6b7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:00:42 np0005466013 nova_compute[192144]: 2025-10-02 12:00:42.497 2 DEBUG oslo_concurrency.lockutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Acquiring lock "refresh_cache-6e063d2c-d996-4af7-b02f-2ea76d2ba132" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:00:42 np0005466013 nova_compute[192144]: 2025-10-02 12:00:42.498 2 DEBUG oslo_concurrency.lockutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Acquired lock "refresh_cache-6e063d2c-d996-4af7-b02f-2ea76d2ba132" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:00:42 np0005466013 nova_compute[192144]: 2025-10-02 12:00:42.498 2 DEBUG nova.network.neutron [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:00:42 np0005466013 nova_compute[192144]: 2025-10-02 12:00:42.587 2 DEBUG nova.compute.manager [req-34c8cae2-3db7-466c-aebc-05b507e4c72b req-4318ee47-aea7-4137-b2bb-e1986489da5e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Received event network-changed-48210443-c8f5-4013-be00-7509a0d2c6b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:00:42 np0005466013 nova_compute[192144]: 2025-10-02 12:00:42.588 2 DEBUG nova.compute.manager [req-34c8cae2-3db7-466c-aebc-05b507e4c72b req-4318ee47-aea7-4137-b2bb-e1986489da5e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Refreshing instance network info cache due to event network-changed-48210443-c8f5-4013-be00-7509a0d2c6b7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:00:42 np0005466013 nova_compute[192144]: 2025-10-02 12:00:42.588 2 DEBUG oslo_concurrency.lockutils [req-34c8cae2-3db7-466c-aebc-05b507e4c72b req-4318ee47-aea7-4137-b2bb-e1986489da5e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-6e063d2c-d996-4af7-b02f-2ea76d2ba132" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:00:42 np0005466013 nova_compute[192144]: 2025-10-02 12:00:42.843 2 DEBUG nova.network.neutron [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:00:44 np0005466013 nova_compute[192144]: 2025-10-02 12:00:44.850 2 DEBUG nova.network.neutron [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Updating instance_info_cache with network_info: [{"id": "48210443-c8f5-4013-be00-7509a0d2c6b7", "address": "fa:16:3e:04:99:53", "network": {"id": "0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.50", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::19e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "23de7e9a877e477cb52ac4d4c1410e0d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48210443-c8", "ovs_interfaceid": "48210443-c8f5-4013-be00-7509a0d2c6b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:00:44 np0005466013 nova_compute[192144]: 2025-10-02 12:00:44.949 2 DEBUG oslo_concurrency.lockutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Releasing lock "refresh_cache-6e063d2c-d996-4af7-b02f-2ea76d2ba132" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:00:44 np0005466013 nova_compute[192144]: 2025-10-02 12:00:44.950 2 DEBUG nova.compute.manager [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Instance network_info: |[{"id": "48210443-c8f5-4013-be00-7509a0d2c6b7", "address": "fa:16:3e:04:99:53", "network": {"id": "0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.50", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::19e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "23de7e9a877e477cb52ac4d4c1410e0d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48210443-c8", "ovs_interfaceid": "48210443-c8f5-4013-be00-7509a0d2c6b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:00:44 np0005466013 nova_compute[192144]: 2025-10-02 12:00:44.950 2 DEBUG oslo_concurrency.lockutils [req-34c8cae2-3db7-466c-aebc-05b507e4c72b req-4318ee47-aea7-4137-b2bb-e1986489da5e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-6e063d2c-d996-4af7-b02f-2ea76d2ba132" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:00:44 np0005466013 nova_compute[192144]: 2025-10-02 12:00:44.950 2 DEBUG nova.network.neutron [req-34c8cae2-3db7-466c-aebc-05b507e4c72b req-4318ee47-aea7-4137-b2bb-e1986489da5e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Refreshing network info cache for port 48210443-c8f5-4013-be00-7509a0d2c6b7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:00:44 np0005466013 nova_compute[192144]: 2025-10-02 12:00:44.953 2 DEBUG nova.virt.libvirt.driver [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Start _get_guest_xml network_info=[{"id": "48210443-c8f5-4013-be00-7509a0d2c6b7", "address": "fa:16:3e:04:99:53", "network": {"id": "0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.50", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::19e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "23de7e9a877e477cb52ac4d4c1410e0d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48210443-c8", "ovs_interfaceid": "48210443-c8f5-4013-be00-7509a0d2c6b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:00:44 np0005466013 nova_compute[192144]: 2025-10-02 12:00:44.957 2 WARNING nova.virt.libvirt.driver [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:00:44 np0005466013 nova_compute[192144]: 2025-10-02 12:00:44.965 2 DEBUG nova.virt.libvirt.host [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:00:44 np0005466013 nova_compute[192144]: 2025-10-02 12:00:44.965 2 DEBUG nova.virt.libvirt.host [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:00:44 np0005466013 nova_compute[192144]: 2025-10-02 12:00:44.969 2 DEBUG nova.virt.libvirt.host [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:00:44 np0005466013 nova_compute[192144]: 2025-10-02 12:00:44.970 2 DEBUG nova.virt.libvirt.host [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:00:44 np0005466013 nova_compute[192144]: 2025-10-02 12:00:44.971 2 DEBUG nova.virt.libvirt.driver [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:00:44 np0005466013 nova_compute[192144]: 2025-10-02 12:00:44.971 2 DEBUG nova.virt.hardware [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:00:44 np0005466013 nova_compute[192144]: 2025-10-02 12:00:44.971 2 DEBUG nova.virt.hardware [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:00:44 np0005466013 nova_compute[192144]: 2025-10-02 12:00:44.972 2 DEBUG nova.virt.hardware [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:00:44 np0005466013 nova_compute[192144]: 2025-10-02 12:00:44.972 2 DEBUG nova.virt.hardware [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:00:44 np0005466013 nova_compute[192144]: 2025-10-02 12:00:44.972 2 DEBUG nova.virt.hardware [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:00:44 np0005466013 nova_compute[192144]: 2025-10-02 12:00:44.972 2 DEBUG nova.virt.hardware [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:00:44 np0005466013 nova_compute[192144]: 2025-10-02 12:00:44.973 2 DEBUG nova.virt.hardware [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:00:44 np0005466013 nova_compute[192144]: 2025-10-02 12:00:44.973 2 DEBUG nova.virt.hardware [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:00:44 np0005466013 nova_compute[192144]: 2025-10-02 12:00:44.973 2 DEBUG nova.virt.hardware [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:00:44 np0005466013 nova_compute[192144]: 2025-10-02 12:00:44.973 2 DEBUG nova.virt.hardware [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:00:44 np0005466013 nova_compute[192144]: 2025-10-02 12:00:44.973 2 DEBUG nova.virt.hardware [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:00:44 np0005466013 nova_compute[192144]: 2025-10-02 12:00:44.977 2 DEBUG nova.virt.libvirt.vif [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:00:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-549213814-3',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-549213814-3',id=4,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=2,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='23de7e9a877e477cb52ac4d4c1410e0d',ramdisk_id='',reservation_id='r-2zs6ym1o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AutoAllocateNetworkTest-1436985778',owner_user_name='tempest-AutoAllocateNetworkTest-1436985778-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:00:12Z,user_data=None,user_id='4e1cdf41d58b4774b94da988b9e8db73',uuid=6e063d2c-d996-4af7-b02f-2ea76d2ba132,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "48210443-c8f5-4013-be00-7509a0d2c6b7", "address": "fa:16:3e:04:99:53", "network": {"id": "0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.50", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::19e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "23de7e9a877e477cb52ac4d4c1410e0d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48210443-c8", "ovs_interfaceid": "48210443-c8f5-4013-be00-7509a0d2c6b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:00:44 np0005466013 nova_compute[192144]: 2025-10-02 12:00:44.977 2 DEBUG nova.network.os_vif_util [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Converting VIF {"id": "48210443-c8f5-4013-be00-7509a0d2c6b7", "address": "fa:16:3e:04:99:53", "network": {"id": "0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.50", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::19e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "23de7e9a877e477cb52ac4d4c1410e0d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48210443-c8", "ovs_interfaceid": "48210443-c8f5-4013-be00-7509a0d2c6b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:00:44 np0005466013 nova_compute[192144]: 2025-10-02 12:00:44.978 2 DEBUG nova.network.os_vif_util [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:04:99:53,bridge_name='br-int',has_traffic_filtering=True,id=48210443-c8f5-4013-be00-7509a0d2c6b7,network=Network(0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap48210443-c8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:00:44 np0005466013 nova_compute[192144]: 2025-10-02 12:00:44.979 2 DEBUG nova.objects.instance [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lazy-loading 'pci_devices' on Instance uuid 6e063d2c-d996-4af7-b02f-2ea76d2ba132 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:00:45 np0005466013 nova_compute[192144]: 2025-10-02 12:00:45.042 2 DEBUG nova.virt.libvirt.driver [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:00:45 np0005466013 nova_compute[192144]:  <uuid>6e063d2c-d996-4af7-b02f-2ea76d2ba132</uuid>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:  <name>instance-00000004</name>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:00:45 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:      <nova:name>tempest-tempest.common.compute-instance-549213814-3</nova:name>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:00:44</nova:creationTime>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:00:45 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:        <nova:user uuid="4e1cdf41d58b4774b94da988b9e8db73">tempest-AutoAllocateNetworkTest-1436985778-project-member</nova:user>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:        <nova:project uuid="23de7e9a877e477cb52ac4d4c1410e0d">tempest-AutoAllocateNetworkTest-1436985778</nova:project>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:        <nova:port uuid="48210443-c8f5-4013-be00-7509a0d2c6b7">
Oct  2 08:00:45 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.1.0.50" ipVersion="4"/>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="fdfe:381f:8400::19e" ipVersion="6"/>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:      <entry name="serial">6e063d2c-d996-4af7-b02f-2ea76d2ba132</entry>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:      <entry name="uuid">6e063d2c-d996-4af7-b02f-2ea76d2ba132</entry>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:00:45 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/6e063d2c-d996-4af7-b02f-2ea76d2ba132/disk"/>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:00:45 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/6e063d2c-d996-4af7-b02f-2ea76d2ba132/disk.config"/>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:00:45 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:04:99:53"/>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:      <target dev="tap48210443-c8"/>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:00:45 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/6e063d2c-d996-4af7-b02f-2ea76d2ba132/console.log" append="off"/>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:00:45 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:00:45 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:00:45 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:00:45 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:00:45 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:00:45 np0005466013 nova_compute[192144]: 2025-10-02 12:00:45.043 2 DEBUG nova.compute.manager [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Preparing to wait for external event network-vif-plugged-48210443-c8f5-4013-be00-7509a0d2c6b7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:00:45 np0005466013 nova_compute[192144]: 2025-10-02 12:00:45.044 2 DEBUG oslo_concurrency.lockutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Acquiring lock "6e063d2c-d996-4af7-b02f-2ea76d2ba132-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:00:45 np0005466013 nova_compute[192144]: 2025-10-02 12:00:45.044 2 DEBUG oslo_concurrency.lockutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lock "6e063d2c-d996-4af7-b02f-2ea76d2ba132-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:00:45 np0005466013 nova_compute[192144]: 2025-10-02 12:00:45.044 2 DEBUG oslo_concurrency.lockutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lock "6e063d2c-d996-4af7-b02f-2ea76d2ba132-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:00:45 np0005466013 nova_compute[192144]: 2025-10-02 12:00:45.045 2 DEBUG nova.virt.libvirt.vif [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:00:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-549213814-3',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-549213814-3',id=4,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=2,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='23de7e9a877e477cb52ac4d4c1410e0d',ramdisk_id='',reservation_id='r-2zs6ym1o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AutoAllocateNetworkTest-1436985778',owner_user_name='tempest-AutoAllocateNetworkTest-1436985778-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:00:12Z,user_data=None,user_id='4e1cdf41d58b4774b94da988b9e8db73',uuid=6e063d2c-d996-4af7-b02f-2ea76d2ba132,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "48210443-c8f5-4013-be00-7509a0d2c6b7", "address": "fa:16:3e:04:99:53", "network": {"id": "0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.50", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::19e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "23de7e9a877e477cb52ac4d4c1410e0d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48210443-c8", "ovs_interfaceid": "48210443-c8f5-4013-be00-7509a0d2c6b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:00:45 np0005466013 nova_compute[192144]: 2025-10-02 12:00:45.045 2 DEBUG nova.network.os_vif_util [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Converting VIF {"id": "48210443-c8f5-4013-be00-7509a0d2c6b7", "address": "fa:16:3e:04:99:53", "network": {"id": "0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.50", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::19e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "23de7e9a877e477cb52ac4d4c1410e0d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48210443-c8", "ovs_interfaceid": "48210443-c8f5-4013-be00-7509a0d2c6b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:00:45 np0005466013 nova_compute[192144]: 2025-10-02 12:00:45.046 2 DEBUG nova.network.os_vif_util [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:04:99:53,bridge_name='br-int',has_traffic_filtering=True,id=48210443-c8f5-4013-be00-7509a0d2c6b7,network=Network(0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap48210443-c8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:00:45 np0005466013 nova_compute[192144]: 2025-10-02 12:00:45.046 2 DEBUG os_vif [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:99:53,bridge_name='br-int',has_traffic_filtering=True,id=48210443-c8f5-4013-be00-7509a0d2c6b7,network=Network(0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap48210443-c8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:00:45 np0005466013 nova_compute[192144]: 2025-10-02 12:00:45.077 2 DEBUG ovsdbapp.backend.ovs_idl [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  2 08:00:45 np0005466013 nova_compute[192144]: 2025-10-02 12:00:45.077 2 DEBUG ovsdbapp.backend.ovs_idl [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  2 08:00:45 np0005466013 nova_compute[192144]: 2025-10-02 12:00:45.078 2 DEBUG ovsdbapp.backend.ovs_idl [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  2 08:00:45 np0005466013 nova_compute[192144]: 2025-10-02 12:00:45.078 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  2 08:00:45 np0005466013 nova_compute[192144]: 2025-10-02 12:00:45.079 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [POLLOUT] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:45 np0005466013 nova_compute[192144]: 2025-10-02 12:00:45.079 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  2 08:00:45 np0005466013 nova_compute[192144]: 2025-10-02 12:00:45.079 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:45 np0005466013 nova_compute[192144]: 2025-10-02 12:00:45.080 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:45 np0005466013 nova_compute[192144]: 2025-10-02 12:00:45.082 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:45 np0005466013 nova_compute[192144]: 2025-10-02 12:00:45.091 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:45 np0005466013 nova_compute[192144]: 2025-10-02 12:00:45.092 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:00:45 np0005466013 nova_compute[192144]: 2025-10-02 12:00:45.092 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:00:45 np0005466013 nova_compute[192144]: 2025-10-02 12:00:45.093 2 INFO oslo.privsep.daemon [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpvabnqrvu/privsep.sock']#033[00m
Oct  2 08:00:45 np0005466013 podman[219848]: 2025-10-02 12:00:45.708786944 +0000 UTC m=+0.086037556 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, container_name=ovn_controller)
Oct  2 08:00:45 np0005466013 nova_compute[192144]: 2025-10-02 12:00:45.771 2 INFO oslo.privsep.daemon [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Spawned new privsep daemon via rootwrap#033[00m
Oct  2 08:00:45 np0005466013 nova_compute[192144]: 2025-10-02 12:00:45.621 101 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Oct  2 08:00:45 np0005466013 nova_compute[192144]: 2025-10-02 12:00:45.625 101 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Oct  2 08:00:45 np0005466013 nova_compute[192144]: 2025-10-02 12:00:45.627 101 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none#033[00m
Oct  2 08:00:45 np0005466013 nova_compute[192144]: 2025-10-02 12:00:45.627 101 INFO oslo.privsep.daemon [-] privsep daemon running as pid 101#033[00m
Oct  2 08:00:46 np0005466013 nova_compute[192144]: 2025-10-02 12:00:46.105 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:46 np0005466013 nova_compute[192144]: 2025-10-02 12:00:46.106 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap48210443-c8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:00:46 np0005466013 nova_compute[192144]: 2025-10-02 12:00:46.106 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap48210443-c8, col_values=(('external_ids', {'iface-id': '48210443-c8f5-4013-be00-7509a0d2c6b7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:04:99:53', 'vm-uuid': '6e063d2c-d996-4af7-b02f-2ea76d2ba132'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:00:46 np0005466013 nova_compute[192144]: 2025-10-02 12:00:46.108 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:46 np0005466013 NetworkManager[51205]: <info>  [1759406446.1094] manager: (tap48210443-c8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/23)
Oct  2 08:00:46 np0005466013 nova_compute[192144]: 2025-10-02 12:00:46.110 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:00:46 np0005466013 nova_compute[192144]: 2025-10-02 12:00:46.115 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:46 np0005466013 nova_compute[192144]: 2025-10-02 12:00:46.116 2 INFO os_vif [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:99:53,bridge_name='br-int',has_traffic_filtering=True,id=48210443-c8f5-4013-be00-7509a0d2c6b7,network=Network(0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap48210443-c8')#033[00m
Oct  2 08:00:46 np0005466013 nova_compute[192144]: 2025-10-02 12:00:46.173 2 DEBUG nova.virt.libvirt.driver [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:00:46 np0005466013 nova_compute[192144]: 2025-10-02 12:00:46.174 2 DEBUG nova.virt.libvirt.driver [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:00:46 np0005466013 nova_compute[192144]: 2025-10-02 12:00:46.174 2 DEBUG nova.virt.libvirt.driver [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] No VIF found with MAC fa:16:3e:04:99:53, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:00:46 np0005466013 nova_compute[192144]: 2025-10-02 12:00:46.175 2 INFO nova.virt.libvirt.driver [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Using config drive#033[00m
Oct  2 08:00:46 np0005466013 nova_compute[192144]: 2025-10-02 12:00:46.587 2 INFO nova.virt.libvirt.driver [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Creating config drive at /var/lib/nova/instances/6e063d2c-d996-4af7-b02f-2ea76d2ba132/disk.config#033[00m
Oct  2 08:00:46 np0005466013 nova_compute[192144]: 2025-10-02 12:00:46.592 2 DEBUG oslo_concurrency.processutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6e063d2c-d996-4af7-b02f-2ea76d2ba132/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdhuflqss execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:00:46 np0005466013 nova_compute[192144]: 2025-10-02 12:00:46.715 2 DEBUG oslo_concurrency.processutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6e063d2c-d996-4af7-b02f-2ea76d2ba132/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdhuflqss" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:00:46 np0005466013 kernel: tun: Universal TUN/TAP device driver, 1.6
Oct  2 08:00:46 np0005466013 kernel: tap48210443-c8: entered promiscuous mode
Oct  2 08:00:46 np0005466013 NetworkManager[51205]: <info>  [1759406446.8071] manager: (tap48210443-c8): new Tun device (/org/freedesktop/NetworkManager/Devices/24)
Oct  2 08:00:46 np0005466013 ovn_controller[94366]: 2025-10-02T12:00:46Z|00027|binding|INFO|Claiming lport 48210443-c8f5-4013-be00-7509a0d2c6b7 for this chassis.
Oct  2 08:00:46 np0005466013 ovn_controller[94366]: 2025-10-02T12:00:46Z|00028|binding|INFO|48210443-c8f5-4013-be00-7509a0d2c6b7: Claiming fa:16:3e:04:99:53 10.1.0.50 fdfe:381f:8400::19e
Oct  2 08:00:46 np0005466013 nova_compute[192144]: 2025-10-02 12:00:46.806 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:46 np0005466013 nova_compute[192144]: 2025-10-02 12:00:46.811 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:46 np0005466013 systemd-udevd[219940]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:00:46 np0005466013 systemd-machined[152202]: New machine qemu-2-instance-00000004.
Oct  2 08:00:46 np0005466013 podman[219891]: 2025-10-02 12:00:46.86109055 +0000 UTC m=+0.075304897 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Oct  2 08:00:46 np0005466013 NetworkManager[51205]: <info>  [1759406446.8707] device (tap48210443-c8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:00:46 np0005466013 NetworkManager[51205]: <info>  [1759406446.8725] device (tap48210443-c8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:00:46 np0005466013 podman[219890]: 2025-10-02 12:00:46.872987891 +0000 UTC m=+0.091196990 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  2 08:00:46 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:46.883 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:99:53 10.1.0.50 fdfe:381f:8400::19e'], port_security=['fa:16:3e:04:99:53 10.1.0.50 fdfe:381f:8400::19e'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.0.50/26 fdfe:381f:8400::19e/64', 'neutron:device_id': '6e063d2c-d996-4af7-b02f-2ea76d2ba132', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '23de7e9a877e477cb52ac4d4c1410e0d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6166ab66-e763-4e6f-ba6d-1725486f45f7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9cab3463-7636-46ad-b75d-f72d7d1739eb, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=48210443-c8f5-4013-be00-7509a0d2c6b7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:00:46 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:46.885 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 48210443-c8f5-4013-be00-7509a0d2c6b7 in datapath 0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6 bound to our chassis#033[00m
Oct  2 08:00:46 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:46.888 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6#033[00m
Oct  2 08:00:46 np0005466013 systemd[1]: Started Virtual Machine qemu-2-instance-00000004.
Oct  2 08:00:46 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:46.889 103323 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpf5ymnpmk/privsep.sock']#033[00m
Oct  2 08:00:46 np0005466013 nova_compute[192144]: 2025-10-02 12:00:46.892 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:46 np0005466013 ovn_controller[94366]: 2025-10-02T12:00:46Z|00029|binding|INFO|Setting lport 48210443-c8f5-4013-be00-7509a0d2c6b7 ovn-installed in OVS
Oct  2 08:00:46 np0005466013 ovn_controller[94366]: 2025-10-02T12:00:46Z|00030|binding|INFO|Setting lport 48210443-c8f5-4013-be00-7509a0d2c6b7 up in Southbound
Oct  2 08:00:46 np0005466013 nova_compute[192144]: 2025-10-02 12:00:46.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:47 np0005466013 nova_compute[192144]: 2025-10-02 12:00:47.396 2 DEBUG nova.compute.manager [req-8492317a-442b-41b6-aa30-2cfecd0495ea req-3e4a7322-1c1c-4d25-bfdc-6e0f65611dc9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Received event network-vif-plugged-48210443-c8f5-4013-be00-7509a0d2c6b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:00:47 np0005466013 nova_compute[192144]: 2025-10-02 12:00:47.397 2 DEBUG oslo_concurrency.lockutils [req-8492317a-442b-41b6-aa30-2cfecd0495ea req-3e4a7322-1c1c-4d25-bfdc-6e0f65611dc9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "6e063d2c-d996-4af7-b02f-2ea76d2ba132-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:00:47 np0005466013 nova_compute[192144]: 2025-10-02 12:00:47.397 2 DEBUG oslo_concurrency.lockutils [req-8492317a-442b-41b6-aa30-2cfecd0495ea req-3e4a7322-1c1c-4d25-bfdc-6e0f65611dc9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "6e063d2c-d996-4af7-b02f-2ea76d2ba132-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:00:47 np0005466013 nova_compute[192144]: 2025-10-02 12:00:47.397 2 DEBUG oslo_concurrency.lockutils [req-8492317a-442b-41b6-aa30-2cfecd0495ea req-3e4a7322-1c1c-4d25-bfdc-6e0f65611dc9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "6e063d2c-d996-4af7-b02f-2ea76d2ba132-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:00:47 np0005466013 nova_compute[192144]: 2025-10-02 12:00:47.397 2 DEBUG nova.compute.manager [req-8492317a-442b-41b6-aa30-2cfecd0495ea req-3e4a7322-1c1c-4d25-bfdc-6e0f65611dc9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Processing event network-vif-plugged-48210443-c8f5-4013-be00-7509a0d2c6b7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:00:47 np0005466013 nova_compute[192144]: 2025-10-02 12:00:47.407 2 DEBUG nova.network.neutron [req-34c8cae2-3db7-466c-aebc-05b507e4c72b req-4318ee47-aea7-4137-b2bb-e1986489da5e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Updated VIF entry in instance network info cache for port 48210443-c8f5-4013-be00-7509a0d2c6b7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:00:47 np0005466013 nova_compute[192144]: 2025-10-02 12:00:47.407 2 DEBUG nova.network.neutron [req-34c8cae2-3db7-466c-aebc-05b507e4c72b req-4318ee47-aea7-4137-b2bb-e1986489da5e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Updating instance_info_cache with network_info: [{"id": "48210443-c8f5-4013-be00-7509a0d2c6b7", "address": "fa:16:3e:04:99:53", "network": {"id": "0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.50", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::19e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "23de7e9a877e477cb52ac4d4c1410e0d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48210443-c8", "ovs_interfaceid": "48210443-c8f5-4013-be00-7509a0d2c6b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:00:47 np0005466013 nova_compute[192144]: 2025-10-02 12:00:47.436 2 DEBUG oslo_concurrency.lockutils [req-34c8cae2-3db7-466c-aebc-05b507e4c72b req-4318ee47-aea7-4137-b2bb-e1986489da5e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-6e063d2c-d996-4af7-b02f-2ea76d2ba132" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:00:47 np0005466013 nova_compute[192144]: 2025-10-02 12:00:47.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:47.557 103323 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Oct  2 08:00:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:47.557 103323 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpf5ymnpmk/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Oct  2 08:00:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:47.432 219962 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Oct  2 08:00:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:47.438 219962 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Oct  2 08:00:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:47.440 219962 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none#033[00m
Oct  2 08:00:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:47.440 219962 INFO oslo.privsep.daemon [-] privsep daemon running as pid 219962#033[00m
Oct  2 08:00:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:47.559 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[4e124bf4-4d36-44c2-8090-e1af7020e155]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:00:47 np0005466013 nova_compute[192144]: 2025-10-02 12:00:47.908 2 DEBUG nova.compute.manager [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:00:47 np0005466013 nova_compute[192144]: 2025-10-02 12:00:47.909 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759406447.909211, 6e063d2c-d996-4af7-b02f-2ea76d2ba132 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:00:47 np0005466013 nova_compute[192144]: 2025-10-02 12:00:47.909 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] VM Started (Lifecycle Event)#033[00m
Oct  2 08:00:47 np0005466013 nova_compute[192144]: 2025-10-02 12:00:47.913 2 DEBUG nova.virt.libvirt.driver [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:00:47 np0005466013 nova_compute[192144]: 2025-10-02 12:00:47.917 2 INFO nova.virt.libvirt.driver [-] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Instance spawned successfully.#033[00m
Oct  2 08:00:47 np0005466013 nova_compute[192144]: 2025-10-02 12:00:47.917 2 DEBUG nova.virt.libvirt.driver [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:00:47 np0005466013 nova_compute[192144]: 2025-10-02 12:00:47.930 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:00:47 np0005466013 nova_compute[192144]: 2025-10-02 12:00:47.934 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:00:47 np0005466013 nova_compute[192144]: 2025-10-02 12:00:47.941 2 DEBUG nova.virt.libvirt.driver [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:00:47 np0005466013 nova_compute[192144]: 2025-10-02 12:00:47.941 2 DEBUG nova.virt.libvirt.driver [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:00:47 np0005466013 nova_compute[192144]: 2025-10-02 12:00:47.941 2 DEBUG nova.virt.libvirt.driver [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:00:47 np0005466013 nova_compute[192144]: 2025-10-02 12:00:47.942 2 DEBUG nova.virt.libvirt.driver [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:00:47 np0005466013 nova_compute[192144]: 2025-10-02 12:00:47.942 2 DEBUG nova.virt.libvirt.driver [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:00:47 np0005466013 nova_compute[192144]: 2025-10-02 12:00:47.942 2 DEBUG nova.virt.libvirt.driver [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:00:47 np0005466013 nova_compute[192144]: 2025-10-02 12:00:47.964 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:00:47 np0005466013 nova_compute[192144]: 2025-10-02 12:00:47.965 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759406447.9127746, 6e063d2c-d996-4af7-b02f-2ea76d2ba132 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:00:47 np0005466013 nova_compute[192144]: 2025-10-02 12:00:47.965 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:00:48 np0005466013 nova_compute[192144]: 2025-10-02 12:00:48.008 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:00:48 np0005466013 nova_compute[192144]: 2025-10-02 12:00:48.012 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759406447.9140875, 6e063d2c-d996-4af7-b02f-2ea76d2ba132 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:00:48 np0005466013 nova_compute[192144]: 2025-10-02 12:00:48.013 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:00:48 np0005466013 nova_compute[192144]: 2025-10-02 12:00:48.026 2 INFO nova.compute.manager [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Took 35.73 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:00:48 np0005466013 nova_compute[192144]: 2025-10-02 12:00:48.027 2 DEBUG nova.compute.manager [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:00:48 np0005466013 nova_compute[192144]: 2025-10-02 12:00:48.031 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:00:48 np0005466013 nova_compute[192144]: 2025-10-02 12:00:48.039 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:00:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:48.049 219962 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:00:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:48.049 219962 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:00:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:48.049 219962 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:00:48 np0005466013 nova_compute[192144]: 2025-10-02 12:00:48.068 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:00:48 np0005466013 nova_compute[192144]: 2025-10-02 12:00:48.140 2 INFO nova.compute.manager [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Took 36.75 seconds to build instance.#033[00m
Oct  2 08:00:48 np0005466013 nova_compute[192144]: 2025-10-02 12:00:48.167 2 DEBUG oslo_concurrency.lockutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lock "6e063d2c-d996-4af7-b02f-2ea76d2ba132" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 36.892s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:00:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:48.606 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[0800fa4d-bb14-4ec5-8726-1e0496f958f0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:00:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:48.607 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0e6cbdbf-b1 in ovnmeta-0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:00:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:48.609 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0e6cbdbf-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:00:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:48.609 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[59ad8853-55e7-4b0b-8329-01d96a403731]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:00:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:48.612 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[2129c65a-7c3d-4fea-8ac8-19256eca9444]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:00:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:48.642 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[9804eff5-a6bd-4aba-bb20-7c71b00a9006]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:00:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:48.672 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[3ac23e09-0021-4999-8ef3-d56d35241e62]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:00:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:48.674 103323 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpavke0sas/privsep.sock']#033[00m
Oct  2 08:00:49 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:49.348 103323 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Oct  2 08:00:49 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:49.350 103323 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpavke0sas/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Oct  2 08:00:49 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:49.218 219977 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Oct  2 08:00:49 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:49.227 219977 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Oct  2 08:00:49 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:49.231 219977 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Oct  2 08:00:49 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:49.232 219977 INFO oslo.privsep.daemon [-] privsep daemon running as pid 219977#033[00m
Oct  2 08:00:49 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:49.353 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[4f18d15c-ac9a-42ba-b834-b47ea429f540]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:00:49 np0005466013 nova_compute[192144]: 2025-10-02 12:00:49.567 2 DEBUG nova.compute.manager [req-99195cf3-99a2-4890-ba5e-72138d1ed3ff req-c2fe3751-5438-4132-b67a-a03cac16a5b6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Received event network-vif-plugged-48210443-c8f5-4013-be00-7509a0d2c6b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:00:49 np0005466013 nova_compute[192144]: 2025-10-02 12:00:49.568 2 DEBUG oslo_concurrency.lockutils [req-99195cf3-99a2-4890-ba5e-72138d1ed3ff req-c2fe3751-5438-4132-b67a-a03cac16a5b6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "6e063d2c-d996-4af7-b02f-2ea76d2ba132-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:00:49 np0005466013 nova_compute[192144]: 2025-10-02 12:00:49.568 2 DEBUG oslo_concurrency.lockutils [req-99195cf3-99a2-4890-ba5e-72138d1ed3ff req-c2fe3751-5438-4132-b67a-a03cac16a5b6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "6e063d2c-d996-4af7-b02f-2ea76d2ba132-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:00:49 np0005466013 nova_compute[192144]: 2025-10-02 12:00:49.568 2 DEBUG oslo_concurrency.lockutils [req-99195cf3-99a2-4890-ba5e-72138d1ed3ff req-c2fe3751-5438-4132-b67a-a03cac16a5b6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "6e063d2c-d996-4af7-b02f-2ea76d2ba132-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:00:49 np0005466013 nova_compute[192144]: 2025-10-02 12:00:49.568 2 DEBUG nova.compute.manager [req-99195cf3-99a2-4890-ba5e-72138d1ed3ff req-c2fe3751-5438-4132-b67a-a03cac16a5b6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] No waiting events found dispatching network-vif-plugged-48210443-c8f5-4013-be00-7509a0d2c6b7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:00:49 np0005466013 nova_compute[192144]: 2025-10-02 12:00:49.568 2 WARNING nova.compute.manager [req-99195cf3-99a2-4890-ba5e-72138d1ed3ff req-c2fe3751-5438-4132-b67a-a03cac16a5b6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Received unexpected event network-vif-plugged-48210443-c8f5-4013-be00-7509a0d2c6b7 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:00:49 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:49.926 219977 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:00:49 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:49.926 219977 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:00:49 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:49.926 219977 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:00:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:50.560 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[c996eece-6dc0-46b0-ab88-de3695eefe58]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:00:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:50.566 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[b2738596-5c44-4f44-82b5-711c9d5f22de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:00:50 np0005466013 NetworkManager[51205]: <info>  [1759406450.5703] manager: (tap0e6cbdbf-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/25)
Oct  2 08:00:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:50.598 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[71da1a65-dd7f-469f-bfcc-2564bbbb2db0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:00:50 np0005466013 systemd-udevd[219987]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:00:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:50.601 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[a35920b8-93b8-4b70-9a27-93fc848e17ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:00:50 np0005466013 NetworkManager[51205]: <info>  [1759406450.6278] device (tap0e6cbdbf-b0): carrier: link connected
Oct  2 08:00:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:50.633 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[cf1546f9-629a-4682-8ceb-37d3b36426fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:00:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:50.656 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[6b6eb155-7330-43d1-8f11-081fb8ffe3b2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0e6cbdbf-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:44:05:20'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 444224, 'reachable_time': 23635, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220006, 'error': None, 'target': 'ovnmeta-0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:00:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:50.677 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[a65d1b21-1a94-44eb-a1e4-cbc54b0e5130]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe44:520'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 444224, 'tstamp': 444224}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220007, 'error': None, 'target': 'ovnmeta-0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:00:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:50.700 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[9fdfe4c9-ded5-4002-ab0d-60fab77c86d9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0e6cbdbf-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:44:05:20'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 444224, 'reachable_time': 23635, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220008, 'error': None, 'target': 'ovnmeta-0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:00:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:50.741 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[54e29128-0a09-48af-9f07-52cc56b47ff8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:00:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:50.811 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[3485dd37-58a3-4149-8333-a52443f6d4b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:00:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:50.812 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0e6cbdbf-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:00:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:50.813 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:00:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:50.813 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0e6cbdbf-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:00:50 np0005466013 NetworkManager[51205]: <info>  [1759406450.8160] manager: (tap0e6cbdbf-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/26)
Oct  2 08:00:50 np0005466013 kernel: tap0e6cbdbf-b0: entered promiscuous mode
Oct  2 08:00:50 np0005466013 nova_compute[192144]: 2025-10-02 12:00:50.817 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:50.820 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0e6cbdbf-b0, col_values=(('external_ids', {'iface-id': '0e5a8941-b399-4368-aa52-d99cb4bfefe5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:00:50 np0005466013 ovn_controller[94366]: 2025-10-02T12:00:50Z|00031|binding|INFO|Releasing lport 0e5a8941-b399-4368-aa52-d99cb4bfefe5 from this chassis (sb_readonly=0)
Oct  2 08:00:50 np0005466013 nova_compute[192144]: 2025-10-02 12:00:50.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:50.834 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:00:50 np0005466013 nova_compute[192144]: 2025-10-02 12:00:50.834 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:50.835 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[7eea861a-eb83-48ad-9248-c6161869fd5a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:00:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:50.836 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:00:50 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:00:50 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:00:50 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6
Oct  2 08:00:50 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:00:50 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:00:50 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:00:50 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6.pid.haproxy
Oct  2 08:00:50 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:00:50 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:00:50 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:00:50 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:00:50 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:00:50 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:00:50 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:00:50 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:00:50 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:00:50 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:00:50 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:00:50 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:00:50 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:00:50 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:00:50 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:00:50 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:00:50 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:00:50 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:00:50 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:00:50 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:00:50 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID 0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6
Oct  2 08:00:50 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:00:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:50.837 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6', 'env', 'PROCESS_TAG=haproxy-0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:00:51 np0005466013 nova_compute[192144]: 2025-10-02 12:00:51.109 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:51 np0005466013 podman[220041]: 2025-10-02 12:00:51.267245324 +0000 UTC m=+0.079744239 container create 242bfc1bb657992a2a5c76167264a886cc52441505b17aeda54a5ec7b6464cf2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001)
Oct  2 08:00:51 np0005466013 systemd[1]: Started libpod-conmon-242bfc1bb657992a2a5c76167264a886cc52441505b17aeda54a5ec7b6464cf2.scope.
Oct  2 08:00:51 np0005466013 podman[220041]: 2025-10-02 12:00:51.209627486 +0000 UTC m=+0.022126421 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:00:51 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:00:51 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3ec20c3d26f742b8698f2584b55638a1984875dafc2ab97be96eb065574bbad6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:00:51 np0005466013 podman[220041]: 2025-10-02 12:00:51.352235736 +0000 UTC m=+0.164734671 container init 242bfc1bb657992a2a5c76167264a886cc52441505b17aeda54a5ec7b6464cf2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:00:51 np0005466013 podman[220041]: 2025-10-02 12:00:51.359245244 +0000 UTC m=+0.171744159 container start 242bfc1bb657992a2a5c76167264a886cc52441505b17aeda54a5ec7b6464cf2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  2 08:00:51 np0005466013 neutron-haproxy-ovnmeta-0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6[220056]: [NOTICE]   (220060) : New worker (220062) forked
Oct  2 08:00:51 np0005466013 neutron-haproxy-ovnmeta-0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6[220056]: [NOTICE]   (220060) : Loading success.
Oct  2 08:00:51 np0005466013 nova_compute[192144]: 2025-10-02 12:00:51.835 2 DEBUG nova.compute.manager [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Oct  2 08:00:51 np0005466013 nova_compute[192144]: 2025-10-02 12:00:51.964 2 DEBUG oslo_concurrency.lockutils [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:00:51 np0005466013 nova_compute[192144]: 2025-10-02 12:00:51.965 2 DEBUG oslo_concurrency.lockutils [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:00:51 np0005466013 nova_compute[192144]: 2025-10-02 12:00:51.987 2 DEBUG nova.objects.instance [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Lazy-loading 'pci_requests' on Instance uuid 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:00:52 np0005466013 nova_compute[192144]: 2025-10-02 12:00:52.004 2 DEBUG nova.virt.hardware [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:00:52 np0005466013 nova_compute[192144]: 2025-10-02 12:00:52.004 2 INFO nova.compute.claims [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:00:52 np0005466013 nova_compute[192144]: 2025-10-02 12:00:52.005 2 DEBUG nova.objects.instance [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Lazy-loading 'resources' on Instance uuid 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:00:52 np0005466013 nova_compute[192144]: 2025-10-02 12:00:52.017 2 DEBUG nova.objects.instance [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Lazy-loading 'numa_topology' on Instance uuid 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:00:52 np0005466013 nova_compute[192144]: 2025-10-02 12:00:52.028 2 DEBUG nova.objects.instance [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:00:52 np0005466013 nova_compute[192144]: 2025-10-02 12:00:52.074 2 INFO nova.compute.resource_tracker [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Updating resource usage from migration 230f0c54-f1fd-4e2c-8967-c050cc98b321#033[00m
Oct  2 08:00:52 np0005466013 nova_compute[192144]: 2025-10-02 12:00:52.074 2 DEBUG nova.compute.resource_tracker [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Starting to track incoming migration 230f0c54-f1fd-4e2c-8967-c050cc98b321 with flavor 9ac83da7-f31e-4467-8569-d28002f6aeed _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Oct  2 08:00:52 np0005466013 nova_compute[192144]: 2025-10-02 12:00:52.243 2 DEBUG nova.compute.provider_tree [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:00:52 np0005466013 nova_compute[192144]: 2025-10-02 12:00:52.257 2 DEBUG nova.scheduler.client.report [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:00:52 np0005466013 nova_compute[192144]: 2025-10-02 12:00:52.277 2 DEBUG oslo_concurrency.lockutils [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.312s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:00:52 np0005466013 nova_compute[192144]: 2025-10-02 12:00:52.278 2 INFO nova.compute.manager [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Migrating#033[00m
Oct  2 08:00:52 np0005466013 nova_compute[192144]: 2025-10-02 12:00:52.278 2 DEBUG oslo_concurrency.lockutils [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:00:52 np0005466013 nova_compute[192144]: 2025-10-02 12:00:52.278 2 DEBUG oslo_concurrency.lockutils [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:00:52 np0005466013 nova_compute[192144]: 2025-10-02 12:00:52.287 2 INFO nova.compute.rpcapi [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Automatically selected compute RPC version 6.2 from minimum service version 66#033[00m
Oct  2 08:00:52 np0005466013 nova_compute[192144]: 2025-10-02 12:00:52.287 2 DEBUG oslo_concurrency.lockutils [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:00:52 np0005466013 nova_compute[192144]: 2025-10-02 12:00:52.506 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:54 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:54.416 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:00:54 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:54.418 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:00:54 np0005466013 nova_compute[192144]: 2025-10-02 12:00:54.418 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:54 np0005466013 podman[220071]: 2025-10-02 12:00:54.696168458 +0000 UTC m=+0.073258957 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  2 08:00:54 np0005466013 systemd[1]: Created slice User Slice of UID 42436.
Oct  2 08:00:54 np0005466013 systemd[1]: Starting User Runtime Directory /run/user/42436...
Oct  2 08:00:54 np0005466013 systemd-logind[784]: New session 29 of user nova.
Oct  2 08:00:54 np0005466013 systemd[1]: Finished User Runtime Directory /run/user/42436.
Oct  2 08:00:54 np0005466013 systemd[1]: Starting User Manager for UID 42436...
Oct  2 08:00:54 np0005466013 systemd[220095]: Queued start job for default target Main User Target.
Oct  2 08:00:54 np0005466013 systemd[220095]: Created slice User Application Slice.
Oct  2 08:00:54 np0005466013 systemd[220095]: Started Mark boot as successful after the user session has run 2 minutes.
Oct  2 08:00:54 np0005466013 systemd[220095]: Started Daily Cleanup of User's Temporary Directories.
Oct  2 08:00:54 np0005466013 systemd[220095]: Reached target Paths.
Oct  2 08:00:54 np0005466013 systemd[220095]: Reached target Timers.
Oct  2 08:00:54 np0005466013 systemd[220095]: Starting D-Bus User Message Bus Socket...
Oct  2 08:00:54 np0005466013 systemd[220095]: Starting Create User's Volatile Files and Directories...
Oct  2 08:00:54 np0005466013 systemd[220095]: Listening on D-Bus User Message Bus Socket.
Oct  2 08:00:54 np0005466013 systemd[220095]: Reached target Sockets.
Oct  2 08:00:54 np0005466013 systemd[220095]: Finished Create User's Volatile Files and Directories.
Oct  2 08:00:54 np0005466013 systemd[220095]: Reached target Basic System.
Oct  2 08:00:54 np0005466013 systemd[220095]: Reached target Main User Target.
Oct  2 08:00:54 np0005466013 systemd[220095]: Startup finished in 139ms.
Oct  2 08:00:54 np0005466013 systemd[1]: Started User Manager for UID 42436.
Oct  2 08:00:54 np0005466013 systemd[1]: Started Session 29 of User nova.
Oct  2 08:00:54 np0005466013 systemd[1]: session-29.scope: Deactivated successfully.
Oct  2 08:00:54 np0005466013 systemd-logind[784]: Session 29 logged out. Waiting for processes to exit.
Oct  2 08:00:54 np0005466013 systemd-logind[784]: Removed session 29.
Oct  2 08:00:55 np0005466013 systemd-logind[784]: New session 31 of user nova.
Oct  2 08:00:55 np0005466013 systemd[1]: Started Session 31 of User nova.
Oct  2 08:00:55 np0005466013 systemd[1]: session-31.scope: Deactivated successfully.
Oct  2 08:00:55 np0005466013 systemd-logind[784]: Session 31 logged out. Waiting for processes to exit.
Oct  2 08:00:55 np0005466013 systemd-logind[784]: Removed session 31.
Oct  2 08:00:56 np0005466013 nova_compute[192144]: 2025-10-02 12:00:56.111 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:57 np0005466013 nova_compute[192144]: 2025-10-02 12:00:57.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:57 np0005466013 nova_compute[192144]: 2025-10-02 12:00:57.551 2 DEBUG oslo_concurrency.lockutils [None req-4d990e93-eb2e-401f-bc48-f2e17a76f36b 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Acquiring lock "6e063d2c-d996-4af7-b02f-2ea76d2ba132" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:00:57 np0005466013 nova_compute[192144]: 2025-10-02 12:00:57.551 2 DEBUG oslo_concurrency.lockutils [None req-4d990e93-eb2e-401f-bc48-f2e17a76f36b 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lock "6e063d2c-d996-4af7-b02f-2ea76d2ba132" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:00:57 np0005466013 nova_compute[192144]: 2025-10-02 12:00:57.552 2 DEBUG oslo_concurrency.lockutils [None req-4d990e93-eb2e-401f-bc48-f2e17a76f36b 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Acquiring lock "6e063d2c-d996-4af7-b02f-2ea76d2ba132-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:00:57 np0005466013 nova_compute[192144]: 2025-10-02 12:00:57.552 2 DEBUG oslo_concurrency.lockutils [None req-4d990e93-eb2e-401f-bc48-f2e17a76f36b 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lock "6e063d2c-d996-4af7-b02f-2ea76d2ba132-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:00:57 np0005466013 nova_compute[192144]: 2025-10-02 12:00:57.552 2 DEBUG oslo_concurrency.lockutils [None req-4d990e93-eb2e-401f-bc48-f2e17a76f36b 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lock "6e063d2c-d996-4af7-b02f-2ea76d2ba132-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:00:57 np0005466013 nova_compute[192144]: 2025-10-02 12:00:57.565 2 INFO nova.compute.manager [None req-4d990e93-eb2e-401f-bc48-f2e17a76f36b 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Terminating instance#033[00m
Oct  2 08:00:57 np0005466013 nova_compute[192144]: 2025-10-02 12:00:57.575 2 DEBUG nova.compute.manager [None req-4d990e93-eb2e-401f-bc48-f2e17a76f36b 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:00:57 np0005466013 kernel: tap48210443-c8 (unregistering): left promiscuous mode
Oct  2 08:00:57 np0005466013 NetworkManager[51205]: <info>  [1759406457.6023] device (tap48210443-c8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:00:57 np0005466013 ovn_controller[94366]: 2025-10-02T12:00:57Z|00032|binding|INFO|Releasing lport 48210443-c8f5-4013-be00-7509a0d2c6b7 from this chassis (sb_readonly=0)
Oct  2 08:00:57 np0005466013 ovn_controller[94366]: 2025-10-02T12:00:57Z|00033|binding|INFO|Setting lport 48210443-c8f5-4013-be00-7509a0d2c6b7 down in Southbound
Oct  2 08:00:57 np0005466013 nova_compute[192144]: 2025-10-02 12:00:57.653 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:57 np0005466013 ovn_controller[94366]: 2025-10-02T12:00:57Z|00034|binding|INFO|Removing iface tap48210443-c8 ovn-installed in OVS
Oct  2 08:00:57 np0005466013 nova_compute[192144]: 2025-10-02 12:00:57.656 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:57 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:57.661 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:99:53 10.1.0.50 fdfe:381f:8400::19e'], port_security=['fa:16:3e:04:99:53 10.1.0.50 fdfe:381f:8400::19e'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.0.50/26 fdfe:381f:8400::19e/64', 'neutron:device_id': '6e063d2c-d996-4af7-b02f-2ea76d2ba132', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '23de7e9a877e477cb52ac4d4c1410e0d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6166ab66-e763-4e6f-ba6d-1725486f45f7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9cab3463-7636-46ad-b75d-f72d7d1739eb, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=48210443-c8f5-4013-be00-7509a0d2c6b7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:00:57 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:57.663 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 48210443-c8f5-4013-be00-7509a0d2c6b7 in datapath 0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6 unbound from our chassis#033[00m
Oct  2 08:00:57 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:57.665 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:00:57 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:57.666 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[b43b5046-e88d-4e2a-8c82-d6cfda5b4e1f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:00:57 np0005466013 nova_compute[192144]: 2025-10-02 12:00:57.666 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:57 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:57.666 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6 namespace which is not needed anymore#033[00m
Oct  2 08:00:57 np0005466013 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000004.scope: Deactivated successfully.
Oct  2 08:00:57 np0005466013 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000004.scope: Consumed 10.628s CPU time.
Oct  2 08:00:57 np0005466013 systemd-machined[152202]: Machine qemu-2-instance-00000004 terminated.
Oct  2 08:00:57 np0005466013 neutron-haproxy-ovnmeta-0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6[220056]: [NOTICE]   (220060) : haproxy version is 2.8.14-c23fe91
Oct  2 08:00:57 np0005466013 neutron-haproxy-ovnmeta-0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6[220056]: [NOTICE]   (220060) : path to executable is /usr/sbin/haproxy
Oct  2 08:00:57 np0005466013 neutron-haproxy-ovnmeta-0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6[220056]: [WARNING]  (220060) : Exiting Master process...
Oct  2 08:00:57 np0005466013 neutron-haproxy-ovnmeta-0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6[220056]: [ALERT]    (220060) : Current worker (220062) exited with code 143 (Terminated)
Oct  2 08:00:57 np0005466013 neutron-haproxy-ovnmeta-0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6[220056]: [WARNING]  (220060) : All workers exited. Exiting... (0)
Oct  2 08:00:57 np0005466013 systemd[1]: libpod-242bfc1bb657992a2a5c76167264a886cc52441505b17aeda54a5ec7b6464cf2.scope: Deactivated successfully.
Oct  2 08:00:57 np0005466013 podman[220138]: 2025-10-02 12:00:57.814315016 +0000 UTC m=+0.079891067 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.expose-services=, maintainer=Red Hat, Inc., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, build-date=2025-08-20T13:12:41)
Oct  2 08:00:57 np0005466013 podman[220128]: 2025-10-02 12:00:57.822088901 +0000 UTC m=+0.091565079 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:00:57 np0005466013 podman[220162]: 2025-10-02 12:00:57.827575186 +0000 UTC m=+0.062346120 container died 242bfc1bb657992a2a5c76167264a886cc52441505b17aeda54a5ec7b6464cf2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:00:57 np0005466013 nova_compute[192144]: 2025-10-02 12:00:57.832 2 INFO nova.virt.libvirt.driver [-] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Instance destroyed successfully.#033[00m
Oct  2 08:00:57 np0005466013 nova_compute[192144]: 2025-10-02 12:00:57.833 2 DEBUG nova.objects.instance [None req-4d990e93-eb2e-401f-bc48-f2e17a76f36b 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lazy-loading 'resources' on Instance uuid 6e063d2c-d996-4af7-b02f-2ea76d2ba132 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:00:57 np0005466013 nova_compute[192144]: 2025-10-02 12:00:57.848 2 DEBUG nova.virt.libvirt.vif [None req-4d990e93-eb2e-401f-bc48-f2e17a76f36b 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:00:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-549213814-3',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-549213814-3',id=4,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=2,launched_at=2025-10-02T12:00:48Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='23de7e9a877e477cb52ac4d4c1410e0d',ramdisk_id='',reservation_id='r-2zs6ym1o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AutoAllocateNetworkTest-1436985778',owner_user_name='tempest-AutoAllocateNetworkTest-1436985778-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:00:48Z,user_data=None,user_id='4e1cdf41d58b4774b94da988b9e8db73',uuid=6e063d2c-d996-4af7-b02f-2ea76d2ba132,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "48210443-c8f5-4013-be00-7509a0d2c6b7", "address": "fa:16:3e:04:99:53", "network": {"id": "0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.50", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::19e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "23de7e9a877e477cb52ac4d4c1410e0d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48210443-c8", "ovs_interfaceid": "48210443-c8f5-4013-be00-7509a0d2c6b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:00:57 np0005466013 nova_compute[192144]: 2025-10-02 12:00:57.848 2 DEBUG nova.network.os_vif_util [None req-4d990e93-eb2e-401f-bc48-f2e17a76f36b 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Converting VIF {"id": "48210443-c8f5-4013-be00-7509a0d2c6b7", "address": "fa:16:3e:04:99:53", "network": {"id": "0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.50", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::19e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "23de7e9a877e477cb52ac4d4c1410e0d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48210443-c8", "ovs_interfaceid": "48210443-c8f5-4013-be00-7509a0d2c6b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:00:57 np0005466013 nova_compute[192144]: 2025-10-02 12:00:57.849 2 DEBUG nova.network.os_vif_util [None req-4d990e93-eb2e-401f-bc48-f2e17a76f36b 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:04:99:53,bridge_name='br-int',has_traffic_filtering=True,id=48210443-c8f5-4013-be00-7509a0d2c6b7,network=Network(0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap48210443-c8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:00:57 np0005466013 nova_compute[192144]: 2025-10-02 12:00:57.850 2 DEBUG os_vif [None req-4d990e93-eb2e-401f-bc48-f2e17a76f36b 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:99:53,bridge_name='br-int',has_traffic_filtering=True,id=48210443-c8f5-4013-be00-7509a0d2c6b7,network=Network(0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap48210443-c8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:00:57 np0005466013 nova_compute[192144]: 2025-10-02 12:00:57.851 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:57 np0005466013 nova_compute[192144]: 2025-10-02 12:00:57.852 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap48210443-c8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:00:57 np0005466013 nova_compute[192144]: 2025-10-02 12:00:57.853 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:57 np0005466013 nova_compute[192144]: 2025-10-02 12:00:57.854 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:57 np0005466013 nova_compute[192144]: 2025-10-02 12:00:57.856 2 INFO os_vif [None req-4d990e93-eb2e-401f-bc48-f2e17a76f36b 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:99:53,bridge_name='br-int',has_traffic_filtering=True,id=48210443-c8f5-4013-be00-7509a0d2c6b7,network=Network(0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap48210443-c8')#033[00m
Oct  2 08:00:57 np0005466013 nova_compute[192144]: 2025-10-02 12:00:57.857 2 INFO nova.virt.libvirt.driver [None req-4d990e93-eb2e-401f-bc48-f2e17a76f36b 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Deleting instance files /var/lib/nova/instances/6e063d2c-d996-4af7-b02f-2ea76d2ba132_del#033[00m
Oct  2 08:00:57 np0005466013 nova_compute[192144]: 2025-10-02 12:00:57.858 2 INFO nova.virt.libvirt.driver [None req-4d990e93-eb2e-401f-bc48-f2e17a76f36b 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Deletion of /var/lib/nova/instances/6e063d2c-d996-4af7-b02f-2ea76d2ba132_del complete#033[00m
Oct  2 08:00:57 np0005466013 nova_compute[192144]: 2025-10-02 12:00:57.921 2 DEBUG nova.virt.libvirt.host [None req-4d990e93-eb2e-401f-bc48-f2e17a76f36b 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754#033[00m
Oct  2 08:00:57 np0005466013 nova_compute[192144]: 2025-10-02 12:00:57.921 2 INFO nova.virt.libvirt.host [None req-4d990e93-eb2e-401f-bc48-f2e17a76f36b 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] UEFI support detected#033[00m
Oct  2 08:00:57 np0005466013 nova_compute[192144]: 2025-10-02 12:00:57.923 2 INFO nova.compute.manager [None req-4d990e93-eb2e-401f-bc48-f2e17a76f36b 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Took 0.35 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:00:57 np0005466013 nova_compute[192144]: 2025-10-02 12:00:57.924 2 DEBUG oslo.service.loopingcall [None req-4d990e93-eb2e-401f-bc48-f2e17a76f36b 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:00:57 np0005466013 nova_compute[192144]: 2025-10-02 12:00:57.924 2 DEBUG nova.compute.manager [-] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:00:57 np0005466013 nova_compute[192144]: 2025-10-02 12:00:57.924 2 DEBUG nova.network.neutron [-] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:00:57 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-242bfc1bb657992a2a5c76167264a886cc52441505b17aeda54a5ec7b6464cf2-userdata-shm.mount: Deactivated successfully.
Oct  2 08:00:57 np0005466013 systemd[1]: var-lib-containers-storage-overlay-3ec20c3d26f742b8698f2584b55638a1984875dafc2ab97be96eb065574bbad6-merged.mount: Deactivated successfully.
Oct  2 08:00:57 np0005466013 podman[220162]: 2025-10-02 12:00:57.970885003 +0000 UTC m=+0.205655937 container cleanup 242bfc1bb657992a2a5c76167264a886cc52441505b17aeda54a5ec7b6464cf2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:00:57 np0005466013 systemd[1]: libpod-conmon-242bfc1bb657992a2a5c76167264a886cc52441505b17aeda54a5ec7b6464cf2.scope: Deactivated successfully.
Oct  2 08:00:58 np0005466013 podman[220225]: 2025-10-02 12:00:58.051740219 +0000 UTC m=+0.061594577 container remove 242bfc1bb657992a2a5c76167264a886cc52441505b17aeda54a5ec7b6464cf2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:00:58 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:58.056 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[1af8f744-0da6-41b0-a12a-c8bbab29399f]: (4, ('Thu Oct  2 12:00:57 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6 (242bfc1bb657992a2a5c76167264a886cc52441505b17aeda54a5ec7b6464cf2)\n242bfc1bb657992a2a5c76167264a886cc52441505b17aeda54a5ec7b6464cf2\nThu Oct  2 12:00:57 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6 (242bfc1bb657992a2a5c76167264a886cc52441505b17aeda54a5ec7b6464cf2)\n242bfc1bb657992a2a5c76167264a886cc52441505b17aeda54a5ec7b6464cf2\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:00:58 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:58.058 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[1ef27f58-d3ef-49b4-9600-931cf2f610b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:00:58 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:58.059 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0e6cbdbf-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:00:58 np0005466013 nova_compute[192144]: 2025-10-02 12:00:58.060 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:58 np0005466013 kernel: tap0e6cbdbf-b0: left promiscuous mode
Oct  2 08:00:58 np0005466013 nova_compute[192144]: 2025-10-02 12:00:58.062 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:58 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:58.064 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[b9de6989-85ea-4293-95d8-913dd695bb74]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:00:58 np0005466013 nova_compute[192144]: 2025-10-02 12:00:58.076 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:58 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:58.096 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[fbba0a25-fbdb-4db9-93b4-3521dc4a64ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:00:58 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:58.097 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[0d657f9d-828e-49ad-8bff-1ad8f8487d8d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:00:58 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:58.110 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[3586fc92-1d72-4e01-82ed-503fd1abab89]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 444217, 'reachable_time': 37825, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220240, 'error': None, 'target': 'ovnmeta-0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:00:58 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:58.119 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:00:58 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:58.120 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[d0c58396-3145-4d86-a366-64b284e47c08]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:00:58 np0005466013 systemd[1]: run-netns-ovnmeta\x2d0e6cbdbf\x2db727\x2d48dc\x2d82d1\x2df7af5e6b3fc6.mount: Deactivated successfully.
Oct  2 08:00:58 np0005466013 nova_compute[192144]: 2025-10-02 12:00:58.202 2 DEBUG nova.compute.manager [req-81526533-b4d3-432e-b80f-1d3d65b450db req-fa033c61-d86f-4217-b64b-47c3d459104a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Received event network-vif-unplugged-48210443-c8f5-4013-be00-7509a0d2c6b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:00:58 np0005466013 nova_compute[192144]: 2025-10-02 12:00:58.202 2 DEBUG oslo_concurrency.lockutils [req-81526533-b4d3-432e-b80f-1d3d65b450db req-fa033c61-d86f-4217-b64b-47c3d459104a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "6e063d2c-d996-4af7-b02f-2ea76d2ba132-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:00:58 np0005466013 nova_compute[192144]: 2025-10-02 12:00:58.202 2 DEBUG oslo_concurrency.lockutils [req-81526533-b4d3-432e-b80f-1d3d65b450db req-fa033c61-d86f-4217-b64b-47c3d459104a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "6e063d2c-d996-4af7-b02f-2ea76d2ba132-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:00:58 np0005466013 nova_compute[192144]: 2025-10-02 12:00:58.202 2 DEBUG oslo_concurrency.lockutils [req-81526533-b4d3-432e-b80f-1d3d65b450db req-fa033c61-d86f-4217-b64b-47c3d459104a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "6e063d2c-d996-4af7-b02f-2ea76d2ba132-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:00:58 np0005466013 nova_compute[192144]: 2025-10-02 12:00:58.203 2 DEBUG nova.compute.manager [req-81526533-b4d3-432e-b80f-1d3d65b450db req-fa033c61-d86f-4217-b64b-47c3d459104a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] No waiting events found dispatching network-vif-unplugged-48210443-c8f5-4013-be00-7509a0d2c6b7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:00:58 np0005466013 nova_compute[192144]: 2025-10-02 12:00:58.203 2 DEBUG nova.compute.manager [req-81526533-b4d3-432e-b80f-1d3d65b450db req-fa033c61-d86f-4217-b64b-47c3d459104a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Received event network-vif-unplugged-48210443-c8f5-4013-be00-7509a0d2c6b7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:00:58 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:00:58.419 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:00:58 np0005466013 nova_compute[192144]: 2025-10-02 12:00:58.591 2 DEBUG nova.network.neutron [-] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:00:58 np0005466013 nova_compute[192144]: 2025-10-02 12:00:58.616 2 DEBUG nova.compute.manager [req-01a6910e-da25-4414-b255-b03cc2776322 req-95b92a1e-d707-4700-b767-22c4a372aa6c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Received event network-vif-deleted-48210443-c8f5-4013-be00-7509a0d2c6b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:00:58 np0005466013 nova_compute[192144]: 2025-10-02 12:00:58.617 2 INFO nova.compute.manager [req-01a6910e-da25-4414-b255-b03cc2776322 req-95b92a1e-d707-4700-b767-22c4a372aa6c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Neutron deleted interface 48210443-c8f5-4013-be00-7509a0d2c6b7; detaching it from the instance and deleting it from the info cache#033[00m
Oct  2 08:00:58 np0005466013 nova_compute[192144]: 2025-10-02 12:00:58.617 2 DEBUG nova.network.neutron [req-01a6910e-da25-4414-b255-b03cc2776322 req-95b92a1e-d707-4700-b767-22c4a372aa6c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:00:58 np0005466013 nova_compute[192144]: 2025-10-02 12:00:58.726 2 INFO nova.compute.manager [-] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Took 0.80 seconds to deallocate network for instance.#033[00m
Oct  2 08:00:58 np0005466013 nova_compute[192144]: 2025-10-02 12:00:58.730 2 DEBUG nova.compute.manager [req-01a6910e-da25-4414-b255-b03cc2776322 req-95b92a1e-d707-4700-b767-22c4a372aa6c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Detach interface failed, port_id=48210443-c8f5-4013-be00-7509a0d2c6b7, reason: Instance 6e063d2c-d996-4af7-b02f-2ea76d2ba132 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Oct  2 08:00:58 np0005466013 nova_compute[192144]: 2025-10-02 12:00:58.871 2 DEBUG oslo_concurrency.lockutils [None req-4d990e93-eb2e-401f-bc48-f2e17a76f36b 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:00:58 np0005466013 nova_compute[192144]: 2025-10-02 12:00:58.872 2 DEBUG oslo_concurrency.lockutils [None req-4d990e93-eb2e-401f-bc48-f2e17a76f36b 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:00:58 np0005466013 nova_compute[192144]: 2025-10-02 12:00:58.951 2 DEBUG nova.compute.provider_tree [None req-4d990e93-eb2e-401f-bc48-f2e17a76f36b 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:00:58 np0005466013 nova_compute[192144]: 2025-10-02 12:00:58.996 2 DEBUG nova.scheduler.client.report [None req-4d990e93-eb2e-401f-bc48-f2e17a76f36b 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:00:59 np0005466013 nova_compute[192144]: 2025-10-02 12:00:59.049 2 DEBUG oslo_concurrency.lockutils [None req-4d990e93-eb2e-401f-bc48-f2e17a76f36b 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.177s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:00:59 np0005466013 nova_compute[192144]: 2025-10-02 12:00:59.093 2 INFO nova.scheduler.client.report [None req-4d990e93-eb2e-401f-bc48-f2e17a76f36b 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Deleted allocations for instance 6e063d2c-d996-4af7-b02f-2ea76d2ba132#033[00m
Oct  2 08:00:59 np0005466013 nova_compute[192144]: 2025-10-02 12:00:59.188 2 DEBUG oslo_concurrency.lockutils [None req-4d990e93-eb2e-401f-bc48-f2e17a76f36b 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lock "6e063d2c-d996-4af7-b02f-2ea76d2ba132" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.637s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:01:00 np0005466013 nova_compute[192144]: 2025-10-02 12:01:00.386 2 DEBUG nova.compute.manager [req-212118a4-6d92-4352-8105-b77e4b36f33d req-e5568ad2-299b-4ac7-9532-c40bb9bb1562 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Received event network-vif-plugged-48210443-c8f5-4013-be00-7509a0d2c6b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:01:00 np0005466013 nova_compute[192144]: 2025-10-02 12:01:00.387 2 DEBUG oslo_concurrency.lockutils [req-212118a4-6d92-4352-8105-b77e4b36f33d req-e5568ad2-299b-4ac7-9532-c40bb9bb1562 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "6e063d2c-d996-4af7-b02f-2ea76d2ba132-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:01:00 np0005466013 nova_compute[192144]: 2025-10-02 12:01:00.387 2 DEBUG oslo_concurrency.lockutils [req-212118a4-6d92-4352-8105-b77e4b36f33d req-e5568ad2-299b-4ac7-9532-c40bb9bb1562 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "6e063d2c-d996-4af7-b02f-2ea76d2ba132-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:01:00 np0005466013 nova_compute[192144]: 2025-10-02 12:01:00.388 2 DEBUG oslo_concurrency.lockutils [req-212118a4-6d92-4352-8105-b77e4b36f33d req-e5568ad2-299b-4ac7-9532-c40bb9bb1562 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "6e063d2c-d996-4af7-b02f-2ea76d2ba132-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:01:00 np0005466013 nova_compute[192144]: 2025-10-02 12:01:00.388 2 DEBUG nova.compute.manager [req-212118a4-6d92-4352-8105-b77e4b36f33d req-e5568ad2-299b-4ac7-9532-c40bb9bb1562 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] No waiting events found dispatching network-vif-plugged-48210443-c8f5-4013-be00-7509a0d2c6b7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:01:00 np0005466013 nova_compute[192144]: 2025-10-02 12:01:00.388 2 WARNING nova.compute.manager [req-212118a4-6d92-4352-8105-b77e4b36f33d req-e5568ad2-299b-4ac7-9532-c40bb9bb1562 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Received unexpected event network-vif-plugged-48210443-c8f5-4013-be00-7509a0d2c6b7 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:01:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:01:02.273 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:01:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:01:02.274 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:01:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:01:02.274 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:01:02 np0005466013 nova_compute[192144]: 2025-10-02 12:01:02.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:02 np0005466013 nova_compute[192144]: 2025-10-02 12:01:02.854 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:04 np0005466013 podman[220253]: 2025-10-02 12:01:04.68025612 +0000 UTC m=+0.057564365 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:01:04 np0005466013 podman[220273]: 2025-10-02 12:01:04.749877868 +0000 UTC m=+0.044969676 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:01:04 np0005466013 nova_compute[192144]: 2025-10-02 12:01:04.919 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:05 np0005466013 systemd[1]: Stopping User Manager for UID 42436...
Oct  2 08:01:05 np0005466013 systemd[220095]: Activating special unit Exit the Session...
Oct  2 08:01:05 np0005466013 systemd[220095]: Stopped target Main User Target.
Oct  2 08:01:05 np0005466013 systemd[220095]: Stopped target Basic System.
Oct  2 08:01:05 np0005466013 systemd[220095]: Stopped target Paths.
Oct  2 08:01:05 np0005466013 systemd[220095]: Stopped target Sockets.
Oct  2 08:01:05 np0005466013 systemd[220095]: Stopped target Timers.
Oct  2 08:01:05 np0005466013 systemd[220095]: Stopped Mark boot as successful after the user session has run 2 minutes.
Oct  2 08:01:05 np0005466013 systemd[220095]: Stopped Daily Cleanup of User's Temporary Directories.
Oct  2 08:01:05 np0005466013 systemd[220095]: Closed D-Bus User Message Bus Socket.
Oct  2 08:01:05 np0005466013 systemd[220095]: Stopped Create User's Volatile Files and Directories.
Oct  2 08:01:05 np0005466013 systemd[220095]: Removed slice User Application Slice.
Oct  2 08:01:05 np0005466013 systemd[220095]: Reached target Shutdown.
Oct  2 08:01:05 np0005466013 systemd[220095]: Finished Exit the Session.
Oct  2 08:01:05 np0005466013 systemd[220095]: Reached target Exit the Session.
Oct  2 08:01:05 np0005466013 systemd[1]: user@42436.service: Deactivated successfully.
Oct  2 08:01:05 np0005466013 systemd[1]: Stopped User Manager for UID 42436.
Oct  2 08:01:05 np0005466013 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Oct  2 08:01:05 np0005466013 systemd[1]: run-user-42436.mount: Deactivated successfully.
Oct  2 08:01:05 np0005466013 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Oct  2 08:01:05 np0005466013 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Oct  2 08:01:05 np0005466013 systemd[1]: Removed slice User Slice of UID 42436.
Oct  2 08:01:07 np0005466013 nova_compute[192144]: 2025-10-02 12:01:07.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:07 np0005466013 nova_compute[192144]: 2025-10-02 12:01:07.856 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:07 np0005466013 nova_compute[192144]: 2025-10-02 12:01:07.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:01:08 np0005466013 systemd-logind[784]: New session 32 of user nova.
Oct  2 08:01:08 np0005466013 systemd[1]: Created slice User Slice of UID 42436.
Oct  2 08:01:08 np0005466013 systemd[1]: Starting User Runtime Directory /run/user/42436...
Oct  2 08:01:08 np0005466013 systemd[1]: Finished User Runtime Directory /run/user/42436.
Oct  2 08:01:08 np0005466013 systemd[1]: Starting User Manager for UID 42436...
Oct  2 08:01:08 np0005466013 systemd[220303]: Queued start job for default target Main User Target.
Oct  2 08:01:08 np0005466013 systemd[220303]: Created slice User Application Slice.
Oct  2 08:01:08 np0005466013 systemd[220303]: Started Mark boot as successful after the user session has run 2 minutes.
Oct  2 08:01:08 np0005466013 systemd[220303]: Started Daily Cleanup of User's Temporary Directories.
Oct  2 08:01:08 np0005466013 systemd[220303]: Reached target Paths.
Oct  2 08:01:08 np0005466013 systemd[220303]: Reached target Timers.
Oct  2 08:01:08 np0005466013 systemd[220303]: Starting D-Bus User Message Bus Socket...
Oct  2 08:01:08 np0005466013 systemd[220303]: Starting Create User's Volatile Files and Directories...
Oct  2 08:01:08 np0005466013 systemd[220303]: Listening on D-Bus User Message Bus Socket.
Oct  2 08:01:08 np0005466013 systemd[220303]: Reached target Sockets.
Oct  2 08:01:08 np0005466013 systemd[220303]: Finished Create User's Volatile Files and Directories.
Oct  2 08:01:08 np0005466013 systemd[220303]: Reached target Basic System.
Oct  2 08:01:08 np0005466013 systemd[220303]: Reached target Main User Target.
Oct  2 08:01:08 np0005466013 systemd[220303]: Startup finished in 134ms.
Oct  2 08:01:08 np0005466013 systemd[1]: Started User Manager for UID 42436.
Oct  2 08:01:08 np0005466013 systemd[1]: Started Session 32 of User nova.
Oct  2 08:01:08 np0005466013 nova_compute[192144]: 2025-10-02 12:01:08.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:01:08 np0005466013 nova_compute[192144]: 2025-10-02 12:01:08.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:01:08 np0005466013 nova_compute[192144]: 2025-10-02 12:01:08.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:01:09 np0005466013 nova_compute[192144]: 2025-10-02 12:01:09.114 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:01:09 np0005466013 nova_compute[192144]: 2025-10-02 12:01:09.114 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:01:09 np0005466013 nova_compute[192144]: 2025-10-02 12:01:09.115 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:01:09 np0005466013 nova_compute[192144]: 2025-10-02 12:01:09.115 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:01:09 np0005466013 systemd[1]: session-32.scope: Deactivated successfully.
Oct  2 08:01:09 np0005466013 systemd-logind[784]: Session 32 logged out. Waiting for processes to exit.
Oct  2 08:01:09 np0005466013 systemd-logind[784]: Removed session 32.
Oct  2 08:01:09 np0005466013 systemd-logind[784]: New session 34 of user nova.
Oct  2 08:01:09 np0005466013 systemd[1]: Started Session 34 of User nova.
Oct  2 08:01:09 np0005466013 systemd[1]: session-34.scope: Deactivated successfully.
Oct  2 08:01:09 np0005466013 systemd-logind[784]: Session 34 logged out. Waiting for processes to exit.
Oct  2 08:01:09 np0005466013 systemd-logind[784]: Removed session 34.
Oct  2 08:01:09 np0005466013 systemd-logind[784]: New session 35 of user nova.
Oct  2 08:01:09 np0005466013 systemd[1]: Started Session 35 of User nova.
Oct  2 08:01:09 np0005466013 nova_compute[192144]: 2025-10-02 12:01:09.704 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5c14c0c1-8950-4722-9065-b691fc2dc856/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:01:09 np0005466013 systemd[1]: session-35.scope: Deactivated successfully.
Oct  2 08:01:09 np0005466013 systemd-logind[784]: Session 35 logged out. Waiting for processes to exit.
Oct  2 08:01:09 np0005466013 systemd-logind[784]: Removed session 35.
Oct  2 08:01:09 np0005466013 nova_compute[192144]: 2025-10-02 12:01:09.767 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5c14c0c1-8950-4722-9065-b691fc2dc856/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:01:09 np0005466013 nova_compute[192144]: 2025-10-02 12:01:09.767 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5c14c0c1-8950-4722-9065-b691fc2dc856/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:01:09 np0005466013 nova_compute[192144]: 2025-10-02 12:01:09.825 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5c14c0c1-8950-4722-9065-b691fc2dc856/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:01:09 np0005466013 nova_compute[192144]: 2025-10-02 12:01:09.976 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:01:09 np0005466013 nova_compute[192144]: 2025-10-02 12:01:09.977 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5589MB free_disk=73.40771102905273GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:01:09 np0005466013 nova_compute[192144]: 2025-10-02 12:01:09.977 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:01:09 np0005466013 nova_compute[192144]: 2025-10-02 12:01:09.977 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:01:10 np0005466013 nova_compute[192144]: 2025-10-02 12:01:10.508 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Migration for instance 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Oct  2 08:01:10 np0005466013 nova_compute[192144]: 2025-10-02 12:01:10.613 2 INFO nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Updating resource usage from migration 230f0c54-f1fd-4e2c-8967-c050cc98b321#033[00m
Oct  2 08:01:10 np0005466013 nova_compute[192144]: 2025-10-02 12:01:10.613 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Starting to track incoming migration 230f0c54-f1fd-4e2c-8967-c050cc98b321 with flavor 9ac83da7-f31e-4467-8569-d28002f6aeed _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Oct  2 08:01:10 np0005466013 nova_compute[192144]: 2025-10-02 12:01:10.647 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance 5c14c0c1-8950-4722-9065-b691fc2dc856 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:01:10 np0005466013 nova_compute[192144]: 2025-10-02 12:01:10.764 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Instance with task_state "resize_migrated" is not being actively managed by this compute host but has allocations referencing this compute node (8a5c5335-95d5-48d7-aa6f-2fc6c798dc80): {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. Skipping heal of allocations during the task state transition. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1708#033[00m
Oct  2 08:01:10 np0005466013 nova_compute[192144]: 2025-10-02 12:01:10.764 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:01:10 np0005466013 nova_compute[192144]: 2025-10-02 12:01:10.765 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:01:10 np0005466013 nova_compute[192144]: 2025-10-02 12:01:10.819 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:01:10 np0005466013 nova_compute[192144]: 2025-10-02 12:01:10.922 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:01:11 np0005466013 nova_compute[192144]: 2025-10-02 12:01:11.287 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:01:11 np0005466013 nova_compute[192144]: 2025-10-02 12:01:11.287 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.310s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:01:11 np0005466013 nova_compute[192144]: 2025-10-02 12:01:11.861 2 DEBUG oslo_concurrency.lockutils [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Acquiring lock "refresh_cache-0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:01:11 np0005466013 nova_compute[192144]: 2025-10-02 12:01:11.861 2 DEBUG oslo_concurrency.lockutils [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Acquired lock "refresh_cache-0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:01:11 np0005466013 nova_compute[192144]: 2025-10-02 12:01:11.861 2 DEBUG nova.network.neutron [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:01:12 np0005466013 nova_compute[192144]: 2025-10-02 12:01:12.288 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:01:12 np0005466013 nova_compute[192144]: 2025-10-02 12:01:12.288 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:01:12 np0005466013 nova_compute[192144]: 2025-10-02 12:01:12.289 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:01:12 np0005466013 nova_compute[192144]: 2025-10-02 12:01:12.289 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:01:12 np0005466013 nova_compute[192144]: 2025-10-02 12:01:12.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:12 np0005466013 nova_compute[192144]: 2025-10-02 12:01:12.578 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Oct  2 08:01:12 np0005466013 nova_compute[192144]: 2025-10-02 12:01:12.578 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "refresh_cache-0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:01:12 np0005466013 nova_compute[192144]: 2025-10-02 12:01:12.660 2 DEBUG oslo_concurrency.lockutils [None req-2cbccfae-9069-468b-a715-a4042dfcefe7 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Acquiring lock "5c14c0c1-8950-4722-9065-b691fc2dc856" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:01:12 np0005466013 nova_compute[192144]: 2025-10-02 12:01:12.661 2 DEBUG oslo_concurrency.lockutils [None req-2cbccfae-9069-468b-a715-a4042dfcefe7 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lock "5c14c0c1-8950-4722-9065-b691fc2dc856" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:01:12 np0005466013 nova_compute[192144]: 2025-10-02 12:01:12.661 2 DEBUG oslo_concurrency.lockutils [None req-2cbccfae-9069-468b-a715-a4042dfcefe7 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Acquiring lock "5c14c0c1-8950-4722-9065-b691fc2dc856-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:01:12 np0005466013 nova_compute[192144]: 2025-10-02 12:01:12.661 2 DEBUG oslo_concurrency.lockutils [None req-2cbccfae-9069-468b-a715-a4042dfcefe7 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lock "5c14c0c1-8950-4722-9065-b691fc2dc856-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:01:12 np0005466013 nova_compute[192144]: 2025-10-02 12:01:12.662 2 DEBUG oslo_concurrency.lockutils [None req-2cbccfae-9069-468b-a715-a4042dfcefe7 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lock "5c14c0c1-8950-4722-9065-b691fc2dc856-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:01:12 np0005466013 nova_compute[192144]: 2025-10-02 12:01:12.830 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759406457.8297062, 6e063d2c-d996-4af7-b02f-2ea76d2ba132 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:01:12 np0005466013 nova_compute[192144]: 2025-10-02 12:01:12.831 2 INFO nova.compute.manager [-] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:01:12 np0005466013 nova_compute[192144]: 2025-10-02 12:01:12.859 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:12 np0005466013 nova_compute[192144]: 2025-10-02 12:01:12.878 2 INFO nova.compute.manager [None req-2cbccfae-9069-468b-a715-a4042dfcefe7 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] Terminating instance#033[00m
Oct  2 08:01:12 np0005466013 nova_compute[192144]: 2025-10-02 12:01:12.906 2 DEBUG nova.compute.manager [None req-c8b5b252-5392-46cb-8706-15e2deb046fb - - - - - -] [instance: 6e063d2c-d996-4af7-b02f-2ea76d2ba132] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:01:13 np0005466013 nova_compute[192144]: 2025-10-02 12:01:13.031 2 DEBUG oslo_concurrency.lockutils [None req-2cbccfae-9069-468b-a715-a4042dfcefe7 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Acquiring lock "refresh_cache-5c14c0c1-8950-4722-9065-b691fc2dc856" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:01:13 np0005466013 nova_compute[192144]: 2025-10-02 12:01:13.032 2 DEBUG oslo_concurrency.lockutils [None req-2cbccfae-9069-468b-a715-a4042dfcefe7 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Acquired lock "refresh_cache-5c14c0c1-8950-4722-9065-b691fc2dc856" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:01:13 np0005466013 nova_compute[192144]: 2025-10-02 12:01:13.032 2 DEBUG nova.network.neutron [None req-2cbccfae-9069-468b-a715-a4042dfcefe7 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:01:13 np0005466013 nova_compute[192144]: 2025-10-02 12:01:13.166 2 DEBUG nova.network.neutron [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:01:13 np0005466013 nova_compute[192144]: 2025-10-02 12:01:13.434 2 DEBUG nova.network.neutron [None req-2cbccfae-9069-468b-a715-a4042dfcefe7 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:01:13 np0005466013 nova_compute[192144]: 2025-10-02 12:01:13.560 2 DEBUG nova.network.neutron [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:01:13 np0005466013 nova_compute[192144]: 2025-10-02 12:01:13.642 2 DEBUG nova.network.neutron [None req-2cbccfae-9069-468b-a715-a4042dfcefe7 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:01:13 np0005466013 nova_compute[192144]: 2025-10-02 12:01:13.656 2 DEBUG oslo_concurrency.lockutils [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Releasing lock "refresh_cache-0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:01:13 np0005466013 nova_compute[192144]: 2025-10-02 12:01:13.660 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquired lock "refresh_cache-0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:01:13 np0005466013 nova_compute[192144]: 2025-10-02 12:01:13.660 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:01:13 np0005466013 nova_compute[192144]: 2025-10-02 12:01:13.660 2 DEBUG nova.objects.instance [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lazy-loading 'info_cache' on Instance uuid 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:01:13 np0005466013 nova_compute[192144]: 2025-10-02 12:01:13.790 2 DEBUG oslo_concurrency.lockutils [None req-2cbccfae-9069-468b-a715-a4042dfcefe7 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Releasing lock "refresh_cache-5c14c0c1-8950-4722-9065-b691fc2dc856" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:01:13 np0005466013 nova_compute[192144]: 2025-10-02 12:01:13.791 2 DEBUG nova.compute.manager [None req-2cbccfae-9069-468b-a715-a4042dfcefe7 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:01:13 np0005466013 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Deactivated successfully.
Oct  2 08:01:13 np0005466013 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Consumed 14.722s CPU time.
Oct  2 08:01:13 np0005466013 systemd-machined[152202]: Machine qemu-1-instance-00000001 terminated.
Oct  2 08:01:14 np0005466013 nova_compute[192144]: 2025-10-02 12:01:14.030 2 INFO nova.virt.libvirt.driver [-] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] Instance destroyed successfully.#033[00m
Oct  2 08:01:14 np0005466013 nova_compute[192144]: 2025-10-02 12:01:14.031 2 DEBUG nova.objects.instance [None req-2cbccfae-9069-468b-a715-a4042dfcefe7 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lazy-loading 'resources' on Instance uuid 5c14c0c1-8950-4722-9065-b691fc2dc856 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:01:14 np0005466013 nova_compute[192144]: 2025-10-02 12:01:14.222 2 INFO nova.virt.libvirt.driver [None req-2cbccfae-9069-468b-a715-a4042dfcefe7 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] Deleting instance files /var/lib/nova/instances/5c14c0c1-8950-4722-9065-b691fc2dc856_del#033[00m
Oct  2 08:01:14 np0005466013 nova_compute[192144]: 2025-10-02 12:01:14.223 2 INFO nova.virt.libvirt.driver [None req-2cbccfae-9069-468b-a715-a4042dfcefe7 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] Deletion of /var/lib/nova/instances/5c14c0c1-8950-4722-9065-b691fc2dc856_del complete#033[00m
Oct  2 08:01:15 np0005466013 nova_compute[192144]: 2025-10-02 12:01:15.052 2 INFO nova.compute.manager [None req-2cbccfae-9069-468b-a715-a4042dfcefe7 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] Took 1.26 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:01:15 np0005466013 nova_compute[192144]: 2025-10-02 12:01:15.053 2 DEBUG oslo.service.loopingcall [None req-2cbccfae-9069-468b-a715-a4042dfcefe7 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:01:15 np0005466013 nova_compute[192144]: 2025-10-02 12:01:15.053 2 DEBUG nova.compute.manager [-] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:01:15 np0005466013 nova_compute[192144]: 2025-10-02 12:01:15.053 2 DEBUG nova.network.neutron [-] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:01:16 np0005466013 nova_compute[192144]: 2025-10-02 12:01:16.122 2 DEBUG nova.virt.libvirt.driver [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Oct  2 08:01:16 np0005466013 nova_compute[192144]: 2025-10-02 12:01:16.124 2 DEBUG nova.virt.libvirt.driver [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Oct  2 08:01:16 np0005466013 nova_compute[192144]: 2025-10-02 12:01:16.125 2 INFO nova.virt.libvirt.driver [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Creating image(s)#033[00m
Oct  2 08:01:16 np0005466013 nova_compute[192144]: 2025-10-02 12:01:16.126 2 DEBUG nova.objects.instance [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:01:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:01:16.341 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:01:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:01:16.341 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:01:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:01:16.342 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:01:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:01:16.342 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:01:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:01:16.342 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:01:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:01:16.342 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:01:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:01:16.342 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:01:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:01:16.342 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:01:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:01:16.342 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:01:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:01:16.342 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:01:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:01:16.342 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:01:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:01:16.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:01:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:01:16.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:01:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:01:16.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:01:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:01:16.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:01:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:01:16.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:01:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:01:16.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:01:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:01:16.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:01:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:01:16.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:01:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:01:16.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:01:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:01:16.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:01:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:01:16.344 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:01:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:01:16.344 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:01:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:01:16.344 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:01:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:01:16.344 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:01:16 np0005466013 nova_compute[192144]: 2025-10-02 12:01:16.416 2 DEBUG oslo_concurrency.processutils [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:01:16 np0005466013 nova_compute[192144]: 2025-10-02 12:01:16.485 2 DEBUG oslo_concurrency.processutils [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:01:16 np0005466013 nova_compute[192144]: 2025-10-02 12:01:16.486 2 DEBUG nova.virt.disk.api [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Checking if we can resize image /var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:01:16 np0005466013 nova_compute[192144]: 2025-10-02 12:01:16.487 2 DEBUG oslo_concurrency.processutils [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:01:16 np0005466013 nova_compute[192144]: 2025-10-02 12:01:16.546 2 DEBUG oslo_concurrency.processutils [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:01:16 np0005466013 nova_compute[192144]: 2025-10-02 12:01:16.547 2 DEBUG nova.virt.disk.api [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Cannot resize image /var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:01:16 np0005466013 podman[220353]: 2025-10-02 12:01:16.713029823 +0000 UTC m=+0.084103145 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=ovn_controller)
Oct  2 08:01:17 np0005466013 nova_compute[192144]: 2025-10-02 12:01:17.166 2 DEBUG nova.virt.libvirt.driver [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Oct  2 08:01:17 np0005466013 nova_compute[192144]: 2025-10-02 12:01:17.167 2 DEBUG nova.virt.libvirt.driver [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Ensure instance console log exists: /var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:01:17 np0005466013 nova_compute[192144]: 2025-10-02 12:01:17.167 2 DEBUG oslo_concurrency.lockutils [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:01:17 np0005466013 nova_compute[192144]: 2025-10-02 12:01:17.168 2 DEBUG oslo_concurrency.lockutils [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:01:17 np0005466013 nova_compute[192144]: 2025-10-02 12:01:17.168 2 DEBUG oslo_concurrency.lockutils [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:01:17 np0005466013 nova_compute[192144]: 2025-10-02 12:01:17.170 2 DEBUG nova.virt.libvirt.driver [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:01:17 np0005466013 nova_compute[192144]: 2025-10-02 12:01:17.175 2 WARNING nova.virt.libvirt.driver [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:01:17 np0005466013 nova_compute[192144]: 2025-10-02 12:01:17.183 2 DEBUG nova.virt.libvirt.host [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:01:17 np0005466013 nova_compute[192144]: 2025-10-02 12:01:17.184 2 DEBUG nova.virt.libvirt.host [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:01:17 np0005466013 nova_compute[192144]: 2025-10-02 12:01:17.188 2 DEBUG nova.virt.libvirt.host [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:01:17 np0005466013 nova_compute[192144]: 2025-10-02 12:01:17.188 2 DEBUG nova.virt.libvirt.host [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:01:17 np0005466013 nova_compute[192144]: 2025-10-02 12:01:17.189 2 DEBUG nova.virt.libvirt.driver [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:01:17 np0005466013 nova_compute[192144]: 2025-10-02 12:01:17.190 2 DEBUG nova.virt.hardware [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:01:17 np0005466013 nova_compute[192144]: 2025-10-02 12:01:17.190 2 DEBUG nova.virt.hardware [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:01:17 np0005466013 nova_compute[192144]: 2025-10-02 12:01:17.190 2 DEBUG nova.virt.hardware [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:01:17 np0005466013 nova_compute[192144]: 2025-10-02 12:01:17.191 2 DEBUG nova.virt.hardware [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:01:17 np0005466013 nova_compute[192144]: 2025-10-02 12:01:17.191 2 DEBUG nova.virt.hardware [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:01:17 np0005466013 nova_compute[192144]: 2025-10-02 12:01:17.191 2 DEBUG nova.virt.hardware [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:01:17 np0005466013 nova_compute[192144]: 2025-10-02 12:01:17.191 2 DEBUG nova.virt.hardware [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:01:17 np0005466013 nova_compute[192144]: 2025-10-02 12:01:17.191 2 DEBUG nova.virt.hardware [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:01:17 np0005466013 nova_compute[192144]: 2025-10-02 12:01:17.192 2 DEBUG nova.virt.hardware [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:01:17 np0005466013 nova_compute[192144]: 2025-10-02 12:01:17.192 2 DEBUG nova.virt.hardware [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:01:17 np0005466013 nova_compute[192144]: 2025-10-02 12:01:17.192 2 DEBUG nova.virt.hardware [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:01:17 np0005466013 nova_compute[192144]: 2025-10-02 12:01:17.192 2 DEBUG nova.objects.instance [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:01:17 np0005466013 nova_compute[192144]: 2025-10-02 12:01:17.558 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:17 np0005466013 podman[220379]: 2025-10-02 12:01:17.66658006 +0000 UTC m=+0.044014808 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  2 08:01:17 np0005466013 podman[220380]: 2025-10-02 12:01:17.671273891 +0000 UTC m=+0.045383829 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, container_name=ovn_metadata_agent)
Oct  2 08:01:17 np0005466013 nova_compute[192144]: 2025-10-02 12:01:17.861 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:18 np0005466013 nova_compute[192144]: 2025-10-02 12:01:18.024 2 DEBUG oslo_concurrency.processutils [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:01:18 np0005466013 nova_compute[192144]: 2025-10-02 12:01:18.074 2 DEBUG oslo_concurrency.processutils [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7/disk.config --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:01:18 np0005466013 nova_compute[192144]: 2025-10-02 12:01:18.074 2 DEBUG oslo_concurrency.lockutils [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Acquiring lock "/var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:01:18 np0005466013 nova_compute[192144]: 2025-10-02 12:01:18.075 2 DEBUG oslo_concurrency.lockutils [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Lock "/var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:01:18 np0005466013 nova_compute[192144]: 2025-10-02 12:01:18.076 2 DEBUG oslo_concurrency.lockutils [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Lock "/var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:01:18 np0005466013 nova_compute[192144]: 2025-10-02 12:01:18.077 2 DEBUG nova.virt.libvirt.driver [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:01:18 np0005466013 nova_compute[192144]:  <uuid>0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7</uuid>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:  <name>instance-00000006</name>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:01:18 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:      <nova:name>tempest-MigrationsAdminTest-server-976277975</nova:name>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:01:17</nova:creationTime>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:01:18 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:        <nova:user uuid="8da35688aa864e189f10b334a21bc6c4">tempest-MigrationsAdminTest-1651504538-project-member</nova:user>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:        <nova:project uuid="4dcc6c51db2640cbb04083b3336de813">tempest-MigrationsAdminTest-1651504538</nova:project>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:      <nova:ports/>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:01:18 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:      <entry name="serial">0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7</entry>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:      <entry name="uuid">0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7</entry>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:01:18 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:01:18 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:01:18 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7/disk"/>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:01:18 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7/disk.config"/>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:01:18 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7/console.log" append="off"/>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:01:18 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:01:18 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:01:18 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:01:18 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:01:18 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:01:18 np0005466013 nova_compute[192144]: 2025-10-02 12:01:18.994 2 DEBUG nova.virt.libvirt.driver [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:01:18 np0005466013 nova_compute[192144]: 2025-10-02 12:01:18.995 2 DEBUG nova.virt.libvirt.driver [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:01:18 np0005466013 nova_compute[192144]: 2025-10-02 12:01:18.995 2 INFO nova.virt.libvirt.driver [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Using config drive#033[00m
Oct  2 08:01:19 np0005466013 systemd-machined[152202]: New machine qemu-3-instance-00000006.
Oct  2 08:01:19 np0005466013 systemd[1]: Started Virtual Machine qemu-3-instance-00000006.
Oct  2 08:01:19 np0005466013 nova_compute[192144]: 2025-10-02 12:01:19.674 2 DEBUG nova.network.neutron [-] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:01:19 np0005466013 nova_compute[192144]: 2025-10-02 12:01:19.677 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:01:19 np0005466013 systemd[1]: Stopping User Manager for UID 42436...
Oct  2 08:01:19 np0005466013 systemd[220303]: Activating special unit Exit the Session...
Oct  2 08:01:19 np0005466013 systemd[220303]: Stopped target Main User Target.
Oct  2 08:01:19 np0005466013 systemd[220303]: Stopped target Basic System.
Oct  2 08:01:19 np0005466013 systemd[220303]: Stopped target Paths.
Oct  2 08:01:19 np0005466013 systemd[220303]: Stopped target Sockets.
Oct  2 08:01:19 np0005466013 systemd[220303]: Stopped target Timers.
Oct  2 08:01:19 np0005466013 systemd[220303]: Stopped Mark boot as successful after the user session has run 2 minutes.
Oct  2 08:01:19 np0005466013 systemd[220303]: Stopped Daily Cleanup of User's Temporary Directories.
Oct  2 08:01:19 np0005466013 systemd[220303]: Closed D-Bus User Message Bus Socket.
Oct  2 08:01:19 np0005466013 systemd[220303]: Stopped Create User's Volatile Files and Directories.
Oct  2 08:01:19 np0005466013 systemd[220303]: Removed slice User Application Slice.
Oct  2 08:01:19 np0005466013 systemd[220303]: Reached target Shutdown.
Oct  2 08:01:19 np0005466013 systemd[220303]: Finished Exit the Session.
Oct  2 08:01:19 np0005466013 systemd[220303]: Reached target Exit the Session.
Oct  2 08:01:19 np0005466013 systemd[1]: user@42436.service: Deactivated successfully.
Oct  2 08:01:19 np0005466013 systemd[1]: Stopped User Manager for UID 42436.
Oct  2 08:01:19 np0005466013 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Oct  2 08:01:19 np0005466013 systemd[1]: run-user-42436.mount: Deactivated successfully.
Oct  2 08:01:19 np0005466013 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Oct  2 08:01:19 np0005466013 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Oct  2 08:01:19 np0005466013 systemd[1]: Removed slice User Slice of UID 42436.
Oct  2 08:01:20 np0005466013 nova_compute[192144]: 2025-10-02 12:01:20.087 2 DEBUG nova.network.neutron [-] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:01:20 np0005466013 nova_compute[192144]: 2025-10-02 12:01:20.329 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:01:20 np0005466013 nova_compute[192144]: 2025-10-02 12:01:20.504 2 INFO nova.compute.manager [-] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] Took 5.45 seconds to deallocate network for instance.#033[00m
Oct  2 08:01:21 np0005466013 nova_compute[192144]: 2025-10-02 12:01:21.038 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Releasing lock "refresh_cache-0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:01:21 np0005466013 nova_compute[192144]: 2025-10-02 12:01:21.039 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:01:21 np0005466013 nova_compute[192144]: 2025-10-02 12:01:21.039 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:01:21 np0005466013 nova_compute[192144]: 2025-10-02 12:01:21.040 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:01:21 np0005466013 nova_compute[192144]: 2025-10-02 12:01:21.040 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:01:21 np0005466013 nova_compute[192144]: 2025-10-02 12:01:21.040 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:01:21 np0005466013 nova_compute[192144]: 2025-10-02 12:01:21.044 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759406481.0441751, 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:01:21 np0005466013 nova_compute[192144]: 2025-10-02 12:01:21.044 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:01:21 np0005466013 nova_compute[192144]: 2025-10-02 12:01:21.047 2 DEBUG nova.compute.manager [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:01:21 np0005466013 nova_compute[192144]: 2025-10-02 12:01:21.050 2 INFO nova.virt.libvirt.driver [-] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Instance running successfully.#033[00m
Oct  2 08:01:21 np0005466013 virtqemud[191867]: argument unsupported: QEMU guest agent is not configured
Oct  2 08:01:21 np0005466013 nova_compute[192144]: 2025-10-02 12:01:21.053 2 DEBUG nova.virt.libvirt.guest [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Oct  2 08:01:21 np0005466013 nova_compute[192144]: 2025-10-02 12:01:21.053 2 DEBUG nova.virt.libvirt.driver [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Oct  2 08:01:21 np0005466013 nova_compute[192144]: 2025-10-02 12:01:21.395 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:01:21 np0005466013 nova_compute[192144]: 2025-10-02 12:01:21.399 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:01:21 np0005466013 nova_compute[192144]: 2025-10-02 12:01:21.735 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Oct  2 08:01:21 np0005466013 nova_compute[192144]: 2025-10-02 12:01:21.736 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759406481.0461514, 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:01:21 np0005466013 nova_compute[192144]: 2025-10-02 12:01:21.737 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] VM Started (Lifecycle Event)#033[00m
Oct  2 08:01:21 np0005466013 nova_compute[192144]: 2025-10-02 12:01:21.857 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:01:21 np0005466013 nova_compute[192144]: 2025-10-02 12:01:21.861 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Synchronizing instance power state after lifecycle event "Started"; current vm_state: resized, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:01:22 np0005466013 nova_compute[192144]: 2025-10-02 12:01:22.105 2 DEBUG oslo_concurrency.lockutils [None req-2cbccfae-9069-468b-a715-a4042dfcefe7 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:01:22 np0005466013 nova_compute[192144]: 2025-10-02 12:01:22.106 2 DEBUG oslo_concurrency.lockutils [None req-2cbccfae-9069-468b-a715-a4042dfcefe7 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:01:22 np0005466013 nova_compute[192144]: 2025-10-02 12:01:22.243 2 DEBUG nova.compute.provider_tree [None req-2cbccfae-9069-468b-a715-a4042dfcefe7 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:01:22 np0005466013 nova_compute[192144]: 2025-10-02 12:01:22.321 2 DEBUG nova.scheduler.client.report [None req-2cbccfae-9069-468b-a715-a4042dfcefe7 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:01:22 np0005466013 nova_compute[192144]: 2025-10-02 12:01:22.447 2 DEBUG oslo_concurrency.lockutils [None req-2cbccfae-9069-468b-a715-a4042dfcefe7 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.341s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:01:22 np0005466013 nova_compute[192144]: 2025-10-02 12:01:22.504 2 INFO nova.scheduler.client.report [None req-2cbccfae-9069-468b-a715-a4042dfcefe7 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Deleted allocations for instance 5c14c0c1-8950-4722-9065-b691fc2dc856#033[00m
Oct  2 08:01:22 np0005466013 nova_compute[192144]: 2025-10-02 12:01:22.561 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:22 np0005466013 nova_compute[192144]: 2025-10-02 12:01:22.853 2 DEBUG oslo_concurrency.lockutils [None req-2cbccfae-9069-468b-a715-a4042dfcefe7 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lock "5c14c0c1-8950-4722-9065-b691fc2dc856" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 10.192s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:01:22 np0005466013 nova_compute[192144]: 2025-10-02 12:01:22.863 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:25 np0005466013 podman[220451]: 2025-10-02 12:01:25.671725404 +0000 UTC m=+0.051709738 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Oct  2 08:01:27 np0005466013 nova_compute[192144]: 2025-10-02 12:01:27.562 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:27 np0005466013 nova_compute[192144]: 2025-10-02 12:01:27.899 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:28 np0005466013 podman[220472]: 2025-10-02 12:01:28.670929199 +0000 UTC m=+0.050438240 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, distribution-scope=public, version=9.6, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, io.buildah.version=1.33.7, release=1755695350, vendor=Red Hat, Inc., config_id=edpm, io.openshift.tags=minimal rhel9, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, maintainer=Red Hat, Inc.)
Oct  2 08:01:28 np0005466013 podman[220471]: 2025-10-02 12:01:28.675717953 +0000 UTC m=+0.056802522 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=multipathd)
Oct  2 08:01:29 np0005466013 nova_compute[192144]: 2025-10-02 12:01:29.028 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759406474.0276144, 5c14c0c1-8950-4722-9065-b691fc2dc856 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:01:29 np0005466013 nova_compute[192144]: 2025-10-02 12:01:29.029 2 INFO nova.compute.manager [-] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:01:29 np0005466013 nova_compute[192144]: 2025-10-02 12:01:29.117 2 DEBUG nova.compute.manager [None req-50215045-046a-4646-8ad9-e943cea383f9 - - - - - -] [instance: 5c14c0c1-8950-4722-9065-b691fc2dc856] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:01:29 np0005466013 nova_compute[192144]: 2025-10-02 12:01:29.448 2 DEBUG oslo_concurrency.lockutils [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Acquiring lock "e09de65a-0b2d-4aa5-9d9a-49f039add691" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:01:29 np0005466013 nova_compute[192144]: 2025-10-02 12:01:29.449 2 DEBUG oslo_concurrency.lockutils [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Lock "e09de65a-0b2d-4aa5-9d9a-49f039add691" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:01:29 np0005466013 nova_compute[192144]: 2025-10-02 12:01:29.521 2 DEBUG nova.compute.manager [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: e09de65a-0b2d-4aa5-9d9a-49f039add691] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:01:30 np0005466013 nova_compute[192144]: 2025-10-02 12:01:30.390 2 DEBUG oslo_concurrency.lockutils [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:01:30 np0005466013 nova_compute[192144]: 2025-10-02 12:01:30.391 2 DEBUG oslo_concurrency.lockutils [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:01:30 np0005466013 nova_compute[192144]: 2025-10-02 12:01:30.397 2 DEBUG nova.virt.hardware [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:01:30 np0005466013 nova_compute[192144]: 2025-10-02 12:01:30.397 2 INFO nova.compute.claims [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: e09de65a-0b2d-4aa5-9d9a-49f039add691] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:01:30 np0005466013 nova_compute[192144]: 2025-10-02 12:01:30.915 2 DEBUG nova.compute.provider_tree [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:01:30 np0005466013 nova_compute[192144]: 2025-10-02 12:01:30.989 2 DEBUG nova.scheduler.client.report [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:01:31 np0005466013 nova_compute[192144]: 2025-10-02 12:01:31.063 2 DEBUG oslo_concurrency.lockutils [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.672s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:01:31 np0005466013 nova_compute[192144]: 2025-10-02 12:01:31.064 2 DEBUG nova.compute.manager [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: e09de65a-0b2d-4aa5-9d9a-49f039add691] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:01:31 np0005466013 nova_compute[192144]: 2025-10-02 12:01:31.343 2 DEBUG nova.compute.manager [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: e09de65a-0b2d-4aa5-9d9a-49f039add691] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:01:31 np0005466013 nova_compute[192144]: 2025-10-02 12:01:31.344 2 DEBUG nova.network.neutron [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: e09de65a-0b2d-4aa5-9d9a-49f039add691] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:01:31 np0005466013 nova_compute[192144]: 2025-10-02 12:01:31.525 2 INFO nova.virt.libvirt.driver [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: e09de65a-0b2d-4aa5-9d9a-49f039add691] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:01:31 np0005466013 nova_compute[192144]: 2025-10-02 12:01:31.623 2 DEBUG nova.compute.manager [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: e09de65a-0b2d-4aa5-9d9a-49f039add691] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:01:31 np0005466013 nova_compute[192144]: 2025-10-02 12:01:31.634 2 DEBUG nova.network.neutron [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: e09de65a-0b2d-4aa5-9d9a-49f039add691] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Oct  2 08:01:31 np0005466013 nova_compute[192144]: 2025-10-02 12:01:31.634 2 DEBUG nova.compute.manager [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: e09de65a-0b2d-4aa5-9d9a-49f039add691] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:01:31 np0005466013 nova_compute[192144]: 2025-10-02 12:01:31.972 2 DEBUG nova.compute.manager [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: e09de65a-0b2d-4aa5-9d9a-49f039add691] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:01:31 np0005466013 nova_compute[192144]: 2025-10-02 12:01:31.974 2 DEBUG nova.virt.libvirt.driver [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: e09de65a-0b2d-4aa5-9d9a-49f039add691] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:01:31 np0005466013 nova_compute[192144]: 2025-10-02 12:01:31.974 2 INFO nova.virt.libvirt.driver [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: e09de65a-0b2d-4aa5-9d9a-49f039add691] Creating image(s)#033[00m
Oct  2 08:01:31 np0005466013 nova_compute[192144]: 2025-10-02 12:01:31.975 2 DEBUG oslo_concurrency.lockutils [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Acquiring lock "/var/lib/nova/instances/e09de65a-0b2d-4aa5-9d9a-49f039add691/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:01:31 np0005466013 nova_compute[192144]: 2025-10-02 12:01:31.975 2 DEBUG oslo_concurrency.lockutils [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Lock "/var/lib/nova/instances/e09de65a-0b2d-4aa5-9d9a-49f039add691/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:01:31 np0005466013 nova_compute[192144]: 2025-10-02 12:01:31.976 2 DEBUG oslo_concurrency.lockutils [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Lock "/var/lib/nova/instances/e09de65a-0b2d-4aa5-9d9a-49f039add691/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:01:31 np0005466013 nova_compute[192144]: 2025-10-02 12:01:31.988 2 DEBUG oslo_concurrency.processutils [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:01:32 np0005466013 nova_compute[192144]: 2025-10-02 12:01:32.050 2 DEBUG oslo_concurrency.processutils [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:01:32 np0005466013 nova_compute[192144]: 2025-10-02 12:01:32.051 2 DEBUG oslo_concurrency.lockutils [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:01:32 np0005466013 nova_compute[192144]: 2025-10-02 12:01:32.052 2 DEBUG oslo_concurrency.lockutils [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:01:32 np0005466013 nova_compute[192144]: 2025-10-02 12:01:32.062 2 DEBUG oslo_concurrency.processutils [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:01:32 np0005466013 nova_compute[192144]: 2025-10-02 12:01:32.133 2 DEBUG oslo_concurrency.processutils [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:01:32 np0005466013 nova_compute[192144]: 2025-10-02 12:01:32.135 2 DEBUG oslo_concurrency.processutils [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/e09de65a-0b2d-4aa5-9d9a-49f039add691/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:01:32 np0005466013 nova_compute[192144]: 2025-10-02 12:01:32.175 2 DEBUG oslo_concurrency.processutils [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/e09de65a-0b2d-4aa5-9d9a-49f039add691/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:01:32 np0005466013 nova_compute[192144]: 2025-10-02 12:01:32.177 2 DEBUG oslo_concurrency.lockutils [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:01:32 np0005466013 nova_compute[192144]: 2025-10-02 12:01:32.178 2 DEBUG oslo_concurrency.processutils [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:01:32 np0005466013 nova_compute[192144]: 2025-10-02 12:01:32.232 2 DEBUG oslo_concurrency.processutils [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:01:32 np0005466013 nova_compute[192144]: 2025-10-02 12:01:32.233 2 DEBUG nova.virt.disk.api [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Checking if we can resize image /var/lib/nova/instances/e09de65a-0b2d-4aa5-9d9a-49f039add691/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:01:32 np0005466013 nova_compute[192144]: 2025-10-02 12:01:32.233 2 DEBUG oslo_concurrency.processutils [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e09de65a-0b2d-4aa5-9d9a-49f039add691/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:01:32 np0005466013 nova_compute[192144]: 2025-10-02 12:01:32.294 2 DEBUG oslo_concurrency.processutils [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e09de65a-0b2d-4aa5-9d9a-49f039add691/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:01:32 np0005466013 nova_compute[192144]: 2025-10-02 12:01:32.295 2 DEBUG nova.virt.disk.api [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Cannot resize image /var/lib/nova/instances/e09de65a-0b2d-4aa5-9d9a-49f039add691/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:01:32 np0005466013 nova_compute[192144]: 2025-10-02 12:01:32.296 2 DEBUG nova.objects.instance [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Lazy-loading 'migration_context' on Instance uuid e09de65a-0b2d-4aa5-9d9a-49f039add691 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:01:32 np0005466013 nova_compute[192144]: 2025-10-02 12:01:32.334 2 DEBUG nova.virt.libvirt.driver [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: e09de65a-0b2d-4aa5-9d9a-49f039add691] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:01:32 np0005466013 nova_compute[192144]: 2025-10-02 12:01:32.335 2 DEBUG nova.virt.libvirt.driver [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: e09de65a-0b2d-4aa5-9d9a-49f039add691] Ensure instance console log exists: /var/lib/nova/instances/e09de65a-0b2d-4aa5-9d9a-49f039add691/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:01:32 np0005466013 nova_compute[192144]: 2025-10-02 12:01:32.335 2 DEBUG oslo_concurrency.lockutils [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:01:32 np0005466013 nova_compute[192144]: 2025-10-02 12:01:32.335 2 DEBUG oslo_concurrency.lockutils [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:01:32 np0005466013 nova_compute[192144]: 2025-10-02 12:01:32.336 2 DEBUG oslo_concurrency.lockutils [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:01:32 np0005466013 nova_compute[192144]: 2025-10-02 12:01:32.337 2 DEBUG nova.virt.libvirt.driver [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: e09de65a-0b2d-4aa5-9d9a-49f039add691] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:01:32 np0005466013 nova_compute[192144]: 2025-10-02 12:01:32.340 2 WARNING nova.virt.libvirt.driver [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:01:32 np0005466013 nova_compute[192144]: 2025-10-02 12:01:32.348 2 DEBUG nova.virt.libvirt.host [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:01:32 np0005466013 nova_compute[192144]: 2025-10-02 12:01:32.349 2 DEBUG nova.virt.libvirt.host [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:01:32 np0005466013 nova_compute[192144]: 2025-10-02 12:01:32.352 2 DEBUG nova.virt.libvirt.host [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:01:32 np0005466013 nova_compute[192144]: 2025-10-02 12:01:32.352 2 DEBUG nova.virt.libvirt.host [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:01:32 np0005466013 nova_compute[192144]: 2025-10-02 12:01:32.353 2 DEBUG nova.virt.libvirt.driver [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:01:32 np0005466013 nova_compute[192144]: 2025-10-02 12:01:32.353 2 DEBUG nova.virt.hardware [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:01:32 np0005466013 nova_compute[192144]: 2025-10-02 12:01:32.354 2 DEBUG nova.virt.hardware [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:01:32 np0005466013 nova_compute[192144]: 2025-10-02 12:01:32.354 2 DEBUG nova.virt.hardware [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:01:32 np0005466013 nova_compute[192144]: 2025-10-02 12:01:32.354 2 DEBUG nova.virt.hardware [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:01:32 np0005466013 nova_compute[192144]: 2025-10-02 12:01:32.354 2 DEBUG nova.virt.hardware [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:01:32 np0005466013 nova_compute[192144]: 2025-10-02 12:01:32.354 2 DEBUG nova.virt.hardware [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:01:32 np0005466013 nova_compute[192144]: 2025-10-02 12:01:32.355 2 DEBUG nova.virt.hardware [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:01:32 np0005466013 nova_compute[192144]: 2025-10-02 12:01:32.355 2 DEBUG nova.virt.hardware [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:01:32 np0005466013 nova_compute[192144]: 2025-10-02 12:01:32.355 2 DEBUG nova.virt.hardware [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:01:32 np0005466013 nova_compute[192144]: 2025-10-02 12:01:32.355 2 DEBUG nova.virt.hardware [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:01:32 np0005466013 nova_compute[192144]: 2025-10-02 12:01:32.356 2 DEBUG nova.virt.hardware [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:01:32 np0005466013 nova_compute[192144]: 2025-10-02 12:01:32.358 2 DEBUG nova.objects.instance [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Lazy-loading 'pci_devices' on Instance uuid e09de65a-0b2d-4aa5-9d9a-49f039add691 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:01:32 np0005466013 nova_compute[192144]: 2025-10-02 12:01:32.563 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:32 np0005466013 nova_compute[192144]: 2025-10-02 12:01:32.697 2 DEBUG nova.virt.libvirt.driver [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: e09de65a-0b2d-4aa5-9d9a-49f039add691] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:01:32 np0005466013 nova_compute[192144]:  <uuid>e09de65a-0b2d-4aa5-9d9a-49f039add691</uuid>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:  <name>instance-00000009</name>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:01:32 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:      <nova:name>tempest-MigrationsAdminTest-server-1510345576</nova:name>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:01:32</nova:creationTime>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:01:32 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:        <nova:user uuid="8da35688aa864e189f10b334a21bc6c4">tempest-MigrationsAdminTest-1651504538-project-member</nova:user>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:        <nova:project uuid="4dcc6c51db2640cbb04083b3336de813">tempest-MigrationsAdminTest-1651504538</nova:project>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:      <nova:ports/>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:01:32 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:      <entry name="serial">e09de65a-0b2d-4aa5-9d9a-49f039add691</entry>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:      <entry name="uuid">e09de65a-0b2d-4aa5-9d9a-49f039add691</entry>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:01:32 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:01:32 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:01:32 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/e09de65a-0b2d-4aa5-9d9a-49f039add691/disk"/>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:01:32 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/e09de65a-0b2d-4aa5-9d9a-49f039add691/disk.config"/>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:01:32 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/e09de65a-0b2d-4aa5-9d9a-49f039add691/console.log" append="off"/>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:01:32 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:01:32 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:01:32 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:01:32 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:01:32 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:01:32 np0005466013 nova_compute[192144]: 2025-10-02 12:01:32.901 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:33 np0005466013 nova_compute[192144]: 2025-10-02 12:01:33.079 2 DEBUG nova.virt.libvirt.driver [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:01:33 np0005466013 nova_compute[192144]: 2025-10-02 12:01:33.080 2 DEBUG nova.virt.libvirt.driver [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:01:33 np0005466013 nova_compute[192144]: 2025-10-02 12:01:33.081 2 INFO nova.virt.libvirt.driver [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: e09de65a-0b2d-4aa5-9d9a-49f039add691] Using config drive#033[00m
Oct  2 08:01:33 np0005466013 nova_compute[192144]: 2025-10-02 12:01:33.377 2 INFO nova.virt.libvirt.driver [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: e09de65a-0b2d-4aa5-9d9a-49f039add691] Creating config drive at /var/lib/nova/instances/e09de65a-0b2d-4aa5-9d9a-49f039add691/disk.config#033[00m
Oct  2 08:01:33 np0005466013 nova_compute[192144]: 2025-10-02 12:01:33.384 2 DEBUG oslo_concurrency.processutils [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e09de65a-0b2d-4aa5-9d9a-49f039add691/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_zph9slm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:01:33 np0005466013 nova_compute[192144]: 2025-10-02 12:01:33.526 2 DEBUG oslo_concurrency.processutils [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e09de65a-0b2d-4aa5-9d9a-49f039add691/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_zph9slm" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:01:33 np0005466013 systemd-machined[152202]: New machine qemu-4-instance-00000009.
Oct  2 08:01:33 np0005466013 systemd[1]: Started Virtual Machine qemu-4-instance-00000009.
Oct  2 08:01:34 np0005466013 nova_compute[192144]: 2025-10-02 12:01:34.648 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759406494.6481023, e09de65a-0b2d-4aa5-9d9a-49f039add691 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:01:34 np0005466013 nova_compute[192144]: 2025-10-02 12:01:34.649 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: e09de65a-0b2d-4aa5-9d9a-49f039add691] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:01:34 np0005466013 nova_compute[192144]: 2025-10-02 12:01:34.651 2 DEBUG nova.compute.manager [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: e09de65a-0b2d-4aa5-9d9a-49f039add691] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:01:34 np0005466013 nova_compute[192144]: 2025-10-02 12:01:34.652 2 DEBUG nova.virt.libvirt.driver [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: e09de65a-0b2d-4aa5-9d9a-49f039add691] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:01:34 np0005466013 nova_compute[192144]: 2025-10-02 12:01:34.655 2 INFO nova.virt.libvirt.driver [-] [instance: e09de65a-0b2d-4aa5-9d9a-49f039add691] Instance spawned successfully.#033[00m
Oct  2 08:01:34 np0005466013 nova_compute[192144]: 2025-10-02 12:01:34.656 2 DEBUG nova.virt.libvirt.driver [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: e09de65a-0b2d-4aa5-9d9a-49f039add691] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:01:34 np0005466013 nova_compute[192144]: 2025-10-02 12:01:34.679 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: e09de65a-0b2d-4aa5-9d9a-49f039add691] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:01:34 np0005466013 nova_compute[192144]: 2025-10-02 12:01:34.683 2 DEBUG nova.virt.libvirt.driver [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: e09de65a-0b2d-4aa5-9d9a-49f039add691] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:01:34 np0005466013 nova_compute[192144]: 2025-10-02 12:01:34.684 2 DEBUG nova.virt.libvirt.driver [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: e09de65a-0b2d-4aa5-9d9a-49f039add691] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:01:34 np0005466013 nova_compute[192144]: 2025-10-02 12:01:34.684 2 DEBUG nova.virt.libvirt.driver [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: e09de65a-0b2d-4aa5-9d9a-49f039add691] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:01:34 np0005466013 nova_compute[192144]: 2025-10-02 12:01:34.685 2 DEBUG nova.virt.libvirt.driver [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: e09de65a-0b2d-4aa5-9d9a-49f039add691] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:01:34 np0005466013 nova_compute[192144]: 2025-10-02 12:01:34.685 2 DEBUG nova.virt.libvirt.driver [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: e09de65a-0b2d-4aa5-9d9a-49f039add691] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:01:34 np0005466013 nova_compute[192144]: 2025-10-02 12:01:34.686 2 DEBUG nova.virt.libvirt.driver [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: e09de65a-0b2d-4aa5-9d9a-49f039add691] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:01:34 np0005466013 nova_compute[192144]: 2025-10-02 12:01:34.689 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: e09de65a-0b2d-4aa5-9d9a-49f039add691] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:01:34 np0005466013 nova_compute[192144]: 2025-10-02 12:01:34.720 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: e09de65a-0b2d-4aa5-9d9a-49f039add691] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:01:34 np0005466013 nova_compute[192144]: 2025-10-02 12:01:34.721 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759406494.6491375, e09de65a-0b2d-4aa5-9d9a-49f039add691 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:01:34 np0005466013 nova_compute[192144]: 2025-10-02 12:01:34.721 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: e09de65a-0b2d-4aa5-9d9a-49f039add691] VM Started (Lifecycle Event)#033[00m
Oct  2 08:01:34 np0005466013 nova_compute[192144]: 2025-10-02 12:01:34.752 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: e09de65a-0b2d-4aa5-9d9a-49f039add691] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:01:34 np0005466013 nova_compute[192144]: 2025-10-02 12:01:34.756 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: e09de65a-0b2d-4aa5-9d9a-49f039add691] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:01:34 np0005466013 nova_compute[192144]: 2025-10-02 12:01:34.784 2 INFO nova.compute.manager [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: e09de65a-0b2d-4aa5-9d9a-49f039add691] Took 2.81 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:01:34 np0005466013 nova_compute[192144]: 2025-10-02 12:01:34.785 2 DEBUG nova.compute.manager [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: e09de65a-0b2d-4aa5-9d9a-49f039add691] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:01:34 np0005466013 nova_compute[192144]: 2025-10-02 12:01:34.793 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: e09de65a-0b2d-4aa5-9d9a-49f039add691] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:01:34 np0005466013 nova_compute[192144]: 2025-10-02 12:01:34.871 2 INFO nova.compute.manager [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: e09de65a-0b2d-4aa5-9d9a-49f039add691] Took 5.11 seconds to build instance.#033[00m
Oct  2 08:01:34 np0005466013 nova_compute[192144]: 2025-10-02 12:01:34.898 2 DEBUG oslo_concurrency.lockutils [None req-436ae1a3-1705-4a44-a270-98662b2942c6 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Lock "e09de65a-0b2d-4aa5-9d9a-49f039add691" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.449s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:01:35 np0005466013 podman[220565]: 2025-10-02 12:01:35.682894374 +0000 UTC m=+0.059888955 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 08:01:35 np0005466013 podman[220564]: 2025-10-02 12:01:35.684670508 +0000 UTC m=+0.062127583 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:01:37 np0005466013 nova_compute[192144]: 2025-10-02 12:01:37.183 2 DEBUG oslo_concurrency.lockutils [None req-7b85e7b1-cbe0-401b-96b7-2f36c2189cff 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Acquiring lock "refresh_cache-e09de65a-0b2d-4aa5-9d9a-49f039add691" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:01:37 np0005466013 nova_compute[192144]: 2025-10-02 12:01:37.184 2 DEBUG oslo_concurrency.lockutils [None req-7b85e7b1-cbe0-401b-96b7-2f36c2189cff 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Acquired lock "refresh_cache-e09de65a-0b2d-4aa5-9d9a-49f039add691" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:01:37 np0005466013 nova_compute[192144]: 2025-10-02 12:01:37.184 2 DEBUG nova.network.neutron [None req-7b85e7b1-cbe0-401b-96b7-2f36c2189cff 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: e09de65a-0b2d-4aa5-9d9a-49f039add691] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:01:37 np0005466013 nova_compute[192144]: 2025-10-02 12:01:37.366 2 DEBUG nova.network.neutron [None req-7b85e7b1-cbe0-401b-96b7-2f36c2189cff 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: e09de65a-0b2d-4aa5-9d9a-49f039add691] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:01:37 np0005466013 nova_compute[192144]: 2025-10-02 12:01:37.565 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:37 np0005466013 nova_compute[192144]: 2025-10-02 12:01:37.600 2 DEBUG nova.network.neutron [None req-7b85e7b1-cbe0-401b-96b7-2f36c2189cff 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: e09de65a-0b2d-4aa5-9d9a-49f039add691] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:01:37 np0005466013 nova_compute[192144]: 2025-10-02 12:01:37.618 2 DEBUG oslo_concurrency.lockutils [None req-7b85e7b1-cbe0-401b-96b7-2f36c2189cff 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Releasing lock "refresh_cache-e09de65a-0b2d-4aa5-9d9a-49f039add691" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:01:37 np0005466013 nova_compute[192144]: 2025-10-02 12:01:37.739 2 DEBUG nova.virt.libvirt.driver [None req-7b85e7b1-cbe0-401b-96b7-2f36c2189cff 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: e09de65a-0b2d-4aa5-9d9a-49f039add691] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Oct  2 08:01:37 np0005466013 nova_compute[192144]: 2025-10-02 12:01:37.740 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-7b85e7b1-cbe0-401b-96b7-2f36c2189cff 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Creating file /var/lib/nova/instances/e09de65a-0b2d-4aa5-9d9a-49f039add691/21c2ae2b0c364b478e44d49c56007f3c.tmp on remote host 192.168.122.100 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Oct  2 08:01:37 np0005466013 nova_compute[192144]: 2025-10-02 12:01:37.740 2 DEBUG oslo_concurrency.processutils [None req-7b85e7b1-cbe0-401b-96b7-2f36c2189cff 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/e09de65a-0b2d-4aa5-9d9a-49f039add691/21c2ae2b0c364b478e44d49c56007f3c.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:01:37 np0005466013 nova_compute[192144]: 2025-10-02 12:01:37.903 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:38 np0005466013 nova_compute[192144]: 2025-10-02 12:01:38.179 2 DEBUG oslo_concurrency.processutils [None req-7b85e7b1-cbe0-401b-96b7-2f36c2189cff 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/e09de65a-0b2d-4aa5-9d9a-49f039add691/21c2ae2b0c364b478e44d49c56007f3c.tmp" returned: 1 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:01:38 np0005466013 nova_compute[192144]: 2025-10-02 12:01:38.180 2 DEBUG oslo_concurrency.processutils [None req-7b85e7b1-cbe0-401b-96b7-2f36c2189cff 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] 'ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/e09de65a-0b2d-4aa5-9d9a-49f039add691/21c2ae2b0c364b478e44d49c56007f3c.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Oct  2 08:01:38 np0005466013 nova_compute[192144]: 2025-10-02 12:01:38.180 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-7b85e7b1-cbe0-401b-96b7-2f36c2189cff 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Creating directory /var/lib/nova/instances/e09de65a-0b2d-4aa5-9d9a-49f039add691 on remote host 192.168.122.100 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Oct  2 08:01:38 np0005466013 nova_compute[192144]: 2025-10-02 12:01:38.180 2 DEBUG oslo_concurrency.processutils [None req-7b85e7b1-cbe0-401b-96b7-2f36c2189cff 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/e09de65a-0b2d-4aa5-9d9a-49f039add691 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:01:38 np0005466013 nova_compute[192144]: 2025-10-02 12:01:38.392 2 DEBUG oslo_concurrency.processutils [None req-7b85e7b1-cbe0-401b-96b7-2f36c2189cff 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/e09de65a-0b2d-4aa5-9d9a-49f039add691" returned: 0 in 0.211s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:01:38 np0005466013 nova_compute[192144]: 2025-10-02 12:01:38.397 2 DEBUG nova.virt.libvirt.driver [None req-7b85e7b1-cbe0-401b-96b7-2f36c2189cff 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: e09de65a-0b2d-4aa5-9d9a-49f039add691] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Oct  2 08:01:42 np0005466013 nova_compute[192144]: 2025-10-02 12:01:42.567 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:42 np0005466013 nova_compute[192144]: 2025-10-02 12:01:42.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:47 np0005466013 nova_compute[192144]: 2025-10-02 12:01:47.605 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:47 np0005466013 podman[220625]: 2025-10-02 12:01:47.70149138 +0000 UTC m=+0.077501436 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:01:47 np0005466013 podman[220652]: 2025-10-02 12:01:47.771775117 +0000 UTC m=+0.045291825 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  2 08:01:47 np0005466013 podman[220653]: 2025-10-02 12:01:47.778641704 +0000 UTC m=+0.048633965 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Oct  2 08:01:47 np0005466013 nova_compute[192144]: 2025-10-02 12:01:47.906 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:48 np0005466013 nova_compute[192144]: 2025-10-02 12:01:48.440 2 DEBUG nova.virt.libvirt.driver [None req-7b85e7b1-cbe0-401b-96b7-2f36c2189cff 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: e09de65a-0b2d-4aa5-9d9a-49f039add691] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Oct  2 08:01:50 np0005466013 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000009.scope: Deactivated successfully.
Oct  2 08:01:50 np0005466013 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000009.scope: Consumed 12.795s CPU time.
Oct  2 08:01:50 np0005466013 systemd-machined[152202]: Machine qemu-4-instance-00000009 terminated.
Oct  2 08:01:51 np0005466013 nova_compute[192144]: 2025-10-02 12:01:51.452 2 INFO nova.virt.libvirt.driver [None req-7b85e7b1-cbe0-401b-96b7-2f36c2189cff 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: e09de65a-0b2d-4aa5-9d9a-49f039add691] Instance shutdown successfully after 13 seconds.#033[00m
Oct  2 08:01:51 np0005466013 nova_compute[192144]: 2025-10-02 12:01:51.459 2 INFO nova.virt.libvirt.driver [-] [instance: e09de65a-0b2d-4aa5-9d9a-49f039add691] Instance destroyed successfully.#033[00m
Oct  2 08:01:51 np0005466013 nova_compute[192144]: 2025-10-02 12:01:51.462 2 DEBUG oslo_concurrency.processutils [None req-7b85e7b1-cbe0-401b-96b7-2f36c2189cff 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e09de65a-0b2d-4aa5-9d9a-49f039add691/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:01:51 np0005466013 nova_compute[192144]: 2025-10-02 12:01:51.516 2 DEBUG oslo_concurrency.processutils [None req-7b85e7b1-cbe0-401b-96b7-2f36c2189cff 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e09de65a-0b2d-4aa5-9d9a-49f039add691/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:01:51 np0005466013 nova_compute[192144]: 2025-10-02 12:01:51.517 2 DEBUG oslo_concurrency.processutils [None req-7b85e7b1-cbe0-401b-96b7-2f36c2189cff 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e09de65a-0b2d-4aa5-9d9a-49f039add691/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:01:51 np0005466013 nova_compute[192144]: 2025-10-02 12:01:51.578 2 DEBUG oslo_concurrency.processutils [None req-7b85e7b1-cbe0-401b-96b7-2f36c2189cff 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e09de65a-0b2d-4aa5-9d9a-49f039add691/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:01:51 np0005466013 nova_compute[192144]: 2025-10-02 12:01:51.580 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-7b85e7b1-cbe0-401b-96b7-2f36c2189cff 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Copying file /var/lib/nova/instances/e09de65a-0b2d-4aa5-9d9a-49f039add691_resize/disk to 192.168.122.100:/var/lib/nova/instances/e09de65a-0b2d-4aa5-9d9a-49f039add691/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Oct  2 08:01:51 np0005466013 nova_compute[192144]: 2025-10-02 12:01:51.580 2 DEBUG oslo_concurrency.processutils [None req-7b85e7b1-cbe0-401b-96b7-2f36c2189cff 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/e09de65a-0b2d-4aa5-9d9a-49f039add691_resize/disk 192.168.122.100:/var/lib/nova/instances/e09de65a-0b2d-4aa5-9d9a-49f039add691/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:01:52 np0005466013 nova_compute[192144]: 2025-10-02 12:01:52.344 2 DEBUG oslo_concurrency.processutils [None req-7b85e7b1-cbe0-401b-96b7-2f36c2189cff 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] CMD "scp -r /var/lib/nova/instances/e09de65a-0b2d-4aa5-9d9a-49f039add691_resize/disk 192.168.122.100:/var/lib/nova/instances/e09de65a-0b2d-4aa5-9d9a-49f039add691/disk" returned: 0 in 0.764s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:01:52 np0005466013 nova_compute[192144]: 2025-10-02 12:01:52.346 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-7b85e7b1-cbe0-401b-96b7-2f36c2189cff 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Copying file /var/lib/nova/instances/e09de65a-0b2d-4aa5-9d9a-49f039add691_resize/disk.config to 192.168.122.100:/var/lib/nova/instances/e09de65a-0b2d-4aa5-9d9a-49f039add691/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Oct  2 08:01:52 np0005466013 nova_compute[192144]: 2025-10-02 12:01:52.347 2 DEBUG oslo_concurrency.processutils [None req-7b85e7b1-cbe0-401b-96b7-2f36c2189cff 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/e09de65a-0b2d-4aa5-9d9a-49f039add691_resize/disk.config 192.168.122.100:/var/lib/nova/instances/e09de65a-0b2d-4aa5-9d9a-49f039add691/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:01:52 np0005466013 nova_compute[192144]: 2025-10-02 12:01:52.605 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:52 np0005466013 nova_compute[192144]: 2025-10-02 12:01:52.630 2 DEBUG oslo_concurrency.processutils [None req-7b85e7b1-cbe0-401b-96b7-2f36c2189cff 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] CMD "scp -C -r /var/lib/nova/instances/e09de65a-0b2d-4aa5-9d9a-49f039add691_resize/disk.config 192.168.122.100:/var/lib/nova/instances/e09de65a-0b2d-4aa5-9d9a-49f039add691/disk.config" returned: 0 in 0.284s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:01:52 np0005466013 nova_compute[192144]: 2025-10-02 12:01:52.631 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-7b85e7b1-cbe0-401b-96b7-2f36c2189cff 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Copying file /var/lib/nova/instances/e09de65a-0b2d-4aa5-9d9a-49f039add691_resize/disk.info to 192.168.122.100:/var/lib/nova/instances/e09de65a-0b2d-4aa5-9d9a-49f039add691/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Oct  2 08:01:52 np0005466013 nova_compute[192144]: 2025-10-02 12:01:52.631 2 DEBUG oslo_concurrency.processutils [None req-7b85e7b1-cbe0-401b-96b7-2f36c2189cff 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/e09de65a-0b2d-4aa5-9d9a-49f039add691_resize/disk.info 192.168.122.100:/var/lib/nova/instances/e09de65a-0b2d-4aa5-9d9a-49f039add691/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:01:52 np0005466013 nova_compute[192144]: 2025-10-02 12:01:52.870 2 DEBUG oslo_concurrency.processutils [None req-7b85e7b1-cbe0-401b-96b7-2f36c2189cff 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] CMD "scp -C -r /var/lib/nova/instances/e09de65a-0b2d-4aa5-9d9a-49f039add691_resize/disk.info 192.168.122.100:/var/lib/nova/instances/e09de65a-0b2d-4aa5-9d9a-49f039add691/disk.info" returned: 0 in 0.239s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:01:52 np0005466013 nova_compute[192144]: 2025-10-02 12:01:52.908 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:53 np0005466013 nova_compute[192144]: 2025-10-02 12:01:53.017 2 DEBUG oslo_concurrency.lockutils [None req-7b85e7b1-cbe0-401b-96b7-2f36c2189cff 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Acquiring lock "e09de65a-0b2d-4aa5-9d9a-49f039add691-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:01:53 np0005466013 nova_compute[192144]: 2025-10-02 12:01:53.017 2 DEBUG oslo_concurrency.lockutils [None req-7b85e7b1-cbe0-401b-96b7-2f36c2189cff 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Lock "e09de65a-0b2d-4aa5-9d9a-49f039add691-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:01:53 np0005466013 nova_compute[192144]: 2025-10-02 12:01:53.017 2 DEBUG oslo_concurrency.lockutils [None req-7b85e7b1-cbe0-401b-96b7-2f36c2189cff 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Lock "e09de65a-0b2d-4aa5-9d9a-49f039add691-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:01:54 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:01:54.725 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:01:54 np0005466013 nova_compute[192144]: 2025-10-02 12:01:54.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:54 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:01:54.727 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:01:56 np0005466013 nova_compute[192144]: 2025-10-02 12:01:56.022 2 DEBUG oslo_concurrency.lockutils [None req-b948f72f-ea8d-4b00-9c37-6352b841447d 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Acquiring lock "e09de65a-0b2d-4aa5-9d9a-49f039add691" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:01:56 np0005466013 nova_compute[192144]: 2025-10-02 12:01:56.022 2 DEBUG oslo_concurrency.lockutils [None req-b948f72f-ea8d-4b00-9c37-6352b841447d 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Lock "e09de65a-0b2d-4aa5-9d9a-49f039add691" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:01:56 np0005466013 nova_compute[192144]: 2025-10-02 12:01:56.023 2 DEBUG nova.compute.manager [None req-b948f72f-ea8d-4b00-9c37-6352b841447d 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: e09de65a-0b2d-4aa5-9d9a-49f039add691] Going to confirm migration 3 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Oct  2 08:01:56 np0005466013 nova_compute[192144]: 2025-10-02 12:01:56.053 2 DEBUG nova.objects.instance [None req-b948f72f-ea8d-4b00-9c37-6352b841447d 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Lazy-loading 'info_cache' on Instance uuid e09de65a-0b2d-4aa5-9d9a-49f039add691 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:01:56 np0005466013 ovn_controller[94366]: 2025-10-02T12:01:56Z|00035|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Oct  2 08:01:56 np0005466013 podman[220718]: 2025-10-02 12:01:56.685962391 +0000 UTC m=+0.057994899 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 08:01:56 np0005466013 nova_compute[192144]: 2025-10-02 12:01:56.946 2 DEBUG oslo_concurrency.lockutils [None req-b948f72f-ea8d-4b00-9c37-6352b841447d 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Acquiring lock "refresh_cache-e09de65a-0b2d-4aa5-9d9a-49f039add691" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:01:56 np0005466013 nova_compute[192144]: 2025-10-02 12:01:56.947 2 DEBUG oslo_concurrency.lockutils [None req-b948f72f-ea8d-4b00-9c37-6352b841447d 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Acquired lock "refresh_cache-e09de65a-0b2d-4aa5-9d9a-49f039add691" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:01:56 np0005466013 nova_compute[192144]: 2025-10-02 12:01:56.947 2 DEBUG nova.network.neutron [None req-b948f72f-ea8d-4b00-9c37-6352b841447d 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: e09de65a-0b2d-4aa5-9d9a-49f039add691] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:01:57 np0005466013 nova_compute[192144]: 2025-10-02 12:01:57.073 2 DEBUG nova.network.neutron [None req-b948f72f-ea8d-4b00-9c37-6352b841447d 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: e09de65a-0b2d-4aa5-9d9a-49f039add691] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:01:57 np0005466013 nova_compute[192144]: 2025-10-02 12:01:57.457 2 DEBUG nova.network.neutron [None req-b948f72f-ea8d-4b00-9c37-6352b841447d 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: e09de65a-0b2d-4aa5-9d9a-49f039add691] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:01:57 np0005466013 nova_compute[192144]: 2025-10-02 12:01:57.476 2 DEBUG oslo_concurrency.lockutils [None req-b948f72f-ea8d-4b00-9c37-6352b841447d 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Releasing lock "refresh_cache-e09de65a-0b2d-4aa5-9d9a-49f039add691" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:01:57 np0005466013 nova_compute[192144]: 2025-10-02 12:01:57.477 2 DEBUG nova.objects.instance [None req-b948f72f-ea8d-4b00-9c37-6352b841447d 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Lazy-loading 'migration_context' on Instance uuid e09de65a-0b2d-4aa5-9d9a-49f039add691 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:01:57 np0005466013 nova_compute[192144]: 2025-10-02 12:01:57.513 2 DEBUG oslo_concurrency.lockutils [None req-b948f72f-ea8d-4b00-9c37-6352b841447d 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:01:57 np0005466013 nova_compute[192144]: 2025-10-02 12:01:57.513 2 DEBUG oslo_concurrency.lockutils [None req-b948f72f-ea8d-4b00-9c37-6352b841447d 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:01:57 np0005466013 nova_compute[192144]: 2025-10-02 12:01:57.597 2 DEBUG nova.compute.provider_tree [None req-b948f72f-ea8d-4b00-9c37-6352b841447d 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:01:57 np0005466013 nova_compute[192144]: 2025-10-02 12:01:57.607 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:57 np0005466013 nova_compute[192144]: 2025-10-02 12:01:57.616 2 DEBUG nova.scheduler.client.report [None req-b948f72f-ea8d-4b00-9c37-6352b841447d 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:01:57 np0005466013 nova_compute[192144]: 2025-10-02 12:01:57.660 2 DEBUG oslo_concurrency.lockutils [None req-b948f72f-ea8d-4b00-9c37-6352b841447d 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.147s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:01:57 np0005466013 nova_compute[192144]: 2025-10-02 12:01:57.784 2 INFO nova.scheduler.client.report [None req-b948f72f-ea8d-4b00-9c37-6352b841447d 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Deleted allocation for migration df4af920-0d69-485c-a689-db5f063c5cf4#033[00m
Oct  2 08:01:57 np0005466013 nova_compute[192144]: 2025-10-02 12:01:57.843 2 DEBUG oslo_concurrency.lockutils [None req-b948f72f-ea8d-4b00-9c37-6352b841447d 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Lock "e09de65a-0b2d-4aa5-9d9a-49f039add691" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 1.821s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:01:57 np0005466013 nova_compute[192144]: 2025-10-02 12:01:57.909 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:58 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:01:58.729 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:01:59 np0005466013 podman[220740]: 2025-10-02 12:01:59.667659167 +0000 UTC m=+0.046076586 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vcs-type=git, io.buildah.version=1.33.7, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, managed_by=edpm_ansible, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, vendor=Red Hat, Inc.)
Oct  2 08:01:59 np0005466013 podman[220739]: 2025-10-02 12:01:59.674641083 +0000 UTC m=+0.053635270 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=multipathd)
Oct  2 08:02:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:02.275 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:02:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:02.276 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:02:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:02.276 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:02:02 np0005466013 nova_compute[192144]: 2025-10-02 12:02:02.609 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:02 np0005466013 nova_compute[192144]: 2025-10-02 12:02:02.911 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:06 np0005466013 nova_compute[192144]: 2025-10-02 12:02:06.031 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759406511.0297303, e09de65a-0b2d-4aa5-9d9a-49f039add691 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:02:06 np0005466013 nova_compute[192144]: 2025-10-02 12:02:06.031 2 INFO nova.compute.manager [-] [instance: e09de65a-0b2d-4aa5-9d9a-49f039add691] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:02:06 np0005466013 nova_compute[192144]: 2025-10-02 12:02:06.048 2 DEBUG nova.compute.manager [None req-032246d6-9132-41ba-90bd-7a2180743e71 - - - - - -] [instance: e09de65a-0b2d-4aa5-9d9a-49f039add691] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:02:06 np0005466013 podman[220782]: 2025-10-02 12:02:06.673204393 +0000 UTC m=+0.050297087 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:02:06 np0005466013 podman[220783]: 2025-10-02 12:02:06.710140406 +0000 UTC m=+0.086062783 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=iscsid, container_name=iscsid, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:02:07 np0005466013 nova_compute[192144]: 2025-10-02 12:02:07.611 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:07 np0005466013 nova_compute[192144]: 2025-10-02 12:02:07.913 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:08 np0005466013 nova_compute[192144]: 2025-10-02 12:02:08.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:02:08 np0005466013 nova_compute[192144]: 2025-10-02 12:02:08.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:02:09 np0005466013 nova_compute[192144]: 2025-10-02 12:02:09.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:02:09 np0005466013 nova_compute[192144]: 2025-10-02 12:02:09.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:02:10 np0005466013 nova_compute[192144]: 2025-10-02 12:02:10.042 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:02:10 np0005466013 nova_compute[192144]: 2025-10-02 12:02:10.043 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:02:10 np0005466013 nova_compute[192144]: 2025-10-02 12:02:10.043 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:02:10 np0005466013 nova_compute[192144]: 2025-10-02 12:02:10.043 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:02:10 np0005466013 nova_compute[192144]: 2025-10-02 12:02:10.132 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:02:10 np0005466013 nova_compute[192144]: 2025-10-02 12:02:10.185 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:02:10 np0005466013 nova_compute[192144]: 2025-10-02 12:02:10.185 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:02:10 np0005466013 nova_compute[192144]: 2025-10-02 12:02:10.239 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:02:10 np0005466013 nova_compute[192144]: 2025-10-02 12:02:10.381 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:02:10 np0005466013 nova_compute[192144]: 2025-10-02 12:02:10.383 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5637MB free_disk=73.43970108032227GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:02:10 np0005466013 nova_compute[192144]: 2025-10-02 12:02:10.383 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:02:10 np0005466013 nova_compute[192144]: 2025-10-02 12:02:10.383 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:02:10 np0005466013 nova_compute[192144]: 2025-10-02 12:02:10.487 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:02:10 np0005466013 nova_compute[192144]: 2025-10-02 12:02:10.488 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:02:10 np0005466013 nova_compute[192144]: 2025-10-02 12:02:10.488 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:02:10 np0005466013 nova_compute[192144]: 2025-10-02 12:02:10.545 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:02:10 np0005466013 nova_compute[192144]: 2025-10-02 12:02:10.571 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:02:10 np0005466013 nova_compute[192144]: 2025-10-02 12:02:10.602 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:02:10 np0005466013 nova_compute[192144]: 2025-10-02 12:02:10.603 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.220s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:02:11 np0005466013 nova_compute[192144]: 2025-10-02 12:02:11.603 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:02:11 np0005466013 nova_compute[192144]: 2025-10-02 12:02:11.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:02:11 np0005466013 nova_compute[192144]: 2025-10-02 12:02:11.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:02:11 np0005466013 nova_compute[192144]: 2025-10-02 12:02:11.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:02:12 np0005466013 nova_compute[192144]: 2025-10-02 12:02:12.225 2 DEBUG oslo_concurrency.lockutils [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Acquiring lock "f1267fe1-552c-4312-b9b0-c02eae82a77a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:02:12 np0005466013 nova_compute[192144]: 2025-10-02 12:02:12.225 2 DEBUG oslo_concurrency.lockutils [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Lock "f1267fe1-552c-4312-b9b0-c02eae82a77a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:02:12 np0005466013 nova_compute[192144]: 2025-10-02 12:02:12.274 2 DEBUG nova.compute.manager [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:02:12 np0005466013 nova_compute[192144]: 2025-10-02 12:02:12.356 2 DEBUG oslo_concurrency.lockutils [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:02:12 np0005466013 nova_compute[192144]: 2025-10-02 12:02:12.357 2 DEBUG oslo_concurrency.lockutils [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:02:12 np0005466013 nova_compute[192144]: 2025-10-02 12:02:12.366 2 DEBUG nova.virt.hardware [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:02:12 np0005466013 nova_compute[192144]: 2025-10-02 12:02:12.367 2 INFO nova.compute.claims [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:02:12 np0005466013 nova_compute[192144]: 2025-10-02 12:02:12.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:12 np0005466013 nova_compute[192144]: 2025-10-02 12:02:12.799 2 DEBUG nova.compute.provider_tree [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:02:12 np0005466013 nova_compute[192144]: 2025-10-02 12:02:12.818 2 DEBUG nova.scheduler.client.report [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:02:12 np0005466013 nova_compute[192144]: 2025-10-02 12:02:12.841 2 DEBUG oslo_concurrency.lockutils [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.485s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:02:12 np0005466013 nova_compute[192144]: 2025-10-02 12:02:12.842 2 DEBUG nova.compute.manager [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:02:12 np0005466013 nova_compute[192144]: 2025-10-02 12:02:12.914 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:12 np0005466013 nova_compute[192144]: 2025-10-02 12:02:12.967 2 DEBUG nova.compute.manager [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:02:12 np0005466013 nova_compute[192144]: 2025-10-02 12:02:12.968 2 DEBUG nova.network.neutron [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:02:13 np0005466013 nova_compute[192144]: 2025-10-02 12:02:13.011 2 INFO nova.virt.libvirt.driver [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:02:13 np0005466013 nova_compute[192144]: 2025-10-02 12:02:13.059 2 DEBUG nova.compute.manager [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:02:13 np0005466013 nova_compute[192144]: 2025-10-02 12:02:13.168 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "refresh_cache-0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:02:13 np0005466013 nova_compute[192144]: 2025-10-02 12:02:13.168 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquired lock "refresh_cache-0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:02:13 np0005466013 nova_compute[192144]: 2025-10-02 12:02:13.169 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:02:13 np0005466013 nova_compute[192144]: 2025-10-02 12:02:13.169 2 DEBUG nova.objects.instance [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lazy-loading 'info_cache' on Instance uuid 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:02:13 np0005466013 nova_compute[192144]: 2025-10-02 12:02:13.249 2 DEBUG nova.compute.manager [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:02:13 np0005466013 nova_compute[192144]: 2025-10-02 12:02:13.250 2 DEBUG nova.virt.libvirt.driver [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:02:13 np0005466013 nova_compute[192144]: 2025-10-02 12:02:13.250 2 INFO nova.virt.libvirt.driver [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Creating image(s)#033[00m
Oct  2 08:02:13 np0005466013 nova_compute[192144]: 2025-10-02 12:02:13.250 2 DEBUG oslo_concurrency.lockutils [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Acquiring lock "/var/lib/nova/instances/f1267fe1-552c-4312-b9b0-c02eae82a77a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:02:13 np0005466013 nova_compute[192144]: 2025-10-02 12:02:13.251 2 DEBUG oslo_concurrency.lockutils [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Lock "/var/lib/nova/instances/f1267fe1-552c-4312-b9b0-c02eae82a77a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:02:13 np0005466013 nova_compute[192144]: 2025-10-02 12:02:13.251 2 DEBUG oslo_concurrency.lockutils [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Lock "/var/lib/nova/instances/f1267fe1-552c-4312-b9b0-c02eae82a77a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:02:13 np0005466013 nova_compute[192144]: 2025-10-02 12:02:13.266 2 DEBUG oslo_concurrency.processutils [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:02:13 np0005466013 nova_compute[192144]: 2025-10-02 12:02:13.319 2 DEBUG oslo_concurrency.processutils [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:02:13 np0005466013 nova_compute[192144]: 2025-10-02 12:02:13.320 2 DEBUG oslo_concurrency.lockutils [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:02:13 np0005466013 nova_compute[192144]: 2025-10-02 12:02:13.321 2 DEBUG oslo_concurrency.lockutils [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:02:13 np0005466013 nova_compute[192144]: 2025-10-02 12:02:13.331 2 DEBUG oslo_concurrency.processutils [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:02:13 np0005466013 nova_compute[192144]: 2025-10-02 12:02:13.382 2 DEBUG oslo_concurrency.processutils [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:02:13 np0005466013 nova_compute[192144]: 2025-10-02 12:02:13.383 2 DEBUG oslo_concurrency.processutils [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/f1267fe1-552c-4312-b9b0-c02eae82a77a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:02:13 np0005466013 nova_compute[192144]: 2025-10-02 12:02:13.409 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:02:13 np0005466013 nova_compute[192144]: 2025-10-02 12:02:13.422 2 DEBUG oslo_concurrency.processutils [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/f1267fe1-552c-4312-b9b0-c02eae82a77a/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:02:13 np0005466013 nova_compute[192144]: 2025-10-02 12:02:13.423 2 DEBUG oslo_concurrency.lockutils [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:02:13 np0005466013 nova_compute[192144]: 2025-10-02 12:02:13.423 2 DEBUG oslo_concurrency.processutils [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:02:13 np0005466013 nova_compute[192144]: 2025-10-02 12:02:13.478 2 DEBUG oslo_concurrency.processutils [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:02:13 np0005466013 nova_compute[192144]: 2025-10-02 12:02:13.479 2 DEBUG nova.virt.disk.api [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Checking if we can resize image /var/lib/nova/instances/f1267fe1-552c-4312-b9b0-c02eae82a77a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:02:13 np0005466013 nova_compute[192144]: 2025-10-02 12:02:13.479 2 DEBUG oslo_concurrency.processutils [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f1267fe1-552c-4312-b9b0-c02eae82a77a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:02:13 np0005466013 nova_compute[192144]: 2025-10-02 12:02:13.497 2 DEBUG nova.policy [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '59e8135d73ee43e088ba5ee7d9bd84b1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5cc73d75e0864e838eefa90cb33b7e01', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:02:13 np0005466013 nova_compute[192144]: 2025-10-02 12:02:13.537 2 DEBUG oslo_concurrency.processutils [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f1267fe1-552c-4312-b9b0-c02eae82a77a/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:02:13 np0005466013 nova_compute[192144]: 2025-10-02 12:02:13.537 2 DEBUG nova.virt.disk.api [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Cannot resize image /var/lib/nova/instances/f1267fe1-552c-4312-b9b0-c02eae82a77a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:02:13 np0005466013 nova_compute[192144]: 2025-10-02 12:02:13.538 2 DEBUG nova.objects.instance [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Lazy-loading 'migration_context' on Instance uuid f1267fe1-552c-4312-b9b0-c02eae82a77a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:02:13 np0005466013 nova_compute[192144]: 2025-10-02 12:02:13.560 2 DEBUG nova.virt.libvirt.driver [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:02:13 np0005466013 nova_compute[192144]: 2025-10-02 12:02:13.560 2 DEBUG nova.virt.libvirt.driver [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Ensure instance console log exists: /var/lib/nova/instances/f1267fe1-552c-4312-b9b0-c02eae82a77a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:02:13 np0005466013 nova_compute[192144]: 2025-10-02 12:02:13.560 2 DEBUG oslo_concurrency.lockutils [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:02:13 np0005466013 nova_compute[192144]: 2025-10-02 12:02:13.561 2 DEBUG oslo_concurrency.lockutils [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:02:13 np0005466013 nova_compute[192144]: 2025-10-02 12:02:13.561 2 DEBUG oslo_concurrency.lockutils [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:02:13 np0005466013 nova_compute[192144]: 2025-10-02 12:02:13.763 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:02:13 np0005466013 nova_compute[192144]: 2025-10-02 12:02:13.780 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Releasing lock "refresh_cache-0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:02:13 np0005466013 nova_compute[192144]: 2025-10-02 12:02:13.780 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:02:13 np0005466013 nova_compute[192144]: 2025-10-02 12:02:13.781 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:02:13 np0005466013 nova_compute[192144]: 2025-10-02 12:02:13.781 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:02:13 np0005466013 nova_compute[192144]: 2025-10-02 12:02:13.781 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:02:14 np0005466013 nova_compute[192144]: 2025-10-02 12:02:14.776 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:02:14 np0005466013 nova_compute[192144]: 2025-10-02 12:02:14.777 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:02:15 np0005466013 nova_compute[192144]: 2025-10-02 12:02:15.736 2 DEBUG nova.network.neutron [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Successfully updated port: 75561bb8-bfb9-4100-9c79-271fd50011de _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:02:15 np0005466013 nova_compute[192144]: 2025-10-02 12:02:15.832 2 DEBUG oslo_concurrency.lockutils [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Acquiring lock "refresh_cache-f1267fe1-552c-4312-b9b0-c02eae82a77a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:02:15 np0005466013 nova_compute[192144]: 2025-10-02 12:02:15.833 2 DEBUG oslo_concurrency.lockutils [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Acquired lock "refresh_cache-f1267fe1-552c-4312-b9b0-c02eae82a77a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:02:15 np0005466013 nova_compute[192144]: 2025-10-02 12:02:15.834 2 DEBUG nova.network.neutron [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:02:15 np0005466013 nova_compute[192144]: 2025-10-02 12:02:15.844 2 DEBUG nova.compute.manager [req-82ba78d4-a31d-4ccd-89b8-b8eb8132f580 req-feb451b6-e2d3-4381-b186-c6848bb2a7ec 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Received event network-changed-75561bb8-bfb9-4100-9c79-271fd50011de external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:02:15 np0005466013 nova_compute[192144]: 2025-10-02 12:02:15.845 2 DEBUG nova.compute.manager [req-82ba78d4-a31d-4ccd-89b8-b8eb8132f580 req-feb451b6-e2d3-4381-b186-c6848bb2a7ec 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Refreshing instance network info cache due to event network-changed-75561bb8-bfb9-4100-9c79-271fd50011de. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:02:15 np0005466013 nova_compute[192144]: 2025-10-02 12:02:15.845 2 DEBUG oslo_concurrency.lockutils [req-82ba78d4-a31d-4ccd-89b8-b8eb8132f580 req-feb451b6-e2d3-4381-b186-c6848bb2a7ec 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-f1267fe1-552c-4312-b9b0-c02eae82a77a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:02:16 np0005466013 nova_compute[192144]: 2025-10-02 12:02:16.019 2 DEBUG nova.network.neutron [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:02:17 np0005466013 nova_compute[192144]: 2025-10-02 12:02:17.616 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:17 np0005466013 nova_compute[192144]: 2025-10-02 12:02:17.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:18 np0005466013 nova_compute[192144]: 2025-10-02 12:02:18.568 2 DEBUG nova.network.neutron [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Updating instance_info_cache with network_info: [{"id": "75561bb8-bfb9-4100-9c79-271fd50011de", "address": "fa:16:3e:19:d8:66", "network": {"id": "020b4768-a07a-4769-8636-455566c87083", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-804372870-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5cc73d75e0864e838eefa90cb33b7e01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75561bb8-bf", "ovs_interfaceid": "75561bb8-bfb9-4100-9c79-271fd50011de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:02:18 np0005466013 nova_compute[192144]: 2025-10-02 12:02:18.675 2 DEBUG oslo_concurrency.lockutils [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Releasing lock "refresh_cache-f1267fe1-552c-4312-b9b0-c02eae82a77a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:02:18 np0005466013 nova_compute[192144]: 2025-10-02 12:02:18.675 2 DEBUG nova.compute.manager [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Instance network_info: |[{"id": "75561bb8-bfb9-4100-9c79-271fd50011de", "address": "fa:16:3e:19:d8:66", "network": {"id": "020b4768-a07a-4769-8636-455566c87083", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-804372870-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5cc73d75e0864e838eefa90cb33b7e01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75561bb8-bf", "ovs_interfaceid": "75561bb8-bfb9-4100-9c79-271fd50011de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:02:18 np0005466013 nova_compute[192144]: 2025-10-02 12:02:18.676 2 DEBUG oslo_concurrency.lockutils [req-82ba78d4-a31d-4ccd-89b8-b8eb8132f580 req-feb451b6-e2d3-4381-b186-c6848bb2a7ec 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-f1267fe1-552c-4312-b9b0-c02eae82a77a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:02:18 np0005466013 nova_compute[192144]: 2025-10-02 12:02:18.676 2 DEBUG nova.network.neutron [req-82ba78d4-a31d-4ccd-89b8-b8eb8132f580 req-feb451b6-e2d3-4381-b186-c6848bb2a7ec 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Refreshing network info cache for port 75561bb8-bfb9-4100-9c79-271fd50011de _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:02:18 np0005466013 nova_compute[192144]: 2025-10-02 12:02:18.678 2 DEBUG nova.virt.libvirt.driver [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Start _get_guest_xml network_info=[{"id": "75561bb8-bfb9-4100-9c79-271fd50011de", "address": "fa:16:3e:19:d8:66", "network": {"id": "020b4768-a07a-4769-8636-455566c87083", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-804372870-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5cc73d75e0864e838eefa90cb33b7e01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75561bb8-bf", "ovs_interfaceid": "75561bb8-bfb9-4100-9c79-271fd50011de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:02:18 np0005466013 podman[220850]: 2025-10-02 12:02:18.681000089 +0000 UTC m=+0.051443052 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:02:18 np0005466013 nova_compute[192144]: 2025-10-02 12:02:18.687 2 WARNING nova.virt.libvirt.driver [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:02:18 np0005466013 podman[220851]: 2025-10-02 12:02:18.688768959 +0000 UTC m=+0.058795230 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent)
Oct  2 08:02:18 np0005466013 nova_compute[192144]: 2025-10-02 12:02:18.693 2 DEBUG nova.virt.libvirt.host [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:02:18 np0005466013 nova_compute[192144]: 2025-10-02 12:02:18.694 2 DEBUG nova.virt.libvirt.host [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:02:18 np0005466013 nova_compute[192144]: 2025-10-02 12:02:18.704 2 DEBUG nova.virt.libvirt.host [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:02:18 np0005466013 nova_compute[192144]: 2025-10-02 12:02:18.705 2 DEBUG nova.virt.libvirt.host [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:02:18 np0005466013 nova_compute[192144]: 2025-10-02 12:02:18.706 2 DEBUG nova.virt.libvirt.driver [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:02:18 np0005466013 nova_compute[192144]: 2025-10-02 12:02:18.707 2 DEBUG nova.virt.hardware [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:02:18 np0005466013 nova_compute[192144]: 2025-10-02 12:02:18.707 2 DEBUG nova.virt.hardware [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:02:18 np0005466013 nova_compute[192144]: 2025-10-02 12:02:18.707 2 DEBUG nova.virt.hardware [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:02:18 np0005466013 nova_compute[192144]: 2025-10-02 12:02:18.708 2 DEBUG nova.virt.hardware [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:02:18 np0005466013 nova_compute[192144]: 2025-10-02 12:02:18.708 2 DEBUG nova.virt.hardware [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:02:18 np0005466013 nova_compute[192144]: 2025-10-02 12:02:18.708 2 DEBUG nova.virt.hardware [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:02:18 np0005466013 nova_compute[192144]: 2025-10-02 12:02:18.708 2 DEBUG nova.virt.hardware [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:02:18 np0005466013 nova_compute[192144]: 2025-10-02 12:02:18.709 2 DEBUG nova.virt.hardware [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:02:18 np0005466013 nova_compute[192144]: 2025-10-02 12:02:18.709 2 DEBUG nova.virt.hardware [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:02:18 np0005466013 nova_compute[192144]: 2025-10-02 12:02:18.709 2 DEBUG nova.virt.hardware [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:02:18 np0005466013 nova_compute[192144]: 2025-10-02 12:02:18.709 2 DEBUG nova.virt.hardware [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:02:18 np0005466013 podman[220852]: 2025-10-02 12:02:18.713647238 +0000 UTC m=+0.080082098 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct  2 08:02:18 np0005466013 nova_compute[192144]: 2025-10-02 12:02:18.714 2 DEBUG nova.virt.libvirt.vif [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:02:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1420306859',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1420306859',id=11,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5cc73d75e0864e838eefa90cb33b7e01',ramdisk_id='',reservation_id='r-f4n74pvj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-984573444',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-984573444-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:02:13Z,user_data=None,user_id='59e8135d73ee43e088ba5ee7d9bd84b1',uuid=f1267fe1-552c-4312-b9b0-c02eae82a77a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "75561bb8-bfb9-4100-9c79-271fd50011de", "address": "fa:16:3e:19:d8:66", "network": {"id": "020b4768-a07a-4769-8636-455566c87083", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-804372870-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5cc73d75e0864e838eefa90cb33b7e01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75561bb8-bf", "ovs_interfaceid": "75561bb8-bfb9-4100-9c79-271fd50011de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:02:18 np0005466013 nova_compute[192144]: 2025-10-02 12:02:18.715 2 DEBUG nova.network.os_vif_util [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Converting VIF {"id": "75561bb8-bfb9-4100-9c79-271fd50011de", "address": "fa:16:3e:19:d8:66", "network": {"id": "020b4768-a07a-4769-8636-455566c87083", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-804372870-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5cc73d75e0864e838eefa90cb33b7e01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75561bb8-bf", "ovs_interfaceid": "75561bb8-bfb9-4100-9c79-271fd50011de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:02:18 np0005466013 nova_compute[192144]: 2025-10-02 12:02:18.715 2 DEBUG nova.network.os_vif_util [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:d8:66,bridge_name='br-int',has_traffic_filtering=True,id=75561bb8-bfb9-4100-9c79-271fd50011de,network=Network(020b4768-a07a-4769-8636-455566c87083),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap75561bb8-bf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:02:18 np0005466013 nova_compute[192144]: 2025-10-02 12:02:18.716 2 DEBUG nova.objects.instance [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Lazy-loading 'pci_devices' on Instance uuid f1267fe1-552c-4312-b9b0-c02eae82a77a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:02:18 np0005466013 nova_compute[192144]: 2025-10-02 12:02:18.740 2 DEBUG nova.virt.libvirt.driver [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:02:18 np0005466013 nova_compute[192144]:  <uuid>f1267fe1-552c-4312-b9b0-c02eae82a77a</uuid>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:  <name>instance-0000000b</name>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:02:18 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:      <nova:name>tempest-LiveAutoBlockMigrationV225Test-server-1420306859</nova:name>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:02:18</nova:creationTime>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:02:18 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:        <nova:user uuid="59e8135d73ee43e088ba5ee7d9bd84b1">tempest-LiveAutoBlockMigrationV225Test-984573444-project-member</nova:user>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:        <nova:project uuid="5cc73d75e0864e838eefa90cb33b7e01">tempest-LiveAutoBlockMigrationV225Test-984573444</nova:project>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:        <nova:port uuid="75561bb8-bfb9-4100-9c79-271fd50011de">
Oct  2 08:02:18 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:      <entry name="serial">f1267fe1-552c-4312-b9b0-c02eae82a77a</entry>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:      <entry name="uuid">f1267fe1-552c-4312-b9b0-c02eae82a77a</entry>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:02:18 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/f1267fe1-552c-4312-b9b0-c02eae82a77a/disk"/>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:02:18 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/f1267fe1-552c-4312-b9b0-c02eae82a77a/disk.config"/>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:02:18 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:19:d8:66"/>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:      <target dev="tap75561bb8-bf"/>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:02:18 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/f1267fe1-552c-4312-b9b0-c02eae82a77a/console.log" append="off"/>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:02:18 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:02:18 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:02:18 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:02:18 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:02:18 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:02:18 np0005466013 nova_compute[192144]: 2025-10-02 12:02:18.741 2 DEBUG nova.compute.manager [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Preparing to wait for external event network-vif-plugged-75561bb8-bfb9-4100-9c79-271fd50011de prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:02:18 np0005466013 nova_compute[192144]: 2025-10-02 12:02:18.741 2 DEBUG oslo_concurrency.lockutils [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Acquiring lock "f1267fe1-552c-4312-b9b0-c02eae82a77a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:02:18 np0005466013 nova_compute[192144]: 2025-10-02 12:02:18.741 2 DEBUG oslo_concurrency.lockutils [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Lock "f1267fe1-552c-4312-b9b0-c02eae82a77a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:02:18 np0005466013 nova_compute[192144]: 2025-10-02 12:02:18.741 2 DEBUG oslo_concurrency.lockutils [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Lock "f1267fe1-552c-4312-b9b0-c02eae82a77a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:02:18 np0005466013 nova_compute[192144]: 2025-10-02 12:02:18.742 2 DEBUG nova.virt.libvirt.vif [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:02:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1420306859',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1420306859',id=11,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5cc73d75e0864e838eefa90cb33b7e01',ramdisk_id='',reservation_id='r-f4n74pvj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-984573444',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-984573444-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:02:13Z,user_data=None,user_id='59e8135d73ee43e088ba5ee7d9bd84b1',uuid=f1267fe1-552c-4312-b9b0-c02eae82a77a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "75561bb8-bfb9-4100-9c79-271fd50011de", "address": "fa:16:3e:19:d8:66", "network": {"id": "020b4768-a07a-4769-8636-455566c87083", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-804372870-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5cc73d75e0864e838eefa90cb33b7e01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75561bb8-bf", "ovs_interfaceid": "75561bb8-bfb9-4100-9c79-271fd50011de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:02:18 np0005466013 nova_compute[192144]: 2025-10-02 12:02:18.742 2 DEBUG nova.network.os_vif_util [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Converting VIF {"id": "75561bb8-bfb9-4100-9c79-271fd50011de", "address": "fa:16:3e:19:d8:66", "network": {"id": "020b4768-a07a-4769-8636-455566c87083", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-804372870-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5cc73d75e0864e838eefa90cb33b7e01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75561bb8-bf", "ovs_interfaceid": "75561bb8-bfb9-4100-9c79-271fd50011de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:02:18 np0005466013 nova_compute[192144]: 2025-10-02 12:02:18.743 2 DEBUG nova.network.os_vif_util [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:d8:66,bridge_name='br-int',has_traffic_filtering=True,id=75561bb8-bfb9-4100-9c79-271fd50011de,network=Network(020b4768-a07a-4769-8636-455566c87083),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap75561bb8-bf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:02:18 np0005466013 nova_compute[192144]: 2025-10-02 12:02:18.743 2 DEBUG os_vif [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:d8:66,bridge_name='br-int',has_traffic_filtering=True,id=75561bb8-bfb9-4100-9c79-271fd50011de,network=Network(020b4768-a07a-4769-8636-455566c87083),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap75561bb8-bf') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:02:18 np0005466013 nova_compute[192144]: 2025-10-02 12:02:18.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:18 np0005466013 nova_compute[192144]: 2025-10-02 12:02:18.744 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:02:18 np0005466013 nova_compute[192144]: 2025-10-02 12:02:18.744 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:02:18 np0005466013 nova_compute[192144]: 2025-10-02 12:02:18.747 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:18 np0005466013 nova_compute[192144]: 2025-10-02 12:02:18.747 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap75561bb8-bf, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:02:18 np0005466013 nova_compute[192144]: 2025-10-02 12:02:18.748 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap75561bb8-bf, col_values=(('external_ids', {'iface-id': '75561bb8-bfb9-4100-9c79-271fd50011de', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:19:d8:66', 'vm-uuid': 'f1267fe1-552c-4312-b9b0-c02eae82a77a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:02:18 np0005466013 nova_compute[192144]: 2025-10-02 12:02:18.759 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:18 np0005466013 NetworkManager[51205]: <info>  [1759406538.7603] manager: (tap75561bb8-bf): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/27)
Oct  2 08:02:18 np0005466013 nova_compute[192144]: 2025-10-02 12:02:18.761 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:02:18 np0005466013 nova_compute[192144]: 2025-10-02 12:02:18.766 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:18 np0005466013 nova_compute[192144]: 2025-10-02 12:02:18.767 2 INFO os_vif [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:d8:66,bridge_name='br-int',has_traffic_filtering=True,id=75561bb8-bfb9-4100-9c79-271fd50011de,network=Network(020b4768-a07a-4769-8636-455566c87083),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap75561bb8-bf')#033[00m
Oct  2 08:02:18 np0005466013 nova_compute[192144]: 2025-10-02 12:02:18.846 2 DEBUG nova.virt.libvirt.driver [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:02:18 np0005466013 nova_compute[192144]: 2025-10-02 12:02:18.846 2 DEBUG nova.virt.libvirt.driver [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:02:18 np0005466013 nova_compute[192144]: 2025-10-02 12:02:18.846 2 DEBUG nova.virt.libvirt.driver [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] No VIF found with MAC fa:16:3e:19:d8:66, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:02:18 np0005466013 nova_compute[192144]: 2025-10-02 12:02:18.846 2 INFO nova.virt.libvirt.driver [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Using config drive#033[00m
Oct  2 08:02:19 np0005466013 nova_compute[192144]: 2025-10-02 12:02:19.925 2 INFO nova.virt.libvirt.driver [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Creating config drive at /var/lib/nova/instances/f1267fe1-552c-4312-b9b0-c02eae82a77a/disk.config#033[00m
Oct  2 08:02:19 np0005466013 nova_compute[192144]: 2025-10-02 12:02:19.930 2 DEBUG oslo_concurrency.processutils [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f1267fe1-552c-4312-b9b0-c02eae82a77a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp12qop8vg execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:02:20 np0005466013 nova_compute[192144]: 2025-10-02 12:02:20.051 2 DEBUG oslo_concurrency.processutils [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f1267fe1-552c-4312-b9b0-c02eae82a77a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp12qop8vg" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:02:20 np0005466013 kernel: tap75561bb8-bf: entered promiscuous mode
Oct  2 08:02:20 np0005466013 NetworkManager[51205]: <info>  [1759406540.0963] manager: (tap75561bb8-bf): new Tun device (/org/freedesktop/NetworkManager/Devices/28)
Oct  2 08:02:20 np0005466013 nova_compute[192144]: 2025-10-02 12:02:20.128 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:20 np0005466013 ovn_controller[94366]: 2025-10-02T12:02:20Z|00036|binding|INFO|Claiming lport 75561bb8-bfb9-4100-9c79-271fd50011de for this chassis.
Oct  2 08:02:20 np0005466013 ovn_controller[94366]: 2025-10-02T12:02:20Z|00037|binding|INFO|75561bb8-bfb9-4100-9c79-271fd50011de: Claiming fa:16:3e:19:d8:66 10.100.0.6
Oct  2 08:02:20 np0005466013 ovn_controller[94366]: 2025-10-02T12:02:20Z|00038|binding|INFO|Claiming lport 3197a9b3-066e-4dd5-acdc-899f59bb4e28 for this chassis.
Oct  2 08:02:20 np0005466013 ovn_controller[94366]: 2025-10-02T12:02:20Z|00039|binding|INFO|3197a9b3-066e-4dd5-acdc-899f59bb4e28: Claiming fa:16:3e:e8:2d:9c 19.80.0.100
Oct  2 08:02:20 np0005466013 nova_compute[192144]: 2025-10-02 12:02:20.132 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:20 np0005466013 nova_compute[192144]: 2025-10-02 12:02:20.134 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:20.144 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e8:2d:9c 19.80.0.100'], port_security=['fa:16:3e:e8:2d:9c 19.80.0.100'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['75561bb8-bfb9-4100-9c79-271fd50011de'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-207340523', 'neutron:cidrs': '19.80.0.100/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-78c3d2d3-8bfe-47b8-9282-3e9091b37043', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-207340523', 'neutron:project_id': '5cc73d75e0864e838eefa90cb33b7e01', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f3fadef5-4bfc-406c-93c4-14d4abd0583e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=1160f689-1347-48a0-ba14-b69afc977804, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=3197a9b3-066e-4dd5-acdc-899f59bb4e28) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:20.146 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:19:d8:66 10.100.0.6'], port_security=['fa:16:3e:19:d8:66 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-2028677656', 'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'f1267fe1-552c-4312-b9b0-c02eae82a77a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-020b4768-a07a-4769-8636-455566c87083', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-2028677656', 'neutron:project_id': '5cc73d75e0864e838eefa90cb33b7e01', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f3fadef5-4bfc-406c-93c4-14d4abd0583e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=11c0be75-bb4b-4e01-8cfa-b9aa4fcaf0e9, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=75561bb8-bfb9-4100-9c79-271fd50011de) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:20.147 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 3197a9b3-066e-4dd5-acdc-899f59bb4e28 in datapath 78c3d2d3-8bfe-47b8-9282-3e9091b37043 bound to our chassis#033[00m
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:20.149 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 78c3d2d3-8bfe-47b8-9282-3e9091b37043#033[00m
Oct  2 08:02:20 np0005466013 systemd-udevd[220937]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:02:20 np0005466013 systemd-machined[152202]: New machine qemu-5-instance-0000000b.
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:20.160 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[43901074-b648-4e46-9fb5-79349eba8690]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:20.161 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap78c3d2d3-81 in ovnmeta-78c3d2d3-8bfe-47b8-9282-3e9091b37043 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:20.163 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap78c3d2d3-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:20.163 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[f006e2d8-d6a5-4724-871f-34684a04785a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:20.164 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[9703ab2c-5f95-4703-962b-93cb26d3c5e7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:02:20 np0005466013 NetworkManager[51205]: <info>  [1759406540.1694] device (tap75561bb8-bf): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:02:20 np0005466013 NetworkManager[51205]: <info>  [1759406540.1704] device (tap75561bb8-bf): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:20.176 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[e82f791f-561c-41a2-8c9d-3506ece7d0af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:02:20 np0005466013 systemd[1]: Started Virtual Machine qemu-5-instance-0000000b.
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:20.207 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[750bffeb-2049-444f-9742-fa775519ac4e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:02:20 np0005466013 nova_compute[192144]: 2025-10-02 12:02:20.216 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:20 np0005466013 ovn_controller[94366]: 2025-10-02T12:02:20Z|00040|binding|INFO|Setting lport 75561bb8-bfb9-4100-9c79-271fd50011de ovn-installed in OVS
Oct  2 08:02:20 np0005466013 ovn_controller[94366]: 2025-10-02T12:02:20Z|00041|binding|INFO|Setting lport 75561bb8-bfb9-4100-9c79-271fd50011de up in Southbound
Oct  2 08:02:20 np0005466013 ovn_controller[94366]: 2025-10-02T12:02:20Z|00042|binding|INFO|Setting lport 3197a9b3-066e-4dd5-acdc-899f59bb4e28 up in Southbound
Oct  2 08:02:20 np0005466013 nova_compute[192144]: 2025-10-02 12:02:20.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:20.233 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[656c2e05-506a-4ae8-abbd-c9191ebed40c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:02:20 np0005466013 NetworkManager[51205]: <info>  [1759406540.2405] manager: (tap78c3d2d3-80): new Veth device (/org/freedesktop/NetworkManager/Devices/29)
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:20.240 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[55f93626-b60a-41c6-93fb-1a4438fb0fc8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:20.266 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[a63804e2-d5a7-42b4-a11b-6bdc18a876bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:20.268 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[03518683-860e-43e5-a7fe-ec426f583d3a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:02:20 np0005466013 NetworkManager[51205]: <info>  [1759406540.2893] device (tap78c3d2d3-80): carrier: link connected
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:20.294 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[32785875-d68b-4417-aec4-4b324c5f31a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:20.311 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[6ba4b1b2-4baf-4118-8a85-b4ce8db749cf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap78c3d2d3-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ce:84:e1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 453190, 'reachable_time': 27074, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220971, 'error': None, 'target': 'ovnmeta-78c3d2d3-8bfe-47b8-9282-3e9091b37043', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:20.325 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[b5661efb-ced2-48a7-9c50-f97d27e14dc6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fece:84e1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 453190, 'tstamp': 453190}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220972, 'error': None, 'target': 'ovnmeta-78c3d2d3-8bfe-47b8-9282-3e9091b37043', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:20.340 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[b15ca9d8-c4bb-4171-8dbb-3c0ec6c15209]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap78c3d2d3-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ce:84:e1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 453190, 'reachable_time': 27074, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220973, 'error': None, 'target': 'ovnmeta-78c3d2d3-8bfe-47b8-9282-3e9091b37043', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:20.373 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[b8432b60-2b98-40b7-8740-548638b3d571]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:20.420 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[36eb3077-1284-4b50-b1ad-97644892aa77]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:20.422 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap78c3d2d3-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:20.422 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:20.423 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap78c3d2d3-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:02:20 np0005466013 nova_compute[192144]: 2025-10-02 12:02:20.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:20 np0005466013 NetworkManager[51205]: <info>  [1759406540.4255] manager: (tap78c3d2d3-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Oct  2 08:02:20 np0005466013 kernel: tap78c3d2d3-80: entered promiscuous mode
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:20.432 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap78c3d2d3-80, col_values=(('external_ids', {'iface-id': 'ced2993d-8371-4d25-a439-bcab0bc7265c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:02:20 np0005466013 ovn_controller[94366]: 2025-10-02T12:02:20Z|00043|binding|INFO|Releasing lport ced2993d-8371-4d25-a439-bcab0bc7265c from this chassis (sb_readonly=0)
Oct  2 08:02:20 np0005466013 nova_compute[192144]: 2025-10-02 12:02:20.434 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:20.436 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/78c3d2d3-8bfe-47b8-9282-3e9091b37043.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/78c3d2d3-8bfe-47b8-9282-3e9091b37043.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:20.437 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[57ee73af-aa9d-490f-9237-fa8117d906fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:20.439 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-78c3d2d3-8bfe-47b8-9282-3e9091b37043
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/78c3d2d3-8bfe-47b8-9282-3e9091b37043.pid.haproxy
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID 78c3d2d3-8bfe-47b8-9282-3e9091b37043
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:20.442 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-78c3d2d3-8bfe-47b8-9282-3e9091b37043', 'env', 'PROCESS_TAG=haproxy-78c3d2d3-8bfe-47b8-9282-3e9091b37043', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/78c3d2d3-8bfe-47b8-9282-3e9091b37043.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:02:20 np0005466013 nova_compute[192144]: 2025-10-02 12:02:20.446 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:20 np0005466013 podman[221005]: 2025-10-02 12:02:20.785008471 +0000 UTC m=+0.052324760 container create 090c1e471d20d332fde317da13dd51acd350086f8be3203a888f2aeac4f5251d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-78c3d2d3-8bfe-47b8-9282-3e9091b37043, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:02:20 np0005466013 systemd[1]: Started libpod-conmon-090c1e471d20d332fde317da13dd51acd350086f8be3203a888f2aeac4f5251d.scope.
Oct  2 08:02:20 np0005466013 podman[221005]: 2025-10-02 12:02:20.755338192 +0000 UTC m=+0.022654501 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:02:20 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:02:20 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4d40e27b1e5f14a32a13a39cd948e444a729164853ae61fe5d3b81749eba05b1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:02:20 np0005466013 podman[221005]: 2025-10-02 12:02:20.872245428 +0000 UTC m=+0.139561737 container init 090c1e471d20d332fde317da13dd51acd350086f8be3203a888f2aeac4f5251d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-78c3d2d3-8bfe-47b8-9282-3e9091b37043, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:02:20 np0005466013 podman[221005]: 2025-10-02 12:02:20.878449061 +0000 UTC m=+0.145765350 container start 090c1e471d20d332fde317da13dd51acd350086f8be3203a888f2aeac4f5251d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-78c3d2d3-8bfe-47b8-9282-3e9091b37043, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:02:20 np0005466013 neutron-haproxy-ovnmeta-78c3d2d3-8bfe-47b8-9282-3e9091b37043[221020]: [NOTICE]   (221024) : New worker (221026) forked
Oct  2 08:02:20 np0005466013 neutron-haproxy-ovnmeta-78c3d2d3-8bfe-47b8-9282-3e9091b37043[221020]: [NOTICE]   (221024) : Loading success.
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:20.943 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 75561bb8-bfb9-4100-9c79-271fd50011de in datapath 020b4768-a07a-4769-8636-455566c87083 unbound from our chassis#033[00m
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:20.945 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 020b4768-a07a-4769-8636-455566c87083#033[00m
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:20.954 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[ca1f2730-821c-43b9-9d5b-4ec9fce719fb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:20.954 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap020b4768-a1 in ovnmeta-020b4768-a07a-4769-8636-455566c87083 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:20.956 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap020b4768-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:20.956 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[0aba6679-353c-4e90-b5dc-ac481a14e7ed]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:20.957 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[836eae37-4118-4710-9e12-2797b27145ec]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:20.967 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[1ebd3288-ef57-49b4-8882-776c6fc2b80c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:02:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:20.994 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[e1ef15da-74be-4b87-8432-81a71e0e7284]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:02:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:21.019 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[e61c25ed-7b08-4577-8791-a16618fa6c9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:02:21 np0005466013 NetworkManager[51205]: <info>  [1759406541.0260] manager: (tap020b4768-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/31)
Oct  2 08:02:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:21.024 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[1df2aa00-b5e0-4237-8463-5aa9ed733cf0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:02:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:21.050 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[1eea3df2-abe8-4b0b-8270-2ca7f9dd5271]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:02:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:21.053 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[29714b52-327d-4c4a-a1b4-7cfd58e595ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:02:21 np0005466013 NetworkManager[51205]: <info>  [1759406541.0695] device (tap020b4768-a0): carrier: link connected
Oct  2 08:02:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:21.072 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[a60a8026-6764-4f72-8b78-3b62dd717d9f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:02:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:21.087 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[4d4b29ba-f45d-49d7-8a24-dca5272481e8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap020b4768-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:62:d2:ce'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 453268, 'reachable_time': 23111, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221052, 'error': None, 'target': 'ovnmeta-020b4768-a07a-4769-8636-455566c87083', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:02:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:21.100 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[77f9dbd4-b5f3-4523-b35a-51f7318ead28]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe62:d2ce'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 453268, 'tstamp': 453268}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221053, 'error': None, 'target': 'ovnmeta-020b4768-a07a-4769-8636-455566c87083', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:02:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:21.115 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[a48c7439-9188-48d0-a5bf-61a8aba856e4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap020b4768-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:62:d2:ce'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 453268, 'reachable_time': 23111, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221054, 'error': None, 'target': 'ovnmeta-020b4768-a07a-4769-8636-455566c87083', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:02:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:21.141 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[5653d6d3-a06e-4851-affc-0db77929eaee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:02:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:21.194 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[e64be573-bf29-4f93-8be1-796e7edb167c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:02:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:21.196 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap020b4768-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:02:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:21.197 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:02:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:21.197 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap020b4768-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:02:21 np0005466013 NetworkManager[51205]: <info>  [1759406541.2457] manager: (tap020b4768-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/32)
Oct  2 08:02:21 np0005466013 kernel: tap020b4768-a0: entered promiscuous mode
Oct  2 08:02:21 np0005466013 nova_compute[192144]: 2025-10-02 12:02:21.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:21 np0005466013 nova_compute[192144]: 2025-10-02 12:02:21.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:21.248 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap020b4768-a0, col_values=(('external_ids', {'iface-id': '7ad14bc1-f6e9-4852-aef9-ac72c7291cba'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:02:21 np0005466013 nova_compute[192144]: 2025-10-02 12:02:21.249 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:21 np0005466013 ovn_controller[94366]: 2025-10-02T12:02:21Z|00044|binding|INFO|Releasing lport 7ad14bc1-f6e9-4852-aef9-ac72c7291cba from this chassis (sb_readonly=0)
Oct  2 08:02:21 np0005466013 nova_compute[192144]: 2025-10-02 12:02:21.261 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:21.263 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/020b4768-a07a-4769-8636-455566c87083.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/020b4768-a07a-4769-8636-455566c87083.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:02:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:21.263 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[ec995a06-f857-47be-bcda-ea966d3f32a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:02:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:21.264 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:02:21 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:02:21 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:02:21 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-020b4768-a07a-4769-8636-455566c87083
Oct  2 08:02:21 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:02:21 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:02:21 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:02:21 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/020b4768-a07a-4769-8636-455566c87083.pid.haproxy
Oct  2 08:02:21 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:02:21 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:02:21 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:02:21 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:02:21 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:02:21 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:02:21 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:02:21 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:02:21 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:02:21 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:02:21 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:02:21 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:02:21 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:02:21 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:02:21 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:02:21 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:02:21 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:02:21 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:02:21 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:02:21 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:02:21 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID 020b4768-a07a-4769-8636-455566c87083
Oct  2 08:02:21 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:02:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:21.265 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-020b4768-a07a-4769-8636-455566c87083', 'env', 'PROCESS_TAG=haproxy-020b4768-a07a-4769-8636-455566c87083', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/020b4768-a07a-4769-8636-455566c87083.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:02:21 np0005466013 nova_compute[192144]: 2025-10-02 12:02:21.442 2 DEBUG nova.compute.manager [req-a0ecd3f4-ac60-4876-b58e-8e4d44f8b1a1 req-88cd2505-c77e-4be3-9132-51a1cbd17fb2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Received event network-vif-plugged-75561bb8-bfb9-4100-9c79-271fd50011de external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:02:21 np0005466013 nova_compute[192144]: 2025-10-02 12:02:21.443 2 DEBUG oslo_concurrency.lockutils [req-a0ecd3f4-ac60-4876-b58e-8e4d44f8b1a1 req-88cd2505-c77e-4be3-9132-51a1cbd17fb2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "f1267fe1-552c-4312-b9b0-c02eae82a77a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:02:21 np0005466013 nova_compute[192144]: 2025-10-02 12:02:21.443 2 DEBUG oslo_concurrency.lockutils [req-a0ecd3f4-ac60-4876-b58e-8e4d44f8b1a1 req-88cd2505-c77e-4be3-9132-51a1cbd17fb2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f1267fe1-552c-4312-b9b0-c02eae82a77a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:02:21 np0005466013 nova_compute[192144]: 2025-10-02 12:02:21.443 2 DEBUG oslo_concurrency.lockutils [req-a0ecd3f4-ac60-4876-b58e-8e4d44f8b1a1 req-88cd2505-c77e-4be3-9132-51a1cbd17fb2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f1267fe1-552c-4312-b9b0-c02eae82a77a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:02:21 np0005466013 nova_compute[192144]: 2025-10-02 12:02:21.444 2 DEBUG nova.compute.manager [req-a0ecd3f4-ac60-4876-b58e-8e4d44f8b1a1 req-88cd2505-c77e-4be3-9132-51a1cbd17fb2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Processing event network-vif-plugged-75561bb8-bfb9-4100-9c79-271fd50011de _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:02:21 np0005466013 nova_compute[192144]: 2025-10-02 12:02:21.476 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759406541.4756687, f1267fe1-552c-4312-b9b0-c02eae82a77a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:02:21 np0005466013 nova_compute[192144]: 2025-10-02 12:02:21.476 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] VM Started (Lifecycle Event)#033[00m
Oct  2 08:02:21 np0005466013 nova_compute[192144]: 2025-10-02 12:02:21.478 2 DEBUG nova.compute.manager [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:02:21 np0005466013 nova_compute[192144]: 2025-10-02 12:02:21.482 2 DEBUG nova.virt.libvirt.driver [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:02:21 np0005466013 nova_compute[192144]: 2025-10-02 12:02:21.487 2 INFO nova.virt.libvirt.driver [-] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Instance spawned successfully.#033[00m
Oct  2 08:02:21 np0005466013 nova_compute[192144]: 2025-10-02 12:02:21.488 2 DEBUG nova.virt.libvirt.driver [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:02:21 np0005466013 nova_compute[192144]: 2025-10-02 12:02:21.508 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:02:21 np0005466013 nova_compute[192144]: 2025-10-02 12:02:21.516 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:02:21 np0005466013 nova_compute[192144]: 2025-10-02 12:02:21.520 2 DEBUG nova.virt.libvirt.driver [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:02:21 np0005466013 nova_compute[192144]: 2025-10-02 12:02:21.520 2 DEBUG nova.virt.libvirt.driver [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:02:21 np0005466013 nova_compute[192144]: 2025-10-02 12:02:21.521 2 DEBUG nova.virt.libvirt.driver [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:02:21 np0005466013 nova_compute[192144]: 2025-10-02 12:02:21.522 2 DEBUG nova.virt.libvirt.driver [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:02:21 np0005466013 nova_compute[192144]: 2025-10-02 12:02:21.522 2 DEBUG nova.virt.libvirt.driver [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:02:21 np0005466013 nova_compute[192144]: 2025-10-02 12:02:21.523 2 DEBUG nova.virt.libvirt.driver [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:02:21 np0005466013 nova_compute[192144]: 2025-10-02 12:02:21.561 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:02:21 np0005466013 nova_compute[192144]: 2025-10-02 12:02:21.562 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759406541.4778023, f1267fe1-552c-4312-b9b0-c02eae82a77a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:02:21 np0005466013 nova_compute[192144]: 2025-10-02 12:02:21.563 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:02:21 np0005466013 nova_compute[192144]: 2025-10-02 12:02:21.606 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:02:21 np0005466013 nova_compute[192144]: 2025-10-02 12:02:21.608 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759406541.4817076, f1267fe1-552c-4312-b9b0-c02eae82a77a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:02:21 np0005466013 nova_compute[192144]: 2025-10-02 12:02:21.609 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:02:21 np0005466013 podman[221086]: 2025-10-02 12:02:21.61456685 +0000 UTC m=+0.043135846 container create d0e0a0e6544e44408e15189c9c2f2b95529c7b7cb4800ac5483a0d8f640c8109 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-020b4768-a07a-4769-8636-455566c87083, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct  2 08:02:21 np0005466013 nova_compute[192144]: 2025-10-02 12:02:21.629 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:02:21 np0005466013 nova_compute[192144]: 2025-10-02 12:02:21.633 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:02:21 np0005466013 systemd[1]: Started libpod-conmon-d0e0a0e6544e44408e15189c9c2f2b95529c7b7cb4800ac5483a0d8f640c8109.scope.
Oct  2 08:02:21 np0005466013 nova_compute[192144]: 2025-10-02 12:02:21.656 2 INFO nova.compute.manager [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Took 8.41 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:02:21 np0005466013 nova_compute[192144]: 2025-10-02 12:02:21.657 2 DEBUG nova.compute.manager [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:02:21 np0005466013 nova_compute[192144]: 2025-10-02 12:02:21.665 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:02:21 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:02:21 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/86bd256e2755db322e16ebb017875b94020f2f82bf883a8daaded5878cca4c7d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:02:21 np0005466013 podman[221086]: 2025-10-02 12:02:21.590071802 +0000 UTC m=+0.018640818 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:02:21 np0005466013 podman[221086]: 2025-10-02 12:02:21.692001965 +0000 UTC m=+0.120570991 container init d0e0a0e6544e44408e15189c9c2f2b95529c7b7cb4800ac5483a0d8f640c8109 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-020b4768-a07a-4769-8636-455566c87083, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:02:21 np0005466013 podman[221086]: 2025-10-02 12:02:21.698418623 +0000 UTC m=+0.126987619 container start d0e0a0e6544e44408e15189c9c2f2b95529c7b7cb4800ac5483a0d8f640c8109 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-020b4768-a07a-4769-8636-455566c87083, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:02:21 np0005466013 neutron-haproxy-ovnmeta-020b4768-a07a-4769-8636-455566c87083[221101]: [NOTICE]   (221105) : New worker (221107) forked
Oct  2 08:02:21 np0005466013 neutron-haproxy-ovnmeta-020b4768-a07a-4769-8636-455566c87083[221101]: [NOTICE]   (221105) : Loading success.
Oct  2 08:02:21 np0005466013 nova_compute[192144]: 2025-10-02 12:02:21.745 2 INFO nova.compute.manager [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Took 9.42 seconds to build instance.#033[00m
Oct  2 08:02:21 np0005466013 nova_compute[192144]: 2025-10-02 12:02:21.762 2 DEBUG oslo_concurrency.lockutils [None req-542c37a3-b5af-4a5c-b56e-108688db987f 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Lock "f1267fe1-552c-4312-b9b0-c02eae82a77a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.537s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:02:21 np0005466013 nova_compute[192144]: 2025-10-02 12:02:21.965 2 DEBUG nova.network.neutron [req-82ba78d4-a31d-4ccd-89b8-b8eb8132f580 req-feb451b6-e2d3-4381-b186-c6848bb2a7ec 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Updated VIF entry in instance network info cache for port 75561bb8-bfb9-4100-9c79-271fd50011de. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:02:21 np0005466013 nova_compute[192144]: 2025-10-02 12:02:21.967 2 DEBUG nova.network.neutron [req-82ba78d4-a31d-4ccd-89b8-b8eb8132f580 req-feb451b6-e2d3-4381-b186-c6848bb2a7ec 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Updating instance_info_cache with network_info: [{"id": "75561bb8-bfb9-4100-9c79-271fd50011de", "address": "fa:16:3e:19:d8:66", "network": {"id": "020b4768-a07a-4769-8636-455566c87083", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-804372870-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5cc73d75e0864e838eefa90cb33b7e01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75561bb8-bf", "ovs_interfaceid": "75561bb8-bfb9-4100-9c79-271fd50011de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:02:21 np0005466013 nova_compute[192144]: 2025-10-02 12:02:21.987 2 DEBUG oslo_concurrency.lockutils [req-82ba78d4-a31d-4ccd-89b8-b8eb8132f580 req-feb451b6-e2d3-4381-b186-c6848bb2a7ec 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-f1267fe1-552c-4312-b9b0-c02eae82a77a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:02:22 np0005466013 nova_compute[192144]: 2025-10-02 12:02:22.620 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:23 np0005466013 nova_compute[192144]: 2025-10-02 12:02:23.604 2 DEBUG nova.compute.manager [req-97d6e3ca-8345-4836-a85b-b35578ba7694 req-7cb7be6d-c91c-48e7-84ea-14fa85a0c717 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Received event network-vif-plugged-75561bb8-bfb9-4100-9c79-271fd50011de external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:02:23 np0005466013 nova_compute[192144]: 2025-10-02 12:02:23.605 2 DEBUG oslo_concurrency.lockutils [req-97d6e3ca-8345-4836-a85b-b35578ba7694 req-7cb7be6d-c91c-48e7-84ea-14fa85a0c717 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "f1267fe1-552c-4312-b9b0-c02eae82a77a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:02:23 np0005466013 nova_compute[192144]: 2025-10-02 12:02:23.606 2 DEBUG oslo_concurrency.lockutils [req-97d6e3ca-8345-4836-a85b-b35578ba7694 req-7cb7be6d-c91c-48e7-84ea-14fa85a0c717 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f1267fe1-552c-4312-b9b0-c02eae82a77a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:02:23 np0005466013 nova_compute[192144]: 2025-10-02 12:02:23.606 2 DEBUG oslo_concurrency.lockutils [req-97d6e3ca-8345-4836-a85b-b35578ba7694 req-7cb7be6d-c91c-48e7-84ea-14fa85a0c717 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f1267fe1-552c-4312-b9b0-c02eae82a77a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:02:23 np0005466013 nova_compute[192144]: 2025-10-02 12:02:23.607 2 DEBUG nova.compute.manager [req-97d6e3ca-8345-4836-a85b-b35578ba7694 req-7cb7be6d-c91c-48e7-84ea-14fa85a0c717 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] No waiting events found dispatching network-vif-plugged-75561bb8-bfb9-4100-9c79-271fd50011de pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:02:23 np0005466013 nova_compute[192144]: 2025-10-02 12:02:23.607 2 WARNING nova.compute.manager [req-97d6e3ca-8345-4836-a85b-b35578ba7694 req-7cb7be6d-c91c-48e7-84ea-14fa85a0c717 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Received unexpected event network-vif-plugged-75561bb8-bfb9-4100-9c79-271fd50011de for instance with vm_state active and task_state None.#033[00m
Oct  2 08:02:23 np0005466013 nova_compute[192144]: 2025-10-02 12:02:23.760 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:27 np0005466013 nova_compute[192144]: 2025-10-02 12:02:27.392 2 DEBUG nova.virt.libvirt.driver [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Check if temp file /var/lib/nova/instances/tmp0t9xiq55 exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065#033[00m
Oct  2 08:02:27 np0005466013 nova_compute[192144]: 2025-10-02 12:02:27.393 2 DEBUG nova.compute.manager [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=71680,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp0t9xiq55',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='f1267fe1-552c-4312-b9b0-c02eae82a77a',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587#033[00m
Oct  2 08:02:27 np0005466013 nova_compute[192144]: 2025-10-02 12:02:27.621 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:27 np0005466013 podman[221116]: 2025-10-02 12:02:27.677557741 +0000 UTC m=+0.055405655 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  2 08:02:28 np0005466013 nova_compute[192144]: 2025-10-02 12:02:28.598 2 DEBUG oslo_concurrency.processutils [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f1267fe1-552c-4312-b9b0-c02eae82a77a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:02:28 np0005466013 nova_compute[192144]: 2025-10-02 12:02:28.659 2 DEBUG oslo_concurrency.processutils [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f1267fe1-552c-4312-b9b0-c02eae82a77a/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:02:28 np0005466013 nova_compute[192144]: 2025-10-02 12:02:28.661 2 DEBUG oslo_concurrency.processutils [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f1267fe1-552c-4312-b9b0-c02eae82a77a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:02:28 np0005466013 nova_compute[192144]: 2025-10-02 12:02:28.715 2 DEBUG oslo_concurrency.processutils [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f1267fe1-552c-4312-b9b0-c02eae82a77a/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:02:28 np0005466013 nova_compute[192144]: 2025-10-02 12:02:28.763 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:30 np0005466013 podman[221143]: 2025-10-02 12:02:30.710778744 +0000 UTC m=+0.079387206 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  2 08:02:30 np0005466013 podman[221144]: 2025-10-02 12:02:30.711654241 +0000 UTC m=+0.077659753 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, config_id=edpm, container_name=openstack_network_exporter, release=1755695350, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., distribution-scope=public)
Oct  2 08:02:31 np0005466013 systemd[1]: Created slice User Slice of UID 42436.
Oct  2 08:02:31 np0005466013 systemd[1]: Starting User Runtime Directory /run/user/42436...
Oct  2 08:02:31 np0005466013 systemd-logind[784]: New session 36 of user nova.
Oct  2 08:02:31 np0005466013 systemd[1]: Finished User Runtime Directory /run/user/42436.
Oct  2 08:02:31 np0005466013 systemd[1]: Starting User Manager for UID 42436...
Oct  2 08:02:32 np0005466013 systemd[221185]: Queued start job for default target Main User Target.
Oct  2 08:02:32 np0005466013 systemd[221185]: Created slice User Application Slice.
Oct  2 08:02:32 np0005466013 systemd[221185]: Started Mark boot as successful after the user session has run 2 minutes.
Oct  2 08:02:32 np0005466013 systemd[221185]: Started Daily Cleanup of User's Temporary Directories.
Oct  2 08:02:32 np0005466013 systemd[221185]: Reached target Paths.
Oct  2 08:02:32 np0005466013 systemd[221185]: Reached target Timers.
Oct  2 08:02:32 np0005466013 systemd[221185]: Starting D-Bus User Message Bus Socket...
Oct  2 08:02:32 np0005466013 systemd[221185]: Starting Create User's Volatile Files and Directories...
Oct  2 08:02:32 np0005466013 systemd[221185]: Finished Create User's Volatile Files and Directories.
Oct  2 08:02:32 np0005466013 systemd[221185]: Listening on D-Bus User Message Bus Socket.
Oct  2 08:02:32 np0005466013 systemd[221185]: Reached target Sockets.
Oct  2 08:02:32 np0005466013 systemd[221185]: Reached target Basic System.
Oct  2 08:02:32 np0005466013 systemd[221185]: Reached target Main User Target.
Oct  2 08:02:32 np0005466013 systemd[221185]: Startup finished in 146ms.
Oct  2 08:02:32 np0005466013 systemd[1]: Started User Manager for UID 42436.
Oct  2 08:02:32 np0005466013 systemd[1]: Started Session 36 of User nova.
Oct  2 08:02:32 np0005466013 systemd[1]: session-36.scope: Deactivated successfully.
Oct  2 08:02:32 np0005466013 systemd-logind[784]: Session 36 logged out. Waiting for processes to exit.
Oct  2 08:02:32 np0005466013 systemd-logind[784]: Removed session 36.
Oct  2 08:02:32 np0005466013 nova_compute[192144]: 2025-10-02 12:02:32.623 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:33 np0005466013 ovn_controller[94366]: 2025-10-02T12:02:33Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:19:d8:66 10.100.0.6
Oct  2 08:02:33 np0005466013 ovn_controller[94366]: 2025-10-02T12:02:33Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:19:d8:66 10.100.0.6
Oct  2 08:02:33 np0005466013 nova_compute[192144]: 2025-10-02 12:02:33.764 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:33 np0005466013 nova_compute[192144]: 2025-10-02 12:02:33.808 2 DEBUG nova.compute.manager [req-b09eeab9-373b-4a5e-acb8-b58bd0baee9f req-299e8b39-2f9f-44fd-8e95-52da528ff1fd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Received event network-vif-unplugged-75561bb8-bfb9-4100-9c79-271fd50011de external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:02:33 np0005466013 nova_compute[192144]: 2025-10-02 12:02:33.809 2 DEBUG oslo_concurrency.lockutils [req-b09eeab9-373b-4a5e-acb8-b58bd0baee9f req-299e8b39-2f9f-44fd-8e95-52da528ff1fd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "f1267fe1-552c-4312-b9b0-c02eae82a77a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:02:33 np0005466013 nova_compute[192144]: 2025-10-02 12:02:33.810 2 DEBUG oslo_concurrency.lockutils [req-b09eeab9-373b-4a5e-acb8-b58bd0baee9f req-299e8b39-2f9f-44fd-8e95-52da528ff1fd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f1267fe1-552c-4312-b9b0-c02eae82a77a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:02:33 np0005466013 nova_compute[192144]: 2025-10-02 12:02:33.810 2 DEBUG oslo_concurrency.lockutils [req-b09eeab9-373b-4a5e-acb8-b58bd0baee9f req-299e8b39-2f9f-44fd-8e95-52da528ff1fd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f1267fe1-552c-4312-b9b0-c02eae82a77a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:02:33 np0005466013 nova_compute[192144]: 2025-10-02 12:02:33.810 2 DEBUG nova.compute.manager [req-b09eeab9-373b-4a5e-acb8-b58bd0baee9f req-299e8b39-2f9f-44fd-8e95-52da528ff1fd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] No waiting events found dispatching network-vif-unplugged-75561bb8-bfb9-4100-9c79-271fd50011de pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:02:33 np0005466013 nova_compute[192144]: 2025-10-02 12:02:33.811 2 DEBUG nova.compute.manager [req-b09eeab9-373b-4a5e-acb8-b58bd0baee9f req-299e8b39-2f9f-44fd-8e95-52da528ff1fd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Received event network-vif-unplugged-75561bb8-bfb9-4100-9c79-271fd50011de for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:02:34 np0005466013 nova_compute[192144]: 2025-10-02 12:02:34.714 2 INFO nova.compute.manager [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Took 6.00 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.#033[00m
Oct  2 08:02:34 np0005466013 nova_compute[192144]: 2025-10-02 12:02:34.716 2 DEBUG nova.compute.manager [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:02:34 np0005466013 nova_compute[192144]: 2025-10-02 12:02:34.774 2 DEBUG nova.compute.manager [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=71680,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp0t9xiq55',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='f1267fe1-552c-4312-b9b0-c02eae82a77a',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(0c9f3fac-4ed0-44d8-a3a1-41a61781ece4),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939#033[00m
Oct  2 08:02:34 np0005466013 nova_compute[192144]: 2025-10-02 12:02:34.832 2 DEBUG nova.objects.instance [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] Lazy-loading 'migration_context' on Instance uuid f1267fe1-552c-4312-b9b0-c02eae82a77a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:02:34 np0005466013 nova_compute[192144]: 2025-10-02 12:02:34.833 2 DEBUG nova.virt.libvirt.driver [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639#033[00m
Oct  2 08:02:34 np0005466013 nova_compute[192144]: 2025-10-02 12:02:34.835 2 DEBUG nova.virt.libvirt.driver [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440#033[00m
Oct  2 08:02:34 np0005466013 nova_compute[192144]: 2025-10-02 12:02:34.836 2 DEBUG nova.virt.libvirt.driver [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449#033[00m
Oct  2 08:02:34 np0005466013 nova_compute[192144]: 2025-10-02 12:02:34.894 2 DEBUG nova.virt.libvirt.vif [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:02:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1420306859',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1420306859',id=11,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:02:21Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5cc73d75e0864e838eefa90cb33b7e01',ramdisk_id='',reservation_id='r-f4n74pvj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-984573444',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-984573444-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:02:21Z,user_data=None,user_id='59e8135d73ee43e088ba5ee7d9bd84b1',uuid=f1267fe1-552c-4312-b9b0-c02eae82a77a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "75561bb8-bfb9-4100-9c79-271fd50011de", "address": "fa:16:3e:19:d8:66", "network": {"id": "020b4768-a07a-4769-8636-455566c87083", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-804372870-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5cc73d75e0864e838eefa90cb33b7e01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap75561bb8-bf", "ovs_interfaceid": "75561bb8-bfb9-4100-9c79-271fd50011de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:02:34 np0005466013 nova_compute[192144]: 2025-10-02 12:02:34.895 2 DEBUG nova.network.os_vif_util [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] Converting VIF {"id": "75561bb8-bfb9-4100-9c79-271fd50011de", "address": "fa:16:3e:19:d8:66", "network": {"id": "020b4768-a07a-4769-8636-455566c87083", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-804372870-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5cc73d75e0864e838eefa90cb33b7e01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap75561bb8-bf", "ovs_interfaceid": "75561bb8-bfb9-4100-9c79-271fd50011de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:02:34 np0005466013 nova_compute[192144]: 2025-10-02 12:02:34.895 2 DEBUG nova.network.os_vif_util [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:d8:66,bridge_name='br-int',has_traffic_filtering=True,id=75561bb8-bfb9-4100-9c79-271fd50011de,network=Network(020b4768-a07a-4769-8636-455566c87083),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap75561bb8-bf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:02:34 np0005466013 nova_compute[192144]: 2025-10-02 12:02:34.896 2 DEBUG nova.virt.libvirt.migration [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Updating guest XML with vif config: <interface type="ethernet">
Oct  2 08:02:34 np0005466013 nova_compute[192144]:  <mac address="fa:16:3e:19:d8:66"/>
Oct  2 08:02:34 np0005466013 nova_compute[192144]:  <model type="virtio"/>
Oct  2 08:02:34 np0005466013 nova_compute[192144]:  <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:02:34 np0005466013 nova_compute[192144]:  <mtu size="1442"/>
Oct  2 08:02:34 np0005466013 nova_compute[192144]:  <target dev="tap75561bb8-bf"/>
Oct  2 08:02:34 np0005466013 nova_compute[192144]: </interface>
Oct  2 08:02:34 np0005466013 nova_compute[192144]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m
Oct  2 08:02:34 np0005466013 nova_compute[192144]: 2025-10-02 12:02:34.897 2 DEBUG nova.virt.libvirt.driver [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272#033[00m
Oct  2 08:02:35 np0005466013 nova_compute[192144]: 2025-10-02 12:02:35.339 2 DEBUG nova.virt.libvirt.migration [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Oct  2 08:02:35 np0005466013 nova_compute[192144]: 2025-10-02 12:02:35.340 2 INFO nova.virt.libvirt.migration [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Increasing downtime to 50 ms after 0 sec elapsed time#033[00m
Oct  2 08:02:35 np0005466013 nova_compute[192144]: 2025-10-02 12:02:35.528 2 INFO nova.virt.libvirt.driver [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).#033[00m
Oct  2 08:02:35 np0005466013 nova_compute[192144]: 2025-10-02 12:02:35.979 2 DEBUG nova.compute.manager [req-e414bb44-5b77-45d4-9cc5-5418c059123a req-22b1282f-72e4-42f6-836f-1f2ddbeeafae 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Received event network-vif-plugged-75561bb8-bfb9-4100-9c79-271fd50011de external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:02:35 np0005466013 nova_compute[192144]: 2025-10-02 12:02:35.980 2 DEBUG oslo_concurrency.lockutils [req-e414bb44-5b77-45d4-9cc5-5418c059123a req-22b1282f-72e4-42f6-836f-1f2ddbeeafae 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "f1267fe1-552c-4312-b9b0-c02eae82a77a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:02:35 np0005466013 nova_compute[192144]: 2025-10-02 12:02:35.981 2 DEBUG oslo_concurrency.lockutils [req-e414bb44-5b77-45d4-9cc5-5418c059123a req-22b1282f-72e4-42f6-836f-1f2ddbeeafae 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f1267fe1-552c-4312-b9b0-c02eae82a77a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:02:35 np0005466013 nova_compute[192144]: 2025-10-02 12:02:35.981 2 DEBUG oslo_concurrency.lockutils [req-e414bb44-5b77-45d4-9cc5-5418c059123a req-22b1282f-72e4-42f6-836f-1f2ddbeeafae 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f1267fe1-552c-4312-b9b0-c02eae82a77a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:02:35 np0005466013 nova_compute[192144]: 2025-10-02 12:02:35.981 2 DEBUG nova.compute.manager [req-e414bb44-5b77-45d4-9cc5-5418c059123a req-22b1282f-72e4-42f6-836f-1f2ddbeeafae 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] No waiting events found dispatching network-vif-plugged-75561bb8-bfb9-4100-9c79-271fd50011de pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:02:35 np0005466013 nova_compute[192144]: 2025-10-02 12:02:35.982 2 WARNING nova.compute.manager [req-e414bb44-5b77-45d4-9cc5-5418c059123a req-22b1282f-72e4-42f6-836f-1f2ddbeeafae 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Received unexpected event network-vif-plugged-75561bb8-bfb9-4100-9c79-271fd50011de for instance with vm_state active and task_state migrating.#033[00m
Oct  2 08:02:35 np0005466013 nova_compute[192144]: 2025-10-02 12:02:35.982 2 DEBUG nova.compute.manager [req-e414bb44-5b77-45d4-9cc5-5418c059123a req-22b1282f-72e4-42f6-836f-1f2ddbeeafae 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Received event network-changed-75561bb8-bfb9-4100-9c79-271fd50011de external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:02:35 np0005466013 nova_compute[192144]: 2025-10-02 12:02:35.983 2 DEBUG nova.compute.manager [req-e414bb44-5b77-45d4-9cc5-5418c059123a req-22b1282f-72e4-42f6-836f-1f2ddbeeafae 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Refreshing instance network info cache due to event network-changed-75561bb8-bfb9-4100-9c79-271fd50011de. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:02:35 np0005466013 nova_compute[192144]: 2025-10-02 12:02:35.983 2 DEBUG oslo_concurrency.lockutils [req-e414bb44-5b77-45d4-9cc5-5418c059123a req-22b1282f-72e4-42f6-836f-1f2ddbeeafae 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-f1267fe1-552c-4312-b9b0-c02eae82a77a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:02:35 np0005466013 nova_compute[192144]: 2025-10-02 12:02:35.984 2 DEBUG oslo_concurrency.lockutils [req-e414bb44-5b77-45d4-9cc5-5418c059123a req-22b1282f-72e4-42f6-836f-1f2ddbeeafae 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-f1267fe1-552c-4312-b9b0-c02eae82a77a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:02:35 np0005466013 nova_compute[192144]: 2025-10-02 12:02:35.984 2 DEBUG nova.network.neutron [req-e414bb44-5b77-45d4-9cc5-5418c059123a req-22b1282f-72e4-42f6-836f-1f2ddbeeafae 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Refreshing network info cache for port 75561bb8-bfb9-4100-9c79-271fd50011de _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:02:36 np0005466013 nova_compute[192144]: 2025-10-02 12:02:36.031 2 DEBUG nova.virt.libvirt.migration [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Oct  2 08:02:36 np0005466013 nova_compute[192144]: 2025-10-02 12:02:36.032 2 DEBUG nova.virt.libvirt.migration [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Oct  2 08:02:36 np0005466013 nova_compute[192144]: 2025-10-02 12:02:36.536 2 DEBUG nova.virt.libvirt.migration [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Oct  2 08:02:36 np0005466013 nova_compute[192144]: 2025-10-02 12:02:36.536 2 DEBUG nova.virt.libvirt.migration [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Oct  2 08:02:37 np0005466013 nova_compute[192144]: 2025-10-02 12:02:37.040 2 DEBUG nova.virt.libvirt.migration [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Oct  2 08:02:37 np0005466013 nova_compute[192144]: 2025-10-02 12:02:37.041 2 DEBUG nova.virt.libvirt.migration [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Oct  2 08:02:37 np0005466013 nova_compute[192144]: 2025-10-02 12:02:37.169 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759406557.1692677, f1267fe1-552c-4312-b9b0-c02eae82a77a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:02:37 np0005466013 nova_compute[192144]: 2025-10-02 12:02:37.170 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:02:37 np0005466013 nova_compute[192144]: 2025-10-02 12:02:37.200 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:02:37 np0005466013 nova_compute[192144]: 2025-10-02 12:02:37.204 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:02:37 np0005466013 nova_compute[192144]: 2025-10-02 12:02:37.225 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] During sync_power_state the instance has a pending task (migrating). Skip.#033[00m
Oct  2 08:02:37 np0005466013 kernel: tap75561bb8-bf (unregistering): left promiscuous mode
Oct  2 08:02:37 np0005466013 NetworkManager[51205]: <info>  [1759406557.3048] device (tap75561bb8-bf): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:02:37 np0005466013 ovn_controller[94366]: 2025-10-02T12:02:37Z|00045|binding|INFO|Releasing lport 75561bb8-bfb9-4100-9c79-271fd50011de from this chassis (sb_readonly=0)
Oct  2 08:02:37 np0005466013 nova_compute[192144]: 2025-10-02 12:02:37.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:37 np0005466013 ovn_controller[94366]: 2025-10-02T12:02:37Z|00046|binding|INFO|Setting lport 75561bb8-bfb9-4100-9c79-271fd50011de down in Southbound
Oct  2 08:02:37 np0005466013 ovn_controller[94366]: 2025-10-02T12:02:37Z|00047|binding|INFO|Releasing lport 3197a9b3-066e-4dd5-acdc-899f59bb4e28 from this chassis (sb_readonly=0)
Oct  2 08:02:37 np0005466013 ovn_controller[94366]: 2025-10-02T12:02:37Z|00048|binding|INFO|Setting lport 3197a9b3-066e-4dd5-acdc-899f59bb4e28 down in Southbound
Oct  2 08:02:37 np0005466013 ovn_controller[94366]: 2025-10-02T12:02:37Z|00049|binding|INFO|Removing iface tap75561bb8-bf ovn-installed in OVS
Oct  2 08:02:37 np0005466013 ovn_controller[94366]: 2025-10-02T12:02:37Z|00050|binding|INFO|Releasing lport 7ad14bc1-f6e9-4852-aef9-ac72c7291cba from this chassis (sb_readonly=0)
Oct  2 08:02:37 np0005466013 ovn_controller[94366]: 2025-10-02T12:02:37Z|00051|binding|INFO|Releasing lport ced2993d-8371-4d25-a439-bcab0bc7265c from this chassis (sb_readonly=0)
Oct  2 08:02:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:37.324 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e8:2d:9c 19.80.0.100'], port_security=['fa:16:3e:e8:2d:9c 19.80.0.100'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['75561bb8-bfb9-4100-9c79-271fd50011de'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-207340523', 'neutron:cidrs': '19.80.0.100/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-78c3d2d3-8bfe-47b8-9282-3e9091b37043', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-207340523', 'neutron:project_id': '5cc73d75e0864e838eefa90cb33b7e01', 'neutron:revision_number': '3', 'neutron:security_group_ids': 'f3fadef5-4bfc-406c-93c4-14d4abd0583e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=1160f689-1347-48a0-ba14-b69afc977804, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=3197a9b3-066e-4dd5-acdc-899f59bb4e28) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:02:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:37.326 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:19:d8:66 10.100.0.6'], port_security=['fa:16:3e:19:d8:66 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'c9f3d658-5c7a-4803-9bbb-01adfb7e88ca'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-2028677656', 'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'f1267fe1-552c-4312-b9b0-c02eae82a77a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-020b4768-a07a-4769-8636-455566c87083', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-2028677656', 'neutron:project_id': '5cc73d75e0864e838eefa90cb33b7e01', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'f3fadef5-4bfc-406c-93c4-14d4abd0583e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=11c0be75-bb4b-4e01-8cfa-b9aa4fcaf0e9, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=75561bb8-bfb9-4100-9c79-271fd50011de) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:02:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:37.327 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 3197a9b3-066e-4dd5-acdc-899f59bb4e28 in datapath 78c3d2d3-8bfe-47b8-9282-3e9091b37043 unbound from our chassis#033[00m
Oct  2 08:02:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:37.329 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 78c3d2d3-8bfe-47b8-9282-3e9091b37043, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:02:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:37.331 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[839bd40a-4c17-4918-b1e8-8eacc11c8e05]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:02:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:37.333 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-78c3d2d3-8bfe-47b8-9282-3e9091b37043 namespace which is not needed anymore#033[00m
Oct  2 08:02:37 np0005466013 nova_compute[192144]: 2025-10-02 12:02:37.344 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:37 np0005466013 nova_compute[192144]: 2025-10-02 12:02:37.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:37 np0005466013 podman[221240]: 2025-10-02 12:02:37.40921273 +0000 UTC m=+0.070306176 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:02:37 np0005466013 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Oct  2 08:02:37 np0005466013 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d0000000b.scope: Consumed 13.217s CPU time.
Oct  2 08:02:37 np0005466013 systemd-machined[152202]: Machine qemu-5-instance-0000000b terminated.
Oct  2 08:02:37 np0005466013 podman[221242]: 2025-10-02 12:02:37.438537937 +0000 UTC m=+0.099727096 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=iscsid)
Oct  2 08:02:37 np0005466013 neutron-haproxy-ovnmeta-78c3d2d3-8bfe-47b8-9282-3e9091b37043[221020]: [NOTICE]   (221024) : haproxy version is 2.8.14-c23fe91
Oct  2 08:02:37 np0005466013 neutron-haproxy-ovnmeta-78c3d2d3-8bfe-47b8-9282-3e9091b37043[221020]: [NOTICE]   (221024) : path to executable is /usr/sbin/haproxy
Oct  2 08:02:37 np0005466013 neutron-haproxy-ovnmeta-78c3d2d3-8bfe-47b8-9282-3e9091b37043[221020]: [WARNING]  (221024) : Exiting Master process...
Oct  2 08:02:37 np0005466013 neutron-haproxy-ovnmeta-78c3d2d3-8bfe-47b8-9282-3e9091b37043[221020]: [WARNING]  (221024) : Exiting Master process...
Oct  2 08:02:37 np0005466013 neutron-haproxy-ovnmeta-78c3d2d3-8bfe-47b8-9282-3e9091b37043[221020]: [ALERT]    (221024) : Current worker (221026) exited with code 143 (Terminated)
Oct  2 08:02:37 np0005466013 neutron-haproxy-ovnmeta-78c3d2d3-8bfe-47b8-9282-3e9091b37043[221020]: [WARNING]  (221024) : All workers exited. Exiting... (0)
Oct  2 08:02:37 np0005466013 systemd[1]: libpod-090c1e471d20d332fde317da13dd51acd350086f8be3203a888f2aeac4f5251d.scope: Deactivated successfully.
Oct  2 08:02:37 np0005466013 podman[221306]: 2025-10-02 12:02:37.468787602 +0000 UTC m=+0.041186404 container died 090c1e471d20d332fde317da13dd51acd350086f8be3203a888f2aeac4f5251d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-78c3d2d3-8bfe-47b8-9282-3e9091b37043, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  2 08:02:37 np0005466013 nova_compute[192144]: 2025-10-02 12:02:37.487 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:37 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-090c1e471d20d332fde317da13dd51acd350086f8be3203a888f2aeac4f5251d-userdata-shm.mount: Deactivated successfully.
Oct  2 08:02:37 np0005466013 systemd[1]: var-lib-containers-storage-overlay-4d40e27b1e5f14a32a13a39cd948e444a729164853ae61fe5d3b81749eba05b1-merged.mount: Deactivated successfully.
Oct  2 08:02:37 np0005466013 podman[221306]: 2025-10-02 12:02:37.507716747 +0000 UTC m=+0.080115549 container cleanup 090c1e471d20d332fde317da13dd51acd350086f8be3203a888f2aeac4f5251d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-78c3d2d3-8bfe-47b8-9282-3e9091b37043, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  2 08:02:37 np0005466013 systemd[1]: libpod-conmon-090c1e471d20d332fde317da13dd51acd350086f8be3203a888f2aeac4f5251d.scope: Deactivated successfully.
Oct  2 08:02:37 np0005466013 nova_compute[192144]: 2025-10-02 12:02:37.531 2 DEBUG nova.virt.libvirt.driver [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279#033[00m
Oct  2 08:02:37 np0005466013 nova_compute[192144]: 2025-10-02 12:02:37.532 2 DEBUG nova.virt.libvirt.driver [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327#033[00m
Oct  2 08:02:37 np0005466013 nova_compute[192144]: 2025-10-02 12:02:37.532 2 DEBUG nova.virt.libvirt.driver [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630#033[00m
Oct  2 08:02:37 np0005466013 nova_compute[192144]: 2025-10-02 12:02:37.542 2 DEBUG nova.virt.libvirt.guest [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid 'f1267fe1-552c-4312-b9b0-c02eae82a77a' (instance-0000000b) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688#033[00m
Oct  2 08:02:37 np0005466013 nova_compute[192144]: 2025-10-02 12:02:37.543 2 INFO nova.virt.libvirt.driver [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Migration operation has completed#033[00m
Oct  2 08:02:37 np0005466013 nova_compute[192144]: 2025-10-02 12:02:37.544 2 INFO nova.compute.manager [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] _post_live_migration() is started..#033[00m
Oct  2 08:02:37 np0005466013 podman[221347]: 2025-10-02 12:02:37.613670404 +0000 UTC m=+0.084371921 container remove 090c1e471d20d332fde317da13dd51acd350086f8be3203a888f2aeac4f5251d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-78c3d2d3-8bfe-47b8-9282-3e9091b37043, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2)
Oct  2 08:02:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:37.619 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[45d4e549-2a8c-42cc-bbd0-68b41b58914c]: (4, ('Thu Oct  2 12:02:37 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-78c3d2d3-8bfe-47b8-9282-3e9091b37043 (090c1e471d20d332fde317da13dd51acd350086f8be3203a888f2aeac4f5251d)\n090c1e471d20d332fde317da13dd51acd350086f8be3203a888f2aeac4f5251d\nThu Oct  2 12:02:37 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-78c3d2d3-8bfe-47b8-9282-3e9091b37043 (090c1e471d20d332fde317da13dd51acd350086f8be3203a888f2aeac4f5251d)\n090c1e471d20d332fde317da13dd51acd350086f8be3203a888f2aeac4f5251d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:02:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:37.620 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[7b0205e3-2333-4465-b385-3d31a225b4ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:02:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:37.621 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap78c3d2d3-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:02:37 np0005466013 kernel: tap78c3d2d3-80: left promiscuous mode
Oct  2 08:02:37 np0005466013 nova_compute[192144]: 2025-10-02 12:02:37.623 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:37 np0005466013 nova_compute[192144]: 2025-10-02 12:02:37.638 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:37.641 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[789988db-d509-436c-a920-34cdaeda830c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:02:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:37.675 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[4e3084db-8fd7-46b1-9999-c724cd7dc971]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:02:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:37.676 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[aa26ef54-af16-4d37-ae4b-5dcb3398aead]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:02:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:37.690 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[ede38973-bc34-4d38-a9c9-42bb00ba1529]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 453184, 'reachable_time': 39839, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221365, 'error': None, 'target': 'ovnmeta-78c3d2d3-8bfe-47b8-9282-3e9091b37043', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:02:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:37.693 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-78c3d2d3-8bfe-47b8-9282-3e9091b37043 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:02:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:37.693 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[12ed8545-6369-4301-9c61-9c29d832c776]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:02:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:37.695 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 75561bb8-bfb9-4100-9c79-271fd50011de in datapath 020b4768-a07a-4769-8636-455566c87083 unbound from our chassis#033[00m
Oct  2 08:02:37 np0005466013 systemd[1]: run-netns-ovnmeta\x2d78c3d2d3\x2d8bfe\x2d47b8\x2d9282\x2d3e9091b37043.mount: Deactivated successfully.
Oct  2 08:02:37 np0005466013 nova_compute[192144]: 2025-10-02 12:02:37.696 2 DEBUG nova.compute.manager [req-548054de-1738-4bf5-8878-d8b302264e93 req-908b21fa-2c43-4a57-8de5-985d08793eaa 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Received event network-vif-unplugged-75561bb8-bfb9-4100-9c79-271fd50011de external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:02:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:37.696 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 020b4768-a07a-4769-8636-455566c87083, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:02:37 np0005466013 nova_compute[192144]: 2025-10-02 12:02:37.696 2 DEBUG oslo_concurrency.lockutils [req-548054de-1738-4bf5-8878-d8b302264e93 req-908b21fa-2c43-4a57-8de5-985d08793eaa 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "f1267fe1-552c-4312-b9b0-c02eae82a77a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:02:37 np0005466013 nova_compute[192144]: 2025-10-02 12:02:37.696 2 DEBUG oslo_concurrency.lockutils [req-548054de-1738-4bf5-8878-d8b302264e93 req-908b21fa-2c43-4a57-8de5-985d08793eaa 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f1267fe1-552c-4312-b9b0-c02eae82a77a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:02:37 np0005466013 nova_compute[192144]: 2025-10-02 12:02:37.696 2 DEBUG oslo_concurrency.lockutils [req-548054de-1738-4bf5-8878-d8b302264e93 req-908b21fa-2c43-4a57-8de5-985d08793eaa 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f1267fe1-552c-4312-b9b0-c02eae82a77a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:02:37 np0005466013 nova_compute[192144]: 2025-10-02 12:02:37.697 2 DEBUG nova.compute.manager [req-548054de-1738-4bf5-8878-d8b302264e93 req-908b21fa-2c43-4a57-8de5-985d08793eaa 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] No waiting events found dispatching network-vif-unplugged-75561bb8-bfb9-4100-9c79-271fd50011de pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:02:37 np0005466013 nova_compute[192144]: 2025-10-02 12:02:37.697 2 DEBUG nova.compute.manager [req-548054de-1738-4bf5-8878-d8b302264e93 req-908b21fa-2c43-4a57-8de5-985d08793eaa 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Received event network-vif-unplugged-75561bb8-bfb9-4100-9c79-271fd50011de for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:02:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:37.697 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[eab65230-416a-4ee2-8750-b39a91e9d46f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:02:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:37.698 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-020b4768-a07a-4769-8636-455566c87083 namespace which is not needed anymore#033[00m
Oct  2 08:02:37 np0005466013 neutron-haproxy-ovnmeta-020b4768-a07a-4769-8636-455566c87083[221101]: [NOTICE]   (221105) : haproxy version is 2.8.14-c23fe91
Oct  2 08:02:37 np0005466013 neutron-haproxy-ovnmeta-020b4768-a07a-4769-8636-455566c87083[221101]: [NOTICE]   (221105) : path to executable is /usr/sbin/haproxy
Oct  2 08:02:37 np0005466013 neutron-haproxy-ovnmeta-020b4768-a07a-4769-8636-455566c87083[221101]: [WARNING]  (221105) : Exiting Master process...
Oct  2 08:02:37 np0005466013 neutron-haproxy-ovnmeta-020b4768-a07a-4769-8636-455566c87083[221101]: [WARNING]  (221105) : Exiting Master process...
Oct  2 08:02:37 np0005466013 neutron-haproxy-ovnmeta-020b4768-a07a-4769-8636-455566c87083[221101]: [ALERT]    (221105) : Current worker (221107) exited with code 143 (Terminated)
Oct  2 08:02:37 np0005466013 neutron-haproxy-ovnmeta-020b4768-a07a-4769-8636-455566c87083[221101]: [WARNING]  (221105) : All workers exited. Exiting... (0)
Oct  2 08:02:37 np0005466013 systemd[1]: libpod-d0e0a0e6544e44408e15189c9c2f2b95529c7b7cb4800ac5483a0d8f640c8109.scope: Deactivated successfully.
Oct  2 08:02:37 np0005466013 conmon[221101]: conmon d0e0a0e6544e44408e15 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-d0e0a0e6544e44408e15189c9c2f2b95529c7b7cb4800ac5483a0d8f640c8109.scope/container/memory.events
Oct  2 08:02:37 np0005466013 podman[221383]: 2025-10-02 12:02:37.820548433 +0000 UTC m=+0.044477567 container died d0e0a0e6544e44408e15189c9c2f2b95529c7b7cb4800ac5483a0d8f640c8109 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-020b4768-a07a-4769-8636-455566c87083, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true)
Oct  2 08:02:37 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d0e0a0e6544e44408e15189c9c2f2b95529c7b7cb4800ac5483a0d8f640c8109-userdata-shm.mount: Deactivated successfully.
Oct  2 08:02:37 np0005466013 systemd[1]: var-lib-containers-storage-overlay-86bd256e2755db322e16ebb017875b94020f2f82bf883a8daaded5878cca4c7d-merged.mount: Deactivated successfully.
Oct  2 08:02:37 np0005466013 podman[221383]: 2025-10-02 12:02:37.865431821 +0000 UTC m=+0.089360935 container cleanup d0e0a0e6544e44408e15189c9c2f2b95529c7b7cb4800ac5483a0d8f640c8109 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-020b4768-a07a-4769-8636-455566c87083, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 08:02:37 np0005466013 systemd[1]: libpod-conmon-d0e0a0e6544e44408e15189c9c2f2b95529c7b7cb4800ac5483a0d8f640c8109.scope: Deactivated successfully.
Oct  2 08:02:37 np0005466013 podman[221412]: 2025-10-02 12:02:37.937957415 +0000 UTC m=+0.049054568 container remove d0e0a0e6544e44408e15189c9c2f2b95529c7b7cb4800ac5483a0d8f640c8109 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-020b4768-a07a-4769-8636-455566c87083, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  2 08:02:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:37.943 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[3feee17f-b681-4664-aa99-37ae9536d692]: (4, ('Thu Oct  2 12:02:37 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-020b4768-a07a-4769-8636-455566c87083 (d0e0a0e6544e44408e15189c9c2f2b95529c7b7cb4800ac5483a0d8f640c8109)\nd0e0a0e6544e44408e15189c9c2f2b95529c7b7cb4800ac5483a0d8f640c8109\nThu Oct  2 12:02:37 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-020b4768-a07a-4769-8636-455566c87083 (d0e0a0e6544e44408e15189c9c2f2b95529c7b7cb4800ac5483a0d8f640c8109)\nd0e0a0e6544e44408e15189c9c2f2b95529c7b7cb4800ac5483a0d8f640c8109\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:02:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:37.945 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[bb45a527-12a1-4869-965d-c254867597f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:02:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:37.945 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap020b4768-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:02:37 np0005466013 nova_compute[192144]: 2025-10-02 12:02:37.947 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:37 np0005466013 kernel: tap020b4768-a0: left promiscuous mode
Oct  2 08:02:37 np0005466013 nova_compute[192144]: 2025-10-02 12:02:37.961 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:37.964 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[d00e8237-b959-463d-ad79-073bc522e731]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:02:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:37.999 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[7f6bb625-a630-47a8-be43-4ebbd5935555]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:02:38 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:38.000 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[993ad435-d699-4fe6-9a70-b64794946e36]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:02:38 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:38.014 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[f0ca3ba5-bc7e-4e47-9af0-cefa2ea96ac0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 453263, 'reachable_time': 25347, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221430, 'error': None, 'target': 'ovnmeta-020b4768-a07a-4769-8636-455566c87083', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:02:38 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:38.016 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-020b4768-a07a-4769-8636-455566c87083 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:02:38 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:02:38.016 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[90f9fbcb-ec3e-4d24-b5c7-829c1fc0fd37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:02:38 np0005466013 nova_compute[192144]: 2025-10-02 12:02:38.131 2 DEBUG nova.network.neutron [req-e414bb44-5b77-45d4-9cc5-5418c059123a req-22b1282f-72e4-42f6-836f-1f2ddbeeafae 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Updated VIF entry in instance network info cache for port 75561bb8-bfb9-4100-9c79-271fd50011de. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:02:38 np0005466013 nova_compute[192144]: 2025-10-02 12:02:38.131 2 DEBUG nova.network.neutron [req-e414bb44-5b77-45d4-9cc5-5418c059123a req-22b1282f-72e4-42f6-836f-1f2ddbeeafae 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Updating instance_info_cache with network_info: [{"id": "75561bb8-bfb9-4100-9c79-271fd50011de", "address": "fa:16:3e:19:d8:66", "network": {"id": "020b4768-a07a-4769-8636-455566c87083", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-804372870-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5cc73d75e0864e838eefa90cb33b7e01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75561bb8-bf", "ovs_interfaceid": "75561bb8-bfb9-4100-9c79-271fd50011de", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:02:38 np0005466013 nova_compute[192144]: 2025-10-02 12:02:38.149 2 DEBUG oslo_concurrency.lockutils [req-e414bb44-5b77-45d4-9cc5-5418c059123a req-22b1282f-72e4-42f6-836f-1f2ddbeeafae 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-f1267fe1-552c-4312-b9b0-c02eae82a77a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:02:38 np0005466013 systemd[1]: run-netns-ovnmeta\x2d020b4768\x2da07a\x2d4769\x2d8636\x2d455566c87083.mount: Deactivated successfully.
Oct  2 08:02:38 np0005466013 nova_compute[192144]: 2025-10-02 12:02:38.766 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:39 np0005466013 nova_compute[192144]: 2025-10-02 12:02:39.848 2 DEBUG nova.compute.manager [req-45b2ecb7-a2c9-4c87-83af-1f22fc26648d req-c69f26a2-abf1-416f-b013-16d4485b5753 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Received event network-vif-plugged-75561bb8-bfb9-4100-9c79-271fd50011de external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:02:39 np0005466013 nova_compute[192144]: 2025-10-02 12:02:39.849 2 DEBUG oslo_concurrency.lockutils [req-45b2ecb7-a2c9-4c87-83af-1f22fc26648d req-c69f26a2-abf1-416f-b013-16d4485b5753 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "f1267fe1-552c-4312-b9b0-c02eae82a77a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:02:39 np0005466013 nova_compute[192144]: 2025-10-02 12:02:39.849 2 DEBUG oslo_concurrency.lockutils [req-45b2ecb7-a2c9-4c87-83af-1f22fc26648d req-c69f26a2-abf1-416f-b013-16d4485b5753 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f1267fe1-552c-4312-b9b0-c02eae82a77a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:02:39 np0005466013 nova_compute[192144]: 2025-10-02 12:02:39.849 2 DEBUG oslo_concurrency.lockutils [req-45b2ecb7-a2c9-4c87-83af-1f22fc26648d req-c69f26a2-abf1-416f-b013-16d4485b5753 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f1267fe1-552c-4312-b9b0-c02eae82a77a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:02:39 np0005466013 nova_compute[192144]: 2025-10-02 12:02:39.849 2 DEBUG nova.compute.manager [req-45b2ecb7-a2c9-4c87-83af-1f22fc26648d req-c69f26a2-abf1-416f-b013-16d4485b5753 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] No waiting events found dispatching network-vif-plugged-75561bb8-bfb9-4100-9c79-271fd50011de pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:02:39 np0005466013 nova_compute[192144]: 2025-10-02 12:02:39.849 2 WARNING nova.compute.manager [req-45b2ecb7-a2c9-4c87-83af-1f22fc26648d req-c69f26a2-abf1-416f-b013-16d4485b5753 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Received unexpected event network-vif-plugged-75561bb8-bfb9-4100-9c79-271fd50011de for instance with vm_state active and task_state migrating.#033[00m
Oct  2 08:02:39 np0005466013 nova_compute[192144]: 2025-10-02 12:02:39.850 2 DEBUG nova.compute.manager [req-45b2ecb7-a2c9-4c87-83af-1f22fc26648d req-c69f26a2-abf1-416f-b013-16d4485b5753 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Received event network-vif-plugged-75561bb8-bfb9-4100-9c79-271fd50011de external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:02:39 np0005466013 nova_compute[192144]: 2025-10-02 12:02:39.850 2 DEBUG oslo_concurrency.lockutils [req-45b2ecb7-a2c9-4c87-83af-1f22fc26648d req-c69f26a2-abf1-416f-b013-16d4485b5753 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "f1267fe1-552c-4312-b9b0-c02eae82a77a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:02:39 np0005466013 nova_compute[192144]: 2025-10-02 12:02:39.850 2 DEBUG oslo_concurrency.lockutils [req-45b2ecb7-a2c9-4c87-83af-1f22fc26648d req-c69f26a2-abf1-416f-b013-16d4485b5753 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f1267fe1-552c-4312-b9b0-c02eae82a77a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:02:39 np0005466013 nova_compute[192144]: 2025-10-02 12:02:39.850 2 DEBUG oslo_concurrency.lockutils [req-45b2ecb7-a2c9-4c87-83af-1f22fc26648d req-c69f26a2-abf1-416f-b013-16d4485b5753 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f1267fe1-552c-4312-b9b0-c02eae82a77a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:02:39 np0005466013 nova_compute[192144]: 2025-10-02 12:02:39.850 2 DEBUG nova.compute.manager [req-45b2ecb7-a2c9-4c87-83af-1f22fc26648d req-c69f26a2-abf1-416f-b013-16d4485b5753 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] No waiting events found dispatching network-vif-plugged-75561bb8-bfb9-4100-9c79-271fd50011de pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:02:39 np0005466013 nova_compute[192144]: 2025-10-02 12:02:39.850 2 WARNING nova.compute.manager [req-45b2ecb7-a2c9-4c87-83af-1f22fc26648d req-c69f26a2-abf1-416f-b013-16d4485b5753 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Received unexpected event network-vif-plugged-75561bb8-bfb9-4100-9c79-271fd50011de for instance with vm_state active and task_state migrating.#033[00m
Oct  2 08:02:39 np0005466013 nova_compute[192144]: 2025-10-02 12:02:39.946 2 DEBUG nova.compute.manager [req-739f4a7c-7c12-448f-83e4-88d4ce14b711 req-3873a5b2-861e-4f74-8773-86ff7bf9796e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Received event network-vif-unplugged-75561bb8-bfb9-4100-9c79-271fd50011de external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:02:39 np0005466013 nova_compute[192144]: 2025-10-02 12:02:39.946 2 DEBUG oslo_concurrency.lockutils [req-739f4a7c-7c12-448f-83e4-88d4ce14b711 req-3873a5b2-861e-4f74-8773-86ff7bf9796e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "f1267fe1-552c-4312-b9b0-c02eae82a77a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:02:39 np0005466013 nova_compute[192144]: 2025-10-02 12:02:39.947 2 DEBUG oslo_concurrency.lockutils [req-739f4a7c-7c12-448f-83e4-88d4ce14b711 req-3873a5b2-861e-4f74-8773-86ff7bf9796e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f1267fe1-552c-4312-b9b0-c02eae82a77a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:02:39 np0005466013 nova_compute[192144]: 2025-10-02 12:02:39.947 2 DEBUG oslo_concurrency.lockutils [req-739f4a7c-7c12-448f-83e4-88d4ce14b711 req-3873a5b2-861e-4f74-8773-86ff7bf9796e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f1267fe1-552c-4312-b9b0-c02eae82a77a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:02:39 np0005466013 nova_compute[192144]: 2025-10-02 12:02:39.947 2 DEBUG nova.compute.manager [req-739f4a7c-7c12-448f-83e4-88d4ce14b711 req-3873a5b2-861e-4f74-8773-86ff7bf9796e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] No waiting events found dispatching network-vif-unplugged-75561bb8-bfb9-4100-9c79-271fd50011de pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:02:39 np0005466013 nova_compute[192144]: 2025-10-02 12:02:39.947 2 DEBUG nova.compute.manager [req-739f4a7c-7c12-448f-83e4-88d4ce14b711 req-3873a5b2-861e-4f74-8773-86ff7bf9796e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Received event network-vif-unplugged-75561bb8-bfb9-4100-9c79-271fd50011de for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:02:40 np0005466013 nova_compute[192144]: 2025-10-02 12:02:40.317 2 DEBUG nova.network.neutron [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] Activated binding for port 75561bb8-bfb9-4100-9c79-271fd50011de and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m
Oct  2 08:02:40 np0005466013 nova_compute[192144]: 2025-10-02 12:02:40.318 2 DEBUG nova.compute.manager [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "75561bb8-bfb9-4100-9c79-271fd50011de", "address": "fa:16:3e:19:d8:66", "network": {"id": "020b4768-a07a-4769-8636-455566c87083", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-804372870-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5cc73d75e0864e838eefa90cb33b7e01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75561bb8-bf", "ovs_interfaceid": "75561bb8-bfb9-4100-9c79-271fd50011de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326#033[00m
Oct  2 08:02:40 np0005466013 nova_compute[192144]: 2025-10-02 12:02:40.318 2 DEBUG nova.virt.libvirt.vif [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:02:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1420306859',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1420306859',id=11,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:02:21Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5cc73d75e0864e838eefa90cb33b7e01',ramdisk_id='',reservation_id='r-f4n74pvj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-984573444',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-984573444-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:02:26Z,user_data=None,user_id='59e8135d73ee43e088ba5ee7d9bd84b1',uuid=f1267fe1-552c-4312-b9b0-c02eae82a77a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "75561bb8-bfb9-4100-9c79-271fd50011de", "address": "fa:16:3e:19:d8:66", "network": {"id": "020b4768-a07a-4769-8636-455566c87083", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-804372870-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5cc73d75e0864e838eefa90cb33b7e01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75561bb8-bf", "ovs_interfaceid": "75561bb8-bfb9-4100-9c79-271fd50011de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:02:40 np0005466013 nova_compute[192144]: 2025-10-02 12:02:40.319 2 DEBUG nova.network.os_vif_util [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] Converting VIF {"id": "75561bb8-bfb9-4100-9c79-271fd50011de", "address": "fa:16:3e:19:d8:66", "network": {"id": "020b4768-a07a-4769-8636-455566c87083", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-804372870-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5cc73d75e0864e838eefa90cb33b7e01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75561bb8-bf", "ovs_interfaceid": "75561bb8-bfb9-4100-9c79-271fd50011de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:02:40 np0005466013 nova_compute[192144]: 2025-10-02 12:02:40.319 2 DEBUG nova.network.os_vif_util [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:d8:66,bridge_name='br-int',has_traffic_filtering=True,id=75561bb8-bfb9-4100-9c79-271fd50011de,network=Network(020b4768-a07a-4769-8636-455566c87083),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap75561bb8-bf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:02:40 np0005466013 nova_compute[192144]: 2025-10-02 12:02:40.319 2 DEBUG os_vif [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:d8:66,bridge_name='br-int',has_traffic_filtering=True,id=75561bb8-bfb9-4100-9c79-271fd50011de,network=Network(020b4768-a07a-4769-8636-455566c87083),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap75561bb8-bf') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:02:40 np0005466013 nova_compute[192144]: 2025-10-02 12:02:40.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:40 np0005466013 nova_compute[192144]: 2025-10-02 12:02:40.322 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap75561bb8-bf, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:02:40 np0005466013 nova_compute[192144]: 2025-10-02 12:02:40.323 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:40 np0005466013 nova_compute[192144]: 2025-10-02 12:02:40.324 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:40 np0005466013 nova_compute[192144]: 2025-10-02 12:02:40.326 2 INFO os_vif [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:d8:66,bridge_name='br-int',has_traffic_filtering=True,id=75561bb8-bfb9-4100-9c79-271fd50011de,network=Network(020b4768-a07a-4769-8636-455566c87083),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap75561bb8-bf')#033[00m
Oct  2 08:02:40 np0005466013 nova_compute[192144]: 2025-10-02 12:02:40.327 2 DEBUG oslo_concurrency.lockutils [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:02:40 np0005466013 nova_compute[192144]: 2025-10-02 12:02:40.327 2 DEBUG oslo_concurrency.lockutils [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:02:40 np0005466013 nova_compute[192144]: 2025-10-02 12:02:40.327 2 DEBUG oslo_concurrency.lockutils [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:02:40 np0005466013 nova_compute[192144]: 2025-10-02 12:02:40.328 2 DEBUG nova.compute.manager [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349#033[00m
Oct  2 08:02:40 np0005466013 nova_compute[192144]: 2025-10-02 12:02:40.328 2 INFO nova.virt.libvirt.driver [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Deleting instance files /var/lib/nova/instances/f1267fe1-552c-4312-b9b0-c02eae82a77a_del#033[00m
Oct  2 08:02:40 np0005466013 nova_compute[192144]: 2025-10-02 12:02:40.329 2 INFO nova.virt.libvirt.driver [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Deletion of /var/lib/nova/instances/f1267fe1-552c-4312-b9b0-c02eae82a77a_del complete#033[00m
Oct  2 08:02:41 np0005466013 nova_compute[192144]: 2025-10-02 12:02:41.973 2 DEBUG nova.compute.manager [req-4a9e9702-54a0-46ea-a1f1-0dd31275c1a8 req-8172ec75-873e-4737-8c8c-388d5fae5980 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Received event network-vif-plugged-75561bb8-bfb9-4100-9c79-271fd50011de external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:02:41 np0005466013 nova_compute[192144]: 2025-10-02 12:02:41.973 2 DEBUG oslo_concurrency.lockutils [req-4a9e9702-54a0-46ea-a1f1-0dd31275c1a8 req-8172ec75-873e-4737-8c8c-388d5fae5980 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "f1267fe1-552c-4312-b9b0-c02eae82a77a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:02:41 np0005466013 nova_compute[192144]: 2025-10-02 12:02:41.974 2 DEBUG oslo_concurrency.lockutils [req-4a9e9702-54a0-46ea-a1f1-0dd31275c1a8 req-8172ec75-873e-4737-8c8c-388d5fae5980 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f1267fe1-552c-4312-b9b0-c02eae82a77a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:02:41 np0005466013 nova_compute[192144]: 2025-10-02 12:02:41.975 2 DEBUG oslo_concurrency.lockutils [req-4a9e9702-54a0-46ea-a1f1-0dd31275c1a8 req-8172ec75-873e-4737-8c8c-388d5fae5980 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f1267fe1-552c-4312-b9b0-c02eae82a77a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:02:41 np0005466013 nova_compute[192144]: 2025-10-02 12:02:41.975 2 DEBUG nova.compute.manager [req-4a9e9702-54a0-46ea-a1f1-0dd31275c1a8 req-8172ec75-873e-4737-8c8c-388d5fae5980 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] No waiting events found dispatching network-vif-plugged-75561bb8-bfb9-4100-9c79-271fd50011de pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:02:41 np0005466013 nova_compute[192144]: 2025-10-02 12:02:41.975 2 WARNING nova.compute.manager [req-4a9e9702-54a0-46ea-a1f1-0dd31275c1a8 req-8172ec75-873e-4737-8c8c-388d5fae5980 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Received unexpected event network-vif-plugged-75561bb8-bfb9-4100-9c79-271fd50011de for instance with vm_state active and task_state migrating.#033[00m
Oct  2 08:02:41 np0005466013 nova_compute[192144]: 2025-10-02 12:02:41.976 2 DEBUG nova.compute.manager [req-4a9e9702-54a0-46ea-a1f1-0dd31275c1a8 req-8172ec75-873e-4737-8c8c-388d5fae5980 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Received event network-vif-plugged-75561bb8-bfb9-4100-9c79-271fd50011de external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:02:41 np0005466013 nova_compute[192144]: 2025-10-02 12:02:41.976 2 DEBUG oslo_concurrency.lockutils [req-4a9e9702-54a0-46ea-a1f1-0dd31275c1a8 req-8172ec75-873e-4737-8c8c-388d5fae5980 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "f1267fe1-552c-4312-b9b0-c02eae82a77a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:02:41 np0005466013 nova_compute[192144]: 2025-10-02 12:02:41.976 2 DEBUG oslo_concurrency.lockutils [req-4a9e9702-54a0-46ea-a1f1-0dd31275c1a8 req-8172ec75-873e-4737-8c8c-388d5fae5980 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f1267fe1-552c-4312-b9b0-c02eae82a77a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:02:41 np0005466013 nova_compute[192144]: 2025-10-02 12:02:41.977 2 DEBUG oslo_concurrency.lockutils [req-4a9e9702-54a0-46ea-a1f1-0dd31275c1a8 req-8172ec75-873e-4737-8c8c-388d5fae5980 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f1267fe1-552c-4312-b9b0-c02eae82a77a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:02:41 np0005466013 nova_compute[192144]: 2025-10-02 12:02:41.977 2 DEBUG nova.compute.manager [req-4a9e9702-54a0-46ea-a1f1-0dd31275c1a8 req-8172ec75-873e-4737-8c8c-388d5fae5980 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] No waiting events found dispatching network-vif-plugged-75561bb8-bfb9-4100-9c79-271fd50011de pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:02:41 np0005466013 nova_compute[192144]: 2025-10-02 12:02:41.977 2 WARNING nova.compute.manager [req-4a9e9702-54a0-46ea-a1f1-0dd31275c1a8 req-8172ec75-873e-4737-8c8c-388d5fae5980 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Received unexpected event network-vif-plugged-75561bb8-bfb9-4100-9c79-271fd50011de for instance with vm_state active and task_state migrating.#033[00m
Oct  2 08:02:42 np0005466013 systemd[1]: Stopping User Manager for UID 42436...
Oct  2 08:02:42 np0005466013 systemd[221185]: Activating special unit Exit the Session...
Oct  2 08:02:42 np0005466013 systemd[221185]: Stopped target Main User Target.
Oct  2 08:02:42 np0005466013 systemd[221185]: Stopped target Basic System.
Oct  2 08:02:42 np0005466013 systemd[221185]: Stopped target Paths.
Oct  2 08:02:42 np0005466013 systemd[221185]: Stopped target Sockets.
Oct  2 08:02:42 np0005466013 systemd[221185]: Stopped target Timers.
Oct  2 08:02:42 np0005466013 systemd[221185]: Stopped Mark boot as successful after the user session has run 2 minutes.
Oct  2 08:02:42 np0005466013 systemd[221185]: Stopped Daily Cleanup of User's Temporary Directories.
Oct  2 08:02:42 np0005466013 systemd[221185]: Closed D-Bus User Message Bus Socket.
Oct  2 08:02:42 np0005466013 systemd[221185]: Stopped Create User's Volatile Files and Directories.
Oct  2 08:02:42 np0005466013 systemd[221185]: Removed slice User Application Slice.
Oct  2 08:02:42 np0005466013 systemd[221185]: Reached target Shutdown.
Oct  2 08:02:42 np0005466013 systemd[221185]: Finished Exit the Session.
Oct  2 08:02:42 np0005466013 systemd[221185]: Reached target Exit the Session.
Oct  2 08:02:42 np0005466013 systemd[1]: user@42436.service: Deactivated successfully.
Oct  2 08:02:42 np0005466013 systemd[1]: Stopped User Manager for UID 42436.
Oct  2 08:02:42 np0005466013 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Oct  2 08:02:42 np0005466013 systemd[1]: run-user-42436.mount: Deactivated successfully.
Oct  2 08:02:42 np0005466013 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Oct  2 08:02:42 np0005466013 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Oct  2 08:02:42 np0005466013 systemd[1]: Removed slice User Slice of UID 42436.
Oct  2 08:02:42 np0005466013 nova_compute[192144]: 2025-10-02 12:02:42.644 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:45 np0005466013 nova_compute[192144]: 2025-10-02 12:02:45.325 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:47 np0005466013 nova_compute[192144]: 2025-10-02 12:02:47.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:48 np0005466013 nova_compute[192144]: 2025-10-02 12:02:48.672 2 DEBUG oslo_concurrency.lockutils [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] Acquiring lock "f1267fe1-552c-4312-b9b0-c02eae82a77a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:02:48 np0005466013 nova_compute[192144]: 2025-10-02 12:02:48.673 2 DEBUG oslo_concurrency.lockutils [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] Lock "f1267fe1-552c-4312-b9b0-c02eae82a77a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:02:48 np0005466013 nova_compute[192144]: 2025-10-02 12:02:48.673 2 DEBUG oslo_concurrency.lockutils [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] Lock "f1267fe1-552c-4312-b9b0-c02eae82a77a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:02:48 np0005466013 nova_compute[192144]: 2025-10-02 12:02:48.735 2 DEBUG oslo_concurrency.lockutils [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:02:48 np0005466013 nova_compute[192144]: 2025-10-02 12:02:48.736 2 DEBUG oslo_concurrency.lockutils [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:02:48 np0005466013 nova_compute[192144]: 2025-10-02 12:02:48.736 2 DEBUG oslo_concurrency.lockutils [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:02:48 np0005466013 nova_compute[192144]: 2025-10-02 12:02:48.736 2 DEBUG nova.compute.resource_tracker [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:02:48 np0005466013 podman[221439]: 2025-10-02 12:02:48.849731326 +0000 UTC m=+0.070176361 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:02:48 np0005466013 podman[221438]: 2025-10-02 12:02:48.853757381 +0000 UTC m=+0.077024594 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:02:48 np0005466013 podman[221440]: 2025-10-02 12:02:48.860707306 +0000 UTC m=+0.077659383 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  2 08:02:48 np0005466013 nova_compute[192144]: 2025-10-02 12:02:48.913 2 DEBUG oslo_concurrency.processutils [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:02:48 np0005466013 nova_compute[192144]: 2025-10-02 12:02:48.964 2 DEBUG oslo_concurrency.processutils [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:02:48 np0005466013 nova_compute[192144]: 2025-10-02 12:02:48.965 2 DEBUG oslo_concurrency.processutils [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:02:49 np0005466013 nova_compute[192144]: 2025-10-02 12:02:49.018 2 DEBUG oslo_concurrency.processutils [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:02:49 np0005466013 nova_compute[192144]: 2025-10-02 12:02:49.134 2 WARNING nova.virt.libvirt.driver [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:02:49 np0005466013 nova_compute[192144]: 2025-10-02 12:02:49.135 2 DEBUG nova.compute.resource_tracker [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5619MB free_disk=73.43965911865234GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:02:49 np0005466013 nova_compute[192144]: 2025-10-02 12:02:49.135 2 DEBUG oslo_concurrency.lockutils [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:02:49 np0005466013 nova_compute[192144]: 2025-10-02 12:02:49.135 2 DEBUG oslo_concurrency.lockutils [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:02:49 np0005466013 nova_compute[192144]: 2025-10-02 12:02:49.214 2 DEBUG nova.compute.resource_tracker [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] Migration for instance f1267fe1-552c-4312-b9b0-c02eae82a77a refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Oct  2 08:02:49 np0005466013 nova_compute[192144]: 2025-10-02 12:02:49.250 2 DEBUG nova.compute.resource_tracker [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Oct  2 08:02:49 np0005466013 nova_compute[192144]: 2025-10-02 12:02:49.275 2 DEBUG nova.compute.resource_tracker [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] Instance 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:02:49 np0005466013 nova_compute[192144]: 2025-10-02 12:02:49.276 2 DEBUG nova.compute.resource_tracker [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] Migration 0c9f3fac-4ed0-44d8-a3a1-41a61781ece4 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Oct  2 08:02:49 np0005466013 nova_compute[192144]: 2025-10-02 12:02:49.276 2 DEBUG nova.compute.resource_tracker [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:02:49 np0005466013 nova_compute[192144]: 2025-10-02 12:02:49.276 2 DEBUG nova.compute.resource_tracker [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:02:49 np0005466013 nova_compute[192144]: 2025-10-02 12:02:49.360 2 DEBUG nova.compute.provider_tree [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:02:49 np0005466013 nova_compute[192144]: 2025-10-02 12:02:49.387 2 DEBUG nova.scheduler.client.report [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:02:49 np0005466013 nova_compute[192144]: 2025-10-02 12:02:49.424 2 DEBUG nova.compute.resource_tracker [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:02:49 np0005466013 nova_compute[192144]: 2025-10-02 12:02:49.424 2 DEBUG oslo_concurrency.lockutils [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.289s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:02:49 np0005466013 nova_compute[192144]: 2025-10-02 12:02:49.441 2 INFO nova.compute.manager [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Migrating instance to compute-0.ctlplane.example.com finished successfully.#033[00m
Oct  2 08:02:49 np0005466013 nova_compute[192144]: 2025-10-02 12:02:49.573 2 INFO nova.scheduler.client.report [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] Deleted allocation for migration 0c9f3fac-4ed0-44d8-a3a1-41a61781ece4#033[00m
Oct  2 08:02:49 np0005466013 nova_compute[192144]: 2025-10-02 12:02:49.573 2 DEBUG nova.virt.libvirt.driver [None req-21ca6e31-d6e2-4253-bdcb-0564f80409a6 4a8407cab3084bfc9d72832f5e66d8c5 5ee775f4f54b4dfda5adf759c97ba3ec - - default default] [instance: f1267fe1-552c-4312-b9b0-c02eae82a77a] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662#033[00m
Oct  2 08:02:49 np0005466013 nova_compute[192144]: 2025-10-02 12:02:49.973 2 DEBUG oslo_concurrency.lockutils [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] Acquiring lock "380d4ae7-3796-4b0c-88df-df667484bad3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:02:49 np0005466013 nova_compute[192144]: 2025-10-02 12:02:49.973 2 DEBUG oslo_concurrency.lockutils [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] Lock "380d4ae7-3796-4b0c-88df-df667484bad3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:02:49 np0005466013 nova_compute[192144]: 2025-10-02 12:02:49.997 2 DEBUG nova.compute.manager [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] [instance: 380d4ae7-3796-4b0c-88df-df667484bad3] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:02:50 np0005466013 nova_compute[192144]: 2025-10-02 12:02:50.102 2 DEBUG oslo_concurrency.lockutils [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:02:50 np0005466013 nova_compute[192144]: 2025-10-02 12:02:50.102 2 DEBUG oslo_concurrency.lockutils [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:02:50 np0005466013 nova_compute[192144]: 2025-10-02 12:02:50.108 2 DEBUG nova.virt.hardware [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:02:50 np0005466013 nova_compute[192144]: 2025-10-02 12:02:50.109 2 INFO nova.compute.claims [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] [instance: 380d4ae7-3796-4b0c-88df-df667484bad3] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:02:50 np0005466013 nova_compute[192144]: 2025-10-02 12:02:50.260 2 DEBUG nova.compute.provider_tree [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:02:50 np0005466013 nova_compute[192144]: 2025-10-02 12:02:50.282 2 DEBUG nova.scheduler.client.report [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:02:50 np0005466013 nova_compute[192144]: 2025-10-02 12:02:50.347 2 DEBUG oslo_concurrency.lockutils [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.245s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:02:50 np0005466013 nova_compute[192144]: 2025-10-02 12:02:50.348 2 DEBUG nova.compute.manager [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] [instance: 380d4ae7-3796-4b0c-88df-df667484bad3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:02:50 np0005466013 nova_compute[192144]: 2025-10-02 12:02:50.361 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:50 np0005466013 nova_compute[192144]: 2025-10-02 12:02:50.441 2 DEBUG nova.compute.manager [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] [instance: 380d4ae7-3796-4b0c-88df-df667484bad3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:02:50 np0005466013 nova_compute[192144]: 2025-10-02 12:02:50.441 2 DEBUG nova.network.neutron [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] [instance: 380d4ae7-3796-4b0c-88df-df667484bad3] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:02:50 np0005466013 nova_compute[192144]: 2025-10-02 12:02:50.468 2 INFO nova.virt.libvirt.driver [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] [instance: 380d4ae7-3796-4b0c-88df-df667484bad3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:02:50 np0005466013 nova_compute[192144]: 2025-10-02 12:02:50.528 2 DEBUG nova.compute.manager [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] [instance: 380d4ae7-3796-4b0c-88df-df667484bad3] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:02:50 np0005466013 nova_compute[192144]: 2025-10-02 12:02:50.683 2 DEBUG nova.compute.manager [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] [instance: 380d4ae7-3796-4b0c-88df-df667484bad3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:02:50 np0005466013 nova_compute[192144]: 2025-10-02 12:02:50.684 2 DEBUG nova.virt.libvirt.driver [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] [instance: 380d4ae7-3796-4b0c-88df-df667484bad3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:02:50 np0005466013 nova_compute[192144]: 2025-10-02 12:02:50.685 2 INFO nova.virt.libvirt.driver [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] [instance: 380d4ae7-3796-4b0c-88df-df667484bad3] Creating image(s)#033[00m
Oct  2 08:02:50 np0005466013 nova_compute[192144]: 2025-10-02 12:02:50.685 2 DEBUG oslo_concurrency.lockutils [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] Acquiring lock "/var/lib/nova/instances/380d4ae7-3796-4b0c-88df-df667484bad3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:02:50 np0005466013 nova_compute[192144]: 2025-10-02 12:02:50.686 2 DEBUG oslo_concurrency.lockutils [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] Lock "/var/lib/nova/instances/380d4ae7-3796-4b0c-88df-df667484bad3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:02:50 np0005466013 nova_compute[192144]: 2025-10-02 12:02:50.686 2 DEBUG oslo_concurrency.lockutils [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] Lock "/var/lib/nova/instances/380d4ae7-3796-4b0c-88df-df667484bad3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:02:50 np0005466013 nova_compute[192144]: 2025-10-02 12:02:50.697 2 DEBUG oslo_concurrency.processutils [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:02:50 np0005466013 nova_compute[192144]: 2025-10-02 12:02:50.784 2 DEBUG oslo_concurrency.processutils [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:02:50 np0005466013 nova_compute[192144]: 2025-10-02 12:02:50.785 2 DEBUG oslo_concurrency.lockutils [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:02:50 np0005466013 nova_compute[192144]: 2025-10-02 12:02:50.786 2 DEBUG oslo_concurrency.lockutils [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:02:50 np0005466013 nova_compute[192144]: 2025-10-02 12:02:50.802 2 DEBUG oslo_concurrency.processutils [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:02:50 np0005466013 nova_compute[192144]: 2025-10-02 12:02:50.820 2 DEBUG nova.network.neutron [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] [instance: 380d4ae7-3796-4b0c-88df-df667484bad3] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Oct  2 08:02:50 np0005466013 nova_compute[192144]: 2025-10-02 12:02:50.821 2 DEBUG nova.compute.manager [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] [instance: 380d4ae7-3796-4b0c-88df-df667484bad3] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:02:50 np0005466013 nova_compute[192144]: 2025-10-02 12:02:50.856 2 DEBUG oslo_concurrency.processutils [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:02:50 np0005466013 nova_compute[192144]: 2025-10-02 12:02:50.857 2 DEBUG oslo_concurrency.processutils [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/380d4ae7-3796-4b0c-88df-df667484bad3/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:02:50 np0005466013 nova_compute[192144]: 2025-10-02 12:02:50.899 2 DEBUG oslo_concurrency.processutils [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/380d4ae7-3796-4b0c-88df-df667484bad3/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:02:50 np0005466013 nova_compute[192144]: 2025-10-02 12:02:50.900 2 DEBUG oslo_concurrency.lockutils [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:02:50 np0005466013 nova_compute[192144]: 2025-10-02 12:02:50.901 2 DEBUG oslo_concurrency.processutils [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:02:50 np0005466013 nova_compute[192144]: 2025-10-02 12:02:50.958 2 DEBUG oslo_concurrency.processutils [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:02:50 np0005466013 nova_compute[192144]: 2025-10-02 12:02:50.960 2 DEBUG nova.virt.disk.api [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] Checking if we can resize image /var/lib/nova/instances/380d4ae7-3796-4b0c-88df-df667484bad3/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:02:50 np0005466013 nova_compute[192144]: 2025-10-02 12:02:50.960 2 DEBUG oslo_concurrency.processutils [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/380d4ae7-3796-4b0c-88df-df667484bad3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:02:51 np0005466013 nova_compute[192144]: 2025-10-02 12:02:51.018 2 DEBUG oslo_concurrency.processutils [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/380d4ae7-3796-4b0c-88df-df667484bad3/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:02:51 np0005466013 nova_compute[192144]: 2025-10-02 12:02:51.018 2 DEBUG nova.virt.disk.api [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] Cannot resize image /var/lib/nova/instances/380d4ae7-3796-4b0c-88df-df667484bad3/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:02:51 np0005466013 nova_compute[192144]: 2025-10-02 12:02:51.019 2 DEBUG nova.objects.instance [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] Lazy-loading 'migration_context' on Instance uuid 380d4ae7-3796-4b0c-88df-df667484bad3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:02:51 np0005466013 nova_compute[192144]: 2025-10-02 12:02:51.144 2 DEBUG nova.virt.libvirt.driver [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] [instance: 380d4ae7-3796-4b0c-88df-df667484bad3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:02:51 np0005466013 nova_compute[192144]: 2025-10-02 12:02:51.144 2 DEBUG nova.virt.libvirt.driver [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] [instance: 380d4ae7-3796-4b0c-88df-df667484bad3] Ensure instance console log exists: /var/lib/nova/instances/380d4ae7-3796-4b0c-88df-df667484bad3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:02:51 np0005466013 nova_compute[192144]: 2025-10-02 12:02:51.144 2 DEBUG oslo_concurrency.lockutils [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:02:51 np0005466013 nova_compute[192144]: 2025-10-02 12:02:51.145 2 DEBUG oslo_concurrency.lockutils [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:02:51 np0005466013 nova_compute[192144]: 2025-10-02 12:02:51.145 2 DEBUG oslo_concurrency.lockutils [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:02:51 np0005466013 nova_compute[192144]: 2025-10-02 12:02:51.147 2 DEBUG nova.virt.libvirt.driver [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] [instance: 380d4ae7-3796-4b0c-88df-df667484bad3] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:02:51 np0005466013 nova_compute[192144]: 2025-10-02 12:02:51.151 2 WARNING nova.virt.libvirt.driver [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:02:51 np0005466013 nova_compute[192144]: 2025-10-02 12:02:51.164 2 DEBUG nova.virt.libvirt.host [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:02:51 np0005466013 nova_compute[192144]: 2025-10-02 12:02:51.165 2 DEBUG nova.virt.libvirt.host [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:02:51 np0005466013 nova_compute[192144]: 2025-10-02 12:02:51.184 2 DEBUG nova.virt.libvirt.host [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:02:51 np0005466013 nova_compute[192144]: 2025-10-02 12:02:51.186 2 DEBUG nova.virt.libvirt.host [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:02:51 np0005466013 nova_compute[192144]: 2025-10-02 12:02:51.187 2 DEBUG nova.virt.libvirt.driver [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:02:51 np0005466013 nova_compute[192144]: 2025-10-02 12:02:51.187 2 DEBUG nova.virt.hardware [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:02:51 np0005466013 nova_compute[192144]: 2025-10-02 12:02:51.187 2 DEBUG nova.virt.hardware [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:02:51 np0005466013 nova_compute[192144]: 2025-10-02 12:02:51.188 2 DEBUG nova.virt.hardware [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:02:51 np0005466013 nova_compute[192144]: 2025-10-02 12:02:51.188 2 DEBUG nova.virt.hardware [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:02:51 np0005466013 nova_compute[192144]: 2025-10-02 12:02:51.188 2 DEBUG nova.virt.hardware [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:02:51 np0005466013 nova_compute[192144]: 2025-10-02 12:02:51.188 2 DEBUG nova.virt.hardware [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:02:51 np0005466013 nova_compute[192144]: 2025-10-02 12:02:51.188 2 DEBUG nova.virt.hardware [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:02:51 np0005466013 nova_compute[192144]: 2025-10-02 12:02:51.189 2 DEBUG nova.virt.hardware [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:02:51 np0005466013 nova_compute[192144]: 2025-10-02 12:02:51.189 2 DEBUG nova.virt.hardware [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:02:51 np0005466013 nova_compute[192144]: 2025-10-02 12:02:51.189 2 DEBUG nova.virt.hardware [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:02:51 np0005466013 nova_compute[192144]: 2025-10-02 12:02:51.189 2 DEBUG nova.virt.hardware [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:02:51 np0005466013 nova_compute[192144]: 2025-10-02 12:02:51.193 2 DEBUG nova.objects.instance [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] Lazy-loading 'pci_devices' on Instance uuid 380d4ae7-3796-4b0c-88df-df667484bad3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:02:51 np0005466013 nova_compute[192144]: 2025-10-02 12:02:51.213 2 DEBUG nova.virt.libvirt.driver [None req-54446560-9457-42b8-b77f-4ed33efc40fb d413386800eb45c8959596be3a47c369 8b0d43e818674dfd81b38d17af224b0d - - default default] [instance: 380d4ae7-3796-4b0c-88df-df667484bad3] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:02:51 np0005466013 nova_compute[192144]:  <uuid>380d4ae7-3796-4b0c-88df-df667484bad3</uuid>
Oct  2 08:02:51 np0005466013 nova_compute[192144]:  <name>instance-0000000f</name>
Oct  2 08:02:51 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:02:51 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:02:51 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:02:51 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:02:51 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:02:51 np0005466013 nova_compute[192144]:      <nova:name>tempest-ServersAdminNegativeTestJSON-server-124669410</nova:name>
Oct  2 08:02:51 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:02:51</nova:creationTime>
Oct  2 08:02:51 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:02:51 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:02:51 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:02:51 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:02:51 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:02:51 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:02:51 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:02:51 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:02:51 np0005466013 nova_compute[192144]:        <nova:user uuid="d413386800eb45c8959596be3a47c369">tempest-ServersAdminNegativeTestJSON-528126379-project-member</nova:user>
Oct  2 08:02:51 np0005466013 nova_compute[192144]:        <nova:project uuid="8b0d43e818674dfd81b38d17af224b0d">tempest-ServersAdminNegativeTestJSON-528126379</nova:project>
Oct  2 08:02:51 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:02:51 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:02:51 np0005466013 nova_compute[192144]:      <nova:ports/>
Oct  2 08:02:51 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:02:51 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:02:51 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:02:51 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:02:51 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:02:51 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:02:51 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:02:51 np0005466013 nova_compute[192144]:      <entry name="serial">380d4ae7-3796-4b0c-88df-df667484bad3</entry>
Oct  2 08:02:51 np0005466013 nova_compute[192144]:      <entry name="uuid">380d4ae7-3796-4b0c-88df-df667484bad3</entry>
Oct  2 08:02:51 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:02:51 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:02:51 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:02:51 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:02:51 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:02:51 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:02:51 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:02:51 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:02:51 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:03:41 np0005466013 nova_compute[192144]: 2025-10-02 12:03:41.283 2 DEBUG nova.virt.libvirt.driver [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Check if temp file /var/lib/nova/instances/tmpeyk8xlf5 exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065#033[00m
Oct  2 08:03:41 np0005466013 nova_compute[192144]: 2025-10-02 12:03:41.287 2 DEBUG oslo_concurrency.processutils [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/356bc6d6-1101-467e-a020-65876724c955/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:03:41 np0005466013 nova_compute[192144]: 2025-10-02 12:03:41.343 2 DEBUG oslo_concurrency.processutils [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/356bc6d6-1101-467e-a020-65876724c955/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:03:41 np0005466013 nova_compute[192144]: 2025-10-02 12:03:41.344 2 DEBUG oslo_concurrency.processutils [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/356bc6d6-1101-467e-a020-65876724c955/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:03:41 np0005466013 nova_compute[192144]: 2025-10-02 12:03:41.403 2 DEBUG oslo_concurrency.processutils [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/356bc6d6-1101-467e-a020-65876724c955/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:03:41 np0005466013 nova_compute[192144]: 2025-10-02 12:03:41.404 2 DEBUG nova.compute.manager [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpeyk8xlf5',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='356bc6d6-1101-467e-a020-65876724c955',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587#033[00m
Oct  2 08:03:41 np0005466013 rsyslogd[1003]: imjournal: 1501 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Oct  2 08:03:42 np0005466013 nova_compute[192144]: 2025-10-02 12:03:42.273 2 DEBUG oslo_concurrency.processutils [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/356bc6d6-1101-467e-a020-65876724c955/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:03:42 np0005466013 nova_compute[192144]: 2025-10-02 12:03:42.328 2 DEBUG oslo_concurrency.processutils [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/356bc6d6-1101-467e-a020-65876724c955/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:03:42 np0005466013 nova_compute[192144]: 2025-10-02 12:03:42.329 2 DEBUG oslo_concurrency.processutils [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/356bc6d6-1101-467e-a020-65876724c955/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:03:42 np0005466013 nova_compute[192144]: 2025-10-02 12:03:42.391 2 DEBUG oslo_concurrency.processutils [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/356bc6d6-1101-467e-a020-65876724c955/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:03:42 np0005466013 nova_compute[192144]: 2025-10-02 12:03:42.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:42 np0005466013 nova_compute[192144]: 2025-10-02 12:03:42.773 2 DEBUG oslo_concurrency.lockutils [None req-12953269-c589-4aca-b822-0f4c84544c3c d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Acquiring lock "c4dcd0cb-13dd-4990-9232-8a51a6c5eff4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:03:42 np0005466013 nova_compute[192144]: 2025-10-02 12:03:42.773 2 DEBUG oslo_concurrency.lockutils [None req-12953269-c589-4aca-b822-0f4c84544c3c d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Lock "c4dcd0cb-13dd-4990-9232-8a51a6c5eff4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:03:42 np0005466013 nova_compute[192144]: 2025-10-02 12:03:42.774 2 DEBUG oslo_concurrency.lockutils [None req-12953269-c589-4aca-b822-0f4c84544c3c d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Acquiring lock "c4dcd0cb-13dd-4990-9232-8a51a6c5eff4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:03:42 np0005466013 nova_compute[192144]: 2025-10-02 12:03:42.774 2 DEBUG oslo_concurrency.lockutils [None req-12953269-c589-4aca-b822-0f4c84544c3c d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Lock "c4dcd0cb-13dd-4990-9232-8a51a6c5eff4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:03:42 np0005466013 nova_compute[192144]: 2025-10-02 12:03:42.774 2 DEBUG oslo_concurrency.lockutils [None req-12953269-c589-4aca-b822-0f4c84544c3c d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Lock "c4dcd0cb-13dd-4990-9232-8a51a6c5eff4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:03:42 np0005466013 nova_compute[192144]: 2025-10-02 12:03:42.802 2 INFO nova.compute.manager [None req-12953269-c589-4aca-b822-0f4c84544c3c d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: c4dcd0cb-13dd-4990-9232-8a51a6c5eff4] Terminating instance#033[00m
Oct  2 08:03:42 np0005466013 nova_compute[192144]: 2025-10-02 12:03:42.822 2 DEBUG oslo_concurrency.lockutils [None req-12953269-c589-4aca-b822-0f4c84544c3c d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Acquiring lock "refresh_cache-c4dcd0cb-13dd-4990-9232-8a51a6c5eff4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:03:42 np0005466013 nova_compute[192144]: 2025-10-02 12:03:42.823 2 DEBUG oslo_concurrency.lockutils [None req-12953269-c589-4aca-b822-0f4c84544c3c d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Acquired lock "refresh_cache-c4dcd0cb-13dd-4990-9232-8a51a6c5eff4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:03:42 np0005466013 nova_compute[192144]: 2025-10-02 12:03:42.823 2 DEBUG nova.network.neutron [None req-12953269-c589-4aca-b822-0f4c84544c3c d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: c4dcd0cb-13dd-4990-9232-8a51a6c5eff4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:03:43 np0005466013 nova_compute[192144]: 2025-10-02 12:03:43.106 2 DEBUG nova.network.neutron [None req-12953269-c589-4aca-b822-0f4c84544c3c d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: c4dcd0cb-13dd-4990-9232-8a51a6c5eff4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:03:43 np0005466013 nova_compute[192144]: 2025-10-02 12:03:43.370 2 DEBUG nova.network.neutron [None req-12953269-c589-4aca-b822-0f4c84544c3c d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: c4dcd0cb-13dd-4990-9232-8a51a6c5eff4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:03:43 np0005466013 nova_compute[192144]: 2025-10-02 12:03:43.397 2 DEBUG oslo_concurrency.lockutils [None req-12953269-c589-4aca-b822-0f4c84544c3c d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Releasing lock "refresh_cache-c4dcd0cb-13dd-4990-9232-8a51a6c5eff4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:03:43 np0005466013 nova_compute[192144]: 2025-10-02 12:03:43.398 2 DEBUG nova.compute.manager [None req-12953269-c589-4aca-b822-0f4c84544c3c d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: c4dcd0cb-13dd-4990-9232-8a51a6c5eff4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:03:43 np0005466013 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000015.scope: Deactivated successfully.
Oct  2 08:03:43 np0005466013 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000015.scope: Consumed 5.239s CPU time.
Oct  2 08:03:43 np0005466013 systemd-machined[152202]: Machine qemu-8-instance-00000015 terminated.
Oct  2 08:03:43 np0005466013 nova_compute[192144]: 2025-10-02 12:03:43.658 2 INFO nova.virt.libvirt.driver [-] [instance: c4dcd0cb-13dd-4990-9232-8a51a6c5eff4] Instance destroyed successfully.#033[00m
Oct  2 08:03:43 np0005466013 nova_compute[192144]: 2025-10-02 12:03:43.659 2 DEBUG nova.objects.instance [None req-12953269-c589-4aca-b822-0f4c84544c3c d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Lazy-loading 'resources' on Instance uuid c4dcd0cb-13dd-4990-9232-8a51a6c5eff4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:03:43 np0005466013 nova_compute[192144]: 2025-10-02 12:03:43.674 2 INFO nova.virt.libvirt.driver [None req-12953269-c589-4aca-b822-0f4c84544c3c d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: c4dcd0cb-13dd-4990-9232-8a51a6c5eff4] Deleting instance files /var/lib/nova/instances/c4dcd0cb-13dd-4990-9232-8a51a6c5eff4_del#033[00m
Oct  2 08:03:43 np0005466013 nova_compute[192144]: 2025-10-02 12:03:43.675 2 INFO nova.virt.libvirt.driver [None req-12953269-c589-4aca-b822-0f4c84544c3c d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: c4dcd0cb-13dd-4990-9232-8a51a6c5eff4] Deletion of /var/lib/nova/instances/c4dcd0cb-13dd-4990-9232-8a51a6c5eff4_del complete#033[00m
Oct  2 08:03:43 np0005466013 nova_compute[192144]: 2025-10-02 12:03:43.774 2 INFO nova.compute.manager [None req-12953269-c589-4aca-b822-0f4c84544c3c d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: c4dcd0cb-13dd-4990-9232-8a51a6c5eff4] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:03:43 np0005466013 nova_compute[192144]: 2025-10-02 12:03:43.774 2 DEBUG oslo.service.loopingcall [None req-12953269-c589-4aca-b822-0f4c84544c3c d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:03:43 np0005466013 nova_compute[192144]: 2025-10-02 12:03:43.775 2 DEBUG nova.compute.manager [-] [instance: c4dcd0cb-13dd-4990-9232-8a51a6c5eff4] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:03:43 np0005466013 nova_compute[192144]: 2025-10-02 12:03:43.775 2 DEBUG nova.network.neutron [-] [instance: c4dcd0cb-13dd-4990-9232-8a51a6c5eff4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:03:44 np0005466013 nova_compute[192144]: 2025-10-02 12:03:44.020 2 DEBUG nova.network.neutron [-] [instance: c4dcd0cb-13dd-4990-9232-8a51a6c5eff4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:03:44 np0005466013 nova_compute[192144]: 2025-10-02 12:03:44.033 2 DEBUG nova.network.neutron [-] [instance: c4dcd0cb-13dd-4990-9232-8a51a6c5eff4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:03:44 np0005466013 nova_compute[192144]: 2025-10-02 12:03:44.047 2 INFO nova.compute.manager [-] [instance: c4dcd0cb-13dd-4990-9232-8a51a6c5eff4] Took 0.27 seconds to deallocate network for instance.#033[00m
Oct  2 08:03:44 np0005466013 nova_compute[192144]: 2025-10-02 12:03:44.143 2 DEBUG oslo_concurrency.lockutils [None req-12953269-c589-4aca-b822-0f4c84544c3c d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:03:44 np0005466013 nova_compute[192144]: 2025-10-02 12:03:44.143 2 DEBUG oslo_concurrency.lockutils [None req-12953269-c589-4aca-b822-0f4c84544c3c d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:03:44 np0005466013 nova_compute[192144]: 2025-10-02 12:03:44.217 2 DEBUG nova.compute.provider_tree [None req-12953269-c589-4aca-b822-0f4c84544c3c d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:03:44 np0005466013 nova_compute[192144]: 2025-10-02 12:03:44.232 2 DEBUG nova.scheduler.client.report [None req-12953269-c589-4aca-b822-0f4c84544c3c d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:03:44 np0005466013 nova_compute[192144]: 2025-10-02 12:03:44.283 2 DEBUG oslo_concurrency.lockutils [None req-12953269-c589-4aca-b822-0f4c84544c3c d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.139s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:03:44 np0005466013 nova_compute[192144]: 2025-10-02 12:03:44.316 2 INFO nova.scheduler.client.report [None req-12953269-c589-4aca-b822-0f4c84544c3c d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Deleted allocations for instance c4dcd0cb-13dd-4990-9232-8a51a6c5eff4#033[00m
Oct  2 08:03:44 np0005466013 nova_compute[192144]: 2025-10-02 12:03:44.456 2 DEBUG oslo_concurrency.lockutils [None req-12953269-c589-4aca-b822-0f4c84544c3c d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Lock "c4dcd0cb-13dd-4990-9232-8a51a6c5eff4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.683s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:03:44 np0005466013 nova_compute[192144]: 2025-10-02 12:03:44.862 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:45 np0005466013 systemd[1]: Created slice User Slice of UID 42436.
Oct  2 08:03:45 np0005466013 systemd[1]: Starting User Runtime Directory /run/user/42436...
Oct  2 08:03:45 np0005466013 systemd-logind[784]: New session 38 of user nova.
Oct  2 08:03:45 np0005466013 systemd[1]: Finished User Runtime Directory /run/user/42436.
Oct  2 08:03:45 np0005466013 systemd[1]: Starting User Manager for UID 42436...
Oct  2 08:03:45 np0005466013 systemd[222082]: Queued start job for default target Main User Target.
Oct  2 08:03:45 np0005466013 systemd[222082]: Created slice User Application Slice.
Oct  2 08:03:45 np0005466013 systemd[222082]: Started Mark boot as successful after the user session has run 2 minutes.
Oct  2 08:03:45 np0005466013 systemd[222082]: Started Daily Cleanup of User's Temporary Directories.
Oct  2 08:03:45 np0005466013 systemd[222082]: Reached target Paths.
Oct  2 08:03:45 np0005466013 systemd[222082]: Reached target Timers.
Oct  2 08:03:45 np0005466013 systemd[222082]: Starting D-Bus User Message Bus Socket...
Oct  2 08:03:45 np0005466013 systemd[222082]: Starting Create User's Volatile Files and Directories...
Oct  2 08:03:45 np0005466013 systemd[222082]: Finished Create User's Volatile Files and Directories.
Oct  2 08:03:45 np0005466013 systemd[222082]: Listening on D-Bus User Message Bus Socket.
Oct  2 08:03:45 np0005466013 systemd[222082]: Reached target Sockets.
Oct  2 08:03:45 np0005466013 systemd[222082]: Reached target Basic System.
Oct  2 08:03:45 np0005466013 systemd[222082]: Reached target Main User Target.
Oct  2 08:03:45 np0005466013 systemd[222082]: Startup finished in 136ms.
Oct  2 08:03:45 np0005466013 systemd[1]: Started User Manager for UID 42436.
Oct  2 08:03:45 np0005466013 systemd[1]: Started Session 38 of User nova.
Oct  2 08:03:45 np0005466013 systemd-logind[784]: Session 38 logged out. Waiting for processes to exit.
Oct  2 08:03:45 np0005466013 systemd[1]: session-38.scope: Deactivated successfully.
Oct  2 08:03:45 np0005466013 systemd-logind[784]: Removed session 38.
Oct  2 08:03:46 np0005466013 nova_compute[192144]: 2025-10-02 12:03:46.580 2 DEBUG nova.compute.manager [req-d0784023-e250-42bf-9d99-fc1402927118 req-85a62212-948b-4cb4-a007-4a7d22de03ae 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Received event network-vif-unplugged-29214def-2450-4edd-acc6-84e165aa1e2c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:03:46 np0005466013 nova_compute[192144]: 2025-10-02 12:03:46.582 2 DEBUG oslo_concurrency.lockutils [req-d0784023-e250-42bf-9d99-fc1402927118 req-85a62212-948b-4cb4-a007-4a7d22de03ae 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "356bc6d6-1101-467e-a020-65876724c955-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:03:46 np0005466013 nova_compute[192144]: 2025-10-02 12:03:46.582 2 DEBUG oslo_concurrency.lockutils [req-d0784023-e250-42bf-9d99-fc1402927118 req-85a62212-948b-4cb4-a007-4a7d22de03ae 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "356bc6d6-1101-467e-a020-65876724c955-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:03:46 np0005466013 nova_compute[192144]: 2025-10-02 12:03:46.582 2 DEBUG oslo_concurrency.lockutils [req-d0784023-e250-42bf-9d99-fc1402927118 req-85a62212-948b-4cb4-a007-4a7d22de03ae 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "356bc6d6-1101-467e-a020-65876724c955-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:03:46 np0005466013 nova_compute[192144]: 2025-10-02 12:03:46.582 2 DEBUG nova.compute.manager [req-d0784023-e250-42bf-9d99-fc1402927118 req-85a62212-948b-4cb4-a007-4a7d22de03ae 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] No waiting events found dispatching network-vif-unplugged-29214def-2450-4edd-acc6-84e165aa1e2c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:03:46 np0005466013 nova_compute[192144]: 2025-10-02 12:03:46.582 2 DEBUG nova.compute.manager [req-d0784023-e250-42bf-9d99-fc1402927118 req-85a62212-948b-4cb4-a007-4a7d22de03ae 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Received event network-vif-unplugged-29214def-2450-4edd-acc6-84e165aa1e2c for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:03:47 np0005466013 nova_compute[192144]: 2025-10-02 12:03:47.716 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:47 np0005466013 nova_compute[192144]: 2025-10-02 12:03:47.735 2 INFO nova.compute.manager [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Took 5.34 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.#033[00m
Oct  2 08:03:47 np0005466013 nova_compute[192144]: 2025-10-02 12:03:47.737 2 DEBUG nova.compute.manager [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:03:47 np0005466013 nova_compute[192144]: 2025-10-02 12:03:47.779 2 DEBUG nova.compute.manager [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpeyk8xlf5',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='356bc6d6-1101-467e-a020-65876724c955',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(d5491466-7696-4e8f-b88a-dec3442315d4),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939#033[00m
Oct  2 08:03:47 np0005466013 nova_compute[192144]: 2025-10-02 12:03:47.803 2 DEBUG nova.objects.instance [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Lazy-loading 'migration_context' on Instance uuid 356bc6d6-1101-467e-a020-65876724c955 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:03:47 np0005466013 nova_compute[192144]: 2025-10-02 12:03:47.804 2 DEBUG nova.virt.libvirt.driver [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639#033[00m
Oct  2 08:03:47 np0005466013 nova_compute[192144]: 2025-10-02 12:03:47.806 2 DEBUG nova.virt.libvirt.driver [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440#033[00m
Oct  2 08:03:47 np0005466013 nova_compute[192144]: 2025-10-02 12:03:47.806 2 DEBUG nova.virt.libvirt.driver [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449#033[00m
Oct  2 08:03:47 np0005466013 nova_compute[192144]: 2025-10-02 12:03:47.821 2 DEBUG nova.virt.libvirt.vif [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:03:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-507794369',display_name='tempest-LiveMigrationTest-server-507794369',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-livemigrationtest-server-507794369',id=20,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:03:36Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='f7cb78d24d1a4511a59ced45ccc4a1c7',ramdisk_id='',reservation_id='r-hsf0qpxd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-1666170212',owner_user_name='tempest-LiveMigrationTest-1666170212-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:03:36Z,user_data=None,user_id='5f75195e56504673bd403ce69cbc28ca',uuid=356bc6d6-1101-467e-a020-65876724c955,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "29214def-2450-4edd-acc6-84e165aa1e2c", "address": "fa:16:3e:1d:3d:20", "network": {"id": "664b6526-6df1-4024-9bab-37218e6c18bd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-2017832683-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7cb78d24d1a4511a59ced45ccc4a1c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap29214def-24", "ovs_interfaceid": "29214def-2450-4edd-acc6-84e165aa1e2c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:03:47 np0005466013 nova_compute[192144]: 2025-10-02 12:03:47.822 2 DEBUG nova.network.os_vif_util [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Converting VIF {"id": "29214def-2450-4edd-acc6-84e165aa1e2c", "address": "fa:16:3e:1d:3d:20", "network": {"id": "664b6526-6df1-4024-9bab-37218e6c18bd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-2017832683-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7cb78d24d1a4511a59ced45ccc4a1c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap29214def-24", "ovs_interfaceid": "29214def-2450-4edd-acc6-84e165aa1e2c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:03:47 np0005466013 nova_compute[192144]: 2025-10-02 12:03:47.822 2 DEBUG nova.network.os_vif_util [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1d:3d:20,bridge_name='br-int',has_traffic_filtering=True,id=29214def-2450-4edd-acc6-84e165aa1e2c,network=Network(664b6526-6df1-4024-9bab-37218e6c18bd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29214def-24') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:03:47 np0005466013 nova_compute[192144]: 2025-10-02 12:03:47.823 2 DEBUG nova.virt.libvirt.migration [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Updating guest XML with vif config: <interface type="ethernet">
Oct  2 08:03:47 np0005466013 nova_compute[192144]:  <mac address="fa:16:3e:1d:3d:20"/>
Oct  2 08:03:47 np0005466013 nova_compute[192144]:  <model type="virtio"/>
Oct  2 08:03:47 np0005466013 nova_compute[192144]:  <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:03:47 np0005466013 nova_compute[192144]:  <mtu size="1442"/>
Oct  2 08:03:47 np0005466013 nova_compute[192144]:  <target dev="tap29214def-24"/>
Oct  2 08:03:47 np0005466013 nova_compute[192144]: </interface>
Oct  2 08:03:47 np0005466013 nova_compute[192144]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m
Oct  2 08:03:47 np0005466013 nova_compute[192144]: 2025-10-02 12:03:47.823 2 DEBUG nova.virt.libvirt.driver [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272#033[00m
Oct  2 08:03:48 np0005466013 nova_compute[192144]: 2025-10-02 12:03:48.308 2 DEBUG nova.virt.libvirt.migration [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Oct  2 08:03:48 np0005466013 nova_compute[192144]: 2025-10-02 12:03:48.309 2 INFO nova.virt.libvirt.migration [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Increasing downtime to 50 ms after 0 sec elapsed time#033[00m
Oct  2 08:03:48 np0005466013 nova_compute[192144]: 2025-10-02 12:03:48.486 2 INFO nova.virt.libvirt.driver [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).#033[00m
Oct  2 08:03:48 np0005466013 nova_compute[192144]: 2025-10-02 12:03:48.817 2 DEBUG nova.compute.manager [req-1205499e-958a-45b7-9c7e-d62bbba09be3 req-be879862-9803-4258-a95c-2f86620387b6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Received event network-vif-plugged-29214def-2450-4edd-acc6-84e165aa1e2c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:03:48 np0005466013 nova_compute[192144]: 2025-10-02 12:03:48.818 2 DEBUG oslo_concurrency.lockutils [req-1205499e-958a-45b7-9c7e-d62bbba09be3 req-be879862-9803-4258-a95c-2f86620387b6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "356bc6d6-1101-467e-a020-65876724c955-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:03:48 np0005466013 nova_compute[192144]: 2025-10-02 12:03:48.818 2 DEBUG oslo_concurrency.lockutils [req-1205499e-958a-45b7-9c7e-d62bbba09be3 req-be879862-9803-4258-a95c-2f86620387b6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "356bc6d6-1101-467e-a020-65876724c955-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:03:48 np0005466013 nova_compute[192144]: 2025-10-02 12:03:48.819 2 DEBUG oslo_concurrency.lockutils [req-1205499e-958a-45b7-9c7e-d62bbba09be3 req-be879862-9803-4258-a95c-2f86620387b6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "356bc6d6-1101-467e-a020-65876724c955-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:03:48 np0005466013 nova_compute[192144]: 2025-10-02 12:03:48.819 2 DEBUG nova.compute.manager [req-1205499e-958a-45b7-9c7e-d62bbba09be3 req-be879862-9803-4258-a95c-2f86620387b6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] No waiting events found dispatching network-vif-plugged-29214def-2450-4edd-acc6-84e165aa1e2c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:03:48 np0005466013 nova_compute[192144]: 2025-10-02 12:03:48.819 2 WARNING nova.compute.manager [req-1205499e-958a-45b7-9c7e-d62bbba09be3 req-be879862-9803-4258-a95c-2f86620387b6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Received unexpected event network-vif-plugged-29214def-2450-4edd-acc6-84e165aa1e2c for instance with vm_state active and task_state migrating.#033[00m
Oct  2 08:03:48 np0005466013 nova_compute[192144]: 2025-10-02 12:03:48.819 2 DEBUG nova.compute.manager [req-1205499e-958a-45b7-9c7e-d62bbba09be3 req-be879862-9803-4258-a95c-2f86620387b6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Received event network-changed-29214def-2450-4edd-acc6-84e165aa1e2c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:03:48 np0005466013 nova_compute[192144]: 2025-10-02 12:03:48.820 2 DEBUG nova.compute.manager [req-1205499e-958a-45b7-9c7e-d62bbba09be3 req-be879862-9803-4258-a95c-2f86620387b6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Refreshing instance network info cache due to event network-changed-29214def-2450-4edd-acc6-84e165aa1e2c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:03:48 np0005466013 nova_compute[192144]: 2025-10-02 12:03:48.820 2 DEBUG oslo_concurrency.lockutils [req-1205499e-958a-45b7-9c7e-d62bbba09be3 req-be879862-9803-4258-a95c-2f86620387b6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-356bc6d6-1101-467e-a020-65876724c955" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:03:48 np0005466013 nova_compute[192144]: 2025-10-02 12:03:48.820 2 DEBUG oslo_concurrency.lockutils [req-1205499e-958a-45b7-9c7e-d62bbba09be3 req-be879862-9803-4258-a95c-2f86620387b6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-356bc6d6-1101-467e-a020-65876724c955" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:03:48 np0005466013 nova_compute[192144]: 2025-10-02 12:03:48.820 2 DEBUG nova.network.neutron [req-1205499e-958a-45b7-9c7e-d62bbba09be3 req-be879862-9803-4258-a95c-2f86620387b6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Refreshing network info cache for port 29214def-2450-4edd-acc6-84e165aa1e2c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:03:48 np0005466013 nova_compute[192144]: 2025-10-02 12:03:48.988 2 DEBUG nova.virt.libvirt.migration [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Oct  2 08:03:48 np0005466013 nova_compute[192144]: 2025-10-02 12:03:48.988 2 DEBUG nova.virt.libvirt.migration [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Oct  2 08:03:49 np0005466013 nova_compute[192144]: 2025-10-02 12:03:49.491 2 DEBUG nova.virt.libvirt.migration [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Oct  2 08:03:49 np0005466013 nova_compute[192144]: 2025-10-02 12:03:49.491 2 DEBUG nova.virt.libvirt.migration [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Oct  2 08:03:49 np0005466013 ovn_controller[94366]: 2025-10-02T12:03:49Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:1d:3d:20 10.100.0.14
Oct  2 08:03:49 np0005466013 ovn_controller[94366]: 2025-10-02T12:03:49Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1d:3d:20 10.100.0.14
Oct  2 08:03:49 np0005466013 nova_compute[192144]: 2025-10-02 12:03:49.864 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:49 np0005466013 nova_compute[192144]: 2025-10-02 12:03:49.993 2 DEBUG nova.virt.libvirt.migration [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Oct  2 08:03:49 np0005466013 nova_compute[192144]: 2025-10-02 12:03:49.994 2 DEBUG nova.virt.libvirt.migration [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Oct  2 08:03:50 np0005466013 nova_compute[192144]: 2025-10-02 12:03:50.074 2 DEBUG nova.network.neutron [req-1205499e-958a-45b7-9c7e-d62bbba09be3 req-be879862-9803-4258-a95c-2f86620387b6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Updated VIF entry in instance network info cache for port 29214def-2450-4edd-acc6-84e165aa1e2c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:03:50 np0005466013 nova_compute[192144]: 2025-10-02 12:03:50.075 2 DEBUG nova.network.neutron [req-1205499e-958a-45b7-9c7e-d62bbba09be3 req-be879862-9803-4258-a95c-2f86620387b6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Updating instance_info_cache with network_info: [{"id": "29214def-2450-4edd-acc6-84e165aa1e2c", "address": "fa:16:3e:1d:3d:20", "network": {"id": "664b6526-6df1-4024-9bab-37218e6c18bd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-2017832683-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7cb78d24d1a4511a59ced45ccc4a1c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29214def-24", "ovs_interfaceid": "29214def-2450-4edd-acc6-84e165aa1e2c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:03:50 np0005466013 nova_compute[192144]: 2025-10-02 12:03:50.094 2 DEBUG oslo_concurrency.lockutils [req-1205499e-958a-45b7-9c7e-d62bbba09be3 req-be879862-9803-4258-a95c-2f86620387b6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-356bc6d6-1101-467e-a020-65876724c955" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:03:50 np0005466013 nova_compute[192144]: 2025-10-02 12:03:50.497 2 DEBUG nova.virt.libvirt.migration [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Oct  2 08:03:50 np0005466013 nova_compute[192144]: 2025-10-02 12:03:50.498 2 DEBUG nova.virt.libvirt.migration [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Oct  2 08:03:50 np0005466013 podman[222115]: 2025-10-02 12:03:50.689431438 +0000 UTC m=+0.060832840 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 08:03:50 np0005466013 podman[222116]: 2025-10-02 12:03:50.692115049 +0000 UTC m=+0.063755548 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  2 08:03:50 np0005466013 podman[222117]: 2025-10-02 12:03:50.720161727 +0000 UTC m=+0.090984631 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:03:51 np0005466013 nova_compute[192144]: 2025-10-02 12:03:51.001 2 DEBUG nova.virt.libvirt.migration [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Oct  2 08:03:51 np0005466013 nova_compute[192144]: 2025-10-02 12:03:51.002 2 DEBUG nova.virt.libvirt.migration [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Oct  2 08:03:51 np0005466013 nova_compute[192144]: 2025-10-02 12:03:51.505 2 DEBUG nova.virt.libvirt.migration [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Oct  2 08:03:51 np0005466013 nova_compute[192144]: 2025-10-02 12:03:51.506 2 DEBUG nova.virt.libvirt.migration [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Oct  2 08:03:52 np0005466013 nova_compute[192144]: 2025-10-02 12:03:52.009 2 DEBUG nova.virt.libvirt.migration [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Current 50 elapsed 4 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Oct  2 08:03:52 np0005466013 nova_compute[192144]: 2025-10-02 12:03:52.009 2 DEBUG nova.virt.libvirt.migration [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Oct  2 08:03:52 np0005466013 nova_compute[192144]: 2025-10-02 12:03:52.513 2 DEBUG nova.virt.libvirt.migration [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Current 50 elapsed 4 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Oct  2 08:03:52 np0005466013 nova_compute[192144]: 2025-10-02 12:03:52.514 2 DEBUG nova.virt.libvirt.migration [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Oct  2 08:03:52 np0005466013 nova_compute[192144]: 2025-10-02 12:03:52.719 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:53 np0005466013 nova_compute[192144]: 2025-10-02 12:03:53.017 2 DEBUG nova.virt.libvirt.migration [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Current 50 elapsed 5 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Oct  2 08:03:53 np0005466013 nova_compute[192144]: 2025-10-02 12:03:53.018 2 DEBUG nova.virt.libvirt.migration [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Oct  2 08:03:53 np0005466013 nova_compute[192144]: 2025-10-02 12:03:53.521 2 DEBUG nova.virt.libvirt.migration [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Current 50 elapsed 5 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Oct  2 08:03:53 np0005466013 nova_compute[192144]: 2025-10-02 12:03:53.521 2 DEBUG nova.virt.libvirt.migration [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Oct  2 08:03:53 np0005466013 nova_compute[192144]: 2025-10-02 12:03:53.596 2 DEBUG nova.virt.libvirt.driver [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Migration running for 5 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 74% remaining (bytes processed=19595264, remaining=55640064, total=75235328). _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10531#033[00m
Oct  2 08:03:54 np0005466013 nova_compute[192144]: 2025-10-02 12:03:54.102 2 DEBUG nova.virt.libvirt.migration [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Current 50 elapsed 6 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Oct  2 08:03:54 np0005466013 nova_compute[192144]: 2025-10-02 12:03:54.102 2 DEBUG nova.virt.libvirt.migration [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Oct  2 08:03:54 np0005466013 nova_compute[192144]: 2025-10-02 12:03:54.606 2 DEBUG nova.virt.libvirt.migration [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Current 50 elapsed 6 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Oct  2 08:03:54 np0005466013 nova_compute[192144]: 2025-10-02 12:03:54.606 2 DEBUG nova.virt.libvirt.migration [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Oct  2 08:03:54 np0005466013 nova_compute[192144]: 2025-10-02 12:03:54.867 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:55 np0005466013 nova_compute[192144]: 2025-10-02 12:03:55.109 2 DEBUG nova.virt.libvirt.migration [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Current 50 elapsed 7 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Oct  2 08:03:55 np0005466013 nova_compute[192144]: 2025-10-02 12:03:55.109 2 DEBUG nova.virt.libvirt.migration [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Oct  2 08:03:55 np0005466013 systemd[1]: Stopping User Manager for UID 42436...
Oct  2 08:03:55 np0005466013 systemd[222082]: Activating special unit Exit the Session...
Oct  2 08:03:55 np0005466013 systemd[222082]: Stopped target Main User Target.
Oct  2 08:03:55 np0005466013 systemd[222082]: Stopped target Basic System.
Oct  2 08:03:55 np0005466013 systemd[222082]: Stopped target Paths.
Oct  2 08:03:55 np0005466013 systemd[222082]: Stopped target Sockets.
Oct  2 08:03:55 np0005466013 systemd[222082]: Stopped target Timers.
Oct  2 08:03:55 np0005466013 systemd[222082]: Stopped Mark boot as successful after the user session has run 2 minutes.
Oct  2 08:03:55 np0005466013 systemd[222082]: Stopped Daily Cleanup of User's Temporary Directories.
Oct  2 08:03:55 np0005466013 systemd[222082]: Closed D-Bus User Message Bus Socket.
Oct  2 08:03:55 np0005466013 systemd[222082]: Stopped Create User's Volatile Files and Directories.
Oct  2 08:03:55 np0005466013 systemd[222082]: Removed slice User Application Slice.
Oct  2 08:03:55 np0005466013 systemd[222082]: Reached target Shutdown.
Oct  2 08:03:55 np0005466013 systemd[222082]: Finished Exit the Session.
Oct  2 08:03:55 np0005466013 systemd[222082]: Reached target Exit the Session.
Oct  2 08:03:55 np0005466013 nova_compute[192144]: 2025-10-02 12:03:55.614 2 DEBUG nova.virt.libvirt.migration [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Current 50 elapsed 7 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Oct  2 08:03:55 np0005466013 nova_compute[192144]: 2025-10-02 12:03:55.615 2 DEBUG nova.virt.libvirt.migration [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Oct  2 08:03:55 np0005466013 systemd[1]: user@42436.service: Deactivated successfully.
Oct  2 08:03:55 np0005466013 systemd[1]: Stopped User Manager for UID 42436.
Oct  2 08:03:55 np0005466013 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Oct  2 08:03:55 np0005466013 systemd[1]: run-user-42436.mount: Deactivated successfully.
Oct  2 08:03:55 np0005466013 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Oct  2 08:03:55 np0005466013 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Oct  2 08:03:55 np0005466013 systemd[1]: Removed slice User Slice of UID 42436.
Oct  2 08:03:56 np0005466013 nova_compute[192144]: 2025-10-02 12:03:56.110 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759406636.1104128, 356bc6d6-1101-467e-a020-65876724c955 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:03:56 np0005466013 nova_compute[192144]: 2025-10-02 12:03:56.111 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 356bc6d6-1101-467e-a020-65876724c955] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:03:56 np0005466013 nova_compute[192144]: 2025-10-02 12:03:56.142 2 DEBUG nova.virt.libvirt.migration [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Current 50 elapsed 8 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Oct  2 08:03:56 np0005466013 nova_compute[192144]: 2025-10-02 12:03:56.142 2 DEBUG nova.virt.libvirt.migration [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Oct  2 08:03:56 np0005466013 nova_compute[192144]: 2025-10-02 12:03:56.154 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 356bc6d6-1101-467e-a020-65876724c955] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:03:56 np0005466013 nova_compute[192144]: 2025-10-02 12:03:56.157 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 356bc6d6-1101-467e-a020-65876724c955] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:03:56 np0005466013 nova_compute[192144]: 2025-10-02 12:03:56.173 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 356bc6d6-1101-467e-a020-65876724c955] During sync_power_state the instance has a pending task (migrating). Skip.#033[00m
Oct  2 08:03:56 np0005466013 kernel: tap29214def-24 (unregistering): left promiscuous mode
Oct  2 08:03:56 np0005466013 NetworkManager[51205]: <info>  [1759406636.3817] device (tap29214def-24): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:03:56 np0005466013 ovn_controller[94366]: 2025-10-02T12:03:56Z|00058|binding|INFO|Releasing lport 29214def-2450-4edd-acc6-84e165aa1e2c from this chassis (sb_readonly=0)
Oct  2 08:03:56 np0005466013 ovn_controller[94366]: 2025-10-02T12:03:56Z|00059|binding|INFO|Setting lport 29214def-2450-4edd-acc6-84e165aa1e2c down in Southbound
Oct  2 08:03:56 np0005466013 nova_compute[192144]: 2025-10-02 12:03:56.418 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:56 np0005466013 ovn_controller[94366]: 2025-10-02T12:03:56Z|00060|binding|INFO|Removing iface tap29214def-24 ovn-installed in OVS
Oct  2 08:03:56 np0005466013 nova_compute[192144]: 2025-10-02 12:03:56.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:56 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:03:56.426 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1d:3d:20 10.100.0.14'], port_security=['fa:16:3e:1d:3d:20 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'c9f3d658-5c7a-4803-9bbb-01adfb7e88ca'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '356bc6d6-1101-467e-a020-65876724c955', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-664b6526-6df1-4024-9bab-37218e6c18bd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f7cb78d24d1a4511a59ced45ccc4a1c7', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'a459d514-aab4-4030-9850-e066abdeaccc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=eddfb51e-1095-4b3d-a2dc-f2557cf13b11, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=29214def-2450-4edd-acc6-84e165aa1e2c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:03:56 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:03:56.428 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 29214def-2450-4edd-acc6-84e165aa1e2c in datapath 664b6526-6df1-4024-9bab-37218e6c18bd unbound from our chassis#033[00m
Oct  2 08:03:56 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:03:56.429 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 664b6526-6df1-4024-9bab-37218e6c18bd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:03:56 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:03:56.430 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[36c0a5aa-28f1-4899-9bf8-ba213ed0bd8f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:03:56 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:03:56.431 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd namespace which is not needed anymore#033[00m
Oct  2 08:03:56 np0005466013 nova_compute[192144]: 2025-10-02 12:03:56.436 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:56 np0005466013 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000014.scope: Deactivated successfully.
Oct  2 08:03:56 np0005466013 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000014.scope: Consumed 12.869s CPU time.
Oct  2 08:03:56 np0005466013 systemd-machined[152202]: Machine qemu-7-instance-00000014 terminated.
Oct  2 08:03:56 np0005466013 neutron-haproxy-ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd[221957]: [NOTICE]   (221961) : haproxy version is 2.8.14-c23fe91
Oct  2 08:03:56 np0005466013 neutron-haproxy-ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd[221957]: [NOTICE]   (221961) : path to executable is /usr/sbin/haproxy
Oct  2 08:03:56 np0005466013 neutron-haproxy-ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd[221957]: [WARNING]  (221961) : Exiting Master process...
Oct  2 08:03:56 np0005466013 neutron-haproxy-ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd[221957]: [ALERT]    (221961) : Current worker (221963) exited with code 143 (Terminated)
Oct  2 08:03:56 np0005466013 neutron-haproxy-ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd[221957]: [WARNING]  (221961) : All workers exited. Exiting... (0)
Oct  2 08:03:56 np0005466013 systemd[1]: libpod-07d634ca9f1a71e542e8f0edea4d134f8eb93ad27c1c1bd6a94a552ca94a3016.scope: Deactivated successfully.
Oct  2 08:03:56 np0005466013 podman[222208]: 2025-10-02 12:03:56.556126726 +0000 UTC m=+0.042008637 container died 07d634ca9f1a71e542e8f0edea4d134f8eb93ad27c1c1bd6a94a552ca94a3016 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001)
Oct  2 08:03:56 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-07d634ca9f1a71e542e8f0edea4d134f8eb93ad27c1c1bd6a94a552ca94a3016-userdata-shm.mount: Deactivated successfully.
Oct  2 08:03:56 np0005466013 systemd[1]: var-lib-containers-storage-overlay-f7382dd794cc10686ff31a7d63e163c106aafd3ed219bd5a72d9cbb0dd253aa4-merged.mount: Deactivated successfully.
Oct  2 08:03:56 np0005466013 podman[222208]: 2025-10-02 12:03:56.597582526 +0000 UTC m=+0.083464437 container cleanup 07d634ca9f1a71e542e8f0edea4d134f8eb93ad27c1c1bd6a94a552ca94a3016 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:03:56 np0005466013 systemd[1]: libpod-conmon-07d634ca9f1a71e542e8f0edea4d134f8eb93ad27c1c1bd6a94a552ca94a3016.scope: Deactivated successfully.
Oct  2 08:03:56 np0005466013 nova_compute[192144]: 2025-10-02 12:03:56.603 2 DEBUG nova.virt.libvirt.driver [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279#033[00m
Oct  2 08:03:56 np0005466013 nova_compute[192144]: 2025-10-02 12:03:56.604 2 DEBUG nova.virt.libvirt.driver [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327#033[00m
Oct  2 08:03:56 np0005466013 nova_compute[192144]: 2025-10-02 12:03:56.604 2 DEBUG nova.virt.libvirt.driver [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630#033[00m
Oct  2 08:03:56 np0005466013 nova_compute[192144]: 2025-10-02 12:03:56.644 2 DEBUG nova.virt.libvirt.guest [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '356bc6d6-1101-467e-a020-65876724c955' (instance-00000014) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688#033[00m
Oct  2 08:03:56 np0005466013 nova_compute[192144]: 2025-10-02 12:03:56.644 2 INFO nova.virt.libvirt.driver [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Migration operation has completed#033[00m
Oct  2 08:03:56 np0005466013 nova_compute[192144]: 2025-10-02 12:03:56.645 2 INFO nova.compute.manager [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] _post_live_migration() is started..#033[00m
Oct  2 08:03:56 np0005466013 podman[222254]: 2025-10-02 12:03:56.678746464 +0000 UTC m=+0.053121450 container remove 07d634ca9f1a71e542e8f0edea4d134f8eb93ad27c1c1bd6a94a552ca94a3016 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Oct  2 08:03:56 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:03:56.684 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[93ace942-a816-4715-a9db-ad7da201a48c]: (4, ('Thu Oct  2 12:03:56 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd (07d634ca9f1a71e542e8f0edea4d134f8eb93ad27c1c1bd6a94a552ca94a3016)\n07d634ca9f1a71e542e8f0edea4d134f8eb93ad27c1c1bd6a94a552ca94a3016\nThu Oct  2 12:03:56 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd (07d634ca9f1a71e542e8f0edea4d134f8eb93ad27c1c1bd6a94a552ca94a3016)\n07d634ca9f1a71e542e8f0edea4d134f8eb93ad27c1c1bd6a94a552ca94a3016\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:03:56 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:03:56.686 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[518048b4-3f95-4d9b-899e-bdd196992063]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:03:56 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:03:56.687 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap664b6526-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:03:56 np0005466013 nova_compute[192144]: 2025-10-02 12:03:56.689 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:56 np0005466013 kernel: tap664b6526-60: left promiscuous mode
Oct  2 08:03:56 np0005466013 nova_compute[192144]: 2025-10-02 12:03:56.704 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:56 np0005466013 nova_compute[192144]: 2025-10-02 12:03:56.710 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:56 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:03:56.714 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[076c9240-b05d-4744-9de8-f31b85b384f6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:03:56 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:03:56.748 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[04d5754e-21e0-4b07-8796-4ffb01605333]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:03:56 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:03:56.750 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[44071620-9286-400b-ae95-bfa62ff1f641]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:03:56 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:03:56.767 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[c10191b1-a781-4953-80a0-bc99b9c54f26]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 460753, 'reachable_time': 44638, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222271, 'error': None, 'target': 'ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:03:56 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:03:56.770 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:03:56 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:03:56.770 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[b87eeb5d-e577-401f-9c8e-cc2d9f11e455]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:03:56 np0005466013 systemd[1]: run-netns-ovnmeta\x2d664b6526\x2d6df1\x2d4024\x2d9bab\x2d37218e6c18bd.mount: Deactivated successfully.
Oct  2 08:03:56 np0005466013 nova_compute[192144]: 2025-10-02 12:03:56.833 2 DEBUG nova.compute.manager [req-01d13236-788a-4b6c-a774-6f81723057c5 req-b3023c58-e71b-4bdd-b1fa-6e2bf3986bb4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Received event network-vif-unplugged-29214def-2450-4edd-acc6-84e165aa1e2c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:03:56 np0005466013 nova_compute[192144]: 2025-10-02 12:03:56.833 2 DEBUG oslo_concurrency.lockutils [req-01d13236-788a-4b6c-a774-6f81723057c5 req-b3023c58-e71b-4bdd-b1fa-6e2bf3986bb4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "356bc6d6-1101-467e-a020-65876724c955-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:03:56 np0005466013 nova_compute[192144]: 2025-10-02 12:03:56.833 2 DEBUG oslo_concurrency.lockutils [req-01d13236-788a-4b6c-a774-6f81723057c5 req-b3023c58-e71b-4bdd-b1fa-6e2bf3986bb4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "356bc6d6-1101-467e-a020-65876724c955-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:03:56 np0005466013 nova_compute[192144]: 2025-10-02 12:03:56.834 2 DEBUG oslo_concurrency.lockutils [req-01d13236-788a-4b6c-a774-6f81723057c5 req-b3023c58-e71b-4bdd-b1fa-6e2bf3986bb4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "356bc6d6-1101-467e-a020-65876724c955-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:03:56 np0005466013 nova_compute[192144]: 2025-10-02 12:03:56.834 2 DEBUG nova.compute.manager [req-01d13236-788a-4b6c-a774-6f81723057c5 req-b3023c58-e71b-4bdd-b1fa-6e2bf3986bb4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] No waiting events found dispatching network-vif-unplugged-29214def-2450-4edd-acc6-84e165aa1e2c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:03:56 np0005466013 nova_compute[192144]: 2025-10-02 12:03:56.834 2 DEBUG nova.compute.manager [req-01d13236-788a-4b6c-a774-6f81723057c5 req-b3023c58-e71b-4bdd-b1fa-6e2bf3986bb4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Received event network-vif-unplugged-29214def-2450-4edd-acc6-84e165aa1e2c for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:03:57 np0005466013 nova_compute[192144]: 2025-10-02 12:03:57.470 2 DEBUG nova.compute.manager [req-0f039c53-ce0f-47c9-921b-92631098343d req-1bb01039-0c60-4ac6-b5d1-3ad25a3c745f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Received event network-vif-unplugged-29214def-2450-4edd-acc6-84e165aa1e2c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:03:57 np0005466013 nova_compute[192144]: 2025-10-02 12:03:57.470 2 DEBUG oslo_concurrency.lockutils [req-0f039c53-ce0f-47c9-921b-92631098343d req-1bb01039-0c60-4ac6-b5d1-3ad25a3c745f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "356bc6d6-1101-467e-a020-65876724c955-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:03:57 np0005466013 nova_compute[192144]: 2025-10-02 12:03:57.471 2 DEBUG oslo_concurrency.lockutils [req-0f039c53-ce0f-47c9-921b-92631098343d req-1bb01039-0c60-4ac6-b5d1-3ad25a3c745f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "356bc6d6-1101-467e-a020-65876724c955-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:03:57 np0005466013 nova_compute[192144]: 2025-10-02 12:03:57.471 2 DEBUG oslo_concurrency.lockutils [req-0f039c53-ce0f-47c9-921b-92631098343d req-1bb01039-0c60-4ac6-b5d1-3ad25a3c745f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "356bc6d6-1101-467e-a020-65876724c955-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:03:57 np0005466013 nova_compute[192144]: 2025-10-02 12:03:57.471 2 DEBUG nova.compute.manager [req-0f039c53-ce0f-47c9-921b-92631098343d req-1bb01039-0c60-4ac6-b5d1-3ad25a3c745f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] No waiting events found dispatching network-vif-unplugged-29214def-2450-4edd-acc6-84e165aa1e2c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:03:57 np0005466013 nova_compute[192144]: 2025-10-02 12:03:57.471 2 DEBUG nova.compute.manager [req-0f039c53-ce0f-47c9-921b-92631098343d req-1bb01039-0c60-4ac6-b5d1-3ad25a3c745f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Received event network-vif-unplugged-29214def-2450-4edd-acc6-84e165aa1e2c for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:03:57 np0005466013 nova_compute[192144]: 2025-10-02 12:03:57.720 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:58 np0005466013 nova_compute[192144]: 2025-10-02 12:03:58.315 2 DEBUG nova.network.neutron [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Activated binding for port 29214def-2450-4edd-acc6-84e165aa1e2c and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m
Oct  2 08:03:58 np0005466013 nova_compute[192144]: 2025-10-02 12:03:58.316 2 DEBUG nova.compute.manager [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "29214def-2450-4edd-acc6-84e165aa1e2c", "address": "fa:16:3e:1d:3d:20", "network": {"id": "664b6526-6df1-4024-9bab-37218e6c18bd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-2017832683-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7cb78d24d1a4511a59ced45ccc4a1c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29214def-24", "ovs_interfaceid": "29214def-2450-4edd-acc6-84e165aa1e2c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326#033[00m
Oct  2 08:03:58 np0005466013 nova_compute[192144]: 2025-10-02 12:03:58.316 2 DEBUG nova.virt.libvirt.vif [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:03:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-507794369',display_name='tempest-LiveMigrationTest-server-507794369',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-livemigrationtest-server-507794369',id=20,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:03:36Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='f7cb78d24d1a4511a59ced45ccc4a1c7',ramdisk_id='',reservation_id='r-hsf0qpxd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-1666170212',owner_user_name='tempest-LiveMigrationTest-1666170212-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:03:48Z,user_data=None,user_id='5f75195e56504673bd403ce69cbc28ca',uuid=356bc6d6-1101-467e-a020-65876724c955,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "29214def-2450-4edd-acc6-84e165aa1e2c", "address": "fa:16:3e:1d:3d:20", "network": {"id": "664b6526-6df1-4024-9bab-37218e6c18bd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-2017832683-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7cb78d24d1a4511a59ced45ccc4a1c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29214def-24", "ovs_interfaceid": "29214def-2450-4edd-acc6-84e165aa1e2c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:03:58 np0005466013 nova_compute[192144]: 2025-10-02 12:03:58.316 2 DEBUG nova.network.os_vif_util [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Converting VIF {"id": "29214def-2450-4edd-acc6-84e165aa1e2c", "address": "fa:16:3e:1d:3d:20", "network": {"id": "664b6526-6df1-4024-9bab-37218e6c18bd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-2017832683-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7cb78d24d1a4511a59ced45ccc4a1c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29214def-24", "ovs_interfaceid": "29214def-2450-4edd-acc6-84e165aa1e2c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:03:58 np0005466013 nova_compute[192144]: 2025-10-02 12:03:58.317 2 DEBUG nova.network.os_vif_util [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1d:3d:20,bridge_name='br-int',has_traffic_filtering=True,id=29214def-2450-4edd-acc6-84e165aa1e2c,network=Network(664b6526-6df1-4024-9bab-37218e6c18bd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29214def-24') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:03:58 np0005466013 nova_compute[192144]: 2025-10-02 12:03:58.317 2 DEBUG os_vif [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1d:3d:20,bridge_name='br-int',has_traffic_filtering=True,id=29214def-2450-4edd-acc6-84e165aa1e2c,network=Network(664b6526-6df1-4024-9bab-37218e6c18bd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29214def-24') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:03:58 np0005466013 nova_compute[192144]: 2025-10-02 12:03:58.319 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:58 np0005466013 nova_compute[192144]: 2025-10-02 12:03:58.319 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap29214def-24, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:03:58 np0005466013 nova_compute[192144]: 2025-10-02 12:03:58.321 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:58 np0005466013 nova_compute[192144]: 2025-10-02 12:03:58.323 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:03:58 np0005466013 nova_compute[192144]: 2025-10-02 12:03:58.323 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:58 np0005466013 nova_compute[192144]: 2025-10-02 12:03:58.325 2 INFO os_vif [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1d:3d:20,bridge_name='br-int',has_traffic_filtering=True,id=29214def-2450-4edd-acc6-84e165aa1e2c,network=Network(664b6526-6df1-4024-9bab-37218e6c18bd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29214def-24')#033[00m
Oct  2 08:03:58 np0005466013 nova_compute[192144]: 2025-10-02 12:03:58.326 2 DEBUG oslo_concurrency.lockutils [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:03:58 np0005466013 nova_compute[192144]: 2025-10-02 12:03:58.326 2 DEBUG oslo_concurrency.lockutils [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:03:58 np0005466013 nova_compute[192144]: 2025-10-02 12:03:58.327 2 DEBUG oslo_concurrency.lockutils [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:03:58 np0005466013 nova_compute[192144]: 2025-10-02 12:03:58.327 2 DEBUG nova.compute.manager [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349#033[00m
Oct  2 08:03:58 np0005466013 nova_compute[192144]: 2025-10-02 12:03:58.327 2 INFO nova.virt.libvirt.driver [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Deleting instance files /var/lib/nova/instances/356bc6d6-1101-467e-a020-65876724c955_del#033[00m
Oct  2 08:03:58 np0005466013 nova_compute[192144]: 2025-10-02 12:03:58.328 2 INFO nova.virt.libvirt.driver [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Deletion of /var/lib/nova/instances/356bc6d6-1101-467e-a020-65876724c955_del complete#033[00m
Oct  2 08:03:58 np0005466013 nova_compute[192144]: 2025-10-02 12:03:58.654 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759406623.6526957, c4dcd0cb-13dd-4990-9232-8a51a6c5eff4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:03:58 np0005466013 nova_compute[192144]: 2025-10-02 12:03:58.654 2 INFO nova.compute.manager [-] [instance: c4dcd0cb-13dd-4990-9232-8a51a6c5eff4] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:03:58 np0005466013 nova_compute[192144]: 2025-10-02 12:03:58.691 2 DEBUG nova.compute.manager [None req-fc0fc941-3816-4fc4-8bd9-ce3e6e3a392d - - - - - -] [instance: c4dcd0cb-13dd-4990-9232-8a51a6c5eff4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:03:59 np0005466013 nova_compute[192144]: 2025-10-02 12:03:59.042 2 DEBUG nova.compute.manager [req-21d275b7-a818-4f35-a94d-eb7d054424d0 req-b9992ae9-ea31-4a23-8a0b-6cb3962923e5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Received event network-vif-plugged-29214def-2450-4edd-acc6-84e165aa1e2c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:03:59 np0005466013 nova_compute[192144]: 2025-10-02 12:03:59.043 2 DEBUG oslo_concurrency.lockutils [req-21d275b7-a818-4f35-a94d-eb7d054424d0 req-b9992ae9-ea31-4a23-8a0b-6cb3962923e5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "356bc6d6-1101-467e-a020-65876724c955-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:03:59 np0005466013 nova_compute[192144]: 2025-10-02 12:03:59.044 2 DEBUG oslo_concurrency.lockutils [req-21d275b7-a818-4f35-a94d-eb7d054424d0 req-b9992ae9-ea31-4a23-8a0b-6cb3962923e5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "356bc6d6-1101-467e-a020-65876724c955-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:03:59 np0005466013 nova_compute[192144]: 2025-10-02 12:03:59.044 2 DEBUG oslo_concurrency.lockutils [req-21d275b7-a818-4f35-a94d-eb7d054424d0 req-b9992ae9-ea31-4a23-8a0b-6cb3962923e5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "356bc6d6-1101-467e-a020-65876724c955-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:03:59 np0005466013 nova_compute[192144]: 2025-10-02 12:03:59.044 2 DEBUG nova.compute.manager [req-21d275b7-a818-4f35-a94d-eb7d054424d0 req-b9992ae9-ea31-4a23-8a0b-6cb3962923e5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] No waiting events found dispatching network-vif-plugged-29214def-2450-4edd-acc6-84e165aa1e2c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:03:59 np0005466013 nova_compute[192144]: 2025-10-02 12:03:59.045 2 WARNING nova.compute.manager [req-21d275b7-a818-4f35-a94d-eb7d054424d0 req-b9992ae9-ea31-4a23-8a0b-6cb3962923e5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Received unexpected event network-vif-plugged-29214def-2450-4edd-acc6-84e165aa1e2c for instance with vm_state active and task_state migrating.#033[00m
Oct  2 08:03:59 np0005466013 nova_compute[192144]: 2025-10-02 12:03:59.045 2 DEBUG nova.compute.manager [req-21d275b7-a818-4f35-a94d-eb7d054424d0 req-b9992ae9-ea31-4a23-8a0b-6cb3962923e5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Received event network-vif-plugged-29214def-2450-4edd-acc6-84e165aa1e2c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:03:59 np0005466013 nova_compute[192144]: 2025-10-02 12:03:59.045 2 DEBUG oslo_concurrency.lockutils [req-21d275b7-a818-4f35-a94d-eb7d054424d0 req-b9992ae9-ea31-4a23-8a0b-6cb3962923e5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "356bc6d6-1101-467e-a020-65876724c955-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:03:59 np0005466013 nova_compute[192144]: 2025-10-02 12:03:59.045 2 DEBUG oslo_concurrency.lockutils [req-21d275b7-a818-4f35-a94d-eb7d054424d0 req-b9992ae9-ea31-4a23-8a0b-6cb3962923e5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "356bc6d6-1101-467e-a020-65876724c955-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:03:59 np0005466013 nova_compute[192144]: 2025-10-02 12:03:59.046 2 DEBUG oslo_concurrency.lockutils [req-21d275b7-a818-4f35-a94d-eb7d054424d0 req-b9992ae9-ea31-4a23-8a0b-6cb3962923e5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "356bc6d6-1101-467e-a020-65876724c955-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:03:59 np0005466013 nova_compute[192144]: 2025-10-02 12:03:59.046 2 DEBUG nova.compute.manager [req-21d275b7-a818-4f35-a94d-eb7d054424d0 req-b9992ae9-ea31-4a23-8a0b-6cb3962923e5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] No waiting events found dispatching network-vif-plugged-29214def-2450-4edd-acc6-84e165aa1e2c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:03:59 np0005466013 nova_compute[192144]: 2025-10-02 12:03:59.046 2 WARNING nova.compute.manager [req-21d275b7-a818-4f35-a94d-eb7d054424d0 req-b9992ae9-ea31-4a23-8a0b-6cb3962923e5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Received unexpected event network-vif-plugged-29214def-2450-4edd-acc6-84e165aa1e2c for instance with vm_state active and task_state migrating.#033[00m
Oct  2 08:03:59 np0005466013 nova_compute[192144]: 2025-10-02 12:03:59.046 2 DEBUG nova.compute.manager [req-21d275b7-a818-4f35-a94d-eb7d054424d0 req-b9992ae9-ea31-4a23-8a0b-6cb3962923e5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Received event network-vif-plugged-29214def-2450-4edd-acc6-84e165aa1e2c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:03:59 np0005466013 nova_compute[192144]: 2025-10-02 12:03:59.047 2 DEBUG oslo_concurrency.lockutils [req-21d275b7-a818-4f35-a94d-eb7d054424d0 req-b9992ae9-ea31-4a23-8a0b-6cb3962923e5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "356bc6d6-1101-467e-a020-65876724c955-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:03:59 np0005466013 nova_compute[192144]: 2025-10-02 12:03:59.047 2 DEBUG oslo_concurrency.lockutils [req-21d275b7-a818-4f35-a94d-eb7d054424d0 req-b9992ae9-ea31-4a23-8a0b-6cb3962923e5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "356bc6d6-1101-467e-a020-65876724c955-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:03:59 np0005466013 nova_compute[192144]: 2025-10-02 12:03:59.047 2 DEBUG oslo_concurrency.lockutils [req-21d275b7-a818-4f35-a94d-eb7d054424d0 req-b9992ae9-ea31-4a23-8a0b-6cb3962923e5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "356bc6d6-1101-467e-a020-65876724c955-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:03:59 np0005466013 nova_compute[192144]: 2025-10-02 12:03:59.047 2 DEBUG nova.compute.manager [req-21d275b7-a818-4f35-a94d-eb7d054424d0 req-b9992ae9-ea31-4a23-8a0b-6cb3962923e5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] No waiting events found dispatching network-vif-plugged-29214def-2450-4edd-acc6-84e165aa1e2c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:03:59 np0005466013 nova_compute[192144]: 2025-10-02 12:03:59.048 2 WARNING nova.compute.manager [req-21d275b7-a818-4f35-a94d-eb7d054424d0 req-b9992ae9-ea31-4a23-8a0b-6cb3962923e5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Received unexpected event network-vif-plugged-29214def-2450-4edd-acc6-84e165aa1e2c for instance with vm_state active and task_state migrating.#033[00m
Oct  2 08:04:00 np0005466013 podman[222272]: 2025-10-02 12:04:00.692987442 +0000 UTC m=+0.063956293 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Oct  2 08:04:01 np0005466013 nova_compute[192144]: 2025-10-02 12:04:01.166 2 DEBUG nova.compute.manager [req-0534a7f0-aecc-4294-a5bb-5e9bc687e62c req-80fae21a-a426-4d92-8770-a706093b4822 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Received event network-vif-plugged-29214def-2450-4edd-acc6-84e165aa1e2c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:04:01 np0005466013 nova_compute[192144]: 2025-10-02 12:04:01.167 2 DEBUG oslo_concurrency.lockutils [req-0534a7f0-aecc-4294-a5bb-5e9bc687e62c req-80fae21a-a426-4d92-8770-a706093b4822 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "356bc6d6-1101-467e-a020-65876724c955-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:01 np0005466013 nova_compute[192144]: 2025-10-02 12:04:01.167 2 DEBUG oslo_concurrency.lockutils [req-0534a7f0-aecc-4294-a5bb-5e9bc687e62c req-80fae21a-a426-4d92-8770-a706093b4822 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "356bc6d6-1101-467e-a020-65876724c955-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:01 np0005466013 nova_compute[192144]: 2025-10-02 12:04:01.168 2 DEBUG oslo_concurrency.lockutils [req-0534a7f0-aecc-4294-a5bb-5e9bc687e62c req-80fae21a-a426-4d92-8770-a706093b4822 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "356bc6d6-1101-467e-a020-65876724c955-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:01 np0005466013 nova_compute[192144]: 2025-10-02 12:04:01.168 2 DEBUG nova.compute.manager [req-0534a7f0-aecc-4294-a5bb-5e9bc687e62c req-80fae21a-a426-4d92-8770-a706093b4822 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] No waiting events found dispatching network-vif-plugged-29214def-2450-4edd-acc6-84e165aa1e2c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:04:01 np0005466013 nova_compute[192144]: 2025-10-02 12:04:01.168 2 WARNING nova.compute.manager [req-0534a7f0-aecc-4294-a5bb-5e9bc687e62c req-80fae21a-a426-4d92-8770-a706093b4822 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Received unexpected event network-vif-plugged-29214def-2450-4edd-acc6-84e165aa1e2c for instance with vm_state active and task_state migrating.#033[00m
Oct  2 08:04:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:04:02.277 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:04:02.278 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:04:02.278 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:02 np0005466013 nova_compute[192144]: 2025-10-02 12:04:02.722 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:02 np0005466013 nova_compute[192144]: 2025-10-02 12:04:02.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:04:02 np0005466013 nova_compute[192144]: 2025-10-02 12:04:02.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  2 08:04:03 np0005466013 nova_compute[192144]: 2025-10-02 12:04:03.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:03 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:04:03.321 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:04:03 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:04:03.323 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:04:03 np0005466013 podman[222293]: 2025-10-02 12:04:03.6825176 +0000 UTC m=+0.056141001 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, name=ubi9-minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, config_id=edpm, vendor=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, release=1755695350, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct  2 08:04:03 np0005466013 podman[222292]: 2025-10-02 12:04:03.712805025 +0000 UTC m=+0.089682123 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:04:04 np0005466013 nova_compute[192144]: 2025-10-02 12:04:04.276 2 DEBUG oslo_concurrency.lockutils [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Acquiring lock "356bc6d6-1101-467e-a020-65876724c955-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:04 np0005466013 nova_compute[192144]: 2025-10-02 12:04:04.276 2 DEBUG oslo_concurrency.lockutils [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Lock "356bc6d6-1101-467e-a020-65876724c955-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:04 np0005466013 nova_compute[192144]: 2025-10-02 12:04:04.277 2 DEBUG oslo_concurrency.lockutils [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Lock "356bc6d6-1101-467e-a020-65876724c955-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:04 np0005466013 nova_compute[192144]: 2025-10-02 12:04:04.303 2 DEBUG oslo_concurrency.lockutils [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:04 np0005466013 nova_compute[192144]: 2025-10-02 12:04:04.303 2 DEBUG oslo_concurrency.lockutils [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:04 np0005466013 nova_compute[192144]: 2025-10-02 12:04:04.304 2 DEBUG oslo_concurrency.lockutils [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:04 np0005466013 nova_compute[192144]: 2025-10-02 12:04:04.304 2 DEBUG nova.compute.resource_tracker [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:04:04 np0005466013 nova_compute[192144]: 2025-10-02 12:04:04.501 2 WARNING nova.virt.libvirt.driver [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:04:04 np0005466013 nova_compute[192144]: 2025-10-02 12:04:04.502 2 DEBUG nova.compute.resource_tracker [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5773MB free_disk=73.46427917480469GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:04:04 np0005466013 nova_compute[192144]: 2025-10-02 12:04:04.502 2 DEBUG oslo_concurrency.lockutils [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:04 np0005466013 nova_compute[192144]: 2025-10-02 12:04:04.503 2 DEBUG oslo_concurrency.lockutils [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:04 np0005466013 nova_compute[192144]: 2025-10-02 12:04:04.558 2 DEBUG nova.compute.resource_tracker [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Migration for instance 356bc6d6-1101-467e-a020-65876724c955 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Oct  2 08:04:04 np0005466013 nova_compute[192144]: 2025-10-02 12:04:04.578 2 DEBUG nova.compute.resource_tracker [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Oct  2 08:04:04 np0005466013 nova_compute[192144]: 2025-10-02 12:04:04.601 2 DEBUG nova.compute.resource_tracker [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Migration d5491466-7696-4e8f-b88a-dec3442315d4 is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Oct  2 08:04:04 np0005466013 nova_compute[192144]: 2025-10-02 12:04:04.602 2 DEBUG nova.compute.resource_tracker [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:04:04 np0005466013 nova_compute[192144]: 2025-10-02 12:04:04.602 2 DEBUG nova.compute.resource_tracker [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:04:04 np0005466013 nova_compute[192144]: 2025-10-02 12:04:04.664 2 DEBUG nova.compute.provider_tree [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:04:04 np0005466013 nova_compute[192144]: 2025-10-02 12:04:04.679 2 DEBUG nova.scheduler.client.report [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:04:04 np0005466013 nova_compute[192144]: 2025-10-02 12:04:04.724 2 DEBUG nova.compute.resource_tracker [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:04:04 np0005466013 nova_compute[192144]: 2025-10-02 12:04:04.725 2 DEBUG oslo_concurrency.lockutils [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.222s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:04 np0005466013 nova_compute[192144]: 2025-10-02 12:04:04.748 2 INFO nova.compute.manager [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Migrating instance to compute-0.ctlplane.example.com finished successfully.#033[00m
Oct  2 08:04:04 np0005466013 nova_compute[192144]: 2025-10-02 12:04:04.828 2 INFO nova.scheduler.client.report [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Deleted allocation for migration d5491466-7696-4e8f-b88a-dec3442315d4#033[00m
Oct  2 08:04:04 np0005466013 nova_compute[192144]: 2025-10-02 12:04:04.828 2 DEBUG nova.virt.libvirt.driver [None req-08fef021-4e4c-49d5-b19b-f48e19ff10a8 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662#033[00m
Oct  2 08:04:05 np0005466013 nova_compute[192144]: 2025-10-02 12:04:05.360 2 DEBUG oslo_concurrency.lockutils [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Acquiring lock "05cee82b-0fd3-4016-893a-4a0a7fc48322" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:05 np0005466013 nova_compute[192144]: 2025-10-02 12:04:05.361 2 DEBUG oslo_concurrency.lockutils [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Lock "05cee82b-0fd3-4016-893a-4a0a7fc48322" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:05 np0005466013 nova_compute[192144]: 2025-10-02 12:04:05.381 2 DEBUG nova.compute.manager [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:04:05 np0005466013 nova_compute[192144]: 2025-10-02 12:04:05.505 2 DEBUG oslo_concurrency.lockutils [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:05 np0005466013 nova_compute[192144]: 2025-10-02 12:04:05.505 2 DEBUG oslo_concurrency.lockutils [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:05 np0005466013 nova_compute[192144]: 2025-10-02 12:04:05.513 2 DEBUG nova.virt.hardware [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:04:05 np0005466013 nova_compute[192144]: 2025-10-02 12:04:05.514 2 INFO nova.compute.claims [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:04:05 np0005466013 nova_compute[192144]: 2025-10-02 12:04:05.624 2 DEBUG nova.compute.provider_tree [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:04:05 np0005466013 nova_compute[192144]: 2025-10-02 12:04:05.637 2 DEBUG nova.scheduler.client.report [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:04:05 np0005466013 nova_compute[192144]: 2025-10-02 12:04:05.654 2 DEBUG oslo_concurrency.lockutils [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.149s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:05 np0005466013 nova_compute[192144]: 2025-10-02 12:04:05.655 2 DEBUG nova.compute.manager [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:04:05 np0005466013 nova_compute[192144]: 2025-10-02 12:04:05.747 2 DEBUG nova.compute.manager [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:04:05 np0005466013 nova_compute[192144]: 2025-10-02 12:04:05.748 2 DEBUG nova.network.neutron [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:04:05 np0005466013 nova_compute[192144]: 2025-10-02 12:04:05.772 2 INFO nova.virt.libvirt.driver [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:04:05 np0005466013 nova_compute[192144]: 2025-10-02 12:04:05.794 2 DEBUG nova.compute.manager [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:04:05 np0005466013 nova_compute[192144]: 2025-10-02 12:04:05.904 2 DEBUG nova.compute.manager [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:04:05 np0005466013 nova_compute[192144]: 2025-10-02 12:04:05.906 2 DEBUG nova.virt.libvirt.driver [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:04:05 np0005466013 nova_compute[192144]: 2025-10-02 12:04:05.906 2 INFO nova.virt.libvirt.driver [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] Creating image(s)#033[00m
Oct  2 08:04:05 np0005466013 nova_compute[192144]: 2025-10-02 12:04:05.906 2 DEBUG oslo_concurrency.lockutils [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Acquiring lock "/var/lib/nova/instances/05cee82b-0fd3-4016-893a-4a0a7fc48322/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:05 np0005466013 nova_compute[192144]: 2025-10-02 12:04:05.907 2 DEBUG oslo_concurrency.lockutils [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Lock "/var/lib/nova/instances/05cee82b-0fd3-4016-893a-4a0a7fc48322/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:05 np0005466013 nova_compute[192144]: 2025-10-02 12:04:05.907 2 DEBUG oslo_concurrency.lockutils [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Lock "/var/lib/nova/instances/05cee82b-0fd3-4016-893a-4a0a7fc48322/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:05 np0005466013 nova_compute[192144]: 2025-10-02 12:04:05.918 2 DEBUG oslo_concurrency.processutils [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:04:05 np0005466013 nova_compute[192144]: 2025-10-02 12:04:05.974 2 DEBUG oslo_concurrency.processutils [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:04:05 np0005466013 nova_compute[192144]: 2025-10-02 12:04:05.976 2 DEBUG oslo_concurrency.lockutils [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:05 np0005466013 nova_compute[192144]: 2025-10-02 12:04:05.977 2 DEBUG oslo_concurrency.lockutils [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:05 np0005466013 nova_compute[192144]: 2025-10-02 12:04:05.993 2 DEBUG oslo_concurrency.processutils [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:04:06 np0005466013 nova_compute[192144]: 2025-10-02 12:04:06.017 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:04:06 np0005466013 nova_compute[192144]: 2025-10-02 12:04:06.061 2 DEBUG oslo_concurrency.processutils [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:04:06 np0005466013 nova_compute[192144]: 2025-10-02 12:04:06.062 2 DEBUG oslo_concurrency.processutils [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/05cee82b-0fd3-4016-893a-4a0a7fc48322/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:04:06 np0005466013 nova_compute[192144]: 2025-10-02 12:04:06.108 2 DEBUG oslo_concurrency.processutils [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/05cee82b-0fd3-4016-893a-4a0a7fc48322/disk 1073741824" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:04:06 np0005466013 nova_compute[192144]: 2025-10-02 12:04:06.109 2 DEBUG oslo_concurrency.lockutils [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:06 np0005466013 nova_compute[192144]: 2025-10-02 12:04:06.110 2 DEBUG oslo_concurrency.processutils [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:04:06 np0005466013 nova_compute[192144]: 2025-10-02 12:04:06.163 2 DEBUG oslo_concurrency.processutils [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:04:06 np0005466013 nova_compute[192144]: 2025-10-02 12:04:06.164 2 DEBUG nova.virt.disk.api [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Checking if we can resize image /var/lib/nova/instances/05cee82b-0fd3-4016-893a-4a0a7fc48322/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:04:06 np0005466013 nova_compute[192144]: 2025-10-02 12:04:06.165 2 DEBUG oslo_concurrency.processutils [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/05cee82b-0fd3-4016-893a-4a0a7fc48322/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:04:06 np0005466013 nova_compute[192144]: 2025-10-02 12:04:06.206 2 DEBUG nova.network.neutron [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Oct  2 08:04:06 np0005466013 nova_compute[192144]: 2025-10-02 12:04:06.206 2 DEBUG nova.compute.manager [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:04:06 np0005466013 nova_compute[192144]: 2025-10-02 12:04:06.217 2 DEBUG oslo_concurrency.processutils [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/05cee82b-0fd3-4016-893a-4a0a7fc48322/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:04:06 np0005466013 nova_compute[192144]: 2025-10-02 12:04:06.218 2 DEBUG nova.virt.disk.api [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Cannot resize image /var/lib/nova/instances/05cee82b-0fd3-4016-893a-4a0a7fc48322/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:04:06 np0005466013 nova_compute[192144]: 2025-10-02 12:04:06.218 2 DEBUG nova.objects.instance [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Lazy-loading 'migration_context' on Instance uuid 05cee82b-0fd3-4016-893a-4a0a7fc48322 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:04:06 np0005466013 nova_compute[192144]: 2025-10-02 12:04:06.232 2 DEBUG nova.virt.libvirt.driver [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:04:06 np0005466013 nova_compute[192144]: 2025-10-02 12:04:06.233 2 DEBUG nova.virt.libvirt.driver [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] Ensure instance console log exists: /var/lib/nova/instances/05cee82b-0fd3-4016-893a-4a0a7fc48322/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:04:06 np0005466013 nova_compute[192144]: 2025-10-02 12:04:06.233 2 DEBUG oslo_concurrency.lockutils [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:06 np0005466013 nova_compute[192144]: 2025-10-02 12:04:06.234 2 DEBUG oslo_concurrency.lockutils [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:06 np0005466013 nova_compute[192144]: 2025-10-02 12:04:06.234 2 DEBUG oslo_concurrency.lockutils [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:06 np0005466013 nova_compute[192144]: 2025-10-02 12:04:06.235 2 DEBUG nova.virt.libvirt.driver [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:04:06 np0005466013 nova_compute[192144]: 2025-10-02 12:04:06.239 2 WARNING nova.virt.libvirt.driver [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:04:06 np0005466013 nova_compute[192144]: 2025-10-02 12:04:06.245 2 DEBUG nova.virt.libvirt.host [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:04:06 np0005466013 nova_compute[192144]: 2025-10-02 12:04:06.246 2 DEBUG nova.virt.libvirt.host [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:04:06 np0005466013 nova_compute[192144]: 2025-10-02 12:04:06.249 2 DEBUG nova.virt.libvirt.host [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:04:06 np0005466013 nova_compute[192144]: 2025-10-02 12:04:06.249 2 DEBUG nova.virt.libvirt.host [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:04:06 np0005466013 nova_compute[192144]: 2025-10-02 12:04:06.250 2 DEBUG nova.virt.libvirt.driver [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:04:06 np0005466013 nova_compute[192144]: 2025-10-02 12:04:06.251 2 DEBUG nova.virt.hardware [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:04:06 np0005466013 nova_compute[192144]: 2025-10-02 12:04:06.251 2 DEBUG nova.virt.hardware [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:04:06 np0005466013 nova_compute[192144]: 2025-10-02 12:04:06.251 2 DEBUG nova.virt.hardware [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:04:06 np0005466013 nova_compute[192144]: 2025-10-02 12:04:06.252 2 DEBUG nova.virt.hardware [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:04:06 np0005466013 nova_compute[192144]: 2025-10-02 12:04:06.252 2 DEBUG nova.virt.hardware [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:04:06 np0005466013 nova_compute[192144]: 2025-10-02 12:04:06.252 2 DEBUG nova.virt.hardware [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:04:06 np0005466013 nova_compute[192144]: 2025-10-02 12:04:06.252 2 DEBUG nova.virt.hardware [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:04:06 np0005466013 nova_compute[192144]: 2025-10-02 12:04:06.253 2 DEBUG nova.virt.hardware [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:04:06 np0005466013 nova_compute[192144]: 2025-10-02 12:04:06.253 2 DEBUG nova.virt.hardware [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:04:06 np0005466013 nova_compute[192144]: 2025-10-02 12:04:06.253 2 DEBUG nova.virt.hardware [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:04:06 np0005466013 nova_compute[192144]: 2025-10-02 12:04:06.253 2 DEBUG nova.virt.hardware [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:04:06 np0005466013 nova_compute[192144]: 2025-10-02 12:04:06.257 2 DEBUG nova.objects.instance [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Lazy-loading 'pci_devices' on Instance uuid 05cee82b-0fd3-4016-893a-4a0a7fc48322 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:04:06 np0005466013 nova_compute[192144]: 2025-10-02 12:04:06.271 2 DEBUG nova.virt.libvirt.driver [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:04:06 np0005466013 nova_compute[192144]:  <uuid>05cee82b-0fd3-4016-893a-4a0a7fc48322</uuid>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:  <name>instance-00000018</name>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:04:06 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:      <nova:name>tempest-ServerDiagnosticsTest-server-602651795</nova:name>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:04:06</nova:creationTime>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:04:06 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:        <nova:user uuid="d4a6f546eb184617896f7b31d695e198">tempest-ServerDiagnosticsTest-487712087-project-member</nova:user>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:        <nova:project uuid="6a7bc078cca045fab4ffbbcaaa2f00e7">tempest-ServerDiagnosticsTest-487712087</nova:project>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:      <nova:ports/>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:04:06 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:      <entry name="serial">05cee82b-0fd3-4016-893a-4a0a7fc48322</entry>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:      <entry name="uuid">05cee82b-0fd3-4016-893a-4a0a7fc48322</entry>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:04:06 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:04:06 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:04:06 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/05cee82b-0fd3-4016-893a-4a0a7fc48322/disk"/>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:04:06 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/05cee82b-0fd3-4016-893a-4a0a7fc48322/disk.config"/>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:04:06 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/05cee82b-0fd3-4016-893a-4a0a7fc48322/console.log" append="off"/>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:04:06 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:04:06 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:04:06 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:04:06 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:04:06 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:04:06 np0005466013 nova_compute[192144]: 2025-10-02 12:04:06.316 2 DEBUG nova.virt.libvirt.driver [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:04:06 np0005466013 nova_compute[192144]: 2025-10-02 12:04:06.316 2 DEBUG nova.virt.libvirt.driver [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:04:06 np0005466013 nova_compute[192144]: 2025-10-02 12:04:06.317 2 INFO nova.virt.libvirt.driver [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] Using config drive#033[00m
Oct  2 08:04:06 np0005466013 nova_compute[192144]: 2025-10-02 12:04:06.465 2 INFO nova.virt.libvirt.driver [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] Creating config drive at /var/lib/nova/instances/05cee82b-0fd3-4016-893a-4a0a7fc48322/disk.config#033[00m
Oct  2 08:04:06 np0005466013 nova_compute[192144]: 2025-10-02 12:04:06.470 2 DEBUG oslo_concurrency.processutils [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/05cee82b-0fd3-4016-893a-4a0a7fc48322/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp29gzlwkh execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:04:06 np0005466013 nova_compute[192144]: 2025-10-02 12:04:06.594 2 DEBUG oslo_concurrency.processutils [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/05cee82b-0fd3-4016-893a-4a0a7fc48322/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp29gzlwkh" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:04:06 np0005466013 nova_compute[192144]: 2025-10-02 12:04:06.617 2 DEBUG nova.virt.libvirt.driver [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Creating tmpfile /var/lib/nova/instances/tmp6xzn3h2n to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Oct  2 08:04:06 np0005466013 systemd-machined[152202]: New machine qemu-9-instance-00000018.
Oct  2 08:04:06 np0005466013 systemd[1]: Started Virtual Machine qemu-9-instance-00000018.
Oct  2 08:04:06 np0005466013 nova_compute[192144]: 2025-10-02 12:04:06.714 2 DEBUG nova.compute.manager [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp6xzn3h2n',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Oct  2 08:04:07 np0005466013 nova_compute[192144]: 2025-10-02 12:04:07.600 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759406647.6004837, 05cee82b-0fd3-4016-893a-4a0a7fc48322 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:04:07 np0005466013 nova_compute[192144]: 2025-10-02 12:04:07.601 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:04:07 np0005466013 nova_compute[192144]: 2025-10-02 12:04:07.603 2 DEBUG nova.compute.manager [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:04:07 np0005466013 nova_compute[192144]: 2025-10-02 12:04:07.603 2 DEBUG nova.virt.libvirt.driver [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:04:07 np0005466013 nova_compute[192144]: 2025-10-02 12:04:07.607 2 INFO nova.virt.libvirt.driver [-] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] Instance spawned successfully.#033[00m
Oct  2 08:04:07 np0005466013 nova_compute[192144]: 2025-10-02 12:04:07.608 2 DEBUG nova.virt.libvirt.driver [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:04:07 np0005466013 nova_compute[192144]: 2025-10-02 12:04:07.646 2 DEBUG nova.virt.libvirt.driver [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:04:07 np0005466013 nova_compute[192144]: 2025-10-02 12:04:07.647 2 DEBUG nova.virt.libvirt.driver [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:04:07 np0005466013 nova_compute[192144]: 2025-10-02 12:04:07.647 2 DEBUG nova.virt.libvirt.driver [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:04:07 np0005466013 nova_compute[192144]: 2025-10-02 12:04:07.647 2 DEBUG nova.virt.libvirt.driver [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:04:07 np0005466013 nova_compute[192144]: 2025-10-02 12:04:07.648 2 DEBUG nova.virt.libvirt.driver [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:04:07 np0005466013 nova_compute[192144]: 2025-10-02 12:04:07.648 2 DEBUG nova.virt.libvirt.driver [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:04:07 np0005466013 nova_compute[192144]: 2025-10-02 12:04:07.651 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:04:07 np0005466013 nova_compute[192144]: 2025-10-02 12:04:07.654 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:04:07 np0005466013 nova_compute[192144]: 2025-10-02 12:04:07.708 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:04:07 np0005466013 nova_compute[192144]: 2025-10-02 12:04:07.708 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759406647.6012638, 05cee82b-0fd3-4016-893a-4a0a7fc48322 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:04:07 np0005466013 nova_compute[192144]: 2025-10-02 12:04:07.709 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] VM Started (Lifecycle Event)#033[00m
Oct  2 08:04:07 np0005466013 nova_compute[192144]: 2025-10-02 12:04:07.724 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:07 np0005466013 nova_compute[192144]: 2025-10-02 12:04:07.749 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:04:07 np0005466013 nova_compute[192144]: 2025-10-02 12:04:07.752 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:04:07 np0005466013 nova_compute[192144]: 2025-10-02 12:04:07.787 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:04:07 np0005466013 nova_compute[192144]: 2025-10-02 12:04:07.788 2 INFO nova.compute.manager [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] Took 1.88 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:04:07 np0005466013 nova_compute[192144]: 2025-10-02 12:04:07.788 2 DEBUG nova.compute.manager [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:04:07 np0005466013 nova_compute[192144]: 2025-10-02 12:04:07.913 2 INFO nova.compute.manager [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] Took 2.45 seconds to build instance.#033[00m
Oct  2 08:04:07 np0005466013 nova_compute[192144]: 2025-10-02 12:04:07.938 2 DEBUG oslo_concurrency.lockutils [None req-6c632247-c5d9-49c0-b720-7bac59e1e6da d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Lock "05cee82b-0fd3-4016-893a-4a0a7fc48322" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.577s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:08 np0005466013 nova_compute[192144]: 2025-10-02 12:04:08.008 2 DEBUG nova.compute.manager [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp6xzn3h2n',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='356bc6d6-1101-467e-a020-65876724c955',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Oct  2 08:04:08 np0005466013 nova_compute[192144]: 2025-10-02 12:04:08.037 2 DEBUG oslo_concurrency.lockutils [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Acquiring lock "refresh_cache-356bc6d6-1101-467e-a020-65876724c955" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:04:08 np0005466013 nova_compute[192144]: 2025-10-02 12:04:08.037 2 DEBUG oslo_concurrency.lockutils [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Acquired lock "refresh_cache-356bc6d6-1101-467e-a020-65876724c955" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:04:08 np0005466013 nova_compute[192144]: 2025-10-02 12:04:08.038 2 DEBUG nova.network.neutron [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:04:08 np0005466013 nova_compute[192144]: 2025-10-02 12:04:08.324 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:08 np0005466013 podman[222377]: 2025-10-02 12:04:08.673609615 +0000 UTC m=+0.052427131 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:04:08 np0005466013 podman[222378]: 2025-10-02 12:04:08.706715088 +0000 UTC m=+0.080951392 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid)
Oct  2 08:04:09 np0005466013 nova_compute[192144]: 2025-10-02 12:04:09.229 2 DEBUG nova.compute.manager [None req-499d22e6-d944-47dd-9674-5a1558686b1d 00b36470c64247368a78e56bab9a499e a17a5c1ddeee4d66ad015809adaedc08 - - default default] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:04:09 np0005466013 nova_compute[192144]: 2025-10-02 12:04:09.233 2 INFO nova.compute.manager [None req-499d22e6-d944-47dd-9674-5a1558686b1d 00b36470c64247368a78e56bab9a499e a17a5c1ddeee4d66ad015809adaedc08 - - default default] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] Retrieving diagnostics#033[00m
Oct  2 08:04:09 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:04:09.326 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:04:09 np0005466013 nova_compute[192144]: 2025-10-02 12:04:09.546 2 DEBUG oslo_concurrency.lockutils [None req-96d982e5-c2b0-45b6-a596-97bb5b214477 d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Acquiring lock "05cee82b-0fd3-4016-893a-4a0a7fc48322" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:09 np0005466013 nova_compute[192144]: 2025-10-02 12:04:09.546 2 DEBUG oslo_concurrency.lockutils [None req-96d982e5-c2b0-45b6-a596-97bb5b214477 d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Lock "05cee82b-0fd3-4016-893a-4a0a7fc48322" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:09 np0005466013 nova_compute[192144]: 2025-10-02 12:04:09.547 2 DEBUG oslo_concurrency.lockutils [None req-96d982e5-c2b0-45b6-a596-97bb5b214477 d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Acquiring lock "05cee82b-0fd3-4016-893a-4a0a7fc48322-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:09 np0005466013 nova_compute[192144]: 2025-10-02 12:04:09.547 2 DEBUG oslo_concurrency.lockutils [None req-96d982e5-c2b0-45b6-a596-97bb5b214477 d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Lock "05cee82b-0fd3-4016-893a-4a0a7fc48322-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:09 np0005466013 nova_compute[192144]: 2025-10-02 12:04:09.547 2 DEBUG oslo_concurrency.lockutils [None req-96d982e5-c2b0-45b6-a596-97bb5b214477 d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Lock "05cee82b-0fd3-4016-893a-4a0a7fc48322-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:09 np0005466013 nova_compute[192144]: 2025-10-02 12:04:09.563 2 INFO nova.compute.manager [None req-96d982e5-c2b0-45b6-a596-97bb5b214477 d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] Terminating instance#033[00m
Oct  2 08:04:09 np0005466013 nova_compute[192144]: 2025-10-02 12:04:09.583 2 DEBUG oslo_concurrency.lockutils [None req-96d982e5-c2b0-45b6-a596-97bb5b214477 d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Acquiring lock "refresh_cache-05cee82b-0fd3-4016-893a-4a0a7fc48322" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:04:09 np0005466013 nova_compute[192144]: 2025-10-02 12:04:09.584 2 DEBUG oslo_concurrency.lockutils [None req-96d982e5-c2b0-45b6-a596-97bb5b214477 d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Acquired lock "refresh_cache-05cee82b-0fd3-4016-893a-4a0a7fc48322" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:04:09 np0005466013 nova_compute[192144]: 2025-10-02 12:04:09.585 2 DEBUG nova.network.neutron [None req-96d982e5-c2b0-45b6-a596-97bb5b214477 d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:04:09 np0005466013 nova_compute[192144]: 2025-10-02 12:04:09.780 2 DEBUG nova.network.neutron [None req-96d982e5-c2b0-45b6-a596-97bb5b214477 d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:04:09 np0005466013 nova_compute[192144]: 2025-10-02 12:04:09.892 2 DEBUG nova.network.neutron [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Updating instance_info_cache with network_info: [{"id": "29214def-2450-4edd-acc6-84e165aa1e2c", "address": "fa:16:3e:1d:3d:20", "network": {"id": "664b6526-6df1-4024-9bab-37218e6c18bd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-2017832683-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7cb78d24d1a4511a59ced45ccc4a1c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29214def-24", "ovs_interfaceid": "29214def-2450-4edd-acc6-84e165aa1e2c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:04:09 np0005466013 nova_compute[192144]: 2025-10-02 12:04:09.909 2 DEBUG oslo_concurrency.lockutils [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Releasing lock "refresh_cache-356bc6d6-1101-467e-a020-65876724c955" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:04:09 np0005466013 nova_compute[192144]: 2025-10-02 12:04:09.920 2 DEBUG nova.virt.libvirt.driver [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp6xzn3h2n',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='356bc6d6-1101-467e-a020-65876724c955',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Oct  2 08:04:09 np0005466013 nova_compute[192144]: 2025-10-02 12:04:09.921 2 DEBUG nova.virt.libvirt.driver [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Creating instance directory: /var/lib/nova/instances/356bc6d6-1101-467e-a020-65876724c955 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Oct  2 08:04:09 np0005466013 nova_compute[192144]: 2025-10-02 12:04:09.921 2 DEBUG nova.virt.libvirt.driver [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Creating disk.info with the contents: {'/var/lib/nova/instances/356bc6d6-1101-467e-a020-65876724c955/disk': 'qcow2', '/var/lib/nova/instances/356bc6d6-1101-467e-a020-65876724c955/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854#033[00m
Oct  2 08:04:09 np0005466013 nova_compute[192144]: 2025-10-02 12:04:09.922 2 DEBUG nova.virt.libvirt.driver [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864#033[00m
Oct  2 08:04:09 np0005466013 nova_compute[192144]: 2025-10-02 12:04:09.924 2 DEBUG nova.objects.instance [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 356bc6d6-1101-467e-a020-65876724c955 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:04:09 np0005466013 nova_compute[192144]: 2025-10-02 12:04:09.964 2 DEBUG oslo_concurrency.processutils [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:04:10 np0005466013 nova_compute[192144]: 2025-10-02 12:04:10.028 2 DEBUG oslo_concurrency.processutils [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:04:10 np0005466013 nova_compute[192144]: 2025-10-02 12:04:10.030 2 DEBUG oslo_concurrency.lockutils [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:10 np0005466013 nova_compute[192144]: 2025-10-02 12:04:10.031 2 DEBUG oslo_concurrency.lockutils [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:10 np0005466013 nova_compute[192144]: 2025-10-02 12:04:10.059 2 DEBUG oslo_concurrency.processutils [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:04:10 np0005466013 nova_compute[192144]: 2025-10-02 12:04:10.085 2 DEBUG nova.network.neutron [None req-96d982e5-c2b0-45b6-a596-97bb5b214477 d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:04:10 np0005466013 nova_compute[192144]: 2025-10-02 12:04:10.114 2 DEBUG oslo_concurrency.lockutils [None req-96d982e5-c2b0-45b6-a596-97bb5b214477 d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Releasing lock "refresh_cache-05cee82b-0fd3-4016-893a-4a0a7fc48322" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:04:10 np0005466013 nova_compute[192144]: 2025-10-02 12:04:10.116 2 DEBUG nova.compute.manager [None req-96d982e5-c2b0-45b6-a596-97bb5b214477 d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:04:10 np0005466013 nova_compute[192144]: 2025-10-02 12:04:10.128 2 DEBUG oslo_concurrency.processutils [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:04:10 np0005466013 nova_compute[192144]: 2025-10-02 12:04:10.128 2 DEBUG oslo_concurrency.processutils [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/356bc6d6-1101-467e-a020-65876724c955/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:04:10 np0005466013 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000018.scope: Deactivated successfully.
Oct  2 08:04:10 np0005466013 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000018.scope: Consumed 3.352s CPU time.
Oct  2 08:04:10 np0005466013 systemd-machined[152202]: Machine qemu-9-instance-00000018 terminated.
Oct  2 08:04:10 np0005466013 nova_compute[192144]: 2025-10-02 12:04:10.163 2 DEBUG oslo_concurrency.processutils [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/356bc6d6-1101-467e-a020-65876724c955/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:04:10 np0005466013 nova_compute[192144]: 2025-10-02 12:04:10.165 2 DEBUG oslo_concurrency.lockutils [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:10 np0005466013 nova_compute[192144]: 2025-10-02 12:04:10.166 2 DEBUG oslo_concurrency.processutils [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:04:10 np0005466013 nova_compute[192144]: 2025-10-02 12:04:10.218 2 DEBUG oslo_concurrency.processutils [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:04:10 np0005466013 nova_compute[192144]: 2025-10-02 12:04:10.219 2 DEBUG nova.virt.disk.api [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Checking if we can resize image /var/lib/nova/instances/356bc6d6-1101-467e-a020-65876724c955/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:04:10 np0005466013 nova_compute[192144]: 2025-10-02 12:04:10.220 2 DEBUG oslo_concurrency.processutils [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/356bc6d6-1101-467e-a020-65876724c955/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:04:10 np0005466013 nova_compute[192144]: 2025-10-02 12:04:10.280 2 DEBUG oslo_concurrency.processutils [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/356bc6d6-1101-467e-a020-65876724c955/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:04:10 np0005466013 nova_compute[192144]: 2025-10-02 12:04:10.281 2 DEBUG nova.virt.disk.api [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Cannot resize image /var/lib/nova/instances/356bc6d6-1101-467e-a020-65876724c955/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:04:10 np0005466013 nova_compute[192144]: 2025-10-02 12:04:10.281 2 DEBUG nova.objects.instance [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Lazy-loading 'migration_context' on Instance uuid 356bc6d6-1101-467e-a020-65876724c955 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:04:10 np0005466013 nova_compute[192144]: 2025-10-02 12:04:10.294 2 DEBUG oslo_concurrency.processutils [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/356bc6d6-1101-467e-a020-65876724c955/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:04:10 np0005466013 nova_compute[192144]: 2025-10-02 12:04:10.317 2 DEBUG oslo_concurrency.processutils [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/356bc6d6-1101-467e-a020-65876724c955/disk.config 485376" returned: 0 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:04:10 np0005466013 nova_compute[192144]: 2025-10-02 12:04:10.318 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Copying file compute-0.ctlplane.example.com:/var/lib/nova/instances/356bc6d6-1101-467e-a020-65876724c955/disk.config to /var/lib/nova/instances/356bc6d6-1101-467e-a020-65876724c955 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Oct  2 08:04:10 np0005466013 nova_compute[192144]: 2025-10-02 12:04:10.319 2 DEBUG oslo_concurrency.processutils [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Running cmd (subprocess): scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/356bc6d6-1101-467e-a020-65876724c955/disk.config /var/lib/nova/instances/356bc6d6-1101-467e-a020-65876724c955 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:04:10 np0005466013 nova_compute[192144]: 2025-10-02 12:04:10.366 2 INFO nova.virt.libvirt.driver [-] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] Instance destroyed successfully.#033[00m
Oct  2 08:04:10 np0005466013 nova_compute[192144]: 2025-10-02 12:04:10.367 2 DEBUG nova.objects.instance [None req-96d982e5-c2b0-45b6-a596-97bb5b214477 d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Lazy-loading 'resources' on Instance uuid 05cee82b-0fd3-4016-893a-4a0a7fc48322 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:04:10 np0005466013 nova_compute[192144]: 2025-10-02 12:04:10.383 2 INFO nova.virt.libvirt.driver [None req-96d982e5-c2b0-45b6-a596-97bb5b214477 d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] Deleting instance files /var/lib/nova/instances/05cee82b-0fd3-4016-893a-4a0a7fc48322_del#033[00m
Oct  2 08:04:10 np0005466013 nova_compute[192144]: 2025-10-02 12:04:10.384 2 INFO nova.virt.libvirt.driver [None req-96d982e5-c2b0-45b6-a596-97bb5b214477 d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] Deletion of /var/lib/nova/instances/05cee82b-0fd3-4016-893a-4a0a7fc48322_del complete#033[00m
Oct  2 08:04:10 np0005466013 nova_compute[192144]: 2025-10-02 12:04:10.459 2 INFO nova.compute.manager [None req-96d982e5-c2b0-45b6-a596-97bb5b214477 d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] Took 0.34 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:04:10 np0005466013 nova_compute[192144]: 2025-10-02 12:04:10.460 2 DEBUG oslo.service.loopingcall [None req-96d982e5-c2b0-45b6-a596-97bb5b214477 d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:04:10 np0005466013 nova_compute[192144]: 2025-10-02 12:04:10.461 2 DEBUG nova.compute.manager [-] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:04:10 np0005466013 nova_compute[192144]: 2025-10-02 12:04:10.461 2 DEBUG nova.network.neutron [-] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:04:10 np0005466013 nova_compute[192144]: 2025-10-02 12:04:10.835 2 DEBUG oslo_concurrency.processutils [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] CMD "scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/356bc6d6-1101-467e-a020-65876724c955/disk.config /var/lib/nova/instances/356bc6d6-1101-467e-a020-65876724c955" returned: 0 in 0.516s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:04:10 np0005466013 nova_compute[192144]: 2025-10-02 12:04:10.836 2 DEBUG nova.virt.libvirt.driver [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Oct  2 08:04:10 np0005466013 nova_compute[192144]: 2025-10-02 12:04:10.837 2 DEBUG nova.virt.libvirt.vif [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T12:03:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-507794369',display_name='tempest-LiveMigrationTest-server-507794369',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-livemigrationtest-server-507794369',id=20,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:03:36Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='f7cb78d24d1a4511a59ced45ccc4a1c7',ramdisk_id='',reservation_id='r-hsf0qpxd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-1666170212',owner_user_name='tempest-LiveMigrationTest-1666170212-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:04:03Z,user_data=None,user_id='5f75195e56504673bd403ce69cbc28ca',uuid=356bc6d6-1101-467e-a020-65876724c955,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "29214def-2450-4edd-acc6-84e165aa1e2c", "address": "fa:16:3e:1d:3d:20", "network": {"id": "664b6526-6df1-4024-9bab-37218e6c18bd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-2017832683-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7cb78d24d1a4511a59ced45ccc4a1c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap29214def-24", "ovs_interfaceid": "29214def-2450-4edd-acc6-84e165aa1e2c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:04:10 np0005466013 nova_compute[192144]: 2025-10-02 12:04:10.837 2 DEBUG nova.network.os_vif_util [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Converting VIF {"id": "29214def-2450-4edd-acc6-84e165aa1e2c", "address": "fa:16:3e:1d:3d:20", "network": {"id": "664b6526-6df1-4024-9bab-37218e6c18bd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-2017832683-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7cb78d24d1a4511a59ced45ccc4a1c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap29214def-24", "ovs_interfaceid": "29214def-2450-4edd-acc6-84e165aa1e2c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:04:10 np0005466013 nova_compute[192144]: 2025-10-02 12:04:10.838 2 DEBUG nova.network.os_vif_util [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1d:3d:20,bridge_name='br-int',has_traffic_filtering=True,id=29214def-2450-4edd-acc6-84e165aa1e2c,network=Network(664b6526-6df1-4024-9bab-37218e6c18bd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29214def-24') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:04:10 np0005466013 nova_compute[192144]: 2025-10-02 12:04:10.838 2 DEBUG os_vif [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1d:3d:20,bridge_name='br-int',has_traffic_filtering=True,id=29214def-2450-4edd-acc6-84e165aa1e2c,network=Network(664b6526-6df1-4024-9bab-37218e6c18bd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29214def-24') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:04:10 np0005466013 nova_compute[192144]: 2025-10-02 12:04:10.839 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:10 np0005466013 nova_compute[192144]: 2025-10-02 12:04:10.840 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:04:10 np0005466013 nova_compute[192144]: 2025-10-02 12:04:10.840 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:04:10 np0005466013 nova_compute[192144]: 2025-10-02 12:04:10.843 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:10 np0005466013 nova_compute[192144]: 2025-10-02 12:04:10.843 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap29214def-24, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:04:10 np0005466013 nova_compute[192144]: 2025-10-02 12:04:10.843 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap29214def-24, col_values=(('external_ids', {'iface-id': '29214def-2450-4edd-acc6-84e165aa1e2c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1d:3d:20', 'vm-uuid': '356bc6d6-1101-467e-a020-65876724c955'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:04:10 np0005466013 nova_compute[192144]: 2025-10-02 12:04:10.845 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:10 np0005466013 NetworkManager[51205]: <info>  [1759406650.8462] manager: (tap29214def-24): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/37)
Oct  2 08:04:10 np0005466013 nova_compute[192144]: 2025-10-02 12:04:10.848 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:04:10 np0005466013 nova_compute[192144]: 2025-10-02 12:04:10.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:10 np0005466013 nova_compute[192144]: 2025-10-02 12:04:10.853 2 INFO os_vif [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1d:3d:20,bridge_name='br-int',has_traffic_filtering=True,id=29214def-2450-4edd-acc6-84e165aa1e2c,network=Network(664b6526-6df1-4024-9bab-37218e6c18bd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29214def-24')#033[00m
Oct  2 08:04:10 np0005466013 nova_compute[192144]: 2025-10-02 12:04:10.853 2 DEBUG nova.virt.libvirt.driver [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Oct  2 08:04:10 np0005466013 nova_compute[192144]: 2025-10-02 12:04:10.853 2 DEBUG nova.compute.manager [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp6xzn3h2n',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='356bc6d6-1101-467e-a020-65876724c955',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Oct  2 08:04:10 np0005466013 nova_compute[192144]: 2025-10-02 12:04:10.966 2 DEBUG nova.network.neutron [-] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:04:10 np0005466013 nova_compute[192144]: 2025-10-02 12:04:10.979 2 DEBUG nova.network.neutron [-] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:04:11 np0005466013 nova_compute[192144]: 2025-10-02 12:04:11.000 2 INFO nova.compute.manager [-] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] Took 0.54 seconds to deallocate network for instance.#033[00m
Oct  2 08:04:11 np0005466013 nova_compute[192144]: 2025-10-02 12:04:11.009 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:04:11 np0005466013 nova_compute[192144]: 2025-10-02 12:04:11.040 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:11 np0005466013 nova_compute[192144]: 2025-10-02 12:04:11.041 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:11 np0005466013 nova_compute[192144]: 2025-10-02 12:04:11.041 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:11 np0005466013 nova_compute[192144]: 2025-10-02 12:04:11.041 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:04:11 np0005466013 nova_compute[192144]: 2025-10-02 12:04:11.081 2 DEBUG oslo_concurrency.lockutils [None req-96d982e5-c2b0-45b6-a596-97bb5b214477 d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:11 np0005466013 nova_compute[192144]: 2025-10-02 12:04:11.081 2 DEBUG oslo_concurrency.lockutils [None req-96d982e5-c2b0-45b6-a596-97bb5b214477 d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:11 np0005466013 nova_compute[192144]: 2025-10-02 12:04:11.166 2 DEBUG nova.compute.provider_tree [None req-96d982e5-c2b0-45b6-a596-97bb5b214477 d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:04:11 np0005466013 nova_compute[192144]: 2025-10-02 12:04:11.182 2 DEBUG nova.scheduler.client.report [None req-96d982e5-c2b0-45b6-a596-97bb5b214477 d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:04:11 np0005466013 nova_compute[192144]: 2025-10-02 12:04:11.211 2 DEBUG oslo_concurrency.lockutils [None req-96d982e5-c2b0-45b6-a596-97bb5b214477 d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:11 np0005466013 nova_compute[192144]: 2025-10-02 12:04:11.244 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:04:11 np0005466013 nova_compute[192144]: 2025-10-02 12:04:11.245 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5756MB free_disk=73.46358871459961GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:04:11 np0005466013 nova_compute[192144]: 2025-10-02 12:04:11.245 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:11 np0005466013 nova_compute[192144]: 2025-10-02 12:04:11.245 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:11 np0005466013 nova_compute[192144]: 2025-10-02 12:04:11.250 2 INFO nova.scheduler.client.report [None req-96d982e5-c2b0-45b6-a596-97bb5b214477 d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Deleted allocations for instance 05cee82b-0fd3-4016-893a-4a0a7fc48322#033[00m
Oct  2 08:04:11 np0005466013 nova_compute[192144]: 2025-10-02 12:04:11.311 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Migration for instance 356bc6d6-1101-467e-a020-65876724c955 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Oct  2 08:04:11 np0005466013 nova_compute[192144]: 2025-10-02 12:04:11.340 2 INFO nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: 356bc6d6-1101-467e-a020-65876724c955] Updating resource usage from migration 8b92b020-fef8-4e24-b417-6318a75e3466#033[00m
Oct  2 08:04:11 np0005466013 nova_compute[192144]: 2025-10-02 12:04:11.341 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: 356bc6d6-1101-467e-a020-65876724c955] Starting to track incoming migration 8b92b020-fef8-4e24-b417-6318a75e3466 with flavor 9ac83da7-f31e-4467-8569-d28002f6aeed _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Oct  2 08:04:11 np0005466013 nova_compute[192144]: 2025-10-02 12:04:11.352 2 DEBUG oslo_concurrency.lockutils [None req-96d982e5-c2b0-45b6-a596-97bb5b214477 d4a6f546eb184617896f7b31d695e198 6a7bc078cca045fab4ffbbcaaa2f00e7 - - default default] Lock "05cee82b-0fd3-4016-893a-4a0a7fc48322" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.806s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:11 np0005466013 nova_compute[192144]: 2025-10-02 12:04:11.385 2 WARNING nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance 356bc6d6-1101-467e-a020-65876724c955 has been moved to another host compute-0.ctlplane.example.com(compute-0.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.#033[00m
Oct  2 08:04:11 np0005466013 nova_compute[192144]: 2025-10-02 12:04:11.385 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:04:11 np0005466013 nova_compute[192144]: 2025-10-02 12:04:11.386 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:04:11 np0005466013 nova_compute[192144]: 2025-10-02 12:04:11.427 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:04:11 np0005466013 nova_compute[192144]: 2025-10-02 12:04:11.440 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:04:11 np0005466013 nova_compute[192144]: 2025-10-02 12:04:11.464 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:04:11 np0005466013 nova_compute[192144]: 2025-10-02 12:04:11.464 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.219s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:11 np0005466013 nova_compute[192144]: 2025-10-02 12:04:11.464 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:04:11 np0005466013 nova_compute[192144]: 2025-10-02 12:04:11.464 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  2 08:04:11 np0005466013 nova_compute[192144]: 2025-10-02 12:04:11.483 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  2 08:04:11 np0005466013 nova_compute[192144]: 2025-10-02 12:04:11.603 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759406636.6000688, 356bc6d6-1101-467e-a020-65876724c955 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:04:11 np0005466013 nova_compute[192144]: 2025-10-02 12:04:11.604 2 INFO nova.compute.manager [-] [instance: 356bc6d6-1101-467e-a020-65876724c955] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:04:11 np0005466013 nova_compute[192144]: 2025-10-02 12:04:11.622 2 DEBUG nova.compute.manager [None req-80031c6a-b519-43e4-8ffd-210825f0cf97 - - - - - -] [instance: 356bc6d6-1101-467e-a020-65876724c955] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:04:12 np0005466013 nova_compute[192144]: 2025-10-02 12:04:12.330 2 DEBUG nova.network.neutron [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Port 29214def-2450-4edd-acc6-84e165aa1e2c updated with migration profile {'os_vif_delegation': True, 'migrating_to': 'compute-2.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Oct  2 08:04:12 np0005466013 nova_compute[192144]: 2025-10-02 12:04:12.343 2 DEBUG nova.compute.manager [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp6xzn3h2n',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='356bc6d6-1101-467e-a020-65876724c955',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Oct  2 08:04:12 np0005466013 systemd[1]: Starting libvirt proxy daemon...
Oct  2 08:04:12 np0005466013 nova_compute[192144]: 2025-10-02 12:04:12.468 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:04:12 np0005466013 nova_compute[192144]: 2025-10-02 12:04:12.469 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:04:12 np0005466013 nova_compute[192144]: 2025-10-02 12:04:12.469 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:04:12 np0005466013 nova_compute[192144]: 2025-10-02 12:04:12.485 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:04:12 np0005466013 nova_compute[192144]: 2025-10-02 12:04:12.485 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:04:12 np0005466013 nova_compute[192144]: 2025-10-02 12:04:12.485 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:04:12 np0005466013 nova_compute[192144]: 2025-10-02 12:04:12.485 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:04:12 np0005466013 systemd[1]: Started libvirt proxy daemon.
Oct  2 08:04:12 np0005466013 kernel: tap29214def-24: entered promiscuous mode
Oct  2 08:04:12 np0005466013 systemd-udevd[222425]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:04:12 np0005466013 NetworkManager[51205]: <info>  [1759406652.6178] manager: (tap29214def-24): new Tun device (/org/freedesktop/NetworkManager/Devices/38)
Oct  2 08:04:12 np0005466013 ovn_controller[94366]: 2025-10-02T12:04:12Z|00061|binding|INFO|Claiming lport 29214def-2450-4edd-acc6-84e165aa1e2c for this additional chassis.
Oct  2 08:04:12 np0005466013 ovn_controller[94366]: 2025-10-02T12:04:12Z|00062|binding|INFO|29214def-2450-4edd-acc6-84e165aa1e2c: Claiming fa:16:3e:1d:3d:20 10.100.0.14
Oct  2 08:04:12 np0005466013 nova_compute[192144]: 2025-10-02 12:04:12.620 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:12 np0005466013 NetworkManager[51205]: <info>  [1759406652.6327] device (tap29214def-24): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:04:12 np0005466013 NetworkManager[51205]: <info>  [1759406652.6335] device (tap29214def-24): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:04:12 np0005466013 ovn_controller[94366]: 2025-10-02T12:04:12Z|00063|binding|INFO|Setting lport 29214def-2450-4edd-acc6-84e165aa1e2c ovn-installed in OVS
Oct  2 08:04:12 np0005466013 nova_compute[192144]: 2025-10-02 12:04:12.634 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:12 np0005466013 nova_compute[192144]: 2025-10-02 12:04:12.640 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:12 np0005466013 systemd-machined[152202]: New machine qemu-10-instance-00000014.
Oct  2 08:04:12 np0005466013 systemd[1]: Started Virtual Machine qemu-10-instance-00000014.
Oct  2 08:04:12 np0005466013 nova_compute[192144]: 2025-10-02 12:04:12.725 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:13 np0005466013 nova_compute[192144]: 2025-10-02 12:04:13.364 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759406653.3642592, 356bc6d6-1101-467e-a020-65876724c955 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:04:13 np0005466013 nova_compute[192144]: 2025-10-02 12:04:13.365 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 356bc6d6-1101-467e-a020-65876724c955] VM Started (Lifecycle Event)#033[00m
Oct  2 08:04:13 np0005466013 nova_compute[192144]: 2025-10-02 12:04:13.391 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 356bc6d6-1101-467e-a020-65876724c955] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:04:13 np0005466013 nova_compute[192144]: 2025-10-02 12:04:13.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:04:14 np0005466013 nova_compute[192144]: 2025-10-02 12:04:14.021 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:04:14 np0005466013 nova_compute[192144]: 2025-10-02 12:04:14.022 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:04:14 np0005466013 nova_compute[192144]: 2025-10-02 12:04:14.290 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759406654.2903235, 356bc6d6-1101-467e-a020-65876724c955 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:04:14 np0005466013 nova_compute[192144]: 2025-10-02 12:04:14.291 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 356bc6d6-1101-467e-a020-65876724c955] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:04:14 np0005466013 nova_compute[192144]: 2025-10-02 12:04:14.319 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 356bc6d6-1101-467e-a020-65876724c955] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:04:14 np0005466013 nova_compute[192144]: 2025-10-02 12:04:14.323 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 356bc6d6-1101-467e-a020-65876724c955] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:04:14 np0005466013 nova_compute[192144]: 2025-10-02 12:04:14.352 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 356bc6d6-1101-467e-a020-65876724c955] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-2.ctlplane.example.com#033[00m
Oct  2 08:04:14 np0005466013 nova_compute[192144]: 2025-10-02 12:04:14.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:04:14 np0005466013 nova_compute[192144]: 2025-10-02 12:04:14.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:04:15 np0005466013 ovn_controller[94366]: 2025-10-02T12:04:15Z|00064|binding|INFO|Claiming lport 29214def-2450-4edd-acc6-84e165aa1e2c for this chassis.
Oct  2 08:04:15 np0005466013 ovn_controller[94366]: 2025-10-02T12:04:15Z|00065|binding|INFO|29214def-2450-4edd-acc6-84e165aa1e2c: Claiming fa:16:3e:1d:3d:20 10.100.0.14
Oct  2 08:04:15 np0005466013 ovn_controller[94366]: 2025-10-02T12:04:15Z|00066|binding|INFO|Setting lport 29214def-2450-4edd-acc6-84e165aa1e2c up in Southbound
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:04:15.480 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1d:3d:20 10.100.0.14'], port_security=['fa:16:3e:1d:3d:20 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '356bc6d6-1101-467e-a020-65876724c955', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-664b6526-6df1-4024-9bab-37218e6c18bd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f7cb78d24d1a4511a59ced45ccc4a1c7', 'neutron:revision_number': '21', 'neutron:security_group_ids': 'a459d514-aab4-4030-9850-e066abdeaccc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=eddfb51e-1095-4b3d-a2dc-f2557cf13b11, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=29214def-2450-4edd-acc6-84e165aa1e2c) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:04:15.482 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 29214def-2450-4edd-acc6-84e165aa1e2c in datapath 664b6526-6df1-4024-9bab-37218e6c18bd bound to our chassis#033[00m
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:04:15.483 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 664b6526-6df1-4024-9bab-37218e6c18bd#033[00m
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:04:15.495 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[21237d93-fd1f-4a4f-b973-70715694fc9b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:04:15.496 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap664b6526-61 in ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:04:15.499 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap664b6526-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:04:15.499 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[2da67c63-7c00-428b-8b1f-494ff3fb6381]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:04:15.500 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[872b8d0c-b750-4955-b68b-3be4a5f7c752]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:04:15.510 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[9399db91-ba71-4695-a642-1f8847238473]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:04:15.533 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[9df62641-3b87-4d0a-9d34-87a819df6010]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:04:15.559 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[b0c3bac6-3449-4377-a14b-99cef6475ce5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:04:15.567 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[e99cb67b-f679-47b6-ba5d-dec98acbbd82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:15 np0005466013 NetworkManager[51205]: <info>  [1759406655.5680] manager: (tap664b6526-60): new Veth device (/org/freedesktop/NetworkManager/Devices/39)
Oct  2 08:04:15 np0005466013 systemd-udevd[222511]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:04:15.598 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[917f3fdb-6d08-471e-8ed4-e6158497a387]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:04:15.602 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[10bbd94c-eeb3-4cf6-8001-7ad2cd5cccac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:15 np0005466013 NetworkManager[51205]: <info>  [1759406655.6228] device (tap664b6526-60): carrier: link connected
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:04:15.627 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[b2ac2e34-f598-48d0-a26c-a93dfbbfdb2b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:04:15.642 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[8f2c32ef-d080-41aa-a6c1-7e57e871e36f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap664b6526-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5c:8c:2f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 464724, 'reachable_time': 26579, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222530, 'error': None, 'target': 'ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:04:15.656 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[992d2c45-c73f-4977-820b-9ddc4c7a84d3]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5c:8c2f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 464724, 'tstamp': 464724}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222531, 'error': None, 'target': 'ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:04:15.671 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[9035672c-93a1-4fbe-89ac-255a13caa4cb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap664b6526-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5c:8c:2f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 464724, 'reachable_time': 26579, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222532, 'error': None, 'target': 'ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:04:15.695 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[500c5d6b-f8eb-40a4-8214-7eb8050f8331]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:04:15.754 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[c8e66bc0-2a0f-421e-8af0-c75befaed97c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:04:15.756 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap664b6526-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:04:15.756 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:04:15.757 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap664b6526-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:04:15 np0005466013 NetworkManager[51205]: <info>  [1759406655.7593] manager: (tap664b6526-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/40)
Oct  2 08:04:15 np0005466013 kernel: tap664b6526-60: entered promiscuous mode
Oct  2 08:04:15 np0005466013 nova_compute[192144]: 2025-10-02 12:04:15.759 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:15 np0005466013 nova_compute[192144]: 2025-10-02 12:04:15.762 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:04:15.762 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap664b6526-60, col_values=(('external_ids', {'iface-id': '2f7dc774-b718-4d9e-9655-fbc5ffa141e8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:04:15 np0005466013 ovn_controller[94366]: 2025-10-02T12:04:15Z|00067|binding|INFO|Releasing lport 2f7dc774-b718-4d9e-9655-fbc5ffa141e8 from this chassis (sb_readonly=0)
Oct  2 08:04:15 np0005466013 nova_compute[192144]: 2025-10-02 12:04:15.764 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:15 np0005466013 nova_compute[192144]: 2025-10-02 12:04:15.777 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:04:15.778 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/664b6526-6df1-4024-9bab-37218e6c18bd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/664b6526-6df1-4024-9bab-37218e6c18bd.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:04:15.779 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[1b1c9a7b-3fe6-436d-abd0-381a0fa745e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:04:15.779 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-664b6526-6df1-4024-9bab-37218e6c18bd
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/664b6526-6df1-4024-9bab-37218e6c18bd.pid.haproxy
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID 664b6526-6df1-4024-9bab-37218e6c18bd
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:04:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:04:15.780 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd', 'env', 'PROCESS_TAG=haproxy-664b6526-6df1-4024-9bab-37218e6c18bd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/664b6526-6df1-4024-9bab-37218e6c18bd.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:04:15 np0005466013 nova_compute[192144]: 2025-10-02 12:04:15.846 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:15 np0005466013 nova_compute[192144]: 2025-10-02 12:04:15.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:04:16 np0005466013 nova_compute[192144]: 2025-10-02 12:04:16.028 2 INFO nova.compute.manager [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Post operation of migration started#033[00m
Oct  2 08:04:16 np0005466013 podman[222565]: 2025-10-02 12:04:16.177405688 +0000 UTC m=+0.054326629 container create 26238d37c42498c04997eb0f6abf60dacb48ed1c22e5d2b9a4048c2a8e4e94c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0)
Oct  2 08:04:16 np0005466013 systemd[1]: Started libpod-conmon-26238d37c42498c04997eb0f6abf60dacb48ed1c22e5d2b9a4048c2a8e4e94c5.scope.
Oct  2 08:04:16 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:04:16 np0005466013 podman[222565]: 2025-10-02 12:04:16.145821813 +0000 UTC m=+0.022742774 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:04:16 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/daf239092209cf7d71a2a3e72bb415957e53edb6ffdec3c8f0db392c51a41d96/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:04:16 np0005466013 podman[222565]: 2025-10-02 12:04:16.257023797 +0000 UTC m=+0.133944758 container init 26238d37c42498c04997eb0f6abf60dacb48ed1c22e5d2b9a4048c2a8e4e94c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:04:16 np0005466013 podman[222565]: 2025-10-02 12:04:16.261739723 +0000 UTC m=+0.138660664 container start 26238d37c42498c04997eb0f6abf60dacb48ed1c22e5d2b9a4048c2a8e4e94c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:04:16 np0005466013 neutron-haproxy-ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd[222580]: [NOTICE]   (222584) : New worker (222586) forked
Oct  2 08:04:16 np0005466013 neutron-haproxy-ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd[222580]: [NOTICE]   (222584) : Loading success.
Oct  2 08:04:16 np0005466013 nova_compute[192144]: 2025-10-02 12:04:16.628 2 DEBUG oslo_concurrency.lockutils [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Acquiring lock "refresh_cache-356bc6d6-1101-467e-a020-65876724c955" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:04:16 np0005466013 nova_compute[192144]: 2025-10-02 12:04:16.628 2 DEBUG oslo_concurrency.lockutils [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Acquired lock "refresh_cache-356bc6d6-1101-467e-a020-65876724c955" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:04:16 np0005466013 nova_compute[192144]: 2025-10-02 12:04:16.628 2 DEBUG nova.network.neutron [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:04:17 np0005466013 nova_compute[192144]: 2025-10-02 12:04:17.727 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:19 np0005466013 nova_compute[192144]: 2025-10-02 12:04:19.455 2 DEBUG nova.network.neutron [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Updating instance_info_cache with network_info: [{"id": "29214def-2450-4edd-acc6-84e165aa1e2c", "address": "fa:16:3e:1d:3d:20", "network": {"id": "664b6526-6df1-4024-9bab-37218e6c18bd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-2017832683-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7cb78d24d1a4511a59ced45ccc4a1c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29214def-24", "ovs_interfaceid": "29214def-2450-4edd-acc6-84e165aa1e2c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:04:19 np0005466013 nova_compute[192144]: 2025-10-02 12:04:19.592 2 DEBUG oslo_concurrency.lockutils [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Releasing lock "refresh_cache-356bc6d6-1101-467e-a020-65876724c955" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:04:19 np0005466013 nova_compute[192144]: 2025-10-02 12:04:19.626 2 DEBUG oslo_concurrency.lockutils [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:19 np0005466013 nova_compute[192144]: 2025-10-02 12:04:19.626 2 DEBUG oslo_concurrency.lockutils [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:19 np0005466013 nova_compute[192144]: 2025-10-02 12:04:19.627 2 DEBUG oslo_concurrency.lockutils [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:19 np0005466013 nova_compute[192144]: 2025-10-02 12:04:19.630 2 INFO nova.virt.libvirt.driver [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Oct  2 08:04:19 np0005466013 virtqemud[191867]: Domain id=10 name='instance-00000014' uuid=356bc6d6-1101-467e-a020-65876724c955 is tainted: custom-monitor
Oct  2 08:04:20 np0005466013 nova_compute[192144]: 2025-10-02 12:04:20.637 2 INFO nova.virt.libvirt.driver [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Oct  2 08:04:20 np0005466013 nova_compute[192144]: 2025-10-02 12:04:20.848 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:21 np0005466013 nova_compute[192144]: 2025-10-02 12:04:21.642 2 INFO nova.virt.libvirt.driver [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Oct  2 08:04:21 np0005466013 nova_compute[192144]: 2025-10-02 12:04:21.646 2 DEBUG nova.compute.manager [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:04:21 np0005466013 podman[222596]: 2025-10-02 12:04:21.6766093 +0000 UTC m=+0.047479118 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  2 08:04:21 np0005466013 podman[222595]: 2025-10-02 12:04:21.677017603 +0000 UTC m=+0.048361575 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  2 08:04:21 np0005466013 podman[222597]: 2025-10-02 12:04:21.716760429 +0000 UTC m=+0.082977573 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  2 08:04:21 np0005466013 nova_compute[192144]: 2025-10-02 12:04:21.882 2 DEBUG nova.objects.instance [None req-0a34d7c2-6d84-4b74-ba5b-49c921863df7 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Oct  2 08:04:22 np0005466013 nova_compute[192144]: 2025-10-02 12:04:22.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:25 np0005466013 nova_compute[192144]: 2025-10-02 12:04:25.359 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:04:25 np0005466013 nova_compute[192144]: 2025-10-02 12:04:25.363 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759406650.362652, 05cee82b-0fd3-4016-893a-4a0a7fc48322 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:04:25 np0005466013 nova_compute[192144]: 2025-10-02 12:04:25.364 2 INFO nova.compute.manager [-] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:04:25 np0005466013 nova_compute[192144]: 2025-10-02 12:04:25.377 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Triggering sync for uuid 356bc6d6-1101-467e-a020-65876724c955 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Oct  2 08:04:25 np0005466013 nova_compute[192144]: 2025-10-02 12:04:25.378 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "356bc6d6-1101-467e-a020-65876724c955" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:25 np0005466013 nova_compute[192144]: 2025-10-02 12:04:25.378 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "356bc6d6-1101-467e-a020-65876724c955" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:25 np0005466013 nova_compute[192144]: 2025-10-02 12:04:25.383 2 DEBUG nova.compute.manager [None req-578dc2c5-ffd3-486b-95ef-3e9b0c019d3d - - - - - -] [instance: 05cee82b-0fd3-4016-893a-4a0a7fc48322] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:04:25 np0005466013 nova_compute[192144]: 2025-10-02 12:04:25.417 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "356bc6d6-1101-467e-a020-65876724c955" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.039s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:25 np0005466013 nova_compute[192144]: 2025-10-02 12:04:25.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:27 np0005466013 nova_compute[192144]: 2025-10-02 12:04:27.731 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:30 np0005466013 nova_compute[192144]: 2025-10-02 12:04:30.904 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:31 np0005466013 podman[222658]: 2025-10-02 12:04:31.679516414 +0000 UTC m=+0.057823537 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct  2 08:04:32 np0005466013 nova_compute[192144]: 2025-10-02 12:04:32.735 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:34 np0005466013 podman[222679]: 2025-10-02 12:04:34.678039873 +0000 UTC m=+0.056001311 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, config_id=edpm, container_name=openstack_network_exporter, name=ubi9-minimal, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, maintainer=Red Hat, Inc., vcs-type=git)
Oct  2 08:04:34 np0005466013 podman[222678]: 2025-10-02 12:04:34.678180288 +0000 UTC m=+0.058658864 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:04:35 np0005466013 nova_compute[192144]: 2025-10-02 12:04:35.906 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:37 np0005466013 nova_compute[192144]: 2025-10-02 12:04:37.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:39 np0005466013 podman[222719]: 2025-10-02 12:04:39.667135612 +0000 UTC m=+0.048428606 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:04:39 np0005466013 podman[222720]: 2025-10-02 12:04:39.672948092 +0000 UTC m=+0.049852501 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:04:40 np0005466013 nova_compute[192144]: 2025-10-02 12:04:40.908 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:42 np0005466013 nova_compute[192144]: 2025-10-02 12:04:42.739 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:45 np0005466013 nova_compute[192144]: 2025-10-02 12:04:45.909 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:47 np0005466013 nova_compute[192144]: 2025-10-02 12:04:47.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:50 np0005466013 nova_compute[192144]: 2025-10-02 12:04:50.911 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:52 np0005466013 podman[222761]: 2025-10-02 12:04:52.676214825 +0000 UTC m=+0.052385409 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  2 08:04:52 np0005466013 podman[222762]: 2025-10-02 12:04:52.676849124 +0000 UTC m=+0.049416657 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:04:52 np0005466013 podman[222763]: 2025-10-02 12:04:52.714221909 +0000 UTC m=+0.082424487 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  2 08:04:52 np0005466013 nova_compute[192144]: 2025-10-02 12:04:52.742 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:55 np0005466013 nova_compute[192144]: 2025-10-02 12:04:55.918 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:57 np0005466013 nova_compute[192144]: 2025-10-02 12:04:57.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:59 np0005466013 nova_compute[192144]: 2025-10-02 12:04:59.702 2 DEBUG oslo_concurrency.lockutils [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Acquiring lock "31fa0ee3-64b4-4f39-adf9-bceb5906e105" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:59 np0005466013 nova_compute[192144]: 2025-10-02 12:04:59.702 2 DEBUG oslo_concurrency.lockutils [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lock "31fa0ee3-64b4-4f39-adf9-bceb5906e105" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:59 np0005466013 nova_compute[192144]: 2025-10-02 12:04:59.729 2 DEBUG nova.compute.manager [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:04:59 np0005466013 nova_compute[192144]: 2025-10-02 12:04:59.960 2 DEBUG oslo_concurrency.lockutils [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:59 np0005466013 nova_compute[192144]: 2025-10-02 12:04:59.961 2 DEBUG oslo_concurrency.lockutils [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:59 np0005466013 nova_compute[192144]: 2025-10-02 12:04:59.969 2 DEBUG nova.virt.hardware [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:04:59 np0005466013 nova_compute[192144]: 2025-10-02 12:04:59.969 2 INFO nova.compute.claims [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:05:00 np0005466013 nova_compute[192144]: 2025-10-02 12:05:00.107 2 DEBUG nova.scheduler.client.report [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Refreshing inventories for resource provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  2 08:05:00 np0005466013 nova_compute[192144]: 2025-10-02 12:05:00.202 2 DEBUG nova.scheduler.client.report [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Updating ProviderTree inventory for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  2 08:05:00 np0005466013 nova_compute[192144]: 2025-10-02 12:05:00.203 2 DEBUG nova.compute.provider_tree [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Updating inventory in ProviderTree for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 08:05:00 np0005466013 nova_compute[192144]: 2025-10-02 12:05:00.227 2 DEBUG nova.scheduler.client.report [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Refreshing aggregate associations for resource provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  2 08:05:00 np0005466013 nova_compute[192144]: 2025-10-02 12:05:00.266 2 DEBUG nova.scheduler.client.report [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Refreshing trait associations for resource provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80, traits: COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_TRUSTED_CERTS,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NODE,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE,HW_CPU_X86_MMX,COMPUTE_RESCUE_BFV _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  2 08:05:00 np0005466013 nova_compute[192144]: 2025-10-02 12:05:00.336 2 DEBUG nova.compute.provider_tree [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:05:00 np0005466013 nova_compute[192144]: 2025-10-02 12:05:00.364 2 DEBUG nova.scheduler.client.report [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:05:00 np0005466013 nova_compute[192144]: 2025-10-02 12:05:00.390 2 DEBUG oslo_concurrency.lockutils [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.429s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:05:00 np0005466013 nova_compute[192144]: 2025-10-02 12:05:00.391 2 DEBUG nova.compute.manager [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:05:00 np0005466013 nova_compute[192144]: 2025-10-02 12:05:00.462 2 DEBUG nova.compute.manager [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:05:00 np0005466013 nova_compute[192144]: 2025-10-02 12:05:00.463 2 DEBUG nova.network.neutron [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:05:00 np0005466013 nova_compute[192144]: 2025-10-02 12:05:00.494 2 INFO nova.virt.libvirt.driver [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:05:00 np0005466013 nova_compute[192144]: 2025-10-02 12:05:00.519 2 DEBUG nova.compute.manager [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:05:00 np0005466013 nova_compute[192144]: 2025-10-02 12:05:00.655 2 DEBUG nova.compute.manager [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:05:00 np0005466013 nova_compute[192144]: 2025-10-02 12:05:00.656 2 DEBUG nova.virt.libvirt.driver [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:05:00 np0005466013 nova_compute[192144]: 2025-10-02 12:05:00.656 2 INFO nova.virt.libvirt.driver [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Creating image(s)#033[00m
Oct  2 08:05:00 np0005466013 nova_compute[192144]: 2025-10-02 12:05:00.657 2 DEBUG oslo_concurrency.lockutils [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Acquiring lock "/var/lib/nova/instances/31fa0ee3-64b4-4f39-adf9-bceb5906e105/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:05:00 np0005466013 nova_compute[192144]: 2025-10-02 12:05:00.657 2 DEBUG oslo_concurrency.lockutils [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lock "/var/lib/nova/instances/31fa0ee3-64b4-4f39-adf9-bceb5906e105/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:05:00 np0005466013 nova_compute[192144]: 2025-10-02 12:05:00.658 2 DEBUG oslo_concurrency.lockutils [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lock "/var/lib/nova/instances/31fa0ee3-64b4-4f39-adf9-bceb5906e105/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:05:00 np0005466013 nova_compute[192144]: 2025-10-02 12:05:00.674 2 DEBUG oslo_concurrency.processutils [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:05:00 np0005466013 nova_compute[192144]: 2025-10-02 12:05:00.731 2 DEBUG oslo_concurrency.processutils [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:05:00 np0005466013 nova_compute[192144]: 2025-10-02 12:05:00.733 2 DEBUG oslo_concurrency.lockutils [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:05:00 np0005466013 nova_compute[192144]: 2025-10-02 12:05:00.733 2 DEBUG oslo_concurrency.lockutils [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:05:00 np0005466013 nova_compute[192144]: 2025-10-02 12:05:00.747 2 DEBUG oslo_concurrency.processutils [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:05:00 np0005466013 nova_compute[192144]: 2025-10-02 12:05:00.800 2 DEBUG oslo_concurrency.processutils [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:05:00 np0005466013 nova_compute[192144]: 2025-10-02 12:05:00.801 2 DEBUG oslo_concurrency.processutils [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/31fa0ee3-64b4-4f39-adf9-bceb5906e105/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:05:00 np0005466013 nova_compute[192144]: 2025-10-02 12:05:00.885 2 DEBUG nova.policy [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cdc7ec1af4d8410db0b4592293549806', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '87e7399e976c40bc84f320ed0d052ac6', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:05:00 np0005466013 nova_compute[192144]: 2025-10-02 12:05:00.920 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:01 np0005466013 nova_compute[192144]: 2025-10-02 12:05:01.040 2 DEBUG oslo_concurrency.processutils [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/31fa0ee3-64b4-4f39-adf9-bceb5906e105/disk 1073741824" returned: 0 in 0.240s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:05:01 np0005466013 nova_compute[192144]: 2025-10-02 12:05:01.041 2 DEBUG oslo_concurrency.lockutils [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.308s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:05:01 np0005466013 nova_compute[192144]: 2025-10-02 12:05:01.042 2 DEBUG oslo_concurrency.processutils [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:05:01 np0005466013 nova_compute[192144]: 2025-10-02 12:05:01.101 2 DEBUG oslo_concurrency.processutils [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:05:01 np0005466013 nova_compute[192144]: 2025-10-02 12:05:01.102 2 DEBUG nova.virt.disk.api [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Checking if we can resize image /var/lib/nova/instances/31fa0ee3-64b4-4f39-adf9-bceb5906e105/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:05:01 np0005466013 nova_compute[192144]: 2025-10-02 12:05:01.102 2 DEBUG oslo_concurrency.processutils [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/31fa0ee3-64b4-4f39-adf9-bceb5906e105/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:05:01 np0005466013 nova_compute[192144]: 2025-10-02 12:05:01.158 2 DEBUG oslo_concurrency.processutils [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/31fa0ee3-64b4-4f39-adf9-bceb5906e105/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:05:01 np0005466013 nova_compute[192144]: 2025-10-02 12:05:01.160 2 DEBUG nova.virt.disk.api [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Cannot resize image /var/lib/nova/instances/31fa0ee3-64b4-4f39-adf9-bceb5906e105/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:05:01 np0005466013 nova_compute[192144]: 2025-10-02 12:05:01.160 2 DEBUG nova.objects.instance [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lazy-loading 'migration_context' on Instance uuid 31fa0ee3-64b4-4f39-adf9-bceb5906e105 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:05:01 np0005466013 nova_compute[192144]: 2025-10-02 12:05:01.180 2 DEBUG nova.virt.libvirt.driver [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:05:01 np0005466013 nova_compute[192144]: 2025-10-02 12:05:01.180 2 DEBUG nova.virt.libvirt.driver [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Ensure instance console log exists: /var/lib/nova/instances/31fa0ee3-64b4-4f39-adf9-bceb5906e105/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:05:01 np0005466013 nova_compute[192144]: 2025-10-02 12:05:01.180 2 DEBUG oslo_concurrency.lockutils [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:05:01 np0005466013 nova_compute[192144]: 2025-10-02 12:05:01.181 2 DEBUG oslo_concurrency.lockutils [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:05:01 np0005466013 nova_compute[192144]: 2025-10-02 12:05:01.181 2 DEBUG oslo_concurrency.lockutils [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:05:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:02.285 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:05:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:02.286 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:05:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:02.287 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:05:02 np0005466013 podman[222844]: 2025-10-02 12:05:02.671684081 +0000 UTC m=+0.050045116 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:05:02 np0005466013 nova_compute[192144]: 2025-10-02 12:05:02.681 2 DEBUG nova.network.neutron [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Successfully created port: be03f2e4-1e42-4870-941c-467fcae525e2 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:05:02 np0005466013 nova_compute[192144]: 2025-10-02 12:05:02.746 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:03 np0005466013 nova_compute[192144]: 2025-10-02 12:05:03.933 2 DEBUG nova.network.neutron [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Successfully updated port: be03f2e4-1e42-4870-941c-467fcae525e2 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:05:03 np0005466013 nova_compute[192144]: 2025-10-02 12:05:03.950 2 DEBUG oslo_concurrency.lockutils [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Acquiring lock "refresh_cache-31fa0ee3-64b4-4f39-adf9-bceb5906e105" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:05:03 np0005466013 nova_compute[192144]: 2025-10-02 12:05:03.950 2 DEBUG oslo_concurrency.lockutils [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Acquired lock "refresh_cache-31fa0ee3-64b4-4f39-adf9-bceb5906e105" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:05:03 np0005466013 nova_compute[192144]: 2025-10-02 12:05:03.950 2 DEBUG nova.network.neutron [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:05:04 np0005466013 nova_compute[192144]: 2025-10-02 12:05:04.232 2 DEBUG nova.network.neutron [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:05:05 np0005466013 podman[222865]: 2025-10-02 12:05:05.685680777 +0000 UTC m=+0.059156907 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, distribution-scope=public, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, architecture=x86_64, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, vcs-type=git)
Oct  2 08:05:05 np0005466013 podman[222864]: 2025-10-02 12:05:05.702948411 +0000 UTC m=+0.071257112 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct  2 08:05:05 np0005466013 nova_compute[192144]: 2025-10-02 12:05:05.907 2 DEBUG nova.compute.manager [req-5f26bbe5-20de-49b7-aefc-20aa493966b6 req-e53eb8d0-1758-445f-910f-f621f2624270 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Received event network-changed-be03f2e4-1e42-4870-941c-467fcae525e2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:05:05 np0005466013 nova_compute[192144]: 2025-10-02 12:05:05.907 2 DEBUG nova.compute.manager [req-5f26bbe5-20de-49b7-aefc-20aa493966b6 req-e53eb8d0-1758-445f-910f-f621f2624270 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Refreshing instance network info cache due to event network-changed-be03f2e4-1e42-4870-941c-467fcae525e2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:05:05 np0005466013 nova_compute[192144]: 2025-10-02 12:05:05.907 2 DEBUG oslo_concurrency.lockutils [req-5f26bbe5-20de-49b7-aefc-20aa493966b6 req-e53eb8d0-1758-445f-910f-f621f2624270 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-31fa0ee3-64b4-4f39-adf9-bceb5906e105" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:05:05 np0005466013 nova_compute[192144]: 2025-10-02 12:05:05.922 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:06 np0005466013 nova_compute[192144]: 2025-10-02 12:05:06.324 2 DEBUG nova.network.neutron [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Updating instance_info_cache with network_info: [{"id": "be03f2e4-1e42-4870-941c-467fcae525e2", "address": "fa:16:3e:e7:d2:a3", "network": {"id": "982b406e-0686-44db-8945-39e0f57e4781", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-993027464-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "87e7399e976c40bc84f320ed0d052ac6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe03f2e4-1e", "ovs_interfaceid": "be03f2e4-1e42-4870-941c-467fcae525e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:05:06 np0005466013 nova_compute[192144]: 2025-10-02 12:05:06.351 2 DEBUG oslo_concurrency.lockutils [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Releasing lock "refresh_cache-31fa0ee3-64b4-4f39-adf9-bceb5906e105" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:05:06 np0005466013 nova_compute[192144]: 2025-10-02 12:05:06.351 2 DEBUG nova.compute.manager [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Instance network_info: |[{"id": "be03f2e4-1e42-4870-941c-467fcae525e2", "address": "fa:16:3e:e7:d2:a3", "network": {"id": "982b406e-0686-44db-8945-39e0f57e4781", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-993027464-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "87e7399e976c40bc84f320ed0d052ac6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe03f2e4-1e", "ovs_interfaceid": "be03f2e4-1e42-4870-941c-467fcae525e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:05:06 np0005466013 nova_compute[192144]: 2025-10-02 12:05:06.352 2 DEBUG oslo_concurrency.lockutils [req-5f26bbe5-20de-49b7-aefc-20aa493966b6 req-e53eb8d0-1758-445f-910f-f621f2624270 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-31fa0ee3-64b4-4f39-adf9-bceb5906e105" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:05:06 np0005466013 nova_compute[192144]: 2025-10-02 12:05:06.352 2 DEBUG nova.network.neutron [req-5f26bbe5-20de-49b7-aefc-20aa493966b6 req-e53eb8d0-1758-445f-910f-f621f2624270 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Refreshing network info cache for port be03f2e4-1e42-4870-941c-467fcae525e2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:05:06 np0005466013 nova_compute[192144]: 2025-10-02 12:05:06.354 2 DEBUG nova.virt.libvirt.driver [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Start _get_guest_xml network_info=[{"id": "be03f2e4-1e42-4870-941c-467fcae525e2", "address": "fa:16:3e:e7:d2:a3", "network": {"id": "982b406e-0686-44db-8945-39e0f57e4781", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-993027464-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "87e7399e976c40bc84f320ed0d052ac6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe03f2e4-1e", "ovs_interfaceid": "be03f2e4-1e42-4870-941c-467fcae525e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:05:06 np0005466013 nova_compute[192144]: 2025-10-02 12:05:06.358 2 WARNING nova.virt.libvirt.driver [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:05:06 np0005466013 nova_compute[192144]: 2025-10-02 12:05:06.362 2 DEBUG nova.virt.libvirt.host [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:05:06 np0005466013 nova_compute[192144]: 2025-10-02 12:05:06.362 2 DEBUG nova.virt.libvirt.host [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:05:06 np0005466013 nova_compute[192144]: 2025-10-02 12:05:06.365 2 DEBUG nova.virt.libvirt.host [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:05:06 np0005466013 nova_compute[192144]: 2025-10-02 12:05:06.365 2 DEBUG nova.virt.libvirt.host [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:05:06 np0005466013 nova_compute[192144]: 2025-10-02 12:05:06.366 2 DEBUG nova.virt.libvirt.driver [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:05:06 np0005466013 nova_compute[192144]: 2025-10-02 12:05:06.366 2 DEBUG nova.virt.hardware [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:05:06 np0005466013 nova_compute[192144]: 2025-10-02 12:05:06.367 2 DEBUG nova.virt.hardware [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:05:06 np0005466013 nova_compute[192144]: 2025-10-02 12:05:06.367 2 DEBUG nova.virt.hardware [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:05:06 np0005466013 nova_compute[192144]: 2025-10-02 12:05:06.367 2 DEBUG nova.virt.hardware [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:05:06 np0005466013 nova_compute[192144]: 2025-10-02 12:05:06.368 2 DEBUG nova.virt.hardware [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:05:06 np0005466013 nova_compute[192144]: 2025-10-02 12:05:06.368 2 DEBUG nova.virt.hardware [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:05:06 np0005466013 nova_compute[192144]: 2025-10-02 12:05:06.368 2 DEBUG nova.virt.hardware [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:05:06 np0005466013 nova_compute[192144]: 2025-10-02 12:05:06.369 2 DEBUG nova.virt.hardware [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:05:06 np0005466013 nova_compute[192144]: 2025-10-02 12:05:06.369 2 DEBUG nova.virt.hardware [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:05:06 np0005466013 nova_compute[192144]: 2025-10-02 12:05:06.369 2 DEBUG nova.virt.hardware [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:05:06 np0005466013 nova_compute[192144]: 2025-10-02 12:05:06.369 2 DEBUG nova.virt.hardware [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:05:06 np0005466013 nova_compute[192144]: 2025-10-02 12:05:06.373 2 DEBUG nova.virt.libvirt.vif [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:04:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-1762751702',display_name='tempest-ImagesOneServerNegativeTestJSON-server-1762751702',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-1762751702',id=29,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='87e7399e976c40bc84f320ed0d052ac6',ramdisk_id='',reservation_id='r-uefui2o0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-507683469',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-507683469-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:05:00Z,user_data=None,user_id='cdc7ec1af4d8410db0b4592293549806',uuid=31fa0ee3-64b4-4f39-adf9-bceb5906e105,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "be03f2e4-1e42-4870-941c-467fcae525e2", "address": "fa:16:3e:e7:d2:a3", "network": {"id": "982b406e-0686-44db-8945-39e0f57e4781", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-993027464-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "87e7399e976c40bc84f320ed0d052ac6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe03f2e4-1e", "ovs_interfaceid": "be03f2e4-1e42-4870-941c-467fcae525e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:05:06 np0005466013 nova_compute[192144]: 2025-10-02 12:05:06.374 2 DEBUG nova.network.os_vif_util [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Converting VIF {"id": "be03f2e4-1e42-4870-941c-467fcae525e2", "address": "fa:16:3e:e7:d2:a3", "network": {"id": "982b406e-0686-44db-8945-39e0f57e4781", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-993027464-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "87e7399e976c40bc84f320ed0d052ac6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe03f2e4-1e", "ovs_interfaceid": "be03f2e4-1e42-4870-941c-467fcae525e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:05:06 np0005466013 nova_compute[192144]: 2025-10-02 12:05:06.374 2 DEBUG nova.network.os_vif_util [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e7:d2:a3,bridge_name='br-int',has_traffic_filtering=True,id=be03f2e4-1e42-4870-941c-467fcae525e2,network=Network(982b406e-0686-44db-8945-39e0f57e4781),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe03f2e4-1e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:05:06 np0005466013 nova_compute[192144]: 2025-10-02 12:05:06.375 2 DEBUG nova.objects.instance [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lazy-loading 'pci_devices' on Instance uuid 31fa0ee3-64b4-4f39-adf9-bceb5906e105 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:05:06 np0005466013 nova_compute[192144]: 2025-10-02 12:05:06.401 2 DEBUG nova.virt.libvirt.driver [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:05:06 np0005466013 nova_compute[192144]:  <uuid>31fa0ee3-64b4-4f39-adf9-bceb5906e105</uuid>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:  <name>instance-0000001d</name>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:05:06 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:      <nova:name>tempest-ImagesOneServerNegativeTestJSON-server-1762751702</nova:name>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:05:06</nova:creationTime>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:05:06 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:        <nova:user uuid="cdc7ec1af4d8410db0b4592293549806">tempest-ImagesOneServerNegativeTestJSON-507683469-project-member</nova:user>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:        <nova:project uuid="87e7399e976c40bc84f320ed0d052ac6">tempest-ImagesOneServerNegativeTestJSON-507683469</nova:project>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:        <nova:port uuid="be03f2e4-1e42-4870-941c-467fcae525e2">
Oct  2 08:05:06 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:      <entry name="serial">31fa0ee3-64b4-4f39-adf9-bceb5906e105</entry>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:      <entry name="uuid">31fa0ee3-64b4-4f39-adf9-bceb5906e105</entry>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:05:06 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/31fa0ee3-64b4-4f39-adf9-bceb5906e105/disk"/>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:05:06 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/31fa0ee3-64b4-4f39-adf9-bceb5906e105/disk.config"/>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:05:06 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:e7:d2:a3"/>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:      <target dev="tapbe03f2e4-1e"/>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:05:06 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/31fa0ee3-64b4-4f39-adf9-bceb5906e105/console.log" append="off"/>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:05:06 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:05:06 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:05:06 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:05:06 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:05:06 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:05:06 np0005466013 nova_compute[192144]: 2025-10-02 12:05:06.402 2 DEBUG nova.compute.manager [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Preparing to wait for external event network-vif-plugged-be03f2e4-1e42-4870-941c-467fcae525e2 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:05:06 np0005466013 nova_compute[192144]: 2025-10-02 12:05:06.403 2 DEBUG oslo_concurrency.lockutils [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Acquiring lock "31fa0ee3-64b4-4f39-adf9-bceb5906e105-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:05:06 np0005466013 nova_compute[192144]: 2025-10-02 12:05:06.403 2 DEBUG oslo_concurrency.lockutils [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lock "31fa0ee3-64b4-4f39-adf9-bceb5906e105-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:05:06 np0005466013 nova_compute[192144]: 2025-10-02 12:05:06.403 2 DEBUG oslo_concurrency.lockutils [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lock "31fa0ee3-64b4-4f39-adf9-bceb5906e105-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:05:06 np0005466013 nova_compute[192144]: 2025-10-02 12:05:06.404 2 DEBUG nova.virt.libvirt.vif [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:04:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-1762751702',display_name='tempest-ImagesOneServerNegativeTestJSON-server-1762751702',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-1762751702',id=29,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='87e7399e976c40bc84f320ed0d052ac6',ramdisk_id='',reservation_id='r-uefui2o0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-507683469',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-507683469-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:05:00Z,user_data=None,user_id='cdc7ec1af4d8410db0b4592293549806',uuid=31fa0ee3-64b4-4f39-adf9-bceb5906e105,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "be03f2e4-1e42-4870-941c-467fcae525e2", "address": "fa:16:3e:e7:d2:a3", "network": {"id": "982b406e-0686-44db-8945-39e0f57e4781", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-993027464-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "87e7399e976c40bc84f320ed0d052ac6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe03f2e4-1e", "ovs_interfaceid": "be03f2e4-1e42-4870-941c-467fcae525e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:05:06 np0005466013 nova_compute[192144]: 2025-10-02 12:05:06.404 2 DEBUG nova.network.os_vif_util [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Converting VIF {"id": "be03f2e4-1e42-4870-941c-467fcae525e2", "address": "fa:16:3e:e7:d2:a3", "network": {"id": "982b406e-0686-44db-8945-39e0f57e4781", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-993027464-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "87e7399e976c40bc84f320ed0d052ac6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe03f2e4-1e", "ovs_interfaceid": "be03f2e4-1e42-4870-941c-467fcae525e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:05:06 np0005466013 nova_compute[192144]: 2025-10-02 12:05:06.405 2 DEBUG nova.network.os_vif_util [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e7:d2:a3,bridge_name='br-int',has_traffic_filtering=True,id=be03f2e4-1e42-4870-941c-467fcae525e2,network=Network(982b406e-0686-44db-8945-39e0f57e4781),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe03f2e4-1e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:05:06 np0005466013 nova_compute[192144]: 2025-10-02 12:05:06.405 2 DEBUG os_vif [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e7:d2:a3,bridge_name='br-int',has_traffic_filtering=True,id=be03f2e4-1e42-4870-941c-467fcae525e2,network=Network(982b406e-0686-44db-8945-39e0f57e4781),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe03f2e4-1e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:05:06 np0005466013 nova_compute[192144]: 2025-10-02 12:05:06.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:06 np0005466013 nova_compute[192144]: 2025-10-02 12:05:06.406 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:05:06 np0005466013 nova_compute[192144]: 2025-10-02 12:05:06.406 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:05:06 np0005466013 nova_compute[192144]: 2025-10-02 12:05:06.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:06 np0005466013 nova_compute[192144]: 2025-10-02 12:05:06.409 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbe03f2e4-1e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:05:06 np0005466013 nova_compute[192144]: 2025-10-02 12:05:06.409 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbe03f2e4-1e, col_values=(('external_ids', {'iface-id': 'be03f2e4-1e42-4870-941c-467fcae525e2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e7:d2:a3', 'vm-uuid': '31fa0ee3-64b4-4f39-adf9-bceb5906e105'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:05:06 np0005466013 nova_compute[192144]: 2025-10-02 12:05:06.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:06 np0005466013 NetworkManager[51205]: <info>  [1759406706.4386] manager: (tapbe03f2e4-1e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/41)
Oct  2 08:05:06 np0005466013 nova_compute[192144]: 2025-10-02 12:05:06.440 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:05:06 np0005466013 nova_compute[192144]: 2025-10-02 12:05:06.444 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:06 np0005466013 nova_compute[192144]: 2025-10-02 12:05:06.445 2 INFO os_vif [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e7:d2:a3,bridge_name='br-int',has_traffic_filtering=True,id=be03f2e4-1e42-4870-941c-467fcae525e2,network=Network(982b406e-0686-44db-8945-39e0f57e4781),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe03f2e4-1e')#033[00m
Oct  2 08:05:06 np0005466013 nova_compute[192144]: 2025-10-02 12:05:06.518 2 DEBUG nova.virt.libvirt.driver [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:05:06 np0005466013 nova_compute[192144]: 2025-10-02 12:05:06.518 2 DEBUG nova.virt.libvirt.driver [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:05:06 np0005466013 nova_compute[192144]: 2025-10-02 12:05:06.518 2 DEBUG nova.virt.libvirt.driver [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] No VIF found with MAC fa:16:3e:e7:d2:a3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:05:06 np0005466013 nova_compute[192144]: 2025-10-02 12:05:06.518 2 INFO nova.virt.libvirt.driver [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Using config drive#033[00m
Oct  2 08:05:07 np0005466013 nova_compute[192144]: 2025-10-02 12:05:07.151 2 INFO nova.virt.libvirt.driver [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Creating config drive at /var/lib/nova/instances/31fa0ee3-64b4-4f39-adf9-bceb5906e105/disk.config#033[00m
Oct  2 08:05:07 np0005466013 nova_compute[192144]: 2025-10-02 12:05:07.155 2 DEBUG oslo_concurrency.processutils [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/31fa0ee3-64b4-4f39-adf9-bceb5906e105/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpldvvru41 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:05:07 np0005466013 nova_compute[192144]: 2025-10-02 12:05:07.278 2 DEBUG oslo_concurrency.processutils [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/31fa0ee3-64b4-4f39-adf9-bceb5906e105/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpldvvru41" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:05:07 np0005466013 kernel: tapbe03f2e4-1e: entered promiscuous mode
Oct  2 08:05:07 np0005466013 NetworkManager[51205]: <info>  [1759406707.3477] manager: (tapbe03f2e4-1e): new Tun device (/org/freedesktop/NetworkManager/Devices/42)
Oct  2 08:05:07 np0005466013 nova_compute[192144]: 2025-10-02 12:05:07.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:07 np0005466013 ovn_controller[94366]: 2025-10-02T12:05:07Z|00068|binding|INFO|Claiming lport be03f2e4-1e42-4870-941c-467fcae525e2 for this chassis.
Oct  2 08:05:07 np0005466013 ovn_controller[94366]: 2025-10-02T12:05:07Z|00069|binding|INFO|be03f2e4-1e42-4870-941c-467fcae525e2: Claiming fa:16:3e:e7:d2:a3 10.100.0.8
Oct  2 08:05:07 np0005466013 nova_compute[192144]: 2025-10-02 12:05:07.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:07 np0005466013 systemd-udevd[222924]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:05:07 np0005466013 NetworkManager[51205]: <info>  [1759406707.3954] device (tapbe03f2e4-1e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:05:07 np0005466013 NetworkManager[51205]: <info>  [1759406707.3967] device (tapbe03f2e4-1e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:05:07 np0005466013 systemd-machined[152202]: New machine qemu-11-instance-0000001d.
Oct  2 08:05:07 np0005466013 nova_compute[192144]: 2025-10-02 12:05:07.407 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:07 np0005466013 ovn_controller[94366]: 2025-10-02T12:05:07Z|00070|binding|INFO|Setting lport be03f2e4-1e42-4870-941c-467fcae525e2 ovn-installed in OVS
Oct  2 08:05:07 np0005466013 nova_compute[192144]: 2025-10-02 12:05:07.411 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:07 np0005466013 systemd[1]: Started Virtual Machine qemu-11-instance-0000001d.
Oct  2 08:05:07 np0005466013 nova_compute[192144]: 2025-10-02 12:05:07.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:08 np0005466013 nova_compute[192144]: 2025-10-02 12:05:08.042 2 DEBUG nova.network.neutron [req-5f26bbe5-20de-49b7-aefc-20aa493966b6 req-e53eb8d0-1758-445f-910f-f621f2624270 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Updated VIF entry in instance network info cache for port be03f2e4-1e42-4870-941c-467fcae525e2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:05:08 np0005466013 nova_compute[192144]: 2025-10-02 12:05:08.043 2 DEBUG nova.network.neutron [req-5f26bbe5-20de-49b7-aefc-20aa493966b6 req-e53eb8d0-1758-445f-910f-f621f2624270 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Updating instance_info_cache with network_info: [{"id": "be03f2e4-1e42-4870-941c-467fcae525e2", "address": "fa:16:3e:e7:d2:a3", "network": {"id": "982b406e-0686-44db-8945-39e0f57e4781", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-993027464-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "87e7399e976c40bc84f320ed0d052ac6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe03f2e4-1e", "ovs_interfaceid": "be03f2e4-1e42-4870-941c-467fcae525e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:05:08 np0005466013 nova_compute[192144]: 2025-10-02 12:05:08.080 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759406708.0799563, 31fa0ee3-64b4-4f39-adf9-bceb5906e105 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:05:08 np0005466013 nova_compute[192144]: 2025-10-02 12:05:08.080 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] VM Started (Lifecycle Event)#033[00m
Oct  2 08:05:09 np0005466013 ovn_controller[94366]: 2025-10-02T12:05:09Z|00071|binding|INFO|Setting lport be03f2e4-1e42-4870-941c-467fcae525e2 up in Southbound
Oct  2 08:05:09 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:09.717 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e7:d2:a3 10.100.0.8'], port_security=['fa:16:3e:e7:d2:a3 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '31fa0ee3-64b4-4f39-adf9-bceb5906e105', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-982b406e-0686-44db-8945-39e0f57e4781', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '87e7399e976c40bc84f320ed0d052ac6', 'neutron:revision_number': '2', 'neutron:security_group_ids': '410de2f3-62e2-482c-a480-7655c2811e48', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f46c1e1f-04ef-471b-85c6-c4415ad3e6bb, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=be03f2e4-1e42-4870-941c-467fcae525e2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:05:09 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:09.718 103323 INFO neutron.agent.ovn.metadata.agent [-] Port be03f2e4-1e42-4870-941c-467fcae525e2 in datapath 982b406e-0686-44db-8945-39e0f57e4781 bound to our chassis#033[00m
Oct  2 08:05:09 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:09.720 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 982b406e-0686-44db-8945-39e0f57e4781#033[00m
Oct  2 08:05:09 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:09.732 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[00d38bf1-fd60-4b58-a6b1-e9eb9b445781]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:05:09 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:09.733 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap982b406e-01 in ovnmeta-982b406e-0686-44db-8945-39e0f57e4781 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:05:09 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:09.735 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap982b406e-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:05:09 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:09.735 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[2a3d75ba-0e3f-409b-b5ea-f124763c0f56]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:05:09 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:09.736 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[05b9eff8-1dfe-40f2-bc61-dc2bed1045ad]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:05:09 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:09.748 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[151b6e75-4557-4257-853b-d3c4f4fac846]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:05:09 np0005466013 nova_compute[192144]: 2025-10-02 12:05:09.757 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:05:09 np0005466013 nova_compute[192144]: 2025-10-02 12:05:09.758 2 DEBUG oslo_concurrency.lockutils [req-5f26bbe5-20de-49b7-aefc-20aa493966b6 req-e53eb8d0-1758-445f-910f-f621f2624270 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-31fa0ee3-64b4-4f39-adf9-bceb5906e105" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:05:09 np0005466013 nova_compute[192144]: 2025-10-02 12:05:09.761 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759406708.0818295, 31fa0ee3-64b4-4f39-adf9-bceb5906e105 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:05:09 np0005466013 nova_compute[192144]: 2025-10-02 12:05:09.762 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:05:09 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:09.772 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[0a90b22b-f144-4420-b6ae-3d094ace6593]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:05:09 np0005466013 nova_compute[192144]: 2025-10-02 12:05:09.796 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:05:09 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:09.799 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[72e4ae35-944f-40d2-88bb-aeee4f626e12]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:05:09 np0005466013 nova_compute[192144]: 2025-10-02 12:05:09.800 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:05:09 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:09.806 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[25afd4ea-f207-447d-982b-61e4af58bceb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:05:09 np0005466013 NetworkManager[51205]: <info>  [1759406709.8091] manager: (tap982b406e-00): new Veth device (/org/freedesktop/NetworkManager/Devices/43)
Oct  2 08:05:09 np0005466013 nova_compute[192144]: 2025-10-02 12:05:09.830 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:05:09 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:09.834 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[57894fb2-cfd8-4b3e-95c1-59236bc8efd1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:05:09 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:09.838 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[7e0e7c20-201c-4b31-b711-fea16cbbfcf2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:05:09 np0005466013 podman[222946]: 2025-10-02 12:05:09.849935009 +0000 UTC m=+0.069268580 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  2 08:05:09 np0005466013 podman[222944]: 2025-10-02 12:05:09.849950699 +0000 UTC m=+0.071481548 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:05:09 np0005466013 NetworkManager[51205]: <info>  [1759406709.8617] device (tap982b406e-00): carrier: link connected
Oct  2 08:05:09 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:09.865 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[b284c814-778f-430f-8a2c-7766f55314d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:05:09 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:09.879 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[39385cf2-c2ec-4ad4-b05d-d20e70cb196a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap982b406e-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:92:e2:1f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 470147, 'reachable_time': 34125, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223008, 'error': None, 'target': 'ovnmeta-982b406e-0686-44db-8945-39e0f57e4781', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:05:09 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:09.891 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[0e6c8f73-111d-479f-8126-a57f2021a077]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe92:e21f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 470147, 'tstamp': 470147}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223010, 'error': None, 'target': 'ovnmeta-982b406e-0686-44db-8945-39e0f57e4781', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:05:09 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:09.906 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[5fb1ce75-de76-434a-9153-25d06d41260d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap982b406e-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:92:e2:1f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 470147, 'reachable_time': 34125, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223011, 'error': None, 'target': 'ovnmeta-982b406e-0686-44db-8945-39e0f57e4781', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:05:09 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:09.932 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[63683d03-4290-4ea8-9499-a297b675b984]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:05:09 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:09.987 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[09f1c096-1ae2-4afe-a2ad-4819d54c4c7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:05:09 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:09.988 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap982b406e-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:05:09 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:09.989 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:05:09 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:09.989 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap982b406e-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:05:09 np0005466013 NetworkManager[51205]: <info>  [1759406709.9921] manager: (tap982b406e-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/44)
Oct  2 08:05:09 np0005466013 kernel: tap982b406e-00: entered promiscuous mode
Oct  2 08:05:09 np0005466013 nova_compute[192144]: 2025-10-02 12:05:09.993 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:09 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:09.995 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap982b406e-00, col_values=(('external_ids', {'iface-id': 'e7c44940-f7d8-482e-a63d-10c99ba9de76'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:05:09 np0005466013 ovn_controller[94366]: 2025-10-02T12:05:09Z|00072|binding|INFO|Releasing lport e7c44940-f7d8-482e-a63d-10c99ba9de76 from this chassis (sb_readonly=0)
Oct  2 08:05:09 np0005466013 nova_compute[192144]: 2025-10-02 12:05:09.996 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:10 np0005466013 nova_compute[192144]: 2025-10-02 12:05:10.009 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:10 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:10.010 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/982b406e-0686-44db-8945-39e0f57e4781.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/982b406e-0686-44db-8945-39e0f57e4781.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:05:10 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:10.011 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[9b3ec0b0-cb0b-4957-9fb2-fea82a198238]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:05:10 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:10.011 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:05:10 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:05:10 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:05:10 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-982b406e-0686-44db-8945-39e0f57e4781
Oct  2 08:05:10 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:05:10 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:05:10 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:05:10 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/982b406e-0686-44db-8945-39e0f57e4781.pid.haproxy
Oct  2 08:05:10 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:05:10 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:05:10 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:05:10 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:05:10 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:05:10 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:05:10 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:05:10 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:05:10 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:05:10 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:05:10 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:05:10 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:05:10 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:05:10 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:05:10 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:05:10 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:05:10 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:05:10 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:05:10 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:05:10 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:05:10 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID 982b406e-0686-44db-8945-39e0f57e4781
Oct  2 08:05:10 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:05:10 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:10.012 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-982b406e-0686-44db-8945-39e0f57e4781', 'env', 'PROCESS_TAG=haproxy-982b406e-0686-44db-8945-39e0f57e4781', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/982b406e-0686-44db-8945-39e0f57e4781.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:05:10 np0005466013 nova_compute[192144]: 2025-10-02 12:05:10.307 2 DEBUG nova.compute.manager [req-808f478d-8be9-4c2f-a54d-20af739edade req-259d8196-0722-40fc-a00f-0d71cc4c1d06 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Received event network-vif-plugged-be03f2e4-1e42-4870-941c-467fcae525e2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:05:10 np0005466013 nova_compute[192144]: 2025-10-02 12:05:10.307 2 DEBUG oslo_concurrency.lockutils [req-808f478d-8be9-4c2f-a54d-20af739edade req-259d8196-0722-40fc-a00f-0d71cc4c1d06 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "31fa0ee3-64b4-4f39-adf9-bceb5906e105-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:05:10 np0005466013 nova_compute[192144]: 2025-10-02 12:05:10.308 2 DEBUG oslo_concurrency.lockutils [req-808f478d-8be9-4c2f-a54d-20af739edade req-259d8196-0722-40fc-a00f-0d71cc4c1d06 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "31fa0ee3-64b4-4f39-adf9-bceb5906e105-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:05:10 np0005466013 nova_compute[192144]: 2025-10-02 12:05:10.308 2 DEBUG oslo_concurrency.lockutils [req-808f478d-8be9-4c2f-a54d-20af739edade req-259d8196-0722-40fc-a00f-0d71cc4c1d06 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "31fa0ee3-64b4-4f39-adf9-bceb5906e105-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:05:10 np0005466013 nova_compute[192144]: 2025-10-02 12:05:10.308 2 DEBUG nova.compute.manager [req-808f478d-8be9-4c2f-a54d-20af739edade req-259d8196-0722-40fc-a00f-0d71cc4c1d06 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Processing event network-vif-plugged-be03f2e4-1e42-4870-941c-467fcae525e2 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:05:10 np0005466013 nova_compute[192144]: 2025-10-02 12:05:10.309 2 DEBUG nova.compute.manager [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:05:10 np0005466013 nova_compute[192144]: 2025-10-02 12:05:10.313 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759406710.3122566, 31fa0ee3-64b4-4f39-adf9-bceb5906e105 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:05:10 np0005466013 nova_compute[192144]: 2025-10-02 12:05:10.313 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:05:10 np0005466013 nova_compute[192144]: 2025-10-02 12:05:10.315 2 DEBUG nova.virt.libvirt.driver [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:05:10 np0005466013 nova_compute[192144]: 2025-10-02 12:05:10.319 2 INFO nova.virt.libvirt.driver [-] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Instance spawned successfully.#033[00m
Oct  2 08:05:10 np0005466013 nova_compute[192144]: 2025-10-02 12:05:10.319 2 DEBUG nova.virt.libvirt.driver [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:05:10 np0005466013 nova_compute[192144]: 2025-10-02 12:05:10.361 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:05:10 np0005466013 nova_compute[192144]: 2025-10-02 12:05:10.365 2 DEBUG nova.virt.libvirt.driver [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:05:10 np0005466013 nova_compute[192144]: 2025-10-02 12:05:10.365 2 DEBUG nova.virt.libvirt.driver [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:05:10 np0005466013 nova_compute[192144]: 2025-10-02 12:05:10.365 2 DEBUG nova.virt.libvirt.driver [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:05:10 np0005466013 nova_compute[192144]: 2025-10-02 12:05:10.366 2 DEBUG nova.virt.libvirt.driver [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:05:10 np0005466013 nova_compute[192144]: 2025-10-02 12:05:10.366 2 DEBUG nova.virt.libvirt.driver [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:05:10 np0005466013 nova_compute[192144]: 2025-10-02 12:05:10.366 2 DEBUG nova.virt.libvirt.driver [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:05:10 np0005466013 nova_compute[192144]: 2025-10-02 12:05:10.372 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:05:10 np0005466013 nova_compute[192144]: 2025-10-02 12:05:10.412 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:05:10 np0005466013 podman[223044]: 2025-10-02 12:05:10.425453984 +0000 UTC m=+0.055655670 container create 2e79780bf9be020d31c462d8971c2fe9e65db1ba090908cefd26a7706d051c0b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-982b406e-0686-44db-8945-39e0f57e4781, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 08:05:10 np0005466013 systemd[1]: Started libpod-conmon-2e79780bf9be020d31c462d8971c2fe9e65db1ba090908cefd26a7706d051c0b.scope.
Oct  2 08:05:10 np0005466013 nova_compute[192144]: 2025-10-02 12:05:10.471 2 INFO nova.compute.manager [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Took 9.82 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:05:10 np0005466013 nova_compute[192144]: 2025-10-02 12:05:10.472 2 DEBUG nova.compute.manager [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:05:10 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:05:10 np0005466013 podman[223044]: 2025-10-02 12:05:10.393234599 +0000 UTC m=+0.023436305 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:05:10 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5ebf6a82dd366def3dcda5ffcfd584129b661017b4d8dc9ab44ccddc1c2e366d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:05:10 np0005466013 podman[223044]: 2025-10-02 12:05:10.4994958 +0000 UTC m=+0.129697516 container init 2e79780bf9be020d31c462d8971c2fe9e65db1ba090908cefd26a7706d051c0b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-982b406e-0686-44db-8945-39e0f57e4781, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:05:10 np0005466013 podman[223044]: 2025-10-02 12:05:10.504484355 +0000 UTC m=+0.134686041 container start 2e79780bf9be020d31c462d8971c2fe9e65db1ba090908cefd26a7706d051c0b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-982b406e-0686-44db-8945-39e0f57e4781, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:05:10 np0005466013 neutron-haproxy-ovnmeta-982b406e-0686-44db-8945-39e0f57e4781[223060]: [NOTICE]   (223064) : New worker (223066) forked
Oct  2 08:05:10 np0005466013 neutron-haproxy-ovnmeta-982b406e-0686-44db-8945-39e0f57e4781[223060]: [NOTICE]   (223064) : Loading success.
Oct  2 08:05:10 np0005466013 nova_compute[192144]: 2025-10-02 12:05:10.528 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:10 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:10.527 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:05:10 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:10.566 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:05:10 np0005466013 nova_compute[192144]: 2025-10-02 12:05:10.629 2 INFO nova.compute.manager [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Took 10.81 seconds to build instance.#033[00m
Oct  2 08:05:10 np0005466013 nova_compute[192144]: 2025-10-02 12:05:10.651 2 DEBUG oslo_concurrency.lockutils [None req-33533965-b657-4d9b-a79e-8277230bc6be cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lock "31fa0ee3-64b4-4f39-adf9-bceb5906e105" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.949s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:05:10 np0005466013 nova_compute[192144]: 2025-10-02 12:05:10.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:05:11 np0005466013 nova_compute[192144]: 2025-10-02 12:05:11.061 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:05:11 np0005466013 nova_compute[192144]: 2025-10-02 12:05:11.061 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:05:11 np0005466013 nova_compute[192144]: 2025-10-02 12:05:11.062 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:05:11 np0005466013 nova_compute[192144]: 2025-10-02 12:05:11.062 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:05:11 np0005466013 nova_compute[192144]: 2025-10-02 12:05:11.204 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/356bc6d6-1101-467e-a020-65876724c955/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:05:11 np0005466013 nova_compute[192144]: 2025-10-02 12:05:11.262 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/356bc6d6-1101-467e-a020-65876724c955/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:05:11 np0005466013 nova_compute[192144]: 2025-10-02 12:05:11.263 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/356bc6d6-1101-467e-a020-65876724c955/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:05:11 np0005466013 nova_compute[192144]: 2025-10-02 12:05:11.316 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/356bc6d6-1101-467e-a020-65876724c955/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:05:11 np0005466013 nova_compute[192144]: 2025-10-02 12:05:11.322 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/31fa0ee3-64b4-4f39-adf9-bceb5906e105/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:05:11 np0005466013 nova_compute[192144]: 2025-10-02 12:05:11.375 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/31fa0ee3-64b4-4f39-adf9-bceb5906e105/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:05:11 np0005466013 nova_compute[192144]: 2025-10-02 12:05:11.375 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/31fa0ee3-64b4-4f39-adf9-bceb5906e105/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:05:11 np0005466013 nova_compute[192144]: 2025-10-02 12:05:11.429 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/31fa0ee3-64b4-4f39-adf9-bceb5906e105/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:05:11 np0005466013 nova_compute[192144]: 2025-10-02 12:05:11.439 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:11 np0005466013 nova_compute[192144]: 2025-10-02 12:05:11.598 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:05:11 np0005466013 nova_compute[192144]: 2025-10-02 12:05:11.599 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5521MB free_disk=73.43436813354492GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:05:11 np0005466013 nova_compute[192144]: 2025-10-02 12:05:11.599 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:05:11 np0005466013 nova_compute[192144]: 2025-10-02 12:05:11.600 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:05:11 np0005466013 nova_compute[192144]: 2025-10-02 12:05:11.892 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance 356bc6d6-1101-467e-a020-65876724c955 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:05:11 np0005466013 nova_compute[192144]: 2025-10-02 12:05:11.893 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance 31fa0ee3-64b4-4f39-adf9-bceb5906e105 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:05:11 np0005466013 nova_compute[192144]: 2025-10-02 12:05:11.893 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:05:11 np0005466013 nova_compute[192144]: 2025-10-02 12:05:11.893 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:05:11 np0005466013 nova_compute[192144]: 2025-10-02 12:05:11.973 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:05:11 np0005466013 nova_compute[192144]: 2025-10-02 12:05:11.989 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:05:12 np0005466013 nova_compute[192144]: 2025-10-02 12:05:12.022 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:05:12 np0005466013 nova_compute[192144]: 2025-10-02 12:05:12.023 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.423s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:05:12 np0005466013 nova_compute[192144]: 2025-10-02 12:05:12.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:13.567 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:05:14 np0005466013 nova_compute[192144]: 2025-10-02 12:05:14.023 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:05:14 np0005466013 nova_compute[192144]: 2025-10-02 12:05:14.024 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:05:14 np0005466013 nova_compute[192144]: 2025-10-02 12:05:14.024 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:05:14 np0005466013 nova_compute[192144]: 2025-10-02 12:05:14.314 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "refresh_cache-31fa0ee3-64b4-4f39-adf9-bceb5906e105" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:05:14 np0005466013 nova_compute[192144]: 2025-10-02 12:05:14.314 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquired lock "refresh_cache-31fa0ee3-64b4-4f39-adf9-bceb5906e105" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:05:14 np0005466013 nova_compute[192144]: 2025-10-02 12:05:14.315 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:05:14 np0005466013 nova_compute[192144]: 2025-10-02 12:05:14.315 2 DEBUG nova.objects.instance [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lazy-loading 'info_cache' on Instance uuid 31fa0ee3-64b4-4f39-adf9-bceb5906e105 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:05:14 np0005466013 nova_compute[192144]: 2025-10-02 12:05:14.397 2 DEBUG nova.compute.manager [req-c1f10c9e-e670-447b-abe7-646884099c79 req-be60dfbe-3d60-45d8-9101-5364cda117bc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Received event network-vif-plugged-be03f2e4-1e42-4870-941c-467fcae525e2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:05:14 np0005466013 nova_compute[192144]: 2025-10-02 12:05:14.398 2 DEBUG oslo_concurrency.lockutils [req-c1f10c9e-e670-447b-abe7-646884099c79 req-be60dfbe-3d60-45d8-9101-5364cda117bc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "31fa0ee3-64b4-4f39-adf9-bceb5906e105-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:05:14 np0005466013 nova_compute[192144]: 2025-10-02 12:05:14.398 2 DEBUG oslo_concurrency.lockutils [req-c1f10c9e-e670-447b-abe7-646884099c79 req-be60dfbe-3d60-45d8-9101-5364cda117bc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "31fa0ee3-64b4-4f39-adf9-bceb5906e105-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:05:14 np0005466013 nova_compute[192144]: 2025-10-02 12:05:14.398 2 DEBUG oslo_concurrency.lockutils [req-c1f10c9e-e670-447b-abe7-646884099c79 req-be60dfbe-3d60-45d8-9101-5364cda117bc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "31fa0ee3-64b4-4f39-adf9-bceb5906e105-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:05:14 np0005466013 nova_compute[192144]: 2025-10-02 12:05:14.398 2 DEBUG nova.compute.manager [req-c1f10c9e-e670-447b-abe7-646884099c79 req-be60dfbe-3d60-45d8-9101-5364cda117bc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] No waiting events found dispatching network-vif-plugged-be03f2e4-1e42-4870-941c-467fcae525e2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:05:14 np0005466013 nova_compute[192144]: 2025-10-02 12:05:14.398 2 WARNING nova.compute.manager [req-c1f10c9e-e670-447b-abe7-646884099c79 req-be60dfbe-3d60-45d8-9101-5364cda117bc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Received unexpected event network-vif-plugged-be03f2e4-1e42-4870-941c-467fcae525e2 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:05:16 np0005466013 nova_compute[192144]: 2025-10-02 12:05:16.138 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Updating instance_info_cache with network_info: [{"id": "be03f2e4-1e42-4870-941c-467fcae525e2", "address": "fa:16:3e:e7:d2:a3", "network": {"id": "982b406e-0686-44db-8945-39e0f57e4781", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-993027464-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "87e7399e976c40bc84f320ed0d052ac6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe03f2e4-1e", "ovs_interfaceid": "be03f2e4-1e42-4870-941c-467fcae525e2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:05:16 np0005466013 nova_compute[192144]: 2025-10-02 12:05:16.155 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Releasing lock "refresh_cache-31fa0ee3-64b4-4f39-adf9-bceb5906e105" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:05:16 np0005466013 nova_compute[192144]: 2025-10-02 12:05:16.156 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:05:16 np0005466013 nova_compute[192144]: 2025-10-02 12:05:16.156 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:05:16 np0005466013 nova_compute[192144]: 2025-10-02 12:05:16.156 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:05:16 np0005466013 nova_compute[192144]: 2025-10-02 12:05:16.156 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:05:16 np0005466013 nova_compute[192144]: 2025-10-02 12:05:16.157 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:05:16 np0005466013 nova_compute[192144]: 2025-10-02 12:05:16.157 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:05:16 np0005466013 nova_compute[192144]: 2025-10-02 12:05:16.157 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.345 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '356bc6d6-1101-467e-a020-65876724c955', 'name': 'tempest-LiveMigrationTest-server-507794369', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000014', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'f7cb78d24d1a4511a59ced45ccc4a1c7', 'user_id': '5f75195e56504673bd403ce69cbc28ca', 'hostId': '04aeeafd3b3d14c37aa026fe541948d818a0d44b2eb5c8767ebc72ff', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.347 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '31fa0ee3-64b4-4f39-adf9-bceb5906e105', 'name': 'tempest-ImagesOneServerNegativeTestJSON-server-1762751702', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000001d', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '87e7399e976c40bc84f320ed0d052ac6', 'user_id': 'cdc7ec1af4d8410db0b4592293549806', 'hostId': 'e78f6e823697588ea37aa098ba375e8d623624c2700ad6613abe0a2e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.348 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.360 12 DEBUG ceilometer.compute.pollsters [-] 356bc6d6-1101-467e-a020-65876724c955/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.360 12 DEBUG ceilometer.compute.pollsters [-] 356bc6d6-1101-467e-a020-65876724c955/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.371 12 DEBUG ceilometer.compute.pollsters [-] 31fa0ee3-64b4-4f39-adf9-bceb5906e105/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.372 12 DEBUG ceilometer.compute.pollsters [-] 31fa0ee3-64b4-4f39-adf9-bceb5906e105/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.373 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '50119b60-f77c-49ff-abd5-d3fdb51cb0e1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '5f75195e56504673bd403ce69cbc28ca', 'user_name': None, 'project_id': 'f7cb78d24d1a4511a59ced45ccc4a1c7', 'project_name': None, 'resource_id': '356bc6d6-1101-467e-a020-65876724c955-vda', 'timestamp': '2025-10-02T12:05:16.348266', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-507794369', 'name': 'instance-00000014', 'instance_id': '356bc6d6-1101-467e-a020-65876724c955', 'instance_type': 'm1.nano', 'host': '04aeeafd3b3d14c37aa026fe541948d818a0d44b2eb5c8767ebc72ff', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0ef22dc6-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.02649806, 'message_signature': 'c7d12421bb5d50dfd7d223230ab5ab20ec4b4b2e4225a139d25855cf9b8f0e6c'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '5f75195e56504673bd403ce69cbc28ca', 'user_name': None, 'project_id': 'f7cb78d24d1a4511a59ced45ccc4a1c7', 'project_name': None, 'resource_id': '356bc6d6-1101-467e-a020-65876724c955-sda', 'timestamp': '2025-10-02T12:05:16.348266', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-507794369', 'name': 'instance-00000014', 'instance_id': '356bc6d6-1101-467e-a020-65876724c955', 'instance_type': 'm1.nano', 'host': '04aeeafd3b3d14c37aa026fe541948d818a0d44b2eb5c8767ebc72ff', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0ef23c44-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.02649806, 'message_signature': '4933206936bb5cde1f1747060052c5f63aa1c9488412af82b7a4912bb40ef89d'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': 'cdc7ec1af4d8410db0b4592293549806', 'user_name': None, 'project_id': '87e7399e976c40bc84f320ed0d052ac6', 'project_name': None, 'resource_id': '31fa0ee3-64b4-4f39-adf9-bceb5906e105-vda', 'timestamp': '2025-10-02T12:05:16.348266', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1762751702', 'name': 'instance-0000001d', 'instance_id': '31fa0ee3-64b4-4f39-adf9-bceb5906e105', 'instance_type': 'm1.nano', 'host': 'e78f6e823697588ea37aa098ba375e8d623624c2700ad6613abe0a2e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0ef3eef4-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.039379426, 'message_signature': 'e932d98b8eace7df45706160cbf9aa02afa22891761d0ec6e4c8439c68c23f89'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'cdc7ec1af4d8410db0b4592293549806', 'user_name': None, 'project_id': '87e7399e976c40bc84f320ed0d052ac6', 'project_name': None, 'resource_id': '31fa0ee3-64b4-4f39-adf9-bceb5906e105-sda', 'timestamp': '2025-10-02T12:05:16.348266', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1762751702', 'name': 'instance-0000001d', 'instance_id': '31fa0ee3-64b4-4f39-adf9-bceb5906e105', 'instance_type': 'm1.nano', 'host': 'e78f6e823697588ea37aa098ba375e8d623624c2700ad6613abe0a2e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0ef3fad4-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.039379426, 'message_signature': 'b62e93154d5cfc23aa1a17100a394dbf3e6af3675ad83a94dac567fcef7592e7'}]}, 'timestamp': '2025-10-02 12:05:16.372563', '_unique_id': '8650ff04a4724d968874cb9993b9e93e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.373 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.373 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.373 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.373 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.373 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.373 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.373 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.373 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.373 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.373 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.373 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.373 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.373 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.373 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.373 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.373 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.373 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.373 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.373 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.373 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.373 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.373 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.373 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.373 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.373 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.373 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.373 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.373 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.373 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.373 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.373 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.374 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.374 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.375 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-LiveMigrationTest-server-507794369>, <NovaLikeServer: tempest-ImagesOneServerNegativeTestJSON-server-1762751702>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveMigrationTest-server-507794369>, <NovaLikeServer: tempest-ImagesOneServerNegativeTestJSON-server-1762751702>]
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.375 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.378 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 356bc6d6-1101-467e-a020-65876724c955 / tap29214def-24 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.378 12 DEBUG ceilometer.compute.pollsters [-] 356bc6d6-1101-467e-a020-65876724c955/network.incoming.bytes volume: 1846 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.380 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 31fa0ee3-64b4-4f39-adf9-bceb5906e105 / tapbe03f2e4-1e inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.380 12 DEBUG ceilometer.compute.pollsters [-] 31fa0ee3-64b4-4f39-adf9-bceb5906e105/network.incoming.bytes volume: 110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c6377938-ca22-472f-bd1d-9d95d3651c83', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1846, 'user_id': '5f75195e56504673bd403ce69cbc28ca', 'user_name': None, 'project_id': 'f7cb78d24d1a4511a59ced45ccc4a1c7', 'project_name': None, 'resource_id': 'instance-00000014-356bc6d6-1101-467e-a020-65876724c955-tap29214def-24', 'timestamp': '2025-10-02T12:05:16.375780', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-507794369', 'name': 'tap29214def-24', 'instance_id': '356bc6d6-1101-467e-a020-65876724c955', 'instance_type': 'm1.nano', 'host': '04aeeafd3b3d14c37aa026fe541948d818a0d44b2eb5c8767ebc72ff', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1d:3d:20', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap29214def-24'}, 'message_id': '0ef4edae-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.05406834, 'message_signature': '7436701889ea8394a181e45695f76f606a65eb558edd722b7a038b7b869664ab'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 110, 'user_id': 'cdc7ec1af4d8410db0b4592293549806', 'user_name': None, 'project_id': '87e7399e976c40bc84f320ed0d052ac6', 'project_name': None, 'resource_id': 'instance-0000001d-31fa0ee3-64b4-4f39-adf9-bceb5906e105-tapbe03f2e4-1e', 'timestamp': '2025-10-02T12:05:16.375780', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1762751702', 'name': 'tapbe03f2e4-1e', 'instance_id': '31fa0ee3-64b4-4f39-adf9-bceb5906e105', 'instance_type': 'm1.nano', 'host': 'e78f6e823697588ea37aa098ba375e8d623624c2700ad6613abe0a2e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e7:d2:a3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbe03f2e4-1e'}, 'message_id': '0ef54600-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.057050409, 'message_signature': '9f29c6842649826f6604dd50138ca01b023a3ae652f7d950e04fecedb2554475'}]}, 'timestamp': '2025-10-02 12:05:16.381133', '_unique_id': '108d2135b9444188b3a75704d13e5e86'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.382 12 DEBUG ceilometer.compute.pollsters [-] 356bc6d6-1101-467e-a020-65876724c955/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.383 12 DEBUG ceilometer.compute.pollsters [-] 356bc6d6-1101-467e-a020-65876724c955/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.383 12 DEBUG ceilometer.compute.pollsters [-] 31fa0ee3-64b4-4f39-adf9-bceb5906e105/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.383 12 DEBUG ceilometer.compute.pollsters [-] 31fa0ee3-64b4-4f39-adf9-bceb5906e105/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.385 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd415d2a8-840b-454e-aea7-8ba5483824f8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5f75195e56504673bd403ce69cbc28ca', 'user_name': None, 'project_id': 'f7cb78d24d1a4511a59ced45ccc4a1c7', 'project_name': None, 'resource_id': '356bc6d6-1101-467e-a020-65876724c955-vda', 'timestamp': '2025-10-02T12:05:16.382969', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-507794369', 'name': 'instance-00000014', 'instance_id': '356bc6d6-1101-467e-a020-65876724c955', 'instance_type': 'm1.nano', 'host': '04aeeafd3b3d14c37aa026fe541948d818a0d44b2eb5c8767ebc72ff', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0ef59b78-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.02649806, 'message_signature': '536f813682a32fbd1eb2a865878d3ff8942e4cf340c5468ae74692b9446abfd7'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '5f75195e56504673bd403ce69cbc28ca', 'user_name': None, 'project_id': 'f7cb78d24d1a4511a59ced45ccc4a1c7', 'project_name': None, 'resource_id': '356bc6d6-1101-467e-a020-65876724c955-sda', 'timestamp': '2025-10-02T12:05:16.382969', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-507794369', 'name': 'instance-00000014', 'instance_id': '356bc6d6-1101-467e-a020-65876724c955', 'instance_type': 'm1.nano', 'host': '04aeeafd3b3d14c37aa026fe541948d818a0d44b2eb5c8767ebc72ff', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0ef5a6ea-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.02649806, 'message_signature': '2d60d30e45eccab38b433bfe71597291f71064dc62a834336ada6c322b520f0a'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cdc7ec1af4d8410db0b4592293549806', 'user_name': None, 'project_id': '87e7399e976c40bc84f320ed0d052ac6', 'project_name': None, 'resource_id': '31fa0ee3-64b4-4f39-adf9-bceb5906e105-vda', 'timestamp': '2025-10-02T12:05:16.382969', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1762751702', 'name': 'instance-0000001d', 'instance_id': '31fa0ee3-64b4-4f39-adf9-bceb5906e105', 'instance_type': 'm1.nano', 'host': 'e78f6e823697588ea37aa098ba375e8d623624c2700ad6613abe0a2e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0ef5b0c2-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.039379426, 'message_signature': 'fac52a33bf42b6a6cc14b4201028424ca66aeef34d4aa3dc1188a018d95c70e0'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'cdc7ec1af4d8410db0b4592293549806', 'user_name': None, 'project_id': '87e7399e976c40bc84f320ed0d052ac6', 'project_name': None, 'resource_id': '31fa0ee3-64b4-4f39-adf9-bceb5906e105-sda', 'timestamp': '2025-10-02T12:05:16.382969', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1762751702', 'name': 'instance-0000001d', 'instance_id': '31fa0ee3-64b4-4f39-adf9-bceb5906e105', 'instance_type': 'm1.nano', 'host': 'e78f6e823697588ea37aa098ba375e8d623624c2700ad6613abe0a2e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0ef5be00-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.039379426, 'message_signature': '77033b6cae16c88ac1e67b86db83f842b69584fb7f47e854b69e1391b062648b'}]}, 'timestamp': '2025-10-02 12:05:16.384136', '_unique_id': 'c19e269d15f64804bea055df5b82cc99'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.385 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.385 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.385 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.385 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.385 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.385 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.385 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.385 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.385 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.385 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.385 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.385 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.385 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.385 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.385 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.385 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.385 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.385 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.385 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.385 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.385 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.385 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.385 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.385 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.385 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.385 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.385 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.385 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.385 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.385 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.385 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.385 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.405 12 DEBUG ceilometer.compute.pollsters [-] 356bc6d6-1101-467e-a020-65876724c955/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.406 12 DEBUG ceilometer.compute.pollsters [-] 356bc6d6-1101-467e-a020-65876724c955/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.432 12 DEBUG ceilometer.compute.pollsters [-] 31fa0ee3-64b4-4f39-adf9-bceb5906e105/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.433 12 DEBUG ceilometer.compute.pollsters [-] 31fa0ee3-64b4-4f39-adf9-bceb5906e105/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.434 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '24604814-d148-4382-a3e3-3ad0c579e93c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5f75195e56504673bd403ce69cbc28ca', 'user_name': None, 'project_id': 'f7cb78d24d1a4511a59ced45ccc4a1c7', 'project_name': None, 'resource_id': '356bc6d6-1101-467e-a020-65876724c955-vda', 'timestamp': '2025-10-02T12:05:16.386073', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-507794369', 'name': 'instance-00000014', 'instance_id': '356bc6d6-1101-467e-a020-65876724c955', 'instance_type': 'm1.nano', 'host': '04aeeafd3b3d14c37aa026fe541948d818a0d44b2eb5c8767ebc72ff', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0ef9144c-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.064322639, 'message_signature': '06e578535ca476e7d0124e490cb0586e7204eb96b6569c589089eb39789dc9c9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5f75195e56504673bd403ce69cbc28ca', 'user_name': None, 'project_id': 'f7cb78d24d1a4511a59ced45ccc4a1c7', 'project_name': None, 'resource_id': '356bc6d6-1101-467e-a020-65876724c955-sda', 'timestamp': '2025-10-02T12:05:16.386073', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-507794369', 'name': 'instance-00000014', 'instance_id': '356bc6d6-1101-467e-a020-65876724c955', 'instance_type': 'm1.nano', 'host': '04aeeafd3b3d14c37aa026fe541948d818a0d44b2eb5c8767ebc72ff', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0ef9214e-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.064322639, 'message_signature': 'ea1c48466d5e84b6bef271511592c7b2b2f389541ec0ca8c207542790e6ca4f4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': 'cdc7ec1af4d8410db0b4592293549806', 'user_name': None, 'project_id': '87e7399e976c40bc84f320ed0d052ac6', 'project_name': None, 'resource_id': '31fa0ee3-64b4-4f39-adf9-bceb5906e105-vda', 'timestamp': '2025-10-02T12:05:16.386073', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1762751702', 'name': 'instance-0000001d', 'instance_id': '31fa0ee3-64b4-4f39-adf9-bceb5906e105', 'instance_type': 'm1.nano', 'host': 'e78f6e823697588ea37aa098ba375e8d623624c2700ad6613abe0a2e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0efd431e-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.084522457, 'message_signature': '63f7dc42aa5f3adb65776847f99bdf29dbfaac86fbb72891a8537b9a2b1da9ac'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': 'cdc7ec1af4d8410db0b4592293549806', 'user_name': None, 'project_id': '87e7399e976c40bc84f320ed0d052ac6', 'project_name': None, 'resource_id': '31fa0ee3-64b4-4f39-adf9-bceb5906e105-sda', 'timestamp': '2025-10-02T12:05:16.386073', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1762751702', 'name': 'instance-0000001d', 'instance_id': '31fa0ee3-64b4-4f39-adf9-bceb5906e105', 'instance_type': 'm1.nano', 'host': 'e78f6e823697588ea37aa098ba375e8d623624c2700ad6613abe0a2e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0efd4da0-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.084522457, 'message_signature': 'eb740efad8d6d1705cbbaf22989daadf38747bc30a0a1451cc8e19404b67de74'}]}, 'timestamp': '2025-10-02 12:05:16.433660', '_unique_id': 'ccb7cd8fef7041cb98f95670e702b56b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.434 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.434 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.434 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.434 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.434 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.434 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.434 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.434 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.434 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.434 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.434 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.434 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.434 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.434 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.434 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.434 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.434 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.434 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.434 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.434 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.434 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.434 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.434 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.434 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.434 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.434 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.434 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.434 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.434 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.434 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.434 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.435 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.435 12 DEBUG ceilometer.compute.pollsters [-] 356bc6d6-1101-467e-a020-65876724c955/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.435 12 DEBUG ceilometer.compute.pollsters [-] 356bc6d6-1101-467e-a020-65876724c955/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.436 12 DEBUG ceilometer.compute.pollsters [-] 31fa0ee3-64b4-4f39-adf9-bceb5906e105/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.436 12 DEBUG ceilometer.compute.pollsters [-] 31fa0ee3-64b4-4f39-adf9-bceb5906e105/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.437 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a95cb7fe-8f8f-443a-b771-bd3ea5b09c90', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '5f75195e56504673bd403ce69cbc28ca', 'user_name': None, 'project_id': 'f7cb78d24d1a4511a59ced45ccc4a1c7', 'project_name': None, 'resource_id': '356bc6d6-1101-467e-a020-65876724c955-vda', 'timestamp': '2025-10-02T12:05:16.435573', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-507794369', 'name': 'instance-00000014', 'instance_id': '356bc6d6-1101-467e-a020-65876724c955', 'instance_type': 'm1.nano', 'host': '04aeeafd3b3d14c37aa026fe541948d818a0d44b2eb5c8767ebc72ff', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0efda17e-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.064322639, 'message_signature': '3920d3551407109554593befe04671b7d616b2065e6b679d5b6ddf2afb499b30'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '5f75195e56504673bd403ce69cbc28ca', 'user_name': None, 'project_id': 'f7cb78d24d1a4511a59ced45ccc4a1c7', 'project_name': None, 'resource_id': '356bc6d6-1101-467e-a020-65876724c955-sda', 'timestamp': '2025-10-02T12:05:16.435573', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-507794369', 'name': 'instance-00000014', 'instance_id': '356bc6d6-1101-467e-a020-65876724c955', 'instance_type': 'm1.nano', 'host': '04aeeafd3b3d14c37aa026fe541948d818a0d44b2eb5c8767ebc72ff', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0efdaa3e-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.064322639, 'message_signature': '8b6de792529ce65b0fd1e4fa66eeda83cf30094a02620532d7218b236ef60e88'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': 'cdc7ec1af4d8410db0b4592293549806', 'user_name': None, 'project_id': '87e7399e976c40bc84f320ed0d052ac6', 'project_name': None, 'resource_id': '31fa0ee3-64b4-4f39-adf9-bceb5906e105-vda', 'timestamp': '2025-10-02T12:05:16.435573', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1762751702', 'name': 'instance-0000001d', 'instance_id': '31fa0ee3-64b4-4f39-adf9-bceb5906e105', 'instance_type': 'm1.nano', 'host': 'e78f6e823697588ea37aa098ba375e8d623624c2700ad6613abe0a2e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0efdb54c-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.084522457, 'message_signature': 'e99a15de947afa9a076bd7c1cca0355a16c34ec0e99d3fa0bcb27eddbe619aea'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cdc7ec1af4d8410db0b4592293549806', 'user_name': None, 'project_id': '87e7399e976c40bc84f320ed0d052ac6', 'project_name': None, 'resource_id': '31fa0ee3-64b4-4f39-adf9-bceb5906e105-sda', 'timestamp': '2025-10-02T12:05:16.435573', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1762751702', 'name': 'instance-0000001d', 'instance_id': '31fa0ee3-64b4-4f39-adf9-bceb5906e105', 'instance_type': 'm1.nano', 'host': 'e78f6e823697588ea37aa098ba375e8d623624c2700ad6613abe0a2e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0efdbe52-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.084522457, 'message_signature': 'bfb3736856e070e0ff27412a07fd4ead0348016a6ea95770c7598f8294b24eb9'}]}, 'timestamp': '2025-10-02 12:05:16.436532', '_unique_id': 'e5896f5b9d80450f9553d563780032f4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.437 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.437 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.437 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.437 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.437 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.437 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.437 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.437 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.437 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.437 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.437 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.437 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.437 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.437 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.437 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.437 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.437 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.437 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.437 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.437 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.437 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.437 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.437 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.437 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.437 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.437 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.437 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.437 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.437 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.437 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.437 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.437 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  2 08:05:16 np0005466013 nova_compute[192144]: 2025-10-02 12:05:16.441 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.455 12 DEBUG ceilometer.compute.pollsters [-] 356bc6d6-1101-467e-a020-65876724c955/memory.usage volume: 42.50390625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.473 12 DEBUG ceilometer.compute.pollsters [-] 31fa0ee3-64b4-4f39-adf9-bceb5906e105/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.474 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 31fa0ee3-64b4-4f39-adf9-bceb5906e105: ceilometer.compute.pollsters.NoVolumeException
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.475 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c38cbb13-0e69-4d50-bed0-2c359dac0d2d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.50390625, 'user_id': '5f75195e56504673bd403ce69cbc28ca', 'user_name': None, 'project_id': 'f7cb78d24d1a4511a59ced45ccc4a1c7', 'project_name': None, 'resource_id': '356bc6d6-1101-467e-a020-65876724c955', 'timestamp': '2025-10-02T12:05:16.437950', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-507794369', 'name': 'instance-00000014', 'instance_id': '356bc6d6-1101-467e-a020-65876724c955', 'instance_type': 'm1.nano', 'host': '04aeeafd3b3d14c37aa026fe541948d818a0d44b2eb5c8767ebc72ff', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '0f00af18-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.133171795, 'message_signature': '955ff018a03abc7c312d8cbd7e7c80f244b0e3371df96958325aab9a80c22776'}]}, 'timestamp': '2025-10-02 12:05:16.474467', '_unique_id': 'dee0095092464047b0d4c1533118b649'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.475 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.475 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.475 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.475 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.475 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.475 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.475 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.475 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.475 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.475 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.475 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.475 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.475 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.475 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.475 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.475 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.475 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.475 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.475 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.475 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.475 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.475 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.475 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.475 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.475 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.475 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.475 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.475 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.475 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.475 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.475 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.476 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.477 12 DEBUG ceilometer.compute.pollsters [-] 356bc6d6-1101-467e-a020-65876724c955/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.477 12 DEBUG ceilometer.compute.pollsters [-] 31fa0ee3-64b4-4f39-adf9-bceb5906e105/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.478 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0efc5e7f-100e-4af6-a4e2-b178843a5ac5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5f75195e56504673bd403ce69cbc28ca', 'user_name': None, 'project_id': 'f7cb78d24d1a4511a59ced45ccc4a1c7', 'project_name': None, 'resource_id': 'instance-00000014-356bc6d6-1101-467e-a020-65876724c955-tap29214def-24', 'timestamp': '2025-10-02T12:05:16.477080', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-507794369', 'name': 'tap29214def-24', 'instance_id': '356bc6d6-1101-467e-a020-65876724c955', 'instance_type': 'm1.nano', 'host': '04aeeafd3b3d14c37aa026fe541948d818a0d44b2eb5c8767ebc72ff', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1d:3d:20', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap29214def-24'}, 'message_id': '0f03fbb4-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.05406834, 'message_signature': 'b175fcbb5d5076343cc08504eec00e420e6f02cb18b4e8cbf48e84cf7886905d'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cdc7ec1af4d8410db0b4592293549806', 'user_name': None, 'project_id': '87e7399e976c40bc84f320ed0d052ac6', 'project_name': None, 'resource_id': 'instance-0000001d-31fa0ee3-64b4-4f39-adf9-bceb5906e105-tapbe03f2e4-1e', 'timestamp': '2025-10-02T12:05:16.477080', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1762751702', 'name': 'tapbe03f2e4-1e', 'instance_id': '31fa0ee3-64b4-4f39-adf9-bceb5906e105', 'instance_type': 'm1.nano', 'host': 'e78f6e823697588ea37aa098ba375e8d623624c2700ad6613abe0a2e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e7:d2:a3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbe03f2e4-1e'}, 'message_id': '0f040a3c-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.057050409, 'message_signature': '860ad9ed4425e5d45d4a37e01e46e3c8f92b26bb82ade7837e1abe4ec33ae2f7'}]}, 'timestamp': '2025-10-02 12:05:16.477863', '_unique_id': '1de7fd20b18f422e9ca29e1c39b83640'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.478 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.478 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.478 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.478 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.478 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.478 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.478 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.478 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.478 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.478 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.478 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.478 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.478 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.478 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.478 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.478 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.478 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.478 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.478 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.478 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.478 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.478 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.478 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.478 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.478 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.478 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.478 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.478 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.478 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.478 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.478 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.479 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.479 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.479 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-LiveMigrationTest-server-507794369>, <NovaLikeServer: tempest-ImagesOneServerNegativeTestJSON-server-1762751702>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveMigrationTest-server-507794369>, <NovaLikeServer: tempest-ImagesOneServerNegativeTestJSON-server-1762751702>]
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.480 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.480 12 DEBUG ceilometer.compute.pollsters [-] 356bc6d6-1101-467e-a020-65876724c955/cpu volume: 280000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.480 12 DEBUG ceilometer.compute.pollsters [-] 31fa0ee3-64b4-4f39-adf9-bceb5906e105/cpu volume: 5950000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.481 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '27c2b85e-5b89-4495-9f24-cd129cd677ca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 280000000, 'user_id': '5f75195e56504673bd403ce69cbc28ca', 'user_name': None, 'project_id': 'f7cb78d24d1a4511a59ced45ccc4a1c7', 'project_name': None, 'resource_id': '356bc6d6-1101-467e-a020-65876724c955', 'timestamp': '2025-10-02T12:05:16.480247', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-507794369', 'name': 'instance-00000014', 'instance_id': '356bc6d6-1101-467e-a020-65876724c955', 'instance_type': 'm1.nano', 'host': '04aeeafd3b3d14c37aa026fe541948d818a0d44b2eb5c8767ebc72ff', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '0f0473f0-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.133171795, 'message_signature': '8dbf50ba89a68452d3252a443f50adf0f3b7f1c70766f568aef7b6584d69bb1f'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 5950000000, 'user_id': 'cdc7ec1af4d8410db0b4592293549806', 'user_name': None, 'project_id': '87e7399e976c40bc84f320ed0d052ac6', 'project_name': None, 'resource_id': '31fa0ee3-64b4-4f39-adf9-bceb5906e105', 'timestamp': '2025-10-02T12:05:16.480247', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1762751702', 'name': 'instance-0000001d', 'instance_id': '31fa0ee3-64b4-4f39-adf9-bceb5906e105', 'instance_type': 'm1.nano', 'host': 'e78f6e823697588ea37aa098ba375e8d623624c2700ad6613abe0a2e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '0f047dd2-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.151862672, 'message_signature': '3d5ad6fe5031640ff747745356e8fe53ece9b83fb592ea1365f4116b311fec29'}]}, 'timestamp': '2025-10-02 12:05:16.480763', '_unique_id': 'e252d68658b74e7a8317b81aefa37fde'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.481 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.481 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.481 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.481 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.481 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.481 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.481 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.481 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.481 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.481 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.481 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.481 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.481 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.481 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.481 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.481 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.481 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.481 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.481 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.481 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.481 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.481 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.481 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.481 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.481 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.481 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.481 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.481 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.481 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.481 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.481 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.482 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.482 12 DEBUG ceilometer.compute.pollsters [-] 356bc6d6-1101-467e-a020-65876724c955/network.outgoing.bytes volume: 5560 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.482 12 DEBUG ceilometer.compute.pollsters [-] 31fa0ee3-64b4-4f39-adf9-bceb5906e105/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.484 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '04948ee7-53eb-49e6-98f4-ab9e9aec7a18', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 5560, 'user_id': '5f75195e56504673bd403ce69cbc28ca', 'user_name': None, 'project_id': 'f7cb78d24d1a4511a59ced45ccc4a1c7', 'project_name': None, 'resource_id': 'instance-00000014-356bc6d6-1101-467e-a020-65876724c955-tap29214def-24', 'timestamp': '2025-10-02T12:05:16.482612', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-507794369', 'name': 'tap29214def-24', 'instance_id': '356bc6d6-1101-467e-a020-65876724c955', 'instance_type': 'm1.nano', 'host': '04aeeafd3b3d14c37aa026fe541948d818a0d44b2eb5c8767ebc72ff', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1d:3d:20', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap29214def-24'}, 'message_id': '0f04d0ca-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.05406834, 'message_signature': '13fe4942d098759d68518e21265eb959f2c28f8c51e26a06e44a9b7c8b6e596c'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cdc7ec1af4d8410db0b4592293549806', 'user_name': None, 'project_id': '87e7399e976c40bc84f320ed0d052ac6', 'project_name': None, 'resource_id': 'instance-0000001d-31fa0ee3-64b4-4f39-adf9-bceb5906e105-tapbe03f2e4-1e', 'timestamp': '2025-10-02T12:05:16.482612', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1762751702', 'name': 'tapbe03f2e4-1e', 'instance_id': '31fa0ee3-64b4-4f39-adf9-bceb5906e105', 'instance_type': 'm1.nano', 'host': 'e78f6e823697588ea37aa098ba375e8d623624c2700ad6613abe0a2e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e7:d2:a3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbe03f2e4-1e'}, 'message_id': '0f04de76-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.057050409, 'message_signature': '217458f2f2ee3cf72bb179ea45c96a7387505914e8c3aaa4380df9096d5ef6b9'}]}, 'timestamp': '2025-10-02 12:05:16.483288', '_unique_id': 'c0c233cc087840a19462c2c8959ae21e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.484 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.484 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.484 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.484 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.484 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.484 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.484 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.484 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.484 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.484 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.484 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.484 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.484 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.484 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.484 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.484 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.484 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.484 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.484 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.484 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.484 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.484 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.484 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.484 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.484 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.484 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.484 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.484 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.484 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.484 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.484 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.484 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.485 12 DEBUG ceilometer.compute.pollsters [-] 356bc6d6-1101-467e-a020-65876724c955/disk.device.allocation volume: 30613504 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.485 12 DEBUG ceilometer.compute.pollsters [-] 356bc6d6-1101-467e-a020-65876724c955/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.485 12 DEBUG ceilometer.compute.pollsters [-] 31fa0ee3-64b4-4f39-adf9-bceb5906e105/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.485 12 DEBUG ceilometer.compute.pollsters [-] 31fa0ee3-64b4-4f39-adf9-bceb5906e105/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.486 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4584b4ae-eca7-4399-a3e8-c6528b948c68', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30613504, 'user_id': '5f75195e56504673bd403ce69cbc28ca', 'user_name': None, 'project_id': 'f7cb78d24d1a4511a59ced45ccc4a1c7', 'project_name': None, 'resource_id': '356bc6d6-1101-467e-a020-65876724c955-vda', 'timestamp': '2025-10-02T12:05:16.485030', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-507794369', 'name': 'instance-00000014', 'instance_id': '356bc6d6-1101-467e-a020-65876724c955', 'instance_type': 'm1.nano', 'host': '04aeeafd3b3d14c37aa026fe541948d818a0d44b2eb5c8767ebc72ff', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0f052f16-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.02649806, 'message_signature': '12859b5f0d349a8b9253262a923cfaa9e19e36aba4a335ae7a78c3c8e543dc1b'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '5f75195e56504673bd403ce69cbc28ca', 'user_name': None, 'project_id': 'f7cb78d24d1a4511a59ced45ccc4a1c7', 'project_name': None, 'resource_id': '356bc6d6-1101-467e-a020-65876724c955-sda', 'timestamp': '2025-10-02T12:05:16.485030', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-507794369', 'name': 'instance-00000014', 'instance_id': '356bc6d6-1101-467e-a020-65876724c955', 'instance_type': 'm1.nano', 'host': '04aeeafd3b3d14c37aa026fe541948d818a0d44b2eb5c8767ebc72ff', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0f05388a-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.02649806, 'message_signature': 'a9d787851e9bb866918cafc2d5be17ecca4bc03de6fe6ed739da817717dbca2f'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'cdc7ec1af4d8410db0b4592293549806', 'user_name': None, 'project_id': '87e7399e976c40bc84f320ed0d052ac6', 'project_name': None, 'resource_id': '31fa0ee3-64b4-4f39-adf9-bceb5906e105-vda', 'timestamp': '2025-10-02T12:05:16.485030', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1762751702', 'name': 'instance-0000001d', 'instance_id': '31fa0ee3-64b4-4f39-adf9-bceb5906e105', 'instance_type': 'm1.nano', 'host': 'e78f6e823697588ea37aa098ba375e8d623624c2700ad6613abe0a2e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0f053ff6-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.039379426, 'message_signature': '508a683749873cbf91eb676dd73b548b7636cc5ff3293b089042b46241a93a04'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'cdc7ec1af4d8410db0b4592293549806', 'user_name': None, 'project_id': '87e7399e976c40bc84f320ed0d052ac6', 'project_name': None, 'resource_id': '31fa0ee3-64b4-4f39-adf9-bceb5906e105-sda', 'timestamp': '2025-10-02T12:05:16.485030', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1762751702', 'name': 'instance-0000001d', 'instance_id': '31fa0ee3-64b4-4f39-adf9-bceb5906e105', 'instance_type': 'm1.nano', 'host': 'e78f6e823697588ea37aa098ba375e8d623624c2700ad6613abe0a2e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0f0547b2-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.039379426, 'message_signature': '7ad45b09fac855cdd850220084c1d492bb417e38bf1363b248b0f88cb446c3b5'}]}, 'timestamp': '2025-10-02 12:05:16.485973', '_unique_id': 'dd9b24e6b84a4deca3e71247feb5e24c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.486 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.486 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.486 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.486 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.486 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.486 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.486 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.486 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.486 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.486 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.486 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.486 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.486 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.486 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.486 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.486 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.486 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.486 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.486 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.486 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.486 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.486 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.486 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.486 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.486 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.486 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.486 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.486 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.486 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.486 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.486 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.487 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.487 12 DEBUG ceilometer.compute.pollsters [-] 356bc6d6-1101-467e-a020-65876724c955/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.487 12 DEBUG ceilometer.compute.pollsters [-] 356bc6d6-1101-467e-a020-65876724c955/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.487 12 DEBUG ceilometer.compute.pollsters [-] 31fa0ee3-64b4-4f39-adf9-bceb5906e105/disk.device.read.latency volume: 437007352 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.488 12 DEBUG ceilometer.compute.pollsters [-] 31fa0ee3-64b4-4f39-adf9-bceb5906e105/disk.device.read.latency volume: 2194878 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.489 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4676cf47-0152-4a42-a95d-e9715ff201a5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '5f75195e56504673bd403ce69cbc28ca', 'user_name': None, 'project_id': 'f7cb78d24d1a4511a59ced45ccc4a1c7', 'project_name': None, 'resource_id': '356bc6d6-1101-467e-a020-65876724c955-vda', 'timestamp': '2025-10-02T12:05:16.487363', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-507794369', 'name': 'instance-00000014', 'instance_id': '356bc6d6-1101-467e-a020-65876724c955', 'instance_type': 'm1.nano', 'host': '04aeeafd3b3d14c37aa026fe541948d818a0d44b2eb5c8767ebc72ff', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0f0589a2-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.064322639, 'message_signature': 'df10f1f7df22ad860f6f3d17b88cd782df5085adf51f390bc8b3faff5385f164'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '5f75195e56504673bd403ce69cbc28ca', 'user_name': None, 'project_id': 'f7cb78d24d1a4511a59ced45ccc4a1c7', 'project_name': None, 'resource_id': '356bc6d6-1101-467e-a020-65876724c955-sda', 'timestamp': '2025-10-02T12:05:16.487363', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-507794369', 'name': 'instance-00000014', 'instance_id': '356bc6d6-1101-467e-a020-65876724c955', 'instance_type': 'm1.nano', 'host': '04aeeafd3b3d14c37aa026fe541948d818a0d44b2eb5c8767ebc72ff', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0f0593ca-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.064322639, 'message_signature': '0517e53ae55a502dca1825bbcf037b8eec62f6d17b1df061d45a6cc4534f5a2a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 437007352, 'user_id': 'cdc7ec1af4d8410db0b4592293549806', 'user_name': None, 'project_id': '87e7399e976c40bc84f320ed0d052ac6', 'project_name': None, 'resource_id': '31fa0ee3-64b4-4f39-adf9-bceb5906e105-vda', 'timestamp': '2025-10-02T12:05:16.487363', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1762751702', 'name': 'instance-0000001d', 'instance_id': '31fa0ee3-64b4-4f39-adf9-bceb5906e105', 'instance_type': 'm1.nano', 'host': 'e78f6e823697588ea37aa098ba375e8d623624c2700ad6613abe0a2e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0f059fc8-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.084522457, 'message_signature': 'a77d439fe63bacc9f7a8164426d9667d53a57182f3f8be221f56bdcd71cbc136'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2194878, 'user_id': 'cdc7ec1af4d8410db0b4592293549806', 'user_name': None, 'project_id': '87e7399e976c40bc84f320ed0d052ac6', 'project_name': None, 'resource_id': '31fa0ee3-64b4-4f39-adf9-bceb5906e105-sda', 'timestamp': '2025-10-02T12:05:16.487363', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1762751702', 'name': 'instance-0000001d', 'instance_id': '31fa0ee3-64b4-4f39-adf9-bceb5906e105', 'instance_type': 'm1.nano', 'host': 'e78f6e823697588ea37aa098ba375e8d623624c2700ad6613abe0a2e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0f05ab6c-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.084522457, 'message_signature': 'a780bdcab35b69ebbef1d2520c4ca293a2d7bbd34df616024d8b1861b74e794e'}]}, 'timestamp': '2025-10-02 12:05:16.488513', '_unique_id': '9149b97ad34345dc9496c68ad2a04faa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.489 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.489 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.489 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.489 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.489 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.489 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.489 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.489 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.489 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.489 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.489 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.489 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.489 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.489 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.489 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.489 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.489 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.489 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.489 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.489 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.489 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.489 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.489 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.489 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.489 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.489 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.489 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.489 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.489 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.489 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.489 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.490 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.490 12 DEBUG ceilometer.compute.pollsters [-] 356bc6d6-1101-467e-a020-65876724c955/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.490 12 DEBUG ceilometer.compute.pollsters [-] 31fa0ee3-64b4-4f39-adf9-bceb5906e105/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '34641a21-a965-4348-9986-4abf4e18ea3f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5f75195e56504673bd403ce69cbc28ca', 'user_name': None, 'project_id': 'f7cb78d24d1a4511a59ced45ccc4a1c7', 'project_name': None, 'resource_id': 'instance-00000014-356bc6d6-1101-467e-a020-65876724c955-tap29214def-24', 'timestamp': '2025-10-02T12:05:16.490213', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-507794369', 'name': 'tap29214def-24', 'instance_id': '356bc6d6-1101-467e-a020-65876724c955', 'instance_type': 'm1.nano', 'host': '04aeeafd3b3d14c37aa026fe541948d818a0d44b2eb5c8767ebc72ff', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1d:3d:20', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap29214def-24'}, 'message_id': '0f05f8e2-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.05406834, 'message_signature': '53c380b5502dac138cc299fd2050d7c1aa4585ed38bfc0c00cc6bdc146004127'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cdc7ec1af4d8410db0b4592293549806', 'user_name': None, 'project_id': '87e7399e976c40bc84f320ed0d052ac6', 'project_name': None, 'resource_id': 'instance-0000001d-31fa0ee3-64b4-4f39-adf9-bceb5906e105-tapbe03f2e4-1e', 'timestamp': '2025-10-02T12:05:16.490213', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1762751702', 'name': 'tapbe03f2e4-1e', 'instance_id': '31fa0ee3-64b4-4f39-adf9-bceb5906e105', 'instance_type': 'm1.nano', 'host': 'e78f6e823697588ea37aa098ba375e8d623624c2700ad6613abe0a2e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e7:d2:a3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbe03f2e4-1e'}, 'message_id': '0f060242-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.057050409, 'message_signature': '5f4262254896f88890029faba78fc5420ba4bdf6a176daf22b3db82c9ea0d76e'}]}, 'timestamp': '2025-10-02 12:05:16.490707', '_unique_id': '72613dec342a43b8b51a07adfb57be3c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.491 12 DEBUG ceilometer.compute.pollsters [-] 356bc6d6-1101-467e-a020-65876724c955/network.outgoing.packets volume: 80 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.492 12 DEBUG ceilometer.compute.pollsters [-] 31fa0ee3-64b4-4f39-adf9-bceb5906e105/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.492 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dd2ca245-7417-4e34-aeef-626d525a47d4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 80, 'user_id': '5f75195e56504673bd403ce69cbc28ca', 'user_name': None, 'project_id': 'f7cb78d24d1a4511a59ced45ccc4a1c7', 'project_name': None, 'resource_id': 'instance-00000014-356bc6d6-1101-467e-a020-65876724c955-tap29214def-24', 'timestamp': '2025-10-02T12:05:16.491916', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-507794369', 'name': 'tap29214def-24', 'instance_id': '356bc6d6-1101-467e-a020-65876724c955', 'instance_type': 'm1.nano', 'host': '04aeeafd3b3d14c37aa026fe541948d818a0d44b2eb5c8767ebc72ff', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1d:3d:20', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap29214def-24'}, 'message_id': '0f063a46-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.05406834, 'message_signature': '8e500972e561dfeb011a8b77475faf46806d4ca94afb73938306432f20f7eb9e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cdc7ec1af4d8410db0b4592293549806', 'user_name': None, 'project_id': '87e7399e976c40bc84f320ed0d052ac6', 'project_name': None, 'resource_id': 'instance-0000001d-31fa0ee3-64b4-4f39-adf9-bceb5906e105-tapbe03f2e4-1e', 'timestamp': '2025-10-02T12:05:16.491916', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1762751702', 'name': 'tapbe03f2e4-1e', 'instance_id': '31fa0ee3-64b4-4f39-adf9-bceb5906e105', 'instance_type': 'm1.nano', 'host': 'e78f6e823697588ea37aa098ba375e8d623624c2700ad6613abe0a2e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e7:d2:a3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbe03f2e4-1e'}, 'message_id': '0f0643a6-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.057050409, 'message_signature': '2f339225a83b43365dc547d5c5e46438e89b0a765b5708c0495eac89764bcc1d'}]}, 'timestamp': '2025-10-02 12:05:16.492383', '_unique_id': '5f0ebc07338440f385d59cedc7c2431f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.492 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.492 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.492 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.492 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.492 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.492 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.492 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.492 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.492 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.492 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.492 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.492 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.492 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.492 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.492 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.492 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.492 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.492 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.492 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.492 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.492 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.492 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.492 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.492 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.492 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.492 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.492 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.492 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.492 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.492 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.492 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.493 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.493 12 DEBUG ceilometer.compute.pollsters [-] 356bc6d6-1101-467e-a020-65876724c955/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.493 12 DEBUG ceilometer.compute.pollsters [-] 31fa0ee3-64b4-4f39-adf9-bceb5906e105/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.494 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '63bf4525-d297-495a-a824-1f866389631f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5f75195e56504673bd403ce69cbc28ca', 'user_name': None, 'project_id': 'f7cb78d24d1a4511a59ced45ccc4a1c7', 'project_name': None, 'resource_id': 'instance-00000014-356bc6d6-1101-467e-a020-65876724c955-tap29214def-24', 'timestamp': '2025-10-02T12:05:16.493487', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-507794369', 'name': 'tap29214def-24', 'instance_id': '356bc6d6-1101-467e-a020-65876724c955', 'instance_type': 'm1.nano', 'host': '04aeeafd3b3d14c37aa026fe541948d818a0d44b2eb5c8767ebc72ff', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1d:3d:20', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap29214def-24'}, 'message_id': '0f067786-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.05406834, 'message_signature': 'a5817861dfcb3662f49e519a980ffb80eff0440114518c412e9bcee94943ac87'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cdc7ec1af4d8410db0b4592293549806', 'user_name': None, 'project_id': '87e7399e976c40bc84f320ed0d052ac6', 'project_name': None, 'resource_id': 'instance-0000001d-31fa0ee3-64b4-4f39-adf9-bceb5906e105-tapbe03f2e4-1e', 'timestamp': '2025-10-02T12:05:16.493487', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1762751702', 'name': 'tapbe03f2e4-1e', 'instance_id': '31fa0ee3-64b4-4f39-adf9-bceb5906e105', 'instance_type': 'm1.nano', 'host': 'e78f6e823697588ea37aa098ba375e8d623624c2700ad6613abe0a2e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e7:d2:a3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbe03f2e4-1e'}, 'message_id': '0f067f92-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.057050409, 'message_signature': '0e7ea3fd0821575039d50dbccfef4dff7a26e098ab5985808e89c53382f0e742'}]}, 'timestamp': '2025-10-02 12:05:16.493940', '_unique_id': '32f0c93acd534cafb799ca1eacffdae2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.494 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.494 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.494 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.494 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.494 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.494 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.494 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.494 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.494 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.494 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.494 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.494 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.494 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.494 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.494 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.494 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.494 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.494 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.494 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.494 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.494 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.494 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.494 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.494 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.494 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.494 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.494 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.494 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.494 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.494 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.494 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.494 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.495 12 DEBUG ceilometer.compute.pollsters [-] 356bc6d6-1101-467e-a020-65876724c955/disk.device.write.latency volume: 18615761 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.495 12 DEBUG ceilometer.compute.pollsters [-] 356bc6d6-1101-467e-a020-65876724c955/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.495 12 DEBUG ceilometer.compute.pollsters [-] 31fa0ee3-64b4-4f39-adf9-bceb5906e105/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.495 12 DEBUG ceilometer.compute.pollsters [-] 31fa0ee3-64b4-4f39-adf9-bceb5906e105/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.496 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '87cfa3eb-c8f6-4e19-b8bd-d02f96dc61c2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 18615761, 'user_id': '5f75195e56504673bd403ce69cbc28ca', 'user_name': None, 'project_id': 'f7cb78d24d1a4511a59ced45ccc4a1c7', 'project_name': None, 'resource_id': '356bc6d6-1101-467e-a020-65876724c955-vda', 'timestamp': '2025-10-02T12:05:16.495060', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-507794369', 'name': 'instance-00000014', 'instance_id': '356bc6d6-1101-467e-a020-65876724c955', 'instance_type': 'm1.nano', 'host': '04aeeafd3b3d14c37aa026fe541948d818a0d44b2eb5c8767ebc72ff', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0f06b6e2-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.064322639, 'message_signature': '784ac50679bebded00ca4f733b2723aa5fa44335ca01e0975ee4b4dc65c9ffd9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '5f75195e56504673bd403ce69cbc28ca', 'user_name': None, 'project_id': 'f7cb78d24d1a4511a59ced45ccc4a1c7', 'project_name': None, 'resource_id': '356bc6d6-1101-467e-a020-65876724c955-sda', 'timestamp': '2025-10-02T12:05:16.495060', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-507794369', 'name': 'instance-00000014', 'instance_id': '356bc6d6-1101-467e-a020-65876724c955', 'instance_type': 'm1.nano', 'host': '04aeeafd3b3d14c37aa026fe541948d818a0d44b2eb5c8767ebc72ff', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0f06bf98-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.064322639, 'message_signature': '190a799709d832f7a14e51b75d484ad260397fb7ee3169248d2b091cdaad9249'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'cdc7ec1af4d8410db0b4592293549806', 'user_name': None, 'project_id': '87e7399e976c40bc84f320ed0d052ac6', 'project_name': None, 'resource_id': '31fa0ee3-64b4-4f39-adf9-bceb5906e105-vda', 'timestamp': '2025-10-02T12:05:16.495060', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1762751702', 'name': 'instance-0000001d', 'instance_id': '31fa0ee3-64b4-4f39-adf9-bceb5906e105', 'instance_type': 'm1.nano', 'host': 'e78f6e823697588ea37aa098ba375e8d623624c2700ad6613abe0a2e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0f06c74a-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.084522457, 'message_signature': '5a89ca8bea13ce51b4240427effbb2c7c399c31c66fb70a29daef265b73e7244'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'cdc7ec1af4d8410db0b4592293549806', 'user_name': None, 'project_id': '87e7399e976c40bc84f320ed0d052ac6', 'project_name': None, 'resource_id': '31fa0ee3-64b4-4f39-adf9-bceb5906e105-sda', 'timestamp': '2025-10-02T12:05:16.495060', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1762751702', 'name': 'instance-0000001d', 'instance_id': '31fa0ee3-64b4-4f39-adf9-bceb5906e105', 'instance_type': 'm1.nano', 'host': 'e78f6e823697588ea37aa098ba375e8d623624c2700ad6613abe0a2e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0f06cf56-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.084522457, 'message_signature': '15a59f82bfb89a09ddbd315d7f1b4a4a0098ad708a2b59fabf57044155b23c31'}]}, 'timestamp': '2025-10-02 12:05:16.495954', '_unique_id': 'de8a5e77f61642718e07a93595e16a01'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.496 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.496 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.496 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.496 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.496 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.496 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.496 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.496 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.496 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.496 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.496 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.496 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.496 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.496 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.496 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.496 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.496 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.496 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.496 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.496 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.496 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.496 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.496 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.496 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.496 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.496 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.496 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.496 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.496 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.496 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.496 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.497 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.497 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.497 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-LiveMigrationTest-server-507794369>, <NovaLikeServer: tempest-ImagesOneServerNegativeTestJSON-server-1762751702>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveMigrationTest-server-507794369>, <NovaLikeServer: tempest-ImagesOneServerNegativeTestJSON-server-1762751702>]
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.497 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.497 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.497 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-LiveMigrationTest-server-507794369>, <NovaLikeServer: tempest-ImagesOneServerNegativeTestJSON-server-1762751702>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveMigrationTest-server-507794369>, <NovaLikeServer: tempest-ImagesOneServerNegativeTestJSON-server-1762751702>]
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.497 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.498 12 DEBUG ceilometer.compute.pollsters [-] 356bc6d6-1101-467e-a020-65876724c955/disk.device.write.requests volume: 19 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.498 12 DEBUG ceilometer.compute.pollsters [-] 356bc6d6-1101-467e-a020-65876724c955/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.498 12 DEBUG ceilometer.compute.pollsters [-] 31fa0ee3-64b4-4f39-adf9-bceb5906e105/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.498 12 DEBUG ceilometer.compute.pollsters [-] 31fa0ee3-64b4-4f39-adf9-bceb5906e105/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.500 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f817b533-8e0e-4183-999f-cffb0474b6a8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 19, 'user_id': '5f75195e56504673bd403ce69cbc28ca', 'user_name': None, 'project_id': 'f7cb78d24d1a4511a59ced45ccc4a1c7', 'project_name': None, 'resource_id': '356bc6d6-1101-467e-a020-65876724c955-vda', 'timestamp': '2025-10-02T12:05:16.498040', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-507794369', 'name': 'instance-00000014', 'instance_id': '356bc6d6-1101-467e-a020-65876724c955', 'instance_type': 'm1.nano', 'host': '04aeeafd3b3d14c37aa026fe541948d818a0d44b2eb5c8767ebc72ff', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0f072a1e-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.064322639, 'message_signature': 'ca37f03c02059b5fff54ec964af268b8b5b5ab6a23c31401970f28cd4896a195'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '5f75195e56504673bd403ce69cbc28ca', 'user_name': None, 'project_id': 'f7cb78d24d1a4511a59ced45ccc4a1c7', 'project_name': None, 'resource_id': '356bc6d6-1101-467e-a020-65876724c955-sda', 'timestamp': '2025-10-02T12:05:16.498040', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-507794369', 'name': 'instance-00000014', 'instance_id': '356bc6d6-1101-467e-a020-65876724c955', 'instance_type': 'm1.nano', 'host': '04aeeafd3b3d14c37aa026fe541948d818a0d44b2eb5c8767ebc72ff', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0f0733ec-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.064322639, 'message_signature': '8295fcb9c9d71a1eab470704c8b294dda812076ed60d13410186546b185cef05'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'cdc7ec1af4d8410db0b4592293549806', 'user_name': None, 'project_id': '87e7399e976c40bc84f320ed0d052ac6', 'project_name': None, 'resource_id': '31fa0ee3-64b4-4f39-adf9-bceb5906e105-vda', 'timestamp': '2025-10-02T12:05:16.498040', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1762751702', 'name': 'instance-0000001d', 'instance_id': '31fa0ee3-64b4-4f39-adf9-bceb5906e105', 'instance_type': 'm1.nano', 'host': 'e78f6e823697588ea37aa098ba375e8d623624c2700ad6613abe0a2e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0f073d06-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.084522457, 'message_signature': '93e2b14d3e3bcb6ab9e15208cc306d225a47e1c63c9388c57cea58edb04e4c8d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'cdc7ec1af4d8410db0b4592293549806', 'user_name': None, 'project_id': '87e7399e976c40bc84f320ed0d052ac6', 'project_name': None, 'resource_id': '31fa0ee3-64b4-4f39-adf9-bceb5906e105-sda', 'timestamp': '2025-10-02T12:05:16.498040', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1762751702', 'name': 'instance-0000001d', 'instance_id': '31fa0ee3-64b4-4f39-adf9-bceb5906e105', 'instance_type': 'm1.nano', 'host': 'e78f6e823697588ea37aa098ba375e8d623624c2700ad6613abe0a2e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0f074724-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.084522457, 'message_signature': '02a1c11e8d35093b8aa3a5dc405080671183051aac9a8f9cf6d089394e901e53'}]}, 'timestamp': '2025-10-02 12:05:16.499034', '_unique_id': 'fedb908edae349298ebc4b41fcea2a7d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.500 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.500 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.500 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.500 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.500 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.500 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.500 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.500 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.500 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.500 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.500 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.500 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.500 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.500 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.500 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.500 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.500 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.500 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.500 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.500 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.500 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.500 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.500 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.500 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.500 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.500 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.500 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.500 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.500 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.500 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.500 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.500 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.501 12 DEBUG ceilometer.compute.pollsters [-] 356bc6d6-1101-467e-a020-65876724c955/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.501 12 DEBUG ceilometer.compute.pollsters [-] 31fa0ee3-64b4-4f39-adf9-bceb5906e105/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.502 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '55dea269-274a-48a0-b5d1-0b702f75be41', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5f75195e56504673bd403ce69cbc28ca', 'user_name': None, 'project_id': 'f7cb78d24d1a4511a59ced45ccc4a1c7', 'project_name': None, 'resource_id': 'instance-00000014-356bc6d6-1101-467e-a020-65876724c955-tap29214def-24', 'timestamp': '2025-10-02T12:05:16.500974', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-507794369', 'name': 'tap29214def-24', 'instance_id': '356bc6d6-1101-467e-a020-65876724c955', 'instance_type': 'm1.nano', 'host': '04aeeafd3b3d14c37aa026fe541948d818a0d44b2eb5c8767ebc72ff', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1d:3d:20', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap29214def-24'}, 'message_id': '0f079f8a-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.05406834, 'message_signature': 'e11e64e80760527c2b2eae36bc18b532101ced4edbd31b7f9f47c2737fcba58e'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cdc7ec1af4d8410db0b4592293549806', 'user_name': None, 'project_id': '87e7399e976c40bc84f320ed0d052ac6', 'project_name': None, 'resource_id': 'instance-0000001d-31fa0ee3-64b4-4f39-adf9-bceb5906e105-tapbe03f2e4-1e', 'timestamp': '2025-10-02T12:05:16.500974', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1762751702', 'name': 'tapbe03f2e4-1e', 'instance_id': '31fa0ee3-64b4-4f39-adf9-bceb5906e105', 'instance_type': 'm1.nano', 'host': 'e78f6e823697588ea37aa098ba375e8d623624c2700ad6613abe0a2e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e7:d2:a3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbe03f2e4-1e'}, 'message_id': '0f07abec-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.057050409, 'message_signature': '67dc760d9e64c337f7a32c905efe26f66dd4a6fc0ab8189f67a81a0c1e1aec47'}]}, 'timestamp': '2025-10-02 12:05:16.501621', '_unique_id': '061cf77448f8423992f4b0aff593a1a9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.502 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.502 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.502 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.502 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.502 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.502 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.502 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.502 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.502 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.502 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.502 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.502 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.502 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.502 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.502 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.502 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.502 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.502 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.502 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.502 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.502 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.502 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.502 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.502 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.502 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.502 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.502 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.502 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.502 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.502 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.502 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.503 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.503 12 DEBUG ceilometer.compute.pollsters [-] 356bc6d6-1101-467e-a020-65876724c955/disk.device.write.bytes volume: 126976 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.504 12 DEBUG ceilometer.compute.pollsters [-] 356bc6d6-1101-467e-a020-65876724c955/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.504 12 DEBUG ceilometer.compute.pollsters [-] 31fa0ee3-64b4-4f39-adf9-bceb5906e105/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.504 12 DEBUG ceilometer.compute.pollsters [-] 31fa0ee3-64b4-4f39-adf9-bceb5906e105/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.505 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7c63413f-1631-42be-83ef-6ce153168ac3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 126976, 'user_id': '5f75195e56504673bd403ce69cbc28ca', 'user_name': None, 'project_id': 'f7cb78d24d1a4511a59ced45ccc4a1c7', 'project_name': None, 'resource_id': '356bc6d6-1101-467e-a020-65876724c955-vda', 'timestamp': '2025-10-02T12:05:16.503690', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-507794369', 'name': 'instance-00000014', 'instance_id': '356bc6d6-1101-467e-a020-65876724c955', 'instance_type': 'm1.nano', 'host': '04aeeafd3b3d14c37aa026fe541948d818a0d44b2eb5c8767ebc72ff', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0f0809ac-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.064322639, 'message_signature': '25d23cc79287607c4ce47c643b8da6880be3cb146af2ab65ccaaf761a1c15be7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5f75195e56504673bd403ce69cbc28ca', 'user_name': None, 'project_id': 'f7cb78d24d1a4511a59ced45ccc4a1c7', 'project_name': None, 'resource_id': '356bc6d6-1101-467e-a020-65876724c955-sda', 'timestamp': '2025-10-02T12:05:16.503690', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-507794369', 'name': 'instance-00000014', 'instance_id': '356bc6d6-1101-467e-a020-65876724c955', 'instance_type': 'm1.nano', 'host': '04aeeafd3b3d14c37aa026fe541948d818a0d44b2eb5c8767ebc72ff', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0f08169a-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.064322639, 'message_signature': '56eda06f98f6db77527cdf95133eff7d635303ee15cb938e06099f036619f06e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cdc7ec1af4d8410db0b4592293549806', 'user_name': None, 'project_id': '87e7399e976c40bc84f320ed0d052ac6', 'project_name': None, 'resource_id': '31fa0ee3-64b4-4f39-adf9-bceb5906e105-vda', 'timestamp': '2025-10-02T12:05:16.503690', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1762751702', 'name': 'instance-0000001d', 'instance_id': '31fa0ee3-64b4-4f39-adf9-bceb5906e105', 'instance_type': 'm1.nano', 'host': 'e78f6e823697588ea37aa098ba375e8d623624c2700ad6613abe0a2e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0f081f46-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.084522457, 'message_signature': '8c18621bc1680ad9f22cca52ae030bd2bc92ea63dd2ff1837d759494de4e28db'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cdc7ec1af4d8410db0b4592293549806', 'user_name': None, 'project_id': '87e7399e976c40bc84f320ed0d052ac6', 'project_name': None, 'resource_id': '31fa0ee3-64b4-4f39-adf9-bceb5906e105-sda', 'timestamp': '2025-10-02T12:05:16.503690', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1762751702', 'name': 'instance-0000001d', 'instance_id': '31fa0ee3-64b4-4f39-adf9-bceb5906e105', 'instance_type': 'm1.nano', 'host': 'e78f6e823697588ea37aa098ba375e8d623624c2700ad6613abe0a2e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0f0827e8-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.084522457, 'message_signature': '6f8f298cf1531c1a34076e0951c19903dcf9e4c7b501daee846072518b10a288'}]}, 'timestamp': '2025-10-02 12:05:16.504794', '_unique_id': '58b1a49cedb14210b284b10e9007bcb6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.505 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.505 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.505 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.505 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.505 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.505 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.505 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.505 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.505 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.505 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.505 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.505 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.505 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.505 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.505 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.505 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.505 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.505 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.505 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.505 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.505 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.505 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.505 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.505 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.505 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.505 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.505 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.505 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.505 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.505 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.505 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.506 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.506 12 DEBUG ceilometer.compute.pollsters [-] 356bc6d6-1101-467e-a020-65876724c955/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.507 12 DEBUG ceilometer.compute.pollsters [-] 31fa0ee3-64b4-4f39-adf9-bceb5906e105/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.507 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cd459d86-e6fb-4cb8-8c7b-d0179f7e595a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5f75195e56504673bd403ce69cbc28ca', 'user_name': None, 'project_id': 'f7cb78d24d1a4511a59ced45ccc4a1c7', 'project_name': None, 'resource_id': 'instance-00000014-356bc6d6-1101-467e-a020-65876724c955-tap29214def-24', 'timestamp': '2025-10-02T12:05:16.506705', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-507794369', 'name': 'tap29214def-24', 'instance_id': '356bc6d6-1101-467e-a020-65876724c955', 'instance_type': 'm1.nano', 'host': '04aeeafd3b3d14c37aa026fe541948d818a0d44b2eb5c8767ebc72ff', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1d:3d:20', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap29214def-24'}, 'message_id': '0f087fa4-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.05406834, 'message_signature': 'fdab85f359494d11aa7147a055a365b7c815bd5a20ba74ba9ca8b956f285cb5e'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cdc7ec1af4d8410db0b4592293549806', 'user_name': None, 'project_id': '87e7399e976c40bc84f320ed0d052ac6', 'project_name': None, 'resource_id': 'instance-0000001d-31fa0ee3-64b4-4f39-adf9-bceb5906e105-tapbe03f2e4-1e', 'timestamp': '2025-10-02T12:05:16.506705', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1762751702', 'name': 'tapbe03f2e4-1e', 'instance_id': '31fa0ee3-64b4-4f39-adf9-bceb5906e105', 'instance_type': 'm1.nano', 'host': 'e78f6e823697588ea37aa098ba375e8d623624c2700ad6613abe0a2e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e7:d2:a3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbe03f2e4-1e'}, 'message_id': '0f088ba2-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.057050409, 'message_signature': '99665c5ded788a2528ec7b90ead282a43cd5566c9439546ea568cb994df878b4'}]}, 'timestamp': '2025-10-02 12:05:16.507343', '_unique_id': 'c69b0efe58f14a79942415fe4a71d7ba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.507 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.507 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.507 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.507 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.507 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.507 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.507 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.507 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.507 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.507 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.507 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.507 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.507 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.507 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.507 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.507 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.507 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.507 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.507 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.507 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.507 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.507 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.507 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.507 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.507 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.507 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.507 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.507 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.507 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.507 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.507 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.508 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.508 12 DEBUG ceilometer.compute.pollsters [-] 356bc6d6-1101-467e-a020-65876724c955/network.incoming.packets volume: 31 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.508 12 DEBUG ceilometer.compute.pollsters [-] 31fa0ee3-64b4-4f39-adf9-bceb5906e105/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.509 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e255412d-3566-4e38-a45a-cd99b740cf0a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 31, 'user_id': '5f75195e56504673bd403ce69cbc28ca', 'user_name': None, 'project_id': 'f7cb78d24d1a4511a59ced45ccc4a1c7', 'project_name': None, 'resource_id': 'instance-00000014-356bc6d6-1101-467e-a020-65876724c955-tap29214def-24', 'timestamp': '2025-10-02T12:05:16.508598', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-507794369', 'name': 'tap29214def-24', 'instance_id': '356bc6d6-1101-467e-a020-65876724c955', 'instance_type': 'm1.nano', 'host': '04aeeafd3b3d14c37aa026fe541948d818a0d44b2eb5c8767ebc72ff', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1d:3d:20', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap29214def-24'}, 'message_id': '0f08c70c-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.05406834, 'message_signature': '9af558d051b58730532e7a9edbad3280bfe28cab52e4c971fcf81794e5c41a82'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'cdc7ec1af4d8410db0b4592293549806', 'user_name': None, 'project_id': '87e7399e976c40bc84f320ed0d052ac6', 'project_name': None, 'resource_id': 'instance-0000001d-31fa0ee3-64b4-4f39-adf9-bceb5906e105-tapbe03f2e4-1e', 'timestamp': '2025-10-02T12:05:16.508598', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1762751702', 'name': 'tapbe03f2e4-1e', 'instance_id': '31fa0ee3-64b4-4f39-adf9-bceb5906e105', 'instance_type': 'm1.nano', 'host': 'e78f6e823697588ea37aa098ba375e8d623624c2700ad6613abe0a2e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e7:d2:a3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbe03f2e4-1e'}, 'message_id': '0f08d3dc-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.057050409, 'message_signature': '7389d03318f6f1d5d8227d57b25dc5b0f59365b49de65a74cc295b43eaf4e1b3'}]}, 'timestamp': '2025-10-02 12:05:16.509207', '_unique_id': 'bb14971045fe4b898c1bb026131faf88'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.509 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.509 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.509 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.509 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.509 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.509 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.509 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.509 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.509 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.509 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.509 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.509 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.509 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.509 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.509 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.509 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.509 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.509 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.509 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.509 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.509 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.509 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.509 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.509 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.509 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.509 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.509 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.509 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.509 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.509 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.509 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.510 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.510 12 DEBUG ceilometer.compute.pollsters [-] 356bc6d6-1101-467e-a020-65876724c955/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.510 12 DEBUG ceilometer.compute.pollsters [-] 31fa0ee3-64b4-4f39-adf9-bceb5906e105/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.511 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7095dfd4-0243-4608-99db-9488d96d3c60', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5f75195e56504673bd403ce69cbc28ca', 'user_name': None, 'project_id': 'f7cb78d24d1a4511a59ced45ccc4a1c7', 'project_name': None, 'resource_id': 'instance-00000014-356bc6d6-1101-467e-a020-65876724c955-tap29214def-24', 'timestamp': '2025-10-02T12:05:16.510455', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-507794369', 'name': 'tap29214def-24', 'instance_id': '356bc6d6-1101-467e-a020-65876724c955', 'instance_type': 'm1.nano', 'host': '04aeeafd3b3d14c37aa026fe541948d818a0d44b2eb5c8767ebc72ff', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1d:3d:20', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap29214def-24'}, 'message_id': '0f09105e-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.05406834, 'message_signature': 'b1a603a4fedc3d6afa2522545623987c788e82d5975d815a5cbdf887ff810507'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cdc7ec1af4d8410db0b4592293549806', 'user_name': None, 'project_id': '87e7399e976c40bc84f320ed0d052ac6', 'project_name': None, 'resource_id': 'instance-0000001d-31fa0ee3-64b4-4f39-adf9-bceb5906e105-tapbe03f2e4-1e', 'timestamp': '2025-10-02T12:05:16.510455', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-1762751702', 'name': 'tapbe03f2e4-1e', 'instance_id': '31fa0ee3-64b4-4f39-adf9-bceb5906e105', 'instance_type': 'm1.nano', 'host': 'e78f6e823697588ea37aa098ba375e8d623624c2700ad6613abe0a2e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e7:d2:a3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbe03f2e4-1e'}, 'message_id': '0f091cac-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4708.057050409, 'message_signature': 'b3f7afe452d6a001edaf4895b13041ab323ad24a717c1e4321bbd6b61ec1a3ee'}]}, 'timestamp': '2025-10-02 12:05:16.511074', '_unique_id': 'd6449ba3e3204b14b622f7041c47a9e1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.511 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.511 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.511 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.511 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.511 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.511 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.511 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.511 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.511 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.511 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.511 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.511 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.511 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.511 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.511 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.511 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.511 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.511 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.511 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.511 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.511 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.511 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.511 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.511 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.511 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.511 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.511 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.511 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.511 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.511 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:05:16.511 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:05:17 np0005466013 nova_compute[192144]: 2025-10-02 12:05:17.782 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:17 np0005466013 nova_compute[192144]: 2025-10-02 12:05:17.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:05:17 np0005466013 nova_compute[192144]: 2025-10-02 12:05:17.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:05:20 np0005466013 nova_compute[192144]: 2025-10-02 12:05:20.132 2 DEBUG nova.compute.manager [None req-8c20fd64-22cd-4331-9e34-d30992f18b94 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:05:20 np0005466013 nova_compute[192144]: 2025-10-02 12:05:20.202 2 INFO nova.compute.manager [None req-8c20fd64-22cd-4331-9e34-d30992f18b94 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] instance snapshotting#033[00m
Oct  2 08:05:20 np0005466013 nova_compute[192144]: 2025-10-02 12:05:20.797 2 INFO nova.virt.libvirt.driver [None req-8c20fd64-22cd-4331-9e34-d30992f18b94 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Beginning live snapshot process#033[00m
Oct  2 08:05:21 np0005466013 nova_compute[192144]: 2025-10-02 12:05:21.443 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:22 np0005466013 virtqemud[191867]: invalid argument: disk vda does not have an active block job
Oct  2 08:05:22 np0005466013 nova_compute[192144]: 2025-10-02 12:05:22.324 2 DEBUG oslo_concurrency.processutils [None req-8c20fd64-22cd-4331-9e34-d30992f18b94 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/31fa0ee3-64b4-4f39-adf9-bceb5906e105/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:05:22 np0005466013 nova_compute[192144]: 2025-10-02 12:05:22.430 2 DEBUG oslo_concurrency.processutils [None req-8c20fd64-22cd-4331-9e34-d30992f18b94 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/31fa0ee3-64b4-4f39-adf9-bceb5906e105/disk --force-share --output=json -f qcow2" returned: 0 in 0.106s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:05:22 np0005466013 nova_compute[192144]: 2025-10-02 12:05:22.431 2 DEBUG oslo_concurrency.processutils [None req-8c20fd64-22cd-4331-9e34-d30992f18b94 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/31fa0ee3-64b4-4f39-adf9-bceb5906e105/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:05:22 np0005466013 nova_compute[192144]: 2025-10-02 12:05:22.559 2 DEBUG oslo_concurrency.processutils [None req-8c20fd64-22cd-4331-9e34-d30992f18b94 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/31fa0ee3-64b4-4f39-adf9-bceb5906e105/disk --force-share --output=json -f qcow2" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:05:22 np0005466013 nova_compute[192144]: 2025-10-02 12:05:22.573 2 DEBUG oslo_concurrency.processutils [None req-8c20fd64-22cd-4331-9e34-d30992f18b94 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:05:22 np0005466013 nova_compute[192144]: 2025-10-02 12:05:22.637 2 DEBUG oslo_concurrency.processutils [None req-8c20fd64-22cd-4331-9e34-d30992f18b94 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:05:22 np0005466013 nova_compute[192144]: 2025-10-02 12:05:22.638 2 DEBUG oslo_concurrency.processutils [None req-8c20fd64-22cd-4331-9e34-d30992f18b94 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpxpdfjdk6/5bfedb7f26614eb9a8610896eef90ba8.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:05:22 np0005466013 nova_compute[192144]: 2025-10-02 12:05:22.784 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:22 np0005466013 nova_compute[192144]: 2025-10-02 12:05:22.869 2 DEBUG oslo_concurrency.processutils [None req-8c20fd64-22cd-4331-9e34-d30992f18b94 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpxpdfjdk6/5bfedb7f26614eb9a8610896eef90ba8.delta 1073741824" returned: 0 in 0.231s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:05:22 np0005466013 nova_compute[192144]: 2025-10-02 12:05:22.870 2 INFO nova.virt.libvirt.driver [None req-8c20fd64-22cd-4331-9e34-d30992f18b94 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Oct  2 08:05:22 np0005466013 podman[223109]: 2025-10-02 12:05:22.955912982 +0000 UTC m=+0.051500122 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:05:22 np0005466013 podman[223110]: 2025-10-02 12:05:22.980011928 +0000 UTC m=+0.073919273 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  2 08:05:22 np0005466013 podman[223111]: 2025-10-02 12:05:22.985227121 +0000 UTC m=+0.076348074 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:05:23 np0005466013 nova_compute[192144]: 2025-10-02 12:05:23.223 2 DEBUG oslo_concurrency.lockutils [None req-1c007e7a-4b15-45cf-8ca9-d4f82a969d69 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Acquiring lock "356bc6d6-1101-467e-a020-65876724c955" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:05:23 np0005466013 nova_compute[192144]: 2025-10-02 12:05:23.225 2 DEBUG oslo_concurrency.lockutils [None req-1c007e7a-4b15-45cf-8ca9-d4f82a969d69 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Lock "356bc6d6-1101-467e-a020-65876724c955" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:05:23 np0005466013 nova_compute[192144]: 2025-10-02 12:05:23.225 2 DEBUG oslo_concurrency.lockutils [None req-1c007e7a-4b15-45cf-8ca9-d4f82a969d69 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Acquiring lock "356bc6d6-1101-467e-a020-65876724c955-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:05:23 np0005466013 nova_compute[192144]: 2025-10-02 12:05:23.225 2 DEBUG oslo_concurrency.lockutils [None req-1c007e7a-4b15-45cf-8ca9-d4f82a969d69 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Lock "356bc6d6-1101-467e-a020-65876724c955-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:05:23 np0005466013 nova_compute[192144]: 2025-10-02 12:05:23.226 2 DEBUG oslo_concurrency.lockutils [None req-1c007e7a-4b15-45cf-8ca9-d4f82a969d69 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Lock "356bc6d6-1101-467e-a020-65876724c955-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:05:23 np0005466013 nova_compute[192144]: 2025-10-02 12:05:23.239 2 INFO nova.compute.manager [None req-1c007e7a-4b15-45cf-8ca9-d4f82a969d69 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Terminating instance#033[00m
Oct  2 08:05:23 np0005466013 nova_compute[192144]: 2025-10-02 12:05:23.250 2 DEBUG nova.compute.manager [None req-1c007e7a-4b15-45cf-8ca9-d4f82a969d69 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:05:23 np0005466013 kernel: tap29214def-24 (unregistering): left promiscuous mode
Oct  2 08:05:23 np0005466013 NetworkManager[51205]: <info>  [1759406723.2931] device (tap29214def-24): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:05:23 np0005466013 nova_compute[192144]: 2025-10-02 12:05:23.301 2 DEBUG nova.virt.libvirt.guest [None req-8c20fd64-22cd-4331-9e34-d30992f18b94 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] COPY block job progress, current cursor: 0 final cursor: 19005440 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Oct  2 08:05:23 np0005466013 nova_compute[192144]: 2025-10-02 12:05:23.303 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:23 np0005466013 ovn_controller[94366]: 2025-10-02T12:05:23Z|00073|binding|INFO|Releasing lport 29214def-2450-4edd-acc6-84e165aa1e2c from this chassis (sb_readonly=0)
Oct  2 08:05:23 np0005466013 ovn_controller[94366]: 2025-10-02T12:05:23Z|00074|binding|INFO|Setting lport 29214def-2450-4edd-acc6-84e165aa1e2c down in Southbound
Oct  2 08:05:23 np0005466013 ovn_controller[94366]: 2025-10-02T12:05:23Z|00075|binding|INFO|Removing iface tap29214def-24 ovn-installed in OVS
Oct  2 08:05:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:23.310 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1d:3d:20 10.100.0.14'], port_security=['fa:16:3e:1d:3d:20 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '356bc6d6-1101-467e-a020-65876724c955', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-664b6526-6df1-4024-9bab-37218e6c18bd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f7cb78d24d1a4511a59ced45ccc4a1c7', 'neutron:revision_number': '23', 'neutron:security_group_ids': 'a459d514-aab4-4030-9850-e066abdeaccc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=eddfb51e-1095-4b3d-a2dc-f2557cf13b11, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=29214def-2450-4edd-acc6-84e165aa1e2c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:05:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:23.312 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 29214def-2450-4edd-acc6-84e165aa1e2c in datapath 664b6526-6df1-4024-9bab-37218e6c18bd unbound from our chassis#033[00m
Oct  2 08:05:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:23.315 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 664b6526-6df1-4024-9bab-37218e6c18bd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:05:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:23.318 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[3e7ec393-984d-40a0-820f-c9d77fda38fc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:05:23 np0005466013 nova_compute[192144]: 2025-10-02 12:05:23.318 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:23.319 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd namespace which is not needed anymore#033[00m
Oct  2 08:05:23 np0005466013 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d00000014.scope: Deactivated successfully.
Oct  2 08:05:23 np0005466013 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d00000014.scope: Consumed 3.850s CPU time.
Oct  2 08:05:23 np0005466013 systemd-machined[152202]: Machine qemu-10-instance-00000014 terminated.
Oct  2 08:05:23 np0005466013 neutron-haproxy-ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd[222580]: [NOTICE]   (222584) : haproxy version is 2.8.14-c23fe91
Oct  2 08:05:23 np0005466013 neutron-haproxy-ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd[222580]: [NOTICE]   (222584) : path to executable is /usr/sbin/haproxy
Oct  2 08:05:23 np0005466013 neutron-haproxy-ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd[222580]: [WARNING]  (222584) : Exiting Master process...
Oct  2 08:05:23 np0005466013 neutron-haproxy-ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd[222580]: [ALERT]    (222584) : Current worker (222586) exited with code 143 (Terminated)
Oct  2 08:05:23 np0005466013 neutron-haproxy-ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd[222580]: [WARNING]  (222584) : All workers exited. Exiting... (0)
Oct  2 08:05:23 np0005466013 nova_compute[192144]: 2025-10-02 12:05:23.500 2 INFO nova.virt.libvirt.driver [-] [instance: 356bc6d6-1101-467e-a020-65876724c955] Instance destroyed successfully.#033[00m
Oct  2 08:05:23 np0005466013 nova_compute[192144]: 2025-10-02 12:05:23.501 2 DEBUG nova.objects.instance [None req-1c007e7a-4b15-45cf-8ca9-d4f82a969d69 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Lazy-loading 'resources' on Instance uuid 356bc6d6-1101-467e-a020-65876724c955 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:05:23 np0005466013 systemd[1]: libpod-26238d37c42498c04997eb0f6abf60dacb48ed1c22e5d2b9a4048c2a8e4e94c5.scope: Deactivated successfully.
Oct  2 08:05:23 np0005466013 podman[223210]: 2025-10-02 12:05:23.511868835 +0000 UTC m=+0.104407432 container died 26238d37c42498c04997eb0f6abf60dacb48ed1c22e5d2b9a4048c2a8e4e94c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2)
Oct  2 08:05:23 np0005466013 nova_compute[192144]: 2025-10-02 12:05:23.527 2 DEBUG nova.virt.libvirt.vif [None req-1c007e7a-4b15-45cf-8ca9-d4f82a969d69 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T12:03:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-507794369',display_name='tempest-LiveMigrationTest-server-507794369',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-livemigrationtest-server-507794369',id=20,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:03:36Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f7cb78d24d1a4511a59ced45ccc4a1c7',ramdisk_id='',reservation_id='r-hsf0qpxd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='2',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-1666170212',owner_user_name='tempest-LiveMigrationTest-1666170212-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:04:22Z,user_data=None,user_id='5f75195e56504673bd403ce69cbc28ca',uuid=356bc6d6-1101-467e-a020-65876724c955,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "29214def-2450-4edd-acc6-84e165aa1e2c", "address": "fa:16:3e:1d:3d:20", "network": {"id": "664b6526-6df1-4024-9bab-37218e6c18bd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-2017832683-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7cb78d24d1a4511a59ced45ccc4a1c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29214def-24", "ovs_interfaceid": "29214def-2450-4edd-acc6-84e165aa1e2c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:05:23 np0005466013 nova_compute[192144]: 2025-10-02 12:05:23.528 2 DEBUG nova.network.os_vif_util [None req-1c007e7a-4b15-45cf-8ca9-d4f82a969d69 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Converting VIF {"id": "29214def-2450-4edd-acc6-84e165aa1e2c", "address": "fa:16:3e:1d:3d:20", "network": {"id": "664b6526-6df1-4024-9bab-37218e6c18bd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-2017832683-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7cb78d24d1a4511a59ced45ccc4a1c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29214def-24", "ovs_interfaceid": "29214def-2450-4edd-acc6-84e165aa1e2c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:05:23 np0005466013 nova_compute[192144]: 2025-10-02 12:05:23.529 2 DEBUG nova.network.os_vif_util [None req-1c007e7a-4b15-45cf-8ca9-d4f82a969d69 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1d:3d:20,bridge_name='br-int',has_traffic_filtering=True,id=29214def-2450-4edd-acc6-84e165aa1e2c,network=Network(664b6526-6df1-4024-9bab-37218e6c18bd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29214def-24') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:05:23 np0005466013 nova_compute[192144]: 2025-10-02 12:05:23.529 2 DEBUG os_vif [None req-1c007e7a-4b15-45cf-8ca9-d4f82a969d69 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1d:3d:20,bridge_name='br-int',has_traffic_filtering=True,id=29214def-2450-4edd-acc6-84e165aa1e2c,network=Network(664b6526-6df1-4024-9bab-37218e6c18bd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29214def-24') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:05:23 np0005466013 nova_compute[192144]: 2025-10-02 12:05:23.532 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:23 np0005466013 nova_compute[192144]: 2025-10-02 12:05:23.532 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap29214def-24, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:05:23 np0005466013 nova_compute[192144]: 2025-10-02 12:05:23.533 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:23 np0005466013 nova_compute[192144]: 2025-10-02 12:05:23.536 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:05:23 np0005466013 nova_compute[192144]: 2025-10-02 12:05:23.539 2 INFO os_vif [None req-1c007e7a-4b15-45cf-8ca9-d4f82a969d69 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1d:3d:20,bridge_name='br-int',has_traffic_filtering=True,id=29214def-2450-4edd-acc6-84e165aa1e2c,network=Network(664b6526-6df1-4024-9bab-37218e6c18bd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29214def-24')#033[00m
Oct  2 08:05:23 np0005466013 nova_compute[192144]: 2025-10-02 12:05:23.539 2 INFO nova.virt.libvirt.driver [None req-1c007e7a-4b15-45cf-8ca9-d4f82a969d69 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Deleting instance files /var/lib/nova/instances/356bc6d6-1101-467e-a020-65876724c955_del#033[00m
Oct  2 08:05:23 np0005466013 nova_compute[192144]: 2025-10-02 12:05:23.540 2 INFO nova.virt.libvirt.driver [None req-1c007e7a-4b15-45cf-8ca9-d4f82a969d69 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Deletion of /var/lib/nova/instances/356bc6d6-1101-467e-a020-65876724c955_del complete#033[00m
Oct  2 08:05:23 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-26238d37c42498c04997eb0f6abf60dacb48ed1c22e5d2b9a4048c2a8e4e94c5-userdata-shm.mount: Deactivated successfully.
Oct  2 08:05:23 np0005466013 systemd[1]: var-lib-containers-storage-overlay-daf239092209cf7d71a2a3e72bb415957e53edb6ffdec3c8f0db392c51a41d96-merged.mount: Deactivated successfully.
Oct  2 08:05:23 np0005466013 podman[223210]: 2025-10-02 12:05:23.647828297 +0000 UTC m=+0.240366884 container cleanup 26238d37c42498c04997eb0f6abf60dacb48ed1c22e5d2b9a4048c2a8e4e94c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:05:23 np0005466013 systemd[1]: libpod-conmon-26238d37c42498c04997eb0f6abf60dacb48ed1c22e5d2b9a4048c2a8e4e94c5.scope: Deactivated successfully.
Oct  2 08:05:23 np0005466013 nova_compute[192144]: 2025-10-02 12:05:23.666 2 INFO nova.compute.manager [None req-1c007e7a-4b15-45cf-8ca9-d4f82a969d69 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Took 0.42 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:05:23 np0005466013 nova_compute[192144]: 2025-10-02 12:05:23.666 2 DEBUG oslo.service.loopingcall [None req-1c007e7a-4b15-45cf-8ca9-d4f82a969d69 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:05:23 np0005466013 nova_compute[192144]: 2025-10-02 12:05:23.667 2 DEBUG nova.compute.manager [-] [instance: 356bc6d6-1101-467e-a020-65876724c955] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:05:23 np0005466013 nova_compute[192144]: 2025-10-02 12:05:23.667 2 DEBUG nova.network.neutron [-] [instance: 356bc6d6-1101-467e-a020-65876724c955] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:05:23 np0005466013 nova_compute[192144]: 2025-10-02 12:05:23.749 2 DEBUG nova.compute.manager [req-4e4387b4-c68c-4a32-94ca-d2a8417ae49e req-35e516aa-ccd4-41f5-b440-afc3b5f0960a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Received event network-vif-unplugged-29214def-2450-4edd-acc6-84e165aa1e2c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:05:23 np0005466013 nova_compute[192144]: 2025-10-02 12:05:23.749 2 DEBUG oslo_concurrency.lockutils [req-4e4387b4-c68c-4a32-94ca-d2a8417ae49e req-35e516aa-ccd4-41f5-b440-afc3b5f0960a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "356bc6d6-1101-467e-a020-65876724c955-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:05:23 np0005466013 nova_compute[192144]: 2025-10-02 12:05:23.749 2 DEBUG oslo_concurrency.lockutils [req-4e4387b4-c68c-4a32-94ca-d2a8417ae49e req-35e516aa-ccd4-41f5-b440-afc3b5f0960a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "356bc6d6-1101-467e-a020-65876724c955-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:05:23 np0005466013 nova_compute[192144]: 2025-10-02 12:05:23.750 2 DEBUG oslo_concurrency.lockutils [req-4e4387b4-c68c-4a32-94ca-d2a8417ae49e req-35e516aa-ccd4-41f5-b440-afc3b5f0960a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "356bc6d6-1101-467e-a020-65876724c955-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:05:23 np0005466013 nova_compute[192144]: 2025-10-02 12:05:23.750 2 DEBUG nova.compute.manager [req-4e4387b4-c68c-4a32-94ca-d2a8417ae49e req-35e516aa-ccd4-41f5-b440-afc3b5f0960a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] No waiting events found dispatching network-vif-unplugged-29214def-2450-4edd-acc6-84e165aa1e2c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:05:23 np0005466013 nova_compute[192144]: 2025-10-02 12:05:23.750 2 DEBUG nova.compute.manager [req-4e4387b4-c68c-4a32-94ca-d2a8417ae49e req-35e516aa-ccd4-41f5-b440-afc3b5f0960a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Received event network-vif-unplugged-29214def-2450-4edd-acc6-84e165aa1e2c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:05:23 np0005466013 podman[223263]: 2025-10-02 12:05:23.764800292 +0000 UTC m=+0.094682280 container remove 26238d37c42498c04997eb0f6abf60dacb48ed1c22e5d2b9a4048c2a8e4e94c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  2 08:05:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:23.770 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[1ab0516d-c204-4e73-ab52-36eba55d4854]: (4, ('Thu Oct  2 12:05:23 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd (26238d37c42498c04997eb0f6abf60dacb48ed1c22e5d2b9a4048c2a8e4e94c5)\n26238d37c42498c04997eb0f6abf60dacb48ed1c22e5d2b9a4048c2a8e4e94c5\nThu Oct  2 12:05:23 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd (26238d37c42498c04997eb0f6abf60dacb48ed1c22e5d2b9a4048c2a8e4e94c5)\n26238d37c42498c04997eb0f6abf60dacb48ed1c22e5d2b9a4048c2a8e4e94c5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:05:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:23.771 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[e6962a8b-555f-40ee-a147-a6fa0cdcf9ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:05:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:23.772 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap664b6526-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:05:23 np0005466013 kernel: tap664b6526-60: left promiscuous mode
Oct  2 08:05:23 np0005466013 nova_compute[192144]: 2025-10-02 12:05:23.774 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:23.778 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[db5618d2-7e1d-4a7e-9a1c-d846c28fb624]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:05:23 np0005466013 nova_compute[192144]: 2025-10-02 12:05:23.788 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:23 np0005466013 nova_compute[192144]: 2025-10-02 12:05:23.804 2 DEBUG nova.virt.libvirt.guest [None req-8c20fd64-22cd-4331-9e34-d30992f18b94 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] COPY block job progress, current cursor: 28704768 final cursor: 28704768 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Oct  2 08:05:23 np0005466013 nova_compute[192144]: 2025-10-02 12:05:23.806 2 INFO nova.virt.libvirt.driver [None req-8c20fd64-22cd-4331-9e34-d30992f18b94 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Oct  2 08:05:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:23.816 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[09d18444-fb61-4f00-83a0-778a0ebfcec0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:05:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:23.817 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[c0c8b817-41fe-4c5e-80bd-863990a84eb7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:05:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:23.833 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[972d3a35-51df-4d82-bcb4-11fda296608e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 464717, 'reachable_time': 22346, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223278, 'error': None, 'target': 'ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:05:23 np0005466013 systemd[1]: run-netns-ovnmeta\x2d664b6526\x2d6df1\x2d4024\x2d9bab\x2d37218e6c18bd.mount: Deactivated successfully.
Oct  2 08:05:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:23.835 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:05:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:23.835 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[00793cc0-6700-412b-a2e9-208e093610ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:05:23 np0005466013 nova_compute[192144]: 2025-10-02 12:05:23.843 2 DEBUG nova.privsep.utils [None req-8c20fd64-22cd-4331-9e34-d30992f18b94 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Oct  2 08:05:23 np0005466013 nova_compute[192144]: 2025-10-02 12:05:23.843 2 DEBUG oslo_concurrency.processutils [None req-8c20fd64-22cd-4331-9e34-d30992f18b94 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpxpdfjdk6/5bfedb7f26614eb9a8610896eef90ba8.delta /var/lib/nova/instances/snapshots/tmpxpdfjdk6/5bfedb7f26614eb9a8610896eef90ba8 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:05:24 np0005466013 nova_compute[192144]: 2025-10-02 12:05:24.318 2 DEBUG oslo_concurrency.processutils [None req-8c20fd64-22cd-4331-9e34-d30992f18b94 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpxpdfjdk6/5bfedb7f26614eb9a8610896eef90ba8.delta /var/lib/nova/instances/snapshots/tmpxpdfjdk6/5bfedb7f26614eb9a8610896eef90ba8" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:05:24 np0005466013 nova_compute[192144]: 2025-10-02 12:05:24.325 2 INFO nova.virt.libvirt.driver [None req-8c20fd64-22cd-4331-9e34-d30992f18b94 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Snapshot extracted, beginning image upload#033[00m
Oct  2 08:05:24 np0005466013 nova_compute[192144]: 2025-10-02 12:05:24.776 2 WARNING nova.compute.manager [None req-8c20fd64-22cd-4331-9e34-d30992f18b94 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Image not found during snapshot: nova.exception.ImageNotFound: Image 70e28f1e-6466-4d85-8b81-01ef8a78e2a9 could not be found.#033[00m
Oct  2 08:05:24 np0005466013 nova_compute[192144]: 2025-10-02 12:05:24.782 2 DEBUG nova.network.neutron [-] [instance: 356bc6d6-1101-467e-a020-65876724c955] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:05:24 np0005466013 nova_compute[192144]: 2025-10-02 12:05:24.801 2 INFO nova.compute.manager [-] [instance: 356bc6d6-1101-467e-a020-65876724c955] Took 1.13 seconds to deallocate network for instance.#033[00m
Oct  2 08:05:24 np0005466013 nova_compute[192144]: 2025-10-02 12:05:24.916 2 DEBUG oslo_concurrency.lockutils [None req-1c007e7a-4b15-45cf-8ca9-d4f82a969d69 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:05:24 np0005466013 nova_compute[192144]: 2025-10-02 12:05:24.916 2 DEBUG oslo_concurrency.lockutils [None req-1c007e7a-4b15-45cf-8ca9-d4f82a969d69 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:05:25 np0005466013 nova_compute[192144]: 2025-10-02 12:05:25.018 2 DEBUG nova.compute.provider_tree [None req-1c007e7a-4b15-45cf-8ca9-d4f82a969d69 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:05:25 np0005466013 nova_compute[192144]: 2025-10-02 12:05:25.046 2 DEBUG nova.scheduler.client.report [None req-1c007e7a-4b15-45cf-8ca9-d4f82a969d69 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:05:25 np0005466013 nova_compute[192144]: 2025-10-02 12:05:25.073 2 DEBUG oslo_concurrency.lockutils [None req-1c007e7a-4b15-45cf-8ca9-d4f82a969d69 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.157s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:05:25 np0005466013 nova_compute[192144]: 2025-10-02 12:05:25.113 2 INFO nova.scheduler.client.report [None req-1c007e7a-4b15-45cf-8ca9-d4f82a969d69 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Deleted allocations for instance 356bc6d6-1101-467e-a020-65876724c955#033[00m
Oct  2 08:05:25 np0005466013 nova_compute[192144]: 2025-10-02 12:05:25.199 2 DEBUG oslo_concurrency.lockutils [None req-1c007e7a-4b15-45cf-8ca9-d4f82a969d69 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Lock "356bc6d6-1101-467e-a020-65876724c955" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.975s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:05:25 np0005466013 ovn_controller[94366]: 2025-10-02T12:05:25Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e7:d2:a3 10.100.0.8
Oct  2 08:05:25 np0005466013 ovn_controller[94366]: 2025-10-02T12:05:25Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e7:d2:a3 10.100.0.8
Oct  2 08:05:25 np0005466013 nova_compute[192144]: 2025-10-02 12:05:25.941 2 DEBUG nova.compute.manager [req-98ab0150-62e6-4250-962d-c1be00aeeae8 req-ad732ce1-0ab9-4c6b-8a95-f9eb0096a5f4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Received event network-vif-plugged-29214def-2450-4edd-acc6-84e165aa1e2c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:05:25 np0005466013 nova_compute[192144]: 2025-10-02 12:05:25.942 2 DEBUG oslo_concurrency.lockutils [req-98ab0150-62e6-4250-962d-c1be00aeeae8 req-ad732ce1-0ab9-4c6b-8a95-f9eb0096a5f4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "356bc6d6-1101-467e-a020-65876724c955-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:05:25 np0005466013 nova_compute[192144]: 2025-10-02 12:05:25.942 2 DEBUG oslo_concurrency.lockutils [req-98ab0150-62e6-4250-962d-c1be00aeeae8 req-ad732ce1-0ab9-4c6b-8a95-f9eb0096a5f4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "356bc6d6-1101-467e-a020-65876724c955-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:05:25 np0005466013 nova_compute[192144]: 2025-10-02 12:05:25.942 2 DEBUG oslo_concurrency.lockutils [req-98ab0150-62e6-4250-962d-c1be00aeeae8 req-ad732ce1-0ab9-4c6b-8a95-f9eb0096a5f4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "356bc6d6-1101-467e-a020-65876724c955-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:05:25 np0005466013 nova_compute[192144]: 2025-10-02 12:05:25.942 2 DEBUG nova.compute.manager [req-98ab0150-62e6-4250-962d-c1be00aeeae8 req-ad732ce1-0ab9-4c6b-8a95-f9eb0096a5f4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] No waiting events found dispatching network-vif-plugged-29214def-2450-4edd-acc6-84e165aa1e2c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:05:25 np0005466013 nova_compute[192144]: 2025-10-02 12:05:25.943 2 WARNING nova.compute.manager [req-98ab0150-62e6-4250-962d-c1be00aeeae8 req-ad732ce1-0ab9-4c6b-8a95-f9eb0096a5f4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Received unexpected event network-vif-plugged-29214def-2450-4edd-acc6-84e165aa1e2c for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:05:25 np0005466013 nova_compute[192144]: 2025-10-02 12:05:25.943 2 DEBUG nova.compute.manager [req-98ab0150-62e6-4250-962d-c1be00aeeae8 req-ad732ce1-0ab9-4c6b-8a95-f9eb0096a5f4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 356bc6d6-1101-467e-a020-65876724c955] Received event network-vif-deleted-29214def-2450-4edd-acc6-84e165aa1e2c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:05:26 np0005466013 nova_compute[192144]: 2025-10-02 12:05:26.821 2 DEBUG oslo_concurrency.lockutils [None req-978217b1-10dc-4b9b-ab51-056fdd2157ca cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Acquiring lock "31fa0ee3-64b4-4f39-adf9-bceb5906e105" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:05:26 np0005466013 nova_compute[192144]: 2025-10-02 12:05:26.821 2 DEBUG oslo_concurrency.lockutils [None req-978217b1-10dc-4b9b-ab51-056fdd2157ca cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lock "31fa0ee3-64b4-4f39-adf9-bceb5906e105" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:05:26 np0005466013 nova_compute[192144]: 2025-10-02 12:05:26.821 2 DEBUG oslo_concurrency.lockutils [None req-978217b1-10dc-4b9b-ab51-056fdd2157ca cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Acquiring lock "31fa0ee3-64b4-4f39-adf9-bceb5906e105-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:05:26 np0005466013 nova_compute[192144]: 2025-10-02 12:05:26.823 2 DEBUG oslo_concurrency.lockutils [None req-978217b1-10dc-4b9b-ab51-056fdd2157ca cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lock "31fa0ee3-64b4-4f39-adf9-bceb5906e105-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:05:26 np0005466013 nova_compute[192144]: 2025-10-02 12:05:26.823 2 DEBUG oslo_concurrency.lockutils [None req-978217b1-10dc-4b9b-ab51-056fdd2157ca cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lock "31fa0ee3-64b4-4f39-adf9-bceb5906e105-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:05:26 np0005466013 nova_compute[192144]: 2025-10-02 12:05:26.835 2 INFO nova.compute.manager [None req-978217b1-10dc-4b9b-ab51-056fdd2157ca cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Terminating instance#033[00m
Oct  2 08:05:26 np0005466013 nova_compute[192144]: 2025-10-02 12:05:26.847 2 DEBUG nova.compute.manager [None req-978217b1-10dc-4b9b-ab51-056fdd2157ca cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:05:26 np0005466013 kernel: tapbe03f2e4-1e (unregistering): left promiscuous mode
Oct  2 08:05:26 np0005466013 NetworkManager[51205]: <info>  [1759406726.8704] device (tapbe03f2e4-1e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:05:26 np0005466013 ovn_controller[94366]: 2025-10-02T12:05:26Z|00076|binding|INFO|Releasing lport be03f2e4-1e42-4870-941c-467fcae525e2 from this chassis (sb_readonly=0)
Oct  2 08:05:26 np0005466013 ovn_controller[94366]: 2025-10-02T12:05:26Z|00077|binding|INFO|Setting lport be03f2e4-1e42-4870-941c-467fcae525e2 down in Southbound
Oct  2 08:05:26 np0005466013 ovn_controller[94366]: 2025-10-02T12:05:26Z|00078|binding|INFO|Removing iface tapbe03f2e4-1e ovn-installed in OVS
Oct  2 08:05:26 np0005466013 nova_compute[192144]: 2025-10-02 12:05:26.875 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:26 np0005466013 nova_compute[192144]: 2025-10-02 12:05:26.878 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:26.892 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e7:d2:a3 10.100.0.8'], port_security=['fa:16:3e:e7:d2:a3 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '31fa0ee3-64b4-4f39-adf9-bceb5906e105', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-982b406e-0686-44db-8945-39e0f57e4781', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '87e7399e976c40bc84f320ed0d052ac6', 'neutron:revision_number': '4', 'neutron:security_group_ids': '410de2f3-62e2-482c-a480-7655c2811e48', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f46c1e1f-04ef-471b-85c6-c4415ad3e6bb, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=be03f2e4-1e42-4870-941c-467fcae525e2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:05:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:26.894 103323 INFO neutron.agent.ovn.metadata.agent [-] Port be03f2e4-1e42-4870-941c-467fcae525e2 in datapath 982b406e-0686-44db-8945-39e0f57e4781 unbound from our chassis#033[00m
Oct  2 08:05:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:26.895 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 982b406e-0686-44db-8945-39e0f57e4781, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:05:26 np0005466013 nova_compute[192144]: 2025-10-02 12:05:26.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:26.896 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[0eaf7daf-802f-463e-a4a0-2255f244fa74]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:05:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:26.897 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-982b406e-0686-44db-8945-39e0f57e4781 namespace which is not needed anymore#033[00m
Oct  2 08:05:26 np0005466013 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000001d.scope: Deactivated successfully.
Oct  2 08:05:26 np0005466013 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000001d.scope: Consumed 12.816s CPU time.
Oct  2 08:05:26 np0005466013 systemd-machined[152202]: Machine qemu-11-instance-0000001d terminated.
Oct  2 08:05:27 np0005466013 neutron-haproxy-ovnmeta-982b406e-0686-44db-8945-39e0f57e4781[223060]: [NOTICE]   (223064) : haproxy version is 2.8.14-c23fe91
Oct  2 08:05:27 np0005466013 neutron-haproxy-ovnmeta-982b406e-0686-44db-8945-39e0f57e4781[223060]: [NOTICE]   (223064) : path to executable is /usr/sbin/haproxy
Oct  2 08:05:27 np0005466013 neutron-haproxy-ovnmeta-982b406e-0686-44db-8945-39e0f57e4781[223060]: [WARNING]  (223064) : Exiting Master process...
Oct  2 08:05:27 np0005466013 neutron-haproxy-ovnmeta-982b406e-0686-44db-8945-39e0f57e4781[223060]: [WARNING]  (223064) : Exiting Master process...
Oct  2 08:05:27 np0005466013 neutron-haproxy-ovnmeta-982b406e-0686-44db-8945-39e0f57e4781[223060]: [ALERT]    (223064) : Current worker (223066) exited with code 143 (Terminated)
Oct  2 08:05:27 np0005466013 neutron-haproxy-ovnmeta-982b406e-0686-44db-8945-39e0f57e4781[223060]: [WARNING]  (223064) : All workers exited. Exiting... (0)
Oct  2 08:05:27 np0005466013 systemd[1]: libpod-2e79780bf9be020d31c462d8971c2fe9e65db1ba090908cefd26a7706d051c0b.scope: Deactivated successfully.
Oct  2 08:05:27 np0005466013 podman[223312]: 2025-10-02 12:05:27.081161914 +0000 UTC m=+0.086431568 container died 2e79780bf9be020d31c462d8971c2fe9e65db1ba090908cefd26a7706d051c0b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-982b406e-0686-44db-8945-39e0f57e4781, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct  2 08:05:27 np0005466013 nova_compute[192144]: 2025-10-02 12:05:27.116 2 INFO nova.virt.libvirt.driver [-] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Instance destroyed successfully.#033[00m
Oct  2 08:05:27 np0005466013 nova_compute[192144]: 2025-10-02 12:05:27.117 2 DEBUG nova.objects.instance [None req-978217b1-10dc-4b9b-ab51-056fdd2157ca cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lazy-loading 'resources' on Instance uuid 31fa0ee3-64b4-4f39-adf9-bceb5906e105 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:05:27 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2e79780bf9be020d31c462d8971c2fe9e65db1ba090908cefd26a7706d051c0b-userdata-shm.mount: Deactivated successfully.
Oct  2 08:05:27 np0005466013 systemd[1]: var-lib-containers-storage-overlay-5ebf6a82dd366def3dcda5ffcfd584129b661017b4d8dc9ab44ccddc1c2e366d-merged.mount: Deactivated successfully.
Oct  2 08:05:27 np0005466013 nova_compute[192144]: 2025-10-02 12:05:27.135 2 DEBUG nova.virt.libvirt.vif [None req-978217b1-10dc-4b9b-ab51-056fdd2157ca cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:04:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-1762751702',display_name='tempest-ImagesOneServerNegativeTestJSON-server-1762751702',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-1762751702',id=29,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:05:10Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='87e7399e976c40bc84f320ed0d052ac6',ramdisk_id='',reservation_id='r-uefui2o0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-507683469',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-507683469-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:05:24Z,user_data=None,user_id='cdc7ec1af4d8410db0b4592293549806',uuid=31fa0ee3-64b4-4f39-adf9-bceb5906e105,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "be03f2e4-1e42-4870-941c-467fcae525e2", "address": "fa:16:3e:e7:d2:a3", "network": {"id": "982b406e-0686-44db-8945-39e0f57e4781", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-993027464-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "87e7399e976c40bc84f320ed0d052ac6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe03f2e4-1e", "ovs_interfaceid": "be03f2e4-1e42-4870-941c-467fcae525e2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:05:27 np0005466013 nova_compute[192144]: 2025-10-02 12:05:27.135 2 DEBUG nova.network.os_vif_util [None req-978217b1-10dc-4b9b-ab51-056fdd2157ca cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Converting VIF {"id": "be03f2e4-1e42-4870-941c-467fcae525e2", "address": "fa:16:3e:e7:d2:a3", "network": {"id": "982b406e-0686-44db-8945-39e0f57e4781", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-993027464-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "87e7399e976c40bc84f320ed0d052ac6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe03f2e4-1e", "ovs_interfaceid": "be03f2e4-1e42-4870-941c-467fcae525e2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:05:27 np0005466013 nova_compute[192144]: 2025-10-02 12:05:27.137 2 DEBUG nova.network.os_vif_util [None req-978217b1-10dc-4b9b-ab51-056fdd2157ca cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e7:d2:a3,bridge_name='br-int',has_traffic_filtering=True,id=be03f2e4-1e42-4870-941c-467fcae525e2,network=Network(982b406e-0686-44db-8945-39e0f57e4781),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe03f2e4-1e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:05:27 np0005466013 nova_compute[192144]: 2025-10-02 12:05:27.137 2 DEBUG os_vif [None req-978217b1-10dc-4b9b-ab51-056fdd2157ca cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e7:d2:a3,bridge_name='br-int',has_traffic_filtering=True,id=be03f2e4-1e42-4870-941c-467fcae525e2,network=Network(982b406e-0686-44db-8945-39e0f57e4781),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe03f2e4-1e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:05:27 np0005466013 nova_compute[192144]: 2025-10-02 12:05:27.139 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:27 np0005466013 nova_compute[192144]: 2025-10-02 12:05:27.140 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbe03f2e4-1e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:05:27 np0005466013 nova_compute[192144]: 2025-10-02 12:05:27.141 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:27 np0005466013 nova_compute[192144]: 2025-10-02 12:05:27.142 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:27 np0005466013 nova_compute[192144]: 2025-10-02 12:05:27.145 2 INFO os_vif [None req-978217b1-10dc-4b9b-ab51-056fdd2157ca cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e7:d2:a3,bridge_name='br-int',has_traffic_filtering=True,id=be03f2e4-1e42-4870-941c-467fcae525e2,network=Network(982b406e-0686-44db-8945-39e0f57e4781),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe03f2e4-1e')#033[00m
Oct  2 08:05:27 np0005466013 nova_compute[192144]: 2025-10-02 12:05:27.146 2 INFO nova.virt.libvirt.driver [None req-978217b1-10dc-4b9b-ab51-056fdd2157ca cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Deleting instance files /var/lib/nova/instances/31fa0ee3-64b4-4f39-adf9-bceb5906e105_del#033[00m
Oct  2 08:05:27 np0005466013 nova_compute[192144]: 2025-10-02 12:05:27.147 2 INFO nova.virt.libvirt.driver [None req-978217b1-10dc-4b9b-ab51-056fdd2157ca cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Deletion of /var/lib/nova/instances/31fa0ee3-64b4-4f39-adf9-bceb5906e105_del complete#033[00m
Oct  2 08:05:27 np0005466013 podman[223312]: 2025-10-02 12:05:27.154012591 +0000 UTC m=+0.159282245 container cleanup 2e79780bf9be020d31c462d8971c2fe9e65db1ba090908cefd26a7706d051c0b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-982b406e-0686-44db-8945-39e0f57e4781, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Oct  2 08:05:27 np0005466013 systemd[1]: libpod-conmon-2e79780bf9be020d31c462d8971c2fe9e65db1ba090908cefd26a7706d051c0b.scope: Deactivated successfully.
Oct  2 08:05:27 np0005466013 podman[223357]: 2025-10-02 12:05:27.236070263 +0000 UTC m=+0.059948912 container remove 2e79780bf9be020d31c462d8971c2fe9e65db1ba090908cefd26a7706d051c0b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-982b406e-0686-44db-8945-39e0f57e4781, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 08:05:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:27.240 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[2cc07b09-2491-4c33-b88e-3fd41e979612]: (4, ('Thu Oct  2 12:05:26 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-982b406e-0686-44db-8945-39e0f57e4781 (2e79780bf9be020d31c462d8971c2fe9e65db1ba090908cefd26a7706d051c0b)\n2e79780bf9be020d31c462d8971c2fe9e65db1ba090908cefd26a7706d051c0b\nThu Oct  2 12:05:27 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-982b406e-0686-44db-8945-39e0f57e4781 (2e79780bf9be020d31c462d8971c2fe9e65db1ba090908cefd26a7706d051c0b)\n2e79780bf9be020d31c462d8971c2fe9e65db1ba090908cefd26a7706d051c0b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:05:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:27.242 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[05d510fa-6df2-4087-af17-1ee9c6800c45]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:05:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:27.242 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap982b406e-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:05:27 np0005466013 nova_compute[192144]: 2025-10-02 12:05:27.244 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:27 np0005466013 kernel: tap982b406e-00: left promiscuous mode
Oct  2 08:05:27 np0005466013 nova_compute[192144]: 2025-10-02 12:05:27.254 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:27 np0005466013 nova_compute[192144]: 2025-10-02 12:05:27.256 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:27.258 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[48030a75-63a0-4e4f-8486-c20b8de297d7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:05:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:27.277 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[61a0da0f-fa98-4ecf-8ef8-55960fd7f1bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:05:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:27.278 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[ccfa7c1f-6597-49ab-a96f-dc378a28502b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:05:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:27.292 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[519f5815-a9eb-4951-8fa8-ff263be1d322]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 470141, 'reachable_time': 15638, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223372, 'error': None, 'target': 'ovnmeta-982b406e-0686-44db-8945-39e0f57e4781', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:05:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:27.294 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-982b406e-0686-44db-8945-39e0f57e4781 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:05:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:05:27.294 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[4e3e8ce0-1b74-46ab-a401-41b7f6cfffab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:05:27 np0005466013 systemd[1]: run-netns-ovnmeta\x2d982b406e\x2d0686\x2d44db\x2d8945\x2d39e0f57e4781.mount: Deactivated successfully.
Oct  2 08:05:27 np0005466013 nova_compute[192144]: 2025-10-02 12:05:27.786 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:29 np0005466013 nova_compute[192144]: 2025-10-02 12:05:29.344 2 DEBUG nova.compute.manager [req-f98050de-d912-4566-ac8e-0e96f590ad3a req-eebf7035-d37f-462f-8a92-3458ca29b16e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Received event network-vif-unplugged-be03f2e4-1e42-4870-941c-467fcae525e2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:05:29 np0005466013 nova_compute[192144]: 2025-10-02 12:05:29.345 2 DEBUG oslo_concurrency.lockutils [req-f98050de-d912-4566-ac8e-0e96f590ad3a req-eebf7035-d37f-462f-8a92-3458ca29b16e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "31fa0ee3-64b4-4f39-adf9-bceb5906e105-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:05:29 np0005466013 nova_compute[192144]: 2025-10-02 12:05:29.345 2 DEBUG oslo_concurrency.lockutils [req-f98050de-d912-4566-ac8e-0e96f590ad3a req-eebf7035-d37f-462f-8a92-3458ca29b16e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "31fa0ee3-64b4-4f39-adf9-bceb5906e105-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:05:29 np0005466013 nova_compute[192144]: 2025-10-02 12:05:29.346 2 DEBUG oslo_concurrency.lockutils [req-f98050de-d912-4566-ac8e-0e96f590ad3a req-eebf7035-d37f-462f-8a92-3458ca29b16e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "31fa0ee3-64b4-4f39-adf9-bceb5906e105-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:05:29 np0005466013 nova_compute[192144]: 2025-10-02 12:05:29.346 2 DEBUG nova.compute.manager [req-f98050de-d912-4566-ac8e-0e96f590ad3a req-eebf7035-d37f-462f-8a92-3458ca29b16e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] No waiting events found dispatching network-vif-unplugged-be03f2e4-1e42-4870-941c-467fcae525e2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:05:29 np0005466013 nova_compute[192144]: 2025-10-02 12:05:29.346 2 DEBUG nova.compute.manager [req-f98050de-d912-4566-ac8e-0e96f590ad3a req-eebf7035-d37f-462f-8a92-3458ca29b16e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Received event network-vif-unplugged-be03f2e4-1e42-4870-941c-467fcae525e2 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:05:29 np0005466013 nova_compute[192144]: 2025-10-02 12:05:29.438 2 INFO nova.compute.manager [None req-978217b1-10dc-4b9b-ab51-056fdd2157ca cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Took 2.59 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:05:29 np0005466013 nova_compute[192144]: 2025-10-02 12:05:29.439 2 DEBUG oslo.service.loopingcall [None req-978217b1-10dc-4b9b-ab51-056fdd2157ca cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:05:29 np0005466013 nova_compute[192144]: 2025-10-02 12:05:29.439 2 DEBUG nova.compute.manager [-] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:05:29 np0005466013 nova_compute[192144]: 2025-10-02 12:05:29.439 2 DEBUG nova.network.neutron [-] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:05:30 np0005466013 nova_compute[192144]: 2025-10-02 12:05:30.384 2 DEBUG nova.network.neutron [-] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:05:30 np0005466013 nova_compute[192144]: 2025-10-02 12:05:30.434 2 DEBUG nova.compute.manager [req-592ffef0-ff00-48ff-a458-3cdc8fe5028a req-391651b5-69a8-4ce8-8191-fcb69cd26245 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Received event network-vif-deleted-be03f2e4-1e42-4870-941c-467fcae525e2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:05:30 np0005466013 nova_compute[192144]: 2025-10-02 12:05:30.434 2 INFO nova.compute.manager [req-592ffef0-ff00-48ff-a458-3cdc8fe5028a req-391651b5-69a8-4ce8-8191-fcb69cd26245 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Neutron deleted interface be03f2e4-1e42-4870-941c-467fcae525e2; detaching it from the instance and deleting it from the info cache#033[00m
Oct  2 08:05:30 np0005466013 nova_compute[192144]: 2025-10-02 12:05:30.434 2 DEBUG nova.network.neutron [req-592ffef0-ff00-48ff-a458-3cdc8fe5028a req-391651b5-69a8-4ce8-8191-fcb69cd26245 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:05:30 np0005466013 nova_compute[192144]: 2025-10-02 12:05:30.453 2 INFO nova.compute.manager [-] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Took 1.01 seconds to deallocate network for instance.#033[00m
Oct  2 08:05:30 np0005466013 nova_compute[192144]: 2025-10-02 12:05:30.471 2 DEBUG nova.compute.manager [req-592ffef0-ff00-48ff-a458-3cdc8fe5028a req-391651b5-69a8-4ce8-8191-fcb69cd26245 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Detach interface failed, port_id=be03f2e4-1e42-4870-941c-467fcae525e2, reason: Instance 31fa0ee3-64b4-4f39-adf9-bceb5906e105 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Oct  2 08:05:30 np0005466013 nova_compute[192144]: 2025-10-02 12:05:30.737 2 DEBUG oslo_concurrency.lockutils [None req-978217b1-10dc-4b9b-ab51-056fdd2157ca cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:05:30 np0005466013 nova_compute[192144]: 2025-10-02 12:05:30.738 2 DEBUG oslo_concurrency.lockutils [None req-978217b1-10dc-4b9b-ab51-056fdd2157ca cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:05:30 np0005466013 nova_compute[192144]: 2025-10-02 12:05:30.995 2 DEBUG nova.compute.provider_tree [None req-978217b1-10dc-4b9b-ab51-056fdd2157ca cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:05:31 np0005466013 nova_compute[192144]: 2025-10-02 12:05:31.011 2 DEBUG nova.scheduler.client.report [None req-978217b1-10dc-4b9b-ab51-056fdd2157ca cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:05:31 np0005466013 nova_compute[192144]: 2025-10-02 12:05:31.064 2 DEBUG oslo_concurrency.lockutils [None req-978217b1-10dc-4b9b-ab51-056fdd2157ca cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.326s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:05:31 np0005466013 nova_compute[192144]: 2025-10-02 12:05:31.115 2 INFO nova.scheduler.client.report [None req-978217b1-10dc-4b9b-ab51-056fdd2157ca cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Deleted allocations for instance 31fa0ee3-64b4-4f39-adf9-bceb5906e105#033[00m
Oct  2 08:05:31 np0005466013 nova_compute[192144]: 2025-10-02 12:05:31.310 2 DEBUG oslo_concurrency.lockutils [None req-978217b1-10dc-4b9b-ab51-056fdd2157ca cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lock "31fa0ee3-64b4-4f39-adf9-bceb5906e105" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.489s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:05:31 np0005466013 nova_compute[192144]: 2025-10-02 12:05:31.426 2 DEBUG nova.compute.manager [req-13a0bf82-036d-439d-92d6-e3820d6af22e req-441aa88f-9be6-4859-aa23-5c4669f5b4f2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Received event network-vif-plugged-be03f2e4-1e42-4870-941c-467fcae525e2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:05:31 np0005466013 nova_compute[192144]: 2025-10-02 12:05:31.426 2 DEBUG oslo_concurrency.lockutils [req-13a0bf82-036d-439d-92d6-e3820d6af22e req-441aa88f-9be6-4859-aa23-5c4669f5b4f2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "31fa0ee3-64b4-4f39-adf9-bceb5906e105-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:05:31 np0005466013 nova_compute[192144]: 2025-10-02 12:05:31.426 2 DEBUG oslo_concurrency.lockutils [req-13a0bf82-036d-439d-92d6-e3820d6af22e req-441aa88f-9be6-4859-aa23-5c4669f5b4f2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "31fa0ee3-64b4-4f39-adf9-bceb5906e105-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:05:31 np0005466013 nova_compute[192144]: 2025-10-02 12:05:31.427 2 DEBUG oslo_concurrency.lockutils [req-13a0bf82-036d-439d-92d6-e3820d6af22e req-441aa88f-9be6-4859-aa23-5c4669f5b4f2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "31fa0ee3-64b4-4f39-adf9-bceb5906e105-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:05:31 np0005466013 nova_compute[192144]: 2025-10-02 12:05:31.427 2 DEBUG nova.compute.manager [req-13a0bf82-036d-439d-92d6-e3820d6af22e req-441aa88f-9be6-4859-aa23-5c4669f5b4f2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] No waiting events found dispatching network-vif-plugged-be03f2e4-1e42-4870-941c-467fcae525e2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:05:31 np0005466013 nova_compute[192144]: 2025-10-02 12:05:31.427 2 WARNING nova.compute.manager [req-13a0bf82-036d-439d-92d6-e3820d6af22e req-441aa88f-9be6-4859-aa23-5c4669f5b4f2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Received unexpected event network-vif-plugged-be03f2e4-1e42-4870-941c-467fcae525e2 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:05:32 np0005466013 nova_compute[192144]: 2025-10-02 12:05:32.143 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:32 np0005466013 nova_compute[192144]: 2025-10-02 12:05:32.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:32 np0005466013 nova_compute[192144]: 2025-10-02 12:05:32.788 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:33 np0005466013 podman[223373]: 2025-10-02 12:05:33.679888722 +0000 UTC m=+0.059658482 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:05:36 np0005466013 podman[223394]: 2025-10-02 12:05:36.684358678 +0000 UTC m=+0.056856701 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, architecture=x86_64, io.openshift.tags=minimal rhel9, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vcs-type=git, io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vendor=Red Hat, Inc., version=9.6, io.buildah.version=1.33.7, name=ubi9-minimal)
Oct  2 08:05:36 np0005466013 podman[223393]: 2025-10-02 12:05:36.705712294 +0000 UTC m=+0.082738386 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251001, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:05:37 np0005466013 nova_compute[192144]: 2025-10-02 12:05:37.145 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:37 np0005466013 nova_compute[192144]: 2025-10-02 12:05:37.817 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:38 np0005466013 nova_compute[192144]: 2025-10-02 12:05:38.499 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759406723.4970808, 356bc6d6-1101-467e-a020-65876724c955 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:05:38 np0005466013 nova_compute[192144]: 2025-10-02 12:05:38.499 2 INFO nova.compute.manager [-] [instance: 356bc6d6-1101-467e-a020-65876724c955] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:05:38 np0005466013 nova_compute[192144]: 2025-10-02 12:05:38.526 2 DEBUG nova.compute.manager [None req-5f92a3c8-90b8-4f05-9b73-9c19688ebccc - - - - - -] [instance: 356bc6d6-1101-467e-a020-65876724c955] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:05:40 np0005466013 podman[223435]: 2025-10-02 12:05:40.678466226 +0000 UTC m=+0.053552301 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:05:40 np0005466013 podman[223436]: 2025-10-02 12:05:40.683885615 +0000 UTC m=+0.056367214 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=iscsid, org.label-schema.license=GPLv2, config_id=iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:05:42 np0005466013 nova_compute[192144]: 2025-10-02 12:05:42.115 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759406727.114241, 31fa0ee3-64b4-4f39-adf9-bceb5906e105 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:05:42 np0005466013 nova_compute[192144]: 2025-10-02 12:05:42.116 2 INFO nova.compute.manager [-] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:05:42 np0005466013 nova_compute[192144]: 2025-10-02 12:05:42.148 2 DEBUG nova.compute.manager [None req-f3059e0b-5779-4766-9196-95447ebd1cf5 - - - - - -] [instance: 31fa0ee3-64b4-4f39-adf9-bceb5906e105] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:05:42 np0005466013 nova_compute[192144]: 2025-10-02 12:05:42.149 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:42 np0005466013 nova_compute[192144]: 2025-10-02 12:05:42.818 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:47 np0005466013 nova_compute[192144]: 2025-10-02 12:05:47.150 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:47 np0005466013 nova_compute[192144]: 2025-10-02 12:05:47.818 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:52 np0005466013 nova_compute[192144]: 2025-10-02 12:05:52.175 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:52 np0005466013 nova_compute[192144]: 2025-10-02 12:05:52.820 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:53 np0005466013 podman[223482]: 2025-10-02 12:05:53.669680908 +0000 UTC m=+0.046798327 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Oct  2 08:05:53 np0005466013 podman[223481]: 2025-10-02 12:05:53.673113751 +0000 UTC m=+0.052540517 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 08:05:53 np0005466013 podman[223483]: 2025-10-02 12:05:53.710057802 +0000 UTC m=+0.083498750 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:05:57 np0005466013 nova_compute[192144]: 2025-10-02 12:05:57.177 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:57 np0005466013 nova_compute[192144]: 2025-10-02 12:05:57.821 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:01 np0005466013 nova_compute[192144]: 2025-10-02 12:06:01.257 2 DEBUG oslo_concurrency.lockutils [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Acquiring lock "a58a927e-98bd-4fb4-aeaa-7f7817401dc0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:01 np0005466013 nova_compute[192144]: 2025-10-02 12:06:01.257 2 DEBUG oslo_concurrency.lockutils [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Lock "a58a927e-98bd-4fb4-aeaa-7f7817401dc0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:01 np0005466013 nova_compute[192144]: 2025-10-02 12:06:01.288 2 DEBUG nova.compute.manager [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:06:01 np0005466013 nova_compute[192144]: 2025-10-02 12:06:01.442 2 DEBUG oslo_concurrency.lockutils [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:01 np0005466013 nova_compute[192144]: 2025-10-02 12:06:01.443 2 DEBUG oslo_concurrency.lockutils [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:01 np0005466013 nova_compute[192144]: 2025-10-02 12:06:01.452 2 DEBUG nova.virt.hardware [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:06:01 np0005466013 nova_compute[192144]: 2025-10-02 12:06:01.452 2 INFO nova.compute.claims [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:06:01 np0005466013 nova_compute[192144]: 2025-10-02 12:06:01.725 2 DEBUG nova.compute.provider_tree [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:06:01 np0005466013 nova_compute[192144]: 2025-10-02 12:06:01.750 2 DEBUG nova.scheduler.client.report [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:06:01 np0005466013 nova_compute[192144]: 2025-10-02 12:06:01.776 2 DEBUG oslo_concurrency.lockutils [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.334s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:01 np0005466013 nova_compute[192144]: 2025-10-02 12:06:01.777 2 DEBUG nova.compute.manager [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:06:01 np0005466013 nova_compute[192144]: 2025-10-02 12:06:01.847 2 DEBUG nova.compute.manager [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:06:01 np0005466013 nova_compute[192144]: 2025-10-02 12:06:01.848 2 DEBUG nova.network.neutron [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:06:01 np0005466013 nova_compute[192144]: 2025-10-02 12:06:01.902 2 INFO nova.virt.libvirt.driver [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:06:01 np0005466013 nova_compute[192144]: 2025-10-02 12:06:01.929 2 DEBUG nova.compute.manager [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:06:02 np0005466013 nova_compute[192144]: 2025-10-02 12:06:02.180 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:02 np0005466013 nova_compute[192144]: 2025-10-02 12:06:02.232 2 DEBUG nova.compute.manager [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:06:02 np0005466013 nova_compute[192144]: 2025-10-02 12:06:02.233 2 DEBUG nova.virt.libvirt.driver [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:06:02 np0005466013 nova_compute[192144]: 2025-10-02 12:06:02.233 2 INFO nova.virt.libvirt.driver [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] Creating image(s)#033[00m
Oct  2 08:06:02 np0005466013 nova_compute[192144]: 2025-10-02 12:06:02.234 2 DEBUG oslo_concurrency.lockutils [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Acquiring lock "/var/lib/nova/instances/a58a927e-98bd-4fb4-aeaa-7f7817401dc0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:02 np0005466013 nova_compute[192144]: 2025-10-02 12:06:02.234 2 DEBUG oslo_concurrency.lockutils [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Lock "/var/lib/nova/instances/a58a927e-98bd-4fb4-aeaa-7f7817401dc0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:02 np0005466013 nova_compute[192144]: 2025-10-02 12:06:02.235 2 DEBUG oslo_concurrency.lockutils [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Lock "/var/lib/nova/instances/a58a927e-98bd-4fb4-aeaa-7f7817401dc0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:02 np0005466013 nova_compute[192144]: 2025-10-02 12:06:02.246 2 DEBUG oslo_concurrency.processutils [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:02.286 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:02.286 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:02.287 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:02 np0005466013 nova_compute[192144]: 2025-10-02 12:06:02.301 2 DEBUG oslo_concurrency.processutils [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:02 np0005466013 nova_compute[192144]: 2025-10-02 12:06:02.302 2 DEBUG oslo_concurrency.lockutils [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:02 np0005466013 nova_compute[192144]: 2025-10-02 12:06:02.303 2 DEBUG oslo_concurrency.lockutils [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:02 np0005466013 nova_compute[192144]: 2025-10-02 12:06:02.318 2 DEBUG oslo_concurrency.processutils [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:02 np0005466013 nova_compute[192144]: 2025-10-02 12:06:02.386 2 DEBUG oslo_concurrency.processutils [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:02 np0005466013 nova_compute[192144]: 2025-10-02 12:06:02.387 2 DEBUG oslo_concurrency.processutils [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/a58a927e-98bd-4fb4-aeaa-7f7817401dc0/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:02 np0005466013 nova_compute[192144]: 2025-10-02 12:06:02.423 2 DEBUG oslo_concurrency.processutils [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/a58a927e-98bd-4fb4-aeaa-7f7817401dc0/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:02 np0005466013 nova_compute[192144]: 2025-10-02 12:06:02.424 2 DEBUG oslo_concurrency.lockutils [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:02 np0005466013 nova_compute[192144]: 2025-10-02 12:06:02.424 2 DEBUG oslo_concurrency.processutils [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:02 np0005466013 nova_compute[192144]: 2025-10-02 12:06:02.483 2 DEBUG oslo_concurrency.processutils [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:02 np0005466013 nova_compute[192144]: 2025-10-02 12:06:02.484 2 DEBUG nova.virt.disk.api [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Checking if we can resize image /var/lib/nova/instances/a58a927e-98bd-4fb4-aeaa-7f7817401dc0/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:06:02 np0005466013 nova_compute[192144]: 2025-10-02 12:06:02.484 2 DEBUG oslo_concurrency.processutils [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a58a927e-98bd-4fb4-aeaa-7f7817401dc0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:02 np0005466013 nova_compute[192144]: 2025-10-02 12:06:02.542 2 DEBUG oslo_concurrency.processutils [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a58a927e-98bd-4fb4-aeaa-7f7817401dc0/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:02 np0005466013 nova_compute[192144]: 2025-10-02 12:06:02.543 2 DEBUG nova.virt.disk.api [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Cannot resize image /var/lib/nova/instances/a58a927e-98bd-4fb4-aeaa-7f7817401dc0/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:06:02 np0005466013 nova_compute[192144]: 2025-10-02 12:06:02.544 2 DEBUG nova.objects.instance [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Lazy-loading 'migration_context' on Instance uuid a58a927e-98bd-4fb4-aeaa-7f7817401dc0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:06:02 np0005466013 nova_compute[192144]: 2025-10-02 12:06:02.574 2 DEBUG nova.virt.libvirt.driver [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:06:02 np0005466013 nova_compute[192144]: 2025-10-02 12:06:02.574 2 DEBUG nova.virt.libvirt.driver [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] Ensure instance console log exists: /var/lib/nova/instances/a58a927e-98bd-4fb4-aeaa-7f7817401dc0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:06:02 np0005466013 nova_compute[192144]: 2025-10-02 12:06:02.575 2 DEBUG oslo_concurrency.lockutils [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:02 np0005466013 nova_compute[192144]: 2025-10-02 12:06:02.575 2 DEBUG oslo_concurrency.lockutils [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:02 np0005466013 nova_compute[192144]: 2025-10-02 12:06:02.576 2 DEBUG oslo_concurrency.lockutils [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:02 np0005466013 nova_compute[192144]: 2025-10-02 12:06:02.823 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:02 np0005466013 nova_compute[192144]: 2025-10-02 12:06:02.961 2 DEBUG nova.network.neutron [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Oct  2 08:06:02 np0005466013 nova_compute[192144]: 2025-10-02 12:06:02.962 2 DEBUG nova.compute.manager [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:06:02 np0005466013 nova_compute[192144]: 2025-10-02 12:06:02.963 2 DEBUG nova.virt.libvirt.driver [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:06:02 np0005466013 nova_compute[192144]: 2025-10-02 12:06:02.969 2 WARNING nova.virt.libvirt.driver [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:06:02 np0005466013 nova_compute[192144]: 2025-10-02 12:06:02.983 2 DEBUG nova.virt.libvirt.host [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:06:02 np0005466013 nova_compute[192144]: 2025-10-02 12:06:02.984 2 DEBUG nova.virt.libvirt.host [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:06:02 np0005466013 nova_compute[192144]: 2025-10-02 12:06:02.989 2 DEBUG nova.virt.libvirt.host [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:06:02 np0005466013 nova_compute[192144]: 2025-10-02 12:06:02.990 2 DEBUG nova.virt.libvirt.host [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:06:02 np0005466013 nova_compute[192144]: 2025-10-02 12:06:02.991 2 DEBUG nova.virt.libvirt.driver [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:06:02 np0005466013 nova_compute[192144]: 2025-10-02 12:06:02.992 2 DEBUG nova.virt.hardware [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:06:02 np0005466013 nova_compute[192144]: 2025-10-02 12:06:02.992 2 DEBUG nova.virt.hardware [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:06:02 np0005466013 nova_compute[192144]: 2025-10-02 12:06:02.992 2 DEBUG nova.virt.hardware [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:06:02 np0005466013 nova_compute[192144]: 2025-10-02 12:06:02.993 2 DEBUG nova.virt.hardware [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:06:02 np0005466013 nova_compute[192144]: 2025-10-02 12:06:02.993 2 DEBUG nova.virt.hardware [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:06:02 np0005466013 nova_compute[192144]: 2025-10-02 12:06:02.993 2 DEBUG nova.virt.hardware [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:06:02 np0005466013 nova_compute[192144]: 2025-10-02 12:06:02.993 2 DEBUG nova.virt.hardware [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:06:02 np0005466013 nova_compute[192144]: 2025-10-02 12:06:02.994 2 DEBUG nova.virt.hardware [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:06:02 np0005466013 nova_compute[192144]: 2025-10-02 12:06:02.994 2 DEBUG nova.virt.hardware [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:06:02 np0005466013 nova_compute[192144]: 2025-10-02 12:06:02.994 2 DEBUG nova.virt.hardware [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:06:02 np0005466013 nova_compute[192144]: 2025-10-02 12:06:02.995 2 DEBUG nova.virt.hardware [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:06:03 np0005466013 nova_compute[192144]: 2025-10-02 12:06:02.999 2 DEBUG nova.objects.instance [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Lazy-loading 'pci_devices' on Instance uuid a58a927e-98bd-4fb4-aeaa-7f7817401dc0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:06:03 np0005466013 nova_compute[192144]: 2025-10-02 12:06:03.016 2 DEBUG nova.virt.libvirt.driver [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:06:03 np0005466013 nova_compute[192144]:  <uuid>a58a927e-98bd-4fb4-aeaa-7f7817401dc0</uuid>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:  <name>instance-00000020</name>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:06:03 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:      <nova:name>tempest-LiveMigrationNegativeTest-server-1120166688</nova:name>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:06:02</nova:creationTime>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:06:03 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:        <nova:user uuid="c8270c33bced4c1b806e47efa970c01e">tempest-LiveMigrationNegativeTest-50793384-project-member</nova:user>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:        <nova:project uuid="fb361251cae94d42aaec252513e2f05c">tempest-LiveMigrationNegativeTest-50793384</nova:project>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:      <nova:ports/>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:06:03 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:      <entry name="serial">a58a927e-98bd-4fb4-aeaa-7f7817401dc0</entry>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:      <entry name="uuid">a58a927e-98bd-4fb4-aeaa-7f7817401dc0</entry>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:06:03 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:06:03 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:06:03 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/a58a927e-98bd-4fb4-aeaa-7f7817401dc0/disk"/>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:06:03 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/a58a927e-98bd-4fb4-aeaa-7f7817401dc0/disk.config"/>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:06:03 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/a58a927e-98bd-4fb4-aeaa-7f7817401dc0/console.log" append="off"/>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:06:03 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:06:03 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:06:03 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:06:03 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:06:03 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:06:03 np0005466013 nova_compute[192144]: 2025-10-02 12:06:03.109 2 DEBUG nova.virt.libvirt.driver [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:06:03 np0005466013 nova_compute[192144]: 2025-10-02 12:06:03.109 2 DEBUG nova.virt.libvirt.driver [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:06:03 np0005466013 nova_compute[192144]: 2025-10-02 12:06:03.110 2 INFO nova.virt.libvirt.driver [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] Using config drive#033[00m
Oct  2 08:06:03 np0005466013 nova_compute[192144]: 2025-10-02 12:06:03.345 2 INFO nova.virt.libvirt.driver [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] Creating config drive at /var/lib/nova/instances/a58a927e-98bd-4fb4-aeaa-7f7817401dc0/disk.config#033[00m
Oct  2 08:06:03 np0005466013 nova_compute[192144]: 2025-10-02 12:06:03.350 2 DEBUG oslo_concurrency.processutils [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a58a927e-98bd-4fb4-aeaa-7f7817401dc0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp76a3e25g execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:03 np0005466013 nova_compute[192144]: 2025-10-02 12:06:03.473 2 DEBUG oslo_concurrency.processutils [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a58a927e-98bd-4fb4-aeaa-7f7817401dc0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp76a3e25g" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:03 np0005466013 systemd-machined[152202]: New machine qemu-12-instance-00000020.
Oct  2 08:06:03 np0005466013 systemd[1]: Started Virtual Machine qemu-12-instance-00000020.
Oct  2 08:06:03 np0005466013 podman[223584]: 2025-10-02 12:06:03.821142558 +0000 UTC m=+0.078230376 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=edpm, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct  2 08:06:04 np0005466013 nova_compute[192144]: 2025-10-02 12:06:04.208 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759406764.2084916, a58a927e-98bd-4fb4-aeaa-7f7817401dc0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:06:04 np0005466013 nova_compute[192144]: 2025-10-02 12:06:04.209 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:06:04 np0005466013 nova_compute[192144]: 2025-10-02 12:06:04.212 2 DEBUG nova.compute.manager [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:06:04 np0005466013 nova_compute[192144]: 2025-10-02 12:06:04.212 2 DEBUG nova.virt.libvirt.driver [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:06:04 np0005466013 nova_compute[192144]: 2025-10-02 12:06:04.215 2 INFO nova.virt.libvirt.driver [-] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] Instance spawned successfully.#033[00m
Oct  2 08:06:04 np0005466013 nova_compute[192144]: 2025-10-02 12:06:04.215 2 DEBUG nova.virt.libvirt.driver [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:06:04 np0005466013 nova_compute[192144]: 2025-10-02 12:06:04.244 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:06:04 np0005466013 nova_compute[192144]: 2025-10-02 12:06:04.251 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:06:04 np0005466013 nova_compute[192144]: 2025-10-02 12:06:04.253 2 DEBUG nova.virt.libvirt.driver [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:06:04 np0005466013 nova_compute[192144]: 2025-10-02 12:06:04.254 2 DEBUG nova.virt.libvirt.driver [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:06:04 np0005466013 nova_compute[192144]: 2025-10-02 12:06:04.254 2 DEBUG nova.virt.libvirt.driver [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:06:04 np0005466013 nova_compute[192144]: 2025-10-02 12:06:04.254 2 DEBUG nova.virt.libvirt.driver [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:06:04 np0005466013 nova_compute[192144]: 2025-10-02 12:06:04.255 2 DEBUG nova.virt.libvirt.driver [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:06:04 np0005466013 nova_compute[192144]: 2025-10-02 12:06:04.255 2 DEBUG nova.virt.libvirt.driver [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:06:04 np0005466013 nova_compute[192144]: 2025-10-02 12:06:04.283 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:06:04 np0005466013 nova_compute[192144]: 2025-10-02 12:06:04.284 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759406764.2100878, a58a927e-98bd-4fb4-aeaa-7f7817401dc0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:06:04 np0005466013 nova_compute[192144]: 2025-10-02 12:06:04.284 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] VM Started (Lifecycle Event)#033[00m
Oct  2 08:06:05 np0005466013 nova_compute[192144]: 2025-10-02 12:06:05.516 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:06:05 np0005466013 nova_compute[192144]: 2025-10-02 12:06:05.519 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:06:05 np0005466013 nova_compute[192144]: 2025-10-02 12:06:05.541 2 INFO nova.compute.manager [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] Took 3.31 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:06:05 np0005466013 nova_compute[192144]: 2025-10-02 12:06:05.542 2 DEBUG nova.compute.manager [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:06:05 np0005466013 nova_compute[192144]: 2025-10-02 12:06:05.585 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:06:05 np0005466013 nova_compute[192144]: 2025-10-02 12:06:05.699 2 INFO nova.compute.manager [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] Took 4.30 seconds to build instance.#033[00m
Oct  2 08:06:05 np0005466013 nova_compute[192144]: 2025-10-02 12:06:05.778 2 DEBUG oslo_concurrency.lockutils [None req-0855ba76-5ba0-4ce5-bed5-a11714e5b50f c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Lock "a58a927e-98bd-4fb4-aeaa-7f7817401dc0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.521s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:06 np0005466013 nova_compute[192144]: 2025-10-02 12:06:06.394 2 DEBUG nova.objects.instance [None req-4ced9eca-613d-4501-91db-7c0428f5829e 01b136eff3d4449dab3ce9477a904468 62b4fc9ab8b44c5bb854a0593f5b6abe - - default default] Lazy-loading 'pci_devices' on Instance uuid a58a927e-98bd-4fb4-aeaa-7f7817401dc0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:06:06 np0005466013 nova_compute[192144]: 2025-10-02 12:06:06.424 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759406766.4241061, a58a927e-98bd-4fb4-aeaa-7f7817401dc0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:06:06 np0005466013 nova_compute[192144]: 2025-10-02 12:06:06.424 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:06:06 np0005466013 nova_compute[192144]: 2025-10-02 12:06:06.454 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:06:06 np0005466013 nova_compute[192144]: 2025-10-02 12:06:06.458 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:06:06 np0005466013 nova_compute[192144]: 2025-10-02 12:06:06.494 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] During sync_power_state the instance has a pending task (suspending). Skip.#033[00m
Oct  2 08:06:06 np0005466013 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000020.scope: Deactivated successfully.
Oct  2 08:06:06 np0005466013 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000020.scope: Consumed 2.909s CPU time.
Oct  2 08:06:06 np0005466013 systemd-machined[152202]: Machine qemu-12-instance-00000020 terminated.
Oct  2 08:06:06 np0005466013 podman[223611]: 2025-10-02 12:06:06.904096516 +0000 UTC m=+0.057755949 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, managed_by=edpm_ansible, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.openshift.tags=minimal rhel9)
Oct  2 08:06:06 np0005466013 podman[223610]: 2025-10-02 12:06:06.911769909 +0000 UTC m=+0.068314058 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  2 08:06:07 np0005466013 nova_compute[192144]: 2025-10-02 12:06:07.043 2 DEBUG nova.compute.manager [None req-4ced9eca-613d-4501-91db-7c0428f5829e 01b136eff3d4449dab3ce9477a904468 62b4fc9ab8b44c5bb854a0593f5b6abe - - default default] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:06:07 np0005466013 nova_compute[192144]: 2025-10-02 12:06:07.181 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:07 np0005466013 nova_compute[192144]: 2025-10-02 12:06:07.825 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:11 np0005466013 podman[223662]: 2025-10-02 12:06:11.684673233 +0000 UTC m=+0.062460456 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=iscsid, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:06:11 np0005466013 podman[223661]: 2025-10-02 12:06:11.704781158 +0000 UTC m=+0.084336069 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 08:06:12 np0005466013 nova_compute[192144]: 2025-10-02 12:06:12.183 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:12.371 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:06:12 np0005466013 nova_compute[192144]: 2025-10-02 12:06:12.371 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:12.372 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:06:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:12.756 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:06:12 np0005466013 nova_compute[192144]: 2025-10-02 12:06:12.756 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:12.757 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:06:12 np0005466013 nova_compute[192144]: 2025-10-02 12:06:12.826 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:12 np0005466013 nova_compute[192144]: 2025-10-02 12:06:12.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:06:12 np0005466013 nova_compute[192144]: 2025-10-02 12:06:12.993 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:06:13 np0005466013 nova_compute[192144]: 2025-10-02 12:06:13.026 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:06:13 np0005466013 nova_compute[192144]: 2025-10-02 12:06:13.027 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:06:13 np0005466013 nova_compute[192144]: 2025-10-02 12:06:13.027 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:06:13 np0005466013 nova_compute[192144]: 2025-10-02 12:06:13.030 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:06:13 np0005466013 nova_compute[192144]: 2025-10-02 12:06:13.198 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:13 np0005466013 nova_compute[192144]: 2025-10-02 12:06:13.199 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:13 np0005466013 nova_compute[192144]: 2025-10-02 12:06:13.199 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:13 np0005466013 nova_compute[192144]: 2025-10-02 12:06:13.199 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:06:13 np0005466013 nova_compute[192144]: 2025-10-02 12:06:13.300 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a58a927e-98bd-4fb4-aeaa-7f7817401dc0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:13 np0005466013 nova_compute[192144]: 2025-10-02 12:06:13.352 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a58a927e-98bd-4fb4-aeaa-7f7817401dc0/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:13 np0005466013 nova_compute[192144]: 2025-10-02 12:06:13.353 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a58a927e-98bd-4fb4-aeaa-7f7817401dc0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:13 np0005466013 nova_compute[192144]: 2025-10-02 12:06:13.407 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a58a927e-98bd-4fb4-aeaa-7f7817401dc0/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:13 np0005466013 nova_compute[192144]: 2025-10-02 12:06:13.554 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:06:13 np0005466013 nova_compute[192144]: 2025-10-02 12:06:13.555 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5696MB free_disk=73.4006233215332GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:06:13 np0005466013 nova_compute[192144]: 2025-10-02 12:06:13.555 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:13 np0005466013 nova_compute[192144]: 2025-10-02 12:06:13.555 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:13 np0005466013 nova_compute[192144]: 2025-10-02 12:06:13.683 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance a58a927e-98bd-4fb4-aeaa-7f7817401dc0 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:06:13 np0005466013 nova_compute[192144]: 2025-10-02 12:06:13.684 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:06:13 np0005466013 nova_compute[192144]: 2025-10-02 12:06:13.684 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:06:13 np0005466013 nova_compute[192144]: 2025-10-02 12:06:13.721 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:06:13 np0005466013 nova_compute[192144]: 2025-10-02 12:06:13.745 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:06:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:13.759 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:06:13 np0005466013 nova_compute[192144]: 2025-10-02 12:06:13.777 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:06:13 np0005466013 nova_compute[192144]: 2025-10-02 12:06:13.778 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.223s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:14 np0005466013 ovn_controller[94366]: 2025-10-02T12:06:14Z|00079|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Oct  2 08:06:14 np0005466013 nova_compute[192144]: 2025-10-02 12:06:14.515 2 DEBUG oslo_concurrency.lockutils [None req-c09d4c0e-ed6a-4e63-bd29-b9e6e64e945d c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Acquiring lock "a58a927e-98bd-4fb4-aeaa-7f7817401dc0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:14 np0005466013 nova_compute[192144]: 2025-10-02 12:06:14.516 2 DEBUG oslo_concurrency.lockutils [None req-c09d4c0e-ed6a-4e63-bd29-b9e6e64e945d c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Lock "a58a927e-98bd-4fb4-aeaa-7f7817401dc0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:14 np0005466013 nova_compute[192144]: 2025-10-02 12:06:14.516 2 DEBUG oslo_concurrency.lockutils [None req-c09d4c0e-ed6a-4e63-bd29-b9e6e64e945d c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Acquiring lock "a58a927e-98bd-4fb4-aeaa-7f7817401dc0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:14 np0005466013 nova_compute[192144]: 2025-10-02 12:06:14.517 2 DEBUG oslo_concurrency.lockutils [None req-c09d4c0e-ed6a-4e63-bd29-b9e6e64e945d c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Lock "a58a927e-98bd-4fb4-aeaa-7f7817401dc0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:14 np0005466013 nova_compute[192144]: 2025-10-02 12:06:14.517 2 DEBUG oslo_concurrency.lockutils [None req-c09d4c0e-ed6a-4e63-bd29-b9e6e64e945d c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Lock "a58a927e-98bd-4fb4-aeaa-7f7817401dc0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:14 np0005466013 nova_compute[192144]: 2025-10-02 12:06:14.533 2 INFO nova.compute.manager [None req-c09d4c0e-ed6a-4e63-bd29-b9e6e64e945d c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] Terminating instance#033[00m
Oct  2 08:06:14 np0005466013 nova_compute[192144]: 2025-10-02 12:06:14.549 2 DEBUG oslo_concurrency.lockutils [None req-c09d4c0e-ed6a-4e63-bd29-b9e6e64e945d c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Acquiring lock "refresh_cache-a58a927e-98bd-4fb4-aeaa-7f7817401dc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:06:14 np0005466013 nova_compute[192144]: 2025-10-02 12:06:14.549 2 DEBUG oslo_concurrency.lockutils [None req-c09d4c0e-ed6a-4e63-bd29-b9e6e64e945d c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Acquired lock "refresh_cache-a58a927e-98bd-4fb4-aeaa-7f7817401dc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:06:14 np0005466013 nova_compute[192144]: 2025-10-02 12:06:14.549 2 DEBUG nova.network.neutron [None req-c09d4c0e-ed6a-4e63-bd29-b9e6e64e945d c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:06:14 np0005466013 systemd[1]: virtproxyd.service: Deactivated successfully.
Oct  2 08:06:14 np0005466013 nova_compute[192144]: 2025-10-02 12:06:14.744 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:06:14 np0005466013 nova_compute[192144]: 2025-10-02 12:06:14.760 2 DEBUG nova.network.neutron [None req-c09d4c0e-ed6a-4e63-bd29-b9e6e64e945d c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:06:14 np0005466013 nova_compute[192144]: 2025-10-02 12:06:14.988 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:06:15 np0005466013 nova_compute[192144]: 2025-10-02 12:06:15.013 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:06:15 np0005466013 nova_compute[192144]: 2025-10-02 12:06:15.014 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:06:15 np0005466013 nova_compute[192144]: 2025-10-02 12:06:15.217 2 DEBUG nova.network.neutron [None req-c09d4c0e-ed6a-4e63-bd29-b9e6e64e945d c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:06:15 np0005466013 nova_compute[192144]: 2025-10-02 12:06:15.236 2 DEBUG oslo_concurrency.lockutils [None req-c09d4c0e-ed6a-4e63-bd29-b9e6e64e945d c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Releasing lock "refresh_cache-a58a927e-98bd-4fb4-aeaa-7f7817401dc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:06:15 np0005466013 nova_compute[192144]: 2025-10-02 12:06:15.237 2 DEBUG nova.compute.manager [None req-c09d4c0e-ed6a-4e63-bd29-b9e6e64e945d c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:06:15 np0005466013 nova_compute[192144]: 2025-10-02 12:06:15.245 2 INFO nova.virt.libvirt.driver [-] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] Instance destroyed successfully.#033[00m
Oct  2 08:06:15 np0005466013 nova_compute[192144]: 2025-10-02 12:06:15.245 2 DEBUG nova.objects.instance [None req-c09d4c0e-ed6a-4e63-bd29-b9e6e64e945d c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Lazy-loading 'resources' on Instance uuid a58a927e-98bd-4fb4-aeaa-7f7817401dc0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:06:15 np0005466013 nova_compute[192144]: 2025-10-02 12:06:15.259 2 INFO nova.virt.libvirt.driver [None req-c09d4c0e-ed6a-4e63-bd29-b9e6e64e945d c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] Deleting instance files /var/lib/nova/instances/a58a927e-98bd-4fb4-aeaa-7f7817401dc0_del#033[00m
Oct  2 08:06:15 np0005466013 nova_compute[192144]: 2025-10-02 12:06:15.260 2 INFO nova.virt.libvirt.driver [None req-c09d4c0e-ed6a-4e63-bd29-b9e6e64e945d c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] Deletion of /var/lib/nova/instances/a58a927e-98bd-4fb4-aeaa-7f7817401dc0_del complete#033[00m
Oct  2 08:06:15 np0005466013 nova_compute[192144]: 2025-10-02 12:06:15.373 2 INFO nova.compute.manager [None req-c09d4c0e-ed6a-4e63-bd29-b9e6e64e945d c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] Took 0.14 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:06:15 np0005466013 nova_compute[192144]: 2025-10-02 12:06:15.374 2 DEBUG oslo.service.loopingcall [None req-c09d4c0e-ed6a-4e63-bd29-b9e6e64e945d c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:06:15 np0005466013 nova_compute[192144]: 2025-10-02 12:06:15.374 2 DEBUG nova.compute.manager [-] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:06:15 np0005466013 nova_compute[192144]: 2025-10-02 12:06:15.374 2 DEBUG nova.network.neutron [-] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:06:16 np0005466013 nova_compute[192144]: 2025-10-02 12:06:16.411 2 DEBUG nova.network.neutron [-] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:06:16 np0005466013 nova_compute[192144]: 2025-10-02 12:06:16.431 2 DEBUG nova.network.neutron [-] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:06:16 np0005466013 nova_compute[192144]: 2025-10-02 12:06:16.449 2 INFO nova.compute.manager [-] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] Took 1.08 seconds to deallocate network for instance.#033[00m
Oct  2 08:06:16 np0005466013 nova_compute[192144]: 2025-10-02 12:06:16.601 2 DEBUG oslo_concurrency.lockutils [None req-c09d4c0e-ed6a-4e63-bd29-b9e6e64e945d c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:16 np0005466013 nova_compute[192144]: 2025-10-02 12:06:16.601 2 DEBUG oslo_concurrency.lockutils [None req-c09d4c0e-ed6a-4e63-bd29-b9e6e64e945d c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:16 np0005466013 nova_compute[192144]: 2025-10-02 12:06:16.666 2 DEBUG nova.compute.provider_tree [None req-c09d4c0e-ed6a-4e63-bd29-b9e6e64e945d c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:06:16 np0005466013 nova_compute[192144]: 2025-10-02 12:06:16.696 2 DEBUG nova.scheduler.client.report [None req-c09d4c0e-ed6a-4e63-bd29-b9e6e64e945d c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:06:16 np0005466013 nova_compute[192144]: 2025-10-02 12:06:16.723 2 DEBUG oslo_concurrency.lockutils [None req-c09d4c0e-ed6a-4e63-bd29-b9e6e64e945d c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:16 np0005466013 nova_compute[192144]: 2025-10-02 12:06:16.782 2 INFO nova.scheduler.client.report [None req-c09d4c0e-ed6a-4e63-bd29-b9e6e64e945d c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Deleted allocations for instance a58a927e-98bd-4fb4-aeaa-7f7817401dc0#033[00m
Oct  2 08:06:16 np0005466013 nova_compute[192144]: 2025-10-02 12:06:16.915 2 DEBUG oslo_concurrency.lockutils [None req-c09d4c0e-ed6a-4e63-bd29-b9e6e64e945d c8270c33bced4c1b806e47efa970c01e fb361251cae94d42aaec252513e2f05c - - default default] Lock "a58a927e-98bd-4fb4-aeaa-7f7817401dc0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.399s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:16 np0005466013 nova_compute[192144]: 2025-10-02 12:06:16.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:06:17 np0005466013 nova_compute[192144]: 2025-10-02 12:06:17.186 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:17 np0005466013 nova_compute[192144]: 2025-10-02 12:06:17.829 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:17 np0005466013 nova_compute[192144]: 2025-10-02 12:06:17.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:06:18 np0005466013 nova_compute[192144]: 2025-10-02 12:06:18.989 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:06:20 np0005466013 nova_compute[192144]: 2025-10-02 12:06:20.057 2 DEBUG oslo_concurrency.lockutils [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Acquiring lock "3a407f85-32f9-4831-9b8c-4d237153d9f4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:20 np0005466013 nova_compute[192144]: 2025-10-02 12:06:20.058 2 DEBUG oslo_concurrency.lockutils [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lock "3a407f85-32f9-4831-9b8c-4d237153d9f4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:20 np0005466013 nova_compute[192144]: 2025-10-02 12:06:20.087 2 DEBUG nova.compute.manager [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:06:20 np0005466013 nova_compute[192144]: 2025-10-02 12:06:20.250 2 DEBUG oslo_concurrency.lockutils [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:20 np0005466013 nova_compute[192144]: 2025-10-02 12:06:20.251 2 DEBUG oslo_concurrency.lockutils [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:20 np0005466013 nova_compute[192144]: 2025-10-02 12:06:20.256 2 DEBUG nova.virt.hardware [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:06:20 np0005466013 nova_compute[192144]: 2025-10-02 12:06:20.256 2 INFO nova.compute.claims [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:06:20 np0005466013 nova_compute[192144]: 2025-10-02 12:06:20.605 2 DEBUG nova.compute.provider_tree [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:06:20 np0005466013 nova_compute[192144]: 2025-10-02 12:06:20.630 2 DEBUG nova.scheduler.client.report [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:06:20 np0005466013 nova_compute[192144]: 2025-10-02 12:06:20.664 2 DEBUG oslo_concurrency.lockutils [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.413s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:20 np0005466013 nova_compute[192144]: 2025-10-02 12:06:20.665 2 DEBUG nova.compute.manager [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:06:20 np0005466013 nova_compute[192144]: 2025-10-02 12:06:20.742 2 DEBUG nova.compute.manager [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:06:20 np0005466013 nova_compute[192144]: 2025-10-02 12:06:20.742 2 DEBUG nova.network.neutron [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:06:20 np0005466013 nova_compute[192144]: 2025-10-02 12:06:20.762 2 INFO nova.virt.libvirt.driver [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:06:20 np0005466013 nova_compute[192144]: 2025-10-02 12:06:20.798 2 DEBUG nova.compute.manager [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:06:20 np0005466013 nova_compute[192144]: 2025-10-02 12:06:20.975 2 DEBUG nova.compute.manager [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:06:20 np0005466013 nova_compute[192144]: 2025-10-02 12:06:20.976 2 DEBUG nova.virt.libvirt.driver [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:06:20 np0005466013 nova_compute[192144]: 2025-10-02 12:06:20.976 2 INFO nova.virt.libvirt.driver [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Creating image(s)#033[00m
Oct  2 08:06:20 np0005466013 nova_compute[192144]: 2025-10-02 12:06:20.977 2 DEBUG oslo_concurrency.lockutils [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Acquiring lock "/var/lib/nova/instances/3a407f85-32f9-4831-9b8c-4d237153d9f4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:20 np0005466013 nova_compute[192144]: 2025-10-02 12:06:20.977 2 DEBUG oslo_concurrency.lockutils [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lock "/var/lib/nova/instances/3a407f85-32f9-4831-9b8c-4d237153d9f4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:20 np0005466013 nova_compute[192144]: 2025-10-02 12:06:20.978 2 DEBUG oslo_concurrency.lockutils [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lock "/var/lib/nova/instances/3a407f85-32f9-4831-9b8c-4d237153d9f4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:20 np0005466013 nova_compute[192144]: 2025-10-02 12:06:20.991 2 DEBUG oslo_concurrency.processutils [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:21 np0005466013 nova_compute[192144]: 2025-10-02 12:06:21.046 2 DEBUG oslo_concurrency.processutils [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:21 np0005466013 nova_compute[192144]: 2025-10-02 12:06:21.047 2 DEBUG oslo_concurrency.lockutils [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:21 np0005466013 nova_compute[192144]: 2025-10-02 12:06:21.048 2 DEBUG oslo_concurrency.lockutils [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:21 np0005466013 nova_compute[192144]: 2025-10-02 12:06:21.058 2 DEBUG oslo_concurrency.processutils [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:21 np0005466013 nova_compute[192144]: 2025-10-02 12:06:21.075 2 DEBUG nova.policy [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cdc7ec1af4d8410db0b4592293549806', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '87e7399e976c40bc84f320ed0d052ac6', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:06:21 np0005466013 nova_compute[192144]: 2025-10-02 12:06:21.116 2 DEBUG oslo_concurrency.processutils [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:21 np0005466013 nova_compute[192144]: 2025-10-02 12:06:21.117 2 DEBUG oslo_concurrency.processutils [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/3a407f85-32f9-4831-9b8c-4d237153d9f4/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:21 np0005466013 nova_compute[192144]: 2025-10-02 12:06:21.155 2 DEBUG oslo_concurrency.processutils [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/3a407f85-32f9-4831-9b8c-4d237153d9f4/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:21 np0005466013 nova_compute[192144]: 2025-10-02 12:06:21.157 2 DEBUG oslo_concurrency.lockutils [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:21 np0005466013 nova_compute[192144]: 2025-10-02 12:06:21.157 2 DEBUG oslo_concurrency.processutils [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:21 np0005466013 nova_compute[192144]: 2025-10-02 12:06:21.219 2 DEBUG oslo_concurrency.processutils [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:21 np0005466013 nova_compute[192144]: 2025-10-02 12:06:21.220 2 DEBUG nova.virt.disk.api [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Checking if we can resize image /var/lib/nova/instances/3a407f85-32f9-4831-9b8c-4d237153d9f4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:06:21 np0005466013 nova_compute[192144]: 2025-10-02 12:06:21.220 2 DEBUG oslo_concurrency.processutils [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3a407f85-32f9-4831-9b8c-4d237153d9f4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:21 np0005466013 nova_compute[192144]: 2025-10-02 12:06:21.277 2 DEBUG oslo_concurrency.processutils [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3a407f85-32f9-4831-9b8c-4d237153d9f4/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:21 np0005466013 nova_compute[192144]: 2025-10-02 12:06:21.278 2 DEBUG nova.virt.disk.api [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Cannot resize image /var/lib/nova/instances/3a407f85-32f9-4831-9b8c-4d237153d9f4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:06:21 np0005466013 nova_compute[192144]: 2025-10-02 12:06:21.279 2 DEBUG nova.objects.instance [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lazy-loading 'migration_context' on Instance uuid 3a407f85-32f9-4831-9b8c-4d237153d9f4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:06:21 np0005466013 nova_compute[192144]: 2025-10-02 12:06:21.294 2 DEBUG nova.virt.libvirt.driver [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:06:21 np0005466013 nova_compute[192144]: 2025-10-02 12:06:21.295 2 DEBUG nova.virt.libvirt.driver [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Ensure instance console log exists: /var/lib/nova/instances/3a407f85-32f9-4831-9b8c-4d237153d9f4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:06:21 np0005466013 nova_compute[192144]: 2025-10-02 12:06:21.295 2 DEBUG oslo_concurrency.lockutils [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:21 np0005466013 nova_compute[192144]: 2025-10-02 12:06:21.296 2 DEBUG oslo_concurrency.lockutils [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:21 np0005466013 nova_compute[192144]: 2025-10-02 12:06:21.296 2 DEBUG oslo_concurrency.lockutils [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:21.375 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:06:22 np0005466013 nova_compute[192144]: 2025-10-02 12:06:22.044 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759406767.042742, a58a927e-98bd-4fb4-aeaa-7f7817401dc0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:06:22 np0005466013 nova_compute[192144]: 2025-10-02 12:06:22.044 2 INFO nova.compute.manager [-] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:06:22 np0005466013 nova_compute[192144]: 2025-10-02 12:06:22.069 2 DEBUG nova.compute.manager [None req-7ff6d32e-33cc-4247-a1b2-aed5e36aa961 - - - - - -] [instance: a58a927e-98bd-4fb4-aeaa-7f7817401dc0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:06:22 np0005466013 nova_compute[192144]: 2025-10-02 12:06:22.190 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:22 np0005466013 nova_compute[192144]: 2025-10-02 12:06:22.447 2 DEBUG nova.network.neutron [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Successfully created port: 9259d441-74e4-4515-8b47-d86ae8e47f98 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:06:22 np0005466013 nova_compute[192144]: 2025-10-02 12:06:22.832 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:24 np0005466013 nova_compute[192144]: 2025-10-02 12:06:24.234 2 DEBUG nova.network.neutron [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Successfully updated port: 9259d441-74e4-4515-8b47-d86ae8e47f98 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:06:24 np0005466013 nova_compute[192144]: 2025-10-02 12:06:24.259 2 DEBUG oslo_concurrency.lockutils [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Acquiring lock "refresh_cache-3a407f85-32f9-4831-9b8c-4d237153d9f4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:06:24 np0005466013 nova_compute[192144]: 2025-10-02 12:06:24.260 2 DEBUG oslo_concurrency.lockutils [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Acquired lock "refresh_cache-3a407f85-32f9-4831-9b8c-4d237153d9f4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:06:24 np0005466013 nova_compute[192144]: 2025-10-02 12:06:24.260 2 DEBUG nova.network.neutron [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:06:24 np0005466013 podman[223728]: 2025-10-02 12:06:24.321585151 +0000 UTC m=+0.059313664 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 08:06:24 np0005466013 podman[223729]: 2025-10-02 12:06:24.341017068 +0000 UTC m=+0.077258306 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  2 08:06:24 np0005466013 podman[223730]: 2025-10-02 12:06:24.350221878 +0000 UTC m=+0.084365358 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible)
Oct  2 08:06:24 np0005466013 nova_compute[192144]: 2025-10-02 12:06:24.461 2 DEBUG nova.compute.manager [req-ba4866eb-6d3b-4aa5-8f54-00f7a0a7122e req-2560d08e-b669-4822-aeb6-665275039a9c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Received event network-changed-9259d441-74e4-4515-8b47-d86ae8e47f98 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:06:24 np0005466013 nova_compute[192144]: 2025-10-02 12:06:24.461 2 DEBUG nova.compute.manager [req-ba4866eb-6d3b-4aa5-8f54-00f7a0a7122e req-2560d08e-b669-4822-aeb6-665275039a9c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Refreshing instance network info cache due to event network-changed-9259d441-74e4-4515-8b47-d86ae8e47f98. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:06:24 np0005466013 nova_compute[192144]: 2025-10-02 12:06:24.461 2 DEBUG oslo_concurrency.lockutils [req-ba4866eb-6d3b-4aa5-8f54-00f7a0a7122e req-2560d08e-b669-4822-aeb6-665275039a9c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-3a407f85-32f9-4831-9b8c-4d237153d9f4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:06:24 np0005466013 nova_compute[192144]: 2025-10-02 12:06:24.611 2 DEBUG nova.network.neutron [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:06:26 np0005466013 nova_compute[192144]: 2025-10-02 12:06:26.354 2 DEBUG nova.network.neutron [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Updating instance_info_cache with network_info: [{"id": "9259d441-74e4-4515-8b47-d86ae8e47f98", "address": "fa:16:3e:bd:15:bf", "network": {"id": "982b406e-0686-44db-8945-39e0f57e4781", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-993027464-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "87e7399e976c40bc84f320ed0d052ac6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9259d441-74", "ovs_interfaceid": "9259d441-74e4-4515-8b47-d86ae8e47f98", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:06:26 np0005466013 nova_compute[192144]: 2025-10-02 12:06:26.387 2 DEBUG oslo_concurrency.lockutils [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Releasing lock "refresh_cache-3a407f85-32f9-4831-9b8c-4d237153d9f4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:06:26 np0005466013 nova_compute[192144]: 2025-10-02 12:06:26.388 2 DEBUG nova.compute.manager [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Instance network_info: |[{"id": "9259d441-74e4-4515-8b47-d86ae8e47f98", "address": "fa:16:3e:bd:15:bf", "network": {"id": "982b406e-0686-44db-8945-39e0f57e4781", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-993027464-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "87e7399e976c40bc84f320ed0d052ac6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9259d441-74", "ovs_interfaceid": "9259d441-74e4-4515-8b47-d86ae8e47f98", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:06:26 np0005466013 nova_compute[192144]: 2025-10-02 12:06:26.388 2 DEBUG oslo_concurrency.lockutils [req-ba4866eb-6d3b-4aa5-8f54-00f7a0a7122e req-2560d08e-b669-4822-aeb6-665275039a9c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-3a407f85-32f9-4831-9b8c-4d237153d9f4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:06:26 np0005466013 nova_compute[192144]: 2025-10-02 12:06:26.388 2 DEBUG nova.network.neutron [req-ba4866eb-6d3b-4aa5-8f54-00f7a0a7122e req-2560d08e-b669-4822-aeb6-665275039a9c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Refreshing network info cache for port 9259d441-74e4-4515-8b47-d86ae8e47f98 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:06:26 np0005466013 nova_compute[192144]: 2025-10-02 12:06:26.390 2 DEBUG nova.virt.libvirt.driver [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Start _get_guest_xml network_info=[{"id": "9259d441-74e4-4515-8b47-d86ae8e47f98", "address": "fa:16:3e:bd:15:bf", "network": {"id": "982b406e-0686-44db-8945-39e0f57e4781", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-993027464-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "87e7399e976c40bc84f320ed0d052ac6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9259d441-74", "ovs_interfaceid": "9259d441-74e4-4515-8b47-d86ae8e47f98", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:06:26 np0005466013 nova_compute[192144]: 2025-10-02 12:06:26.394 2 WARNING nova.virt.libvirt.driver [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:06:26 np0005466013 nova_compute[192144]: 2025-10-02 12:06:26.398 2 DEBUG nova.virt.libvirt.host [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:06:26 np0005466013 nova_compute[192144]: 2025-10-02 12:06:26.398 2 DEBUG nova.virt.libvirt.host [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:06:26 np0005466013 nova_compute[192144]: 2025-10-02 12:06:26.402 2 DEBUG nova.virt.libvirt.host [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:06:26 np0005466013 nova_compute[192144]: 2025-10-02 12:06:26.402 2 DEBUG nova.virt.libvirt.host [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:06:26 np0005466013 nova_compute[192144]: 2025-10-02 12:06:26.403 2 DEBUG nova.virt.libvirt.driver [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:06:26 np0005466013 nova_compute[192144]: 2025-10-02 12:06:26.404 2 DEBUG nova.virt.hardware [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:06:26 np0005466013 nova_compute[192144]: 2025-10-02 12:06:26.404 2 DEBUG nova.virt.hardware [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:06:26 np0005466013 nova_compute[192144]: 2025-10-02 12:06:26.404 2 DEBUG nova.virt.hardware [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:06:26 np0005466013 nova_compute[192144]: 2025-10-02 12:06:26.405 2 DEBUG nova.virt.hardware [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:06:26 np0005466013 nova_compute[192144]: 2025-10-02 12:06:26.405 2 DEBUG nova.virt.hardware [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:06:26 np0005466013 nova_compute[192144]: 2025-10-02 12:06:26.405 2 DEBUG nova.virt.hardware [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:06:26 np0005466013 nova_compute[192144]: 2025-10-02 12:06:26.405 2 DEBUG nova.virt.hardware [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:06:26 np0005466013 nova_compute[192144]: 2025-10-02 12:06:26.405 2 DEBUG nova.virt.hardware [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:06:26 np0005466013 nova_compute[192144]: 2025-10-02 12:06:26.406 2 DEBUG nova.virt.hardware [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:06:26 np0005466013 nova_compute[192144]: 2025-10-02 12:06:26.406 2 DEBUG nova.virt.hardware [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:06:26 np0005466013 nova_compute[192144]: 2025-10-02 12:06:26.406 2 DEBUG nova.virt.hardware [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:06:26 np0005466013 nova_compute[192144]: 2025-10-02 12:06:26.409 2 DEBUG nova.virt.libvirt.vif [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:06:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-1407570191',display_name='tempest-ImagesOneServerNegativeTestJSON-server-1407570191',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-1407570191',id=35,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='87e7399e976c40bc84f320ed0d052ac6',ramdisk_id='',reservation_id='r-9yu2xa9z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-507683469',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-507683469-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:06:20Z,user_data=None,user_id='cdc7ec1af4d8410db0b4592293549806',uuid=3a407f85-32f9-4831-9b8c-4d237153d9f4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9259d441-74e4-4515-8b47-d86ae8e47f98", "address": "fa:16:3e:bd:15:bf", "network": {"id": "982b406e-0686-44db-8945-39e0f57e4781", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-993027464-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "87e7399e976c40bc84f320ed0d052ac6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9259d441-74", "ovs_interfaceid": "9259d441-74e4-4515-8b47-d86ae8e47f98", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:06:26 np0005466013 nova_compute[192144]: 2025-10-02 12:06:26.410 2 DEBUG nova.network.os_vif_util [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Converting VIF {"id": "9259d441-74e4-4515-8b47-d86ae8e47f98", "address": "fa:16:3e:bd:15:bf", "network": {"id": "982b406e-0686-44db-8945-39e0f57e4781", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-993027464-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "87e7399e976c40bc84f320ed0d052ac6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9259d441-74", "ovs_interfaceid": "9259d441-74e4-4515-8b47-d86ae8e47f98", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:06:26 np0005466013 nova_compute[192144]: 2025-10-02 12:06:26.410 2 DEBUG nova.network.os_vif_util [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bd:15:bf,bridge_name='br-int',has_traffic_filtering=True,id=9259d441-74e4-4515-8b47-d86ae8e47f98,network=Network(982b406e-0686-44db-8945-39e0f57e4781),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9259d441-74') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:06:26 np0005466013 nova_compute[192144]: 2025-10-02 12:06:26.412 2 DEBUG nova.objects.instance [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3a407f85-32f9-4831-9b8c-4d237153d9f4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:06:26 np0005466013 nova_compute[192144]: 2025-10-02 12:06:26.433 2 DEBUG nova.virt.libvirt.driver [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:06:26 np0005466013 nova_compute[192144]:  <uuid>3a407f85-32f9-4831-9b8c-4d237153d9f4</uuid>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:  <name>instance-00000023</name>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:06:26 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:      <nova:name>tempest-ImagesOneServerNegativeTestJSON-server-1407570191</nova:name>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:06:26</nova:creationTime>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:06:26 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:        <nova:user uuid="cdc7ec1af4d8410db0b4592293549806">tempest-ImagesOneServerNegativeTestJSON-507683469-project-member</nova:user>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:        <nova:project uuid="87e7399e976c40bc84f320ed0d052ac6">tempest-ImagesOneServerNegativeTestJSON-507683469</nova:project>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:        <nova:port uuid="9259d441-74e4-4515-8b47-d86ae8e47f98">
Oct  2 08:06:26 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:      <entry name="serial">3a407f85-32f9-4831-9b8c-4d237153d9f4</entry>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:      <entry name="uuid">3a407f85-32f9-4831-9b8c-4d237153d9f4</entry>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:06:26 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/3a407f85-32f9-4831-9b8c-4d237153d9f4/disk"/>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:06:26 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/3a407f85-32f9-4831-9b8c-4d237153d9f4/disk.config"/>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:06:26 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:bd:15:bf"/>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:      <target dev="tap9259d441-74"/>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:06:26 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/3a407f85-32f9-4831-9b8c-4d237153d9f4/console.log" append="off"/>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:06:26 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:06:26 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:06:26 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:06:26 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:06:26 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:06:26 np0005466013 nova_compute[192144]: 2025-10-02 12:06:26.434 2 DEBUG nova.compute.manager [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Preparing to wait for external event network-vif-plugged-9259d441-74e4-4515-8b47-d86ae8e47f98 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:06:26 np0005466013 nova_compute[192144]: 2025-10-02 12:06:26.434 2 DEBUG oslo_concurrency.lockutils [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Acquiring lock "3a407f85-32f9-4831-9b8c-4d237153d9f4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:26 np0005466013 nova_compute[192144]: 2025-10-02 12:06:26.435 2 DEBUG oslo_concurrency.lockutils [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lock "3a407f85-32f9-4831-9b8c-4d237153d9f4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:26 np0005466013 nova_compute[192144]: 2025-10-02 12:06:26.435 2 DEBUG oslo_concurrency.lockutils [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lock "3a407f85-32f9-4831-9b8c-4d237153d9f4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:26 np0005466013 nova_compute[192144]: 2025-10-02 12:06:26.435 2 DEBUG nova.virt.libvirt.vif [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:06:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-1407570191',display_name='tempest-ImagesOneServerNegativeTestJSON-server-1407570191',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-1407570191',id=35,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='87e7399e976c40bc84f320ed0d052ac6',ramdisk_id='',reservation_id='r-9yu2xa9z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-507683469',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-507683469-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:06:20Z,user_data=None,user_id='cdc7ec1af4d8410db0b4592293549806',uuid=3a407f85-32f9-4831-9b8c-4d237153d9f4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9259d441-74e4-4515-8b47-d86ae8e47f98", "address": "fa:16:3e:bd:15:bf", "network": {"id": "982b406e-0686-44db-8945-39e0f57e4781", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-993027464-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "87e7399e976c40bc84f320ed0d052ac6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9259d441-74", "ovs_interfaceid": "9259d441-74e4-4515-8b47-d86ae8e47f98", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:06:26 np0005466013 nova_compute[192144]: 2025-10-02 12:06:26.436 2 DEBUG nova.network.os_vif_util [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Converting VIF {"id": "9259d441-74e4-4515-8b47-d86ae8e47f98", "address": "fa:16:3e:bd:15:bf", "network": {"id": "982b406e-0686-44db-8945-39e0f57e4781", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-993027464-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "87e7399e976c40bc84f320ed0d052ac6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9259d441-74", "ovs_interfaceid": "9259d441-74e4-4515-8b47-d86ae8e47f98", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:06:26 np0005466013 nova_compute[192144]: 2025-10-02 12:06:26.437 2 DEBUG nova.network.os_vif_util [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bd:15:bf,bridge_name='br-int',has_traffic_filtering=True,id=9259d441-74e4-4515-8b47-d86ae8e47f98,network=Network(982b406e-0686-44db-8945-39e0f57e4781),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9259d441-74') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:06:26 np0005466013 nova_compute[192144]: 2025-10-02 12:06:26.437 2 DEBUG os_vif [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bd:15:bf,bridge_name='br-int',has_traffic_filtering=True,id=9259d441-74e4-4515-8b47-d86ae8e47f98,network=Network(982b406e-0686-44db-8945-39e0f57e4781),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9259d441-74') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:06:26 np0005466013 nova_compute[192144]: 2025-10-02 12:06:26.437 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:26 np0005466013 nova_compute[192144]: 2025-10-02 12:06:26.438 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:06:26 np0005466013 nova_compute[192144]: 2025-10-02 12:06:26.438 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:06:26 np0005466013 nova_compute[192144]: 2025-10-02 12:06:26.441 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:26 np0005466013 nova_compute[192144]: 2025-10-02 12:06:26.442 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9259d441-74, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:06:26 np0005466013 nova_compute[192144]: 2025-10-02 12:06:26.442 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9259d441-74, col_values=(('external_ids', {'iface-id': '9259d441-74e4-4515-8b47-d86ae8e47f98', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:bd:15:bf', 'vm-uuid': '3a407f85-32f9-4831-9b8c-4d237153d9f4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:06:26 np0005466013 nova_compute[192144]: 2025-10-02 12:06:26.443 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:26 np0005466013 NetworkManager[51205]: <info>  [1759406786.4462] manager: (tap9259d441-74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/45)
Oct  2 08:06:26 np0005466013 nova_compute[192144]: 2025-10-02 12:06:26.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:06:26 np0005466013 nova_compute[192144]: 2025-10-02 12:06:26.450 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:26 np0005466013 nova_compute[192144]: 2025-10-02 12:06:26.453 2 INFO os_vif [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bd:15:bf,bridge_name='br-int',has_traffic_filtering=True,id=9259d441-74e4-4515-8b47-d86ae8e47f98,network=Network(982b406e-0686-44db-8945-39e0f57e4781),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9259d441-74')#033[00m
Oct  2 08:06:26 np0005466013 nova_compute[192144]: 2025-10-02 12:06:26.729 2 DEBUG nova.virt.libvirt.driver [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:06:26 np0005466013 nova_compute[192144]: 2025-10-02 12:06:26.729 2 DEBUG nova.virt.libvirt.driver [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:06:26 np0005466013 nova_compute[192144]: 2025-10-02 12:06:26.730 2 DEBUG nova.virt.libvirt.driver [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] No VIF found with MAC fa:16:3e:bd:15:bf, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:06:26 np0005466013 nova_compute[192144]: 2025-10-02 12:06:26.730 2 INFO nova.virt.libvirt.driver [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Using config drive#033[00m
Oct  2 08:06:27 np0005466013 nova_compute[192144]: 2025-10-02 12:06:27.561 2 INFO nova.virt.libvirt.driver [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Creating config drive at /var/lib/nova/instances/3a407f85-32f9-4831-9b8c-4d237153d9f4/disk.config#033[00m
Oct  2 08:06:27 np0005466013 nova_compute[192144]: 2025-10-02 12:06:27.565 2 DEBUG oslo_concurrency.processutils [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3a407f85-32f9-4831-9b8c-4d237153d9f4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3xk7s6km execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:27 np0005466013 nova_compute[192144]: 2025-10-02 12:06:27.687 2 DEBUG oslo_concurrency.processutils [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3a407f85-32f9-4831-9b8c-4d237153d9f4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3xk7s6km" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:27 np0005466013 kernel: tap9259d441-74: entered promiscuous mode
Oct  2 08:06:27 np0005466013 ovn_controller[94366]: 2025-10-02T12:06:27Z|00080|binding|INFO|Claiming lport 9259d441-74e4-4515-8b47-d86ae8e47f98 for this chassis.
Oct  2 08:06:27 np0005466013 ovn_controller[94366]: 2025-10-02T12:06:27Z|00081|binding|INFO|9259d441-74e4-4515-8b47-d86ae8e47f98: Claiming fa:16:3e:bd:15:bf 10.100.0.3
Oct  2 08:06:27 np0005466013 nova_compute[192144]: 2025-10-02 12:06:27.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:27 np0005466013 NetworkManager[51205]: <info>  [1759406787.7449] manager: (tap9259d441-74): new Tun device (/org/freedesktop/NetworkManager/Devices/46)
Oct  2 08:06:27 np0005466013 nova_compute[192144]: 2025-10-02 12:06:27.747 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:27 np0005466013 systemd-udevd[223816]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:06:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:27.771 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bd:15:bf 10.100.0.3'], port_security=['fa:16:3e:bd:15:bf 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '3a407f85-32f9-4831-9b8c-4d237153d9f4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-982b406e-0686-44db-8945-39e0f57e4781', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '87e7399e976c40bc84f320ed0d052ac6', 'neutron:revision_number': '2', 'neutron:security_group_ids': '410de2f3-62e2-482c-a480-7655c2811e48', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f46c1e1f-04ef-471b-85c6-c4415ad3e6bb, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=9259d441-74e4-4515-8b47-d86ae8e47f98) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:06:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:27.773 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 9259d441-74e4-4515-8b47-d86ae8e47f98 in datapath 982b406e-0686-44db-8945-39e0f57e4781 bound to our chassis#033[00m
Oct  2 08:06:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:27.775 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 982b406e-0686-44db-8945-39e0f57e4781#033[00m
Oct  2 08:06:27 np0005466013 NetworkManager[51205]: <info>  [1759406787.7896] device (tap9259d441-74): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:06:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:27.789 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[70e3722d-a45e-4c86-9aa8-3fb92a6d929f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:27.790 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap982b406e-01 in ovnmeta-982b406e-0686-44db-8945-39e0f57e4781 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:06:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:27.792 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap982b406e-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:06:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:27.792 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[18cde5f8-8c0a-4147-8670-738f18b3fa7d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:27 np0005466013 NetworkManager[51205]: <info>  [1759406787.7938] device (tap9259d441-74): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:06:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:27.794 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[e078f34d-2669-4662-ae6c-545c03182fe8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:27 np0005466013 systemd-machined[152202]: New machine qemu-13-instance-00000023.
Oct  2 08:06:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:27.805 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[84826e35-b4f2-4f2e-9a20-a78423b9c80e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:27 np0005466013 nova_compute[192144]: 2025-10-02 12:06:27.806 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:27 np0005466013 ovn_controller[94366]: 2025-10-02T12:06:27Z|00082|binding|INFO|Setting lport 9259d441-74e4-4515-8b47-d86ae8e47f98 ovn-installed in OVS
Oct  2 08:06:27 np0005466013 ovn_controller[94366]: 2025-10-02T12:06:27Z|00083|binding|INFO|Setting lport 9259d441-74e4-4515-8b47-d86ae8e47f98 up in Southbound
Oct  2 08:06:27 np0005466013 nova_compute[192144]: 2025-10-02 12:06:27.812 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:27 np0005466013 systemd[1]: Started Virtual Machine qemu-13-instance-00000023.
Oct  2 08:06:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:27.828 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[e5f2096e-7cdc-4c61-bde6-9ea4cb33a3c5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:27 np0005466013 nova_compute[192144]: 2025-10-02 12:06:27.836 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:27.861 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[c60a4e5c-bb36-4535-aa6f-043de9ce4d74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:27 np0005466013 NetworkManager[51205]: <info>  [1759406787.8687] manager: (tap982b406e-00): new Veth device (/org/freedesktop/NetworkManager/Devices/47)
Oct  2 08:06:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:27.867 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[c5da20b1-9844-41f0-adb1-d6f62387119e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:27.903 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[042cc44e-4ed6-4ce8-9c3f-9700cff8d46d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:27.908 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[67745fb5-bd82-4f93-afd9-aa2e368d32d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:27 np0005466013 NetworkManager[51205]: <info>  [1759406787.9302] device (tap982b406e-00): carrier: link connected
Oct  2 08:06:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:27.934 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[656bbc9d-65da-4128-9187-30c7b8ea5909]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:27.955 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[07f5c74a-75d3-4a0b-a187-1157db2bf428]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap982b406e-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:92:e2:1f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 477954, 'reachable_time': 27346, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223852, 'error': None, 'target': 'ovnmeta-982b406e-0686-44db-8945-39e0f57e4781', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:27.982 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[3bdfd5d6-43bc-4b2d-a7fd-d9d1b1bd57c2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe92:e21f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 477954, 'tstamp': 477954}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223853, 'error': None, 'target': 'ovnmeta-982b406e-0686-44db-8945-39e0f57e4781', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:28 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:28.001 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[f95c72a1-065b-4ff8-8205-c3477ea32b2e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap982b406e-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:92:e2:1f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 477954, 'reachable_time': 27346, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223854, 'error': None, 'target': 'ovnmeta-982b406e-0686-44db-8945-39e0f57e4781', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:28 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:28.038 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[4f3b3733-3891-4a05-b84a-dabbc3e6e550]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:28 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:28.110 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[a0166e40-3658-4a71-9e7a-1a89f3c08d01]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:28 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:28.113 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap982b406e-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:06:28 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:28.113 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:06:28 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:28.114 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap982b406e-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:06:28 np0005466013 nova_compute[192144]: 2025-10-02 12:06:28.116 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:28 np0005466013 NetworkManager[51205]: <info>  [1759406788.1175] manager: (tap982b406e-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/48)
Oct  2 08:06:28 np0005466013 kernel: tap982b406e-00: entered promiscuous mode
Oct  2 08:06:28 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:28.126 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap982b406e-00, col_values=(('external_ids', {'iface-id': 'e7c44940-f7d8-482e-a63d-10c99ba9de76'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:06:28 np0005466013 nova_compute[192144]: 2025-10-02 12:06:28.128 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:28 np0005466013 ovn_controller[94366]: 2025-10-02T12:06:28Z|00084|binding|INFO|Releasing lport e7c44940-f7d8-482e-a63d-10c99ba9de76 from this chassis (sb_readonly=0)
Oct  2 08:06:28 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:28.129 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/982b406e-0686-44db-8945-39e0f57e4781.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/982b406e-0686-44db-8945-39e0f57e4781.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:06:28 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:28.131 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[064da9ba-b343-43b3-9c9d-7edfa59fb43c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:28 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:28.134 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:06:28 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:06:28 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:06:28 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-982b406e-0686-44db-8945-39e0f57e4781
Oct  2 08:06:28 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:06:28 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:06:28 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:06:28 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/982b406e-0686-44db-8945-39e0f57e4781.pid.haproxy
Oct  2 08:06:28 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:06:28 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:06:28 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:06:28 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:06:28 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:06:28 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:06:28 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:06:28 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:06:28 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:06:28 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:06:28 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:06:28 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:06:28 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:06:28 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:06:28 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:06:28 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:06:28 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:06:28 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:06:28 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:06:28 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:06:28 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID 982b406e-0686-44db-8945-39e0f57e4781
Oct  2 08:06:28 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:06:28 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:28.136 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-982b406e-0686-44db-8945-39e0f57e4781', 'env', 'PROCESS_TAG=haproxy-982b406e-0686-44db-8945-39e0f57e4781', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/982b406e-0686-44db-8945-39e0f57e4781.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:06:28 np0005466013 nova_compute[192144]: 2025-10-02 12:06:28.141 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:28 np0005466013 nova_compute[192144]: 2025-10-02 12:06:28.252 2 DEBUG nova.compute.manager [req-2fde53dd-a202-4556-87e9-698bebdc65d1 req-220f99f9-fd27-47f8-9141-776dd8f3a313 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Received event network-vif-plugged-9259d441-74e4-4515-8b47-d86ae8e47f98 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:06:28 np0005466013 nova_compute[192144]: 2025-10-02 12:06:28.252 2 DEBUG oslo_concurrency.lockutils [req-2fde53dd-a202-4556-87e9-698bebdc65d1 req-220f99f9-fd27-47f8-9141-776dd8f3a313 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "3a407f85-32f9-4831-9b8c-4d237153d9f4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:28 np0005466013 nova_compute[192144]: 2025-10-02 12:06:28.253 2 DEBUG oslo_concurrency.lockutils [req-2fde53dd-a202-4556-87e9-698bebdc65d1 req-220f99f9-fd27-47f8-9141-776dd8f3a313 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "3a407f85-32f9-4831-9b8c-4d237153d9f4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:28 np0005466013 nova_compute[192144]: 2025-10-02 12:06:28.253 2 DEBUG oslo_concurrency.lockutils [req-2fde53dd-a202-4556-87e9-698bebdc65d1 req-220f99f9-fd27-47f8-9141-776dd8f3a313 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "3a407f85-32f9-4831-9b8c-4d237153d9f4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:28 np0005466013 nova_compute[192144]: 2025-10-02 12:06:28.253 2 DEBUG nova.compute.manager [req-2fde53dd-a202-4556-87e9-698bebdc65d1 req-220f99f9-fd27-47f8-9141-776dd8f3a313 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Processing event network-vif-plugged-9259d441-74e4-4515-8b47-d86ae8e47f98 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:06:28 np0005466013 podman[223893]: 2025-10-02 12:06:28.555207034 +0000 UTC m=+0.054305089 container create 072d572d20a4748b369ecc9a5f952c826db72d9ecc60c9c9ea65c0108a6b5856 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-982b406e-0686-44db-8945-39e0f57e4781, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 08:06:28 np0005466013 nova_compute[192144]: 2025-10-02 12:06:28.571 2 DEBUG nova.compute.manager [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:06:28 np0005466013 nova_compute[192144]: 2025-10-02 12:06:28.574 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759406788.5702977, 3a407f85-32f9-4831-9b8c-4d237153d9f4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:06:28 np0005466013 nova_compute[192144]: 2025-10-02 12:06:28.574 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] VM Started (Lifecycle Event)#033[00m
Oct  2 08:06:28 np0005466013 nova_compute[192144]: 2025-10-02 12:06:28.578 2 DEBUG nova.virt.libvirt.driver [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:06:28 np0005466013 nova_compute[192144]: 2025-10-02 12:06:28.584 2 INFO nova.virt.libvirt.driver [-] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Instance spawned successfully.#033[00m
Oct  2 08:06:28 np0005466013 nova_compute[192144]: 2025-10-02 12:06:28.585 2 DEBUG nova.virt.libvirt.driver [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:06:28 np0005466013 nova_compute[192144]: 2025-10-02 12:06:28.602 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:06:28 np0005466013 systemd[1]: Started libpod-conmon-072d572d20a4748b369ecc9a5f952c826db72d9ecc60c9c9ea65c0108a6b5856.scope.
Oct  2 08:06:28 np0005466013 nova_compute[192144]: 2025-10-02 12:06:28.608 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:06:28 np0005466013 nova_compute[192144]: 2025-10-02 12:06:28.611 2 DEBUG nova.virt.libvirt.driver [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:06:28 np0005466013 nova_compute[192144]: 2025-10-02 12:06:28.611 2 DEBUG nova.virt.libvirt.driver [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:06:28 np0005466013 nova_compute[192144]: 2025-10-02 12:06:28.611 2 DEBUG nova.virt.libvirt.driver [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:06:28 np0005466013 nova_compute[192144]: 2025-10-02 12:06:28.611 2 DEBUG nova.virt.libvirt.driver [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:06:28 np0005466013 nova_compute[192144]: 2025-10-02 12:06:28.612 2 DEBUG nova.virt.libvirt.driver [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:06:28 np0005466013 nova_compute[192144]: 2025-10-02 12:06:28.612 2 DEBUG nova.virt.libvirt.driver [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:06:28 np0005466013 podman[223893]: 2025-10-02 12:06:28.529607098 +0000 UTC m=+0.028705203 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:06:28 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:06:28 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e05f33503b147dc9352d45dd6ca884eb17e8a17a714796ef84fe2a937935839d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:06:28 np0005466013 nova_compute[192144]: 2025-10-02 12:06:28.642 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:06:28 np0005466013 nova_compute[192144]: 2025-10-02 12:06:28.643 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759406788.5706315, 3a407f85-32f9-4831-9b8c-4d237153d9f4 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:06:28 np0005466013 nova_compute[192144]: 2025-10-02 12:06:28.643 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:06:28 np0005466013 podman[223893]: 2025-10-02 12:06:28.648159848 +0000 UTC m=+0.147257913 container init 072d572d20a4748b369ecc9a5f952c826db72d9ecc60c9c9ea65c0108a6b5856 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-982b406e-0686-44db-8945-39e0f57e4781, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:06:28 np0005466013 podman[223893]: 2025-10-02 12:06:28.654813594 +0000 UTC m=+0.153911649 container start 072d572d20a4748b369ecc9a5f952c826db72d9ecc60c9c9ea65c0108a6b5856 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-982b406e-0686-44db-8945-39e0f57e4781, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Oct  2 08:06:28 np0005466013 nova_compute[192144]: 2025-10-02 12:06:28.670 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:06:28 np0005466013 nova_compute[192144]: 2025-10-02 12:06:28.673 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759406788.5774984, 3a407f85-32f9-4831-9b8c-4d237153d9f4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:06:28 np0005466013 nova_compute[192144]: 2025-10-02 12:06:28.673 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:06:28 np0005466013 neutron-haproxy-ovnmeta-982b406e-0686-44db-8945-39e0f57e4781[223909]: [NOTICE]   (223913) : New worker (223915) forked
Oct  2 08:06:28 np0005466013 neutron-haproxy-ovnmeta-982b406e-0686-44db-8945-39e0f57e4781[223909]: [NOTICE]   (223913) : Loading success.
Oct  2 08:06:28 np0005466013 nova_compute[192144]: 2025-10-02 12:06:28.694 2 DEBUG nova.network.neutron [req-ba4866eb-6d3b-4aa5-8f54-00f7a0a7122e req-2560d08e-b669-4822-aeb6-665275039a9c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Updated VIF entry in instance network info cache for port 9259d441-74e4-4515-8b47-d86ae8e47f98. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:06:28 np0005466013 nova_compute[192144]: 2025-10-02 12:06:28.694 2 DEBUG nova.network.neutron [req-ba4866eb-6d3b-4aa5-8f54-00f7a0a7122e req-2560d08e-b669-4822-aeb6-665275039a9c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Updating instance_info_cache with network_info: [{"id": "9259d441-74e4-4515-8b47-d86ae8e47f98", "address": "fa:16:3e:bd:15:bf", "network": {"id": "982b406e-0686-44db-8945-39e0f57e4781", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-993027464-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "87e7399e976c40bc84f320ed0d052ac6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9259d441-74", "ovs_interfaceid": "9259d441-74e4-4515-8b47-d86ae8e47f98", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:06:28 np0005466013 nova_compute[192144]: 2025-10-02 12:06:28.697 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:06:28 np0005466013 nova_compute[192144]: 2025-10-02 12:06:28.700 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:06:28 np0005466013 nova_compute[192144]: 2025-10-02 12:06:28.710 2 INFO nova.compute.manager [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Took 7.74 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:06:28 np0005466013 nova_compute[192144]: 2025-10-02 12:06:28.711 2 DEBUG nova.compute.manager [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:06:28 np0005466013 nova_compute[192144]: 2025-10-02 12:06:28.718 2 DEBUG oslo_concurrency.lockutils [req-ba4866eb-6d3b-4aa5-8f54-00f7a0a7122e req-2560d08e-b669-4822-aeb6-665275039a9c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-3a407f85-32f9-4831-9b8c-4d237153d9f4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:06:28 np0005466013 nova_compute[192144]: 2025-10-02 12:06:28.738 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:06:28 np0005466013 nova_compute[192144]: 2025-10-02 12:06:28.790 2 INFO nova.compute.manager [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Took 8.58 seconds to build instance.#033[00m
Oct  2 08:06:28 np0005466013 nova_compute[192144]: 2025-10-02 12:06:28.817 2 DEBUG oslo_concurrency.lockutils [None req-10001161-8f0c-441a-90e1-17b37a6881b9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lock "3a407f85-32f9-4831-9b8c-4d237153d9f4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.759s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:30 np0005466013 nova_compute[192144]: 2025-10-02 12:06:30.203 2 DEBUG oslo_concurrency.lockutils [None req-3de582c3-c0ea-4ce9-840a-5eaf3e1feaf5 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Acquiring lock "3a407f85-32f9-4831-9b8c-4d237153d9f4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:30 np0005466013 nova_compute[192144]: 2025-10-02 12:06:30.204 2 DEBUG oslo_concurrency.lockutils [None req-3de582c3-c0ea-4ce9-840a-5eaf3e1feaf5 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lock "3a407f85-32f9-4831-9b8c-4d237153d9f4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:30 np0005466013 nova_compute[192144]: 2025-10-02 12:06:30.205 2 DEBUG oslo_concurrency.lockutils [None req-3de582c3-c0ea-4ce9-840a-5eaf3e1feaf5 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Acquiring lock "3a407f85-32f9-4831-9b8c-4d237153d9f4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:30 np0005466013 nova_compute[192144]: 2025-10-02 12:06:30.205 2 DEBUG oslo_concurrency.lockutils [None req-3de582c3-c0ea-4ce9-840a-5eaf3e1feaf5 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lock "3a407f85-32f9-4831-9b8c-4d237153d9f4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:30 np0005466013 nova_compute[192144]: 2025-10-02 12:06:30.205 2 DEBUG oslo_concurrency.lockutils [None req-3de582c3-c0ea-4ce9-840a-5eaf3e1feaf5 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lock "3a407f85-32f9-4831-9b8c-4d237153d9f4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:30 np0005466013 nova_compute[192144]: 2025-10-02 12:06:30.215 2 INFO nova.compute.manager [None req-3de582c3-c0ea-4ce9-840a-5eaf3e1feaf5 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Terminating instance#033[00m
Oct  2 08:06:30 np0005466013 nova_compute[192144]: 2025-10-02 12:06:30.225 2 DEBUG nova.compute.manager [None req-3de582c3-c0ea-4ce9-840a-5eaf3e1feaf5 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:06:30 np0005466013 kernel: tap9259d441-74 (unregistering): left promiscuous mode
Oct  2 08:06:30 np0005466013 NetworkManager[51205]: <info>  [1759406790.2427] device (tap9259d441-74): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:06:30 np0005466013 ovn_controller[94366]: 2025-10-02T12:06:30Z|00085|binding|INFO|Releasing lport 9259d441-74e4-4515-8b47-d86ae8e47f98 from this chassis (sb_readonly=0)
Oct  2 08:06:30 np0005466013 ovn_controller[94366]: 2025-10-02T12:06:30Z|00086|binding|INFO|Setting lport 9259d441-74e4-4515-8b47-d86ae8e47f98 down in Southbound
Oct  2 08:06:30 np0005466013 ovn_controller[94366]: 2025-10-02T12:06:30Z|00087|binding|INFO|Removing iface tap9259d441-74 ovn-installed in OVS
Oct  2 08:06:30 np0005466013 nova_compute[192144]: 2025-10-02 12:06:30.253 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:30 np0005466013 nova_compute[192144]: 2025-10-02 12:06:30.255 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:30.263 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bd:15:bf 10.100.0.3'], port_security=['fa:16:3e:bd:15:bf 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '3a407f85-32f9-4831-9b8c-4d237153d9f4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-982b406e-0686-44db-8945-39e0f57e4781', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '87e7399e976c40bc84f320ed0d052ac6', 'neutron:revision_number': '4', 'neutron:security_group_ids': '410de2f3-62e2-482c-a480-7655c2811e48', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f46c1e1f-04ef-471b-85c6-c4415ad3e6bb, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=9259d441-74e4-4515-8b47-d86ae8e47f98) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:06:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:30.264 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 9259d441-74e4-4515-8b47-d86ae8e47f98 in datapath 982b406e-0686-44db-8945-39e0f57e4781 unbound from our chassis#033[00m
Oct  2 08:06:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:30.265 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 982b406e-0686-44db-8945-39e0f57e4781, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:06:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:30.266 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[72cca121-ee53-4614-85b6-d2099af768e2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:30.267 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-982b406e-0686-44db-8945-39e0f57e4781 namespace which is not needed anymore#033[00m
Oct  2 08:06:30 np0005466013 nova_compute[192144]: 2025-10-02 12:06:30.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:30 np0005466013 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000023.scope: Deactivated successfully.
Oct  2 08:06:30 np0005466013 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000023.scope: Consumed 2.367s CPU time.
Oct  2 08:06:30 np0005466013 systemd-machined[152202]: Machine qemu-13-instance-00000023 terminated.
Oct  2 08:06:30 np0005466013 neutron-haproxy-ovnmeta-982b406e-0686-44db-8945-39e0f57e4781[223909]: [NOTICE]   (223913) : haproxy version is 2.8.14-c23fe91
Oct  2 08:06:30 np0005466013 neutron-haproxy-ovnmeta-982b406e-0686-44db-8945-39e0f57e4781[223909]: [NOTICE]   (223913) : path to executable is /usr/sbin/haproxy
Oct  2 08:06:30 np0005466013 neutron-haproxy-ovnmeta-982b406e-0686-44db-8945-39e0f57e4781[223909]: [WARNING]  (223913) : Exiting Master process...
Oct  2 08:06:30 np0005466013 neutron-haproxy-ovnmeta-982b406e-0686-44db-8945-39e0f57e4781[223909]: [ALERT]    (223913) : Current worker (223915) exited with code 143 (Terminated)
Oct  2 08:06:30 np0005466013 neutron-haproxy-ovnmeta-982b406e-0686-44db-8945-39e0f57e4781[223909]: [WARNING]  (223913) : All workers exited. Exiting... (0)
Oct  2 08:06:30 np0005466013 systemd[1]: libpod-072d572d20a4748b369ecc9a5f952c826db72d9ecc60c9c9ea65c0108a6b5856.scope: Deactivated successfully.
Oct  2 08:06:30 np0005466013 podman[223946]: 2025-10-02 12:06:30.400621906 +0000 UTC m=+0.044042218 container died 072d572d20a4748b369ecc9a5f952c826db72d9ecc60c9c9ea65c0108a6b5856 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-982b406e-0686-44db-8945-39e0f57e4781, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Oct  2 08:06:30 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-072d572d20a4748b369ecc9a5f952c826db72d9ecc60c9c9ea65c0108a6b5856-userdata-shm.mount: Deactivated successfully.
Oct  2 08:06:30 np0005466013 systemd[1]: var-lib-containers-storage-overlay-e05f33503b147dc9352d45dd6ca884eb17e8a17a714796ef84fe2a937935839d-merged.mount: Deactivated successfully.
Oct  2 08:06:30 np0005466013 podman[223946]: 2025-10-02 12:06:30.442574197 +0000 UTC m=+0.085994509 container cleanup 072d572d20a4748b369ecc9a5f952c826db72d9ecc60c9c9ea65c0108a6b5856 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-982b406e-0686-44db-8945-39e0f57e4781, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 08:06:30 np0005466013 NetworkManager[51205]: <info>  [1759406790.4475] manager: (tap9259d441-74): new Tun device (/org/freedesktop/NetworkManager/Devices/49)
Oct  2 08:06:30 np0005466013 nova_compute[192144]: 2025-10-02 12:06:30.450 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:30 np0005466013 nova_compute[192144]: 2025-10-02 12:06:30.456 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:30 np0005466013 systemd[1]: libpod-conmon-072d572d20a4748b369ecc9a5f952c826db72d9ecc60c9c9ea65c0108a6b5856.scope: Deactivated successfully.
Oct  2 08:06:30 np0005466013 nova_compute[192144]: 2025-10-02 12:06:30.497 2 INFO nova.virt.libvirt.driver [-] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Instance destroyed successfully.#033[00m
Oct  2 08:06:30 np0005466013 nova_compute[192144]: 2025-10-02 12:06:30.498 2 DEBUG nova.objects.instance [None req-3de582c3-c0ea-4ce9-840a-5eaf3e1feaf5 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lazy-loading 'resources' on Instance uuid 3a407f85-32f9-4831-9b8c-4d237153d9f4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:06:30 np0005466013 podman[223979]: 2025-10-02 12:06:30.51245175 +0000 UTC m=+0.040752928 container remove 072d572d20a4748b369ecc9a5f952c826db72d9ecc60c9c9ea65c0108a6b5856 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-982b406e-0686-44db-8945-39e0f57e4781, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 08:06:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:30.521 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[8b549c06-3100-4f75-b3ec-ab63d11dcc4e]: (4, ('Thu Oct  2 12:06:30 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-982b406e-0686-44db-8945-39e0f57e4781 (072d572d20a4748b369ecc9a5f952c826db72d9ecc60c9c9ea65c0108a6b5856)\n072d572d20a4748b369ecc9a5f952c826db72d9ecc60c9c9ea65c0108a6b5856\nThu Oct  2 12:06:30 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-982b406e-0686-44db-8945-39e0f57e4781 (072d572d20a4748b369ecc9a5f952c826db72d9ecc60c9c9ea65c0108a6b5856)\n072d572d20a4748b369ecc9a5f952c826db72d9ecc60c9c9ea65c0108a6b5856\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:30 np0005466013 nova_compute[192144]: 2025-10-02 12:06:30.522 2 DEBUG nova.virt.libvirt.vif [None req-3de582c3-c0ea-4ce9-840a-5eaf3e1feaf5 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:06:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-1407570191',display_name='tempest-ImagesOneServerNegativeTestJSON-server-1407570191',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-1407570191',id=35,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:06:28Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='87e7399e976c40bc84f320ed0d052ac6',ramdisk_id='',reservation_id='r-9yu2xa9z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-507683469',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-507683469-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:06:28Z,user_data=None,user_id='cdc7ec1af4d8410db0b4592293549806',uuid=3a407f85-32f9-4831-9b8c-4d237153d9f4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9259d441-74e4-4515-8b47-d86ae8e47f98", "address": "fa:16:3e:bd:15:bf", "network": {"id": "982b406e-0686-44db-8945-39e0f57e4781", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-993027464-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "87e7399e976c40bc84f320ed0d052ac6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9259d441-74", "ovs_interfaceid": "9259d441-74e4-4515-8b47-d86ae8e47f98", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:06:30 np0005466013 nova_compute[192144]: 2025-10-02 12:06:30.522 2 DEBUG nova.network.os_vif_util [None req-3de582c3-c0ea-4ce9-840a-5eaf3e1feaf5 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Converting VIF {"id": "9259d441-74e4-4515-8b47-d86ae8e47f98", "address": "fa:16:3e:bd:15:bf", "network": {"id": "982b406e-0686-44db-8945-39e0f57e4781", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-993027464-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "87e7399e976c40bc84f320ed0d052ac6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9259d441-74", "ovs_interfaceid": "9259d441-74e4-4515-8b47-d86ae8e47f98", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:06:30 np0005466013 nova_compute[192144]: 2025-10-02 12:06:30.523 2 DEBUG nova.network.os_vif_util [None req-3de582c3-c0ea-4ce9-840a-5eaf3e1feaf5 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bd:15:bf,bridge_name='br-int',has_traffic_filtering=True,id=9259d441-74e4-4515-8b47-d86ae8e47f98,network=Network(982b406e-0686-44db-8945-39e0f57e4781),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9259d441-74') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:06:30 np0005466013 nova_compute[192144]: 2025-10-02 12:06:30.523 2 DEBUG os_vif [None req-3de582c3-c0ea-4ce9-840a-5eaf3e1feaf5 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bd:15:bf,bridge_name='br-int',has_traffic_filtering=True,id=9259d441-74e4-4515-8b47-d86ae8e47f98,network=Network(982b406e-0686-44db-8945-39e0f57e4781),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9259d441-74') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:06:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:30.523 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[bd480f41-625d-4b76-9092-54244d40d8ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:30.525 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap982b406e-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:06:30 np0005466013 nova_compute[192144]: 2025-10-02 12:06:30.526 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:30 np0005466013 nova_compute[192144]: 2025-10-02 12:06:30.526 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9259d441-74, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:06:30 np0005466013 kernel: tap982b406e-00: left promiscuous mode
Oct  2 08:06:30 np0005466013 nova_compute[192144]: 2025-10-02 12:06:30.530 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:06:30 np0005466013 nova_compute[192144]: 2025-10-02 12:06:30.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:30 np0005466013 nova_compute[192144]: 2025-10-02 12:06:30.551 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:30 np0005466013 nova_compute[192144]: 2025-10-02 12:06:30.553 2 INFO os_vif [None req-3de582c3-c0ea-4ce9-840a-5eaf3e1feaf5 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bd:15:bf,bridge_name='br-int',has_traffic_filtering=True,id=9259d441-74e4-4515-8b47-d86ae8e47f98,network=Network(982b406e-0686-44db-8945-39e0f57e4781),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9259d441-74')#033[00m
Oct  2 08:06:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:30.553 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[dfaa5e3d-89a7-4bb2-9382-2219aebbda84]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:30 np0005466013 nova_compute[192144]: 2025-10-02 12:06:30.554 2 INFO nova.virt.libvirt.driver [None req-3de582c3-c0ea-4ce9-840a-5eaf3e1feaf5 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Deleting instance files /var/lib/nova/instances/3a407f85-32f9-4831-9b8c-4d237153d9f4_del#033[00m
Oct  2 08:06:30 np0005466013 nova_compute[192144]: 2025-10-02 12:06:30.554 2 INFO nova.virt.libvirt.driver [None req-3de582c3-c0ea-4ce9-840a-5eaf3e1feaf5 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Deletion of /var/lib/nova/instances/3a407f85-32f9-4831-9b8c-4d237153d9f4_del complete#033[00m
Oct  2 08:06:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:30.587 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[4067167d-7eea-4fb5-8758-0dbd1185a719]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:30.589 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[f2bd7dd8-aa9d-4581-8601-19159dd8f54d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:30.603 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[d1616d5c-057e-494d-90ea-e2e56a544b03]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 477947, 'reachable_time': 35044, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224008, 'error': None, 'target': 'ovnmeta-982b406e-0686-44db-8945-39e0f57e4781', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:30.607 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-982b406e-0686-44db-8945-39e0f57e4781 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:06:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:30.607 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[b5be06df-0a0c-4e4a-b6f4-ac93d8f10ff5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:30 np0005466013 systemd[1]: run-netns-ovnmeta\x2d982b406e\x2d0686\x2d44db\x2d8945\x2d39e0f57e4781.mount: Deactivated successfully.
Oct  2 08:06:30 np0005466013 nova_compute[192144]: 2025-10-02 12:06:30.659 2 INFO nova.compute.manager [None req-3de582c3-c0ea-4ce9-840a-5eaf3e1feaf5 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Took 0.43 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:06:30 np0005466013 nova_compute[192144]: 2025-10-02 12:06:30.660 2 DEBUG oslo.service.loopingcall [None req-3de582c3-c0ea-4ce9-840a-5eaf3e1feaf5 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:06:30 np0005466013 nova_compute[192144]: 2025-10-02 12:06:30.660 2 DEBUG nova.compute.manager [-] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:06:30 np0005466013 nova_compute[192144]: 2025-10-02 12:06:30.660 2 DEBUG nova.network.neutron [-] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:06:30 np0005466013 nova_compute[192144]: 2025-10-02 12:06:30.792 2 DEBUG nova.compute.manager [req-f0270c3e-93fa-4c94-bbde-1d4ac52ef180 req-a1bde352-7c7a-4f89-a6c3-aaa81968bb1c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Received event network-vif-plugged-9259d441-74e4-4515-8b47-d86ae8e47f98 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:06:30 np0005466013 nova_compute[192144]: 2025-10-02 12:06:30.793 2 DEBUG oslo_concurrency.lockutils [req-f0270c3e-93fa-4c94-bbde-1d4ac52ef180 req-a1bde352-7c7a-4f89-a6c3-aaa81968bb1c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "3a407f85-32f9-4831-9b8c-4d237153d9f4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:30 np0005466013 nova_compute[192144]: 2025-10-02 12:06:30.794 2 DEBUG oslo_concurrency.lockutils [req-f0270c3e-93fa-4c94-bbde-1d4ac52ef180 req-a1bde352-7c7a-4f89-a6c3-aaa81968bb1c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "3a407f85-32f9-4831-9b8c-4d237153d9f4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:30 np0005466013 nova_compute[192144]: 2025-10-02 12:06:30.794 2 DEBUG oslo_concurrency.lockutils [req-f0270c3e-93fa-4c94-bbde-1d4ac52ef180 req-a1bde352-7c7a-4f89-a6c3-aaa81968bb1c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "3a407f85-32f9-4831-9b8c-4d237153d9f4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:30 np0005466013 nova_compute[192144]: 2025-10-02 12:06:30.794 2 DEBUG nova.compute.manager [req-f0270c3e-93fa-4c94-bbde-1d4ac52ef180 req-a1bde352-7c7a-4f89-a6c3-aaa81968bb1c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] No waiting events found dispatching network-vif-plugged-9259d441-74e4-4515-8b47-d86ae8e47f98 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:06:30 np0005466013 nova_compute[192144]: 2025-10-02 12:06:30.795 2 WARNING nova.compute.manager [req-f0270c3e-93fa-4c94-bbde-1d4ac52ef180 req-a1bde352-7c7a-4f89-a6c3-aaa81968bb1c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Received unexpected event network-vif-plugged-9259d441-74e4-4515-8b47-d86ae8e47f98 for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:06:32 np0005466013 nova_compute[192144]: 2025-10-02 12:06:32.269 2 DEBUG nova.network.neutron [-] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:06:32 np0005466013 nova_compute[192144]: 2025-10-02 12:06:32.315 2 INFO nova.compute.manager [-] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Took 1.65 seconds to deallocate network for instance.#033[00m
Oct  2 08:06:32 np0005466013 nova_compute[192144]: 2025-10-02 12:06:32.524 2 DEBUG oslo_concurrency.lockutils [None req-3de582c3-c0ea-4ce9-840a-5eaf3e1feaf5 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:32 np0005466013 nova_compute[192144]: 2025-10-02 12:06:32.524 2 DEBUG oslo_concurrency.lockutils [None req-3de582c3-c0ea-4ce9-840a-5eaf3e1feaf5 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:32 np0005466013 nova_compute[192144]: 2025-10-02 12:06:32.599 2 DEBUG nova.compute.provider_tree [None req-3de582c3-c0ea-4ce9-840a-5eaf3e1feaf5 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:06:32 np0005466013 nova_compute[192144]: 2025-10-02 12:06:32.617 2 DEBUG nova.scheduler.client.report [None req-3de582c3-c0ea-4ce9-840a-5eaf3e1feaf5 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:06:32 np0005466013 nova_compute[192144]: 2025-10-02 12:06:32.659 2 DEBUG oslo_concurrency.lockutils [None req-3de582c3-c0ea-4ce9-840a-5eaf3e1feaf5 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:32 np0005466013 nova_compute[192144]: 2025-10-02 12:06:32.765 2 INFO nova.scheduler.client.report [None req-3de582c3-c0ea-4ce9-840a-5eaf3e1feaf5 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Deleted allocations for instance 3a407f85-32f9-4831-9b8c-4d237153d9f4#033[00m
Oct  2 08:06:32 np0005466013 nova_compute[192144]: 2025-10-02 12:06:32.837 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:33 np0005466013 nova_compute[192144]: 2025-10-02 12:06:33.038 2 DEBUG oslo_concurrency.lockutils [None req-3de582c3-c0ea-4ce9-840a-5eaf3e1feaf5 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lock "3a407f85-32f9-4831-9b8c-4d237153d9f4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.833s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:33 np0005466013 nova_compute[192144]: 2025-10-02 12:06:33.125 2 DEBUG nova.compute.manager [req-15afc3cd-b699-45a5-ad25-9808a57e419d req-52e1c459-7e45-4443-8c75-21baee1e1210 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Received event network-vif-unplugged-9259d441-74e4-4515-8b47-d86ae8e47f98 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:06:33 np0005466013 nova_compute[192144]: 2025-10-02 12:06:33.126 2 DEBUG oslo_concurrency.lockutils [req-15afc3cd-b699-45a5-ad25-9808a57e419d req-52e1c459-7e45-4443-8c75-21baee1e1210 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "3a407f85-32f9-4831-9b8c-4d237153d9f4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:33 np0005466013 nova_compute[192144]: 2025-10-02 12:06:33.126 2 DEBUG oslo_concurrency.lockutils [req-15afc3cd-b699-45a5-ad25-9808a57e419d req-52e1c459-7e45-4443-8c75-21baee1e1210 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "3a407f85-32f9-4831-9b8c-4d237153d9f4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:33 np0005466013 nova_compute[192144]: 2025-10-02 12:06:33.126 2 DEBUG oslo_concurrency.lockutils [req-15afc3cd-b699-45a5-ad25-9808a57e419d req-52e1c459-7e45-4443-8c75-21baee1e1210 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "3a407f85-32f9-4831-9b8c-4d237153d9f4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:33 np0005466013 nova_compute[192144]: 2025-10-02 12:06:33.126 2 DEBUG nova.compute.manager [req-15afc3cd-b699-45a5-ad25-9808a57e419d req-52e1c459-7e45-4443-8c75-21baee1e1210 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] No waiting events found dispatching network-vif-unplugged-9259d441-74e4-4515-8b47-d86ae8e47f98 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:06:33 np0005466013 nova_compute[192144]: 2025-10-02 12:06:33.126 2 WARNING nova.compute.manager [req-15afc3cd-b699-45a5-ad25-9808a57e419d req-52e1c459-7e45-4443-8c75-21baee1e1210 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Received unexpected event network-vif-unplugged-9259d441-74e4-4515-8b47-d86ae8e47f98 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:06:33 np0005466013 nova_compute[192144]: 2025-10-02 12:06:33.127 2 DEBUG nova.compute.manager [req-15afc3cd-b699-45a5-ad25-9808a57e419d req-52e1c459-7e45-4443-8c75-21baee1e1210 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Received event network-vif-plugged-9259d441-74e4-4515-8b47-d86ae8e47f98 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:06:33 np0005466013 nova_compute[192144]: 2025-10-02 12:06:33.127 2 DEBUG oslo_concurrency.lockutils [req-15afc3cd-b699-45a5-ad25-9808a57e419d req-52e1c459-7e45-4443-8c75-21baee1e1210 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "3a407f85-32f9-4831-9b8c-4d237153d9f4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:33 np0005466013 nova_compute[192144]: 2025-10-02 12:06:33.127 2 DEBUG oslo_concurrency.lockutils [req-15afc3cd-b699-45a5-ad25-9808a57e419d req-52e1c459-7e45-4443-8c75-21baee1e1210 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "3a407f85-32f9-4831-9b8c-4d237153d9f4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:33 np0005466013 nova_compute[192144]: 2025-10-02 12:06:33.127 2 DEBUG oslo_concurrency.lockutils [req-15afc3cd-b699-45a5-ad25-9808a57e419d req-52e1c459-7e45-4443-8c75-21baee1e1210 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "3a407f85-32f9-4831-9b8c-4d237153d9f4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:33 np0005466013 nova_compute[192144]: 2025-10-02 12:06:33.127 2 DEBUG nova.compute.manager [req-15afc3cd-b699-45a5-ad25-9808a57e419d req-52e1c459-7e45-4443-8c75-21baee1e1210 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] No waiting events found dispatching network-vif-plugged-9259d441-74e4-4515-8b47-d86ae8e47f98 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:06:33 np0005466013 nova_compute[192144]: 2025-10-02 12:06:33.128 2 WARNING nova.compute.manager [req-15afc3cd-b699-45a5-ad25-9808a57e419d req-52e1c459-7e45-4443-8c75-21baee1e1210 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Received unexpected event network-vif-plugged-9259d441-74e4-4515-8b47-d86ae8e47f98 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:06:33 np0005466013 nova_compute[192144]: 2025-10-02 12:06:33.128 2 DEBUG nova.compute.manager [req-15afc3cd-b699-45a5-ad25-9808a57e419d req-52e1c459-7e45-4443-8c75-21baee1e1210 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Received event network-vif-deleted-9259d441-74e4-4515-8b47-d86ae8e47f98 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:06:34 np0005466013 podman[224009]: 2025-10-02 12:06:34.69865184 +0000 UTC m=+0.070904631 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:06:35 np0005466013 nova_compute[192144]: 2025-10-02 12:06:35.528 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:37 np0005466013 podman[224031]: 2025-10-02 12:06:37.714928359 +0000 UTC m=+0.074565176 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, release=1755695350, build-date=2025-08-20T13:12:41, config_id=edpm, io.buildah.version=1.33.7, distribution-scope=public, io.openshift.expose-services=, managed_by=edpm_ansible, vendor=Red Hat, Inc.)
Oct  2 08:06:37 np0005466013 podman[224030]: 2025-10-02 12:06:37.73200173 +0000 UTC m=+0.096489986 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  2 08:06:37 np0005466013 nova_compute[192144]: 2025-10-02 12:06:37.861 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:40 np0005466013 nova_compute[192144]: 2025-10-02 12:06:40.532 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:41 np0005466013 nova_compute[192144]: 2025-10-02 12:06:41.854 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:42 np0005466013 podman[224069]: 2025-10-02 12:06:42.678627401 +0000 UTC m=+0.051094649 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 08:06:42 np0005466013 podman[224070]: 2025-10-02 12:06:42.708722812 +0000 UTC m=+0.079969335 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, managed_by=edpm_ansible, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 08:06:42 np0005466013 nova_compute[192144]: 2025-10-02 12:06:42.750 2 DEBUG oslo_concurrency.lockutils [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Acquiring lock "a850e122-58e1-4fa2-9555-1564c9c36203" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:42 np0005466013 nova_compute[192144]: 2025-10-02 12:06:42.750 2 DEBUG oslo_concurrency.lockutils [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "a850e122-58e1-4fa2-9555-1564c9c36203" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:42 np0005466013 nova_compute[192144]: 2025-10-02 12:06:42.769 2 DEBUG nova.compute.manager [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:06:42 np0005466013 nova_compute[192144]: 2025-10-02 12:06:42.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:42 np0005466013 nova_compute[192144]: 2025-10-02 12:06:42.893 2 DEBUG oslo_concurrency.lockutils [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:42 np0005466013 nova_compute[192144]: 2025-10-02 12:06:42.894 2 DEBUG oslo_concurrency.lockutils [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:42 np0005466013 nova_compute[192144]: 2025-10-02 12:06:42.900 2 DEBUG nova.virt.hardware [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:06:42 np0005466013 nova_compute[192144]: 2025-10-02 12:06:42.900 2 INFO nova.compute.claims [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:06:43 np0005466013 nova_compute[192144]: 2025-10-02 12:06:43.065 2 DEBUG nova.compute.provider_tree [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:06:43 np0005466013 nova_compute[192144]: 2025-10-02 12:06:43.087 2 DEBUG nova.scheduler.client.report [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:06:43 np0005466013 nova_compute[192144]: 2025-10-02 12:06:43.118 2 DEBUG oslo_concurrency.lockutils [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.224s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:43 np0005466013 nova_compute[192144]: 2025-10-02 12:06:43.119 2 DEBUG nova.compute.manager [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:06:43 np0005466013 nova_compute[192144]: 2025-10-02 12:06:43.195 2 DEBUG nova.compute.manager [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:06:43 np0005466013 nova_compute[192144]: 2025-10-02 12:06:43.195 2 DEBUG nova.network.neutron [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:06:43 np0005466013 nova_compute[192144]: 2025-10-02 12:06:43.210 2 INFO nova.virt.libvirt.driver [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:06:43 np0005466013 nova_compute[192144]: 2025-10-02 12:06:43.231 2 DEBUG nova.compute.manager [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:06:43 np0005466013 nova_compute[192144]: 2025-10-02 12:06:43.364 2 DEBUG nova.compute.manager [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:06:43 np0005466013 nova_compute[192144]: 2025-10-02 12:06:43.366 2 DEBUG nova.virt.libvirt.driver [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:06:43 np0005466013 nova_compute[192144]: 2025-10-02 12:06:43.367 2 INFO nova.virt.libvirt.driver [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Creating image(s)#033[00m
Oct  2 08:06:43 np0005466013 nova_compute[192144]: 2025-10-02 12:06:43.368 2 DEBUG oslo_concurrency.lockutils [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Acquiring lock "/var/lib/nova/instances/a850e122-58e1-4fa2-9555-1564c9c36203/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:43 np0005466013 nova_compute[192144]: 2025-10-02 12:06:43.368 2 DEBUG oslo_concurrency.lockutils [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "/var/lib/nova/instances/a850e122-58e1-4fa2-9555-1564c9c36203/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:43 np0005466013 nova_compute[192144]: 2025-10-02 12:06:43.369 2 DEBUG oslo_concurrency.lockutils [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "/var/lib/nova/instances/a850e122-58e1-4fa2-9555-1564c9c36203/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:43 np0005466013 nova_compute[192144]: 2025-10-02 12:06:43.384 2 DEBUG oslo_concurrency.processutils [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:43 np0005466013 nova_compute[192144]: 2025-10-02 12:06:43.434 2 DEBUG nova.policy [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:06:43 np0005466013 nova_compute[192144]: 2025-10-02 12:06:43.441 2 DEBUG oslo_concurrency.processutils [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:43 np0005466013 nova_compute[192144]: 2025-10-02 12:06:43.442 2 DEBUG oslo_concurrency.lockutils [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:43 np0005466013 nova_compute[192144]: 2025-10-02 12:06:43.443 2 DEBUG oslo_concurrency.lockutils [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:43 np0005466013 nova_compute[192144]: 2025-10-02 12:06:43.453 2 DEBUG oslo_concurrency.processutils [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:43 np0005466013 nova_compute[192144]: 2025-10-02 12:06:43.507 2 DEBUG oslo_concurrency.processutils [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:43 np0005466013 nova_compute[192144]: 2025-10-02 12:06:43.508 2 DEBUG oslo_concurrency.processutils [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/a850e122-58e1-4fa2-9555-1564c9c36203/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:43 np0005466013 nova_compute[192144]: 2025-10-02 12:06:43.542 2 DEBUG oslo_concurrency.processutils [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/a850e122-58e1-4fa2-9555-1564c9c36203/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:43 np0005466013 nova_compute[192144]: 2025-10-02 12:06:43.544 2 DEBUG oslo_concurrency.lockutils [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:43 np0005466013 nova_compute[192144]: 2025-10-02 12:06:43.544 2 DEBUG oslo_concurrency.processutils [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:43 np0005466013 nova_compute[192144]: 2025-10-02 12:06:43.601 2 DEBUG oslo_concurrency.processutils [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:43 np0005466013 nova_compute[192144]: 2025-10-02 12:06:43.602 2 DEBUG nova.virt.disk.api [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Checking if we can resize image /var/lib/nova/instances/a850e122-58e1-4fa2-9555-1564c9c36203/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:06:43 np0005466013 nova_compute[192144]: 2025-10-02 12:06:43.602 2 DEBUG oslo_concurrency.processutils [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a850e122-58e1-4fa2-9555-1564c9c36203/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:43 np0005466013 nova_compute[192144]: 2025-10-02 12:06:43.655 2 DEBUG oslo_concurrency.processutils [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a850e122-58e1-4fa2-9555-1564c9c36203/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:43 np0005466013 nova_compute[192144]: 2025-10-02 12:06:43.656 2 DEBUG nova.virt.disk.api [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Cannot resize image /var/lib/nova/instances/a850e122-58e1-4fa2-9555-1564c9c36203/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:06:43 np0005466013 nova_compute[192144]: 2025-10-02 12:06:43.658 2 DEBUG nova.objects.instance [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lazy-loading 'migration_context' on Instance uuid a850e122-58e1-4fa2-9555-1564c9c36203 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:06:43 np0005466013 nova_compute[192144]: 2025-10-02 12:06:43.674 2 DEBUG nova.virt.libvirt.driver [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:06:43 np0005466013 nova_compute[192144]: 2025-10-02 12:06:43.674 2 DEBUG nova.virt.libvirt.driver [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Ensure instance console log exists: /var/lib/nova/instances/a850e122-58e1-4fa2-9555-1564c9c36203/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:06:43 np0005466013 nova_compute[192144]: 2025-10-02 12:06:43.675 2 DEBUG oslo_concurrency.lockutils [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:43 np0005466013 nova_compute[192144]: 2025-10-02 12:06:43.675 2 DEBUG oslo_concurrency.lockutils [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:43 np0005466013 nova_compute[192144]: 2025-10-02 12:06:43.675 2 DEBUG oslo_concurrency.lockutils [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:45 np0005466013 nova_compute[192144]: 2025-10-02 12:06:45.351 2 DEBUG nova.network.neutron [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Successfully created port: 0b2b5a9b-1472-4353-96f8-c4b5d8fe1132 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:06:45 np0005466013 nova_compute[192144]: 2025-10-02 12:06:45.497 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759406790.4958088, 3a407f85-32f9-4831-9b8c-4d237153d9f4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:06:45 np0005466013 nova_compute[192144]: 2025-10-02 12:06:45.497 2 INFO nova.compute.manager [-] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:06:45 np0005466013 nova_compute[192144]: 2025-10-02 12:06:45.520 2 DEBUG nova.compute.manager [None req-77b0cc31-b330-4439-9fc2-e79ac71ccb0b - - - - - -] [instance: 3a407f85-32f9-4831-9b8c-4d237153d9f4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:06:45 np0005466013 nova_compute[192144]: 2025-10-02 12:06:45.534 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:46 np0005466013 nova_compute[192144]: 2025-10-02 12:06:46.375 2 DEBUG nova.network.neutron [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Successfully updated port: 0b2b5a9b-1472-4353-96f8-c4b5d8fe1132 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:06:46 np0005466013 nova_compute[192144]: 2025-10-02 12:06:46.397 2 DEBUG oslo_concurrency.lockutils [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Acquiring lock "refresh_cache-a850e122-58e1-4fa2-9555-1564c9c36203" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:06:46 np0005466013 nova_compute[192144]: 2025-10-02 12:06:46.398 2 DEBUG oslo_concurrency.lockutils [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Acquired lock "refresh_cache-a850e122-58e1-4fa2-9555-1564c9c36203" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:06:46 np0005466013 nova_compute[192144]: 2025-10-02 12:06:46.398 2 DEBUG nova.network.neutron [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:06:46 np0005466013 nova_compute[192144]: 2025-10-02 12:06:46.500 2 DEBUG nova.compute.manager [req-1e90f6bd-14eb-4843-bff6-416444d38d11 req-f750d525-922c-411d-899f-d67812ac5d5a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Received event network-changed-0b2b5a9b-1472-4353-96f8-c4b5d8fe1132 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:06:46 np0005466013 nova_compute[192144]: 2025-10-02 12:06:46.501 2 DEBUG nova.compute.manager [req-1e90f6bd-14eb-4843-bff6-416444d38d11 req-f750d525-922c-411d-899f-d67812ac5d5a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Refreshing instance network info cache due to event network-changed-0b2b5a9b-1472-4353-96f8-c4b5d8fe1132. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:06:46 np0005466013 nova_compute[192144]: 2025-10-02 12:06:46.501 2 DEBUG oslo_concurrency.lockutils [req-1e90f6bd-14eb-4843-bff6-416444d38d11 req-f750d525-922c-411d-899f-d67812ac5d5a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-a850e122-58e1-4fa2-9555-1564c9c36203" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:06:46 np0005466013 nova_compute[192144]: 2025-10-02 12:06:46.606 2 DEBUG nova.network.neutron [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.452 2 DEBUG oslo_concurrency.lockutils [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Acquiring lock "9ae3f2a0-cb83-4b09-8d4d-604c431e09e2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.453 2 DEBUG oslo_concurrency.lockutils [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Lock "9ae3f2a0-cb83-4b09-8d4d-604c431e09e2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.483 2 DEBUG nova.compute.manager [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.582 2 DEBUG oslo_concurrency.lockutils [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.582 2 DEBUG oslo_concurrency.lockutils [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.587 2 DEBUG nova.virt.hardware [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.587 2 INFO nova.compute.claims [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.727 2 DEBUG nova.compute.provider_tree [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.744 2 DEBUG nova.scheduler.client.report [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.764 2 DEBUG oslo_concurrency.lockutils [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.182s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.765 2 DEBUG nova.compute.manager [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.784 2 DEBUG nova.network.neutron [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Updating instance_info_cache with network_info: [{"id": "0b2b5a9b-1472-4353-96f8-c4b5d8fe1132", "address": "fa:16:3e:61:29:fa", "network": {"id": "66b5a7c3-fe3e-42b0-aea6-19534bca6e0e", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1726703238-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "db3f04a20fd740c1af3139196dc928d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b2b5a9b-14", "ovs_interfaceid": "0b2b5a9b-1472-4353-96f8-c4b5d8fe1132", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.805 2 DEBUG oslo_concurrency.lockutils [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Releasing lock "refresh_cache-a850e122-58e1-4fa2-9555-1564c9c36203" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.806 2 DEBUG nova.compute.manager [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Instance network_info: |[{"id": "0b2b5a9b-1472-4353-96f8-c4b5d8fe1132", "address": "fa:16:3e:61:29:fa", "network": {"id": "66b5a7c3-fe3e-42b0-aea6-19534bca6e0e", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1726703238-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "db3f04a20fd740c1af3139196dc928d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b2b5a9b-14", "ovs_interfaceid": "0b2b5a9b-1472-4353-96f8-c4b5d8fe1132", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.806 2 DEBUG oslo_concurrency.lockutils [req-1e90f6bd-14eb-4843-bff6-416444d38d11 req-f750d525-922c-411d-899f-d67812ac5d5a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-a850e122-58e1-4fa2-9555-1564c9c36203" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.806 2 DEBUG nova.network.neutron [req-1e90f6bd-14eb-4843-bff6-416444d38d11 req-f750d525-922c-411d-899f-d67812ac5d5a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Refreshing network info cache for port 0b2b5a9b-1472-4353-96f8-c4b5d8fe1132 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.808 2 DEBUG nova.virt.libvirt.driver [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Start _get_guest_xml network_info=[{"id": "0b2b5a9b-1472-4353-96f8-c4b5d8fe1132", "address": "fa:16:3e:61:29:fa", "network": {"id": "66b5a7c3-fe3e-42b0-aea6-19534bca6e0e", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1726703238-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "db3f04a20fd740c1af3139196dc928d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b2b5a9b-14", "ovs_interfaceid": "0b2b5a9b-1472-4353-96f8-c4b5d8fe1132", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.814 2 DEBUG nova.compute.manager [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.814 2 DEBUG nova.network.neutron [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.817 2 WARNING nova.virt.libvirt.driver [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.821 2 DEBUG nova.virt.libvirt.host [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.822 2 DEBUG nova.virt.libvirt.host [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.827 2 DEBUG nova.virt.libvirt.host [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.827 2 DEBUG nova.virt.libvirt.host [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.828 2 DEBUG nova.virt.libvirt.driver [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.828 2 DEBUG nova.virt.hardware [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.829 2 DEBUG nova.virt.hardware [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.829 2 DEBUG nova.virt.hardware [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.829 2 DEBUG nova.virt.hardware [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.829 2 DEBUG nova.virt.hardware [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.830 2 DEBUG nova.virt.hardware [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.830 2 DEBUG nova.virt.hardware [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.830 2 DEBUG nova.virt.hardware [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.830 2 DEBUG nova.virt.hardware [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.830 2 DEBUG nova.virt.hardware [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.831 2 DEBUG nova.virt.hardware [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.834 2 DEBUG nova.virt.libvirt.vif [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:06:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1175198892',display_name='tempest-ServersAdminTestJSON-server-1175198892',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1175198892',id=37,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='db3f04a20fd740c1af3139196dc928d2',ramdisk_id='',reservation_id='r-wfa8n35h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-1782354187',owner_user_name='tempest-ServersAdminTestJSON-1782354187-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:06:43Z,user_data=None,user_id='9258efa4511c4bb3813eca27b75b1008',uuid=a850e122-58e1-4fa2-9555-1564c9c36203,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0b2b5a9b-1472-4353-96f8-c4b5d8fe1132", "address": "fa:16:3e:61:29:fa", "network": {"id": "66b5a7c3-fe3e-42b0-aea6-19534bca6e0e", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1726703238-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "db3f04a20fd740c1af3139196dc928d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b2b5a9b-14", "ovs_interfaceid": "0b2b5a9b-1472-4353-96f8-c4b5d8fe1132", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.835 2 DEBUG nova.network.os_vif_util [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Converting VIF {"id": "0b2b5a9b-1472-4353-96f8-c4b5d8fe1132", "address": "fa:16:3e:61:29:fa", "network": {"id": "66b5a7c3-fe3e-42b0-aea6-19534bca6e0e", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1726703238-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "db3f04a20fd740c1af3139196dc928d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b2b5a9b-14", "ovs_interfaceid": "0b2b5a9b-1472-4353-96f8-c4b5d8fe1132", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.835 2 DEBUG nova.network.os_vif_util [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:61:29:fa,bridge_name='br-int',has_traffic_filtering=True,id=0b2b5a9b-1472-4353-96f8-c4b5d8fe1132,network=Network(66b5a7c3-fe3e-42b0-aea6-19534bca6e0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b2b5a9b-14') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.836 2 DEBUG nova.objects.instance [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lazy-loading 'pci_devices' on Instance uuid a850e122-58e1-4fa2-9555-1564c9c36203 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.838 2 INFO nova.virt.libvirt.driver [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.851 2 DEBUG nova.virt.libvirt.driver [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:06:47 np0005466013 nova_compute[192144]:  <uuid>a850e122-58e1-4fa2-9555-1564c9c36203</uuid>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:  <name>instance-00000025</name>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:06:47 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:      <nova:name>tempest-ServersAdminTestJSON-server-1175198892</nova:name>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:06:47</nova:creationTime>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:06:47 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:        <nova:user uuid="9258efa4511c4bb3813eca27b75b1008">tempest-ServersAdminTestJSON-1782354187-project-member</nova:user>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:        <nova:project uuid="db3f04a20fd740c1af3139196dc928d2">tempest-ServersAdminTestJSON-1782354187</nova:project>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:        <nova:port uuid="0b2b5a9b-1472-4353-96f8-c4b5d8fe1132">
Oct  2 08:06:47 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:      <entry name="serial">a850e122-58e1-4fa2-9555-1564c9c36203</entry>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:      <entry name="uuid">a850e122-58e1-4fa2-9555-1564c9c36203</entry>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:06:47 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/a850e122-58e1-4fa2-9555-1564c9c36203/disk"/>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:06:47 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/a850e122-58e1-4fa2-9555-1564c9c36203/disk.config"/>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:06:47 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:61:29:fa"/>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:      <target dev="tap0b2b5a9b-14"/>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:06:47 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/a850e122-58e1-4fa2-9555-1564c9c36203/console.log" append="off"/>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:06:47 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:06:47 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:06:47 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:06:47 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:06:47 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.852 2 DEBUG nova.compute.manager [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Preparing to wait for external event network-vif-plugged-0b2b5a9b-1472-4353-96f8-c4b5d8fe1132 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.852 2 DEBUG oslo_concurrency.lockutils [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Acquiring lock "a850e122-58e1-4fa2-9555-1564c9c36203-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.852 2 DEBUG oslo_concurrency.lockutils [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "a850e122-58e1-4fa2-9555-1564c9c36203-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.853 2 DEBUG oslo_concurrency.lockutils [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "a850e122-58e1-4fa2-9555-1564c9c36203-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.853 2 DEBUG nova.virt.libvirt.vif [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:06:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1175198892',display_name='tempest-ServersAdminTestJSON-server-1175198892',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1175198892',id=37,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='db3f04a20fd740c1af3139196dc928d2',ramdisk_id='',reservation_id='r-wfa8n35h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-1782354187',owner_user_name='tempest-ServersAdminTestJSON-1782354187-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:06:43Z,user_data=None,user_id='9258efa4511c4bb3813eca27b75b1008',uuid=a850e122-58e1-4fa2-9555-1564c9c36203,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0b2b5a9b-1472-4353-96f8-c4b5d8fe1132", "address": "fa:16:3e:61:29:fa", "network": {"id": "66b5a7c3-fe3e-42b0-aea6-19534bca6e0e", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1726703238-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "db3f04a20fd740c1af3139196dc928d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b2b5a9b-14", "ovs_interfaceid": "0b2b5a9b-1472-4353-96f8-c4b5d8fe1132", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.853 2 DEBUG nova.network.os_vif_util [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Converting VIF {"id": "0b2b5a9b-1472-4353-96f8-c4b5d8fe1132", "address": "fa:16:3e:61:29:fa", "network": {"id": "66b5a7c3-fe3e-42b0-aea6-19534bca6e0e", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1726703238-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "db3f04a20fd740c1af3139196dc928d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b2b5a9b-14", "ovs_interfaceid": "0b2b5a9b-1472-4353-96f8-c4b5d8fe1132", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.854 2 DEBUG nova.network.os_vif_util [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:61:29:fa,bridge_name='br-int',has_traffic_filtering=True,id=0b2b5a9b-1472-4353-96f8-c4b5d8fe1132,network=Network(66b5a7c3-fe3e-42b0-aea6-19534bca6e0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b2b5a9b-14') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.854 2 DEBUG os_vif [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:61:29:fa,bridge_name='br-int',has_traffic_filtering=True,id=0b2b5a9b-1472-4353-96f8-c4b5d8fe1132,network=Network(66b5a7c3-fe3e-42b0-aea6-19534bca6e0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b2b5a9b-14') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.855 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.855 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.855 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.857 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.858 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0b2b5a9b-14, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.858 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0b2b5a9b-14, col_values=(('external_ids', {'iface-id': '0b2b5a9b-1472-4353-96f8-c4b5d8fe1132', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:61:29:fa', 'vm-uuid': 'a850e122-58e1-4fa2-9555-1564c9c36203'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.859 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:47 np0005466013 NetworkManager[51205]: <info>  [1759406807.8606] manager: (tap0b2b5a9b-14): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/50)
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.862 2 DEBUG nova.compute.manager [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.864 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.866 2 INFO os_vif [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:61:29:fa,bridge_name='br-int',has_traffic_filtering=True,id=0b2b5a9b-1472-4353-96f8-c4b5d8fe1132,network=Network(66b5a7c3-fe3e-42b0-aea6-19534bca6e0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b2b5a9b-14')#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.948 2 DEBUG nova.virt.libvirt.driver [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.948 2 DEBUG nova.virt.libvirt.driver [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.948 2 DEBUG nova.virt.libvirt.driver [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] No VIF found with MAC fa:16:3e:61:29:fa, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:06:47 np0005466013 nova_compute[192144]: 2025-10-02 12:06:47.949 2 INFO nova.virt.libvirt.driver [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Using config drive#033[00m
Oct  2 08:06:48 np0005466013 nova_compute[192144]: 2025-10-02 12:06:48.031 2 DEBUG nova.compute.manager [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:06:48 np0005466013 nova_compute[192144]: 2025-10-02 12:06:48.033 2 DEBUG nova.virt.libvirt.driver [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:06:48 np0005466013 nova_compute[192144]: 2025-10-02 12:06:48.033 2 INFO nova.virt.libvirt.driver [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Creating image(s)#033[00m
Oct  2 08:06:48 np0005466013 nova_compute[192144]: 2025-10-02 12:06:48.034 2 DEBUG oslo_concurrency.lockutils [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Acquiring lock "/var/lib/nova/instances/9ae3f2a0-cb83-4b09-8d4d-604c431e09e2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:48 np0005466013 nova_compute[192144]: 2025-10-02 12:06:48.034 2 DEBUG oslo_concurrency.lockutils [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Lock "/var/lib/nova/instances/9ae3f2a0-cb83-4b09-8d4d-604c431e09e2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:48 np0005466013 nova_compute[192144]: 2025-10-02 12:06:48.035 2 DEBUG oslo_concurrency.lockutils [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Lock "/var/lib/nova/instances/9ae3f2a0-cb83-4b09-8d4d-604c431e09e2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:48 np0005466013 nova_compute[192144]: 2025-10-02 12:06:48.056 2 DEBUG oslo_concurrency.processutils [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:48 np0005466013 nova_compute[192144]: 2025-10-02 12:06:48.111 2 DEBUG oslo_concurrency.processutils [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:48 np0005466013 nova_compute[192144]: 2025-10-02 12:06:48.112 2 DEBUG oslo_concurrency.lockutils [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:48 np0005466013 nova_compute[192144]: 2025-10-02 12:06:48.113 2 DEBUG oslo_concurrency.lockutils [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:48 np0005466013 nova_compute[192144]: 2025-10-02 12:06:48.127 2 DEBUG oslo_concurrency.processutils [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:48 np0005466013 nova_compute[192144]: 2025-10-02 12:06:48.147 2 DEBUG nova.policy [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fb4813531b1848edaf57576b1f551d3d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c7ce4d06ab3e4e45b22ec26fe7e71cce', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:06:48 np0005466013 nova_compute[192144]: 2025-10-02 12:06:48.191 2 DEBUG oslo_concurrency.processutils [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:48 np0005466013 nova_compute[192144]: 2025-10-02 12:06:48.192 2 DEBUG oslo_concurrency.processutils [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/9ae3f2a0-cb83-4b09-8d4d-604c431e09e2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:48 np0005466013 nova_compute[192144]: 2025-10-02 12:06:48.226 2 DEBUG oslo_concurrency.processutils [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/9ae3f2a0-cb83-4b09-8d4d-604c431e09e2/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:48 np0005466013 nova_compute[192144]: 2025-10-02 12:06:48.227 2 DEBUG oslo_concurrency.lockutils [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:48 np0005466013 nova_compute[192144]: 2025-10-02 12:06:48.227 2 DEBUG oslo_concurrency.processutils [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:48 np0005466013 nova_compute[192144]: 2025-10-02 12:06:48.277 2 DEBUG oslo_concurrency.processutils [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:48 np0005466013 nova_compute[192144]: 2025-10-02 12:06:48.278 2 DEBUG nova.virt.disk.api [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Checking if we can resize image /var/lib/nova/instances/9ae3f2a0-cb83-4b09-8d4d-604c431e09e2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:06:48 np0005466013 nova_compute[192144]: 2025-10-02 12:06:48.278 2 DEBUG oslo_concurrency.processutils [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9ae3f2a0-cb83-4b09-8d4d-604c431e09e2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:48 np0005466013 nova_compute[192144]: 2025-10-02 12:06:48.334 2 DEBUG oslo_concurrency.processutils [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9ae3f2a0-cb83-4b09-8d4d-604c431e09e2/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:48 np0005466013 nova_compute[192144]: 2025-10-02 12:06:48.335 2 DEBUG nova.virt.disk.api [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Cannot resize image /var/lib/nova/instances/9ae3f2a0-cb83-4b09-8d4d-604c431e09e2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:06:48 np0005466013 nova_compute[192144]: 2025-10-02 12:06:48.335 2 DEBUG nova.objects.instance [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Lazy-loading 'migration_context' on Instance uuid 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:06:48 np0005466013 nova_compute[192144]: 2025-10-02 12:06:48.350 2 DEBUG nova.virt.libvirt.driver [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:06:48 np0005466013 nova_compute[192144]: 2025-10-02 12:06:48.350 2 DEBUG nova.virt.libvirt.driver [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Ensure instance console log exists: /var/lib/nova/instances/9ae3f2a0-cb83-4b09-8d4d-604c431e09e2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:06:48 np0005466013 nova_compute[192144]: 2025-10-02 12:06:48.351 2 DEBUG oslo_concurrency.lockutils [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:48 np0005466013 nova_compute[192144]: 2025-10-02 12:06:48.351 2 DEBUG oslo_concurrency.lockutils [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:48 np0005466013 nova_compute[192144]: 2025-10-02 12:06:48.351 2 DEBUG oslo_concurrency.lockutils [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:48 np0005466013 nova_compute[192144]: 2025-10-02 12:06:48.551 2 INFO nova.virt.libvirt.driver [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Creating config drive at /var/lib/nova/instances/a850e122-58e1-4fa2-9555-1564c9c36203/disk.config#033[00m
Oct  2 08:06:48 np0005466013 nova_compute[192144]: 2025-10-02 12:06:48.556 2 DEBUG oslo_concurrency.processutils [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a850e122-58e1-4fa2-9555-1564c9c36203/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphclzezss execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:48 np0005466013 nova_compute[192144]: 2025-10-02 12:06:48.682 2 DEBUG oslo_concurrency.processutils [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a850e122-58e1-4fa2-9555-1564c9c36203/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphclzezss" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:48 np0005466013 kernel: tap0b2b5a9b-14: entered promiscuous mode
Oct  2 08:06:48 np0005466013 NetworkManager[51205]: <info>  [1759406808.7334] manager: (tap0b2b5a9b-14): new Tun device (/org/freedesktop/NetworkManager/Devices/51)
Oct  2 08:06:48 np0005466013 ovn_controller[94366]: 2025-10-02T12:06:48Z|00088|binding|INFO|Claiming lport 0b2b5a9b-1472-4353-96f8-c4b5d8fe1132 for this chassis.
Oct  2 08:06:48 np0005466013 ovn_controller[94366]: 2025-10-02T12:06:48Z|00089|binding|INFO|0b2b5a9b-1472-4353-96f8-c4b5d8fe1132: Claiming fa:16:3e:61:29:fa 10.100.0.13
Oct  2 08:06:48 np0005466013 nova_compute[192144]: 2025-10-02 12:06:48.733 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:48 np0005466013 nova_compute[192144]: 2025-10-02 12:06:48.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:48.750 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:61:29:fa 10.100.0.13'], port_security=['fa:16:3e:61:29:fa 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'a850e122-58e1-4fa2-9555-1564c9c36203', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'db3f04a20fd740c1af3139196dc928d2', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c69e6497-c2d4-4cc0-a1d9-2c5055cc5d77', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5dc739b2-072d-4dd4-b9d2-9724145d12f5, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=0b2b5a9b-1472-4353-96f8-c4b5d8fe1132) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:06:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:48.751 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 0b2b5a9b-1472-4353-96f8-c4b5d8fe1132 in datapath 66b5a7c3-fe3e-42b0-aea6-19534bca6e0e bound to our chassis#033[00m
Oct  2 08:06:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:48.753 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 66b5a7c3-fe3e-42b0-aea6-19534bca6e0e#033[00m
Oct  2 08:06:48 np0005466013 systemd-udevd[224159]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:06:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:48.764 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[ac9d59e2-16a3-4227-af89-4898416e1cee]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:48.765 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap66b5a7c3-f1 in ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:06:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:48.767 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap66b5a7c3-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:06:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:48.767 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[f45b6910-eb55-4d72-a36d-da15f0ca0af1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:48.768 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[c59af126-02a7-4f98-98a7-133f2b6895e6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:48 np0005466013 systemd-machined[152202]: New machine qemu-14-instance-00000025.
Oct  2 08:06:48 np0005466013 NetworkManager[51205]: <info>  [1759406808.7728] device (tap0b2b5a9b-14): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:06:48 np0005466013 NetworkManager[51205]: <info>  [1759406808.7734] device (tap0b2b5a9b-14): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:06:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:48.782 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[5c4a70c5-ba38-44d8-95d0-e6e07490ea63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:48 np0005466013 nova_compute[192144]: 2025-10-02 12:06:48.792 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:48 np0005466013 systemd[1]: Started Virtual Machine qemu-14-instance-00000025.
Oct  2 08:06:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:48.795 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[258aad84-d57f-4ff1-be79-1fc990cf3464]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:48 np0005466013 ovn_controller[94366]: 2025-10-02T12:06:48Z|00090|binding|INFO|Setting lport 0b2b5a9b-1472-4353-96f8-c4b5d8fe1132 ovn-installed in OVS
Oct  2 08:06:48 np0005466013 ovn_controller[94366]: 2025-10-02T12:06:48Z|00091|binding|INFO|Setting lport 0b2b5a9b-1472-4353-96f8-c4b5d8fe1132 up in Southbound
Oct  2 08:06:48 np0005466013 nova_compute[192144]: 2025-10-02 12:06:48.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:48.817 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[d0a53351-edba-4530-948b-14c85374d77c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:48 np0005466013 NetworkManager[51205]: <info>  [1759406808.8230] manager: (tap66b5a7c3-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/52)
Oct  2 08:06:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:48.823 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[e3714507-7ab6-4304-90ad-77d77124da32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:48.854 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[08e2aef5-2cb5-461c-9cf2-71663b31b3a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:48.857 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[b3830c1a-7fed-403c-ba92-5d72a1454cc2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:48 np0005466013 NetworkManager[51205]: <info>  [1759406808.8785] device (tap66b5a7c3-f0): carrier: link connected
Oct  2 08:06:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:48.884 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[f348e7ec-7f15-4679-a4c9-e133e33677ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:48.899 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[a19cd6dd-153a-4390-9946-fb452961eab9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap66b5a7c3-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:7b:77'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 480049, 'reachable_time': 35627, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224193, 'error': None, 'target': 'ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:48.916 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[957bd0fd-4b00-45b6-b664-d2eaaeeaa077]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe79:7b77'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 480049, 'tstamp': 480049}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224194, 'error': None, 'target': 'ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:48.932 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[f5b6fcf4-981a-4ecf-9b1a-0634948d2d53]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap66b5a7c3-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:7b:77'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 480049, 'reachable_time': 35627, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224195, 'error': None, 'target': 'ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:48.964 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[fd8cc2b6-4226-4cdc-96b1-0d1451341ff3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:48 np0005466013 nova_compute[192144]: 2025-10-02 12:06:48.984 2 DEBUG nova.network.neutron [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Successfully created port: e5808e22-40db-4078-a542-4f4cd632e06e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:06:49 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:49.019 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[2b7286dc-bce3-4ef9-98eb-8379058741a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:49 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:49.020 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap66b5a7c3-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:06:49 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:49.020 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:06:49 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:49.021 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap66b5a7c3-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:06:49 np0005466013 nova_compute[192144]: 2025-10-02 12:06:49.022 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:49 np0005466013 kernel: tap66b5a7c3-f0: entered promiscuous mode
Oct  2 08:06:49 np0005466013 NetworkManager[51205]: <info>  [1759406809.0241] manager: (tap66b5a7c3-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/53)
Oct  2 08:06:49 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:49.024 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap66b5a7c3-f0, col_values=(('external_ids', {'iface-id': 'a0163170-212d-4aba-9028-3d5fb4d45c5b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:06:49 np0005466013 ovn_controller[94366]: 2025-10-02T12:06:49Z|00092|binding|INFO|Releasing lport a0163170-212d-4aba-9028-3d5fb4d45c5b from this chassis (sb_readonly=0)
Oct  2 08:06:49 np0005466013 nova_compute[192144]: 2025-10-02 12:06:49.025 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:49 np0005466013 nova_compute[192144]: 2025-10-02 12:06:49.036 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:49 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:49.037 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/66b5a7c3-fe3e-42b0-aea6-19534bca6e0e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/66b5a7c3-fe3e-42b0-aea6-19534bca6e0e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:06:49 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:49.039 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[4cd3d6db-9b3d-442a-8b51-37ffcd5ed5e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:49 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:49.040 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:06:49 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:06:49 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:06:49 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e
Oct  2 08:06:49 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:06:49 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:06:49 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:06:49 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/66b5a7c3-fe3e-42b0-aea6-19534bca6e0e.pid.haproxy
Oct  2 08:06:49 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:06:49 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:06:49 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:06:49 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:06:49 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:06:49 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:06:49 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:06:49 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:06:49 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:06:49 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:06:49 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:06:49 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:06:49 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:06:49 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:06:49 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:06:49 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:06:49 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:06:49 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:06:49 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:06:49 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:06:49 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID 66b5a7c3-fe3e-42b0-aea6-19534bca6e0e
Oct  2 08:06:49 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:06:49 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:49.040 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e', 'env', 'PROCESS_TAG=haproxy-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/66b5a7c3-fe3e-42b0-aea6-19534bca6e0e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:06:49 np0005466013 nova_compute[192144]: 2025-10-02 12:06:49.253 2 DEBUG nova.compute.manager [req-6e8e0317-3345-4a19-a0d4-c7bfa7709737 req-99bfae66-eff6-43e1-9c51-c5e746d0585c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Received event network-vif-plugged-0b2b5a9b-1472-4353-96f8-c4b5d8fe1132 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:06:49 np0005466013 nova_compute[192144]: 2025-10-02 12:06:49.254 2 DEBUG oslo_concurrency.lockutils [req-6e8e0317-3345-4a19-a0d4-c7bfa7709737 req-99bfae66-eff6-43e1-9c51-c5e746d0585c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "a850e122-58e1-4fa2-9555-1564c9c36203-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:49 np0005466013 nova_compute[192144]: 2025-10-02 12:06:49.254 2 DEBUG oslo_concurrency.lockutils [req-6e8e0317-3345-4a19-a0d4-c7bfa7709737 req-99bfae66-eff6-43e1-9c51-c5e746d0585c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "a850e122-58e1-4fa2-9555-1564c9c36203-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:49 np0005466013 nova_compute[192144]: 2025-10-02 12:06:49.254 2 DEBUG oslo_concurrency.lockutils [req-6e8e0317-3345-4a19-a0d4-c7bfa7709737 req-99bfae66-eff6-43e1-9c51-c5e746d0585c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "a850e122-58e1-4fa2-9555-1564c9c36203-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:49 np0005466013 nova_compute[192144]: 2025-10-02 12:06:49.254 2 DEBUG nova.compute.manager [req-6e8e0317-3345-4a19-a0d4-c7bfa7709737 req-99bfae66-eff6-43e1-9c51-c5e746d0585c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Processing event network-vif-plugged-0b2b5a9b-1472-4353-96f8-c4b5d8fe1132 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:06:49 np0005466013 podman[224233]: 2025-10-02 12:06:49.389733494 +0000 UTC m=+0.051307787 container create b7bfd7638f77aabc0fc63f36868ef30733a985d73f3ca817cdb08d2a4193c6cc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:06:49 np0005466013 systemd[1]: Started libpod-conmon-b7bfd7638f77aabc0fc63f36868ef30733a985d73f3ca817cdb08d2a4193c6cc.scope.
Oct  2 08:06:49 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:06:49 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/12d003e85d90663a8a55e3f422c0c1335bfa794efcc275bb0fe0f2abb9865be7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:06:49 np0005466013 podman[224233]: 2025-10-02 12:06:49.364035065 +0000 UTC m=+0.025609388 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:06:49 np0005466013 podman[224233]: 2025-10-02 12:06:49.46349172 +0000 UTC m=+0.125066033 container init b7bfd7638f77aabc0fc63f36868ef30733a985d73f3ca817cdb08d2a4193c6cc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  2 08:06:49 np0005466013 podman[224233]: 2025-10-02 12:06:49.468867729 +0000 UTC m=+0.130442022 container start b7bfd7638f77aabc0fc63f36868ef30733a985d73f3ca817cdb08d2a4193c6cc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  2 08:06:49 np0005466013 nova_compute[192144]: 2025-10-02 12:06:49.473 2 DEBUG nova.network.neutron [req-1e90f6bd-14eb-4843-bff6-416444d38d11 req-f750d525-922c-411d-899f-d67812ac5d5a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Updated VIF entry in instance network info cache for port 0b2b5a9b-1472-4353-96f8-c4b5d8fe1132. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:06:49 np0005466013 nova_compute[192144]: 2025-10-02 12:06:49.473 2 DEBUG nova.network.neutron [req-1e90f6bd-14eb-4843-bff6-416444d38d11 req-f750d525-922c-411d-899f-d67812ac5d5a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Updating instance_info_cache with network_info: [{"id": "0b2b5a9b-1472-4353-96f8-c4b5d8fe1132", "address": "fa:16:3e:61:29:fa", "network": {"id": "66b5a7c3-fe3e-42b0-aea6-19534bca6e0e", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1726703238-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "db3f04a20fd740c1af3139196dc928d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b2b5a9b-14", "ovs_interfaceid": "0b2b5a9b-1472-4353-96f8-c4b5d8fe1132", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:06:49 np0005466013 nova_compute[192144]: 2025-10-02 12:06:49.476 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759406809.4758675, a850e122-58e1-4fa2-9555-1564c9c36203 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:06:49 np0005466013 nova_compute[192144]: 2025-10-02 12:06:49.476 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] VM Started (Lifecycle Event)#033[00m
Oct  2 08:06:49 np0005466013 nova_compute[192144]: 2025-10-02 12:06:49.478 2 DEBUG nova.compute.manager [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:06:49 np0005466013 nova_compute[192144]: 2025-10-02 12:06:49.484 2 DEBUG nova.virt.libvirt.driver [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:06:49 np0005466013 nova_compute[192144]: 2025-10-02 12:06:49.487 2 INFO nova.virt.libvirt.driver [-] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Instance spawned successfully.#033[00m
Oct  2 08:06:49 np0005466013 nova_compute[192144]: 2025-10-02 12:06:49.488 2 DEBUG nova.virt.libvirt.driver [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:06:49 np0005466013 neutron-haproxy-ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e[224248]: [NOTICE]   (224252) : New worker (224254) forked
Oct  2 08:06:49 np0005466013 neutron-haproxy-ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e[224248]: [NOTICE]   (224252) : Loading success.
Oct  2 08:06:49 np0005466013 nova_compute[192144]: 2025-10-02 12:06:49.508 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:06:49 np0005466013 nova_compute[192144]: 2025-10-02 12:06:49.511 2 DEBUG oslo_concurrency.lockutils [req-1e90f6bd-14eb-4843-bff6-416444d38d11 req-f750d525-922c-411d-899f-d67812ac5d5a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-a850e122-58e1-4fa2-9555-1564c9c36203" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:06:49 np0005466013 nova_compute[192144]: 2025-10-02 12:06:49.513 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:06:49 np0005466013 nova_compute[192144]: 2025-10-02 12:06:49.526 2 DEBUG nova.virt.libvirt.driver [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:06:49 np0005466013 nova_compute[192144]: 2025-10-02 12:06:49.527 2 DEBUG nova.virt.libvirt.driver [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:06:49 np0005466013 nova_compute[192144]: 2025-10-02 12:06:49.527 2 DEBUG nova.virt.libvirt.driver [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:06:49 np0005466013 nova_compute[192144]: 2025-10-02 12:06:49.528 2 DEBUG nova.virt.libvirt.driver [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:06:49 np0005466013 nova_compute[192144]: 2025-10-02 12:06:49.528 2 DEBUG nova.virt.libvirt.driver [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:06:49 np0005466013 nova_compute[192144]: 2025-10-02 12:06:49.528 2 DEBUG nova.virt.libvirt.driver [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:06:49 np0005466013 nova_compute[192144]: 2025-10-02 12:06:49.544 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:06:49 np0005466013 nova_compute[192144]: 2025-10-02 12:06:49.545 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759406809.476239, a850e122-58e1-4fa2-9555-1564c9c36203 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:06:49 np0005466013 nova_compute[192144]: 2025-10-02 12:06:49.545 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:06:49 np0005466013 nova_compute[192144]: 2025-10-02 12:06:49.716 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:06:49 np0005466013 nova_compute[192144]: 2025-10-02 12:06:49.718 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759406809.489572, a850e122-58e1-4fa2-9555-1564c9c36203 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:06:49 np0005466013 nova_compute[192144]: 2025-10-02 12:06:49.719 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:06:49 np0005466013 nova_compute[192144]: 2025-10-02 12:06:49.739 2 INFO nova.compute.manager [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Took 6.37 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:06:49 np0005466013 nova_compute[192144]: 2025-10-02 12:06:49.739 2 DEBUG nova.compute.manager [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:06:49 np0005466013 nova_compute[192144]: 2025-10-02 12:06:49.752 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:06:49 np0005466013 nova_compute[192144]: 2025-10-02 12:06:49.755 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:06:49 np0005466013 nova_compute[192144]: 2025-10-02 12:06:49.784 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:06:49 np0005466013 nova_compute[192144]: 2025-10-02 12:06:49.861 2 INFO nova.compute.manager [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Took 7.02 seconds to build instance.#033[00m
Oct  2 08:06:49 np0005466013 nova_compute[192144]: 2025-10-02 12:06:49.881 2 DEBUG oslo_concurrency.lockutils [None req-8cdd6094-fa74-4209-9fe0-c55d270812ee 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "a850e122-58e1-4fa2-9555-1564c9c36203" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:50 np0005466013 nova_compute[192144]: 2025-10-02 12:06:50.452 2 DEBUG nova.network.neutron [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Successfully updated port: e5808e22-40db-4078-a542-4f4cd632e06e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:06:50 np0005466013 nova_compute[192144]: 2025-10-02 12:06:50.465 2 DEBUG oslo_concurrency.lockutils [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Acquiring lock "refresh_cache-9ae3f2a0-cb83-4b09-8d4d-604c431e09e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:06:50 np0005466013 nova_compute[192144]: 2025-10-02 12:06:50.465 2 DEBUG oslo_concurrency.lockutils [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Acquired lock "refresh_cache-9ae3f2a0-cb83-4b09-8d4d-604c431e09e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:06:50 np0005466013 nova_compute[192144]: 2025-10-02 12:06:50.465 2 DEBUG nova.network.neutron [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:06:50 np0005466013 nova_compute[192144]: 2025-10-02 12:06:50.666 2 DEBUG nova.network.neutron [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:06:50 np0005466013 nova_compute[192144]: 2025-10-02 12:06:50.711 2 DEBUG nova.compute.manager [req-8ef1cdc9-38ec-44ad-8e77-93fa4bc0ae5a req-0eb85bea-46c8-4d8c-8a04-7113661f6fb3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Received event network-changed-e5808e22-40db-4078-a542-4f4cd632e06e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:06:50 np0005466013 nova_compute[192144]: 2025-10-02 12:06:50.711 2 DEBUG nova.compute.manager [req-8ef1cdc9-38ec-44ad-8e77-93fa4bc0ae5a req-0eb85bea-46c8-4d8c-8a04-7113661f6fb3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Refreshing instance network info cache due to event network-changed-e5808e22-40db-4078-a542-4f4cd632e06e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:06:50 np0005466013 nova_compute[192144]: 2025-10-02 12:06:50.711 2 DEBUG oslo_concurrency.lockutils [req-8ef1cdc9-38ec-44ad-8e77-93fa4bc0ae5a req-0eb85bea-46c8-4d8c-8a04-7113661f6fb3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-9ae3f2a0-cb83-4b09-8d4d-604c431e09e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.075 2 DEBUG oslo_concurrency.lockutils [None req-1d21bfc7-e70f-4c3c-ae8d-9d6fc2f134ec 4df934684d2e42ae8975ab3398d26e03 437c25a80e1b4ba9b7de3dab7761571b - - default default] Acquiring lock "refresh_cache-a850e122-58e1-4fa2-9555-1564c9c36203" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.076 2 DEBUG oslo_concurrency.lockutils [None req-1d21bfc7-e70f-4c3c-ae8d-9d6fc2f134ec 4df934684d2e42ae8975ab3398d26e03 437c25a80e1b4ba9b7de3dab7761571b - - default default] Acquired lock "refresh_cache-a850e122-58e1-4fa2-9555-1564c9c36203" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.076 2 DEBUG nova.network.neutron [None req-1d21bfc7-e70f-4c3c-ae8d-9d6fc2f134ec 4df934684d2e42ae8975ab3398d26e03 437c25a80e1b4ba9b7de3dab7761571b - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.342 2 DEBUG nova.compute.manager [req-589d45c6-9582-466c-9b99-9fba9928fa6e req-0f105c0e-6537-43f9-8ed5-2bd0cae0e641 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Received event network-vif-plugged-0b2b5a9b-1472-4353-96f8-c4b5d8fe1132 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.342 2 DEBUG oslo_concurrency.lockutils [req-589d45c6-9582-466c-9b99-9fba9928fa6e req-0f105c0e-6537-43f9-8ed5-2bd0cae0e641 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "a850e122-58e1-4fa2-9555-1564c9c36203-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.342 2 DEBUG oslo_concurrency.lockutils [req-589d45c6-9582-466c-9b99-9fba9928fa6e req-0f105c0e-6537-43f9-8ed5-2bd0cae0e641 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "a850e122-58e1-4fa2-9555-1564c9c36203-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.343 2 DEBUG oslo_concurrency.lockutils [req-589d45c6-9582-466c-9b99-9fba9928fa6e req-0f105c0e-6537-43f9-8ed5-2bd0cae0e641 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "a850e122-58e1-4fa2-9555-1564c9c36203-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.343 2 DEBUG nova.compute.manager [req-589d45c6-9582-466c-9b99-9fba9928fa6e req-0f105c0e-6537-43f9-8ed5-2bd0cae0e641 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] No waiting events found dispatching network-vif-plugged-0b2b5a9b-1472-4353-96f8-c4b5d8fe1132 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.343 2 WARNING nova.compute.manager [req-589d45c6-9582-466c-9b99-9fba9928fa6e req-0f105c0e-6537-43f9-8ed5-2bd0cae0e641 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Received unexpected event network-vif-plugged-0b2b5a9b-1472-4353-96f8-c4b5d8fe1132 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.800 2 DEBUG nova.network.neutron [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Updating instance_info_cache with network_info: [{"id": "e5808e22-40db-4078-a542-4f4cd632e06e", "address": "fa:16:3e:04:fd:65", "network": {"id": "b47cf9b2-6909-4c87-b7af-b579f1b91bed", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1900090400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7ce4d06ab3e4e45b22ec26fe7e71cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5808e22-40", "ovs_interfaceid": "e5808e22-40db-4078-a542-4f4cd632e06e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.821 2 DEBUG oslo_concurrency.lockutils [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Releasing lock "refresh_cache-9ae3f2a0-cb83-4b09-8d4d-604c431e09e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.821 2 DEBUG nova.compute.manager [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Instance network_info: |[{"id": "e5808e22-40db-4078-a542-4f4cd632e06e", "address": "fa:16:3e:04:fd:65", "network": {"id": "b47cf9b2-6909-4c87-b7af-b579f1b91bed", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1900090400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7ce4d06ab3e4e45b22ec26fe7e71cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5808e22-40", "ovs_interfaceid": "e5808e22-40db-4078-a542-4f4cd632e06e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.822 2 DEBUG oslo_concurrency.lockutils [req-8ef1cdc9-38ec-44ad-8e77-93fa4bc0ae5a req-0eb85bea-46c8-4d8c-8a04-7113661f6fb3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-9ae3f2a0-cb83-4b09-8d4d-604c431e09e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.822 2 DEBUG nova.network.neutron [req-8ef1cdc9-38ec-44ad-8e77-93fa4bc0ae5a req-0eb85bea-46c8-4d8c-8a04-7113661f6fb3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Refreshing network info cache for port e5808e22-40db-4078-a542-4f4cd632e06e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.824 2 DEBUG nova.virt.libvirt.driver [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Start _get_guest_xml network_info=[{"id": "e5808e22-40db-4078-a542-4f4cd632e06e", "address": "fa:16:3e:04:fd:65", "network": {"id": "b47cf9b2-6909-4c87-b7af-b579f1b91bed", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1900090400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7ce4d06ab3e4e45b22ec26fe7e71cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5808e22-40", "ovs_interfaceid": "e5808e22-40db-4078-a542-4f4cd632e06e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.828 2 WARNING nova.virt.libvirt.driver [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.833 2 DEBUG nova.virt.libvirt.host [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.833 2 DEBUG nova.virt.libvirt.host [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.836 2 DEBUG nova.virt.libvirt.host [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.836 2 DEBUG nova.virt.libvirt.host [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.837 2 DEBUG nova.virt.libvirt.driver [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.837 2 DEBUG nova.virt.hardware [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.837 2 DEBUG nova.virt.hardware [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.838 2 DEBUG nova.virt.hardware [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.838 2 DEBUG nova.virt.hardware [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.838 2 DEBUG nova.virt.hardware [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.839 2 DEBUG nova.virt.hardware [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.839 2 DEBUG nova.virt.hardware [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.839 2 DEBUG nova.virt.hardware [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.839 2 DEBUG nova.virt.hardware [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.840 2 DEBUG nova.virt.hardware [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.840 2 DEBUG nova.virt.hardware [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.845 2 DEBUG nova.virt.libvirt.vif [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:06:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerTestJSON-server-1496470881',display_name='tempest-ImagesOneServerTestJSON-server-1496470881',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesoneservertestjson-server-1496470881',id=38,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c7ce4d06ab3e4e45b22ec26fe7e71cce',ramdisk_id='',reservation_id='r-bosl4u32',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerTestJSON-193751',owner_user_name='tempest-ImagesOneServerTestJSON-193751-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:06:47Z,user_data=None,user_id='fb4813531b1848edaf57576b1f551d3d',uuid=9ae3f2a0-cb83-4b09-8d4d-604c431e09e2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e5808e22-40db-4078-a542-4f4cd632e06e", "address": "fa:16:3e:04:fd:65", "network": {"id": "b47cf9b2-6909-4c87-b7af-b579f1b91bed", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1900090400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7ce4d06ab3e4e45b22ec26fe7e71cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5808e22-40", "ovs_interfaceid": "e5808e22-40db-4078-a542-4f4cd632e06e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.846 2 DEBUG nova.network.os_vif_util [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Converting VIF {"id": "e5808e22-40db-4078-a542-4f4cd632e06e", "address": "fa:16:3e:04:fd:65", "network": {"id": "b47cf9b2-6909-4c87-b7af-b579f1b91bed", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1900090400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7ce4d06ab3e4e45b22ec26fe7e71cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5808e22-40", "ovs_interfaceid": "e5808e22-40db-4078-a542-4f4cd632e06e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.847 2 DEBUG nova.network.os_vif_util [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:04:fd:65,bridge_name='br-int',has_traffic_filtering=True,id=e5808e22-40db-4078-a542-4f4cd632e06e,network=Network(b47cf9b2-6909-4c87-b7af-b579f1b91bed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape5808e22-40') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.848 2 DEBUG nova.objects.instance [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Lazy-loading 'pci_devices' on Instance uuid 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.876 2 DEBUG nova.virt.libvirt.driver [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:06:51 np0005466013 nova_compute[192144]:  <uuid>9ae3f2a0-cb83-4b09-8d4d-604c431e09e2</uuid>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:  <name>instance-00000026</name>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:06:51 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:      <nova:name>tempest-ImagesOneServerTestJSON-server-1496470881</nova:name>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:06:51</nova:creationTime>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:06:51 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:        <nova:user uuid="fb4813531b1848edaf57576b1f551d3d">tempest-ImagesOneServerTestJSON-193751-project-member</nova:user>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:        <nova:project uuid="c7ce4d06ab3e4e45b22ec26fe7e71cce">tempest-ImagesOneServerTestJSON-193751</nova:project>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:        <nova:port uuid="e5808e22-40db-4078-a542-4f4cd632e06e">
Oct  2 08:06:51 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:      <entry name="serial">9ae3f2a0-cb83-4b09-8d4d-604c431e09e2</entry>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:      <entry name="uuid">9ae3f2a0-cb83-4b09-8d4d-604c431e09e2</entry>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:06:51 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/9ae3f2a0-cb83-4b09-8d4d-604c431e09e2/disk"/>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:06:51 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/9ae3f2a0-cb83-4b09-8d4d-604c431e09e2/disk.config"/>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:06:51 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:04:fd:65"/>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:      <target dev="tape5808e22-40"/>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:06:51 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/9ae3f2a0-cb83-4b09-8d4d-604c431e09e2/console.log" append="off"/>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:06:51 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:06:51 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:06:51 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:06:51 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:06:51 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.877 2 DEBUG nova.compute.manager [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Preparing to wait for external event network-vif-plugged-e5808e22-40db-4078-a542-4f4cd632e06e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.878 2 DEBUG oslo_concurrency.lockutils [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Acquiring lock "9ae3f2a0-cb83-4b09-8d4d-604c431e09e2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.878 2 DEBUG oslo_concurrency.lockutils [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Lock "9ae3f2a0-cb83-4b09-8d4d-604c431e09e2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.879 2 DEBUG oslo_concurrency.lockutils [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Lock "9ae3f2a0-cb83-4b09-8d4d-604c431e09e2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.880 2 DEBUG nova.virt.libvirt.vif [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:06:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerTestJSON-server-1496470881',display_name='tempest-ImagesOneServerTestJSON-server-1496470881',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesoneservertestjson-server-1496470881',id=38,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c7ce4d06ab3e4e45b22ec26fe7e71cce',ramdisk_id='',reservation_id='r-bosl4u32',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerTestJSON-193751',owner_user_name='tempest-ImagesOneServerTestJSON-193751-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:06:47Z,user_data=None,user_id='fb4813531b1848edaf57576b1f551d3d',uuid=9ae3f2a0-cb83-4b09-8d4d-604c431e09e2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e5808e22-40db-4078-a542-4f4cd632e06e", "address": "fa:16:3e:04:fd:65", "network": {"id": "b47cf9b2-6909-4c87-b7af-b579f1b91bed", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1900090400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7ce4d06ab3e4e45b22ec26fe7e71cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5808e22-40", "ovs_interfaceid": "e5808e22-40db-4078-a542-4f4cd632e06e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.881 2 DEBUG nova.network.os_vif_util [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Converting VIF {"id": "e5808e22-40db-4078-a542-4f4cd632e06e", "address": "fa:16:3e:04:fd:65", "network": {"id": "b47cf9b2-6909-4c87-b7af-b579f1b91bed", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1900090400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7ce4d06ab3e4e45b22ec26fe7e71cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5808e22-40", "ovs_interfaceid": "e5808e22-40db-4078-a542-4f4cd632e06e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.881 2 DEBUG nova.network.os_vif_util [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:04:fd:65,bridge_name='br-int',has_traffic_filtering=True,id=e5808e22-40db-4078-a542-4f4cd632e06e,network=Network(b47cf9b2-6909-4c87-b7af-b579f1b91bed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape5808e22-40') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.882 2 DEBUG os_vif [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:fd:65,bridge_name='br-int',has_traffic_filtering=True,id=e5808e22-40db-4078-a542-4f4cd632e06e,network=Network(b47cf9b2-6909-4c87-b7af-b579f1b91bed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape5808e22-40') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.884 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.884 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.885 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.888 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape5808e22-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.889 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape5808e22-40, col_values=(('external_ids', {'iface-id': 'e5808e22-40db-4078-a542-4f4cd632e06e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:04:fd:65', 'vm-uuid': '9ae3f2a0-cb83-4b09-8d4d-604c431e09e2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.891 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:51 np0005466013 NetworkManager[51205]: <info>  [1759406811.8923] manager: (tape5808e22-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/54)
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.895 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.898 2 INFO os_vif [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:fd:65,bridge_name='br-int',has_traffic_filtering=True,id=e5808e22-40db-4078-a542-4f4cd632e06e,network=Network(b47cf9b2-6909-4c87-b7af-b579f1b91bed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape5808e22-40')#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.967 2 DEBUG nova.virt.libvirt.driver [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.968 2 DEBUG nova.virt.libvirt.driver [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.968 2 DEBUG nova.virt.libvirt.driver [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] No VIF found with MAC fa:16:3e:04:fd:65, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:06:51 np0005466013 nova_compute[192144]: 2025-10-02 12:06:51.969 2 INFO nova.virt.libvirt.driver [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Using config drive#033[00m
Oct  2 08:06:52 np0005466013 nova_compute[192144]: 2025-10-02 12:06:52.636 2 INFO nova.virt.libvirt.driver [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Creating config drive at /var/lib/nova/instances/9ae3f2a0-cb83-4b09-8d4d-604c431e09e2/disk.config#033[00m
Oct  2 08:06:52 np0005466013 nova_compute[192144]: 2025-10-02 12:06:52.642 2 DEBUG oslo_concurrency.processutils [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9ae3f2a0-cb83-4b09-8d4d-604c431e09e2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsxff0oz1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:52 np0005466013 nova_compute[192144]: 2025-10-02 12:06:52.770 2 DEBUG oslo_concurrency.processutils [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9ae3f2a0-cb83-4b09-8d4d-604c431e09e2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsxff0oz1" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:52 np0005466013 nova_compute[192144]: 2025-10-02 12:06:52.785 2 DEBUG nova.network.neutron [None req-1d21bfc7-e70f-4c3c-ae8d-9d6fc2f134ec 4df934684d2e42ae8975ab3398d26e03 437c25a80e1b4ba9b7de3dab7761571b - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Updating instance_info_cache with network_info: [{"id": "0b2b5a9b-1472-4353-96f8-c4b5d8fe1132", "address": "fa:16:3e:61:29:fa", "network": {"id": "66b5a7c3-fe3e-42b0-aea6-19534bca6e0e", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1726703238-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "db3f04a20fd740c1af3139196dc928d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b2b5a9b-14", "ovs_interfaceid": "0b2b5a9b-1472-4353-96f8-c4b5d8fe1132", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:06:52 np0005466013 nova_compute[192144]: 2025-10-02 12:06:52.812 2 DEBUG oslo_concurrency.lockutils [None req-1d21bfc7-e70f-4c3c-ae8d-9d6fc2f134ec 4df934684d2e42ae8975ab3398d26e03 437c25a80e1b4ba9b7de3dab7761571b - - default default] Releasing lock "refresh_cache-a850e122-58e1-4fa2-9555-1564c9c36203" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:06:52 np0005466013 nova_compute[192144]: 2025-10-02 12:06:52.813 2 DEBUG nova.compute.manager [None req-1d21bfc7-e70f-4c3c-ae8d-9d6fc2f134ec 4df934684d2e42ae8975ab3398d26e03 437c25a80e1b4ba9b7de3dab7761571b - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Inject network info _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7144#033[00m
Oct  2 08:06:52 np0005466013 nova_compute[192144]: 2025-10-02 12:06:52.814 2 DEBUG nova.compute.manager [None req-1d21bfc7-e70f-4c3c-ae8d-9d6fc2f134ec 4df934684d2e42ae8975ab3398d26e03 437c25a80e1b4ba9b7de3dab7761571b - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] network_info to inject: |[{"id": "0b2b5a9b-1472-4353-96f8-c4b5d8fe1132", "address": "fa:16:3e:61:29:fa", "network": {"id": "66b5a7c3-fe3e-42b0-aea6-19534bca6e0e", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1726703238-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "db3f04a20fd740c1af3139196dc928d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b2b5a9b-14", "ovs_interfaceid": "0b2b5a9b-1472-4353-96f8-c4b5d8fe1132", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7145#033[00m
Oct  2 08:06:52 np0005466013 kernel: tape5808e22-40: entered promiscuous mode
Oct  2 08:06:52 np0005466013 NetworkManager[51205]: <info>  [1759406812.8278] manager: (tape5808e22-40): new Tun device (/org/freedesktop/NetworkManager/Devices/55)
Oct  2 08:06:52 np0005466013 nova_compute[192144]: 2025-10-02 12:06:52.828 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:52 np0005466013 ovn_controller[94366]: 2025-10-02T12:06:52Z|00093|binding|INFO|Claiming lport e5808e22-40db-4078-a542-4f4cd632e06e for this chassis.
Oct  2 08:06:52 np0005466013 ovn_controller[94366]: 2025-10-02T12:06:52Z|00094|binding|INFO|e5808e22-40db-4078-a542-4f4cd632e06e: Claiming fa:16:3e:04:fd:65 10.100.0.11
Oct  2 08:06:52 np0005466013 systemd-udevd[224282]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:06:52 np0005466013 systemd-machined[152202]: New machine qemu-15-instance-00000026.
Oct  2 08:06:52 np0005466013 NetworkManager[51205]: <info>  [1759406812.8629] device (tape5808e22-40): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:06:52 np0005466013 NetworkManager[51205]: <info>  [1759406812.8642] device (tape5808e22-40): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:06:52 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:52.874 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:fd:65 10.100.0.11'], port_security=['fa:16:3e:04:fd:65 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '9ae3f2a0-cb83-4b09-8d4d-604c431e09e2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b47cf9b2-6909-4c87-b7af-b579f1b91bed', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c7ce4d06ab3e4e45b22ec26fe7e71cce', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'edcc431e-1b1d-4cc9-ad73-44b2fb583cc1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cd1f3cea-66c2-4b36-9423-124e5ddc31df, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=e5808e22-40db-4078-a542-4f4cd632e06e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:06:52 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:52.876 103323 INFO neutron.agent.ovn.metadata.agent [-] Port e5808e22-40db-4078-a542-4f4cd632e06e in datapath b47cf9b2-6909-4c87-b7af-b579f1b91bed bound to our chassis#033[00m
Oct  2 08:06:52 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:52.878 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b47cf9b2-6909-4c87-b7af-b579f1b91bed#033[00m
Oct  2 08:06:52 np0005466013 systemd[1]: Started Virtual Machine qemu-15-instance-00000026.
Oct  2 08:06:52 np0005466013 nova_compute[192144]: 2025-10-02 12:06:52.889 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:52 np0005466013 nova_compute[192144]: 2025-10-02 12:06:52.893 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:52 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:52.895 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[cf6c2cdc-d903-46ab-ab74-2937f85c80b7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:52 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:52.896 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb47cf9b2-61 in ovnmeta-b47cf9b2-6909-4c87-b7af-b579f1b91bed namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:06:52 np0005466013 ovn_controller[94366]: 2025-10-02T12:06:52Z|00095|binding|INFO|Setting lport e5808e22-40db-4078-a542-4f4cd632e06e ovn-installed in OVS
Oct  2 08:06:52 np0005466013 ovn_controller[94366]: 2025-10-02T12:06:52Z|00096|binding|INFO|Setting lport e5808e22-40db-4078-a542-4f4cd632e06e up in Southbound
Oct  2 08:06:52 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:52.899 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb47cf9b2-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:06:52 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:52.899 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[6d08e8b3-5419-4e8c-8598-b097da554f79]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:52 np0005466013 nova_compute[192144]: 2025-10-02 12:06:52.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:52 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:52.900 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[b79378e5-834b-48b3-9c6a-44241b5e6827]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:52 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:52.910 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[39e9b488-e14a-4d4d-a8a3-4fca1518b3ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:52 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:52.936 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[7947ade6-3379-4175-9619-3fa16e56f7c6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:52 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:52.971 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[ca8c3ee7-9749-47ef-999b-d722354b1c1e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:52 np0005466013 systemd-udevd[224285]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:06:52 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:52.978 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[c567a861-b229-4acb-8f3c-cb2cc802d835]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:52 np0005466013 NetworkManager[51205]: <info>  [1759406812.9798] manager: (tapb47cf9b2-60): new Veth device (/org/freedesktop/NetworkManager/Devices/56)
Oct  2 08:06:53 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:53.010 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[4ad984fb-871b-4e13-b6b1-421e2e41eb17]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:53 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:53.013 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[49d72ab1-503b-45e5-a3b7-6dc1180bd001]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:53 np0005466013 NetworkManager[51205]: <info>  [1759406813.0361] device (tapb47cf9b2-60): carrier: link connected
Oct  2 08:06:53 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:53.040 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[8d7a2709-a05b-466c-bd41-8b10069581f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:53 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:53.058 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[4898a43a-9ab3-4600-a647-80719cf87df7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb47cf9b2-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:15:f2:04'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 35], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 480465, 'reachable_time': 36529, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224316, 'error': None, 'target': 'ovnmeta-b47cf9b2-6909-4c87-b7af-b579f1b91bed', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:53 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:53.074 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[c30e876d-023f-4722-8f74-31f96d2a0876]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe15:f204'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 480465, 'tstamp': 480465}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224317, 'error': None, 'target': 'ovnmeta-b47cf9b2-6909-4c87-b7af-b579f1b91bed', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:53 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:53.094 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[6a6786a8-95b0-4832-af12-6d84c9db2d6d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb47cf9b2-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:15:f2:04'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 35], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 480465, 'reachable_time': 36529, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224318, 'error': None, 'target': 'ovnmeta-b47cf9b2-6909-4c87-b7af-b579f1b91bed', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:53 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:53.124 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[f646059a-c02b-4584-a25c-765070a4a82a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:53 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:53.176 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[96458e41-2b2f-4451-aa12-6f183d111047]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:53 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:53.177 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb47cf9b2-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:06:53 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:53.177 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:06:53 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:53.177 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb47cf9b2-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:06:53 np0005466013 kernel: tapb47cf9b2-60: entered promiscuous mode
Oct  2 08:06:53 np0005466013 nova_compute[192144]: 2025-10-02 12:06:53.179 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:53 np0005466013 NetworkManager[51205]: <info>  [1759406813.1811] manager: (tapb47cf9b2-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/57)
Oct  2 08:06:53 np0005466013 nova_compute[192144]: 2025-10-02 12:06:53.181 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:53 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:53.184 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb47cf9b2-60, col_values=(('external_ids', {'iface-id': '24b41591-c1fb-4e86-98cd-f230aa214230'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:06:53 np0005466013 nova_compute[192144]: 2025-10-02 12:06:53.185 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:53 np0005466013 ovn_controller[94366]: 2025-10-02T12:06:53Z|00097|binding|INFO|Releasing lport 24b41591-c1fb-4e86-98cd-f230aa214230 from this chassis (sb_readonly=0)
Oct  2 08:06:53 np0005466013 nova_compute[192144]: 2025-10-02 12:06:53.186 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:53 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:53.188 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b47cf9b2-6909-4c87-b7af-b579f1b91bed.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b47cf9b2-6909-4c87-b7af-b579f1b91bed.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:06:53 np0005466013 nova_compute[192144]: 2025-10-02 12:06:53.196 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:53 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:53.196 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[f2dcf299-772f-4ce8-9e18-5ddb995adee2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:53 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:53.198 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:06:53 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:06:53 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:06:53 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-b47cf9b2-6909-4c87-b7af-b579f1b91bed
Oct  2 08:06:53 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:06:53 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:06:53 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:06:53 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/b47cf9b2-6909-4c87-b7af-b579f1b91bed.pid.haproxy
Oct  2 08:06:53 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:06:53 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:06:53 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:06:53 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:06:53 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:06:53 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:06:53 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:06:53 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:06:53 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:06:53 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:06:53 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:06:53 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:06:53 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:06:53 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:06:53 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:06:53 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:06:53 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:06:53 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:06:53 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:06:53 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:06:53 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID b47cf9b2-6909-4c87-b7af-b579f1b91bed
Oct  2 08:06:53 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:06:53 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:06:53.198 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b47cf9b2-6909-4c87-b7af-b579f1b91bed', 'env', 'PROCESS_TAG=haproxy-b47cf9b2-6909-4c87-b7af-b579f1b91bed', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b47cf9b2-6909-4c87-b7af-b579f1b91bed.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:06:53 np0005466013 nova_compute[192144]: 2025-10-02 12:06:53.363 2 DEBUG nova.network.neutron [req-8ef1cdc9-38ec-44ad-8e77-93fa4bc0ae5a req-0eb85bea-46c8-4d8c-8a04-7113661f6fb3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Updated VIF entry in instance network info cache for port e5808e22-40db-4078-a542-4f4cd632e06e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:06:53 np0005466013 nova_compute[192144]: 2025-10-02 12:06:53.363 2 DEBUG nova.network.neutron [req-8ef1cdc9-38ec-44ad-8e77-93fa4bc0ae5a req-0eb85bea-46c8-4d8c-8a04-7113661f6fb3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Updating instance_info_cache with network_info: [{"id": "e5808e22-40db-4078-a542-4f4cd632e06e", "address": "fa:16:3e:04:fd:65", "network": {"id": "b47cf9b2-6909-4c87-b7af-b579f1b91bed", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1900090400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7ce4d06ab3e4e45b22ec26fe7e71cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5808e22-40", "ovs_interfaceid": "e5808e22-40db-4078-a542-4f4cd632e06e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:06:53 np0005466013 nova_compute[192144]: 2025-10-02 12:06:53.380 2 DEBUG oslo_concurrency.lockutils [req-8ef1cdc9-38ec-44ad-8e77-93fa4bc0ae5a req-0eb85bea-46c8-4d8c-8a04-7113661f6fb3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-9ae3f2a0-cb83-4b09-8d4d-604c431e09e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:06:53 np0005466013 nova_compute[192144]: 2025-10-02 12:06:53.437 2 DEBUG nova.compute.manager [req-a124be77-ee4e-4e27-8a80-c1bf299fe00d req-9bd8793f-1a54-47e8-9363-72faa1590a34 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Received event network-vif-plugged-e5808e22-40db-4078-a542-4f4cd632e06e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:06:53 np0005466013 nova_compute[192144]: 2025-10-02 12:06:53.437 2 DEBUG oslo_concurrency.lockutils [req-a124be77-ee4e-4e27-8a80-c1bf299fe00d req-9bd8793f-1a54-47e8-9363-72faa1590a34 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "9ae3f2a0-cb83-4b09-8d4d-604c431e09e2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:53 np0005466013 nova_compute[192144]: 2025-10-02 12:06:53.437 2 DEBUG oslo_concurrency.lockutils [req-a124be77-ee4e-4e27-8a80-c1bf299fe00d req-9bd8793f-1a54-47e8-9363-72faa1590a34 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "9ae3f2a0-cb83-4b09-8d4d-604c431e09e2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:53 np0005466013 nova_compute[192144]: 2025-10-02 12:06:53.437 2 DEBUG oslo_concurrency.lockutils [req-a124be77-ee4e-4e27-8a80-c1bf299fe00d req-9bd8793f-1a54-47e8-9363-72faa1590a34 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "9ae3f2a0-cb83-4b09-8d4d-604c431e09e2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:53 np0005466013 nova_compute[192144]: 2025-10-02 12:06:53.438 2 DEBUG nova.compute.manager [req-a124be77-ee4e-4e27-8a80-c1bf299fe00d req-9bd8793f-1a54-47e8-9363-72faa1590a34 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Processing event network-vif-plugged-e5808e22-40db-4078-a542-4f4cd632e06e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:06:53 np0005466013 podman[224350]: 2025-10-02 12:06:53.550552408 +0000 UTC m=+0.052068435 container create 2aab6e9db3580fabef6b5182d66366a3402f96b13fbf918c5ab7edac773e9e92 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b47cf9b2-6909-4c87-b7af-b579f1b91bed, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251001, io.buildah.version=1.41.3)
Oct  2 08:06:53 np0005466013 systemd[1]: Started libpod-conmon-2aab6e9db3580fabef6b5182d66366a3402f96b13fbf918c5ab7edac773e9e92.scope.
Oct  2 08:06:53 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:06:53 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/62da744d75074c76f799558a51b90862c7a85c8fc87de61be96054c4b129c991/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:06:53 np0005466013 podman[224350]: 2025-10-02 12:06:53.611012892 +0000 UTC m=+0.112528939 container init 2aab6e9db3580fabef6b5182d66366a3402f96b13fbf918c5ab7edac773e9e92 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b47cf9b2-6909-4c87-b7af-b579f1b91bed, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS)
Oct  2 08:06:53 np0005466013 podman[224350]: 2025-10-02 12:06:53.524098601 +0000 UTC m=+0.025614668 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:06:53 np0005466013 podman[224350]: 2025-10-02 12:06:53.61636315 +0000 UTC m=+0.117879177 container start 2aab6e9db3580fabef6b5182d66366a3402f96b13fbf918c5ab7edac773e9e92 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b47cf9b2-6909-4c87-b7af-b579f1b91bed, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:06:53 np0005466013 neutron-haproxy-ovnmeta-b47cf9b2-6909-4c87-b7af-b579f1b91bed[224366]: [NOTICE]   (224370) : New worker (224372) forked
Oct  2 08:06:53 np0005466013 neutron-haproxy-ovnmeta-b47cf9b2-6909-4c87-b7af-b579f1b91bed[224366]: [NOTICE]   (224370) : Loading success.
Oct  2 08:06:54 np0005466013 podman[224389]: 2025-10-02 12:06:54.675189816 +0000 UTC m=+0.047879880 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Oct  2 08:06:54 np0005466013 nova_compute[192144]: 2025-10-02 12:06:54.690 2 DEBUG nova.compute.manager [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:06:54 np0005466013 nova_compute[192144]: 2025-10-02 12:06:54.691 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759406814.689826, 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:06:54 np0005466013 nova_compute[192144]: 2025-10-02 12:06:54.691 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] VM Started (Lifecycle Event)#033[00m
Oct  2 08:06:54 np0005466013 nova_compute[192144]: 2025-10-02 12:06:54.697 2 DEBUG nova.virt.libvirt.driver [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:06:54 np0005466013 nova_compute[192144]: 2025-10-02 12:06:54.701 2 INFO nova.virt.libvirt.driver [-] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Instance spawned successfully.#033[00m
Oct  2 08:06:54 np0005466013 nova_compute[192144]: 2025-10-02 12:06:54.701 2 DEBUG nova.virt.libvirt.driver [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:06:54 np0005466013 podman[224388]: 2025-10-02 12:06:54.703308376 +0000 UTC m=+0.076403645 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 08:06:54 np0005466013 nova_compute[192144]: 2025-10-02 12:06:54.725 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:06:54 np0005466013 nova_compute[192144]: 2025-10-02 12:06:54.729 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:06:54 np0005466013 nova_compute[192144]: 2025-10-02 12:06:54.743 2 DEBUG nova.virt.libvirt.driver [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:06:54 np0005466013 nova_compute[192144]: 2025-10-02 12:06:54.743 2 DEBUG nova.virt.libvirt.driver [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:06:54 np0005466013 nova_compute[192144]: 2025-10-02 12:06:54.743 2 DEBUG nova.virt.libvirt.driver [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:06:54 np0005466013 nova_compute[192144]: 2025-10-02 12:06:54.744 2 DEBUG nova.virt.libvirt.driver [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:06:54 np0005466013 nova_compute[192144]: 2025-10-02 12:06:54.744 2 DEBUG nova.virt.libvirt.driver [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:06:54 np0005466013 nova_compute[192144]: 2025-10-02 12:06:54.745 2 DEBUG nova.virt.libvirt.driver [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:06:54 np0005466013 podman[224390]: 2025-10-02 12:06:54.745552097 +0000 UTC m=+0.114182360 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:06:54 np0005466013 nova_compute[192144]: 2025-10-02 12:06:54.910 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:06:54 np0005466013 nova_compute[192144]: 2025-10-02 12:06:54.910 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759406814.6899805, 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:06:54 np0005466013 nova_compute[192144]: 2025-10-02 12:06:54.910 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:06:54 np0005466013 nova_compute[192144]: 2025-10-02 12:06:54.937 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:06:54 np0005466013 nova_compute[192144]: 2025-10-02 12:06:54.940 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759406814.695108, 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:06:54 np0005466013 nova_compute[192144]: 2025-10-02 12:06:54.940 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:06:54 np0005466013 nova_compute[192144]: 2025-10-02 12:06:54.960 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:06:54 np0005466013 nova_compute[192144]: 2025-10-02 12:06:54.963 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:06:54 np0005466013 nova_compute[192144]: 2025-10-02 12:06:54.990 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:06:54 np0005466013 nova_compute[192144]: 2025-10-02 12:06:54.993 2 INFO nova.compute.manager [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Took 6.96 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:06:54 np0005466013 nova_compute[192144]: 2025-10-02 12:06:54.994 2 DEBUG nova.compute.manager [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:06:55 np0005466013 nova_compute[192144]: 2025-10-02 12:06:55.068 2 INFO nova.compute.manager [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Took 7.52 seconds to build instance.#033[00m
Oct  2 08:06:55 np0005466013 nova_compute[192144]: 2025-10-02 12:06:55.090 2 DEBUG oslo_concurrency.lockutils [None req-e4fe0104-13b7-4fb8-bb4a-064431786512 fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Lock "9ae3f2a0-cb83-4b09-8d4d-604c431e09e2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.638s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:55 np0005466013 nova_compute[192144]: 2025-10-02 12:06:55.703 2 DEBUG nova.compute.manager [req-6bd670a8-b457-4641-a670-38a5fe62ab84 req-9dbafbb4-a8c8-4415-93f0-cfb3a0aa11ea 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Received event network-vif-plugged-e5808e22-40db-4078-a542-4f4cd632e06e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:06:55 np0005466013 nova_compute[192144]: 2025-10-02 12:06:55.704 2 DEBUG oslo_concurrency.lockutils [req-6bd670a8-b457-4641-a670-38a5fe62ab84 req-9dbafbb4-a8c8-4415-93f0-cfb3a0aa11ea 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "9ae3f2a0-cb83-4b09-8d4d-604c431e09e2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:55 np0005466013 nova_compute[192144]: 2025-10-02 12:06:55.704 2 DEBUG oslo_concurrency.lockutils [req-6bd670a8-b457-4641-a670-38a5fe62ab84 req-9dbafbb4-a8c8-4415-93f0-cfb3a0aa11ea 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "9ae3f2a0-cb83-4b09-8d4d-604c431e09e2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:55 np0005466013 nova_compute[192144]: 2025-10-02 12:06:55.704 2 DEBUG oslo_concurrency.lockutils [req-6bd670a8-b457-4641-a670-38a5fe62ab84 req-9dbafbb4-a8c8-4415-93f0-cfb3a0aa11ea 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "9ae3f2a0-cb83-4b09-8d4d-604c431e09e2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:55 np0005466013 nova_compute[192144]: 2025-10-02 12:06:55.705 2 DEBUG nova.compute.manager [req-6bd670a8-b457-4641-a670-38a5fe62ab84 req-9dbafbb4-a8c8-4415-93f0-cfb3a0aa11ea 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] No waiting events found dispatching network-vif-plugged-e5808e22-40db-4078-a542-4f4cd632e06e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:06:55 np0005466013 nova_compute[192144]: 2025-10-02 12:06:55.705 2 WARNING nova.compute.manager [req-6bd670a8-b457-4641-a670-38a5fe62ab84 req-9dbafbb4-a8c8-4415-93f0-cfb3a0aa11ea 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Received unexpected event network-vif-plugged-e5808e22-40db-4078-a542-4f4cd632e06e for instance with vm_state active and task_state None.#033[00m
Oct  2 08:06:56 np0005466013 nova_compute[192144]: 2025-10-02 12:06:56.892 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:57 np0005466013 nova_compute[192144]: 2025-10-02 12:06:57.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:58 np0005466013 nova_compute[192144]: 2025-10-02 12:06:58.367 2 DEBUG nova.compute.manager [None req-bbdd64b9-decb-44e9-be7d-b9940a7d4c2c fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:06:58 np0005466013 nova_compute[192144]: 2025-10-02 12:06:58.437 2 INFO nova.compute.manager [None req-bbdd64b9-decb-44e9-be7d-b9940a7d4c2c fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] instance snapshotting#033[00m
Oct  2 08:06:58 np0005466013 nova_compute[192144]: 2025-10-02 12:06:58.652 2 INFO nova.virt.libvirt.driver [None req-bbdd64b9-decb-44e9-be7d-b9940a7d4c2c fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Beginning live snapshot process#033[00m
Oct  2 08:06:58 np0005466013 virtqemud[191867]: invalid argument: disk vda does not have an active block job
Oct  2 08:06:58 np0005466013 nova_compute[192144]: 2025-10-02 12:06:58.825 2 DEBUG oslo_concurrency.processutils [None req-bbdd64b9-decb-44e9-be7d-b9940a7d4c2c fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9ae3f2a0-cb83-4b09-8d4d-604c431e09e2/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:58 np0005466013 nova_compute[192144]: 2025-10-02 12:06:58.884 2 DEBUG oslo_concurrency.processutils [None req-bbdd64b9-decb-44e9-be7d-b9940a7d4c2c fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9ae3f2a0-cb83-4b09-8d4d-604c431e09e2/disk --force-share --output=json -f qcow2" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:58 np0005466013 nova_compute[192144]: 2025-10-02 12:06:58.885 2 DEBUG oslo_concurrency.processutils [None req-bbdd64b9-decb-44e9-be7d-b9940a7d4c2c fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9ae3f2a0-cb83-4b09-8d4d-604c431e09e2/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:58 np0005466013 nova_compute[192144]: 2025-10-02 12:06:58.943 2 DEBUG oslo_concurrency.processutils [None req-bbdd64b9-decb-44e9-be7d-b9940a7d4c2c fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9ae3f2a0-cb83-4b09-8d4d-604c431e09e2/disk --force-share --output=json -f qcow2" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:58 np0005466013 nova_compute[192144]: 2025-10-02 12:06:58.959 2 DEBUG oslo_concurrency.processutils [None req-bbdd64b9-decb-44e9-be7d-b9940a7d4c2c fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:59 np0005466013 nova_compute[192144]: 2025-10-02 12:06:59.021 2 DEBUG oslo_concurrency.processutils [None req-bbdd64b9-decb-44e9-be7d-b9940a7d4c2c fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:59 np0005466013 nova_compute[192144]: 2025-10-02 12:06:59.022 2 DEBUG oslo_concurrency.processutils [None req-bbdd64b9-decb-44e9-be7d-b9940a7d4c2c fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpbahowclc/0b1574cefec041bc93305154ec35e073.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:59 np0005466013 nova_compute[192144]: 2025-10-02 12:06:59.061 2 DEBUG oslo_concurrency.processutils [None req-bbdd64b9-decb-44e9-be7d-b9940a7d4c2c fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpbahowclc/0b1574cefec041bc93305154ec35e073.delta 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:59 np0005466013 nova_compute[192144]: 2025-10-02 12:06:59.063 2 INFO nova.virt.libvirt.driver [None req-bbdd64b9-decb-44e9-be7d-b9940a7d4c2c fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Oct  2 08:06:59 np0005466013 nova_compute[192144]: 2025-10-02 12:06:59.120 2 DEBUG nova.virt.libvirt.guest [None req-bbdd64b9-decb-44e9-be7d-b9940a7d4c2c fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Oct  2 08:06:59 np0005466013 nova_compute[192144]: 2025-10-02 12:06:59.124 2 INFO nova.virt.libvirt.driver [None req-bbdd64b9-decb-44e9-be7d-b9940a7d4c2c fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Oct  2 08:06:59 np0005466013 nova_compute[192144]: 2025-10-02 12:06:59.170 2 DEBUG nova.privsep.utils [None req-bbdd64b9-decb-44e9-be7d-b9940a7d4c2c fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Oct  2 08:06:59 np0005466013 nova_compute[192144]: 2025-10-02 12:06:59.171 2 DEBUG oslo_concurrency.processutils [None req-bbdd64b9-decb-44e9-be7d-b9940a7d4c2c fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpbahowclc/0b1574cefec041bc93305154ec35e073.delta /var/lib/nova/instances/snapshots/tmpbahowclc/0b1574cefec041bc93305154ec35e073 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:59 np0005466013 nova_compute[192144]: 2025-10-02 12:06:59.327 2 DEBUG oslo_concurrency.processutils [None req-bbdd64b9-decb-44e9-be7d-b9940a7d4c2c fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpbahowclc/0b1574cefec041bc93305154ec35e073.delta /var/lib/nova/instances/snapshots/tmpbahowclc/0b1574cefec041bc93305154ec35e073" returned: 0 in 0.156s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:59 np0005466013 nova_compute[192144]: 2025-10-02 12:06:59.331 2 INFO nova.virt.libvirt.driver [None req-bbdd64b9-decb-44e9-be7d-b9940a7d4c2c fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Snapshot extracted, beginning image upload#033[00m
Oct  2 08:07:01 np0005466013 nova_compute[192144]: 2025-10-02 12:07:01.895 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:01 np0005466013 nova_compute[192144]: 2025-10-02 12:07:01.921 2 INFO nova.virt.libvirt.driver [None req-bbdd64b9-decb-44e9-be7d-b9940a7d4c2c fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Snapshot image upload complete#033[00m
Oct  2 08:07:01 np0005466013 nova_compute[192144]: 2025-10-02 12:07:01.921 2 INFO nova.compute.manager [None req-bbdd64b9-decb-44e9-be7d-b9940a7d4c2c fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Took 3.47 seconds to snapshot the instance on the hypervisor.#033[00m
Oct  2 08:07:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:07:02.287 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:07:02.290 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:07:02.291 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:02 np0005466013 ovn_controller[94366]: 2025-10-02T12:07:02Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:61:29:fa 10.100.0.13
Oct  2 08:07:02 np0005466013 ovn_controller[94366]: 2025-10-02T12:07:02Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:61:29:fa 10.100.0.13
Oct  2 08:07:02 np0005466013 nova_compute[192144]: 2025-10-02 12:07:02.898 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:05 np0005466013 podman[224491]: 2025-10-02 12:07:05.722944681 +0000 UTC m=+0.087676140 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:07:06 np0005466013 nova_compute[192144]: 2025-10-02 12:07:06.450 2 DEBUG nova.compute.manager [None req-fd1eac43-03b4-4dd5-a7c7-9c2ce4b035ac fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:07:06 np0005466013 nova_compute[192144]: 2025-10-02 12:07:06.534 2 INFO nova.compute.manager [None req-fd1eac43-03b4-4dd5-a7c7-9c2ce4b035ac fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] instance snapshotting#033[00m
Oct  2 08:07:06 np0005466013 nova_compute[192144]: 2025-10-02 12:07:06.898 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:06 np0005466013 nova_compute[192144]: 2025-10-02 12:07:06.951 2 INFO nova.virt.libvirt.driver [None req-fd1eac43-03b4-4dd5-a7c7-9c2ce4b035ac fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Beginning live snapshot process#033[00m
Oct  2 08:07:07 np0005466013 virtqemud[191867]: invalid argument: disk vda does not have an active block job
Oct  2 08:07:07 np0005466013 nova_compute[192144]: 2025-10-02 12:07:07.165 2 DEBUG oslo_concurrency.processutils [None req-fd1eac43-03b4-4dd5-a7c7-9c2ce4b035ac fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9ae3f2a0-cb83-4b09-8d4d-604c431e09e2/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:07 np0005466013 nova_compute[192144]: 2025-10-02 12:07:07.221 2 DEBUG oslo_concurrency.processutils [None req-fd1eac43-03b4-4dd5-a7c7-9c2ce4b035ac fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9ae3f2a0-cb83-4b09-8d4d-604c431e09e2/disk --force-share --output=json -f qcow2" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:07 np0005466013 nova_compute[192144]: 2025-10-02 12:07:07.222 2 DEBUG oslo_concurrency.processutils [None req-fd1eac43-03b4-4dd5-a7c7-9c2ce4b035ac fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9ae3f2a0-cb83-4b09-8d4d-604c431e09e2/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:07 np0005466013 nova_compute[192144]: 2025-10-02 12:07:07.275 2 DEBUG oslo_concurrency.processutils [None req-fd1eac43-03b4-4dd5-a7c7-9c2ce4b035ac fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9ae3f2a0-cb83-4b09-8d4d-604c431e09e2/disk --force-share --output=json -f qcow2" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:07 np0005466013 nova_compute[192144]: 2025-10-02 12:07:07.294 2 DEBUG oslo_concurrency.processutils [None req-fd1eac43-03b4-4dd5-a7c7-9c2ce4b035ac fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:07 np0005466013 nova_compute[192144]: 2025-10-02 12:07:07.351 2 DEBUG oslo_concurrency.processutils [None req-fd1eac43-03b4-4dd5-a7c7-9c2ce4b035ac fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:07 np0005466013 nova_compute[192144]: 2025-10-02 12:07:07.352 2 DEBUG oslo_concurrency.processutils [None req-fd1eac43-03b4-4dd5-a7c7-9c2ce4b035ac fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp315950hy/7df3aa4d6b8246bfaf63ce22299ea1e8.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:07 np0005466013 nova_compute[192144]: 2025-10-02 12:07:07.384 2 DEBUG oslo_concurrency.processutils [None req-fd1eac43-03b4-4dd5-a7c7-9c2ce4b035ac fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp315950hy/7df3aa4d6b8246bfaf63ce22299ea1e8.delta 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:07 np0005466013 nova_compute[192144]: 2025-10-02 12:07:07.385 2 INFO nova.virt.libvirt.driver [None req-fd1eac43-03b4-4dd5-a7c7-9c2ce4b035ac fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Oct  2 08:07:07 np0005466013 nova_compute[192144]: 2025-10-02 12:07:07.426 2 DEBUG nova.virt.libvirt.guest [None req-fd1eac43-03b4-4dd5-a7c7-9c2ce4b035ac fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] COPY block job progress, current cursor: 0 final cursor: 75235328 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Oct  2 08:07:07 np0005466013 nova_compute[192144]: 2025-10-02 12:07:07.899 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:07 np0005466013 ovn_controller[94366]: 2025-10-02T12:07:07Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:04:fd:65 10.100.0.11
Oct  2 08:07:07 np0005466013 ovn_controller[94366]: 2025-10-02T12:07:07Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:04:fd:65 10.100.0.11
Oct  2 08:07:07 np0005466013 nova_compute[192144]: 2025-10-02 12:07:07.929 2 DEBUG nova.virt.libvirt.guest [None req-fd1eac43-03b4-4dd5-a7c7-9c2ce4b035ac fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] COPY block job progress, current cursor: 75235328 final cursor: 75235328 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Oct  2 08:07:07 np0005466013 nova_compute[192144]: 2025-10-02 12:07:07.933 2 INFO nova.virt.libvirt.driver [None req-fd1eac43-03b4-4dd5-a7c7-9c2ce4b035ac fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Oct  2 08:07:07 np0005466013 nova_compute[192144]: 2025-10-02 12:07:07.968 2 DEBUG nova.privsep.utils [None req-fd1eac43-03b4-4dd5-a7c7-9c2ce4b035ac fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Oct  2 08:07:07 np0005466013 nova_compute[192144]: 2025-10-02 12:07:07.969 2 DEBUG oslo_concurrency.processutils [None req-fd1eac43-03b4-4dd5-a7c7-9c2ce4b035ac fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp315950hy/7df3aa4d6b8246bfaf63ce22299ea1e8.delta /var/lib/nova/instances/snapshots/tmp315950hy/7df3aa4d6b8246bfaf63ce22299ea1e8 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:08 np0005466013 nova_compute[192144]: 2025-10-02 12:07:08.412 2 DEBUG oslo_concurrency.lockutils [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Acquiring lock "555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:08 np0005466013 nova_compute[192144]: 2025-10-02 12:07:08.413 2 DEBUG oslo_concurrency.lockutils [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Lock "555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:08 np0005466013 nova_compute[192144]: 2025-10-02 12:07:08.430 2 DEBUG nova.compute.manager [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] [instance: 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:07:08 np0005466013 nova_compute[192144]: 2025-10-02 12:07:08.465 2 DEBUG oslo_concurrency.processutils [None req-fd1eac43-03b4-4dd5-a7c7-9c2ce4b035ac fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp315950hy/7df3aa4d6b8246bfaf63ce22299ea1e8.delta /var/lib/nova/instances/snapshots/tmp315950hy/7df3aa4d6b8246bfaf63ce22299ea1e8" returned: 0 in 0.496s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:08 np0005466013 nova_compute[192144]: 2025-10-02 12:07:08.472 2 INFO nova.virt.libvirt.driver [None req-fd1eac43-03b4-4dd5-a7c7-9c2ce4b035ac fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Snapshot extracted, beginning image upload#033[00m
Oct  2 08:07:08 np0005466013 nova_compute[192144]: 2025-10-02 12:07:08.526 2 DEBUG oslo_concurrency.lockutils [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:08 np0005466013 nova_compute[192144]: 2025-10-02 12:07:08.527 2 DEBUG oslo_concurrency.lockutils [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:08 np0005466013 nova_compute[192144]: 2025-10-02 12:07:08.534 2 DEBUG nova.virt.hardware [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:07:08 np0005466013 nova_compute[192144]: 2025-10-02 12:07:08.535 2 INFO nova.compute.claims [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] [instance: 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:07:08 np0005466013 podman[224556]: 2025-10-02 12:07:08.682475984 +0000 UTC m=+0.055661287 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, architecture=x86_64, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Oct  2 08:07:08 np0005466013 podman[224555]: 2025-10-02 12:07:08.688433905 +0000 UTC m=+0.064421441 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd)
Oct  2 08:07:08 np0005466013 nova_compute[192144]: 2025-10-02 12:07:08.697 2 DEBUG nova.compute.provider_tree [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:07:08 np0005466013 nova_compute[192144]: 2025-10-02 12:07:08.713 2 DEBUG nova.scheduler.client.report [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:07:08 np0005466013 nova_compute[192144]: 2025-10-02 12:07:08.738 2 DEBUG oslo_concurrency.lockutils [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.211s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:08 np0005466013 nova_compute[192144]: 2025-10-02 12:07:08.739 2 DEBUG nova.compute.manager [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] [instance: 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:07:08 np0005466013 nova_compute[192144]: 2025-10-02 12:07:08.794 2 DEBUG nova.compute.manager [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] [instance: 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948#033[00m
Oct  2 08:07:08 np0005466013 nova_compute[192144]: 2025-10-02 12:07:08.811 2 INFO nova.virt.libvirt.driver [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] [instance: 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:07:08 np0005466013 nova_compute[192144]: 2025-10-02 12:07:08.830 2 DEBUG nova.compute.manager [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] [instance: 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:07:08 np0005466013 nova_compute[192144]: 2025-10-02 12:07:08.982 2 DEBUG nova.compute.manager [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] [instance: 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:07:08 np0005466013 nova_compute[192144]: 2025-10-02 12:07:08.984 2 DEBUG nova.virt.libvirt.driver [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] [instance: 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:07:08 np0005466013 nova_compute[192144]: 2025-10-02 12:07:08.984 2 INFO nova.virt.libvirt.driver [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] [instance: 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952] Creating image(s)#033[00m
Oct  2 08:07:08 np0005466013 nova_compute[192144]: 2025-10-02 12:07:08.985 2 DEBUG oslo_concurrency.lockutils [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Acquiring lock "/var/lib/nova/instances/555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:08 np0005466013 nova_compute[192144]: 2025-10-02 12:07:08.985 2 DEBUG oslo_concurrency.lockutils [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Lock "/var/lib/nova/instances/555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:08 np0005466013 nova_compute[192144]: 2025-10-02 12:07:08.986 2 DEBUG oslo_concurrency.lockutils [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Lock "/var/lib/nova/instances/555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:08 np0005466013 nova_compute[192144]: 2025-10-02 12:07:08.999 2 DEBUG oslo_concurrency.processutils [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:09 np0005466013 nova_compute[192144]: 2025-10-02 12:07:09.059 2 DEBUG oslo_concurrency.processutils [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:09 np0005466013 nova_compute[192144]: 2025-10-02 12:07:09.060 2 DEBUG oslo_concurrency.lockutils [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:09 np0005466013 nova_compute[192144]: 2025-10-02 12:07:09.061 2 DEBUG oslo_concurrency.lockutils [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:09 np0005466013 nova_compute[192144]: 2025-10-02 12:07:09.075 2 DEBUG oslo_concurrency.processutils [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:09 np0005466013 nova_compute[192144]: 2025-10-02 12:07:09.136 2 DEBUG oslo_concurrency.processutils [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:09 np0005466013 nova_compute[192144]: 2025-10-02 12:07:09.137 2 DEBUG oslo_concurrency.processutils [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:09 np0005466013 nova_compute[192144]: 2025-10-02 12:07:09.172 2 DEBUG oslo_concurrency.processutils [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:09 np0005466013 nova_compute[192144]: 2025-10-02 12:07:09.173 2 DEBUG oslo_concurrency.lockutils [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:09 np0005466013 nova_compute[192144]: 2025-10-02 12:07:09.173 2 DEBUG oslo_concurrency.processutils [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:09 np0005466013 nova_compute[192144]: 2025-10-02 12:07:09.228 2 DEBUG oslo_concurrency.processutils [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:09 np0005466013 nova_compute[192144]: 2025-10-02 12:07:09.229 2 DEBUG nova.virt.disk.api [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Checking if we can resize image /var/lib/nova/instances/555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:07:09 np0005466013 nova_compute[192144]: 2025-10-02 12:07:09.229 2 DEBUG oslo_concurrency.processutils [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:09 np0005466013 nova_compute[192144]: 2025-10-02 12:07:09.289 2 DEBUG oslo_concurrency.processutils [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:09 np0005466013 nova_compute[192144]: 2025-10-02 12:07:09.290 2 DEBUG nova.virt.disk.api [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Cannot resize image /var/lib/nova/instances/555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:07:09 np0005466013 nova_compute[192144]: 2025-10-02 12:07:09.290 2 DEBUG nova.objects.instance [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Lazy-loading 'migration_context' on Instance uuid 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:07:09 np0005466013 nova_compute[192144]: 2025-10-02 12:07:09.309 2 DEBUG nova.virt.libvirt.driver [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] [instance: 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:07:09 np0005466013 nova_compute[192144]: 2025-10-02 12:07:09.310 2 DEBUG nova.virt.libvirt.driver [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] [instance: 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952] Ensure instance console log exists: /var/lib/nova/instances/555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:07:09 np0005466013 nova_compute[192144]: 2025-10-02 12:07:09.310 2 DEBUG oslo_concurrency.lockutils [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:09 np0005466013 nova_compute[192144]: 2025-10-02 12:07:09.311 2 DEBUG oslo_concurrency.lockutils [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:09 np0005466013 nova_compute[192144]: 2025-10-02 12:07:09.311 2 DEBUG oslo_concurrency.lockutils [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:09 np0005466013 nova_compute[192144]: 2025-10-02 12:07:09.313 2 DEBUG nova.virt.libvirt.driver [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] [instance: 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:07:09 np0005466013 nova_compute[192144]: 2025-10-02 12:07:09.317 2 WARNING nova.virt.libvirt.driver [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:07:09 np0005466013 nova_compute[192144]: 2025-10-02 12:07:09.323 2 DEBUG nova.virt.libvirt.host [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:07:09 np0005466013 nova_compute[192144]: 2025-10-02 12:07:09.323 2 DEBUG nova.virt.libvirt.host [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:07:09 np0005466013 nova_compute[192144]: 2025-10-02 12:07:09.327 2 DEBUG nova.virt.libvirt.host [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:07:09 np0005466013 nova_compute[192144]: 2025-10-02 12:07:09.328 2 DEBUG nova.virt.libvirt.host [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:07:09 np0005466013 nova_compute[192144]: 2025-10-02 12:07:09.329 2 DEBUG nova.virt.libvirt.driver [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:07:09 np0005466013 nova_compute[192144]: 2025-10-02 12:07:09.330 2 DEBUG nova.virt.hardware [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:07:09 np0005466013 nova_compute[192144]: 2025-10-02 12:07:09.330 2 DEBUG nova.virt.hardware [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:07:09 np0005466013 nova_compute[192144]: 2025-10-02 12:07:09.330 2 DEBUG nova.virt.hardware [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:07:09 np0005466013 nova_compute[192144]: 2025-10-02 12:07:09.331 2 DEBUG nova.virt.hardware [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:07:09 np0005466013 nova_compute[192144]: 2025-10-02 12:07:09.331 2 DEBUG nova.virt.hardware [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:07:09 np0005466013 nova_compute[192144]: 2025-10-02 12:07:09.331 2 DEBUG nova.virt.hardware [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:07:09 np0005466013 nova_compute[192144]: 2025-10-02 12:07:09.331 2 DEBUG nova.virt.hardware [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:07:09 np0005466013 nova_compute[192144]: 2025-10-02 12:07:09.332 2 DEBUG nova.virt.hardware [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:07:09 np0005466013 nova_compute[192144]: 2025-10-02 12:07:09.332 2 DEBUG nova.virt.hardware [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:07:09 np0005466013 nova_compute[192144]: 2025-10-02 12:07:09.332 2 DEBUG nova.virt.hardware [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:07:09 np0005466013 nova_compute[192144]: 2025-10-02 12:07:09.332 2 DEBUG nova.virt.hardware [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:07:09 np0005466013 nova_compute[192144]: 2025-10-02 12:07:09.337 2 DEBUG nova.objects.instance [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Lazy-loading 'pci_devices' on Instance uuid 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:07:09 np0005466013 nova_compute[192144]: 2025-10-02 12:07:09.349 2 DEBUG nova.virt.libvirt.driver [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] [instance: 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:07:09 np0005466013 nova_compute[192144]:  <uuid>555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952</uuid>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:  <name>instance-00000029</name>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:07:09 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:      <nova:name>tempest-ServerDiagnosticsV248Test-server-871354274</nova:name>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:07:09</nova:creationTime>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:07:09 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:        <nova:user uuid="a184841b0d7f4132b40d6c8dc1230655">tempest-ServerDiagnosticsV248Test-955643866-project-member</nova:user>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:        <nova:project uuid="84063a6f8a7246a7ba8decc84421e3d9">tempest-ServerDiagnosticsV248Test-955643866</nova:project>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:      <nova:ports/>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:07:09 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:      <entry name="serial">555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952</entry>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:      <entry name="uuid">555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952</entry>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:07:09 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:07:09 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:07:09 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952/disk"/>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:07:09 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952/disk.config"/>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:07:09 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952/console.log" append="off"/>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:07:09 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:07:09 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:07:09 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:07:09 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:07:09 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:07:09 np0005466013 nova_compute[192144]: 2025-10-02 12:07:09.399 2 DEBUG nova.virt.libvirt.driver [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:07:09 np0005466013 nova_compute[192144]: 2025-10-02 12:07:09.399 2 DEBUG nova.virt.libvirt.driver [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:07:09 np0005466013 nova_compute[192144]: 2025-10-02 12:07:09.399 2 INFO nova.virt.libvirt.driver [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] [instance: 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952] Using config drive#033[00m
Oct  2 08:07:09 np0005466013 nova_compute[192144]: 2025-10-02 12:07:09.676 2 INFO nova.virt.libvirt.driver [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] [instance: 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952] Creating config drive at /var/lib/nova/instances/555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952/disk.config#033[00m
Oct  2 08:07:09 np0005466013 nova_compute[192144]: 2025-10-02 12:07:09.681 2 DEBUG oslo_concurrency.processutils [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0ptses8b execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:09 np0005466013 nova_compute[192144]: 2025-10-02 12:07:09.806 2 DEBUG oslo_concurrency.processutils [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0ptses8b" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:09 np0005466013 systemd-machined[152202]: New machine qemu-16-instance-00000029.
Oct  2 08:07:09 np0005466013 systemd[1]: Started Virtual Machine qemu-16-instance-00000029.
Oct  2 08:07:10 np0005466013 nova_compute[192144]: 2025-10-02 12:07:10.696 2 INFO nova.virt.libvirt.driver [None req-fd1eac43-03b4-4dd5-a7c7-9c2ce4b035ac fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Snapshot image upload complete#033[00m
Oct  2 08:07:10 np0005466013 nova_compute[192144]: 2025-10-02 12:07:10.697 2 INFO nova.compute.manager [None req-fd1eac43-03b4-4dd5-a7c7-9c2ce4b035ac fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Took 4.15 seconds to snapshot the instance on the hypervisor.#033[00m
Oct  2 08:07:10 np0005466013 nova_compute[192144]: 2025-10-02 12:07:10.719 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759406830.7192068, 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:07:10 np0005466013 nova_compute[192144]: 2025-10-02 12:07:10.720 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:07:10 np0005466013 nova_compute[192144]: 2025-10-02 12:07:10.722 2 DEBUG nova.compute.manager [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] [instance: 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:07:10 np0005466013 nova_compute[192144]: 2025-10-02 12:07:10.722 2 DEBUG nova.virt.libvirt.driver [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] [instance: 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:07:10 np0005466013 nova_compute[192144]: 2025-10-02 12:07:10.726 2 INFO nova.virt.libvirt.driver [-] [instance: 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952] Instance spawned successfully.#033[00m
Oct  2 08:07:10 np0005466013 nova_compute[192144]: 2025-10-02 12:07:10.726 2 DEBUG nova.virt.libvirt.driver [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] [instance: 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:07:10 np0005466013 nova_compute[192144]: 2025-10-02 12:07:10.743 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:07:10 np0005466013 nova_compute[192144]: 2025-10-02 12:07:10.747 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:07:10 np0005466013 nova_compute[192144]: 2025-10-02 12:07:10.753 2 DEBUG nova.virt.libvirt.driver [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] [instance: 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:07:10 np0005466013 nova_compute[192144]: 2025-10-02 12:07:10.754 2 DEBUG nova.virt.libvirt.driver [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] [instance: 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:07:10 np0005466013 nova_compute[192144]: 2025-10-02 12:07:10.754 2 DEBUG nova.virt.libvirt.driver [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] [instance: 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:07:10 np0005466013 nova_compute[192144]: 2025-10-02 12:07:10.754 2 DEBUG nova.virt.libvirt.driver [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] [instance: 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:07:10 np0005466013 nova_compute[192144]: 2025-10-02 12:07:10.755 2 DEBUG nova.virt.libvirt.driver [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] [instance: 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:07:10 np0005466013 nova_compute[192144]: 2025-10-02 12:07:10.755 2 DEBUG nova.virt.libvirt.driver [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] [instance: 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:07:10 np0005466013 nova_compute[192144]: 2025-10-02 12:07:10.783 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:07:10 np0005466013 nova_compute[192144]: 2025-10-02 12:07:10.783 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759406830.7221706, 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:07:10 np0005466013 nova_compute[192144]: 2025-10-02 12:07:10.784 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952] VM Started (Lifecycle Event)#033[00m
Oct  2 08:07:10 np0005466013 nova_compute[192144]: 2025-10-02 12:07:10.812 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:07:10 np0005466013 nova_compute[192144]: 2025-10-02 12:07:10.817 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:07:10 np0005466013 nova_compute[192144]: 2025-10-02 12:07:10.842 2 INFO nova.compute.manager [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] [instance: 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952] Took 1.86 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:07:10 np0005466013 nova_compute[192144]: 2025-10-02 12:07:10.843 2 DEBUG nova.compute.manager [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] [instance: 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:07:10 np0005466013 nova_compute[192144]: 2025-10-02 12:07:10.852 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:07:10 np0005466013 nova_compute[192144]: 2025-10-02 12:07:10.935 2 INFO nova.compute.manager [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] [instance: 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952] Took 2.45 seconds to build instance.#033[00m
Oct  2 08:07:10 np0005466013 nova_compute[192144]: 2025-10-02 12:07:10.949 2 DEBUG oslo_concurrency.lockutils [None req-6e90c2ff-3730-41a5-9939-e97bd1e91d41 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Lock "555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.536s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:11 np0005466013 nova_compute[192144]: 2025-10-02 12:07:11.899 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:11 np0005466013 nova_compute[192144]: 2025-10-02 12:07:11.962 2 DEBUG nova.compute.manager [None req-fe656ece-a956-4ba5-bd50-ede393ae410d 47bbd29b479940d88e5572cd60829e24 56ab74bdf1fd47f0a5cfc316301e49eb - - default default] [instance: 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:07:11 np0005466013 nova_compute[192144]: 2025-10-02 12:07:11.965 2 INFO nova.compute.manager [None req-fe656ece-a956-4ba5-bd50-ede393ae410d 47bbd29b479940d88e5572cd60829e24 56ab74bdf1fd47f0a5cfc316301e49eb - - default default] [instance: 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952] Retrieving diagnostics#033[00m
Oct  2 08:07:12 np0005466013 nova_compute[192144]: 2025-10-02 12:07:12.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:13 np0005466013 nova_compute[192144]: 2025-10-02 12:07:13.099 2 DEBUG oslo_concurrency.lockutils [None req-a68ff774-a880-4c7a-bed8-0b307117f28a fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Acquiring lock "9ae3f2a0-cb83-4b09-8d4d-604c431e09e2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:13 np0005466013 nova_compute[192144]: 2025-10-02 12:07:13.099 2 DEBUG oslo_concurrency.lockutils [None req-a68ff774-a880-4c7a-bed8-0b307117f28a fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Lock "9ae3f2a0-cb83-4b09-8d4d-604c431e09e2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:13 np0005466013 nova_compute[192144]: 2025-10-02 12:07:13.100 2 DEBUG oslo_concurrency.lockutils [None req-a68ff774-a880-4c7a-bed8-0b307117f28a fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Acquiring lock "9ae3f2a0-cb83-4b09-8d4d-604c431e09e2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:13 np0005466013 nova_compute[192144]: 2025-10-02 12:07:13.100 2 DEBUG oslo_concurrency.lockutils [None req-a68ff774-a880-4c7a-bed8-0b307117f28a fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Lock "9ae3f2a0-cb83-4b09-8d4d-604c431e09e2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:13 np0005466013 nova_compute[192144]: 2025-10-02 12:07:13.100 2 DEBUG oslo_concurrency.lockutils [None req-a68ff774-a880-4c7a-bed8-0b307117f28a fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Lock "9ae3f2a0-cb83-4b09-8d4d-604c431e09e2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:13 np0005466013 nova_compute[192144]: 2025-10-02 12:07:13.110 2 INFO nova.compute.manager [None req-a68ff774-a880-4c7a-bed8-0b307117f28a fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Terminating instance#033[00m
Oct  2 08:07:13 np0005466013 nova_compute[192144]: 2025-10-02 12:07:13.131 2 DEBUG nova.compute.manager [None req-a68ff774-a880-4c7a-bed8-0b307117f28a fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:07:13 np0005466013 kernel: tape5808e22-40 (unregistering): left promiscuous mode
Oct  2 08:07:13 np0005466013 NetworkManager[51205]: <info>  [1759406833.1634] device (tape5808e22-40): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:07:13 np0005466013 ovn_controller[94366]: 2025-10-02T12:07:13Z|00098|binding|INFO|Releasing lport e5808e22-40db-4078-a542-4f4cd632e06e from this chassis (sb_readonly=0)
Oct  2 08:07:13 np0005466013 nova_compute[192144]: 2025-10-02 12:07:13.167 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:13 np0005466013 ovn_controller[94366]: 2025-10-02T12:07:13Z|00099|binding|INFO|Setting lport e5808e22-40db-4078-a542-4f4cd632e06e down in Southbound
Oct  2 08:07:13 np0005466013 ovn_controller[94366]: 2025-10-02T12:07:13Z|00100|binding|INFO|Removing iface tape5808e22-40 ovn-installed in OVS
Oct  2 08:07:13 np0005466013 nova_compute[192144]: 2025-10-02 12:07:13.176 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:07:13.178 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:fd:65 10.100.0.11'], port_security=['fa:16:3e:04:fd:65 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '9ae3f2a0-cb83-4b09-8d4d-604c431e09e2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b47cf9b2-6909-4c87-b7af-b579f1b91bed', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c7ce4d06ab3e4e45b22ec26fe7e71cce', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'edcc431e-1b1d-4cc9-ad73-44b2fb583cc1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cd1f3cea-66c2-4b36-9423-124e5ddc31df, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=e5808e22-40db-4078-a542-4f4cd632e06e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:07:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:07:13.180 103323 INFO neutron.agent.ovn.metadata.agent [-] Port e5808e22-40db-4078-a542-4f4cd632e06e in datapath b47cf9b2-6909-4c87-b7af-b579f1b91bed unbound from our chassis#033[00m
Oct  2 08:07:13 np0005466013 nova_compute[192144]: 2025-10-02 12:07:13.182 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:07:13.183 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b47cf9b2-6909-4c87-b7af-b579f1b91bed, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:07:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:07:13.186 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[93bebae1-99be-4bf9-940f-43e9202da794]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:07:13.186 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b47cf9b2-6909-4c87-b7af-b579f1b91bed namespace which is not needed anymore#033[00m
Oct  2 08:07:13 np0005466013 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000026.scope: Deactivated successfully.
Oct  2 08:07:13 np0005466013 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000026.scope: Consumed 14.399s CPU time.
Oct  2 08:07:13 np0005466013 systemd-machined[152202]: Machine qemu-15-instance-00000026 terminated.
Oct  2 08:07:13 np0005466013 podman[224643]: 2025-10-02 12:07:13.251460452 +0000 UTC m=+0.061834556 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 08:07:13 np0005466013 podman[224638]: 2025-10-02 12:07:13.278126327 +0000 UTC m=+0.090622049 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  2 08:07:13 np0005466013 kernel: tape5808e22-40: entered promiscuous mode
Oct  2 08:07:13 np0005466013 NetworkManager[51205]: <info>  [1759406833.3504] manager: (tape5808e22-40): new Tun device (/org/freedesktop/NetworkManager/Devices/58)
Oct  2 08:07:13 np0005466013 nova_compute[192144]: 2025-10-02 12:07:13.353 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:13 np0005466013 ovn_controller[94366]: 2025-10-02T12:07:13Z|00101|binding|INFO|Claiming lport e5808e22-40db-4078-a542-4f4cd632e06e for this chassis.
Oct  2 08:07:13 np0005466013 ovn_controller[94366]: 2025-10-02T12:07:13Z|00102|binding|INFO|e5808e22-40db-4078-a542-4f4cd632e06e: Claiming fa:16:3e:04:fd:65 10.100.0.11
Oct  2 08:07:13 np0005466013 kernel: tape5808e22-40 (unregistering): left promiscuous mode
Oct  2 08:07:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:07:13.362 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:fd:65 10.100.0.11'], port_security=['fa:16:3e:04:fd:65 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '9ae3f2a0-cb83-4b09-8d4d-604c431e09e2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b47cf9b2-6909-4c87-b7af-b579f1b91bed', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c7ce4d06ab3e4e45b22ec26fe7e71cce', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'edcc431e-1b1d-4cc9-ad73-44b2fb583cc1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cd1f3cea-66c2-4b36-9423-124e5ddc31df, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=e5808e22-40db-4078-a542-4f4cd632e06e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:07:13 np0005466013 neutron-haproxy-ovnmeta-b47cf9b2-6909-4c87-b7af-b579f1b91bed[224366]: [NOTICE]   (224370) : haproxy version is 2.8.14-c23fe91
Oct  2 08:07:13 np0005466013 neutron-haproxy-ovnmeta-b47cf9b2-6909-4c87-b7af-b579f1b91bed[224366]: [NOTICE]   (224370) : path to executable is /usr/sbin/haproxy
Oct  2 08:07:13 np0005466013 neutron-haproxy-ovnmeta-b47cf9b2-6909-4c87-b7af-b579f1b91bed[224366]: [WARNING]  (224370) : Exiting Master process...
Oct  2 08:07:13 np0005466013 neutron-haproxy-ovnmeta-b47cf9b2-6909-4c87-b7af-b579f1b91bed[224366]: [ALERT]    (224370) : Current worker (224372) exited with code 143 (Terminated)
Oct  2 08:07:13 np0005466013 neutron-haproxy-ovnmeta-b47cf9b2-6909-4c87-b7af-b579f1b91bed[224366]: [WARNING]  (224370) : All workers exited. Exiting... (0)
Oct  2 08:07:13 np0005466013 ovn_controller[94366]: 2025-10-02T12:07:13Z|00103|binding|INFO|Setting lport e5808e22-40db-4078-a542-4f4cd632e06e ovn-installed in OVS
Oct  2 08:07:13 np0005466013 ovn_controller[94366]: 2025-10-02T12:07:13Z|00104|binding|INFO|Setting lport e5808e22-40db-4078-a542-4f4cd632e06e up in Southbound
Oct  2 08:07:13 np0005466013 systemd[1]: libpod-2aab6e9db3580fabef6b5182d66366a3402f96b13fbf918c5ab7edac773e9e92.scope: Deactivated successfully.
Oct  2 08:07:13 np0005466013 nova_compute[192144]: 2025-10-02 12:07:13.377 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:13 np0005466013 nova_compute[192144]: 2025-10-02 12:07:13.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:13 np0005466013 ovn_controller[94366]: 2025-10-02T12:07:13Z|00105|binding|INFO|Releasing lport e5808e22-40db-4078-a542-4f4cd632e06e from this chassis (sb_readonly=1)
Oct  2 08:07:13 np0005466013 ovn_controller[94366]: 2025-10-02T12:07:13Z|00106|binding|INFO|Removing iface tape5808e22-40 ovn-installed in OVS
Oct  2 08:07:13 np0005466013 ovn_controller[94366]: 2025-10-02T12:07:13Z|00107|if_status|INFO|Not setting lport e5808e22-40db-4078-a542-4f4cd632e06e down as sb is readonly
Oct  2 08:07:13 np0005466013 ovn_controller[94366]: 2025-10-02T12:07:13Z|00108|binding|INFO|Releasing lport e5808e22-40db-4078-a542-4f4cd632e06e from this chassis (sb_readonly=0)
Oct  2 08:07:13 np0005466013 ovn_controller[94366]: 2025-10-02T12:07:13Z|00109|binding|INFO|Setting lport e5808e22-40db-4078-a542-4f4cd632e06e down in Southbound
Oct  2 08:07:13 np0005466013 nova_compute[192144]: 2025-10-02 12:07:13.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:13 np0005466013 podman[224699]: 2025-10-02 12:07:13.384166355 +0000 UTC m=+0.095407306 container died 2aab6e9db3580fabef6b5182d66366a3402f96b13fbf918c5ab7edac773e9e92 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b47cf9b2-6909-4c87-b7af-b579f1b91bed, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Oct  2 08:07:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:07:13.391 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:fd:65 10.100.0.11'], port_security=['fa:16:3e:04:fd:65 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '9ae3f2a0-cb83-4b09-8d4d-604c431e09e2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b47cf9b2-6909-4c87-b7af-b579f1b91bed', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c7ce4d06ab3e4e45b22ec26fe7e71cce', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'edcc431e-1b1d-4cc9-ad73-44b2fb583cc1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cd1f3cea-66c2-4b36-9423-124e5ddc31df, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=e5808e22-40db-4078-a542-4f4cd632e06e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:07:13 np0005466013 nova_compute[192144]: 2025-10-02 12:07:13.394 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:13 np0005466013 nova_compute[192144]: 2025-10-02 12:07:13.408 2 INFO nova.virt.libvirt.driver [-] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Instance destroyed successfully.#033[00m
Oct  2 08:07:13 np0005466013 nova_compute[192144]: 2025-10-02 12:07:13.408 2 DEBUG nova.objects.instance [None req-a68ff774-a880-4c7a-bed8-0b307117f28a fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Lazy-loading 'resources' on Instance uuid 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:07:13 np0005466013 nova_compute[192144]: 2025-10-02 12:07:13.423 2 DEBUG nova.virt.libvirt.vif [None req-a68ff774-a880-4c7a-bed8-0b307117f28a fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:06:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesOneServerTestJSON-server-1496470881',display_name='tempest-ImagesOneServerTestJSON-server-1496470881',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesoneservertestjson-server-1496470881',id=38,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:06:54Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c7ce4d06ab3e4e45b22ec26fe7e71cce',ramdisk_id='',reservation_id='r-bosl4u32',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesOneServerTestJSON-193751',owner_user_name='tempest-ImagesOneServerTestJSON-193751-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:07:10Z,user_data=None,user_id='fb4813531b1848edaf57576b1f551d3d',uuid=9ae3f2a0-cb83-4b09-8d4d-604c431e09e2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e5808e22-40db-4078-a542-4f4cd632e06e", "address": "fa:16:3e:04:fd:65", "network": {"id": "b47cf9b2-6909-4c87-b7af-b579f1b91bed", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1900090400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7ce4d06ab3e4e45b22ec26fe7e71cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5808e22-40", "ovs_interfaceid": "e5808e22-40db-4078-a542-4f4cd632e06e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:07:13 np0005466013 nova_compute[192144]: 2025-10-02 12:07:13.424 2 DEBUG nova.network.os_vif_util [None req-a68ff774-a880-4c7a-bed8-0b307117f28a fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Converting VIF {"id": "e5808e22-40db-4078-a542-4f4cd632e06e", "address": "fa:16:3e:04:fd:65", "network": {"id": "b47cf9b2-6909-4c87-b7af-b579f1b91bed", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1900090400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7ce4d06ab3e4e45b22ec26fe7e71cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5808e22-40", "ovs_interfaceid": "e5808e22-40db-4078-a542-4f4cd632e06e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:07:13 np0005466013 nova_compute[192144]: 2025-10-02 12:07:13.425 2 DEBUG nova.network.os_vif_util [None req-a68ff774-a880-4c7a-bed8-0b307117f28a fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:04:fd:65,bridge_name='br-int',has_traffic_filtering=True,id=e5808e22-40db-4078-a542-4f4cd632e06e,network=Network(b47cf9b2-6909-4c87-b7af-b579f1b91bed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape5808e22-40') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:07:13 np0005466013 nova_compute[192144]: 2025-10-02 12:07:13.425 2 DEBUG os_vif [None req-a68ff774-a880-4c7a-bed8-0b307117f28a fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:fd:65,bridge_name='br-int',has_traffic_filtering=True,id=e5808e22-40db-4078-a542-4f4cd632e06e,network=Network(b47cf9b2-6909-4c87-b7af-b579f1b91bed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape5808e22-40') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:07:13 np0005466013 nova_compute[192144]: 2025-10-02 12:07:13.429 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:13 np0005466013 nova_compute[192144]: 2025-10-02 12:07:13.429 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape5808e22-40, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:07:13 np0005466013 nova_compute[192144]: 2025-10-02 12:07:13.432 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:13 np0005466013 nova_compute[192144]: 2025-10-02 12:07:13.433 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:13 np0005466013 nova_compute[192144]: 2025-10-02 12:07:13.435 2 INFO os_vif [None req-a68ff774-a880-4c7a-bed8-0b307117f28a fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:fd:65,bridge_name='br-int',has_traffic_filtering=True,id=e5808e22-40db-4078-a542-4f4cd632e06e,network=Network(b47cf9b2-6909-4c87-b7af-b579f1b91bed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape5808e22-40')#033[00m
Oct  2 08:07:13 np0005466013 nova_compute[192144]: 2025-10-02 12:07:13.436 2 INFO nova.virt.libvirt.driver [None req-a68ff774-a880-4c7a-bed8-0b307117f28a fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Deleting instance files /var/lib/nova/instances/9ae3f2a0-cb83-4b09-8d4d-604c431e09e2_del#033[00m
Oct  2 08:07:13 np0005466013 nova_compute[192144]: 2025-10-02 12:07:13.440 2 INFO nova.virt.libvirt.driver [None req-a68ff774-a880-4c7a-bed8-0b307117f28a fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Deletion of /var/lib/nova/instances/9ae3f2a0-cb83-4b09-8d4d-604c431e09e2_del complete#033[00m
Oct  2 08:07:13 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2aab6e9db3580fabef6b5182d66366a3402f96b13fbf918c5ab7edac773e9e92-userdata-shm.mount: Deactivated successfully.
Oct  2 08:07:13 np0005466013 systemd[1]: var-lib-containers-storage-overlay-62da744d75074c76f799558a51b90862c7a85c8fc87de61be96054c4b129c991-merged.mount: Deactivated successfully.
Oct  2 08:07:13 np0005466013 podman[224699]: 2025-10-02 12:07:13.483091012 +0000 UTC m=+0.194331963 container cleanup 2aab6e9db3580fabef6b5182d66366a3402f96b13fbf918c5ab7edac773e9e92 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b47cf9b2-6909-4c87-b7af-b579f1b91bed, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct  2 08:07:13 np0005466013 systemd[1]: libpod-conmon-2aab6e9db3580fabef6b5182d66366a3402f96b13fbf918c5ab7edac773e9e92.scope: Deactivated successfully.
Oct  2 08:07:13 np0005466013 nova_compute[192144]: 2025-10-02 12:07:13.503 2 INFO nova.compute.manager [None req-a68ff774-a880-4c7a-bed8-0b307117f28a fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Took 0.37 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:07:13 np0005466013 nova_compute[192144]: 2025-10-02 12:07:13.503 2 DEBUG oslo.service.loopingcall [None req-a68ff774-a880-4c7a-bed8-0b307117f28a fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:07:13 np0005466013 nova_compute[192144]: 2025-10-02 12:07:13.504 2 DEBUG nova.compute.manager [-] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:07:13 np0005466013 nova_compute[192144]: 2025-10-02 12:07:13.504 2 DEBUG nova.network.neutron [-] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:07:13 np0005466013 podman[224742]: 2025-10-02 12:07:13.577066114 +0000 UTC m=+0.065645427 container remove 2aab6e9db3580fabef6b5182d66366a3402f96b13fbf918c5ab7edac773e9e92 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b47cf9b2-6909-4c87-b7af-b579f1b91bed, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0)
Oct  2 08:07:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:07:13.582 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[59ee1a38-14cf-4434-a9a9-b2ebab53c155]: (4, ('Thu Oct  2 12:07:13 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-b47cf9b2-6909-4c87-b7af-b579f1b91bed (2aab6e9db3580fabef6b5182d66366a3402f96b13fbf918c5ab7edac773e9e92)\n2aab6e9db3580fabef6b5182d66366a3402f96b13fbf918c5ab7edac773e9e92\nThu Oct  2 12:07:13 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-b47cf9b2-6909-4c87-b7af-b579f1b91bed (2aab6e9db3580fabef6b5182d66366a3402f96b13fbf918c5ab7edac773e9e92)\n2aab6e9db3580fabef6b5182d66366a3402f96b13fbf918c5ab7edac773e9e92\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:07:13.583 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[641fd5fc-9264-4ac1-87a1-e3092d75a238]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:07:13.584 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb47cf9b2-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:07:13 np0005466013 nova_compute[192144]: 2025-10-02 12:07:13.586 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:13 np0005466013 kernel: tapb47cf9b2-60: left promiscuous mode
Oct  2 08:07:13 np0005466013 nova_compute[192144]: 2025-10-02 12:07:13.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:07:13.603 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[061fcc73-8916-4762-8252-747e79d2806b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:07:13.634 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[6643e6b9-072a-4e7f-aded-61b3cec0e138]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:07:13.635 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[7aa3db4f-a694-40ed-9601-e43f20670c66]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:07:13.651 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[73240997-2d44-4dca-9b8d-dad8b07cb458]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 480458, 'reachable_time': 21191, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224757, 'error': None, 'target': 'ovnmeta-b47cf9b2-6909-4c87-b7af-b579f1b91bed', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:13 np0005466013 systemd[1]: run-netns-ovnmeta\x2db47cf9b2\x2d6909\x2d4c87\x2db7af\x2db579f1b91bed.mount: Deactivated successfully.
Oct  2 08:07:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:07:13.653 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b47cf9b2-6909-4c87-b7af-b579f1b91bed deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:07:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:07:13.654 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[6d24d523-f9f5-4610-a88c-935b494f8095]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:07:13.656 103323 INFO neutron.agent.ovn.metadata.agent [-] Port e5808e22-40db-4078-a542-4f4cd632e06e in datapath b47cf9b2-6909-4c87-b7af-b579f1b91bed unbound from our chassis#033[00m
Oct  2 08:07:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:07:13.658 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b47cf9b2-6909-4c87-b7af-b579f1b91bed, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:07:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:07:13.658 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[fbd518c0-e598-47d4-8535-22a533f42ec5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:07:13.659 103323 INFO neutron.agent.ovn.metadata.agent [-] Port e5808e22-40db-4078-a542-4f4cd632e06e in datapath b47cf9b2-6909-4c87-b7af-b579f1b91bed unbound from our chassis#033[00m
Oct  2 08:07:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:07:13.660 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b47cf9b2-6909-4c87-b7af-b579f1b91bed, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:07:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:07:13.660 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[6b8f4774-9dd8-4d28-b2ef-9727dba786d9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:13 np0005466013 nova_compute[192144]: 2025-10-02 12:07:13.694 2 DEBUG nova.compute.manager [req-4446c546-b450-4b24-b16f-c0706db23239 req-ec495f35-13f1-4f45-8a62-678908a381e8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Received event network-vif-unplugged-e5808e22-40db-4078-a542-4f4cd632e06e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:07:13 np0005466013 nova_compute[192144]: 2025-10-02 12:07:13.695 2 DEBUG oslo_concurrency.lockutils [req-4446c546-b450-4b24-b16f-c0706db23239 req-ec495f35-13f1-4f45-8a62-678908a381e8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "9ae3f2a0-cb83-4b09-8d4d-604c431e09e2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:13 np0005466013 nova_compute[192144]: 2025-10-02 12:07:13.695 2 DEBUG oslo_concurrency.lockutils [req-4446c546-b450-4b24-b16f-c0706db23239 req-ec495f35-13f1-4f45-8a62-678908a381e8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "9ae3f2a0-cb83-4b09-8d4d-604c431e09e2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:13 np0005466013 nova_compute[192144]: 2025-10-02 12:07:13.696 2 DEBUG oslo_concurrency.lockutils [req-4446c546-b450-4b24-b16f-c0706db23239 req-ec495f35-13f1-4f45-8a62-678908a381e8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "9ae3f2a0-cb83-4b09-8d4d-604c431e09e2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:13 np0005466013 nova_compute[192144]: 2025-10-02 12:07:13.696 2 DEBUG nova.compute.manager [req-4446c546-b450-4b24-b16f-c0706db23239 req-ec495f35-13f1-4f45-8a62-678908a381e8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] No waiting events found dispatching network-vif-unplugged-e5808e22-40db-4078-a542-4f4cd632e06e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:07:13 np0005466013 nova_compute[192144]: 2025-10-02 12:07:13.697 2 DEBUG nova.compute.manager [req-4446c546-b450-4b24-b16f-c0706db23239 req-ec495f35-13f1-4f45-8a62-678908a381e8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Received event network-vif-unplugged-e5808e22-40db-4078-a542-4f4cd632e06e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:07:13 np0005466013 nova_compute[192144]: 2025-10-02 12:07:13.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:07:14 np0005466013 nova_compute[192144]: 2025-10-02 12:07:14.013 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:14 np0005466013 nova_compute[192144]: 2025-10-02 12:07:14.014 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:14 np0005466013 nova_compute[192144]: 2025-10-02 12:07:14.014 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:14 np0005466013 nova_compute[192144]: 2025-10-02 12:07:14.014 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:07:14 np0005466013 nova_compute[192144]: 2025-10-02 12:07:14.077 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a850e122-58e1-4fa2-9555-1564c9c36203/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:14 np0005466013 nova_compute[192144]: 2025-10-02 12:07:14.142 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a850e122-58e1-4fa2-9555-1564c9c36203/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:14 np0005466013 nova_compute[192144]: 2025-10-02 12:07:14.143 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a850e122-58e1-4fa2-9555-1564c9c36203/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:14 np0005466013 nova_compute[192144]: 2025-10-02 12:07:14.200 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a850e122-58e1-4fa2-9555-1564c9c36203/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:14 np0005466013 nova_compute[192144]: 2025-10-02 12:07:14.205 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:14 np0005466013 nova_compute[192144]: 2025-10-02 12:07:14.269 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:14 np0005466013 nova_compute[192144]: 2025-10-02 12:07:14.270 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:14 np0005466013 nova_compute[192144]: 2025-10-02 12:07:14.326 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:14 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:07:14.455 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:07:14 np0005466013 nova_compute[192144]: 2025-10-02 12:07:14.456 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:14 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:07:14.456 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:07:14 np0005466013 nova_compute[192144]: 2025-10-02 12:07:14.471 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:07:14 np0005466013 nova_compute[192144]: 2025-10-02 12:07:14.472 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5338MB free_disk=73.43476486206055GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:07:14 np0005466013 nova_compute[192144]: 2025-10-02 12:07:14.473 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:14 np0005466013 nova_compute[192144]: 2025-10-02 12:07:14.473 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:14 np0005466013 nova_compute[192144]: 2025-10-02 12:07:14.491 2 DEBUG nova.network.neutron [-] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:07:14 np0005466013 nova_compute[192144]: 2025-10-02 12:07:14.537 2 INFO nova.compute.manager [-] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Took 1.03 seconds to deallocate network for instance.#033[00m
Oct  2 08:07:14 np0005466013 nova_compute[192144]: 2025-10-02 12:07:14.580 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance a850e122-58e1-4fa2-9555-1564c9c36203 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:07:14 np0005466013 nova_compute[192144]: 2025-10-02 12:07:14.581 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:07:14 np0005466013 nova_compute[192144]: 2025-10-02 12:07:14.581 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:07:14 np0005466013 nova_compute[192144]: 2025-10-02 12:07:14.581 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:07:14 np0005466013 nova_compute[192144]: 2025-10-02 12:07:14.581 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=79GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:07:14 np0005466013 nova_compute[192144]: 2025-10-02 12:07:14.603 2 DEBUG nova.compute.manager [req-1994aff0-2e43-4f33-aaf7-d06ce583455b req-2f6f35ad-e39e-4d01-97b6-b008827ae557 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Received event network-vif-deleted-e5808e22-40db-4078-a542-4f4cd632e06e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:07:14 np0005466013 nova_compute[192144]: 2025-10-02 12:07:14.675 2 DEBUG oslo_concurrency.lockutils [None req-a68ff774-a880-4c7a-bed8-0b307117f28a fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:14 np0005466013 nova_compute[192144]: 2025-10-02 12:07:14.701 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:07:14 np0005466013 nova_compute[192144]: 2025-10-02 12:07:14.714 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:07:14 np0005466013 nova_compute[192144]: 2025-10-02 12:07:14.735 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:07:14 np0005466013 nova_compute[192144]: 2025-10-02 12:07:14.736 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.263s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:14 np0005466013 nova_compute[192144]: 2025-10-02 12:07:14.737 2 DEBUG oslo_concurrency.lockutils [None req-a68ff774-a880-4c7a-bed8-0b307117f28a fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.062s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:14 np0005466013 nova_compute[192144]: 2025-10-02 12:07:14.819 2 DEBUG nova.compute.provider_tree [None req-a68ff774-a880-4c7a-bed8-0b307117f28a fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:07:14 np0005466013 nova_compute[192144]: 2025-10-02 12:07:14.834 2 DEBUG nova.scheduler.client.report [None req-a68ff774-a880-4c7a-bed8-0b307117f28a fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:07:14 np0005466013 nova_compute[192144]: 2025-10-02 12:07:14.854 2 DEBUG oslo_concurrency.lockutils [None req-a68ff774-a880-4c7a-bed8-0b307117f28a fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:14 np0005466013 nova_compute[192144]: 2025-10-02 12:07:14.887 2 INFO nova.scheduler.client.report [None req-a68ff774-a880-4c7a-bed8-0b307117f28a fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Deleted allocations for instance 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2#033[00m
Oct  2 08:07:14 np0005466013 nova_compute[192144]: 2025-10-02 12:07:14.958 2 DEBUG oslo_concurrency.lockutils [None req-a68ff774-a880-4c7a-bed8-0b307117f28a fb4813531b1848edaf57576b1f551d3d c7ce4d06ab3e4e45b22ec26fe7e71cce - - default default] Lock "9ae3f2a0-cb83-4b09-8d4d-604c431e09e2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.858s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:15 np0005466013 nova_compute[192144]: 2025-10-02 12:07:15.738 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:07:15 np0005466013 nova_compute[192144]: 2025-10-02 12:07:15.739 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:07:15 np0005466013 nova_compute[192144]: 2025-10-02 12:07:15.739 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:07:15 np0005466013 nova_compute[192144]: 2025-10-02 12:07:15.782 2 DEBUG nova.compute.manager [req-7070a021-63b4-434f-b8b3-1223b3c17eca req-1e66286e-5e05-46d1-aea0-c74409b820ab 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Received event network-vif-plugged-e5808e22-40db-4078-a542-4f4cd632e06e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:07:15 np0005466013 nova_compute[192144]: 2025-10-02 12:07:15.783 2 DEBUG oslo_concurrency.lockutils [req-7070a021-63b4-434f-b8b3-1223b3c17eca req-1e66286e-5e05-46d1-aea0-c74409b820ab 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "9ae3f2a0-cb83-4b09-8d4d-604c431e09e2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:15 np0005466013 nova_compute[192144]: 2025-10-02 12:07:15.783 2 DEBUG oslo_concurrency.lockutils [req-7070a021-63b4-434f-b8b3-1223b3c17eca req-1e66286e-5e05-46d1-aea0-c74409b820ab 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "9ae3f2a0-cb83-4b09-8d4d-604c431e09e2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:15 np0005466013 nova_compute[192144]: 2025-10-02 12:07:15.784 2 DEBUG oslo_concurrency.lockutils [req-7070a021-63b4-434f-b8b3-1223b3c17eca req-1e66286e-5e05-46d1-aea0-c74409b820ab 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "9ae3f2a0-cb83-4b09-8d4d-604c431e09e2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:15 np0005466013 nova_compute[192144]: 2025-10-02 12:07:15.784 2 DEBUG nova.compute.manager [req-7070a021-63b4-434f-b8b3-1223b3c17eca req-1e66286e-5e05-46d1-aea0-c74409b820ab 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] No waiting events found dispatching network-vif-plugged-e5808e22-40db-4078-a542-4f4cd632e06e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:07:15 np0005466013 nova_compute[192144]: 2025-10-02 12:07:15.784 2 WARNING nova.compute.manager [req-7070a021-63b4-434f-b8b3-1223b3c17eca req-1e66286e-5e05-46d1-aea0-c74409b820ab 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Received unexpected event network-vif-plugged-e5808e22-40db-4078-a542-4f4cd632e06e for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.344 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'a850e122-58e1-4fa2-9555-1564c9c36203', 'name': 'tempest-ServersAdminTestJSON-server-1175198892', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000025', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'db3f04a20fd740c1af3139196dc928d2', 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'hostId': '04d9a0281125c42669486905634c8366529c7bf74f6151e80b76ceb5', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.347 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952', 'name': 'tempest-ServerDiagnosticsV248Test-server-871354274', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000029', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '84063a6f8a7246a7ba8decc84421e3d9', 'user_id': 'a184841b0d7f4132b40d6c8dc1230655', 'hostId': '8b07f79e1fe44788e8f684fd89060f0f0a10e69c798d7c3a4b53b7b3', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.347 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.350 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for a850e122-58e1-4fa2-9555-1564c9c36203 / tap0b2b5a9b-14 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.350 12 DEBUG ceilometer.compute.pollsters [-] a850e122-58e1-4fa2-9555-1564c9c36203/network.incoming.packets volume: 13 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.354 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1482dc2c-f487-4221-9374-e0c7b49b6f1c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 13, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': 'instance-00000025-a850e122-58e1-4fa2-9555-1564c9c36203-tap0b2b5a9b-14', 'timestamp': '2025-10-02T12:07:16.348140', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1175198892', 'name': 'tap0b2b5a9b-14', 'instance_id': 'a850e122-58e1-4fa2-9555-1564c9c36203', 'instance_type': 'm1.nano', 'host': '04d9a0281125c42669486905634c8366529c7bf74f6151e80b76ceb5', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:61:29:fa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0b2b5a9b-14'}, 'message_id': '56773e66-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4828.026359924, 'message_signature': '22bb8b910906cf9e76083668f9b1266a2094a60ca4fb21204b3bbc16baecaf62'}]}, 'timestamp': '2025-10-02 12:07:16.352579', '_unique_id': 'a5ddbb03ce694788a7219040b536e3e7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.354 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.354 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.354 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.354 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.354 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.354 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.354 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.354 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.354 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.354 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.354 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.354 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.354 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.354 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.354 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.354 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.354 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.354 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.354 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.354 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.354 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.354 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.354 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.354 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.354 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.354 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.354 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.354 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.354 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.354 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.354 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.355 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.355 12 DEBUG ceilometer.compute.pollsters [-] a850e122-58e1-4fa2-9555-1564c9c36203/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.356 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '88304898-136d-4466-afdc-7774ffac8dfd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': 'instance-00000025-a850e122-58e1-4fa2-9555-1564c9c36203-tap0b2b5a9b-14', 'timestamp': '2025-10-02T12:07:16.355377', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1175198892', 'name': 'tap0b2b5a9b-14', 'instance_id': 'a850e122-58e1-4fa2-9555-1564c9c36203', 'instance_type': 'm1.nano', 'host': '04d9a0281125c42669486905634c8366529c7bf74f6151e80b76ceb5', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:61:29:fa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0b2b5a9b-14'}, 'message_id': '5677f43c-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4828.026359924, 'message_signature': '96b65ab47c3d6e9224f848940e106c48f580061a9507261d001ef08de04d0b93'}]}, 'timestamp': '2025-10-02 12:07:16.355760', '_unique_id': '63f749b1bc914269a2b1ca387320e4fd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.356 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.356 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.356 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.356 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.356 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.356 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.356 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.356 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.356 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.356 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.356 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.356 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.356 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.356 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.356 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.356 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.356 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.356 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.356 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.356 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.356 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.356 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.356 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.356 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.356 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.356 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.356 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.356 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.356 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.356 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.356 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.357 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.357 12 DEBUG ceilometer.compute.pollsters [-] a850e122-58e1-4fa2-9555-1564c9c36203/network.incoming.bytes volume: 1736 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.358 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '85933246-cf25-4569-9d62-ee7ef252461d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1736, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': 'instance-00000025-a850e122-58e1-4fa2-9555-1564c9c36203-tap0b2b5a9b-14', 'timestamp': '2025-10-02T12:07:16.357139', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1175198892', 'name': 'tap0b2b5a9b-14', 'instance_id': 'a850e122-58e1-4fa2-9555-1564c9c36203', 'instance_type': 'm1.nano', 'host': '04d9a0281125c42669486905634c8366529c7bf74f6151e80b76ceb5', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:61:29:fa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0b2b5a9b-14'}, 'message_id': '5678373a-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4828.026359924, 'message_signature': '885847f77debd16f22319191aa72c5e3d9ede85881bca24362473cb0734bed93'}]}, 'timestamp': '2025-10-02 12:07:16.357485', '_unique_id': '0176224440db4afc8486e849c989b19b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.358 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.358 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.358 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.358 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.358 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.358 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.358 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.358 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.358 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.358 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.358 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.358 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.358 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.358 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.358 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.358 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.358 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.358 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.358 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.358 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.358 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.358 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.358 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.358 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.358 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.358 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.358 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.358 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.358 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.358 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.358 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.358 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.372 12 DEBUG ceilometer.compute.pollsters [-] a850e122-58e1-4fa2-9555-1564c9c36203/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.373 12 DEBUG ceilometer.compute.pollsters [-] a850e122-58e1-4fa2-9555-1564c9c36203/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.384 12 DEBUG ceilometer.compute.pollsters [-] 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.385 12 DEBUG ceilometer.compute.pollsters [-] 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.388 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '883bf92a-ab4a-49f3-95f8-9233decabe74', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': 'a850e122-58e1-4fa2-9555-1564c9c36203-vda', 'timestamp': '2025-10-02T12:07:16.358709', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1175198892', 'name': 'instance-00000025', 'instance_id': 'a850e122-58e1-4fa2-9555-1564c9c36203', 'instance_type': 'm1.nano', 'host': '04d9a0281125c42669486905634c8366529c7bf74f6151e80b76ceb5', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '567ab5a0-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4828.036931784, 'message_signature': '34fab405d15bfa0623997b1bd6a99bc520cc1c7b4431a55963b6322aab90fb26'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': 'a850e122-58e1-4fa2-9555-1564c9c36203-sda', 'timestamp': '2025-10-02T12:07:16.358709', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1175198892', 'name': 'instance-00000025', 'instance_id': 'a850e122-58e1-4fa2-9555-1564c9c36203', 'instance_type': 'm1.nano', 'host': '04d9a0281125c42669486905634c8366529c7bf74f6151e80b76ceb5', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '567ac838-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4828.036931784, 'message_signature': 'a3353d8922a4958d57ea183daa0d2ee6fbcc60cbdf722f7effede5ca20a37923'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': 'a184841b0d7f4132b40d6c8dc1230655', 'user_name': None, 'project_id': '84063a6f8a7246a7ba8decc84421e3d9', 'project_name': None, 'resource_id': '555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952-vda', 'timestamp': '2025-10-02T12:07:16.358709', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsV248Test-server-871354274', 'name': 'instance-00000029', 'instance_id': '555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952', 'instance_type': 'm1.nano', 'host': '8b07f79e1fe44788e8f684fd89060f0f0a10e69c798d7c3a4b53b7b3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '567c7b10-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4828.05250158, 'message_signature': '6dfc4d983071f4e294716046bce468b613c5caacf471f6f99c48c2d00e03c21c'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a184841b0d7f4132b40d6c8dc1230655', 'user_name': None, 'project_id': '84063a6f8a7246a7ba8decc84421e3d9', 'project_name': None, 'resource_id': '555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952-sda', 'timestamp': '2025-10-02T12:07:16.358709', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsV248Test-server-871354274', 'name': 'instance-00000029', 'instance_id': '555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952', 'instance_type': 'm1.nano', 'host': '8b07f79e1fe44788e8f684fd89060f0f0a10e69c798d7c3a4b53b7b3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '567c8d62-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4828.05250158, 'message_signature': 'd2d40c12354c61e232d0f5b26a430879aa48ab20214d26e5ec2194a21f41c088'}]}, 'timestamp': '2025-10-02 12:07:16.385953', '_unique_id': '779c90e31eb64ac3a2a3ed468749e6ae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.388 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.388 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.388 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.388 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.388 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.388 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.388 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.388 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.388 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.388 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.388 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.388 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.388 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.388 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.388 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.388 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.388 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.388 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.388 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.388 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.388 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.388 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.388 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.388 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.388 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.388 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.388 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.388 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.388 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.388 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.388 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.389 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.412 12 DEBUG ceilometer.compute.pollsters [-] a850e122-58e1-4fa2-9555-1564c9c36203/disk.device.read.requests volume: 1107 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.413 12 DEBUG ceilometer.compute.pollsters [-] a850e122-58e1-4fa2-9555-1564c9c36203/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.434 12 DEBUG ceilometer.compute.pollsters [-] 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.435 12 DEBUG ceilometer.compute.pollsters [-] 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.436 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '07769edd-978f-4e93-8064-5cd8103bf8b1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1107, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': 'a850e122-58e1-4fa2-9555-1564c9c36203-vda', 'timestamp': '2025-10-02T12:07:16.389385', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1175198892', 'name': 'instance-00000025', 'instance_id': 'a850e122-58e1-4fa2-9555-1564c9c36203', 'instance_type': 'm1.nano', 'host': '04d9a0281125c42669486905634c8366529c7bf74f6151e80b76ceb5', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5680afd2-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4828.067680321, 'message_signature': '7cc6701f3f764aaafd02b57e28c8e8904eab9a0323106d497134a84b7f839d4d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': 'a850e122-58e1-4fa2-9555-1564c9c36203-sda', 'timestamp': '2025-10-02T12:07:16.389385', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1175198892', 'name': 'instance-00000025', 'instance_id': 'a850e122-58e1-4fa2-9555-1564c9c36203', 'instance_type': 'm1.nano', 'host': '04d9a0281125c42669486905634c8366529c7bf74f6151e80b76ceb5', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5680c490-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4828.067680321, 'message_signature': 'eca9fa65a8455eb829822986a5ffc5e340bbae02e50bbf06657c03229fdd69d3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': 'a184841b0d7f4132b40d6c8dc1230655', 'user_name': None, 'project_id': '84063a6f8a7246a7ba8decc84421e3d9', 'project_name': None, 'resource_id': '555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952-vda', 'timestamp': '2025-10-02T12:07:16.389385', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsV248Test-server-871354274', 'name': 'instance-00000029', 'instance_id': '555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952', 'instance_type': 'm1.nano', 'host': '8b07f79e1fe44788e8f684fd89060f0f0a10e69c798d7c3a4b53b7b3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5684072c-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4828.09174877, 'message_signature': 'c7200f5473248a48a3b2b1419af89ce3f787f4f22dd3c2520221da7648e52d8c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'a184841b0d7f4132b40d6c8dc1230655', 'user_name': None, 'project_id': '84063a6f8a7246a7ba8decc84421e3d9', 'project_name': None, 'resource_id': '555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952-sda', 'timestamp': '2025-10-02T12:07:16.389385', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsV248Test-server-871354274', 'name': 'instance-00000029', 'instance_id': '555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952', 'instance_type': 'm1.nano', 'host': '8b07f79e1fe44788e8f684fd89060f0f0a10e69c798d7c3a4b53b7b3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '56841924-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4828.09174877, 'message_signature': 'dd65d1e312995df03bfc7b69ecd922e1b14f44e9769c9bddaa087031c73e14af'}]}, 'timestamp': '2025-10-02 12:07:16.435353', '_unique_id': '0b427348187a40ec802e4bda3f002a98'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.436 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.436 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.436 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.436 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.436 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.436 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.436 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.436 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.436 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.436 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.436 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.436 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.436 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.436 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.436 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.436 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.436 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.436 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.436 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.436 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.436 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.436 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.436 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.436 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.436 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.436 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.436 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.436 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.436 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.436 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.436 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.437 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.437 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.438 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServersAdminTestJSON-server-1175198892>, <NovaLikeServer: tempest-ServerDiagnosticsV248Test-server-871354274>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersAdminTestJSON-server-1175198892>, <NovaLikeServer: tempest-ServerDiagnosticsV248Test-server-871354274>]
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.438 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.438 12 DEBUG ceilometer.compute.pollsters [-] a850e122-58e1-4fa2-9555-1564c9c36203/disk.device.write.requests volume: 313 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.438 12 DEBUG ceilometer.compute.pollsters [-] a850e122-58e1-4fa2-9555-1564c9c36203/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.439 12 DEBUG ceilometer.compute.pollsters [-] 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.439 12 DEBUG ceilometer.compute.pollsters [-] 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.440 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2e36518a-f538-4603-8c95-d1bbe136eb97', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 313, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': 'a850e122-58e1-4fa2-9555-1564c9c36203-vda', 'timestamp': '2025-10-02T12:07:16.438373', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1175198892', 'name': 'instance-00000025', 'instance_id': 'a850e122-58e1-4fa2-9555-1564c9c36203', 'instance_type': 'm1.nano', 'host': '04d9a0281125c42669486905634c8366529c7bf74f6151e80b76ceb5', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '56849d7c-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4828.067680321, 'message_signature': 'b7f66bab317c41909887ec50a3207a5f1eabd4170f333c728f76503711e92a57'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': 'a850e122-58e1-4fa2-9555-1564c9c36203-sda', 'timestamp': '2025-10-02T12:07:16.438373', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1175198892', 'name': 'instance-00000025', 'instance_id': 'a850e122-58e1-4fa2-9555-1564c9c36203', 'instance_type': 'm1.nano', 'host': '04d9a0281125c42669486905634c8366529c7bf74f6151e80b76ceb5', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5684ad62-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4828.067680321, 'message_signature': 'aa935e33bce674ac23d40c6788076c7e36b8baef5ac5a4ee75730cd70a719715'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'a184841b0d7f4132b40d6c8dc1230655', 'user_name': None, 'project_id': '84063a6f8a7246a7ba8decc84421e3d9', 'project_name': None, 'resource_id': '555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952-vda', 'timestamp': '2025-10-02T12:07:16.438373', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsV248Test-server-871354274', 'name': 'instance-00000029', 'instance_id': '555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952', 'instance_type': 'm1.nano', 'host': '8b07f79e1fe44788e8f684fd89060f0f0a10e69c798d7c3a4b53b7b3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5684b910-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4828.09174877, 'message_signature': '0493219d24dd375a440707701102ae8dbe55666bba3a8272eac0d2236f035524'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'a184841b0d7f4132b40d6c8dc1230655', 'user_name': None, 'project_id': '84063a6f8a7246a7ba8decc84421e3d9', 'project_name': None, 'resource_id': '555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952-sda', 'timestamp': '2025-10-02T12:07:16.438373', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsV248Test-server-871354274', 'name': 'instance-00000029', 'instance_id': '555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952', 'instance_type': 'm1.nano', 'host': '8b07f79e1fe44788e8f684fd89060f0f0a10e69c798d7c3a4b53b7b3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5684c3f6-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4828.09174877, 'message_signature': '8c307a22cfbd4ad36cb099d226e13c5a122ce3d84f199a1950835ac8b374ef21'}]}, 'timestamp': '2025-10-02 12:07:16.439696', '_unique_id': 'ec0dd56fa4ed4c969bc5ebc256c0bce7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.440 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.440 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.440 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.440 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.440 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.440 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.440 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.440 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.440 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.440 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.440 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.440 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.440 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.440 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.440 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.440 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.440 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.440 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.440 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.440 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.440 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.440 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.440 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.440 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.440 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.440 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.440 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.440 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.440 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.440 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.440 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.441 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.441 12 DEBUG ceilometer.compute.pollsters [-] a850e122-58e1-4fa2-9555-1564c9c36203/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.442 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1beefd1b-6aea-4ff7-8cd7-bb9708116a8e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': 'instance-00000025-a850e122-58e1-4fa2-9555-1564c9c36203-tap0b2b5a9b-14', 'timestamp': '2025-10-02T12:07:16.441636', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1175198892', 'name': 'tap0b2b5a9b-14', 'instance_id': 'a850e122-58e1-4fa2-9555-1564c9c36203', 'instance_type': 'm1.nano', 'host': '04d9a0281125c42669486905634c8366529c7bf74f6151e80b76ceb5', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:61:29:fa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0b2b5a9b-14'}, 'message_id': '56851b26-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4828.026359924, 'message_signature': 'f21dff01fccc91d2c9fa88fccdad36a1b17fb9130afae38c8982c6bb0072014d'}]}, 'timestamp': '2025-10-02 12:07:16.441926', '_unique_id': '1cf4c836b89e4da6bf2cedb447a2cbe3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.442 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.442 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.442 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.442 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.442 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.442 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.442 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.442 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.442 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.442 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.442 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.442 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.442 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.442 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.442 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.442 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.442 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.442 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.442 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.442 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.442 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.442 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.442 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.442 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.442 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.442 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.442 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.442 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.442 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.442 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.442 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 DEBUG ceilometer.compute.pollsters [-] a850e122-58e1-4fa2-9555-1564c9c36203/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2a154aff-20e5-424f-8eb7-e622a4505b97', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': 'instance-00000025-a850e122-58e1-4fa2-9555-1564c9c36203-tap0b2b5a9b-14', 'timestamp': '2025-10-02T12:07:16.443110', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1175198892', 'name': 'tap0b2b5a9b-14', 'instance_id': 'a850e122-58e1-4fa2-9555-1564c9c36203', 'instance_type': 'm1.nano', 'host': '04d9a0281125c42669486905634c8366529c7bf74f6151e80b76ceb5', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:61:29:fa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0b2b5a9b-14'}, 'message_id': '568553de-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4828.026359924, 'message_signature': '08e89b1654ac506e89198d906fbe4f25d861e9ea93be7c6633a3c2daf5f2e045'}]}, 'timestamp': '2025-10-02 12:07:16.443345', '_unique_id': 'b4afc4ca3e5f41859952856ec4be0721'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.443 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.444 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.444 12 DEBUG ceilometer.compute.pollsters [-] a850e122-58e1-4fa2-9555-1564c9c36203/disk.device.read.latency volume: 470638913 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.444 12 DEBUG ceilometer.compute.pollsters [-] a850e122-58e1-4fa2-9555-1564c9c36203/disk.device.read.latency volume: 47922474 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.445 12 DEBUG ceilometer.compute.pollsters [-] 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952/disk.device.read.latency volume: 501453107 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.445 12 DEBUG ceilometer.compute.pollsters [-] 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952/disk.device.read.latency volume: 1700414 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.446 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'de349e2a-871f-4967-acf3-33712d440f17', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 470638913, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': 'a850e122-58e1-4fa2-9555-1564c9c36203-vda', 'timestamp': '2025-10-02T12:07:16.444480', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1175198892', 'name': 'instance-00000025', 'instance_id': 'a850e122-58e1-4fa2-9555-1564c9c36203', 'instance_type': 'm1.nano', 'host': '04d9a0281125c42669486905634c8366529c7bf74f6151e80b76ceb5', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '56858ade-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4828.067680321, 'message_signature': '7aec528d79e10414c7620b96d8192148a8eb75852982c9ac1d151ccd365366fa'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 47922474, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': 'a850e122-58e1-4fa2-9555-1564c9c36203-sda', 'timestamp': '2025-10-02T12:07:16.444480', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1175198892', 'name': 'instance-00000025', 'instance_id': 'a850e122-58e1-4fa2-9555-1564c9c36203', 'instance_type': 'm1.nano', 'host': '04d9a0281125c42669486905634c8366529c7bf74f6151e80b76ceb5', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '568599a2-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4828.067680321, 'message_signature': '7c32988908761bcc73a01745b21fc271a13744a607f15252b0afe92f170c6486'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 501453107, 'user_id': 'a184841b0d7f4132b40d6c8dc1230655', 'user_name': None, 'project_id': '84063a6f8a7246a7ba8decc84421e3d9', 'project_name': None, 'resource_id': '555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952-vda', 'timestamp': '2025-10-02T12:07:16.444480', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsV248Test-server-871354274', 'name': 'instance-00000029', 'instance_id': '555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952', 'instance_type': 'm1.nano', 'host': '8b07f79e1fe44788e8f684fd89060f0f0a10e69c798d7c3a4b53b7b3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5685a1fe-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4828.09174877, 'message_signature': 'b5671dd994ff447db558febd0889c69bee0c8714adfa1d4326708ddacb605a96'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1700414, 'user_id': 'a184841b0d7f4132b40d6c8dc1230655', 'user_name': None, 'project_id': '84063a6f8a7246a7ba8decc84421e3d9', 'project_name': None, 'resource_id': '555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952-sda', 'timestamp': '2025-10-02T12:07:16.444480', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsV248Test-server-871354274', 'name': 'instance-00000029', 'instance_id': '555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952', 'instance_type': 'm1.nano', 'host': '8b07f79e1fe44788e8f684fd89060f0f0a10e69c798d7c3a4b53b7b3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5685a9ce-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4828.09174877, 'message_signature': '79cd8838ce4881bd1c4d4e59465f1bead0dd3b471a58e4baf78717d1c12d8d78'}]}, 'timestamp': '2025-10-02 12:07:16.445537', '_unique_id': '9964f3dc93c44d489560282f8b8dcbb5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.446 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.446 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.446 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.446 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.446 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.446 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.446 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.446 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.446 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.446 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.446 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.446 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.446 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.446 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.446 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.446 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.446 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.446 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.446 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.446 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.446 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.446 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.446 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.446 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.446 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.446 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.446 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.446 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.446 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.446 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.446 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.446 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.447 12 DEBUG ceilometer.compute.pollsters [-] a850e122-58e1-4fa2-9555-1564c9c36203/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.447 12 DEBUG ceilometer.compute.pollsters [-] a850e122-58e1-4fa2-9555-1564c9c36203/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.447 12 DEBUG ceilometer.compute.pollsters [-] 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.447 12 DEBUG ceilometer.compute.pollsters [-] 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.448 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dc4d8986-c509-4075-b84e-ab202e00e3ec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': 'a850e122-58e1-4fa2-9555-1564c9c36203-vda', 'timestamp': '2025-10-02T12:07:16.446979', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1175198892', 'name': 'instance-00000025', 'instance_id': 'a850e122-58e1-4fa2-9555-1564c9c36203', 'instance_type': 'm1.nano', 'host': '04d9a0281125c42669486905634c8366529c7bf74f6151e80b76ceb5', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5685eae2-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4828.036931784, 'message_signature': '2398f508bb7057a31d776e3675da5db8138736020cfab46df375d28bcb926eab'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': 'a850e122-58e1-4fa2-9555-1564c9c36203-sda', 'timestamp': '2025-10-02T12:07:16.446979', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1175198892', 'name': 'instance-00000025', 'instance_id': 'a850e122-58e1-4fa2-9555-1564c9c36203', 'instance_type': 'm1.nano', 'host': '04d9a0281125c42669486905634c8366529c7bf74f6151e80b76ceb5', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5685f2a8-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4828.036931784, 'message_signature': '6ac379a8dcd4487f8ae1e7f87d694729affbf24d8e236126f3d2de1f1c739efc'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'a184841b0d7f4132b40d6c8dc1230655', 'user_name': None, 'project_id': '84063a6f8a7246a7ba8decc84421e3d9', 'project_name': None, 'resource_id': '555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952-vda', 'timestamp': '2025-10-02T12:07:16.446979', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsV248Test-server-871354274', 'name': 'instance-00000029', 'instance_id': '555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952', 'instance_type': 'm1.nano', 'host': '8b07f79e1fe44788e8f684fd89060f0f0a10e69c798d7c3a4b53b7b3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5685fa64-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4828.05250158, 'message_signature': '4c06cb0ada43763c6beeaf94dcdc3bbf36226ef33353eb38b9d4caadee1ea2a8'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a184841b0d7f4132b40d6c8dc1230655', 'user_name': None, 'project_id': '84063a6f8a7246a7ba8decc84421e3d9', 'project_name': None, 'resource_id': '555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952-sda', 'timestamp': '2025-10-02T12:07:16.446979', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsV248Test-server-871354274', 'name': 'instance-00000029', 'instance_id': '555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952', 'instance_type': 'm1.nano', 'host': '8b07f79e1fe44788e8f684fd89060f0f0a10e69c798d7c3a4b53b7b3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '568601c6-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4828.05250158, 'message_signature': 'bb43932d71e06e396ebe3b73eae7577aa2ff8f27b1fde389a0f7b5468ec6af21'}]}, 'timestamp': '2025-10-02 12:07:16.447776', '_unique_id': '7680b37333ca45719066bbf4196981f6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.448 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.448 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.448 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.448 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.448 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.448 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.448 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.448 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.448 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.448 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.448 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.448 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.448 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.448 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.448 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.448 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.448 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.448 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.448 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.448 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.448 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.448 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.448 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.448 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.448 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.448 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.448 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.448 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.448 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.448 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.448 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.448 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.449 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.449 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServersAdminTestJSON-server-1175198892>, <NovaLikeServer: tempest-ServerDiagnosticsV248Test-server-871354274>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersAdminTestJSON-server-1175198892>, <NovaLikeServer: tempest-ServerDiagnosticsV248Test-server-871354274>]
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.449 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.449 12 DEBUG ceilometer.compute.pollsters [-] a850e122-58e1-4fa2-9555-1564c9c36203/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '53b1b031-009d-49a8-a06a-c7b15faafc40', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': 'instance-00000025-a850e122-58e1-4fa2-9555-1564c9c36203-tap0b2b5a9b-14', 'timestamp': '2025-10-02T12:07:16.449355', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1175198892', 'name': 'tap0b2b5a9b-14', 'instance_id': 'a850e122-58e1-4fa2-9555-1564c9c36203', 'instance_type': 'm1.nano', 'host': '04d9a0281125c42669486905634c8366529c7bf74f6151e80b76ceb5', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:61:29:fa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0b2b5a9b-14'}, 'message_id': '5686480c-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4828.026359924, 'message_signature': '46784d15d10625d43a64bc8c67de64ef6aad75d71a956f07a75a481a399d747a'}]}, 'timestamp': '2025-10-02 12:07:16.449594', '_unique_id': '8c9a3199eff543369df245f925de6fc1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 DEBUG ceilometer.compute.pollsters [-] a850e122-58e1-4fa2-9555-1564c9c36203/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.450 12 DEBUG ceilometer.compute.pollsters [-] a850e122-58e1-4fa2-9555-1564c9c36203/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.451 12 DEBUG ceilometer.compute.pollsters [-] 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.451 12 DEBUG ceilometer.compute.pollsters [-] 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5b4d2203-3867-4da3-8f6e-a6f654c4d3f0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30154752, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': 'a850e122-58e1-4fa2-9555-1564c9c36203-vda', 'timestamp': '2025-10-02T12:07:16.450681', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1175198892', 'name': 'instance-00000025', 'instance_id': 'a850e122-58e1-4fa2-9555-1564c9c36203', 'instance_type': 'm1.nano', 'host': '04d9a0281125c42669486905634c8366529c7bf74f6151e80b76ceb5', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '56867b38-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4828.036931784, 'message_signature': '75e9ec3f19f7c1ea4222968bfd5f480d477007b2d69560b892c43e65eeed2934'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': 'a850e122-58e1-4fa2-9555-1564c9c36203-sda', 'timestamp': '2025-10-02T12:07:16.450681', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1175198892', 'name': 'instance-00000025', 'instance_id': 'a850e122-58e1-4fa2-9555-1564c9c36203', 'instance_type': 'm1.nano', 'host': '04d9a0281125c42669486905634c8366529c7bf74f6151e80b76ceb5', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '56868402-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4828.036931784, 'message_signature': 'b2bac025ac5a385068f33f3aceaf8ac268f8d1009eda077fbf093997c0152a25'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'a184841b0d7f4132b40d6c8dc1230655', 'user_name': None, 'project_id': '84063a6f8a7246a7ba8decc84421e3d9', 'project_name': None, 'resource_id': '555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952-vda', 'timestamp': '2025-10-02T12:07:16.450681', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsV248Test-server-871354274', 'name': 'instance-00000029', 'instance_id': '555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952', 'instance_type': 'm1.nano', 'host': '8b07f79e1fe44788e8f684fd89060f0f0a10e69c798d7c3a4b53b7b3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '56868ba0-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4828.05250158, 'message_signature': '980f871717f54bbc6d62b102427f71b716e1cbd8f7e644363607805fcdc9dc49'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'a184841b0d7f4132b40d6c8dc1230655', 'user_name': None, 'project_id': '84063a6f8a7246a7ba8decc84421e3d9', 'project_name': None, 'resource_id': '555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952-sda', 'timestamp': '2025-10-02T12:07:16.450681', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsV248Test-server-871354274', 'name': 'instance-00000029', 'instance_id': '555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952', 'instance_type': 'm1.nano', 'host': '8b07f79e1fe44788e8f684fd89060f0f0a10e69c798d7c3a4b53b7b3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '56869302-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4828.05250158, 'message_signature': '315c3e2ff31c07b5f05a354619242d40a4ea0991d2d668abaa219d92f883a5f7'}]}, 'timestamp': '2025-10-02 12:07:16.451495', '_unique_id': '0822e762ba8a40ab9c73b4aa54fe0912'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.452 12 DEBUG ceilometer.compute.pollsters [-] a850e122-58e1-4fa2-9555-1564c9c36203/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd652cb8e-5184-4b59-b150-d9c99d6ab120', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': 'instance-00000025-a850e122-58e1-4fa2-9555-1564c9c36203-tap0b2b5a9b-14', 'timestamp': '2025-10-02T12:07:16.452620', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1175198892', 'name': 'tap0b2b5a9b-14', 'instance_id': 'a850e122-58e1-4fa2-9555-1564c9c36203', 'instance_type': 'm1.nano', 'host': '04d9a0281125c42669486905634c8366529c7bf74f6151e80b76ceb5', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:61:29:fa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0b2b5a9b-14'}, 'message_id': '5686c7fa-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4828.026359924, 'message_signature': '65fa109fe1931bfed36cb5172dc66f9cd259b0f9d0cf22d48ecbd41c1bc995ae'}]}, 'timestamp': '2025-10-02 12:07:16.452890', '_unique_id': 'e8e7206c7851471f8ced82c93cf9f615'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.453 12 DEBUG ceilometer.compute.pollsters [-] a850e122-58e1-4fa2-9555-1564c9c36203/disk.device.write.latency volume: 2212722383 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.454 12 DEBUG ceilometer.compute.pollsters [-] a850e122-58e1-4fa2-9555-1564c9c36203/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.454 12 DEBUG ceilometer.compute.pollsters [-] 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.454 12 DEBUG ceilometer.compute.pollsters [-] 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.455 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eb34a6f6-3a9c-4539-99f2-b9e5ca3da8ec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2212722383, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': 'a850e122-58e1-4fa2-9555-1564c9c36203-vda', 'timestamp': '2025-10-02T12:07:16.453960', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1175198892', 'name': 'instance-00000025', 'instance_id': 'a850e122-58e1-4fa2-9555-1564c9c36203', 'instance_type': 'm1.nano', 'host': '04d9a0281125c42669486905634c8366529c7bf74f6151e80b76ceb5', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5686fb58-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4828.067680321, 'message_signature': '13418a1646e5db830f0afbfe025d84f695b8c4423f2cda21b2e8aa3b12d5a1cb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': 'a850e122-58e1-4fa2-9555-1564c9c36203-sda', 'timestamp': '2025-10-02T12:07:16.453960', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1175198892', 'name': 'instance-00000025', 'instance_id': 'a850e122-58e1-4fa2-9555-1564c9c36203', 'instance_type': 'm1.nano', 'host': '04d9a0281125c42669486905634c8366529c7bf74f6151e80b76ceb5', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '568702f6-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4828.067680321, 'message_signature': '0ae13509da4642c5b752990c93cb2e2398f83fc1f701a3780e31dd4e293ae5ad'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'a184841b0d7f4132b40d6c8dc1230655', 'user_name': None, 'project_id': '84063a6f8a7246a7ba8decc84421e3d9', 'project_name': None, 'resource_id': '555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952-vda', 'timestamp': '2025-10-02T12:07:16.453960', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsV248Test-server-871354274', 'name': 'instance-00000029', 'instance_id': '555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952', 'instance_type': 'm1.nano', 'host': '8b07f79e1fe44788e8f684fd89060f0f0a10e69c798d7c3a4b53b7b3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '56870a76-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4828.09174877, 'message_signature': '69581b2a8ff83610d8f8909dff37535ce6c2f0988a2d45d2f3bd7ee6a0aea0be'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'a184841b0d7f4132b40d6c8dc1230655', 'user_name': None, 'project_id': '84063a6f8a7246a7ba8decc84421e3d9', 'project_name': None, 'resource_id': '555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952-sda', 'timestamp': '2025-10-02T12:07:16.453960', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsV248Test-server-871354274', 'name': 'instance-00000029', 'instance_id': '555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952', 'instance_type': 'm1.nano', 'host': '8b07f79e1fe44788e8f684fd89060f0f0a10e69c798d7c3a4b53b7b3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '568711e2-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4828.09174877, 'message_signature': '041f4417a2a6a4d6fec7c35ca479509f2efceafe31f0e76b8a0e47760e4dc7c5'}]}, 'timestamp': '2025-10-02 12:07:16.454745', '_unique_id': '59119acfd33c490ab29afdb5f6d64940'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.455 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.455 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.455 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.455 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.455 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.455 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.455 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.455 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.455 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.455 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.455 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.455 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.455 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.455 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.455 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.455 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.455 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.455 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.455 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.455 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.455 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.455 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.455 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.455 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.455 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.455 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.455 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.455 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.455 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.455 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.455 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.456 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.456 12 DEBUG ceilometer.compute.pollsters [-] a850e122-58e1-4fa2-9555-1564c9c36203/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3aedac7b-b9ff-46ec-98cd-dcd479bf1e39', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': 'instance-00000025-a850e122-58e1-4fa2-9555-1564c9c36203-tap0b2b5a9b-14', 'timestamp': '2025-10-02T12:07:16.456313', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1175198892', 'name': 'tap0b2b5a9b-14', 'instance_id': 'a850e122-58e1-4fa2-9555-1564c9c36203', 'instance_type': 'm1.nano', 'host': '04d9a0281125c42669486905634c8366529c7bf74f6151e80b76ceb5', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:61:29:fa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0b2b5a9b-14'}, 'message_id': '5687576a-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4828.026359924, 'message_signature': 'fa3c46ad0f9f33639954533e953361b70bf33402deffdc063358f4d9a7952dde'}]}, 'timestamp': '2025-10-02 12:07:16.456543', '_unique_id': 'ea0c6cd2183a41c1b6650b3b5ef445a9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 DEBUG ceilometer.compute.pollsters [-] a850e122-58e1-4fa2-9555-1564c9c36203/disk.device.write.bytes volume: 72876032 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.457 12 DEBUG ceilometer.compute.pollsters [-] a850e122-58e1-4fa2-9555-1564c9c36203/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.458 12 DEBUG ceilometer.compute.pollsters [-] 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.458 12 DEBUG ceilometer.compute.pollsters [-] 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dc92d531-6b22-4f60-a5de-f122cc1697bd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72876032, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': 'a850e122-58e1-4fa2-9555-1564c9c36203-vda', 'timestamp': '2025-10-02T12:07:16.457715', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1175198892', 'name': 'instance-00000025', 'instance_id': 'a850e122-58e1-4fa2-9555-1564c9c36203', 'instance_type': 'm1.nano', 'host': '04d9a0281125c42669486905634c8366529c7bf74f6151e80b76ceb5', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '56878f78-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4828.067680321, 'message_signature': '340c7c1f247747b976d3bd683b7dcb20c194ab89e07de680a865a0573db565e9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': 'a850e122-58e1-4fa2-9555-1564c9c36203-sda', 'timestamp': '2025-10-02T12:07:16.457715', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1175198892', 'name': 'instance-00000025', 'instance_id': 'a850e122-58e1-4fa2-9555-1564c9c36203', 'instance_type': 'm1.nano', 'host': '04d9a0281125c42669486905634c8366529c7bf74f6151e80b76ceb5', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '56879770-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4828.067680321, 'message_signature': 'd20b36a891ffcbc7a19598340d4f500a9f45fe74df788532f9f21908bada4c61'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a184841b0d7f4132b40d6c8dc1230655', 'user_name': None, 'project_id': '84063a6f8a7246a7ba8decc84421e3d9', 'project_name': None, 'resource_id': '555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952-vda', 'timestamp': '2025-10-02T12:07:16.457715', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsV248Test-server-871354274', 'name': 'instance-00000029', 'instance_id': '555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952', 'instance_type': 'm1.nano', 'host': '8b07f79e1fe44788e8f684fd89060f0f0a10e69c798d7c3a4b53b7b3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '56879f22-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4828.09174877, 'message_signature': 'c57408b0d14012fbe94d8b7e80c8bde53197732202926eda39e27ef527de7e32'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a184841b0d7f4132b40d6c8dc1230655', 'user_name': None, 'project_id': '84063a6f8a7246a7ba8decc84421e3d9', 'project_name': None, 'resource_id': '555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952-sda', 'timestamp': '2025-10-02T12:07:16.457715', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsV248Test-server-871354274', 'name': 'instance-00000029', 'instance_id': '555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952', 'instance_type': 'm1.nano', 'host': '8b07f79e1fe44788e8f684fd89060f0f0a10e69c798d7c3a4b53b7b3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5687a68e-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4828.09174877, 'message_signature': 'd78f62a6fd27a96e41160353463f99716fec593a3125535dc79564e1b739b19d'}]}, 'timestamp': '2025-10-02 12:07:16.458550', '_unique_id': 'd1bf5b2838994e8ca8a82d7c2e5cddfe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServersAdminTestJSON-server-1175198892>, <NovaLikeServer: tempest-ServerDiagnosticsV248Test-server-871354274>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersAdminTestJSON-server-1175198892>, <NovaLikeServer: tempest-ServerDiagnosticsV248Test-server-871354274>]
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.459 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  2 08:07:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:07:16.459 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.474 12 DEBUG ceilometer.compute.pollsters [-] a850e122-58e1-4fa2-9555-1564c9c36203/memory.usage volume: 42.5703125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.489 12 DEBUG ceilometer.compute.pollsters [-] 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.490 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952: ceilometer.compute.pollsters.NoVolumeException
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4a7db6c4-71f5-4b32-9a13-4493512df4d5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.5703125, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': 'a850e122-58e1-4fa2-9555-1564c9c36203', 'timestamp': '2025-10-02T12:07:16.460038', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1175198892', 'name': 'instance-00000025', 'instance_id': 'a850e122-58e1-4fa2-9555-1564c9c36203', 'instance_type': 'm1.nano', 'host': '04d9a0281125c42669486905634c8366529c7bf74f6151e80b76ceb5', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '568a3dea-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4828.152983633, 'message_signature': '266b924394bf7b576b1909e5b2145577403fc46ba068787614a401db4db8be6b'}]}, 'timestamp': '2025-10-02 12:07:16.490274', '_unique_id': '6b2af669394e4d088e3319bbd5868715'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.491 12 DEBUG ceilometer.compute.pollsters [-] a850e122-58e1-4fa2-9555-1564c9c36203/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.492 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '295cad21-1eb4-4c45-aef2-1c9aa947b777', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': 'instance-00000025-a850e122-58e1-4fa2-9555-1564c9c36203-tap0b2b5a9b-14', 'timestamp': '2025-10-02T12:07:16.491960', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1175198892', 'name': 'tap0b2b5a9b-14', 'instance_id': 'a850e122-58e1-4fa2-9555-1564c9c36203', 'instance_type': 'm1.nano', 'host': '04d9a0281125c42669486905634c8366529c7bf74f6151e80b76ceb5', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:61:29:fa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0b2b5a9b-14'}, 'message_id': '568cc916-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4828.026359924, 'message_signature': '52b9ec526a2575d4ec867ed78515590720e22a0920b2ff4cc7ce2da6698a321c'}]}, 'timestamp': '2025-10-02 12:07:16.492250', '_unique_id': 'c09be7235b1841f3a6dff142809d430c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.492 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.492 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.492 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.492 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.492 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.492 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.492 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.492 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.492 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.492 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.492 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.492 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.492 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.492 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.492 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.492 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.492 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.492 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.492 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.492 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.492 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.492 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.492 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.492 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.492 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.492 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.492 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.492 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.492 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.492 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.492 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.493 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.493 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.493 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServersAdminTestJSON-server-1175198892>, <NovaLikeServer: tempest-ServerDiagnosticsV248Test-server-871354274>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersAdminTestJSON-server-1175198892>, <NovaLikeServer: tempest-ServerDiagnosticsV248Test-server-871354274>]
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.494 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.494 12 DEBUG ceilometer.compute.pollsters [-] a850e122-58e1-4fa2-9555-1564c9c36203/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dc1f74cd-0c8b-45d8-9199-86f5de53266b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': 'instance-00000025-a850e122-58e1-4fa2-9555-1564c9c36203-tap0b2b5a9b-14', 'timestamp': '2025-10-02T12:07:16.494227', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1175198892', 'name': 'tap0b2b5a9b-14', 'instance_id': 'a850e122-58e1-4fa2-9555-1564c9c36203', 'instance_type': 'm1.nano', 'host': '04d9a0281125c42669486905634c8366529c7bf74f6151e80b76ceb5', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:61:29:fa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0b2b5a9b-14'}, 'message_id': '568d2258-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4828.026359924, 'message_signature': '8c4cc8ac79365cb4d242238cbe201e2db213c77c964b885672592635904d2f8c'}]}, 'timestamp': '2025-10-02 12:07:16.494527', '_unique_id': 'c1b9e63368b7461093f9fe5f158af053'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 nova_compute[192144]: 2025-10-02 12:07:16.493 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "refresh_cache-a850e122-58e1-4fa2-9555-1564c9c36203" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 nova_compute[192144]: 2025-10-02 12:07:16.494 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquired lock "refresh_cache-a850e122-58e1-4fa2-9555-1564c9c36203" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:07:16 np0005466013 nova_compute[192144]: 2025-10-02 12:07:16.494 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 nova_compute[192144]: 2025-10-02 12:07:16.494 2 DEBUG nova.objects.instance [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lazy-loading 'info_cache' on Instance uuid a850e122-58e1-4fa2-9555-1564c9c36203 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.495 12 DEBUG ceilometer.compute.pollsters [-] a850e122-58e1-4fa2-9555-1564c9c36203/disk.device.read.bytes volume: 30743040 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.496 12 DEBUG ceilometer.compute.pollsters [-] a850e122-58e1-4fa2-9555-1564c9c36203/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.496 12 DEBUG ceilometer.compute.pollsters [-] 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.496 12 DEBUG ceilometer.compute.pollsters [-] 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7af7d565-2942-45f5-8cc5-4fa007ba1231', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30743040, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': 'a850e122-58e1-4fa2-9555-1564c9c36203-vda', 'timestamp': '2025-10-02T12:07:16.495692', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1175198892', 'name': 'instance-00000025', 'instance_id': 'a850e122-58e1-4fa2-9555-1564c9c36203', 'instance_type': 'm1.nano', 'host': '04d9a0281125c42669486905634c8366529c7bf74f6151e80b76ceb5', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '568d5bb0-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4828.067680321, 'message_signature': 'b17203915ecc8a72a31af91479f0c19bb3f532fa4535822fb2082f4b4d838ed4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': 'a850e122-58e1-4fa2-9555-1564c9c36203-sda', 'timestamp': '2025-10-02T12:07:16.495692', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1175198892', 'name': 'instance-00000025', 'instance_id': 'a850e122-58e1-4fa2-9555-1564c9c36203', 'instance_type': 'm1.nano', 'host': '04d9a0281125c42669486905634c8366529c7bf74f6151e80b76ceb5', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '568d6664-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4828.067680321, 'message_signature': '4112cbcf36079976c52be2a0e65de826babfc2e9dc4f1859d988cd6bc91c7201'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': 'a184841b0d7f4132b40d6c8dc1230655', 'user_name': None, 'project_id': '84063a6f8a7246a7ba8decc84421e3d9', 'project_name': None, 'resource_id': '555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952-vda', 'timestamp': '2025-10-02T12:07:16.495692', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsV248Test-server-871354274', 'name': 'instance-00000029', 'instance_id': '555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952', 'instance_type': 'm1.nano', 'host': '8b07f79e1fe44788e8f684fd89060f0f0a10e69c798d7c3a4b53b7b3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '568d6f42-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4828.09174877, 'message_signature': '2ef37bf252d0bc9442d56887932ada2c69fc8418265d74f359517d2dc6b19788'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': 'a184841b0d7f4132b40d6c8dc1230655', 'user_name': None, 'project_id': '84063a6f8a7246a7ba8decc84421e3d9', 'project_name': None, 'resource_id': '555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952-sda', 'timestamp': '2025-10-02T12:07:16.495692', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsV248Test-server-871354274', 'name': 'instance-00000029', 'instance_id': '555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952', 'instance_type': 'm1.nano', 'host': '8b07f79e1fe44788e8f684fd89060f0f0a10e69c798d7c3a4b53b7b3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '568d78f2-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4828.09174877, 'message_signature': 'd2954b66c289d032fde4866970818b12036b70762438c55533d8bd587892efa2'}]}, 'timestamp': '2025-10-02 12:07:16.496711', '_unique_id': '78fa84f7a2c34d33b35e29b1ab610dad'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.497 12 DEBUG ceilometer.compute.pollsters [-] a850e122-58e1-4fa2-9555-1564c9c36203/cpu volume: 11590000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.498 12 DEBUG ceilometer.compute.pollsters [-] 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952/cpu volume: 5560000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.498 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dafe9de4-7fd1-4ff7-89be-402d24cd611c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11590000000, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': 'a850e122-58e1-4fa2-9555-1564c9c36203', 'timestamp': '2025-10-02T12:07:16.497958', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1175198892', 'name': 'instance-00000025', 'instance_id': 'a850e122-58e1-4fa2-9555-1564c9c36203', 'instance_type': 'm1.nano', 'host': '04d9a0281125c42669486905634c8366529c7bf74f6151e80b76ceb5', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '568db36c-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4828.152983633, 'message_signature': '5c9ed19e002729e3494298a9353af39ae06888fd85c6331583618e462ab21cd3'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 5560000000, 'user_id': 'a184841b0d7f4132b40d6c8dc1230655', 'user_name': None, 'project_id': '84063a6f8a7246a7ba8decc84421e3d9', 'project_name': None, 'resource_id': '555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952', 'timestamp': '2025-10-02T12:07:16.497958', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsV248Test-server-871354274', 'name': 'instance-00000029', 'instance_id': '555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952', 'instance_type': 'm1.nano', 'host': '8b07f79e1fe44788e8f684fd89060f0f0a10e69c798d7c3a4b53b7b3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '568dbdee-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 4828.167950746, 'message_signature': 'ca440f69fef5e5546fe351fbe3cfe65b10f578bfaf708e4ebcee69155ff23eac'}]}, 'timestamp': '2025-10-02 12:07:16.498491', '_unique_id': '04f0ce7655b940ef98379b98475bc65e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.498 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.498 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.498 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.498 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.498 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.498 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.498 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.498 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.498 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.498 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.498 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.498 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.498 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.498 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.498 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.498 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.498 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.498 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.498 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.498 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.498 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.498 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.498 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.498 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.498 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.498 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.498 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.498 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.498 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.498 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:07:16.498 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466013 nova_compute[192144]: 2025-10-02 12:07:17.936 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:18 np0005466013 nova_compute[192144]: 2025-10-02 12:07:18.432 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:19 np0005466013 nova_compute[192144]: 2025-10-02 12:07:19.538 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Updating instance_info_cache with network_info: [{"id": "0b2b5a9b-1472-4353-96f8-c4b5d8fe1132", "address": "fa:16:3e:61:29:fa", "network": {"id": "66b5a7c3-fe3e-42b0-aea6-19534bca6e0e", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1726703238-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "db3f04a20fd740c1af3139196dc928d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b2b5a9b-14", "ovs_interfaceid": "0b2b5a9b-1472-4353-96f8-c4b5d8fe1132", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:07:19 np0005466013 nova_compute[192144]: 2025-10-02 12:07:19.621 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Releasing lock "refresh_cache-a850e122-58e1-4fa2-9555-1564c9c36203" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:07:19 np0005466013 nova_compute[192144]: 2025-10-02 12:07:19.621 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:07:19 np0005466013 nova_compute[192144]: 2025-10-02 12:07:19.622 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:07:19 np0005466013 nova_compute[192144]: 2025-10-02 12:07:19.622 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:07:19 np0005466013 nova_compute[192144]: 2025-10-02 12:07:19.623 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:07:19 np0005466013 nova_compute[192144]: 2025-10-02 12:07:19.623 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:07:19 np0005466013 nova_compute[192144]: 2025-10-02 12:07:19.623 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:07:19 np0005466013 nova_compute[192144]: 2025-10-02 12:07:19.624 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:07:19 np0005466013 nova_compute[192144]: 2025-10-02 12:07:19.624 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:07:20 np0005466013 ovn_controller[94366]: 2025-10-02T12:07:20Z|00110|binding|INFO|Releasing lport a0163170-212d-4aba-9028-3d5fb4d45c5b from this chassis (sb_readonly=0)
Oct  2 08:07:20 np0005466013 nova_compute[192144]: 2025-10-02 12:07:20.911 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:22 np0005466013 nova_compute[192144]: 2025-10-02 12:07:22.079 2 DEBUG nova.compute.manager [None req-0f1e4a78-9b2b-4ecb-ae9c-099d247e5a0c 47bbd29b479940d88e5572cd60829e24 56ab74bdf1fd47f0a5cfc316301e49eb - - default default] [instance: 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:07:22 np0005466013 nova_compute[192144]: 2025-10-02 12:07:22.082 2 INFO nova.compute.manager [None req-0f1e4a78-9b2b-4ecb-ae9c-099d247e5a0c 47bbd29b479940d88e5572cd60829e24 56ab74bdf1fd47f0a5cfc316301e49eb - - default default] [instance: 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952] Retrieving diagnostics#033[00m
Oct  2 08:07:22 np0005466013 nova_compute[192144]: 2025-10-02 12:07:22.409 2 DEBUG oslo_concurrency.lockutils [None req-b3a75c8f-6e08-4c27-bb7b-e487bd18aeb8 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Acquiring lock "555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:22 np0005466013 nova_compute[192144]: 2025-10-02 12:07:22.410 2 DEBUG oslo_concurrency.lockutils [None req-b3a75c8f-6e08-4c27-bb7b-e487bd18aeb8 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Lock "555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:22 np0005466013 nova_compute[192144]: 2025-10-02 12:07:22.410 2 DEBUG oslo_concurrency.lockutils [None req-b3a75c8f-6e08-4c27-bb7b-e487bd18aeb8 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Acquiring lock "555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:22 np0005466013 nova_compute[192144]: 2025-10-02 12:07:22.410 2 DEBUG oslo_concurrency.lockutils [None req-b3a75c8f-6e08-4c27-bb7b-e487bd18aeb8 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Lock "555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:22 np0005466013 nova_compute[192144]: 2025-10-02 12:07:22.411 2 DEBUG oslo_concurrency.lockutils [None req-b3a75c8f-6e08-4c27-bb7b-e487bd18aeb8 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Lock "555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:22 np0005466013 nova_compute[192144]: 2025-10-02 12:07:22.423 2 INFO nova.compute.manager [None req-b3a75c8f-6e08-4c27-bb7b-e487bd18aeb8 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] [instance: 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952] Terminating instance#033[00m
Oct  2 08:07:22 np0005466013 nova_compute[192144]: 2025-10-02 12:07:22.435 2 DEBUG oslo_concurrency.lockutils [None req-b3a75c8f-6e08-4c27-bb7b-e487bd18aeb8 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Acquiring lock "refresh_cache-555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:07:22 np0005466013 nova_compute[192144]: 2025-10-02 12:07:22.436 2 DEBUG oslo_concurrency.lockutils [None req-b3a75c8f-6e08-4c27-bb7b-e487bd18aeb8 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Acquired lock "refresh_cache-555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:07:22 np0005466013 nova_compute[192144]: 2025-10-02 12:07:22.436 2 DEBUG nova.network.neutron [None req-b3a75c8f-6e08-4c27-bb7b-e487bd18aeb8 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] [instance: 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:07:22 np0005466013 nova_compute[192144]: 2025-10-02 12:07:22.625 2 DEBUG nova.network.neutron [None req-b3a75c8f-6e08-4c27-bb7b-e487bd18aeb8 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] [instance: 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:07:22 np0005466013 nova_compute[192144]: 2025-10-02 12:07:22.938 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:22 np0005466013 nova_compute[192144]: 2025-10-02 12:07:22.951 2 DEBUG nova.network.neutron [None req-b3a75c8f-6e08-4c27-bb7b-e487bd18aeb8 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] [instance: 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:07:22 np0005466013 nova_compute[192144]: 2025-10-02 12:07:22.968 2 DEBUG oslo_concurrency.lockutils [None req-b3a75c8f-6e08-4c27-bb7b-e487bd18aeb8 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Releasing lock "refresh_cache-555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:07:22 np0005466013 nova_compute[192144]: 2025-10-02 12:07:22.969 2 DEBUG nova.compute.manager [None req-b3a75c8f-6e08-4c27-bb7b-e487bd18aeb8 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] [instance: 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:07:23 np0005466013 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000029.scope: Deactivated successfully.
Oct  2 08:07:23 np0005466013 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000029.scope: Consumed 12.286s CPU time.
Oct  2 08:07:23 np0005466013 systemd-machined[152202]: Machine qemu-16-instance-00000029 terminated.
Oct  2 08:07:23 np0005466013 nova_compute[192144]: 2025-10-02 12:07:23.207 2 INFO nova.virt.libvirt.driver [-] [instance: 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952] Instance destroyed successfully.#033[00m
Oct  2 08:07:23 np0005466013 nova_compute[192144]: 2025-10-02 12:07:23.208 2 DEBUG nova.objects.instance [None req-b3a75c8f-6e08-4c27-bb7b-e487bd18aeb8 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Lazy-loading 'resources' on Instance uuid 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:07:23 np0005466013 nova_compute[192144]: 2025-10-02 12:07:23.223 2 INFO nova.virt.libvirt.driver [None req-b3a75c8f-6e08-4c27-bb7b-e487bd18aeb8 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] [instance: 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952] Deleting instance files /var/lib/nova/instances/555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952_del#033[00m
Oct  2 08:07:23 np0005466013 nova_compute[192144]: 2025-10-02 12:07:23.224 2 INFO nova.virt.libvirt.driver [None req-b3a75c8f-6e08-4c27-bb7b-e487bd18aeb8 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] [instance: 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952] Deletion of /var/lib/nova/instances/555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952_del complete#033[00m
Oct  2 08:07:23 np0005466013 nova_compute[192144]: 2025-10-02 12:07:23.298 2 INFO nova.compute.manager [None req-b3a75c8f-6e08-4c27-bb7b-e487bd18aeb8 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] [instance: 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952] Took 0.33 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:07:23 np0005466013 nova_compute[192144]: 2025-10-02 12:07:23.298 2 DEBUG oslo.service.loopingcall [None req-b3a75c8f-6e08-4c27-bb7b-e487bd18aeb8 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:07:23 np0005466013 nova_compute[192144]: 2025-10-02 12:07:23.299 2 DEBUG nova.compute.manager [-] [instance: 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:07:23 np0005466013 nova_compute[192144]: 2025-10-02 12:07:23.299 2 DEBUG nova.network.neutron [-] [instance: 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:07:23 np0005466013 nova_compute[192144]: 2025-10-02 12:07:23.434 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:23 np0005466013 nova_compute[192144]: 2025-10-02 12:07:23.494 2 DEBUG nova.network.neutron [-] [instance: 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:07:23 np0005466013 nova_compute[192144]: 2025-10-02 12:07:23.507 2 DEBUG nova.network.neutron [-] [instance: 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:07:23 np0005466013 nova_compute[192144]: 2025-10-02 12:07:23.524 2 INFO nova.compute.manager [-] [instance: 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952] Took 0.22 seconds to deallocate network for instance.#033[00m
Oct  2 08:07:23 np0005466013 nova_compute[192144]: 2025-10-02 12:07:23.601 2 DEBUG oslo_concurrency.lockutils [None req-b3a75c8f-6e08-4c27-bb7b-e487bd18aeb8 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:23 np0005466013 nova_compute[192144]: 2025-10-02 12:07:23.602 2 DEBUG oslo_concurrency.lockutils [None req-b3a75c8f-6e08-4c27-bb7b-e487bd18aeb8 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:23 np0005466013 nova_compute[192144]: 2025-10-02 12:07:23.692 2 DEBUG nova.compute.provider_tree [None req-b3a75c8f-6e08-4c27-bb7b-e487bd18aeb8 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:07:23 np0005466013 nova_compute[192144]: 2025-10-02 12:07:23.705 2 DEBUG nova.scheduler.client.report [None req-b3a75c8f-6e08-4c27-bb7b-e487bd18aeb8 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:07:23 np0005466013 nova_compute[192144]: 2025-10-02 12:07:23.731 2 DEBUG oslo_concurrency.lockutils [None req-b3a75c8f-6e08-4c27-bb7b-e487bd18aeb8 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:23 np0005466013 nova_compute[192144]: 2025-10-02 12:07:23.758 2 INFO nova.scheduler.client.report [None req-b3a75c8f-6e08-4c27-bb7b-e487bd18aeb8 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Deleted allocations for instance 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952#033[00m
Oct  2 08:07:23 np0005466013 nova_compute[192144]: 2025-10-02 12:07:23.831 2 DEBUG oslo_concurrency.lockutils [None req-b3a75c8f-6e08-4c27-bb7b-e487bd18aeb8 a184841b0d7f4132b40d6c8dc1230655 84063a6f8a7246a7ba8decc84421e3d9 - - default default] Lock "555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.421s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:23 np0005466013 nova_compute[192144]: 2025-10-02 12:07:23.874 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:07:25 np0005466013 podman[224792]: 2025-10-02 12:07:25.676382874 +0000 UTC m=+0.050298426 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  2 08:07:25 np0005466013 podman[224793]: 2025-10-02 12:07:25.676403064 +0000 UTC m=+0.050282455 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct  2 08:07:25 np0005466013 podman[224794]: 2025-10-02 12:07:25.710208401 +0000 UTC m=+0.079870022 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct  2 08:07:27 np0005466013 nova_compute[192144]: 2025-10-02 12:07:27.941 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:28 np0005466013 nova_compute[192144]: 2025-10-02 12:07:28.407 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759406833.4052958, 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:07:28 np0005466013 nova_compute[192144]: 2025-10-02 12:07:28.407 2 INFO nova.compute.manager [-] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:07:28 np0005466013 nova_compute[192144]: 2025-10-02 12:07:28.427 2 DEBUG nova.compute.manager [None req-24ab9147-6e68-484b-9946-9da843989055 - - - - - -] [instance: 9ae3f2a0-cb83-4b09-8d4d-604c431e09e2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:07:28 np0005466013 nova_compute[192144]: 2025-10-02 12:07:28.434 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:29 np0005466013 nova_compute[192144]: 2025-10-02 12:07:29.507 2 DEBUG oslo_concurrency.lockutils [None req-f29d4072-0aba-4e77-a0d8-cde3723b2a0d 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Acquiring lock "a850e122-58e1-4fa2-9555-1564c9c36203" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:29 np0005466013 nova_compute[192144]: 2025-10-02 12:07:29.507 2 DEBUG oslo_concurrency.lockutils [None req-f29d4072-0aba-4e77-a0d8-cde3723b2a0d 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "a850e122-58e1-4fa2-9555-1564c9c36203" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:29 np0005466013 nova_compute[192144]: 2025-10-02 12:07:29.508 2 DEBUG oslo_concurrency.lockutils [None req-f29d4072-0aba-4e77-a0d8-cde3723b2a0d 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Acquiring lock "a850e122-58e1-4fa2-9555-1564c9c36203-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:29 np0005466013 nova_compute[192144]: 2025-10-02 12:07:29.508 2 DEBUG oslo_concurrency.lockutils [None req-f29d4072-0aba-4e77-a0d8-cde3723b2a0d 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "a850e122-58e1-4fa2-9555-1564c9c36203-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:29 np0005466013 nova_compute[192144]: 2025-10-02 12:07:29.509 2 DEBUG oslo_concurrency.lockutils [None req-f29d4072-0aba-4e77-a0d8-cde3723b2a0d 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "a850e122-58e1-4fa2-9555-1564c9c36203-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:29 np0005466013 nova_compute[192144]: 2025-10-02 12:07:29.526 2 INFO nova.compute.manager [None req-f29d4072-0aba-4e77-a0d8-cde3723b2a0d 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Terminating instance#033[00m
Oct  2 08:07:29 np0005466013 nova_compute[192144]: 2025-10-02 12:07:29.541 2 DEBUG nova.compute.manager [None req-f29d4072-0aba-4e77-a0d8-cde3723b2a0d 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:07:29 np0005466013 kernel: tap0b2b5a9b-14 (unregistering): left promiscuous mode
Oct  2 08:07:29 np0005466013 NetworkManager[51205]: <info>  [1759406849.5669] device (tap0b2b5a9b-14): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:07:29 np0005466013 ovn_controller[94366]: 2025-10-02T12:07:29Z|00111|binding|INFO|Releasing lport 0b2b5a9b-1472-4353-96f8-c4b5d8fe1132 from this chassis (sb_readonly=0)
Oct  2 08:07:29 np0005466013 ovn_controller[94366]: 2025-10-02T12:07:29Z|00112|binding|INFO|Setting lport 0b2b5a9b-1472-4353-96f8-c4b5d8fe1132 down in Southbound
Oct  2 08:07:29 np0005466013 ovn_controller[94366]: 2025-10-02T12:07:29Z|00113|binding|INFO|Removing iface tap0b2b5a9b-14 ovn-installed in OVS
Oct  2 08:07:29 np0005466013 nova_compute[192144]: 2025-10-02 12:07:29.579 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:29 np0005466013 nova_compute[192144]: 2025-10-02 12:07:29.581 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:29 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:07:29.595 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:61:29:fa 10.100.0.13'], port_security=['fa:16:3e:61:29:fa 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'a850e122-58e1-4fa2-9555-1564c9c36203', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'db3f04a20fd740c1af3139196dc928d2', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c69e6497-c2d4-4cc0-a1d9-2c5055cc5d77', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5dc739b2-072d-4dd4-b9d2-9724145d12f5, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=0b2b5a9b-1472-4353-96f8-c4b5d8fe1132) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:07:29 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:07:29.596 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 0b2b5a9b-1472-4353-96f8-c4b5d8fe1132 in datapath 66b5a7c3-fe3e-42b0-aea6-19534bca6e0e unbound from our chassis#033[00m
Oct  2 08:07:29 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:07:29.598 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 66b5a7c3-fe3e-42b0-aea6-19534bca6e0e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:07:29 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:07:29.599 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[49e1cb6a-9b8f-455e-bd71-997f633309ab]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:29 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:07:29.600 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e namespace which is not needed anymore#033[00m
Oct  2 08:07:29 np0005466013 nova_compute[192144]: 2025-10-02 12:07:29.601 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:29 np0005466013 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000025.scope: Deactivated successfully.
Oct  2 08:07:29 np0005466013 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000025.scope: Consumed 13.763s CPU time.
Oct  2 08:07:29 np0005466013 systemd-machined[152202]: Machine qemu-14-instance-00000025 terminated.
Oct  2 08:07:29 np0005466013 neutron-haproxy-ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e[224248]: [NOTICE]   (224252) : haproxy version is 2.8.14-c23fe91
Oct  2 08:07:29 np0005466013 neutron-haproxy-ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e[224248]: [NOTICE]   (224252) : path to executable is /usr/sbin/haproxy
Oct  2 08:07:29 np0005466013 neutron-haproxy-ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e[224248]: [WARNING]  (224252) : Exiting Master process...
Oct  2 08:07:29 np0005466013 neutron-haproxy-ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e[224248]: [ALERT]    (224252) : Current worker (224254) exited with code 143 (Terminated)
Oct  2 08:07:29 np0005466013 neutron-haproxy-ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e[224248]: [WARNING]  (224252) : All workers exited. Exiting... (0)
Oct  2 08:07:29 np0005466013 systemd[1]: libpod-b7bfd7638f77aabc0fc63f36868ef30733a985d73f3ca817cdb08d2a4193c6cc.scope: Deactivated successfully.
Oct  2 08:07:29 np0005466013 conmon[224248]: conmon b7bfd7638f77aabc0fc6 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-b7bfd7638f77aabc0fc63f36868ef30733a985d73f3ca817cdb08d2a4193c6cc.scope/container/memory.events
Oct  2 08:07:29 np0005466013 podman[224880]: 2025-10-02 12:07:29.75718928 +0000 UTC m=+0.054315159 container died b7bfd7638f77aabc0fc63f36868ef30733a985d73f3ca817cdb08d2a4193c6cc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:07:29 np0005466013 nova_compute[192144]: 2025-10-02 12:07:29.778 2 DEBUG nova.compute.manager [req-89d7acff-9ac9-4ac0-bb9d-47e471fb5860 req-428c3355-94fb-4f86-b457-e075e2ad359c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Received event network-vif-unplugged-0b2b5a9b-1472-4353-96f8-c4b5d8fe1132 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:07:29 np0005466013 nova_compute[192144]: 2025-10-02 12:07:29.779 2 DEBUG oslo_concurrency.lockutils [req-89d7acff-9ac9-4ac0-bb9d-47e471fb5860 req-428c3355-94fb-4f86-b457-e075e2ad359c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "a850e122-58e1-4fa2-9555-1564c9c36203-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:29 np0005466013 nova_compute[192144]: 2025-10-02 12:07:29.779 2 DEBUG oslo_concurrency.lockutils [req-89d7acff-9ac9-4ac0-bb9d-47e471fb5860 req-428c3355-94fb-4f86-b457-e075e2ad359c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "a850e122-58e1-4fa2-9555-1564c9c36203-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:29 np0005466013 nova_compute[192144]: 2025-10-02 12:07:29.784 2 DEBUG oslo_concurrency.lockutils [req-89d7acff-9ac9-4ac0-bb9d-47e471fb5860 req-428c3355-94fb-4f86-b457-e075e2ad359c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "a850e122-58e1-4fa2-9555-1564c9c36203-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:29 np0005466013 nova_compute[192144]: 2025-10-02 12:07:29.785 2 DEBUG nova.compute.manager [req-89d7acff-9ac9-4ac0-bb9d-47e471fb5860 req-428c3355-94fb-4f86-b457-e075e2ad359c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] No waiting events found dispatching network-vif-unplugged-0b2b5a9b-1472-4353-96f8-c4b5d8fe1132 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:07:29 np0005466013 nova_compute[192144]: 2025-10-02 12:07:29.785 2 DEBUG nova.compute.manager [req-89d7acff-9ac9-4ac0-bb9d-47e471fb5860 req-428c3355-94fb-4f86-b457-e075e2ad359c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Received event network-vif-unplugged-0b2b5a9b-1472-4353-96f8-c4b5d8fe1132 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:07:29 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b7bfd7638f77aabc0fc63f36868ef30733a985d73f3ca817cdb08d2a4193c6cc-userdata-shm.mount: Deactivated successfully.
Oct  2 08:07:29 np0005466013 systemd[1]: var-lib-containers-storage-overlay-12d003e85d90663a8a55e3f422c0c1335bfa794efcc275bb0fe0f2abb9865be7-merged.mount: Deactivated successfully.
Oct  2 08:07:29 np0005466013 podman[224880]: 2025-10-02 12:07:29.80232648 +0000 UTC m=+0.099452349 container cleanup b7bfd7638f77aabc0fc63f36868ef30733a985d73f3ca817cdb08d2a4193c6cc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:07:29 np0005466013 systemd[1]: libpod-conmon-b7bfd7638f77aabc0fc63f36868ef30733a985d73f3ca817cdb08d2a4193c6cc.scope: Deactivated successfully.
Oct  2 08:07:29 np0005466013 nova_compute[192144]: 2025-10-02 12:07:29.818 2 INFO nova.virt.libvirt.driver [-] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Instance destroyed successfully.#033[00m
Oct  2 08:07:29 np0005466013 nova_compute[192144]: 2025-10-02 12:07:29.818 2 DEBUG nova.objects.instance [None req-f29d4072-0aba-4e77-a0d8-cde3723b2a0d 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lazy-loading 'resources' on Instance uuid a850e122-58e1-4fa2-9555-1564c9c36203 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:07:29 np0005466013 nova_compute[192144]: 2025-10-02 12:07:29.839 2 DEBUG nova.virt.libvirt.vif [None req-f29d4072-0aba-4e77-a0d8-cde3723b2a0d 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:06:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1175198892',display_name='tempest-ServersAdminTestJSON-server-1175198892',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1175198892',id=37,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:06:49Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='db3f04a20fd740c1af3139196dc928d2',ramdisk_id='',reservation_id='r-wfa8n35h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1782354187',owner_user_name='tempest-ServersAdminTestJSON-1782354187-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:06:49Z,user_data=None,user_id='9258efa4511c4bb3813eca27b75b1008',uuid=a850e122-58e1-4fa2-9555-1564c9c36203,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0b2b5a9b-1472-4353-96f8-c4b5d8fe1132", "address": "fa:16:3e:61:29:fa", "network": {"id": "66b5a7c3-fe3e-42b0-aea6-19534bca6e0e", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1726703238-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "db3f04a20fd740c1af3139196dc928d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b2b5a9b-14", "ovs_interfaceid": "0b2b5a9b-1472-4353-96f8-c4b5d8fe1132", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:07:29 np0005466013 nova_compute[192144]: 2025-10-02 12:07:29.840 2 DEBUG nova.network.os_vif_util [None req-f29d4072-0aba-4e77-a0d8-cde3723b2a0d 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Converting VIF {"id": "0b2b5a9b-1472-4353-96f8-c4b5d8fe1132", "address": "fa:16:3e:61:29:fa", "network": {"id": "66b5a7c3-fe3e-42b0-aea6-19534bca6e0e", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1726703238-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "db3f04a20fd740c1af3139196dc928d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b2b5a9b-14", "ovs_interfaceid": "0b2b5a9b-1472-4353-96f8-c4b5d8fe1132", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:07:29 np0005466013 nova_compute[192144]: 2025-10-02 12:07:29.841 2 DEBUG nova.network.os_vif_util [None req-f29d4072-0aba-4e77-a0d8-cde3723b2a0d 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:61:29:fa,bridge_name='br-int',has_traffic_filtering=True,id=0b2b5a9b-1472-4353-96f8-c4b5d8fe1132,network=Network(66b5a7c3-fe3e-42b0-aea6-19534bca6e0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b2b5a9b-14') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:07:29 np0005466013 nova_compute[192144]: 2025-10-02 12:07:29.842 2 DEBUG os_vif [None req-f29d4072-0aba-4e77-a0d8-cde3723b2a0d 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:61:29:fa,bridge_name='br-int',has_traffic_filtering=True,id=0b2b5a9b-1472-4353-96f8-c4b5d8fe1132,network=Network(66b5a7c3-fe3e-42b0-aea6-19534bca6e0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b2b5a9b-14') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:07:29 np0005466013 nova_compute[192144]: 2025-10-02 12:07:29.845 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:29 np0005466013 nova_compute[192144]: 2025-10-02 12:07:29.845 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0b2b5a9b-14, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:07:29 np0005466013 nova_compute[192144]: 2025-10-02 12:07:29.847 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:29 np0005466013 nova_compute[192144]: 2025-10-02 12:07:29.848 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:29 np0005466013 nova_compute[192144]: 2025-10-02 12:07:29.853 2 INFO os_vif [None req-f29d4072-0aba-4e77-a0d8-cde3723b2a0d 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:61:29:fa,bridge_name='br-int',has_traffic_filtering=True,id=0b2b5a9b-1472-4353-96f8-c4b5d8fe1132,network=Network(66b5a7c3-fe3e-42b0-aea6-19534bca6e0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b2b5a9b-14')#033[00m
Oct  2 08:07:29 np0005466013 nova_compute[192144]: 2025-10-02 12:07:29.854 2 INFO nova.virt.libvirt.driver [None req-f29d4072-0aba-4e77-a0d8-cde3723b2a0d 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Deleting instance files /var/lib/nova/instances/a850e122-58e1-4fa2-9555-1564c9c36203_del#033[00m
Oct  2 08:07:29 np0005466013 nova_compute[192144]: 2025-10-02 12:07:29.855 2 INFO nova.virt.libvirt.driver [None req-f29d4072-0aba-4e77-a0d8-cde3723b2a0d 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Deletion of /var/lib/nova/instances/a850e122-58e1-4fa2-9555-1564c9c36203_del complete#033[00m
Oct  2 08:07:29 np0005466013 podman[224924]: 2025-10-02 12:07:29.867866888 +0000 UTC m=+0.040883398 container remove b7bfd7638f77aabc0fc63f36868ef30733a985d73f3ca817cdb08d2a4193c6cc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:07:29 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:07:29.872 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[483a08ce-118c-4f2d-9f7e-eff33b2ee558]: (4, ('Thu Oct  2 12:07:29 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e (b7bfd7638f77aabc0fc63f36868ef30733a985d73f3ca817cdb08d2a4193c6cc)\nb7bfd7638f77aabc0fc63f36868ef30733a985d73f3ca817cdb08d2a4193c6cc\nThu Oct  2 12:07:29 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e (b7bfd7638f77aabc0fc63f36868ef30733a985d73f3ca817cdb08d2a4193c6cc)\nb7bfd7638f77aabc0fc63f36868ef30733a985d73f3ca817cdb08d2a4193c6cc\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:29 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:07:29.873 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[01893d50-ca62-4f60-a7c8-bc528cfc901f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:29 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:07:29.875 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap66b5a7c3-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:07:29 np0005466013 nova_compute[192144]: 2025-10-02 12:07:29.876 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:29 np0005466013 kernel: tap66b5a7c3-f0: left promiscuous mode
Oct  2 08:07:29 np0005466013 nova_compute[192144]: 2025-10-02 12:07:29.889 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:29 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:07:29.891 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[458c20e4-8bae-4f11-bfb5-e5d458062570]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:29 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:07:29.926 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[faf578d2-1b68-40cd-9674-6563bb6ce359]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:29 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:07:29.928 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[e91df466-1d8c-4c4f-ace2-92bf14231f65]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:29 np0005466013 nova_compute[192144]: 2025-10-02 12:07:29.929 2 INFO nova.compute.manager [None req-f29d4072-0aba-4e77-a0d8-cde3723b2a0d 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:07:29 np0005466013 nova_compute[192144]: 2025-10-02 12:07:29.929 2 DEBUG oslo.service.loopingcall [None req-f29d4072-0aba-4e77-a0d8-cde3723b2a0d 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:07:29 np0005466013 nova_compute[192144]: 2025-10-02 12:07:29.930 2 DEBUG nova.compute.manager [-] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:07:29 np0005466013 nova_compute[192144]: 2025-10-02 12:07:29.930 2 DEBUG nova.network.neutron [-] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:07:29 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:07:29.948 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[eb854365-6e35-4128-9f40-ad74b826f36c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 480043, 'reachable_time': 36380, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224939, 'error': None, 'target': 'ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:29 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:07:29.951 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:07:29 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:07:29.951 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[92768367-59fe-424e-a6c7-6f8d9f2b23a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:29 np0005466013 systemd[1]: run-netns-ovnmeta\x2d66b5a7c3\x2dfe3e\x2d42b0\x2daea6\x2d19534bca6e0e.mount: Deactivated successfully.
Oct  2 08:07:30 np0005466013 nova_compute[192144]: 2025-10-02 12:07:30.621 2 DEBUG nova.network.neutron [-] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:07:30 np0005466013 nova_compute[192144]: 2025-10-02 12:07:30.640 2 INFO nova.compute.manager [-] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Took 0.71 seconds to deallocate network for instance.#033[00m
Oct  2 08:07:30 np0005466013 nova_compute[192144]: 2025-10-02 12:07:30.709 2 DEBUG nova.compute.manager [req-a3f398ec-a40c-4f21-a996-2dc0bcedc0c9 req-1c5c8975-2fa7-46a0-aebf-404eb6738611 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Received event network-vif-deleted-0b2b5a9b-1472-4353-96f8-c4b5d8fe1132 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:07:30 np0005466013 nova_compute[192144]: 2025-10-02 12:07:30.728 2 DEBUG oslo_concurrency.lockutils [None req-f29d4072-0aba-4e77-a0d8-cde3723b2a0d 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:30 np0005466013 nova_compute[192144]: 2025-10-02 12:07:30.729 2 DEBUG oslo_concurrency.lockutils [None req-f29d4072-0aba-4e77-a0d8-cde3723b2a0d 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:30 np0005466013 nova_compute[192144]: 2025-10-02 12:07:30.802 2 DEBUG nova.compute.provider_tree [None req-f29d4072-0aba-4e77-a0d8-cde3723b2a0d 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:07:30 np0005466013 nova_compute[192144]: 2025-10-02 12:07:30.823 2 DEBUG nova.scheduler.client.report [None req-f29d4072-0aba-4e77-a0d8-cde3723b2a0d 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:07:30 np0005466013 nova_compute[192144]: 2025-10-02 12:07:30.842 2 DEBUG oslo_concurrency.lockutils [None req-f29d4072-0aba-4e77-a0d8-cde3723b2a0d 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:30 np0005466013 nova_compute[192144]: 2025-10-02 12:07:30.865 2 DEBUG oslo_concurrency.lockutils [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Acquiring lock "1df89ab6-e68b-4cdb-96ac-80896dce72c0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:30 np0005466013 nova_compute[192144]: 2025-10-02 12:07:30.865 2 DEBUG oslo_concurrency.lockutils [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Lock "1df89ab6-e68b-4cdb-96ac-80896dce72c0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:30 np0005466013 nova_compute[192144]: 2025-10-02 12:07:30.877 2 INFO nova.scheduler.client.report [None req-f29d4072-0aba-4e77-a0d8-cde3723b2a0d 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Deleted allocations for instance a850e122-58e1-4fa2-9555-1564c9c36203#033[00m
Oct  2 08:07:30 np0005466013 nova_compute[192144]: 2025-10-02 12:07:30.887 2 DEBUG nova.compute.manager [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:07:30 np0005466013 nova_compute[192144]: 2025-10-02 12:07:30.996 2 DEBUG oslo_concurrency.lockutils [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:30 np0005466013 nova_compute[192144]: 2025-10-02 12:07:30.996 2 DEBUG oslo_concurrency.lockutils [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.002 2 DEBUG nova.virt.hardware [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.002 2 INFO nova.compute.claims [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.015 2 DEBUG oslo_concurrency.lockutils [None req-f29d4072-0aba-4e77-a0d8-cde3723b2a0d 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "a850e122-58e1-4fa2-9555-1564c9c36203" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.508s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.122 2 DEBUG nova.compute.provider_tree [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.135 2 DEBUG nova.scheduler.client.report [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.156 2 DEBUG oslo_concurrency.lockutils [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.160s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.157 2 DEBUG nova.compute.manager [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.214 2 DEBUG nova.compute.manager [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.247 2 INFO nova.virt.libvirt.driver [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.265 2 DEBUG nova.compute.manager [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.371 2 DEBUG nova.compute.manager [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.372 2 DEBUG nova.virt.libvirt.driver [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.373 2 INFO nova.virt.libvirt.driver [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Creating image(s)#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.373 2 DEBUG oslo_concurrency.lockutils [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Acquiring lock "/var/lib/nova/instances/1df89ab6-e68b-4cdb-96ac-80896dce72c0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.374 2 DEBUG oslo_concurrency.lockutils [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Lock "/var/lib/nova/instances/1df89ab6-e68b-4cdb-96ac-80896dce72c0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.375 2 DEBUG oslo_concurrency.lockutils [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Lock "/var/lib/nova/instances/1df89ab6-e68b-4cdb-96ac-80896dce72c0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.388 2 DEBUG oslo_concurrency.processutils [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.444 2 DEBUG oslo_concurrency.processutils [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.445 2 DEBUG oslo_concurrency.lockutils [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.445 2 DEBUG oslo_concurrency.lockutils [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.457 2 DEBUG oslo_concurrency.processutils [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.508 2 DEBUG oslo_concurrency.processutils [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.509 2 DEBUG oslo_concurrency.processutils [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/1df89ab6-e68b-4cdb-96ac-80896dce72c0/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.544 2 DEBUG oslo_concurrency.processutils [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/1df89ab6-e68b-4cdb-96ac-80896dce72c0/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.545 2 DEBUG oslo_concurrency.lockutils [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.545 2 DEBUG oslo_concurrency.processutils [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.598 2 DEBUG oslo_concurrency.processutils [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.599 2 DEBUG nova.virt.disk.api [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Checking if we can resize image /var/lib/nova/instances/1df89ab6-e68b-4cdb-96ac-80896dce72c0/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.599 2 DEBUG oslo_concurrency.processutils [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1df89ab6-e68b-4cdb-96ac-80896dce72c0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.672 2 DEBUG oslo_concurrency.processutils [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1df89ab6-e68b-4cdb-96ac-80896dce72c0/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.673 2 DEBUG nova.virt.disk.api [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Cannot resize image /var/lib/nova/instances/1df89ab6-e68b-4cdb-96ac-80896dce72c0/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.673 2 DEBUG nova.objects.instance [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Lazy-loading 'migration_context' on Instance uuid 1df89ab6-e68b-4cdb-96ac-80896dce72c0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.711 2 DEBUG nova.virt.libvirt.driver [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.712 2 DEBUG nova.virt.libvirt.driver [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Ensure instance console log exists: /var/lib/nova/instances/1df89ab6-e68b-4cdb-96ac-80896dce72c0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.712 2 DEBUG oslo_concurrency.lockutils [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.713 2 DEBUG oslo_concurrency.lockutils [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.713 2 DEBUG oslo_concurrency.lockutils [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.714 2 DEBUG nova.virt.libvirt.driver [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.719 2 WARNING nova.virt.libvirt.driver [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.726 2 DEBUG nova.virt.libvirt.host [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.727 2 DEBUG nova.virt.libvirt.host [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.731 2 DEBUG nova.virt.libvirt.host [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.732 2 DEBUG nova.virt.libvirt.host [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.734 2 DEBUG nova.virt.libvirt.driver [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.734 2 DEBUG nova.virt.hardware [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.735 2 DEBUG nova.virt.hardware [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.735 2 DEBUG nova.virt.hardware [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.735 2 DEBUG nova.virt.hardware [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.735 2 DEBUG nova.virt.hardware [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.735 2 DEBUG nova.virt.hardware [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.735 2 DEBUG nova.virt.hardware [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.736 2 DEBUG nova.virt.hardware [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.736 2 DEBUG nova.virt.hardware [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.736 2 DEBUG nova.virt.hardware [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.736 2 DEBUG nova.virt.hardware [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.739 2 DEBUG nova.objects.instance [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Lazy-loading 'pci_devices' on Instance uuid 1df89ab6-e68b-4cdb-96ac-80896dce72c0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.761 2 DEBUG nova.virt.libvirt.driver [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:07:31 np0005466013 nova_compute[192144]:  <uuid>1df89ab6-e68b-4cdb-96ac-80896dce72c0</uuid>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:  <name>instance-0000002a</name>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:07:31 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:      <nova:name>tempest-UnshelveToHostMultiNodesTest-server-1188612115</nova:name>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:07:31</nova:creationTime>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:07:31 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:        <nova:user uuid="70e85655ffe7475ba88961b19bf4d65a">tempest-UnshelveToHostMultiNodesTest-250675149-project-member</nova:user>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:        <nova:project uuid="ef1d6333695d494da23ba067aaed9cfb">tempest-UnshelveToHostMultiNodesTest-250675149</nova:project>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:      <nova:ports/>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:07:31 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:      <entry name="serial">1df89ab6-e68b-4cdb-96ac-80896dce72c0</entry>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:      <entry name="uuid">1df89ab6-e68b-4cdb-96ac-80896dce72c0</entry>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:07:31 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:07:31 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:07:31 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/1df89ab6-e68b-4cdb-96ac-80896dce72c0/disk"/>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:07:31 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/1df89ab6-e68b-4cdb-96ac-80896dce72c0/disk.config"/>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:07:31 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/1df89ab6-e68b-4cdb-96ac-80896dce72c0/console.log" append="off"/>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:07:31 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:07:31 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:07:31 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:07:31 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:07:31 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.819 2 DEBUG nova.virt.libvirt.driver [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.819 2 DEBUG nova.virt.libvirt.driver [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.819 2 INFO nova.virt.libvirt.driver [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Using config drive#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.917 2 DEBUG nova.compute.manager [req-8c20686b-4255-414a-9c41-da42d417bdc4 req-f593a6e6-7cc6-4cd1-a7a6-cf1b890a77fb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Received event network-vif-plugged-0b2b5a9b-1472-4353-96f8-c4b5d8fe1132 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.917 2 DEBUG oslo_concurrency.lockutils [req-8c20686b-4255-414a-9c41-da42d417bdc4 req-f593a6e6-7cc6-4cd1-a7a6-cf1b890a77fb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "a850e122-58e1-4fa2-9555-1564c9c36203-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.917 2 DEBUG oslo_concurrency.lockutils [req-8c20686b-4255-414a-9c41-da42d417bdc4 req-f593a6e6-7cc6-4cd1-a7a6-cf1b890a77fb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "a850e122-58e1-4fa2-9555-1564c9c36203-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.918 2 DEBUG oslo_concurrency.lockutils [req-8c20686b-4255-414a-9c41-da42d417bdc4 req-f593a6e6-7cc6-4cd1-a7a6-cf1b890a77fb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "a850e122-58e1-4fa2-9555-1564c9c36203-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.918 2 DEBUG nova.compute.manager [req-8c20686b-4255-414a-9c41-da42d417bdc4 req-f593a6e6-7cc6-4cd1-a7a6-cf1b890a77fb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] No waiting events found dispatching network-vif-plugged-0b2b5a9b-1472-4353-96f8-c4b5d8fe1132 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:07:31 np0005466013 nova_compute[192144]: 2025-10-02 12:07:31.918 2 WARNING nova.compute.manager [req-8c20686b-4255-414a-9c41-da42d417bdc4 req-f593a6e6-7cc6-4cd1-a7a6-cf1b890a77fb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Received unexpected event network-vif-plugged-0b2b5a9b-1472-4353-96f8-c4b5d8fe1132 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:07:32 np0005466013 nova_compute[192144]: 2025-10-02 12:07:32.270 2 INFO nova.virt.libvirt.driver [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Creating config drive at /var/lib/nova/instances/1df89ab6-e68b-4cdb-96ac-80896dce72c0/disk.config#033[00m
Oct  2 08:07:32 np0005466013 nova_compute[192144]: 2025-10-02 12:07:32.274 2 DEBUG oslo_concurrency.processutils [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1df89ab6-e68b-4cdb-96ac-80896dce72c0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0umc0zi7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:32 np0005466013 nova_compute[192144]: 2025-10-02 12:07:32.398 2 DEBUG oslo_concurrency.processutils [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1df89ab6-e68b-4cdb-96ac-80896dce72c0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0umc0zi7" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:32 np0005466013 systemd-machined[152202]: New machine qemu-17-instance-0000002a.
Oct  2 08:07:32 np0005466013 systemd[1]: Started Virtual Machine qemu-17-instance-0000002a.
Oct  2 08:07:32 np0005466013 nova_compute[192144]: 2025-10-02 12:07:32.942 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:33 np0005466013 nova_compute[192144]: 2025-10-02 12:07:33.153 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759406853.1532547, 1df89ab6-e68b-4cdb-96ac-80896dce72c0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:07:33 np0005466013 nova_compute[192144]: 2025-10-02 12:07:33.154 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:07:33 np0005466013 nova_compute[192144]: 2025-10-02 12:07:33.156 2 DEBUG nova.compute.manager [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:07:33 np0005466013 nova_compute[192144]: 2025-10-02 12:07:33.157 2 DEBUG nova.virt.libvirt.driver [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:07:33 np0005466013 nova_compute[192144]: 2025-10-02 12:07:33.160 2 INFO nova.virt.libvirt.driver [-] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Instance spawned successfully.#033[00m
Oct  2 08:07:33 np0005466013 nova_compute[192144]: 2025-10-02 12:07:33.160 2 DEBUG nova.virt.libvirt.driver [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:07:33 np0005466013 nova_compute[192144]: 2025-10-02 12:07:33.181 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:07:33 np0005466013 nova_compute[192144]: 2025-10-02 12:07:33.186 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:07:33 np0005466013 nova_compute[192144]: 2025-10-02 12:07:33.189 2 DEBUG nova.virt.libvirt.driver [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:07:33 np0005466013 nova_compute[192144]: 2025-10-02 12:07:33.189 2 DEBUG nova.virt.libvirt.driver [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:07:33 np0005466013 nova_compute[192144]: 2025-10-02 12:07:33.190 2 DEBUG nova.virt.libvirt.driver [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:07:33 np0005466013 nova_compute[192144]: 2025-10-02 12:07:33.190 2 DEBUG nova.virt.libvirt.driver [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:07:33 np0005466013 nova_compute[192144]: 2025-10-02 12:07:33.191 2 DEBUG nova.virt.libvirt.driver [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:07:33 np0005466013 nova_compute[192144]: 2025-10-02 12:07:33.191 2 DEBUG nova.virt.libvirt.driver [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:07:33 np0005466013 nova_compute[192144]: 2025-10-02 12:07:33.221 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:07:33 np0005466013 nova_compute[192144]: 2025-10-02 12:07:33.222 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759406853.154143, 1df89ab6-e68b-4cdb-96ac-80896dce72c0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:07:33 np0005466013 nova_compute[192144]: 2025-10-02 12:07:33.222 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] VM Started (Lifecycle Event)#033[00m
Oct  2 08:07:33 np0005466013 nova_compute[192144]: 2025-10-02 12:07:33.260 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:07:33 np0005466013 nova_compute[192144]: 2025-10-02 12:07:33.263 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:07:33 np0005466013 nova_compute[192144]: 2025-10-02 12:07:33.289 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:07:33 np0005466013 nova_compute[192144]: 2025-10-02 12:07:33.315 2 INFO nova.compute.manager [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Took 1.94 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:07:33 np0005466013 nova_compute[192144]: 2025-10-02 12:07:33.316 2 DEBUG nova.compute.manager [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:07:33 np0005466013 nova_compute[192144]: 2025-10-02 12:07:33.406 2 INFO nova.compute.manager [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Took 2.45 seconds to build instance.#033[00m
Oct  2 08:07:33 np0005466013 nova_compute[192144]: 2025-10-02 12:07:33.428 2 DEBUG oslo_concurrency.lockutils [None req-0f8896c1-6de4-4229-9c02-95bdf6b1aa5b 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Lock "1df89ab6-e68b-4cdb-96ac-80896dce72c0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.562s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:34 np0005466013 nova_compute[192144]: 2025-10-02 12:07:34.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:35 np0005466013 nova_compute[192144]: 2025-10-02 12:07:35.970 2 DEBUG oslo_concurrency.lockutils [None req-670f9717-a6dd-4cad-b131-41f33e23f4cc 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Acquiring lock "1df89ab6-e68b-4cdb-96ac-80896dce72c0" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:35 np0005466013 nova_compute[192144]: 2025-10-02 12:07:35.970 2 DEBUG oslo_concurrency.lockutils [None req-670f9717-a6dd-4cad-b131-41f33e23f4cc 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Lock "1df89ab6-e68b-4cdb-96ac-80896dce72c0" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:35 np0005466013 nova_compute[192144]: 2025-10-02 12:07:35.971 2 INFO nova.compute.manager [None req-670f9717-a6dd-4cad-b131-41f33e23f4cc 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Shelving#033[00m
Oct  2 08:07:36 np0005466013 nova_compute[192144]: 2025-10-02 12:07:36.020 2 DEBUG nova.virt.libvirt.driver [None req-670f9717-a6dd-4cad-b131-41f33e23f4cc 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Oct  2 08:07:36 np0005466013 podman[224983]: 2025-10-02 12:07:36.689822362 +0000 UTC m=+0.060135726 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, config_id=edpm, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  2 08:07:37 np0005466013 nova_compute[192144]: 2025-10-02 12:07:37.964 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:38 np0005466013 nova_compute[192144]: 2025-10-02 12:07:38.209 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759406843.2062132, 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:07:38 np0005466013 nova_compute[192144]: 2025-10-02 12:07:38.209 2 INFO nova.compute.manager [-] [instance: 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:07:38 np0005466013 nova_compute[192144]: 2025-10-02 12:07:38.229 2 DEBUG nova.compute.manager [None req-733530b2-2133-4b18-9ac6-cd1a24ad991a - - - - - -] [instance: 555db7fb-e4e1-4fa2-9b4c-dbb3a2d0f952] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:07:39 np0005466013 podman[225005]: 2025-10-02 12:07:39.683567091 +0000 UTC m=+0.058326132 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, distribution-scope=public, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, managed_by=edpm_ansible, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct  2 08:07:39 np0005466013 podman[225004]: 2025-10-02 12:07:39.683615163 +0000 UTC m=+0.058944625 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  2 08:07:39 np0005466013 nova_compute[192144]: 2025-10-02 12:07:39.699 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:39 np0005466013 nova_compute[192144]: 2025-10-02 12:07:39.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:42 np0005466013 nova_compute[192144]: 2025-10-02 12:07:42.969 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:43 np0005466013 podman[225050]: 2025-10-02 12:07:43.67280915 +0000 UTC m=+0.050105759 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 08:07:43 np0005466013 podman[225052]: 2025-10-02 12:07:43.687708222 +0000 UTC m=+0.059082739 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, org.label-schema.vendor=CentOS, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  2 08:07:44 np0005466013 nova_compute[192144]: 2025-10-02 12:07:44.816 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759406849.8155441, a850e122-58e1-4fa2-9555-1564c9c36203 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:07:44 np0005466013 nova_compute[192144]: 2025-10-02 12:07:44.817 2 INFO nova.compute.manager [-] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:07:44 np0005466013 nova_compute[192144]: 2025-10-02 12:07:44.839 2 DEBUG nova.compute.manager [None req-6f3639a4-869e-4344-99f1-fd3fe2c3d532 - - - - - -] [instance: a850e122-58e1-4fa2-9555-1564c9c36203] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:07:44 np0005466013 nova_compute[192144]: 2025-10-02 12:07:44.854 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:46 np0005466013 nova_compute[192144]: 2025-10-02 12:07:46.067 2 DEBUG nova.virt.libvirt.driver [None req-670f9717-a6dd-4cad-b131-41f33e23f4cc 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Oct  2 08:07:47 np0005466013 nova_compute[192144]: 2025-10-02 12:07:47.970 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:48 np0005466013 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d0000002a.scope: Deactivated successfully.
Oct  2 08:07:48 np0005466013 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d0000002a.scope: Consumed 12.487s CPU time.
Oct  2 08:07:48 np0005466013 systemd-machined[152202]: Machine qemu-17-instance-0000002a terminated.
Oct  2 08:07:49 np0005466013 nova_compute[192144]: 2025-10-02 12:07:49.083 2 INFO nova.virt.libvirt.driver [None req-670f9717-a6dd-4cad-b131-41f33e23f4cc 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Instance shutdown successfully after 13 seconds.#033[00m
Oct  2 08:07:49 np0005466013 nova_compute[192144]: 2025-10-02 12:07:49.087 2 INFO nova.virt.libvirt.driver [-] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Instance destroyed successfully.#033[00m
Oct  2 08:07:49 np0005466013 nova_compute[192144]: 2025-10-02 12:07:49.088 2 DEBUG nova.objects.instance [None req-670f9717-a6dd-4cad-b131-41f33e23f4cc 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Lazy-loading 'numa_topology' on Instance uuid 1df89ab6-e68b-4cdb-96ac-80896dce72c0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:07:49 np0005466013 nova_compute[192144]: 2025-10-02 12:07:49.571 2 INFO nova.virt.libvirt.driver [None req-670f9717-a6dd-4cad-b131-41f33e23f4cc 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Beginning cold snapshot process#033[00m
Oct  2 08:07:49 np0005466013 nova_compute[192144]: 2025-10-02 12:07:49.856 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:50 np0005466013 nova_compute[192144]: 2025-10-02 12:07:50.477 2 DEBUG nova.privsep.utils [None req-670f9717-a6dd-4cad-b131-41f33e23f4cc 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Oct  2 08:07:50 np0005466013 nova_compute[192144]: 2025-10-02 12:07:50.478 2 DEBUG oslo_concurrency.processutils [None req-670f9717-a6dd-4cad-b131-41f33e23f4cc 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/1df89ab6-e68b-4cdb-96ac-80896dce72c0/disk /var/lib/nova/instances/snapshots/tmp__l_73n5/4d9e94a9e2024d93b5111da57bca8392 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:50 np0005466013 nova_compute[192144]: 2025-10-02 12:07:50.863 2 DEBUG oslo_concurrency.processutils [None req-670f9717-a6dd-4cad-b131-41f33e23f4cc 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/1df89ab6-e68b-4cdb-96ac-80896dce72c0/disk /var/lib/nova/instances/snapshots/tmp__l_73n5/4d9e94a9e2024d93b5111da57bca8392" returned: 0 in 0.385s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:50 np0005466013 nova_compute[192144]: 2025-10-02 12:07:50.864 2 INFO nova.virt.libvirt.driver [None req-670f9717-a6dd-4cad-b131-41f33e23f4cc 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Snapshot extracted, beginning image upload#033[00m
Oct  2 08:07:52 np0005466013 nova_compute[192144]: 2025-10-02 12:07:52.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:54 np0005466013 nova_compute[192144]: 2025-10-02 12:07:54.067 2 INFO nova.virt.libvirt.driver [None req-670f9717-a6dd-4cad-b131-41f33e23f4cc 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Snapshot image upload complete#033[00m
Oct  2 08:07:54 np0005466013 nova_compute[192144]: 2025-10-02 12:07:54.068 2 DEBUG nova.compute.manager [None req-670f9717-a6dd-4cad-b131-41f33e23f4cc 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:07:54 np0005466013 nova_compute[192144]: 2025-10-02 12:07:54.156 2 INFO nova.compute.manager [None req-670f9717-a6dd-4cad-b131-41f33e23f4cc 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Shelve offloading#033[00m
Oct  2 08:07:54 np0005466013 nova_compute[192144]: 2025-10-02 12:07:54.174 2 INFO nova.virt.libvirt.driver [-] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Instance destroyed successfully.#033[00m
Oct  2 08:07:54 np0005466013 nova_compute[192144]: 2025-10-02 12:07:54.175 2 DEBUG nova.compute.manager [None req-670f9717-a6dd-4cad-b131-41f33e23f4cc 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:07:54 np0005466013 nova_compute[192144]: 2025-10-02 12:07:54.177 2 DEBUG oslo_concurrency.lockutils [None req-670f9717-a6dd-4cad-b131-41f33e23f4cc 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Acquiring lock "refresh_cache-1df89ab6-e68b-4cdb-96ac-80896dce72c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:07:54 np0005466013 nova_compute[192144]: 2025-10-02 12:07:54.177 2 DEBUG oslo_concurrency.lockutils [None req-670f9717-a6dd-4cad-b131-41f33e23f4cc 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Acquired lock "refresh_cache-1df89ab6-e68b-4cdb-96ac-80896dce72c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:07:54 np0005466013 nova_compute[192144]: 2025-10-02 12:07:54.178 2 DEBUG nova.network.neutron [None req-670f9717-a6dd-4cad-b131-41f33e23f4cc 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:07:54 np0005466013 nova_compute[192144]: 2025-10-02 12:07:54.380 2 DEBUG nova.network.neutron [None req-670f9717-a6dd-4cad-b131-41f33e23f4cc 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:07:54 np0005466013 nova_compute[192144]: 2025-10-02 12:07:54.858 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:55 np0005466013 nova_compute[192144]: 2025-10-02 12:07:55.261 2 DEBUG nova.network.neutron [None req-670f9717-a6dd-4cad-b131-41f33e23f4cc 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:07:55 np0005466013 nova_compute[192144]: 2025-10-02 12:07:55.279 2 DEBUG oslo_concurrency.lockutils [None req-670f9717-a6dd-4cad-b131-41f33e23f4cc 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Releasing lock "refresh_cache-1df89ab6-e68b-4cdb-96ac-80896dce72c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:07:55 np0005466013 nova_compute[192144]: 2025-10-02 12:07:55.285 2 INFO nova.virt.libvirt.driver [-] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Instance destroyed successfully.#033[00m
Oct  2 08:07:55 np0005466013 nova_compute[192144]: 2025-10-02 12:07:55.286 2 DEBUG nova.objects.instance [None req-670f9717-a6dd-4cad-b131-41f33e23f4cc 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Lazy-loading 'resources' on Instance uuid 1df89ab6-e68b-4cdb-96ac-80896dce72c0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:07:55 np0005466013 nova_compute[192144]: 2025-10-02 12:07:55.298 2 INFO nova.virt.libvirt.driver [None req-670f9717-a6dd-4cad-b131-41f33e23f4cc 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Deleting instance files /var/lib/nova/instances/1df89ab6-e68b-4cdb-96ac-80896dce72c0_del#033[00m
Oct  2 08:07:55 np0005466013 nova_compute[192144]: 2025-10-02 12:07:55.305 2 INFO nova.virt.libvirt.driver [None req-670f9717-a6dd-4cad-b131-41f33e23f4cc 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Deletion of /var/lib/nova/instances/1df89ab6-e68b-4cdb-96ac-80896dce72c0_del complete#033[00m
Oct  2 08:07:55 np0005466013 nova_compute[192144]: 2025-10-02 12:07:55.420 2 INFO nova.scheduler.client.report [None req-670f9717-a6dd-4cad-b131-41f33e23f4cc 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Deleted allocations for instance 1df89ab6-e68b-4cdb-96ac-80896dce72c0#033[00m
Oct  2 08:07:55 np0005466013 nova_compute[192144]: 2025-10-02 12:07:55.475 2 DEBUG oslo_concurrency.lockutils [None req-670f9717-a6dd-4cad-b131-41f33e23f4cc 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:55 np0005466013 nova_compute[192144]: 2025-10-02 12:07:55.475 2 DEBUG oslo_concurrency.lockutils [None req-670f9717-a6dd-4cad-b131-41f33e23f4cc 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:55 np0005466013 nova_compute[192144]: 2025-10-02 12:07:55.503 2 DEBUG nova.compute.provider_tree [None req-670f9717-a6dd-4cad-b131-41f33e23f4cc 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:07:55 np0005466013 nova_compute[192144]: 2025-10-02 12:07:55.514 2 DEBUG nova.scheduler.client.report [None req-670f9717-a6dd-4cad-b131-41f33e23f4cc 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:07:55 np0005466013 nova_compute[192144]: 2025-10-02 12:07:55.541 2 DEBUG oslo_concurrency.lockutils [None req-670f9717-a6dd-4cad-b131-41f33e23f4cc 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.066s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:55 np0005466013 nova_compute[192144]: 2025-10-02 12:07:55.615 2 DEBUG oslo_concurrency.lockutils [None req-670f9717-a6dd-4cad-b131-41f33e23f4cc 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Lock "1df89ab6-e68b-4cdb-96ac-80896dce72c0" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 19.645s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:56 np0005466013 podman[225125]: 2025-10-02 12:07:56.668707094 +0000 UTC m=+0.050054017 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 08:07:56 np0005466013 podman[225126]: 2025-10-02 12:07:56.70280449 +0000 UTC m=+0.078578384 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Oct  2 08:07:56 np0005466013 podman[225127]: 2025-10-02 12:07:56.702952466 +0000 UTC m=+0.078427430 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Oct  2 08:07:57 np0005466013 nova_compute[192144]: 2025-10-02 12:07:57.680 2 DEBUG oslo_concurrency.lockutils [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] Acquiring lock "1df89ab6-e68b-4cdb-96ac-80896dce72c0" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:57 np0005466013 nova_compute[192144]: 2025-10-02 12:07:57.680 2 DEBUG oslo_concurrency.lockutils [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] Lock "1df89ab6-e68b-4cdb-96ac-80896dce72c0" acquired by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:57 np0005466013 nova_compute[192144]: 2025-10-02 12:07:57.681 2 INFO nova.compute.manager [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Unshelving#033[00m
Oct  2 08:07:57 np0005466013 nova_compute[192144]: 2025-10-02 12:07:57.788 2 DEBUG oslo_concurrency.lockutils [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:57 np0005466013 nova_compute[192144]: 2025-10-02 12:07:57.788 2 DEBUG oslo_concurrency.lockutils [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:57 np0005466013 nova_compute[192144]: 2025-10-02 12:07:57.792 2 DEBUG nova.objects.instance [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] Lazy-loading 'pci_requests' on Instance uuid 1df89ab6-e68b-4cdb-96ac-80896dce72c0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:07:57 np0005466013 nova_compute[192144]: 2025-10-02 12:07:57.805 2 DEBUG nova.objects.instance [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] Lazy-loading 'numa_topology' on Instance uuid 1df89ab6-e68b-4cdb-96ac-80896dce72c0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:07:57 np0005466013 nova_compute[192144]: 2025-10-02 12:07:57.815 2 DEBUG nova.virt.hardware [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:07:57 np0005466013 nova_compute[192144]: 2025-10-02 12:07:57.816 2 INFO nova.compute.claims [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:07:57 np0005466013 nova_compute[192144]: 2025-10-02 12:07:57.940 2 DEBUG nova.compute.provider_tree [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:07:57 np0005466013 nova_compute[192144]: 2025-10-02 12:07:57.958 2 DEBUG nova.scheduler.client.report [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:07:57 np0005466013 nova_compute[192144]: 2025-10-02 12:07:57.974 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:57 np0005466013 nova_compute[192144]: 2025-10-02 12:07:57.978 2 DEBUG oslo_concurrency.lockutils [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.190s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:58 np0005466013 nova_compute[192144]: 2025-10-02 12:07:58.097 2 DEBUG oslo_concurrency.lockutils [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] Acquiring lock "refresh_cache-1df89ab6-e68b-4cdb-96ac-80896dce72c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:07:58 np0005466013 nova_compute[192144]: 2025-10-02 12:07:58.097 2 DEBUG oslo_concurrency.lockutils [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] Acquired lock "refresh_cache-1df89ab6-e68b-4cdb-96ac-80896dce72c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:07:58 np0005466013 nova_compute[192144]: 2025-10-02 12:07:58.098 2 DEBUG nova.network.neutron [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:07:58 np0005466013 nova_compute[192144]: 2025-10-02 12:07:58.312 2 DEBUG nova.network.neutron [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:07:58 np0005466013 nova_compute[192144]: 2025-10-02 12:07:58.517 2 DEBUG nova.network.neutron [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:07:58 np0005466013 nova_compute[192144]: 2025-10-02 12:07:58.578 2 DEBUG oslo_concurrency.lockutils [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] Releasing lock "refresh_cache-1df89ab6-e68b-4cdb-96ac-80896dce72c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:07:58 np0005466013 nova_compute[192144]: 2025-10-02 12:07:58.580 2 DEBUG nova.virt.libvirt.driver [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:07:58 np0005466013 nova_compute[192144]: 2025-10-02 12:07:58.581 2 INFO nova.virt.libvirt.driver [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Creating image(s)#033[00m
Oct  2 08:07:58 np0005466013 nova_compute[192144]: 2025-10-02 12:07:58.582 2 DEBUG oslo_concurrency.lockutils [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] Acquiring lock "/var/lib/nova/instances/1df89ab6-e68b-4cdb-96ac-80896dce72c0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:58 np0005466013 nova_compute[192144]: 2025-10-02 12:07:58.582 2 DEBUG oslo_concurrency.lockutils [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] Lock "/var/lib/nova/instances/1df89ab6-e68b-4cdb-96ac-80896dce72c0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:58 np0005466013 nova_compute[192144]: 2025-10-02 12:07:58.584 2 DEBUG oslo_concurrency.lockutils [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] Lock "/var/lib/nova/instances/1df89ab6-e68b-4cdb-96ac-80896dce72c0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:58 np0005466013 nova_compute[192144]: 2025-10-02 12:07:58.584 2 DEBUG nova.objects.instance [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] Lazy-loading 'trusted_certs' on Instance uuid 1df89ab6-e68b-4cdb-96ac-80896dce72c0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:07:58 np0005466013 nova_compute[192144]: 2025-10-02 12:07:58.602 2 DEBUG oslo_concurrency.lockutils [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] Acquiring lock "c05f5917d86516c44566bebc89543d31048e148b" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:58 np0005466013 nova_compute[192144]: 2025-10-02 12:07:58.604 2 DEBUG oslo_concurrency.lockutils [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] Lock "c05f5917d86516c44566bebc89543d31048e148b" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:00 np0005466013 nova_compute[192144]: 2025-10-02 12:08:00.016 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:00 np0005466013 nova_compute[192144]: 2025-10-02 12:08:00.310 2 DEBUG oslo_concurrency.processutils [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c05f5917d86516c44566bebc89543d31048e148b.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:08:00 np0005466013 nova_compute[192144]: 2025-10-02 12:08:00.374 2 DEBUG oslo_concurrency.processutils [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c05f5917d86516c44566bebc89543d31048e148b.part --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:08:00 np0005466013 nova_compute[192144]: 2025-10-02 12:08:00.376 2 DEBUG nova.virt.images [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] 56278c39-9b26-4671-b11b-d390810aead5 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Oct  2 08:08:00 np0005466013 nova_compute[192144]: 2025-10-02 12:08:00.377 2 DEBUG nova.privsep.utils [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Oct  2 08:08:00 np0005466013 nova_compute[192144]: 2025-10-02 12:08:00.378 2 DEBUG oslo_concurrency.processutils [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/c05f5917d86516c44566bebc89543d31048e148b.part /var/lib/nova/instances/_base/c05f5917d86516c44566bebc89543d31048e148b.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:08:00 np0005466013 nova_compute[192144]: 2025-10-02 12:08:00.662 2 DEBUG oslo_concurrency.processutils [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/c05f5917d86516c44566bebc89543d31048e148b.part /var/lib/nova/instances/_base/c05f5917d86516c44566bebc89543d31048e148b.converted" returned: 0 in 0.284s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:08:00 np0005466013 nova_compute[192144]: 2025-10-02 12:08:00.675 2 DEBUG oslo_concurrency.processutils [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c05f5917d86516c44566bebc89543d31048e148b.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:08:00 np0005466013 nova_compute[192144]: 2025-10-02 12:08:00.738 2 DEBUG oslo_concurrency.processutils [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c05f5917d86516c44566bebc89543d31048e148b.converted --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:08:00 np0005466013 nova_compute[192144]: 2025-10-02 12:08:00.740 2 DEBUG oslo_concurrency.lockutils [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] Lock "c05f5917d86516c44566bebc89543d31048e148b" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.136s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:00 np0005466013 nova_compute[192144]: 2025-10-02 12:08:00.758 2 DEBUG oslo_concurrency.processutils [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c05f5917d86516c44566bebc89543d31048e148b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:08:00 np0005466013 nova_compute[192144]: 2025-10-02 12:08:00.835 2 DEBUG oslo_concurrency.processutils [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c05f5917d86516c44566bebc89543d31048e148b --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:08:00 np0005466013 nova_compute[192144]: 2025-10-02 12:08:00.836 2 DEBUG oslo_concurrency.lockutils [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] Acquiring lock "c05f5917d86516c44566bebc89543d31048e148b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:00 np0005466013 nova_compute[192144]: 2025-10-02 12:08:00.837 2 DEBUG oslo_concurrency.lockutils [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] Lock "c05f5917d86516c44566bebc89543d31048e148b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:00 np0005466013 nova_compute[192144]: 2025-10-02 12:08:00.852 2 DEBUG oslo_concurrency.processutils [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c05f5917d86516c44566bebc89543d31048e148b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:08:00 np0005466013 nova_compute[192144]: 2025-10-02 12:08:00.924 2 DEBUG oslo_concurrency.processutils [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c05f5917d86516c44566bebc89543d31048e148b --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:08:00 np0005466013 nova_compute[192144]: 2025-10-02 12:08:00.926 2 DEBUG oslo_concurrency.processutils [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c05f5917d86516c44566bebc89543d31048e148b,backing_fmt=raw /var/lib/nova/instances/1df89ab6-e68b-4cdb-96ac-80896dce72c0/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:08:00 np0005466013 nova_compute[192144]: 2025-10-02 12:08:00.966 2 DEBUG oslo_concurrency.processutils [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c05f5917d86516c44566bebc89543d31048e148b,backing_fmt=raw /var/lib/nova/instances/1df89ab6-e68b-4cdb-96ac-80896dce72c0/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:08:00 np0005466013 nova_compute[192144]: 2025-10-02 12:08:00.968 2 DEBUG oslo_concurrency.lockutils [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] Lock "c05f5917d86516c44566bebc89543d31048e148b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:00 np0005466013 nova_compute[192144]: 2025-10-02 12:08:00.968 2 DEBUG oslo_concurrency.processutils [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c05f5917d86516c44566bebc89543d31048e148b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:08:01 np0005466013 nova_compute[192144]: 2025-10-02 12:08:01.029 2 DEBUG oslo_concurrency.processutils [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c05f5917d86516c44566bebc89543d31048e148b --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:08:01 np0005466013 nova_compute[192144]: 2025-10-02 12:08:01.031 2 DEBUG nova.objects.instance [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] Lazy-loading 'migration_context' on Instance uuid 1df89ab6-e68b-4cdb-96ac-80896dce72c0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:08:01 np0005466013 nova_compute[192144]: 2025-10-02 12:08:01.052 2 INFO nova.virt.libvirt.driver [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Rebasing disk image.#033[00m
Oct  2 08:08:01 np0005466013 nova_compute[192144]: 2025-10-02 12:08:01.053 2 DEBUG oslo_concurrency.processutils [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:08:01 np0005466013 nova_compute[192144]: 2025-10-02 12:08:01.118 2 DEBUG oslo_concurrency.processutils [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:08:01 np0005466013 nova_compute[192144]: 2025-10-02 12:08:01.120 2 DEBUG oslo_concurrency.processutils [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] Running cmd (subprocess): qemu-img rebase -b /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 -F raw /var/lib/nova/instances/1df89ab6-e68b-4cdb-96ac-80896dce72c0/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:08:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:08:02.288 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:08:02.289 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:08:02.289 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:02 np0005466013 nova_compute[192144]: 2025-10-02 12:08:02.621 2 DEBUG oslo_concurrency.processutils [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] CMD "qemu-img rebase -b /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 -F raw /var/lib/nova/instances/1df89ab6-e68b-4cdb-96ac-80896dce72c0/disk" returned: 0 in 1.501s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:08:02 np0005466013 nova_compute[192144]: 2025-10-02 12:08:02.622 2 DEBUG nova.virt.libvirt.driver [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:08:02 np0005466013 nova_compute[192144]: 2025-10-02 12:08:02.622 2 DEBUG nova.virt.libvirt.driver [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Ensure instance console log exists: /var/lib/nova/instances/1df89ab6-e68b-4cdb-96ac-80896dce72c0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:08:02 np0005466013 nova_compute[192144]: 2025-10-02 12:08:02.623 2 DEBUG oslo_concurrency.lockutils [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:02 np0005466013 nova_compute[192144]: 2025-10-02 12:08:02.623 2 DEBUG oslo_concurrency.lockutils [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:02 np0005466013 nova_compute[192144]: 2025-10-02 12:08:02.623 2 DEBUG oslo_concurrency.lockutils [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:02 np0005466013 nova_compute[192144]: 2025-10-02 12:08:02.625 2 DEBUG nova.virt.libvirt.driver [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='15e6e160e59c9954ca1333957bb1fa9a',container_format='bare',created_at=2025-10-02T12:07:35Z,direct_url=<?>,disk_format='qcow2',id=56278c39-9b26-4671-b11b-d390810aead5,min_disk=1,min_ram=0,name='tempest-UnshelveToHostMultiNodesTest-server-1188612115-shelved',owner='ef1d6333695d494da23ba067aaed9cfb',properties=ImageMetaProps,protected=<?>,size=52232192,status='active',tags=<?>,updated_at=2025-10-02T12:07:53Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:08:02 np0005466013 nova_compute[192144]: 2025-10-02 12:08:02.631 2 WARNING nova.virt.libvirt.driver [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:08:02 np0005466013 nova_compute[192144]: 2025-10-02 12:08:02.639 2 DEBUG nova.virt.libvirt.host [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:08:02 np0005466013 nova_compute[192144]: 2025-10-02 12:08:02.640 2 DEBUG nova.virt.libvirt.host [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:08:02 np0005466013 nova_compute[192144]: 2025-10-02 12:08:02.645 2 DEBUG nova.virt.libvirt.host [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:08:02 np0005466013 nova_compute[192144]: 2025-10-02 12:08:02.646 2 DEBUG nova.virt.libvirt.host [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:08:02 np0005466013 nova_compute[192144]: 2025-10-02 12:08:02.647 2 DEBUG nova.virt.libvirt.driver [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:08:02 np0005466013 nova_compute[192144]: 2025-10-02 12:08:02.647 2 DEBUG nova.virt.hardware [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='15e6e160e59c9954ca1333957bb1fa9a',container_format='bare',created_at=2025-10-02T12:07:35Z,direct_url=<?>,disk_format='qcow2',id=56278c39-9b26-4671-b11b-d390810aead5,min_disk=1,min_ram=0,name='tempest-UnshelveToHostMultiNodesTest-server-1188612115-shelved',owner='ef1d6333695d494da23ba067aaed9cfb',properties=ImageMetaProps,protected=<?>,size=52232192,status='active',tags=<?>,updated_at=2025-10-02T12:07:53Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:08:02 np0005466013 nova_compute[192144]: 2025-10-02 12:08:02.648 2 DEBUG nova.virt.hardware [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:08:02 np0005466013 nova_compute[192144]: 2025-10-02 12:08:02.648 2 DEBUG nova.virt.hardware [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:08:02 np0005466013 nova_compute[192144]: 2025-10-02 12:08:02.648 2 DEBUG nova.virt.hardware [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:08:02 np0005466013 nova_compute[192144]: 2025-10-02 12:08:02.649 2 DEBUG nova.virt.hardware [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:08:02 np0005466013 nova_compute[192144]: 2025-10-02 12:08:02.649 2 DEBUG nova.virt.hardware [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:08:02 np0005466013 nova_compute[192144]: 2025-10-02 12:08:02.649 2 DEBUG nova.virt.hardware [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:08:02 np0005466013 nova_compute[192144]: 2025-10-02 12:08:02.649 2 DEBUG nova.virt.hardware [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:08:02 np0005466013 nova_compute[192144]: 2025-10-02 12:08:02.649 2 DEBUG nova.virt.hardware [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:08:02 np0005466013 nova_compute[192144]: 2025-10-02 12:08:02.650 2 DEBUG nova.virt.hardware [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:08:02 np0005466013 nova_compute[192144]: 2025-10-02 12:08:02.650 2 DEBUG nova.virt.hardware [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:08:02 np0005466013 nova_compute[192144]: 2025-10-02 12:08:02.650 2 DEBUG nova.objects.instance [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] Lazy-loading 'vcpu_model' on Instance uuid 1df89ab6-e68b-4cdb-96ac-80896dce72c0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:08:02 np0005466013 nova_compute[192144]: 2025-10-02 12:08:02.738 2 DEBUG nova.objects.instance [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] Lazy-loading 'pci_devices' on Instance uuid 1df89ab6-e68b-4cdb-96ac-80896dce72c0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:08:02 np0005466013 nova_compute[192144]: 2025-10-02 12:08:02.791 2 DEBUG nova.virt.libvirt.driver [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:08:02 np0005466013 nova_compute[192144]:  <uuid>1df89ab6-e68b-4cdb-96ac-80896dce72c0</uuid>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:  <name>instance-0000002a</name>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:08:02 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:      <nova:name>tempest-UnshelveToHostMultiNodesTest-server-1188612115</nova:name>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:08:02</nova:creationTime>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:08:02 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:        <nova:user uuid="70e85655ffe7475ba88961b19bf4d65a">tempest-UnshelveToHostMultiNodesTest-250675149-project-member</nova:user>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:        <nova:project uuid="ef1d6333695d494da23ba067aaed9cfb">tempest-UnshelveToHostMultiNodesTest-250675149</nova:project>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="56278c39-9b26-4671-b11b-d390810aead5"/>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:      <nova:ports/>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:08:02 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:      <entry name="serial">1df89ab6-e68b-4cdb-96ac-80896dce72c0</entry>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:      <entry name="uuid">1df89ab6-e68b-4cdb-96ac-80896dce72c0</entry>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:08:02 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:08:02 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:08:02 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/1df89ab6-e68b-4cdb-96ac-80896dce72c0/disk"/>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:08:02 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/1df89ab6-e68b-4cdb-96ac-80896dce72c0/disk.config"/>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:08:02 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/1df89ab6-e68b-4cdb-96ac-80896dce72c0/console.log" append="off"/>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:    <input type="keyboard" bus="usb"/>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:08:02 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:08:02 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:08:02 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:08:02 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:08:02 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:08:02 np0005466013 nova_compute[192144]: 2025-10-02 12:08:02.912 2 DEBUG nova.virt.libvirt.driver [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:08:02 np0005466013 nova_compute[192144]: 2025-10-02 12:08:02.913 2 DEBUG nova.virt.libvirt.driver [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:08:02 np0005466013 nova_compute[192144]: 2025-10-02 12:08:02.914 2 INFO nova.virt.libvirt.driver [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Using config drive#033[00m
Oct  2 08:08:02 np0005466013 nova_compute[192144]: 2025-10-02 12:08:02.947 2 DEBUG nova.objects.instance [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] Lazy-loading 'ec2_ids' on Instance uuid 1df89ab6-e68b-4cdb-96ac-80896dce72c0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:08:02 np0005466013 nova_compute[192144]: 2025-10-02 12:08:02.976 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:02 np0005466013 nova_compute[192144]: 2025-10-02 12:08:02.995 2 DEBUG nova.objects.instance [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] Lazy-loading 'keypairs' on Instance uuid 1df89ab6-e68b-4cdb-96ac-80896dce72c0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:08:03 np0005466013 nova_compute[192144]: 2025-10-02 12:08:03.172 2 INFO nova.virt.libvirt.driver [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Creating config drive at /var/lib/nova/instances/1df89ab6-e68b-4cdb-96ac-80896dce72c0/disk.config#033[00m
Oct  2 08:08:03 np0005466013 nova_compute[192144]: 2025-10-02 12:08:03.179 2 DEBUG oslo_concurrency.processutils [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1df89ab6-e68b-4cdb-96ac-80896dce72c0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9ttf49v0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:08:03 np0005466013 nova_compute[192144]: 2025-10-02 12:08:03.309 2 DEBUG oslo_concurrency.processutils [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1df89ab6-e68b-4cdb-96ac-80896dce72c0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9ttf49v0" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:08:03 np0005466013 systemd-machined[152202]: New machine qemu-18-instance-0000002a.
Oct  2 08:08:03 np0005466013 systemd[1]: Started Virtual Machine qemu-18-instance-0000002a.
Oct  2 08:08:03 np0005466013 nova_compute[192144]: 2025-10-02 12:08:03.424 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759406868.423424, 1df89ab6-e68b-4cdb-96ac-80896dce72c0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:08:03 np0005466013 nova_compute[192144]: 2025-10-02 12:08:03.425 2 INFO nova.compute.manager [-] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:08:03 np0005466013 nova_compute[192144]: 2025-10-02 12:08:03.445 2 DEBUG nova.compute.manager [None req-ae331894-b769-4f27-add7-a1d580b6b607 - - - - - -] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:08:04 np0005466013 nova_compute[192144]: 2025-10-02 12:08:04.451 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759406884.4505372, 1df89ab6-e68b-4cdb-96ac-80896dce72c0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:08:04 np0005466013 nova_compute[192144]: 2025-10-02 12:08:04.451 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:08:04 np0005466013 nova_compute[192144]: 2025-10-02 12:08:04.453 2 DEBUG nova.compute.manager [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:08:04 np0005466013 nova_compute[192144]: 2025-10-02 12:08:04.454 2 DEBUG nova.virt.libvirt.driver [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:08:04 np0005466013 nova_compute[192144]: 2025-10-02 12:08:04.456 2 INFO nova.virt.libvirt.driver [-] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Instance spawned successfully.#033[00m
Oct  2 08:08:04 np0005466013 nova_compute[192144]: 2025-10-02 12:08:04.524 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:08:04 np0005466013 nova_compute[192144]: 2025-10-02 12:08:04.529 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:08:04 np0005466013 nova_compute[192144]: 2025-10-02 12:08:04.737 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:08:04 np0005466013 nova_compute[192144]: 2025-10-02 12:08:04.738 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759406884.4517357, 1df89ab6-e68b-4cdb-96ac-80896dce72c0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:08:04 np0005466013 nova_compute[192144]: 2025-10-02 12:08:04.738 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] VM Started (Lifecycle Event)#033[00m
Oct  2 08:08:04 np0005466013 nova_compute[192144]: 2025-10-02 12:08:04.761 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:08:04 np0005466013 nova_compute[192144]: 2025-10-02 12:08:04.765 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:08:04 np0005466013 nova_compute[192144]: 2025-10-02 12:08:04.871 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:08:05 np0005466013 nova_compute[192144]: 2025-10-02 12:08:05.017 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:05 np0005466013 nova_compute[192144]: 2025-10-02 12:08:05.574 2 DEBUG nova.compute.manager [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:08:05 np0005466013 nova_compute[192144]: 2025-10-02 12:08:05.684 2 DEBUG oslo_concurrency.lockutils [None req-781a89e3-7857-41d3-b82f-b3feb8efa4d7 7c9f5af6d8f24daf9842b195fa11137e 6e923b73e6774f58bca20f0f5d2962bf - - default default] Lock "1df89ab6-e68b-4cdb-96ac-80896dce72c0" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 8.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:07 np0005466013 podman[225254]: 2025-10-02 12:08:07.708774182 +0000 UTC m=+0.073772353 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct  2 08:08:07 np0005466013 nova_compute[192144]: 2025-10-02 12:08:07.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:08 np0005466013 nova_compute[192144]: 2025-10-02 12:08:08.053 2 DEBUG oslo_concurrency.lockutils [None req-e397b7d4-c013-418e-af17-f730cb63a6a1 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Acquiring lock "1df89ab6-e68b-4cdb-96ac-80896dce72c0" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:08 np0005466013 nova_compute[192144]: 2025-10-02 12:08:08.054 2 DEBUG oslo_concurrency.lockutils [None req-e397b7d4-c013-418e-af17-f730cb63a6a1 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Lock "1df89ab6-e68b-4cdb-96ac-80896dce72c0" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:08 np0005466013 nova_compute[192144]: 2025-10-02 12:08:08.054 2 INFO nova.compute.manager [None req-e397b7d4-c013-418e-af17-f730cb63a6a1 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Shelving#033[00m
Oct  2 08:08:08 np0005466013 nova_compute[192144]: 2025-10-02 12:08:08.098 2 DEBUG nova.virt.libvirt.driver [None req-e397b7d4-c013-418e-af17-f730cb63a6a1 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Oct  2 08:08:10 np0005466013 nova_compute[192144]: 2025-10-02 12:08:10.020 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:10 np0005466013 podman[225275]: 2025-10-02 12:08:10.683359827 +0000 UTC m=+0.059472643 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible)
Oct  2 08:08:10 np0005466013 podman[225276]: 2025-10-02 12:08:10.706306816 +0000 UTC m=+0.081097224 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, release=1755695350, vcs-type=git, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., version=9.6)
Oct  2 08:08:12 np0005466013 ovn_controller[94366]: 2025-10-02T12:08:12Z|00114|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Oct  2 08:08:12 np0005466013 nova_compute[192144]: 2025-10-02 12:08:12.984 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:14 np0005466013 podman[225315]: 2025-10-02 12:08:14.667589728 +0000 UTC m=+0.049487877 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  2 08:08:14 np0005466013 podman[225316]: 2025-10-02 12:08:14.67659976 +0000 UTC m=+0.054792566 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  2 08:08:14 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:08:14.808 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:08:14 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:08:14.810 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:08:14 np0005466013 nova_compute[192144]: 2025-10-02 12:08:14.809 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:14 np0005466013 nova_compute[192144]: 2025-10-02 12:08:14.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:08:14 np0005466013 nova_compute[192144]: 2025-10-02 12:08:14.993 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:08:15 np0005466013 nova_compute[192144]: 2025-10-02 12:08:15.022 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:15 np0005466013 nova_compute[192144]: 2025-10-02 12:08:15.093 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:08:15 np0005466013 nova_compute[192144]: 2025-10-02 12:08:15.093 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:08:15 np0005466013 nova_compute[192144]: 2025-10-02 12:08:15.203 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:15 np0005466013 nova_compute[192144]: 2025-10-02 12:08:15.204 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:15 np0005466013 nova_compute[192144]: 2025-10-02 12:08:15.204 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:15 np0005466013 nova_compute[192144]: 2025-10-02 12:08:15.204 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:08:15 np0005466013 nova_compute[192144]: 2025-10-02 12:08:15.358 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1df89ab6-e68b-4cdb-96ac-80896dce72c0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:08:15 np0005466013 nova_compute[192144]: 2025-10-02 12:08:15.415 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1df89ab6-e68b-4cdb-96ac-80896dce72c0/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:08:15 np0005466013 nova_compute[192144]: 2025-10-02 12:08:15.416 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1df89ab6-e68b-4cdb-96ac-80896dce72c0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:08:15 np0005466013 nova_compute[192144]: 2025-10-02 12:08:15.479 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1df89ab6-e68b-4cdb-96ac-80896dce72c0/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:08:15 np0005466013 nova_compute[192144]: 2025-10-02 12:08:15.648 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:08:15 np0005466013 nova_compute[192144]: 2025-10-02 12:08:15.651 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5627MB free_disk=73.3677749633789GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:08:15 np0005466013 nova_compute[192144]: 2025-10-02 12:08:15.652 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:15 np0005466013 nova_compute[192144]: 2025-10-02 12:08:15.653 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:16 np0005466013 nova_compute[192144]: 2025-10-02 12:08:16.248 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance 1df89ab6-e68b-4cdb-96ac-80896dce72c0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:08:16 np0005466013 nova_compute[192144]: 2025-10-02 12:08:16.249 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:08:16 np0005466013 nova_compute[192144]: 2025-10-02 12:08:16.249 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:08:16 np0005466013 nova_compute[192144]: 2025-10-02 12:08:16.640 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:08:16 np0005466013 nova_compute[192144]: 2025-10-02 12:08:16.724 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:08:16 np0005466013 nova_compute[192144]: 2025-10-02 12:08:16.821 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:08:16 np0005466013 nova_compute[192144]: 2025-10-02 12:08:16.821 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.168s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:17 np0005466013 nova_compute[192144]: 2025-10-02 12:08:17.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:18 np0005466013 nova_compute[192144]: 2025-10-02 12:08:18.140 2 DEBUG nova.virt.libvirt.driver [None req-e397b7d4-c013-418e-af17-f730cb63a6a1 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Oct  2 08:08:18 np0005466013 nova_compute[192144]: 2025-10-02 12:08:18.722 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:08:18 np0005466013 nova_compute[192144]: 2025-10-02 12:08:18.746 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:08:18 np0005466013 nova_compute[192144]: 2025-10-02 12:08:18.746 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:08:18 np0005466013 nova_compute[192144]: 2025-10-02 12:08:18.747 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:08:18 np0005466013 nova_compute[192144]: 2025-10-02 12:08:18.747 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:08:18 np0005466013 nova_compute[192144]: 2025-10-02 12:08:18.747 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:08:18 np0005466013 nova_compute[192144]: 2025-10-02 12:08:18.748 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:08:18 np0005466013 nova_compute[192144]: 2025-10-02 12:08:18.748 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:08:18 np0005466013 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 08:08:20 np0005466013 nova_compute[192144]: 2025-10-02 12:08:20.023 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:20 np0005466013 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d0000002a.scope: Deactivated successfully.
Oct  2 08:08:20 np0005466013 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d0000002a.scope: Consumed 13.496s CPU time.
Oct  2 08:08:20 np0005466013 systemd-machined[152202]: Machine qemu-18-instance-0000002a terminated.
Oct  2 08:08:21 np0005466013 nova_compute[192144]: 2025-10-02 12:08:21.158 2 INFO nova.virt.libvirt.driver [None req-e397b7d4-c013-418e-af17-f730cb63a6a1 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Instance shutdown successfully after 13 seconds.#033[00m
Oct  2 08:08:21 np0005466013 nova_compute[192144]: 2025-10-02 12:08:21.162 2 INFO nova.virt.libvirt.driver [-] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Instance destroyed successfully.#033[00m
Oct  2 08:08:21 np0005466013 nova_compute[192144]: 2025-10-02 12:08:21.162 2 DEBUG nova.objects.instance [None req-e397b7d4-c013-418e-af17-f730cb63a6a1 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Lazy-loading 'numa_topology' on Instance uuid 1df89ab6-e68b-4cdb-96ac-80896dce72c0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:08:21 np0005466013 nova_compute[192144]: 2025-10-02 12:08:21.542 2 INFO nova.virt.libvirt.driver [None req-e397b7d4-c013-418e-af17-f730cb63a6a1 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Beginning cold snapshot process#033[00m
Oct  2 08:08:21 np0005466013 nova_compute[192144]: 2025-10-02 12:08:21.821 2 DEBUG nova.privsep.utils [None req-e397b7d4-c013-418e-af17-f730cb63a6a1 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Oct  2 08:08:21 np0005466013 nova_compute[192144]: 2025-10-02 12:08:21.822 2 DEBUG oslo_concurrency.processutils [None req-e397b7d4-c013-418e-af17-f730cb63a6a1 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/1df89ab6-e68b-4cdb-96ac-80896dce72c0/disk /var/lib/nova/instances/snapshots/tmptso8mavu/ed0c67505dde453586d3c87a920c6005 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:08:22 np0005466013 nova_compute[192144]: 2025-10-02 12:08:22.015 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:08:22 np0005466013 nova_compute[192144]: 2025-10-02 12:08:22.338 2 DEBUG oslo_concurrency.processutils [None req-e397b7d4-c013-418e-af17-f730cb63a6a1 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/1df89ab6-e68b-4cdb-96ac-80896dce72c0/disk /var/lib/nova/instances/snapshots/tmptso8mavu/ed0c67505dde453586d3c87a920c6005" returned: 0 in 0.517s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:08:22 np0005466013 nova_compute[192144]: 2025-10-02 12:08:22.339 2 INFO nova.virt.libvirt.driver [None req-e397b7d4-c013-418e-af17-f730cb63a6a1 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Snapshot extracted, beginning image upload#033[00m
Oct  2 08:08:22 np0005466013 nova_compute[192144]: 2025-10-02 12:08:22.987 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:08:24.813 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:08:25 np0005466013 nova_compute[192144]: 2025-10-02 12:08:25.025 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:25 np0005466013 nova_compute[192144]: 2025-10-02 12:08:25.068 2 INFO nova.virt.libvirt.driver [None req-e397b7d4-c013-418e-af17-f730cb63a6a1 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Snapshot image upload complete#033[00m
Oct  2 08:08:25 np0005466013 nova_compute[192144]: 2025-10-02 12:08:25.069 2 DEBUG nova.compute.manager [None req-e397b7d4-c013-418e-af17-f730cb63a6a1 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:08:25 np0005466013 nova_compute[192144]: 2025-10-02 12:08:25.186 2 INFO nova.compute.manager [None req-e397b7d4-c013-418e-af17-f730cb63a6a1 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Shelve offloading#033[00m
Oct  2 08:08:25 np0005466013 nova_compute[192144]: 2025-10-02 12:08:25.204 2 INFO nova.virt.libvirt.driver [-] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Instance destroyed successfully.#033[00m
Oct  2 08:08:25 np0005466013 nova_compute[192144]: 2025-10-02 12:08:25.204 2 DEBUG nova.compute.manager [None req-e397b7d4-c013-418e-af17-f730cb63a6a1 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:08:25 np0005466013 nova_compute[192144]: 2025-10-02 12:08:25.207 2 DEBUG oslo_concurrency.lockutils [None req-e397b7d4-c013-418e-af17-f730cb63a6a1 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Acquiring lock "refresh_cache-1df89ab6-e68b-4cdb-96ac-80896dce72c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:08:25 np0005466013 nova_compute[192144]: 2025-10-02 12:08:25.207 2 DEBUG oslo_concurrency.lockutils [None req-e397b7d4-c013-418e-af17-f730cb63a6a1 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Acquired lock "refresh_cache-1df89ab6-e68b-4cdb-96ac-80896dce72c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:08:25 np0005466013 nova_compute[192144]: 2025-10-02 12:08:25.207 2 DEBUG nova.network.neutron [None req-e397b7d4-c013-418e-af17-f730cb63a6a1 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:08:25 np0005466013 nova_compute[192144]: 2025-10-02 12:08:25.423 2 DEBUG nova.network.neutron [None req-e397b7d4-c013-418e-af17-f730cb63a6a1 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:08:25 np0005466013 nova_compute[192144]: 2025-10-02 12:08:25.790 2 DEBUG nova.network.neutron [None req-e397b7d4-c013-418e-af17-f730cb63a6a1 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:08:25 np0005466013 nova_compute[192144]: 2025-10-02 12:08:25.815 2 DEBUG oslo_concurrency.lockutils [None req-e397b7d4-c013-418e-af17-f730cb63a6a1 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Releasing lock "refresh_cache-1df89ab6-e68b-4cdb-96ac-80896dce72c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:08:25 np0005466013 nova_compute[192144]: 2025-10-02 12:08:25.823 2 INFO nova.virt.libvirt.driver [-] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Instance destroyed successfully.#033[00m
Oct  2 08:08:25 np0005466013 nova_compute[192144]: 2025-10-02 12:08:25.824 2 DEBUG nova.objects.instance [None req-e397b7d4-c013-418e-af17-f730cb63a6a1 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Lazy-loading 'resources' on Instance uuid 1df89ab6-e68b-4cdb-96ac-80896dce72c0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:08:25 np0005466013 nova_compute[192144]: 2025-10-02 12:08:25.844 2 INFO nova.virt.libvirt.driver [None req-e397b7d4-c013-418e-af17-f730cb63a6a1 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Deleting instance files /var/lib/nova/instances/1df89ab6-e68b-4cdb-96ac-80896dce72c0_del#033[00m
Oct  2 08:08:25 np0005466013 nova_compute[192144]: 2025-10-02 12:08:25.849 2 INFO nova.virt.libvirt.driver [None req-e397b7d4-c013-418e-af17-f730cb63a6a1 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Deletion of /var/lib/nova/instances/1df89ab6-e68b-4cdb-96ac-80896dce72c0_del complete#033[00m
Oct  2 08:08:25 np0005466013 nova_compute[192144]: 2025-10-02 12:08:25.960 2 INFO nova.scheduler.client.report [None req-e397b7d4-c013-418e-af17-f730cb63a6a1 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Deleted allocations for instance 1df89ab6-e68b-4cdb-96ac-80896dce72c0#033[00m
Oct  2 08:08:26 np0005466013 nova_compute[192144]: 2025-10-02 12:08:26.022 2 DEBUG oslo_concurrency.lockutils [None req-e397b7d4-c013-418e-af17-f730cb63a6a1 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:26 np0005466013 nova_compute[192144]: 2025-10-02 12:08:26.022 2 DEBUG oslo_concurrency.lockutils [None req-e397b7d4-c013-418e-af17-f730cb63a6a1 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:26 np0005466013 nova_compute[192144]: 2025-10-02 12:08:26.047 2 DEBUG nova.compute.provider_tree [None req-e397b7d4-c013-418e-af17-f730cb63a6a1 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:08:26 np0005466013 nova_compute[192144]: 2025-10-02 12:08:26.066 2 DEBUG nova.scheduler.client.report [None req-e397b7d4-c013-418e-af17-f730cb63a6a1 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:08:26 np0005466013 nova_compute[192144]: 2025-10-02 12:08:26.084 2 DEBUG oslo_concurrency.lockutils [None req-e397b7d4-c013-418e-af17-f730cb63a6a1 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.062s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:26 np0005466013 nova_compute[192144]: 2025-10-02 12:08:26.176 2 DEBUG oslo_concurrency.lockutils [None req-e397b7d4-c013-418e-af17-f730cb63a6a1 70e85655ffe7475ba88961b19bf4d65a ef1d6333695d494da23ba067aaed9cfb - - default default] Lock "1df89ab6-e68b-4cdb-96ac-80896dce72c0" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 18.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:27 np0005466013 podman[225389]: 2025-10-02 12:08:27.691982372 +0000 UTC m=+0.059352547 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  2 08:08:27 np0005466013 podman[225388]: 2025-10-02 12:08:27.693339842 +0000 UTC m=+0.064050008 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:08:27 np0005466013 podman[225390]: 2025-10-02 12:08:27.725388608 +0000 UTC m=+0.090822503 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001)
Oct  2 08:08:27 np0005466013 nova_compute[192144]: 2025-10-02 12:08:27.989 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:30 np0005466013 nova_compute[192144]: 2025-10-02 12:08:30.027 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:32 np0005466013 nova_compute[192144]: 2025-10-02 12:08:32.992 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:35 np0005466013 nova_compute[192144]: 2025-10-02 12:08:35.028 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:35 np0005466013 nova_compute[192144]: 2025-10-02 12:08:35.575 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759406900.5746894, 1df89ab6-e68b-4cdb-96ac-80896dce72c0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:08:35 np0005466013 nova_compute[192144]: 2025-10-02 12:08:35.576 2 INFO nova.compute.manager [-] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:08:35 np0005466013 nova_compute[192144]: 2025-10-02 12:08:35.602 2 DEBUG nova.compute.manager [None req-b1d645c3-e8aa-4497-b1d9-1ad58c3a4951 - - - - - -] [instance: 1df89ab6-e68b-4cdb-96ac-80896dce72c0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:08:37 np0005466013 nova_compute[192144]: 2025-10-02 12:08:37.992 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:38 np0005466013 podman[225453]: 2025-10-02 12:08:38.713862453 +0000 UTC m=+0.089002808 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:08:40 np0005466013 nova_compute[192144]: 2025-10-02 12:08:40.031 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:41 np0005466013 podman[225473]: 2025-10-02 12:08:41.67874031 +0000 UTC m=+0.055721756 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Oct  2 08:08:41 np0005466013 podman[225474]: 2025-10-02 12:08:41.684621349 +0000 UTC m=+0.056998715 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, vcs-type=git, distribution-scope=public, io.buildah.version=1.33.7, release=1755695350, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, managed_by=edpm_ansible, version=9.6, name=ubi9-minimal, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Oct  2 08:08:43 np0005466013 nova_compute[192144]: 2025-10-02 12:08:42.999 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:45 np0005466013 nova_compute[192144]: 2025-10-02 12:08:45.034 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:45 np0005466013 podman[225515]: 2025-10-02 12:08:45.673155755 +0000 UTC m=+0.054455978 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:08:45 np0005466013 podman[225516]: 2025-10-02 12:08:45.677045873 +0000 UTC m=+0.055299323 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 08:08:48 np0005466013 nova_compute[192144]: 2025-10-02 12:08:48.001 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:50 np0005466013 nova_compute[192144]: 2025-10-02 12:08:50.036 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:53 np0005466013 nova_compute[192144]: 2025-10-02 12:08:53.003 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:55 np0005466013 nova_compute[192144]: 2025-10-02 12:08:55.038 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:58 np0005466013 nova_compute[192144]: 2025-10-02 12:08:58.005 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:58 np0005466013 podman[225559]: 2025-10-02 12:08:58.703893771 +0000 UTC m=+0.080716907 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:08:58 np0005466013 podman[225560]: 2025-10-02 12:08:58.711019247 +0000 UTC m=+0.085211293 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:08:58 np0005466013 podman[225561]: 2025-10-02 12:08:58.718582577 +0000 UTC m=+0.089959487 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Oct  2 08:09:00 np0005466013 nova_compute[192144]: 2025-10-02 12:09:00.050 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:02.289 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:02.290 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:02.290 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:03 np0005466013 nova_compute[192144]: 2025-10-02 12:09:03.005 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:05 np0005466013 nova_compute[192144]: 2025-10-02 12:09:05.052 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:08 np0005466013 nova_compute[192144]: 2025-10-02 12:09:08.006 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:08 np0005466013 nova_compute[192144]: 2025-10-02 12:09:08.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:09:08 np0005466013 nova_compute[192144]: 2025-10-02 12:09:08.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  2 08:09:09 np0005466013 podman[225631]: 2025-10-02 12:09:09.677662087 +0000 UTC m=+0.057878852 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:09:10 np0005466013 nova_compute[192144]: 2025-10-02 12:09:10.054 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:12 np0005466013 podman[225652]: 2025-10-02 12:09:12.667544704 +0000 UTC m=+0.048724353 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:09:12 np0005466013 podman[225653]: 2025-10-02 12:09:12.667618076 +0000 UTC m=+0.047367571 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, version=9.6, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_id=edpm, release=1755695350, vcs-type=git)
Oct  2 08:09:13 np0005466013 nova_compute[192144]: 2025-10-02 12:09:13.008 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:14 np0005466013 nova_compute[192144]: 2025-10-02 12:09:14.025 2 DEBUG oslo_concurrency.lockutils [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Acquiring lock "aa3ad2f7-f503-4fb3-b13f-c3691b7f7700" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:14 np0005466013 nova_compute[192144]: 2025-10-02 12:09:14.026 2 DEBUG oslo_concurrency.lockutils [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Lock "aa3ad2f7-f503-4fb3-b13f-c3691b7f7700" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:14 np0005466013 nova_compute[192144]: 2025-10-02 12:09:14.045 2 DEBUG nova.compute.manager [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:09:14 np0005466013 nova_compute[192144]: 2025-10-02 12:09:14.143 2 DEBUG oslo_concurrency.lockutils [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:14 np0005466013 nova_compute[192144]: 2025-10-02 12:09:14.144 2 DEBUG oslo_concurrency.lockutils [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:14 np0005466013 nova_compute[192144]: 2025-10-02 12:09:14.151 2 DEBUG nova.virt.hardware [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:09:14 np0005466013 nova_compute[192144]: 2025-10-02 12:09:14.151 2 INFO nova.compute.claims [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:09:14 np0005466013 nova_compute[192144]: 2025-10-02 12:09:14.268 2 DEBUG nova.compute.provider_tree [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:09:14 np0005466013 nova_compute[192144]: 2025-10-02 12:09:14.286 2 DEBUG nova.scheduler.client.report [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:09:14 np0005466013 nova_compute[192144]: 2025-10-02 12:09:14.309 2 DEBUG oslo_concurrency.lockutils [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.165s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:14 np0005466013 nova_compute[192144]: 2025-10-02 12:09:14.310 2 DEBUG nova.compute.manager [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:09:14 np0005466013 nova_compute[192144]: 2025-10-02 12:09:14.383 2 DEBUG nova.compute.manager [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:09:14 np0005466013 nova_compute[192144]: 2025-10-02 12:09:14.384 2 DEBUG nova.network.neutron [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:09:14 np0005466013 nova_compute[192144]: 2025-10-02 12:09:14.418 2 INFO nova.virt.libvirt.driver [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:09:14 np0005466013 nova_compute[192144]: 2025-10-02 12:09:14.449 2 DEBUG nova.compute.manager [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:09:14 np0005466013 nova_compute[192144]: 2025-10-02 12:09:14.607 2 DEBUG nova.policy [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '725180cfb6174d38a53f3965d04a4916', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '53cd9990789640a5b5e28b5beb8b222b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:09:14 np0005466013 nova_compute[192144]: 2025-10-02 12:09:14.622 2 DEBUG nova.compute.manager [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:09:14 np0005466013 nova_compute[192144]: 2025-10-02 12:09:14.623 2 DEBUG nova.virt.libvirt.driver [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:09:14 np0005466013 nova_compute[192144]: 2025-10-02 12:09:14.624 2 INFO nova.virt.libvirt.driver [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Creating image(s)#033[00m
Oct  2 08:09:14 np0005466013 nova_compute[192144]: 2025-10-02 12:09:14.624 2 DEBUG oslo_concurrency.lockutils [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Acquiring lock "/var/lib/nova/instances/aa3ad2f7-f503-4fb3-b13f-c3691b7f7700/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:14 np0005466013 nova_compute[192144]: 2025-10-02 12:09:14.625 2 DEBUG oslo_concurrency.lockutils [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Lock "/var/lib/nova/instances/aa3ad2f7-f503-4fb3-b13f-c3691b7f7700/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:14 np0005466013 nova_compute[192144]: 2025-10-02 12:09:14.625 2 DEBUG oslo_concurrency.lockutils [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Lock "/var/lib/nova/instances/aa3ad2f7-f503-4fb3-b13f-c3691b7f7700/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:14 np0005466013 nova_compute[192144]: 2025-10-02 12:09:14.640 2 DEBUG oslo_concurrency.processutils [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:09:14 np0005466013 nova_compute[192144]: 2025-10-02 12:09:14.699 2 DEBUG oslo_concurrency.processutils [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:09:14 np0005466013 nova_compute[192144]: 2025-10-02 12:09:14.700 2 DEBUG oslo_concurrency.lockutils [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:14 np0005466013 nova_compute[192144]: 2025-10-02 12:09:14.701 2 DEBUG oslo_concurrency.lockutils [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:14 np0005466013 nova_compute[192144]: 2025-10-02 12:09:14.712 2 DEBUG oslo_concurrency.processutils [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:09:14 np0005466013 nova_compute[192144]: 2025-10-02 12:09:14.779 2 DEBUG oslo_concurrency.processutils [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:09:14 np0005466013 nova_compute[192144]: 2025-10-02 12:09:14.780 2 DEBUG oslo_concurrency.processutils [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/aa3ad2f7-f503-4fb3-b13f-c3691b7f7700/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:09:14 np0005466013 nova_compute[192144]: 2025-10-02 12:09:14.817 2 DEBUG oslo_concurrency.processutils [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/aa3ad2f7-f503-4fb3-b13f-c3691b7f7700/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:09:14 np0005466013 nova_compute[192144]: 2025-10-02 12:09:14.818 2 DEBUG oslo_concurrency.lockutils [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:14 np0005466013 nova_compute[192144]: 2025-10-02 12:09:14.818 2 DEBUG oslo_concurrency.processutils [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:09:14 np0005466013 nova_compute[192144]: 2025-10-02 12:09:14.873 2 DEBUG oslo_concurrency.processutils [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:09:14 np0005466013 nova_compute[192144]: 2025-10-02 12:09:14.874 2 DEBUG nova.virt.disk.api [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Checking if we can resize image /var/lib/nova/instances/aa3ad2f7-f503-4fb3-b13f-c3691b7f7700/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:09:14 np0005466013 nova_compute[192144]: 2025-10-02 12:09:14.874 2 DEBUG oslo_concurrency.processutils [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aa3ad2f7-f503-4fb3-b13f-c3691b7f7700/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:09:14 np0005466013 nova_compute[192144]: 2025-10-02 12:09:14.928 2 DEBUG oslo_concurrency.processutils [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aa3ad2f7-f503-4fb3-b13f-c3691b7f7700/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:09:14 np0005466013 nova_compute[192144]: 2025-10-02 12:09:14.929 2 DEBUG nova.virt.disk.api [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Cannot resize image /var/lib/nova/instances/aa3ad2f7-f503-4fb3-b13f-c3691b7f7700/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:09:14 np0005466013 nova_compute[192144]: 2025-10-02 12:09:14.929 2 DEBUG nova.objects.instance [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Lazy-loading 'migration_context' on Instance uuid aa3ad2f7-f503-4fb3-b13f-c3691b7f7700 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:09:14 np0005466013 nova_compute[192144]: 2025-10-02 12:09:14.946 2 DEBUG nova.virt.libvirt.driver [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:09:14 np0005466013 nova_compute[192144]: 2025-10-02 12:09:14.946 2 DEBUG nova.virt.libvirt.driver [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Ensure instance console log exists: /var/lib/nova/instances/aa3ad2f7-f503-4fb3-b13f-c3691b7f7700/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:09:14 np0005466013 nova_compute[192144]: 2025-10-02 12:09:14.947 2 DEBUG oslo_concurrency.lockutils [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:14 np0005466013 nova_compute[192144]: 2025-10-02 12:09:14.947 2 DEBUG oslo_concurrency.lockutils [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:14 np0005466013 nova_compute[192144]: 2025-10-02 12:09:14.947 2 DEBUG oslo_concurrency.lockutils [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:15 np0005466013 nova_compute[192144]: 2025-10-02 12:09:15.055 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:15 np0005466013 nova_compute[192144]: 2025-10-02 12:09:15.320 2 DEBUG nova.network.neutron [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Successfully created port: b0274f45-87f9-46e7-b0b4-c762e6ed2b43 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:09:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:16.034 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:09:16 np0005466013 nova_compute[192144]: 2025-10-02 12:09:16.036 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:16.036 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:09:16 np0005466013 nova_compute[192144]: 2025-10-02 12:09:16.142 2 DEBUG nova.network.neutron [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Successfully updated port: b0274f45-87f9-46e7-b0b4-c762e6ed2b43 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:09:16 np0005466013 nova_compute[192144]: 2025-10-02 12:09:16.163 2 DEBUG oslo_concurrency.lockutils [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Acquiring lock "refresh_cache-aa3ad2f7-f503-4fb3-b13f-c3691b7f7700" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:09:16 np0005466013 nova_compute[192144]: 2025-10-02 12:09:16.164 2 DEBUG oslo_concurrency.lockutils [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Acquired lock "refresh_cache-aa3ad2f7-f503-4fb3-b13f-c3691b7f7700" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:09:16 np0005466013 nova_compute[192144]: 2025-10-02 12:09:16.164 2 DEBUG nova.network.neutron [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:09:16 np0005466013 nova_compute[192144]: 2025-10-02 12:09:16.179 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:09:16 np0005466013 nova_compute[192144]: 2025-10-02 12:09:16.180 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:09:16 np0005466013 nova_compute[192144]: 2025-10-02 12:09:16.180 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:09:16 np0005466013 nova_compute[192144]: 2025-10-02 12:09:16.208 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Oct  2 08:09:16 np0005466013 nova_compute[192144]: 2025-10-02 12:09:16.209 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:09:16 np0005466013 nova_compute[192144]: 2025-10-02 12:09:16.251 2 DEBUG nova.compute.manager [req-55132649-a2ff-4a62-8498-aa19a5cad7c6 req-24e9894e-5587-4f69-b765-c9fff0e0603a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Received event network-changed-b0274f45-87f9-46e7-b0b4-c762e6ed2b43 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:09:16 np0005466013 nova_compute[192144]: 2025-10-02 12:09:16.251 2 DEBUG nova.compute.manager [req-55132649-a2ff-4a62-8498-aa19a5cad7c6 req-24e9894e-5587-4f69-b765-c9fff0e0603a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Refreshing instance network info cache due to event network-changed-b0274f45-87f9-46e7-b0b4-c762e6ed2b43. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:09:16 np0005466013 nova_compute[192144]: 2025-10-02 12:09:16.251 2 DEBUG oslo_concurrency.lockutils [req-55132649-a2ff-4a62-8498-aa19a5cad7c6 req-24e9894e-5587-4f69-b765-c9fff0e0603a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-aa3ad2f7-f503-4fb3-b13f-c3691b7f7700" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:09:16 np0005466013 nova_compute[192144]: 2025-10-02 12:09:16.342 2 DEBUG nova.network.neutron [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:09:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:09:16.344 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:09:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:09:16.344 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:09:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:09:16.344 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:09:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:09:16.344 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:09:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:09:16.344 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:09:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:09:16.344 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:09:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:09:16.344 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:09:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:09:16.344 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:09:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:09:16.345 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:09:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:09:16.345 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:09:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:09:16.345 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:09:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:09:16.345 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:09:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:09:16.345 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:09:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:09:16.345 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:09:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:09:16.345 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:09:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:09:16.345 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:09:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:09:16.345 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:09:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:09:16.345 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:09:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:09:16.345 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:09:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:09:16.345 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:09:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:09:16.345 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:09:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:09:16.346 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:09:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:09:16.346 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:09:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:09:16.346 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:09:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:09:16.346 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:09:16 np0005466013 podman[225712]: 2025-10-02 12:09:16.680488544 +0000 UTC m=+0.054267782 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3)
Oct  2 08:09:16 np0005466013 podman[225711]: 2025-10-02 12:09:16.705940938 +0000 UTC m=+0.082670846 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  2 08:09:16 np0005466013 nova_compute[192144]: 2025-10-02 12:09:16.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.020 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.020 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.020 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.020 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.183 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.184 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5747MB free_disk=73.39663314819336GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.185 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.185 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.265 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance aa3ad2f7-f503-4fb3-b13f-c3691b7f7700 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.265 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.265 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.315 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.333 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.346 2 DEBUG nova.network.neutron [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Updating instance_info_cache with network_info: [{"id": "b0274f45-87f9-46e7-b0b4-c762e6ed2b43", "address": "fa:16:3e:1b:44:bd", "network": {"id": "e3531c03-dcc1-4c2a-981f-8534850ce14f", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1609714742-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "53cd9990789640a5b5e28b5beb8b222b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0274f45-87", "ovs_interfaceid": "b0274f45-87f9-46e7-b0b4-c762e6ed2b43", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.362 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.363 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.177s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.374 2 DEBUG oslo_concurrency.lockutils [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Releasing lock "refresh_cache-aa3ad2f7-f503-4fb3-b13f-c3691b7f7700" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.375 2 DEBUG nova.compute.manager [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Instance network_info: |[{"id": "b0274f45-87f9-46e7-b0b4-c762e6ed2b43", "address": "fa:16:3e:1b:44:bd", "network": {"id": "e3531c03-dcc1-4c2a-981f-8534850ce14f", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1609714742-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "53cd9990789640a5b5e28b5beb8b222b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0274f45-87", "ovs_interfaceid": "b0274f45-87f9-46e7-b0b4-c762e6ed2b43", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.375 2 DEBUG oslo_concurrency.lockutils [req-55132649-a2ff-4a62-8498-aa19a5cad7c6 req-24e9894e-5587-4f69-b765-c9fff0e0603a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-aa3ad2f7-f503-4fb3-b13f-c3691b7f7700" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.375 2 DEBUG nova.network.neutron [req-55132649-a2ff-4a62-8498-aa19a5cad7c6 req-24e9894e-5587-4f69-b765-c9fff0e0603a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Refreshing network info cache for port b0274f45-87f9-46e7-b0b4-c762e6ed2b43 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.378 2 DEBUG nova.virt.libvirt.driver [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Start _get_guest_xml network_info=[{"id": "b0274f45-87f9-46e7-b0b4-c762e6ed2b43", "address": "fa:16:3e:1b:44:bd", "network": {"id": "e3531c03-dcc1-4c2a-981f-8534850ce14f", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1609714742-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "53cd9990789640a5b5e28b5beb8b222b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0274f45-87", "ovs_interfaceid": "b0274f45-87f9-46e7-b0b4-c762e6ed2b43", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.381 2 WARNING nova.virt.libvirt.driver [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.385 2 DEBUG nova.virt.libvirt.host [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.385 2 DEBUG nova.virt.libvirt.host [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.391 2 DEBUG nova.virt.libvirt.host [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.391 2 DEBUG nova.virt.libvirt.host [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.392 2 DEBUG nova.virt.libvirt.driver [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.392 2 DEBUG nova.virt.hardware [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.393 2 DEBUG nova.virt.hardware [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.393 2 DEBUG nova.virt.hardware [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.393 2 DEBUG nova.virt.hardware [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.393 2 DEBUG nova.virt.hardware [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.394 2 DEBUG nova.virt.hardware [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.394 2 DEBUG nova.virt.hardware [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.394 2 DEBUG nova.virt.hardware [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.394 2 DEBUG nova.virt.hardware [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.394 2 DEBUG nova.virt.hardware [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.395 2 DEBUG nova.virt.hardware [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.397 2 DEBUG nova.virt.libvirt.vif [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:09:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-1306562486',display_name='tempest-FloatingIPsAssociationTestJSON-server-1306562486',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-1306562486',id=48,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='53cd9990789640a5b5e28b5beb8b222b',ramdisk_id='',reservation_id='r-1hewk00f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationTestJSON-132128146',owner_user_name='tempest-FloatingIPsAssociationTestJSON-132128146-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:09:14Z,user_data=None,user_id='725180cfb6174d38a53f3965d04a4916',uuid=aa3ad2f7-f503-4fb3-b13f-c3691b7f7700,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b0274f45-87f9-46e7-b0b4-c762e6ed2b43", "address": "fa:16:3e:1b:44:bd", "network": {"id": "e3531c03-dcc1-4c2a-981f-8534850ce14f", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1609714742-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "53cd9990789640a5b5e28b5beb8b222b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0274f45-87", "ovs_interfaceid": "b0274f45-87f9-46e7-b0b4-c762e6ed2b43", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.398 2 DEBUG nova.network.os_vif_util [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Converting VIF {"id": "b0274f45-87f9-46e7-b0b4-c762e6ed2b43", "address": "fa:16:3e:1b:44:bd", "network": {"id": "e3531c03-dcc1-4c2a-981f-8534850ce14f", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1609714742-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "53cd9990789640a5b5e28b5beb8b222b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0274f45-87", "ovs_interfaceid": "b0274f45-87f9-46e7-b0b4-c762e6ed2b43", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.398 2 DEBUG nova.network.os_vif_util [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1b:44:bd,bridge_name='br-int',has_traffic_filtering=True,id=b0274f45-87f9-46e7-b0b4-c762e6ed2b43,network=Network(e3531c03-dcc1-4c2a-981f-8534850ce14f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb0274f45-87') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.399 2 DEBUG nova.objects.instance [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Lazy-loading 'pci_devices' on Instance uuid aa3ad2f7-f503-4fb3-b13f-c3691b7f7700 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.413 2 DEBUG nova.virt.libvirt.driver [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:09:17 np0005466013 nova_compute[192144]:  <uuid>aa3ad2f7-f503-4fb3-b13f-c3691b7f7700</uuid>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:  <name>instance-00000030</name>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:09:17 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:      <nova:name>tempest-FloatingIPsAssociationTestJSON-server-1306562486</nova:name>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:09:17</nova:creationTime>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:09:17 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:        <nova:user uuid="725180cfb6174d38a53f3965d04a4916">tempest-FloatingIPsAssociationTestJSON-132128146-project-member</nova:user>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:        <nova:project uuid="53cd9990789640a5b5e28b5beb8b222b">tempest-FloatingIPsAssociationTestJSON-132128146</nova:project>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:        <nova:port uuid="b0274f45-87f9-46e7-b0b4-c762e6ed2b43">
Oct  2 08:09:17 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:      <entry name="serial">aa3ad2f7-f503-4fb3-b13f-c3691b7f7700</entry>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:      <entry name="uuid">aa3ad2f7-f503-4fb3-b13f-c3691b7f7700</entry>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:09:17 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/aa3ad2f7-f503-4fb3-b13f-c3691b7f7700/disk"/>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:09:17 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/aa3ad2f7-f503-4fb3-b13f-c3691b7f7700/disk.config"/>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:09:17 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:1b:44:bd"/>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:      <target dev="tapb0274f45-87"/>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:09:17 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/aa3ad2f7-f503-4fb3-b13f-c3691b7f7700/console.log" append="off"/>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:09:17 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:09:17 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:09:17 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:09:17 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:09:17 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.414 2 DEBUG nova.compute.manager [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Preparing to wait for external event network-vif-plugged-b0274f45-87f9-46e7-b0b4-c762e6ed2b43 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.415 2 DEBUG oslo_concurrency.lockutils [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Acquiring lock "aa3ad2f7-f503-4fb3-b13f-c3691b7f7700-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.415 2 DEBUG oslo_concurrency.lockutils [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Lock "aa3ad2f7-f503-4fb3-b13f-c3691b7f7700-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.415 2 DEBUG oslo_concurrency.lockutils [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Lock "aa3ad2f7-f503-4fb3-b13f-c3691b7f7700-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.416 2 DEBUG nova.virt.libvirt.vif [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:09:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-1306562486',display_name='tempest-FloatingIPsAssociationTestJSON-server-1306562486',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-1306562486',id=48,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='53cd9990789640a5b5e28b5beb8b222b',ramdisk_id='',reservation_id='r-1hewk00f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationTestJSON-132128146',owner_user_name='tempest-FloatingIPsAssociationTestJSON-132128146-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:09:14Z,user_data=None,user_id='725180cfb6174d38a53f3965d04a4916',uuid=aa3ad2f7-f503-4fb3-b13f-c3691b7f7700,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b0274f45-87f9-46e7-b0b4-c762e6ed2b43", "address": "fa:16:3e:1b:44:bd", "network": {"id": "e3531c03-dcc1-4c2a-981f-8534850ce14f", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1609714742-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "53cd9990789640a5b5e28b5beb8b222b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0274f45-87", "ovs_interfaceid": "b0274f45-87f9-46e7-b0b4-c762e6ed2b43", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.416 2 DEBUG nova.network.os_vif_util [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Converting VIF {"id": "b0274f45-87f9-46e7-b0b4-c762e6ed2b43", "address": "fa:16:3e:1b:44:bd", "network": {"id": "e3531c03-dcc1-4c2a-981f-8534850ce14f", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1609714742-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "53cd9990789640a5b5e28b5beb8b222b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0274f45-87", "ovs_interfaceid": "b0274f45-87f9-46e7-b0b4-c762e6ed2b43", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.417 2 DEBUG nova.network.os_vif_util [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1b:44:bd,bridge_name='br-int',has_traffic_filtering=True,id=b0274f45-87f9-46e7-b0b4-c762e6ed2b43,network=Network(e3531c03-dcc1-4c2a-981f-8534850ce14f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb0274f45-87') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.417 2 DEBUG os_vif [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1b:44:bd,bridge_name='br-int',has_traffic_filtering=True,id=b0274f45-87f9-46e7-b0b4-c762e6ed2b43,network=Network(e3531c03-dcc1-4c2a-981f-8534850ce14f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb0274f45-87') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.417 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.418 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.418 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.421 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb0274f45-87, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.421 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb0274f45-87, col_values=(('external_ids', {'iface-id': 'b0274f45-87f9-46e7-b0b4-c762e6ed2b43', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1b:44:bd', 'vm-uuid': 'aa3ad2f7-f503-4fb3-b13f-c3691b7f7700'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:17 np0005466013 NetworkManager[51205]: <info>  [1759406957.4236] manager: (tapb0274f45-87): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/59)
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.425 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.430 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.432 2 INFO os_vif [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1b:44:bd,bridge_name='br-int',has_traffic_filtering=True,id=b0274f45-87f9-46e7-b0b4-c762e6ed2b43,network=Network(e3531c03-dcc1-4c2a-981f-8534850ce14f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb0274f45-87')#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.483 2 DEBUG nova.virt.libvirt.driver [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.483 2 DEBUG nova.virt.libvirt.driver [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.484 2 DEBUG nova.virt.libvirt.driver [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] No VIF found with MAC fa:16:3e:1b:44:bd, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.484 2 INFO nova.virt.libvirt.driver [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Using config drive#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.851 2 INFO nova.virt.libvirt.driver [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Creating config drive at /var/lib/nova/instances/aa3ad2f7-f503-4fb3-b13f-c3691b7f7700/disk.config#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.855 2 DEBUG oslo_concurrency.processutils [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/aa3ad2f7-f503-4fb3-b13f-c3691b7f7700/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppz6idk8u execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:09:17 np0005466013 nova_compute[192144]: 2025-10-02 12:09:17.980 2 DEBUG oslo_concurrency.processutils [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/aa3ad2f7-f503-4fb3-b13f-c3691b7f7700/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppz6idk8u" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:09:18 np0005466013 nova_compute[192144]: 2025-10-02 12:09:18.010 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:18 np0005466013 kernel: tapb0274f45-87: entered promiscuous mode
Oct  2 08:09:18 np0005466013 NetworkManager[51205]: <info>  [1759406958.0352] manager: (tapb0274f45-87): new Tun device (/org/freedesktop/NetworkManager/Devices/60)
Oct  2 08:09:18 np0005466013 nova_compute[192144]: 2025-10-02 12:09:18.035 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:18 np0005466013 ovn_controller[94366]: 2025-10-02T12:09:18Z|00115|binding|INFO|Claiming lport b0274f45-87f9-46e7-b0b4-c762e6ed2b43 for this chassis.
Oct  2 08:09:18 np0005466013 ovn_controller[94366]: 2025-10-02T12:09:18Z|00116|binding|INFO|b0274f45-87f9-46e7-b0b4-c762e6ed2b43: Claiming fa:16:3e:1b:44:bd 10.100.0.9
Oct  2 08:09:18 np0005466013 nova_compute[192144]: 2025-10-02 12:09:18.040 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:18.052 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1b:44:bd 10.100.0.9'], port_security=['fa:16:3e:1b:44:bd 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'aa3ad2f7-f503-4fb3-b13f-c3691b7f7700', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e3531c03-dcc1-4c2a-981f-8534850ce14f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '53cd9990789640a5b5e28b5beb8b222b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '451fc8d0-64dd-41c6-91ef-df444df65e30', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9845a444-54df-440f-9a26-b835473c9d1b, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=b0274f45-87f9-46e7-b0b4-c762e6ed2b43) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:18.053 103323 INFO neutron.agent.ovn.metadata.agent [-] Port b0274f45-87f9-46e7-b0b4-c762e6ed2b43 in datapath e3531c03-dcc1-4c2a-981f-8534850ce14f bound to our chassis#033[00m
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:18.055 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e3531c03-dcc1-4c2a-981f-8534850ce14f#033[00m
Oct  2 08:09:18 np0005466013 systemd-udevd[225773]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:09:18 np0005466013 systemd-machined[152202]: New machine qemu-19-instance-00000030.
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:18.066 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[88d6ef8e-f57a-47d1-921d-8eabab72104c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:18.067 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape3531c03-d1 in ovnmeta-e3531c03-dcc1-4c2a-981f-8534850ce14f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:18.069 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape3531c03-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:18.069 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[8746392d-0e17-438e-b7b3-4c255aed8bcb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:18.070 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[53f2b017-3173-4eb2-b044-048ff75da711]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:18 np0005466013 NetworkManager[51205]: <info>  [1759406958.0765] device (tapb0274f45-87): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:09:18 np0005466013 NetworkManager[51205]: <info>  [1759406958.0776] device (tapb0274f45-87): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:18.080 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[02ed0ae0-9bd7-4557-9097-a220ce080baa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:18 np0005466013 nova_compute[192144]: 2025-10-02 12:09:18.091 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:18 np0005466013 systemd[1]: Started Virtual Machine qemu-19-instance-00000030.
Oct  2 08:09:18 np0005466013 nova_compute[192144]: 2025-10-02 12:09:18.096 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:18.097 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[6d5ee434-b69f-4805-a23c-17632619a134]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:18 np0005466013 ovn_controller[94366]: 2025-10-02T12:09:18Z|00117|binding|INFO|Setting lport b0274f45-87f9-46e7-b0b4-c762e6ed2b43 ovn-installed in OVS
Oct  2 08:09:18 np0005466013 ovn_controller[94366]: 2025-10-02T12:09:18Z|00118|binding|INFO|Setting lport b0274f45-87f9-46e7-b0b4-c762e6ed2b43 up in Southbound
Oct  2 08:09:18 np0005466013 nova_compute[192144]: 2025-10-02 12:09:18.101 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:18.121 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[d3ae4f65-961b-487a-9b71-af1289bc0f03]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:18.127 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[e2a2ec7a-cbfa-41c6-a2cb-4cc419631755]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:18 np0005466013 NetworkManager[51205]: <info>  [1759406958.1281] manager: (tape3531c03-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/61)
Oct  2 08:09:18 np0005466013 systemd-udevd[225776]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:18.154 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[f7be41aa-a99c-4024-a1e5-1adab377d9f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:18.156 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[ef19b29a-1cf1-42f7-b7f2-a2492076268c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:18 np0005466013 NetworkManager[51205]: <info>  [1759406958.1753] device (tape3531c03-d0): carrier: link connected
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:18.180 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[0abefcd5-7095-44d1-9aea-b729e49fcf23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:18.197 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[96e02e80-66c3-4090-91e7-82d9019e9f78]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape3531c03-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:30:c6:bb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 39], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 494979, 'reachable_time': 27575, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225805, 'error': None, 'target': 'ovnmeta-e3531c03-dcc1-4c2a-981f-8534850ce14f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:18.212 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[53297c30-1979-448f-893f-357abbcad7cc]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe30:c6bb'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 494979, 'tstamp': 494979}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225806, 'error': None, 'target': 'ovnmeta-e3531c03-dcc1-4c2a-981f-8534850ce14f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:18.229 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[b9f3cd81-988d-40ef-b16c-970ea46fc2c4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape3531c03-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:30:c6:bb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 39], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 494979, 'reachable_time': 27575, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225807, 'error': None, 'target': 'ovnmeta-e3531c03-dcc1-4c2a-981f-8534850ce14f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:18.259 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[c55c6e98-e83e-4162-80e8-28bda91dd34d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:18.309 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[4d101090-a933-4f05-bda0-4663fd5b7ed7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:18.311 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape3531c03-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:18.311 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:18.311 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape3531c03-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:18 np0005466013 nova_compute[192144]: 2025-10-02 12:09:18.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:18 np0005466013 NetworkManager[51205]: <info>  [1759406958.3133] manager: (tape3531c03-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/62)
Oct  2 08:09:18 np0005466013 kernel: tape3531c03-d0: entered promiscuous mode
Oct  2 08:09:18 np0005466013 nova_compute[192144]: 2025-10-02 12:09:18.315 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:18.316 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape3531c03-d0, col_values=(('external_ids', {'iface-id': 'a22200c1-7efb-4203-8e94-7851356dbd00'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:18 np0005466013 nova_compute[192144]: 2025-10-02 12:09:18.317 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:18 np0005466013 ovn_controller[94366]: 2025-10-02T12:09:18Z|00119|binding|INFO|Releasing lport a22200c1-7efb-4203-8e94-7851356dbd00 from this chassis (sb_readonly=0)
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:18.320 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e3531c03-dcc1-4c2a-981f-8534850ce14f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e3531c03-dcc1-4c2a-981f-8534850ce14f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:09:18 np0005466013 nova_compute[192144]: 2025-10-02 12:09:18.329 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:18.329 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[044ccfb1-32b2-4272-97a6-92c927000fdb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:18.330 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-e3531c03-dcc1-4c2a-981f-8534850ce14f
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/e3531c03-dcc1-4c2a-981f-8534850ce14f.pid.haproxy
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID e3531c03-dcc1-4c2a-981f-8534850ce14f
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:09:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:18.331 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e3531c03-dcc1-4c2a-981f-8534850ce14f', 'env', 'PROCESS_TAG=haproxy-e3531c03-dcc1-4c2a-981f-8534850ce14f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e3531c03-dcc1-4c2a-981f-8534850ce14f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:09:18 np0005466013 nova_compute[192144]: 2025-10-02 12:09:18.363 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:09:18 np0005466013 nova_compute[192144]: 2025-10-02 12:09:18.374 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:09:18 np0005466013 nova_compute[192144]: 2025-10-02 12:09:18.375 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:09:18 np0005466013 nova_compute[192144]: 2025-10-02 12:09:18.375 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:09:18 np0005466013 nova_compute[192144]: 2025-10-02 12:09:18.375 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:09:18 np0005466013 podman[225839]: 2025-10-02 12:09:18.667341081 +0000 UTC m=+0.040704550 container create 9e886f1a7d82f415dd76908c13d715cbffa52f9677d7813aa6b29062bbac6e67 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e3531c03-dcc1-4c2a-981f-8534850ce14f, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 08:09:18 np0005466013 systemd[1]: Started libpod-conmon-9e886f1a7d82f415dd76908c13d715cbffa52f9677d7813aa6b29062bbac6e67.scope.
Oct  2 08:09:18 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:09:18 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ec81b60a0db651e7f87ed9e34de9c924c40875d7861d4466bc954345c84240a6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:09:18 np0005466013 podman[225839]: 2025-10-02 12:09:18.737607727 +0000 UTC m=+0.110971216 container init 9e886f1a7d82f415dd76908c13d715cbffa52f9677d7813aa6b29062bbac6e67 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e3531c03-dcc1-4c2a-981f-8534850ce14f, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:09:18 np0005466013 nova_compute[192144]: 2025-10-02 12:09:18.740 2 DEBUG nova.compute.manager [req-c60986b1-2c37-4828-8f49-aff50f8e391c req-10f99b2d-08c1-4eb9-ba5a-256f9f592736 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Received event network-vif-plugged-b0274f45-87f9-46e7-b0b4-c762e6ed2b43 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:09:18 np0005466013 nova_compute[192144]: 2025-10-02 12:09:18.741 2 DEBUG oslo_concurrency.lockutils [req-c60986b1-2c37-4828-8f49-aff50f8e391c req-10f99b2d-08c1-4eb9-ba5a-256f9f592736 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "aa3ad2f7-f503-4fb3-b13f-c3691b7f7700-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:18 np0005466013 nova_compute[192144]: 2025-10-02 12:09:18.741 2 DEBUG oslo_concurrency.lockutils [req-c60986b1-2c37-4828-8f49-aff50f8e391c req-10f99b2d-08c1-4eb9-ba5a-256f9f592736 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "aa3ad2f7-f503-4fb3-b13f-c3691b7f7700-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:18 np0005466013 nova_compute[192144]: 2025-10-02 12:09:18.742 2 DEBUG oslo_concurrency.lockutils [req-c60986b1-2c37-4828-8f49-aff50f8e391c req-10f99b2d-08c1-4eb9-ba5a-256f9f592736 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "aa3ad2f7-f503-4fb3-b13f-c3691b7f7700-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:18 np0005466013 podman[225839]: 2025-10-02 12:09:18.742646131 +0000 UTC m=+0.116009590 container start 9e886f1a7d82f415dd76908c13d715cbffa52f9677d7813aa6b29062bbac6e67 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e3531c03-dcc1-4c2a-981f-8534850ce14f, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct  2 08:09:18 np0005466013 podman[225839]: 2025-10-02 12:09:18.646752824 +0000 UTC m=+0.020116323 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:09:18 np0005466013 nova_compute[192144]: 2025-10-02 12:09:18.742 2 DEBUG nova.compute.manager [req-c60986b1-2c37-4828-8f49-aff50f8e391c req-10f99b2d-08c1-4eb9-ba5a-256f9f592736 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Processing event network-vif-plugged-b0274f45-87f9-46e7-b0b4-c762e6ed2b43 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:09:18 np0005466013 neutron-haproxy-ovnmeta-e3531c03-dcc1-4c2a-981f-8534850ce14f[225855]: [NOTICE]   (225859) : New worker (225861) forked
Oct  2 08:09:18 np0005466013 neutron-haproxy-ovnmeta-e3531c03-dcc1-4c2a-981f-8534850ce14f[225855]: [NOTICE]   (225859) : Loading success.
Oct  2 08:09:18 np0005466013 nova_compute[192144]: 2025-10-02 12:09:18.822 2 DEBUG nova.network.neutron [req-55132649-a2ff-4a62-8498-aa19a5cad7c6 req-24e9894e-5587-4f69-b765-c9fff0e0603a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Updated VIF entry in instance network info cache for port b0274f45-87f9-46e7-b0b4-c762e6ed2b43. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:09:18 np0005466013 nova_compute[192144]: 2025-10-02 12:09:18.823 2 DEBUG nova.network.neutron [req-55132649-a2ff-4a62-8498-aa19a5cad7c6 req-24e9894e-5587-4f69-b765-c9fff0e0603a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Updating instance_info_cache with network_info: [{"id": "b0274f45-87f9-46e7-b0b4-c762e6ed2b43", "address": "fa:16:3e:1b:44:bd", "network": {"id": "e3531c03-dcc1-4c2a-981f-8534850ce14f", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1609714742-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "53cd9990789640a5b5e28b5beb8b222b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0274f45-87", "ovs_interfaceid": "b0274f45-87f9-46e7-b0b4-c762e6ed2b43", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:09:18 np0005466013 nova_compute[192144]: 2025-10-02 12:09:18.837 2 DEBUG oslo_concurrency.lockutils [req-55132649-a2ff-4a62-8498-aa19a5cad7c6 req-24e9894e-5587-4f69-b765-c9fff0e0603a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-aa3ad2f7-f503-4fb3-b13f-c3691b7f7700" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:09:18 np0005466013 nova_compute[192144]: 2025-10-02 12:09:18.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:09:18 np0005466013 nova_compute[192144]: 2025-10-02 12:09:18.996 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:09:19 np0005466013 nova_compute[192144]: 2025-10-02 12:09:19.327 2 DEBUG nova.compute.manager [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:09:19 np0005466013 nova_compute[192144]: 2025-10-02 12:09:19.328 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759406959.3281615, aa3ad2f7-f503-4fb3-b13f-c3691b7f7700 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:09:19 np0005466013 nova_compute[192144]: 2025-10-02 12:09:19.328 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] VM Started (Lifecycle Event)#033[00m
Oct  2 08:09:19 np0005466013 nova_compute[192144]: 2025-10-02 12:09:19.331 2 DEBUG nova.virt.libvirt.driver [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:09:19 np0005466013 nova_compute[192144]: 2025-10-02 12:09:19.334 2 INFO nova.virt.libvirt.driver [-] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Instance spawned successfully.#033[00m
Oct  2 08:09:19 np0005466013 nova_compute[192144]: 2025-10-02 12:09:19.335 2 DEBUG nova.virt.libvirt.driver [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:09:19 np0005466013 nova_compute[192144]: 2025-10-02 12:09:19.349 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:09:19 np0005466013 nova_compute[192144]: 2025-10-02 12:09:19.354 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:09:19 np0005466013 nova_compute[192144]: 2025-10-02 12:09:19.357 2 DEBUG nova.virt.libvirt.driver [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:09:19 np0005466013 nova_compute[192144]: 2025-10-02 12:09:19.357 2 DEBUG nova.virt.libvirt.driver [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:09:19 np0005466013 nova_compute[192144]: 2025-10-02 12:09:19.357 2 DEBUG nova.virt.libvirt.driver [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:09:19 np0005466013 nova_compute[192144]: 2025-10-02 12:09:19.358 2 DEBUG nova.virt.libvirt.driver [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:09:19 np0005466013 nova_compute[192144]: 2025-10-02 12:09:19.358 2 DEBUG nova.virt.libvirt.driver [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:09:19 np0005466013 nova_compute[192144]: 2025-10-02 12:09:19.359 2 DEBUG nova.virt.libvirt.driver [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:09:19 np0005466013 nova_compute[192144]: 2025-10-02 12:09:19.383 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:09:19 np0005466013 nova_compute[192144]: 2025-10-02 12:09:19.384 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759406959.3291688, aa3ad2f7-f503-4fb3-b13f-c3691b7f7700 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:09:19 np0005466013 nova_compute[192144]: 2025-10-02 12:09:19.384 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:09:19 np0005466013 nova_compute[192144]: 2025-10-02 12:09:19.419 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:09:19 np0005466013 nova_compute[192144]: 2025-10-02 12:09:19.422 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759406959.3315496, aa3ad2f7-f503-4fb3-b13f-c3691b7f7700 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:09:19 np0005466013 nova_compute[192144]: 2025-10-02 12:09:19.423 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:09:19 np0005466013 nova_compute[192144]: 2025-10-02 12:09:19.442 2 INFO nova.compute.manager [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Took 4.82 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:09:19 np0005466013 nova_compute[192144]: 2025-10-02 12:09:19.443 2 DEBUG nova.compute.manager [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:09:19 np0005466013 nova_compute[192144]: 2025-10-02 12:09:19.444 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:09:19 np0005466013 nova_compute[192144]: 2025-10-02 12:09:19.449 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:09:19 np0005466013 nova_compute[192144]: 2025-10-02 12:09:19.484 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:09:19 np0005466013 nova_compute[192144]: 2025-10-02 12:09:19.536 2 INFO nova.compute.manager [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Took 5.43 seconds to build instance.#033[00m
Oct  2 08:09:19 np0005466013 nova_compute[192144]: 2025-10-02 12:09:19.562 2 DEBUG oslo_concurrency.lockutils [None req-49de0a0b-d488-433a-bd43-afa58dac68f1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Lock "aa3ad2f7-f503-4fb3-b13f-c3691b7f7700" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.536s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:19 np0005466013 nova_compute[192144]: 2025-10-02 12:09:19.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:09:19 np0005466013 nova_compute[192144]: 2025-10-02 12:09:19.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  2 08:09:20 np0005466013 nova_compute[192144]: 2025-10-02 12:09:20.014 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  2 08:09:20 np0005466013 nova_compute[192144]: 2025-10-02 12:09:20.015 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:09:20 np0005466013 nova_compute[192144]: 2025-10-02 12:09:20.925 2 DEBUG nova.compute.manager [req-557c209c-cd20-4c5c-bc8e-3caf2cb1abdf req-8fa809bd-b8c5-4cc4-b02f-4e1e9eb5516d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Received event network-vif-plugged-b0274f45-87f9-46e7-b0b4-c762e6ed2b43 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:09:20 np0005466013 nova_compute[192144]: 2025-10-02 12:09:20.927 2 DEBUG oslo_concurrency.lockutils [req-557c209c-cd20-4c5c-bc8e-3caf2cb1abdf req-8fa809bd-b8c5-4cc4-b02f-4e1e9eb5516d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "aa3ad2f7-f503-4fb3-b13f-c3691b7f7700-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:20 np0005466013 nova_compute[192144]: 2025-10-02 12:09:20.927 2 DEBUG oslo_concurrency.lockutils [req-557c209c-cd20-4c5c-bc8e-3caf2cb1abdf req-8fa809bd-b8c5-4cc4-b02f-4e1e9eb5516d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "aa3ad2f7-f503-4fb3-b13f-c3691b7f7700-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:20 np0005466013 nova_compute[192144]: 2025-10-02 12:09:20.927 2 DEBUG oslo_concurrency.lockutils [req-557c209c-cd20-4c5c-bc8e-3caf2cb1abdf req-8fa809bd-b8c5-4cc4-b02f-4e1e9eb5516d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "aa3ad2f7-f503-4fb3-b13f-c3691b7f7700-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:20 np0005466013 nova_compute[192144]: 2025-10-02 12:09:20.928 2 DEBUG nova.compute.manager [req-557c209c-cd20-4c5c-bc8e-3caf2cb1abdf req-8fa809bd-b8c5-4cc4-b02f-4e1e9eb5516d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] No waiting events found dispatching network-vif-plugged-b0274f45-87f9-46e7-b0b4-c762e6ed2b43 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:09:20 np0005466013 nova_compute[192144]: 2025-10-02 12:09:20.928 2 WARNING nova.compute.manager [req-557c209c-cd20-4c5c-bc8e-3caf2cb1abdf req-8fa809bd-b8c5-4cc4-b02f-4e1e9eb5516d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Received unexpected event network-vif-plugged-b0274f45-87f9-46e7-b0b4-c762e6ed2b43 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:09:22 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:22.038 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:22 np0005466013 nova_compute[192144]: 2025-10-02 12:09:22.423 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:23 np0005466013 nova_compute[192144]: 2025-10-02 12:09:23.013 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:24 np0005466013 nova_compute[192144]: 2025-10-02 12:09:24.023 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:09:27 np0005466013 nova_compute[192144]: 2025-10-02 12:09:27.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:28 np0005466013 nova_compute[192144]: 2025-10-02 12:09:28.016 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:29 np0005466013 podman[225885]: 2025-10-02 12:09:29.674949557 +0000 UTC m=+0.048994772 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Oct  2 08:09:29 np0005466013 podman[225884]: 2025-10-02 12:09:29.676309489 +0000 UTC m=+0.051579911 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:09:29 np0005466013 podman[225886]: 2025-10-02 12:09:29.703082483 +0000 UTC m=+0.075372224 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct  2 08:09:30 np0005466013 nova_compute[192144]: 2025-10-02 12:09:30.792 2 DEBUG oslo_concurrency.lockutils [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Acquiring lock "5498fc15-956a-42a0-817e-e4bb31469607" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:30 np0005466013 nova_compute[192144]: 2025-10-02 12:09:30.793 2 DEBUG oslo_concurrency.lockutils [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Lock "5498fc15-956a-42a0-817e-e4bb31469607" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:30 np0005466013 nova_compute[192144]: 2025-10-02 12:09:30.817 2 DEBUG nova.compute.manager [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:09:30 np0005466013 nova_compute[192144]: 2025-10-02 12:09:30.936 2 DEBUG oslo_concurrency.lockutils [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:30 np0005466013 nova_compute[192144]: 2025-10-02 12:09:30.937 2 DEBUG oslo_concurrency.lockutils [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:30 np0005466013 nova_compute[192144]: 2025-10-02 12:09:30.943 2 DEBUG nova.virt.hardware [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:09:30 np0005466013 nova_compute[192144]: 2025-10-02 12:09:30.943 2 INFO nova.compute.claims [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:09:31 np0005466013 nova_compute[192144]: 2025-10-02 12:09:31.098 2 DEBUG nova.compute.provider_tree [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:09:31 np0005466013 nova_compute[192144]: 2025-10-02 12:09:31.113 2 DEBUG nova.scheduler.client.report [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:09:31 np0005466013 nova_compute[192144]: 2025-10-02 12:09:31.144 2 DEBUG oslo_concurrency.lockutils [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.207s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:31 np0005466013 nova_compute[192144]: 2025-10-02 12:09:31.145 2 DEBUG nova.compute.manager [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:09:31 np0005466013 nova_compute[192144]: 2025-10-02 12:09:31.211 2 DEBUG nova.compute.manager [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:09:31 np0005466013 nova_compute[192144]: 2025-10-02 12:09:31.212 2 DEBUG nova.network.neutron [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:09:31 np0005466013 nova_compute[192144]: 2025-10-02 12:09:31.234 2 INFO nova.virt.libvirt.driver [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:09:31 np0005466013 nova_compute[192144]: 2025-10-02 12:09:31.254 2 DEBUG nova.compute.manager [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:09:31 np0005466013 nova_compute[192144]: 2025-10-02 12:09:31.399 2 DEBUG nova.compute.manager [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:09:31 np0005466013 nova_compute[192144]: 2025-10-02 12:09:31.400 2 DEBUG nova.virt.libvirt.driver [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:09:31 np0005466013 nova_compute[192144]: 2025-10-02 12:09:31.401 2 INFO nova.virt.libvirt.driver [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Creating image(s)#033[00m
Oct  2 08:09:31 np0005466013 nova_compute[192144]: 2025-10-02 12:09:31.401 2 DEBUG oslo_concurrency.lockutils [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Acquiring lock "/var/lib/nova/instances/5498fc15-956a-42a0-817e-e4bb31469607/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:31 np0005466013 nova_compute[192144]: 2025-10-02 12:09:31.402 2 DEBUG oslo_concurrency.lockutils [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Lock "/var/lib/nova/instances/5498fc15-956a-42a0-817e-e4bb31469607/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:31 np0005466013 nova_compute[192144]: 2025-10-02 12:09:31.403 2 DEBUG oslo_concurrency.lockutils [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Lock "/var/lib/nova/instances/5498fc15-956a-42a0-817e-e4bb31469607/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:31 np0005466013 nova_compute[192144]: 2025-10-02 12:09:31.417 2 DEBUG oslo_concurrency.processutils [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:09:31 np0005466013 nova_compute[192144]: 2025-10-02 12:09:31.481 2 DEBUG oslo_concurrency.processutils [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:09:31 np0005466013 nova_compute[192144]: 2025-10-02 12:09:31.482 2 DEBUG oslo_concurrency.lockutils [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:31 np0005466013 nova_compute[192144]: 2025-10-02 12:09:31.482 2 DEBUG oslo_concurrency.lockutils [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:31 np0005466013 nova_compute[192144]: 2025-10-02 12:09:31.509 2 DEBUG oslo_concurrency.processutils [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:09:31 np0005466013 nova_compute[192144]: 2025-10-02 12:09:31.580 2 DEBUG nova.policy [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '45712a323f9248c3b83534f5afa82f60', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bddb0cc777354c619155396b3af4a779', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:09:31 np0005466013 nova_compute[192144]: 2025-10-02 12:09:31.590 2 DEBUG oslo_concurrency.processutils [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:09:31 np0005466013 nova_compute[192144]: 2025-10-02 12:09:31.592 2 DEBUG oslo_concurrency.processutils [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/5498fc15-956a-42a0-817e-e4bb31469607/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:09:31 np0005466013 nova_compute[192144]: 2025-10-02 12:09:31.636 2 DEBUG oslo_concurrency.processutils [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/5498fc15-956a-42a0-817e-e4bb31469607/disk 1073741824" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:09:31 np0005466013 nova_compute[192144]: 2025-10-02 12:09:31.638 2 DEBUG oslo_concurrency.lockutils [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.155s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:31 np0005466013 nova_compute[192144]: 2025-10-02 12:09:31.639 2 DEBUG oslo_concurrency.processutils [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:09:31 np0005466013 nova_compute[192144]: 2025-10-02 12:09:31.697 2 DEBUG oslo_concurrency.processutils [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:09:31 np0005466013 nova_compute[192144]: 2025-10-02 12:09:31.698 2 DEBUG nova.virt.disk.api [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Checking if we can resize image /var/lib/nova/instances/5498fc15-956a-42a0-817e-e4bb31469607/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:09:31 np0005466013 nova_compute[192144]: 2025-10-02 12:09:31.699 2 DEBUG oslo_concurrency.processutils [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5498fc15-956a-42a0-817e-e4bb31469607/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:09:31 np0005466013 nova_compute[192144]: 2025-10-02 12:09:31.758 2 DEBUG oslo_concurrency.processutils [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5498fc15-956a-42a0-817e-e4bb31469607/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:09:31 np0005466013 nova_compute[192144]: 2025-10-02 12:09:31.759 2 DEBUG nova.virt.disk.api [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Cannot resize image /var/lib/nova/instances/5498fc15-956a-42a0-817e-e4bb31469607/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:09:31 np0005466013 nova_compute[192144]: 2025-10-02 12:09:31.760 2 DEBUG nova.objects.instance [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Lazy-loading 'migration_context' on Instance uuid 5498fc15-956a-42a0-817e-e4bb31469607 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:09:31 np0005466013 nova_compute[192144]: 2025-10-02 12:09:31.784 2 DEBUG nova.virt.libvirt.driver [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:09:31 np0005466013 nova_compute[192144]: 2025-10-02 12:09:31.785 2 DEBUG nova.virt.libvirt.driver [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Ensure instance console log exists: /var/lib/nova/instances/5498fc15-956a-42a0-817e-e4bb31469607/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:09:31 np0005466013 nova_compute[192144]: 2025-10-02 12:09:31.785 2 DEBUG oslo_concurrency.lockutils [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:31 np0005466013 nova_compute[192144]: 2025-10-02 12:09:31.786 2 DEBUG oslo_concurrency.lockutils [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:31 np0005466013 nova_compute[192144]: 2025-10-02 12:09:31.786 2 DEBUG oslo_concurrency.lockutils [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:31 np0005466013 ovn_controller[94366]: 2025-10-02T12:09:31Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:1b:44:bd 10.100.0.9
Oct  2 08:09:31 np0005466013 ovn_controller[94366]: 2025-10-02T12:09:31Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1b:44:bd 10.100.0.9
Oct  2 08:09:32 np0005466013 nova_compute[192144]: 2025-10-02 12:09:32.428 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:33 np0005466013 nova_compute[192144]: 2025-10-02 12:09:33.018 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:33 np0005466013 nova_compute[192144]: 2025-10-02 12:09:33.640 2 DEBUG nova.network.neutron [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Successfully created port: ec65b119-dfb7-4027-bd99-1b3de8416626 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:09:34 np0005466013 nova_compute[192144]: 2025-10-02 12:09:34.988 2 DEBUG nova.network.neutron [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Successfully updated port: ec65b119-dfb7-4027-bd99-1b3de8416626 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:09:35 np0005466013 nova_compute[192144]: 2025-10-02 12:09:35.007 2 DEBUG oslo_concurrency.lockutils [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Acquiring lock "refresh_cache-5498fc15-956a-42a0-817e-e4bb31469607" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:09:35 np0005466013 nova_compute[192144]: 2025-10-02 12:09:35.007 2 DEBUG oslo_concurrency.lockutils [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Acquired lock "refresh_cache-5498fc15-956a-42a0-817e-e4bb31469607" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:09:35 np0005466013 nova_compute[192144]: 2025-10-02 12:09:35.007 2 DEBUG nova.network.neutron [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:09:35 np0005466013 nova_compute[192144]: 2025-10-02 12:09:35.203 2 DEBUG nova.network.neutron [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:09:35 np0005466013 nova_compute[192144]: 2025-10-02 12:09:35.498 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:35 np0005466013 NetworkManager[51205]: <info>  [1759406975.4992] manager: (patch-br-int-to-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/63)
Oct  2 08:09:35 np0005466013 NetworkManager[51205]: <info>  [1759406975.5001] device (patch-br-int-to-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  2 08:09:35 np0005466013 NetworkManager[51205]: <info>  [1759406975.5011] manager: (patch-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/64)
Oct  2 08:09:35 np0005466013 NetworkManager[51205]: <info>  [1759406975.5013] device (patch-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  2 08:09:35 np0005466013 NetworkManager[51205]: <info>  [1759406975.5021] manager: (patch-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/65)
Oct  2 08:09:35 np0005466013 NetworkManager[51205]: <info>  [1759406975.5026] manager: (patch-br-int-to-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/66)
Oct  2 08:09:35 np0005466013 NetworkManager[51205]: <info>  [1759406975.5030] device (patch-br-int-to-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Oct  2 08:09:35 np0005466013 NetworkManager[51205]: <info>  [1759406975.5033] device (patch-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Oct  2 08:09:35 np0005466013 nova_compute[192144]: 2025-10-02 12:09:35.634 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:35 np0005466013 ovn_controller[94366]: 2025-10-02T12:09:35Z|00120|binding|INFO|Releasing lport a22200c1-7efb-4203-8e94-7851356dbd00 from this chassis (sb_readonly=0)
Oct  2 08:09:35 np0005466013 nova_compute[192144]: 2025-10-02 12:09:35.652 2 DEBUG nova.compute.manager [req-d30baf00-2538-4c41-9dc0-5f42c662b7c0 req-da79b344-8f3c-4970-a576-3c1c131e36fe 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Received event network-changed-ec65b119-dfb7-4027-bd99-1b3de8416626 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:09:35 np0005466013 nova_compute[192144]: 2025-10-02 12:09:35.653 2 DEBUG nova.compute.manager [req-d30baf00-2538-4c41-9dc0-5f42c662b7c0 req-da79b344-8f3c-4970-a576-3c1c131e36fe 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Refreshing instance network info cache due to event network-changed-ec65b119-dfb7-4027-bd99-1b3de8416626. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:09:35 np0005466013 nova_compute[192144]: 2025-10-02 12:09:35.653 2 DEBUG oslo_concurrency.lockutils [req-d30baf00-2538-4c41-9dc0-5f42c662b7c0 req-da79b344-8f3c-4970-a576-3c1c131e36fe 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-5498fc15-956a-42a0-817e-e4bb31469607" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:09:35 np0005466013 nova_compute[192144]: 2025-10-02 12:09:35.653 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:35 np0005466013 nova_compute[192144]: 2025-10-02 12:09:35.957 2 DEBUG nova.compute.manager [req-658435e1-de6f-4cfd-b1da-fe145992000e req-8373f3c9-5bef-4684-b8c4-b8e058556f5c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Received event network-changed-b0274f45-87f9-46e7-b0b4-c762e6ed2b43 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:09:35 np0005466013 nova_compute[192144]: 2025-10-02 12:09:35.958 2 DEBUG nova.compute.manager [req-658435e1-de6f-4cfd-b1da-fe145992000e req-8373f3c9-5bef-4684-b8c4-b8e058556f5c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Refreshing instance network info cache due to event network-changed-b0274f45-87f9-46e7-b0b4-c762e6ed2b43. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:09:35 np0005466013 nova_compute[192144]: 2025-10-02 12:09:35.958 2 DEBUG oslo_concurrency.lockutils [req-658435e1-de6f-4cfd-b1da-fe145992000e req-8373f3c9-5bef-4684-b8c4-b8e058556f5c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-aa3ad2f7-f503-4fb3-b13f-c3691b7f7700" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:09:35 np0005466013 nova_compute[192144]: 2025-10-02 12:09:35.958 2 DEBUG oslo_concurrency.lockutils [req-658435e1-de6f-4cfd-b1da-fe145992000e req-8373f3c9-5bef-4684-b8c4-b8e058556f5c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-aa3ad2f7-f503-4fb3-b13f-c3691b7f7700" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:09:35 np0005466013 nova_compute[192144]: 2025-10-02 12:09:35.959 2 DEBUG nova.network.neutron [req-658435e1-de6f-4cfd-b1da-fe145992000e req-8373f3c9-5bef-4684-b8c4-b8e058556f5c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Refreshing network info cache for port b0274f45-87f9-46e7-b0b4-c762e6ed2b43 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:09:36 np0005466013 nova_compute[192144]: 2025-10-02 12:09:36.799 2 DEBUG nova.network.neutron [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Updating instance_info_cache with network_info: [{"id": "ec65b119-dfb7-4027-bd99-1b3de8416626", "address": "fa:16:3e:ac:0d:d4", "network": {"id": "9e24c283-8cbc-4703-9e61-98b782609dec", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-409456389-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bddb0cc777354c619155396b3af4a779", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec65b119-df", "ovs_interfaceid": "ec65b119-dfb7-4027-bd99-1b3de8416626", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:09:36 np0005466013 nova_compute[192144]: 2025-10-02 12:09:36.819 2 DEBUG oslo_concurrency.lockutils [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Releasing lock "refresh_cache-5498fc15-956a-42a0-817e-e4bb31469607" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:09:36 np0005466013 nova_compute[192144]: 2025-10-02 12:09:36.820 2 DEBUG nova.compute.manager [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Instance network_info: |[{"id": "ec65b119-dfb7-4027-bd99-1b3de8416626", "address": "fa:16:3e:ac:0d:d4", "network": {"id": "9e24c283-8cbc-4703-9e61-98b782609dec", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-409456389-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bddb0cc777354c619155396b3af4a779", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec65b119-df", "ovs_interfaceid": "ec65b119-dfb7-4027-bd99-1b3de8416626", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:09:36 np0005466013 nova_compute[192144]: 2025-10-02 12:09:36.820 2 DEBUG oslo_concurrency.lockutils [req-d30baf00-2538-4c41-9dc0-5f42c662b7c0 req-da79b344-8f3c-4970-a576-3c1c131e36fe 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-5498fc15-956a-42a0-817e-e4bb31469607" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:09:36 np0005466013 nova_compute[192144]: 2025-10-02 12:09:36.820 2 DEBUG nova.network.neutron [req-d30baf00-2538-4c41-9dc0-5f42c662b7c0 req-da79b344-8f3c-4970-a576-3c1c131e36fe 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Refreshing network info cache for port ec65b119-dfb7-4027-bd99-1b3de8416626 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:09:36 np0005466013 nova_compute[192144]: 2025-10-02 12:09:36.823 2 DEBUG nova.virt.libvirt.driver [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Start _get_guest_xml network_info=[{"id": "ec65b119-dfb7-4027-bd99-1b3de8416626", "address": "fa:16:3e:ac:0d:d4", "network": {"id": "9e24c283-8cbc-4703-9e61-98b782609dec", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-409456389-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bddb0cc777354c619155396b3af4a779", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec65b119-df", "ovs_interfaceid": "ec65b119-dfb7-4027-bd99-1b3de8416626", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:09:36 np0005466013 nova_compute[192144]: 2025-10-02 12:09:36.827 2 WARNING nova.virt.libvirt.driver [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:09:36 np0005466013 nova_compute[192144]: 2025-10-02 12:09:36.831 2 DEBUG nova.virt.libvirt.host [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:09:36 np0005466013 nova_compute[192144]: 2025-10-02 12:09:36.833 2 DEBUG nova.virt.libvirt.host [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:09:36 np0005466013 nova_compute[192144]: 2025-10-02 12:09:36.836 2 DEBUG nova.virt.libvirt.host [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:09:36 np0005466013 nova_compute[192144]: 2025-10-02 12:09:36.837 2 DEBUG nova.virt.libvirt.host [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:09:36 np0005466013 nova_compute[192144]: 2025-10-02 12:09:36.839 2 DEBUG nova.virt.libvirt.driver [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:09:36 np0005466013 nova_compute[192144]: 2025-10-02 12:09:36.839 2 DEBUG nova.virt.hardware [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:09:36 np0005466013 nova_compute[192144]: 2025-10-02 12:09:36.840 2 DEBUG nova.virt.hardware [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:09:36 np0005466013 nova_compute[192144]: 2025-10-02 12:09:36.840 2 DEBUG nova.virt.hardware [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:09:36 np0005466013 nova_compute[192144]: 2025-10-02 12:09:36.840 2 DEBUG nova.virt.hardware [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:09:36 np0005466013 nova_compute[192144]: 2025-10-02 12:09:36.841 2 DEBUG nova.virt.hardware [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:09:36 np0005466013 nova_compute[192144]: 2025-10-02 12:09:36.841 2 DEBUG nova.virt.hardware [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:09:36 np0005466013 nova_compute[192144]: 2025-10-02 12:09:36.841 2 DEBUG nova.virt.hardware [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:09:36 np0005466013 nova_compute[192144]: 2025-10-02 12:09:36.841 2 DEBUG nova.virt.hardware [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:09:36 np0005466013 nova_compute[192144]: 2025-10-02 12:09:36.842 2 DEBUG nova.virt.hardware [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:09:36 np0005466013 nova_compute[192144]: 2025-10-02 12:09:36.842 2 DEBUG nova.virt.hardware [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:09:36 np0005466013 nova_compute[192144]: 2025-10-02 12:09:36.842 2 DEBUG nova.virt.hardware [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:09:36 np0005466013 nova_compute[192144]: 2025-10-02 12:09:36.846 2 DEBUG nova.virt.libvirt.vif [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:09:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesUnderV243Test-server-317293098',display_name='tempest-AttachInterfacesUnderV243Test-server-317293098',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacesunderv243test-server-317293098',id=50,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIZvnjWPkZxqSPdIDG0zWedfhIItto24yt8Qc/6lQ3UzQDrUOVIiy5BZb2l3Cs/s/tvrOYJJdssgLv7yaV7Niy0i4bWu2vTMzJa7gPQs7zbNxPodTDIjKAdRcs6dhQdYQA==',key_name='tempest-keypair-712590735',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bddb0cc777354c619155396b3af4a779',ramdisk_id='',reservation_id='r-4voredoz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesUnderV243Test-1757342469',owner_user_name='tempest-AttachInterfacesUnderV243Test-1757342469-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:09:31Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='45712a323f9248c3b83534f5afa82f60',uuid=5498fc15-956a-42a0-817e-e4bb31469607,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ec65b119-dfb7-4027-bd99-1b3de8416626", "address": "fa:16:3e:ac:0d:d4", "network": {"id": "9e24c283-8cbc-4703-9e61-98b782609dec", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-409456389-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bddb0cc777354c619155396b3af4a779", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec65b119-df", "ovs_interfaceid": "ec65b119-dfb7-4027-bd99-1b3de8416626", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:09:36 np0005466013 nova_compute[192144]: 2025-10-02 12:09:36.847 2 DEBUG nova.network.os_vif_util [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Converting VIF {"id": "ec65b119-dfb7-4027-bd99-1b3de8416626", "address": "fa:16:3e:ac:0d:d4", "network": {"id": "9e24c283-8cbc-4703-9e61-98b782609dec", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-409456389-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bddb0cc777354c619155396b3af4a779", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec65b119-df", "ovs_interfaceid": "ec65b119-dfb7-4027-bd99-1b3de8416626", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:09:36 np0005466013 nova_compute[192144]: 2025-10-02 12:09:36.848 2 DEBUG nova.network.os_vif_util [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ac:0d:d4,bridge_name='br-int',has_traffic_filtering=True,id=ec65b119-dfb7-4027-bd99-1b3de8416626,network=Network(9e24c283-8cbc-4703-9e61-98b782609dec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec65b119-df') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:09:36 np0005466013 nova_compute[192144]: 2025-10-02 12:09:36.849 2 DEBUG nova.objects.instance [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5498fc15-956a-42a0-817e-e4bb31469607 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:09:36 np0005466013 nova_compute[192144]: 2025-10-02 12:09:36.881 2 DEBUG nova.virt.libvirt.driver [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:09:36 np0005466013 nova_compute[192144]:  <uuid>5498fc15-956a-42a0-817e-e4bb31469607</uuid>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:  <name>instance-00000032</name>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:09:36 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:      <nova:name>tempest-AttachInterfacesUnderV243Test-server-317293098</nova:name>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:09:36</nova:creationTime>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:09:36 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:        <nova:user uuid="45712a323f9248c3b83534f5afa82f60">tempest-AttachInterfacesUnderV243Test-1757342469-project-member</nova:user>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:        <nova:project uuid="bddb0cc777354c619155396b3af4a779">tempest-AttachInterfacesUnderV243Test-1757342469</nova:project>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:        <nova:port uuid="ec65b119-dfb7-4027-bd99-1b3de8416626">
Oct  2 08:09:36 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:      <entry name="serial">5498fc15-956a-42a0-817e-e4bb31469607</entry>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:      <entry name="uuid">5498fc15-956a-42a0-817e-e4bb31469607</entry>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:09:36 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/5498fc15-956a-42a0-817e-e4bb31469607/disk"/>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:09:36 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/5498fc15-956a-42a0-817e-e4bb31469607/disk.config"/>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:09:36 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:ac:0d:d4"/>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:      <target dev="tapec65b119-df"/>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:09:36 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/5498fc15-956a-42a0-817e-e4bb31469607/console.log" append="off"/>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:09:36 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:09:36 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:09:36 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:09:36 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:09:36 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:09:36 np0005466013 nova_compute[192144]: 2025-10-02 12:09:36.881 2 DEBUG nova.compute.manager [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Preparing to wait for external event network-vif-plugged-ec65b119-dfb7-4027-bd99-1b3de8416626 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:09:36 np0005466013 nova_compute[192144]: 2025-10-02 12:09:36.882 2 DEBUG oslo_concurrency.lockutils [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Acquiring lock "5498fc15-956a-42a0-817e-e4bb31469607-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:36 np0005466013 nova_compute[192144]: 2025-10-02 12:09:36.882 2 DEBUG oslo_concurrency.lockutils [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Lock "5498fc15-956a-42a0-817e-e4bb31469607-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:36 np0005466013 nova_compute[192144]: 2025-10-02 12:09:36.883 2 DEBUG oslo_concurrency.lockutils [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Lock "5498fc15-956a-42a0-817e-e4bb31469607-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:36 np0005466013 nova_compute[192144]: 2025-10-02 12:09:36.883 2 DEBUG nova.virt.libvirt.vif [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:09:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesUnderV243Test-server-317293098',display_name='tempest-AttachInterfacesUnderV243Test-server-317293098',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacesunderv243test-server-317293098',id=50,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIZvnjWPkZxqSPdIDG0zWedfhIItto24yt8Qc/6lQ3UzQDrUOVIiy5BZb2l3Cs/s/tvrOYJJdssgLv7yaV7Niy0i4bWu2vTMzJa7gPQs7zbNxPodTDIjKAdRcs6dhQdYQA==',key_name='tempest-keypair-712590735',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bddb0cc777354c619155396b3af4a779',ramdisk_id='',reservation_id='r-4voredoz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesUnderV243Test-1757342469',owner_user_name='tempest-AttachInterfacesUnderV243Test-1757342469-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:09:31Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='45712a323f9248c3b83534f5afa82f60',uuid=5498fc15-956a-42a0-817e-e4bb31469607,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ec65b119-dfb7-4027-bd99-1b3de8416626", "address": "fa:16:3e:ac:0d:d4", "network": {"id": "9e24c283-8cbc-4703-9e61-98b782609dec", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-409456389-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bddb0cc777354c619155396b3af4a779", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec65b119-df", "ovs_interfaceid": "ec65b119-dfb7-4027-bd99-1b3de8416626", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:09:36 np0005466013 nova_compute[192144]: 2025-10-02 12:09:36.884 2 DEBUG nova.network.os_vif_util [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Converting VIF {"id": "ec65b119-dfb7-4027-bd99-1b3de8416626", "address": "fa:16:3e:ac:0d:d4", "network": {"id": "9e24c283-8cbc-4703-9e61-98b782609dec", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-409456389-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bddb0cc777354c619155396b3af4a779", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec65b119-df", "ovs_interfaceid": "ec65b119-dfb7-4027-bd99-1b3de8416626", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:09:36 np0005466013 nova_compute[192144]: 2025-10-02 12:09:36.884 2 DEBUG nova.network.os_vif_util [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ac:0d:d4,bridge_name='br-int',has_traffic_filtering=True,id=ec65b119-dfb7-4027-bd99-1b3de8416626,network=Network(9e24c283-8cbc-4703-9e61-98b782609dec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec65b119-df') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:09:36 np0005466013 nova_compute[192144]: 2025-10-02 12:09:36.884 2 DEBUG os_vif [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:0d:d4,bridge_name='br-int',has_traffic_filtering=True,id=ec65b119-dfb7-4027-bd99-1b3de8416626,network=Network(9e24c283-8cbc-4703-9e61-98b782609dec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec65b119-df') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:09:36 np0005466013 nova_compute[192144]: 2025-10-02 12:09:36.885 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:36 np0005466013 nova_compute[192144]: 2025-10-02 12:09:36.885 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:36 np0005466013 nova_compute[192144]: 2025-10-02 12:09:36.886 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:09:36 np0005466013 nova_compute[192144]: 2025-10-02 12:09:36.889 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:36 np0005466013 nova_compute[192144]: 2025-10-02 12:09:36.889 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapec65b119-df, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:36 np0005466013 nova_compute[192144]: 2025-10-02 12:09:36.890 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapec65b119-df, col_values=(('external_ids', {'iface-id': 'ec65b119-dfb7-4027-bd99-1b3de8416626', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ac:0d:d4', 'vm-uuid': '5498fc15-956a-42a0-817e-e4bb31469607'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:36 np0005466013 nova_compute[192144]: 2025-10-02 12:09:36.891 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:36 np0005466013 NetworkManager[51205]: <info>  [1759406976.8923] manager: (tapec65b119-df): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/67)
Oct  2 08:09:36 np0005466013 nova_compute[192144]: 2025-10-02 12:09:36.894 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:09:36 np0005466013 nova_compute[192144]: 2025-10-02 12:09:36.898 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:36 np0005466013 nova_compute[192144]: 2025-10-02 12:09:36.898 2 INFO os_vif [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:0d:d4,bridge_name='br-int',has_traffic_filtering=True,id=ec65b119-dfb7-4027-bd99-1b3de8416626,network=Network(9e24c283-8cbc-4703-9e61-98b782609dec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec65b119-df')#033[00m
Oct  2 08:09:36 np0005466013 nova_compute[192144]: 2025-10-02 12:09:36.953 2 DEBUG nova.virt.libvirt.driver [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:09:36 np0005466013 nova_compute[192144]: 2025-10-02 12:09:36.953 2 DEBUG nova.virt.libvirt.driver [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:09:36 np0005466013 nova_compute[192144]: 2025-10-02 12:09:36.953 2 DEBUG nova.virt.libvirt.driver [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] No VIF found with MAC fa:16:3e:ac:0d:d4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:09:36 np0005466013 nova_compute[192144]: 2025-10-02 12:09:36.954 2 INFO nova.virt.libvirt.driver [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Using config drive#033[00m
Oct  2 08:09:37 np0005466013 nova_compute[192144]: 2025-10-02 12:09:37.374 2 INFO nova.virt.libvirt.driver [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Creating config drive at /var/lib/nova/instances/5498fc15-956a-42a0-817e-e4bb31469607/disk.config#033[00m
Oct  2 08:09:37 np0005466013 nova_compute[192144]: 2025-10-02 12:09:37.379 2 DEBUG oslo_concurrency.processutils [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5498fc15-956a-42a0-817e-e4bb31469607/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmlz64240 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:09:37 np0005466013 nova_compute[192144]: 2025-10-02 12:09:37.505 2 DEBUG oslo_concurrency.processutils [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5498fc15-956a-42a0-817e-e4bb31469607/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmlz64240" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:09:37 np0005466013 kernel: tapec65b119-df: entered promiscuous mode
Oct  2 08:09:37 np0005466013 NetworkManager[51205]: <info>  [1759406977.5592] manager: (tapec65b119-df): new Tun device (/org/freedesktop/NetworkManager/Devices/68)
Oct  2 08:09:37 np0005466013 ovn_controller[94366]: 2025-10-02T12:09:37Z|00121|binding|INFO|Claiming lport ec65b119-dfb7-4027-bd99-1b3de8416626 for this chassis.
Oct  2 08:09:37 np0005466013 ovn_controller[94366]: 2025-10-02T12:09:37Z|00122|binding|INFO|ec65b119-dfb7-4027-bd99-1b3de8416626: Claiming fa:16:3e:ac:0d:d4 10.100.0.12
Oct  2 08:09:37 np0005466013 nova_compute[192144]: 2025-10-02 12:09:37.560 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:37 np0005466013 ovn_controller[94366]: 2025-10-02T12:09:37Z|00123|binding|INFO|Setting lport ec65b119-dfb7-4027-bd99-1b3de8416626 ovn-installed in OVS
Oct  2 08:09:37 np0005466013 nova_compute[192144]: 2025-10-02 12:09:37.577 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:37 np0005466013 ovn_controller[94366]: 2025-10-02T12:09:37Z|00124|binding|INFO|Setting lport ec65b119-dfb7-4027-bd99-1b3de8416626 up in Southbound
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:37.580 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:0d:d4 10.100.0.12'], port_security=['fa:16:3e:ac:0d:d4 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '5498fc15-956a-42a0-817e-e4bb31469607', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9e24c283-8cbc-4703-9e61-98b782609dec', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bddb0cc777354c619155396b3af4a779', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a5b39b52-31e4-466c-a577-6a5a4ac5a248', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0c15321d-9c09-4ce1-a81b-3f0dd80999b7, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=ec65b119-dfb7-4027-bd99-1b3de8416626) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:09:37 np0005466013 nova_compute[192144]: 2025-10-02 12:09:37.580 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:37.583 103323 INFO neutron.agent.ovn.metadata.agent [-] Port ec65b119-dfb7-4027-bd99-1b3de8416626 in datapath 9e24c283-8cbc-4703-9e61-98b782609dec bound to our chassis#033[00m
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:37.585 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9e24c283-8cbc-4703-9e61-98b782609dec#033[00m
Oct  2 08:09:37 np0005466013 systemd-udevd[225996]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:37.598 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[2b0c87d1-275f-49db-bc93-9debca1b107b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:37.599 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9e24c283-81 in ovnmeta-9e24c283-8cbc-4703-9e61-98b782609dec namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:37.602 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9e24c283-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:37.602 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[616dd7ed-35fc-466e-b32e-59070898c91d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:37.603 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[6546d4ea-b0f2-418c-9057-dbce33392250]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:37 np0005466013 systemd-machined[152202]: New machine qemu-20-instance-00000032.
Oct  2 08:09:37 np0005466013 NetworkManager[51205]: <info>  [1759406977.6169] device (tapec65b119-df): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:09:37 np0005466013 NetworkManager[51205]: <info>  [1759406977.6183] device (tapec65b119-df): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:37.621 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[2f19cedf-89c6-40c1-95ca-022efa830324]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:37 np0005466013 systemd[1]: Started Virtual Machine qemu-20-instance-00000032.
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:37.642 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[1fb3cbe1-cf46-4089-ab8a-babb3baef1fd]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:37.676 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[037c07bf-685f-4e1e-af34-33d5fcece56b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:37 np0005466013 NetworkManager[51205]: <info>  [1759406977.6850] manager: (tap9e24c283-80): new Veth device (/org/freedesktop/NetworkManager/Devices/69)
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:37.684 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[ad1c3337-2432-4647-aa91-9b47718f6ea7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:37.721 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[79e8f906-c3ea-45c5-b8e6-65aa968e7405]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:37.725 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[31d16166-d3de-42eb-a3a5-b29929076fd9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:37 np0005466013 NetworkManager[51205]: <info>  [1759406977.7514] device (tap9e24c283-80): carrier: link connected
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:37.758 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[92b1566e-b84c-4f3f-9ad7-0b4568f667b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:37.778 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[093f16e7-ea29-4c89-80e8-c3fd148b82db]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9e24c283-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7d:e4:09'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 496936, 'reachable_time': 23963, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226029, 'error': None, 'target': 'ovnmeta-9e24c283-8cbc-4703-9e61-98b782609dec', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:37.793 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[064a74c5-617f-4603-a857-bdd44f8e4789]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7d:e409'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 496936, 'tstamp': 496936}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226030, 'error': None, 'target': 'ovnmeta-9e24c283-8cbc-4703-9e61-98b782609dec', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:37.813 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[9a20be3d-11dc-4435-aaca-be5ae95ca2d9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9e24c283-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7d:e4:09'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 496936, 'reachable_time': 23963, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226031, 'error': None, 'target': 'ovnmeta-9e24c283-8cbc-4703-9e61-98b782609dec', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:37.858 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[d6683cde-9e37-45b0-8255-587fe40e574c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:37.930 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[c4b4e7ac-93b3-4b5f-bcbf-d548fb9a346a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:37.932 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9e24c283-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:37.933 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:37.933 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9e24c283-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:37 np0005466013 nova_compute[192144]: 2025-10-02 12:09:37.936 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:37 np0005466013 kernel: tap9e24c283-80: entered promiscuous mode
Oct  2 08:09:37 np0005466013 NetworkManager[51205]: <info>  [1759406977.9382] manager: (tap9e24c283-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/70)
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:37.940 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9e24c283-80, col_values=(('external_ids', {'iface-id': 'c73cfa59-a671-4019-9fd6-1dcbc8e59b26'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:37 np0005466013 ovn_controller[94366]: 2025-10-02T12:09:37Z|00125|binding|INFO|Releasing lport c73cfa59-a671-4019-9fd6-1dcbc8e59b26 from this chassis (sb_readonly=0)
Oct  2 08:09:37 np0005466013 nova_compute[192144]: 2025-10-02 12:09:37.942 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:37 np0005466013 nova_compute[192144]: 2025-10-02 12:09:37.956 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:37.958 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9e24c283-8cbc-4703-9e61-98b782609dec.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9e24c283-8cbc-4703-9e61-98b782609dec.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:37.959 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[5b0be0e9-94ae-460f-98aa-fb1f3d3ced5a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:37.960 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-9e24c283-8cbc-4703-9e61-98b782609dec
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/9e24c283-8cbc-4703-9e61-98b782609dec.pid.haproxy
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID 9e24c283-8cbc-4703-9e61-98b782609dec
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:09:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:09:37.961 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9e24c283-8cbc-4703-9e61-98b782609dec', 'env', 'PROCESS_TAG=haproxy-9e24c283-8cbc-4703-9e61-98b782609dec', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9e24c283-8cbc-4703-9e61-98b782609dec.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:09:38 np0005466013 nova_compute[192144]: 2025-10-02 12:09:38.020 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:38 np0005466013 podman[226069]: 2025-10-02 12:09:38.432143532 +0000 UTC m=+0.062916068 container create bd020f904c8a5bbe17880092e784637c5ea990945302dd170df0f9ad5b57ac2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9e24c283-8cbc-4703-9e61-98b782609dec, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Oct  2 08:09:38 np0005466013 systemd[1]: Started libpod-conmon-bd020f904c8a5bbe17880092e784637c5ea990945302dd170df0f9ad5b57ac2a.scope.
Oct  2 08:09:38 np0005466013 podman[226069]: 2025-10-02 12:09:38.39557936 +0000 UTC m=+0.026351916 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:09:38 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:09:38 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2482d772718bb72c5c06fa41f5977af25f28154e5a91dc798b04b8b9f4cb1790/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:09:38 np0005466013 podman[226069]: 2025-10-02 12:09:38.552799525 +0000 UTC m=+0.183572081 container init bd020f904c8a5bbe17880092e784637c5ea990945302dd170df0f9ad5b57ac2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9e24c283-8cbc-4703-9e61-98b782609dec, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:09:38 np0005466013 podman[226069]: 2025-10-02 12:09:38.561436268 +0000 UTC m=+0.192208794 container start bd020f904c8a5bbe17880092e784637c5ea990945302dd170df0f9ad5b57ac2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9e24c283-8cbc-4703-9e61-98b782609dec, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Oct  2 08:09:38 np0005466013 neutron-haproxy-ovnmeta-9e24c283-8cbc-4703-9e61-98b782609dec[226086]: [NOTICE]   (226090) : New worker (226092) forked
Oct  2 08:09:38 np0005466013 neutron-haproxy-ovnmeta-9e24c283-8cbc-4703-9e61-98b782609dec[226086]: [NOTICE]   (226090) : Loading success.
Oct  2 08:09:38 np0005466013 nova_compute[192144]: 2025-10-02 12:09:38.791 2 DEBUG nova.network.neutron [req-658435e1-de6f-4cfd-b1da-fe145992000e req-8373f3c9-5bef-4684-b8c4-b8e058556f5c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Updated VIF entry in instance network info cache for port b0274f45-87f9-46e7-b0b4-c762e6ed2b43. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:09:38 np0005466013 nova_compute[192144]: 2025-10-02 12:09:38.792 2 DEBUG nova.network.neutron [req-658435e1-de6f-4cfd-b1da-fe145992000e req-8373f3c9-5bef-4684-b8c4-b8e058556f5c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Updating instance_info_cache with network_info: [{"id": "b0274f45-87f9-46e7-b0b4-c762e6ed2b43", "address": "fa:16:3e:1b:44:bd", "network": {"id": "e3531c03-dcc1-4c2a-981f-8534850ce14f", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1609714742-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "53cd9990789640a5b5e28b5beb8b222b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0274f45-87", "ovs_interfaceid": "b0274f45-87f9-46e7-b0b4-c762e6ed2b43", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:09:38 np0005466013 nova_compute[192144]: 2025-10-02 12:09:38.832 2 DEBUG oslo_concurrency.lockutils [req-658435e1-de6f-4cfd-b1da-fe145992000e req-8373f3c9-5bef-4684-b8c4-b8e058556f5c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-aa3ad2f7-f503-4fb3-b13f-c3691b7f7700" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:09:38 np0005466013 nova_compute[192144]: 2025-10-02 12:09:38.855 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759406978.8547747, 5498fc15-956a-42a0-817e-e4bb31469607 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:09:38 np0005466013 nova_compute[192144]: 2025-10-02 12:09:38.856 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] VM Started (Lifecycle Event)#033[00m
Oct  2 08:09:38 np0005466013 nova_compute[192144]: 2025-10-02 12:09:38.898 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:09:38 np0005466013 nova_compute[192144]: 2025-10-02 12:09:38.904 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759406978.8563657, 5498fc15-956a-42a0-817e-e4bb31469607 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:09:38 np0005466013 nova_compute[192144]: 2025-10-02 12:09:38.904 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:09:38 np0005466013 nova_compute[192144]: 2025-10-02 12:09:38.921 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:09:38 np0005466013 nova_compute[192144]: 2025-10-02 12:09:38.926 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:09:38 np0005466013 nova_compute[192144]: 2025-10-02 12:09:38.943 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:09:39 np0005466013 nova_compute[192144]: 2025-10-02 12:09:39.014 2 DEBUG nova.network.neutron [req-d30baf00-2538-4c41-9dc0-5f42c662b7c0 req-da79b344-8f3c-4970-a576-3c1c131e36fe 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Updated VIF entry in instance network info cache for port ec65b119-dfb7-4027-bd99-1b3de8416626. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:09:39 np0005466013 nova_compute[192144]: 2025-10-02 12:09:39.015 2 DEBUG nova.network.neutron [req-d30baf00-2538-4c41-9dc0-5f42c662b7c0 req-da79b344-8f3c-4970-a576-3c1c131e36fe 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Updating instance_info_cache with network_info: [{"id": "ec65b119-dfb7-4027-bd99-1b3de8416626", "address": "fa:16:3e:ac:0d:d4", "network": {"id": "9e24c283-8cbc-4703-9e61-98b782609dec", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-409456389-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bddb0cc777354c619155396b3af4a779", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec65b119-df", "ovs_interfaceid": "ec65b119-dfb7-4027-bd99-1b3de8416626", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:09:39 np0005466013 nova_compute[192144]: 2025-10-02 12:09:39.032 2 DEBUG oslo_concurrency.lockutils [req-d30baf00-2538-4c41-9dc0-5f42c662b7c0 req-da79b344-8f3c-4970-a576-3c1c131e36fe 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-5498fc15-956a-42a0-817e-e4bb31469607" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:09:39 np0005466013 nova_compute[192144]: 2025-10-02 12:09:39.884 2 DEBUG nova.compute.manager [req-d6dd55e5-d1cf-4830-b21b-fba589dc4220 req-986b9ef5-baee-4ce6-b8d6-21bc05b1cbc6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Received event network-vif-plugged-ec65b119-dfb7-4027-bd99-1b3de8416626 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:09:39 np0005466013 nova_compute[192144]: 2025-10-02 12:09:39.885 2 DEBUG oslo_concurrency.lockutils [req-d6dd55e5-d1cf-4830-b21b-fba589dc4220 req-986b9ef5-baee-4ce6-b8d6-21bc05b1cbc6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "5498fc15-956a-42a0-817e-e4bb31469607-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:39 np0005466013 nova_compute[192144]: 2025-10-02 12:09:39.885 2 DEBUG oslo_concurrency.lockutils [req-d6dd55e5-d1cf-4830-b21b-fba589dc4220 req-986b9ef5-baee-4ce6-b8d6-21bc05b1cbc6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "5498fc15-956a-42a0-817e-e4bb31469607-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:39 np0005466013 nova_compute[192144]: 2025-10-02 12:09:39.885 2 DEBUG oslo_concurrency.lockutils [req-d6dd55e5-d1cf-4830-b21b-fba589dc4220 req-986b9ef5-baee-4ce6-b8d6-21bc05b1cbc6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "5498fc15-956a-42a0-817e-e4bb31469607-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:39 np0005466013 nova_compute[192144]: 2025-10-02 12:09:39.886 2 DEBUG nova.compute.manager [req-d6dd55e5-d1cf-4830-b21b-fba589dc4220 req-986b9ef5-baee-4ce6-b8d6-21bc05b1cbc6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Processing event network-vif-plugged-ec65b119-dfb7-4027-bd99-1b3de8416626 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:09:39 np0005466013 nova_compute[192144]: 2025-10-02 12:09:39.886 2 DEBUG nova.compute.manager [req-d6dd55e5-d1cf-4830-b21b-fba589dc4220 req-986b9ef5-baee-4ce6-b8d6-21bc05b1cbc6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Received event network-vif-plugged-ec65b119-dfb7-4027-bd99-1b3de8416626 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:09:39 np0005466013 nova_compute[192144]: 2025-10-02 12:09:39.886 2 DEBUG oslo_concurrency.lockutils [req-d6dd55e5-d1cf-4830-b21b-fba589dc4220 req-986b9ef5-baee-4ce6-b8d6-21bc05b1cbc6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "5498fc15-956a-42a0-817e-e4bb31469607-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:39 np0005466013 nova_compute[192144]: 2025-10-02 12:09:39.886 2 DEBUG oslo_concurrency.lockutils [req-d6dd55e5-d1cf-4830-b21b-fba589dc4220 req-986b9ef5-baee-4ce6-b8d6-21bc05b1cbc6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "5498fc15-956a-42a0-817e-e4bb31469607-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:39 np0005466013 nova_compute[192144]: 2025-10-02 12:09:39.887 2 DEBUG oslo_concurrency.lockutils [req-d6dd55e5-d1cf-4830-b21b-fba589dc4220 req-986b9ef5-baee-4ce6-b8d6-21bc05b1cbc6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "5498fc15-956a-42a0-817e-e4bb31469607-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:39 np0005466013 nova_compute[192144]: 2025-10-02 12:09:39.887 2 DEBUG nova.compute.manager [req-d6dd55e5-d1cf-4830-b21b-fba589dc4220 req-986b9ef5-baee-4ce6-b8d6-21bc05b1cbc6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] No waiting events found dispatching network-vif-plugged-ec65b119-dfb7-4027-bd99-1b3de8416626 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:09:39 np0005466013 nova_compute[192144]: 2025-10-02 12:09:39.887 2 WARNING nova.compute.manager [req-d6dd55e5-d1cf-4830-b21b-fba589dc4220 req-986b9ef5-baee-4ce6-b8d6-21bc05b1cbc6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Received unexpected event network-vif-plugged-ec65b119-dfb7-4027-bd99-1b3de8416626 for instance with vm_state building and task_state spawning.#033[00m
Oct  2 08:09:39 np0005466013 nova_compute[192144]: 2025-10-02 12:09:39.888 2 DEBUG nova.compute.manager [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:09:39 np0005466013 nova_compute[192144]: 2025-10-02 12:09:39.892 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759406979.8923576, 5498fc15-956a-42a0-817e-e4bb31469607 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:09:39 np0005466013 nova_compute[192144]: 2025-10-02 12:09:39.893 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:09:39 np0005466013 nova_compute[192144]: 2025-10-02 12:09:39.895 2 DEBUG nova.virt.libvirt.driver [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:09:39 np0005466013 nova_compute[192144]: 2025-10-02 12:09:39.899 2 INFO nova.virt.libvirt.driver [-] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Instance spawned successfully.#033[00m
Oct  2 08:09:39 np0005466013 nova_compute[192144]: 2025-10-02 12:09:39.900 2 DEBUG nova.virt.libvirt.driver [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:09:39 np0005466013 nova_compute[192144]: 2025-10-02 12:09:39.918 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:09:39 np0005466013 nova_compute[192144]: 2025-10-02 12:09:39.925 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:09:39 np0005466013 nova_compute[192144]: 2025-10-02 12:09:39.933 2 DEBUG nova.virt.libvirt.driver [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:09:39 np0005466013 nova_compute[192144]: 2025-10-02 12:09:39.934 2 DEBUG nova.virt.libvirt.driver [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:09:39 np0005466013 nova_compute[192144]: 2025-10-02 12:09:39.934 2 DEBUG nova.virt.libvirt.driver [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:09:39 np0005466013 nova_compute[192144]: 2025-10-02 12:09:39.935 2 DEBUG nova.virt.libvirt.driver [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:09:39 np0005466013 nova_compute[192144]: 2025-10-02 12:09:39.936 2 DEBUG nova.virt.libvirt.driver [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:09:39 np0005466013 nova_compute[192144]: 2025-10-02 12:09:39.936 2 DEBUG nova.virt.libvirt.driver [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:09:39 np0005466013 nova_compute[192144]: 2025-10-02 12:09:39.969 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:09:40 np0005466013 nova_compute[192144]: 2025-10-02 12:09:40.018 2 INFO nova.compute.manager [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Took 8.62 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:09:40 np0005466013 nova_compute[192144]: 2025-10-02 12:09:40.019 2 DEBUG nova.compute.manager [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:09:40 np0005466013 nova_compute[192144]: 2025-10-02 12:09:40.126 2 INFO nova.compute.manager [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Took 9.23 seconds to build instance.#033[00m
Oct  2 08:09:40 np0005466013 nova_compute[192144]: 2025-10-02 12:09:40.151 2 DEBUG oslo_concurrency.lockutils [None req-ebd7df95-cebb-48fd-b0ea-a07978f91fbf 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Lock "5498fc15-956a-42a0-817e-e4bb31469607" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.358s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:40 np0005466013 nova_compute[192144]: 2025-10-02 12:09:40.418 2 DEBUG nova.compute.manager [req-7ff75422-ca5f-49a0-9e18-3a414d849177 req-20967e78-c2d3-4e66-87c3-f4fd788850e3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Received event network-changed-b0274f45-87f9-46e7-b0b4-c762e6ed2b43 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:09:40 np0005466013 nova_compute[192144]: 2025-10-02 12:09:40.419 2 DEBUG nova.compute.manager [req-7ff75422-ca5f-49a0-9e18-3a414d849177 req-20967e78-c2d3-4e66-87c3-f4fd788850e3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Refreshing instance network info cache due to event network-changed-b0274f45-87f9-46e7-b0b4-c762e6ed2b43. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:09:40 np0005466013 nova_compute[192144]: 2025-10-02 12:09:40.420 2 DEBUG oslo_concurrency.lockutils [req-7ff75422-ca5f-49a0-9e18-3a414d849177 req-20967e78-c2d3-4e66-87c3-f4fd788850e3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-aa3ad2f7-f503-4fb3-b13f-c3691b7f7700" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:09:40 np0005466013 nova_compute[192144]: 2025-10-02 12:09:40.420 2 DEBUG oslo_concurrency.lockutils [req-7ff75422-ca5f-49a0-9e18-3a414d849177 req-20967e78-c2d3-4e66-87c3-f4fd788850e3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-aa3ad2f7-f503-4fb3-b13f-c3691b7f7700" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:09:40 np0005466013 nova_compute[192144]: 2025-10-02 12:09:40.420 2 DEBUG nova.network.neutron [req-7ff75422-ca5f-49a0-9e18-3a414d849177 req-20967e78-c2d3-4e66-87c3-f4fd788850e3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Refreshing network info cache for port b0274f45-87f9-46e7-b0b4-c762e6ed2b43 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:09:40 np0005466013 podman[226101]: 2025-10-02 12:09:40.699635127 +0000 UTC m=+0.065937154 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3)
Oct  2 08:09:41 np0005466013 nova_compute[192144]: 2025-10-02 12:09:41.934 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:43 np0005466013 nova_compute[192144]: 2025-10-02 12:09:43.023 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:43 np0005466013 nova_compute[192144]: 2025-10-02 12:09:43.055 2 DEBUG nova.network.neutron [req-7ff75422-ca5f-49a0-9e18-3a414d849177 req-20967e78-c2d3-4e66-87c3-f4fd788850e3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Updated VIF entry in instance network info cache for port b0274f45-87f9-46e7-b0b4-c762e6ed2b43. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:09:43 np0005466013 nova_compute[192144]: 2025-10-02 12:09:43.056 2 DEBUG nova.network.neutron [req-7ff75422-ca5f-49a0-9e18-3a414d849177 req-20967e78-c2d3-4e66-87c3-f4fd788850e3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Updating instance_info_cache with network_info: [{"id": "b0274f45-87f9-46e7-b0b4-c762e6ed2b43", "address": "fa:16:3e:1b:44:bd", "network": {"id": "e3531c03-dcc1-4c2a-981f-8534850ce14f", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1609714742-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "53cd9990789640a5b5e28b5beb8b222b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0274f45-87", "ovs_interfaceid": "b0274f45-87f9-46e7-b0b4-c762e6ed2b43", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:09:43 np0005466013 nova_compute[192144]: 2025-10-02 12:09:43.077 2 DEBUG oslo_concurrency.lockutils [req-7ff75422-ca5f-49a0-9e18-3a414d849177 req-20967e78-c2d3-4e66-87c3-f4fd788850e3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-aa3ad2f7-f503-4fb3-b13f-c3691b7f7700" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:09:43 np0005466013 ovn_controller[94366]: 2025-10-02T12:09:43Z|00126|binding|INFO|Releasing lport c73cfa59-a671-4019-9fd6-1dcbc8e59b26 from this chassis (sb_readonly=0)
Oct  2 08:09:43 np0005466013 ovn_controller[94366]: 2025-10-02T12:09:43Z|00127|binding|INFO|Releasing lport a22200c1-7efb-4203-8e94-7851356dbd00 from this chassis (sb_readonly=0)
Oct  2 08:09:43 np0005466013 nova_compute[192144]: 2025-10-02 12:09:43.267 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:43 np0005466013 podman[226123]: 2025-10-02 12:09:43.721862468 +0000 UTC m=+0.073495910 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., architecture=x86_64, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, maintainer=Red Hat, Inc.)
Oct  2 08:09:43 np0005466013 podman[226122]: 2025-10-02 12:09:43.722807952 +0000 UTC m=+0.076727159 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:09:44 np0005466013 nova_compute[192144]: 2025-10-02 12:09:44.433 2 DEBUG nova.compute.manager [req-96e48bf6-4785-4888-96a0-27125a09b60b req-487d022d-ceb3-4aaf-b921-956a456ea62b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Received event network-changed-ec65b119-dfb7-4027-bd99-1b3de8416626 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:09:44 np0005466013 nova_compute[192144]: 2025-10-02 12:09:44.433 2 DEBUG nova.compute.manager [req-96e48bf6-4785-4888-96a0-27125a09b60b req-487d022d-ceb3-4aaf-b921-956a456ea62b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Refreshing instance network info cache due to event network-changed-ec65b119-dfb7-4027-bd99-1b3de8416626. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:09:44 np0005466013 nova_compute[192144]: 2025-10-02 12:09:44.433 2 DEBUG oslo_concurrency.lockutils [req-96e48bf6-4785-4888-96a0-27125a09b60b req-487d022d-ceb3-4aaf-b921-956a456ea62b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-5498fc15-956a-42a0-817e-e4bb31469607" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:09:44 np0005466013 nova_compute[192144]: 2025-10-02 12:09:44.434 2 DEBUG oslo_concurrency.lockutils [req-96e48bf6-4785-4888-96a0-27125a09b60b req-487d022d-ceb3-4aaf-b921-956a456ea62b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-5498fc15-956a-42a0-817e-e4bb31469607" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:09:44 np0005466013 nova_compute[192144]: 2025-10-02 12:09:44.434 2 DEBUG nova.network.neutron [req-96e48bf6-4785-4888-96a0-27125a09b60b req-487d022d-ceb3-4aaf-b921-956a456ea62b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Refreshing network info cache for port ec65b119-dfb7-4027-bd99-1b3de8416626 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:09:46 np0005466013 nova_compute[192144]: 2025-10-02 12:09:46.598 2 DEBUG nova.network.neutron [req-96e48bf6-4785-4888-96a0-27125a09b60b req-487d022d-ceb3-4aaf-b921-956a456ea62b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Updated VIF entry in instance network info cache for port ec65b119-dfb7-4027-bd99-1b3de8416626. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:09:46 np0005466013 nova_compute[192144]: 2025-10-02 12:09:46.599 2 DEBUG nova.network.neutron [req-96e48bf6-4785-4888-96a0-27125a09b60b req-487d022d-ceb3-4aaf-b921-956a456ea62b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Updating instance_info_cache with network_info: [{"id": "ec65b119-dfb7-4027-bd99-1b3de8416626", "address": "fa:16:3e:ac:0d:d4", "network": {"id": "9e24c283-8cbc-4703-9e61-98b782609dec", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-409456389-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bddb0cc777354c619155396b3af4a779", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec65b119-df", "ovs_interfaceid": "ec65b119-dfb7-4027-bd99-1b3de8416626", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:09:46 np0005466013 nova_compute[192144]: 2025-10-02 12:09:46.619 2 DEBUG oslo_concurrency.lockutils [req-96e48bf6-4785-4888-96a0-27125a09b60b req-487d022d-ceb3-4aaf-b921-956a456ea62b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-5498fc15-956a-42a0-817e-e4bb31469607" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:09:46 np0005466013 nova_compute[192144]: 2025-10-02 12:09:46.937 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:47 np0005466013 podman[226164]: 2025-10-02 12:09:47.682756158 +0000 UTC m=+0.053995670 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  2 08:09:47 np0005466013 podman[226165]: 2025-10-02 12:09:47.718096411 +0000 UTC m=+0.084246285 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:09:48 np0005466013 nova_compute[192144]: 2025-10-02 12:09:48.030 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:51 np0005466013 nova_compute[192144]: 2025-10-02 12:09:51.939 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:52 np0005466013 ovn_controller[94366]: 2025-10-02T12:09:52Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ac:0d:d4 10.100.0.12
Oct  2 08:09:52 np0005466013 ovn_controller[94366]: 2025-10-02T12:09:52Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ac:0d:d4 10.100.0.12
Oct  2 08:09:53 np0005466013 nova_compute[192144]: 2025-10-02 12:09:53.032 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:54 np0005466013 nova_compute[192144]: 2025-10-02 12:09:54.993 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:56 np0005466013 nova_compute[192144]: 2025-10-02 12:09:56.371 2 DEBUG nova.compute.manager [req-986fecea-4218-411d-b17c-609a90d91302 req-9f18987b-fd5e-4af1-b438-f5a948c9e2f4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Received event network-changed-b0274f45-87f9-46e7-b0b4-c762e6ed2b43 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:09:56 np0005466013 nova_compute[192144]: 2025-10-02 12:09:56.372 2 DEBUG nova.compute.manager [req-986fecea-4218-411d-b17c-609a90d91302 req-9f18987b-fd5e-4af1-b438-f5a948c9e2f4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Refreshing instance network info cache due to event network-changed-b0274f45-87f9-46e7-b0b4-c762e6ed2b43. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:09:56 np0005466013 nova_compute[192144]: 2025-10-02 12:09:56.372 2 DEBUG oslo_concurrency.lockutils [req-986fecea-4218-411d-b17c-609a90d91302 req-9f18987b-fd5e-4af1-b438-f5a948c9e2f4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-aa3ad2f7-f503-4fb3-b13f-c3691b7f7700" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:09:56 np0005466013 nova_compute[192144]: 2025-10-02 12:09:56.372 2 DEBUG oslo_concurrency.lockutils [req-986fecea-4218-411d-b17c-609a90d91302 req-9f18987b-fd5e-4af1-b438-f5a948c9e2f4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-aa3ad2f7-f503-4fb3-b13f-c3691b7f7700" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:09:56 np0005466013 nova_compute[192144]: 2025-10-02 12:09:56.372 2 DEBUG nova.network.neutron [req-986fecea-4218-411d-b17c-609a90d91302 req-9f18987b-fd5e-4af1-b438-f5a948c9e2f4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Refreshing network info cache for port b0274f45-87f9-46e7-b0b4-c762e6ed2b43 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:09:56 np0005466013 nova_compute[192144]: 2025-10-02 12:09:56.942 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:57 np0005466013 nova_compute[192144]: 2025-10-02 12:09:57.700 2 DEBUG nova.network.neutron [req-986fecea-4218-411d-b17c-609a90d91302 req-9f18987b-fd5e-4af1-b438-f5a948c9e2f4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Updated VIF entry in instance network info cache for port b0274f45-87f9-46e7-b0b4-c762e6ed2b43. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:09:57 np0005466013 nova_compute[192144]: 2025-10-02 12:09:57.700 2 DEBUG nova.network.neutron [req-986fecea-4218-411d-b17c-609a90d91302 req-9f18987b-fd5e-4af1-b438-f5a948c9e2f4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Updating instance_info_cache with network_info: [{"id": "b0274f45-87f9-46e7-b0b4-c762e6ed2b43", "address": "fa:16:3e:1b:44:bd", "network": {"id": "e3531c03-dcc1-4c2a-981f-8534850ce14f", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1609714742-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "53cd9990789640a5b5e28b5beb8b222b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0274f45-87", "ovs_interfaceid": "b0274f45-87f9-46e7-b0b4-c762e6ed2b43", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:09:57 np0005466013 nova_compute[192144]: 2025-10-02 12:09:57.906 2 DEBUG oslo_concurrency.lockutils [req-986fecea-4218-411d-b17c-609a90d91302 req-9f18987b-fd5e-4af1-b438-f5a948c9e2f4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-aa3ad2f7-f503-4fb3-b13f-c3691b7f7700" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:09:58 np0005466013 nova_compute[192144]: 2025-10-02 12:09:58.034 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:58 np0005466013 nova_compute[192144]: 2025-10-02 12:09:58.851 2 DEBUG nova.compute.manager [req-3a5c56a1-0e57-4a12-a133-8022c4869be3 req-e924a400-8852-4db1-bbf6-8c60f919a1f9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Received event network-changed-b0274f45-87f9-46e7-b0b4-c762e6ed2b43 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:09:58 np0005466013 nova_compute[192144]: 2025-10-02 12:09:58.851 2 DEBUG nova.compute.manager [req-3a5c56a1-0e57-4a12-a133-8022c4869be3 req-e924a400-8852-4db1-bbf6-8c60f919a1f9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Refreshing instance network info cache due to event network-changed-b0274f45-87f9-46e7-b0b4-c762e6ed2b43. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:09:58 np0005466013 nova_compute[192144]: 2025-10-02 12:09:58.851 2 DEBUG oslo_concurrency.lockutils [req-3a5c56a1-0e57-4a12-a133-8022c4869be3 req-e924a400-8852-4db1-bbf6-8c60f919a1f9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-aa3ad2f7-f503-4fb3-b13f-c3691b7f7700" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:09:58 np0005466013 nova_compute[192144]: 2025-10-02 12:09:58.852 2 DEBUG oslo_concurrency.lockutils [req-3a5c56a1-0e57-4a12-a133-8022c4869be3 req-e924a400-8852-4db1-bbf6-8c60f919a1f9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-aa3ad2f7-f503-4fb3-b13f-c3691b7f7700" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:09:58 np0005466013 nova_compute[192144]: 2025-10-02 12:09:58.852 2 DEBUG nova.network.neutron [req-3a5c56a1-0e57-4a12-a133-8022c4869be3 req-e924a400-8852-4db1-bbf6-8c60f919a1f9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Refreshing network info cache for port b0274f45-87f9-46e7-b0b4-c762e6ed2b43 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:09:59 np0005466013 nova_compute[192144]: 2025-10-02 12:09:59.843 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:00 np0005466013 nova_compute[192144]: 2025-10-02 12:10:00.101 2 DEBUG nova.network.neutron [req-3a5c56a1-0e57-4a12-a133-8022c4869be3 req-e924a400-8852-4db1-bbf6-8c60f919a1f9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Updated VIF entry in instance network info cache for port b0274f45-87f9-46e7-b0b4-c762e6ed2b43. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:10:00 np0005466013 nova_compute[192144]: 2025-10-02 12:10:00.101 2 DEBUG nova.network.neutron [req-3a5c56a1-0e57-4a12-a133-8022c4869be3 req-e924a400-8852-4db1-bbf6-8c60f919a1f9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Updating instance_info_cache with network_info: [{"id": "b0274f45-87f9-46e7-b0b4-c762e6ed2b43", "address": "fa:16:3e:1b:44:bd", "network": {"id": "e3531c03-dcc1-4c2a-981f-8534850ce14f", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1609714742-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "53cd9990789640a5b5e28b5beb8b222b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0274f45-87", "ovs_interfaceid": "b0274f45-87f9-46e7-b0b4-c762e6ed2b43", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:10:00 np0005466013 nova_compute[192144]: 2025-10-02 12:10:00.132 2 DEBUG oslo_concurrency.lockutils [req-3a5c56a1-0e57-4a12-a133-8022c4869be3 req-e924a400-8852-4db1-bbf6-8c60f919a1f9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-aa3ad2f7-f503-4fb3-b13f-c3691b7f7700" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:10:00 np0005466013 podman[226224]: 2025-10-02 12:10:00.686118575 +0000 UTC m=+0.056183795 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Oct  2 08:10:00 np0005466013 podman[226223]: 2025-10-02 12:10:00.712089104 +0000 UTC m=+0.086167626 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 08:10:00 np0005466013 podman[226225]: 2025-10-02 12:10:00.730828911 +0000 UTC m=+0.098310030 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Oct  2 08:10:01 np0005466013 nova_compute[192144]: 2025-10-02 12:10:01.881 2 DEBUG nova.objects.instance [None req-c4e065a4-0a11-4553-b49f-891628808c7c 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Lazy-loading 'flavor' on Instance uuid 5498fc15-956a-42a0-817e-e4bb31469607 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:10:01 np0005466013 nova_compute[192144]: 2025-10-02 12:10:01.944 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:02 np0005466013 nova_compute[192144]: 2025-10-02 12:10:02.202 2 DEBUG oslo_concurrency.lockutils [None req-c4e065a4-0a11-4553-b49f-891628808c7c 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Acquiring lock "refresh_cache-5498fc15-956a-42a0-817e-e4bb31469607" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:10:02 np0005466013 nova_compute[192144]: 2025-10-02 12:10:02.203 2 DEBUG oslo_concurrency.lockutils [None req-c4e065a4-0a11-4553-b49f-891628808c7c 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Acquired lock "refresh_cache-5498fc15-956a-42a0-817e-e4bb31469607" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:10:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:02.291 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:02.291 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:02.292 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:03 np0005466013 nova_compute[192144]: 2025-10-02 12:10:03.036 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:03 np0005466013 nova_compute[192144]: 2025-10-02 12:10:03.280 2 DEBUG oslo_concurrency.lockutils [None req-f87baf58-21d4-4972-b7fb-dbb60ac3c2d1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Acquiring lock "aa3ad2f7-f503-4fb3-b13f-c3691b7f7700" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:03 np0005466013 nova_compute[192144]: 2025-10-02 12:10:03.280 2 DEBUG oslo_concurrency.lockutils [None req-f87baf58-21d4-4972-b7fb-dbb60ac3c2d1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Lock "aa3ad2f7-f503-4fb3-b13f-c3691b7f7700" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:03 np0005466013 nova_compute[192144]: 2025-10-02 12:10:03.280 2 DEBUG oslo_concurrency.lockutils [None req-f87baf58-21d4-4972-b7fb-dbb60ac3c2d1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Acquiring lock "aa3ad2f7-f503-4fb3-b13f-c3691b7f7700-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:03 np0005466013 nova_compute[192144]: 2025-10-02 12:10:03.281 2 DEBUG oslo_concurrency.lockutils [None req-f87baf58-21d4-4972-b7fb-dbb60ac3c2d1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Lock "aa3ad2f7-f503-4fb3-b13f-c3691b7f7700-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:03 np0005466013 nova_compute[192144]: 2025-10-02 12:10:03.281 2 DEBUG oslo_concurrency.lockutils [None req-f87baf58-21d4-4972-b7fb-dbb60ac3c2d1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Lock "aa3ad2f7-f503-4fb3-b13f-c3691b7f7700-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:03 np0005466013 nova_compute[192144]: 2025-10-02 12:10:03.346 2 INFO nova.compute.manager [None req-f87baf58-21d4-4972-b7fb-dbb60ac3c2d1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Terminating instance#033[00m
Oct  2 08:10:03 np0005466013 nova_compute[192144]: 2025-10-02 12:10:03.435 2 DEBUG nova.compute.manager [None req-f87baf58-21d4-4972-b7fb-dbb60ac3c2d1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:10:03 np0005466013 kernel: tapb0274f45-87 (unregistering): left promiscuous mode
Oct  2 08:10:03 np0005466013 NetworkManager[51205]: <info>  [1759407003.4610] device (tapb0274f45-87): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:10:03 np0005466013 nova_compute[192144]: 2025-10-02 12:10:03.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:03 np0005466013 ovn_controller[94366]: 2025-10-02T12:10:03Z|00128|binding|INFO|Releasing lport b0274f45-87f9-46e7-b0b4-c762e6ed2b43 from this chassis (sb_readonly=0)
Oct  2 08:10:03 np0005466013 ovn_controller[94366]: 2025-10-02T12:10:03Z|00129|binding|INFO|Setting lport b0274f45-87f9-46e7-b0b4-c762e6ed2b43 down in Southbound
Oct  2 08:10:03 np0005466013 ovn_controller[94366]: 2025-10-02T12:10:03Z|00130|binding|INFO|Removing iface tapb0274f45-87 ovn-installed in OVS
Oct  2 08:10:03 np0005466013 nova_compute[192144]: 2025-10-02 12:10:03.476 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:03 np0005466013 nova_compute[192144]: 2025-10-02 12:10:03.487 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:03 np0005466013 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000030.scope: Deactivated successfully.
Oct  2 08:10:03 np0005466013 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000030.scope: Consumed 14.467s CPU time.
Oct  2 08:10:03 np0005466013 systemd-machined[152202]: Machine qemu-19-instance-00000030 terminated.
Oct  2 08:10:03 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:03.540 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1b:44:bd 10.100.0.9'], port_security=['fa:16:3e:1b:44:bd 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'aa3ad2f7-f503-4fb3-b13f-c3691b7f7700', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e3531c03-dcc1-4c2a-981f-8534850ce14f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '53cd9990789640a5b5e28b5beb8b222b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '451fc8d0-64dd-41c6-91ef-df444df65e30', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9845a444-54df-440f-9a26-b835473c9d1b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=b0274f45-87f9-46e7-b0b4-c762e6ed2b43) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:10:03 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:03.542 103323 INFO neutron.agent.ovn.metadata.agent [-] Port b0274f45-87f9-46e7-b0b4-c762e6ed2b43 in datapath e3531c03-dcc1-4c2a-981f-8534850ce14f unbound from our chassis#033[00m
Oct  2 08:10:03 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:03.544 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e3531c03-dcc1-4c2a-981f-8534850ce14f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:10:03 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:03.545 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[3dd480ae-5868-4fbb-bcb5-70ab5b0c93a7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:03 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:03.545 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e3531c03-dcc1-4c2a-981f-8534850ce14f namespace which is not needed anymore#033[00m
Oct  2 08:10:03 np0005466013 nova_compute[192144]: 2025-10-02 12:10:03.612 2 DEBUG nova.network.neutron [None req-c4e065a4-0a11-4553-b49f-891628808c7c 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:10:03 np0005466013 neutron-haproxy-ovnmeta-e3531c03-dcc1-4c2a-981f-8534850ce14f[225855]: [NOTICE]   (225859) : haproxy version is 2.8.14-c23fe91
Oct  2 08:10:03 np0005466013 neutron-haproxy-ovnmeta-e3531c03-dcc1-4c2a-981f-8534850ce14f[225855]: [NOTICE]   (225859) : path to executable is /usr/sbin/haproxy
Oct  2 08:10:03 np0005466013 neutron-haproxy-ovnmeta-e3531c03-dcc1-4c2a-981f-8534850ce14f[225855]: [WARNING]  (225859) : Exiting Master process...
Oct  2 08:10:03 np0005466013 neutron-haproxy-ovnmeta-e3531c03-dcc1-4c2a-981f-8534850ce14f[225855]: [WARNING]  (225859) : Exiting Master process...
Oct  2 08:10:03 np0005466013 neutron-haproxy-ovnmeta-e3531c03-dcc1-4c2a-981f-8534850ce14f[225855]: [ALERT]    (225859) : Current worker (225861) exited with code 143 (Terminated)
Oct  2 08:10:03 np0005466013 neutron-haproxy-ovnmeta-e3531c03-dcc1-4c2a-981f-8534850ce14f[225855]: [WARNING]  (225859) : All workers exited. Exiting... (0)
Oct  2 08:10:03 np0005466013 systemd[1]: libpod-9e886f1a7d82f415dd76908c13d715cbffa52f9677d7813aa6b29062bbac6e67.scope: Deactivated successfully.
Oct  2 08:10:03 np0005466013 podman[226309]: 2025-10-02 12:10:03.689596234 +0000 UTC m=+0.051834104 container died 9e886f1a7d82f415dd76908c13d715cbffa52f9677d7813aa6b29062bbac6e67 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e3531c03-dcc1-4c2a-981f-8534850ce14f, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true)
Oct  2 08:10:03 np0005466013 nova_compute[192144]: 2025-10-02 12:10:03.705 2 INFO nova.virt.libvirt.driver [-] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Instance destroyed successfully.#033[00m
Oct  2 08:10:03 np0005466013 nova_compute[192144]: 2025-10-02 12:10:03.706 2 DEBUG nova.objects.instance [None req-f87baf58-21d4-4972-b7fb-dbb60ac3c2d1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Lazy-loading 'resources' on Instance uuid aa3ad2f7-f503-4fb3-b13f-c3691b7f7700 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:10:03 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9e886f1a7d82f415dd76908c13d715cbffa52f9677d7813aa6b29062bbac6e67-userdata-shm.mount: Deactivated successfully.
Oct  2 08:10:03 np0005466013 systemd[1]: var-lib-containers-storage-overlay-ec81b60a0db651e7f87ed9e34de9c924c40875d7861d4466bc954345c84240a6-merged.mount: Deactivated successfully.
Oct  2 08:10:03 np0005466013 podman[226309]: 2025-10-02 12:10:03.767533114 +0000 UTC m=+0.129770984 container cleanup 9e886f1a7d82f415dd76908c13d715cbffa52f9677d7813aa6b29062bbac6e67 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e3531c03-dcc1-4c2a-981f-8534850ce14f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:10:03 np0005466013 nova_compute[192144]: 2025-10-02 12:10:03.773 2 DEBUG nova.virt.libvirt.vif [None req-f87baf58-21d4-4972-b7fb-dbb60ac3c2d1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:09:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-1306562486',display_name='tempest-FloatingIPsAssociationTestJSON-server-1306562486',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-1306562486',id=48,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:09:19Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='53cd9990789640a5b5e28b5beb8b222b',ramdisk_id='',reservation_id='r-1hewk00f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIPsAssociationTestJSON-132128146',owner_user_name='tempest-FloatingIPsAssociationTestJSON-132128146-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:09:19Z,user_data=None,user_id='725180cfb6174d38a53f3965d04a4916',uuid=aa3ad2f7-f503-4fb3-b13f-c3691b7f7700,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b0274f45-87f9-46e7-b0b4-c762e6ed2b43", "address": "fa:16:3e:1b:44:bd", "network": {"id": "e3531c03-dcc1-4c2a-981f-8534850ce14f", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1609714742-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "53cd9990789640a5b5e28b5beb8b222b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0274f45-87", "ovs_interfaceid": "b0274f45-87f9-46e7-b0b4-c762e6ed2b43", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:10:03 np0005466013 nova_compute[192144]: 2025-10-02 12:10:03.774 2 DEBUG nova.network.os_vif_util [None req-f87baf58-21d4-4972-b7fb-dbb60ac3c2d1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Converting VIF {"id": "b0274f45-87f9-46e7-b0b4-c762e6ed2b43", "address": "fa:16:3e:1b:44:bd", "network": {"id": "e3531c03-dcc1-4c2a-981f-8534850ce14f", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1609714742-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "53cd9990789640a5b5e28b5beb8b222b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0274f45-87", "ovs_interfaceid": "b0274f45-87f9-46e7-b0b4-c762e6ed2b43", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:10:03 np0005466013 nova_compute[192144]: 2025-10-02 12:10:03.775 2 DEBUG nova.network.os_vif_util [None req-f87baf58-21d4-4972-b7fb-dbb60ac3c2d1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1b:44:bd,bridge_name='br-int',has_traffic_filtering=True,id=b0274f45-87f9-46e7-b0b4-c762e6ed2b43,network=Network(e3531c03-dcc1-4c2a-981f-8534850ce14f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb0274f45-87') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:10:03 np0005466013 nova_compute[192144]: 2025-10-02 12:10:03.776 2 DEBUG os_vif [None req-f87baf58-21d4-4972-b7fb-dbb60ac3c2d1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1b:44:bd,bridge_name='br-int',has_traffic_filtering=True,id=b0274f45-87f9-46e7-b0b4-c762e6ed2b43,network=Network(e3531c03-dcc1-4c2a-981f-8534850ce14f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb0274f45-87') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:10:03 np0005466013 systemd[1]: libpod-conmon-9e886f1a7d82f415dd76908c13d715cbffa52f9677d7813aa6b29062bbac6e67.scope: Deactivated successfully.
Oct  2 08:10:03 np0005466013 nova_compute[192144]: 2025-10-02 12:10:03.779 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:03 np0005466013 nova_compute[192144]: 2025-10-02 12:10:03.780 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb0274f45-87, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:10:03 np0005466013 nova_compute[192144]: 2025-10-02 12:10:03.783 2 DEBUG nova.compute.manager [req-43db0c53-52b7-4367-9eb2-93391d84a638 req-e1fe49b3-19dc-406d-acd6-32a9af4eecf9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Received event network-changed-ec65b119-dfb7-4027-bd99-1b3de8416626 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:10:03 np0005466013 nova_compute[192144]: 2025-10-02 12:10:03.783 2 DEBUG nova.compute.manager [req-43db0c53-52b7-4367-9eb2-93391d84a638 req-e1fe49b3-19dc-406d-acd6-32a9af4eecf9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Refreshing instance network info cache due to event network-changed-ec65b119-dfb7-4027-bd99-1b3de8416626. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:10:03 np0005466013 nova_compute[192144]: 2025-10-02 12:10:03.783 2 DEBUG oslo_concurrency.lockutils [req-43db0c53-52b7-4367-9eb2-93391d84a638 req-e1fe49b3-19dc-406d-acd6-32a9af4eecf9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-5498fc15-956a-42a0-817e-e4bb31469607" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:10:03 np0005466013 nova_compute[192144]: 2025-10-02 12:10:03.784 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:03 np0005466013 nova_compute[192144]: 2025-10-02 12:10:03.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:10:03 np0005466013 nova_compute[192144]: 2025-10-02 12:10:03.788 2 INFO os_vif [None req-f87baf58-21d4-4972-b7fb-dbb60ac3c2d1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1b:44:bd,bridge_name='br-int',has_traffic_filtering=True,id=b0274f45-87f9-46e7-b0b4-c762e6ed2b43,network=Network(e3531c03-dcc1-4c2a-981f-8534850ce14f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb0274f45-87')#033[00m
Oct  2 08:10:03 np0005466013 nova_compute[192144]: 2025-10-02 12:10:03.789 2 INFO nova.virt.libvirt.driver [None req-f87baf58-21d4-4972-b7fb-dbb60ac3c2d1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Deleting instance files /var/lib/nova/instances/aa3ad2f7-f503-4fb3-b13f-c3691b7f7700_del#033[00m
Oct  2 08:10:03 np0005466013 nova_compute[192144]: 2025-10-02 12:10:03.790 2 INFO nova.virt.libvirt.driver [None req-f87baf58-21d4-4972-b7fb-dbb60ac3c2d1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Deletion of /var/lib/nova/instances/aa3ad2f7-f503-4fb3-b13f-c3691b7f7700_del complete#033[00m
Oct  2 08:10:03 np0005466013 podman[226351]: 2025-10-02 12:10:03.833065366 +0000 UTC m=+0.043284413 container remove 9e886f1a7d82f415dd76908c13d715cbffa52f9677d7813aa6b29062bbac6e67 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e3531c03-dcc1-4c2a-981f-8534850ce14f, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:10:03 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:03.838 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[445363c7-6ba1-45d2-9041-3a93be513cac]: (4, ('Thu Oct  2 12:10:03 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-e3531c03-dcc1-4c2a-981f-8534850ce14f (9e886f1a7d82f415dd76908c13d715cbffa52f9677d7813aa6b29062bbac6e67)\n9e886f1a7d82f415dd76908c13d715cbffa52f9677d7813aa6b29062bbac6e67\nThu Oct  2 12:10:03 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-e3531c03-dcc1-4c2a-981f-8534850ce14f (9e886f1a7d82f415dd76908c13d715cbffa52f9677d7813aa6b29062bbac6e67)\n9e886f1a7d82f415dd76908c13d715cbffa52f9677d7813aa6b29062bbac6e67\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:03 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:03.839 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[2c9524ca-c40b-4032-b471-9910dc2af259]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:03 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:03.840 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape3531c03-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:10:03 np0005466013 kernel: tape3531c03-d0: left promiscuous mode
Oct  2 08:10:03 np0005466013 nova_compute[192144]: 2025-10-02 12:10:03.842 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:03 np0005466013 nova_compute[192144]: 2025-10-02 12:10:03.854 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:03 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:03.858 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[cd237acf-f8fa-4258-b8f8-0df1665c5490]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:03 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:03.883 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[cd8c48c9-1241-4994-adbe-4b776b300c51]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:03 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:03.884 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[c650f561-d6f1-4789-b91b-e957ff07fa34]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:03 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:03.902 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[41498448-0aa2-4edb-a3f2-d309c720fd4a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 494973, 'reachable_time': 42470, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226367, 'error': None, 'target': 'ovnmeta-e3531c03-dcc1-4c2a-981f-8534850ce14f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:03 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:03.904 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e3531c03-dcc1-4c2a-981f-8534850ce14f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:10:03 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:03.904 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[3bb75c10-c57d-476b-b629-d964b2965c28]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:03 np0005466013 systemd[1]: run-netns-ovnmeta\x2de3531c03\x2ddcc1\x2d4c2a\x2d981f\x2d8534850ce14f.mount: Deactivated successfully.
Oct  2 08:10:04 np0005466013 nova_compute[192144]: 2025-10-02 12:10:04.130 2 INFO nova.compute.manager [None req-f87baf58-21d4-4972-b7fb-dbb60ac3c2d1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Took 0.69 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:10:04 np0005466013 nova_compute[192144]: 2025-10-02 12:10:04.131 2 DEBUG oslo.service.loopingcall [None req-f87baf58-21d4-4972-b7fb-dbb60ac3c2d1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:10:04 np0005466013 nova_compute[192144]: 2025-10-02 12:10:04.131 2 DEBUG nova.compute.manager [-] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:10:04 np0005466013 nova_compute[192144]: 2025-10-02 12:10:04.131 2 DEBUG nova.network.neutron [-] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:10:05 np0005466013 nova_compute[192144]: 2025-10-02 12:10:05.022 2 DEBUG nova.network.neutron [None req-c4e065a4-0a11-4553-b49f-891628808c7c 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Updating instance_info_cache with network_info: [{"id": "ec65b119-dfb7-4027-bd99-1b3de8416626", "address": "fa:16:3e:ac:0d:d4", "network": {"id": "9e24c283-8cbc-4703-9e61-98b782609dec", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-409456389-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}, {"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bddb0cc777354c619155396b3af4a779", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec65b119-df", "ovs_interfaceid": "ec65b119-dfb7-4027-bd99-1b3de8416626", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:10:05 np0005466013 nova_compute[192144]: 2025-10-02 12:10:05.133 2 DEBUG oslo_concurrency.lockutils [None req-c4e065a4-0a11-4553-b49f-891628808c7c 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Releasing lock "refresh_cache-5498fc15-956a-42a0-817e-e4bb31469607" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:10:05 np0005466013 nova_compute[192144]: 2025-10-02 12:10:05.133 2 DEBUG nova.compute.manager [None req-c4e065a4-0a11-4553-b49f-891628808c7c 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Inject network info _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7144#033[00m
Oct  2 08:10:05 np0005466013 nova_compute[192144]: 2025-10-02 12:10:05.133 2 DEBUG nova.compute.manager [None req-c4e065a4-0a11-4553-b49f-891628808c7c 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] network_info to inject: |[{"id": "ec65b119-dfb7-4027-bd99-1b3de8416626", "address": "fa:16:3e:ac:0d:d4", "network": {"id": "9e24c283-8cbc-4703-9e61-98b782609dec", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-409456389-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}, {"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bddb0cc777354c619155396b3af4a779", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec65b119-df", "ovs_interfaceid": "ec65b119-dfb7-4027-bd99-1b3de8416626", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7145#033[00m
Oct  2 08:10:05 np0005466013 nova_compute[192144]: 2025-10-02 12:10:05.136 2 DEBUG oslo_concurrency.lockutils [req-43db0c53-52b7-4367-9eb2-93391d84a638 req-e1fe49b3-19dc-406d-acd6-32a9af4eecf9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-5498fc15-956a-42a0-817e-e4bb31469607" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:10:05 np0005466013 nova_compute[192144]: 2025-10-02 12:10:05.137 2 DEBUG nova.network.neutron [req-43db0c53-52b7-4367-9eb2-93391d84a638 req-e1fe49b3-19dc-406d-acd6-32a9af4eecf9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Refreshing network info cache for port ec65b119-dfb7-4027-bd99-1b3de8416626 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:10:05 np0005466013 nova_compute[192144]: 2025-10-02 12:10:05.367 2 DEBUG nova.network.neutron [-] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:10:05 np0005466013 nova_compute[192144]: 2025-10-02 12:10:05.522 2 INFO nova.compute.manager [-] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Took 1.39 seconds to deallocate network for instance.#033[00m
Oct  2 08:10:05 np0005466013 nova_compute[192144]: 2025-10-02 12:10:05.568 2 DEBUG nova.compute.manager [req-4271915c-26e7-4454-8240-7ffa0eb1f8e4 req-2b63a3b5-8c30-4928-a67e-5509fdd1a9f2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Received event network-vif-deleted-b0274f45-87f9-46e7-b0b4-c762e6ed2b43 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:10:05 np0005466013 nova_compute[192144]: 2025-10-02 12:10:05.568 2 INFO nova.compute.manager [req-4271915c-26e7-4454-8240-7ffa0eb1f8e4 req-2b63a3b5-8c30-4928-a67e-5509fdd1a9f2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Neutron deleted interface b0274f45-87f9-46e7-b0b4-c762e6ed2b43; detaching it from the instance and deleting it from the info cache#033[00m
Oct  2 08:10:05 np0005466013 nova_compute[192144]: 2025-10-02 12:10:05.569 2 DEBUG nova.network.neutron [req-4271915c-26e7-4454-8240-7ffa0eb1f8e4 req-2b63a3b5-8c30-4928-a67e-5509fdd1a9f2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:10:05 np0005466013 nova_compute[192144]: 2025-10-02 12:10:05.655 2 DEBUG nova.compute.manager [req-4271915c-26e7-4454-8240-7ffa0eb1f8e4 req-2b63a3b5-8c30-4928-a67e-5509fdd1a9f2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Detach interface failed, port_id=b0274f45-87f9-46e7-b0b4-c762e6ed2b43, reason: Instance aa3ad2f7-f503-4fb3-b13f-c3691b7f7700 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Oct  2 08:10:05 np0005466013 nova_compute[192144]: 2025-10-02 12:10:05.935 2 DEBUG oslo_concurrency.lockutils [None req-f87baf58-21d4-4972-b7fb-dbb60ac3c2d1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:05 np0005466013 nova_compute[192144]: 2025-10-02 12:10:05.935 2 DEBUG oslo_concurrency.lockutils [None req-f87baf58-21d4-4972-b7fb-dbb60ac3c2d1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:06 np0005466013 nova_compute[192144]: 2025-10-02 12:10:06.016 2 DEBUG nova.scheduler.client.report [None req-f87baf58-21d4-4972-b7fb-dbb60ac3c2d1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Refreshing inventories for resource provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  2 08:10:06 np0005466013 nova_compute[192144]: 2025-10-02 12:10:06.071 2 DEBUG nova.scheduler.client.report [None req-f87baf58-21d4-4972-b7fb-dbb60ac3c2d1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Updating ProviderTree inventory for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  2 08:10:06 np0005466013 nova_compute[192144]: 2025-10-02 12:10:06.071 2 DEBUG nova.compute.provider_tree [None req-f87baf58-21d4-4972-b7fb-dbb60ac3c2d1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Updating inventory in ProviderTree for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 08:10:06 np0005466013 nova_compute[192144]: 2025-10-02 12:10:06.141 2 DEBUG nova.scheduler.client.report [None req-f87baf58-21d4-4972-b7fb-dbb60ac3c2d1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Refreshing aggregate associations for resource provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  2 08:10:06 np0005466013 nova_compute[192144]: 2025-10-02 12:10:06.178 2 DEBUG nova.scheduler.client.report [None req-f87baf58-21d4-4972-b7fb-dbb60ac3c2d1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Refreshing trait associations for resource provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80, traits: COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_TRUSTED_CERTS,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NODE,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE,HW_CPU_X86_MMX,COMPUTE_RESCUE_BFV _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  2 08:10:06 np0005466013 nova_compute[192144]: 2025-10-02 12:10:06.202 2 DEBUG nova.compute.manager [req-f6f606d5-a406-4056-8f20-d922b8bf0dbb req-918495d3-12fb-4cfa-b3f2-4e6f604a7fb2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Received event network-vif-unplugged-b0274f45-87f9-46e7-b0b4-c762e6ed2b43 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:10:06 np0005466013 nova_compute[192144]: 2025-10-02 12:10:06.203 2 DEBUG oslo_concurrency.lockutils [req-f6f606d5-a406-4056-8f20-d922b8bf0dbb req-918495d3-12fb-4cfa-b3f2-4e6f604a7fb2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "aa3ad2f7-f503-4fb3-b13f-c3691b7f7700-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:06 np0005466013 nova_compute[192144]: 2025-10-02 12:10:06.203 2 DEBUG oslo_concurrency.lockutils [req-f6f606d5-a406-4056-8f20-d922b8bf0dbb req-918495d3-12fb-4cfa-b3f2-4e6f604a7fb2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "aa3ad2f7-f503-4fb3-b13f-c3691b7f7700-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:06 np0005466013 nova_compute[192144]: 2025-10-02 12:10:06.203 2 DEBUG oslo_concurrency.lockutils [req-f6f606d5-a406-4056-8f20-d922b8bf0dbb req-918495d3-12fb-4cfa-b3f2-4e6f604a7fb2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "aa3ad2f7-f503-4fb3-b13f-c3691b7f7700-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:06 np0005466013 nova_compute[192144]: 2025-10-02 12:10:06.204 2 DEBUG nova.compute.manager [req-f6f606d5-a406-4056-8f20-d922b8bf0dbb req-918495d3-12fb-4cfa-b3f2-4e6f604a7fb2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] No waiting events found dispatching network-vif-unplugged-b0274f45-87f9-46e7-b0b4-c762e6ed2b43 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:10:06 np0005466013 nova_compute[192144]: 2025-10-02 12:10:06.204 2 WARNING nova.compute.manager [req-f6f606d5-a406-4056-8f20-d922b8bf0dbb req-918495d3-12fb-4cfa-b3f2-4e6f604a7fb2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Received unexpected event network-vif-unplugged-b0274f45-87f9-46e7-b0b4-c762e6ed2b43 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:10:06 np0005466013 nova_compute[192144]: 2025-10-02 12:10:06.204 2 DEBUG nova.compute.manager [req-f6f606d5-a406-4056-8f20-d922b8bf0dbb req-918495d3-12fb-4cfa-b3f2-4e6f604a7fb2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Received event network-vif-plugged-b0274f45-87f9-46e7-b0b4-c762e6ed2b43 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:10:06 np0005466013 nova_compute[192144]: 2025-10-02 12:10:06.204 2 DEBUG oslo_concurrency.lockutils [req-f6f606d5-a406-4056-8f20-d922b8bf0dbb req-918495d3-12fb-4cfa-b3f2-4e6f604a7fb2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "aa3ad2f7-f503-4fb3-b13f-c3691b7f7700-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:06 np0005466013 nova_compute[192144]: 2025-10-02 12:10:06.205 2 DEBUG oslo_concurrency.lockutils [req-f6f606d5-a406-4056-8f20-d922b8bf0dbb req-918495d3-12fb-4cfa-b3f2-4e6f604a7fb2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "aa3ad2f7-f503-4fb3-b13f-c3691b7f7700-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:06 np0005466013 nova_compute[192144]: 2025-10-02 12:10:06.205 2 DEBUG oslo_concurrency.lockutils [req-f6f606d5-a406-4056-8f20-d922b8bf0dbb req-918495d3-12fb-4cfa-b3f2-4e6f604a7fb2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "aa3ad2f7-f503-4fb3-b13f-c3691b7f7700-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:06 np0005466013 nova_compute[192144]: 2025-10-02 12:10:06.205 2 DEBUG nova.compute.manager [req-f6f606d5-a406-4056-8f20-d922b8bf0dbb req-918495d3-12fb-4cfa-b3f2-4e6f604a7fb2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] No waiting events found dispatching network-vif-plugged-b0274f45-87f9-46e7-b0b4-c762e6ed2b43 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:10:06 np0005466013 nova_compute[192144]: 2025-10-02 12:10:06.206 2 WARNING nova.compute.manager [req-f6f606d5-a406-4056-8f20-d922b8bf0dbb req-918495d3-12fb-4cfa-b3f2-4e6f604a7fb2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Received unexpected event network-vif-plugged-b0274f45-87f9-46e7-b0b4-c762e6ed2b43 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:10:06 np0005466013 nova_compute[192144]: 2025-10-02 12:10:06.245 2 DEBUG nova.compute.provider_tree [None req-f87baf58-21d4-4972-b7fb-dbb60ac3c2d1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:10:06 np0005466013 nova_compute[192144]: 2025-10-02 12:10:06.265 2 DEBUG nova.scheduler.client.report [None req-f87baf58-21d4-4972-b7fb-dbb60ac3c2d1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:10:06 np0005466013 nova_compute[192144]: 2025-10-02 12:10:06.294 2 DEBUG oslo_concurrency.lockutils [None req-f87baf58-21d4-4972-b7fb-dbb60ac3c2d1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.359s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:06 np0005466013 nova_compute[192144]: 2025-10-02 12:10:06.319 2 INFO nova.scheduler.client.report [None req-f87baf58-21d4-4972-b7fb-dbb60ac3c2d1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Deleted allocations for instance aa3ad2f7-f503-4fb3-b13f-c3691b7f7700#033[00m
Oct  2 08:10:06 np0005466013 nova_compute[192144]: 2025-10-02 12:10:06.401 2 DEBUG oslo_concurrency.lockutils [None req-f87baf58-21d4-4972-b7fb-dbb60ac3c2d1 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Lock "aa3ad2f7-f503-4fb3-b13f-c3691b7f7700" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:06 np0005466013 nova_compute[192144]: 2025-10-02 12:10:06.461 2 DEBUG nova.objects.instance [None req-3e7dac8d-b7f8-43bd-8724-c1ab7db1320b 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Lazy-loading 'flavor' on Instance uuid 5498fc15-956a-42a0-817e-e4bb31469607 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:10:06 np0005466013 nova_compute[192144]: 2025-10-02 12:10:06.510 2 DEBUG oslo_concurrency.lockutils [None req-3e7dac8d-b7f8-43bd-8724-c1ab7db1320b 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Acquiring lock "refresh_cache-5498fc15-956a-42a0-817e-e4bb31469607" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:10:07 np0005466013 nova_compute[192144]: 2025-10-02 12:10:07.099 2 DEBUG nova.network.neutron [req-43db0c53-52b7-4367-9eb2-93391d84a638 req-e1fe49b3-19dc-406d-acd6-32a9af4eecf9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Updated VIF entry in instance network info cache for port ec65b119-dfb7-4027-bd99-1b3de8416626. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:10:07 np0005466013 nova_compute[192144]: 2025-10-02 12:10:07.099 2 DEBUG nova.network.neutron [req-43db0c53-52b7-4367-9eb2-93391d84a638 req-e1fe49b3-19dc-406d-acd6-32a9af4eecf9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Updating instance_info_cache with network_info: [{"id": "ec65b119-dfb7-4027-bd99-1b3de8416626", "address": "fa:16:3e:ac:0d:d4", "network": {"id": "9e24c283-8cbc-4703-9e61-98b782609dec", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-409456389-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}, {"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bddb0cc777354c619155396b3af4a779", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec65b119-df", "ovs_interfaceid": "ec65b119-dfb7-4027-bd99-1b3de8416626", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:10:07 np0005466013 nova_compute[192144]: 2025-10-02 12:10:07.115 2 DEBUG oslo_concurrency.lockutils [req-43db0c53-52b7-4367-9eb2-93391d84a638 req-e1fe49b3-19dc-406d-acd6-32a9af4eecf9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-5498fc15-956a-42a0-817e-e4bb31469607" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:10:07 np0005466013 nova_compute[192144]: 2025-10-02 12:10:07.116 2 DEBUG oslo_concurrency.lockutils [None req-3e7dac8d-b7f8-43bd-8724-c1ab7db1320b 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Acquired lock "refresh_cache-5498fc15-956a-42a0-817e-e4bb31469607" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:10:08 np0005466013 nova_compute[192144]: 2025-10-02 12:10:08.011 2 DEBUG nova.network.neutron [None req-3e7dac8d-b7f8-43bd-8724-c1ab7db1320b 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:10:08 np0005466013 nova_compute[192144]: 2025-10-02 12:10:08.038 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:08 np0005466013 nova_compute[192144]: 2025-10-02 12:10:08.304 2 DEBUG nova.compute.manager [req-55cc7bd7-6373-4a84-ba0e-9a9dbdedb4c7 req-ee76528d-bf28-40af-aaf4-2a5b26893afa 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Received event network-changed-ec65b119-dfb7-4027-bd99-1b3de8416626 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:10:08 np0005466013 nova_compute[192144]: 2025-10-02 12:10:08.305 2 DEBUG nova.compute.manager [req-55cc7bd7-6373-4a84-ba0e-9a9dbdedb4c7 req-ee76528d-bf28-40af-aaf4-2a5b26893afa 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Refreshing instance network info cache due to event network-changed-ec65b119-dfb7-4027-bd99-1b3de8416626. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:10:08 np0005466013 nova_compute[192144]: 2025-10-02 12:10:08.306 2 DEBUG oslo_concurrency.lockutils [req-55cc7bd7-6373-4a84-ba0e-9a9dbdedb4c7 req-ee76528d-bf28-40af-aaf4-2a5b26893afa 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-5498fc15-956a-42a0-817e-e4bb31469607" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:10:08 np0005466013 nova_compute[192144]: 2025-10-02 12:10:08.784 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:11 np0005466013 nova_compute[192144]: 2025-10-02 12:10:11.638 2 DEBUG nova.network.neutron [None req-3e7dac8d-b7f8-43bd-8724-c1ab7db1320b 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Updating instance_info_cache with network_info: [{"id": "ec65b119-dfb7-4027-bd99-1b3de8416626", "address": "fa:16:3e:ac:0d:d4", "network": {"id": "9e24c283-8cbc-4703-9e61-98b782609dec", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-409456389-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bddb0cc777354c619155396b3af4a779", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec65b119-df", "ovs_interfaceid": "ec65b119-dfb7-4027-bd99-1b3de8416626", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:10:11 np0005466013 podman[226368]: 2025-10-02 12:10:11.679406673 +0000 UTC m=+0.054802565 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:10:11 np0005466013 nova_compute[192144]: 2025-10-02 12:10:11.679 2 DEBUG oslo_concurrency.lockutils [None req-3e7dac8d-b7f8-43bd-8724-c1ab7db1320b 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Releasing lock "refresh_cache-5498fc15-956a-42a0-817e-e4bb31469607" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:10:11 np0005466013 nova_compute[192144]: 2025-10-02 12:10:11.680 2 DEBUG nova.compute.manager [None req-3e7dac8d-b7f8-43bd-8724-c1ab7db1320b 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Inject network info _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7144#033[00m
Oct  2 08:10:11 np0005466013 nova_compute[192144]: 2025-10-02 12:10:11.680 2 DEBUG nova.compute.manager [None req-3e7dac8d-b7f8-43bd-8724-c1ab7db1320b 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] network_info to inject: |[{"id": "ec65b119-dfb7-4027-bd99-1b3de8416626", "address": "fa:16:3e:ac:0d:d4", "network": {"id": "9e24c283-8cbc-4703-9e61-98b782609dec", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-409456389-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bddb0cc777354c619155396b3af4a779", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec65b119-df", "ovs_interfaceid": "ec65b119-dfb7-4027-bd99-1b3de8416626", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7145#033[00m
Oct  2 08:10:11 np0005466013 nova_compute[192144]: 2025-10-02 12:10:11.682 2 DEBUG oslo_concurrency.lockutils [req-55cc7bd7-6373-4a84-ba0e-9a9dbdedb4c7 req-ee76528d-bf28-40af-aaf4-2a5b26893afa 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-5498fc15-956a-42a0-817e-e4bb31469607" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:10:11 np0005466013 nova_compute[192144]: 2025-10-02 12:10:11.682 2 DEBUG nova.network.neutron [req-55cc7bd7-6373-4a84-ba0e-9a9dbdedb4c7 req-ee76528d-bf28-40af-aaf4-2a5b26893afa 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Refreshing network info cache for port ec65b119-dfb7-4027-bd99-1b3de8416626 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:10:12 np0005466013 nova_compute[192144]: 2025-10-02 12:10:12.932 2 DEBUG oslo_concurrency.lockutils [None req-ad154865-6cf4-4b06-b1f3-a2b6bb7a6fb0 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Acquiring lock "5498fc15-956a-42a0-817e-e4bb31469607" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:12 np0005466013 nova_compute[192144]: 2025-10-02 12:10:12.933 2 DEBUG oslo_concurrency.lockutils [None req-ad154865-6cf4-4b06-b1f3-a2b6bb7a6fb0 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Lock "5498fc15-956a-42a0-817e-e4bb31469607" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:12 np0005466013 nova_compute[192144]: 2025-10-02 12:10:12.933 2 DEBUG oslo_concurrency.lockutils [None req-ad154865-6cf4-4b06-b1f3-a2b6bb7a6fb0 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Acquiring lock "5498fc15-956a-42a0-817e-e4bb31469607-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:12 np0005466013 nova_compute[192144]: 2025-10-02 12:10:12.933 2 DEBUG oslo_concurrency.lockutils [None req-ad154865-6cf4-4b06-b1f3-a2b6bb7a6fb0 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Lock "5498fc15-956a-42a0-817e-e4bb31469607-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:12 np0005466013 nova_compute[192144]: 2025-10-02 12:10:12.934 2 DEBUG oslo_concurrency.lockutils [None req-ad154865-6cf4-4b06-b1f3-a2b6bb7a6fb0 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Lock "5498fc15-956a-42a0-817e-e4bb31469607-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:12 np0005466013 nova_compute[192144]: 2025-10-02 12:10:12.950 2 INFO nova.compute.manager [None req-ad154865-6cf4-4b06-b1f3-a2b6bb7a6fb0 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Terminating instance#033[00m
Oct  2 08:10:12 np0005466013 nova_compute[192144]: 2025-10-02 12:10:12.971 2 DEBUG nova.compute.manager [None req-ad154865-6cf4-4b06-b1f3-a2b6bb7a6fb0 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:10:12 np0005466013 kernel: tapec65b119-df (unregistering): left promiscuous mode
Oct  2 08:10:12 np0005466013 NetworkManager[51205]: <info>  [1759407012.9983] device (tapec65b119-df): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:10:13 np0005466013 ovn_controller[94366]: 2025-10-02T12:10:13Z|00131|binding|INFO|Releasing lport ec65b119-dfb7-4027-bd99-1b3de8416626 from this chassis (sb_readonly=0)
Oct  2 08:10:13 np0005466013 ovn_controller[94366]: 2025-10-02T12:10:13Z|00132|binding|INFO|Setting lport ec65b119-dfb7-4027-bd99-1b3de8416626 down in Southbound
Oct  2 08:10:13 np0005466013 nova_compute[192144]: 2025-10-02 12:10:13.007 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:13 np0005466013 ovn_controller[94366]: 2025-10-02T12:10:13Z|00133|binding|INFO|Removing iface tapec65b119-df ovn-installed in OVS
Oct  2 08:10:13 np0005466013 nova_compute[192144]: 2025-10-02 12:10:13.011 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:13 np0005466013 nova_compute[192144]: 2025-10-02 12:10:13.026 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:13.027 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:0d:d4 10.100.0.12'], port_security=['fa:16:3e:ac:0d:d4 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '5498fc15-956a-42a0-817e-e4bb31469607', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9e24c283-8cbc-4703-9e61-98b782609dec', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bddb0cc777354c619155396b3af4a779', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'a5b39b52-31e4-466c-a577-6a5a4ac5a248', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.231'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0c15321d-9c09-4ce1-a81b-3f0dd80999b7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=ec65b119-dfb7-4027-bd99-1b3de8416626) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:10:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:13.028 103323 INFO neutron.agent.ovn.metadata.agent [-] Port ec65b119-dfb7-4027-bd99-1b3de8416626 in datapath 9e24c283-8cbc-4703-9e61-98b782609dec unbound from our chassis#033[00m
Oct  2 08:10:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:13.031 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9e24c283-8cbc-4703-9e61-98b782609dec, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:10:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:13.033 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[04baccd7-44db-401d-bd48-58222068ec50]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:13.034 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9e24c283-8cbc-4703-9e61-98b782609dec namespace which is not needed anymore#033[00m
Oct  2 08:10:13 np0005466013 nova_compute[192144]: 2025-10-02 12:10:13.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:13 np0005466013 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000032.scope: Deactivated successfully.
Oct  2 08:10:13 np0005466013 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000032.scope: Consumed 14.606s CPU time.
Oct  2 08:10:13 np0005466013 systemd-machined[152202]: Machine qemu-20-instance-00000032 terminated.
Oct  2 08:10:13 np0005466013 neutron-haproxy-ovnmeta-9e24c283-8cbc-4703-9e61-98b782609dec[226086]: [NOTICE]   (226090) : haproxy version is 2.8.14-c23fe91
Oct  2 08:10:13 np0005466013 neutron-haproxy-ovnmeta-9e24c283-8cbc-4703-9e61-98b782609dec[226086]: [NOTICE]   (226090) : path to executable is /usr/sbin/haproxy
Oct  2 08:10:13 np0005466013 neutron-haproxy-ovnmeta-9e24c283-8cbc-4703-9e61-98b782609dec[226086]: [WARNING]  (226090) : Exiting Master process...
Oct  2 08:10:13 np0005466013 neutron-haproxy-ovnmeta-9e24c283-8cbc-4703-9e61-98b782609dec[226086]: [WARNING]  (226090) : Exiting Master process...
Oct  2 08:10:13 np0005466013 neutron-haproxy-ovnmeta-9e24c283-8cbc-4703-9e61-98b782609dec[226086]: [ALERT]    (226090) : Current worker (226092) exited with code 143 (Terminated)
Oct  2 08:10:13 np0005466013 neutron-haproxy-ovnmeta-9e24c283-8cbc-4703-9e61-98b782609dec[226086]: [WARNING]  (226090) : All workers exited. Exiting... (0)
Oct  2 08:10:13 np0005466013 systemd[1]: libpod-bd020f904c8a5bbe17880092e784637c5ea990945302dd170df0f9ad5b57ac2a.scope: Deactivated successfully.
Oct  2 08:10:13 np0005466013 podman[226412]: 2025-10-02 12:10:13.196885249 +0000 UTC m=+0.079600932 container died bd020f904c8a5bbe17880092e784637c5ea990945302dd170df0f9ad5b57ac2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9e24c283-8cbc-4703-9e61-98b782609dec, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:10:13 np0005466013 nova_compute[192144]: 2025-10-02 12:10:13.230 2 INFO nova.virt.libvirt.driver [-] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Instance destroyed successfully.#033[00m
Oct  2 08:10:13 np0005466013 nova_compute[192144]: 2025-10-02 12:10:13.231 2 DEBUG nova.objects.instance [None req-ad154865-6cf4-4b06-b1f3-a2b6bb7a6fb0 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Lazy-loading 'resources' on Instance uuid 5498fc15-956a-42a0-817e-e4bb31469607 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:10:13 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bd020f904c8a5bbe17880092e784637c5ea990945302dd170df0f9ad5b57ac2a-userdata-shm.mount: Deactivated successfully.
Oct  2 08:10:13 np0005466013 systemd[1]: var-lib-containers-storage-overlay-2482d772718bb72c5c06fa41f5977af25f28154e5a91dc798b04b8b9f4cb1790-merged.mount: Deactivated successfully.
Oct  2 08:10:13 np0005466013 nova_compute[192144]: 2025-10-02 12:10:13.251 2 DEBUG nova.virt.libvirt.vif [None req-ad154865-6cf4-4b06-b1f3-a2b6bb7a6fb0 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:09:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesUnderV243Test-server-317293098',display_name='tempest-AttachInterfacesUnderV243Test-server-317293098',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacesunderv243test-server-317293098',id=50,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIZvnjWPkZxqSPdIDG0zWedfhIItto24yt8Qc/6lQ3UzQDrUOVIiy5BZb2l3Cs/s/tvrOYJJdssgLv7yaV7Niy0i4bWu2vTMzJa7gPQs7zbNxPodTDIjKAdRcs6dhQdYQA==',key_name='tempest-keypair-712590735',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:09:40Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='bddb0cc777354c619155396b3af4a779',ramdisk_id='',reservation_id='r-4voredoz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesUnderV243Test-1757342469',owner_user_name='tempest-AttachInterfacesUnderV243Test-1757342469-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:10:11Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='45712a323f9248c3b83534f5afa82f60',uuid=5498fc15-956a-42a0-817e-e4bb31469607,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ec65b119-dfb7-4027-bd99-1b3de8416626", "address": "fa:16:3e:ac:0d:d4", "network": {"id": "9e24c283-8cbc-4703-9e61-98b782609dec", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-409456389-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bddb0cc777354c619155396b3af4a779", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec65b119-df", "ovs_interfaceid": "ec65b119-dfb7-4027-bd99-1b3de8416626", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:10:13 np0005466013 nova_compute[192144]: 2025-10-02 12:10:13.252 2 DEBUG nova.network.os_vif_util [None req-ad154865-6cf4-4b06-b1f3-a2b6bb7a6fb0 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Converting VIF {"id": "ec65b119-dfb7-4027-bd99-1b3de8416626", "address": "fa:16:3e:ac:0d:d4", "network": {"id": "9e24c283-8cbc-4703-9e61-98b782609dec", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-409456389-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bddb0cc777354c619155396b3af4a779", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec65b119-df", "ovs_interfaceid": "ec65b119-dfb7-4027-bd99-1b3de8416626", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:10:13 np0005466013 nova_compute[192144]: 2025-10-02 12:10:13.253 2 DEBUG nova.network.os_vif_util [None req-ad154865-6cf4-4b06-b1f3-a2b6bb7a6fb0 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ac:0d:d4,bridge_name='br-int',has_traffic_filtering=True,id=ec65b119-dfb7-4027-bd99-1b3de8416626,network=Network(9e24c283-8cbc-4703-9e61-98b782609dec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec65b119-df') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:10:13 np0005466013 nova_compute[192144]: 2025-10-02 12:10:13.253 2 DEBUG os_vif [None req-ad154865-6cf4-4b06-b1f3-a2b6bb7a6fb0 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ac:0d:d4,bridge_name='br-int',has_traffic_filtering=True,id=ec65b119-dfb7-4027-bd99-1b3de8416626,network=Network(9e24c283-8cbc-4703-9e61-98b782609dec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec65b119-df') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:10:13 np0005466013 nova_compute[192144]: 2025-10-02 12:10:13.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:13 np0005466013 nova_compute[192144]: 2025-10-02 12:10:13.257 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapec65b119-df, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:10:13 np0005466013 nova_compute[192144]: 2025-10-02 12:10:13.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:13 np0005466013 nova_compute[192144]: 2025-10-02 12:10:13.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:13 np0005466013 nova_compute[192144]: 2025-10-02 12:10:13.263 2 INFO os_vif [None req-ad154865-6cf4-4b06-b1f3-a2b6bb7a6fb0 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ac:0d:d4,bridge_name='br-int',has_traffic_filtering=True,id=ec65b119-dfb7-4027-bd99-1b3de8416626,network=Network(9e24c283-8cbc-4703-9e61-98b782609dec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec65b119-df')#033[00m
Oct  2 08:10:13 np0005466013 nova_compute[192144]: 2025-10-02 12:10:13.263 2 INFO nova.virt.libvirt.driver [None req-ad154865-6cf4-4b06-b1f3-a2b6bb7a6fb0 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Deleting instance files /var/lib/nova/instances/5498fc15-956a-42a0-817e-e4bb31469607_del#033[00m
Oct  2 08:10:13 np0005466013 nova_compute[192144]: 2025-10-02 12:10:13.264 2 INFO nova.virt.libvirt.driver [None req-ad154865-6cf4-4b06-b1f3-a2b6bb7a6fb0 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Deletion of /var/lib/nova/instances/5498fc15-956a-42a0-817e-e4bb31469607_del complete#033[00m
Oct  2 08:10:13 np0005466013 podman[226412]: 2025-10-02 12:10:13.282115742 +0000 UTC m=+0.164831395 container cleanup bd020f904c8a5bbe17880092e784637c5ea990945302dd170df0f9ad5b57ac2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9e24c283-8cbc-4703-9e61-98b782609dec, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:10:13 np0005466013 systemd[1]: libpod-conmon-bd020f904c8a5bbe17880092e784637c5ea990945302dd170df0f9ad5b57ac2a.scope: Deactivated successfully.
Oct  2 08:10:13 np0005466013 nova_compute[192144]: 2025-10-02 12:10:13.371 2 INFO nova.compute.manager [None req-ad154865-6cf4-4b06-b1f3-a2b6bb7a6fb0 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:10:13 np0005466013 nova_compute[192144]: 2025-10-02 12:10:13.372 2 DEBUG oslo.service.loopingcall [None req-ad154865-6cf4-4b06-b1f3-a2b6bb7a6fb0 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:10:13 np0005466013 nova_compute[192144]: 2025-10-02 12:10:13.372 2 DEBUG nova.compute.manager [-] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:10:13 np0005466013 nova_compute[192144]: 2025-10-02 12:10:13.372 2 DEBUG nova.network.neutron [-] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:10:13 np0005466013 podman[226458]: 2025-10-02 12:10:13.451122154 +0000 UTC m=+0.148818096 container remove bd020f904c8a5bbe17880092e784637c5ea990945302dd170df0f9ad5b57ac2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9e24c283-8cbc-4703-9e61-98b782609dec, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  2 08:10:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:13.458 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[3409b1e2-4809-4218-a284-054815d46a7d]: (4, ('Thu Oct  2 12:10:13 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9e24c283-8cbc-4703-9e61-98b782609dec (bd020f904c8a5bbe17880092e784637c5ea990945302dd170df0f9ad5b57ac2a)\nbd020f904c8a5bbe17880092e784637c5ea990945302dd170df0f9ad5b57ac2a\nThu Oct  2 12:10:13 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9e24c283-8cbc-4703-9e61-98b782609dec (bd020f904c8a5bbe17880092e784637c5ea990945302dd170df0f9ad5b57ac2a)\nbd020f904c8a5bbe17880092e784637c5ea990945302dd170df0f9ad5b57ac2a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:13.460 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[07eb7c40-2947-4c79-87b6-c9a57a7a8b96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:13.461 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9e24c283-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:10:13 np0005466013 nova_compute[192144]: 2025-10-02 12:10:13.463 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:13 np0005466013 kernel: tap9e24c283-80: left promiscuous mode
Oct  2 08:10:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:13.468 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[6a35b7a1-949d-4b9f-8704-c391c67d23c6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:13 np0005466013 nova_compute[192144]: 2025-10-02 12:10:13.476 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:13.493 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[5f3d7460-4550-4406-804a-eb0241220e2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:13.495 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[9c3ba4db-72a4-4f72-8a06-41b67caa3be6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:13.511 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[1f82e331-060d-476e-92af-7857dbf2266c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 496928, 'reachable_time': 24422, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226473, 'error': None, 'target': 'ovnmeta-9e24c283-8cbc-4703-9e61-98b782609dec', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:13 np0005466013 systemd[1]: run-netns-ovnmeta\x2d9e24c283\x2d8cbc\x2d4703\x2d9e61\x2d98b782609dec.mount: Deactivated successfully.
Oct  2 08:10:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:13.516 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9e24c283-8cbc-4703-9e61-98b782609dec deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:10:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:13.516 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[f7fea32d-3cce-4756-b951-c6e0c4e7d319]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:14 np0005466013 podman[226475]: 2025-10-02 12:10:14.691544628 +0000 UTC m=+0.068679268 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  2 08:10:14 np0005466013 podman[226476]: 2025-10-02 12:10:14.716175431 +0000 UTC m=+0.092082875 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, container_name=openstack_network_exporter, managed_by=edpm_ansible, vcs-type=git, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc.)
Oct  2 08:10:15 np0005466013 nova_compute[192144]: 2025-10-02 12:10:15.211 2 DEBUG nova.compute.manager [req-a170d3de-7a47-4543-854f-8f105325b0d2 req-b920b216-aaa2-46ee-b1be-e10ddc19cffe 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Received event network-vif-unplugged-ec65b119-dfb7-4027-bd99-1b3de8416626 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:10:15 np0005466013 nova_compute[192144]: 2025-10-02 12:10:15.211 2 DEBUG oslo_concurrency.lockutils [req-a170d3de-7a47-4543-854f-8f105325b0d2 req-b920b216-aaa2-46ee-b1be-e10ddc19cffe 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "5498fc15-956a-42a0-817e-e4bb31469607-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:15 np0005466013 nova_compute[192144]: 2025-10-02 12:10:15.211 2 DEBUG oslo_concurrency.lockutils [req-a170d3de-7a47-4543-854f-8f105325b0d2 req-b920b216-aaa2-46ee-b1be-e10ddc19cffe 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "5498fc15-956a-42a0-817e-e4bb31469607-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:15 np0005466013 nova_compute[192144]: 2025-10-02 12:10:15.211 2 DEBUG oslo_concurrency.lockutils [req-a170d3de-7a47-4543-854f-8f105325b0d2 req-b920b216-aaa2-46ee-b1be-e10ddc19cffe 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "5498fc15-956a-42a0-817e-e4bb31469607-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:15 np0005466013 nova_compute[192144]: 2025-10-02 12:10:15.211 2 DEBUG nova.compute.manager [req-a170d3de-7a47-4543-854f-8f105325b0d2 req-b920b216-aaa2-46ee-b1be-e10ddc19cffe 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] No waiting events found dispatching network-vif-unplugged-ec65b119-dfb7-4027-bd99-1b3de8416626 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:10:15 np0005466013 nova_compute[192144]: 2025-10-02 12:10:15.212 2 DEBUG nova.compute.manager [req-a170d3de-7a47-4543-854f-8f105325b0d2 req-b920b216-aaa2-46ee-b1be-e10ddc19cffe 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Received event network-vif-unplugged-ec65b119-dfb7-4027-bd99-1b3de8416626 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:10:15 np0005466013 nova_compute[192144]: 2025-10-02 12:10:15.212 2 DEBUG nova.compute.manager [req-a170d3de-7a47-4543-854f-8f105325b0d2 req-b920b216-aaa2-46ee-b1be-e10ddc19cffe 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Received event network-vif-plugged-ec65b119-dfb7-4027-bd99-1b3de8416626 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:10:15 np0005466013 nova_compute[192144]: 2025-10-02 12:10:15.212 2 DEBUG oslo_concurrency.lockutils [req-a170d3de-7a47-4543-854f-8f105325b0d2 req-b920b216-aaa2-46ee-b1be-e10ddc19cffe 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "5498fc15-956a-42a0-817e-e4bb31469607-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:15 np0005466013 nova_compute[192144]: 2025-10-02 12:10:15.212 2 DEBUG oslo_concurrency.lockutils [req-a170d3de-7a47-4543-854f-8f105325b0d2 req-b920b216-aaa2-46ee-b1be-e10ddc19cffe 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "5498fc15-956a-42a0-817e-e4bb31469607-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:15 np0005466013 nova_compute[192144]: 2025-10-02 12:10:15.212 2 DEBUG oslo_concurrency.lockutils [req-a170d3de-7a47-4543-854f-8f105325b0d2 req-b920b216-aaa2-46ee-b1be-e10ddc19cffe 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "5498fc15-956a-42a0-817e-e4bb31469607-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:15 np0005466013 nova_compute[192144]: 2025-10-02 12:10:15.212 2 DEBUG nova.compute.manager [req-a170d3de-7a47-4543-854f-8f105325b0d2 req-b920b216-aaa2-46ee-b1be-e10ddc19cffe 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] No waiting events found dispatching network-vif-plugged-ec65b119-dfb7-4027-bd99-1b3de8416626 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:10:15 np0005466013 nova_compute[192144]: 2025-10-02 12:10:15.212 2 WARNING nova.compute.manager [req-a170d3de-7a47-4543-854f-8f105325b0d2 req-b920b216-aaa2-46ee-b1be-e10ddc19cffe 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Received unexpected event network-vif-plugged-ec65b119-dfb7-4027-bd99-1b3de8416626 for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:10:15 np0005466013 nova_compute[192144]: 2025-10-02 12:10:15.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:15 np0005466013 nova_compute[192144]: 2025-10-02 12:10:15.989 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:16 np0005466013 nova_compute[192144]: 2025-10-02 12:10:16.211 2 DEBUG nova.network.neutron [req-55cc7bd7-6373-4a84-ba0e-9a9dbdedb4c7 req-ee76528d-bf28-40af-aaf4-2a5b26893afa 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Updated VIF entry in instance network info cache for port ec65b119-dfb7-4027-bd99-1b3de8416626. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:10:16 np0005466013 nova_compute[192144]: 2025-10-02 12:10:16.212 2 DEBUG nova.network.neutron [req-55cc7bd7-6373-4a84-ba0e-9a9dbdedb4c7 req-ee76528d-bf28-40af-aaf4-2a5b26893afa 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Updating instance_info_cache with network_info: [{"id": "ec65b119-dfb7-4027-bd99-1b3de8416626", "address": "fa:16:3e:ac:0d:d4", "network": {"id": "9e24c283-8cbc-4703-9e61-98b782609dec", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-409456389-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bddb0cc777354c619155396b3af4a779", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec65b119-df", "ovs_interfaceid": "ec65b119-dfb7-4027-bd99-1b3de8416626", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:10:16 np0005466013 nova_compute[192144]: 2025-10-02 12:10:16.232 2 DEBUG oslo_concurrency.lockutils [req-55cc7bd7-6373-4a84-ba0e-9a9dbdedb4c7 req-ee76528d-bf28-40af-aaf4-2a5b26893afa 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-5498fc15-956a-42a0-817e-e4bb31469607" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:10:16 np0005466013 nova_compute[192144]: 2025-10-02 12:10:16.364 2 DEBUG nova.network.neutron [-] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:10:16 np0005466013 nova_compute[192144]: 2025-10-02 12:10:16.393 2 INFO nova.compute.manager [-] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Took 3.02 seconds to deallocate network for instance.#033[00m
Oct  2 08:10:16 np0005466013 nova_compute[192144]: 2025-10-02 12:10:16.505 2 DEBUG nova.compute.manager [req-65bf632d-2d02-4770-90e6-45d5db662e72 req-72ee007a-284f-4b9e-aece-9c62de862b4f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Received event network-vif-deleted-ec65b119-dfb7-4027-bd99-1b3de8416626 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:10:16 np0005466013 nova_compute[192144]: 2025-10-02 12:10:16.565 2 DEBUG oslo_concurrency.lockutils [None req-ad154865-6cf4-4b06-b1f3-a2b6bb7a6fb0 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:16 np0005466013 nova_compute[192144]: 2025-10-02 12:10:16.566 2 DEBUG oslo_concurrency.lockutils [None req-ad154865-6cf4-4b06-b1f3-a2b6bb7a6fb0 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:16 np0005466013 nova_compute[192144]: 2025-10-02 12:10:16.630 2 DEBUG nova.compute.provider_tree [None req-ad154865-6cf4-4b06-b1f3-a2b6bb7a6fb0 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:10:16 np0005466013 nova_compute[192144]: 2025-10-02 12:10:16.645 2 DEBUG nova.scheduler.client.report [None req-ad154865-6cf4-4b06-b1f3-a2b6bb7a6fb0 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:10:16 np0005466013 nova_compute[192144]: 2025-10-02 12:10:16.675 2 DEBUG oslo_concurrency.lockutils [None req-ad154865-6cf4-4b06-b1f3-a2b6bb7a6fb0 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:16 np0005466013 nova_compute[192144]: 2025-10-02 12:10:16.746 2 INFO nova.scheduler.client.report [None req-ad154865-6cf4-4b06-b1f3-a2b6bb7a6fb0 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Deleted allocations for instance 5498fc15-956a-42a0-817e-e4bb31469607#033[00m
Oct  2 08:10:16 np0005466013 nova_compute[192144]: 2025-10-02 12:10:16.883 2 DEBUG oslo_concurrency.lockutils [None req-ad154865-6cf4-4b06-b1f3-a2b6bb7a6fb0 45712a323f9248c3b83534f5afa82f60 bddb0cc777354c619155396b3af4a779 - - default default] Lock "5498fc15-956a-42a0-817e-e4bb31469607" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.950s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:16 np0005466013 nova_compute[192144]: 2025-10-02 12:10:16.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:10:17 np0005466013 nova_compute[192144]: 2025-10-02 12:10:17.021 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:17 np0005466013 nova_compute[192144]: 2025-10-02 12:10:17.021 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:17 np0005466013 nova_compute[192144]: 2025-10-02 12:10:17.021 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:17 np0005466013 nova_compute[192144]: 2025-10-02 12:10:17.022 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:10:17 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:17.071 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:10:17 np0005466013 nova_compute[192144]: 2025-10-02 12:10:17.071 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:17 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:17.072 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:10:17 np0005466013 nova_compute[192144]: 2025-10-02 12:10:17.182 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:10:17 np0005466013 nova_compute[192144]: 2025-10-02 12:10:17.183 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5735MB free_disk=73.3967514038086GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:10:17 np0005466013 nova_compute[192144]: 2025-10-02 12:10:17.183 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:17 np0005466013 nova_compute[192144]: 2025-10-02 12:10:17.183 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:17 np0005466013 nova_compute[192144]: 2025-10-02 12:10:17.271 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:10:17 np0005466013 nova_compute[192144]: 2025-10-02 12:10:17.272 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:10:17 np0005466013 nova_compute[192144]: 2025-10-02 12:10:17.297 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:10:17 np0005466013 nova_compute[192144]: 2025-10-02 12:10:17.331 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:10:17 np0005466013 nova_compute[192144]: 2025-10-02 12:10:17.363 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:10:17 np0005466013 nova_compute[192144]: 2025-10-02 12:10:17.363 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.181s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:18 np0005466013 nova_compute[192144]: 2025-10-02 12:10:18.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:18 np0005466013 nova_compute[192144]: 2025-10-02 12:10:18.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:18 np0005466013 nova_compute[192144]: 2025-10-02 12:10:18.363 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:10:18 np0005466013 nova_compute[192144]: 2025-10-02 12:10:18.364 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:10:18 np0005466013 nova_compute[192144]: 2025-10-02 12:10:18.364 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:10:18 np0005466013 nova_compute[192144]: 2025-10-02 12:10:18.396 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:10:18 np0005466013 nova_compute[192144]: 2025-10-02 12:10:18.397 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:10:18 np0005466013 podman[226514]: 2025-10-02 12:10:18.685740256 +0000 UTC m=+0.058449085 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:10:18 np0005466013 podman[226515]: 2025-10-02 12:10:18.696743442 +0000 UTC m=+0.062271002 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct  2 08:10:18 np0005466013 nova_compute[192144]: 2025-10-02 12:10:18.704 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407003.703239, aa3ad2f7-f503-4fb3-b13f-c3691b7f7700 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:10:18 np0005466013 nova_compute[192144]: 2025-10-02 12:10:18.705 2 INFO nova.compute.manager [-] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:10:18 np0005466013 nova_compute[192144]: 2025-10-02 12:10:18.735 2 DEBUG nova.compute.manager [None req-812bd637-8bbc-4e77-a423-8d217c92e50e - - - - - -] [instance: aa3ad2f7-f503-4fb3-b13f-c3691b7f7700] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:10:18 np0005466013 nova_compute[192144]: 2025-10-02 12:10:18.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:10:19 np0005466013 nova_compute[192144]: 2025-10-02 12:10:19.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:10:19 np0005466013 nova_compute[192144]: 2025-10-02 12:10:19.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:10:19 np0005466013 nova_compute[192144]: 2025-10-02 12:10:19.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:10:19 np0005466013 nova_compute[192144]: 2025-10-02 12:10:19.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:10:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:20.074 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:10:20 np0005466013 nova_compute[192144]: 2025-10-02 12:10:20.988 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:10:21 np0005466013 nova_compute[192144]: 2025-10-02 12:10:21.007 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:10:23 np0005466013 nova_compute[192144]: 2025-10-02 12:10:23.046 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:23 np0005466013 nova_compute[192144]: 2025-10-02 12:10:23.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:24 np0005466013 nova_compute[192144]: 2025-10-02 12:10:24.007 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:10:28 np0005466013 nova_compute[192144]: 2025-10-02 12:10:28.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:28 np0005466013 nova_compute[192144]: 2025-10-02 12:10:28.229 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407013.2287593, 5498fc15-956a-42a0-817e-e4bb31469607 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:10:28 np0005466013 nova_compute[192144]: 2025-10-02 12:10:28.230 2 INFO nova.compute.manager [-] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:10:28 np0005466013 nova_compute[192144]: 2025-10-02 12:10:28.250 2 DEBUG nova.compute.manager [None req-c3d07516-91ca-42f6-a021-0959c6993a2c - - - - - -] [instance: 5498fc15-956a-42a0-817e-e4bb31469607] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:10:28 np0005466013 nova_compute[192144]: 2025-10-02 12:10:28.261 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:31 np0005466013 podman[226557]: 2025-10-02 12:10:31.668605622 +0000 UTC m=+0.044252223 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  2 08:10:31 np0005466013 podman[226556]: 2025-10-02 12:10:31.675583075 +0000 UTC m=+0.051532675 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  2 08:10:31 np0005466013 podman[226558]: 2025-10-02 12:10:31.709618424 +0000 UTC m=+0.079210509 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:10:33 np0005466013 nova_compute[192144]: 2025-10-02 12:10:33.049 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:33 np0005466013 nova_compute[192144]: 2025-10-02 12:10:33.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:38 np0005466013 nova_compute[192144]: 2025-10-02 12:10:38.052 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:38 np0005466013 nova_compute[192144]: 2025-10-02 12:10:38.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:39 np0005466013 nova_compute[192144]: 2025-10-02 12:10:39.467 2 DEBUG oslo_concurrency.lockutils [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquiring lock "e0215fe6-39be-4529-9345-a5fcb4e3e6ee" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:39 np0005466013 nova_compute[192144]: 2025-10-02 12:10:39.467 2 DEBUG oslo_concurrency.lockutils [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "e0215fe6-39be-4529-9345-a5fcb4e3e6ee" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:39 np0005466013 nova_compute[192144]: 2025-10-02 12:10:39.492 2 DEBUG nova.compute.manager [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:10:39 np0005466013 nova_compute[192144]: 2025-10-02 12:10:39.641 2 DEBUG oslo_concurrency.lockutils [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:39 np0005466013 nova_compute[192144]: 2025-10-02 12:10:39.641 2 DEBUG oslo_concurrency.lockutils [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:39 np0005466013 nova_compute[192144]: 2025-10-02 12:10:39.650 2 DEBUG nova.virt.hardware [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:10:39 np0005466013 nova_compute[192144]: 2025-10-02 12:10:39.651 2 INFO nova.compute.claims [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:10:39 np0005466013 nova_compute[192144]: 2025-10-02 12:10:39.846 2 DEBUG nova.compute.provider_tree [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:10:39 np0005466013 nova_compute[192144]: 2025-10-02 12:10:39.949 2 DEBUG nova.scheduler.client.report [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:10:40 np0005466013 nova_compute[192144]: 2025-10-02 12:10:40.548 2 DEBUG oslo_concurrency.lockutils [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.906s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:40 np0005466013 nova_compute[192144]: 2025-10-02 12:10:40.549 2 DEBUG nova.compute.manager [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:10:40 np0005466013 nova_compute[192144]: 2025-10-02 12:10:40.886 2 DEBUG nova.compute.manager [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:10:40 np0005466013 nova_compute[192144]: 2025-10-02 12:10:40.887 2 DEBUG nova.network.neutron [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:10:40 np0005466013 nova_compute[192144]: 2025-10-02 12:10:40.929 2 INFO nova.virt.libvirt.driver [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:10:40 np0005466013 nova_compute[192144]: 2025-10-02 12:10:40.950 2 DEBUG nova.compute.manager [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:10:41 np0005466013 nova_compute[192144]: 2025-10-02 12:10:41.126 2 DEBUG nova.compute.manager [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:10:41 np0005466013 nova_compute[192144]: 2025-10-02 12:10:41.128 2 DEBUG nova.virt.libvirt.driver [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:10:41 np0005466013 nova_compute[192144]: 2025-10-02 12:10:41.128 2 INFO nova.virt.libvirt.driver [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Creating image(s)#033[00m
Oct  2 08:10:41 np0005466013 nova_compute[192144]: 2025-10-02 12:10:41.129 2 DEBUG oslo_concurrency.lockutils [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquiring lock "/var/lib/nova/instances/e0215fe6-39be-4529-9345-a5fcb4e3e6ee/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:41 np0005466013 nova_compute[192144]: 2025-10-02 12:10:41.129 2 DEBUG oslo_concurrency.lockutils [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "/var/lib/nova/instances/e0215fe6-39be-4529-9345-a5fcb4e3e6ee/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:41 np0005466013 nova_compute[192144]: 2025-10-02 12:10:41.130 2 DEBUG oslo_concurrency.lockutils [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "/var/lib/nova/instances/e0215fe6-39be-4529-9345-a5fcb4e3e6ee/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:41 np0005466013 nova_compute[192144]: 2025-10-02 12:10:41.147 2 DEBUG oslo_concurrency.processutils [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:10:41 np0005466013 nova_compute[192144]: 2025-10-02 12:10:41.261 2 DEBUG oslo_concurrency.processutils [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.114s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:10:41 np0005466013 nova_compute[192144]: 2025-10-02 12:10:41.263 2 DEBUG oslo_concurrency.lockutils [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:41 np0005466013 nova_compute[192144]: 2025-10-02 12:10:41.263 2 DEBUG oslo_concurrency.lockutils [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:41 np0005466013 nova_compute[192144]: 2025-10-02 12:10:41.273 2 DEBUG oslo_concurrency.processutils [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:10:41 np0005466013 nova_compute[192144]: 2025-10-02 12:10:41.331 2 DEBUG oslo_concurrency.processutils [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:10:41 np0005466013 nova_compute[192144]: 2025-10-02 12:10:41.333 2 DEBUG oslo_concurrency.processutils [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/e0215fe6-39be-4529-9345-a5fcb4e3e6ee/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:10:41 np0005466013 nova_compute[192144]: 2025-10-02 12:10:41.770 2 DEBUG nova.policy [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:10:42 np0005466013 nova_compute[192144]: 2025-10-02 12:10:42.045 2 DEBUG oslo_concurrency.processutils [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/e0215fe6-39be-4529-9345-a5fcb4e3e6ee/disk 1073741824" returned: 0 in 0.711s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:10:42 np0005466013 nova_compute[192144]: 2025-10-02 12:10:42.045 2 DEBUG oslo_concurrency.lockutils [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.782s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:42 np0005466013 nova_compute[192144]: 2025-10-02 12:10:42.046 2 DEBUG oslo_concurrency.processutils [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:10:42 np0005466013 nova_compute[192144]: 2025-10-02 12:10:42.101 2 DEBUG oslo_concurrency.processutils [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:10:42 np0005466013 nova_compute[192144]: 2025-10-02 12:10:42.102 2 DEBUG nova.virt.disk.api [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Checking if we can resize image /var/lib/nova/instances/e0215fe6-39be-4529-9345-a5fcb4e3e6ee/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:10:42 np0005466013 nova_compute[192144]: 2025-10-02 12:10:42.103 2 DEBUG oslo_concurrency.processutils [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e0215fe6-39be-4529-9345-a5fcb4e3e6ee/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:10:42 np0005466013 nova_compute[192144]: 2025-10-02 12:10:42.166 2 DEBUG oslo_concurrency.processutils [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e0215fe6-39be-4529-9345-a5fcb4e3e6ee/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:10:42 np0005466013 nova_compute[192144]: 2025-10-02 12:10:42.167 2 DEBUG nova.virt.disk.api [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Cannot resize image /var/lib/nova/instances/e0215fe6-39be-4529-9345-a5fcb4e3e6ee/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:10:42 np0005466013 nova_compute[192144]: 2025-10-02 12:10:42.167 2 DEBUG nova.objects.instance [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lazy-loading 'migration_context' on Instance uuid e0215fe6-39be-4529-9345-a5fcb4e3e6ee obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:10:42 np0005466013 nova_compute[192144]: 2025-10-02 12:10:42.192 2 DEBUG nova.virt.libvirt.driver [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:10:42 np0005466013 nova_compute[192144]: 2025-10-02 12:10:42.193 2 DEBUG nova.virt.libvirt.driver [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Ensure instance console log exists: /var/lib/nova/instances/e0215fe6-39be-4529-9345-a5fcb4e3e6ee/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:10:42 np0005466013 nova_compute[192144]: 2025-10-02 12:10:42.193 2 DEBUG oslo_concurrency.lockutils [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:42 np0005466013 nova_compute[192144]: 2025-10-02 12:10:42.193 2 DEBUG oslo_concurrency.lockutils [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:42 np0005466013 nova_compute[192144]: 2025-10-02 12:10:42.193 2 DEBUG oslo_concurrency.lockutils [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:42 np0005466013 podman[226637]: 2025-10-02 12:10:42.682533034 +0000 UTC m=+0.064318291 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251001)
Oct  2 08:10:43 np0005466013 nova_compute[192144]: 2025-10-02 12:10:43.054 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:43 np0005466013 nova_compute[192144]: 2025-10-02 12:10:43.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:43 np0005466013 nova_compute[192144]: 2025-10-02 12:10:43.661 2 DEBUG nova.network.neutron [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Successfully created port: 04cd91f3-a598-48c3-bb6a-789dece3461d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:10:45 np0005466013 podman[226657]: 2025-10-02 12:10:45.667598724 +0000 UTC m=+0.048798218 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible)
Oct  2 08:10:45 np0005466013 podman[226658]: 2025-10-02 12:10:45.672585774 +0000 UTC m=+0.050890572 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, config_id=edpm, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, release=1755695350, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vcs-type=git, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container)
Oct  2 08:10:46 np0005466013 nova_compute[192144]: 2025-10-02 12:10:46.434 2 DEBUG nova.network.neutron [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Successfully updated port: 04cd91f3-a598-48c3-bb6a-789dece3461d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:10:46 np0005466013 nova_compute[192144]: 2025-10-02 12:10:46.462 2 DEBUG oslo_concurrency.lockutils [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquiring lock "refresh_cache-e0215fe6-39be-4529-9345-a5fcb4e3e6ee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:10:46 np0005466013 nova_compute[192144]: 2025-10-02 12:10:46.462 2 DEBUG oslo_concurrency.lockutils [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquired lock "refresh_cache-e0215fe6-39be-4529-9345-a5fcb4e3e6ee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:10:46 np0005466013 nova_compute[192144]: 2025-10-02 12:10:46.462 2 DEBUG nova.network.neutron [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:10:46 np0005466013 nova_compute[192144]: 2025-10-02 12:10:46.627 2 DEBUG nova.compute.manager [req-0ff56ee0-e503-4a25-b385-7784d54e4618 req-bd38e9ad-a6ab-4173-aea0-3e55f6c985d2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Received event network-changed-04cd91f3-a598-48c3-bb6a-789dece3461d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:10:46 np0005466013 nova_compute[192144]: 2025-10-02 12:10:46.627 2 DEBUG nova.compute.manager [req-0ff56ee0-e503-4a25-b385-7784d54e4618 req-bd38e9ad-a6ab-4173-aea0-3e55f6c985d2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Refreshing instance network info cache due to event network-changed-04cd91f3-a598-48c3-bb6a-789dece3461d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:10:46 np0005466013 nova_compute[192144]: 2025-10-02 12:10:46.627 2 DEBUG oslo_concurrency.lockutils [req-0ff56ee0-e503-4a25-b385-7784d54e4618 req-bd38e9ad-a6ab-4173-aea0-3e55f6c985d2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-e0215fe6-39be-4529-9345-a5fcb4e3e6ee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:10:46 np0005466013 nova_compute[192144]: 2025-10-02 12:10:46.803 2 DEBUG nova.network.neutron [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:10:48 np0005466013 nova_compute[192144]: 2025-10-02 12:10:48.056 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:48 np0005466013 nova_compute[192144]: 2025-10-02 12:10:48.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:49 np0005466013 nova_compute[192144]: 2025-10-02 12:10:49.428 2 DEBUG nova.network.neutron [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Updating instance_info_cache with network_info: [{"id": "04cd91f3-a598-48c3-bb6a-789dece3461d", "address": "fa:16:3e:9c:f9:18", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04cd91f3-a5", "ovs_interfaceid": "04cd91f3-a598-48c3-bb6a-789dece3461d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:10:49 np0005466013 nova_compute[192144]: 2025-10-02 12:10:49.444 2 DEBUG oslo_concurrency.lockutils [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Releasing lock "refresh_cache-e0215fe6-39be-4529-9345-a5fcb4e3e6ee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:10:49 np0005466013 nova_compute[192144]: 2025-10-02 12:10:49.445 2 DEBUG nova.compute.manager [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Instance network_info: |[{"id": "04cd91f3-a598-48c3-bb6a-789dece3461d", "address": "fa:16:3e:9c:f9:18", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04cd91f3-a5", "ovs_interfaceid": "04cd91f3-a598-48c3-bb6a-789dece3461d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:10:49 np0005466013 nova_compute[192144]: 2025-10-02 12:10:49.445 2 DEBUG oslo_concurrency.lockutils [req-0ff56ee0-e503-4a25-b385-7784d54e4618 req-bd38e9ad-a6ab-4173-aea0-3e55f6c985d2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-e0215fe6-39be-4529-9345-a5fcb4e3e6ee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:10:49 np0005466013 nova_compute[192144]: 2025-10-02 12:10:49.446 2 DEBUG nova.network.neutron [req-0ff56ee0-e503-4a25-b385-7784d54e4618 req-bd38e9ad-a6ab-4173-aea0-3e55f6c985d2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Refreshing network info cache for port 04cd91f3-a598-48c3-bb6a-789dece3461d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:10:49 np0005466013 nova_compute[192144]: 2025-10-02 12:10:49.448 2 DEBUG nova.virt.libvirt.driver [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Start _get_guest_xml network_info=[{"id": "04cd91f3-a598-48c3-bb6a-789dece3461d", "address": "fa:16:3e:9c:f9:18", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04cd91f3-a5", "ovs_interfaceid": "04cd91f3-a598-48c3-bb6a-789dece3461d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:10:49 np0005466013 nova_compute[192144]: 2025-10-02 12:10:49.452 2 WARNING nova.virt.libvirt.driver [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:10:49 np0005466013 nova_compute[192144]: 2025-10-02 12:10:49.457 2 DEBUG nova.virt.libvirt.host [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:10:49 np0005466013 nova_compute[192144]: 2025-10-02 12:10:49.458 2 DEBUG nova.virt.libvirt.host [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:10:49 np0005466013 nova_compute[192144]: 2025-10-02 12:10:49.461 2 DEBUG nova.virt.libvirt.host [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:10:49 np0005466013 nova_compute[192144]: 2025-10-02 12:10:49.462 2 DEBUG nova.virt.libvirt.host [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:10:49 np0005466013 nova_compute[192144]: 2025-10-02 12:10:49.463 2 DEBUG nova.virt.libvirt.driver [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:10:49 np0005466013 nova_compute[192144]: 2025-10-02 12:10:49.463 2 DEBUG nova.virt.hardware [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:10:49 np0005466013 nova_compute[192144]: 2025-10-02 12:10:49.464 2 DEBUG nova.virt.hardware [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:10:49 np0005466013 nova_compute[192144]: 2025-10-02 12:10:49.464 2 DEBUG nova.virt.hardware [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:10:49 np0005466013 nova_compute[192144]: 2025-10-02 12:10:49.464 2 DEBUG nova.virt.hardware [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:10:49 np0005466013 nova_compute[192144]: 2025-10-02 12:10:49.464 2 DEBUG nova.virt.hardware [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:10:49 np0005466013 nova_compute[192144]: 2025-10-02 12:10:49.464 2 DEBUG nova.virt.hardware [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:10:49 np0005466013 nova_compute[192144]: 2025-10-02 12:10:49.465 2 DEBUG nova.virt.hardware [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:10:49 np0005466013 nova_compute[192144]: 2025-10-02 12:10:49.465 2 DEBUG nova.virt.hardware [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:10:49 np0005466013 nova_compute[192144]: 2025-10-02 12:10:49.465 2 DEBUG nova.virt.hardware [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:10:49 np0005466013 nova_compute[192144]: 2025-10-02 12:10:49.466 2 DEBUG nova.virt.hardware [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:10:49 np0005466013 nova_compute[192144]: 2025-10-02 12:10:49.466 2 DEBUG nova.virt.hardware [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:10:49 np0005466013 nova_compute[192144]: 2025-10-02 12:10:49.469 2 DEBUG nova.virt.libvirt.vif [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:10:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1321990610',display_name='tempest-tempest.common.compute-instance-1321990610',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1321990610',id=53,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBOYv65bqxpRWLwHd1cwzG/4qFq5fwAENkgbBIDBOqnc0JgdzkSWQfx96bY6oBwDuHqykokwPzxefRuwxgQXggptdfQb5jD77e031VNj4krJvTD/OQ1Uz/d20gy+DMsXFg==',key_name='tempest-keypair-922901776',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ef4e3be787374d90a6a236c7f76bd940',ramdisk_id='',reservation_id='r-3gr6mfrz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-812274278',owner_user_name='tempest-AttachInterfacesTestJSON-812274278-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:10:40Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='fbc7616089cb4f78832692487019c83d',uuid=e0215fe6-39be-4529-9345-a5fcb4e3e6ee,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "04cd91f3-a598-48c3-bb6a-789dece3461d", "address": "fa:16:3e:9c:f9:18", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04cd91f3-a5", "ovs_interfaceid": "04cd91f3-a598-48c3-bb6a-789dece3461d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:10:49 np0005466013 nova_compute[192144]: 2025-10-02 12:10:49.470 2 DEBUG nova.network.os_vif_util [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Converting VIF {"id": "04cd91f3-a598-48c3-bb6a-789dece3461d", "address": "fa:16:3e:9c:f9:18", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04cd91f3-a5", "ovs_interfaceid": "04cd91f3-a598-48c3-bb6a-789dece3461d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:10:49 np0005466013 nova_compute[192144]: 2025-10-02 12:10:49.470 2 DEBUG nova.network.os_vif_util [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9c:f9:18,bridge_name='br-int',has_traffic_filtering=True,id=04cd91f3-a598-48c3-bb6a-789dece3461d,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04cd91f3-a5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:10:49 np0005466013 nova_compute[192144]: 2025-10-02 12:10:49.471 2 DEBUG nova.objects.instance [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lazy-loading 'pci_devices' on Instance uuid e0215fe6-39be-4529-9345-a5fcb4e3e6ee obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:10:49 np0005466013 nova_compute[192144]: 2025-10-02 12:10:49.485 2 DEBUG nova.virt.libvirt.driver [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:10:49 np0005466013 nova_compute[192144]:  <uuid>e0215fe6-39be-4529-9345-a5fcb4e3e6ee</uuid>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:  <name>instance-00000035</name>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:10:49 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:      <nova:name>tempest-tempest.common.compute-instance-1321990610</nova:name>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:10:49</nova:creationTime>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:10:49 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:        <nova:user uuid="fbc7616089cb4f78832692487019c83d">tempest-AttachInterfacesTestJSON-812274278-project-member</nova:user>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:        <nova:project uuid="ef4e3be787374d90a6a236c7f76bd940">tempest-AttachInterfacesTestJSON-812274278</nova:project>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:        <nova:port uuid="04cd91f3-a598-48c3-bb6a-789dece3461d">
Oct  2 08:10:49 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:      <entry name="serial">e0215fe6-39be-4529-9345-a5fcb4e3e6ee</entry>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:      <entry name="uuid">e0215fe6-39be-4529-9345-a5fcb4e3e6ee</entry>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:10:49 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/e0215fe6-39be-4529-9345-a5fcb4e3e6ee/disk"/>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:10:49 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/e0215fe6-39be-4529-9345-a5fcb4e3e6ee/disk.config"/>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:10:49 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:9c:f9:18"/>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:      <target dev="tap04cd91f3-a5"/>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:10:49 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/e0215fe6-39be-4529-9345-a5fcb4e3e6ee/console.log" append="off"/>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:10:49 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:10:49 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:10:49 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:10:49 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:10:49 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:10:49 np0005466013 nova_compute[192144]: 2025-10-02 12:10:49.486 2 DEBUG nova.compute.manager [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Preparing to wait for external event network-vif-plugged-04cd91f3-a598-48c3-bb6a-789dece3461d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:10:49 np0005466013 nova_compute[192144]: 2025-10-02 12:10:49.487 2 DEBUG oslo_concurrency.lockutils [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquiring lock "e0215fe6-39be-4529-9345-a5fcb4e3e6ee-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:49 np0005466013 nova_compute[192144]: 2025-10-02 12:10:49.487 2 DEBUG oslo_concurrency.lockutils [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "e0215fe6-39be-4529-9345-a5fcb4e3e6ee-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:49 np0005466013 nova_compute[192144]: 2025-10-02 12:10:49.487 2 DEBUG oslo_concurrency.lockutils [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "e0215fe6-39be-4529-9345-a5fcb4e3e6ee-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:49 np0005466013 nova_compute[192144]: 2025-10-02 12:10:49.488 2 DEBUG nova.virt.libvirt.vif [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:10:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1321990610',display_name='tempest-tempest.common.compute-instance-1321990610',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1321990610',id=53,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBOYv65bqxpRWLwHd1cwzG/4qFq5fwAENkgbBIDBOqnc0JgdzkSWQfx96bY6oBwDuHqykokwPzxefRuwxgQXggptdfQb5jD77e031VNj4krJvTD/OQ1Uz/d20gy+DMsXFg==',key_name='tempest-keypair-922901776',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ef4e3be787374d90a6a236c7f76bd940',ramdisk_id='',reservation_id='r-3gr6mfrz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-812274278',owner_user_name='tempest-AttachInterfacesTestJSON-812274278-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:10:40Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='fbc7616089cb4f78832692487019c83d',uuid=e0215fe6-39be-4529-9345-a5fcb4e3e6ee,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "04cd91f3-a598-48c3-bb6a-789dece3461d", "address": "fa:16:3e:9c:f9:18", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04cd91f3-a5", "ovs_interfaceid": "04cd91f3-a598-48c3-bb6a-789dece3461d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:10:49 np0005466013 nova_compute[192144]: 2025-10-02 12:10:49.488 2 DEBUG nova.network.os_vif_util [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Converting VIF {"id": "04cd91f3-a598-48c3-bb6a-789dece3461d", "address": "fa:16:3e:9c:f9:18", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04cd91f3-a5", "ovs_interfaceid": "04cd91f3-a598-48c3-bb6a-789dece3461d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:10:49 np0005466013 nova_compute[192144]: 2025-10-02 12:10:49.488 2 DEBUG nova.network.os_vif_util [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9c:f9:18,bridge_name='br-int',has_traffic_filtering=True,id=04cd91f3-a598-48c3-bb6a-789dece3461d,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04cd91f3-a5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:10:49 np0005466013 nova_compute[192144]: 2025-10-02 12:10:49.489 2 DEBUG os_vif [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9c:f9:18,bridge_name='br-int',has_traffic_filtering=True,id=04cd91f3-a598-48c3-bb6a-789dece3461d,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04cd91f3-a5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:10:49 np0005466013 nova_compute[192144]: 2025-10-02 12:10:49.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:49 np0005466013 nova_compute[192144]: 2025-10-02 12:10:49.489 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:10:49 np0005466013 nova_compute[192144]: 2025-10-02 12:10:49.490 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:10:49 np0005466013 nova_compute[192144]: 2025-10-02 12:10:49.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:49 np0005466013 nova_compute[192144]: 2025-10-02 12:10:49.492 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap04cd91f3-a5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:10:49 np0005466013 nova_compute[192144]: 2025-10-02 12:10:49.493 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap04cd91f3-a5, col_values=(('external_ids', {'iface-id': '04cd91f3-a598-48c3-bb6a-789dece3461d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9c:f9:18', 'vm-uuid': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:10:49 np0005466013 nova_compute[192144]: 2025-10-02 12:10:49.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:49 np0005466013 NetworkManager[51205]: <info>  [1759407049.4953] manager: (tap04cd91f3-a5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/71)
Oct  2 08:10:49 np0005466013 nova_compute[192144]: 2025-10-02 12:10:49.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:10:49 np0005466013 nova_compute[192144]: 2025-10-02 12:10:49.505 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:49 np0005466013 nova_compute[192144]: 2025-10-02 12:10:49.506 2 INFO os_vif [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9c:f9:18,bridge_name='br-int',has_traffic_filtering=True,id=04cd91f3-a598-48c3-bb6a-789dece3461d,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04cd91f3-a5')#033[00m
Oct  2 08:10:49 np0005466013 nova_compute[192144]: 2025-10-02 12:10:49.624 2 DEBUG nova.virt.libvirt.driver [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:10:49 np0005466013 nova_compute[192144]: 2025-10-02 12:10:49.624 2 DEBUG nova.virt.libvirt.driver [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:10:49 np0005466013 nova_compute[192144]: 2025-10-02 12:10:49.624 2 DEBUG nova.virt.libvirt.driver [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] No VIF found with MAC fa:16:3e:9c:f9:18, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:10:49 np0005466013 nova_compute[192144]: 2025-10-02 12:10:49.625 2 INFO nova.virt.libvirt.driver [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Using config drive#033[00m
Oct  2 08:10:49 np0005466013 podman[226697]: 2025-10-02 12:10:49.674040808 +0000 UTC m=+0.050852020 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 08:10:49 np0005466013 podman[226698]: 2025-10-02 12:10:49.682256133 +0000 UTC m=+0.055816017 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid)
Oct  2 08:10:50 np0005466013 nova_compute[192144]: 2025-10-02 12:10:50.650 2 INFO nova.virt.libvirt.driver [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Creating config drive at /var/lib/nova/instances/e0215fe6-39be-4529-9345-a5fcb4e3e6ee/disk.config#033[00m
Oct  2 08:10:50 np0005466013 nova_compute[192144]: 2025-10-02 12:10:50.657 2 DEBUG oslo_concurrency.processutils [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e0215fe6-39be-4529-9345-a5fcb4e3e6ee/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4zd5ouww execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:10:50 np0005466013 nova_compute[192144]: 2025-10-02 12:10:50.787 2 DEBUG oslo_concurrency.processutils [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e0215fe6-39be-4529-9345-a5fcb4e3e6ee/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4zd5ouww" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:10:50 np0005466013 kernel: tap04cd91f3-a5: entered promiscuous mode
Oct  2 08:10:50 np0005466013 NetworkManager[51205]: <info>  [1759407050.8439] manager: (tap04cd91f3-a5): new Tun device (/org/freedesktop/NetworkManager/Devices/72)
Oct  2 08:10:50 np0005466013 nova_compute[192144]: 2025-10-02 12:10:50.844 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:50 np0005466013 ovn_controller[94366]: 2025-10-02T12:10:50Z|00134|binding|INFO|Claiming lport 04cd91f3-a598-48c3-bb6a-789dece3461d for this chassis.
Oct  2 08:10:50 np0005466013 ovn_controller[94366]: 2025-10-02T12:10:50Z|00135|binding|INFO|04cd91f3-a598-48c3-bb6a-789dece3461d: Claiming fa:16:3e:9c:f9:18 10.100.0.11
Oct  2 08:10:50 np0005466013 nova_compute[192144]: 2025-10-02 12:10:50.846 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:50 np0005466013 nova_compute[192144]: 2025-10-02 12:10:50.851 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:50 np0005466013 nova_compute[192144]: 2025-10-02 12:10:50.855 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:50 np0005466013 nova_compute[192144]: 2025-10-02 12:10:50.860 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:50 np0005466013 NetworkManager[51205]: <info>  [1759407050.8614] manager: (patch-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/73)
Oct  2 08:10:50 np0005466013 NetworkManager[51205]: <info>  [1759407050.8623] manager: (patch-br-int-to-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/74)
Oct  2 08:10:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:50.869 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9c:f9:18 10.100.0.11'], port_security=['fa:16:3e:9c:f9:18 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d845a33-56e0-4850-9f27-8a54095796f2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3b3d6cff-45b6-4476-af05-0164bc00fd3b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4583e9be-3cfa-4470-9e2e-4e943d469605, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=04cd91f3-a598-48c3-bb6a-789dece3461d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:10:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:50.870 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 04cd91f3-a598-48c3-bb6a-789dece3461d in datapath 7d845a33-56e0-4850-9f27-8a54095796f2 bound to our chassis#033[00m
Oct  2 08:10:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:50.872 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7d845a33-56e0-4850-9f27-8a54095796f2#033[00m
Oct  2 08:10:50 np0005466013 systemd-udevd[226755]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:10:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:50.882 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[ba0d758d-e492-49c5-9c0d-fa910e90d4f7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:50.883 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7d845a33-51 in ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:10:50 np0005466013 NetworkManager[51205]: <info>  [1759407050.8848] device (tap04cd91f3-a5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:10:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:50.884 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7d845a33-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:10:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:50.884 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[e2f66b4f-ec84-4e88-ba1b-206f93e0573a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:50.885 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[85108286-d64c-4d2d-b27c-eb8a65acc1c3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:50 np0005466013 NetworkManager[51205]: <info>  [1759407050.8862] device (tap04cd91f3-a5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:10:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:50.896 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[36a822e1-9bfa-44b6-85d4-db02c626741b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:50 np0005466013 systemd-machined[152202]: New machine qemu-21-instance-00000035.
Oct  2 08:10:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:50.928 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[bdf371ea-c8fe-4ac2-a119-072bd237d97b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:50.954 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[7113349d-f5e7-445c-81d2-6e2b46c74cb7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:50 np0005466013 systemd-udevd[226759]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:10:50 np0005466013 systemd[1]: Started Virtual Machine qemu-21-instance-00000035.
Oct  2 08:10:50 np0005466013 NetworkManager[51205]: <info>  [1759407050.9724] manager: (tap7d845a33-50): new Veth device (/org/freedesktop/NetworkManager/Devices/75)
Oct  2 08:10:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:50.972 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[45c4128b-4784-4d75-8d6b-2dd55bbc9f65]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:51 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:51.002 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[2db469e6-d45a-4ac6-b90d-b38c74451c2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:51 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:51.005 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[d4c6a71d-60e9-4b6b-96ac-61d0c44f4d59]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:51 np0005466013 NetworkManager[51205]: <info>  [1759407051.0299] device (tap7d845a33-50): carrier: link connected
Oct  2 08:10:51 np0005466013 nova_compute[192144]: 2025-10-02 12:10:51.031 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:51 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:51.037 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[a292e41d-a374-48a4-ac52-ed52b0d622b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:51 np0005466013 nova_compute[192144]: 2025-10-02 12:10:51.050 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:51 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:51.054 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[007a711b-d9cb-4d10-b6f9-2c310c1fbb84]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7d845a33-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:90:16'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 45], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 504264, 'reachable_time': 26822, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226789, 'error': None, 'target': 'ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:51 np0005466013 ovn_controller[94366]: 2025-10-02T12:10:51Z|00136|binding|INFO|Setting lport 04cd91f3-a598-48c3-bb6a-789dece3461d ovn-installed in OVS
Oct  2 08:10:51 np0005466013 ovn_controller[94366]: 2025-10-02T12:10:51Z|00137|binding|INFO|Setting lport 04cd91f3-a598-48c3-bb6a-789dece3461d up in Southbound
Oct  2 08:10:51 np0005466013 nova_compute[192144]: 2025-10-02 12:10:51.060 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:51 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:51.070 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[bd30526a-a6e0-4fff-a3be-a8888da32ac2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8f:9016'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 504264, 'tstamp': 504264}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226791, 'error': None, 'target': 'ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:51 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:51.085 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[51b2d247-ff54-4727-9027-28c903759a1c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7d845a33-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:90:16'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 45], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 504264, 'reachable_time': 26822, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 168, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 168, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226792, 'error': None, 'target': 'ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:51 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:51.116 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[512d963f-9678-44f7-9f6e-9f461ba5d90b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:51 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:51.167 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[f036b328-2a20-47c5-b4c1-9bba2068e8c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:51 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:51.168 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7d845a33-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:10:51 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:51.168 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:10:51 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:51.169 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7d845a33-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:10:51 np0005466013 nova_compute[192144]: 2025-10-02 12:10:51.171 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:51 np0005466013 kernel: tap7d845a33-50: entered promiscuous mode
Oct  2 08:10:51 np0005466013 NetworkManager[51205]: <info>  [1759407051.1722] manager: (tap7d845a33-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/76)
Oct  2 08:10:51 np0005466013 nova_compute[192144]: 2025-10-02 12:10:51.173 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:51 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:51.175 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7d845a33-50, col_values=(('external_ids', {'iface-id': '1c321c19-d630-4a6f-8ba8-7bac90af9bae'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:10:51 np0005466013 nova_compute[192144]: 2025-10-02 12:10:51.176 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:51 np0005466013 ovn_controller[94366]: 2025-10-02T12:10:51Z|00138|binding|INFO|Releasing lport 1c321c19-d630-4a6f-8ba8-7bac90af9bae from this chassis (sb_readonly=0)
Oct  2 08:10:51 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:51.177 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7d845a33-56e0-4850-9f27-8a54095796f2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7d845a33-56e0-4850-9f27-8a54095796f2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:10:51 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:51.178 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[9adef71a-cac6-4434-916d-f94e98980131]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:51 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:51.179 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:10:51 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:10:51 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:10:51 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-7d845a33-56e0-4850-9f27-8a54095796f2
Oct  2 08:10:51 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:10:51 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:10:51 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:10:51 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/7d845a33-56e0-4850-9f27-8a54095796f2.pid.haproxy
Oct  2 08:10:51 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:10:51 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:10:51 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:10:51 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:10:51 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:10:51 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:10:51 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:10:51 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:10:51 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:10:51 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:10:51 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:10:51 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:10:51 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:10:51 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:10:51 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:10:51 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:10:51 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:10:51 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:10:51 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:10:51 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:10:51 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID 7d845a33-56e0-4850-9f27-8a54095796f2
Oct  2 08:10:51 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:10:51 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:10:51.182 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2', 'env', 'PROCESS_TAG=haproxy-7d845a33-56e0-4850-9f27-8a54095796f2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7d845a33-56e0-4850-9f27-8a54095796f2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:10:51 np0005466013 nova_compute[192144]: 2025-10-02 12:10:51.190 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:51 np0005466013 nova_compute[192144]: 2025-10-02 12:10:51.399 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:51 np0005466013 nova_compute[192144]: 2025-10-02 12:10:51.587 2 DEBUG nova.compute.manager [req-5bb64609-65a4-4163-aca7-9e86e232178e req-6256dfb1-7acc-4290-ade2-31842ce4d9f2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Received event network-vif-plugged-04cd91f3-a598-48c3-bb6a-789dece3461d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:10:51 np0005466013 nova_compute[192144]: 2025-10-02 12:10:51.588 2 DEBUG oslo_concurrency.lockutils [req-5bb64609-65a4-4163-aca7-9e86e232178e req-6256dfb1-7acc-4290-ade2-31842ce4d9f2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "e0215fe6-39be-4529-9345-a5fcb4e3e6ee-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:51 np0005466013 nova_compute[192144]: 2025-10-02 12:10:51.589 2 DEBUG oslo_concurrency.lockutils [req-5bb64609-65a4-4163-aca7-9e86e232178e req-6256dfb1-7acc-4290-ade2-31842ce4d9f2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "e0215fe6-39be-4529-9345-a5fcb4e3e6ee-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:51 np0005466013 nova_compute[192144]: 2025-10-02 12:10:51.589 2 DEBUG oslo_concurrency.lockutils [req-5bb64609-65a4-4163-aca7-9e86e232178e req-6256dfb1-7acc-4290-ade2-31842ce4d9f2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "e0215fe6-39be-4529-9345-a5fcb4e3e6ee-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:51 np0005466013 nova_compute[192144]: 2025-10-02 12:10:51.589 2 DEBUG nova.compute.manager [req-5bb64609-65a4-4163-aca7-9e86e232178e req-6256dfb1-7acc-4290-ade2-31842ce4d9f2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Processing event network-vif-plugged-04cd91f3-a598-48c3-bb6a-789dece3461d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:10:51 np0005466013 nova_compute[192144]: 2025-10-02 12:10:51.617 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407051.6169639, e0215fe6-39be-4529-9345-a5fcb4e3e6ee => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:10:51 np0005466013 nova_compute[192144]: 2025-10-02 12:10:51.618 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] VM Started (Lifecycle Event)#033[00m
Oct  2 08:10:51 np0005466013 nova_compute[192144]: 2025-10-02 12:10:51.620 2 DEBUG nova.compute.manager [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:10:51 np0005466013 nova_compute[192144]: 2025-10-02 12:10:51.624 2 DEBUG nova.virt.libvirt.driver [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:10:51 np0005466013 nova_compute[192144]: 2025-10-02 12:10:51.628 2 INFO nova.virt.libvirt.driver [-] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Instance spawned successfully.#033[00m
Oct  2 08:10:51 np0005466013 nova_compute[192144]: 2025-10-02 12:10:51.629 2 DEBUG nova.virt.libvirt.driver [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:10:51 np0005466013 nova_compute[192144]: 2025-10-02 12:10:51.652 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:10:51 np0005466013 nova_compute[192144]: 2025-10-02 12:10:51.657 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:10:51 np0005466013 podman[226831]: 2025-10-02 12:10:51.662537825 +0000 UTC m=+0.074353142 container create 878dd236cf5096b531a1a30ef9ca45b5175efa96c4d157618fec30a54cc616e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 08:10:51 np0005466013 nova_compute[192144]: 2025-10-02 12:10:51.677 2 DEBUG nova.virt.libvirt.driver [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:10:51 np0005466013 nova_compute[192144]: 2025-10-02 12:10:51.678 2 DEBUG nova.virt.libvirt.driver [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:10:51 np0005466013 nova_compute[192144]: 2025-10-02 12:10:51.679 2 DEBUG nova.virt.libvirt.driver [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:10:51 np0005466013 nova_compute[192144]: 2025-10-02 12:10:51.679 2 DEBUG nova.virt.libvirt.driver [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:10:51 np0005466013 nova_compute[192144]: 2025-10-02 12:10:51.680 2 DEBUG nova.virt.libvirt.driver [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:10:51 np0005466013 nova_compute[192144]: 2025-10-02 12:10:51.680 2 DEBUG nova.virt.libvirt.driver [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:10:51 np0005466013 nova_compute[192144]: 2025-10-02 12:10:51.691 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:10:51 np0005466013 nova_compute[192144]: 2025-10-02 12:10:51.691 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407051.6171029, e0215fe6-39be-4529-9345-a5fcb4e3e6ee => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:10:51 np0005466013 nova_compute[192144]: 2025-10-02 12:10:51.692 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:10:51 np0005466013 systemd[1]: Started libpod-conmon-878dd236cf5096b531a1a30ef9ca45b5175efa96c4d157618fec30a54cc616e5.scope.
Oct  2 08:10:51 np0005466013 podman[226831]: 2025-10-02 12:10:51.619832049 +0000 UTC m=+0.031647386 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:10:51 np0005466013 nova_compute[192144]: 2025-10-02 12:10:51.723 2 DEBUG nova.network.neutron [req-0ff56ee0-e503-4a25-b385-7784d54e4618 req-bd38e9ad-a6ab-4173-aea0-3e55f6c985d2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Updated VIF entry in instance network info cache for port 04cd91f3-a598-48c3-bb6a-789dece3461d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:10:51 np0005466013 nova_compute[192144]: 2025-10-02 12:10:51.724 2 DEBUG nova.network.neutron [req-0ff56ee0-e503-4a25-b385-7784d54e4618 req-bd38e9ad-a6ab-4173-aea0-3e55f6c985d2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Updating instance_info_cache with network_info: [{"id": "04cd91f3-a598-48c3-bb6a-789dece3461d", "address": "fa:16:3e:9c:f9:18", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04cd91f3-a5", "ovs_interfaceid": "04cd91f3-a598-48c3-bb6a-789dece3461d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:10:51 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:10:51 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d0502dc157657651b30b6d4846c0786fa0b36e60dd957145453926a6ee92b92b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:10:51 np0005466013 nova_compute[192144]: 2025-10-02 12:10:51.749 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:10:51 np0005466013 podman[226831]: 2025-10-02 12:10:51.749959155 +0000 UTC m=+0.161774492 container init 878dd236cf5096b531a1a30ef9ca45b5175efa96c4d157618fec30a54cc616e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3)
Oct  2 08:10:51 np0005466013 nova_compute[192144]: 2025-10-02 12:10:51.752 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407051.6231658, e0215fe6-39be-4529-9345-a5fcb4e3e6ee => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:10:51 np0005466013 nova_compute[192144]: 2025-10-02 12:10:51.753 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:10:51 np0005466013 podman[226831]: 2025-10-02 12:10:51.75579825 +0000 UTC m=+0.167613567 container start 878dd236cf5096b531a1a30ef9ca45b5175efa96c4d157618fec30a54cc616e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  2 08:10:51 np0005466013 nova_compute[192144]: 2025-10-02 12:10:51.775 2 DEBUG oslo_concurrency.lockutils [req-0ff56ee0-e503-4a25-b385-7784d54e4618 req-bd38e9ad-a6ab-4173-aea0-3e55f6c985d2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-e0215fe6-39be-4529-9345-a5fcb4e3e6ee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:10:51 np0005466013 neutron-haproxy-ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2[226846]: [NOTICE]   (226850) : New worker (226852) forked
Oct  2 08:10:51 np0005466013 neutron-haproxy-ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2[226846]: [NOTICE]   (226850) : Loading success.
Oct  2 08:10:51 np0005466013 nova_compute[192144]: 2025-10-02 12:10:51.791 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:10:51 np0005466013 nova_compute[192144]: 2025-10-02 12:10:51.794 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:10:51 np0005466013 nova_compute[192144]: 2025-10-02 12:10:51.821 2 INFO nova.compute.manager [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Took 10.69 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:10:51 np0005466013 nova_compute[192144]: 2025-10-02 12:10:51.822 2 DEBUG nova.compute.manager [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:10:51 np0005466013 nova_compute[192144]: 2025-10-02 12:10:51.823 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:10:51 np0005466013 nova_compute[192144]: 2025-10-02 12:10:51.965 2 INFO nova.compute.manager [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Took 12.38 seconds to build instance.#033[00m
Oct  2 08:10:52 np0005466013 nova_compute[192144]: 2025-10-02 12:10:52.002 2 DEBUG oslo_concurrency.lockutils [None req-ce1019fc-9d30-435b-9345-c8dc4b99ce0b fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "e0215fe6-39be-4529-9345-a5fcb4e3e6ee" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.534s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:53 np0005466013 nova_compute[192144]: 2025-10-02 12:10:53.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:53 np0005466013 nova_compute[192144]: 2025-10-02 12:10:53.739 2 DEBUG nova.compute.manager [req-e08bcc16-f0b9-4a17-b163-ece49454e2a0 req-739deec9-8ad6-4316-8dce-fe1603afb327 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Received event network-vif-plugged-04cd91f3-a598-48c3-bb6a-789dece3461d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:10:53 np0005466013 nova_compute[192144]: 2025-10-02 12:10:53.740 2 DEBUG oslo_concurrency.lockutils [req-e08bcc16-f0b9-4a17-b163-ece49454e2a0 req-739deec9-8ad6-4316-8dce-fe1603afb327 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "e0215fe6-39be-4529-9345-a5fcb4e3e6ee-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:53 np0005466013 nova_compute[192144]: 2025-10-02 12:10:53.740 2 DEBUG oslo_concurrency.lockutils [req-e08bcc16-f0b9-4a17-b163-ece49454e2a0 req-739deec9-8ad6-4316-8dce-fe1603afb327 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "e0215fe6-39be-4529-9345-a5fcb4e3e6ee-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:53 np0005466013 nova_compute[192144]: 2025-10-02 12:10:53.740 2 DEBUG oslo_concurrency.lockutils [req-e08bcc16-f0b9-4a17-b163-ece49454e2a0 req-739deec9-8ad6-4316-8dce-fe1603afb327 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "e0215fe6-39be-4529-9345-a5fcb4e3e6ee-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:53 np0005466013 nova_compute[192144]: 2025-10-02 12:10:53.740 2 DEBUG nova.compute.manager [req-e08bcc16-f0b9-4a17-b163-ece49454e2a0 req-739deec9-8ad6-4316-8dce-fe1603afb327 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] No waiting events found dispatching network-vif-plugged-04cd91f3-a598-48c3-bb6a-789dece3461d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:10:53 np0005466013 nova_compute[192144]: 2025-10-02 12:10:53.740 2 WARNING nova.compute.manager [req-e08bcc16-f0b9-4a17-b163-ece49454e2a0 req-739deec9-8ad6-4316-8dce-fe1603afb327 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Received unexpected event network-vif-plugged-04cd91f3-a598-48c3-bb6a-789dece3461d for instance with vm_state active and task_state None.#033[00m
Oct  2 08:10:54 np0005466013 nova_compute[192144]: 2025-10-02 12:10:54.495 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:58 np0005466013 nova_compute[192144]: 2025-10-02 12:10:58.004 2 DEBUG oslo_concurrency.lockutils [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Acquiring lock "f12db2c7-b990-46f2-8865-699df3c176e6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:58 np0005466013 nova_compute[192144]: 2025-10-02 12:10:58.004 2 DEBUG oslo_concurrency.lockutils [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Lock "f12db2c7-b990-46f2-8865-699df3c176e6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:58 np0005466013 nova_compute[192144]: 2025-10-02 12:10:58.026 2 DEBUG nova.compute.manager [req-428a3fde-fab2-4c6c-bdd2-a5e942d7510c req-377896e7-6f12-4b10-8b59-87510a71099a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Received event network-changed-04cd91f3-a598-48c3-bb6a-789dece3461d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:10:58 np0005466013 nova_compute[192144]: 2025-10-02 12:10:58.027 2 DEBUG nova.compute.manager [req-428a3fde-fab2-4c6c-bdd2-a5e942d7510c req-377896e7-6f12-4b10-8b59-87510a71099a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Refreshing instance network info cache due to event network-changed-04cd91f3-a598-48c3-bb6a-789dece3461d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:10:58 np0005466013 nova_compute[192144]: 2025-10-02 12:10:58.027 2 DEBUG oslo_concurrency.lockutils [req-428a3fde-fab2-4c6c-bdd2-a5e942d7510c req-377896e7-6f12-4b10-8b59-87510a71099a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-e0215fe6-39be-4529-9345-a5fcb4e3e6ee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:10:58 np0005466013 nova_compute[192144]: 2025-10-02 12:10:58.027 2 DEBUG oslo_concurrency.lockutils [req-428a3fde-fab2-4c6c-bdd2-a5e942d7510c req-377896e7-6f12-4b10-8b59-87510a71099a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-e0215fe6-39be-4529-9345-a5fcb4e3e6ee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:10:58 np0005466013 nova_compute[192144]: 2025-10-02 12:10:58.027 2 DEBUG nova.network.neutron [req-428a3fde-fab2-4c6c-bdd2-a5e942d7510c req-377896e7-6f12-4b10-8b59-87510a71099a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Refreshing network info cache for port 04cd91f3-a598-48c3-bb6a-789dece3461d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:10:58 np0005466013 nova_compute[192144]: 2025-10-02 12:10:58.050 2 DEBUG nova.compute.manager [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:10:58 np0005466013 nova_compute[192144]: 2025-10-02 12:10:58.062 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:58 np0005466013 nova_compute[192144]: 2025-10-02 12:10:58.248 2 DEBUG oslo_concurrency.lockutils [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:58 np0005466013 nova_compute[192144]: 2025-10-02 12:10:58.248 2 DEBUG oslo_concurrency.lockutils [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:58 np0005466013 nova_compute[192144]: 2025-10-02 12:10:58.256 2 DEBUG nova.virt.hardware [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:10:58 np0005466013 nova_compute[192144]: 2025-10-02 12:10:58.256 2 INFO nova.compute.claims [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:10:58 np0005466013 nova_compute[192144]: 2025-10-02 12:10:58.454 2 DEBUG nova.compute.provider_tree [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:10:58 np0005466013 nova_compute[192144]: 2025-10-02 12:10:58.488 2 DEBUG nova.scheduler.client.report [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:10:58 np0005466013 nova_compute[192144]: 2025-10-02 12:10:58.513 2 DEBUG oslo_concurrency.lockutils [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.264s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:58 np0005466013 nova_compute[192144]: 2025-10-02 12:10:58.513 2 DEBUG nova.compute.manager [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:10:58 np0005466013 nova_compute[192144]: 2025-10-02 12:10:58.582 2 DEBUG nova.compute.manager [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:10:58 np0005466013 nova_compute[192144]: 2025-10-02 12:10:58.583 2 DEBUG nova.network.neutron [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:10:58 np0005466013 nova_compute[192144]: 2025-10-02 12:10:58.605 2 INFO nova.virt.libvirt.driver [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:10:58 np0005466013 nova_compute[192144]: 2025-10-02 12:10:58.624 2 DEBUG nova.compute.manager [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:10:58 np0005466013 nova_compute[192144]: 2025-10-02 12:10:58.768 2 DEBUG nova.compute.manager [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:10:58 np0005466013 nova_compute[192144]: 2025-10-02 12:10:58.770 2 DEBUG nova.virt.libvirt.driver [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:10:58 np0005466013 nova_compute[192144]: 2025-10-02 12:10:58.770 2 INFO nova.virt.libvirt.driver [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Creating image(s)#033[00m
Oct  2 08:10:58 np0005466013 nova_compute[192144]: 2025-10-02 12:10:58.771 2 DEBUG oslo_concurrency.lockutils [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Acquiring lock "/var/lib/nova/instances/f12db2c7-b990-46f2-8865-699df3c176e6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:58 np0005466013 nova_compute[192144]: 2025-10-02 12:10:58.771 2 DEBUG oslo_concurrency.lockutils [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Lock "/var/lib/nova/instances/f12db2c7-b990-46f2-8865-699df3c176e6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:58 np0005466013 nova_compute[192144]: 2025-10-02 12:10:58.772 2 DEBUG oslo_concurrency.lockutils [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Lock "/var/lib/nova/instances/f12db2c7-b990-46f2-8865-699df3c176e6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:58 np0005466013 nova_compute[192144]: 2025-10-02 12:10:58.786 2 DEBUG oslo_concurrency.processutils [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:10:58 np0005466013 nova_compute[192144]: 2025-10-02 12:10:58.849 2 DEBUG oslo_concurrency.processutils [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:10:58 np0005466013 nova_compute[192144]: 2025-10-02 12:10:58.850 2 DEBUG oslo_concurrency.lockutils [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:58 np0005466013 nova_compute[192144]: 2025-10-02 12:10:58.851 2 DEBUG oslo_concurrency.lockutils [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:58 np0005466013 nova_compute[192144]: 2025-10-02 12:10:58.864 2 DEBUG oslo_concurrency.processutils [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:10:58 np0005466013 nova_compute[192144]: 2025-10-02 12:10:58.929 2 DEBUG oslo_concurrency.processutils [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:10:58 np0005466013 nova_compute[192144]: 2025-10-02 12:10:58.931 2 DEBUG oslo_concurrency.processutils [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/f12db2c7-b990-46f2-8865-699df3c176e6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:10:58 np0005466013 nova_compute[192144]: 2025-10-02 12:10:58.997 2 DEBUG nova.compute.manager [req-25328763-88a9-4347-a3b5-3d4121fb1318 req-ed6b78fa-6b7f-43c2-b112-6349b6d3b532 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Received event network-changed-04cd91f3-a598-48c3-bb6a-789dece3461d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:10:58 np0005466013 nova_compute[192144]: 2025-10-02 12:10:58.997 2 DEBUG nova.compute.manager [req-25328763-88a9-4347-a3b5-3d4121fb1318 req-ed6b78fa-6b7f-43c2-b112-6349b6d3b532 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Refreshing instance network info cache due to event network-changed-04cd91f3-a598-48c3-bb6a-789dece3461d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:10:58 np0005466013 nova_compute[192144]: 2025-10-02 12:10:58.998 2 DEBUG oslo_concurrency.lockutils [req-25328763-88a9-4347-a3b5-3d4121fb1318 req-ed6b78fa-6b7f-43c2-b112-6349b6d3b532 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-e0215fe6-39be-4529-9345-a5fcb4e3e6ee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:10:59 np0005466013 nova_compute[192144]: 2025-10-02 12:10:59.056 2 DEBUG nova.policy [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5b11a704bc3240da9c36e22382c9bd70', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7c0445031175477fb35cf45ea4e8ebe9', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:10:59 np0005466013 nova_compute[192144]: 2025-10-02 12:10:59.114 2 DEBUG oslo_concurrency.processutils [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/f12db2c7-b990-46f2-8865-699df3c176e6/disk 1073741824" returned: 0 in 0.183s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:10:59 np0005466013 nova_compute[192144]: 2025-10-02 12:10:59.115 2 DEBUG oslo_concurrency.lockutils [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.264s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:59 np0005466013 nova_compute[192144]: 2025-10-02 12:10:59.116 2 DEBUG oslo_concurrency.processutils [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:10:59 np0005466013 nova_compute[192144]: 2025-10-02 12:10:59.186 2 DEBUG oslo_concurrency.processutils [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:10:59 np0005466013 nova_compute[192144]: 2025-10-02 12:10:59.187 2 DEBUG nova.virt.disk.api [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Checking if we can resize image /var/lib/nova/instances/f12db2c7-b990-46f2-8865-699df3c176e6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:10:59 np0005466013 nova_compute[192144]: 2025-10-02 12:10:59.188 2 DEBUG oslo_concurrency.processutils [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f12db2c7-b990-46f2-8865-699df3c176e6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:10:59 np0005466013 nova_compute[192144]: 2025-10-02 12:10:59.254 2 DEBUG oslo_concurrency.processutils [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f12db2c7-b990-46f2-8865-699df3c176e6/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:10:59 np0005466013 nova_compute[192144]: 2025-10-02 12:10:59.255 2 DEBUG nova.virt.disk.api [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Cannot resize image /var/lib/nova/instances/f12db2c7-b990-46f2-8865-699df3c176e6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:10:59 np0005466013 nova_compute[192144]: 2025-10-02 12:10:59.255 2 DEBUG nova.objects.instance [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Lazy-loading 'migration_context' on Instance uuid f12db2c7-b990-46f2-8865-699df3c176e6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:10:59 np0005466013 nova_compute[192144]: 2025-10-02 12:10:59.273 2 DEBUG nova.virt.libvirt.driver [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:10:59 np0005466013 nova_compute[192144]: 2025-10-02 12:10:59.273 2 DEBUG nova.virt.libvirt.driver [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Ensure instance console log exists: /var/lib/nova/instances/f12db2c7-b990-46f2-8865-699df3c176e6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:10:59 np0005466013 nova_compute[192144]: 2025-10-02 12:10:59.274 2 DEBUG oslo_concurrency.lockutils [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:59 np0005466013 nova_compute[192144]: 2025-10-02 12:10:59.275 2 DEBUG oslo_concurrency.lockutils [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:59 np0005466013 nova_compute[192144]: 2025-10-02 12:10:59.275 2 DEBUG oslo_concurrency.lockutils [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:59 np0005466013 nova_compute[192144]: 2025-10-02 12:10:59.497 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:00 np0005466013 nova_compute[192144]: 2025-10-02 12:11:00.271 2 DEBUG nova.network.neutron [req-428a3fde-fab2-4c6c-bdd2-a5e942d7510c req-377896e7-6f12-4b10-8b59-87510a71099a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Updated VIF entry in instance network info cache for port 04cd91f3-a598-48c3-bb6a-789dece3461d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:11:00 np0005466013 nova_compute[192144]: 2025-10-02 12:11:00.275 2 DEBUG nova.network.neutron [req-428a3fde-fab2-4c6c-bdd2-a5e942d7510c req-377896e7-6f12-4b10-8b59-87510a71099a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Updating instance_info_cache with network_info: [{"id": "04cd91f3-a598-48c3-bb6a-789dece3461d", "address": "fa:16:3e:9c:f9:18", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04cd91f3-a5", "ovs_interfaceid": "04cd91f3-a598-48c3-bb6a-789dece3461d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:11:00 np0005466013 nova_compute[192144]: 2025-10-02 12:11:00.311 2 DEBUG oslo_concurrency.lockutils [req-428a3fde-fab2-4c6c-bdd2-a5e942d7510c req-377896e7-6f12-4b10-8b59-87510a71099a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-e0215fe6-39be-4529-9345-a5fcb4e3e6ee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:11:00 np0005466013 nova_compute[192144]: 2025-10-02 12:11:00.314 2 DEBUG oslo_concurrency.lockutils [req-25328763-88a9-4347-a3b5-3d4121fb1318 req-ed6b78fa-6b7f-43c2-b112-6349b6d3b532 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-e0215fe6-39be-4529-9345-a5fcb4e3e6ee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:11:00 np0005466013 nova_compute[192144]: 2025-10-02 12:11:00.315 2 DEBUG nova.network.neutron [req-25328763-88a9-4347-a3b5-3d4121fb1318 req-ed6b78fa-6b7f-43c2-b112-6349b6d3b532 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Refreshing network info cache for port 04cd91f3-a598-48c3-bb6a-789dece3461d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:11:01 np0005466013 nova_compute[192144]: 2025-10-02 12:11:01.177 2 DEBUG nova.network.neutron [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Successfully created port: 1f708987-9c45-4948-b794-28b4c634ea5d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:11:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:02.292 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:02.296 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:02.298 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:02 np0005466013 nova_compute[192144]: 2025-10-02 12:11:02.513 2 DEBUG nova.network.neutron [req-25328763-88a9-4347-a3b5-3d4121fb1318 req-ed6b78fa-6b7f-43c2-b112-6349b6d3b532 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Updated VIF entry in instance network info cache for port 04cd91f3-a598-48c3-bb6a-789dece3461d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:11:02 np0005466013 nova_compute[192144]: 2025-10-02 12:11:02.513 2 DEBUG nova.network.neutron [req-25328763-88a9-4347-a3b5-3d4121fb1318 req-ed6b78fa-6b7f-43c2-b112-6349b6d3b532 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Updating instance_info_cache with network_info: [{"id": "04cd91f3-a598-48c3-bb6a-789dece3461d", "address": "fa:16:3e:9c:f9:18", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04cd91f3-a5", "ovs_interfaceid": "04cd91f3-a598-48c3-bb6a-789dece3461d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:11:02 np0005466013 nova_compute[192144]: 2025-10-02 12:11:02.530 2 DEBUG oslo_concurrency.lockutils [req-25328763-88a9-4347-a3b5-3d4121fb1318 req-ed6b78fa-6b7f-43c2-b112-6349b6d3b532 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-e0215fe6-39be-4529-9345-a5fcb4e3e6ee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:11:02 np0005466013 podman[226876]: 2025-10-02 12:11:02.688400114 +0000 UTC m=+0.064134126 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 08:11:02 np0005466013 podman[226877]: 2025-10-02 12:11:02.71007266 +0000 UTC m=+0.082772912 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:11:02 np0005466013 podman[226878]: 2025-10-02 12:11:02.722289365 +0000 UTC m=+0.093045960 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:11:03 np0005466013 nova_compute[192144]: 2025-10-02 12:11:03.067 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:03 np0005466013 nova_compute[192144]: 2025-10-02 12:11:03.147 2 DEBUG nova.network.neutron [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Successfully updated port: 1f708987-9c45-4948-b794-28b4c634ea5d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:11:03 np0005466013 nova_compute[192144]: 2025-10-02 12:11:03.178 2 DEBUG oslo_concurrency.lockutils [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Acquiring lock "refresh_cache-f12db2c7-b990-46f2-8865-699df3c176e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:11:03 np0005466013 nova_compute[192144]: 2025-10-02 12:11:03.179 2 DEBUG oslo_concurrency.lockutils [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Acquired lock "refresh_cache-f12db2c7-b990-46f2-8865-699df3c176e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:11:03 np0005466013 nova_compute[192144]: 2025-10-02 12:11:03.179 2 DEBUG nova.network.neutron [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:11:03 np0005466013 nova_compute[192144]: 2025-10-02 12:11:03.264 2 DEBUG nova.compute.manager [req-e9a6d113-9f7c-4754-b30a-8a9810f3cea8 req-ee73f087-bc7e-4505-b66c-6eed32e2b030 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Received event network-changed-1f708987-9c45-4948-b794-28b4c634ea5d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:11:03 np0005466013 nova_compute[192144]: 2025-10-02 12:11:03.264 2 DEBUG nova.compute.manager [req-e9a6d113-9f7c-4754-b30a-8a9810f3cea8 req-ee73f087-bc7e-4505-b66c-6eed32e2b030 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Refreshing instance network info cache due to event network-changed-1f708987-9c45-4948-b794-28b4c634ea5d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:11:03 np0005466013 nova_compute[192144]: 2025-10-02 12:11:03.265 2 DEBUG oslo_concurrency.lockutils [req-e9a6d113-9f7c-4754-b30a-8a9810f3cea8 req-ee73f087-bc7e-4505-b66c-6eed32e2b030 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-f12db2c7-b990-46f2-8865-699df3c176e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:11:03 np0005466013 nova_compute[192144]: 2025-10-02 12:11:03.390 2 DEBUG nova.network.neutron [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:11:04 np0005466013 nova_compute[192144]: 2025-10-02 12:11:04.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:04 np0005466013 nova_compute[192144]: 2025-10-02 12:11:04.921 2 DEBUG nova.network.neutron [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Updating instance_info_cache with network_info: [{"id": "1f708987-9c45-4948-b794-28b4c634ea5d", "address": "fa:16:3e:61:cb:d1", "network": {"id": "6b319229-b7a5-4343-8a9f-30a4189f9c4c", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1040296601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7c0445031175477fb35cf45ea4e8ebe9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f708987-9c", "ovs_interfaceid": "1f708987-9c45-4948-b794-28b4c634ea5d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:11:04 np0005466013 nova_compute[192144]: 2025-10-02 12:11:04.942 2 DEBUG oslo_concurrency.lockutils [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Releasing lock "refresh_cache-f12db2c7-b990-46f2-8865-699df3c176e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:11:04 np0005466013 nova_compute[192144]: 2025-10-02 12:11:04.943 2 DEBUG nova.compute.manager [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Instance network_info: |[{"id": "1f708987-9c45-4948-b794-28b4c634ea5d", "address": "fa:16:3e:61:cb:d1", "network": {"id": "6b319229-b7a5-4343-8a9f-30a4189f9c4c", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1040296601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7c0445031175477fb35cf45ea4e8ebe9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f708987-9c", "ovs_interfaceid": "1f708987-9c45-4948-b794-28b4c634ea5d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:11:04 np0005466013 nova_compute[192144]: 2025-10-02 12:11:04.943 2 DEBUG oslo_concurrency.lockutils [req-e9a6d113-9f7c-4754-b30a-8a9810f3cea8 req-ee73f087-bc7e-4505-b66c-6eed32e2b030 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-f12db2c7-b990-46f2-8865-699df3c176e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:11:04 np0005466013 nova_compute[192144]: 2025-10-02 12:11:04.944 2 DEBUG nova.network.neutron [req-e9a6d113-9f7c-4754-b30a-8a9810f3cea8 req-ee73f087-bc7e-4505-b66c-6eed32e2b030 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Refreshing network info cache for port 1f708987-9c45-4948-b794-28b4c634ea5d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:11:04 np0005466013 nova_compute[192144]: 2025-10-02 12:11:04.952 2 DEBUG nova.virt.libvirt.driver [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Start _get_guest_xml network_info=[{"id": "1f708987-9c45-4948-b794-28b4c634ea5d", "address": "fa:16:3e:61:cb:d1", "network": {"id": "6b319229-b7a5-4343-8a9f-30a4189f9c4c", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1040296601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7c0445031175477fb35cf45ea4e8ebe9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f708987-9c", "ovs_interfaceid": "1f708987-9c45-4948-b794-28b4c634ea5d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:11:04 np0005466013 nova_compute[192144]: 2025-10-02 12:11:04.961 2 WARNING nova.virt.libvirt.driver [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:11:04 np0005466013 nova_compute[192144]: 2025-10-02 12:11:04.966 2 DEBUG nova.virt.libvirt.host [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:11:04 np0005466013 nova_compute[192144]: 2025-10-02 12:11:04.967 2 DEBUG nova.virt.libvirt.host [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:11:04 np0005466013 nova_compute[192144]: 2025-10-02 12:11:04.975 2 DEBUG nova.virt.libvirt.host [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:11:04 np0005466013 nova_compute[192144]: 2025-10-02 12:11:04.975 2 DEBUG nova.virt.libvirt.host [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:11:04 np0005466013 nova_compute[192144]: 2025-10-02 12:11:04.977 2 DEBUG nova.virt.libvirt.driver [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:11:04 np0005466013 nova_compute[192144]: 2025-10-02 12:11:04.977 2 DEBUG nova.virt.hardware [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:11:04 np0005466013 nova_compute[192144]: 2025-10-02 12:11:04.978 2 DEBUG nova.virt.hardware [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:11:04 np0005466013 nova_compute[192144]: 2025-10-02 12:11:04.978 2 DEBUG nova.virt.hardware [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:11:04 np0005466013 nova_compute[192144]: 2025-10-02 12:11:04.978 2 DEBUG nova.virt.hardware [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:11:04 np0005466013 nova_compute[192144]: 2025-10-02 12:11:04.979 2 DEBUG nova.virt.hardware [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:11:04 np0005466013 nova_compute[192144]: 2025-10-02 12:11:04.979 2 DEBUG nova.virt.hardware [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:11:04 np0005466013 nova_compute[192144]: 2025-10-02 12:11:04.979 2 DEBUG nova.virt.hardware [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:11:04 np0005466013 nova_compute[192144]: 2025-10-02 12:11:04.980 2 DEBUG nova.virt.hardware [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:11:04 np0005466013 nova_compute[192144]: 2025-10-02 12:11:04.980 2 DEBUG nova.virt.hardware [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:11:04 np0005466013 nova_compute[192144]: 2025-10-02 12:11:04.980 2 DEBUG nova.virt.hardware [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:11:04 np0005466013 nova_compute[192144]: 2025-10-02 12:11:04.981 2 DEBUG nova.virt.hardware [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:11:04 np0005466013 nova_compute[192144]: 2025-10-02 12:11:04.985 2 DEBUG nova.virt.libvirt.vif [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=2.2.2.2,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:10:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='guest-instance-1.domain.com',display_name='guest-instance-1.domain.com',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='guest-instance-1-domain-com',id=55,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNlSyfGZ4DLGyW6nLdJOEus31UhYB05UFJ8yocUsxwa8dND8GhDYJhmm+n/B+Hn9fiVn10MXIKsKGB9vg58iJyGT4TqSHNBf4PV0LNj44WE3+z/u3L3HZLHJoFJG3oiLzQ==',key_name='tempest-keypair-174484104',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7c0445031175477fb35cf45ea4e8ebe9',ramdisk_id='',reservation_id='r-eqv1ywk4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestFqdnHostnames-1450369239',owner_user_name='tempest-ServersTestFqdnHostnames-1450369239-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:10:58Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='5b11a704bc3240da9c36e22382c9bd70',uuid=f12db2c7-b990-46f2-8865-699df3c176e6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1f708987-9c45-4948-b794-28b4c634ea5d", "address": "fa:16:3e:61:cb:d1", "network": {"id": "6b319229-b7a5-4343-8a9f-30a4189f9c4c", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1040296601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7c0445031175477fb35cf45ea4e8ebe9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f708987-9c", "ovs_interfaceid": "1f708987-9c45-4948-b794-28b4c634ea5d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:11:04 np0005466013 nova_compute[192144]: 2025-10-02 12:11:04.985 2 DEBUG nova.network.os_vif_util [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Converting VIF {"id": "1f708987-9c45-4948-b794-28b4c634ea5d", "address": "fa:16:3e:61:cb:d1", "network": {"id": "6b319229-b7a5-4343-8a9f-30a4189f9c4c", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1040296601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7c0445031175477fb35cf45ea4e8ebe9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f708987-9c", "ovs_interfaceid": "1f708987-9c45-4948-b794-28b4c634ea5d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:11:04 np0005466013 nova_compute[192144]: 2025-10-02 12:11:04.986 2 DEBUG nova.network.os_vif_util [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:61:cb:d1,bridge_name='br-int',has_traffic_filtering=True,id=1f708987-9c45-4948-b794-28b4c634ea5d,network=Network(6b319229-b7a5-4343-8a9f-30a4189f9c4c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f708987-9c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:11:04 np0005466013 nova_compute[192144]: 2025-10-02 12:11:04.988 2 DEBUG nova.objects.instance [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Lazy-loading 'pci_devices' on Instance uuid f12db2c7-b990-46f2-8865-699df3c176e6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:11:05 np0005466013 nova_compute[192144]: 2025-10-02 12:11:05.002 2 DEBUG nova.virt.libvirt.driver [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:11:05 np0005466013 nova_compute[192144]:  <uuid>f12db2c7-b990-46f2-8865-699df3c176e6</uuid>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:  <name>instance-00000037</name>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:11:05 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:      <nova:name>guest-instance-1.domain.com</nova:name>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:11:04</nova:creationTime>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:11:05 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:        <nova:user uuid="5b11a704bc3240da9c36e22382c9bd70">tempest-ServersTestFqdnHostnames-1450369239-project-member</nova:user>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:        <nova:project uuid="7c0445031175477fb35cf45ea4e8ebe9">tempest-ServersTestFqdnHostnames-1450369239</nova:project>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:        <nova:port uuid="1f708987-9c45-4948-b794-28b4c634ea5d">
Oct  2 08:11:05 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:      <entry name="serial">f12db2c7-b990-46f2-8865-699df3c176e6</entry>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:      <entry name="uuid">f12db2c7-b990-46f2-8865-699df3c176e6</entry>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:11:05 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/f12db2c7-b990-46f2-8865-699df3c176e6/disk"/>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:11:05 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/f12db2c7-b990-46f2-8865-699df3c176e6/disk.config"/>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:11:05 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:61:cb:d1"/>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:      <target dev="tap1f708987-9c"/>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:11:05 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/f12db2c7-b990-46f2-8865-699df3c176e6/console.log" append="off"/>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:11:05 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:11:05 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:11:05 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:11:05 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:11:05 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:11:05 np0005466013 nova_compute[192144]: 2025-10-02 12:11:05.004 2 DEBUG nova.compute.manager [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Preparing to wait for external event network-vif-plugged-1f708987-9c45-4948-b794-28b4c634ea5d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:11:05 np0005466013 nova_compute[192144]: 2025-10-02 12:11:05.004 2 DEBUG oslo_concurrency.lockutils [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Acquiring lock "f12db2c7-b990-46f2-8865-699df3c176e6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:05 np0005466013 nova_compute[192144]: 2025-10-02 12:11:05.004 2 DEBUG oslo_concurrency.lockutils [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Lock "f12db2c7-b990-46f2-8865-699df3c176e6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:05 np0005466013 nova_compute[192144]: 2025-10-02 12:11:05.005 2 DEBUG oslo_concurrency.lockutils [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Lock "f12db2c7-b990-46f2-8865-699df3c176e6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:05 np0005466013 nova_compute[192144]: 2025-10-02 12:11:05.005 2 DEBUG nova.virt.libvirt.vif [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=2.2.2.2,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:10:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='guest-instance-1.domain.com',display_name='guest-instance-1.domain.com',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='guest-instance-1-domain-com',id=55,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNlSyfGZ4DLGyW6nLdJOEus31UhYB05UFJ8yocUsxwa8dND8GhDYJhmm+n/B+Hn9fiVn10MXIKsKGB9vg58iJyGT4TqSHNBf4PV0LNj44WE3+z/u3L3HZLHJoFJG3oiLzQ==',key_name='tempest-keypair-174484104',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7c0445031175477fb35cf45ea4e8ebe9',ramdisk_id='',reservation_id='r-eqv1ywk4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestFqdnHostnames-1450369239',owner_user_name='tempest-ServersTestFqdnHostnames-1450369239-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:10:58Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='5b11a704bc3240da9c36e22382c9bd70',uuid=f12db2c7-b990-46f2-8865-699df3c176e6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1f708987-9c45-4948-b794-28b4c634ea5d", "address": "fa:16:3e:61:cb:d1", "network": {"id": "6b319229-b7a5-4343-8a9f-30a4189f9c4c", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1040296601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7c0445031175477fb35cf45ea4e8ebe9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f708987-9c", "ovs_interfaceid": "1f708987-9c45-4948-b794-28b4c634ea5d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:11:05 np0005466013 nova_compute[192144]: 2025-10-02 12:11:05.006 2 DEBUG nova.network.os_vif_util [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Converting VIF {"id": "1f708987-9c45-4948-b794-28b4c634ea5d", "address": "fa:16:3e:61:cb:d1", "network": {"id": "6b319229-b7a5-4343-8a9f-30a4189f9c4c", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1040296601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7c0445031175477fb35cf45ea4e8ebe9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f708987-9c", "ovs_interfaceid": "1f708987-9c45-4948-b794-28b4c634ea5d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:11:05 np0005466013 nova_compute[192144]: 2025-10-02 12:11:05.006 2 DEBUG nova.network.os_vif_util [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:61:cb:d1,bridge_name='br-int',has_traffic_filtering=True,id=1f708987-9c45-4948-b794-28b4c634ea5d,network=Network(6b319229-b7a5-4343-8a9f-30a4189f9c4c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f708987-9c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:11:05 np0005466013 nova_compute[192144]: 2025-10-02 12:11:05.007 2 DEBUG os_vif [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:61:cb:d1,bridge_name='br-int',has_traffic_filtering=True,id=1f708987-9c45-4948-b794-28b4c634ea5d,network=Network(6b319229-b7a5-4343-8a9f-30a4189f9c4c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f708987-9c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:11:05 np0005466013 nova_compute[192144]: 2025-10-02 12:11:05.007 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:05 np0005466013 nova_compute[192144]: 2025-10-02 12:11:05.008 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:05 np0005466013 nova_compute[192144]: 2025-10-02 12:11:05.008 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:11:05 np0005466013 nova_compute[192144]: 2025-10-02 12:11:05.012 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:05 np0005466013 nova_compute[192144]: 2025-10-02 12:11:05.012 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1f708987-9c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:05 np0005466013 nova_compute[192144]: 2025-10-02 12:11:05.013 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1f708987-9c, col_values=(('external_ids', {'iface-id': '1f708987-9c45-4948-b794-28b4c634ea5d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:61:cb:d1', 'vm-uuid': 'f12db2c7-b990-46f2-8865-699df3c176e6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:05 np0005466013 nova_compute[192144]: 2025-10-02 12:11:05.016 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:05 np0005466013 nova_compute[192144]: 2025-10-02 12:11:05.017 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:11:05 np0005466013 NetworkManager[51205]: <info>  [1759407065.0185] manager: (tap1f708987-9c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/77)
Oct  2 08:11:05 np0005466013 nova_compute[192144]: 2025-10-02 12:11:05.027 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:05 np0005466013 nova_compute[192144]: 2025-10-02 12:11:05.028 2 INFO os_vif [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:61:cb:d1,bridge_name='br-int',has_traffic_filtering=True,id=1f708987-9c45-4948-b794-28b4c634ea5d,network=Network(6b319229-b7a5-4343-8a9f-30a4189f9c4c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f708987-9c')#033[00m
Oct  2 08:11:05 np0005466013 nova_compute[192144]: 2025-10-02 12:11:05.138 2 DEBUG nova.virt.libvirt.driver [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:11:05 np0005466013 nova_compute[192144]: 2025-10-02 12:11:05.139 2 DEBUG nova.virt.libvirt.driver [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:11:05 np0005466013 nova_compute[192144]: 2025-10-02 12:11:05.139 2 DEBUG nova.virt.libvirt.driver [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] No VIF found with MAC fa:16:3e:61:cb:d1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:11:05 np0005466013 nova_compute[192144]: 2025-10-02 12:11:05.139 2 INFO nova.virt.libvirt.driver [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Using config drive#033[00m
Oct  2 08:11:05 np0005466013 nova_compute[192144]: 2025-10-02 12:11:05.544 2 INFO nova.virt.libvirt.driver [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Creating config drive at /var/lib/nova/instances/f12db2c7-b990-46f2-8865-699df3c176e6/disk.config#033[00m
Oct  2 08:11:05 np0005466013 nova_compute[192144]: 2025-10-02 12:11:05.551 2 DEBUG oslo_concurrency.processutils [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f12db2c7-b990-46f2-8865-699df3c176e6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgegz7dgv execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:11:05 np0005466013 nova_compute[192144]: 2025-10-02 12:11:05.685 2 DEBUG oslo_concurrency.processutils [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f12db2c7-b990-46f2-8865-699df3c176e6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgegz7dgv" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:11:05 np0005466013 NetworkManager[51205]: <info>  [1759407065.7515] manager: (tap1f708987-9c): new Tun device (/org/freedesktop/NetworkManager/Devices/78)
Oct  2 08:11:05 np0005466013 kernel: tap1f708987-9c: entered promiscuous mode
Oct  2 08:11:05 np0005466013 nova_compute[192144]: 2025-10-02 12:11:05.755 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:05 np0005466013 ovn_controller[94366]: 2025-10-02T12:11:05Z|00139|binding|INFO|Claiming lport 1f708987-9c45-4948-b794-28b4c634ea5d for this chassis.
Oct  2 08:11:05 np0005466013 ovn_controller[94366]: 2025-10-02T12:11:05Z|00140|binding|INFO|1f708987-9c45-4948-b794-28b4c634ea5d: Claiming fa:16:3e:61:cb:d1 10.100.0.9
Oct  2 08:11:05 np0005466013 ovn_controller[94366]: 2025-10-02T12:11:05Z|00141|binding|INFO|Setting lport 1f708987-9c45-4948-b794-28b4c634ea5d ovn-installed in OVS
Oct  2 08:11:05 np0005466013 nova_compute[192144]: 2025-10-02 12:11:05.771 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:05 np0005466013 nova_compute[192144]: 2025-10-02 12:11:05.773 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:05 np0005466013 ovn_controller[94366]: 2025-10-02T12:11:05Z|00142|binding|INFO|Setting lport 1f708987-9c45-4948-b794-28b4c634ea5d up in Southbound
Oct  2 08:11:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:05.775 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:61:cb:d1 10.100.0.9'], port_security=['fa:16:3e:61:cb:d1 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'f12db2c7-b990-46f2-8865-699df3c176e6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6b319229-b7a5-4343-8a9f-30a4189f9c4c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7c0445031175477fb35cf45ea4e8ebe9', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b721cd83-2c83-42fe-9c6d-205a8d8dd7e2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ccd24e31-d17f-4734-be86-ecf006ac2832, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=1f708987-9c45-4948-b794-28b4c634ea5d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:11:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:05.776 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 1f708987-9c45-4948-b794-28b4c634ea5d in datapath 6b319229-b7a5-4343-8a9f-30a4189f9c4c bound to our chassis#033[00m
Oct  2 08:11:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:05.778 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6b319229-b7a5-4343-8a9f-30a4189f9c4c#033[00m
Oct  2 08:11:05 np0005466013 systemd-udevd[226982]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:11:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:05.799 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[b39a3f75-e1c9-4d42-80c9-e4072e4f6abd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:05.801 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6b319229-b1 in ovnmeta-6b319229-b7a5-4343-8a9f-30a4189f9c4c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:11:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:05.804 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6b319229-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:11:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:05.804 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[62766e82-e310-4801-9130-4c9081969f39]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:05.805 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[93dd44f8-0678-45bc-b38d-9fad725dfb6d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:05 np0005466013 NetworkManager[51205]: <info>  [1759407065.8124] device (tap1f708987-9c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:11:05 np0005466013 NetworkManager[51205]: <info>  [1759407065.8134] device (tap1f708987-9c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:11:05 np0005466013 systemd-machined[152202]: New machine qemu-22-instance-00000037.
Oct  2 08:11:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:05.821 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[237aade6-3762-4768-a69b-9c17567700f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:05 np0005466013 systemd[1]: Started Virtual Machine qemu-22-instance-00000037.
Oct  2 08:11:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:05.848 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[a9affb29-8175-4b28-bab3-9fd093e26f8b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:05.880 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[356574ea-fbba-4638-b446-cfea94ecbd8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:05 np0005466013 NetworkManager[51205]: <info>  [1759407065.8888] manager: (tap6b319229-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/79)
Oct  2 08:11:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:05.887 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[d17e6a6b-9805-4e62-ac0e-ff27f8d34fa1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:05.924 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[205488c9-b484-4bdb-876b-93b8daeb0ca2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:05.929 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[d5348004-473e-4eb9-8929-b3a11b6a6ca5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:05 np0005466013 NetworkManager[51205]: <info>  [1759407065.9535] device (tap6b319229-b0): carrier: link connected
Oct  2 08:11:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:05.960 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[0c122d6a-b27c-410b-a25f-c0cdb8bf40f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:05.982 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[75fa6e57-cc77-4e76-bb87-7d4829a886b6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6b319229-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f0:3a:12'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 47], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 505757, 'reachable_time': 42619, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227017, 'error': None, 'target': 'ovnmeta-6b319229-b7a5-4343-8a9f-30a4189f9c4c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:06.000 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[f5fc3b33-c9b3-499c-a0e2-ae5cb4307236]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef0:3a12'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 505757, 'tstamp': 505757}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227018, 'error': None, 'target': 'ovnmeta-6b319229-b7a5-4343-8a9f-30a4189f9c4c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:06.022 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[94f5849b-7caf-4f0c-800d-12f6b70c3fa7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6b319229-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f0:3a:12'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 47], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 505757, 'reachable_time': 42619, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227019, 'error': None, 'target': 'ovnmeta-6b319229-b7a5-4343-8a9f-30a4189f9c4c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:06.061 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[3fc51ca1-f8a3-4eeb-83d9-dd07f8a43c97]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:06.136 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[5e9778b1-007d-4ca2-86a4-9614e7caed24]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:06.150 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6b319229-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:06.150 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:11:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:06.150 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6b319229-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:06 np0005466013 nova_compute[192144]: 2025-10-02 12:11:06.152 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:06 np0005466013 kernel: tap6b319229-b0: entered promiscuous mode
Oct  2 08:11:06 np0005466013 NetworkManager[51205]: <info>  [1759407066.1548] manager: (tap6b319229-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/80)
Oct  2 08:11:06 np0005466013 nova_compute[192144]: 2025-10-02 12:11:06.155 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:06.159 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6b319229-b0, col_values=(('external_ids', {'iface-id': 'bae948b1-4f73-4351-9747-4eb0bc1cf88b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:06 np0005466013 nova_compute[192144]: 2025-10-02 12:11:06.162 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:06 np0005466013 ovn_controller[94366]: 2025-10-02T12:11:06Z|00143|binding|INFO|Releasing lport bae948b1-4f73-4351-9747-4eb0bc1cf88b from this chassis (sb_readonly=0)
Oct  2 08:11:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:06.163 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6b319229-b7a5-4343-8a9f-30a4189f9c4c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6b319229-b7a5-4343-8a9f-30a4189f9c4c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:11:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:06.166 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[b6e20d99-4ac9-45a6-847b-47651a78755c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:06.172 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:11:06 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:11:06 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:11:06 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-6b319229-b7a5-4343-8a9f-30a4189f9c4c
Oct  2 08:11:06 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:11:06 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:11:06 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:11:06 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/6b319229-b7a5-4343-8a9f-30a4189f9c4c.pid.haproxy
Oct  2 08:11:06 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:11:06 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:11:06 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:11:06 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:11:06 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:11:06 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:11:06 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:11:06 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:11:06 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:11:06 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:11:06 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:11:06 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:11:06 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:11:06 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:11:06 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:11:06 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:11:06 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:11:06 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:11:06 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:11:06 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:11:06 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID 6b319229-b7a5-4343-8a9f-30a4189f9c4c
Oct  2 08:11:06 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:11:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:06.173 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6b319229-b7a5-4343-8a9f-30a4189f9c4c', 'env', 'PROCESS_TAG=haproxy-6b319229-b7a5-4343-8a9f-30a4189f9c4c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6b319229-b7a5-4343-8a9f-30a4189f9c4c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:11:06 np0005466013 nova_compute[192144]: 2025-10-02 12:11:06.176 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:06 np0005466013 ovn_controller[94366]: 2025-10-02T12:11:06Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:9c:f9:18 10.100.0.11
Oct  2 08:11:06 np0005466013 ovn_controller[94366]: 2025-10-02T12:11:06Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9c:f9:18 10.100.0.11
Oct  2 08:11:06 np0005466013 podman[227051]: 2025-10-02 12:11:06.575705268 +0000 UTC m=+0.030787401 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:11:06 np0005466013 nova_compute[192144]: 2025-10-02 12:11:06.885 2 DEBUG nova.network.neutron [req-e9a6d113-9f7c-4754-b30a-8a9810f3cea8 req-ee73f087-bc7e-4505-b66c-6eed32e2b030 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Updated VIF entry in instance network info cache for port 1f708987-9c45-4948-b794-28b4c634ea5d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:11:06 np0005466013 nova_compute[192144]: 2025-10-02 12:11:06.886 2 DEBUG nova.network.neutron [req-e9a6d113-9f7c-4754-b30a-8a9810f3cea8 req-ee73f087-bc7e-4505-b66c-6eed32e2b030 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Updating instance_info_cache with network_info: [{"id": "1f708987-9c45-4948-b794-28b4c634ea5d", "address": "fa:16:3e:61:cb:d1", "network": {"id": "6b319229-b7a5-4343-8a9f-30a4189f9c4c", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1040296601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7c0445031175477fb35cf45ea4e8ebe9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f708987-9c", "ovs_interfaceid": "1f708987-9c45-4948-b794-28b4c634ea5d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:11:06 np0005466013 nova_compute[192144]: 2025-10-02 12:11:06.908 2 DEBUG oslo_concurrency.lockutils [req-e9a6d113-9f7c-4754-b30a-8a9810f3cea8 req-ee73f087-bc7e-4505-b66c-6eed32e2b030 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-f12db2c7-b990-46f2-8865-699df3c176e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:11:07 np0005466013 podman[227051]: 2025-10-02 12:11:07.095257625 +0000 UTC m=+0.550339738 container create ed079d378436b3a13b328d96ebb851d04a3ae0b7c2b9fb5b71aa279f2bbf15a1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6b319229-b7a5-4343-8a9f-30a4189f9c4c, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:11:07 np0005466013 systemd[1]: Started libpod-conmon-ed079d378436b3a13b328d96ebb851d04a3ae0b7c2b9fb5b71aa279f2bbf15a1.scope.
Oct  2 08:11:07 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:11:07 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/464acf72dc5ee27fb2a7be97615991fd8433d092bb77166e12525414b9a6aec6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:11:07 np0005466013 podman[227051]: 2025-10-02 12:11:07.266671984 +0000 UTC m=+0.721754127 container init ed079d378436b3a13b328d96ebb851d04a3ae0b7c2b9fb5b71aa279f2bbf15a1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6b319229-b7a5-4343-8a9f-30a4189f9c4c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2)
Oct  2 08:11:07 np0005466013 podman[227051]: 2025-10-02 12:11:07.274644412 +0000 UTC m=+0.729726535 container start ed079d378436b3a13b328d96ebb851d04a3ae0b7c2b9fb5b71aa279f2bbf15a1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6b319229-b7a5-4343-8a9f-30a4189f9c4c, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:11:07 np0005466013 neutron-haproxy-ovnmeta-6b319229-b7a5-4343-8a9f-30a4189f9c4c[227073]: [NOTICE]   (227077) : New worker (227079) forked
Oct  2 08:11:07 np0005466013 neutron-haproxy-ovnmeta-6b319229-b7a5-4343-8a9f-30a4189f9c4c[227073]: [NOTICE]   (227077) : Loading success.
Oct  2 08:11:07 np0005466013 nova_compute[192144]: 2025-10-02 12:11:07.429 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407067.4288874, f12db2c7-b990-46f2-8865-699df3c176e6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:11:07 np0005466013 nova_compute[192144]: 2025-10-02 12:11:07.430 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] VM Started (Lifecycle Event)#033[00m
Oct  2 08:11:07 np0005466013 nova_compute[192144]: 2025-10-02 12:11:07.454 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:11:07 np0005466013 nova_compute[192144]: 2025-10-02 12:11:07.460 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407067.4290416, f12db2c7-b990-46f2-8865-699df3c176e6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:11:07 np0005466013 nova_compute[192144]: 2025-10-02 12:11:07.461 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:11:07 np0005466013 nova_compute[192144]: 2025-10-02 12:11:07.486 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:11:07 np0005466013 nova_compute[192144]: 2025-10-02 12:11:07.491 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:11:07 np0005466013 nova_compute[192144]: 2025-10-02 12:11:07.515 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:11:07 np0005466013 nova_compute[192144]: 2025-10-02 12:11:07.710 2 DEBUG nova.compute.manager [req-24f74341-8e36-40dd-a896-21504c75bd7f req-6e4b9bd4-c06e-4ef9-a85d-b0792b297383 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Received event network-vif-plugged-1f708987-9c45-4948-b794-28b4c634ea5d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:11:07 np0005466013 nova_compute[192144]: 2025-10-02 12:11:07.710 2 DEBUG oslo_concurrency.lockutils [req-24f74341-8e36-40dd-a896-21504c75bd7f req-6e4b9bd4-c06e-4ef9-a85d-b0792b297383 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "f12db2c7-b990-46f2-8865-699df3c176e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:07 np0005466013 nova_compute[192144]: 2025-10-02 12:11:07.711 2 DEBUG oslo_concurrency.lockutils [req-24f74341-8e36-40dd-a896-21504c75bd7f req-6e4b9bd4-c06e-4ef9-a85d-b0792b297383 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f12db2c7-b990-46f2-8865-699df3c176e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:07 np0005466013 nova_compute[192144]: 2025-10-02 12:11:07.711 2 DEBUG oslo_concurrency.lockutils [req-24f74341-8e36-40dd-a896-21504c75bd7f req-6e4b9bd4-c06e-4ef9-a85d-b0792b297383 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f12db2c7-b990-46f2-8865-699df3c176e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:07 np0005466013 nova_compute[192144]: 2025-10-02 12:11:07.711 2 DEBUG nova.compute.manager [req-24f74341-8e36-40dd-a896-21504c75bd7f req-6e4b9bd4-c06e-4ef9-a85d-b0792b297383 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Processing event network-vif-plugged-1f708987-9c45-4948-b794-28b4c634ea5d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:11:07 np0005466013 nova_compute[192144]: 2025-10-02 12:11:07.711 2 DEBUG nova.compute.manager [req-24f74341-8e36-40dd-a896-21504c75bd7f req-6e4b9bd4-c06e-4ef9-a85d-b0792b297383 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Received event network-vif-plugged-1f708987-9c45-4948-b794-28b4c634ea5d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:11:07 np0005466013 nova_compute[192144]: 2025-10-02 12:11:07.711 2 DEBUG oslo_concurrency.lockutils [req-24f74341-8e36-40dd-a896-21504c75bd7f req-6e4b9bd4-c06e-4ef9-a85d-b0792b297383 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "f12db2c7-b990-46f2-8865-699df3c176e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:07 np0005466013 nova_compute[192144]: 2025-10-02 12:11:07.712 2 DEBUG oslo_concurrency.lockutils [req-24f74341-8e36-40dd-a896-21504c75bd7f req-6e4b9bd4-c06e-4ef9-a85d-b0792b297383 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f12db2c7-b990-46f2-8865-699df3c176e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:07 np0005466013 nova_compute[192144]: 2025-10-02 12:11:07.712 2 DEBUG oslo_concurrency.lockutils [req-24f74341-8e36-40dd-a896-21504c75bd7f req-6e4b9bd4-c06e-4ef9-a85d-b0792b297383 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f12db2c7-b990-46f2-8865-699df3c176e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:07 np0005466013 nova_compute[192144]: 2025-10-02 12:11:07.712 2 DEBUG nova.compute.manager [req-24f74341-8e36-40dd-a896-21504c75bd7f req-6e4b9bd4-c06e-4ef9-a85d-b0792b297383 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] No waiting events found dispatching network-vif-plugged-1f708987-9c45-4948-b794-28b4c634ea5d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:11:07 np0005466013 nova_compute[192144]: 2025-10-02 12:11:07.712 2 WARNING nova.compute.manager [req-24f74341-8e36-40dd-a896-21504c75bd7f req-6e4b9bd4-c06e-4ef9-a85d-b0792b297383 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Received unexpected event network-vif-plugged-1f708987-9c45-4948-b794-28b4c634ea5d for instance with vm_state building and task_state spawning.#033[00m
Oct  2 08:11:07 np0005466013 nova_compute[192144]: 2025-10-02 12:11:07.713 2 DEBUG nova.compute.manager [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:11:07 np0005466013 nova_compute[192144]: 2025-10-02 12:11:07.717 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407067.715977, f12db2c7-b990-46f2-8865-699df3c176e6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:11:07 np0005466013 nova_compute[192144]: 2025-10-02 12:11:07.717 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:11:07 np0005466013 nova_compute[192144]: 2025-10-02 12:11:07.720 2 DEBUG nova.virt.libvirt.driver [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:11:07 np0005466013 nova_compute[192144]: 2025-10-02 12:11:07.725 2 INFO nova.virt.libvirt.driver [-] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Instance spawned successfully.#033[00m
Oct  2 08:11:07 np0005466013 nova_compute[192144]: 2025-10-02 12:11:07.726 2 DEBUG nova.virt.libvirt.driver [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:11:07 np0005466013 nova_compute[192144]: 2025-10-02 12:11:07.744 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:11:07 np0005466013 nova_compute[192144]: 2025-10-02 12:11:07.754 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:11:07 np0005466013 nova_compute[192144]: 2025-10-02 12:11:07.758 2 DEBUG nova.virt.libvirt.driver [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:11:07 np0005466013 nova_compute[192144]: 2025-10-02 12:11:07.759 2 DEBUG nova.virt.libvirt.driver [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:11:07 np0005466013 nova_compute[192144]: 2025-10-02 12:11:07.760 2 DEBUG nova.virt.libvirt.driver [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:11:07 np0005466013 nova_compute[192144]: 2025-10-02 12:11:07.760 2 DEBUG nova.virt.libvirt.driver [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:11:07 np0005466013 nova_compute[192144]: 2025-10-02 12:11:07.761 2 DEBUG nova.virt.libvirt.driver [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:11:07 np0005466013 nova_compute[192144]: 2025-10-02 12:11:07.761 2 DEBUG nova.virt.libvirt.driver [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:11:07 np0005466013 nova_compute[192144]: 2025-10-02 12:11:07.782 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:11:07 np0005466013 nova_compute[192144]: 2025-10-02 12:11:07.830 2 INFO nova.compute.manager [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Took 9.06 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:11:07 np0005466013 nova_compute[192144]: 2025-10-02 12:11:07.830 2 DEBUG nova.compute.manager [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:11:07 np0005466013 nova_compute[192144]: 2025-10-02 12:11:07.932 2 INFO nova.compute.manager [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Took 9.75 seconds to build instance.#033[00m
Oct  2 08:11:08 np0005466013 nova_compute[192144]: 2025-10-02 12:11:08.115 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:08 np0005466013 nova_compute[192144]: 2025-10-02 12:11:08.147 2 DEBUG oslo_concurrency.lockutils [None req-6a8dccfc-1c08-4714-a169-e301051cfb91 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Lock "f12db2c7-b990-46f2-8865-699df3c176e6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.143s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:10 np0005466013 nova_compute[192144]: 2025-10-02 12:11:10.016 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:12 np0005466013 nova_compute[192144]: 2025-10-02 12:11:12.108 2 DEBUG nova.compute.manager [req-7d87bffb-6351-4980-b327-be6b2b78c601 req-158911f7-b938-41e4-b5d5-65b304a919fa 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Received event network-changed-1f708987-9c45-4948-b794-28b4c634ea5d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:11:12 np0005466013 nova_compute[192144]: 2025-10-02 12:11:12.110 2 DEBUG nova.compute.manager [req-7d87bffb-6351-4980-b327-be6b2b78c601 req-158911f7-b938-41e4-b5d5-65b304a919fa 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Refreshing instance network info cache due to event network-changed-1f708987-9c45-4948-b794-28b4c634ea5d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:11:12 np0005466013 nova_compute[192144]: 2025-10-02 12:11:12.110 2 DEBUG oslo_concurrency.lockutils [req-7d87bffb-6351-4980-b327-be6b2b78c601 req-158911f7-b938-41e4-b5d5-65b304a919fa 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-f12db2c7-b990-46f2-8865-699df3c176e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:11:12 np0005466013 nova_compute[192144]: 2025-10-02 12:11:12.111 2 DEBUG oslo_concurrency.lockutils [req-7d87bffb-6351-4980-b327-be6b2b78c601 req-158911f7-b938-41e4-b5d5-65b304a919fa 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-f12db2c7-b990-46f2-8865-699df3c176e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:11:12 np0005466013 nova_compute[192144]: 2025-10-02 12:11:12.111 2 DEBUG nova.network.neutron [req-7d87bffb-6351-4980-b327-be6b2b78c601 req-158911f7-b938-41e4-b5d5-65b304a919fa 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Refreshing network info cache for port 1f708987-9c45-4948-b794-28b4c634ea5d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:11:13 np0005466013 nova_compute[192144]: 2025-10-02 12:11:13.118 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:13 np0005466013 nova_compute[192144]: 2025-10-02 12:11:13.650 2 DEBUG nova.network.neutron [req-7d87bffb-6351-4980-b327-be6b2b78c601 req-158911f7-b938-41e4-b5d5-65b304a919fa 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Updated VIF entry in instance network info cache for port 1f708987-9c45-4948-b794-28b4c634ea5d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:11:13 np0005466013 nova_compute[192144]: 2025-10-02 12:11:13.651 2 DEBUG nova.network.neutron [req-7d87bffb-6351-4980-b327-be6b2b78c601 req-158911f7-b938-41e4-b5d5-65b304a919fa 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Updating instance_info_cache with network_info: [{"id": "1f708987-9c45-4948-b794-28b4c634ea5d", "address": "fa:16:3e:61:cb:d1", "network": {"id": "6b319229-b7a5-4343-8a9f-30a4189f9c4c", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1040296601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7c0445031175477fb35cf45ea4e8ebe9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f708987-9c", "ovs_interfaceid": "1f708987-9c45-4948-b794-28b4c634ea5d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:11:13 np0005466013 nova_compute[192144]: 2025-10-02 12:11:13.674 2 DEBUG oslo_concurrency.lockutils [req-7d87bffb-6351-4980-b327-be6b2b78c601 req-158911f7-b938-41e4-b5d5-65b304a919fa 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-f12db2c7-b990-46f2-8865-699df3c176e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:11:13 np0005466013 podman[227088]: 2025-10-02 12:11:13.68338112 +0000 UTC m=+0.060571560 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  2 08:11:15 np0005466013 nova_compute[192144]: 2025-10-02 12:11:15.018 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:16 np0005466013 nova_compute[192144]: 2025-10-02 12:11:16.079 2 DEBUG oslo_concurrency.lockutils [None req-3af35bf0-f515-4356-bc09-16469ba8d072 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquiring lock "interface-e0215fe6-39be-4529-9345-a5fcb4e3e6ee-5c564602-5be7-47b0-858a-52eed7fcfd09" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:16 np0005466013 nova_compute[192144]: 2025-10-02 12:11:16.081 2 DEBUG oslo_concurrency.lockutils [None req-3af35bf0-f515-4356-bc09-16469ba8d072 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "interface-e0215fe6-39be-4529-9345-a5fcb4e3e6ee-5c564602-5be7-47b0-858a-52eed7fcfd09" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:16 np0005466013 nova_compute[192144]: 2025-10-02 12:11:16.082 2 DEBUG nova.objects.instance [None req-3af35bf0-f515-4356-bc09-16469ba8d072 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lazy-loading 'flavor' on Instance uuid e0215fe6-39be-4529-9345-a5fcb4e3e6ee obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:11:16 np0005466013 nova_compute[192144]: 2025-10-02 12:11:16.312 2 DEBUG nova.compute.manager [req-925e8172-d769-4292-a670-2d4dd84d9225 req-b8b133d6-887b-4071-bfc1-8c02e01f16a2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Received event network-changed-04cd91f3-a598-48c3-bb6a-789dece3461d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:11:16 np0005466013 nova_compute[192144]: 2025-10-02 12:11:16.313 2 DEBUG nova.compute.manager [req-925e8172-d769-4292-a670-2d4dd84d9225 req-b8b133d6-887b-4071-bfc1-8c02e01f16a2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Refreshing instance network info cache due to event network-changed-04cd91f3-a598-48c3-bb6a-789dece3461d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:11:16 np0005466013 nova_compute[192144]: 2025-10-02 12:11:16.313 2 DEBUG oslo_concurrency.lockutils [req-925e8172-d769-4292-a670-2d4dd84d9225 req-b8b133d6-887b-4071-bfc1-8c02e01f16a2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-e0215fe6-39be-4529-9345-a5fcb4e3e6ee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:11:16 np0005466013 nova_compute[192144]: 2025-10-02 12:11:16.313 2 DEBUG oslo_concurrency.lockutils [req-925e8172-d769-4292-a670-2d4dd84d9225 req-b8b133d6-887b-4071-bfc1-8c02e01f16a2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-e0215fe6-39be-4529-9345-a5fcb4e3e6ee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:11:16 np0005466013 nova_compute[192144]: 2025-10-02 12:11:16.314 2 DEBUG nova.network.neutron [req-925e8172-d769-4292-a670-2d4dd84d9225 req-b8b133d6-887b-4071-bfc1-8c02e01f16a2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Refreshing network info cache for port 04cd91f3-a598-48c3-bb6a-789dece3461d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.346 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f12db2c7-b990-46f2-8865-699df3c176e6', 'name': 'guest-instance-1.domain.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000037', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '7c0445031175477fb35cf45ea4e8ebe9', 'user_id': '5b11a704bc3240da9c36e22382c9bd70', 'hostId': '4ce9181730da9bfdc487101e799e7aadb94c3f84a22dad40d074351b', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.350 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee', 'name': 'tempest-tempest.common.compute-instance-1321990610', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000035', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'ef4e3be787374d90a6a236c7f76bd940', 'user_id': 'fbc7616089cb4f78832692487019c83d', 'hostId': '27faf5241fba9dee681b8eeae06daf4ee3b570dc0375dc29c00251b9', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.350 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.372 12 DEBUG ceilometer.compute.pollsters [-] f12db2c7-b990-46f2-8865-699df3c176e6/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.373 12 DEBUG ceilometer.compute.pollsters [-] f12db2c7-b990-46f2-8865-699df3c176e6/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.400 12 DEBUG ceilometer.compute.pollsters [-] e0215fe6-39be-4529-9345-a5fcb4e3e6ee/disk.device.read.requests volume: 1133 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.401 12 DEBUG ceilometer.compute.pollsters [-] e0215fe6-39be-4529-9345-a5fcb4e3e6ee/disk.device.read.requests volume: 120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.402 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '852923e1-b6bb-4186-83e9-8ea272c123a6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '5b11a704bc3240da9c36e22382c9bd70', 'user_name': None, 'project_id': '7c0445031175477fb35cf45ea4e8ebe9', 'project_name': None, 'resource_id': 'f12db2c7-b990-46f2-8865-699df3c176e6-vda', 'timestamp': '2025-10-02T12:11:16.351059', 'resource_metadata': {'display_name': 'guest-instance-1.domain.com', 'name': 'instance-00000037', 'instance_id': 'f12db2c7-b990-46f2-8865-699df3c176e6', 'instance_type': 'm1.nano', 'host': '4ce9181730da9bfdc487101e799e7aadb94c3f84a22dad40d074351b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e587c22e-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.029386684, 'message_signature': '284ae3d0c53c84f495a02f463a532f90defc5e04c67296554b37a9720f6bc922'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '5b11a704bc3240da9c36e22382c9bd70', 'user_name': None, 'project_id': '7c0445031175477fb35cf45ea4e8ebe9', 'project_name': None, 'resource_id': 'f12db2c7-b990-46f2-8865-699df3c176e6-sda', 'timestamp': '2025-10-02T12:11:16.351059', 'resource_metadata': {'display_name': 'guest-instance-1.domain.com', 'name': 'instance-00000037', 'instance_id': 'f12db2c7-b990-46f2-8865-699df3c176e6', 'instance_type': 'm1.nano', 'host': '4ce9181730da9bfdc487101e799e7aadb94c3f84a22dad40d074351b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e587d23c-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.029386684, 'message_signature': '5aac3425fd79f673982d44d870f92d324cd98d297c68baedbcfab83691fb7508'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1133, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee-vda', 'timestamp': '2025-10-02T12:11:16.351059', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1321990610', 'name': 'instance-00000035', 'instance_id': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee', 'instance_type': 'm1.nano', 'host': '27faf5241fba9dee681b8eeae06daf4ee3b570dc0375dc29c00251b9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e58bf416-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.052121954, 'message_signature': '3fd42886f217dc43d61d76f09aa9a1eb108a85db2b0ae5aabad435deb6d93b08'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 120, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee-sda', 'timestamp': '2025-10-02T12:11:16.351059', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1321990610', 'name': 'instance-00000035', 'instance_id': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee', 'instance_type': 'm1.nano', 'host': '27faf5241fba9dee681b8eeae06daf4ee3b570dc0375dc29c00251b9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e58c06fe-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.052121954, 'message_signature': '00b23aa77cb7e6d493f3a64fb1a62910fd981b98a23b43d5eaf8dc5966b6fad2'}]}, 'timestamp': '2025-10-02 12:11:16.401489', '_unique_id': 'b73321fd2f364c57bbf9793f48bdc342'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.402 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.402 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.402 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.402 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.402 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.402 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.402 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.402 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.402 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.402 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.402 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.402 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.402 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.402 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.402 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.402 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.402 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.402 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.402 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.402 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.402 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.402 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.402 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.402 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.402 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.402 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.402 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.402 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.402 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.402 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.402 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.406 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.409 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for f12db2c7-b990-46f2-8865-699df3c176e6 / tap1f708987-9c inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.410 12 DEBUG ceilometer.compute.pollsters [-] f12db2c7-b990-46f2-8865-699df3c176e6/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.413 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for e0215fe6-39be-4529-9345-a5fcb4e3e6ee / tap04cd91f3-a5 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.413 12 DEBUG ceilometer.compute.pollsters [-] e0215fe6-39be-4529-9345-a5fcb4e3e6ee/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.415 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c6fd4851-a19e-4807-8e14-6cb9167bc67e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5b11a704bc3240da9c36e22382c9bd70', 'user_name': None, 'project_id': '7c0445031175477fb35cf45ea4e8ebe9', 'project_name': None, 'resource_id': 'instance-00000037-f12db2c7-b990-46f2-8865-699df3c176e6-tap1f708987-9c', 'timestamp': '2025-10-02T12:11:16.406682', 'resource_metadata': {'display_name': 'guest-instance-1.domain.com', 'name': 'tap1f708987-9c', 'instance_id': 'f12db2c7-b990-46f2-8865-699df3c176e6', 'instance_type': 'm1.nano', 'host': '4ce9181730da9bfdc487101e799e7aadb94c3f84a22dad40d074351b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:61:cb:d1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1f708987-9c'}, 'message_id': 'e58d7cfa-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.085045986, 'message_signature': '3b1b96f5836dd4c99745fe52c598a937aeaa8a3397e3200d82e7001735947f6a'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': 'instance-00000035-e0215fe6-39be-4529-9345-a5fcb4e3e6ee-tap04cd91f3-a5', 'timestamp': '2025-10-02T12:11:16.406682', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1321990610', 'name': 'tap04cd91f3-a5', 'instance_id': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee', 'instance_type': 'm1.nano', 'host': '27faf5241fba9dee681b8eeae06daf4ee3b570dc0375dc29c00251b9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9c:f9:18', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap04cd91f3-a5'}, 'message_id': 'e58dfa68-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.089393357, 'message_signature': '0c7ac57fd7265755b5c7dc8b630388f9cdc5f875fc317703a67472eb2d0d33b2'}]}, 'timestamp': '2025-10-02 12:11:16.414310', '_unique_id': '520d7b34138543b98f9d1aa143566d02'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.415 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.415 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.415 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.415 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.415 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.415 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.415 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.415 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.415 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.415 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.415 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.415 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.415 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.415 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.415 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.415 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.415 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.415 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.415 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.415 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.415 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.415 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.415 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.415 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.415 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.415 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.415 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.415 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.415 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.415 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.415 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.419 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.419 12 DEBUG ceilometer.compute.pollsters [-] f12db2c7-b990-46f2-8865-699df3c176e6/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.419 12 DEBUG ceilometer.compute.pollsters [-] e0215fe6-39be-4529-9345-a5fcb4e3e6ee/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.421 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f95c2cb1-98eb-4d32-b1c2-35d158835074', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5b11a704bc3240da9c36e22382c9bd70', 'user_name': None, 'project_id': '7c0445031175477fb35cf45ea4e8ebe9', 'project_name': None, 'resource_id': 'instance-00000037-f12db2c7-b990-46f2-8865-699df3c176e6-tap1f708987-9c', 'timestamp': '2025-10-02T12:11:16.419207', 'resource_metadata': {'display_name': 'guest-instance-1.domain.com', 'name': 'tap1f708987-9c', 'instance_id': 'f12db2c7-b990-46f2-8865-699df3c176e6', 'instance_type': 'm1.nano', 'host': '4ce9181730da9bfdc487101e799e7aadb94c3f84a22dad40d074351b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:61:cb:d1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1f708987-9c'}, 'message_id': 'e58ed0fa-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.085045986, 'message_signature': '7a49c102bf2734551d4b6087a74397c29f170593cc830ddfa3f020be9ebaf30b'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': 'instance-00000035-e0215fe6-39be-4529-9345-a5fcb4e3e6ee-tap04cd91f3-a5', 'timestamp': '2025-10-02T12:11:16.419207', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1321990610', 'name': 'tap04cd91f3-a5', 'instance_id': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee', 'instance_type': 'm1.nano', 'host': '27faf5241fba9dee681b8eeae06daf4ee3b570dc0375dc29c00251b9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9c:f9:18', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap04cd91f3-a5'}, 'message_id': 'e58ee2ca-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.089393357, 'message_signature': 'f5dd4e353320cd3681c7c42972c6ad94b5948b9b4ccdba323248bfc9f5e71441'}]}, 'timestamp': '2025-10-02 12:11:16.420229', '_unique_id': '369a9e8d519740de9614fb1b321e722f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.421 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.421 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.421 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.421 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.421 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.421 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.421 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.421 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.421 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.421 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.421 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.421 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.421 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.421 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.421 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.421 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.421 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.421 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.421 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.421 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.421 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.421 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.421 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.421 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.421 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.421 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.421 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.421 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.421 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.421 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.421 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.422 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.424 12 DEBUG ceilometer.compute.pollsters [-] f12db2c7-b990-46f2-8865-699df3c176e6/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.425 12 DEBUG ceilometer.compute.pollsters [-] e0215fe6-39be-4529-9345-a5fcb4e3e6ee/network.outgoing.bytes volume: 3320 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.426 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a3d6281e-b2d9-4748-a8c1-cf9f2a83b0e0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5b11a704bc3240da9c36e22382c9bd70', 'user_name': None, 'project_id': '7c0445031175477fb35cf45ea4e8ebe9', 'project_name': None, 'resource_id': 'instance-00000037-f12db2c7-b990-46f2-8865-699df3c176e6-tap1f708987-9c', 'timestamp': '2025-10-02T12:11:16.424786', 'resource_metadata': {'display_name': 'guest-instance-1.domain.com', 'name': 'tap1f708987-9c', 'instance_id': 'f12db2c7-b990-46f2-8865-699df3c176e6', 'instance_type': 'm1.nano', 'host': '4ce9181730da9bfdc487101e799e7aadb94c3f84a22dad40d074351b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:61:cb:d1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1f708987-9c'}, 'message_id': 'e58fab4c-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.085045986, 'message_signature': '1fc367922c540bb6f211b46ac1308b668e823950ac43e917f5b1543d97482ca3'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3320, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': 'instance-00000035-e0215fe6-39be-4529-9345-a5fcb4e3e6ee-tap04cd91f3-a5', 'timestamp': '2025-10-02T12:11:16.424786', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1321990610', 'name': 'tap04cd91f3-a5', 'instance_id': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee', 'instance_type': 'm1.nano', 'host': '27faf5241fba9dee681b8eeae06daf4ee3b570dc0375dc29c00251b9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9c:f9:18', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap04cd91f3-a5'}, 'message_id': 'e58fbbaa-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.089393357, 'message_signature': 'af2e38be360200d78eead605731f0e9b5331512f73511528f8c61cfaa53df42a'}]}, 'timestamp': '2025-10-02 12:11:16.425773', '_unique_id': '5eb667b8c5b0469daa07337343186d25'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.426 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.426 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.426 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.426 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.426 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.426 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.426 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.426 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.426 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.426 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.426 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.426 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.426 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.426 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.426 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.426 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.426 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.426 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.426 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.426 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.426 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.426 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.426 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.426 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.426 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.426 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.426 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.426 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.426 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.426 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.426 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.430 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.443 12 DEBUG ceilometer.compute.pollsters [-] f12db2c7-b990-46f2-8865-699df3c176e6/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.443 12 DEBUG ceilometer.compute.pollsters [-] f12db2c7-b990-46f2-8865-699df3c176e6/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.459 12 DEBUG ceilometer.compute.pollsters [-] e0215fe6-39be-4529-9345-a5fcb4e3e6ee/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.460 12 DEBUG ceilometer.compute.pollsters [-] e0215fe6-39be-4529-9345-a5fcb4e3e6ee/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.461 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '44a0287c-65c6-4b8c-b277-95a06aabf317', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '5b11a704bc3240da9c36e22382c9bd70', 'user_name': None, 'project_id': '7c0445031175477fb35cf45ea4e8ebe9', 'project_name': None, 'resource_id': 'f12db2c7-b990-46f2-8865-699df3c176e6-vda', 'timestamp': '2025-10-02T12:11:16.430188', 'resource_metadata': {'display_name': 'guest-instance-1.domain.com', 'name': 'instance-00000037', 'instance_id': 'f12db2c7-b990-46f2-8865-699df3c176e6', 'instance_type': 'm1.nano', 'host': '4ce9181730da9bfdc487101e799e7aadb94c3f84a22dad40d074351b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e592761a-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.108534258, 'message_signature': '9f8f51e39e7236b32a448d8cdd57f62ea1f33ad7c5d4066b5de9ba95043e00dd'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': '5b11a704bc3240da9c36e22382c9bd70', 'user_name': None, 'project_id': '7c0445031175477fb35cf45ea4e8ebe9', 'project_name': None, 'resource_id': 'f12db2c7-b990-46f2-8865-699df3c176e6-sda', 'timestamp': '2025-10-02T12:11:16.430188', 'resource_metadata': {'display_name': 'guest-instance-1.domain.com', 'name': 'instance-00000037', 'instance_id': 'f12db2c7-b990-46f2-8865-699df3c176e6', 'instance_type': 'm1.nano', 'host': '4ce9181730da9bfdc487101e799e7aadb94c3f84a22dad40d074351b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e59282ea-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.108534258, 'message_signature': '0febc429ac30bdc34e31a6f975981ac70a620aedc88ac15e92a468b098751f2c'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee-vda', 'timestamp': '2025-10-02T12:11:16.430188', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1321990610', 'name': 'instance-00000035', 'instance_id': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee', 'instance_type': 'm1.nano', 'host': '27faf5241fba9dee681b8eeae06daf4ee3b570dc0375dc29c00251b9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e5950060-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.122175846, 'message_signature': '44f770969b4a2f32ba25410ba0a08ded1a2a9de0b403cedd3da5ee3f9375e8a4'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee-sda', 'timestamp': '2025-10-02T12:11:16.430188', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1321990610', 'name': 'instance-00000035', 'instance_id': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee', 'instance_type': 'm1.nano', 'host': '27faf5241fba9dee681b8eeae06daf4ee3b570dc0375dc29c00251b9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e59510fa-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.122175846, 'message_signature': 'aca8c84584057915bd5a886ec932b9f1e2d31d1e89bfb023ca96a8ee239a815e'}]}, 'timestamp': '2025-10-02 12:11:16.460695', '_unique_id': 'd6366c3c059e4f9ea4f6479d6bbabbd2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.461 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.461 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.461 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.461 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.461 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.461 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.461 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.461 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.461 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.461 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.461 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.461 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.461 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.461 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.461 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.461 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.461 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.461 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.461 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.461 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.461 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.461 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.461 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.461 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.461 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.461 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.461 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.461 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.461 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.461 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.461 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.462 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.484 12 DEBUG ceilometer.compute.pollsters [-] f12db2c7-b990-46f2-8865-699df3c176e6/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.485 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance f12db2c7-b990-46f2-8865-699df3c176e6: ceilometer.compute.pollsters.NoVolumeException
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.512 12 DEBUG ceilometer.compute.pollsters [-] e0215fe6-39be-4529-9345-a5fcb4e3e6ee/memory.usage volume: 42.7265625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.516 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4fc94fcd-7972-4823-9287-32d2aa3f180d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.7265625, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee', 'timestamp': '2025-10-02T12:11:16.467548', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1321990610', 'name': 'instance-00000035', 'instance_id': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee', 'instance_type': 'm1.nano', 'host': '27faf5241fba9dee681b8eeae06daf4ee3b570dc0375dc29c00251b9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'e59d0c10-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.190367612, 'message_signature': 'a0e3e50ceb43b1d920604544d4159758fcd1d0910c82cca2a496603d80c31077'}]}, 'timestamp': '2025-10-02 12:11:16.513137', '_unique_id': '86cf5d315f284e9c857dbb8a3d473c32'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.516 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.516 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.516 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.516 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.516 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.516 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.516 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.516 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.516 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.516 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.516 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.516 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.516 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.516 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.516 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.516 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.516 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.516 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.516 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.516 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.516 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.516 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.516 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.516 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.516 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.516 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.516 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.516 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.516 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.516 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.516 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.517 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.517 12 DEBUG ceilometer.compute.pollsters [-] f12db2c7-b990-46f2-8865-699df3c176e6/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.518 12 DEBUG ceilometer.compute.pollsters [-] e0215fe6-39be-4529-9345-a5fcb4e3e6ee/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.519 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b0ea3a74-371e-42a8-8ac5-e86a50a734aa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5b11a704bc3240da9c36e22382c9bd70', 'user_name': None, 'project_id': '7c0445031175477fb35cf45ea4e8ebe9', 'project_name': None, 'resource_id': 'instance-00000037-f12db2c7-b990-46f2-8865-699df3c176e6-tap1f708987-9c', 'timestamp': '2025-10-02T12:11:16.517587', 'resource_metadata': {'display_name': 'guest-instance-1.domain.com', 'name': 'tap1f708987-9c', 'instance_id': 'f12db2c7-b990-46f2-8865-699df3c176e6', 'instance_type': 'm1.nano', 'host': '4ce9181730da9bfdc487101e799e7aadb94c3f84a22dad40d074351b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:61:cb:d1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1f708987-9c'}, 'message_id': 'e59dcdc6-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.085045986, 'message_signature': '27ae64411e76ae329930e3ab3fc5678af0f688f790fb926e70671d63a93fd523'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': 'instance-00000035-e0215fe6-39be-4529-9345-a5fcb4e3e6ee-tap04cd91f3-a5', 'timestamp': '2025-10-02T12:11:16.517587', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1321990610', 'name': 'tap04cd91f3-a5', 'instance_id': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee', 'instance_type': 'm1.nano', 'host': '27faf5241fba9dee681b8eeae06daf4ee3b570dc0375dc29c00251b9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9c:f9:18', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap04cd91f3-a5'}, 'message_id': 'e59dddfc-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.089393357, 'message_signature': '9acd1fb0ef959cbeeacc4477cc47a283341d98b1231930fe2bcc561d0a2590ba'}]}, 'timestamp': '2025-10-02 12:11:16.518409', '_unique_id': 'b98991d82e6a4a1e80b4031134f2a383'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.519 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.519 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.519 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.519 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.519 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.519 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.519 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.519 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.519 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.519 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.519 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.519 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.519 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.519 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.519 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.519 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.519 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.519 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.519 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.519 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.519 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.519 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.519 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.519 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.519 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.519 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.519 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.519 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.519 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.519 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.519 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.520 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.520 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.520 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: guest-instance-1.domain.com>, <NovaLikeServer: tempest-tempest.common.compute-instance-1321990610>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: guest-instance-1.domain.com>, <NovaLikeServer: tempest-tempest.common.compute-instance-1321990610>]
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.520 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.520 12 DEBUG ceilometer.compute.pollsters [-] f12db2c7-b990-46f2-8865-699df3c176e6/disk.device.read.latency volume: 602276631 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.521 12 DEBUG ceilometer.compute.pollsters [-] f12db2c7-b990-46f2-8865-699df3c176e6/disk.device.read.latency volume: 3552085 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.521 12 DEBUG ceilometer.compute.pollsters [-] e0215fe6-39be-4529-9345-a5fcb4e3e6ee/disk.device.read.latency volume: 1454804113 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.521 12 DEBUG ceilometer.compute.pollsters [-] e0215fe6-39be-4529-9345-a5fcb4e3e6ee/disk.device.read.latency volume: 52318218 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.523 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c6d453eb-d722-4bf9-9511-e133b1ead844', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 602276631, 'user_id': '5b11a704bc3240da9c36e22382c9bd70', 'user_name': None, 'project_id': '7c0445031175477fb35cf45ea4e8ebe9', 'project_name': None, 'resource_id': 'f12db2c7-b990-46f2-8865-699df3c176e6-vda', 'timestamp': '2025-10-02T12:11:16.520956', 'resource_metadata': {'display_name': 'guest-instance-1.domain.com', 'name': 'instance-00000037', 'instance_id': 'f12db2c7-b990-46f2-8865-699df3c176e6', 'instance_type': 'm1.nano', 'host': '4ce9181730da9bfdc487101e799e7aadb94c3f84a22dad40d074351b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e59e4f8a-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.029386684, 'message_signature': '877eea6bf62cf2e11c6c13c83d5ce2519f57d42c3ce6a83614f4853a4760bb8a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3552085, 'user_id': '5b11a704bc3240da9c36e22382c9bd70', 'user_name': None, 'project_id': '7c0445031175477fb35cf45ea4e8ebe9', 'project_name': None, 'resource_id': 'f12db2c7-b990-46f2-8865-699df3c176e6-sda', 'timestamp': '2025-10-02T12:11:16.520956', 'resource_metadata': {'display_name': 'guest-instance-1.domain.com', 'name': 'instance-00000037', 'instance_id': 'f12db2c7-b990-46f2-8865-699df3c176e6', 'instance_type': 'm1.nano', 'host': '4ce9181730da9bfdc487101e799e7aadb94c3f84a22dad40d074351b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e59e5c5a-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.029386684, 'message_signature': '54ac056a3181b0adb61322b9f1cda674bb38fd6ab76b0462d3b290a9c990fbe5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1454804113, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee-vda', 'timestamp': '2025-10-02T12:11:16.520956', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1321990610', 'name': 'instance-00000035', 'instance_id': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee', 'instance_type': 'm1.nano', 'host': '27faf5241fba9dee681b8eeae06daf4ee3b570dc0375dc29c00251b9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e59e6790-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.052121954, 'message_signature': '53f7fa9976a05a6c23dfdfe1dff936c9b51ab2daa4b288f509732d53caf1524a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 52318218, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee-sda', 'timestamp': '2025-10-02T12:11:16.520956', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1321990610', 'name': 'instance-00000035', 'instance_id': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee', 'instance_type': 'm1.nano', 'host': '27faf5241fba9dee681b8eeae06daf4ee3b570dc0375dc29c00251b9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e59e7424-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.052121954, 'message_signature': '0141d9e060103966df50ea00591fa8bee252b1d98056c157cc46e338b73f5e72'}]}, 'timestamp': '2025-10-02 12:11:16.522238', '_unique_id': 'c83fdcc8f86a441b8c359290cdbe3488'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.523 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.523 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.523 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.523 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.523 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.523 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.523 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.523 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.523 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.523 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.523 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.523 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.523 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.523 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.523 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.523 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.523 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.523 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.523 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.523 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.523 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.523 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.523 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.523 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.523 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.523 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.523 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.523 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.523 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.523 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.523 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.524 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.524 12 DEBUG ceilometer.compute.pollsters [-] f12db2c7-b990-46f2-8865-699df3c176e6/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.524 12 DEBUG ceilometer.compute.pollsters [-] f12db2c7-b990-46f2-8865-699df3c176e6/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.525 12 DEBUG ceilometer.compute.pollsters [-] e0215fe6-39be-4529-9345-a5fcb4e3e6ee/disk.device.write.requests volume: 331 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.525 12 DEBUG ceilometer.compute.pollsters [-] e0215fe6-39be-4529-9345-a5fcb4e3e6ee/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.526 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e6fdb80e-3b9c-4607-a353-31bcd69debcc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '5b11a704bc3240da9c36e22382c9bd70', 'user_name': None, 'project_id': '7c0445031175477fb35cf45ea4e8ebe9', 'project_name': None, 'resource_id': 'f12db2c7-b990-46f2-8865-699df3c176e6-vda', 'timestamp': '2025-10-02T12:11:16.524618', 'resource_metadata': {'display_name': 'guest-instance-1.domain.com', 'name': 'instance-00000037', 'instance_id': 'f12db2c7-b990-46f2-8865-699df3c176e6', 'instance_type': 'm1.nano', 'host': '4ce9181730da9bfdc487101e799e7aadb94c3f84a22dad40d074351b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e59ede0a-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.029386684, 'message_signature': 'e73eacb91aa428c6bc088de0994f2d0ce4e331d6919d915a569fd9c61da1ea08'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '5b11a704bc3240da9c36e22382c9bd70', 'user_name': None, 'project_id': '7c0445031175477fb35cf45ea4e8ebe9', 'project_name': None, 'resource_id': 'f12db2c7-b990-46f2-8865-699df3c176e6-sda', 'timestamp': '2025-10-02T12:11:16.524618', 'resource_metadata': {'display_name': 'guest-instance-1.domain.com', 'name': 'instance-00000037', 'instance_id': 'f12db2c7-b990-46f2-8865-699df3c176e6', 'instance_type': 'm1.nano', 'host': '4ce9181730da9bfdc487101e799e7aadb94c3f84a22dad40d074351b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e59eebe8-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.029386684, 'message_signature': '65eb94966eb9781ed4af77c405e0eb7ab2253e4a1f637bc42dd55a18b27f9743'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 331, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee-vda', 'timestamp': '2025-10-02T12:11:16.524618', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1321990610', 'name': 'instance-00000035', 'instance_id': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee', 'instance_type': 'm1.nano', 'host': '27faf5241fba9dee681b8eeae06daf4ee3b570dc0375dc29c00251b9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e59ef76e-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.052121954, 'message_signature': '3371cac54e90e6193a0e9670b17e72c0befd452a1160f2bfd053116c03fb20f4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee-sda', 'timestamp': '2025-10-02T12:11:16.524618', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1321990610', 'name': 'instance-00000035', 'instance_id': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee', 'instance_type': 'm1.nano', 'host': '27faf5241fba9dee681b8eeae06daf4ee3b570dc0375dc29c00251b9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e59f02cc-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.052121954, 'message_signature': '89e3f61c5eafe6825504cdd80942ecd7a5dc395abd9faa77cd88fd3b7369e3aa'}]}, 'timestamp': '2025-10-02 12:11:16.525922', '_unique_id': '669e8e43953f4e8297c9e6ca91f4287c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.526 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.526 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.526 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.526 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.526 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.526 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.526 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.526 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.526 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.526 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.526 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.526 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.526 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.526 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.526 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.526 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.526 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.526 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.526 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.526 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.526 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.526 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.526 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.526 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.526 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.526 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.526 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.526 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.526 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.526 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.526 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.527 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.527 12 DEBUG ceilometer.compute.pollsters [-] f12db2c7-b990-46f2-8865-699df3c176e6/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.528 12 DEBUG ceilometer.compute.pollsters [-] f12db2c7-b990-46f2-8865-699df3c176e6/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.528 12 DEBUG ceilometer.compute.pollsters [-] e0215fe6-39be-4529-9345-a5fcb4e3e6ee/disk.device.write.latency volume: 5787862110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.528 12 DEBUG ceilometer.compute.pollsters [-] e0215fe6-39be-4529-9345-a5fcb4e3e6ee/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.530 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1b3fc139-e9c8-4ffc-b848-0562bc88e6da', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '5b11a704bc3240da9c36e22382c9bd70', 'user_name': None, 'project_id': '7c0445031175477fb35cf45ea4e8ebe9', 'project_name': None, 'resource_id': 'f12db2c7-b990-46f2-8865-699df3c176e6-vda', 'timestamp': '2025-10-02T12:11:16.527946', 'resource_metadata': {'display_name': 'guest-instance-1.domain.com', 'name': 'instance-00000037', 'instance_id': 'f12db2c7-b990-46f2-8865-699df3c176e6', 'instance_type': 'm1.nano', 'host': '4ce9181730da9bfdc487101e799e7aadb94c3f84a22dad40d074351b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e59f6050-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.029386684, 'message_signature': 'f5afe69bedbfda73b8fe036f1bc5022e978760d26ff0358d57c0df474629444f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '5b11a704bc3240da9c36e22382c9bd70', 'user_name': None, 'project_id': '7c0445031175477fb35cf45ea4e8ebe9', 'project_name': None, 'resource_id': 'f12db2c7-b990-46f2-8865-699df3c176e6-sda', 'timestamp': '2025-10-02T12:11:16.527946', 'resource_metadata': {'display_name': 'guest-instance-1.domain.com', 'name': 'instance-00000037', 'instance_id': 'f12db2c7-b990-46f2-8865-699df3c176e6', 'instance_type': 'm1.nano', 'host': '4ce9181730da9bfdc487101e799e7aadb94c3f84a22dad40d074351b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e59f6d16-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.029386684, 'message_signature': 'be7370164182ddb45592aed877977c53af6af5ef55fe0552d71f336ccdd2ac82'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 5787862110, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee-vda', 'timestamp': '2025-10-02T12:11:16.527946', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1321990610', 'name': 'instance-00000035', 'instance_id': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee', 'instance_type': 'm1.nano', 'host': '27faf5241fba9dee681b8eeae06daf4ee3b570dc0375dc29c00251b9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e59f793c-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.052121954, 'message_signature': '7e26ff8bf7996b15a2018ebdde884b024a65b5aae73b3a15fb7b5f6790116e5b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee-sda', 'timestamp': '2025-10-02T12:11:16.527946', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1321990610', 'name': 'instance-00000035', 'instance_id': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee', 'instance_type': 'm1.nano', 'host': '27faf5241fba9dee681b8eeae06daf4ee3b570dc0375dc29c00251b9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e59f86c0-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.052121954, 'message_signature': '3d0be94bba1b3986ab13af80fd7c48e6b1908edce0c2afcb1b642e9111c6ffbf'}]}, 'timestamp': '2025-10-02 12:11:16.529270', '_unique_id': 'd138c3b92a504dfca796a51d69df3c24'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.530 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.530 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.530 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.530 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.530 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.530 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.530 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.530 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.530 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.530 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.530 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.530 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.530 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.530 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.530 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.530 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.530 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.530 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.530 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.530 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.530 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.530 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.530 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.530 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.530 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.530 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.530 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.530 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.530 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.530 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.530 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.530 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.531 12 DEBUG ceilometer.compute.pollsters [-] f12db2c7-b990-46f2-8865-699df3c176e6/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.531 12 DEBUG ceilometer.compute.pollsters [-] f12db2c7-b990-46f2-8865-699df3c176e6/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.531 12 DEBUG ceilometer.compute.pollsters [-] e0215fe6-39be-4529-9345-a5fcb4e3e6ee/disk.device.write.bytes volume: 72908800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.532 12 DEBUG ceilometer.compute.pollsters [-] e0215fe6-39be-4529-9345-a5fcb4e3e6ee/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.533 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e79b5165-0fcd-4573-9c84-1c62a0a576b6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5b11a704bc3240da9c36e22382c9bd70', 'user_name': None, 'project_id': '7c0445031175477fb35cf45ea4e8ebe9', 'project_name': None, 'resource_id': 'f12db2c7-b990-46f2-8865-699df3c176e6-vda', 'timestamp': '2025-10-02T12:11:16.531090', 'resource_metadata': {'display_name': 'guest-instance-1.domain.com', 'name': 'instance-00000037', 'instance_id': 'f12db2c7-b990-46f2-8865-699df3c176e6', 'instance_type': 'm1.nano', 'host': '4ce9181730da9bfdc487101e799e7aadb94c3f84a22dad40d074351b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e59fdab2-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.029386684, 'message_signature': '36247effa20cec1cbe782f3ff7b18888a473fd914c27f4d36228329e40089338'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5b11a704bc3240da9c36e22382c9bd70', 'user_name': None, 'project_id': '7c0445031175477fb35cf45ea4e8ebe9', 'project_name': None, 'resource_id': 'f12db2c7-b990-46f2-8865-699df3c176e6-sda', 'timestamp': '2025-10-02T12:11:16.531090', 'resource_metadata': {'display_name': 'guest-instance-1.domain.com', 'name': 'instance-00000037', 'instance_id': 'f12db2c7-b990-46f2-8865-699df3c176e6', 'instance_type': 'm1.nano', 'host': '4ce9181730da9bfdc487101e799e7aadb94c3f84a22dad40d074351b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e59fe67e-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.029386684, 'message_signature': '3355ff48fed3d2a3ee2e50734d858945d6add4e80bc5c0a7b79c240d3b329361'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72908800, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee-vda', 'timestamp': '2025-10-02T12:11:16.531090', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1321990610', 'name': 'instance-00000035', 'instance_id': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee', 'instance_type': 'm1.nano', 'host': '27faf5241fba9dee681b8eeae06daf4ee3b570dc0375dc29c00251b9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e59ff25e-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.052121954, 'message_signature': 'dca6e2db01db5667e9568c4abecb79f3491f5a29b694f8c296fab2224bf3e2b0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee-sda', 'timestamp': '2025-10-02T12:11:16.531090', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1321990610', 'name': 'instance-00000035', 'instance_id': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee', 'instance_type': 'm1.nano', 'host': '27faf5241fba9dee681b8eeae06daf4ee3b570dc0375dc29c00251b9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e59ffd12-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.052121954, 'message_signature': '54c9af82588674a255b462b85ae769ec912bfeb923db5f062639a6595e2e88e9'}]}, 'timestamp': '2025-10-02 12:11:16.532313', '_unique_id': 'e8e8b06a5f8749bab8e747b3c0cebebb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.533 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.533 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.533 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.533 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.533 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.533 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.533 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.533 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.533 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.533 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.533 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.533 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.533 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.533 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.533 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.533 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.533 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.533 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.533 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.533 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.533 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.533 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.533 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.533 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.533 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.533 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.533 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.533 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.533 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.533 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.533 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.534 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.534 12 DEBUG ceilometer.compute.pollsters [-] f12db2c7-b990-46f2-8865-699df3c176e6/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.534 12 DEBUG ceilometer.compute.pollsters [-] e0215fe6-39be-4529-9345-a5fcb4e3e6ee/network.outgoing.packets volume: 27 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.535 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'da8b46b9-00e4-450b-addc-8eba03081561', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5b11a704bc3240da9c36e22382c9bd70', 'user_name': None, 'project_id': '7c0445031175477fb35cf45ea4e8ebe9', 'project_name': None, 'resource_id': 'instance-00000037-f12db2c7-b990-46f2-8865-699df3c176e6-tap1f708987-9c', 'timestamp': '2025-10-02T12:11:16.534190', 'resource_metadata': {'display_name': 'guest-instance-1.domain.com', 'name': 'tap1f708987-9c', 'instance_id': 'f12db2c7-b990-46f2-8865-699df3c176e6', 'instance_type': 'm1.nano', 'host': '4ce9181730da9bfdc487101e799e7aadb94c3f84a22dad40d074351b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:61:cb:d1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1f708987-9c'}, 'message_id': 'e5a054b0-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.085045986, 'message_signature': '2b258f52b4c11f544775f1c9095200baafe37651b54372a5426762a3eb72c23d'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 27, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': 'instance-00000035-e0215fe6-39be-4529-9345-a5fcb4e3e6ee-tap04cd91f3-a5', 'timestamp': '2025-10-02T12:11:16.534190', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1321990610', 'name': 'tap04cd91f3-a5', 'instance_id': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee', 'instance_type': 'm1.nano', 'host': '27faf5241fba9dee681b8eeae06daf4ee3b570dc0375dc29c00251b9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9c:f9:18', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap04cd91f3-a5'}, 'message_id': 'e5a0619e-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.089393357, 'message_signature': 'fed2b8167226943bbe8b33ec0122c48818d0efafdbb8be95f9d7939608be6aec'}]}, 'timestamp': '2025-10-02 12:11:16.534935', '_unique_id': '551c39bc587d4d7e9e51cf7c9d627930'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.535 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.535 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.535 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.535 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.535 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.535 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.535 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.535 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.535 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.535 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.535 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.535 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.535 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.535 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.535 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.535 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.535 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.535 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.535 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.535 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.535 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.535 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.535 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.535 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.535 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.535 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.535 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.535 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.535 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.535 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.535 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.536 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.536 12 DEBUG ceilometer.compute.pollsters [-] f12db2c7-b990-46f2-8865-699df3c176e6/network.incoming.bytes volume: 110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.537 12 DEBUG ceilometer.compute.pollsters [-] e0215fe6-39be-4529-9345-a5fcb4e3e6ee/network.incoming.bytes volume: 4199 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.538 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8b8b7416-3c06-4fea-8514-06ec9afcd11b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 110, 'user_id': '5b11a704bc3240da9c36e22382c9bd70', 'user_name': None, 'project_id': '7c0445031175477fb35cf45ea4e8ebe9', 'project_name': None, 'resource_id': 'instance-00000037-f12db2c7-b990-46f2-8865-699df3c176e6-tap1f708987-9c', 'timestamp': '2025-10-02T12:11:16.536673', 'resource_metadata': {'display_name': 'guest-instance-1.domain.com', 'name': 'tap1f708987-9c', 'instance_id': 'f12db2c7-b990-46f2-8865-699df3c176e6', 'instance_type': 'm1.nano', 'host': '4ce9181730da9bfdc487101e799e7aadb94c3f84a22dad40d074351b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:61:cb:d1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1f708987-9c'}, 'message_id': 'e5a0b57c-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.085045986, 'message_signature': '1fe7958d570142c41714c45e4a7b3aa43acd9b2da34e0b73b514d4a8a8da1e13'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4199, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': 'instance-00000035-e0215fe6-39be-4529-9345-a5fcb4e3e6ee-tap04cd91f3-a5', 'timestamp': '2025-10-02T12:11:16.536673', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1321990610', 'name': 'tap04cd91f3-a5', 'instance_id': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee', 'instance_type': 'm1.nano', 'host': '27faf5241fba9dee681b8eeae06daf4ee3b570dc0375dc29c00251b9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9c:f9:18', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap04cd91f3-a5'}, 'message_id': 'e5a0c274-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.089393357, 'message_signature': 'f29effe5a2a2426ae0f5ca1d8ed54cc9cf8c48c114331d6a322c31211580da03'}]}, 'timestamp': '2025-10-02 12:11:16.537346', '_unique_id': 'cb4d326760814e0e91997010ef707c1d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.538 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.538 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.538 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.538 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.538 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.538 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.538 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.538 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.538 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.538 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.538 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.538 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.538 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.538 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.538 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.538 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.538 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.538 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.538 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.538 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.538 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.538 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.538 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.538 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.538 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.538 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.538 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.538 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.538 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.538 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.538 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.538 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.539 12 DEBUG ceilometer.compute.pollsters [-] f12db2c7-b990-46f2-8865-699df3c176e6/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.539 12 DEBUG ceilometer.compute.pollsters [-] e0215fe6-39be-4529-9345-a5fcb4e3e6ee/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.540 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8e6ce719-88ce-4e7a-bc57-6b05a12cd628', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5b11a704bc3240da9c36e22382c9bd70', 'user_name': None, 'project_id': '7c0445031175477fb35cf45ea4e8ebe9', 'project_name': None, 'resource_id': 'instance-00000037-f12db2c7-b990-46f2-8865-699df3c176e6-tap1f708987-9c', 'timestamp': '2025-10-02T12:11:16.539023', 'resource_metadata': {'display_name': 'guest-instance-1.domain.com', 'name': 'tap1f708987-9c', 'instance_id': 'f12db2c7-b990-46f2-8865-699df3c176e6', 'instance_type': 'm1.nano', 'host': '4ce9181730da9bfdc487101e799e7aadb94c3f84a22dad40d074351b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:61:cb:d1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1f708987-9c'}, 'message_id': 'e5a1106c-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.085045986, 'message_signature': 'c52e57d378666f75a54ad980347c674f0212631997eaa7bbec6c918fd3df02f2'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': 'instance-00000035-e0215fe6-39be-4529-9345-a5fcb4e3e6ee-tap04cd91f3-a5', 'timestamp': '2025-10-02T12:11:16.539023', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1321990610', 'name': 'tap04cd91f3-a5', 'instance_id': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee', 'instance_type': 'm1.nano', 'host': '27faf5241fba9dee681b8eeae06daf4ee3b570dc0375dc29c00251b9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9c:f9:18', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap04cd91f3-a5'}, 'message_id': 'e5a11c4c-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.089393357, 'message_signature': 'a1e1478b7599ddde225a9ab9af5a7e57d1ad2469b6c41a6ec8d1787ff4b24712'}]}, 'timestamp': '2025-10-02 12:11:16.539662', '_unique_id': '9231477d5cc947eea7934df5dceb5313'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.540 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.540 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.540 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.540 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.540 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.540 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.540 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.540 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.540 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.540 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.540 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.540 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.540 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.540 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.540 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.540 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.540 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.540 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.540 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.540 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.540 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.540 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.540 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.540 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.540 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.540 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.540 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.540 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.540 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.540 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.540 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.541 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.541 12 DEBUG ceilometer.compute.pollsters [-] f12db2c7-b990-46f2-8865-699df3c176e6/cpu volume: 8510000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.541 12 DEBUG ceilometer.compute.pollsters [-] e0215fe6-39be-4529-9345-a5fcb4e3e6ee/cpu volume: 12270000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.542 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3510817a-31a8-46e8-858f-08716736d794', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 8510000000, 'user_id': '5b11a704bc3240da9c36e22382c9bd70', 'user_name': None, 'project_id': '7c0445031175477fb35cf45ea4e8ebe9', 'project_name': None, 'resource_id': 'f12db2c7-b990-46f2-8865-699df3c176e6', 'timestamp': '2025-10-02T12:11:16.541318', 'resource_metadata': {'display_name': 'guest-instance-1.domain.com', 'name': 'instance-00000037', 'instance_id': 'f12db2c7-b990-46f2-8865-699df3c176e6', 'instance_type': 'm1.nano', 'host': '4ce9181730da9bfdc487101e799e7aadb94c3f84a22dad40d074351b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'e5a16a1c-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.162816379, 'message_signature': '5746d717ee594cf0727ec6d96910fe702f7e92769cd9aea85f9053fcec63fc8f'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12270000000, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee', 'timestamp': '2025-10-02T12:11:16.541318', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1321990610', 'name': 'instance-00000035', 'instance_id': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee', 'instance_type': 'm1.nano', 'host': '27faf5241fba9dee681b8eeae06daf4ee3b570dc0375dc29c00251b9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'e5a1762e-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.190367612, 'message_signature': '9ddfa53987222a59124d97c06df3c4264ded561c663b54afb33a009f17b88210'}]}, 'timestamp': '2025-10-02 12:11:16.541970', '_unique_id': '3cfaa3235c7e4b6f9f48e68baa883ead'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.542 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.542 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.542 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.542 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.542 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.542 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.542 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.542 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.542 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.542 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.542 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.542 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.542 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.542 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.542 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.542 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.542 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.542 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.542 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.542 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.542 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.542 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.542 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.542 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.542 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.542 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.542 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.542 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.542 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.542 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.542 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.543 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.543 12 DEBUG ceilometer.compute.pollsters [-] f12db2c7-b990-46f2-8865-699df3c176e6/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.543 12 DEBUG ceilometer.compute.pollsters [-] f12db2c7-b990-46f2-8865-699df3c176e6/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.544 12 DEBUG ceilometer.compute.pollsters [-] e0215fe6-39be-4529-9345-a5fcb4e3e6ee/disk.device.read.bytes volume: 31009280 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.544 12 DEBUG ceilometer.compute.pollsters [-] e0215fe6-39be-4529-9345-a5fcb4e3e6ee/disk.device.read.bytes volume: 299326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.545 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a7b88fca-df8a-4737-933a-c66a3f8a9c22', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '5b11a704bc3240da9c36e22382c9bd70', 'user_name': None, 'project_id': '7c0445031175477fb35cf45ea4e8ebe9', 'project_name': None, 'resource_id': 'f12db2c7-b990-46f2-8865-699df3c176e6-vda', 'timestamp': '2025-10-02T12:11:16.543581', 'resource_metadata': {'display_name': 'guest-instance-1.domain.com', 'name': 'instance-00000037', 'instance_id': 'f12db2c7-b990-46f2-8865-699df3c176e6', 'instance_type': 'm1.nano', 'host': '4ce9181730da9bfdc487101e799e7aadb94c3f84a22dad40d074351b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e5a1c246-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.029386684, 'message_signature': 'ebb2f99a9309fa14c4878a2b66bb5bf1f1307f50e689fb9722db672f75845a5e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '5b11a704bc3240da9c36e22382c9bd70', 'user_name': None, 'project_id': '7c0445031175477fb35cf45ea4e8ebe9', 'project_name': None, 'resource_id': 'f12db2c7-b990-46f2-8865-699df3c176e6-sda', 'timestamp': '2025-10-02T12:11:16.543581', 'resource_metadata': {'display_name': 'guest-instance-1.domain.com', 'name': 'instance-00000037', 'instance_id': 'f12db2c7-b990-46f2-8865-699df3c176e6', 'instance_type': 'm1.nano', 'host': '4ce9181730da9bfdc487101e799e7aadb94c3f84a22dad40d074351b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e5a1cf8e-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.029386684, 'message_signature': '7307eaaf341882d1cb5997e3e7d17f24c4669c0e262091c5bfe74e61fc8cbf14'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31009280, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee-vda', 'timestamp': '2025-10-02T12:11:16.543581', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1321990610', 'name': 'instance-00000035', 'instance_id': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee', 'instance_type': 'm1.nano', 'host': '27faf5241fba9dee681b8eeae06daf4ee3b570dc0375dc29c00251b9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e5a1db5a-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.052121954, 'message_signature': 'fa3bcce616c25c7a0d3140a46c209ded02ea112d27aa3b7b066caaeaaac571c7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 299326, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee-sda', 'timestamp': '2025-10-02T12:11:16.543581', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1321990610', 'name': 'instance-00000035', 'instance_id': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee', 'instance_type': 'm1.nano', 'host': '27faf5241fba9dee681b8eeae06daf4ee3b570dc0375dc29c00251b9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e5a1e6a4-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.052121954, 'message_signature': '12805a7c774e16b24232e9c02ac404f6d625ebd1fff12723b2b9f2b1f8f1499c'}]}, 'timestamp': '2025-10-02 12:11:16.544814', '_unique_id': '5ed6a9b261574727a48a383e02a616c3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.545 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.545 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.545 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.545 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.545 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.545 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.545 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.545 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.545 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.545 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.545 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.545 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.545 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.545 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.545 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.545 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.545 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.545 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.545 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.545 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.545 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.545 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.545 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.545 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.545 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.545 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.545 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.545 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.545 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.545 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.545 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.546 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.546 12 DEBUG ceilometer.compute.pollsters [-] f12db2c7-b990-46f2-8865-699df3c176e6/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.546 12 DEBUG ceilometer.compute.pollsters [-] f12db2c7-b990-46f2-8865-699df3c176e6/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.547 12 DEBUG ceilometer.compute.pollsters [-] e0215fe6-39be-4529-9345-a5fcb4e3e6ee/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.547 12 DEBUG ceilometer.compute.pollsters [-] e0215fe6-39be-4529-9345-a5fcb4e3e6ee/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.548 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3e66a3b1-3c57-47a3-a504-f4bb4b476f6d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '5b11a704bc3240da9c36e22382c9bd70', 'user_name': None, 'project_id': '7c0445031175477fb35cf45ea4e8ebe9', 'project_name': None, 'resource_id': 'f12db2c7-b990-46f2-8865-699df3c176e6-vda', 'timestamp': '2025-10-02T12:11:16.546472', 'resource_metadata': {'display_name': 'guest-instance-1.domain.com', 'name': 'instance-00000037', 'instance_id': 'f12db2c7-b990-46f2-8865-699df3c176e6', 'instance_type': 'm1.nano', 'host': '4ce9181730da9bfdc487101e799e7aadb94c3f84a22dad40d074351b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e5a233c0-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.108534258, 'message_signature': '9caef9abef48aa695c6030196baefdd448404373bd20df541d3f62e1bfe37abe'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '5b11a704bc3240da9c36e22382c9bd70', 'user_name': None, 'project_id': '7c0445031175477fb35cf45ea4e8ebe9', 'project_name': None, 'resource_id': 'f12db2c7-b990-46f2-8865-699df3c176e6-sda', 'timestamp': '2025-10-02T12:11:16.546472', 'resource_metadata': {'display_name': 'guest-instance-1.domain.com', 'name': 'instance-00000037', 'instance_id': 'f12db2c7-b990-46f2-8865-699df3c176e6', 'instance_type': 'm1.nano', 'host': '4ce9181730da9bfdc487101e799e7aadb94c3f84a22dad40d074351b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e5a241f8-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.108534258, 'message_signature': 'e5f77180a04644faa9ad9938aafb102f7c57bb663c02bd39716d85e9da5c43ec'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee-vda', 'timestamp': '2025-10-02T12:11:16.546472', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1321990610', 'name': 'instance-00000035', 'instance_id': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee', 'instance_type': 'm1.nano', 'host': '27faf5241fba9dee681b8eeae06daf4ee3b570dc0375dc29c00251b9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e5a24d38-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.122175846, 'message_signature': 'ab8af974064e8b537d6f5ea24990559b48648623f7ddece95c7f18c4d0c2b846'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee-sda', 'timestamp': '2025-10-02T12:11:16.546472', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1321990610', 'name': 'instance-00000035', 'instance_id': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee', 'instance_type': 'm1.nano', 'host': '27faf5241fba9dee681b8eeae06daf4ee3b570dc0375dc29c00251b9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e5a2585a-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.122175846, 'message_signature': '554c4629a7f2dddffda08f4ace60618b7f572f83187bdff421e4560ba08bf1d3'}]}, 'timestamp': '2025-10-02 12:11:16.547735', '_unique_id': 'bcf191ff69f349528c2b17f46865c779'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.548 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.548 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.548 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.548 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.548 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.548 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.548 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.548 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.548 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.548 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.548 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.548 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.548 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.548 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.548 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.548 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.548 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.548 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.548 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.548 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.548 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.548 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.548 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.548 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.548 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.548 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.548 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.548 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.548 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.548 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.548 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.549 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.549 12 DEBUG ceilometer.compute.pollsters [-] f12db2c7-b990-46f2-8865-699df3c176e6/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.549 12 DEBUG ceilometer.compute.pollsters [-] e0215fe6-39be-4529-9345-a5fcb4e3e6ee/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '46c2860b-e36a-4083-a4e8-89b7d0c0e123', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5b11a704bc3240da9c36e22382c9bd70', 'user_name': None, 'project_id': '7c0445031175477fb35cf45ea4e8ebe9', 'project_name': None, 'resource_id': 'instance-00000037-f12db2c7-b990-46f2-8865-699df3c176e6-tap1f708987-9c', 'timestamp': '2025-10-02T12:11:16.549614', 'resource_metadata': {'display_name': 'guest-instance-1.domain.com', 'name': 'tap1f708987-9c', 'instance_id': 'f12db2c7-b990-46f2-8865-699df3c176e6', 'instance_type': 'm1.nano', 'host': '4ce9181730da9bfdc487101e799e7aadb94c3f84a22dad40d074351b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:61:cb:d1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1f708987-9c'}, 'message_id': 'e5a2ae7c-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.085045986, 'message_signature': 'e581793e827a1051c0b6eab1b9189fece937e2944e6c1ab4d0763058b39bfd48'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': 'instance-00000035-e0215fe6-39be-4529-9345-a5fcb4e3e6ee-tap04cd91f3-a5', 'timestamp': '2025-10-02T12:11:16.549614', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1321990610', 'name': 'tap04cd91f3-a5', 'instance_id': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee', 'instance_type': 'm1.nano', 'host': '27faf5241fba9dee681b8eeae06daf4ee3b570dc0375dc29c00251b9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9c:f9:18', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap04cd91f3-a5'}, 'message_id': 'e5a2bc50-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.089393357, 'message_signature': '65ea83bd4880fa4d207b2535af97a61167624f122cc7a306c775d61cac5b27b5'}]}, 'timestamp': '2025-10-02 12:11:16.550316', '_unique_id': 'f1d2e59a0db4449292f762595ac12667'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.551 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: guest-instance-1.domain.com>, <NovaLikeServer: tempest-tempest.common.compute-instance-1321990610>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: guest-instance-1.domain.com>, <NovaLikeServer: tempest-tempest.common.compute-instance-1321990610>]
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.552 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.552 12 DEBUG ceilometer.compute.pollsters [-] f12db2c7-b990-46f2-8865-699df3c176e6/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.552 12 DEBUG ceilometer.compute.pollsters [-] f12db2c7-b990-46f2-8865-699df3c176e6/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.552 12 DEBUG ceilometer.compute.pollsters [-] e0215fe6-39be-4529-9345-a5fcb4e3e6ee/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.553 12 DEBUG ceilometer.compute.pollsters [-] e0215fe6-39be-4529-9345-a5fcb4e3e6ee/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.554 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bbc80e25-ad15-45f7-96d1-ccb43645d472', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5b11a704bc3240da9c36e22382c9bd70', 'user_name': None, 'project_id': '7c0445031175477fb35cf45ea4e8ebe9', 'project_name': None, 'resource_id': 'f12db2c7-b990-46f2-8865-699df3c176e6-vda', 'timestamp': '2025-10-02T12:11:16.552114', 'resource_metadata': {'display_name': 'guest-instance-1.domain.com', 'name': 'instance-00000037', 'instance_id': 'f12db2c7-b990-46f2-8865-699df3c176e6', 'instance_type': 'm1.nano', 'host': '4ce9181730da9bfdc487101e799e7aadb94c3f84a22dad40d074351b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e5a30f8e-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.108534258, 'message_signature': '940270605253b7ddf6e061f03377c85c7042174d9f10760c6304bf198f05b9eb'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '5b11a704bc3240da9c36e22382c9bd70', 'user_name': None, 'project_id': '7c0445031175477fb35cf45ea4e8ebe9', 'project_name': None, 'resource_id': 'f12db2c7-b990-46f2-8865-699df3c176e6-sda', 'timestamp': '2025-10-02T12:11:16.552114', 'resource_metadata': {'display_name': 'guest-instance-1.domain.com', 'name': 'instance-00000037', 'instance_id': 'f12db2c7-b990-46f2-8865-699df3c176e6', 'instance_type': 'm1.nano', 'host': '4ce9181730da9bfdc487101e799e7aadb94c3f84a22dad40d074351b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e5a31b00-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.108534258, 'message_signature': '41b876de3ff0b75ae4cfa71579b46eb77a4c4f5020fe30b03d00a340536c7bb2'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee-vda', 'timestamp': '2025-10-02T12:11:16.552114', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1321990610', 'name': 'instance-00000035', 'instance_id': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee', 'instance_type': 'm1.nano', 'host': '27faf5241fba9dee681b8eeae06daf4ee3b570dc0375dc29c00251b9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e5a326b8-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.122175846, 'message_signature': 'c423e021c93b6085e85f3f68c8aea389447d19847eff8ef6da80e178ecdaf17a'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee-sda', 'timestamp': '2025-10-02T12:11:16.552114', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1321990610', 'name': 'instance-00000035', 'instance_id': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee', 'instance_type': 'm1.nano', 'host': '27faf5241fba9dee681b8eeae06daf4ee3b570dc0375dc29c00251b9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e5a331b2-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.122175846, 'message_signature': 'd693c434277e400d6f9f947d5d8bdc23eb5fecc9e1e6262bd7837f67d508c387'}]}, 'timestamp': '2025-10-02 12:11:16.553305', '_unique_id': 'afbe7349d3354480a3f628c18a78e583'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.554 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.554 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.554 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.554 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.554 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.554 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.554 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.554 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.554 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.554 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.554 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.554 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.554 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.554 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.554 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.554 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.554 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.554 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.554 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.554 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.554 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.554 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.554 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.554 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.554 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.554 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.554 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.554 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.554 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.554 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.554 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.555 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.555 12 DEBUG ceilometer.compute.pollsters [-] f12db2c7-b990-46f2-8865-699df3c176e6/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.555 12 DEBUG ceilometer.compute.pollsters [-] e0215fe6-39be-4529-9345-a5fcb4e3e6ee/network.incoming.packets volume: 24 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.556 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0fcbc9b2-6a36-44c9-bb33-ff55e7da73c8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '5b11a704bc3240da9c36e22382c9bd70', 'user_name': None, 'project_id': '7c0445031175477fb35cf45ea4e8ebe9', 'project_name': None, 'resource_id': 'instance-00000037-f12db2c7-b990-46f2-8865-699df3c176e6-tap1f708987-9c', 'timestamp': '2025-10-02T12:11:16.555088', 'resource_metadata': {'display_name': 'guest-instance-1.domain.com', 'name': 'tap1f708987-9c', 'instance_id': 'f12db2c7-b990-46f2-8865-699df3c176e6', 'instance_type': 'm1.nano', 'host': '4ce9181730da9bfdc487101e799e7aadb94c3f84a22dad40d074351b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:61:cb:d1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1f708987-9c'}, 'message_id': 'e5a38482-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.085045986, 'message_signature': '771715c22f179f55dc314e98779267464921796650729fad06af76de40bb570c'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 24, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': 'instance-00000035-e0215fe6-39be-4529-9345-a5fcb4e3e6ee-tap04cd91f3-a5', 'timestamp': '2025-10-02T12:11:16.555088', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1321990610', 'name': 'tap04cd91f3-a5', 'instance_id': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee', 'instance_type': 'm1.nano', 'host': '27faf5241fba9dee681b8eeae06daf4ee3b570dc0375dc29c00251b9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9c:f9:18', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap04cd91f3-a5'}, 'message_id': 'e5a3901c-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.089393357, 'message_signature': 'a582264cdc31bb1c1c9277e76157fc777b1c11f3cc72a9c294d8158d10e4a219'}]}, 'timestamp': '2025-10-02 12:11:16.555709', '_unique_id': '4194e307c3e84c7dab549ef1e774ace1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.556 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.556 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.556 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.556 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.556 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.556 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.556 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.556 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.556 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.556 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.556 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.556 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.556 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.556 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.556 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.556 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.556 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.556 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.556 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.556 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.556 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.556 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.556 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.556 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.556 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.556 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.556 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.556 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.556 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.556 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.556 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.556 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.557 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.557 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: guest-instance-1.domain.com>, <NovaLikeServer: tempest-tempest.common.compute-instance-1321990610>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: guest-instance-1.domain.com>, <NovaLikeServer: tempest-tempest.common.compute-instance-1321990610>]
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.557 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.557 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.557 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: guest-instance-1.domain.com>, <NovaLikeServer: tempest-tempest.common.compute-instance-1321990610>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: guest-instance-1.domain.com>, <NovaLikeServer: tempest-tempest.common.compute-instance-1321990610>]
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.557 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.557 12 DEBUG ceilometer.compute.pollsters [-] f12db2c7-b990-46f2-8865-699df3c176e6/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.558 12 DEBUG ceilometer.compute.pollsters [-] e0215fe6-39be-4529-9345-a5fcb4e3e6ee/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.559 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '16fb60a6-d8e1-4dbb-9042-1186df4e977c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5b11a704bc3240da9c36e22382c9bd70', 'user_name': None, 'project_id': '7c0445031175477fb35cf45ea4e8ebe9', 'project_name': None, 'resource_id': 'instance-00000037-f12db2c7-b990-46f2-8865-699df3c176e6-tap1f708987-9c', 'timestamp': '2025-10-02T12:11:16.557754', 'resource_metadata': {'display_name': 'guest-instance-1.domain.com', 'name': 'tap1f708987-9c', 'instance_id': 'f12db2c7-b990-46f2-8865-699df3c176e6', 'instance_type': 'm1.nano', 'host': '4ce9181730da9bfdc487101e799e7aadb94c3f84a22dad40d074351b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:61:cb:d1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1f708987-9c'}, 'message_id': 'e5a3ed5a-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.085045986, 'message_signature': '35c42b6cf81553c8ce34f78e651b51d6542656a46745f7421fd5e1845f7e1fd4'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': 'instance-00000035-e0215fe6-39be-4529-9345-a5fcb4e3e6ee-tap04cd91f3-a5', 'timestamp': '2025-10-02T12:11:16.557754', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1321990610', 'name': 'tap04cd91f3-a5', 'instance_id': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee', 'instance_type': 'm1.nano', 'host': '27faf5241fba9dee681b8eeae06daf4ee3b570dc0375dc29c00251b9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9c:f9:18', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap04cd91f3-a5'}, 'message_id': 'e5a3f89a-9f88-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5068.089393357, 'message_signature': 'fe0a77ecbb8122195c6af0104e4a1dffce132cbc56319f0ebe01a0c474e17e3a'}]}, 'timestamp': '2025-10-02 12:11:16.558577', '_unique_id': 'b86a860e40ca47b3a9359b4b5c68dd6a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.559 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.559 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.559 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.559 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.559 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.559 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.559 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.559 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.559 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.559 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.559 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.559 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.559 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.559 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.559 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.559 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.559 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.559 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.559 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.559 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.559 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.559 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.559 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.559 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.559 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.559 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.559 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.559 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.559 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.559 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:11:16.559 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466013 nova_compute[192144]: 2025-10-02 12:11:16.634 2 DEBUG nova.objects.instance [None req-3af35bf0-f515-4356-bc09-16469ba8d072 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lazy-loading 'pci_requests' on Instance uuid e0215fe6-39be-4529-9345-a5fcb4e3e6ee obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:11:16 np0005466013 nova_compute[192144]: 2025-10-02 12:11:16.653 2 DEBUG nova.network.neutron [None req-3af35bf0-f515-4356-bc09-16469ba8d072 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:11:16 np0005466013 podman[227108]: 2025-10-02 12:11:16.685323404 +0000 UTC m=+0.064320512 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, container_name=multipathd)
Oct  2 08:11:16 np0005466013 podman[227109]: 2025-10-02 12:11:16.692048804 +0000 UTC m=+0.066046313 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9-minimal, vcs-type=git, version=9.6, architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, release=1755695350, vendor=Red Hat, Inc., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc.)
Oct  2 08:11:16 np0005466013 nova_compute[192144]: 2025-10-02 12:11:16.969 2 DEBUG nova.policy [None req-3af35bf0-f515-4356-bc09-16469ba8d072 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:11:17 np0005466013 nova_compute[192144]: 2025-10-02 12:11:17.409 2 DEBUG nova.network.neutron [req-925e8172-d769-4292-a670-2d4dd84d9225 req-b8b133d6-887b-4071-bfc1-8c02e01f16a2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Updated VIF entry in instance network info cache for port 04cd91f3-a598-48c3-bb6a-789dece3461d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:11:17 np0005466013 nova_compute[192144]: 2025-10-02 12:11:17.410 2 DEBUG nova.network.neutron [req-925e8172-d769-4292-a670-2d4dd84d9225 req-b8b133d6-887b-4071-bfc1-8c02e01f16a2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Updating instance_info_cache with network_info: [{"id": "04cd91f3-a598-48c3-bb6a-789dece3461d", "address": "fa:16:3e:9c:f9:18", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04cd91f3-a5", "ovs_interfaceid": "04cd91f3-a598-48c3-bb6a-789dece3461d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:11:17 np0005466013 nova_compute[192144]: 2025-10-02 12:11:17.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:17 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:17.420 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:11:17 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:17.421 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:11:17 np0005466013 nova_compute[192144]: 2025-10-02 12:11:17.451 2 DEBUG oslo_concurrency.lockutils [req-925e8172-d769-4292-a670-2d4dd84d9225 req-b8b133d6-887b-4071-bfc1-8c02e01f16a2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-e0215fe6-39be-4529-9345-a5fcb4e3e6ee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:11:17 np0005466013 nova_compute[192144]: 2025-10-02 12:11:17.690 2 DEBUG nova.network.neutron [None req-3af35bf0-f515-4356-bc09-16469ba8d072 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Successfully updated port: 5c564602-5be7-47b0-858a-52eed7fcfd09 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:11:17 np0005466013 nova_compute[192144]: 2025-10-02 12:11:17.708 2 DEBUG oslo_concurrency.lockutils [None req-3af35bf0-f515-4356-bc09-16469ba8d072 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquiring lock "refresh_cache-e0215fe6-39be-4529-9345-a5fcb4e3e6ee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:11:17 np0005466013 nova_compute[192144]: 2025-10-02 12:11:17.709 2 DEBUG oslo_concurrency.lockutils [None req-3af35bf0-f515-4356-bc09-16469ba8d072 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquired lock "refresh_cache-e0215fe6-39be-4529-9345-a5fcb4e3e6ee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:11:17 np0005466013 nova_compute[192144]: 2025-10-02 12:11:17.709 2 DEBUG nova.network.neutron [None req-3af35bf0-f515-4356-bc09-16469ba8d072 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:11:17 np0005466013 nova_compute[192144]: 2025-10-02 12:11:17.788 2 DEBUG nova.compute.manager [req-e9c953d7-0e1b-4dfd-9f3f-a1ffde4d6b85 req-85112f9b-4b5b-425b-87b2-f71d0bd84a80 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Received event network-changed-5c564602-5be7-47b0-858a-52eed7fcfd09 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:11:17 np0005466013 nova_compute[192144]: 2025-10-02 12:11:17.790 2 DEBUG nova.compute.manager [req-e9c953d7-0e1b-4dfd-9f3f-a1ffde4d6b85 req-85112f9b-4b5b-425b-87b2-f71d0bd84a80 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Refreshing instance network info cache due to event network-changed-5c564602-5be7-47b0-858a-52eed7fcfd09. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:11:17 np0005466013 nova_compute[192144]: 2025-10-02 12:11:17.791 2 DEBUG oslo_concurrency.lockutils [req-e9c953d7-0e1b-4dfd-9f3f-a1ffde4d6b85 req-85112f9b-4b5b-425b-87b2-f71d0bd84a80 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-e0215fe6-39be-4529-9345-a5fcb4e3e6ee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:11:17 np0005466013 nova_compute[192144]: 2025-10-02 12:11:17.858 2 WARNING nova.network.neutron [None req-3af35bf0-f515-4356-bc09-16469ba8d072 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] 7d845a33-56e0-4850-9f27-8a54095796f2 already exists in list: networks containing: ['7d845a33-56e0-4850-9f27-8a54095796f2']. ignoring it#033[00m
Oct  2 08:11:17 np0005466013 nova_compute[192144]: 2025-10-02 12:11:17.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:11:17 np0005466013 nova_compute[192144]: 2025-10-02 12:11:17.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:11:17 np0005466013 nova_compute[192144]: 2025-10-02 12:11:17.995 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:11:18 np0005466013 nova_compute[192144]: 2025-10-02 12:11:18.158 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:18 np0005466013 nova_compute[192144]: 2025-10-02 12:11:18.186 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "refresh_cache-e0215fe6-39be-4529-9345-a5fcb4e3e6ee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:11:20 np0005466013 nova_compute[192144]: 2025-10-02 12:11:20.021 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:20 np0005466013 podman[227165]: 2025-10-02 12:11:20.682031926 +0000 UTC m=+0.059886930 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  2 08:11:20 np0005466013 podman[227166]: 2025-10-02 12:11:20.698813057 +0000 UTC m=+0.070908568 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:11:21 np0005466013 nova_compute[192144]: 2025-10-02 12:11:21.376 2 DEBUG nova.network.neutron [None req-3af35bf0-f515-4356-bc09-16469ba8d072 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Updating instance_info_cache with network_info: [{"id": "04cd91f3-a598-48c3-bb6a-789dece3461d", "address": "fa:16:3e:9c:f9:18", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04cd91f3-a5", "ovs_interfaceid": "04cd91f3-a598-48c3-bb6a-789dece3461d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "5c564602-5be7-47b0-858a-52eed7fcfd09", "address": "fa:16:3e:2d:3c:22", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c564602-5b", "ovs_interfaceid": "5c564602-5be7-47b0-858a-52eed7fcfd09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:11:21 np0005466013 nova_compute[192144]: 2025-10-02 12:11:21.399 2 DEBUG oslo_concurrency.lockutils [None req-3af35bf0-f515-4356-bc09-16469ba8d072 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Releasing lock "refresh_cache-e0215fe6-39be-4529-9345-a5fcb4e3e6ee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:11:21 np0005466013 nova_compute[192144]: 2025-10-02 12:11:21.400 2 DEBUG oslo_concurrency.lockutils [req-e9c953d7-0e1b-4dfd-9f3f-a1ffde4d6b85 req-85112f9b-4b5b-425b-87b2-f71d0bd84a80 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-e0215fe6-39be-4529-9345-a5fcb4e3e6ee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:11:21 np0005466013 nova_compute[192144]: 2025-10-02 12:11:21.400 2 DEBUG nova.network.neutron [req-e9c953d7-0e1b-4dfd-9f3f-a1ffde4d6b85 req-85112f9b-4b5b-425b-87b2-f71d0bd84a80 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Refreshing network info cache for port 5c564602-5be7-47b0-858a-52eed7fcfd09 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:11:21 np0005466013 nova_compute[192144]: 2025-10-02 12:11:21.404 2 DEBUG nova.virt.libvirt.vif [None req-3af35bf0-f515-4356-bc09-16469ba8d072 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:10:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1321990610',display_name='tempest-tempest.common.compute-instance-1321990610',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1321990610',id=53,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBOYv65bqxpRWLwHd1cwzG/4qFq5fwAENkgbBIDBOqnc0JgdzkSWQfx96bY6oBwDuHqykokwPzxefRuwxgQXggptdfQb5jD77e031VNj4krJvTD/OQ1Uz/d20gy+DMsXFg==',key_name='tempest-keypair-922901776',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:10:51Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ef4e3be787374d90a6a236c7f76bd940',ramdisk_id='',reservation_id='r-3gr6mfrz',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-812274278',owner_user_name='tempest-AttachInterfacesTestJSON-812274278-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:10:51Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='fbc7616089cb4f78832692487019c83d',uuid=e0215fe6-39be-4529-9345-a5fcb4e3e6ee,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5c564602-5be7-47b0-858a-52eed7fcfd09", "address": "fa:16:3e:2d:3c:22", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c564602-5b", "ovs_interfaceid": "5c564602-5be7-47b0-858a-52eed7fcfd09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:11:21 np0005466013 nova_compute[192144]: 2025-10-02 12:11:21.405 2 DEBUG nova.network.os_vif_util [None req-3af35bf0-f515-4356-bc09-16469ba8d072 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Converting VIF {"id": "5c564602-5be7-47b0-858a-52eed7fcfd09", "address": "fa:16:3e:2d:3c:22", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c564602-5b", "ovs_interfaceid": "5c564602-5be7-47b0-858a-52eed7fcfd09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:11:21 np0005466013 nova_compute[192144]: 2025-10-02 12:11:21.405 2 DEBUG nova.network.os_vif_util [None req-3af35bf0-f515-4356-bc09-16469ba8d072 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2d:3c:22,bridge_name='br-int',has_traffic_filtering=True,id=5c564602-5be7-47b0-858a-52eed7fcfd09,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5c564602-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:11:21 np0005466013 nova_compute[192144]: 2025-10-02 12:11:21.406 2 DEBUG os_vif [None req-3af35bf0-f515-4356-bc09-16469ba8d072 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:3c:22,bridge_name='br-int',has_traffic_filtering=True,id=5c564602-5be7-47b0-858a-52eed7fcfd09,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5c564602-5b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:11:21 np0005466013 nova_compute[192144]: 2025-10-02 12:11:21.407 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:21 np0005466013 nova_compute[192144]: 2025-10-02 12:11:21.407 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:21 np0005466013 nova_compute[192144]: 2025-10-02 12:11:21.407 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:11:21 np0005466013 nova_compute[192144]: 2025-10-02 12:11:21.410 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:21 np0005466013 nova_compute[192144]: 2025-10-02 12:11:21.410 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5c564602-5b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:21 np0005466013 nova_compute[192144]: 2025-10-02 12:11:21.411 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5c564602-5b, col_values=(('external_ids', {'iface-id': '5c564602-5be7-47b0-858a-52eed7fcfd09', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2d:3c:22', 'vm-uuid': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:21 np0005466013 NetworkManager[51205]: <info>  [1759407081.4135] manager: (tap5c564602-5b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/81)
Oct  2 08:11:21 np0005466013 nova_compute[192144]: 2025-10-02 12:11:21.415 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:21 np0005466013 nova_compute[192144]: 2025-10-02 12:11:21.418 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:11:21 np0005466013 nova_compute[192144]: 2025-10-02 12:11:21.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:21 np0005466013 nova_compute[192144]: 2025-10-02 12:11:21.422 2 INFO os_vif [None req-3af35bf0-f515-4356-bc09-16469ba8d072 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:3c:22,bridge_name='br-int',has_traffic_filtering=True,id=5c564602-5be7-47b0-858a-52eed7fcfd09,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5c564602-5b')#033[00m
Oct  2 08:11:21 np0005466013 nova_compute[192144]: 2025-10-02 12:11:21.423 2 DEBUG nova.virt.libvirt.vif [None req-3af35bf0-f515-4356-bc09-16469ba8d072 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:10:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1321990610',display_name='tempest-tempest.common.compute-instance-1321990610',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1321990610',id=53,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBOYv65bqxpRWLwHd1cwzG/4qFq5fwAENkgbBIDBOqnc0JgdzkSWQfx96bY6oBwDuHqykokwPzxefRuwxgQXggptdfQb5jD77e031VNj4krJvTD/OQ1Uz/d20gy+DMsXFg==',key_name='tempest-keypair-922901776',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:10:51Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ef4e3be787374d90a6a236c7f76bd940',ramdisk_id='',reservation_id='r-3gr6mfrz',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-812274278',owner_user_name='tempest-AttachInterfacesTestJSON-812274278-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:10:51Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='fbc7616089cb4f78832692487019c83d',uuid=e0215fe6-39be-4529-9345-a5fcb4e3e6ee,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5c564602-5be7-47b0-858a-52eed7fcfd09", "address": "fa:16:3e:2d:3c:22", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c564602-5b", "ovs_interfaceid": "5c564602-5be7-47b0-858a-52eed7fcfd09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:11:21 np0005466013 nova_compute[192144]: 2025-10-02 12:11:21.423 2 DEBUG nova.network.os_vif_util [None req-3af35bf0-f515-4356-bc09-16469ba8d072 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Converting VIF {"id": "5c564602-5be7-47b0-858a-52eed7fcfd09", "address": "fa:16:3e:2d:3c:22", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c564602-5b", "ovs_interfaceid": "5c564602-5be7-47b0-858a-52eed7fcfd09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:11:21 np0005466013 nova_compute[192144]: 2025-10-02 12:11:21.424 2 DEBUG nova.network.os_vif_util [None req-3af35bf0-f515-4356-bc09-16469ba8d072 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2d:3c:22,bridge_name='br-int',has_traffic_filtering=True,id=5c564602-5be7-47b0-858a-52eed7fcfd09,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5c564602-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:11:21 np0005466013 nova_compute[192144]: 2025-10-02 12:11:21.426 2 DEBUG nova.virt.libvirt.guest [None req-3af35bf0-f515-4356-bc09-16469ba8d072 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] attach device xml: <interface type="ethernet">
Oct  2 08:11:21 np0005466013 nova_compute[192144]:  <mac address="fa:16:3e:2d:3c:22"/>
Oct  2 08:11:21 np0005466013 nova_compute[192144]:  <model type="virtio"/>
Oct  2 08:11:21 np0005466013 nova_compute[192144]:  <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:11:21 np0005466013 nova_compute[192144]:  <mtu size="1442"/>
Oct  2 08:11:21 np0005466013 nova_compute[192144]:  <target dev="tap5c564602-5b"/>
Oct  2 08:11:21 np0005466013 nova_compute[192144]: </interface>
Oct  2 08:11:21 np0005466013 nova_compute[192144]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Oct  2 08:11:21 np0005466013 kernel: tap5c564602-5b: entered promiscuous mode
Oct  2 08:11:21 np0005466013 NetworkManager[51205]: <info>  [1759407081.4400] manager: (tap5c564602-5b): new Tun device (/org/freedesktop/NetworkManager/Devices/82)
Oct  2 08:11:21 np0005466013 nova_compute[192144]: 2025-10-02 12:11:21.441 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:21 np0005466013 ovn_controller[94366]: 2025-10-02T12:11:21Z|00144|binding|INFO|Claiming lport 5c564602-5be7-47b0-858a-52eed7fcfd09 for this chassis.
Oct  2 08:11:21 np0005466013 ovn_controller[94366]: 2025-10-02T12:11:21Z|00145|binding|INFO|5c564602-5be7-47b0-858a-52eed7fcfd09: Claiming fa:16:3e:2d:3c:22 10.100.0.7
Oct  2 08:11:21 np0005466013 nova_compute[192144]: 2025-10-02 12:11:21.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:21 np0005466013 ovn_controller[94366]: 2025-10-02T12:11:21Z|00146|binding|INFO|Setting lport 5c564602-5be7-47b0-858a-52eed7fcfd09 ovn-installed in OVS
Oct  2 08:11:21 np0005466013 ovn_controller[94366]: 2025-10-02T12:11:21Z|00147|binding|INFO|Setting lport 5c564602-5be7-47b0-858a-52eed7fcfd09 up in Southbound
Oct  2 08:11:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:21.455 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2d:3c:22 10.100.0.7'], port_security=['fa:16:3e:2d:3c:22 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-126745253', 'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d845a33-56e0-4850-9f27-8a54095796f2', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-126745253', 'neutron:project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'neutron:revision_number': '7', 'neutron:security_group_ids': 'e26b972b-3ab5-401c-9d8b-5161665ba680', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4583e9be-3cfa-4470-9e2e-4e943d469605, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=5c564602-5be7-47b0-858a-52eed7fcfd09) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:11:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:21.456 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 5c564602-5be7-47b0-858a-52eed7fcfd09 in datapath 7d845a33-56e0-4850-9f27-8a54095796f2 bound to our chassis#033[00m
Oct  2 08:11:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:21.458 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7d845a33-56e0-4850-9f27-8a54095796f2#033[00m
Oct  2 08:11:21 np0005466013 nova_compute[192144]: 2025-10-02 12:11:21.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:21 np0005466013 nova_compute[192144]: 2025-10-02 12:11:21.461 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:21 np0005466013 systemd-udevd[227216]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:11:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:21.478 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[99d70147-b949-4121-a865-e9afce145e0c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:21 np0005466013 NetworkManager[51205]: <info>  [1759407081.4941] device (tap5c564602-5b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:11:21 np0005466013 NetworkManager[51205]: <info>  [1759407081.4950] device (tap5c564602-5b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:11:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:21.511 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[8b2ed9c8-20d8-4d40-bca8-1371fe34f201]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:21.515 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[981f9f68-e1cb-4839-a9d0-e4dc59c6ab29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:21 np0005466013 nova_compute[192144]: 2025-10-02 12:11:21.525 2 DEBUG nova.virt.libvirt.driver [None req-3af35bf0-f515-4356-bc09-16469ba8d072 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:11:21 np0005466013 nova_compute[192144]: 2025-10-02 12:11:21.526 2 DEBUG nova.virt.libvirt.driver [None req-3af35bf0-f515-4356-bc09-16469ba8d072 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:11:21 np0005466013 nova_compute[192144]: 2025-10-02 12:11:21.526 2 DEBUG nova.virt.libvirt.driver [None req-3af35bf0-f515-4356-bc09-16469ba8d072 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] No VIF found with MAC fa:16:3e:9c:f9:18, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:11:21 np0005466013 nova_compute[192144]: 2025-10-02 12:11:21.526 2 DEBUG nova.virt.libvirt.driver [None req-3af35bf0-f515-4356-bc09-16469ba8d072 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] No VIF found with MAC fa:16:3e:2d:3c:22, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:11:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:21.544 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[f1342126-580f-4844-bd72-410a78ca1b30]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:21 np0005466013 nova_compute[192144]: 2025-10-02 12:11:21.549 2 DEBUG nova.virt.libvirt.guest [None req-3af35bf0-f515-4356-bc09-16469ba8d072 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:11:21 np0005466013 nova_compute[192144]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:11:21 np0005466013 nova_compute[192144]:  <nova:name>tempest-tempest.common.compute-instance-1321990610</nova:name>
Oct  2 08:11:21 np0005466013 nova_compute[192144]:  <nova:creationTime>2025-10-02 12:11:21</nova:creationTime>
Oct  2 08:11:21 np0005466013 nova_compute[192144]:  <nova:flavor name="m1.nano">
Oct  2 08:11:21 np0005466013 nova_compute[192144]:    <nova:memory>128</nova:memory>
Oct  2 08:11:21 np0005466013 nova_compute[192144]:    <nova:disk>1</nova:disk>
Oct  2 08:11:21 np0005466013 nova_compute[192144]:    <nova:swap>0</nova:swap>
Oct  2 08:11:21 np0005466013 nova_compute[192144]:    <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:11:21 np0005466013 nova_compute[192144]:    <nova:vcpus>1</nova:vcpus>
Oct  2 08:11:21 np0005466013 nova_compute[192144]:  </nova:flavor>
Oct  2 08:11:21 np0005466013 nova_compute[192144]:  <nova:owner>
Oct  2 08:11:21 np0005466013 nova_compute[192144]:    <nova:user uuid="fbc7616089cb4f78832692487019c83d">tempest-AttachInterfacesTestJSON-812274278-project-member</nova:user>
Oct  2 08:11:21 np0005466013 nova_compute[192144]:    <nova:project uuid="ef4e3be787374d90a6a236c7f76bd940">tempest-AttachInterfacesTestJSON-812274278</nova:project>
Oct  2 08:11:21 np0005466013 nova_compute[192144]:  </nova:owner>
Oct  2 08:11:21 np0005466013 nova_compute[192144]:  <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:11:21 np0005466013 nova_compute[192144]:  <nova:ports>
Oct  2 08:11:21 np0005466013 nova_compute[192144]:    <nova:port uuid="04cd91f3-a598-48c3-bb6a-789dece3461d">
Oct  2 08:11:21 np0005466013 nova_compute[192144]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Oct  2 08:11:21 np0005466013 nova_compute[192144]:    </nova:port>
Oct  2 08:11:21 np0005466013 nova_compute[192144]:    <nova:port uuid="5c564602-5be7-47b0-858a-52eed7fcfd09">
Oct  2 08:11:21 np0005466013 nova_compute[192144]:      <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Oct  2 08:11:21 np0005466013 nova_compute[192144]:    </nova:port>
Oct  2 08:11:21 np0005466013 nova_compute[192144]:  </nova:ports>
Oct  2 08:11:21 np0005466013 nova_compute[192144]: </nova:instance>
Oct  2 08:11:21 np0005466013 nova_compute[192144]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Oct  2 08:11:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:21.567 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[56c52c26-a1d1-43dc-b1eb-df6ffd469c66]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7d845a33-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:90:16'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 45], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 504264, 'reachable_time': 26822, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227223, 'error': None, 'target': 'ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:21 np0005466013 nova_compute[192144]: 2025-10-02 12:11:21.583 2 DEBUG oslo_concurrency.lockutils [None req-3af35bf0-f515-4356-bc09-16469ba8d072 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "interface-e0215fe6-39be-4529-9345-a5fcb4e3e6ee-5c564602-5be7-47b0-858a-52eed7fcfd09" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 5.502s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:21.591 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[2bc09e21-a591-4581-a86f-040cb2065875]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap7d845a33-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 504275, 'tstamp': 504275}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227224, 'error': None, 'target': 'ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap7d845a33-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 504277, 'tstamp': 504277}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227224, 'error': None, 'target': 'ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:21.594 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7d845a33-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:21 np0005466013 nova_compute[192144]: 2025-10-02 12:11:21.596 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:21 np0005466013 nova_compute[192144]: 2025-10-02 12:11:21.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:21.598 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7d845a33-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:21.599 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:11:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:21.599 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7d845a33-50, col_values=(('external_ids', {'iface-id': '1c321c19-d630-4a6f-8ba8-7bac90af9bae'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:21.599 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:11:21 np0005466013 ovn_controller[94366]: 2025-10-02T12:11:21Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:61:cb:d1 10.100.0.9
Oct  2 08:11:21 np0005466013 ovn_controller[94366]: 2025-10-02T12:11:21Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:61:cb:d1 10.100.0.9
Oct  2 08:11:22 np0005466013 nova_compute[192144]: 2025-10-02 12:11:22.087 2 DEBUG nova.compute.manager [req-ba0adce1-ed05-407e-aef2-97d5bb7bf6ef req-49b155f5-775c-45fe-a651-fa6117ea70d5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Received event network-vif-plugged-5c564602-5be7-47b0-858a-52eed7fcfd09 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:11:22 np0005466013 nova_compute[192144]: 2025-10-02 12:11:22.088 2 DEBUG oslo_concurrency.lockutils [req-ba0adce1-ed05-407e-aef2-97d5bb7bf6ef req-49b155f5-775c-45fe-a651-fa6117ea70d5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "e0215fe6-39be-4529-9345-a5fcb4e3e6ee-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:22 np0005466013 nova_compute[192144]: 2025-10-02 12:11:22.088 2 DEBUG oslo_concurrency.lockutils [req-ba0adce1-ed05-407e-aef2-97d5bb7bf6ef req-49b155f5-775c-45fe-a651-fa6117ea70d5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "e0215fe6-39be-4529-9345-a5fcb4e3e6ee-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:22 np0005466013 nova_compute[192144]: 2025-10-02 12:11:22.089 2 DEBUG oslo_concurrency.lockutils [req-ba0adce1-ed05-407e-aef2-97d5bb7bf6ef req-49b155f5-775c-45fe-a651-fa6117ea70d5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "e0215fe6-39be-4529-9345-a5fcb4e3e6ee-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:22 np0005466013 nova_compute[192144]: 2025-10-02 12:11:22.089 2 DEBUG nova.compute.manager [req-ba0adce1-ed05-407e-aef2-97d5bb7bf6ef req-49b155f5-775c-45fe-a651-fa6117ea70d5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] No waiting events found dispatching network-vif-plugged-5c564602-5be7-47b0-858a-52eed7fcfd09 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:11:22 np0005466013 nova_compute[192144]: 2025-10-02 12:11:22.089 2 WARNING nova.compute.manager [req-ba0adce1-ed05-407e-aef2-97d5bb7bf6ef req-49b155f5-775c-45fe-a651-fa6117ea70d5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Received unexpected event network-vif-plugged-5c564602-5be7-47b0-858a-52eed7fcfd09 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:11:22 np0005466013 ovn_controller[94366]: 2025-10-02T12:11:22Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:2d:3c:22 10.100.0.7
Oct  2 08:11:22 np0005466013 ovn_controller[94366]: 2025-10-02T12:11:22Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:2d:3c:22 10.100.0.7
Oct  2 08:11:23 np0005466013 nova_compute[192144]: 2025-10-02 12:11:23.159 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:23 np0005466013 nova_compute[192144]: 2025-10-02 12:11:23.268 2 DEBUG oslo_concurrency.lockutils [None req-656bd6d3-82c9-47ff-9c36-b436070844d6 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquiring lock "interface-e0215fe6-39be-4529-9345-a5fcb4e3e6ee-5c564602-5be7-47b0-858a-52eed7fcfd09" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:23 np0005466013 nova_compute[192144]: 2025-10-02 12:11:23.269 2 DEBUG oslo_concurrency.lockutils [None req-656bd6d3-82c9-47ff-9c36-b436070844d6 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "interface-e0215fe6-39be-4529-9345-a5fcb4e3e6ee-5c564602-5be7-47b0-858a-52eed7fcfd09" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:23 np0005466013 nova_compute[192144]: 2025-10-02 12:11:23.299 2 DEBUG nova.objects.instance [None req-656bd6d3-82c9-47ff-9c36-b436070844d6 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lazy-loading 'flavor' on Instance uuid e0215fe6-39be-4529-9345-a5fcb4e3e6ee obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:11:23 np0005466013 nova_compute[192144]: 2025-10-02 12:11:23.340 2 DEBUG nova.virt.libvirt.vif [None req-656bd6d3-82c9-47ff-9c36-b436070844d6 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:10:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1321990610',display_name='tempest-tempest.common.compute-instance-1321990610',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1321990610',id=53,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBOYv65bqxpRWLwHd1cwzG/4qFq5fwAENkgbBIDBOqnc0JgdzkSWQfx96bY6oBwDuHqykokwPzxefRuwxgQXggptdfQb5jD77e031VNj4krJvTD/OQ1Uz/d20gy+DMsXFg==',key_name='tempest-keypair-922901776',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:10:51Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ef4e3be787374d90a6a236c7f76bd940',ramdisk_id='',reservation_id='r-3gr6mfrz',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-812274278',owner_user_name='tempest-AttachInterfacesTestJSON-812274278-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:10:51Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='fbc7616089cb4f78832692487019c83d',uuid=e0215fe6-39be-4529-9345-a5fcb4e3e6ee,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5c564602-5be7-47b0-858a-52eed7fcfd09", "address": "fa:16:3e:2d:3c:22", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c564602-5b", "ovs_interfaceid": "5c564602-5be7-47b0-858a-52eed7fcfd09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:11:23 np0005466013 nova_compute[192144]: 2025-10-02 12:11:23.340 2 DEBUG nova.network.os_vif_util [None req-656bd6d3-82c9-47ff-9c36-b436070844d6 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Converting VIF {"id": "5c564602-5be7-47b0-858a-52eed7fcfd09", "address": "fa:16:3e:2d:3c:22", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c564602-5b", "ovs_interfaceid": "5c564602-5be7-47b0-858a-52eed7fcfd09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:11:23 np0005466013 nova_compute[192144]: 2025-10-02 12:11:23.341 2 DEBUG nova.network.os_vif_util [None req-656bd6d3-82c9-47ff-9c36-b436070844d6 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2d:3c:22,bridge_name='br-int',has_traffic_filtering=True,id=5c564602-5be7-47b0-858a-52eed7fcfd09,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5c564602-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:11:23 np0005466013 nova_compute[192144]: 2025-10-02 12:11:23.344 2 DEBUG nova.virt.libvirt.guest [None req-656bd6d3-82c9-47ff-9c36-b436070844d6 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:2d:3c:22"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap5c564602-5b"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Oct  2 08:11:23 np0005466013 nova_compute[192144]: 2025-10-02 12:11:23.345 2 DEBUG nova.virt.libvirt.guest [None req-656bd6d3-82c9-47ff-9c36-b436070844d6 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:2d:3c:22"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap5c564602-5b"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Oct  2 08:11:23 np0005466013 nova_compute[192144]: 2025-10-02 12:11:23.348 2 DEBUG nova.virt.libvirt.driver [None req-656bd6d3-82c9-47ff-9c36-b436070844d6 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Attempting to detach device tap5c564602-5b from instance e0215fe6-39be-4529-9345-a5fcb4e3e6ee from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Oct  2 08:11:23 np0005466013 nova_compute[192144]: 2025-10-02 12:11:23.348 2 DEBUG nova.virt.libvirt.guest [None req-656bd6d3-82c9-47ff-9c36-b436070844d6 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] detach device xml: <interface type="ethernet">
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <mac address="fa:16:3e:2d:3c:22"/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <model type="virtio"/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <mtu size="1442"/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <target dev="tap5c564602-5b"/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]: </interface>
Oct  2 08:11:23 np0005466013 nova_compute[192144]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Oct  2 08:11:23 np0005466013 nova_compute[192144]: 2025-10-02 12:11:23.356 2 DEBUG nova.virt.libvirt.guest [None req-656bd6d3-82c9-47ff-9c36-b436070844d6 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:2d:3c:22"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap5c564602-5b"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Oct  2 08:11:23 np0005466013 nova_compute[192144]: 2025-10-02 12:11:23.359 2 DEBUG nova.virt.libvirt.guest [None req-656bd6d3-82c9-47ff-9c36-b436070844d6 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:2d:3c:22"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap5c564602-5b"/></interface>not found in domain: <domain type='kvm' id='21'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <name>instance-00000035</name>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <uuid>e0215fe6-39be-4529-9345-a5fcb4e3e6ee</uuid>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <nova:name>tempest-tempest.common.compute-instance-1321990610</nova:name>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <nova:creationTime>2025-10-02 12:11:21</nova:creationTime>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <nova:flavor name="m1.nano">
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <nova:memory>128</nova:memory>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <nova:disk>1</nova:disk>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <nova:swap>0</nova:swap>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <nova:vcpus>1</nova:vcpus>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  </nova:flavor>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <nova:owner>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <nova:user uuid="fbc7616089cb4f78832692487019c83d">tempest-AttachInterfacesTestJSON-812274278-project-member</nova:user>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <nova:project uuid="ef4e3be787374d90a6a236c7f76bd940">tempest-AttachInterfacesTestJSON-812274278</nova:project>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  </nova:owner>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <nova:ports>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <nova:port uuid="04cd91f3-a598-48c3-bb6a-789dece3461d">
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </nova:port>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <nova:port uuid="5c564602-5be7-47b0-858a-52eed7fcfd09">
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </nova:port>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  </nova:ports>
Oct  2 08:11:23 np0005466013 nova_compute[192144]: </nova:instance>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <memory unit='KiB'>131072</memory>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <currentMemory unit='KiB'>131072</currentMemory>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <vcpu placement='static'>1</vcpu>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <resource>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <partition>/machine</partition>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  </resource>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <sysinfo type='smbios'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <entry name='manufacturer'>RDO</entry>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <entry name='product'>OpenStack Compute</entry>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <entry name='serial'>e0215fe6-39be-4529-9345-a5fcb4e3e6ee</entry>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <entry name='uuid'>e0215fe6-39be-4529-9345-a5fcb4e3e6ee</entry>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <entry name='family'>Virtual Machine</entry>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <type arch='x86_64' machine='pc-q35-rhel9.6.0'>hvm</type>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <boot dev='hd'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <smbios mode='sysinfo'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <vmcoreinfo state='on'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <cpu mode='custom' match='exact' check='full'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <model fallback='forbid'>Nehalem</model>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <feature policy='require' name='x2apic'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <feature policy='require' name='hypervisor'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <feature policy='require' name='vme'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <clock offset='utc'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <timer name='pit' tickpolicy='delay'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <timer name='rtc' tickpolicy='catchup'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <timer name='hpet' present='no'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <on_poweroff>destroy</on_poweroff>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <on_reboot>restart</on_reboot>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <on_crash>destroy</on_crash>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <disk type='file' device='disk'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <driver name='qemu' type='qcow2' cache='none'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <source file='/var/lib/nova/instances/e0215fe6-39be-4529-9345-a5fcb4e3e6ee/disk' index='2'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <backingStore type='file' index='3'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:        <format type='raw'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:        <source file='/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:        <backingStore/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      </backingStore>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target dev='vda' bus='virtio'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='virtio-disk0'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <disk type='file' device='cdrom'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <driver name='qemu' type='raw' cache='none'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <source file='/var/lib/nova/instances/e0215fe6-39be-4529-9345-a5fcb4e3e6ee/disk.config' index='1'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <backingStore/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target dev='sda' bus='sata'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <readonly/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='sata0-0-0'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='pci' index='0' model='pcie-root'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='pcie.0'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='pci' index='1' model='pcie-root-port'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target chassis='1' port='0x10'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='pci.1'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='pci' index='2' model='pcie-root-port'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target chassis='2' port='0x11'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='pci.2'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='pci' index='3' model='pcie-root-port'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target chassis='3' port='0x12'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='pci.3'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='pci' index='4' model='pcie-root-port'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target chassis='4' port='0x13'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='pci.4'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='pci' index='5' model='pcie-root-port'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target chassis='5' port='0x14'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='pci.5'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='pci' index='6' model='pcie-root-port'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target chassis='6' port='0x15'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='pci.6'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='pci' index='7' model='pcie-root-port'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target chassis='7' port='0x16'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='pci.7'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='pci' index='8' model='pcie-root-port'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target chassis='8' port='0x17'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='pci.8'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='pci' index='9' model='pcie-root-port'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target chassis='9' port='0x18'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='pci.9'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='pci' index='10' model='pcie-root-port'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target chassis='10' port='0x19'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='pci.10'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='pci' index='11' model='pcie-root-port'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target chassis='11' port='0x1a'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='pci.11'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='pci' index='12' model='pcie-root-port'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target chassis='12' port='0x1b'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='pci.12'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='pci' index='13' model='pcie-root-port'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target chassis='13' port='0x1c'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='pci.13'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='pci' index='14' model='pcie-root-port'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target chassis='14' port='0x1d'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='pci.14'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='pci' index='15' model='pcie-root-port'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target chassis='15' port='0x1e'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='pci.15'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='pci' index='16' model='pcie-root-port'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target chassis='16' port='0x1f'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='pci.16'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='pci' index='17' model='pcie-root-port'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target chassis='17' port='0x20'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='pci.17'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='pci' index='18' model='pcie-root-port'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target chassis='18' port='0x21'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='pci.18'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='pci' index='19' model='pcie-root-port'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target chassis='19' port='0x22'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='pci.19'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='pci' index='20' model='pcie-root-port'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target chassis='20' port='0x23'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='pci.20'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='pci' index='21' model='pcie-root-port'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target chassis='21' port='0x24'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='pci.21'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='pci' index='22' model='pcie-root-port'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target chassis='22' port='0x25'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='pci.22'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='pci' index='23' model='pcie-root-port'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target chassis='23' port='0x26'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='pci.23'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='pci' index='24' model='pcie-root-port'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target chassis='24' port='0x27'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='pci.24'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='pci' index='25' model='pcie-root-port'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target chassis='25' port='0x28'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='pci.25'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model name='pcie-pci-bridge'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='pci.26'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='usb' index='0' model='piix3-uhci'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='usb'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='sata' index='0'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='ide'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <interface type='ethernet'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <mac address='fa:16:3e:9c:f9:18'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target dev='tap04cd91f3-a5'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model type='virtio'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <driver name='vhost' rx_queue_size='512'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <mtu size='1442'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='net0'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <interface type='ethernet'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <mac address='fa:16:3e:2d:3c:22'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target dev='tap5c564602-5b'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model type='virtio'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <driver name='vhost' rx_queue_size='512'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <mtu size='1442'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='net1'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <serial type='pty'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <source path='/dev/pts/0'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <log file='/var/lib/nova/instances/e0215fe6-39be-4529-9345-a5fcb4e3e6ee/console.log' append='off'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target type='isa-serial' port='0'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:        <model name='isa-serial'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      </target>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='serial0'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <console type='pty' tty='/dev/pts/0'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <source path='/dev/pts/0'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <log file='/var/lib/nova/instances/e0215fe6-39be-4529-9345-a5fcb4e3e6ee/console.log' append='off'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target type='serial' port='0'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='serial0'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </console>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <input type='tablet' bus='usb'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='input0'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='usb' bus='0' port='1'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </input>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <input type='mouse' bus='ps2'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='input1'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </input>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <input type='keyboard' bus='ps2'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='input2'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </input>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <listen type='address' address='::0'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </graphics>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <audio id='1' type='none'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model type='virtio' heads='1' primary='yes'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='video0'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <watchdog model='itco' action='reset'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='watchdog0'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </watchdog>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <memballoon model='virtio'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <stats period='10'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='balloon0'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <rng model='virtio'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <backend model='random'>/dev/urandom</backend>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='rng0'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <label>system_u:system_r:svirt_t:s0:c37,c440</label>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c37,c440</imagelabel>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  </seclabel>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <label>+107:+107</label>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <imagelabel>+107:+107</imagelabel>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  </seclabel>
Oct  2 08:11:23 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:11:23 np0005466013 nova_compute[192144]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Oct  2 08:11:23 np0005466013 nova_compute[192144]: 2025-10-02 12:11:23.360 2 INFO nova.virt.libvirt.driver [None req-656bd6d3-82c9-47ff-9c36-b436070844d6 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Successfully detached device tap5c564602-5b from instance e0215fe6-39be-4529-9345-a5fcb4e3e6ee from the persistent domain config.#033[00m
Oct  2 08:11:23 np0005466013 nova_compute[192144]: 2025-10-02 12:11:23.361 2 DEBUG nova.virt.libvirt.driver [None req-656bd6d3-82c9-47ff-9c36-b436070844d6 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] (1/8): Attempting to detach device tap5c564602-5b with device alias net1 from instance e0215fe6-39be-4529-9345-a5fcb4e3e6ee from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Oct  2 08:11:23 np0005466013 nova_compute[192144]: 2025-10-02 12:11:23.361 2 DEBUG nova.virt.libvirt.guest [None req-656bd6d3-82c9-47ff-9c36-b436070844d6 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] detach device xml: <interface type="ethernet">
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <mac address="fa:16:3e:2d:3c:22"/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <model type="virtio"/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <mtu size="1442"/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <target dev="tap5c564602-5b"/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]: </interface>
Oct  2 08:11:23 np0005466013 nova_compute[192144]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Oct  2 08:11:23 np0005466013 kernel: tap5c564602-5b (unregistering): left promiscuous mode
Oct  2 08:11:23 np0005466013 NetworkManager[51205]: <info>  [1759407083.4657] device (tap5c564602-5b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:11:23 np0005466013 ovn_controller[94366]: 2025-10-02T12:11:23Z|00148|binding|INFO|Releasing lport 5c564602-5be7-47b0-858a-52eed7fcfd09 from this chassis (sb_readonly=0)
Oct  2 08:11:23 np0005466013 ovn_controller[94366]: 2025-10-02T12:11:23Z|00149|binding|INFO|Setting lport 5c564602-5be7-47b0-858a-52eed7fcfd09 down in Southbound
Oct  2 08:11:23 np0005466013 nova_compute[192144]: 2025-10-02 12:11:23.476 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:23 np0005466013 ovn_controller[94366]: 2025-10-02T12:11:23Z|00150|binding|INFO|Removing iface tap5c564602-5b ovn-installed in OVS
Oct  2 08:11:23 np0005466013 nova_compute[192144]: 2025-10-02 12:11:23.478 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:23.483 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2d:3c:22 10.100.0.7'], port_security=['fa:16:3e:2d:3c:22 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-126745253', 'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d845a33-56e0-4850-9f27-8a54095796f2', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-126745253', 'neutron:project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'neutron:revision_number': '9', 'neutron:security_group_ids': 'e26b972b-3ab5-401c-9d8b-5161665ba680', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4583e9be-3cfa-4470-9e2e-4e943d469605, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=5c564602-5be7-47b0-858a-52eed7fcfd09) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:11:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:23.484 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 5c564602-5be7-47b0-858a-52eed7fcfd09 in datapath 7d845a33-56e0-4850-9f27-8a54095796f2 unbound from our chassis#033[00m
Oct  2 08:11:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:23.486 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7d845a33-56e0-4850-9f27-8a54095796f2#033[00m
Oct  2 08:11:23 np0005466013 nova_compute[192144]: 2025-10-02 12:11:23.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:23 np0005466013 nova_compute[192144]: 2025-10-02 12:11:23.501 2 DEBUG nova.virt.libvirt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Received event <DeviceRemovedEvent: 1759407083.500927, e0215fe6-39be-4529-9345-a5fcb4e3e6ee => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Oct  2 08:11:23 np0005466013 nova_compute[192144]: 2025-10-02 12:11:23.502 2 DEBUG nova.virt.libvirt.driver [None req-656bd6d3-82c9-47ff-9c36-b436070844d6 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Start waiting for the detach event from libvirt for device tap5c564602-5b with device alias net1 for instance e0215fe6-39be-4529-9345-a5fcb4e3e6ee _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Oct  2 08:11:23 np0005466013 nova_compute[192144]: 2025-10-02 12:11:23.502 2 DEBUG nova.virt.libvirt.guest [None req-656bd6d3-82c9-47ff-9c36-b436070844d6 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:2d:3c:22"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap5c564602-5b"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Oct  2 08:11:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:23.502 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[7567cf49-a6fe-4ee2-9de9-684c75a46575]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:23 np0005466013 nova_compute[192144]: 2025-10-02 12:11:23.505 2 DEBUG nova.virt.libvirt.guest [None req-656bd6d3-82c9-47ff-9c36-b436070844d6 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:2d:3c:22"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap5c564602-5b"/></interface>not found in domain: <domain type='kvm' id='21'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <name>instance-00000035</name>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <uuid>e0215fe6-39be-4529-9345-a5fcb4e3e6ee</uuid>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <nova:name>tempest-tempest.common.compute-instance-1321990610</nova:name>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <nova:creationTime>2025-10-02 12:11:21</nova:creationTime>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <nova:flavor name="m1.nano">
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <nova:memory>128</nova:memory>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <nova:disk>1</nova:disk>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <nova:swap>0</nova:swap>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <nova:vcpus>1</nova:vcpus>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  </nova:flavor>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <nova:owner>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <nova:user uuid="fbc7616089cb4f78832692487019c83d">tempest-AttachInterfacesTestJSON-812274278-project-member</nova:user>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <nova:project uuid="ef4e3be787374d90a6a236c7f76bd940">tempest-AttachInterfacesTestJSON-812274278</nova:project>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  </nova:owner>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <nova:ports>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <nova:port uuid="04cd91f3-a598-48c3-bb6a-789dece3461d">
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </nova:port>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <nova:port uuid="5c564602-5be7-47b0-858a-52eed7fcfd09">
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </nova:port>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  </nova:ports>
Oct  2 08:11:23 np0005466013 nova_compute[192144]: </nova:instance>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <memory unit='KiB'>131072</memory>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <currentMemory unit='KiB'>131072</currentMemory>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <vcpu placement='static'>1</vcpu>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <resource>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <partition>/machine</partition>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  </resource>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <sysinfo type='smbios'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <entry name='manufacturer'>RDO</entry>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <entry name='product'>OpenStack Compute</entry>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <entry name='serial'>e0215fe6-39be-4529-9345-a5fcb4e3e6ee</entry>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <entry name='uuid'>e0215fe6-39be-4529-9345-a5fcb4e3e6ee</entry>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <entry name='family'>Virtual Machine</entry>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <type arch='x86_64' machine='pc-q35-rhel9.6.0'>hvm</type>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <boot dev='hd'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <smbios mode='sysinfo'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <vmcoreinfo state='on'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <cpu mode='custom' match='exact' check='full'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <model fallback='forbid'>Nehalem</model>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <feature policy='require' name='x2apic'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <feature policy='require' name='hypervisor'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <feature policy='require' name='vme'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <clock offset='utc'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <timer name='pit' tickpolicy='delay'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <timer name='rtc' tickpolicy='catchup'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <timer name='hpet' present='no'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <on_poweroff>destroy</on_poweroff>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <on_reboot>restart</on_reboot>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <on_crash>destroy</on_crash>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <disk type='file' device='disk'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <driver name='qemu' type='qcow2' cache='none'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <source file='/var/lib/nova/instances/e0215fe6-39be-4529-9345-a5fcb4e3e6ee/disk' index='2'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <backingStore type='file' index='3'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:        <format type='raw'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:        <source file='/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:        <backingStore/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      </backingStore>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target dev='vda' bus='virtio'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='virtio-disk0'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <disk type='file' device='cdrom'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <driver name='qemu' type='raw' cache='none'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <source file='/var/lib/nova/instances/e0215fe6-39be-4529-9345-a5fcb4e3e6ee/disk.config' index='1'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <backingStore/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target dev='sda' bus='sata'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <readonly/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='sata0-0-0'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='pci' index='0' model='pcie-root'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='pcie.0'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='pci' index='1' model='pcie-root-port'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target chassis='1' port='0x10'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='pci.1'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='pci' index='2' model='pcie-root-port'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target chassis='2' port='0x11'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='pci.2'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='pci' index='3' model='pcie-root-port'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target chassis='3' port='0x12'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='pci.3'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='pci' index='4' model='pcie-root-port'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target chassis='4' port='0x13'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='pci.4'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='pci' index='5' model='pcie-root-port'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target chassis='5' port='0x14'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='pci.5'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='pci' index='6' model='pcie-root-port'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target chassis='6' port='0x15'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='pci.6'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='pci' index='7' model='pcie-root-port'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target chassis='7' port='0x16'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='pci.7'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='pci' index='8' model='pcie-root-port'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target chassis='8' port='0x17'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='pci.8'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='pci' index='9' model='pcie-root-port'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target chassis='9' port='0x18'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='pci.9'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='pci' index='10' model='pcie-root-port'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target chassis='10' port='0x19'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='pci.10'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='pci' index='11' model='pcie-root-port'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target chassis='11' port='0x1a'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='pci.11'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='pci' index='12' model='pcie-root-port'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target chassis='12' port='0x1b'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='pci.12'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='pci' index='13' model='pcie-root-port'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target chassis='13' port='0x1c'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='pci.13'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='pci' index='14' model='pcie-root-port'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target chassis='14' port='0x1d'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='pci.14'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='pci' index='15' model='pcie-root-port'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target chassis='15' port='0x1e'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='pci.15'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='pci' index='16' model='pcie-root-port'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target chassis='16' port='0x1f'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='pci.16'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='pci' index='17' model='pcie-root-port'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target chassis='17' port='0x20'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='pci.17'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='pci' index='18' model='pcie-root-port'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target chassis='18' port='0x21'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='pci.18'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='pci' index='19' model='pcie-root-port'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target chassis='19' port='0x22'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='pci.19'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='pci' index='20' model='pcie-root-port'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target chassis='20' port='0x23'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='pci.20'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='pci' index='21' model='pcie-root-port'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target chassis='21' port='0x24'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='pci.21'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='pci' index='22' model='pcie-root-port'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target chassis='22' port='0x25'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='pci.22'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='pci' index='23' model='pcie-root-port'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target chassis='23' port='0x26'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='pci.23'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='pci' index='24' model='pcie-root-port'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target chassis='24' port='0x27'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='pci.24'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='pci' index='25' model='pcie-root-port'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target chassis='25' port='0x28'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='pci.25'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model name='pcie-pci-bridge'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='pci.26'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='usb' index='0' model='piix3-uhci'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='usb'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <controller type='sata' index='0'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='ide'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <interface type='ethernet'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <mac address='fa:16:3e:9c:f9:18'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target dev='tap04cd91f3-a5'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model type='virtio'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <driver name='vhost' rx_queue_size='512'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <mtu size='1442'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='net0'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <serial type='pty'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <source path='/dev/pts/0'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <log file='/var/lib/nova/instances/e0215fe6-39be-4529-9345-a5fcb4e3e6ee/console.log' append='off'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target type='isa-serial' port='0'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:        <model name='isa-serial'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      </target>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='serial0'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <console type='pty' tty='/dev/pts/0'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <source path='/dev/pts/0'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <log file='/var/lib/nova/instances/e0215fe6-39be-4529-9345-a5fcb4e3e6ee/console.log' append='off'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <target type='serial' port='0'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='serial0'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </console>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <input type='tablet' bus='usb'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='input0'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='usb' bus='0' port='1'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </input>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <input type='mouse' bus='ps2'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='input1'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </input>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <input type='keyboard' bus='ps2'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='input2'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </input>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <listen type='address' address='::0'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </graphics>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <audio id='1' type='none'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <model type='virtio' heads='1' primary='yes'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='video0'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <watchdog model='itco' action='reset'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='watchdog0'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </watchdog>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <memballoon model='virtio'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <stats period='10'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='balloon0'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <rng model='virtio'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <backend model='random'>/dev/urandom</backend>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <alias name='rng0'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <label>system_u:system_r:svirt_t:s0:c37,c440</label>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c37,c440</imagelabel>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  </seclabel>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <label>+107:+107</label>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <imagelabel>+107:+107</imagelabel>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  </seclabel>
Oct  2 08:11:23 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:11:23 np0005466013 nova_compute[192144]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Oct  2 08:11:23 np0005466013 nova_compute[192144]: 2025-10-02 12:11:23.505 2 INFO nova.virt.libvirt.driver [None req-656bd6d3-82c9-47ff-9c36-b436070844d6 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Successfully detached device tap5c564602-5b from instance e0215fe6-39be-4529-9345-a5fcb4e3e6ee from the live domain config.#033[00m
Oct  2 08:11:23 np0005466013 nova_compute[192144]: 2025-10-02 12:11:23.505 2 DEBUG nova.virt.libvirt.vif [None req-656bd6d3-82c9-47ff-9c36-b436070844d6 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:10:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1321990610',display_name='tempest-tempest.common.compute-instance-1321990610',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1321990610',id=53,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBOYv65bqxpRWLwHd1cwzG/4qFq5fwAENkgbBIDBOqnc0JgdzkSWQfx96bY6oBwDuHqykokwPzxefRuwxgQXggptdfQb5jD77e031VNj4krJvTD/OQ1Uz/d20gy+DMsXFg==',key_name='tempest-keypair-922901776',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:10:51Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ef4e3be787374d90a6a236c7f76bd940',ramdisk_id='',reservation_id='r-3gr6mfrz',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-812274278',owner_user_name='tempest-AttachInterfacesTestJSON-812274278-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:10:51Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='fbc7616089cb4f78832692487019c83d',uuid=e0215fe6-39be-4529-9345-a5fcb4e3e6ee,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5c564602-5be7-47b0-858a-52eed7fcfd09", "address": "fa:16:3e:2d:3c:22", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c564602-5b", "ovs_interfaceid": "5c564602-5be7-47b0-858a-52eed7fcfd09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:11:23 np0005466013 nova_compute[192144]: 2025-10-02 12:11:23.506 2 DEBUG nova.network.os_vif_util [None req-656bd6d3-82c9-47ff-9c36-b436070844d6 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Converting VIF {"id": "5c564602-5be7-47b0-858a-52eed7fcfd09", "address": "fa:16:3e:2d:3c:22", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c564602-5b", "ovs_interfaceid": "5c564602-5be7-47b0-858a-52eed7fcfd09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:11:23 np0005466013 nova_compute[192144]: 2025-10-02 12:11:23.506 2 DEBUG nova.network.os_vif_util [None req-656bd6d3-82c9-47ff-9c36-b436070844d6 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2d:3c:22,bridge_name='br-int',has_traffic_filtering=True,id=5c564602-5be7-47b0-858a-52eed7fcfd09,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5c564602-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:11:23 np0005466013 nova_compute[192144]: 2025-10-02 12:11:23.506 2 DEBUG os_vif [None req-656bd6d3-82c9-47ff-9c36-b436070844d6 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:3c:22,bridge_name='br-int',has_traffic_filtering=True,id=5c564602-5be7-47b0-858a-52eed7fcfd09,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5c564602-5b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:11:23 np0005466013 nova_compute[192144]: 2025-10-02 12:11:23.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:23 np0005466013 nova_compute[192144]: 2025-10-02 12:11:23.508 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5c564602-5b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:23 np0005466013 nova_compute[192144]: 2025-10-02 12:11:23.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:23 np0005466013 nova_compute[192144]: 2025-10-02 12:11:23.513 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:23 np0005466013 nova_compute[192144]: 2025-10-02 12:11:23.516 2 INFO os_vif [None req-656bd6d3-82c9-47ff-9c36-b436070844d6 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:3c:22,bridge_name='br-int',has_traffic_filtering=True,id=5c564602-5be7-47b0-858a-52eed7fcfd09,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5c564602-5b')#033[00m
Oct  2 08:11:23 np0005466013 nova_compute[192144]: 2025-10-02 12:11:23.518 2 DEBUG nova.virt.libvirt.guest [None req-656bd6d3-82c9-47ff-9c36-b436070844d6 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <nova:name>tempest-tempest.common.compute-instance-1321990610</nova:name>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <nova:creationTime>2025-10-02 12:11:23</nova:creationTime>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <nova:flavor name="m1.nano">
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <nova:memory>128</nova:memory>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <nova:disk>1</nova:disk>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <nova:swap>0</nova:swap>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <nova:vcpus>1</nova:vcpus>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  </nova:flavor>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <nova:owner>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <nova:user uuid="fbc7616089cb4f78832692487019c83d">tempest-AttachInterfacesTestJSON-812274278-project-member</nova:user>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <nova:project uuid="ef4e3be787374d90a6a236c7f76bd940">tempest-AttachInterfacesTestJSON-812274278</nova:project>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  </nova:owner>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  <nova:ports>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    <nova:port uuid="04cd91f3-a598-48c3-bb6a-789dece3461d">
Oct  2 08:11:23 np0005466013 nova_compute[192144]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:    </nova:port>
Oct  2 08:11:23 np0005466013 nova_compute[192144]:  </nova:ports>
Oct  2 08:11:23 np0005466013 nova_compute[192144]: </nova:instance>
Oct  2 08:11:23 np0005466013 nova_compute[192144]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Oct  2 08:11:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:23.537 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[f974f927-7289-481d-ad9f-b58ba6953d7e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:23.541 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[04734c0f-cbd6-4b4f-972b-4cf11beb93b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:23.572 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[ab94396c-b216-4a2f-b16b-9d2bc0d9416b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:23.592 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[57a009e4-cfc8-469c-9962-998158af3355]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7d845a33-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:90:16'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 916, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 916, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 45], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 504264, 'reachable_time': 26822, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227234, 'error': None, 'target': 'ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:23.611 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[940d8a6e-0db0-4a0b-ae25-ca634c59d17d]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap7d845a33-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 504275, 'tstamp': 504275}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227235, 'error': None, 'target': 'ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap7d845a33-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 504277, 'tstamp': 504277}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227235, 'error': None, 'target': 'ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:23.613 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7d845a33-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:23 np0005466013 nova_compute[192144]: 2025-10-02 12:11:23.615 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:23 np0005466013 nova_compute[192144]: 2025-10-02 12:11:23.616 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:23.616 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7d845a33-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:23.616 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:11:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:23.617 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7d845a33-50, col_values=(('external_ids', {'iface-id': '1c321c19-d630-4a6f-8ba8-7bac90af9bae'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:23.617 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:11:24 np0005466013 nova_compute[192144]: 2025-10-02 12:11:24.202 2 DEBUG nova.compute.manager [req-04575680-1a58-4c04-afaf-114db69112b5 req-31f4c4ae-c0f5-4ea8-9c44-7c1f7a686088 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Received event network-vif-plugged-5c564602-5be7-47b0-858a-52eed7fcfd09 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:11:24 np0005466013 nova_compute[192144]: 2025-10-02 12:11:24.202 2 DEBUG oslo_concurrency.lockutils [req-04575680-1a58-4c04-afaf-114db69112b5 req-31f4c4ae-c0f5-4ea8-9c44-7c1f7a686088 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "e0215fe6-39be-4529-9345-a5fcb4e3e6ee-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:24 np0005466013 nova_compute[192144]: 2025-10-02 12:11:24.203 2 DEBUG oslo_concurrency.lockutils [req-04575680-1a58-4c04-afaf-114db69112b5 req-31f4c4ae-c0f5-4ea8-9c44-7c1f7a686088 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "e0215fe6-39be-4529-9345-a5fcb4e3e6ee-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:24 np0005466013 nova_compute[192144]: 2025-10-02 12:11:24.203 2 DEBUG oslo_concurrency.lockutils [req-04575680-1a58-4c04-afaf-114db69112b5 req-31f4c4ae-c0f5-4ea8-9c44-7c1f7a686088 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "e0215fe6-39be-4529-9345-a5fcb4e3e6ee-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:24 np0005466013 nova_compute[192144]: 2025-10-02 12:11:24.203 2 DEBUG nova.compute.manager [req-04575680-1a58-4c04-afaf-114db69112b5 req-31f4c4ae-c0f5-4ea8-9c44-7c1f7a686088 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] No waiting events found dispatching network-vif-plugged-5c564602-5be7-47b0-858a-52eed7fcfd09 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:11:24 np0005466013 nova_compute[192144]: 2025-10-02 12:11:24.204 2 WARNING nova.compute.manager [req-04575680-1a58-4c04-afaf-114db69112b5 req-31f4c4ae-c0f5-4ea8-9c44-7c1f7a686088 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Received unexpected event network-vif-plugged-5c564602-5be7-47b0-858a-52eed7fcfd09 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:11:24 np0005466013 nova_compute[192144]: 2025-10-02 12:11:24.205 2 DEBUG nova.compute.manager [req-04575680-1a58-4c04-afaf-114db69112b5 req-31f4c4ae-c0f5-4ea8-9c44-7c1f7a686088 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Received event network-vif-unplugged-5c564602-5be7-47b0-858a-52eed7fcfd09 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:11:24 np0005466013 nova_compute[192144]: 2025-10-02 12:11:24.205 2 DEBUG oslo_concurrency.lockutils [req-04575680-1a58-4c04-afaf-114db69112b5 req-31f4c4ae-c0f5-4ea8-9c44-7c1f7a686088 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "e0215fe6-39be-4529-9345-a5fcb4e3e6ee-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:24 np0005466013 nova_compute[192144]: 2025-10-02 12:11:24.205 2 DEBUG oslo_concurrency.lockutils [req-04575680-1a58-4c04-afaf-114db69112b5 req-31f4c4ae-c0f5-4ea8-9c44-7c1f7a686088 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "e0215fe6-39be-4529-9345-a5fcb4e3e6ee-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:24 np0005466013 nova_compute[192144]: 2025-10-02 12:11:24.205 2 DEBUG oslo_concurrency.lockutils [req-04575680-1a58-4c04-afaf-114db69112b5 req-31f4c4ae-c0f5-4ea8-9c44-7c1f7a686088 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "e0215fe6-39be-4529-9345-a5fcb4e3e6ee-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:24 np0005466013 nova_compute[192144]: 2025-10-02 12:11:24.205 2 DEBUG nova.compute.manager [req-04575680-1a58-4c04-afaf-114db69112b5 req-31f4c4ae-c0f5-4ea8-9c44-7c1f7a686088 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] No waiting events found dispatching network-vif-unplugged-5c564602-5be7-47b0-858a-52eed7fcfd09 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:11:24 np0005466013 nova_compute[192144]: 2025-10-02 12:11:24.205 2 WARNING nova.compute.manager [req-04575680-1a58-4c04-afaf-114db69112b5 req-31f4c4ae-c0f5-4ea8-9c44-7c1f7a686088 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Received unexpected event network-vif-unplugged-5c564602-5be7-47b0-858a-52eed7fcfd09 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:11:24 np0005466013 nova_compute[192144]: 2025-10-02 12:11:24.478 2 DEBUG nova.network.neutron [req-e9c953d7-0e1b-4dfd-9f3f-a1ffde4d6b85 req-85112f9b-4b5b-425b-87b2-f71d0bd84a80 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Updated VIF entry in instance network info cache for port 5c564602-5be7-47b0-858a-52eed7fcfd09. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:11:24 np0005466013 nova_compute[192144]: 2025-10-02 12:11:24.478 2 DEBUG nova.network.neutron [req-e9c953d7-0e1b-4dfd-9f3f-a1ffde4d6b85 req-85112f9b-4b5b-425b-87b2-f71d0bd84a80 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Updating instance_info_cache with network_info: [{"id": "04cd91f3-a598-48c3-bb6a-789dece3461d", "address": "fa:16:3e:9c:f9:18", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04cd91f3-a5", "ovs_interfaceid": "04cd91f3-a598-48c3-bb6a-789dece3461d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "5c564602-5be7-47b0-858a-52eed7fcfd09", "address": "fa:16:3e:2d:3c:22", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c564602-5b", "ovs_interfaceid": "5c564602-5be7-47b0-858a-52eed7fcfd09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:11:24 np0005466013 nova_compute[192144]: 2025-10-02 12:11:24.501 2 DEBUG oslo_concurrency.lockutils [req-e9c953d7-0e1b-4dfd-9f3f-a1ffde4d6b85 req-85112f9b-4b5b-425b-87b2-f71d0bd84a80 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-e0215fe6-39be-4529-9345-a5fcb4e3e6ee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:11:24 np0005466013 nova_compute[192144]: 2025-10-02 12:11:24.502 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquired lock "refresh_cache-e0215fe6-39be-4529-9345-a5fcb4e3e6ee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:11:24 np0005466013 nova_compute[192144]: 2025-10-02 12:11:24.502 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:11:24 np0005466013 nova_compute[192144]: 2025-10-02 12:11:24.502 2 DEBUG nova.objects.instance [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lazy-loading 'info_cache' on Instance uuid e0215fe6-39be-4529-9345-a5fcb4e3e6ee obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:11:25 np0005466013 nova_compute[192144]: 2025-10-02 12:11:25.087 2 DEBUG oslo_concurrency.lockutils [None req-656bd6d3-82c9-47ff-9c36-b436070844d6 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquiring lock "refresh_cache-e0215fe6-39be-4529-9345-a5fcb4e3e6ee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:11:25 np0005466013 nova_compute[192144]: 2025-10-02 12:11:25.377 2 DEBUG oslo_concurrency.lockutils [None req-3cbf3448-11ba-4584-aed0-34e6911f4695 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquiring lock "e0215fe6-39be-4529-9345-a5fcb4e3e6ee" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:25 np0005466013 nova_compute[192144]: 2025-10-02 12:11:25.378 2 DEBUG oslo_concurrency.lockutils [None req-3cbf3448-11ba-4584-aed0-34e6911f4695 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "e0215fe6-39be-4529-9345-a5fcb4e3e6ee" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:25 np0005466013 nova_compute[192144]: 2025-10-02 12:11:25.378 2 DEBUG oslo_concurrency.lockutils [None req-3cbf3448-11ba-4584-aed0-34e6911f4695 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquiring lock "e0215fe6-39be-4529-9345-a5fcb4e3e6ee-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:25 np0005466013 nova_compute[192144]: 2025-10-02 12:11:25.378 2 DEBUG oslo_concurrency.lockutils [None req-3cbf3448-11ba-4584-aed0-34e6911f4695 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "e0215fe6-39be-4529-9345-a5fcb4e3e6ee-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:25 np0005466013 nova_compute[192144]: 2025-10-02 12:11:25.378 2 DEBUG oslo_concurrency.lockutils [None req-3cbf3448-11ba-4584-aed0-34e6911f4695 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "e0215fe6-39be-4529-9345-a5fcb4e3e6ee-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:25 np0005466013 nova_compute[192144]: 2025-10-02 12:11:25.389 2 INFO nova.compute.manager [None req-3cbf3448-11ba-4584-aed0-34e6911f4695 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Terminating instance#033[00m
Oct  2 08:11:25 np0005466013 nova_compute[192144]: 2025-10-02 12:11:25.401 2 DEBUG nova.compute.manager [None req-3cbf3448-11ba-4584-aed0-34e6911f4695 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:11:25 np0005466013 kernel: tap04cd91f3-a5 (unregistering): left promiscuous mode
Oct  2 08:11:25 np0005466013 NetworkManager[51205]: <info>  [1759407085.4495] device (tap04cd91f3-a5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:11:25 np0005466013 ovn_controller[94366]: 2025-10-02T12:11:25Z|00151|binding|INFO|Releasing lport 04cd91f3-a598-48c3-bb6a-789dece3461d from this chassis (sb_readonly=0)
Oct  2 08:11:25 np0005466013 ovn_controller[94366]: 2025-10-02T12:11:25Z|00152|binding|INFO|Setting lport 04cd91f3-a598-48c3-bb6a-789dece3461d down in Southbound
Oct  2 08:11:25 np0005466013 nova_compute[192144]: 2025-10-02 12:11:25.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:25 np0005466013 ovn_controller[94366]: 2025-10-02T12:11:25Z|00153|binding|INFO|Removing iface tap04cd91f3-a5 ovn-installed in OVS
Oct  2 08:11:25 np0005466013 nova_compute[192144]: 2025-10-02 12:11:25.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:25.464 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9c:f9:18 10.100.0.11'], port_security=['fa:16:3e:9c:f9:18 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'e0215fe6-39be-4529-9345-a5fcb4e3e6ee', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d845a33-56e0-4850-9f27-8a54095796f2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3b3d6cff-45b6-4476-af05-0164bc00fd3b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.203'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4583e9be-3cfa-4470-9e2e-4e943d469605, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=04cd91f3-a598-48c3-bb6a-789dece3461d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:11:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:25.465 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 04cd91f3-a598-48c3-bb6a-789dece3461d in datapath 7d845a33-56e0-4850-9f27-8a54095796f2 unbound from our chassis#033[00m
Oct  2 08:11:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:25.467 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7d845a33-56e0-4850-9f27-8a54095796f2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:11:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:25.468 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[6bf494e5-ceca-4134-b764-b3ea1e7a27df]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:25.468 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2 namespace which is not needed anymore#033[00m
Oct  2 08:11:25 np0005466013 nova_compute[192144]: 2025-10-02 12:11:25.471 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:25 np0005466013 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d00000035.scope: Deactivated successfully.
Oct  2 08:11:25 np0005466013 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d00000035.scope: Consumed 14.462s CPU time.
Oct  2 08:11:25 np0005466013 systemd-machined[152202]: Machine qemu-21-instance-00000035 terminated.
Oct  2 08:11:25 np0005466013 neutron-haproxy-ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2[226846]: [NOTICE]   (226850) : haproxy version is 2.8.14-c23fe91
Oct  2 08:11:25 np0005466013 neutron-haproxy-ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2[226846]: [NOTICE]   (226850) : path to executable is /usr/sbin/haproxy
Oct  2 08:11:25 np0005466013 neutron-haproxy-ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2[226846]: [WARNING]  (226850) : Exiting Master process...
Oct  2 08:11:25 np0005466013 neutron-haproxy-ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2[226846]: [ALERT]    (226850) : Current worker (226852) exited with code 143 (Terminated)
Oct  2 08:11:25 np0005466013 neutron-haproxy-ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2[226846]: [WARNING]  (226850) : All workers exited. Exiting... (0)
Oct  2 08:11:25 np0005466013 systemd[1]: libpod-878dd236cf5096b531a1a30ef9ca45b5175efa96c4d157618fec30a54cc616e5.scope: Deactivated successfully.
Oct  2 08:11:25 np0005466013 nova_compute[192144]: 2025-10-02 12:11:25.663 2 INFO nova.virt.libvirt.driver [-] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Instance destroyed successfully.#033[00m
Oct  2 08:11:25 np0005466013 nova_compute[192144]: 2025-10-02 12:11:25.664 2 DEBUG nova.objects.instance [None req-3cbf3448-11ba-4584-aed0-34e6911f4695 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lazy-loading 'resources' on Instance uuid e0215fe6-39be-4529-9345-a5fcb4e3e6ee obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:11:25 np0005466013 podman[227259]: 2025-10-02 12:11:25.668016863 +0000 UTC m=+0.114795479 container died 878dd236cf5096b531a1a30ef9ca45b5175efa96c4d157618fec30a54cc616e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:11:25 np0005466013 nova_compute[192144]: 2025-10-02 12:11:25.679 2 DEBUG nova.virt.libvirt.vif [None req-3cbf3448-11ba-4584-aed0-34e6911f4695 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:10:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1321990610',display_name='tempest-tempest.common.compute-instance-1321990610',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1321990610',id=53,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBOYv65bqxpRWLwHd1cwzG/4qFq5fwAENkgbBIDBOqnc0JgdzkSWQfx96bY6oBwDuHqykokwPzxefRuwxgQXggptdfQb5jD77e031VNj4krJvTD/OQ1Uz/d20gy+DMsXFg==',key_name='tempest-keypair-922901776',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:10:51Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ef4e3be787374d90a6a236c7f76bd940',ramdisk_id='',reservation_id='r-3gr6mfrz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-812274278',owner_user_name='tempest-AttachInterfacesTestJSON-812274278-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:10:51Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='fbc7616089cb4f78832692487019c83d',uuid=e0215fe6-39be-4529-9345-a5fcb4e3e6ee,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "04cd91f3-a598-48c3-bb6a-789dece3461d", "address": "fa:16:3e:9c:f9:18", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04cd91f3-a5", "ovs_interfaceid": "04cd91f3-a598-48c3-bb6a-789dece3461d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:11:25 np0005466013 nova_compute[192144]: 2025-10-02 12:11:25.680 2 DEBUG nova.network.os_vif_util [None req-3cbf3448-11ba-4584-aed0-34e6911f4695 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Converting VIF {"id": "04cd91f3-a598-48c3-bb6a-789dece3461d", "address": "fa:16:3e:9c:f9:18", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04cd91f3-a5", "ovs_interfaceid": "04cd91f3-a598-48c3-bb6a-789dece3461d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:11:25 np0005466013 nova_compute[192144]: 2025-10-02 12:11:25.680 2 DEBUG nova.network.os_vif_util [None req-3cbf3448-11ba-4584-aed0-34e6911f4695 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:9c:f9:18,bridge_name='br-int',has_traffic_filtering=True,id=04cd91f3-a598-48c3-bb6a-789dece3461d,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04cd91f3-a5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:11:25 np0005466013 nova_compute[192144]: 2025-10-02 12:11:25.681 2 DEBUG os_vif [None req-3cbf3448-11ba-4584-aed0-34e6911f4695 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:9c:f9:18,bridge_name='br-int',has_traffic_filtering=True,id=04cd91f3-a598-48c3-bb6a-789dece3461d,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04cd91f3-a5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:11:25 np0005466013 nova_compute[192144]: 2025-10-02 12:11:25.683 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:25 np0005466013 nova_compute[192144]: 2025-10-02 12:11:25.684 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap04cd91f3-a5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:25 np0005466013 nova_compute[192144]: 2025-10-02 12:11:25.686 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:25 np0005466013 nova_compute[192144]: 2025-10-02 12:11:25.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:11:25 np0005466013 nova_compute[192144]: 2025-10-02 12:11:25.690 2 INFO os_vif [None req-3cbf3448-11ba-4584-aed0-34e6911f4695 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:9c:f9:18,bridge_name='br-int',has_traffic_filtering=True,id=04cd91f3-a598-48c3-bb6a-789dece3461d,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04cd91f3-a5')#033[00m
Oct  2 08:11:25 np0005466013 nova_compute[192144]: 2025-10-02 12:11:25.691 2 DEBUG nova.virt.libvirt.vif [None req-3cbf3448-11ba-4584-aed0-34e6911f4695 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:10:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1321990610',display_name='tempest-tempest.common.compute-instance-1321990610',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1321990610',id=53,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBOYv65bqxpRWLwHd1cwzG/4qFq5fwAENkgbBIDBOqnc0JgdzkSWQfx96bY6oBwDuHqykokwPzxefRuwxgQXggptdfQb5jD77e031VNj4krJvTD/OQ1Uz/d20gy+DMsXFg==',key_name='tempest-keypair-922901776',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:10:51Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ef4e3be787374d90a6a236c7f76bd940',ramdisk_id='',reservation_id='r-3gr6mfrz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-812274278',owner_user_name='tempest-AttachInterfacesTestJSON-812274278-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:10:51Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='fbc7616089cb4f78832692487019c83d',uuid=e0215fe6-39be-4529-9345-a5fcb4e3e6ee,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5c564602-5be7-47b0-858a-52eed7fcfd09", "address": "fa:16:3e:2d:3c:22", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c564602-5b", "ovs_interfaceid": "5c564602-5be7-47b0-858a-52eed7fcfd09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:11:25 np0005466013 nova_compute[192144]: 2025-10-02 12:11:25.691 2 DEBUG nova.network.os_vif_util [None req-3cbf3448-11ba-4584-aed0-34e6911f4695 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Converting VIF {"id": "5c564602-5be7-47b0-858a-52eed7fcfd09", "address": "fa:16:3e:2d:3c:22", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c564602-5b", "ovs_interfaceid": "5c564602-5be7-47b0-858a-52eed7fcfd09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:11:25 np0005466013 nova_compute[192144]: 2025-10-02 12:11:25.692 2 DEBUG nova.network.os_vif_util [None req-3cbf3448-11ba-4584-aed0-34e6911f4695 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2d:3c:22,bridge_name='br-int',has_traffic_filtering=True,id=5c564602-5be7-47b0-858a-52eed7fcfd09,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5c564602-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:11:25 np0005466013 nova_compute[192144]: 2025-10-02 12:11:25.692 2 DEBUG os_vif [None req-3cbf3448-11ba-4584-aed0-34e6911f4695 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:3c:22,bridge_name='br-int',has_traffic_filtering=True,id=5c564602-5be7-47b0-858a-52eed7fcfd09,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5c564602-5b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:11:25 np0005466013 nova_compute[192144]: 2025-10-02 12:11:25.693 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:25 np0005466013 nova_compute[192144]: 2025-10-02 12:11:25.694 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5c564602-5b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:25 np0005466013 nova_compute[192144]: 2025-10-02 12:11:25.694 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:11:25 np0005466013 nova_compute[192144]: 2025-10-02 12:11:25.695 2 INFO os_vif [None req-3cbf3448-11ba-4584-aed0-34e6911f4695 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:3c:22,bridge_name='br-int',has_traffic_filtering=True,id=5c564602-5be7-47b0-858a-52eed7fcfd09,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5c564602-5b')#033[00m
Oct  2 08:11:25 np0005466013 nova_compute[192144]: 2025-10-02 12:11:25.696 2 INFO nova.virt.libvirt.driver [None req-3cbf3448-11ba-4584-aed0-34e6911f4695 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Deleting instance files /var/lib/nova/instances/e0215fe6-39be-4529-9345-a5fcb4e3e6ee_del#033[00m
Oct  2 08:11:25 np0005466013 nova_compute[192144]: 2025-10-02 12:11:25.696 2 INFO nova.virt.libvirt.driver [None req-3cbf3448-11ba-4584-aed0-34e6911f4695 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Deletion of /var/lib/nova/instances/e0215fe6-39be-4529-9345-a5fcb4e3e6ee_del complete#033[00m
Oct  2 08:11:25 np0005466013 nova_compute[192144]: 2025-10-02 12:11:25.790 2 INFO nova.compute.manager [None req-3cbf3448-11ba-4584-aed0-34e6911f4695 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:11:25 np0005466013 nova_compute[192144]: 2025-10-02 12:11:25.791 2 DEBUG oslo.service.loopingcall [None req-3cbf3448-11ba-4584-aed0-34e6911f4695 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:11:25 np0005466013 nova_compute[192144]: 2025-10-02 12:11:25.791 2 DEBUG nova.compute.manager [-] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:11:25 np0005466013 nova_compute[192144]: 2025-10-02 12:11:25.791 2 DEBUG nova.network.neutron [-] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:11:25 np0005466013 systemd[1]: var-lib-containers-storage-overlay-d0502dc157657651b30b6d4846c0786fa0b36e60dd957145453926a6ee92b92b-merged.mount: Deactivated successfully.
Oct  2 08:11:25 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-878dd236cf5096b531a1a30ef9ca45b5175efa96c4d157618fec30a54cc616e5-userdata-shm.mount: Deactivated successfully.
Oct  2 08:11:25 np0005466013 podman[227259]: 2025-10-02 12:11:25.849978157 +0000 UTC m=+0.296756763 container cleanup 878dd236cf5096b531a1a30ef9ca45b5175efa96c4d157618fec30a54cc616e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3)
Oct  2 08:11:25 np0005466013 systemd[1]: libpod-conmon-878dd236cf5096b531a1a30ef9ca45b5175efa96c4d157618fec30a54cc616e5.scope: Deactivated successfully.
Oct  2 08:11:26 np0005466013 podman[227304]: 2025-10-02 12:11:26.070926786 +0000 UTC m=+0.195320154 container remove 878dd236cf5096b531a1a30ef9ca45b5175efa96c4d157618fec30a54cc616e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:11:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:26.077 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[9082a5a0-7fb4-4dfa-a4f2-e05186f400c0]: (4, ('Thu Oct  2 12:11:25 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2 (878dd236cf5096b531a1a30ef9ca45b5175efa96c4d157618fec30a54cc616e5)\n878dd236cf5096b531a1a30ef9ca45b5175efa96c4d157618fec30a54cc616e5\nThu Oct  2 12:11:25 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2 (878dd236cf5096b531a1a30ef9ca45b5175efa96c4d157618fec30a54cc616e5)\n878dd236cf5096b531a1a30ef9ca45b5175efa96c4d157618fec30a54cc616e5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:26.079 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[2aab7552-3164-499f-9210-c498dd51a686]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:26.080 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7d845a33-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:26 np0005466013 nova_compute[192144]: 2025-10-02 12:11:26.082 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:26 np0005466013 kernel: tap7d845a33-50: left promiscuous mode
Oct  2 08:11:26 np0005466013 ovn_controller[94366]: 2025-10-02T12:11:26Z|00154|binding|INFO|Releasing lport bae948b1-4f73-4351-9747-4eb0bc1cf88b from this chassis (sb_readonly=0)
Oct  2 08:11:26 np0005466013 nova_compute[192144]: 2025-10-02 12:11:26.097 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:26 np0005466013 nova_compute[192144]: 2025-10-02 12:11:26.099 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:26.100 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[b41d9bda-3261-4fae-902c-33326f7962a2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:26 np0005466013 nova_compute[192144]: 2025-10-02 12:11:26.140 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:26.153 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[b718056a-4342-4aab-9348-6c58cddeb8cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:26.155 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[06c413da-e1c9-4a90-9b35-af81b1a71a59]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:26.175 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[28640fa4-ad83-4322-87e9-07f9037fb514]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 504256, 'reachable_time': 25559, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227321, 'error': None, 'target': 'ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:26.178 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:11:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:26.179 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[13201b82-ea23-4852-8f4c-7b35fa8b448c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:26 np0005466013 systemd[1]: run-netns-ovnmeta\x2d7d845a33\x2d56e0\x2d4850\x2d9f27\x2d8a54095796f2.mount: Deactivated successfully.
Oct  2 08:11:26 np0005466013 nova_compute[192144]: 2025-10-02 12:11:26.327 2 DEBUG nova.compute.manager [req-8e100427-1a06-46cb-ac06-d3a2fdab7ddc req-8f8f8072-534b-4a7e-b060-86f45b602be7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Received event network-vif-plugged-5c564602-5be7-47b0-858a-52eed7fcfd09 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:11:26 np0005466013 nova_compute[192144]: 2025-10-02 12:11:26.327 2 DEBUG oslo_concurrency.lockutils [req-8e100427-1a06-46cb-ac06-d3a2fdab7ddc req-8f8f8072-534b-4a7e-b060-86f45b602be7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "e0215fe6-39be-4529-9345-a5fcb4e3e6ee-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:26 np0005466013 nova_compute[192144]: 2025-10-02 12:11:26.328 2 DEBUG oslo_concurrency.lockutils [req-8e100427-1a06-46cb-ac06-d3a2fdab7ddc req-8f8f8072-534b-4a7e-b060-86f45b602be7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "e0215fe6-39be-4529-9345-a5fcb4e3e6ee-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:26 np0005466013 nova_compute[192144]: 2025-10-02 12:11:26.328 2 DEBUG oslo_concurrency.lockutils [req-8e100427-1a06-46cb-ac06-d3a2fdab7ddc req-8f8f8072-534b-4a7e-b060-86f45b602be7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "e0215fe6-39be-4529-9345-a5fcb4e3e6ee-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:26 np0005466013 nova_compute[192144]: 2025-10-02 12:11:26.328 2 DEBUG nova.compute.manager [req-8e100427-1a06-46cb-ac06-d3a2fdab7ddc req-8f8f8072-534b-4a7e-b060-86f45b602be7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] No waiting events found dispatching network-vif-plugged-5c564602-5be7-47b0-858a-52eed7fcfd09 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:11:26 np0005466013 nova_compute[192144]: 2025-10-02 12:11:26.329 2 WARNING nova.compute.manager [req-8e100427-1a06-46cb-ac06-d3a2fdab7ddc req-8f8f8072-534b-4a7e-b060-86f45b602be7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Received unexpected event network-vif-plugged-5c564602-5be7-47b0-858a-52eed7fcfd09 for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:11:26 np0005466013 nova_compute[192144]: 2025-10-02 12:11:26.329 2 DEBUG nova.compute.manager [req-8e100427-1a06-46cb-ac06-d3a2fdab7ddc req-8f8f8072-534b-4a7e-b060-86f45b602be7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Received event network-vif-unplugged-04cd91f3-a598-48c3-bb6a-789dece3461d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:11:26 np0005466013 nova_compute[192144]: 2025-10-02 12:11:26.329 2 DEBUG oslo_concurrency.lockutils [req-8e100427-1a06-46cb-ac06-d3a2fdab7ddc req-8f8f8072-534b-4a7e-b060-86f45b602be7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "e0215fe6-39be-4529-9345-a5fcb4e3e6ee-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:26 np0005466013 nova_compute[192144]: 2025-10-02 12:11:26.330 2 DEBUG oslo_concurrency.lockutils [req-8e100427-1a06-46cb-ac06-d3a2fdab7ddc req-8f8f8072-534b-4a7e-b060-86f45b602be7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "e0215fe6-39be-4529-9345-a5fcb4e3e6ee-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:26 np0005466013 nova_compute[192144]: 2025-10-02 12:11:26.330 2 DEBUG oslo_concurrency.lockutils [req-8e100427-1a06-46cb-ac06-d3a2fdab7ddc req-8f8f8072-534b-4a7e-b060-86f45b602be7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "e0215fe6-39be-4529-9345-a5fcb4e3e6ee-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:26 np0005466013 nova_compute[192144]: 2025-10-02 12:11:26.330 2 DEBUG nova.compute.manager [req-8e100427-1a06-46cb-ac06-d3a2fdab7ddc req-8f8f8072-534b-4a7e-b060-86f45b602be7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] No waiting events found dispatching network-vif-unplugged-04cd91f3-a598-48c3-bb6a-789dece3461d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:11:26 np0005466013 nova_compute[192144]: 2025-10-02 12:11:26.330 2 DEBUG nova.compute.manager [req-8e100427-1a06-46cb-ac06-d3a2fdab7ddc req-8f8f8072-534b-4a7e-b060-86f45b602be7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Received event network-vif-unplugged-04cd91f3-a598-48c3-bb6a-789dece3461d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:11:26 np0005466013 nova_compute[192144]: 2025-10-02 12:11:26.331 2 DEBUG nova.compute.manager [req-8e100427-1a06-46cb-ac06-d3a2fdab7ddc req-8f8f8072-534b-4a7e-b060-86f45b602be7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Received event network-vif-plugged-04cd91f3-a598-48c3-bb6a-789dece3461d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:11:26 np0005466013 nova_compute[192144]: 2025-10-02 12:11:26.331 2 DEBUG oslo_concurrency.lockutils [req-8e100427-1a06-46cb-ac06-d3a2fdab7ddc req-8f8f8072-534b-4a7e-b060-86f45b602be7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "e0215fe6-39be-4529-9345-a5fcb4e3e6ee-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:26 np0005466013 nova_compute[192144]: 2025-10-02 12:11:26.331 2 DEBUG oslo_concurrency.lockutils [req-8e100427-1a06-46cb-ac06-d3a2fdab7ddc req-8f8f8072-534b-4a7e-b060-86f45b602be7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "e0215fe6-39be-4529-9345-a5fcb4e3e6ee-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:26 np0005466013 nova_compute[192144]: 2025-10-02 12:11:26.331 2 DEBUG oslo_concurrency.lockutils [req-8e100427-1a06-46cb-ac06-d3a2fdab7ddc req-8f8f8072-534b-4a7e-b060-86f45b602be7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "e0215fe6-39be-4529-9345-a5fcb4e3e6ee-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:26 np0005466013 nova_compute[192144]: 2025-10-02 12:11:26.332 2 DEBUG nova.compute.manager [req-8e100427-1a06-46cb-ac06-d3a2fdab7ddc req-8f8f8072-534b-4a7e-b060-86f45b602be7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] No waiting events found dispatching network-vif-plugged-04cd91f3-a598-48c3-bb6a-789dece3461d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:11:26 np0005466013 nova_compute[192144]: 2025-10-02 12:11:26.332 2 WARNING nova.compute.manager [req-8e100427-1a06-46cb-ac06-d3a2fdab7ddc req-8f8f8072-534b-4a7e-b060-86f45b602be7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Received unexpected event network-vif-plugged-04cd91f3-a598-48c3-bb6a-789dece3461d for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:11:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:26.424 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:27 np0005466013 nova_compute[192144]: 2025-10-02 12:11:27.516 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Updating instance_info_cache with network_info: [{"id": "04cd91f3-a598-48c3-bb6a-789dece3461d", "address": "fa:16:3e:9c:f9:18", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04cd91f3-a5", "ovs_interfaceid": "04cd91f3-a598-48c3-bb6a-789dece3461d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "5c564602-5be7-47b0-858a-52eed7fcfd09", "address": "fa:16:3e:2d:3c:22", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c564602-5b", "ovs_interfaceid": "5c564602-5be7-47b0-858a-52eed7fcfd09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:11:27 np0005466013 nova_compute[192144]: 2025-10-02 12:11:27.533 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Releasing lock "refresh_cache-e0215fe6-39be-4529-9345-a5fcb4e3e6ee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:11:27 np0005466013 nova_compute[192144]: 2025-10-02 12:11:27.533 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:11:27 np0005466013 nova_compute[192144]: 2025-10-02 12:11:27.534 2 DEBUG oslo_concurrency.lockutils [None req-656bd6d3-82c9-47ff-9c36-b436070844d6 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquired lock "refresh_cache-e0215fe6-39be-4529-9345-a5fcb4e3e6ee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:11:27 np0005466013 nova_compute[192144]: 2025-10-02 12:11:27.534 2 DEBUG nova.network.neutron [None req-656bd6d3-82c9-47ff-9c36-b436070844d6 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:11:27 np0005466013 nova_compute[192144]: 2025-10-02 12:11:27.535 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:11:27 np0005466013 nova_compute[192144]: 2025-10-02 12:11:27.536 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:11:27 np0005466013 nova_compute[192144]: 2025-10-02 12:11:27.536 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:11:27 np0005466013 nova_compute[192144]: 2025-10-02 12:11:27.536 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:11:27 np0005466013 nova_compute[192144]: 2025-10-02 12:11:27.537 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:11:27 np0005466013 nova_compute[192144]: 2025-10-02 12:11:27.537 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:11:27 np0005466013 nova_compute[192144]: 2025-10-02 12:11:27.537 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:11:27 np0005466013 nova_compute[192144]: 2025-10-02 12:11:27.537 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:11:27 np0005466013 nova_compute[192144]: 2025-10-02 12:11:27.563 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:27 np0005466013 nova_compute[192144]: 2025-10-02 12:11:27.563 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:27 np0005466013 nova_compute[192144]: 2025-10-02 12:11:27.563 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:27 np0005466013 nova_compute[192144]: 2025-10-02 12:11:27.564 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:11:27 np0005466013 nova_compute[192144]: 2025-10-02 12:11:27.635 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f12db2c7-b990-46f2-8865-699df3c176e6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:11:27 np0005466013 nova_compute[192144]: 2025-10-02 12:11:27.697 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f12db2c7-b990-46f2-8865-699df3c176e6/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:11:27 np0005466013 nova_compute[192144]: 2025-10-02 12:11:27.698 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f12db2c7-b990-46f2-8865-699df3c176e6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:11:27 np0005466013 nova_compute[192144]: 2025-10-02 12:11:27.756 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f12db2c7-b990-46f2-8865-699df3c176e6/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:11:27 np0005466013 nova_compute[192144]: 2025-10-02 12:11:27.935 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:11:27 np0005466013 nova_compute[192144]: 2025-10-02 12:11:27.936 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5578MB free_disk=73.36772537231445GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:11:27 np0005466013 nova_compute[192144]: 2025-10-02 12:11:27.937 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:27 np0005466013 nova_compute[192144]: 2025-10-02 12:11:27.937 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:28 np0005466013 nova_compute[192144]: 2025-10-02 12:11:28.111 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance e0215fe6-39be-4529-9345-a5fcb4e3e6ee actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:11:28 np0005466013 nova_compute[192144]: 2025-10-02 12:11:28.111 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance f12db2c7-b990-46f2-8865-699df3c176e6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:11:28 np0005466013 nova_compute[192144]: 2025-10-02 12:11:28.111 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:11:28 np0005466013 nova_compute[192144]: 2025-10-02 12:11:28.112 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:11:28 np0005466013 nova_compute[192144]: 2025-10-02 12:11:28.162 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:28 np0005466013 nova_compute[192144]: 2025-10-02 12:11:28.168 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:11:28 np0005466013 nova_compute[192144]: 2025-10-02 12:11:28.185 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:11:28 np0005466013 nova_compute[192144]: 2025-10-02 12:11:28.214 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:11:28 np0005466013 nova_compute[192144]: 2025-10-02 12:11:28.214 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.277s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:28 np0005466013 nova_compute[192144]: 2025-10-02 12:11:28.920 2 DEBUG nova.network.neutron [-] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:11:28 np0005466013 nova_compute[192144]: 2025-10-02 12:11:28.939 2 INFO nova.compute.manager [-] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Took 3.15 seconds to deallocate network for instance.#033[00m
Oct  2 08:11:29 np0005466013 nova_compute[192144]: 2025-10-02 12:11:29.028 2 DEBUG nova.compute.manager [req-c1384b55-6676-4581-b2e2-e968dc663d3c req-4c2fc32d-ca6e-477f-842c-e4ec97411ce8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Received event network-vif-deleted-04cd91f3-a598-48c3-bb6a-789dece3461d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:11:29 np0005466013 nova_compute[192144]: 2025-10-02 12:11:29.031 2 DEBUG oslo_concurrency.lockutils [None req-3cbf3448-11ba-4584-aed0-34e6911f4695 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:29 np0005466013 nova_compute[192144]: 2025-10-02 12:11:29.032 2 DEBUG oslo_concurrency.lockutils [None req-3cbf3448-11ba-4584-aed0-34e6911f4695 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:29 np0005466013 nova_compute[192144]: 2025-10-02 12:11:29.109 2 DEBUG nova.compute.provider_tree [None req-3cbf3448-11ba-4584-aed0-34e6911f4695 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:11:29 np0005466013 nova_compute[192144]: 2025-10-02 12:11:29.125 2 DEBUG nova.scheduler.client.report [None req-3cbf3448-11ba-4584-aed0-34e6911f4695 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:11:29 np0005466013 nova_compute[192144]: 2025-10-02 12:11:29.145 2 DEBUG oslo_concurrency.lockutils [None req-3cbf3448-11ba-4584-aed0-34e6911f4695 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:29 np0005466013 nova_compute[192144]: 2025-10-02 12:11:29.198 2 INFO nova.scheduler.client.report [None req-3cbf3448-11ba-4584-aed0-34e6911f4695 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Deleted allocations for instance e0215fe6-39be-4529-9345-a5fcb4e3e6ee#033[00m
Oct  2 08:11:29 np0005466013 nova_compute[192144]: 2025-10-02 12:11:29.446 2 DEBUG oslo_concurrency.lockutils [None req-3cbf3448-11ba-4584-aed0-34e6911f4695 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "e0215fe6-39be-4529-9345-a5fcb4e3e6ee" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.068s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:29 np0005466013 nova_compute[192144]: 2025-10-02 12:11:29.749 2 INFO nova.network.neutron [None req-656bd6d3-82c9-47ff-9c36-b436070844d6 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Port 5c564602-5be7-47b0-858a-52eed7fcfd09 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Oct  2 08:11:29 np0005466013 nova_compute[192144]: 2025-10-02 12:11:29.749 2 DEBUG nova.network.neutron [None req-656bd6d3-82c9-47ff-9c36-b436070844d6 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Updating instance_info_cache with network_info: [{"id": "04cd91f3-a598-48c3-bb6a-789dece3461d", "address": "fa:16:3e:9c:f9:18", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04cd91f3-a5", "ovs_interfaceid": "04cd91f3-a598-48c3-bb6a-789dece3461d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:11:29 np0005466013 nova_compute[192144]: 2025-10-02 12:11:29.775 2 DEBUG oslo_concurrency.lockutils [None req-656bd6d3-82c9-47ff-9c36-b436070844d6 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Releasing lock "refresh_cache-e0215fe6-39be-4529-9345-a5fcb4e3e6ee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:11:29 np0005466013 nova_compute[192144]: 2025-10-02 12:11:29.806 2 DEBUG oslo_concurrency.lockutils [None req-656bd6d3-82c9-47ff-9c36-b436070844d6 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "interface-e0215fe6-39be-4529-9345-a5fcb4e3e6ee-5c564602-5be7-47b0-858a-52eed7fcfd09" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 6.537s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:30 np0005466013 nova_compute[192144]: 2025-10-02 12:11:30.686 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:33 np0005466013 nova_compute[192144]: 2025-10-02 12:11:33.164 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:33 np0005466013 podman[227330]: 2025-10-02 12:11:33.707331648 +0000 UTC m=+0.081572167 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  2 08:11:33 np0005466013 podman[227331]: 2025-10-02 12:11:33.707772371 +0000 UTC m=+0.078026101 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:11:33 np0005466013 podman[227332]: 2025-10-02 12:11:33.742938432 +0000 UTC m=+0.108373688 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 08:11:34 np0005466013 nova_compute[192144]: 2025-10-02 12:11:34.169 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:34 np0005466013 nova_compute[192144]: 2025-10-02 12:11:34.209 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:11:35 np0005466013 nova_compute[192144]: 2025-10-02 12:11:35.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:37 np0005466013 ovn_controller[94366]: 2025-10-02T12:11:37Z|00155|binding|INFO|Releasing lport bae948b1-4f73-4351-9747-4eb0bc1cf88b from this chassis (sb_readonly=0)
Oct  2 08:11:37 np0005466013 nova_compute[192144]: 2025-10-02 12:11:37.153 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:37 np0005466013 nova_compute[192144]: 2025-10-02 12:11:37.324 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:38 np0005466013 nova_compute[192144]: 2025-10-02 12:11:38.167 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:40 np0005466013 ovn_controller[94366]: 2025-10-02T12:11:40Z|00156|binding|INFO|Releasing lport bae948b1-4f73-4351-9747-4eb0bc1cf88b from this chassis (sb_readonly=0)
Oct  2 08:11:40 np0005466013 nova_compute[192144]: 2025-10-02 12:11:40.234 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:40 np0005466013 nova_compute[192144]: 2025-10-02 12:11:40.658 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407085.6571615, e0215fe6-39be-4529-9345-a5fcb4e3e6ee => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:11:40 np0005466013 nova_compute[192144]: 2025-10-02 12:11:40.659 2 INFO nova.compute.manager [-] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:11:40 np0005466013 nova_compute[192144]: 2025-10-02 12:11:40.690 2 DEBUG nova.compute.manager [None req-f0ab5764-faef-4eae-b147-a44facb61e54 - - - - - -] [instance: e0215fe6-39be-4529-9345-a5fcb4e3e6ee] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:11:40 np0005466013 nova_compute[192144]: 2025-10-02 12:11:40.691 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:43 np0005466013 nova_compute[192144]: 2025-10-02 12:11:43.069 2 DEBUG oslo_concurrency.lockutils [None req-6800d384-6d00-4178-bfbd-46bb49a85341 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Acquiring lock "f12db2c7-b990-46f2-8865-699df3c176e6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:43 np0005466013 nova_compute[192144]: 2025-10-02 12:11:43.070 2 DEBUG oslo_concurrency.lockutils [None req-6800d384-6d00-4178-bfbd-46bb49a85341 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Lock "f12db2c7-b990-46f2-8865-699df3c176e6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:43 np0005466013 nova_compute[192144]: 2025-10-02 12:11:43.070 2 DEBUG oslo_concurrency.lockutils [None req-6800d384-6d00-4178-bfbd-46bb49a85341 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Acquiring lock "f12db2c7-b990-46f2-8865-699df3c176e6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:43 np0005466013 nova_compute[192144]: 2025-10-02 12:11:43.070 2 DEBUG oslo_concurrency.lockutils [None req-6800d384-6d00-4178-bfbd-46bb49a85341 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Lock "f12db2c7-b990-46f2-8865-699df3c176e6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:43 np0005466013 nova_compute[192144]: 2025-10-02 12:11:43.070 2 DEBUG oslo_concurrency.lockutils [None req-6800d384-6d00-4178-bfbd-46bb49a85341 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Lock "f12db2c7-b990-46f2-8865-699df3c176e6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:43 np0005466013 nova_compute[192144]: 2025-10-02 12:11:43.082 2 INFO nova.compute.manager [None req-6800d384-6d00-4178-bfbd-46bb49a85341 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Terminating instance#033[00m
Oct  2 08:11:43 np0005466013 nova_compute[192144]: 2025-10-02 12:11:43.093 2 DEBUG nova.compute.manager [None req-6800d384-6d00-4178-bfbd-46bb49a85341 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:11:43 np0005466013 kernel: tap1f708987-9c (unregistering): left promiscuous mode
Oct  2 08:11:43 np0005466013 NetworkManager[51205]: <info>  [1759407103.1197] device (tap1f708987-9c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:11:43 np0005466013 ovn_controller[94366]: 2025-10-02T12:11:43Z|00157|binding|INFO|Releasing lport 1f708987-9c45-4948-b794-28b4c634ea5d from this chassis (sb_readonly=0)
Oct  2 08:11:43 np0005466013 ovn_controller[94366]: 2025-10-02T12:11:43Z|00158|binding|INFO|Setting lport 1f708987-9c45-4948-b794-28b4c634ea5d down in Southbound
Oct  2 08:11:43 np0005466013 ovn_controller[94366]: 2025-10-02T12:11:43Z|00159|binding|INFO|Removing iface tap1f708987-9c ovn-installed in OVS
Oct  2 08:11:43 np0005466013 nova_compute[192144]: 2025-10-02 12:11:43.135 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:43 np0005466013 nova_compute[192144]: 2025-10-02 12:11:43.137 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:43 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:43.144 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:61:cb:d1 10.100.0.9'], port_security=['fa:16:3e:61:cb:d1 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'f12db2c7-b990-46f2-8865-699df3c176e6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6b319229-b7a5-4343-8a9f-30a4189f9c4c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7c0445031175477fb35cf45ea4e8ebe9', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b721cd83-2c83-42fe-9c6d-205a8d8dd7e2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.235'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ccd24e31-d17f-4734-be86-ecf006ac2832, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=1f708987-9c45-4948-b794-28b4c634ea5d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:11:43 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:43.145 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 1f708987-9c45-4948-b794-28b4c634ea5d in datapath 6b319229-b7a5-4343-8a9f-30a4189f9c4c unbound from our chassis#033[00m
Oct  2 08:11:43 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:43.147 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6b319229-b7a5-4343-8a9f-30a4189f9c4c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:11:43 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:43.148 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[23128bce-42e2-4118-ab7f-83c706961630]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:43 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:43.148 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6b319229-b7a5-4343-8a9f-30a4189f9c4c namespace which is not needed anymore#033[00m
Oct  2 08:11:43 np0005466013 nova_compute[192144]: 2025-10-02 12:11:43.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:43 np0005466013 nova_compute[192144]: 2025-10-02 12:11:43.168 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:43 np0005466013 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d00000037.scope: Deactivated successfully.
Oct  2 08:11:43 np0005466013 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d00000037.scope: Consumed 15.293s CPU time.
Oct  2 08:11:43 np0005466013 systemd-machined[152202]: Machine qemu-22-instance-00000037 terminated.
Oct  2 08:11:43 np0005466013 nova_compute[192144]: 2025-10-02 12:11:43.193 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:43 np0005466013 neutron-haproxy-ovnmeta-6b319229-b7a5-4343-8a9f-30a4189f9c4c[227073]: [NOTICE]   (227077) : haproxy version is 2.8.14-c23fe91
Oct  2 08:11:43 np0005466013 neutron-haproxy-ovnmeta-6b319229-b7a5-4343-8a9f-30a4189f9c4c[227073]: [NOTICE]   (227077) : path to executable is /usr/sbin/haproxy
Oct  2 08:11:43 np0005466013 neutron-haproxy-ovnmeta-6b319229-b7a5-4343-8a9f-30a4189f9c4c[227073]: [WARNING]  (227077) : Exiting Master process...
Oct  2 08:11:43 np0005466013 neutron-haproxy-ovnmeta-6b319229-b7a5-4343-8a9f-30a4189f9c4c[227073]: [WARNING]  (227077) : Exiting Master process...
Oct  2 08:11:43 np0005466013 neutron-haproxy-ovnmeta-6b319229-b7a5-4343-8a9f-30a4189f9c4c[227073]: [ALERT]    (227077) : Current worker (227079) exited with code 143 (Terminated)
Oct  2 08:11:43 np0005466013 neutron-haproxy-ovnmeta-6b319229-b7a5-4343-8a9f-30a4189f9c4c[227073]: [WARNING]  (227077) : All workers exited. Exiting... (0)
Oct  2 08:11:43 np0005466013 systemd[1]: libpod-ed079d378436b3a13b328d96ebb851d04a3ae0b7c2b9fb5b71aa279f2bbf15a1.scope: Deactivated successfully.
Oct  2 08:11:43 np0005466013 podman[227421]: 2025-10-02 12:11:43.297193356 +0000 UTC m=+0.050641396 container died ed079d378436b3a13b328d96ebb851d04a3ae0b7c2b9fb5b71aa279f2bbf15a1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6b319229-b7a5-4343-8a9f-30a4189f9c4c, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:11:43 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ed079d378436b3a13b328d96ebb851d04a3ae0b7c2b9fb5b71aa279f2bbf15a1-userdata-shm.mount: Deactivated successfully.
Oct  2 08:11:43 np0005466013 systemd[1]: var-lib-containers-storage-overlay-464acf72dc5ee27fb2a7be97615991fd8433d092bb77166e12525414b9a6aec6-merged.mount: Deactivated successfully.
Oct  2 08:11:43 np0005466013 nova_compute[192144]: 2025-10-02 12:11:43.357 2 INFO nova.virt.libvirt.driver [-] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Instance destroyed successfully.#033[00m
Oct  2 08:11:43 np0005466013 nova_compute[192144]: 2025-10-02 12:11:43.359 2 DEBUG nova.objects.instance [None req-6800d384-6d00-4178-bfbd-46bb49a85341 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Lazy-loading 'resources' on Instance uuid f12db2c7-b990-46f2-8865-699df3c176e6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:11:43 np0005466013 nova_compute[192144]: 2025-10-02 12:11:43.378 2 DEBUG nova.virt.libvirt.vif [None req-6800d384-6d00-4178-bfbd-46bb49a85341 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=2.2.2.2,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:10:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='guest-instance-1.domain.com',display_name='guest-instance-1.domain.com',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='guest-instance-1-domain-com',id=55,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNlSyfGZ4DLGyW6nLdJOEus31UhYB05UFJ8yocUsxwa8dND8GhDYJhmm+n/B+Hn9fiVn10MXIKsKGB9vg58iJyGT4TqSHNBf4PV0LNj44WE3+z/u3L3HZLHJoFJG3oiLzQ==',key_name='tempest-keypair-174484104',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:11:07Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7c0445031175477fb35cf45ea4e8ebe9',ramdisk_id='',reservation_id='r-eqv1ywk4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestFqdnHostnames-1450369239',owner_user_name='tempest-ServersTestFqdnHostnames-1450369239-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:11:07Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='5b11a704bc3240da9c36e22382c9bd70',uuid=f12db2c7-b990-46f2-8865-699df3c176e6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1f708987-9c45-4948-b794-28b4c634ea5d", "address": "fa:16:3e:61:cb:d1", "network": {"id": "6b319229-b7a5-4343-8a9f-30a4189f9c4c", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1040296601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7c0445031175477fb35cf45ea4e8ebe9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f708987-9c", "ovs_interfaceid": "1f708987-9c45-4948-b794-28b4c634ea5d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:11:43 np0005466013 nova_compute[192144]: 2025-10-02 12:11:43.378 2 DEBUG nova.network.os_vif_util [None req-6800d384-6d00-4178-bfbd-46bb49a85341 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Converting VIF {"id": "1f708987-9c45-4948-b794-28b4c634ea5d", "address": "fa:16:3e:61:cb:d1", "network": {"id": "6b319229-b7a5-4343-8a9f-30a4189f9c4c", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1040296601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7c0445031175477fb35cf45ea4e8ebe9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f708987-9c", "ovs_interfaceid": "1f708987-9c45-4948-b794-28b4c634ea5d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:11:43 np0005466013 nova_compute[192144]: 2025-10-02 12:11:43.379 2 DEBUG nova.network.os_vif_util [None req-6800d384-6d00-4178-bfbd-46bb49a85341 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:61:cb:d1,bridge_name='br-int',has_traffic_filtering=True,id=1f708987-9c45-4948-b794-28b4c634ea5d,network=Network(6b319229-b7a5-4343-8a9f-30a4189f9c4c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f708987-9c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:11:43 np0005466013 nova_compute[192144]: 2025-10-02 12:11:43.379 2 DEBUG os_vif [None req-6800d384-6d00-4178-bfbd-46bb49a85341 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:61:cb:d1,bridge_name='br-int',has_traffic_filtering=True,id=1f708987-9c45-4948-b794-28b4c634ea5d,network=Network(6b319229-b7a5-4343-8a9f-30a4189f9c4c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f708987-9c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:11:43 np0005466013 nova_compute[192144]: 2025-10-02 12:11:43.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:43 np0005466013 nova_compute[192144]: 2025-10-02 12:11:43.381 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1f708987-9c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:43 np0005466013 nova_compute[192144]: 2025-10-02 12:11:43.383 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:43 np0005466013 nova_compute[192144]: 2025-10-02 12:11:43.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:43 np0005466013 nova_compute[192144]: 2025-10-02 12:11:43.387 2 INFO os_vif [None req-6800d384-6d00-4178-bfbd-46bb49a85341 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:61:cb:d1,bridge_name='br-int',has_traffic_filtering=True,id=1f708987-9c45-4948-b794-28b4c634ea5d,network=Network(6b319229-b7a5-4343-8a9f-30a4189f9c4c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f708987-9c')#033[00m
Oct  2 08:11:43 np0005466013 nova_compute[192144]: 2025-10-02 12:11:43.387 2 INFO nova.virt.libvirt.driver [None req-6800d384-6d00-4178-bfbd-46bb49a85341 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Deleting instance files /var/lib/nova/instances/f12db2c7-b990-46f2-8865-699df3c176e6_del#033[00m
Oct  2 08:11:43 np0005466013 nova_compute[192144]: 2025-10-02 12:11:43.388 2 INFO nova.virt.libvirt.driver [None req-6800d384-6d00-4178-bfbd-46bb49a85341 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Deletion of /var/lib/nova/instances/f12db2c7-b990-46f2-8865-699df3c176e6_del complete#033[00m
Oct  2 08:11:43 np0005466013 podman[227421]: 2025-10-02 12:11:43.408979054 +0000 UTC m=+0.162427054 container cleanup ed079d378436b3a13b328d96ebb851d04a3ae0b7c2b9fb5b71aa279f2bbf15a1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6b319229-b7a5-4343-8a9f-30a4189f9c4c, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  2 08:11:43 np0005466013 systemd[1]: libpod-conmon-ed079d378436b3a13b328d96ebb851d04a3ae0b7c2b9fb5b71aa279f2bbf15a1.scope: Deactivated successfully.
Oct  2 08:11:43 np0005466013 podman[227466]: 2025-10-02 12:11:43.476832762 +0000 UTC m=+0.045042084 container remove ed079d378436b3a13b328d96ebb851d04a3ae0b7c2b9fb5b71aa279f2bbf15a1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6b319229-b7a5-4343-8a9f-30a4189f9c4c, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  2 08:11:43 np0005466013 nova_compute[192144]: 2025-10-02 12:11:43.477 2 INFO nova.compute.manager [None req-6800d384-6d00-4178-bfbd-46bb49a85341 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:11:43 np0005466013 nova_compute[192144]: 2025-10-02 12:11:43.478 2 DEBUG oslo.service.loopingcall [None req-6800d384-6d00-4178-bfbd-46bb49a85341 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:11:43 np0005466013 nova_compute[192144]: 2025-10-02 12:11:43.478 2 DEBUG nova.compute.manager [-] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:11:43 np0005466013 nova_compute[192144]: 2025-10-02 12:11:43.478 2 DEBUG nova.network.neutron [-] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:11:43 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:43.483 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[5ec4c7c5-a82c-4513-bae3-712bd7958f05]: (4, ('Thu Oct  2 12:11:43 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-6b319229-b7a5-4343-8a9f-30a4189f9c4c (ed079d378436b3a13b328d96ebb851d04a3ae0b7c2b9fb5b71aa279f2bbf15a1)\ned079d378436b3a13b328d96ebb851d04a3ae0b7c2b9fb5b71aa279f2bbf15a1\nThu Oct  2 12:11:43 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-6b319229-b7a5-4343-8a9f-30a4189f9c4c (ed079d378436b3a13b328d96ebb851d04a3ae0b7c2b9fb5b71aa279f2bbf15a1)\ned079d378436b3a13b328d96ebb851d04a3ae0b7c2b9fb5b71aa279f2bbf15a1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:43 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:43.485 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[fd2a5fb0-92cf-4f1a-9ca8-60257efab7e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:43 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:43.486 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6b319229-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:43 np0005466013 nova_compute[192144]: 2025-10-02 12:11:43.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:43 np0005466013 kernel: tap6b319229-b0: left promiscuous mode
Oct  2 08:11:43 np0005466013 nova_compute[192144]: 2025-10-02 12:11:43.502 2 DEBUG nova.compute.manager [req-48f85ba8-2691-4cfc-92bc-83913b092494 req-19ece53c-3a23-46f2-8145-974232902a5b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Received event network-vif-unplugged-1f708987-9c45-4948-b794-28b4c634ea5d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:11:43 np0005466013 nova_compute[192144]: 2025-10-02 12:11:43.503 2 DEBUG oslo_concurrency.lockutils [req-48f85ba8-2691-4cfc-92bc-83913b092494 req-19ece53c-3a23-46f2-8145-974232902a5b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "f12db2c7-b990-46f2-8865-699df3c176e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:43 np0005466013 nova_compute[192144]: 2025-10-02 12:11:43.503 2 DEBUG oslo_concurrency.lockutils [req-48f85ba8-2691-4cfc-92bc-83913b092494 req-19ece53c-3a23-46f2-8145-974232902a5b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f12db2c7-b990-46f2-8865-699df3c176e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:43 np0005466013 nova_compute[192144]: 2025-10-02 12:11:43.503 2 DEBUG oslo_concurrency.lockutils [req-48f85ba8-2691-4cfc-92bc-83913b092494 req-19ece53c-3a23-46f2-8145-974232902a5b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f12db2c7-b990-46f2-8865-699df3c176e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:43 np0005466013 nova_compute[192144]: 2025-10-02 12:11:43.503 2 DEBUG nova.compute.manager [req-48f85ba8-2691-4cfc-92bc-83913b092494 req-19ece53c-3a23-46f2-8145-974232902a5b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] No waiting events found dispatching network-vif-unplugged-1f708987-9c45-4948-b794-28b4c634ea5d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:11:43 np0005466013 nova_compute[192144]: 2025-10-02 12:11:43.504 2 DEBUG nova.compute.manager [req-48f85ba8-2691-4cfc-92bc-83913b092494 req-19ece53c-3a23-46f2-8145-974232902a5b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Received event network-vif-unplugged-1f708987-9c45-4948-b794-28b4c634ea5d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:11:43 np0005466013 nova_compute[192144]: 2025-10-02 12:11:43.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:43 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:43.504 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[63a8e32c-f94e-47d3-b72c-adddc48c633f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:43 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:43.528 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[ade0ec35-f833-461c-a1a1-ebefdc72a373]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:43 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:43.530 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[63946c09-f81b-4043-aaf9-ba0966c8ad86]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:43 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:43.547 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[d2884e14-85e3-49b8-be44-63b74971fef0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 505749, 'reachable_time': 29535, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227481, 'error': None, 'target': 'ovnmeta-6b319229-b7a5-4343-8a9f-30a4189f9c4c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:43 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:43.549 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6b319229-b7a5-4343-8a9f-30a4189f9c4c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:11:43 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:11:43.549 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[4d343e5d-02d3-4000-bdb3-6d45221c1f0a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:43 np0005466013 systemd[1]: run-netns-ovnmeta\x2d6b319229\x2db7a5\x2d4343\x2d8a9f\x2d30a4189f9c4c.mount: Deactivated successfully.
Oct  2 08:11:44 np0005466013 podman[227482]: 2025-10-02 12:11:44.709397354 +0000 UTC m=+0.084494274 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Oct  2 08:11:44 np0005466013 nova_compute[192144]: 2025-10-02 12:11:44.854 2 DEBUG nova.network.neutron [-] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:11:44 np0005466013 nova_compute[192144]: 2025-10-02 12:11:44.873 2 INFO nova.compute.manager [-] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Took 1.39 seconds to deallocate network for instance.#033[00m
Oct  2 08:11:44 np0005466013 nova_compute[192144]: 2025-10-02 12:11:44.971 2 DEBUG oslo_concurrency.lockutils [None req-6800d384-6d00-4178-bfbd-46bb49a85341 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:44 np0005466013 nova_compute[192144]: 2025-10-02 12:11:44.972 2 DEBUG oslo_concurrency.lockutils [None req-6800d384-6d00-4178-bfbd-46bb49a85341 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:45 np0005466013 nova_compute[192144]: 2025-10-02 12:11:45.050 2 DEBUG nova.compute.provider_tree [None req-6800d384-6d00-4178-bfbd-46bb49a85341 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:11:45 np0005466013 nova_compute[192144]: 2025-10-02 12:11:45.067 2 DEBUG nova.scheduler.client.report [None req-6800d384-6d00-4178-bfbd-46bb49a85341 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:11:45 np0005466013 nova_compute[192144]: 2025-10-02 12:11:45.091 2 DEBUG oslo_concurrency.lockutils [None req-6800d384-6d00-4178-bfbd-46bb49a85341 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:45 np0005466013 nova_compute[192144]: 2025-10-02 12:11:45.132 2 INFO nova.scheduler.client.report [None req-6800d384-6d00-4178-bfbd-46bb49a85341 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Deleted allocations for instance f12db2c7-b990-46f2-8865-699df3c176e6#033[00m
Oct  2 08:11:45 np0005466013 nova_compute[192144]: 2025-10-02 12:11:45.215 2 DEBUG oslo_concurrency.lockutils [None req-6800d384-6d00-4178-bfbd-46bb49a85341 5b11a704bc3240da9c36e22382c9bd70 7c0445031175477fb35cf45ea4e8ebe9 - - default default] Lock "f12db2c7-b990-46f2-8865-699df3c176e6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.145s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:45 np0005466013 nova_compute[192144]: 2025-10-02 12:11:45.653 2 DEBUG nova.compute.manager [req-d67d2f82-3d06-4d6f-9b77-86b8ea5f36b9 req-d0872ba9-a83a-4796-b06b-d152248b0675 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Received event network-vif-plugged-1f708987-9c45-4948-b794-28b4c634ea5d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:11:45 np0005466013 nova_compute[192144]: 2025-10-02 12:11:45.653 2 DEBUG oslo_concurrency.lockutils [req-d67d2f82-3d06-4d6f-9b77-86b8ea5f36b9 req-d0872ba9-a83a-4796-b06b-d152248b0675 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "f12db2c7-b990-46f2-8865-699df3c176e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:45 np0005466013 nova_compute[192144]: 2025-10-02 12:11:45.654 2 DEBUG oslo_concurrency.lockutils [req-d67d2f82-3d06-4d6f-9b77-86b8ea5f36b9 req-d0872ba9-a83a-4796-b06b-d152248b0675 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f12db2c7-b990-46f2-8865-699df3c176e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:45 np0005466013 nova_compute[192144]: 2025-10-02 12:11:45.654 2 DEBUG oslo_concurrency.lockutils [req-d67d2f82-3d06-4d6f-9b77-86b8ea5f36b9 req-d0872ba9-a83a-4796-b06b-d152248b0675 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f12db2c7-b990-46f2-8865-699df3c176e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:45 np0005466013 nova_compute[192144]: 2025-10-02 12:11:45.654 2 DEBUG nova.compute.manager [req-d67d2f82-3d06-4d6f-9b77-86b8ea5f36b9 req-d0872ba9-a83a-4796-b06b-d152248b0675 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] No waiting events found dispatching network-vif-plugged-1f708987-9c45-4948-b794-28b4c634ea5d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:11:45 np0005466013 nova_compute[192144]: 2025-10-02 12:11:45.654 2 WARNING nova.compute.manager [req-d67d2f82-3d06-4d6f-9b77-86b8ea5f36b9 req-d0872ba9-a83a-4796-b06b-d152248b0675 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Received unexpected event network-vif-plugged-1f708987-9c45-4948-b794-28b4c634ea5d for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:11:45 np0005466013 nova_compute[192144]: 2025-10-02 12:11:45.654 2 DEBUG nova.compute.manager [req-d67d2f82-3d06-4d6f-9b77-86b8ea5f36b9 req-d0872ba9-a83a-4796-b06b-d152248b0675 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Received event network-vif-deleted-1f708987-9c45-4948-b794-28b4c634ea5d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:11:46 np0005466013 nova_compute[192144]: 2025-10-02 12:11:46.246 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:46 np0005466013 nova_compute[192144]: 2025-10-02 12:11:46.810 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:47 np0005466013 podman[227503]: 2025-10-02 12:11:47.686485441 +0000 UTC m=+0.061620207 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, version=9.6, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, release=1755695350, vcs-type=git, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public)
Oct  2 08:11:47 np0005466013 podman[227502]: 2025-10-02 12:11:47.705646514 +0000 UTC m=+0.085709273 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:11:48 np0005466013 nova_compute[192144]: 2025-10-02 12:11:48.193 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:48 np0005466013 nova_compute[192144]: 2025-10-02 12:11:48.382 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:49 np0005466013 nova_compute[192144]: 2025-10-02 12:11:49.926 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:50 np0005466013 nova_compute[192144]: 2025-10-02 12:11:50.094 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:51 np0005466013 podman[227544]: 2025-10-02 12:11:51.677186535 +0000 UTC m=+0.052438233 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251001)
Oct  2 08:11:51 np0005466013 podman[227543]: 2025-10-02 12:11:51.702786847 +0000 UTC m=+0.081784712 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  2 08:11:52 np0005466013 nova_compute[192144]: 2025-10-02 12:11:52.365 2 DEBUG oslo_concurrency.lockutils [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Acquiring lock "acad95ab-c692-4eb2-b6e0-517da29f69ef" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:52 np0005466013 nova_compute[192144]: 2025-10-02 12:11:52.366 2 DEBUG oslo_concurrency.lockutils [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Lock "acad95ab-c692-4eb2-b6e0-517da29f69ef" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:52 np0005466013 nova_compute[192144]: 2025-10-02 12:11:52.385 2 DEBUG nova.compute.manager [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:11:52 np0005466013 nova_compute[192144]: 2025-10-02 12:11:52.485 2 DEBUG oslo_concurrency.lockutils [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:52 np0005466013 nova_compute[192144]: 2025-10-02 12:11:52.486 2 DEBUG oslo_concurrency.lockutils [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:52 np0005466013 nova_compute[192144]: 2025-10-02 12:11:52.494 2 DEBUG nova.virt.hardware [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:11:52 np0005466013 nova_compute[192144]: 2025-10-02 12:11:52.494 2 INFO nova.compute.claims [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:11:52 np0005466013 nova_compute[192144]: 2025-10-02 12:11:52.638 2 DEBUG nova.compute.provider_tree [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:11:52 np0005466013 nova_compute[192144]: 2025-10-02 12:11:52.653 2 DEBUG nova.scheduler.client.report [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:11:52 np0005466013 nova_compute[192144]: 2025-10-02 12:11:52.676 2 DEBUG oslo_concurrency.lockutils [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.190s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:52 np0005466013 nova_compute[192144]: 2025-10-02 12:11:52.677 2 DEBUG nova.compute.manager [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:11:52 np0005466013 nova_compute[192144]: 2025-10-02 12:11:52.759 2 DEBUG nova.compute.manager [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:11:52 np0005466013 nova_compute[192144]: 2025-10-02 12:11:52.760 2 DEBUG nova.network.neutron [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:11:52 np0005466013 nova_compute[192144]: 2025-10-02 12:11:52.781 2 INFO nova.virt.libvirt.driver [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:11:52 np0005466013 nova_compute[192144]: 2025-10-02 12:11:52.802 2 DEBUG nova.compute.manager [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:11:52 np0005466013 nova_compute[192144]: 2025-10-02 12:11:52.910 2 DEBUG nova.compute.manager [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:11:52 np0005466013 nova_compute[192144]: 2025-10-02 12:11:52.912 2 DEBUG nova.virt.libvirt.driver [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:11:52 np0005466013 nova_compute[192144]: 2025-10-02 12:11:52.913 2 INFO nova.virt.libvirt.driver [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Creating image(s)#033[00m
Oct  2 08:11:52 np0005466013 nova_compute[192144]: 2025-10-02 12:11:52.913 2 DEBUG oslo_concurrency.lockutils [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Acquiring lock "/var/lib/nova/instances/acad95ab-c692-4eb2-b6e0-517da29f69ef/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:52 np0005466013 nova_compute[192144]: 2025-10-02 12:11:52.913 2 DEBUG oslo_concurrency.lockutils [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Lock "/var/lib/nova/instances/acad95ab-c692-4eb2-b6e0-517da29f69ef/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:52 np0005466013 nova_compute[192144]: 2025-10-02 12:11:52.914 2 DEBUG oslo_concurrency.lockutils [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Lock "/var/lib/nova/instances/acad95ab-c692-4eb2-b6e0-517da29f69ef/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:52 np0005466013 nova_compute[192144]: 2025-10-02 12:11:52.927 2 DEBUG oslo_concurrency.processutils [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:11:52 np0005466013 nova_compute[192144]: 2025-10-02 12:11:52.985 2 DEBUG oslo_concurrency.processutils [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:11:52 np0005466013 nova_compute[192144]: 2025-10-02 12:11:52.986 2 DEBUG oslo_concurrency.lockutils [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:52 np0005466013 nova_compute[192144]: 2025-10-02 12:11:52.987 2 DEBUG oslo_concurrency.lockutils [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:52 np0005466013 nova_compute[192144]: 2025-10-02 12:11:52.997 2 DEBUG oslo_concurrency.processutils [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:11:53 np0005466013 nova_compute[192144]: 2025-10-02 12:11:53.051 2 DEBUG oslo_concurrency.processutils [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:11:53 np0005466013 nova_compute[192144]: 2025-10-02 12:11:53.052 2 DEBUG oslo_concurrency.processutils [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/acad95ab-c692-4eb2-b6e0-517da29f69ef/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:11:53 np0005466013 nova_compute[192144]: 2025-10-02 12:11:53.098 2 DEBUG oslo_concurrency.processutils [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/acad95ab-c692-4eb2-b6e0-517da29f69ef/disk 1073741824" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:11:53 np0005466013 nova_compute[192144]: 2025-10-02 12:11:53.099 2 DEBUG oslo_concurrency.lockutils [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:53 np0005466013 nova_compute[192144]: 2025-10-02 12:11:53.099 2 DEBUG oslo_concurrency.processutils [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:11:53 np0005466013 nova_compute[192144]: 2025-10-02 12:11:53.157 2 DEBUG oslo_concurrency.processutils [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:11:53 np0005466013 nova_compute[192144]: 2025-10-02 12:11:53.158 2 DEBUG nova.virt.disk.api [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Checking if we can resize image /var/lib/nova/instances/acad95ab-c692-4eb2-b6e0-517da29f69ef/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:11:53 np0005466013 nova_compute[192144]: 2025-10-02 12:11:53.158 2 DEBUG oslo_concurrency.processutils [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/acad95ab-c692-4eb2-b6e0-517da29f69ef/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:11:53 np0005466013 nova_compute[192144]: 2025-10-02 12:11:53.195 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:53 np0005466013 nova_compute[192144]: 2025-10-02 12:11:53.215 2 DEBUG oslo_concurrency.processutils [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/acad95ab-c692-4eb2-b6e0-517da29f69ef/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:11:53 np0005466013 nova_compute[192144]: 2025-10-02 12:11:53.215 2 DEBUG nova.virt.disk.api [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Cannot resize image /var/lib/nova/instances/acad95ab-c692-4eb2-b6e0-517da29f69ef/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:11:53 np0005466013 nova_compute[192144]: 2025-10-02 12:11:53.216 2 DEBUG nova.objects.instance [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Lazy-loading 'migration_context' on Instance uuid acad95ab-c692-4eb2-b6e0-517da29f69ef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:11:53 np0005466013 nova_compute[192144]: 2025-10-02 12:11:53.233 2 DEBUG nova.virt.libvirt.driver [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:11:53 np0005466013 nova_compute[192144]: 2025-10-02 12:11:53.234 2 DEBUG nova.virt.libvirt.driver [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Ensure instance console log exists: /var/lib/nova/instances/acad95ab-c692-4eb2-b6e0-517da29f69ef/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:11:53 np0005466013 nova_compute[192144]: 2025-10-02 12:11:53.234 2 DEBUG oslo_concurrency.lockutils [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:53 np0005466013 nova_compute[192144]: 2025-10-02 12:11:53.235 2 DEBUG oslo_concurrency.lockutils [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:53 np0005466013 nova_compute[192144]: 2025-10-02 12:11:53.235 2 DEBUG oslo_concurrency.lockutils [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:53 np0005466013 nova_compute[192144]: 2025-10-02 12:11:53.383 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:53 np0005466013 nova_compute[192144]: 2025-10-02 12:11:53.887 2 DEBUG nova.policy [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '576e822ba888408e92dc7462577fbdb9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '098cc383ace84803b8f15713d2c201a1', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:11:55 np0005466013 nova_compute[192144]: 2025-10-02 12:11:55.346 2 DEBUG nova.network.neutron [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Successfully created port: c6392e8e-b868-4504-8dc9-392ef67e317b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:11:58 np0005466013 nova_compute[192144]: 2025-10-02 12:11:58.196 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:58 np0005466013 nova_compute[192144]: 2025-10-02 12:11:58.355 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407103.3552601, f12db2c7-b990-46f2-8865-699df3c176e6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:11:58 np0005466013 nova_compute[192144]: 2025-10-02 12:11:58.356 2 INFO nova.compute.manager [-] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:11:58 np0005466013 nova_compute[192144]: 2025-10-02 12:11:58.375 2 DEBUG nova.compute.manager [None req-d4e581e1-1d58-4bd5-b52a-108079745bfb - - - - - -] [instance: f12db2c7-b990-46f2-8865-699df3c176e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:11:58 np0005466013 nova_compute[192144]: 2025-10-02 12:11:58.385 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:58 np0005466013 nova_compute[192144]: 2025-10-02 12:11:58.440 2 DEBUG nova.network.neutron [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Successfully updated port: c6392e8e-b868-4504-8dc9-392ef67e317b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:11:58 np0005466013 nova_compute[192144]: 2025-10-02 12:11:58.457 2 DEBUG oslo_concurrency.lockutils [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Acquiring lock "refresh_cache-acad95ab-c692-4eb2-b6e0-517da29f69ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:11:58 np0005466013 nova_compute[192144]: 2025-10-02 12:11:58.458 2 DEBUG oslo_concurrency.lockutils [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Acquired lock "refresh_cache-acad95ab-c692-4eb2-b6e0-517da29f69ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:11:58 np0005466013 nova_compute[192144]: 2025-10-02 12:11:58.458 2 DEBUG nova.network.neutron [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:11:58 np0005466013 nova_compute[192144]: 2025-10-02 12:11:58.534 2 DEBUG nova.compute.manager [req-ff4a8e04-8ab3-42e3-88f3-5c9c82400730 req-4430d6cc-236d-4578-9853-34f6ca6547fc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Received event network-changed-c6392e8e-b868-4504-8dc9-392ef67e317b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:11:58 np0005466013 nova_compute[192144]: 2025-10-02 12:11:58.535 2 DEBUG nova.compute.manager [req-ff4a8e04-8ab3-42e3-88f3-5c9c82400730 req-4430d6cc-236d-4578-9853-34f6ca6547fc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Refreshing instance network info cache due to event network-changed-c6392e8e-b868-4504-8dc9-392ef67e317b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:11:58 np0005466013 nova_compute[192144]: 2025-10-02 12:11:58.535 2 DEBUG oslo_concurrency.lockutils [req-ff4a8e04-8ab3-42e3-88f3-5c9c82400730 req-4430d6cc-236d-4578-9853-34f6ca6547fc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-acad95ab-c692-4eb2-b6e0-517da29f69ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:11:58 np0005466013 nova_compute[192144]: 2025-10-02 12:11:58.580 2 DEBUG nova.network.neutron [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:11:59 np0005466013 nova_compute[192144]: 2025-10-02 12:11:59.404 2 DEBUG nova.network.neutron [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Updating instance_info_cache with network_info: [{"id": "c6392e8e-b868-4504-8dc9-392ef67e317b", "address": "fa:16:3e:dc:48:b3", "network": {"id": "875b01f6-4815-49fe-b2f9-4ac15eeba242", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1496494398-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "098cc383ace84803b8f15713d2c201a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6392e8e-b8", "ovs_interfaceid": "c6392e8e-b868-4504-8dc9-392ef67e317b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:11:59 np0005466013 nova_compute[192144]: 2025-10-02 12:11:59.466 2 DEBUG oslo_concurrency.lockutils [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Releasing lock "refresh_cache-acad95ab-c692-4eb2-b6e0-517da29f69ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:11:59 np0005466013 nova_compute[192144]: 2025-10-02 12:11:59.467 2 DEBUG nova.compute.manager [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Instance network_info: |[{"id": "c6392e8e-b868-4504-8dc9-392ef67e317b", "address": "fa:16:3e:dc:48:b3", "network": {"id": "875b01f6-4815-49fe-b2f9-4ac15eeba242", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1496494398-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "098cc383ace84803b8f15713d2c201a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6392e8e-b8", "ovs_interfaceid": "c6392e8e-b868-4504-8dc9-392ef67e317b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:11:59 np0005466013 nova_compute[192144]: 2025-10-02 12:11:59.467 2 DEBUG oslo_concurrency.lockutils [req-ff4a8e04-8ab3-42e3-88f3-5c9c82400730 req-4430d6cc-236d-4578-9853-34f6ca6547fc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-acad95ab-c692-4eb2-b6e0-517da29f69ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:11:59 np0005466013 nova_compute[192144]: 2025-10-02 12:11:59.468 2 DEBUG nova.network.neutron [req-ff4a8e04-8ab3-42e3-88f3-5c9c82400730 req-4430d6cc-236d-4578-9853-34f6ca6547fc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Refreshing network info cache for port c6392e8e-b868-4504-8dc9-392ef67e317b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:11:59 np0005466013 nova_compute[192144]: 2025-10-02 12:11:59.471 2 DEBUG nova.virt.libvirt.driver [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Start _get_guest_xml network_info=[{"id": "c6392e8e-b868-4504-8dc9-392ef67e317b", "address": "fa:16:3e:dc:48:b3", "network": {"id": "875b01f6-4815-49fe-b2f9-4ac15eeba242", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1496494398-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "098cc383ace84803b8f15713d2c201a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6392e8e-b8", "ovs_interfaceid": "c6392e8e-b868-4504-8dc9-392ef67e317b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:11:59 np0005466013 nova_compute[192144]: 2025-10-02 12:11:59.476 2 WARNING nova.virt.libvirt.driver [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:11:59 np0005466013 nova_compute[192144]: 2025-10-02 12:11:59.481 2 DEBUG nova.virt.libvirt.host [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:11:59 np0005466013 nova_compute[192144]: 2025-10-02 12:11:59.482 2 DEBUG nova.virt.libvirt.host [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:11:59 np0005466013 nova_compute[192144]: 2025-10-02 12:11:59.488 2 DEBUG nova.virt.libvirt.host [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:11:59 np0005466013 nova_compute[192144]: 2025-10-02 12:11:59.489 2 DEBUG nova.virt.libvirt.host [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:11:59 np0005466013 nova_compute[192144]: 2025-10-02 12:11:59.490 2 DEBUG nova.virt.libvirt.driver [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:11:59 np0005466013 nova_compute[192144]: 2025-10-02 12:11:59.490 2 DEBUG nova.virt.hardware [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:11:59 np0005466013 nova_compute[192144]: 2025-10-02 12:11:59.491 2 DEBUG nova.virt.hardware [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:11:59 np0005466013 nova_compute[192144]: 2025-10-02 12:11:59.491 2 DEBUG nova.virt.hardware [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:11:59 np0005466013 nova_compute[192144]: 2025-10-02 12:11:59.491 2 DEBUG nova.virt.hardware [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:11:59 np0005466013 nova_compute[192144]: 2025-10-02 12:11:59.491 2 DEBUG nova.virt.hardware [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:11:59 np0005466013 nova_compute[192144]: 2025-10-02 12:11:59.492 2 DEBUG nova.virt.hardware [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:11:59 np0005466013 nova_compute[192144]: 2025-10-02 12:11:59.492 2 DEBUG nova.virt.hardware [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:11:59 np0005466013 nova_compute[192144]: 2025-10-02 12:11:59.492 2 DEBUG nova.virt.hardware [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:11:59 np0005466013 nova_compute[192144]: 2025-10-02 12:11:59.493 2 DEBUG nova.virt.hardware [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:11:59 np0005466013 nova_compute[192144]: 2025-10-02 12:11:59.493 2 DEBUG nova.virt.hardware [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:11:59 np0005466013 nova_compute[192144]: 2025-10-02 12:11:59.493 2 DEBUG nova.virt.hardware [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:11:59 np0005466013 nova_compute[192144]: 2025-10-02 12:11:59.496 2 DEBUG nova.virt.libvirt.vif [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:11:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestManualDisk-server-581821638',display_name='tempest-ServersTestManualDisk-server-581821638',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmanualdisk-server-581821638',id=58,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGso+miI8qCYT4p0eCAv5njDOKrweNd8akhDENvee/f5qXp2M6cmnrI7KRJufnciA1FtuchHAFBjX1yiV0u+1PF0AkEv7TVePk+hbK9x3irpqtSU57UjRRd76dMT+VcS4g==',key_name='tempest-keypair-12061868',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='098cc383ace84803b8f15713d2c201a1',ramdisk_id='',reservation_id='r-amb05o0b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestManualDisk-1767575425',owner_user_name='tempest-ServersTestManualDisk-1767575425-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:11:52Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='576e822ba888408e92dc7462577fbdb9',uuid=acad95ab-c692-4eb2-b6e0-517da29f69ef,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c6392e8e-b868-4504-8dc9-392ef67e317b", "address": "fa:16:3e:dc:48:b3", "network": {"id": "875b01f6-4815-49fe-b2f9-4ac15eeba242", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1496494398-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "098cc383ace84803b8f15713d2c201a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6392e8e-b8", "ovs_interfaceid": "c6392e8e-b868-4504-8dc9-392ef67e317b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:11:59 np0005466013 nova_compute[192144]: 2025-10-02 12:11:59.497 2 DEBUG nova.network.os_vif_util [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Converting VIF {"id": "c6392e8e-b868-4504-8dc9-392ef67e317b", "address": "fa:16:3e:dc:48:b3", "network": {"id": "875b01f6-4815-49fe-b2f9-4ac15eeba242", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1496494398-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "098cc383ace84803b8f15713d2c201a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6392e8e-b8", "ovs_interfaceid": "c6392e8e-b868-4504-8dc9-392ef67e317b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:11:59 np0005466013 nova_compute[192144]: 2025-10-02 12:11:59.497 2 DEBUG nova.network.os_vif_util [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dc:48:b3,bridge_name='br-int',has_traffic_filtering=True,id=c6392e8e-b868-4504-8dc9-392ef67e317b,network=Network(875b01f6-4815-49fe-b2f9-4ac15eeba242),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6392e8e-b8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:11:59 np0005466013 nova_compute[192144]: 2025-10-02 12:11:59.498 2 DEBUG nova.objects.instance [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Lazy-loading 'pci_devices' on Instance uuid acad95ab-c692-4eb2-b6e0-517da29f69ef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:11:59 np0005466013 nova_compute[192144]: 2025-10-02 12:11:59.512 2 DEBUG nova.virt.libvirt.driver [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:11:59 np0005466013 nova_compute[192144]:  <uuid>acad95ab-c692-4eb2-b6e0-517da29f69ef</uuid>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:  <name>instance-0000003a</name>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:11:59 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:      <nova:name>tempest-ServersTestManualDisk-server-581821638</nova:name>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:11:59</nova:creationTime>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:11:59 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:        <nova:user uuid="576e822ba888408e92dc7462577fbdb9">tempest-ServersTestManualDisk-1767575425-project-member</nova:user>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:        <nova:project uuid="098cc383ace84803b8f15713d2c201a1">tempest-ServersTestManualDisk-1767575425</nova:project>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:        <nova:port uuid="c6392e8e-b868-4504-8dc9-392ef67e317b">
Oct  2 08:11:59 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:      <entry name="serial">acad95ab-c692-4eb2-b6e0-517da29f69ef</entry>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:      <entry name="uuid">acad95ab-c692-4eb2-b6e0-517da29f69ef</entry>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:11:59 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/acad95ab-c692-4eb2-b6e0-517da29f69ef/disk"/>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:11:59 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/acad95ab-c692-4eb2-b6e0-517da29f69ef/disk.config"/>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:11:59 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:dc:48:b3"/>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:      <target dev="tapc6392e8e-b8"/>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:11:59 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/acad95ab-c692-4eb2-b6e0-517da29f69ef/console.log" append="off"/>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:11:59 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:11:59 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:11:59 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:11:59 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:11:59 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:11:59 np0005466013 nova_compute[192144]: 2025-10-02 12:11:59.515 2 DEBUG nova.compute.manager [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Preparing to wait for external event network-vif-plugged-c6392e8e-b868-4504-8dc9-392ef67e317b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:11:59 np0005466013 nova_compute[192144]: 2025-10-02 12:11:59.515 2 DEBUG oslo_concurrency.lockutils [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Acquiring lock "acad95ab-c692-4eb2-b6e0-517da29f69ef-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:59 np0005466013 nova_compute[192144]: 2025-10-02 12:11:59.515 2 DEBUG oslo_concurrency.lockutils [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Lock "acad95ab-c692-4eb2-b6e0-517da29f69ef-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:59 np0005466013 nova_compute[192144]: 2025-10-02 12:11:59.516 2 DEBUG oslo_concurrency.lockutils [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Lock "acad95ab-c692-4eb2-b6e0-517da29f69ef-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:59 np0005466013 nova_compute[192144]: 2025-10-02 12:11:59.516 2 DEBUG nova.virt.libvirt.vif [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:11:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestManualDisk-server-581821638',display_name='tempest-ServersTestManualDisk-server-581821638',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmanualdisk-server-581821638',id=58,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGso+miI8qCYT4p0eCAv5njDOKrweNd8akhDENvee/f5qXp2M6cmnrI7KRJufnciA1FtuchHAFBjX1yiV0u+1PF0AkEv7TVePk+hbK9x3irpqtSU57UjRRd76dMT+VcS4g==',key_name='tempest-keypair-12061868',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='098cc383ace84803b8f15713d2c201a1',ramdisk_id='',reservation_id='r-amb05o0b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestManualDisk-1767575425',owner_user_name='tempest-ServersTestManualDisk-1767575425-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:11:52Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='576e822ba888408e92dc7462577fbdb9',uuid=acad95ab-c692-4eb2-b6e0-517da29f69ef,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c6392e8e-b868-4504-8dc9-392ef67e317b", "address": "fa:16:3e:dc:48:b3", "network": {"id": "875b01f6-4815-49fe-b2f9-4ac15eeba242", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1496494398-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "098cc383ace84803b8f15713d2c201a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6392e8e-b8", "ovs_interfaceid": "c6392e8e-b868-4504-8dc9-392ef67e317b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:11:59 np0005466013 nova_compute[192144]: 2025-10-02 12:11:59.517 2 DEBUG nova.network.os_vif_util [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Converting VIF {"id": "c6392e8e-b868-4504-8dc9-392ef67e317b", "address": "fa:16:3e:dc:48:b3", "network": {"id": "875b01f6-4815-49fe-b2f9-4ac15eeba242", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1496494398-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "098cc383ace84803b8f15713d2c201a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6392e8e-b8", "ovs_interfaceid": "c6392e8e-b868-4504-8dc9-392ef67e317b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:11:59 np0005466013 nova_compute[192144]: 2025-10-02 12:11:59.517 2 DEBUG nova.network.os_vif_util [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dc:48:b3,bridge_name='br-int',has_traffic_filtering=True,id=c6392e8e-b868-4504-8dc9-392ef67e317b,network=Network(875b01f6-4815-49fe-b2f9-4ac15eeba242),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6392e8e-b8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:11:59 np0005466013 nova_compute[192144]: 2025-10-02 12:11:59.518 2 DEBUG os_vif [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:dc:48:b3,bridge_name='br-int',has_traffic_filtering=True,id=c6392e8e-b868-4504-8dc9-392ef67e317b,network=Network(875b01f6-4815-49fe-b2f9-4ac15eeba242),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6392e8e-b8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:11:59 np0005466013 nova_compute[192144]: 2025-10-02 12:11:59.519 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:59 np0005466013 nova_compute[192144]: 2025-10-02 12:11:59.519 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:59 np0005466013 nova_compute[192144]: 2025-10-02 12:11:59.520 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:11:59 np0005466013 nova_compute[192144]: 2025-10-02 12:11:59.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:59 np0005466013 nova_compute[192144]: 2025-10-02 12:11:59.523 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc6392e8e-b8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:59 np0005466013 nova_compute[192144]: 2025-10-02 12:11:59.523 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc6392e8e-b8, col_values=(('external_ids', {'iface-id': 'c6392e8e-b868-4504-8dc9-392ef67e317b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:dc:48:b3', 'vm-uuid': 'acad95ab-c692-4eb2-b6e0-517da29f69ef'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:59 np0005466013 nova_compute[192144]: 2025-10-02 12:11:59.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:59 np0005466013 nova_compute[192144]: 2025-10-02 12:11:59.527 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:59 np0005466013 NetworkManager[51205]: <info>  [1759407119.5286] manager: (tapc6392e8e-b8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/83)
Oct  2 08:11:59 np0005466013 nova_compute[192144]: 2025-10-02 12:11:59.528 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:11:59 np0005466013 nova_compute[192144]: 2025-10-02 12:11:59.534 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:59 np0005466013 nova_compute[192144]: 2025-10-02 12:11:59.535 2 INFO os_vif [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:dc:48:b3,bridge_name='br-int',has_traffic_filtering=True,id=c6392e8e-b868-4504-8dc9-392ef67e317b,network=Network(875b01f6-4815-49fe-b2f9-4ac15eeba242),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6392e8e-b8')#033[00m
Oct  2 08:11:59 np0005466013 nova_compute[192144]: 2025-10-02 12:11:59.590 2 DEBUG nova.virt.libvirt.driver [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:11:59 np0005466013 nova_compute[192144]: 2025-10-02 12:11:59.590 2 DEBUG nova.virt.libvirt.driver [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:11:59 np0005466013 nova_compute[192144]: 2025-10-02 12:11:59.590 2 DEBUG nova.virt.libvirt.driver [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] No VIF found with MAC fa:16:3e:dc:48:b3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:11:59 np0005466013 nova_compute[192144]: 2025-10-02 12:11:59.591 2 INFO nova.virt.libvirt.driver [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Using config drive#033[00m
Oct  2 08:11:59 np0005466013 nova_compute[192144]: 2025-10-02 12:11:59.867 2 INFO nova.virt.libvirt.driver [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Creating config drive at /var/lib/nova/instances/acad95ab-c692-4eb2-b6e0-517da29f69ef/disk.config#033[00m
Oct  2 08:11:59 np0005466013 nova_compute[192144]: 2025-10-02 12:11:59.873 2 DEBUG oslo_concurrency.processutils [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/acad95ab-c692-4eb2-b6e0-517da29f69ef/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5n5q60fo execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:11:59 np0005466013 nova_compute[192144]: 2025-10-02 12:11:59.998 2 DEBUG oslo_concurrency.processutils [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/acad95ab-c692-4eb2-b6e0-517da29f69ef/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5n5q60fo" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:12:00 np0005466013 kernel: tapc6392e8e-b8: entered promiscuous mode
Oct  2 08:12:00 np0005466013 NetworkManager[51205]: <info>  [1759407120.0627] manager: (tapc6392e8e-b8): new Tun device (/org/freedesktop/NetworkManager/Devices/84)
Oct  2 08:12:00 np0005466013 ovn_controller[94366]: 2025-10-02T12:12:00Z|00160|binding|INFO|Claiming lport c6392e8e-b868-4504-8dc9-392ef67e317b for this chassis.
Oct  2 08:12:00 np0005466013 nova_compute[192144]: 2025-10-02 12:12:00.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:00 np0005466013 ovn_controller[94366]: 2025-10-02T12:12:00Z|00161|binding|INFO|c6392e8e-b868-4504-8dc9-392ef67e317b: Claiming fa:16:3e:dc:48:b3 10.100.0.7
Oct  2 08:12:00 np0005466013 nova_compute[192144]: 2025-10-02 12:12:00.068 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:00.079 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dc:48:b3 10.100.0.7'], port_security=['fa:16:3e:dc:48:b3 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'acad95ab-c692-4eb2-b6e0-517da29f69ef', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-875b01f6-4815-49fe-b2f9-4ac15eeba242', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '098cc383ace84803b8f15713d2c201a1', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'be57de36-f32e-444f-ba2c-aeaf28293636', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0dab12d7-53a9-429d-b445-cfccadfc19be, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=c6392e8e-b868-4504-8dc9-392ef67e317b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:00.080 103323 INFO neutron.agent.ovn.metadata.agent [-] Port c6392e8e-b868-4504-8dc9-392ef67e317b in datapath 875b01f6-4815-49fe-b2f9-4ac15eeba242 bound to our chassis#033[00m
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:00.082 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 875b01f6-4815-49fe-b2f9-4ac15eeba242#033[00m
Oct  2 08:12:00 np0005466013 systemd-udevd[227622]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:00.094 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[892e786f-2e29-4c77-8064-76e39a37210c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:00.095 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap875b01f6-41 in ovnmeta-875b01f6-4815-49fe-b2f9-4ac15eeba242 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:00.098 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap875b01f6-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:00.098 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[f7f65b92-a939-4f46-ab4a-37a85115c99f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:00.100 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[db2664fe-27c9-483b-b9bf-896eb5a7927c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:00 np0005466013 systemd-machined[152202]: New machine qemu-23-instance-0000003a.
Oct  2 08:12:00 np0005466013 NetworkManager[51205]: <info>  [1759407120.1099] device (tapc6392e8e-b8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:12:00 np0005466013 NetworkManager[51205]: <info>  [1759407120.1108] device (tapc6392e8e-b8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:00.112 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[137be643-b6eb-41f0-98f1-86649d6f7147]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:00 np0005466013 nova_compute[192144]: 2025-10-02 12:12:00.122 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:00 np0005466013 systemd[1]: Started Virtual Machine qemu-23-instance-0000003a.
Oct  2 08:12:00 np0005466013 ovn_controller[94366]: 2025-10-02T12:12:00Z|00162|binding|INFO|Setting lport c6392e8e-b868-4504-8dc9-392ef67e317b ovn-installed in OVS
Oct  2 08:12:00 np0005466013 ovn_controller[94366]: 2025-10-02T12:12:00Z|00163|binding|INFO|Setting lport c6392e8e-b868-4504-8dc9-392ef67e317b up in Southbound
Oct  2 08:12:00 np0005466013 nova_compute[192144]: 2025-10-02 12:12:00.130 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:00.141 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[42529d9e-0198-48ac-9a1f-95e5b70a4cd9]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:00.170 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[da5143da-981f-40be-9951-61bd931eb50b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:00.175 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[bc138fdf-295e-4bad-9e56-3b0506ae2563]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:00 np0005466013 systemd-udevd[227626]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:12:00 np0005466013 NetworkManager[51205]: <info>  [1759407120.1780] manager: (tap875b01f6-40): new Veth device (/org/freedesktop/NetworkManager/Devices/85)
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:00.207 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[9f497787-9f38-4820-9607-d5305fe8c71a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:00.212 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[535c5669-09dc-47e6-8a5c-d21d314919b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:00 np0005466013 NetworkManager[51205]: <info>  [1759407120.2365] device (tap875b01f6-40): carrier: link connected
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:00.243 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[78e4113f-66a2-4377-a513-34468287af60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:00.262 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[38328c69-0cb1-4e99-8efe-8deb6b7ed4cf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap875b01f6-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:46:77:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 52], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 511185, 'reachable_time': 33146, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227655, 'error': None, 'target': 'ovnmeta-875b01f6-4815-49fe-b2f9-4ac15eeba242', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:00.283 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[4adb8a98-8f6c-4019-b1ab-21fab18c5356]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe46:772a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 511185, 'tstamp': 511185}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227656, 'error': None, 'target': 'ovnmeta-875b01f6-4815-49fe-b2f9-4ac15eeba242', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:00.305 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[d1fc6081-8267-4824-92d9-4d46db4a2294]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap875b01f6-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:46:77:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 52], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 511185, 'reachable_time': 33146, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227657, 'error': None, 'target': 'ovnmeta-875b01f6-4815-49fe-b2f9-4ac15eeba242', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:00.342 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[c6040749-aa5f-4daa-86c2-d0220d5e6d64]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:00.407 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[404f3824-096c-4408-8ee1-869bc1a6cd4f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:00.413 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap875b01f6-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:00.414 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:00.414 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap875b01f6-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:12:00 np0005466013 nova_compute[192144]: 2025-10-02 12:12:00.417 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:00 np0005466013 kernel: tap875b01f6-40: entered promiscuous mode
Oct  2 08:12:00 np0005466013 NetworkManager[51205]: <info>  [1759407120.4176] manager: (tap875b01f6-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/86)
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:00.421 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap875b01f6-40, col_values=(('external_ids', {'iface-id': 'aa3e37aa-35c0-4db0-8881-f229002d6c1c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:12:00 np0005466013 nova_compute[192144]: 2025-10-02 12:12:00.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:00 np0005466013 ovn_controller[94366]: 2025-10-02T12:12:00Z|00164|binding|INFO|Releasing lport aa3e37aa-35c0-4db0-8881-f229002d6c1c from this chassis (sb_readonly=0)
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:00.424 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/875b01f6-4815-49fe-b2f9-4ac15eeba242.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/875b01f6-4815-49fe-b2f9-4ac15eeba242.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:00.425 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[576fca31-6403-4e04-b30a-fa23e53e38ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:00.425 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-875b01f6-4815-49fe-b2f9-4ac15eeba242
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/875b01f6-4815-49fe-b2f9-4ac15eeba242.pid.haproxy
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID 875b01f6-4815-49fe-b2f9-4ac15eeba242
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:12:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:00.426 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-875b01f6-4815-49fe-b2f9-4ac15eeba242', 'env', 'PROCESS_TAG=haproxy-875b01f6-4815-49fe-b2f9-4ac15eeba242', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/875b01f6-4815-49fe-b2f9-4ac15eeba242.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:12:00 np0005466013 nova_compute[192144]: 2025-10-02 12:12:00.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:00 np0005466013 podman[227696]: 2025-10-02 12:12:00.79541754 +0000 UTC m=+0.043116794 container create 788b76f2f6224582228c8bca713f57f41be488242f5aa711c7a628bcc5015e09 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-875b01f6-4815-49fe-b2f9-4ac15eeba242, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001)
Oct  2 08:12:00 np0005466013 systemd[1]: Started libpod-conmon-788b76f2f6224582228c8bca713f57f41be488242f5aa711c7a628bcc5015e09.scope.
Oct  2 08:12:00 np0005466013 podman[227696]: 2025-10-02 12:12:00.772007007 +0000 UTC m=+0.019706271 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:12:00 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:12:00 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0512d83c1ac855dc7a3c7b9dfc74f1752f0884dd3720753cc0378af83b2b619a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:12:00 np0005466013 podman[227696]: 2025-10-02 12:12:00.902969747 +0000 UTC m=+0.150669011 container init 788b76f2f6224582228c8bca713f57f41be488242f5aa711c7a628bcc5015e09 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-875b01f6-4815-49fe-b2f9-4ac15eeba242, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:12:00 np0005466013 podman[227696]: 2025-10-02 12:12:00.908584571 +0000 UTC m=+0.156283815 container start 788b76f2f6224582228c8bca713f57f41be488242f5aa711c7a628bcc5015e09 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-875b01f6-4815-49fe-b2f9-4ac15eeba242, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001)
Oct  2 08:12:00 np0005466013 neutron-haproxy-ovnmeta-875b01f6-4815-49fe-b2f9-4ac15eeba242[227711]: [NOTICE]   (227715) : New worker (227717) forked
Oct  2 08:12:00 np0005466013 neutron-haproxy-ovnmeta-875b01f6-4815-49fe-b2f9-4ac15eeba242[227711]: [NOTICE]   (227715) : Loading success.
Oct  2 08:12:01 np0005466013 nova_compute[192144]: 2025-10-02 12:12:01.129 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407121.1294985, acad95ab-c692-4eb2-b6e0-517da29f69ef => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:12:01 np0005466013 nova_compute[192144]: 2025-10-02 12:12:01.130 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] VM Started (Lifecycle Event)#033[00m
Oct  2 08:12:01 np0005466013 nova_compute[192144]: 2025-10-02 12:12:01.161 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:12:01 np0005466013 nova_compute[192144]: 2025-10-02 12:12:01.163 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407121.1303933, acad95ab-c692-4eb2-b6e0-517da29f69ef => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:12:01 np0005466013 nova_compute[192144]: 2025-10-02 12:12:01.164 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:12:01 np0005466013 nova_compute[192144]: 2025-10-02 12:12:01.186 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:12:01 np0005466013 nova_compute[192144]: 2025-10-02 12:12:01.189 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:12:01 np0005466013 nova_compute[192144]: 2025-10-02 12:12:01.214 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:12:01 np0005466013 nova_compute[192144]: 2025-10-02 12:12:01.684 2 DEBUG nova.network.neutron [req-ff4a8e04-8ab3-42e3-88f3-5c9c82400730 req-4430d6cc-236d-4578-9853-34f6ca6547fc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Updated VIF entry in instance network info cache for port c6392e8e-b868-4504-8dc9-392ef67e317b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:12:01 np0005466013 nova_compute[192144]: 2025-10-02 12:12:01.685 2 DEBUG nova.network.neutron [req-ff4a8e04-8ab3-42e3-88f3-5c9c82400730 req-4430d6cc-236d-4578-9853-34f6ca6547fc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Updating instance_info_cache with network_info: [{"id": "c6392e8e-b868-4504-8dc9-392ef67e317b", "address": "fa:16:3e:dc:48:b3", "network": {"id": "875b01f6-4815-49fe-b2f9-4ac15eeba242", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1496494398-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "098cc383ace84803b8f15713d2c201a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6392e8e-b8", "ovs_interfaceid": "c6392e8e-b868-4504-8dc9-392ef67e317b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:12:01 np0005466013 nova_compute[192144]: 2025-10-02 12:12:01.699 2 DEBUG oslo_concurrency.lockutils [req-ff4a8e04-8ab3-42e3-88f3-5c9c82400730 req-4430d6cc-236d-4578-9853-34f6ca6547fc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-acad95ab-c692-4eb2-b6e0-517da29f69ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:12:02 np0005466013 nova_compute[192144]: 2025-10-02 12:12:02.041 2 DEBUG nova.compute.manager [req-487809c3-4464-4122-b347-c43dc208994b req-e396fdc5-8d4b-4fb3-ac1d-f55c2c8689cf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Received event network-vif-plugged-c6392e8e-b868-4504-8dc9-392ef67e317b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:12:02 np0005466013 nova_compute[192144]: 2025-10-02 12:12:02.041 2 DEBUG oslo_concurrency.lockutils [req-487809c3-4464-4122-b347-c43dc208994b req-e396fdc5-8d4b-4fb3-ac1d-f55c2c8689cf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "acad95ab-c692-4eb2-b6e0-517da29f69ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:02 np0005466013 nova_compute[192144]: 2025-10-02 12:12:02.042 2 DEBUG oslo_concurrency.lockutils [req-487809c3-4464-4122-b347-c43dc208994b req-e396fdc5-8d4b-4fb3-ac1d-f55c2c8689cf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "acad95ab-c692-4eb2-b6e0-517da29f69ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:02 np0005466013 nova_compute[192144]: 2025-10-02 12:12:02.042 2 DEBUG oslo_concurrency.lockutils [req-487809c3-4464-4122-b347-c43dc208994b req-e396fdc5-8d4b-4fb3-ac1d-f55c2c8689cf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "acad95ab-c692-4eb2-b6e0-517da29f69ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:02 np0005466013 nova_compute[192144]: 2025-10-02 12:12:02.042 2 DEBUG nova.compute.manager [req-487809c3-4464-4122-b347-c43dc208994b req-e396fdc5-8d4b-4fb3-ac1d-f55c2c8689cf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Processing event network-vif-plugged-c6392e8e-b868-4504-8dc9-392ef67e317b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:12:02 np0005466013 nova_compute[192144]: 2025-10-02 12:12:02.042 2 DEBUG nova.compute.manager [req-487809c3-4464-4122-b347-c43dc208994b req-e396fdc5-8d4b-4fb3-ac1d-f55c2c8689cf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Received event network-vif-plugged-c6392e8e-b868-4504-8dc9-392ef67e317b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:12:02 np0005466013 nova_compute[192144]: 2025-10-02 12:12:02.043 2 DEBUG oslo_concurrency.lockutils [req-487809c3-4464-4122-b347-c43dc208994b req-e396fdc5-8d4b-4fb3-ac1d-f55c2c8689cf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "acad95ab-c692-4eb2-b6e0-517da29f69ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:02 np0005466013 nova_compute[192144]: 2025-10-02 12:12:02.043 2 DEBUG oslo_concurrency.lockutils [req-487809c3-4464-4122-b347-c43dc208994b req-e396fdc5-8d4b-4fb3-ac1d-f55c2c8689cf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "acad95ab-c692-4eb2-b6e0-517da29f69ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:02 np0005466013 nova_compute[192144]: 2025-10-02 12:12:02.043 2 DEBUG oslo_concurrency.lockutils [req-487809c3-4464-4122-b347-c43dc208994b req-e396fdc5-8d4b-4fb3-ac1d-f55c2c8689cf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "acad95ab-c692-4eb2-b6e0-517da29f69ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:02 np0005466013 nova_compute[192144]: 2025-10-02 12:12:02.043 2 DEBUG nova.compute.manager [req-487809c3-4464-4122-b347-c43dc208994b req-e396fdc5-8d4b-4fb3-ac1d-f55c2c8689cf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] No waiting events found dispatching network-vif-plugged-c6392e8e-b868-4504-8dc9-392ef67e317b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:12:02 np0005466013 nova_compute[192144]: 2025-10-02 12:12:02.043 2 WARNING nova.compute.manager [req-487809c3-4464-4122-b347-c43dc208994b req-e396fdc5-8d4b-4fb3-ac1d-f55c2c8689cf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Received unexpected event network-vif-plugged-c6392e8e-b868-4504-8dc9-392ef67e317b for instance with vm_state building and task_state spawning.#033[00m
Oct  2 08:12:02 np0005466013 nova_compute[192144]: 2025-10-02 12:12:02.044 2 DEBUG nova.compute.manager [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:12:02 np0005466013 nova_compute[192144]: 2025-10-02 12:12:02.047 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407122.0474908, acad95ab-c692-4eb2-b6e0-517da29f69ef => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:12:02 np0005466013 nova_compute[192144]: 2025-10-02 12:12:02.048 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:12:02 np0005466013 nova_compute[192144]: 2025-10-02 12:12:02.049 2 DEBUG nova.virt.libvirt.driver [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:12:02 np0005466013 nova_compute[192144]: 2025-10-02 12:12:02.052 2 INFO nova.virt.libvirt.driver [-] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Instance spawned successfully.#033[00m
Oct  2 08:12:02 np0005466013 nova_compute[192144]: 2025-10-02 12:12:02.052 2 DEBUG nova.virt.libvirt.driver [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:12:02 np0005466013 nova_compute[192144]: 2025-10-02 12:12:02.066 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:12:02 np0005466013 nova_compute[192144]: 2025-10-02 12:12:02.072 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:12:02 np0005466013 nova_compute[192144]: 2025-10-02 12:12:02.076 2 DEBUG nova.virt.libvirt.driver [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:12:02 np0005466013 nova_compute[192144]: 2025-10-02 12:12:02.076 2 DEBUG nova.virt.libvirt.driver [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:12:02 np0005466013 nova_compute[192144]: 2025-10-02 12:12:02.076 2 DEBUG nova.virt.libvirt.driver [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:12:02 np0005466013 nova_compute[192144]: 2025-10-02 12:12:02.077 2 DEBUG nova.virt.libvirt.driver [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:12:02 np0005466013 nova_compute[192144]: 2025-10-02 12:12:02.077 2 DEBUG nova.virt.libvirt.driver [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:12:02 np0005466013 nova_compute[192144]: 2025-10-02 12:12:02.077 2 DEBUG nova.virt.libvirt.driver [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:12:02 np0005466013 nova_compute[192144]: 2025-10-02 12:12:02.103 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:12:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:02.293 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:02.294 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:02.294 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:02 np0005466013 nova_compute[192144]: 2025-10-02 12:12:02.314 2 INFO nova.compute.manager [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Took 9.40 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:12:02 np0005466013 nova_compute[192144]: 2025-10-02 12:12:02.315 2 DEBUG nova.compute.manager [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:12:02 np0005466013 nova_compute[192144]: 2025-10-02 12:12:02.618 2 INFO nova.compute.manager [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Took 10.17 seconds to build instance.#033[00m
Oct  2 08:12:02 np0005466013 nova_compute[192144]: 2025-10-02 12:12:02.691 2 DEBUG oslo_concurrency.lockutils [None req-8f668534-4bbf-4ef4-bedb-e3ef3d1fa34f 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Lock "acad95ab-c692-4eb2-b6e0-517da29f69ef" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.325s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:03 np0005466013 nova_compute[192144]: 2025-10-02 12:12:03.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:04 np0005466013 nova_compute[192144]: 2025-10-02 12:12:04.456 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:04 np0005466013 NetworkManager[51205]: <info>  [1759407124.4574] manager: (patch-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/87)
Oct  2 08:12:04 np0005466013 NetworkManager[51205]: <info>  [1759407124.4583] manager: (patch-br-int-to-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/88)
Oct  2 08:12:04 np0005466013 nova_compute[192144]: 2025-10-02 12:12:04.526 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:04 np0005466013 nova_compute[192144]: 2025-10-02 12:12:04.579 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:04 np0005466013 ovn_controller[94366]: 2025-10-02T12:12:04Z|00165|binding|INFO|Releasing lport aa3e37aa-35c0-4db0-8881-f229002d6c1c from this chassis (sb_readonly=0)
Oct  2 08:12:04 np0005466013 nova_compute[192144]: 2025-10-02 12:12:04.599 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:04 np0005466013 podman[227727]: 2025-10-02 12:12:04.689646074 +0000 UTC m=+0.060895015 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 08:12:04 np0005466013 podman[227728]: 2025-10-02 12:12:04.717195336 +0000 UTC m=+0.088514529 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  2 08:12:04 np0005466013 podman[227729]: 2025-10-02 12:12:04.751034552 +0000 UTC m=+0.117765443 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:12:04 np0005466013 nova_compute[192144]: 2025-10-02 12:12:04.981 2 DEBUG nova.compute.manager [req-ece6ed4f-6274-4423-946e-f62550ee6ac4 req-13d7c696-fdeb-4b12-8530-ffa88a0db500 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Received event network-changed-c6392e8e-b868-4504-8dc9-392ef67e317b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:12:04 np0005466013 nova_compute[192144]: 2025-10-02 12:12:04.982 2 DEBUG nova.compute.manager [req-ece6ed4f-6274-4423-946e-f62550ee6ac4 req-13d7c696-fdeb-4b12-8530-ffa88a0db500 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Refreshing instance network info cache due to event network-changed-c6392e8e-b868-4504-8dc9-392ef67e317b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:12:04 np0005466013 nova_compute[192144]: 2025-10-02 12:12:04.982 2 DEBUG oslo_concurrency.lockutils [req-ece6ed4f-6274-4423-946e-f62550ee6ac4 req-13d7c696-fdeb-4b12-8530-ffa88a0db500 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-acad95ab-c692-4eb2-b6e0-517da29f69ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:12:04 np0005466013 nova_compute[192144]: 2025-10-02 12:12:04.982 2 DEBUG oslo_concurrency.lockutils [req-ece6ed4f-6274-4423-946e-f62550ee6ac4 req-13d7c696-fdeb-4b12-8530-ffa88a0db500 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-acad95ab-c692-4eb2-b6e0-517da29f69ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:12:04 np0005466013 nova_compute[192144]: 2025-10-02 12:12:04.982 2 DEBUG nova.network.neutron [req-ece6ed4f-6274-4423-946e-f62550ee6ac4 req-13d7c696-fdeb-4b12-8530-ffa88a0db500 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Refreshing network info cache for port c6392e8e-b868-4504-8dc9-392ef67e317b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:12:07 np0005466013 nova_compute[192144]: 2025-10-02 12:12:07.931 2 DEBUG nova.network.neutron [req-ece6ed4f-6274-4423-946e-f62550ee6ac4 req-13d7c696-fdeb-4b12-8530-ffa88a0db500 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Updated VIF entry in instance network info cache for port c6392e8e-b868-4504-8dc9-392ef67e317b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:12:07 np0005466013 nova_compute[192144]: 2025-10-02 12:12:07.931 2 DEBUG nova.network.neutron [req-ece6ed4f-6274-4423-946e-f62550ee6ac4 req-13d7c696-fdeb-4b12-8530-ffa88a0db500 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Updating instance_info_cache with network_info: [{"id": "c6392e8e-b868-4504-8dc9-392ef67e317b", "address": "fa:16:3e:dc:48:b3", "network": {"id": "875b01f6-4815-49fe-b2f9-4ac15eeba242", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1496494398-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "098cc383ace84803b8f15713d2c201a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6392e8e-b8", "ovs_interfaceid": "c6392e8e-b868-4504-8dc9-392ef67e317b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:12:07 np0005466013 nova_compute[192144]: 2025-10-02 12:12:07.957 2 DEBUG oslo_concurrency.lockutils [req-ece6ed4f-6274-4423-946e-f62550ee6ac4 req-13d7c696-fdeb-4b12-8530-ffa88a0db500 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-acad95ab-c692-4eb2-b6e0-517da29f69ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:12:08 np0005466013 nova_compute[192144]: 2025-10-02 12:12:08.100 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:08 np0005466013 nova_compute[192144]: 2025-10-02 12:12:08.200 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:09 np0005466013 nova_compute[192144]: 2025-10-02 12:12:09.529 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:13 np0005466013 nova_compute[192144]: 2025-10-02 12:12:13.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:13 np0005466013 nova_compute[192144]: 2025-10-02 12:12:13.951 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:14 np0005466013 nova_compute[192144]: 2025-10-02 12:12:14.532 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:15 np0005466013 podman[227810]: 2025-10-02 12:12:15.680874605 +0000 UTC m=+0.060298665 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct  2 08:12:15 np0005466013 ovn_controller[94366]: 2025-10-02T12:12:15Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:dc:48:b3 10.100.0.7
Oct  2 08:12:15 np0005466013 ovn_controller[94366]: 2025-10-02T12:12:15Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:dc:48:b3 10.100.0.7
Oct  2 08:12:17 np0005466013 nova_compute[192144]: 2025-10-02 12:12:17.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:18 np0005466013 nova_compute[192144]: 2025-10-02 12:12:18.204 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:18 np0005466013 podman[227827]: 2025-10-02 12:12:18.696829384 +0000 UTC m=+0.061454022 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:12:18 np0005466013 podman[227828]: 2025-10-02 12:12:18.696829594 +0000 UTC m=+0.060383529 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., config_id=edpm, container_name=openstack_network_exporter, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, release=1755695350)
Oct  2 08:12:18 np0005466013 nova_compute[192144]: 2025-10-02 12:12:18.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:12:19 np0005466013 nova_compute[192144]: 2025-10-02 12:12:19.022 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:19 np0005466013 nova_compute[192144]: 2025-10-02 12:12:19.022 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:19 np0005466013 nova_compute[192144]: 2025-10-02 12:12:19.022 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:19 np0005466013 nova_compute[192144]: 2025-10-02 12:12:19.023 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:12:19 np0005466013 nova_compute[192144]: 2025-10-02 12:12:19.113 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/acad95ab-c692-4eb2-b6e0-517da29f69ef/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:12:19 np0005466013 nova_compute[192144]: 2025-10-02 12:12:19.177 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/acad95ab-c692-4eb2-b6e0-517da29f69ef/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:12:19 np0005466013 nova_compute[192144]: 2025-10-02 12:12:19.178 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/acad95ab-c692-4eb2-b6e0-517da29f69ef/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:12:19 np0005466013 nova_compute[192144]: 2025-10-02 12:12:19.247 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/acad95ab-c692-4eb2-b6e0-517da29f69ef/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:12:19 np0005466013 nova_compute[192144]: 2025-10-02 12:12:19.411 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:12:19 np0005466013 nova_compute[192144]: 2025-10-02 12:12:19.413 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5555MB free_disk=73.3675308227539GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:12:19 np0005466013 nova_compute[192144]: 2025-10-02 12:12:19.413 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:19 np0005466013 nova_compute[192144]: 2025-10-02 12:12:19.414 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:19 np0005466013 nova_compute[192144]: 2025-10-02 12:12:19.508 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance acad95ab-c692-4eb2-b6e0-517da29f69ef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:12:19 np0005466013 nova_compute[192144]: 2025-10-02 12:12:19.509 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:12:19 np0005466013 nova_compute[192144]: 2025-10-02 12:12:19.509 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:12:19 np0005466013 nova_compute[192144]: 2025-10-02 12:12:19.535 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:19 np0005466013 nova_compute[192144]: 2025-10-02 12:12:19.554 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:12:19 np0005466013 nova_compute[192144]: 2025-10-02 12:12:19.579 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:12:19 np0005466013 nova_compute[192144]: 2025-10-02 12:12:19.616 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:12:19 np0005466013 nova_compute[192144]: 2025-10-02 12:12:19.617 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.203s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:20.390 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:12:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:20.392 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:12:20 np0005466013 nova_compute[192144]: 2025-10-02 12:12:20.392 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:20 np0005466013 ovn_controller[94366]: 2025-10-02T12:12:20Z|00166|binding|INFO|Releasing lport aa3e37aa-35c0-4db0-8881-f229002d6c1c from this chassis (sb_readonly=0)
Oct  2 08:12:20 np0005466013 nova_compute[192144]: 2025-10-02 12:12:20.617 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:12:20 np0005466013 nova_compute[192144]: 2025-10-02 12:12:20.618 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:12:20 np0005466013 nova_compute[192144]: 2025-10-02 12:12:20.647 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:12:20 np0005466013 nova_compute[192144]: 2025-10-02 12:12:20.648 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:12:20 np0005466013 nova_compute[192144]: 2025-10-02 12:12:20.648 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:12:20 np0005466013 nova_compute[192144]: 2025-10-02 12:12:20.648 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:12:20 np0005466013 nova_compute[192144]: 2025-10-02 12:12:20.652 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:20 np0005466013 nova_compute[192144]: 2025-10-02 12:12:20.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:12:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:21.394 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:12:21 np0005466013 nova_compute[192144]: 2025-10-02 12:12:21.996 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:12:21 np0005466013 nova_compute[192144]: 2025-10-02 12:12:21.999 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:12:22 np0005466013 nova_compute[192144]: 2025-10-02 12:12:21.999 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:12:22 np0005466013 nova_compute[192144]: 2025-10-02 12:12:22.501 2 DEBUG oslo_concurrency.lockutils [None req-cba5d885-cc2f-4e49-ad98-3df43655d07a 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Acquiring lock "acad95ab-c692-4eb2-b6e0-517da29f69ef" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:22 np0005466013 nova_compute[192144]: 2025-10-02 12:12:22.502 2 DEBUG oslo_concurrency.lockutils [None req-cba5d885-cc2f-4e49-ad98-3df43655d07a 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Lock "acad95ab-c692-4eb2-b6e0-517da29f69ef" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:22 np0005466013 nova_compute[192144]: 2025-10-02 12:12:22.502 2 DEBUG oslo_concurrency.lockutils [None req-cba5d885-cc2f-4e49-ad98-3df43655d07a 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Acquiring lock "acad95ab-c692-4eb2-b6e0-517da29f69ef-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:22 np0005466013 nova_compute[192144]: 2025-10-02 12:12:22.502 2 DEBUG oslo_concurrency.lockutils [None req-cba5d885-cc2f-4e49-ad98-3df43655d07a 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Lock "acad95ab-c692-4eb2-b6e0-517da29f69ef-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:22 np0005466013 nova_compute[192144]: 2025-10-02 12:12:22.502 2 DEBUG oslo_concurrency.lockutils [None req-cba5d885-cc2f-4e49-ad98-3df43655d07a 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Lock "acad95ab-c692-4eb2-b6e0-517da29f69ef-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:22 np0005466013 nova_compute[192144]: 2025-10-02 12:12:22.519 2 INFO nova.compute.manager [None req-cba5d885-cc2f-4e49-ad98-3df43655d07a 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Terminating instance#033[00m
Oct  2 08:12:22 np0005466013 nova_compute[192144]: 2025-10-02 12:12:22.533 2 DEBUG nova.compute.manager [None req-cba5d885-cc2f-4e49-ad98-3df43655d07a 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:12:22 np0005466013 kernel: tapc6392e8e-b8 (unregistering): left promiscuous mode
Oct  2 08:12:22 np0005466013 NetworkManager[51205]: <info>  [1759407142.5587] device (tapc6392e8e-b8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:12:22 np0005466013 nova_compute[192144]: 2025-10-02 12:12:22.569 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:22 np0005466013 ovn_controller[94366]: 2025-10-02T12:12:22Z|00167|binding|INFO|Releasing lport c6392e8e-b868-4504-8dc9-392ef67e317b from this chassis (sb_readonly=0)
Oct  2 08:12:22 np0005466013 ovn_controller[94366]: 2025-10-02T12:12:22Z|00168|binding|INFO|Setting lport c6392e8e-b868-4504-8dc9-392ef67e317b down in Southbound
Oct  2 08:12:22 np0005466013 ovn_controller[94366]: 2025-10-02T12:12:22Z|00169|binding|INFO|Removing iface tapc6392e8e-b8 ovn-installed in OVS
Oct  2 08:12:22 np0005466013 nova_compute[192144]: 2025-10-02 12:12:22.573 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:22 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:22.579 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dc:48:b3 10.100.0.7'], port_security=['fa:16:3e:dc:48:b3 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'acad95ab-c692-4eb2-b6e0-517da29f69ef', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-875b01f6-4815-49fe-b2f9-4ac15eeba242', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '098cc383ace84803b8f15713d2c201a1', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'be57de36-f32e-444f-ba2c-aeaf28293636', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.204'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0dab12d7-53a9-429d-b445-cfccadfc19be, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=c6392e8e-b868-4504-8dc9-392ef67e317b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:12:22 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:22.580 103323 INFO neutron.agent.ovn.metadata.agent [-] Port c6392e8e-b868-4504-8dc9-392ef67e317b in datapath 875b01f6-4815-49fe-b2f9-4ac15eeba242 unbound from our chassis#033[00m
Oct  2 08:12:22 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:22.582 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 875b01f6-4815-49fe-b2f9-4ac15eeba242, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:12:22 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:22.584 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[7f2e1c5b-4704-41fa-a048-31006839b006]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:22 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:22.584 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-875b01f6-4815-49fe-b2f9-4ac15eeba242 namespace which is not needed anymore#033[00m
Oct  2 08:12:22 np0005466013 nova_compute[192144]: 2025-10-02 12:12:22.594 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:22 np0005466013 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d0000003a.scope: Deactivated successfully.
Oct  2 08:12:22 np0005466013 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d0000003a.scope: Consumed 13.867s CPU time.
Oct  2 08:12:22 np0005466013 systemd-machined[152202]: Machine qemu-23-instance-0000003a terminated.
Oct  2 08:12:22 np0005466013 podman[227875]: 2025-10-02 12:12:22.644952673 +0000 UTC m=+0.061122520 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:12:22 np0005466013 podman[227878]: 2025-10-02 12:12:22.65938529 +0000 UTC m=+0.075960760 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:12:22 np0005466013 neutron-haproxy-ovnmeta-875b01f6-4815-49fe-b2f9-4ac15eeba242[227711]: [NOTICE]   (227715) : haproxy version is 2.8.14-c23fe91
Oct  2 08:12:22 np0005466013 neutron-haproxy-ovnmeta-875b01f6-4815-49fe-b2f9-4ac15eeba242[227711]: [NOTICE]   (227715) : path to executable is /usr/sbin/haproxy
Oct  2 08:12:22 np0005466013 neutron-haproxy-ovnmeta-875b01f6-4815-49fe-b2f9-4ac15eeba242[227711]: [WARNING]  (227715) : Exiting Master process...
Oct  2 08:12:22 np0005466013 neutron-haproxy-ovnmeta-875b01f6-4815-49fe-b2f9-4ac15eeba242[227711]: [ALERT]    (227715) : Current worker (227717) exited with code 143 (Terminated)
Oct  2 08:12:22 np0005466013 neutron-haproxy-ovnmeta-875b01f6-4815-49fe-b2f9-4ac15eeba242[227711]: [WARNING]  (227715) : All workers exited. Exiting... (0)
Oct  2 08:12:22 np0005466013 systemd[1]: libpod-788b76f2f6224582228c8bca713f57f41be488242f5aa711c7a628bcc5015e09.scope: Deactivated successfully.
Oct  2 08:12:22 np0005466013 podman[227939]: 2025-10-02 12:12:22.733781411 +0000 UTC m=+0.048458929 container died 788b76f2f6224582228c8bca713f57f41be488242f5aa711c7a628bcc5015e09 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-875b01f6-4815-49fe-b2f9-4ac15eeba242, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  2 08:12:22 np0005466013 kernel: tapc6392e8e-b8: entered promiscuous mode
Oct  2 08:12:22 np0005466013 ovn_controller[94366]: 2025-10-02T12:12:22Z|00170|binding|INFO|Claiming lport c6392e8e-b868-4504-8dc9-392ef67e317b for this chassis.
Oct  2 08:12:22 np0005466013 systemd-udevd[227895]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:12:22 np0005466013 ovn_controller[94366]: 2025-10-02T12:12:22Z|00171|binding|INFO|c6392e8e-b868-4504-8dc9-392ef67e317b: Claiming fa:16:3e:dc:48:b3 10.100.0.7
Oct  2 08:12:22 np0005466013 nova_compute[192144]: 2025-10-02 12:12:22.766 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:22 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-788b76f2f6224582228c8bca713f57f41be488242f5aa711c7a628bcc5015e09-userdata-shm.mount: Deactivated successfully.
Oct  2 08:12:22 np0005466013 NetworkManager[51205]: <info>  [1759407142.7682] manager: (tapc6392e8e-b8): new Tun device (/org/freedesktop/NetworkManager/Devices/89)
Oct  2 08:12:22 np0005466013 systemd[1]: var-lib-containers-storage-overlay-0512d83c1ac855dc7a3c7b9dfc74f1752f0884dd3720753cc0378af83b2b619a-merged.mount: Deactivated successfully.
Oct  2 08:12:22 np0005466013 kernel: tapc6392e8e-b8 (unregistering): left promiscuous mode
Oct  2 08:12:22 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:22.773 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dc:48:b3 10.100.0.7'], port_security=['fa:16:3e:dc:48:b3 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'acad95ab-c692-4eb2-b6e0-517da29f69ef', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-875b01f6-4815-49fe-b2f9-4ac15eeba242', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '098cc383ace84803b8f15713d2c201a1', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'be57de36-f32e-444f-ba2c-aeaf28293636', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.204'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0dab12d7-53a9-429d-b445-cfccadfc19be, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=c6392e8e-b868-4504-8dc9-392ef67e317b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:12:22 np0005466013 podman[227939]: 2025-10-02 12:12:22.787006077 +0000 UTC m=+0.101683595 container cleanup 788b76f2f6224582228c8bca713f57f41be488242f5aa711c7a628bcc5015e09 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-875b01f6-4815-49fe-b2f9-4ac15eeba242, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  2 08:12:22 np0005466013 nova_compute[192144]: 2025-10-02 12:12:22.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:22 np0005466013 ovn_controller[94366]: 2025-10-02T12:12:22Z|00172|binding|INFO|Releasing lport c6392e8e-b868-4504-8dc9-392ef67e317b from this chassis (sb_readonly=0)
Oct  2 08:12:22 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:22.804 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dc:48:b3 10.100.0.7'], port_security=['fa:16:3e:dc:48:b3 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'acad95ab-c692-4eb2-b6e0-517da29f69ef', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-875b01f6-4815-49fe-b2f9-4ac15eeba242', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '098cc383ace84803b8f15713d2c201a1', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'be57de36-f32e-444f-ba2c-aeaf28293636', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.204'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0dab12d7-53a9-429d-b445-cfccadfc19be, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=c6392e8e-b868-4504-8dc9-392ef67e317b) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:12:22 np0005466013 systemd[1]: libpod-conmon-788b76f2f6224582228c8bca713f57f41be488242f5aa711c7a628bcc5015e09.scope: Deactivated successfully.
Oct  2 08:12:22 np0005466013 nova_compute[192144]: 2025-10-02 12:12:22.837 2 INFO nova.virt.libvirt.driver [-] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Instance destroyed successfully.#033[00m
Oct  2 08:12:22 np0005466013 nova_compute[192144]: 2025-10-02 12:12:22.838 2 DEBUG nova.objects.instance [None req-cba5d885-cc2f-4e49-ad98-3df43655d07a 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Lazy-loading 'resources' on Instance uuid acad95ab-c692-4eb2-b6e0-517da29f69ef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:12:22 np0005466013 nova_compute[192144]: 2025-10-02 12:12:22.865 2 DEBUG nova.virt.libvirt.vif [None req-cba5d885-cc2f-4e49-ad98-3df43655d07a 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:11:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestManualDisk-server-581821638',display_name='tempest-ServersTestManualDisk-server-581821638',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmanualdisk-server-581821638',id=58,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGso+miI8qCYT4p0eCAv5njDOKrweNd8akhDENvee/f5qXp2M6cmnrI7KRJufnciA1FtuchHAFBjX1yiV0u+1PF0AkEv7TVePk+hbK9x3irpqtSU57UjRRd76dMT+VcS4g==',key_name='tempest-keypair-12061868',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:12:02Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='098cc383ace84803b8f15713d2c201a1',ramdisk_id='',reservation_id='r-amb05o0b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestManualDisk-1767575425',owner_user_name='tempest-ServersTestManualDisk-1767575425-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:12:02Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='576e822ba888408e92dc7462577fbdb9',uuid=acad95ab-c692-4eb2-b6e0-517da29f69ef,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c6392e8e-b868-4504-8dc9-392ef67e317b", "address": "fa:16:3e:dc:48:b3", "network": {"id": "875b01f6-4815-49fe-b2f9-4ac15eeba242", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1496494398-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "098cc383ace84803b8f15713d2c201a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6392e8e-b8", "ovs_interfaceid": "c6392e8e-b868-4504-8dc9-392ef67e317b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:12:22 np0005466013 nova_compute[192144]: 2025-10-02 12:12:22.865 2 DEBUG nova.network.os_vif_util [None req-cba5d885-cc2f-4e49-ad98-3df43655d07a 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Converting VIF {"id": "c6392e8e-b868-4504-8dc9-392ef67e317b", "address": "fa:16:3e:dc:48:b3", "network": {"id": "875b01f6-4815-49fe-b2f9-4ac15eeba242", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1496494398-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "098cc383ace84803b8f15713d2c201a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6392e8e-b8", "ovs_interfaceid": "c6392e8e-b868-4504-8dc9-392ef67e317b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:12:22 np0005466013 nova_compute[192144]: 2025-10-02 12:12:22.867 2 DEBUG nova.network.os_vif_util [None req-cba5d885-cc2f-4e49-ad98-3df43655d07a 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:dc:48:b3,bridge_name='br-int',has_traffic_filtering=True,id=c6392e8e-b868-4504-8dc9-392ef67e317b,network=Network(875b01f6-4815-49fe-b2f9-4ac15eeba242),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6392e8e-b8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:12:22 np0005466013 nova_compute[192144]: 2025-10-02 12:12:22.867 2 DEBUG os_vif [None req-cba5d885-cc2f-4e49-ad98-3df43655d07a 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:dc:48:b3,bridge_name='br-int',has_traffic_filtering=True,id=c6392e8e-b868-4504-8dc9-392ef67e317b,network=Network(875b01f6-4815-49fe-b2f9-4ac15eeba242),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6392e8e-b8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:12:22 np0005466013 nova_compute[192144]: 2025-10-02 12:12:22.869 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:22 np0005466013 nova_compute[192144]: 2025-10-02 12:12:22.870 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc6392e8e-b8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:12:22 np0005466013 nova_compute[192144]: 2025-10-02 12:12:22.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:22 np0005466013 podman[227971]: 2025-10-02 12:12:22.875307538 +0000 UTC m=+0.059269174 container remove 788b76f2f6224582228c8bca713f57f41be488242f5aa711c7a628bcc5015e09 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-875b01f6-4815-49fe-b2f9-4ac15eeba242, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 08:12:22 np0005466013 nova_compute[192144]: 2025-10-02 12:12:22.875 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:12:22 np0005466013 nova_compute[192144]: 2025-10-02 12:12:22.879 2 INFO os_vif [None req-cba5d885-cc2f-4e49-ad98-3df43655d07a 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:dc:48:b3,bridge_name='br-int',has_traffic_filtering=True,id=c6392e8e-b868-4504-8dc9-392ef67e317b,network=Network(875b01f6-4815-49fe-b2f9-4ac15eeba242),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6392e8e-b8')#033[00m
Oct  2 08:12:22 np0005466013 nova_compute[192144]: 2025-10-02 12:12:22.879 2 INFO nova.virt.libvirt.driver [None req-cba5d885-cc2f-4e49-ad98-3df43655d07a 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Deleting instance files /var/lib/nova/instances/acad95ab-c692-4eb2-b6e0-517da29f69ef_del#033[00m
Oct  2 08:12:22 np0005466013 nova_compute[192144]: 2025-10-02 12:12:22.880 2 INFO nova.virt.libvirt.driver [None req-cba5d885-cc2f-4e49-ad98-3df43655d07a 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Deletion of /var/lib/nova/instances/acad95ab-c692-4eb2-b6e0-517da29f69ef_del complete#033[00m
Oct  2 08:12:22 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:22.882 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[25e3e2be-6128-4ef7-9475-a7f1225b056d]: (4, ('Thu Oct  2 12:12:22 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-875b01f6-4815-49fe-b2f9-4ac15eeba242 (788b76f2f6224582228c8bca713f57f41be488242f5aa711c7a628bcc5015e09)\n788b76f2f6224582228c8bca713f57f41be488242f5aa711c7a628bcc5015e09\nThu Oct  2 12:12:22 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-875b01f6-4815-49fe-b2f9-4ac15eeba242 (788b76f2f6224582228c8bca713f57f41be488242f5aa711c7a628bcc5015e09)\n788b76f2f6224582228c8bca713f57f41be488242f5aa711c7a628bcc5015e09\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:22 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:22.884 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[e6de306e-7833-4fb3-a8a4-b18a175a7585]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:22 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:22.885 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap875b01f6-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:12:22 np0005466013 nova_compute[192144]: 2025-10-02 12:12:22.887 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:22 np0005466013 kernel: tap875b01f6-40: left promiscuous mode
Oct  2 08:12:22 np0005466013 nova_compute[192144]: 2025-10-02 12:12:22.899 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:22 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:22.903 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[21486964-1a45-48ce-b777-107376816120]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:22 np0005466013 nova_compute[192144]: 2025-10-02 12:12:22.916 2 DEBUG oslo_concurrency.lockutils [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Acquiring lock "c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:22 np0005466013 nova_compute[192144]: 2025-10-02 12:12:22.917 2 DEBUG oslo_concurrency.lockutils [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Lock "c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:22 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:22.931 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[4eee6a7e-4428-49e4-ad82-02ae97aa4a7b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:22 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:22.932 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[d83a476f-a0f9-43ef-90af-941925683fa1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:22 np0005466013 nova_compute[192144]: 2025-10-02 12:12:22.942 2 DEBUG nova.compute.manager [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:12:22 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:22.948 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[5ae23b6e-d667-4bc0-ada3-2c83ad20d7f3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 511178, 'reachable_time': 38519, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227993, 'error': None, 'target': 'ovnmeta-875b01f6-4815-49fe-b2f9-4ac15eeba242', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:22 np0005466013 systemd[1]: run-netns-ovnmeta\x2d875b01f6\x2d4815\x2d49fe\x2db2f9\x2d4ac15eeba242.mount: Deactivated successfully.
Oct  2 08:12:22 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:22.952 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-875b01f6-4815-49fe-b2f9-4ac15eeba242 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:12:22 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:22.953 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[9d69f551-1827-4001-9203-db8fd6bc08fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:22 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:22.954 103323 INFO neutron.agent.ovn.metadata.agent [-] Port c6392e8e-b868-4504-8dc9-392ef67e317b in datapath 875b01f6-4815-49fe-b2f9-4ac15eeba242 unbound from our chassis#033[00m
Oct  2 08:12:22 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:22.956 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 875b01f6-4815-49fe-b2f9-4ac15eeba242, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:12:22 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:22.957 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[647cb4cb-4676-4fff-8cb1-af4ccbcef189]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:22 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:22.958 103323 INFO neutron.agent.ovn.metadata.agent [-] Port c6392e8e-b868-4504-8dc9-392ef67e317b in datapath 875b01f6-4815-49fe-b2f9-4ac15eeba242 unbound from our chassis#033[00m
Oct  2 08:12:22 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:22.959 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 875b01f6-4815-49fe-b2f9-4ac15eeba242, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:12:22 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:22.959 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[a39144ca-6fec-4683-bb72-377a3e2345cb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:22 np0005466013 nova_compute[192144]: 2025-10-02 12:12:22.973 2 DEBUG nova.compute.manager [req-5771922c-f85a-41b8-85a9-e43b0292551d req-b8a980ca-eee4-43f8-92a0-a9a84ea253de 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Received event network-vif-unplugged-c6392e8e-b868-4504-8dc9-392ef67e317b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:12:22 np0005466013 nova_compute[192144]: 2025-10-02 12:12:22.974 2 DEBUG oslo_concurrency.lockutils [req-5771922c-f85a-41b8-85a9-e43b0292551d req-b8a980ca-eee4-43f8-92a0-a9a84ea253de 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "acad95ab-c692-4eb2-b6e0-517da29f69ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:22 np0005466013 nova_compute[192144]: 2025-10-02 12:12:22.974 2 DEBUG oslo_concurrency.lockutils [req-5771922c-f85a-41b8-85a9-e43b0292551d req-b8a980ca-eee4-43f8-92a0-a9a84ea253de 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "acad95ab-c692-4eb2-b6e0-517da29f69ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:22 np0005466013 nova_compute[192144]: 2025-10-02 12:12:22.974 2 DEBUG oslo_concurrency.lockutils [req-5771922c-f85a-41b8-85a9-e43b0292551d req-b8a980ca-eee4-43f8-92a0-a9a84ea253de 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "acad95ab-c692-4eb2-b6e0-517da29f69ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:22 np0005466013 nova_compute[192144]: 2025-10-02 12:12:22.974 2 DEBUG nova.compute.manager [req-5771922c-f85a-41b8-85a9-e43b0292551d req-b8a980ca-eee4-43f8-92a0-a9a84ea253de 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] No waiting events found dispatching network-vif-unplugged-c6392e8e-b868-4504-8dc9-392ef67e317b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:12:22 np0005466013 nova_compute[192144]: 2025-10-02 12:12:22.974 2 DEBUG nova.compute.manager [req-5771922c-f85a-41b8-85a9-e43b0292551d req-b8a980ca-eee4-43f8-92a0-a9a84ea253de 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Received event network-vif-unplugged-c6392e8e-b868-4504-8dc9-392ef67e317b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:12:22 np0005466013 nova_compute[192144]: 2025-10-02 12:12:22.980 2 INFO nova.compute.manager [None req-cba5d885-cc2f-4e49-ad98-3df43655d07a 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Took 0.45 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:12:22 np0005466013 nova_compute[192144]: 2025-10-02 12:12:22.980 2 DEBUG oslo.service.loopingcall [None req-cba5d885-cc2f-4e49-ad98-3df43655d07a 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:12:22 np0005466013 nova_compute[192144]: 2025-10-02 12:12:22.981 2 DEBUG nova.compute.manager [-] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:12:22 np0005466013 nova_compute[192144]: 2025-10-02 12:12:22.981 2 DEBUG nova.network.neutron [-] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:12:23 np0005466013 nova_compute[192144]: 2025-10-02 12:12:23.036 2 DEBUG oslo_concurrency.lockutils [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:23 np0005466013 nova_compute[192144]: 2025-10-02 12:12:23.037 2 DEBUG oslo_concurrency.lockutils [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:23 np0005466013 nova_compute[192144]: 2025-10-02 12:12:23.045 2 DEBUG nova.virt.hardware [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:12:23 np0005466013 nova_compute[192144]: 2025-10-02 12:12:23.046 2 INFO nova.compute.claims [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:12:23 np0005466013 nova_compute[192144]: 2025-10-02 12:12:23.174 2 DEBUG nova.compute.provider_tree [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:12:23 np0005466013 nova_compute[192144]: 2025-10-02 12:12:23.190 2 DEBUG nova.scheduler.client.report [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:12:23 np0005466013 nova_compute[192144]: 2025-10-02 12:12:23.211 2 DEBUG oslo_concurrency.lockutils [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.174s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:23 np0005466013 nova_compute[192144]: 2025-10-02 12:12:23.212 2 DEBUG nova.compute.manager [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:12:23 np0005466013 nova_compute[192144]: 2025-10-02 12:12:23.238 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:23 np0005466013 nova_compute[192144]: 2025-10-02 12:12:23.262 2 DEBUG nova.compute.manager [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:12:23 np0005466013 nova_compute[192144]: 2025-10-02 12:12:23.263 2 DEBUG nova.network.neutron [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:12:23 np0005466013 nova_compute[192144]: 2025-10-02 12:12:23.305 2 INFO nova.virt.libvirt.driver [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:12:23 np0005466013 nova_compute[192144]: 2025-10-02 12:12:23.330 2 DEBUG nova.compute.manager [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:12:23 np0005466013 nova_compute[192144]: 2025-10-02 12:12:23.479 2 DEBUG nova.compute.manager [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:12:23 np0005466013 nova_compute[192144]: 2025-10-02 12:12:23.481 2 DEBUG nova.virt.libvirt.driver [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:12:23 np0005466013 nova_compute[192144]: 2025-10-02 12:12:23.481 2 INFO nova.virt.libvirt.driver [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Creating image(s)#033[00m
Oct  2 08:12:23 np0005466013 nova_compute[192144]: 2025-10-02 12:12:23.482 2 DEBUG oslo_concurrency.lockutils [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Acquiring lock "/var/lib/nova/instances/c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:23 np0005466013 nova_compute[192144]: 2025-10-02 12:12:23.482 2 DEBUG oslo_concurrency.lockutils [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Lock "/var/lib/nova/instances/c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:23 np0005466013 nova_compute[192144]: 2025-10-02 12:12:23.483 2 DEBUG oslo_concurrency.lockutils [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Lock "/var/lib/nova/instances/c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:23 np0005466013 nova_compute[192144]: 2025-10-02 12:12:23.500 2 DEBUG nova.policy [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c2b9eab3da414692b3942505e3441920', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '20417475a6a149d5bc47976f4da9a4ae', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:12:23 np0005466013 nova_compute[192144]: 2025-10-02 12:12:23.504 2 DEBUG oslo_concurrency.processutils [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:12:23 np0005466013 nova_compute[192144]: 2025-10-02 12:12:23.584 2 DEBUG oslo_concurrency.processutils [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:12:23 np0005466013 nova_compute[192144]: 2025-10-02 12:12:23.585 2 DEBUG oslo_concurrency.lockutils [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:23 np0005466013 nova_compute[192144]: 2025-10-02 12:12:23.586 2 DEBUG oslo_concurrency.lockutils [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:23 np0005466013 nova_compute[192144]: 2025-10-02 12:12:23.596 2 DEBUG oslo_concurrency.processutils [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:12:23 np0005466013 nova_compute[192144]: 2025-10-02 12:12:23.657 2 DEBUG oslo_concurrency.processutils [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:12:23 np0005466013 nova_compute[192144]: 2025-10-02 12:12:23.659 2 DEBUG oslo_concurrency.processutils [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:12:23 np0005466013 nova_compute[192144]: 2025-10-02 12:12:23.696 2 DEBUG oslo_concurrency.processutils [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:12:23 np0005466013 nova_compute[192144]: 2025-10-02 12:12:23.697 2 DEBUG oslo_concurrency.lockutils [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:23 np0005466013 nova_compute[192144]: 2025-10-02 12:12:23.698 2 DEBUG oslo_concurrency.processutils [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:12:23 np0005466013 nova_compute[192144]: 2025-10-02 12:12:23.760 2 DEBUG oslo_concurrency.processutils [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:12:23 np0005466013 nova_compute[192144]: 2025-10-02 12:12:23.761 2 DEBUG nova.virt.disk.api [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Checking if we can resize image /var/lib/nova/instances/c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:12:23 np0005466013 nova_compute[192144]: 2025-10-02 12:12:23.762 2 DEBUG oslo_concurrency.processutils [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:12:23 np0005466013 nova_compute[192144]: 2025-10-02 12:12:23.826 2 DEBUG oslo_concurrency.processutils [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:12:23 np0005466013 nova_compute[192144]: 2025-10-02 12:12:23.827 2 DEBUG nova.virt.disk.api [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Cannot resize image /var/lib/nova/instances/c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:12:23 np0005466013 nova_compute[192144]: 2025-10-02 12:12:23.827 2 DEBUG nova.objects.instance [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Lazy-loading 'migration_context' on Instance uuid c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:12:23 np0005466013 nova_compute[192144]: 2025-10-02 12:12:23.841 2 DEBUG nova.virt.libvirt.driver [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:12:23 np0005466013 nova_compute[192144]: 2025-10-02 12:12:23.841 2 DEBUG nova.virt.libvirt.driver [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Ensure instance console log exists: /var/lib/nova/instances/c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:12:23 np0005466013 nova_compute[192144]: 2025-10-02 12:12:23.842 2 DEBUG oslo_concurrency.lockutils [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:23 np0005466013 nova_compute[192144]: 2025-10-02 12:12:23.842 2 DEBUG oslo_concurrency.lockutils [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:23 np0005466013 nova_compute[192144]: 2025-10-02 12:12:23.842 2 DEBUG oslo_concurrency.lockutils [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:23 np0005466013 nova_compute[192144]: 2025-10-02 12:12:23.999 2 DEBUG nova.network.neutron [-] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:12:24 np0005466013 nova_compute[192144]: 2025-10-02 12:12:24.018 2 INFO nova.compute.manager [-] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Took 1.04 seconds to deallocate network for instance.#033[00m
Oct  2 08:12:24 np0005466013 nova_compute[192144]: 2025-10-02 12:12:24.068 2 DEBUG nova.compute.manager [req-34355132-0fc0-4280-9724-c8dc2766f706 req-973fc46e-2c3f-4d9d-8fc3-9441d771c89d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Received event network-vif-deleted-c6392e8e-b868-4504-8dc9-392ef67e317b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:12:24 np0005466013 nova_compute[192144]: 2025-10-02 12:12:24.109 2 DEBUG oslo_concurrency.lockutils [None req-cba5d885-cc2f-4e49-ad98-3df43655d07a 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:24 np0005466013 nova_compute[192144]: 2025-10-02 12:12:24.110 2 DEBUG oslo_concurrency.lockutils [None req-cba5d885-cc2f-4e49-ad98-3df43655d07a 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:24 np0005466013 nova_compute[192144]: 2025-10-02 12:12:24.193 2 DEBUG nova.compute.provider_tree [None req-cba5d885-cc2f-4e49-ad98-3df43655d07a 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:12:24 np0005466013 nova_compute[192144]: 2025-10-02 12:12:24.211 2 DEBUG nova.scheduler.client.report [None req-cba5d885-cc2f-4e49-ad98-3df43655d07a 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:12:24 np0005466013 nova_compute[192144]: 2025-10-02 12:12:24.232 2 DEBUG oslo_concurrency.lockutils [None req-cba5d885-cc2f-4e49-ad98-3df43655d07a 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:24 np0005466013 nova_compute[192144]: 2025-10-02 12:12:24.255 2 INFO nova.scheduler.client.report [None req-cba5d885-cc2f-4e49-ad98-3df43655d07a 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Deleted allocations for instance acad95ab-c692-4eb2-b6e0-517da29f69ef#033[00m
Oct  2 08:12:24 np0005466013 nova_compute[192144]: 2025-10-02 12:12:24.358 2 DEBUG oslo_concurrency.lockutils [None req-cba5d885-cc2f-4e49-ad98-3df43655d07a 576e822ba888408e92dc7462577fbdb9 098cc383ace84803b8f15713d2c201a1 - - default default] Lock "acad95ab-c692-4eb2-b6e0-517da29f69ef" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.856s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:24 np0005466013 nova_compute[192144]: 2025-10-02 12:12:24.390 2 DEBUG nova.network.neutron [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Successfully created port: 48594db4-66a1-4d35-90bd-fa3b1e19b08a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:12:24 np0005466013 nova_compute[192144]: 2025-10-02 12:12:24.592 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:24 np0005466013 nova_compute[192144]: 2025-10-02 12:12:24.781 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:25 np0005466013 nova_compute[192144]: 2025-10-02 12:12:25.050 2 DEBUG nova.compute.manager [req-08a084ce-85ee-4062-9512-20493e4bdea1 req-3995c609-3030-4e9c-987b-ef94f50ccd85 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Received event network-vif-plugged-c6392e8e-b868-4504-8dc9-392ef67e317b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:12:25 np0005466013 nova_compute[192144]: 2025-10-02 12:12:25.051 2 DEBUG oslo_concurrency.lockutils [req-08a084ce-85ee-4062-9512-20493e4bdea1 req-3995c609-3030-4e9c-987b-ef94f50ccd85 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "acad95ab-c692-4eb2-b6e0-517da29f69ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:25 np0005466013 nova_compute[192144]: 2025-10-02 12:12:25.052 2 DEBUG oslo_concurrency.lockutils [req-08a084ce-85ee-4062-9512-20493e4bdea1 req-3995c609-3030-4e9c-987b-ef94f50ccd85 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "acad95ab-c692-4eb2-b6e0-517da29f69ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:25 np0005466013 nova_compute[192144]: 2025-10-02 12:12:25.052 2 DEBUG oslo_concurrency.lockutils [req-08a084ce-85ee-4062-9512-20493e4bdea1 req-3995c609-3030-4e9c-987b-ef94f50ccd85 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "acad95ab-c692-4eb2-b6e0-517da29f69ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:25 np0005466013 nova_compute[192144]: 2025-10-02 12:12:25.052 2 DEBUG nova.compute.manager [req-08a084ce-85ee-4062-9512-20493e4bdea1 req-3995c609-3030-4e9c-987b-ef94f50ccd85 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] No waiting events found dispatching network-vif-plugged-c6392e8e-b868-4504-8dc9-392ef67e317b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:12:25 np0005466013 nova_compute[192144]: 2025-10-02 12:12:25.052 2 WARNING nova.compute.manager [req-08a084ce-85ee-4062-9512-20493e4bdea1 req-3995c609-3030-4e9c-987b-ef94f50ccd85 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Received unexpected event network-vif-plugged-c6392e8e-b868-4504-8dc9-392ef67e317b for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:12:25 np0005466013 nova_compute[192144]: 2025-10-02 12:12:25.833 2 DEBUG nova.network.neutron [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Successfully updated port: 48594db4-66a1-4d35-90bd-fa3b1e19b08a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:12:25 np0005466013 nova_compute[192144]: 2025-10-02 12:12:25.866 2 DEBUG oslo_concurrency.lockutils [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Acquiring lock "refresh_cache-c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:12:25 np0005466013 nova_compute[192144]: 2025-10-02 12:12:25.866 2 DEBUG oslo_concurrency.lockutils [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Acquired lock "refresh_cache-c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:12:25 np0005466013 nova_compute[192144]: 2025-10-02 12:12:25.867 2 DEBUG nova.network.neutron [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:12:25 np0005466013 nova_compute[192144]: 2025-10-02 12:12:25.992 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:12:25 np0005466013 nova_compute[192144]: 2025-10-02 12:12:25.992 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:12:26 np0005466013 nova_compute[192144]: 2025-10-02 12:12:26.160 2 DEBUG nova.network.neutron [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:12:26 np0005466013 nova_compute[192144]: 2025-10-02 12:12:26.495 2 DEBUG nova.compute.manager [req-6d797adb-cc88-4e01-b8f3-68684746e6b6 req-71a796d6-5775-4563-a8e1-bbf374e82087 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Received event network-changed-48594db4-66a1-4d35-90bd-fa3b1e19b08a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:12:26 np0005466013 nova_compute[192144]: 2025-10-02 12:12:26.495 2 DEBUG nova.compute.manager [req-6d797adb-cc88-4e01-b8f3-68684746e6b6 req-71a796d6-5775-4563-a8e1-bbf374e82087 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Refreshing instance network info cache due to event network-changed-48594db4-66a1-4d35-90bd-fa3b1e19b08a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:12:26 np0005466013 nova_compute[192144]: 2025-10-02 12:12:26.496 2 DEBUG oslo_concurrency.lockutils [req-6d797adb-cc88-4e01-b8f3-68684746e6b6 req-71a796d6-5775-4563-a8e1-bbf374e82087 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:12:27 np0005466013 nova_compute[192144]: 2025-10-02 12:12:27.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:28 np0005466013 nova_compute[192144]: 2025-10-02 12:12:28.240 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:28 np0005466013 nova_compute[192144]: 2025-10-02 12:12:28.291 2 DEBUG nova.network.neutron [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Updating instance_info_cache with network_info: [{"id": "48594db4-66a1-4d35-90bd-fa3b1e19b08a", "address": "fa:16:3e:48:c7:0f", "network": {"id": "2bdfd186-139e-456a-92e9-4dc9c37a846a", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-953736127-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "20417475a6a149d5bc47976f4da9a4ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48594db4-66", "ovs_interfaceid": "48594db4-66a1-4d35-90bd-fa3b1e19b08a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:12:28 np0005466013 nova_compute[192144]: 2025-10-02 12:12:28.347 2 DEBUG oslo_concurrency.lockutils [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Releasing lock "refresh_cache-c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:12:28 np0005466013 nova_compute[192144]: 2025-10-02 12:12:28.348 2 DEBUG nova.compute.manager [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Instance network_info: |[{"id": "48594db4-66a1-4d35-90bd-fa3b1e19b08a", "address": "fa:16:3e:48:c7:0f", "network": {"id": "2bdfd186-139e-456a-92e9-4dc9c37a846a", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-953736127-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "20417475a6a149d5bc47976f4da9a4ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48594db4-66", "ovs_interfaceid": "48594db4-66a1-4d35-90bd-fa3b1e19b08a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:12:28 np0005466013 nova_compute[192144]: 2025-10-02 12:12:28.349 2 DEBUG oslo_concurrency.lockutils [req-6d797adb-cc88-4e01-b8f3-68684746e6b6 req-71a796d6-5775-4563-a8e1-bbf374e82087 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:12:28 np0005466013 nova_compute[192144]: 2025-10-02 12:12:28.350 2 DEBUG nova.network.neutron [req-6d797adb-cc88-4e01-b8f3-68684746e6b6 req-71a796d6-5775-4563-a8e1-bbf374e82087 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Refreshing network info cache for port 48594db4-66a1-4d35-90bd-fa3b1e19b08a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:12:28 np0005466013 nova_compute[192144]: 2025-10-02 12:12:28.355 2 DEBUG nova.virt.libvirt.driver [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Start _get_guest_xml network_info=[{"id": "48594db4-66a1-4d35-90bd-fa3b1e19b08a", "address": "fa:16:3e:48:c7:0f", "network": {"id": "2bdfd186-139e-456a-92e9-4dc9c37a846a", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-953736127-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "20417475a6a149d5bc47976f4da9a4ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48594db4-66", "ovs_interfaceid": "48594db4-66a1-4d35-90bd-fa3b1e19b08a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:12:28 np0005466013 nova_compute[192144]: 2025-10-02 12:12:28.364 2 WARNING nova.virt.libvirt.driver [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:12:28 np0005466013 nova_compute[192144]: 2025-10-02 12:12:28.368 2 DEBUG nova.virt.libvirt.host [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:12:28 np0005466013 nova_compute[192144]: 2025-10-02 12:12:28.369 2 DEBUG nova.virt.libvirt.host [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:12:28 np0005466013 nova_compute[192144]: 2025-10-02 12:12:28.373 2 DEBUG nova.virt.libvirt.host [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:12:28 np0005466013 nova_compute[192144]: 2025-10-02 12:12:28.373 2 DEBUG nova.virt.libvirt.host [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:12:28 np0005466013 nova_compute[192144]: 2025-10-02 12:12:28.375 2 DEBUG nova.virt.libvirt.driver [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:12:28 np0005466013 nova_compute[192144]: 2025-10-02 12:12:28.375 2 DEBUG nova.virt.hardware [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:12:28 np0005466013 nova_compute[192144]: 2025-10-02 12:12:28.376 2 DEBUG nova.virt.hardware [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:12:28 np0005466013 nova_compute[192144]: 2025-10-02 12:12:28.376 2 DEBUG nova.virt.hardware [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:12:28 np0005466013 nova_compute[192144]: 2025-10-02 12:12:28.376 2 DEBUG nova.virt.hardware [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:12:28 np0005466013 nova_compute[192144]: 2025-10-02 12:12:28.376 2 DEBUG nova.virt.hardware [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:12:28 np0005466013 nova_compute[192144]: 2025-10-02 12:12:28.377 2 DEBUG nova.virt.hardware [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:12:28 np0005466013 nova_compute[192144]: 2025-10-02 12:12:28.377 2 DEBUG nova.virt.hardware [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:12:28 np0005466013 nova_compute[192144]: 2025-10-02 12:12:28.377 2 DEBUG nova.virt.hardware [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:12:28 np0005466013 nova_compute[192144]: 2025-10-02 12:12:28.378 2 DEBUG nova.virt.hardware [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:12:28 np0005466013 nova_compute[192144]: 2025-10-02 12:12:28.378 2 DEBUG nova.virt.hardware [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:12:28 np0005466013 nova_compute[192144]: 2025-10-02 12:12:28.379 2 DEBUG nova.virt.hardware [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:12:28 np0005466013 nova_compute[192144]: 2025-10-02 12:12:28.383 2 DEBUG nova.virt.libvirt.vif [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:12:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-1083280113',display_name='tempest-ServerActionsTestOtherA-server-1083280113',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-1083280113',id=61,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='20417475a6a149d5bc47976f4da9a4ae',ramdisk_id='',reservation_id='r-0r0e0uqy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherA-352727288',owner_user_name='tempest-ServerActionsTestOtherA-352727288-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:12:23Z,user_data=None,user_id='c2b9eab3da414692b3942505e3441920',uuid=c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "48594db4-66a1-4d35-90bd-fa3b1e19b08a", "address": "fa:16:3e:48:c7:0f", "network": {"id": "2bdfd186-139e-456a-92e9-4dc9c37a846a", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-953736127-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "20417475a6a149d5bc47976f4da9a4ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48594db4-66", "ovs_interfaceid": "48594db4-66a1-4d35-90bd-fa3b1e19b08a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:12:28 np0005466013 nova_compute[192144]: 2025-10-02 12:12:28.384 2 DEBUG nova.network.os_vif_util [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Converting VIF {"id": "48594db4-66a1-4d35-90bd-fa3b1e19b08a", "address": "fa:16:3e:48:c7:0f", "network": {"id": "2bdfd186-139e-456a-92e9-4dc9c37a846a", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-953736127-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "20417475a6a149d5bc47976f4da9a4ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48594db4-66", "ovs_interfaceid": "48594db4-66a1-4d35-90bd-fa3b1e19b08a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:12:28 np0005466013 nova_compute[192144]: 2025-10-02 12:12:28.384 2 DEBUG nova.network.os_vif_util [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:48:c7:0f,bridge_name='br-int',has_traffic_filtering=True,id=48594db4-66a1-4d35-90bd-fa3b1e19b08a,network=Network(2bdfd186-139e-456a-92e9-4dc9c37a846a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap48594db4-66') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:12:28 np0005466013 nova_compute[192144]: 2025-10-02 12:12:28.385 2 DEBUG nova.objects.instance [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Lazy-loading 'pci_devices' on Instance uuid c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:12:28 np0005466013 nova_compute[192144]: 2025-10-02 12:12:28.408 2 DEBUG nova.virt.libvirt.driver [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:12:28 np0005466013 nova_compute[192144]:  <uuid>c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0</uuid>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:  <name>instance-0000003d</name>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:12:28 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:      <nova:name>tempest-ServerActionsTestOtherA-server-1083280113</nova:name>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:12:28</nova:creationTime>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:12:28 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:        <nova:user uuid="c2b9eab3da414692b3942505e3441920">tempest-ServerActionsTestOtherA-352727288-project-member</nova:user>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:        <nova:project uuid="20417475a6a149d5bc47976f4da9a4ae">tempest-ServerActionsTestOtherA-352727288</nova:project>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:        <nova:port uuid="48594db4-66a1-4d35-90bd-fa3b1e19b08a">
Oct  2 08:12:28 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:      <entry name="serial">c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0</entry>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:      <entry name="uuid">c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0</entry>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:12:28 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0/disk"/>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:12:28 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0/disk.config"/>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:12:28 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:48:c7:0f"/>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:      <target dev="tap48594db4-66"/>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:12:28 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0/console.log" append="off"/>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:12:28 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:12:28 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:12:28 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:12:28 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:12:28 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:12:28 np0005466013 nova_compute[192144]: 2025-10-02 12:12:28.411 2 DEBUG nova.compute.manager [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Preparing to wait for external event network-vif-plugged-48594db4-66a1-4d35-90bd-fa3b1e19b08a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:12:28 np0005466013 nova_compute[192144]: 2025-10-02 12:12:28.411 2 DEBUG oslo_concurrency.lockutils [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Acquiring lock "c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:28 np0005466013 nova_compute[192144]: 2025-10-02 12:12:28.412 2 DEBUG oslo_concurrency.lockutils [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Lock "c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:28 np0005466013 nova_compute[192144]: 2025-10-02 12:12:28.412 2 DEBUG oslo_concurrency.lockutils [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Lock "c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:28 np0005466013 nova_compute[192144]: 2025-10-02 12:12:28.413 2 DEBUG nova.virt.libvirt.vif [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:12:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-1083280113',display_name='tempest-ServerActionsTestOtherA-server-1083280113',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-1083280113',id=61,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='20417475a6a149d5bc47976f4da9a4ae',ramdisk_id='',reservation_id='r-0r0e0uqy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherA-352727288',owner_user_name='tempest-ServerActionsTestOtherA-352727288-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:12:23Z,user_data=None,user_id='c2b9eab3da414692b3942505e3441920',uuid=c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "48594db4-66a1-4d35-90bd-fa3b1e19b08a", "address": "fa:16:3e:48:c7:0f", "network": {"id": "2bdfd186-139e-456a-92e9-4dc9c37a846a", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-953736127-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "20417475a6a149d5bc47976f4da9a4ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48594db4-66", "ovs_interfaceid": "48594db4-66a1-4d35-90bd-fa3b1e19b08a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:12:28 np0005466013 nova_compute[192144]: 2025-10-02 12:12:28.413 2 DEBUG nova.network.os_vif_util [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Converting VIF {"id": "48594db4-66a1-4d35-90bd-fa3b1e19b08a", "address": "fa:16:3e:48:c7:0f", "network": {"id": "2bdfd186-139e-456a-92e9-4dc9c37a846a", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-953736127-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "20417475a6a149d5bc47976f4da9a4ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48594db4-66", "ovs_interfaceid": "48594db4-66a1-4d35-90bd-fa3b1e19b08a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:12:28 np0005466013 nova_compute[192144]: 2025-10-02 12:12:28.414 2 DEBUG nova.network.os_vif_util [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:48:c7:0f,bridge_name='br-int',has_traffic_filtering=True,id=48594db4-66a1-4d35-90bd-fa3b1e19b08a,network=Network(2bdfd186-139e-456a-92e9-4dc9c37a846a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap48594db4-66') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:12:28 np0005466013 nova_compute[192144]: 2025-10-02 12:12:28.414 2 DEBUG os_vif [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:48:c7:0f,bridge_name='br-int',has_traffic_filtering=True,id=48594db4-66a1-4d35-90bd-fa3b1e19b08a,network=Network(2bdfd186-139e-456a-92e9-4dc9c37a846a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap48594db4-66') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:12:28 np0005466013 nova_compute[192144]: 2025-10-02 12:12:28.415 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:28 np0005466013 nova_compute[192144]: 2025-10-02 12:12:28.415 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:12:28 np0005466013 nova_compute[192144]: 2025-10-02 12:12:28.416 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:12:28 np0005466013 nova_compute[192144]: 2025-10-02 12:12:28.419 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:28 np0005466013 nova_compute[192144]: 2025-10-02 12:12:28.419 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap48594db4-66, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:12:28 np0005466013 nova_compute[192144]: 2025-10-02 12:12:28.420 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap48594db4-66, col_values=(('external_ids', {'iface-id': '48594db4-66a1-4d35-90bd-fa3b1e19b08a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:48:c7:0f', 'vm-uuid': 'c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:12:28 np0005466013 nova_compute[192144]: 2025-10-02 12:12:28.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:28 np0005466013 NetworkManager[51205]: <info>  [1759407148.4230] manager: (tap48594db4-66): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/90)
Oct  2 08:12:28 np0005466013 nova_compute[192144]: 2025-10-02 12:12:28.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:12:28 np0005466013 nova_compute[192144]: 2025-10-02 12:12:28.427 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:28 np0005466013 nova_compute[192144]: 2025-10-02 12:12:28.427 2 INFO os_vif [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:48:c7:0f,bridge_name='br-int',has_traffic_filtering=True,id=48594db4-66a1-4d35-90bd-fa3b1e19b08a,network=Network(2bdfd186-139e-456a-92e9-4dc9c37a846a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap48594db4-66')#033[00m
Oct  2 08:12:28 np0005466013 nova_compute[192144]: 2025-10-02 12:12:28.496 2 DEBUG nova.virt.libvirt.driver [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:12:28 np0005466013 nova_compute[192144]: 2025-10-02 12:12:28.497 2 DEBUG nova.virt.libvirt.driver [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:12:28 np0005466013 nova_compute[192144]: 2025-10-02 12:12:28.497 2 DEBUG nova.virt.libvirt.driver [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] No VIF found with MAC fa:16:3e:48:c7:0f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:12:28 np0005466013 nova_compute[192144]: 2025-10-02 12:12:28.497 2 INFO nova.virt.libvirt.driver [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Using config drive#033[00m
Oct  2 08:12:29 np0005466013 nova_compute[192144]: 2025-10-02 12:12:29.814 2 INFO nova.virt.libvirt.driver [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Creating config drive at /var/lib/nova/instances/c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0/disk.config#033[00m
Oct  2 08:12:29 np0005466013 nova_compute[192144]: 2025-10-02 12:12:29.819 2 DEBUG oslo_concurrency.processutils [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpeeevi4zb execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:12:29 np0005466013 nova_compute[192144]: 2025-10-02 12:12:29.951 2 DEBUG oslo_concurrency.processutils [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpeeevi4zb" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:12:30 np0005466013 kernel: tap48594db4-66: entered promiscuous mode
Oct  2 08:12:30 np0005466013 NetworkManager[51205]: <info>  [1759407150.0229] manager: (tap48594db4-66): new Tun device (/org/freedesktop/NetworkManager/Devices/91)
Oct  2 08:12:30 np0005466013 nova_compute[192144]: 2025-10-02 12:12:30.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:30 np0005466013 ovn_controller[94366]: 2025-10-02T12:12:30Z|00173|binding|INFO|Claiming lport 48594db4-66a1-4d35-90bd-fa3b1e19b08a for this chassis.
Oct  2 08:12:30 np0005466013 ovn_controller[94366]: 2025-10-02T12:12:30Z|00174|binding|INFO|48594db4-66a1-4d35-90bd-fa3b1e19b08a: Claiming fa:16:3e:48:c7:0f 10.100.0.11
Oct  2 08:12:30 np0005466013 nova_compute[192144]: 2025-10-02 12:12:30.027 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:30 np0005466013 NetworkManager[51205]: <info>  [1759407150.0389] manager: (patch-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/92)
Oct  2 08:12:30 np0005466013 NetworkManager[51205]: <info>  [1759407150.0397] manager: (patch-br-int-to-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/93)
Oct  2 08:12:30 np0005466013 nova_compute[192144]: 2025-10-02 12:12:30.033 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:30.048 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:48:c7:0f 10.100.0.11'], port_security=['fa:16:3e:48:c7:0f 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2bdfd186-139e-456a-92e9-4dc9c37a846a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '20417475a6a149d5bc47976f4da9a4ae', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c517fcc5-4e7c-4008-ac85-cb7cba93cd1e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c8a937e8-285b-47d1-b87a-47c75465be5a, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=48594db4-66a1-4d35-90bd-fa3b1e19b08a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:30.050 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 48594db4-66a1-4d35-90bd-fa3b1e19b08a in datapath 2bdfd186-139e-456a-92e9-4dc9c37a846a bound to our chassis#033[00m
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:30.051 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2bdfd186-139e-456a-92e9-4dc9c37a846a#033[00m
Oct  2 08:12:30 np0005466013 systemd-udevd[228030]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:12:30 np0005466013 systemd-machined[152202]: New machine qemu-24-instance-0000003d.
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:30.066 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[5155f7de-b72d-4d6c-bf3d-7617bbeaa665]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:30.067 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap2bdfd186-11 in ovnmeta-2bdfd186-139e-456a-92e9-4dc9c37a846a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:30.070 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap2bdfd186-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:30.071 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[3501f583-6067-4cf5-a475-ad8411b7f3bf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:30 np0005466013 NetworkManager[51205]: <info>  [1759407150.0724] device (tap48594db4-66): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:30.071 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[42e30f96-9f81-425c-9d8a-0ccc2d3cd9d8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:30 np0005466013 NetworkManager[51205]: <info>  [1759407150.0733] device (tap48594db4-66): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:30.083 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[8f9c7e77-fb3b-41ba-a0a9-7e00f452a028]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:30 np0005466013 systemd[1]: Started Virtual Machine qemu-24-instance-0000003d.
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:30.120 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[4bc3bd9d-9084-4470-a05e-3cd637aeb514]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:30.161 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[4dd653c0-e4a7-4b0d-b88a-f27b5329a93f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:30.167 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[10d1a937-a5b0-4132-bd27-c3a3bd27f1b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:30 np0005466013 NetworkManager[51205]: <info>  [1759407150.1723] manager: (tap2bdfd186-10): new Veth device (/org/freedesktop/NetworkManager/Devices/94)
Oct  2 08:12:30 np0005466013 nova_compute[192144]: 2025-10-02 12:12:30.170 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:30 np0005466013 nova_compute[192144]: 2025-10-02 12:12:30.186 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:30 np0005466013 ovn_controller[94366]: 2025-10-02T12:12:30Z|00175|binding|INFO|Setting lport 48594db4-66a1-4d35-90bd-fa3b1e19b08a ovn-installed in OVS
Oct  2 08:12:30 np0005466013 ovn_controller[94366]: 2025-10-02T12:12:30Z|00176|binding|INFO|Setting lport 48594db4-66a1-4d35-90bd-fa3b1e19b08a up in Southbound
Oct  2 08:12:30 np0005466013 nova_compute[192144]: 2025-10-02 12:12:30.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:30.206 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[e1a9dc5d-473f-4e4a-8eaa-ad380c3f379c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:30.210 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[5f77fec7-22ab-45f9-8617-659674b7fbbb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:30 np0005466013 NetworkManager[51205]: <info>  [1759407150.2341] device (tap2bdfd186-10): carrier: link connected
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:30.241 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[f3f80542-cba7-4887-aa05-1989d045c95c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:30.260 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[38f2451a-bf0f-4dd0-b0be-3d4544f684ef]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2bdfd186-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:43:b7:89'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 55], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 514185, 'reachable_time': 21958, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228064, 'error': None, 'target': 'ovnmeta-2bdfd186-139e-456a-92e9-4dc9c37a846a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:30.278 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[16a972b7-6153-48fe-9245-5a4c12be7b6b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe43:b789'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 514185, 'tstamp': 514185}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228065, 'error': None, 'target': 'ovnmeta-2bdfd186-139e-456a-92e9-4dc9c37a846a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:30.298 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[1d4ac007-9caf-40a0-8e0e-12629c04bfb5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2bdfd186-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:43:b7:89'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 55], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 514185, 'reachable_time': 21958, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 228066, 'error': None, 'target': 'ovnmeta-2bdfd186-139e-456a-92e9-4dc9c37a846a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:30.333 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[a85becc1-48da-438a-bb49-7babf8838c35]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:30.400 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[0ef321fc-1383-4ae3-9d7f-0220b0c72193]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:30.402 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2bdfd186-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:30.403 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:30.403 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2bdfd186-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:12:30 np0005466013 nova_compute[192144]: 2025-10-02 12:12:30.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:30 np0005466013 kernel: tap2bdfd186-10: entered promiscuous mode
Oct  2 08:12:30 np0005466013 NetworkManager[51205]: <info>  [1759407150.4085] manager: (tap2bdfd186-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/95)
Oct  2 08:12:30 np0005466013 nova_compute[192144]: 2025-10-02 12:12:30.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:30.409 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2bdfd186-10, col_values=(('external_ids', {'iface-id': '1e2d82b4-a363-4c19-94d1-e62c1ba8e34a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:12:30 np0005466013 nova_compute[192144]: 2025-10-02 12:12:30.410 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:30 np0005466013 ovn_controller[94366]: 2025-10-02T12:12:30Z|00177|binding|INFO|Releasing lport 1e2d82b4-a363-4c19-94d1-e62c1ba8e34a from this chassis (sb_readonly=0)
Oct  2 08:12:30 np0005466013 nova_compute[192144]: 2025-10-02 12:12:30.411 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:30.412 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2bdfd186-139e-456a-92e9-4dc9c37a846a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2bdfd186-139e-456a-92e9-4dc9c37a846a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:30.413 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[4b7d5080-a346-4807-b4cd-1fd9ffc1b3da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:30.413 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-2bdfd186-139e-456a-92e9-4dc9c37a846a
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/2bdfd186-139e-456a-92e9-4dc9c37a846a.pid.haproxy
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID 2bdfd186-139e-456a-92e9-4dc9c37a846a
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:12:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:30.414 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-2bdfd186-139e-456a-92e9-4dc9c37a846a', 'env', 'PROCESS_TAG=haproxy-2bdfd186-139e-456a-92e9-4dc9c37a846a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/2bdfd186-139e-456a-92e9-4dc9c37a846a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:12:30 np0005466013 nova_compute[192144]: 2025-10-02 12:12:30.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:30 np0005466013 podman[228098]: 2025-10-02 12:12:30.786726526 +0000 UTC m=+0.060663887 container create f88dde730675ed7397b672276191f377aa4c0d1d0e2941be1a9930a860b0b68a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2bdfd186-139e-456a-92e9-4dc9c37a846a, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 08:12:30 np0005466013 podman[228098]: 2025-10-02 12:12:30.753321483 +0000 UTC m=+0.027258864 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:12:30 np0005466013 systemd[1]: Started libpod-conmon-f88dde730675ed7397b672276191f377aa4c0d1d0e2941be1a9930a860b0b68a.scope.
Oct  2 08:12:30 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:12:30 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5909bae5241d9552aed57e64b9ef118f3cfd806641ad1d4502797fc6a9161b36/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:12:30 np0005466013 podman[228098]: 2025-10-02 12:12:30.894371435 +0000 UTC m=+0.168308816 container init f88dde730675ed7397b672276191f377aa4c0d1d0e2941be1a9930a860b0b68a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2bdfd186-139e-456a-92e9-4dc9c37a846a, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  2 08:12:30 np0005466013 podman[228098]: 2025-10-02 12:12:30.899917797 +0000 UTC m=+0.173855158 container start f88dde730675ed7397b672276191f377aa4c0d1d0e2941be1a9930a860b0b68a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2bdfd186-139e-456a-92e9-4dc9c37a846a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:12:30 np0005466013 neutron-haproxy-ovnmeta-2bdfd186-139e-456a-92e9-4dc9c37a846a[228118]: [NOTICE]   (228124) : New worker (228126) forked
Oct  2 08:12:30 np0005466013 neutron-haproxy-ovnmeta-2bdfd186-139e-456a-92e9-4dc9c37a846a[228118]: [NOTICE]   (228124) : Loading success.
Oct  2 08:12:31 np0005466013 nova_compute[192144]: 2025-10-02 12:12:31.218 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407151.2174897, c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:12:31 np0005466013 nova_compute[192144]: 2025-10-02 12:12:31.219 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] VM Started (Lifecycle Event)#033[00m
Oct  2 08:12:31 np0005466013 nova_compute[192144]: 2025-10-02 12:12:31.247 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:12:31 np0005466013 nova_compute[192144]: 2025-10-02 12:12:31.252 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407151.2180874, c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:12:31 np0005466013 nova_compute[192144]: 2025-10-02 12:12:31.252 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:12:31 np0005466013 nova_compute[192144]: 2025-10-02 12:12:31.271 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:12:31 np0005466013 nova_compute[192144]: 2025-10-02 12:12:31.275 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:12:31 np0005466013 nova_compute[192144]: 2025-10-02 12:12:31.300 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:12:32 np0005466013 nova_compute[192144]: 2025-10-02 12:12:32.881 2 DEBUG nova.compute.manager [req-fd183074-32f4-4b2e-a9bc-a1ba123ca7b8 req-2ba55ad9-399f-477c-81fc-7cf4d3868b73 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Received event network-vif-plugged-48594db4-66a1-4d35-90bd-fa3b1e19b08a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:12:32 np0005466013 nova_compute[192144]: 2025-10-02 12:12:32.882 2 DEBUG oslo_concurrency.lockutils [req-fd183074-32f4-4b2e-a9bc-a1ba123ca7b8 req-2ba55ad9-399f-477c-81fc-7cf4d3868b73 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:32 np0005466013 nova_compute[192144]: 2025-10-02 12:12:32.883 2 DEBUG oslo_concurrency.lockutils [req-fd183074-32f4-4b2e-a9bc-a1ba123ca7b8 req-2ba55ad9-399f-477c-81fc-7cf4d3868b73 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:32 np0005466013 nova_compute[192144]: 2025-10-02 12:12:32.883 2 DEBUG oslo_concurrency.lockutils [req-fd183074-32f4-4b2e-a9bc-a1ba123ca7b8 req-2ba55ad9-399f-477c-81fc-7cf4d3868b73 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:32 np0005466013 nova_compute[192144]: 2025-10-02 12:12:32.883 2 DEBUG nova.compute.manager [req-fd183074-32f4-4b2e-a9bc-a1ba123ca7b8 req-2ba55ad9-399f-477c-81fc-7cf4d3868b73 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Processing event network-vif-plugged-48594db4-66a1-4d35-90bd-fa3b1e19b08a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:12:32 np0005466013 nova_compute[192144]: 2025-10-02 12:12:32.884 2 DEBUG nova.compute.manager [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:12:32 np0005466013 nova_compute[192144]: 2025-10-02 12:12:32.888 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407152.8880978, c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:12:32 np0005466013 nova_compute[192144]: 2025-10-02 12:12:32.889 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:12:32 np0005466013 nova_compute[192144]: 2025-10-02 12:12:32.890 2 DEBUG nova.virt.libvirt.driver [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:12:32 np0005466013 nova_compute[192144]: 2025-10-02 12:12:32.895 2 INFO nova.virt.libvirt.driver [-] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Instance spawned successfully.#033[00m
Oct  2 08:12:32 np0005466013 nova_compute[192144]: 2025-10-02 12:12:32.896 2 DEBUG nova.virt.libvirt.driver [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:12:32 np0005466013 nova_compute[192144]: 2025-10-02 12:12:32.913 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:12:32 np0005466013 nova_compute[192144]: 2025-10-02 12:12:32.917 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:12:32 np0005466013 nova_compute[192144]: 2025-10-02 12:12:32.927 2 DEBUG nova.virt.libvirt.driver [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:12:32 np0005466013 nova_compute[192144]: 2025-10-02 12:12:32.928 2 DEBUG nova.virt.libvirt.driver [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:12:32 np0005466013 nova_compute[192144]: 2025-10-02 12:12:32.929 2 DEBUG nova.virt.libvirt.driver [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:12:32 np0005466013 nova_compute[192144]: 2025-10-02 12:12:32.929 2 DEBUG nova.virt.libvirt.driver [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:12:32 np0005466013 nova_compute[192144]: 2025-10-02 12:12:32.930 2 DEBUG nova.virt.libvirt.driver [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:12:32 np0005466013 nova_compute[192144]: 2025-10-02 12:12:32.930 2 DEBUG nova.virt.libvirt.driver [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:12:32 np0005466013 nova_compute[192144]: 2025-10-02 12:12:32.941 2 DEBUG nova.network.neutron [req-6d797adb-cc88-4e01-b8f3-68684746e6b6 req-71a796d6-5775-4563-a8e1-bbf374e82087 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Updated VIF entry in instance network info cache for port 48594db4-66a1-4d35-90bd-fa3b1e19b08a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:12:32 np0005466013 nova_compute[192144]: 2025-10-02 12:12:32.941 2 DEBUG nova.network.neutron [req-6d797adb-cc88-4e01-b8f3-68684746e6b6 req-71a796d6-5775-4563-a8e1-bbf374e82087 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Updating instance_info_cache with network_info: [{"id": "48594db4-66a1-4d35-90bd-fa3b1e19b08a", "address": "fa:16:3e:48:c7:0f", "network": {"id": "2bdfd186-139e-456a-92e9-4dc9c37a846a", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-953736127-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "20417475a6a149d5bc47976f4da9a4ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48594db4-66", "ovs_interfaceid": "48594db4-66a1-4d35-90bd-fa3b1e19b08a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:12:32 np0005466013 nova_compute[192144]: 2025-10-02 12:12:32.947 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:12:32 np0005466013 nova_compute[192144]: 2025-10-02 12:12:32.971 2 DEBUG oslo_concurrency.lockutils [req-6d797adb-cc88-4e01-b8f3-68684746e6b6 req-71a796d6-5775-4563-a8e1-bbf374e82087 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:12:33 np0005466013 nova_compute[192144]: 2025-10-02 12:12:33.092 2 INFO nova.compute.manager [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Took 9.61 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:12:33 np0005466013 nova_compute[192144]: 2025-10-02 12:12:33.093 2 DEBUG nova.compute.manager [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:12:33 np0005466013 nova_compute[192144]: 2025-10-02 12:12:33.241 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:33 np0005466013 nova_compute[192144]: 2025-10-02 12:12:33.285 2 INFO nova.compute.manager [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Took 10.28 seconds to build instance.#033[00m
Oct  2 08:12:33 np0005466013 nova_compute[192144]: 2025-10-02 12:12:33.307 2 DEBUG oslo_concurrency.lockutils [None req-c25caaf6-405d-4f1b-9b5e-ce49ad4466a1 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Lock "c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.390s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:33 np0005466013 ovn_controller[94366]: 2025-10-02T12:12:33Z|00178|binding|INFO|Releasing lport 1e2d82b4-a363-4c19-94d1-e62c1ba8e34a from this chassis (sb_readonly=0)
Oct  2 08:12:33 np0005466013 nova_compute[192144]: 2025-10-02 12:12:33.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:33 np0005466013 nova_compute[192144]: 2025-10-02 12:12:33.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:35 np0005466013 nova_compute[192144]: 2025-10-02 12:12:35.006 2 DEBUG nova.compute.manager [req-ca9df224-d5b3-42f5-ad01-955e7e27497b req-c11d0efc-706a-45bf-8040-2f3b2885147d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Received event network-vif-plugged-48594db4-66a1-4d35-90bd-fa3b1e19b08a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:12:35 np0005466013 nova_compute[192144]: 2025-10-02 12:12:35.007 2 DEBUG oslo_concurrency.lockutils [req-ca9df224-d5b3-42f5-ad01-955e7e27497b req-c11d0efc-706a-45bf-8040-2f3b2885147d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:35 np0005466013 nova_compute[192144]: 2025-10-02 12:12:35.007 2 DEBUG oslo_concurrency.lockutils [req-ca9df224-d5b3-42f5-ad01-955e7e27497b req-c11d0efc-706a-45bf-8040-2f3b2885147d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:35 np0005466013 nova_compute[192144]: 2025-10-02 12:12:35.008 2 DEBUG oslo_concurrency.lockutils [req-ca9df224-d5b3-42f5-ad01-955e7e27497b req-c11d0efc-706a-45bf-8040-2f3b2885147d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:35 np0005466013 nova_compute[192144]: 2025-10-02 12:12:35.008 2 DEBUG nova.compute.manager [req-ca9df224-d5b3-42f5-ad01-955e7e27497b req-c11d0efc-706a-45bf-8040-2f3b2885147d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] No waiting events found dispatching network-vif-plugged-48594db4-66a1-4d35-90bd-fa3b1e19b08a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:12:35 np0005466013 nova_compute[192144]: 2025-10-02 12:12:35.008 2 WARNING nova.compute.manager [req-ca9df224-d5b3-42f5-ad01-955e7e27497b req-c11d0efc-706a-45bf-8040-2f3b2885147d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Received unexpected event network-vif-plugged-48594db4-66a1-4d35-90bd-fa3b1e19b08a for instance with vm_state active and task_state None.#033[00m
Oct  2 08:12:35 np0005466013 podman[228135]: 2025-10-02 12:12:35.679662767 +0000 UTC m=+0.053370761 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:12:35 np0005466013 podman[228136]: 2025-10-02 12:12:35.705349661 +0000 UTC m=+0.077236929 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent)
Oct  2 08:12:35 np0005466013 podman[228137]: 2025-10-02 12:12:35.710617675 +0000 UTC m=+0.079155820 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible)
Oct  2 08:12:36 np0005466013 nova_compute[192144]: 2025-10-02 12:12:36.906 2 DEBUG nova.compute.manager [req-5a7a8609-d696-49df-8096-a3ec56bdace1 req-5d528616-f719-4dd2-b001-88571ddb0b48 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Received event network-changed-48594db4-66a1-4d35-90bd-fa3b1e19b08a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:12:36 np0005466013 nova_compute[192144]: 2025-10-02 12:12:36.907 2 DEBUG nova.compute.manager [req-5a7a8609-d696-49df-8096-a3ec56bdace1 req-5d528616-f719-4dd2-b001-88571ddb0b48 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Refreshing instance network info cache due to event network-changed-48594db4-66a1-4d35-90bd-fa3b1e19b08a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:12:36 np0005466013 nova_compute[192144]: 2025-10-02 12:12:36.907 2 DEBUG oslo_concurrency.lockutils [req-5a7a8609-d696-49df-8096-a3ec56bdace1 req-5d528616-f719-4dd2-b001-88571ddb0b48 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:12:36 np0005466013 nova_compute[192144]: 2025-10-02 12:12:36.908 2 DEBUG oslo_concurrency.lockutils [req-5a7a8609-d696-49df-8096-a3ec56bdace1 req-5d528616-f719-4dd2-b001-88571ddb0b48 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:12:36 np0005466013 nova_compute[192144]: 2025-10-02 12:12:36.908 2 DEBUG nova.network.neutron [req-5a7a8609-d696-49df-8096-a3ec56bdace1 req-5d528616-f719-4dd2-b001-88571ddb0b48 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Refreshing network info cache for port 48594db4-66a1-4d35-90bd-fa3b1e19b08a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:12:37 np0005466013 nova_compute[192144]: 2025-10-02 12:12:37.836 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407142.8344412, acad95ab-c692-4eb2-b6e0-517da29f69ef => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:12:37 np0005466013 nova_compute[192144]: 2025-10-02 12:12:37.837 2 INFO nova.compute.manager [-] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:12:37 np0005466013 nova_compute[192144]: 2025-10-02 12:12:37.876 2 DEBUG nova.compute.manager [None req-edd84190-4195-4fa2-91f1-0e668e16ddac - - - - - -] [instance: acad95ab-c692-4eb2-b6e0-517da29f69ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:12:38 np0005466013 nova_compute[192144]: 2025-10-02 12:12:38.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:38 np0005466013 nova_compute[192144]: 2025-10-02 12:12:38.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:39 np0005466013 nova_compute[192144]: 2025-10-02 12:12:39.335 2 DEBUG nova.network.neutron [req-5a7a8609-d696-49df-8096-a3ec56bdace1 req-5d528616-f719-4dd2-b001-88571ddb0b48 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Updated VIF entry in instance network info cache for port 48594db4-66a1-4d35-90bd-fa3b1e19b08a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:12:39 np0005466013 nova_compute[192144]: 2025-10-02 12:12:39.336 2 DEBUG nova.network.neutron [req-5a7a8609-d696-49df-8096-a3ec56bdace1 req-5d528616-f719-4dd2-b001-88571ddb0b48 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Updating instance_info_cache with network_info: [{"id": "48594db4-66a1-4d35-90bd-fa3b1e19b08a", "address": "fa:16:3e:48:c7:0f", "network": {"id": "2bdfd186-139e-456a-92e9-4dc9c37a846a", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-953736127-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "20417475a6a149d5bc47976f4da9a4ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48594db4-66", "ovs_interfaceid": "48594db4-66a1-4d35-90bd-fa3b1e19b08a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:12:39 np0005466013 nova_compute[192144]: 2025-10-02 12:12:39.360 2 DEBUG oslo_concurrency.lockutils [req-5a7a8609-d696-49df-8096-a3ec56bdace1 req-5d528616-f719-4dd2-b001-88571ddb0b48 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:12:41 np0005466013 nova_compute[192144]: 2025-10-02 12:12:41.274 2 DEBUG oslo_concurrency.lockutils [None req-d07e1fd0-c83d-47b0-af60-2f8a7aaba2d9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Acquiring lock "c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:41 np0005466013 nova_compute[192144]: 2025-10-02 12:12:41.275 2 DEBUG oslo_concurrency.lockutils [None req-d07e1fd0-c83d-47b0-af60-2f8a7aaba2d9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Lock "c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:41 np0005466013 nova_compute[192144]: 2025-10-02 12:12:41.276 2 DEBUG oslo_concurrency.lockutils [None req-d07e1fd0-c83d-47b0-af60-2f8a7aaba2d9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Acquiring lock "c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:41 np0005466013 nova_compute[192144]: 2025-10-02 12:12:41.276 2 DEBUG oslo_concurrency.lockutils [None req-d07e1fd0-c83d-47b0-af60-2f8a7aaba2d9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Lock "c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:41 np0005466013 nova_compute[192144]: 2025-10-02 12:12:41.277 2 DEBUG oslo_concurrency.lockutils [None req-d07e1fd0-c83d-47b0-af60-2f8a7aaba2d9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Lock "c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:41 np0005466013 nova_compute[192144]: 2025-10-02 12:12:41.318 2 INFO nova.compute.manager [None req-d07e1fd0-c83d-47b0-af60-2f8a7aaba2d9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Terminating instance#033[00m
Oct  2 08:12:41 np0005466013 nova_compute[192144]: 2025-10-02 12:12:41.334 2 DEBUG nova.compute.manager [None req-d07e1fd0-c83d-47b0-af60-2f8a7aaba2d9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:12:41 np0005466013 kernel: tap48594db4-66 (unregistering): left promiscuous mode
Oct  2 08:12:41 np0005466013 NetworkManager[51205]: <info>  [1759407161.3546] device (tap48594db4-66): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:12:41 np0005466013 ovn_controller[94366]: 2025-10-02T12:12:41Z|00179|binding|INFO|Releasing lport 48594db4-66a1-4d35-90bd-fa3b1e19b08a from this chassis (sb_readonly=0)
Oct  2 08:12:41 np0005466013 ovn_controller[94366]: 2025-10-02T12:12:41Z|00180|binding|INFO|Setting lport 48594db4-66a1-4d35-90bd-fa3b1e19b08a down in Southbound
Oct  2 08:12:41 np0005466013 ovn_controller[94366]: 2025-10-02T12:12:41Z|00181|binding|INFO|Removing iface tap48594db4-66 ovn-installed in OVS
Oct  2 08:12:41 np0005466013 nova_compute[192144]: 2025-10-02 12:12:41.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:41 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:41.429 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:48:c7:0f 10.100.0.11'], port_security=['fa:16:3e:48:c7:0f 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2bdfd186-139e-456a-92e9-4dc9c37a846a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '20417475a6a149d5bc47976f4da9a4ae', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c8a937e8-285b-47d1-b87a-47c75465be5a, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=48594db4-66a1-4d35-90bd-fa3b1e19b08a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:12:41 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:41.431 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 48594db4-66a1-4d35-90bd-fa3b1e19b08a in datapath 2bdfd186-139e-456a-92e9-4dc9c37a846a unbound from our chassis#033[00m
Oct  2 08:12:41 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:41.433 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2bdfd186-139e-456a-92e9-4dc9c37a846a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:12:41 np0005466013 nova_compute[192144]: 2025-10-02 12:12:41.436 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:41 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:41.436 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[f060eff4-21d9-411d-97f9-99ccb6b04b19]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:41 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:41.438 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-2bdfd186-139e-456a-92e9-4dc9c37a846a namespace which is not needed anymore#033[00m
Oct  2 08:12:41 np0005466013 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d0000003d.scope: Deactivated successfully.
Oct  2 08:12:41 np0005466013 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d0000003d.scope: Consumed 9.549s CPU time.
Oct  2 08:12:41 np0005466013 systemd-machined[152202]: Machine qemu-24-instance-0000003d terminated.
Oct  2 08:12:41 np0005466013 nova_compute[192144]: 2025-10-02 12:12:41.583 2 INFO nova.virt.libvirt.driver [-] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Instance destroyed successfully.#033[00m
Oct  2 08:12:41 np0005466013 nova_compute[192144]: 2025-10-02 12:12:41.583 2 DEBUG nova.objects.instance [None req-d07e1fd0-c83d-47b0-af60-2f8a7aaba2d9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Lazy-loading 'resources' on Instance uuid c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:12:41 np0005466013 nova_compute[192144]: 2025-10-02 12:12:41.596 2 DEBUG nova.virt.libvirt.vif [None req-d07e1fd0-c83d-47b0-af60-2f8a7aaba2d9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:12:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-1083280113',display_name='tempest-ServerActionsTestOtherA-server-1083280113',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-1083280113',id=61,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:12:33Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='20417475a6a149d5bc47976f4da9a4ae',ramdisk_id='',reservation_id='r-0r0e0uqy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherA-352727288',owner_user_name='tempest-ServerActionsTestOtherA-352727288-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:12:33Z,user_data=None,user_id='c2b9eab3da414692b3942505e3441920',uuid=c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "48594db4-66a1-4d35-90bd-fa3b1e19b08a", "address": "fa:16:3e:48:c7:0f", "network": {"id": "2bdfd186-139e-456a-92e9-4dc9c37a846a", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-953736127-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "20417475a6a149d5bc47976f4da9a4ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48594db4-66", "ovs_interfaceid": "48594db4-66a1-4d35-90bd-fa3b1e19b08a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:12:41 np0005466013 nova_compute[192144]: 2025-10-02 12:12:41.597 2 DEBUG nova.network.os_vif_util [None req-d07e1fd0-c83d-47b0-af60-2f8a7aaba2d9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Converting VIF {"id": "48594db4-66a1-4d35-90bd-fa3b1e19b08a", "address": "fa:16:3e:48:c7:0f", "network": {"id": "2bdfd186-139e-456a-92e9-4dc9c37a846a", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-953736127-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "20417475a6a149d5bc47976f4da9a4ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48594db4-66", "ovs_interfaceid": "48594db4-66a1-4d35-90bd-fa3b1e19b08a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:12:41 np0005466013 nova_compute[192144]: 2025-10-02 12:12:41.597 2 DEBUG nova.network.os_vif_util [None req-d07e1fd0-c83d-47b0-af60-2f8a7aaba2d9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:48:c7:0f,bridge_name='br-int',has_traffic_filtering=True,id=48594db4-66a1-4d35-90bd-fa3b1e19b08a,network=Network(2bdfd186-139e-456a-92e9-4dc9c37a846a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap48594db4-66') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:12:41 np0005466013 nova_compute[192144]: 2025-10-02 12:12:41.597 2 DEBUG os_vif [None req-d07e1fd0-c83d-47b0-af60-2f8a7aaba2d9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:48:c7:0f,bridge_name='br-int',has_traffic_filtering=True,id=48594db4-66a1-4d35-90bd-fa3b1e19b08a,network=Network(2bdfd186-139e-456a-92e9-4dc9c37a846a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap48594db4-66') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:12:41 np0005466013 nova_compute[192144]: 2025-10-02 12:12:41.599 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:41 np0005466013 nova_compute[192144]: 2025-10-02 12:12:41.599 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap48594db4-66, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:12:41 np0005466013 nova_compute[192144]: 2025-10-02 12:12:41.601 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:41 np0005466013 nova_compute[192144]: 2025-10-02 12:12:41.602 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:41 np0005466013 nova_compute[192144]: 2025-10-02 12:12:41.606 2 INFO os_vif [None req-d07e1fd0-c83d-47b0-af60-2f8a7aaba2d9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:48:c7:0f,bridge_name='br-int',has_traffic_filtering=True,id=48594db4-66a1-4d35-90bd-fa3b1e19b08a,network=Network(2bdfd186-139e-456a-92e9-4dc9c37a846a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap48594db4-66')#033[00m
Oct  2 08:12:41 np0005466013 nova_compute[192144]: 2025-10-02 12:12:41.606 2 INFO nova.virt.libvirt.driver [None req-d07e1fd0-c83d-47b0-af60-2f8a7aaba2d9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Deleting instance files /var/lib/nova/instances/c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0_del#033[00m
Oct  2 08:12:41 np0005466013 nova_compute[192144]: 2025-10-02 12:12:41.607 2 INFO nova.virt.libvirt.driver [None req-d07e1fd0-c83d-47b0-af60-2f8a7aaba2d9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Deletion of /var/lib/nova/instances/c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0_del complete#033[00m
Oct  2 08:12:41 np0005466013 neutron-haproxy-ovnmeta-2bdfd186-139e-456a-92e9-4dc9c37a846a[228118]: [NOTICE]   (228124) : haproxy version is 2.8.14-c23fe91
Oct  2 08:12:41 np0005466013 neutron-haproxy-ovnmeta-2bdfd186-139e-456a-92e9-4dc9c37a846a[228118]: [NOTICE]   (228124) : path to executable is /usr/sbin/haproxy
Oct  2 08:12:41 np0005466013 neutron-haproxy-ovnmeta-2bdfd186-139e-456a-92e9-4dc9c37a846a[228118]: [WARNING]  (228124) : Exiting Master process...
Oct  2 08:12:41 np0005466013 neutron-haproxy-ovnmeta-2bdfd186-139e-456a-92e9-4dc9c37a846a[228118]: [WARNING]  (228124) : Exiting Master process...
Oct  2 08:12:41 np0005466013 neutron-haproxy-ovnmeta-2bdfd186-139e-456a-92e9-4dc9c37a846a[228118]: [ALERT]    (228124) : Current worker (228126) exited with code 143 (Terminated)
Oct  2 08:12:41 np0005466013 neutron-haproxy-ovnmeta-2bdfd186-139e-456a-92e9-4dc9c37a846a[228118]: [WARNING]  (228124) : All workers exited. Exiting... (0)
Oct  2 08:12:41 np0005466013 systemd[1]: libpod-f88dde730675ed7397b672276191f377aa4c0d1d0e2941be1a9930a860b0b68a.scope: Deactivated successfully.
Oct  2 08:12:41 np0005466013 podman[228223]: 2025-10-02 12:12:41.674219559 +0000 UTC m=+0.155848800 container died f88dde730675ed7397b672276191f377aa4c0d1d0e2941be1a9930a860b0b68a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2bdfd186-139e-456a-92e9-4dc9c37a846a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, io.buildah.version=1.41.3)
Oct  2 08:12:41 np0005466013 nova_compute[192144]: 2025-10-02 12:12:41.673 2 INFO nova.compute.manager [None req-d07e1fd0-c83d-47b0-af60-2f8a7aaba2d9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Took 0.34 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:12:41 np0005466013 nova_compute[192144]: 2025-10-02 12:12:41.674 2 DEBUG oslo.service.loopingcall [None req-d07e1fd0-c83d-47b0-af60-2f8a7aaba2d9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:12:41 np0005466013 nova_compute[192144]: 2025-10-02 12:12:41.674 2 DEBUG nova.compute.manager [-] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:12:41 np0005466013 nova_compute[192144]: 2025-10-02 12:12:41.674 2 DEBUG nova.network.neutron [-] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:12:41 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f88dde730675ed7397b672276191f377aa4c0d1d0e2941be1a9930a860b0b68a-userdata-shm.mount: Deactivated successfully.
Oct  2 08:12:41 np0005466013 systemd[1]: var-lib-containers-storage-overlay-5909bae5241d9552aed57e64b9ef118f3cfd806641ad1d4502797fc6a9161b36-merged.mount: Deactivated successfully.
Oct  2 08:12:41 np0005466013 podman[228223]: 2025-10-02 12:12:41.730672965 +0000 UTC m=+0.212302196 container cleanup f88dde730675ed7397b672276191f377aa4c0d1d0e2941be1a9930a860b0b68a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2bdfd186-139e-456a-92e9-4dc9c37a846a, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:12:41 np0005466013 systemd[1]: libpod-conmon-f88dde730675ed7397b672276191f377aa4c0d1d0e2941be1a9930a860b0b68a.scope: Deactivated successfully.
Oct  2 08:12:41 np0005466013 podman[228268]: 2025-10-02 12:12:41.803090335 +0000 UTC m=+0.047534611 container remove f88dde730675ed7397b672276191f377aa4c0d1d0e2941be1a9930a860b0b68a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2bdfd186-139e-456a-92e9-4dc9c37a846a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  2 08:12:41 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:41.809 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[96cb1750-3c21-42d6-8e03-665ff067e13a]: (4, ('Thu Oct  2 12:12:41 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-2bdfd186-139e-456a-92e9-4dc9c37a846a (f88dde730675ed7397b672276191f377aa4c0d1d0e2941be1a9930a860b0b68a)\nf88dde730675ed7397b672276191f377aa4c0d1d0e2941be1a9930a860b0b68a\nThu Oct  2 12:12:41 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-2bdfd186-139e-456a-92e9-4dc9c37a846a (f88dde730675ed7397b672276191f377aa4c0d1d0e2941be1a9930a860b0b68a)\nf88dde730675ed7397b672276191f377aa4c0d1d0e2941be1a9930a860b0b68a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:41 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:41.811 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[cf327785-e37d-4cef-ab21-731786ed7517]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:41 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:41.813 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2bdfd186-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:12:41 np0005466013 nova_compute[192144]: 2025-10-02 12:12:41.814 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:41 np0005466013 kernel: tap2bdfd186-10: left promiscuous mode
Oct  2 08:12:41 np0005466013 nova_compute[192144]: 2025-10-02 12:12:41.828 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:41 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:41.831 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[e962f7e3-db53-409e-aea4-e0ce54cff5fe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:41 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:41.864 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[98bc05f8-99f7-47c2-8fdd-56d788abb3bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:41 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:41.866 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[46c56152-02e3-47a9-8f9b-c630e13314b0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:41 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:41.881 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[e5e24aca-ae75-4586-938e-d5e5c0b4bab4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 514177, 'reachable_time': 39679, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228283, 'error': None, 'target': 'ovnmeta-2bdfd186-139e-456a-92e9-4dc9c37a846a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:41 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:41.884 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-2bdfd186-139e-456a-92e9-4dc9c37a846a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:12:41 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:41.884 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[70b97a11-39b7-4d90-9239-b2d6c040293b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:41 np0005466013 systemd[1]: run-netns-ovnmeta\x2d2bdfd186\x2d139e\x2d456a\x2d92e9\x2d4dc9c37a846a.mount: Deactivated successfully.
Oct  2 08:12:42 np0005466013 nova_compute[192144]: 2025-10-02 12:12:42.302 2 DEBUG nova.compute.manager [req-867e7d7e-d33f-4fcb-8792-7c71e058e421 req-e35265c9-639d-40d7-ad01-aa4d9960c744 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Received event network-vif-unplugged-48594db4-66a1-4d35-90bd-fa3b1e19b08a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:12:42 np0005466013 nova_compute[192144]: 2025-10-02 12:12:42.303 2 DEBUG oslo_concurrency.lockutils [req-867e7d7e-d33f-4fcb-8792-7c71e058e421 req-e35265c9-639d-40d7-ad01-aa4d9960c744 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:42 np0005466013 nova_compute[192144]: 2025-10-02 12:12:42.303 2 DEBUG oslo_concurrency.lockutils [req-867e7d7e-d33f-4fcb-8792-7c71e058e421 req-e35265c9-639d-40d7-ad01-aa4d9960c744 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:42 np0005466013 nova_compute[192144]: 2025-10-02 12:12:42.304 2 DEBUG oslo_concurrency.lockutils [req-867e7d7e-d33f-4fcb-8792-7c71e058e421 req-e35265c9-639d-40d7-ad01-aa4d9960c744 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:42 np0005466013 nova_compute[192144]: 2025-10-02 12:12:42.304 2 DEBUG nova.compute.manager [req-867e7d7e-d33f-4fcb-8792-7c71e058e421 req-e35265c9-639d-40d7-ad01-aa4d9960c744 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] No waiting events found dispatching network-vif-unplugged-48594db4-66a1-4d35-90bd-fa3b1e19b08a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:12:42 np0005466013 nova_compute[192144]: 2025-10-02 12:12:42.304 2 DEBUG nova.compute.manager [req-867e7d7e-d33f-4fcb-8792-7c71e058e421 req-e35265c9-639d-40d7-ad01-aa4d9960c744 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Received event network-vif-unplugged-48594db4-66a1-4d35-90bd-fa3b1e19b08a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:12:42 np0005466013 nova_compute[192144]: 2025-10-02 12:12:42.307 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:42 np0005466013 nova_compute[192144]: 2025-10-02 12:12:42.658 2 DEBUG nova.network.neutron [-] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:12:42 np0005466013 nova_compute[192144]: 2025-10-02 12:12:42.681 2 INFO nova.compute.manager [-] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Took 1.01 seconds to deallocate network for instance.#033[00m
Oct  2 08:12:42 np0005466013 nova_compute[192144]: 2025-10-02 12:12:42.774 2 DEBUG oslo_concurrency.lockutils [None req-d07e1fd0-c83d-47b0-af60-2f8a7aaba2d9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:42 np0005466013 nova_compute[192144]: 2025-10-02 12:12:42.775 2 DEBUG oslo_concurrency.lockutils [None req-d07e1fd0-c83d-47b0-af60-2f8a7aaba2d9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:42 np0005466013 nova_compute[192144]: 2025-10-02 12:12:42.805 2 DEBUG nova.compute.manager [req-078f643f-0576-4ba4-897b-f71ef494717f req-f593ddf8-0030-4f7e-8cdd-8c9238c1363d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Received event network-vif-deleted-48594db4-66a1-4d35-90bd-fa3b1e19b08a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:12:42 np0005466013 nova_compute[192144]: 2025-10-02 12:12:42.860 2 DEBUG nova.compute.provider_tree [None req-d07e1fd0-c83d-47b0-af60-2f8a7aaba2d9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:12:42 np0005466013 nova_compute[192144]: 2025-10-02 12:12:42.884 2 DEBUG nova.scheduler.client.report [None req-d07e1fd0-c83d-47b0-af60-2f8a7aaba2d9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:12:42 np0005466013 nova_compute[192144]: 2025-10-02 12:12:42.908 2 DEBUG oslo_concurrency.lockutils [None req-d07e1fd0-c83d-47b0-af60-2f8a7aaba2d9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.133s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:42 np0005466013 nova_compute[192144]: 2025-10-02 12:12:42.931 2 INFO nova.scheduler.client.report [None req-d07e1fd0-c83d-47b0-af60-2f8a7aaba2d9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Deleted allocations for instance c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0#033[00m
Oct  2 08:12:43 np0005466013 nova_compute[192144]: 2025-10-02 12:12:43.013 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:43 np0005466013 nova_compute[192144]: 2025-10-02 12:12:43.019 2 DEBUG oslo_concurrency.lockutils [None req-d07e1fd0-c83d-47b0-af60-2f8a7aaba2d9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Lock "c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.744s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:43 np0005466013 nova_compute[192144]: 2025-10-02 12:12:43.244 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:44 np0005466013 nova_compute[192144]: 2025-10-02 12:12:44.492 2 DEBUG nova.compute.manager [req-5216d137-f969-45d9-bca8-b4cef130bbe7 req-0d0171bf-8f6a-4a80-bd9e-9e641d85393a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Received event network-vif-plugged-48594db4-66a1-4d35-90bd-fa3b1e19b08a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:12:44 np0005466013 nova_compute[192144]: 2025-10-02 12:12:44.493 2 DEBUG oslo_concurrency.lockutils [req-5216d137-f969-45d9-bca8-b4cef130bbe7 req-0d0171bf-8f6a-4a80-bd9e-9e641d85393a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:44 np0005466013 nova_compute[192144]: 2025-10-02 12:12:44.493 2 DEBUG oslo_concurrency.lockutils [req-5216d137-f969-45d9-bca8-b4cef130bbe7 req-0d0171bf-8f6a-4a80-bd9e-9e641d85393a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:44 np0005466013 nova_compute[192144]: 2025-10-02 12:12:44.493 2 DEBUG oslo_concurrency.lockutils [req-5216d137-f969-45d9-bca8-b4cef130bbe7 req-0d0171bf-8f6a-4a80-bd9e-9e641d85393a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:44 np0005466013 nova_compute[192144]: 2025-10-02 12:12:44.494 2 DEBUG nova.compute.manager [req-5216d137-f969-45d9-bca8-b4cef130bbe7 req-0d0171bf-8f6a-4a80-bd9e-9e641d85393a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] No waiting events found dispatching network-vif-plugged-48594db4-66a1-4d35-90bd-fa3b1e19b08a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:12:44 np0005466013 nova_compute[192144]: 2025-10-02 12:12:44.494 2 WARNING nova.compute.manager [req-5216d137-f969-45d9-bca8-b4cef130bbe7 req-0d0171bf-8f6a-4a80-bd9e-9e641d85393a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Received unexpected event network-vif-plugged-48594db4-66a1-4d35-90bd-fa3b1e19b08a for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:12:46 np0005466013 nova_compute[192144]: 2025-10-02 12:12:46.120 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:46 np0005466013 nova_compute[192144]: 2025-10-02 12:12:46.602 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:46 np0005466013 podman[228284]: 2025-10-02 12:12:46.699784179 +0000 UTC m=+0.061128111 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3)
Oct  2 08:12:47 np0005466013 nova_compute[192144]: 2025-10-02 12:12:47.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:48 np0005466013 nova_compute[192144]: 2025-10-02 12:12:48.246 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:48 np0005466013 nova_compute[192144]: 2025-10-02 12:12:48.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:48 np0005466013 nova_compute[192144]: 2025-10-02 12:12:48.925 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:49 np0005466013 podman[228306]: 2025-10-02 12:12:49.679740204 +0000 UTC m=+0.054948400 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, name=ubi9-minimal, release=1755695350, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.buildah.version=1.33.7, distribution-scope=public, version=9.6, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Oct  2 08:12:49 np0005466013 podman[228305]: 2025-10-02 12:12:49.708713564 +0000 UTC m=+0.086210370 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:12:51 np0005466013 nova_compute[192144]: 2025-10-02 12:12:51.605 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:52 np0005466013 nova_compute[192144]: 2025-10-02 12:12:52.027 2 DEBUG oslo_concurrency.lockutils [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "960cdfa5-111c-4d08-82c7-29134bd55212" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:52 np0005466013 nova_compute[192144]: 2025-10-02 12:12:52.027 2 DEBUG oslo_concurrency.lockutils [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "960cdfa5-111c-4d08-82c7-29134bd55212" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:52 np0005466013 nova_compute[192144]: 2025-10-02 12:12:52.048 2 DEBUG nova.compute.manager [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:12:52 np0005466013 nova_compute[192144]: 2025-10-02 12:12:52.218 2 DEBUG oslo_concurrency.lockutils [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:52 np0005466013 nova_compute[192144]: 2025-10-02 12:12:52.218 2 DEBUG oslo_concurrency.lockutils [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:52 np0005466013 nova_compute[192144]: 2025-10-02 12:12:52.224 2 DEBUG nova.virt.hardware [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:12:52 np0005466013 nova_compute[192144]: 2025-10-02 12:12:52.224 2 INFO nova.compute.claims [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:12:52 np0005466013 nova_compute[192144]: 2025-10-02 12:12:52.421 2 DEBUG nova.compute.provider_tree [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:12:52 np0005466013 nova_compute[192144]: 2025-10-02 12:12:52.447 2 DEBUG nova.scheduler.client.report [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:12:52 np0005466013 nova_compute[192144]: 2025-10-02 12:12:52.520 2 DEBUG oslo_concurrency.lockutils [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.302s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:52 np0005466013 nova_compute[192144]: 2025-10-02 12:12:52.521 2 DEBUG nova.compute.manager [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:12:52 np0005466013 nova_compute[192144]: 2025-10-02 12:12:52.638 2 DEBUG nova.compute.manager [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:12:52 np0005466013 nova_compute[192144]: 2025-10-02 12:12:52.639 2 DEBUG nova.network.neutron [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:12:52 np0005466013 nova_compute[192144]: 2025-10-02 12:12:52.689 2 INFO nova.virt.libvirt.driver [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:12:52 np0005466013 nova_compute[192144]: 2025-10-02 12:12:52.723 2 DEBUG nova.compute.manager [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:12:52 np0005466013 nova_compute[192144]: 2025-10-02 12:12:52.974 2 DEBUG nova.compute.manager [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:12:52 np0005466013 nova_compute[192144]: 2025-10-02 12:12:52.976 2 DEBUG nova.virt.libvirt.driver [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:12:52 np0005466013 nova_compute[192144]: 2025-10-02 12:12:52.976 2 INFO nova.virt.libvirt.driver [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Creating image(s)#033[00m
Oct  2 08:12:52 np0005466013 nova_compute[192144]: 2025-10-02 12:12:52.977 2 DEBUG oslo_concurrency.lockutils [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "/var/lib/nova/instances/960cdfa5-111c-4d08-82c7-29134bd55212/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:52 np0005466013 nova_compute[192144]: 2025-10-02 12:12:52.977 2 DEBUG oslo_concurrency.lockutils [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "/var/lib/nova/instances/960cdfa5-111c-4d08-82c7-29134bd55212/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:52 np0005466013 nova_compute[192144]: 2025-10-02 12:12:52.978 2 DEBUG oslo_concurrency.lockutils [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "/var/lib/nova/instances/960cdfa5-111c-4d08-82c7-29134bd55212/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:52 np0005466013 nova_compute[192144]: 2025-10-02 12:12:52.991 2 DEBUG oslo_concurrency.processutils [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:12:53 np0005466013 nova_compute[192144]: 2025-10-02 12:12:53.051 2 DEBUG oslo_concurrency.processutils [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:12:53 np0005466013 nova_compute[192144]: 2025-10-02 12:12:53.052 2 DEBUG oslo_concurrency.lockutils [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:53 np0005466013 nova_compute[192144]: 2025-10-02 12:12:53.053 2 DEBUG oslo_concurrency.lockutils [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:53 np0005466013 nova_compute[192144]: 2025-10-02 12:12:53.064 2 DEBUG oslo_concurrency.processutils [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:12:53 np0005466013 nova_compute[192144]: 2025-10-02 12:12:53.105 2 DEBUG nova.policy [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'def48c13fd6a43ba88836b753986a731', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ffae703d68b24b9c89686c149113fc2b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:12:53 np0005466013 nova_compute[192144]: 2025-10-02 12:12:53.116 2 DEBUG oslo_concurrency.lockutils [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Acquiring lock "f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:53 np0005466013 nova_compute[192144]: 2025-10-02 12:12:53.117 2 DEBUG oslo_concurrency.lockutils [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Lock "f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:53 np0005466013 nova_compute[192144]: 2025-10-02 12:12:53.125 2 DEBUG oslo_concurrency.processutils [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:12:53 np0005466013 nova_compute[192144]: 2025-10-02 12:12:53.126 2 DEBUG oslo_concurrency.processutils [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/960cdfa5-111c-4d08-82c7-29134bd55212/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:12:53 np0005466013 nova_compute[192144]: 2025-10-02 12:12:53.145 2 DEBUG nova.compute.manager [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:12:53 np0005466013 nova_compute[192144]: 2025-10-02 12:12:53.165 2 DEBUG oslo_concurrency.processutils [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/960cdfa5-111c-4d08-82c7-29134bd55212/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:12:53 np0005466013 nova_compute[192144]: 2025-10-02 12:12:53.168 2 DEBUG oslo_concurrency.lockutils [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:53 np0005466013 nova_compute[192144]: 2025-10-02 12:12:53.168 2 DEBUG oslo_concurrency.processutils [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:12:53 np0005466013 nova_compute[192144]: 2025-10-02 12:12:53.231 2 DEBUG oslo_concurrency.processutils [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:12:53 np0005466013 nova_compute[192144]: 2025-10-02 12:12:53.232 2 DEBUG nova.virt.disk.api [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Checking if we can resize image /var/lib/nova/instances/960cdfa5-111c-4d08-82c7-29134bd55212/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:12:53 np0005466013 nova_compute[192144]: 2025-10-02 12:12:53.234 2 DEBUG oslo_concurrency.processutils [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/960cdfa5-111c-4d08-82c7-29134bd55212/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:12:53 np0005466013 nova_compute[192144]: 2025-10-02 12:12:53.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:53 np0005466013 nova_compute[192144]: 2025-10-02 12:12:53.303 2 DEBUG oslo_concurrency.processutils [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/960cdfa5-111c-4d08-82c7-29134bd55212/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:12:53 np0005466013 nova_compute[192144]: 2025-10-02 12:12:53.304 2 DEBUG nova.virt.disk.api [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Cannot resize image /var/lib/nova/instances/960cdfa5-111c-4d08-82c7-29134bd55212/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:12:53 np0005466013 nova_compute[192144]: 2025-10-02 12:12:53.304 2 DEBUG nova.objects.instance [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lazy-loading 'migration_context' on Instance uuid 960cdfa5-111c-4d08-82c7-29134bd55212 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:12:53 np0005466013 nova_compute[192144]: 2025-10-02 12:12:53.347 2 DEBUG nova.virt.libvirt.driver [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:12:53 np0005466013 nova_compute[192144]: 2025-10-02 12:12:53.347 2 DEBUG nova.virt.libvirt.driver [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Ensure instance console log exists: /var/lib/nova/instances/960cdfa5-111c-4d08-82c7-29134bd55212/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:12:53 np0005466013 nova_compute[192144]: 2025-10-02 12:12:53.348 2 DEBUG oslo_concurrency.lockutils [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:53 np0005466013 nova_compute[192144]: 2025-10-02 12:12:53.348 2 DEBUG oslo_concurrency.lockutils [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:53 np0005466013 nova_compute[192144]: 2025-10-02 12:12:53.349 2 DEBUG oslo_concurrency.lockutils [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:53 np0005466013 nova_compute[192144]: 2025-10-02 12:12:53.370 2 DEBUG oslo_concurrency.lockutils [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:53 np0005466013 nova_compute[192144]: 2025-10-02 12:12:53.370 2 DEBUG oslo_concurrency.lockutils [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:53 np0005466013 nova_compute[192144]: 2025-10-02 12:12:53.376 2 DEBUG nova.virt.hardware [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:12:53 np0005466013 nova_compute[192144]: 2025-10-02 12:12:53.376 2 INFO nova.compute.claims [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:12:53 np0005466013 podman[228360]: 2025-10-02 12:12:53.680243256 +0000 UTC m=+0.049763787 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:12:53 np0005466013 podman[228361]: 2025-10-02 12:12:53.693609892 +0000 UTC m=+0.055143255 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible)
Oct  2 08:12:53 np0005466013 nova_compute[192144]: 2025-10-02 12:12:53.716 2 DEBUG nova.compute.provider_tree [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:12:53 np0005466013 nova_compute[192144]: 2025-10-02 12:12:53.736 2 DEBUG nova.scheduler.client.report [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:12:53 np0005466013 nova_compute[192144]: 2025-10-02 12:12:53.761 2 DEBUG oslo_concurrency.lockutils [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.391s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:53 np0005466013 nova_compute[192144]: 2025-10-02 12:12:53.761 2 DEBUG nova.compute.manager [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:12:53 np0005466013 nova_compute[192144]: 2025-10-02 12:12:53.816 2 DEBUG nova.compute.manager [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:12:53 np0005466013 nova_compute[192144]: 2025-10-02 12:12:53.816 2 DEBUG nova.network.neutron [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:12:53 np0005466013 nova_compute[192144]: 2025-10-02 12:12:53.839 2 INFO nova.virt.libvirt.driver [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:12:53 np0005466013 nova_compute[192144]: 2025-10-02 12:12:53.858 2 DEBUG nova.compute.manager [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:12:54 np0005466013 nova_compute[192144]: 2025-10-02 12:12:54.012 2 DEBUG nova.compute.manager [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:12:54 np0005466013 nova_compute[192144]: 2025-10-02 12:12:54.015 2 DEBUG nova.virt.libvirt.driver [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:12:54 np0005466013 nova_compute[192144]: 2025-10-02 12:12:54.016 2 INFO nova.virt.libvirt.driver [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Creating image(s)#033[00m
Oct  2 08:12:54 np0005466013 nova_compute[192144]: 2025-10-02 12:12:54.017 2 DEBUG oslo_concurrency.lockutils [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Acquiring lock "/var/lib/nova/instances/f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:54 np0005466013 nova_compute[192144]: 2025-10-02 12:12:54.017 2 DEBUG oslo_concurrency.lockutils [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Lock "/var/lib/nova/instances/f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:54 np0005466013 nova_compute[192144]: 2025-10-02 12:12:54.018 2 DEBUG oslo_concurrency.lockutils [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Lock "/var/lib/nova/instances/f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:54 np0005466013 nova_compute[192144]: 2025-10-02 12:12:54.031 2 DEBUG oslo_concurrency.processutils [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:12:54 np0005466013 nova_compute[192144]: 2025-10-02 12:12:54.099 2 DEBUG oslo_concurrency.processutils [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:12:54 np0005466013 nova_compute[192144]: 2025-10-02 12:12:54.101 2 DEBUG oslo_concurrency.lockutils [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:54 np0005466013 nova_compute[192144]: 2025-10-02 12:12:54.103 2 DEBUG oslo_concurrency.lockutils [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:54 np0005466013 nova_compute[192144]: 2025-10-02 12:12:54.121 2 DEBUG oslo_concurrency.processutils [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:12:54 np0005466013 nova_compute[192144]: 2025-10-02 12:12:54.190 2 DEBUG oslo_concurrency.processutils [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:12:54 np0005466013 nova_compute[192144]: 2025-10-02 12:12:54.191 2 DEBUG oslo_concurrency.processutils [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:12:54 np0005466013 nova_compute[192144]: 2025-10-02 12:12:54.233 2 DEBUG oslo_concurrency.processutils [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:12:54 np0005466013 nova_compute[192144]: 2025-10-02 12:12:54.235 2 DEBUG oslo_concurrency.lockutils [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:54 np0005466013 nova_compute[192144]: 2025-10-02 12:12:54.235 2 DEBUG oslo_concurrency.processutils [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:12:54 np0005466013 nova_compute[192144]: 2025-10-02 12:12:54.298 2 DEBUG oslo_concurrency.processutils [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:12:54 np0005466013 nova_compute[192144]: 2025-10-02 12:12:54.299 2 DEBUG nova.virt.disk.api [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Checking if we can resize image /var/lib/nova/instances/f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:12:54 np0005466013 nova_compute[192144]: 2025-10-02 12:12:54.301 2 DEBUG oslo_concurrency.processutils [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:12:54 np0005466013 nova_compute[192144]: 2025-10-02 12:12:54.372 2 DEBUG oslo_concurrency.processutils [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:12:54 np0005466013 nova_compute[192144]: 2025-10-02 12:12:54.373 2 DEBUG nova.virt.disk.api [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Cannot resize image /var/lib/nova/instances/f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:12:54 np0005466013 nova_compute[192144]: 2025-10-02 12:12:54.374 2 DEBUG nova.objects.instance [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Lazy-loading 'migration_context' on Instance uuid f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:12:54 np0005466013 nova_compute[192144]: 2025-10-02 12:12:54.394 2 DEBUG nova.virt.libvirt.driver [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:12:54 np0005466013 nova_compute[192144]: 2025-10-02 12:12:54.395 2 DEBUG nova.virt.libvirt.driver [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Ensure instance console log exists: /var/lib/nova/instances/f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:12:54 np0005466013 nova_compute[192144]: 2025-10-02 12:12:54.396 2 DEBUG oslo_concurrency.lockutils [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:54 np0005466013 nova_compute[192144]: 2025-10-02 12:12:54.396 2 DEBUG oslo_concurrency.lockutils [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:54 np0005466013 nova_compute[192144]: 2025-10-02 12:12:54.397 2 DEBUG oslo_concurrency.lockutils [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:54 np0005466013 nova_compute[192144]: 2025-10-02 12:12:54.474 2 DEBUG nova.policy [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a212b40f430d496d94ca57954f39afd6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7cc67bd4da7644d3bd8155cc7f188aa4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:12:55 np0005466013 nova_compute[192144]: 2025-10-02 12:12:55.045 2 DEBUG nova.network.neutron [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Successfully created port: ad0f91db-0150-4421-a472-f9984b0a20bc _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:12:55 np0005466013 nova_compute[192144]: 2025-10-02 12:12:55.829 2 DEBUG nova.network.neutron [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Successfully created port: af4c0ef6-8643-4a83-be1e-4000ab5fd894 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:12:56 np0005466013 nova_compute[192144]: 2025-10-02 12:12:56.582 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407161.5805259, c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:12:56 np0005466013 nova_compute[192144]: 2025-10-02 12:12:56.582 2 INFO nova.compute.manager [-] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:12:56 np0005466013 nova_compute[192144]: 2025-10-02 12:12:56.606 2 DEBUG nova.compute.manager [None req-ebe27f45-5e7c-4862-a16e-552f305b5471 - - - - - -] [instance: c2e7885e-f6ac-4b03-922b-c1b4ae6ef9a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:12:56 np0005466013 nova_compute[192144]: 2025-10-02 12:12:56.608 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:56 np0005466013 nova_compute[192144]: 2025-10-02 12:12:56.796 2 DEBUG nova.network.neutron [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Successfully updated port: ad0f91db-0150-4421-a472-f9984b0a20bc _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:12:56 np0005466013 nova_compute[192144]: 2025-10-02 12:12:56.827 2 DEBUG oslo_concurrency.lockutils [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "refresh_cache-960cdfa5-111c-4d08-82c7-29134bd55212" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:12:56 np0005466013 nova_compute[192144]: 2025-10-02 12:12:56.827 2 DEBUG oslo_concurrency.lockutils [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquired lock "refresh_cache-960cdfa5-111c-4d08-82c7-29134bd55212" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:12:56 np0005466013 nova_compute[192144]: 2025-10-02 12:12:56.827 2 DEBUG nova.network.neutron [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:12:56 np0005466013 nova_compute[192144]: 2025-10-02 12:12:56.913 2 DEBUG nova.compute.manager [req-20cba1fb-7489-4121-9bf5-202079995a48 req-78bda249-e39d-4cd4-bf16-00d729778ae6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Received event network-changed-ad0f91db-0150-4421-a472-f9984b0a20bc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:12:56 np0005466013 nova_compute[192144]: 2025-10-02 12:12:56.913 2 DEBUG nova.compute.manager [req-20cba1fb-7489-4121-9bf5-202079995a48 req-78bda249-e39d-4cd4-bf16-00d729778ae6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Refreshing instance network info cache due to event network-changed-ad0f91db-0150-4421-a472-f9984b0a20bc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:12:56 np0005466013 nova_compute[192144]: 2025-10-02 12:12:56.914 2 DEBUG oslo_concurrency.lockutils [req-20cba1fb-7489-4121-9bf5-202079995a48 req-78bda249-e39d-4cd4-bf16-00d729778ae6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-960cdfa5-111c-4d08-82c7-29134bd55212" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:12:57 np0005466013 nova_compute[192144]: 2025-10-02 12:12:57.056 2 DEBUG nova.network.neutron [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:12:57 np0005466013 nova_compute[192144]: 2025-10-02 12:12:57.122 2 DEBUG nova.network.neutron [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Successfully updated port: af4c0ef6-8643-4a83-be1e-4000ab5fd894 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:12:57 np0005466013 nova_compute[192144]: 2025-10-02 12:12:57.207 2 DEBUG oslo_concurrency.lockutils [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Acquiring lock "refresh_cache-f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:12:57 np0005466013 nova_compute[192144]: 2025-10-02 12:12:57.207 2 DEBUG oslo_concurrency.lockutils [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Acquired lock "refresh_cache-f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:12:57 np0005466013 nova_compute[192144]: 2025-10-02 12:12:57.207 2 DEBUG nova.network.neutron [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:12:57 np0005466013 nova_compute[192144]: 2025-10-02 12:12:57.478 2 DEBUG nova.network.neutron [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:12:57 np0005466013 nova_compute[192144]: 2025-10-02 12:12:57.952 2 DEBUG nova.network.neutron [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Updating instance_info_cache with network_info: [{"id": "ad0f91db-0150-4421-a472-f9984b0a20bc", "address": "fa:16:3e:62:bc:7f", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad0f91db-01", "ovs_interfaceid": "ad0f91db-0150-4421-a472-f9984b0a20bc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:12:57 np0005466013 nova_compute[192144]: 2025-10-02 12:12:57.976 2 DEBUG oslo_concurrency.lockutils [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Releasing lock "refresh_cache-960cdfa5-111c-4d08-82c7-29134bd55212" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:12:57 np0005466013 nova_compute[192144]: 2025-10-02 12:12:57.977 2 DEBUG nova.compute.manager [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Instance network_info: |[{"id": "ad0f91db-0150-4421-a472-f9984b0a20bc", "address": "fa:16:3e:62:bc:7f", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad0f91db-01", "ovs_interfaceid": "ad0f91db-0150-4421-a472-f9984b0a20bc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:12:57 np0005466013 nova_compute[192144]: 2025-10-02 12:12:57.977 2 DEBUG oslo_concurrency.lockutils [req-20cba1fb-7489-4121-9bf5-202079995a48 req-78bda249-e39d-4cd4-bf16-00d729778ae6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-960cdfa5-111c-4d08-82c7-29134bd55212" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:12:57 np0005466013 nova_compute[192144]: 2025-10-02 12:12:57.977 2 DEBUG nova.network.neutron [req-20cba1fb-7489-4121-9bf5-202079995a48 req-78bda249-e39d-4cd4-bf16-00d729778ae6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Refreshing network info cache for port ad0f91db-0150-4421-a472-f9984b0a20bc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:12:57 np0005466013 nova_compute[192144]: 2025-10-02 12:12:57.980 2 DEBUG nova.virt.libvirt.driver [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Start _get_guest_xml network_info=[{"id": "ad0f91db-0150-4421-a472-f9984b0a20bc", "address": "fa:16:3e:62:bc:7f", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad0f91db-01", "ovs_interfaceid": "ad0f91db-0150-4421-a472-f9984b0a20bc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:12:57 np0005466013 nova_compute[192144]: 2025-10-02 12:12:57.984 2 WARNING nova.virt.libvirt.driver [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:12:57 np0005466013 nova_compute[192144]: 2025-10-02 12:12:57.988 2 DEBUG nova.virt.libvirt.host [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:12:57 np0005466013 nova_compute[192144]: 2025-10-02 12:12:57.989 2 DEBUG nova.virt.libvirt.host [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:12:57 np0005466013 nova_compute[192144]: 2025-10-02 12:12:57.991 2 DEBUG nova.virt.libvirt.host [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:12:57 np0005466013 nova_compute[192144]: 2025-10-02 12:12:57.992 2 DEBUG nova.virt.libvirt.host [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:12:57 np0005466013 nova_compute[192144]: 2025-10-02 12:12:57.993 2 DEBUG nova.virt.libvirt.driver [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:12:57 np0005466013 nova_compute[192144]: 2025-10-02 12:12:57.993 2 DEBUG nova.virt.hardware [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:12:57 np0005466013 nova_compute[192144]: 2025-10-02 12:12:57.994 2 DEBUG nova.virt.hardware [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:12:57 np0005466013 nova_compute[192144]: 2025-10-02 12:12:57.994 2 DEBUG nova.virt.hardware [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:12:57 np0005466013 nova_compute[192144]: 2025-10-02 12:12:57.994 2 DEBUG nova.virt.hardware [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:12:57 np0005466013 nova_compute[192144]: 2025-10-02 12:12:57.994 2 DEBUG nova.virt.hardware [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:12:57 np0005466013 nova_compute[192144]: 2025-10-02 12:12:57.995 2 DEBUG nova.virt.hardware [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:12:57 np0005466013 nova_compute[192144]: 2025-10-02 12:12:57.995 2 DEBUG nova.virt.hardware [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:12:57 np0005466013 nova_compute[192144]: 2025-10-02 12:12:57.995 2 DEBUG nova.virt.hardware [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:12:57 np0005466013 nova_compute[192144]: 2025-10-02 12:12:57.995 2 DEBUG nova.virt.hardware [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:12:57 np0005466013 nova_compute[192144]: 2025-10-02 12:12:57.995 2 DEBUG nova.virt.hardware [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:12:57 np0005466013 nova_compute[192144]: 2025-10-02 12:12:57.996 2 DEBUG nova.virt.hardware [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:12:57 np0005466013 nova_compute[192144]: 2025-10-02 12:12:57.999 2 DEBUG nova.virt.libvirt.vif [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:12:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1330483951',display_name='tempest-ServerDiskConfigTestJSON-server-1330483951',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1330483951',id=62,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ffae703d68b24b9c89686c149113fc2b',ramdisk_id='',reservation_id='r-bb4if0yp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-1763056137',owner_user_name='tempest-ServerDiskConfigTestJSON-1763056137-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:12:52Z,user_data=None,user_id='def48c13fd6a43ba88836b753986a731',uuid=960cdfa5-111c-4d08-82c7-29134bd55212,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ad0f91db-0150-4421-a472-f9984b0a20bc", "address": "fa:16:3e:62:bc:7f", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad0f91db-01", "ovs_interfaceid": "ad0f91db-0150-4421-a472-f9984b0a20bc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:12:58 np0005466013 nova_compute[192144]: 2025-10-02 12:12:57.999 2 DEBUG nova.network.os_vif_util [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Converting VIF {"id": "ad0f91db-0150-4421-a472-f9984b0a20bc", "address": "fa:16:3e:62:bc:7f", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad0f91db-01", "ovs_interfaceid": "ad0f91db-0150-4421-a472-f9984b0a20bc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:12:58 np0005466013 nova_compute[192144]: 2025-10-02 12:12:58.001 2 DEBUG nova.network.os_vif_util [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:62:bc:7f,bridge_name='br-int',has_traffic_filtering=True,id=ad0f91db-0150-4421-a472-f9984b0a20bc,network=Network(d6de4737-ca60-4c8d-bfd5-687f9366ec8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapad0f91db-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:12:58 np0005466013 nova_compute[192144]: 2025-10-02 12:12:58.002 2 DEBUG nova.objects.instance [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lazy-loading 'pci_devices' on Instance uuid 960cdfa5-111c-4d08-82c7-29134bd55212 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:12:58 np0005466013 nova_compute[192144]: 2025-10-02 12:12:58.023 2 DEBUG nova.virt.libvirt.driver [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:12:58 np0005466013 nova_compute[192144]:  <uuid>960cdfa5-111c-4d08-82c7-29134bd55212</uuid>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:  <name>instance-0000003e</name>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:12:58 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:      <nova:name>tempest-ServerDiskConfigTestJSON-server-1330483951</nova:name>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:12:57</nova:creationTime>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:12:58 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:        <nova:user uuid="def48c13fd6a43ba88836b753986a731">tempest-ServerDiskConfigTestJSON-1763056137-project-member</nova:user>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:        <nova:project uuid="ffae703d68b24b9c89686c149113fc2b">tempest-ServerDiskConfigTestJSON-1763056137</nova:project>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:        <nova:port uuid="ad0f91db-0150-4421-a472-f9984b0a20bc">
Oct  2 08:12:58 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:      <entry name="serial">960cdfa5-111c-4d08-82c7-29134bd55212</entry>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:      <entry name="uuid">960cdfa5-111c-4d08-82c7-29134bd55212</entry>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:12:58 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/960cdfa5-111c-4d08-82c7-29134bd55212/disk"/>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:12:58 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/960cdfa5-111c-4d08-82c7-29134bd55212/disk.config"/>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:12:58 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:62:bc:7f"/>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:      <target dev="tapad0f91db-01"/>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:12:58 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/960cdfa5-111c-4d08-82c7-29134bd55212/console.log" append="off"/>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:12:58 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:12:58 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:12:58 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:12:58 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:12:58 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:12:58 np0005466013 nova_compute[192144]: 2025-10-02 12:12:58.025 2 DEBUG nova.compute.manager [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Preparing to wait for external event network-vif-plugged-ad0f91db-0150-4421-a472-f9984b0a20bc prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:12:58 np0005466013 nova_compute[192144]: 2025-10-02 12:12:58.025 2 DEBUG oslo_concurrency.lockutils [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "960cdfa5-111c-4d08-82c7-29134bd55212-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:58 np0005466013 nova_compute[192144]: 2025-10-02 12:12:58.025 2 DEBUG oslo_concurrency.lockutils [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "960cdfa5-111c-4d08-82c7-29134bd55212-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:58 np0005466013 nova_compute[192144]: 2025-10-02 12:12:58.026 2 DEBUG oslo_concurrency.lockutils [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "960cdfa5-111c-4d08-82c7-29134bd55212-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:58 np0005466013 nova_compute[192144]: 2025-10-02 12:12:58.026 2 DEBUG nova.virt.libvirt.vif [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:12:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1330483951',display_name='tempest-ServerDiskConfigTestJSON-server-1330483951',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1330483951',id=62,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ffae703d68b24b9c89686c149113fc2b',ramdisk_id='',reservation_id='r-bb4if0yp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-1763056137',owner_user_name='tempest-ServerDiskConfigTestJSON-1763056137-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:12:52Z,user_data=None,user_id='def48c13fd6a43ba88836b753986a731',uuid=960cdfa5-111c-4d08-82c7-29134bd55212,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ad0f91db-0150-4421-a472-f9984b0a20bc", "address": "fa:16:3e:62:bc:7f", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad0f91db-01", "ovs_interfaceid": "ad0f91db-0150-4421-a472-f9984b0a20bc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:12:58 np0005466013 nova_compute[192144]: 2025-10-02 12:12:58.027 2 DEBUG nova.network.os_vif_util [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Converting VIF {"id": "ad0f91db-0150-4421-a472-f9984b0a20bc", "address": "fa:16:3e:62:bc:7f", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad0f91db-01", "ovs_interfaceid": "ad0f91db-0150-4421-a472-f9984b0a20bc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:12:58 np0005466013 nova_compute[192144]: 2025-10-02 12:12:58.027 2 DEBUG nova.network.os_vif_util [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:62:bc:7f,bridge_name='br-int',has_traffic_filtering=True,id=ad0f91db-0150-4421-a472-f9984b0a20bc,network=Network(d6de4737-ca60-4c8d-bfd5-687f9366ec8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapad0f91db-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:12:58 np0005466013 nova_compute[192144]: 2025-10-02 12:12:58.028 2 DEBUG os_vif [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:62:bc:7f,bridge_name='br-int',has_traffic_filtering=True,id=ad0f91db-0150-4421-a472-f9984b0a20bc,network=Network(d6de4737-ca60-4c8d-bfd5-687f9366ec8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapad0f91db-01') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:12:58 np0005466013 nova_compute[192144]: 2025-10-02 12:12:58.028 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:58 np0005466013 nova_compute[192144]: 2025-10-02 12:12:58.029 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:12:58 np0005466013 nova_compute[192144]: 2025-10-02 12:12:58.029 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:12:58 np0005466013 nova_compute[192144]: 2025-10-02 12:12:58.034 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:58 np0005466013 nova_compute[192144]: 2025-10-02 12:12:58.034 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapad0f91db-01, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:12:58 np0005466013 nova_compute[192144]: 2025-10-02 12:12:58.034 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapad0f91db-01, col_values=(('external_ids', {'iface-id': 'ad0f91db-0150-4421-a472-f9984b0a20bc', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:62:bc:7f', 'vm-uuid': '960cdfa5-111c-4d08-82c7-29134bd55212'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:12:58 np0005466013 nova_compute[192144]: 2025-10-02 12:12:58.036 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:58 np0005466013 NetworkManager[51205]: <info>  [1759407178.0370] manager: (tapad0f91db-01): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/96)
Oct  2 08:12:58 np0005466013 nova_compute[192144]: 2025-10-02 12:12:58.037 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:12:58 np0005466013 nova_compute[192144]: 2025-10-02 12:12:58.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:58 np0005466013 nova_compute[192144]: 2025-10-02 12:12:58.043 2 INFO os_vif [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:62:bc:7f,bridge_name='br-int',has_traffic_filtering=True,id=ad0f91db-0150-4421-a472-f9984b0a20bc,network=Network(d6de4737-ca60-4c8d-bfd5-687f9366ec8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapad0f91db-01')#033[00m
Oct  2 08:12:58 np0005466013 nova_compute[192144]: 2025-10-02 12:12:58.114 2 DEBUG nova.virt.libvirt.driver [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:12:58 np0005466013 nova_compute[192144]: 2025-10-02 12:12:58.115 2 DEBUG nova.virt.libvirt.driver [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:12:58 np0005466013 nova_compute[192144]: 2025-10-02 12:12:58.115 2 DEBUG nova.virt.libvirt.driver [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] No VIF found with MAC fa:16:3e:62:bc:7f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:12:58 np0005466013 nova_compute[192144]: 2025-10-02 12:12:58.116 2 INFO nova.virt.libvirt.driver [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Using config drive#033[00m
Oct  2 08:12:58 np0005466013 nova_compute[192144]: 2025-10-02 12:12:58.250 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:58 np0005466013 nova_compute[192144]: 2025-10-02 12:12:58.977 2 INFO nova.virt.libvirt.driver [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Creating config drive at /var/lib/nova/instances/960cdfa5-111c-4d08-82c7-29134bd55212/disk.config#033[00m
Oct  2 08:12:58 np0005466013 nova_compute[192144]: 2025-10-02 12:12:58.984 2 DEBUG oslo_concurrency.processutils [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/960cdfa5-111c-4d08-82c7-29134bd55212/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8i5nyiap execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.004 2 DEBUG nova.network.neutron [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Updating instance_info_cache with network_info: [{"id": "af4c0ef6-8643-4a83-be1e-4000ab5fd894", "address": "fa:16:3e:a9:c4:a7", "network": {"id": "d8c71e68-a016-4099-877d-881b9a6e634c", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1691956717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7cc67bd4da7644d3bd8155cc7f188aa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf4c0ef6-86", "ovs_interfaceid": "af4c0ef6-8643-4a83-be1e-4000ab5fd894", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.036 2 DEBUG nova.compute.manager [req-127bd899-f2f7-4802-8119-f78cc2918103 req-bab08aa3-7615-4206-bdba-2a73f015d4e0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Received event network-changed-af4c0ef6-8643-4a83-be1e-4000ab5fd894 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.036 2 DEBUG nova.compute.manager [req-127bd899-f2f7-4802-8119-f78cc2918103 req-bab08aa3-7615-4206-bdba-2a73f015d4e0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Refreshing instance network info cache due to event network-changed-af4c0ef6-8643-4a83-be1e-4000ab5fd894. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.036 2 DEBUG oslo_concurrency.lockutils [req-127bd899-f2f7-4802-8119-f78cc2918103 req-bab08aa3-7615-4206-bdba-2a73f015d4e0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.039 2 DEBUG oslo_concurrency.lockutils [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Releasing lock "refresh_cache-f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.039 2 DEBUG nova.compute.manager [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Instance network_info: |[{"id": "af4c0ef6-8643-4a83-be1e-4000ab5fd894", "address": "fa:16:3e:a9:c4:a7", "network": {"id": "d8c71e68-a016-4099-877d-881b9a6e634c", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1691956717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7cc67bd4da7644d3bd8155cc7f188aa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf4c0ef6-86", "ovs_interfaceid": "af4c0ef6-8643-4a83-be1e-4000ab5fd894", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.040 2 DEBUG oslo_concurrency.lockutils [req-127bd899-f2f7-4802-8119-f78cc2918103 req-bab08aa3-7615-4206-bdba-2a73f015d4e0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.040 2 DEBUG nova.network.neutron [req-127bd899-f2f7-4802-8119-f78cc2918103 req-bab08aa3-7615-4206-bdba-2a73f015d4e0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Refreshing network info cache for port af4c0ef6-8643-4a83-be1e-4000ab5fd894 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.043 2 DEBUG nova.virt.libvirt.driver [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Start _get_guest_xml network_info=[{"id": "af4c0ef6-8643-4a83-be1e-4000ab5fd894", "address": "fa:16:3e:a9:c4:a7", "network": {"id": "d8c71e68-a016-4099-877d-881b9a6e634c", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1691956717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7cc67bd4da7644d3bd8155cc7f188aa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf4c0ef6-86", "ovs_interfaceid": "af4c0ef6-8643-4a83-be1e-4000ab5fd894", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.048 2 WARNING nova.virt.libvirt.driver [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.053 2 DEBUG nova.virt.libvirt.host [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.054 2 DEBUG nova.virt.libvirt.host [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.061 2 DEBUG nova.virt.libvirt.host [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.061 2 DEBUG nova.virt.libvirt.host [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.062 2 DEBUG nova.virt.libvirt.driver [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.063 2 DEBUG nova.virt.hardware [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.063 2 DEBUG nova.virt.hardware [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.063 2 DEBUG nova.virt.hardware [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.063 2 DEBUG nova.virt.hardware [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.064 2 DEBUG nova.virt.hardware [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.064 2 DEBUG nova.virt.hardware [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.064 2 DEBUG nova.virt.hardware [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.064 2 DEBUG nova.virt.hardware [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.065 2 DEBUG nova.virt.hardware [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.065 2 DEBUG nova.virt.hardware [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.065 2 DEBUG nova.virt.hardware [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.068 2 DEBUG nova.virt.libvirt.vif [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:12:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='guest-instance-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx-guest-test.domaintest.com',id=63,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOPn0xpGhVjQWIudBz7gv1qPDp8tKuxdf0JEzn8tPqDQfHIB9ebNoAyxeWg2Ca9UuDQGIerOpqmy+4eH92L/OuNtU0gu0/6j03K3Wlz31kXoWZ7qiagw3/7mChywZRPIBg==',key_name='tempest-keypair-644342312',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7cc67bd4da7644d3bd8155cc7f188aa4',ramdisk_id='',reservation_id='r-40n50i1z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersV294TestFqdnHostnames-347469803',owner_user_name='tempest-ServersV294TestFqdnHostnames-347469803-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:12:53Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='a212b40f430d496d94ca57954f39afd6',uuid=f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "af4c0ef6-8643-4a83-be1e-4000ab5fd894", "address": "fa:16:3e:a9:c4:a7", "network": {"id": "d8c71e68-a016-4099-877d-881b9a6e634c", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1691956717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7cc67bd4da7644d3bd8155cc7f188aa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf4c0ef6-86", "ovs_interfaceid": "af4c0ef6-8643-4a83-be1e-4000ab5fd894", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.069 2 DEBUG nova.network.os_vif_util [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Converting VIF {"id": "af4c0ef6-8643-4a83-be1e-4000ab5fd894", "address": "fa:16:3e:a9:c4:a7", "network": {"id": "d8c71e68-a016-4099-877d-881b9a6e634c", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1691956717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7cc67bd4da7644d3bd8155cc7f188aa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf4c0ef6-86", "ovs_interfaceid": "af4c0ef6-8643-4a83-be1e-4000ab5fd894", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.069 2 DEBUG nova.network.os_vif_util [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a9:c4:a7,bridge_name='br-int',has_traffic_filtering=True,id=af4c0ef6-8643-4a83-be1e-4000ab5fd894,network=Network(d8c71e68-a016-4099-877d-881b9a6e634c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf4c0ef6-86') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.070 2 DEBUG nova.objects.instance [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Lazy-loading 'pci_devices' on Instance uuid f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.084 2 DEBUG nova.virt.libvirt.driver [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:12:59 np0005466013 nova_compute[192144]:  <uuid>f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b</uuid>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:  <name>instance-0000003f</name>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:12:59 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:      <nova:name>guest-instance-1</nova:name>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:12:59</nova:creationTime>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:12:59 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:        <nova:user uuid="a212b40f430d496d94ca57954f39afd6">tempest-ServersV294TestFqdnHostnames-347469803-project-member</nova:user>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:        <nova:project uuid="7cc67bd4da7644d3bd8155cc7f188aa4">tempest-ServersV294TestFqdnHostnames-347469803</nova:project>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:        <nova:port uuid="af4c0ef6-8643-4a83-be1e-4000ab5fd894">
Oct  2 08:12:59 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:      <entry name="serial">f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b</entry>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:      <entry name="uuid">f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b</entry>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:12:59 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b/disk"/>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:12:59 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b/disk.config"/>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:12:59 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:a9:c4:a7"/>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:      <target dev="tapaf4c0ef6-86"/>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:12:59 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b/console.log" append="off"/>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:12:59 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:12:59 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:12:59 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:12:59 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:12:59 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.085 2 DEBUG nova.compute.manager [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Preparing to wait for external event network-vif-plugged-af4c0ef6-8643-4a83-be1e-4000ab5fd894 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.085 2 DEBUG oslo_concurrency.lockutils [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Acquiring lock "f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.086 2 DEBUG oslo_concurrency.lockutils [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Lock "f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.086 2 DEBUG oslo_concurrency.lockutils [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Lock "f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.087 2 DEBUG nova.virt.libvirt.vif [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:12:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='guest-instance-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx-guest-test.domaintest.com',id=63,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOPn0xpGhVjQWIudBz7gv1qPDp8tKuxdf0JEzn8tPqDQfHIB9ebNoAyxeWg2Ca9UuDQGIerOpqmy+4eH92L/OuNtU0gu0/6j03K3Wlz31kXoWZ7qiagw3/7mChywZRPIBg==',key_name='tempest-keypair-644342312',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7cc67bd4da7644d3bd8155cc7f188aa4',ramdisk_id='',reservation_id='r-40n50i1z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersV294TestFqdnHostnames-347469803',owner_user_name='tempest-ServersV294TestFqdnHostnames-347469803-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:12:53Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='a212b40f430d496d94ca57954f39afd6',uuid=f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "af4c0ef6-8643-4a83-be1e-4000ab5fd894", "address": "fa:16:3e:a9:c4:a7", "network": {"id": "d8c71e68-a016-4099-877d-881b9a6e634c", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1691956717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7cc67bd4da7644d3bd8155cc7f188aa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf4c0ef6-86", "ovs_interfaceid": "af4c0ef6-8643-4a83-be1e-4000ab5fd894", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.087 2 DEBUG nova.network.os_vif_util [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Converting VIF {"id": "af4c0ef6-8643-4a83-be1e-4000ab5fd894", "address": "fa:16:3e:a9:c4:a7", "network": {"id": "d8c71e68-a016-4099-877d-881b9a6e634c", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1691956717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7cc67bd4da7644d3bd8155cc7f188aa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf4c0ef6-86", "ovs_interfaceid": "af4c0ef6-8643-4a83-be1e-4000ab5fd894", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.087 2 DEBUG nova.network.os_vif_util [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a9:c4:a7,bridge_name='br-int',has_traffic_filtering=True,id=af4c0ef6-8643-4a83-be1e-4000ab5fd894,network=Network(d8c71e68-a016-4099-877d-881b9a6e634c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf4c0ef6-86') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.088 2 DEBUG os_vif [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a9:c4:a7,bridge_name='br-int',has_traffic_filtering=True,id=af4c0ef6-8643-4a83-be1e-4000ab5fd894,network=Network(d8c71e68-a016-4099-877d-881b9a6e634c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf4c0ef6-86') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.091 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.092 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.092 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.095 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.095 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaf4c0ef6-86, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.095 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapaf4c0ef6-86, col_values=(('external_ids', {'iface-id': 'af4c0ef6-8643-4a83-be1e-4000ab5fd894', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a9:c4:a7', 'vm-uuid': 'f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.097 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:59 np0005466013 NetworkManager[51205]: <info>  [1759407179.0980] manager: (tapaf4c0ef6-86): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/97)
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.099 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.103 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.104 2 INFO os_vif [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a9:c4:a7,bridge_name='br-int',has_traffic_filtering=True,id=af4c0ef6-8643-4a83-be1e-4000ab5fd894,network=Network(d8c71e68-a016-4099-877d-881b9a6e634c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf4c0ef6-86')#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.109 2 DEBUG oslo_concurrency.processutils [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/960cdfa5-111c-4d08-82c7-29134bd55212/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8i5nyiap" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.147 2 DEBUG nova.virt.libvirt.driver [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.148 2 DEBUG nova.virt.libvirt.driver [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.148 2 DEBUG nova.virt.libvirt.driver [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] No VIF found with MAC fa:16:3e:a9:c4:a7, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.148 2 INFO nova.virt.libvirt.driver [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Using config drive#033[00m
Oct  2 08:12:59 np0005466013 kernel: tapad0f91db-01: entered promiscuous mode
Oct  2 08:12:59 np0005466013 NetworkManager[51205]: <info>  [1759407179.1682] manager: (tapad0f91db-01): new Tun device (/org/freedesktop/NetworkManager/Devices/98)
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.173 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:59 np0005466013 ovn_controller[94366]: 2025-10-02T12:12:59Z|00182|binding|INFO|Claiming lport ad0f91db-0150-4421-a472-f9984b0a20bc for this chassis.
Oct  2 08:12:59 np0005466013 ovn_controller[94366]: 2025-10-02T12:12:59Z|00183|binding|INFO|ad0f91db-0150-4421-a472-f9984b0a20bc: Claiming fa:16:3e:62:bc:7f 10.100.0.8
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:59.184 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:62:bc:7f 10.100.0.8'], port_security=['fa:16:3e:62:bc:7f 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '960cdfa5-111c-4d08-82c7-29134bd55212', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ffae703d68b24b9c89686c149113fc2b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '64970375-b20e-4c18-bfb5-2a0465f8be7d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9476db85-7514-407a-b55a-3d3c703e8f7b, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=ad0f91db-0150-4421-a472-f9984b0a20bc) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:59.185 103323 INFO neutron.agent.ovn.metadata.agent [-] Port ad0f91db-0150-4421-a472-f9984b0a20bc in datapath d6de4737-ca60-4c8d-bfd5-687f9366ec8b bound to our chassis#033[00m
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:59.187 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d6de4737-ca60-4c8d-bfd5-687f9366ec8b#033[00m
Oct  2 08:12:59 np0005466013 systemd-udevd[228441]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:59.200 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[d9aacccd-0a2d-4a08-aec4-55214e0a49f0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:59.201 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd6de4737-c1 in ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:59.203 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd6de4737-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:59.203 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[1f0d2e23-a09b-4bb0-87f0-2f1b3ae5c519]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:59.204 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[9e874a9b-f07d-41ed-98ac-09a77d785526]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:59 np0005466013 systemd-machined[152202]: New machine qemu-25-instance-0000003e.
Oct  2 08:12:59 np0005466013 NetworkManager[51205]: <info>  [1759407179.2124] device (tapad0f91db-01): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:12:59 np0005466013 NetworkManager[51205]: <info>  [1759407179.2136] device (tapad0f91db-01): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:59.217 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[0e0bb532-d931-4ea1-b0fa-83e82d615322]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:59 np0005466013 ovn_controller[94366]: 2025-10-02T12:12:59Z|00184|binding|INFO|Setting lport ad0f91db-0150-4421-a472-f9984b0a20bc ovn-installed in OVS
Oct  2 08:12:59 np0005466013 ovn_controller[94366]: 2025-10-02T12:12:59Z|00185|binding|INFO|Setting lport ad0f91db-0150-4421-a472-f9984b0a20bc up in Southbound
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.241 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:59 np0005466013 systemd[1]: Started Virtual Machine qemu-25-instance-0000003e.
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:59.246 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[18e2d45f-5b84-470f-bd3f-493195b75c54]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:59.273 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[fb35804d-6944-4b2f-b178-e095ef3a8d90]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:59 np0005466013 systemd-udevd[228445]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:12:59 np0005466013 NetworkManager[51205]: <info>  [1759407179.2799] manager: (tapd6de4737-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/99)
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:59.278 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[aa231e25-ce0c-4bb5-a40d-1be717569ff1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:59.307 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[4b01bc49-43b0-4bb0-a41a-078696abc7cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:59.311 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[5a852321-3f89-4880-8414-728c65ec2d6e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:59 np0005466013 NetworkManager[51205]: <info>  [1759407179.3363] device (tapd6de4737-c0): carrier: link connected
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:59.340 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[a2bc7f0d-bd9d-4aa1-805b-b2559d0c5204]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:59.359 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[8011950b-2170-449d-aa03-a6a33ee9cd9f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd6de4737-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bd:c9:1f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 58], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 517095, 'reachable_time': 18495, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228479, 'error': None, 'target': 'ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:59.374 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[77f50eb9-46e6-4e57-b796-b9ce562fc0cf]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febd:c91f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 517095, 'tstamp': 517095}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228480, 'error': None, 'target': 'ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:59.394 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[524c0b71-8d77-4a8c-9597-6d48c4c220be]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd6de4737-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bd:c9:1f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 58], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 517095, 'reachable_time': 18495, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 228481, 'error': None, 'target': 'ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:59.425 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[67b317ea-aebb-48d5-9426-c6356a51468f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:59.487 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[cc1d8b51-3068-4735-ad51-417ff5e8d9e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:59.491 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd6de4737-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:59.491 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:59.492 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd6de4737-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:59 np0005466013 kernel: tapd6de4737-c0: entered promiscuous mode
Oct  2 08:12:59 np0005466013 NetworkManager[51205]: <info>  [1759407179.4967] manager: (tapd6de4737-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/100)
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.500 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:59.501 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd6de4737-c0, col_values=(('external_ids', {'iface-id': 'cc451eb7-bf34-4b54-96d8-b834f11e06fb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:59 np0005466013 ovn_controller[94366]: 2025-10-02T12:12:59Z|00186|binding|INFO|Releasing lport cc451eb7-bf34-4b54-96d8-b834f11e06fb from this chassis (sb_readonly=0)
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.519 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:59.520 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d6de4737-ca60-4c8d-bfd5-687f9366ec8b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d6de4737-ca60-4c8d-bfd5-687f9366ec8b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:59.521 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[83107889-c89c-415c-852b-c6f5f38cad2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:59.522 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-d6de4737-ca60-4c8d-bfd5-687f9366ec8b
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/d6de4737-ca60-4c8d-bfd5-687f9366ec8b.pid.haproxy
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID d6de4737-ca60-4c8d-bfd5-687f9366ec8b
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:12:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:12:59.523 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'env', 'PROCESS_TAG=haproxy-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d6de4737-ca60-4c8d-bfd5-687f9366ec8b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.908 2 DEBUG nova.compute.manager [req-34921ce4-b79a-428b-8ddc-0d40591b715e req-4e9387f7-70ed-4a96-a72f-793f3e96eda8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Received event network-vif-plugged-ad0f91db-0150-4421-a472-f9984b0a20bc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.909 2 DEBUG oslo_concurrency.lockutils [req-34921ce4-b79a-428b-8ddc-0d40591b715e req-4e9387f7-70ed-4a96-a72f-793f3e96eda8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "960cdfa5-111c-4d08-82c7-29134bd55212-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.910 2 DEBUG oslo_concurrency.lockutils [req-34921ce4-b79a-428b-8ddc-0d40591b715e req-4e9387f7-70ed-4a96-a72f-793f3e96eda8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "960cdfa5-111c-4d08-82c7-29134bd55212-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.910 2 DEBUG oslo_concurrency.lockutils [req-34921ce4-b79a-428b-8ddc-0d40591b715e req-4e9387f7-70ed-4a96-a72f-793f3e96eda8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "960cdfa5-111c-4d08-82c7-29134bd55212-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.910 2 DEBUG nova.compute.manager [req-34921ce4-b79a-428b-8ddc-0d40591b715e req-4e9387f7-70ed-4a96-a72f-793f3e96eda8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Processing event network-vif-plugged-ad0f91db-0150-4421-a472-f9984b0a20bc _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:12:59 np0005466013 podman[228525]: 2025-10-02 12:12:59.934370739 +0000 UTC m=+0.065497277 container create 5b2183ea0b9686c4611a28f51e22cf2a95b0aff4cb4ebbac8ef12faeb37cf61e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:12:59 np0005466013 systemd[1]: Started libpod-conmon-5b2183ea0b9686c4611a28f51e22cf2a95b0aff4cb4ebbac8ef12faeb37cf61e.scope.
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.977 2 INFO nova.virt.libvirt.driver [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Creating config drive at /var/lib/nova/instances/f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b/disk.config#033[00m
Oct  2 08:12:59 np0005466013 nova_compute[192144]: 2025-10-02 12:12:59.982 2 DEBUG oslo_concurrency.processutils [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmph0ae7qnb execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:12:59 np0005466013 podman[228525]: 2025-10-02 12:12:59.899811745 +0000 UTC m=+0.030938313 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:13:00 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:13:00 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bee06db7ff2660ffa7c4380f9c6f3ad9115400452f1fcaa03b9a7c87ec049d35/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:13:00 np0005466013 podman[228525]: 2025-10-02 12:13:00.027570756 +0000 UTC m=+0.158697314 container init 5b2183ea0b9686c4611a28f51e22cf2a95b0aff4cb4ebbac8ef12faeb37cf61e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:13:00 np0005466013 podman[228525]: 2025-10-02 12:13:00.033110788 +0000 UTC m=+0.164237326 container start 5b2183ea0b9686c4611a28f51e22cf2a95b0aff4cb4ebbac8ef12faeb37cf61e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:13:00 np0005466013 neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b[228540]: [NOTICE]   (228547) : New worker (228549) forked
Oct  2 08:13:00 np0005466013 neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b[228540]: [NOTICE]   (228547) : Loading success.
Oct  2 08:13:00 np0005466013 nova_compute[192144]: 2025-10-02 12:13:00.088 2 DEBUG nova.network.neutron [req-20cba1fb-7489-4121-9bf5-202079995a48 req-78bda249-e39d-4cd4-bf16-00d729778ae6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Updated VIF entry in instance network info cache for port ad0f91db-0150-4421-a472-f9984b0a20bc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:13:00 np0005466013 nova_compute[192144]: 2025-10-02 12:13:00.089 2 DEBUG nova.network.neutron [req-20cba1fb-7489-4121-9bf5-202079995a48 req-78bda249-e39d-4cd4-bf16-00d729778ae6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Updating instance_info_cache with network_info: [{"id": "ad0f91db-0150-4421-a472-f9984b0a20bc", "address": "fa:16:3e:62:bc:7f", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad0f91db-01", "ovs_interfaceid": "ad0f91db-0150-4421-a472-f9984b0a20bc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:13:00 np0005466013 nova_compute[192144]: 2025-10-02 12:13:00.113 2 DEBUG oslo_concurrency.processutils [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmph0ae7qnb" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:13:00 np0005466013 nova_compute[192144]: 2025-10-02 12:13:00.121 2 DEBUG oslo_concurrency.lockutils [req-20cba1fb-7489-4121-9bf5-202079995a48 req-78bda249-e39d-4cd4-bf16-00d729778ae6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-960cdfa5-111c-4d08-82c7-29134bd55212" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:13:00 np0005466013 nova_compute[192144]: 2025-10-02 12:13:00.152 2 DEBUG nova.compute.manager [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:13:00 np0005466013 nova_compute[192144]: 2025-10-02 12:13:00.155 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407180.1544774, 960cdfa5-111c-4d08-82c7-29134bd55212 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:13:00 np0005466013 nova_compute[192144]: 2025-10-02 12:13:00.157 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] VM Started (Lifecycle Event)#033[00m
Oct  2 08:13:00 np0005466013 nova_compute[192144]: 2025-10-02 12:13:00.160 2 DEBUG nova.virt.libvirt.driver [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:13:00 np0005466013 nova_compute[192144]: 2025-10-02 12:13:00.169 2 INFO nova.virt.libvirt.driver [-] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Instance spawned successfully.#033[00m
Oct  2 08:13:00 np0005466013 nova_compute[192144]: 2025-10-02 12:13:00.170 2 DEBUG nova.virt.libvirt.driver [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:13:00 np0005466013 NetworkManager[51205]: <info>  [1759407180.1913] manager: (tapaf4c0ef6-86): new Tun device (/org/freedesktop/NetworkManager/Devices/101)
Oct  2 08:13:00 np0005466013 kernel: tapaf4c0ef6-86: entered promiscuous mode
Oct  2 08:13:00 np0005466013 systemd-udevd[228471]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:13:00 np0005466013 nova_compute[192144]: 2025-10-02 12:13:00.193 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:13:00 np0005466013 nova_compute[192144]: 2025-10-02 12:13:00.195 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:00 np0005466013 ovn_controller[94366]: 2025-10-02T12:13:00Z|00187|binding|INFO|Claiming lport af4c0ef6-8643-4a83-be1e-4000ab5fd894 for this chassis.
Oct  2 08:13:00 np0005466013 ovn_controller[94366]: 2025-10-02T12:13:00Z|00188|binding|INFO|af4c0ef6-8643-4a83-be1e-4000ab5fd894: Claiming fa:16:3e:a9:c4:a7 10.100.0.8
Oct  2 08:13:00 np0005466013 nova_compute[192144]: 2025-10-02 12:13:00.203 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:00 np0005466013 nova_compute[192144]: 2025-10-02 12:13:00.205 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:13:00 np0005466013 NetworkManager[51205]: <info>  [1759407180.2105] device (tapaf4c0ef6-86): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:13:00 np0005466013 NetworkManager[51205]: <info>  [1759407180.2116] device (tapaf4c0ef6-86): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:00.215 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a9:c4:a7 10.100.0.8'], port_security=['fa:16:3e:a9:c4:a7 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d8c71e68-a016-4099-877d-881b9a6e634c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7cc67bd4da7644d3bd8155cc7f188aa4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7edffac1-7798-43a3-9bf2-487473d2826c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6d8e7264-2c9c-43b0-92cb-5dba96a13a6a, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=af4c0ef6-8643-4a83-be1e-4000ab5fd894) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:00.217 103323 INFO neutron.agent.ovn.metadata.agent [-] Port af4c0ef6-8643-4a83-be1e-4000ab5fd894 in datapath d8c71e68-a016-4099-877d-881b9a6e634c bound to our chassis#033[00m
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:00.218 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d8c71e68-a016-4099-877d-881b9a6e634c#033[00m
Oct  2 08:13:00 np0005466013 nova_compute[192144]: 2025-10-02 12:13:00.220 2 DEBUG nova.virt.libvirt.driver [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:13:00 np0005466013 nova_compute[192144]: 2025-10-02 12:13:00.221 2 DEBUG nova.virt.libvirt.driver [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:13:00 np0005466013 nova_compute[192144]: 2025-10-02 12:13:00.221 2 DEBUG nova.virt.libvirt.driver [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:13:00 np0005466013 nova_compute[192144]: 2025-10-02 12:13:00.221 2 DEBUG nova.virt.libvirt.driver [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:13:00 np0005466013 nova_compute[192144]: 2025-10-02 12:13:00.222 2 DEBUG nova.virt.libvirt.driver [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:13:00 np0005466013 nova_compute[192144]: 2025-10-02 12:13:00.222 2 DEBUG nova.virt.libvirt.driver [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:00.232 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[ee4518cf-4790-4bf3-9278-d2e7342d59df]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:00.233 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd8c71e68-a1 in ovnmeta-d8c71e68-a016-4099-877d-881b9a6e634c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:00.238 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd8c71e68-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:00.238 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[6967a909-5fa4-4f97-b55a-2ec456ee442e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:00.239 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[70ad6d8f-b290-4d37-9406-4fb14e610403]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:00 np0005466013 nova_compute[192144]: 2025-10-02 12:13:00.241 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:13:00 np0005466013 nova_compute[192144]: 2025-10-02 12:13:00.242 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407180.154621, 960cdfa5-111c-4d08-82c7-29134bd55212 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:13:00 np0005466013 nova_compute[192144]: 2025-10-02 12:13:00.242 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:13:00 np0005466013 systemd-machined[152202]: New machine qemu-26-instance-0000003f.
Oct  2 08:13:00 np0005466013 ovn_controller[94366]: 2025-10-02T12:13:00Z|00189|binding|INFO|Setting lport af4c0ef6-8643-4a83-be1e-4000ab5fd894 ovn-installed in OVS
Oct  2 08:13:00 np0005466013 ovn_controller[94366]: 2025-10-02T12:13:00Z|00190|binding|INFO|Setting lport af4c0ef6-8643-4a83-be1e-4000ab5fd894 up in Southbound
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:00.254 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[af034a66-e440-4cf0-a195-ccc5c507d387]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:00 np0005466013 nova_compute[192144]: 2025-10-02 12:13:00.256 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:00 np0005466013 systemd[1]: Started Virtual Machine qemu-26-instance-0000003f.
Oct  2 08:13:00 np0005466013 nova_compute[192144]: 2025-10-02 12:13:00.280 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:00.283 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[217af6fe-aa78-4357-8937-393d789edf6e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:00 np0005466013 nova_compute[192144]: 2025-10-02 12:13:00.289 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407180.159542, 960cdfa5-111c-4d08-82c7-29134bd55212 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:13:00 np0005466013 nova_compute[192144]: 2025-10-02 12:13:00.289 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:13:00 np0005466013 nova_compute[192144]: 2025-10-02 12:13:00.306 2 INFO nova.compute.manager [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Took 7.33 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:13:00 np0005466013 nova_compute[192144]: 2025-10-02 12:13:00.306 2 DEBUG nova.compute.manager [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:13:00 np0005466013 nova_compute[192144]: 2025-10-02 12:13:00.319 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:13:00 np0005466013 nova_compute[192144]: 2025-10-02 12:13:00.326 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:00.327 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[de5ab074-0880-45d4-b9e2-794f4d26564b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:00 np0005466013 NetworkManager[51205]: <info>  [1759407180.3368] manager: (tapd8c71e68-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/102)
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:00.335 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[aff1b24e-84c6-429e-9811-bce9ff26602d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:00.372 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[823a7b7d-ff5c-4b6e-9a29-6401218d9ef9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:00.377 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[510ff729-e687-4447-a184-1e530d2fedb4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:00 np0005466013 NetworkManager[51205]: <info>  [1759407180.4061] device (tapd8c71e68-a0): carrier: link connected
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:00.412 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[96404504-c1b1-4810-bff9-50c41d9bc809]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:00.431 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[36fa0496-3aad-4295-b4c9-07711e62788b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd8c71e68-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1d:7d:3f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 60], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 517202, 'reachable_time': 21047, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228590, 'error': None, 'target': 'ovnmeta-d8c71e68-a016-4099-877d-881b9a6e634c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:00.454 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[bab8394c-ab5c-4a0a-9c87-e7b5b111d20d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1d:7d3f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 517202, 'tstamp': 517202}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228591, 'error': None, 'target': 'ovnmeta-d8c71e68-a016-4099-877d-881b9a6e634c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:00.477 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[12f18191-36c9-40bd-9438-e579f74f735f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd8c71e68-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1d:7d:3f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 60], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 517202, 'reachable_time': 21047, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 228592, 'error': None, 'target': 'ovnmeta-d8c71e68-a016-4099-877d-881b9a6e634c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:00 np0005466013 nova_compute[192144]: 2025-10-02 12:13:00.507 2 INFO nova.compute.manager [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Took 8.38 seconds to build instance.#033[00m
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:00.525 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[beb1a5fb-7af1-446b-bef7-654e69f4e148]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:00 np0005466013 nova_compute[192144]: 2025-10-02 12:13:00.533 2 DEBUG oslo_concurrency.lockutils [None req-596c5de8-38bb-40e6-be17-3e8d1017013a def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "960cdfa5-111c-4d08-82c7-29134bd55212" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.506s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:00.601 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[19ad9d76-31a1-42bd-b428-13a908dd92b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:00.606 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd8c71e68-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:00.607 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:00.607 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd8c71e68-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:13:00 np0005466013 nova_compute[192144]: 2025-10-02 12:13:00.609 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:00 np0005466013 NetworkManager[51205]: <info>  [1759407180.6103] manager: (tapd8c71e68-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/103)
Oct  2 08:13:00 np0005466013 kernel: tapd8c71e68-a0: entered promiscuous mode
Oct  2 08:13:00 np0005466013 nova_compute[192144]: 2025-10-02 12:13:00.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:00.615 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd8c71e68-a0, col_values=(('external_ids', {'iface-id': '5ad8bf87-5f16-46a4-acdd-093425cab386'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:13:00 np0005466013 nova_compute[192144]: 2025-10-02 12:13:00.617 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:00 np0005466013 ovn_controller[94366]: 2025-10-02T12:13:00Z|00191|binding|INFO|Releasing lport 5ad8bf87-5f16-46a4-acdd-093425cab386 from this chassis (sb_readonly=0)
Oct  2 08:13:00 np0005466013 nova_compute[192144]: 2025-10-02 12:13:00.628 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:00.630 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d8c71e68-a016-4099-877d-881b9a6e634c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d8c71e68-a016-4099-877d-881b9a6e634c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:00.631 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[f2b4f5a7-8f77-488c-b50e-0d626de9084c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:00.632 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-d8c71e68-a016-4099-877d-881b9a6e634c
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/d8c71e68-a016-4099-877d-881b9a6e634c.pid.haproxy
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID d8c71e68-a016-4099-877d-881b9a6e634c
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:13:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:00.634 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d8c71e68-a016-4099-877d-881b9a6e634c', 'env', 'PROCESS_TAG=haproxy-d8c71e68-a016-4099-877d-881b9a6e634c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d8c71e68-a016-4099-877d-881b9a6e634c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:13:01 np0005466013 podman[228629]: 2025-10-02 12:13:01.04800207 +0000 UTC m=+0.053268786 container create 6fe455975ed6ed4833a3a8b5f5b4bb5d8ab0dcb5a17a47ad40588f42873cfbfc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d8c71e68-a016-4099-877d-881b9a6e634c, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 08:13:01 np0005466013 nova_compute[192144]: 2025-10-02 12:13:01.059 2 DEBUG nova.network.neutron [req-127bd899-f2f7-4802-8119-f78cc2918103 req-bab08aa3-7615-4206-bdba-2a73f015d4e0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Updated VIF entry in instance network info cache for port af4c0ef6-8643-4a83-be1e-4000ab5fd894. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:13:01 np0005466013 nova_compute[192144]: 2025-10-02 12:13:01.060 2 DEBUG nova.network.neutron [req-127bd899-f2f7-4802-8119-f78cc2918103 req-bab08aa3-7615-4206-bdba-2a73f015d4e0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Updating instance_info_cache with network_info: [{"id": "af4c0ef6-8643-4a83-be1e-4000ab5fd894", "address": "fa:16:3e:a9:c4:a7", "network": {"id": "d8c71e68-a016-4099-877d-881b9a6e634c", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1691956717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7cc67bd4da7644d3bd8155cc7f188aa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf4c0ef6-86", "ovs_interfaceid": "af4c0ef6-8643-4a83-be1e-4000ab5fd894", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:13:01 np0005466013 nova_compute[192144]: 2025-10-02 12:13:01.078 2 DEBUG oslo_concurrency.lockutils [req-127bd899-f2f7-4802-8119-f78cc2918103 req-bab08aa3-7615-4206-bdba-2a73f015d4e0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:13:01 np0005466013 systemd[1]: Started libpod-conmon-6fe455975ed6ed4833a3a8b5f5b4bb5d8ab0dcb5a17a47ad40588f42873cfbfc.scope.
Oct  2 08:13:01 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:13:01 np0005466013 podman[228629]: 2025-10-02 12:13:01.021932989 +0000 UTC m=+0.027199725 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:13:01 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b08d1fed084485a971404a936ac2c0139cf9f533a0c3be14a90d793d6eb78604/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:13:01 np0005466013 nova_compute[192144]: 2025-10-02 12:13:01.123 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407181.1227052, f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:13:01 np0005466013 nova_compute[192144]: 2025-10-02 12:13:01.123 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] VM Started (Lifecycle Event)#033[00m
Oct  2 08:13:01 np0005466013 nova_compute[192144]: 2025-10-02 12:13:01.128 2 DEBUG nova.compute.manager [req-2d2a7533-23d4-4586-81f7-538195a1e1a6 req-f2ccfc00-ed4e-447f-bd7d-fbdcb300c08c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Received event network-vif-plugged-af4c0ef6-8643-4a83-be1e-4000ab5fd894 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:13:01 np0005466013 nova_compute[192144]: 2025-10-02 12:13:01.129 2 DEBUG oslo_concurrency.lockutils [req-2d2a7533-23d4-4586-81f7-538195a1e1a6 req-f2ccfc00-ed4e-447f-bd7d-fbdcb300c08c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:01 np0005466013 nova_compute[192144]: 2025-10-02 12:13:01.129 2 DEBUG oslo_concurrency.lockutils [req-2d2a7533-23d4-4586-81f7-538195a1e1a6 req-f2ccfc00-ed4e-447f-bd7d-fbdcb300c08c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:01 np0005466013 nova_compute[192144]: 2025-10-02 12:13:01.130 2 DEBUG oslo_concurrency.lockutils [req-2d2a7533-23d4-4586-81f7-538195a1e1a6 req-f2ccfc00-ed4e-447f-bd7d-fbdcb300c08c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:01 np0005466013 nova_compute[192144]: 2025-10-02 12:13:01.130 2 DEBUG nova.compute.manager [req-2d2a7533-23d4-4586-81f7-538195a1e1a6 req-f2ccfc00-ed4e-447f-bd7d-fbdcb300c08c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Processing event network-vif-plugged-af4c0ef6-8643-4a83-be1e-4000ab5fd894 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:13:01 np0005466013 nova_compute[192144]: 2025-10-02 12:13:01.131 2 DEBUG nova.compute.manager [req-2d2a7533-23d4-4586-81f7-538195a1e1a6 req-f2ccfc00-ed4e-447f-bd7d-fbdcb300c08c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Received event network-vif-plugged-af4c0ef6-8643-4a83-be1e-4000ab5fd894 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:13:01 np0005466013 nova_compute[192144]: 2025-10-02 12:13:01.133 2 DEBUG oslo_concurrency.lockutils [req-2d2a7533-23d4-4586-81f7-538195a1e1a6 req-f2ccfc00-ed4e-447f-bd7d-fbdcb300c08c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:01 np0005466013 nova_compute[192144]: 2025-10-02 12:13:01.133 2 DEBUG oslo_concurrency.lockutils [req-2d2a7533-23d4-4586-81f7-538195a1e1a6 req-f2ccfc00-ed4e-447f-bd7d-fbdcb300c08c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:01 np0005466013 nova_compute[192144]: 2025-10-02 12:13:01.134 2 DEBUG oslo_concurrency.lockutils [req-2d2a7533-23d4-4586-81f7-538195a1e1a6 req-f2ccfc00-ed4e-447f-bd7d-fbdcb300c08c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:01 np0005466013 podman[228629]: 2025-10-02 12:13:01.133377923 +0000 UTC m=+0.138644649 container init 6fe455975ed6ed4833a3a8b5f5b4bb5d8ab0dcb5a17a47ad40588f42873cfbfc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d8c71e68-a016-4099-877d-881b9a6e634c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 08:13:01 np0005466013 nova_compute[192144]: 2025-10-02 12:13:01.134 2 DEBUG nova.compute.manager [req-2d2a7533-23d4-4586-81f7-538195a1e1a6 req-f2ccfc00-ed4e-447f-bd7d-fbdcb300c08c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] No waiting events found dispatching network-vif-plugged-af4c0ef6-8643-4a83-be1e-4000ab5fd894 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:13:01 np0005466013 nova_compute[192144]: 2025-10-02 12:13:01.135 2 WARNING nova.compute.manager [req-2d2a7533-23d4-4586-81f7-538195a1e1a6 req-f2ccfc00-ed4e-447f-bd7d-fbdcb300c08c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Received unexpected event network-vif-plugged-af4c0ef6-8643-4a83-be1e-4000ab5fd894 for instance with vm_state building and task_state spawning.#033[00m
Oct  2 08:13:01 np0005466013 nova_compute[192144]: 2025-10-02 12:13:01.136 2 DEBUG nova.compute.manager [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:13:01 np0005466013 nova_compute[192144]: 2025-10-02 12:13:01.143 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:13:01 np0005466013 podman[228629]: 2025-10-02 12:13:01.145280093 +0000 UTC m=+0.150546809 container start 6fe455975ed6ed4833a3a8b5f5b4bb5d8ab0dcb5a17a47ad40588f42873cfbfc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d8c71e68-a016-4099-877d-881b9a6e634c, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 08:13:01 np0005466013 nova_compute[192144]: 2025-10-02 12:13:01.163 2 DEBUG nova.virt.libvirt.driver [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:13:01 np0005466013 nova_compute[192144]: 2025-10-02 12:13:01.164 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:13:01 np0005466013 nova_compute[192144]: 2025-10-02 12:13:01.168 2 INFO nova.virt.libvirt.driver [-] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Instance spawned successfully.#033[00m
Oct  2 08:13:01 np0005466013 nova_compute[192144]: 2025-10-02 12:13:01.170 2 DEBUG nova.virt.libvirt.driver [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:13:01 np0005466013 neutron-haproxy-ovnmeta-d8c71e68-a016-4099-877d-881b9a6e634c[228644]: [NOTICE]   (228648) : New worker (228650) forked
Oct  2 08:13:01 np0005466013 neutron-haproxy-ovnmeta-d8c71e68-a016-4099-877d-881b9a6e634c[228644]: [NOTICE]   (228648) : Loading success.
Oct  2 08:13:01 np0005466013 nova_compute[192144]: 2025-10-02 12:13:01.193 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:13:01 np0005466013 nova_compute[192144]: 2025-10-02 12:13:01.193 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407181.1228166, f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:13:01 np0005466013 nova_compute[192144]: 2025-10-02 12:13:01.194 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:13:01 np0005466013 nova_compute[192144]: 2025-10-02 12:13:01.203 2 DEBUG nova.virt.libvirt.driver [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:13:01 np0005466013 nova_compute[192144]: 2025-10-02 12:13:01.204 2 DEBUG nova.virt.libvirt.driver [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:13:01 np0005466013 nova_compute[192144]: 2025-10-02 12:13:01.204 2 DEBUG nova.virt.libvirt.driver [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:13:01 np0005466013 nova_compute[192144]: 2025-10-02 12:13:01.205 2 DEBUG nova.virt.libvirt.driver [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:13:01 np0005466013 nova_compute[192144]: 2025-10-02 12:13:01.206 2 DEBUG nova.virt.libvirt.driver [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:13:01 np0005466013 nova_compute[192144]: 2025-10-02 12:13:01.206 2 DEBUG nova.virt.libvirt.driver [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:13:01 np0005466013 nova_compute[192144]: 2025-10-02 12:13:01.230 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:13:01 np0005466013 nova_compute[192144]: 2025-10-02 12:13:01.235 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407181.1408777, f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:13:01 np0005466013 nova_compute[192144]: 2025-10-02 12:13:01.236 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:13:01 np0005466013 nova_compute[192144]: 2025-10-02 12:13:01.266 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:13:01 np0005466013 nova_compute[192144]: 2025-10-02 12:13:01.271 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:13:01 np0005466013 nova_compute[192144]: 2025-10-02 12:13:01.293 2 INFO nova.compute.manager [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Took 7.28 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:13:01 np0005466013 nova_compute[192144]: 2025-10-02 12:13:01.294 2 DEBUG nova.compute.manager [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:13:01 np0005466013 nova_compute[192144]: 2025-10-02 12:13:01.297 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:13:01 np0005466013 nova_compute[192144]: 2025-10-02 12:13:01.398 2 INFO nova.compute.manager [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Took 8.14 seconds to build instance.#033[00m
Oct  2 08:13:01 np0005466013 nova_compute[192144]: 2025-10-02 12:13:01.417 2 DEBUG oslo_concurrency.lockutils [None req-687a78ce-bad1-41e3-aa6f-0f7cd59acfc0 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Lock "f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.300s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:01 np0005466013 nova_compute[192144]: 2025-10-02 12:13:01.984 2 DEBUG nova.compute.manager [req-833c1b8c-d490-4de6-a6a9-cedac9baec55 req-15df68d1-fc11-4950-98af-c4e2a8d430b2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Received event network-vif-plugged-ad0f91db-0150-4421-a472-f9984b0a20bc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:13:01 np0005466013 nova_compute[192144]: 2025-10-02 12:13:01.985 2 DEBUG oslo_concurrency.lockutils [req-833c1b8c-d490-4de6-a6a9-cedac9baec55 req-15df68d1-fc11-4950-98af-c4e2a8d430b2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "960cdfa5-111c-4d08-82c7-29134bd55212-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:01 np0005466013 nova_compute[192144]: 2025-10-02 12:13:01.985 2 DEBUG oslo_concurrency.lockutils [req-833c1b8c-d490-4de6-a6a9-cedac9baec55 req-15df68d1-fc11-4950-98af-c4e2a8d430b2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "960cdfa5-111c-4d08-82c7-29134bd55212-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:01 np0005466013 nova_compute[192144]: 2025-10-02 12:13:01.985 2 DEBUG oslo_concurrency.lockutils [req-833c1b8c-d490-4de6-a6a9-cedac9baec55 req-15df68d1-fc11-4950-98af-c4e2a8d430b2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "960cdfa5-111c-4d08-82c7-29134bd55212-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:01 np0005466013 nova_compute[192144]: 2025-10-02 12:13:01.986 2 DEBUG nova.compute.manager [req-833c1b8c-d490-4de6-a6a9-cedac9baec55 req-15df68d1-fc11-4950-98af-c4e2a8d430b2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] No waiting events found dispatching network-vif-plugged-ad0f91db-0150-4421-a472-f9984b0a20bc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:13:01 np0005466013 nova_compute[192144]: 2025-10-02 12:13:01.986 2 WARNING nova.compute.manager [req-833c1b8c-d490-4de6-a6a9-cedac9baec55 req-15df68d1-fc11-4950-98af-c4e2a8d430b2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Received unexpected event network-vif-plugged-ad0f91db-0150-4421-a472-f9984b0a20bc for instance with vm_state active and task_state None.#033[00m
Oct  2 08:13:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:02.294 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:02.296 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:02.297 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:02 np0005466013 nova_compute[192144]: 2025-10-02 12:13:02.967 2 INFO nova.compute.manager [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Rebuilding instance#033[00m
Oct  2 08:13:03 np0005466013 nova_compute[192144]: 2025-10-02 12:13:03.065 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:03 np0005466013 NetworkManager[51205]: <info>  [1759407183.0843] manager: (patch-br-int-to-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/104)
Oct  2 08:13:03 np0005466013 NetworkManager[51205]: <info>  [1759407183.0855] manager: (patch-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/105)
Oct  2 08:13:03 np0005466013 nova_compute[192144]: 2025-10-02 12:13:03.178 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:03 np0005466013 ovn_controller[94366]: 2025-10-02T12:13:03Z|00192|binding|INFO|Releasing lport cc451eb7-bf34-4b54-96d8-b834f11e06fb from this chassis (sb_readonly=0)
Oct  2 08:13:03 np0005466013 ovn_controller[94366]: 2025-10-02T12:13:03Z|00193|binding|INFO|Releasing lport 5ad8bf87-5f16-46a4-acdd-093425cab386 from this chassis (sb_readonly=0)
Oct  2 08:13:03 np0005466013 nova_compute[192144]: 2025-10-02 12:13:03.189 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:03 np0005466013 nova_compute[192144]: 2025-10-02 12:13:03.225 2 DEBUG nova.compute.manager [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:13:03 np0005466013 nova_compute[192144]: 2025-10-02 12:13:03.252 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:03 np0005466013 nova_compute[192144]: 2025-10-02 12:13:03.305 2 DEBUG nova.objects.instance [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lazy-loading 'pci_requests' on Instance uuid 960cdfa5-111c-4d08-82c7-29134bd55212 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:13:03 np0005466013 nova_compute[192144]: 2025-10-02 12:13:03.327 2 DEBUG nova.objects.instance [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lazy-loading 'pci_devices' on Instance uuid 960cdfa5-111c-4d08-82c7-29134bd55212 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:13:03 np0005466013 nova_compute[192144]: 2025-10-02 12:13:03.340 2 DEBUG nova.objects.instance [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lazy-loading 'resources' on Instance uuid 960cdfa5-111c-4d08-82c7-29134bd55212 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:13:03 np0005466013 nova_compute[192144]: 2025-10-02 12:13:03.350 2 DEBUG nova.objects.instance [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lazy-loading 'migration_context' on Instance uuid 960cdfa5-111c-4d08-82c7-29134bd55212 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:13:03 np0005466013 nova_compute[192144]: 2025-10-02 12:13:03.367 2 DEBUG nova.objects.instance [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Oct  2 08:13:03 np0005466013 nova_compute[192144]: 2025-10-02 12:13:03.370 2 DEBUG nova.virt.libvirt.driver [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Oct  2 08:13:04 np0005466013 nova_compute[192144]: 2025-10-02 12:13:04.066 2 DEBUG nova.compute.manager [req-2dcbdda5-dc3f-46a2-ac9e-770c5378505f req-72a2fcaa-1615-4a31-b746-e3885be3c09e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Received event network-changed-af4c0ef6-8643-4a83-be1e-4000ab5fd894 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:13:04 np0005466013 nova_compute[192144]: 2025-10-02 12:13:04.067 2 DEBUG nova.compute.manager [req-2dcbdda5-dc3f-46a2-ac9e-770c5378505f req-72a2fcaa-1615-4a31-b746-e3885be3c09e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Refreshing instance network info cache due to event network-changed-af4c0ef6-8643-4a83-be1e-4000ab5fd894. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:13:04 np0005466013 nova_compute[192144]: 2025-10-02 12:13:04.067 2 DEBUG oslo_concurrency.lockutils [req-2dcbdda5-dc3f-46a2-ac9e-770c5378505f req-72a2fcaa-1615-4a31-b746-e3885be3c09e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:13:04 np0005466013 nova_compute[192144]: 2025-10-02 12:13:04.067 2 DEBUG oslo_concurrency.lockutils [req-2dcbdda5-dc3f-46a2-ac9e-770c5378505f req-72a2fcaa-1615-4a31-b746-e3885be3c09e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:13:04 np0005466013 nova_compute[192144]: 2025-10-02 12:13:04.067 2 DEBUG nova.network.neutron [req-2dcbdda5-dc3f-46a2-ac9e-770c5378505f req-72a2fcaa-1615-4a31-b746-e3885be3c09e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Refreshing network info cache for port af4c0ef6-8643-4a83-be1e-4000ab5fd894 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:13:04 np0005466013 nova_compute[192144]: 2025-10-02 12:13:04.098 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:06 np0005466013 nova_compute[192144]: 2025-10-02 12:13:06.016 2 DEBUG nova.network.neutron [req-2dcbdda5-dc3f-46a2-ac9e-770c5378505f req-72a2fcaa-1615-4a31-b746-e3885be3c09e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Updated VIF entry in instance network info cache for port af4c0ef6-8643-4a83-be1e-4000ab5fd894. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:13:06 np0005466013 nova_compute[192144]: 2025-10-02 12:13:06.019 2 DEBUG nova.network.neutron [req-2dcbdda5-dc3f-46a2-ac9e-770c5378505f req-72a2fcaa-1615-4a31-b746-e3885be3c09e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Updating instance_info_cache with network_info: [{"id": "af4c0ef6-8643-4a83-be1e-4000ab5fd894", "address": "fa:16:3e:a9:c4:a7", "network": {"id": "d8c71e68-a016-4099-877d-881b9a6e634c", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1691956717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7cc67bd4da7644d3bd8155cc7f188aa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf4c0ef6-86", "ovs_interfaceid": "af4c0ef6-8643-4a83-be1e-4000ab5fd894", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:13:06 np0005466013 nova_compute[192144]: 2025-10-02 12:13:06.044 2 DEBUG oslo_concurrency.lockutils [req-2dcbdda5-dc3f-46a2-ac9e-770c5378505f req-72a2fcaa-1615-4a31-b746-e3885be3c09e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:13:06 np0005466013 podman[228661]: 2025-10-02 12:13:06.712508987 +0000 UTC m=+0.077455177 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Oct  2 08:13:06 np0005466013 podman[228660]: 2025-10-02 12:13:06.744038858 +0000 UTC m=+0.114475969 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:13:06 np0005466013 podman[228662]: 2025-10-02 12:13:06.774143423 +0000 UTC m=+0.141025414 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  2 08:13:08 np0005466013 nova_compute[192144]: 2025-10-02 12:13:08.255 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:09 np0005466013 nova_compute[192144]: 2025-10-02 12:13:09.100 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:13 np0005466013 nova_compute[192144]: 2025-10-02 12:13:13.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:13 np0005466013 nova_compute[192144]: 2025-10-02 12:13:13.422 2 DEBUG nova.virt.libvirt.driver [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Oct  2 08:13:14 np0005466013 nova_compute[192144]: 2025-10-02 12:13:14.104 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:14 np0005466013 ovn_controller[94366]: 2025-10-02T12:13:14Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a9:c4:a7 10.100.0.8
Oct  2 08:13:14 np0005466013 ovn_controller[94366]: 2025-10-02T12:13:14Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a9:c4:a7 10.100.0.8
Oct  2 08:13:15 np0005466013 kernel: tapad0f91db-01 (unregistering): left promiscuous mode
Oct  2 08:13:15 np0005466013 NetworkManager[51205]: <info>  [1759407195.7564] device (tapad0f91db-01): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:13:15 np0005466013 ovn_controller[94366]: 2025-10-02T12:13:15Z|00194|binding|INFO|Releasing lport ad0f91db-0150-4421-a472-f9984b0a20bc from this chassis (sb_readonly=0)
Oct  2 08:13:15 np0005466013 ovn_controller[94366]: 2025-10-02T12:13:15Z|00195|binding|INFO|Setting lport ad0f91db-0150-4421-a472-f9984b0a20bc down in Southbound
Oct  2 08:13:15 np0005466013 ovn_controller[94366]: 2025-10-02T12:13:15Z|00196|binding|INFO|Removing iface tapad0f91db-01 ovn-installed in OVS
Oct  2 08:13:15 np0005466013 nova_compute[192144]: 2025-10-02 12:13:15.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:15.779 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:62:bc:7f 10.100.0.8'], port_security=['fa:16:3e:62:bc:7f 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '960cdfa5-111c-4d08-82c7-29134bd55212', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ffae703d68b24b9c89686c149113fc2b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '64970375-b20e-4c18-bfb5-2a0465f8be7d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9476db85-7514-407a-b55a-3d3c703e8f7b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=ad0f91db-0150-4421-a472-f9984b0a20bc) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:13:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:15.781 103323 INFO neutron.agent.ovn.metadata.agent [-] Port ad0f91db-0150-4421-a472-f9984b0a20bc in datapath d6de4737-ca60-4c8d-bfd5-687f9366ec8b unbound from our chassis#033[00m
Oct  2 08:13:15 np0005466013 nova_compute[192144]: 2025-10-02 12:13:15.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:15.784 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d6de4737-ca60-4c8d-bfd5-687f9366ec8b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:13:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:15.786 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[7532012d-d1b3-44bd-9377-2173b9b65fda]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:15.788 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b namespace which is not needed anymore#033[00m
Oct  2 08:13:15 np0005466013 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d0000003e.scope: Deactivated successfully.
Oct  2 08:13:15 np0005466013 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d0000003e.scope: Consumed 13.693s CPU time.
Oct  2 08:13:15 np0005466013 systemd-machined[152202]: Machine qemu-25-instance-0000003e terminated.
Oct  2 08:13:15 np0005466013 neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b[228540]: [NOTICE]   (228547) : haproxy version is 2.8.14-c23fe91
Oct  2 08:13:15 np0005466013 neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b[228540]: [NOTICE]   (228547) : path to executable is /usr/sbin/haproxy
Oct  2 08:13:15 np0005466013 neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b[228540]: [WARNING]  (228547) : Exiting Master process...
Oct  2 08:13:15 np0005466013 neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b[228540]: [ALERT]    (228547) : Current worker (228549) exited with code 143 (Terminated)
Oct  2 08:13:15 np0005466013 neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b[228540]: [WARNING]  (228547) : All workers exited. Exiting... (0)
Oct  2 08:13:15 np0005466013 systemd[1]: libpod-5b2183ea0b9686c4611a28f51e22cf2a95b0aff4cb4ebbac8ef12faeb37cf61e.scope: Deactivated successfully.
Oct  2 08:13:15 np0005466013 podman[228781]: 2025-10-02 12:13:15.949086003 +0000 UTC m=+0.051566324 container died 5b2183ea0b9686c4611a28f51e22cf2a95b0aff4cb4ebbac8ef12faeb37cf61e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001)
Oct  2 08:13:15 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5b2183ea0b9686c4611a28f51e22cf2a95b0aff4cb4ebbac8ef12faeb37cf61e-userdata-shm.mount: Deactivated successfully.
Oct  2 08:13:16 np0005466013 nova_compute[192144]: 2025-10-02 12:13:16.006 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:16 np0005466013 systemd[1]: var-lib-containers-storage-overlay-bee06db7ff2660ffa7c4380f9c6f3ad9115400452f1fcaa03b9a7c87ec049d35-merged.mount: Deactivated successfully.
Oct  2 08:13:16 np0005466013 nova_compute[192144]: 2025-10-02 12:13:16.012 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:16 np0005466013 nova_compute[192144]: 2025-10-02 12:13:16.018 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:16 np0005466013 podman[228781]: 2025-10-02 12:13:16.022128603 +0000 UTC m=+0.124608934 container cleanup 5b2183ea0b9686c4611a28f51e22cf2a95b0aff4cb4ebbac8ef12faeb37cf61e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:13:16 np0005466013 systemd[1]: libpod-conmon-5b2183ea0b9686c4611a28f51e22cf2a95b0aff4cb4ebbac8ef12faeb37cf61e.scope: Deactivated successfully.
Oct  2 08:13:16 np0005466013 podman[228818]: 2025-10-02 12:13:16.094514502 +0000 UTC m=+0.049014724 container remove 5b2183ea0b9686c4611a28f51e22cf2a95b0aff4cb4ebbac8ef12faeb37cf61e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2)
Oct  2 08:13:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:16.101 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[cb05e2ac-df5d-4619-bacb-688d32f31d1f]: (4, ('Thu Oct  2 12:13:15 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b (5b2183ea0b9686c4611a28f51e22cf2a95b0aff4cb4ebbac8ef12faeb37cf61e)\n5b2183ea0b9686c4611a28f51e22cf2a95b0aff4cb4ebbac8ef12faeb37cf61e\nThu Oct  2 12:13:16 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b (5b2183ea0b9686c4611a28f51e22cf2a95b0aff4cb4ebbac8ef12faeb37cf61e)\n5b2183ea0b9686c4611a28f51e22cf2a95b0aff4cb4ebbac8ef12faeb37cf61e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:16.105 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[541ebf74-5d37-43f2-bae6-a66cdd289b17]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:16.106 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd6de4737-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:13:16 np0005466013 nova_compute[192144]: 2025-10-02 12:13:16.108 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:16 np0005466013 kernel: tapd6de4737-c0: left promiscuous mode
Oct  2 08:13:16 np0005466013 nova_compute[192144]: 2025-10-02 12:13:16.125 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:16.130 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[5ab7f8c4-7cbd-477f-a341-45cb1970ca38]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:16.173 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[c0ab261e-00c5-4795-9122-d7dea98a3a96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:16.175 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[7953efec-f4a1-437c-b4b4-f49a3a3ebde4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:16.192 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[1eb96c77-3c2c-4180-923f-fda576af1153]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 517088, 'reachable_time': 43577, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228844, 'error': None, 'target': 'ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:16.197 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:13:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:16.197 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[b4acf59c-a424-4174-925b-e4b788110b02]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:16 np0005466013 systemd[1]: run-netns-ovnmeta\x2dd6de4737\x2dca60\x2d4c8d\x2dbfd5\x2d687f9366ec8b.mount: Deactivated successfully.
Oct  2 08:13:16 np0005466013 nova_compute[192144]: 2025-10-02 12:13:16.236 2 DEBUG nova.compute.manager [req-14c3a543-cdca-427c-a509-e1feb2eabed1 req-5f8ccba7-45c0-4533-9e3d-80727359306f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Received event network-vif-unplugged-ad0f91db-0150-4421-a472-f9984b0a20bc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:13:16 np0005466013 nova_compute[192144]: 2025-10-02 12:13:16.237 2 DEBUG oslo_concurrency.lockutils [req-14c3a543-cdca-427c-a509-e1feb2eabed1 req-5f8ccba7-45c0-4533-9e3d-80727359306f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "960cdfa5-111c-4d08-82c7-29134bd55212-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:16 np0005466013 nova_compute[192144]: 2025-10-02 12:13:16.237 2 DEBUG oslo_concurrency.lockutils [req-14c3a543-cdca-427c-a509-e1feb2eabed1 req-5f8ccba7-45c0-4533-9e3d-80727359306f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "960cdfa5-111c-4d08-82c7-29134bd55212-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:16 np0005466013 nova_compute[192144]: 2025-10-02 12:13:16.237 2 DEBUG oslo_concurrency.lockutils [req-14c3a543-cdca-427c-a509-e1feb2eabed1 req-5f8ccba7-45c0-4533-9e3d-80727359306f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "960cdfa5-111c-4d08-82c7-29134bd55212-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:16 np0005466013 nova_compute[192144]: 2025-10-02 12:13:16.237 2 DEBUG nova.compute.manager [req-14c3a543-cdca-427c-a509-e1feb2eabed1 req-5f8ccba7-45c0-4533-9e3d-80727359306f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] No waiting events found dispatching network-vif-unplugged-ad0f91db-0150-4421-a472-f9984b0a20bc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:13:16 np0005466013 nova_compute[192144]: 2025-10-02 12:13:16.238 2 WARNING nova.compute.manager [req-14c3a543-cdca-427c-a509-e1feb2eabed1 req-5f8ccba7-45c0-4533-9e3d-80727359306f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Received unexpected event network-vif-unplugged-ad0f91db-0150-4421-a472-f9984b0a20bc for instance with vm_state active and task_state rebuilding.#033[00m
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.349 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '960cdfa5-111c-4d08-82c7-29134bd55212', 'name': 'tempest-ServerDiskConfigTestJSON-server-1330483951', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000003e', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': 'ffae703d68b24b9c89686c149113fc2b', 'user_id': 'def48c13fd6a43ba88836b753986a731', 'hostId': '51fc3064ff6794a403bf5c49737eab184ebc5aca6a68f2fb89291b3f', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.352 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b', 'name': 'guest-instance-1', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000003f', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '7cc67bd4da7644d3bd8155cc7f188aa4', 'user_id': 'a212b40f430d496d94ca57954f39afd6', 'hostId': 'fad5650edbbaa0bc17676f5a4cf78998d246bbd7bd79331c2b863d1c', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.353 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.354 12 DEBUG ceilometer.compute.pollsters [-] Instance 960cdfa5-111c-4d08-82c7-29134bd55212 was shut off while getting sample of disk.device.write.latency: Failed to inspect data of instance <name=instance-0000003e, id=960cdfa5-111c-4d08-82c7-29134bd55212>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.375 12 DEBUG ceilometer.compute.pollsters [-] f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b/disk.device.write.latency volume: 2925653465 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.376 12 DEBUG ceilometer.compute.pollsters [-] f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.378 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2a4fe466-7851-4ef5-a757-ba793f813a8d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2925653465, 'user_id': 'a212b40f430d496d94ca57954f39afd6', 'user_name': None, 'project_id': '7cc67bd4da7644d3bd8155cc7f188aa4', 'project_name': None, 'resource_id': 'f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b-vda', 'timestamp': '2025-10-02T12:13:16.353563', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'instance-0000003f', 'instance_id': 'f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b', 'instance_type': 'm1.nano', 'host': 'fad5650edbbaa0bc17676f5a4cf78998d246bbd7bd79331c2b863d1c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2d0ec78c-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5188.03324822, 'message_signature': '5a58993b1618b4c08e324610a00eb960abfc86464b93a55c0b84d5e33263f9be'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'a212b40f430d496d94ca57954f39afd6', 'user_name': None, 'project_id': '7cc67bd4da7644d3bd8155cc7f188aa4', 'project_name': None, 'resource_id': 'f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b-sda', 'timestamp': '2025-10-02T12:13:16.353563', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'instance-0000003f', 'instance_id': 'f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b', 'instance_type': 'm1.nano', 'host': 'fad5650edbbaa0bc17676f5a4cf78998d246bbd7bd79331c2b863d1c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2d0ed4f2-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5188.03324822, 'message_signature': '689d109337b0ad2588764c94d35f5e29fccd650f91f3d5baa15b37723f9b77e4'}]}, 'timestamp': '2025-10-02 12:13:16.376980', '_unique_id': '1c22b957fbf1492580987e9b5cd492b7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.378 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.378 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.378 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.378 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.378 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.378 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.378 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.378 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.378 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.378 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.378 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.378 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.378 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.378 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.378 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.378 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.378 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.378 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.378 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.378 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.378 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.378 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.378 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.378 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.378 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.378 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.378 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.378 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.378 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.378 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.378 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.379 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.380 12 DEBUG ceilometer.compute.pollsters [-] Instance 960cdfa5-111c-4d08-82c7-29134bd55212 was shut off while getting sample of network.incoming.packets: Failed to inspect data of instance <name=instance-0000003e, id=960cdfa5-111c-4d08-82c7-29134bd55212>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.383 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b / tapaf4c0ef6-86 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.384 12 DEBUG ceilometer.compute.pollsters [-] f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b/network.incoming.packets volume: 14 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.385 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'acc81353-9ae2-4e3d-98bf-e6ec45c1aa43', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 14, 'user_id': 'a212b40f430d496d94ca57954f39afd6', 'user_name': None, 'project_id': '7cc67bd4da7644d3bd8155cc7f188aa4', 'project_name': None, 'resource_id': 'instance-0000003f-f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b-tapaf4c0ef6-86', 'timestamp': '2025-10-02T12:13:16.379541', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'tapaf4c0ef6-86', 'instance_id': 'f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b', 'instance_type': 'm1.nano', 'host': 'fad5650edbbaa0bc17676f5a4cf78998d246bbd7bd79331c2b863d1c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a9:c4:a7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaf4c0ef6-86'}, 'message_id': '2d0ffaf8-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5188.059215457, 'message_signature': 'c073b0a8e86148080d030eeb9f1e294280995c27fd91488d5727fdee92f63cc7'}]}, 'timestamp': '2025-10-02 12:13:16.384588', '_unique_id': 'e396e573503440da97a17ac19f4355a5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.385 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.385 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.385 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.385 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.385 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.385 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.385 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.385 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.385 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.385 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.385 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.385 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.385 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.385 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.385 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.385 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.385 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.385 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.385 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.385 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.385 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.385 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.385 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.385 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.385 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.385 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.385 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.385 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.385 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.385 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.385 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.386 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.386 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.387 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerDiskConfigTestJSON-server-1330483951>, <NovaLikeServer: guest-instance-1>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerDiskConfigTestJSON-server-1330483951>, <NovaLikeServer: guest-instance-1>]
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.387 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.388 12 DEBUG ceilometer.compute.pollsters [-] Instance 960cdfa5-111c-4d08-82c7-29134bd55212 was shut off while getting sample of disk.device.capacity: Failed to inspect data of instance <name=instance-0000003e, id=960cdfa5-111c-4d08-82c7-29134bd55212>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.400 12 DEBUG ceilometer.compute.pollsters [-] f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.401 12 DEBUG ceilometer.compute.pollsters [-] f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.402 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd5fef0fd-e8b5-470b-b4f5-4eb525a7032f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'a212b40f430d496d94ca57954f39afd6', 'user_name': None, 'project_id': '7cc67bd4da7644d3bd8155cc7f188aa4', 'project_name': None, 'resource_id': 'f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b-vda', 'timestamp': '2025-10-02T12:13:16.387329', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'instance-0000003f', 'instance_id': 'f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b', 'instance_type': 'm1.nano', 'host': 'fad5650edbbaa0bc17676f5a4cf78998d246bbd7bd79331c2b863d1c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2d12861a-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5188.066471162, 'message_signature': '74bd89b536e4bae1c0deb76718e9a2b2f6c96d65e6025a144409ffcf182805f2'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': 'a212b40f430d496d94ca57954f39afd6', 'user_name': None, 'project_id': '7cc67bd4da7644d3bd8155cc7f188aa4', 'project_name': None, 'resource_id': 'f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b-sda', 'timestamp': '2025-10-02T12:13:16.387329', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'instance-0000003f', 'instance_id': 'f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b', 'instance_type': 'm1.nano', 'host': 'fad5650edbbaa0bc17676f5a4cf78998d246bbd7bd79331c2b863d1c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2d1293bc-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5188.066471162, 'message_signature': '6a6ed9231a58d683ad36812a90f267985ac0965ba493985f7f01cebae89e4ea2'}]}, 'timestamp': '2025-10-02 12:13:16.401465', '_unique_id': '4e3db87b66654a2cb91de3482ba806d6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.402 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.402 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.402 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.402 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.402 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.402 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.402 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.402 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.402 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.402 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.402 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.402 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.402 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.402 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.402 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.402 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.402 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.402 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.402 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.402 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.402 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.402 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.402 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.402 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.402 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.402 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.402 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.402 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.402 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.402 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.402 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.403 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.405 12 DEBUG ceilometer.compute.pollsters [-] Instance 960cdfa5-111c-4d08-82c7-29134bd55212 was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance <name=instance-0000003e, id=960cdfa5-111c-4d08-82c7-29134bd55212>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.405 12 DEBUG ceilometer.compute.pollsters [-] f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b/disk.device.read.latency volume: 526633941 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.405 12 DEBUG ceilometer.compute.pollsters [-] f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b/disk.device.read.latency volume: 152356177 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.406 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b8e862c2-e967-450c-af4d-c1fc54700909', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 526633941, 'user_id': 'a212b40f430d496d94ca57954f39afd6', 'user_name': None, 'project_id': '7cc67bd4da7644d3bd8155cc7f188aa4', 'project_name': None, 'resource_id': 'f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b-vda', 'timestamp': '2025-10-02T12:13:16.403408', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'instance-0000003f', 'instance_id': 'f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b', 'instance_type': 'm1.nano', 'host': 'fad5650edbbaa0bc17676f5a4cf78998d246bbd7bd79331c2b863d1c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2d1336aa-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5188.03324822, 'message_signature': '5f6e095f047fbfcf0b104ad68f1149411183f6afe12e3536f7cc762891bd6604'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 152356177, 'user_id': 'a212b40f430d496d94ca57954f39afd6', 'user_name': None, 'project_id': '7cc67bd4da7644d3bd8155cc7f188aa4', 'project_name': None, 'resource_id': 'f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b-sda', 'timestamp': '2025-10-02T12:13:16.403408', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'instance-0000003f', 'instance_id': 'f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b', 'instance_type': 'm1.nano', 'host': 'fad5650edbbaa0bc17676f5a4cf78998d246bbd7bd79331c2b863d1c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2d134258-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5188.03324822, 'message_signature': '0fde1c31364ab0c014005b2917f3f46194555139539e7d8ecaf8786bb1ddc4bb'}]}, 'timestamp': '2025-10-02 12:13:16.405989', '_unique_id': 'fab0e14d9b5843839ab60c80adfccfff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.406 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.406 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.406 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.406 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.406 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.406 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.406 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.406 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.406 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.406 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.406 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.406 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.406 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.406 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.406 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.406 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.406 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.406 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.406 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.406 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.406 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.406 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.406 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.406 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.406 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.406 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.406 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.406 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.406 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.406 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.406 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.407 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.407 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.407 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerDiskConfigTestJSON-server-1330483951>, <NovaLikeServer: guest-instance-1>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerDiskConfigTestJSON-server-1330483951>, <NovaLikeServer: guest-instance-1>]
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.407 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.408 12 DEBUG ceilometer.compute.pollsters [-] Instance 960cdfa5-111c-4d08-82c7-29134bd55212 was shut off while getting sample of memory.usage: Failed to inspect data of instance <name=instance-0000003e, id=960cdfa5-111c-4d08-82c7-29134bd55212>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.427 12 DEBUG ceilometer.compute.pollsters [-] f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b/memory.usage volume: 40.3671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.429 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '65cd5c71-c226-4b8c-acce-62de22258537', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.3671875, 'user_id': 'a212b40f430d496d94ca57954f39afd6', 'user_name': None, 'project_id': '7cc67bd4da7644d3bd8155cc7f188aa4', 'project_name': None, 'resource_id': 'f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b', 'timestamp': '2025-10-02T12:13:16.407957', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'instance-0000003f', 'instance_id': 'f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b', 'instance_type': 'm1.nano', 'host': 'fad5650edbbaa0bc17676f5a4cf78998d246bbd7bd79331c2b863d1c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '2d169566-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5188.105189185, 'message_signature': '254eea89404218e57fe0ee84c0182acd91ffd9400cdd46d936b5764dac91deb4'}]}, 'timestamp': '2025-10-02 12:13:16.427896', '_unique_id': '645ec2b3e367483283debe536168cf1e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.429 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.429 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.429 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.429 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.429 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.429 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.429 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.429 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.429 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.429 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.429 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.429 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.429 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.429 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.429 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.429 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.429 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.429 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.429 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.429 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.429 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.429 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.429 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.429 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.429 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.429 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.429 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.429 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.429 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.429 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.429 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.430 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.431 12 DEBUG ceilometer.compute.pollsters [-] Instance 960cdfa5-111c-4d08-82c7-29134bd55212 was shut off while getting sample of disk.device.allocation: Failed to inspect data of instance <name=instance-0000003e, id=960cdfa5-111c-4d08-82c7-29134bd55212>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.431 12 DEBUG ceilometer.compute.pollsters [-] f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b/disk.device.allocation volume: 30351360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.432 12 DEBUG ceilometer.compute.pollsters [-] f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.435 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4d8f0f3e-cbd3-4765-8da8-1639ade03b8b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30351360, 'user_id': 'a212b40f430d496d94ca57954f39afd6', 'user_name': None, 'project_id': '7cc67bd4da7644d3bd8155cc7f188aa4', 'project_name': None, 'resource_id': 'f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b-vda', 'timestamp': '2025-10-02T12:13:16.430300', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'instance-0000003f', 'instance_id': 'f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b', 'instance_type': 'm1.nano', 'host': 'fad5650edbbaa0bc17676f5a4cf78998d246bbd7bd79331c2b863d1c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2d174056-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5188.066471162, 'message_signature': '8ad3f23c688661e5468111197ebc2aa0f7a87fef1d2f8cde5c9fc18364f059d8'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': 'a212b40f430d496d94ca57954f39afd6', 'user_name': None, 'project_id': '7cc67bd4da7644d3bd8155cc7f188aa4', 'project_name': None, 'resource_id': 'f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b-sda', 'timestamp': '2025-10-02T12:13:16.430300', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'instance-0000003f', 'instance_id': 'f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b', 'instance_type': 'm1.nano', 'host': 'fad5650edbbaa0bc17676f5a4cf78998d246bbd7bd79331c2b863d1c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2d174e0c-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5188.066471162, 'message_signature': '7e2b95e9980594baa71130e2bfe12af20cc32eb1fd76f5d0b3c6601ce4f82431'}]}, 'timestamp': '2025-10-02 12:13:16.432495', '_unique_id': '603cbc22a61d409bb4fea246f8b407d9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.435 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.435 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.435 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.435 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.435 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.435 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.435 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.435 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.435 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.435 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.435 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.435 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.435 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.435 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.435 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.435 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.435 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.435 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.435 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.435 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.435 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.435 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.435 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.435 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.435 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.435 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.435 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.435 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.435 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.435 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.435 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.436 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.437 12 DEBUG ceilometer.compute.pollsters [-] Instance 960cdfa5-111c-4d08-82c7-29134bd55212 was shut off while getting sample of disk.device.read.requests: Failed to inspect data of instance <name=instance-0000003e, id=960cdfa5-111c-4d08-82c7-29134bd55212>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.438 12 DEBUG ceilometer.compute.pollsters [-] f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b/disk.device.read.requests volume: 1082 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.438 12 DEBUG ceilometer.compute.pollsters [-] f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b/disk.device.read.requests volume: 120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.439 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd4ed557b-5911-4d66-8ff7-b612ff7cc3d5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1082, 'user_id': 'a212b40f430d496d94ca57954f39afd6', 'user_name': None, 'project_id': '7cc67bd4da7644d3bd8155cc7f188aa4', 'project_name': None, 'resource_id': 'f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b-vda', 'timestamp': '2025-10-02T12:13:16.436913', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'instance-0000003f', 'instance_id': 'f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b', 'instance_type': 'm1.nano', 'host': 'fad5650edbbaa0bc17676f5a4cf78998d246bbd7bd79331c2b863d1c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2d1835b0-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5188.03324822, 'message_signature': '613b1b34d3ff1977cdb0de20bbe094dd93b82475bd15f3a3784bfd74205e5e5e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 120, 'user_id': 'a212b40f430d496d94ca57954f39afd6', 'user_name': None, 'project_id': '7cc67bd4da7644d3bd8155cc7f188aa4', 'project_name': None, 'resource_id': 'f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b-sda', 'timestamp': '2025-10-02T12:13:16.436913', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'instance-0000003f', 'instance_id': 'f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b', 'instance_type': 'm1.nano', 'host': 'fad5650edbbaa0bc17676f5a4cf78998d246bbd7bd79331c2b863d1c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2d1842a8-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5188.03324822, 'message_signature': '3785c676b39e64a1f651413e6e9a2c3da7081aecd43a957313a5cd7d055465e4'}]}, 'timestamp': '2025-10-02 12:13:16.438728', '_unique_id': '1c8265bc70b24a3a94e2a445cacb4985'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.439 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.439 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.439 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.439 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.439 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.439 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.439 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.439 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.439 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.439 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.439 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.439 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.439 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.439 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.439 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.439 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.439 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.439 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.439 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:13:16 np0005466013 nova_compute[192144]: 2025-10-02 12:13:16.440 2 INFO nova.virt.libvirt.driver [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Instance shutdown successfully after 13 seconds.#033[00m
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.439 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.439 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.439 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.439 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.439 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.439 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.439 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.439 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.439 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.439 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.439 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.439 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.440 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.441 12 DEBUG ceilometer.compute.pollsters [-] Instance 960cdfa5-111c-4d08-82c7-29134bd55212 was shut off while getting sample of disk.device.usage: Failed to inspect data of instance <name=instance-0000003e, id=960cdfa5-111c-4d08-82c7-29134bd55212>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.441 12 DEBUG ceilometer.compute.pollsters [-] f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b/disk.device.usage volume: 29753344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.442 12 DEBUG ceilometer.compute.pollsters [-] f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.443 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5e233dc5-f4d4-4a6d-a7d4-a9805dc35f7a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29753344, 'user_id': 'a212b40f430d496d94ca57954f39afd6', 'user_name': None, 'project_id': '7cc67bd4da7644d3bd8155cc7f188aa4', 'project_name': None, 'resource_id': 'f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b-vda', 'timestamp': '2025-10-02T12:13:16.440706', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'instance-0000003f', 'instance_id': 'f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b', 'instance_type': 'm1.nano', 'host': 'fad5650edbbaa0bc17676f5a4cf78998d246bbd7bd79331c2b863d1c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2d18cc14-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5188.066471162, 'message_signature': 'c0fd3b534ba017a26d94e124f5a05289a752c567ee7e322595d110ccc9561c62'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': 'a212b40f430d496d94ca57954f39afd6', 'user_name': None, 'project_id': '7cc67bd4da7644d3bd8155cc7f188aa4', 'project_name': None, 'resource_id': 'f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b-sda', 'timestamp': '2025-10-02T12:13:16.440706', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'instance-0000003f', 'instance_id': 'f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b', 'instance_type': 'm1.nano', 'host': 'fad5650edbbaa0bc17676f5a4cf78998d246bbd7bd79331c2b863d1c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2d18d600-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5188.066471162, 'message_signature': 'aa9c6f844eb59d59b55916d3be6af4316478e0d684e7dbf33ead6bbeb9e58600'}]}, 'timestamp': '2025-10-02 12:13:16.442490', '_unique_id': 'eabe43f2de3340ea9208a3c20b5bf1b1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.443 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.443 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.443 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.443 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.443 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.443 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.443 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.443 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.443 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.443 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.443 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.443 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.443 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.443 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.443 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.443 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.443 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.443 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.443 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.443 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.443 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.443 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.443 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.443 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.443 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.443 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.443 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.443 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.443 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.443 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.443 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.444 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.444 12 DEBUG ceilometer.compute.pollsters [-] Instance 960cdfa5-111c-4d08-82c7-29134bd55212 was shut off while getting sample of disk.device.write.requests: Failed to inspect data of instance <name=instance-0000003e, id=960cdfa5-111c-4d08-82c7-29134bd55212>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.445 12 DEBUG ceilometer.compute.pollsters [-] f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b/disk.device.write.requests volume: 305 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.445 12 DEBUG ceilometer.compute.pollsters [-] f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:13:16 np0005466013 nova_compute[192144]: 2025-10-02 12:13:16.445 2 INFO nova.virt.libvirt.driver [-] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Instance destroyed successfully.#033[00m
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.446 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6db085a8-2f8e-4ec9-9be9-2f924db717e9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 305, 'user_id': 'a212b40f430d496d94ca57954f39afd6', 'user_name': None, 'project_id': '7cc67bd4da7644d3bd8155cc7f188aa4', 'project_name': None, 'resource_id': 'f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b-vda', 'timestamp': '2025-10-02T12:13:16.444385', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'instance-0000003f', 'instance_id': 'f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b', 'instance_type': 'm1.nano', 'host': 'fad5650edbbaa0bc17676f5a4cf78998d246bbd7bd79331c2b863d1c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2d19455e-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5188.03324822, 'message_signature': '75162af7d2fc9b4a2eb563ce2030b2c371a9efea7ff1ba55ad155d89bafe69a6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'a212b40f430d496d94ca57954f39afd6', 'user_name': None, 'project_id': '7cc67bd4da7644d3bd8155cc7f188aa4', 'project_name': None, 'resource_id': 'f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b-sda', 'timestamp': '2025-10-02T12:13:16.444385', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'instance-0000003f', 'instance_id': 'f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b', 'instance_type': 'm1.nano', 'host': 'fad5650edbbaa0bc17676f5a4cf78998d246bbd7bd79331c2b863d1c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2d194fcc-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5188.03324822, 'message_signature': '2fce30e681726757ee04b3a68deccd227bb822b972d873cec12dd23e34808f4d'}]}, 'timestamp': '2025-10-02 12:13:16.445600', '_unique_id': '105d5ebc461e4dc9b0d0c9afac65adc7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.446 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.446 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.446 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.446 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.446 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.446 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.446 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.446 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.446 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.446 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.446 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.446 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.446 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.446 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.446 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.446 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.446 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.446 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.446 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.446 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.446 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.446 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.446 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.446 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.446 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.446 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.446 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.446 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.446 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.446 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.446 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.447 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.448 12 DEBUG ceilometer.compute.pollsters [-] Instance 960cdfa5-111c-4d08-82c7-29134bd55212 was shut off while getting sample of network.incoming.bytes: Failed to inspect data of instance <name=instance-0000003e, id=960cdfa5-111c-4d08-82c7-29134bd55212>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.448 12 DEBUG ceilometer.compute.pollsters [-] f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b/network.incoming.bytes volume: 1874 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.449 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9dc6e2e2-3c34-40fe-8433-45f37efc379b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1874, 'user_id': 'a212b40f430d496d94ca57954f39afd6', 'user_name': None, 'project_id': '7cc67bd4da7644d3bd8155cc7f188aa4', 'project_name': None, 'resource_id': 'instance-0000003f-f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b-tapaf4c0ef6-86', 'timestamp': '2025-10-02T12:13:16.447203', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'tapaf4c0ef6-86', 'instance_id': 'f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b', 'instance_type': 'm1.nano', 'host': 'fad5650edbbaa0bc17676f5a4cf78998d246bbd7bd79331c2b863d1c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a9:c4:a7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaf4c0ef6-86'}, 'message_id': '2d19d1ea-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5188.059215457, 'message_signature': 'd25f3926a6375533a8f46514f105592f4493421e75a37f4251c3e7731a275f48'}]}, 'timestamp': '2025-10-02 12:13:16.448976', '_unique_id': '14376440a94e483ebbfe1355f5b08275'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.449 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.449 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.449 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.449 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.449 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.449 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.449 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.449 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.449 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.449 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.449 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.449 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.449 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.449 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.449 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.449 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.449 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.449 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.449 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.449 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.449 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.449 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.449 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.449 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.449 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.449 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.449 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.449 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.449 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.449 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.449 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.450 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  2 08:13:16 np0005466013 nova_compute[192144]: 2025-10-02 12:13:16.451 2 INFO nova.virt.libvirt.driver [-] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Instance destroyed successfully.#033[00m
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.451 12 DEBUG ceilometer.compute.pollsters [-] Instance 960cdfa5-111c-4d08-82c7-29134bd55212 was shut off while getting sample of network.outgoing.packets: Failed to inspect data of instance <name=instance-0000003e, id=960cdfa5-111c-4d08-82c7-29134bd55212>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.451 12 DEBUG ceilometer.compute.pollsters [-] f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b/network.outgoing.packets volume: 10 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.452 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eb204588-94ab-42b1-9776-1909b02866ba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 10, 'user_id': 'a212b40f430d496d94ca57954f39afd6', 'user_name': None, 'project_id': '7cc67bd4da7644d3bd8155cc7f188aa4', 'project_name': None, 'resource_id': 'instance-0000003f-f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b-tapaf4c0ef6-86', 'timestamp': '2025-10-02T12:13:16.450618', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'tapaf4c0ef6-86', 'instance_id': 'f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b', 'instance_type': 'm1.nano', 'host': 'fad5650edbbaa0bc17676f5a4cf78998d246bbd7bd79331c2b863d1c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a9:c4:a7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaf4c0ef6-86'}, 'message_id': '2d1a4a58-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5188.059215457, 'message_signature': '6c9067a81e2d9adb9e239baf023754c086450fda4a5df08350784b127f814992'}]}, 'timestamp': '2025-10-02 12:13:16.452205', '_unique_id': 'fb51ce72b20248afa7dfa517b1bfad60'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.452 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.452 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.452 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.452 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.452 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.452 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.452 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.452 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.452 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.452 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.452 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.452 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.452 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.452 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.452 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.452 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.452 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.452 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.452 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.452 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.452 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.452 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.452 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.452 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.452 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.452 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.452 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.452 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.452 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.452 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.452 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.453 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  2 08:13:16 np0005466013 nova_compute[192144]: 2025-10-02 12:13:16.452 2 DEBUG nova.virt.libvirt.vif [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:12:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1330483951',display_name='tempest-ServerDiskConfigTestJSON-server-1330483951',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1330483951',id=62,image_ref='062d9f80-76b6-42ce-bee7-0fb82a008353',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:13:00Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ffae703d68b24b9c89686c149113fc2b',ramdisk_id='',reservation_id='r-bb4if0yp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='062d9f80-76b6-42ce-bee7-0fb82a008353',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-1763056137',owner_user_name='tempest-ServerDiskConfigTestJSON-1763056137-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:13:02Z,user_data=None,user_id='def48c13fd6a43ba88836b753986a731',uuid=960cdfa5-111c-4d08-82c7-29134bd55212,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ad0f91db-0150-4421-a472-f9984b0a20bc", "address": "fa:16:3e:62:bc:7f", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad0f91db-01", "ovs_interfaceid": "ad0f91db-0150-4421-a472-f9984b0a20bc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:13:16 np0005466013 nova_compute[192144]: 2025-10-02 12:13:16.452 2 DEBUG nova.network.os_vif_util [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Converting VIF {"id": "ad0f91db-0150-4421-a472-f9984b0a20bc", "address": "fa:16:3e:62:bc:7f", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad0f91db-01", "ovs_interfaceid": "ad0f91db-0150-4421-a472-f9984b0a20bc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:13:16 np0005466013 nova_compute[192144]: 2025-10-02 12:13:16.453 2 DEBUG nova.network.os_vif_util [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:62:bc:7f,bridge_name='br-int',has_traffic_filtering=True,id=ad0f91db-0150-4421-a472-f9984b0a20bc,network=Network(d6de4737-ca60-4c8d-bfd5-687f9366ec8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapad0f91db-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.454 12 DEBUG ceilometer.compute.pollsters [-] Instance 960cdfa5-111c-4d08-82c7-29134bd55212 was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance <name=instance-0000003e, id=960cdfa5-111c-4d08-82c7-29134bd55212>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.454 12 DEBUG ceilometer.compute.pollsters [-] f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b/disk.device.write.bytes volume: 72777728 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.454 12 DEBUG ceilometer.compute.pollsters [-] f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:13:16 np0005466013 nova_compute[192144]: 2025-10-02 12:13:16.453 2 DEBUG os_vif [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:62:bc:7f,bridge_name='br-int',has_traffic_filtering=True,id=ad0f91db-0150-4421-a472-f9984b0a20bc,network=Network(d6de4737-ca60-4c8d-bfd5-687f9366ec8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapad0f91db-01') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.455 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8fa56aab-8ec7-44e3-a450-442a497e6bc9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72777728, 'user_id': 'a212b40f430d496d94ca57954f39afd6', 'user_name': None, 'project_id': '7cc67bd4da7644d3bd8155cc7f188aa4', 'project_name': None, 'resource_id': 'f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b-vda', 'timestamp': '2025-10-02T12:13:16.453711', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'instance-0000003f', 'instance_id': 'f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b', 'instance_type': 'm1.nano', 'host': 'fad5650edbbaa0bc17676f5a4cf78998d246bbd7bd79331c2b863d1c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2d1ab736-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5188.03324822, 'message_signature': '3022c4a52f7113358b0f9ceb5f40b10b682b9f49b04e86746a0ade96290706a9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a212b40f430d496d94ca57954f39afd6', 'user_name': None, 'project_id': '7cc67bd4da7644d3bd8155cc7f188aa4', 'project_name': None, 'resource_id': 'f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b-sda', 'timestamp': '2025-10-02T12:13:16.453711', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'instance-0000003f', 'instance_id': 'f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b', 'instance_type': 'm1.nano', 'host': 'fad5650edbbaa0bc17676f5a4cf78998d246bbd7bd79331c2b863d1c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2d1ac2d0-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5188.03324822, 'message_signature': '879985c8aae612ee7e64370314d76a31d6b49e491c6c655925d085f8ddda3d89'}]}, 'timestamp': '2025-10-02 12:13:16.455117', '_unique_id': '1628ff166ccc44f5b628bb0e6355693b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.455 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.455 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.455 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.455 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.455 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.455 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.455 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.455 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.455 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.455 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.455 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.455 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.455 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.455 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.455 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.455 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.455 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.455 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.455 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.455 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.455 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.455 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.455 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.455 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.455 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.455 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.455 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.455 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.455 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.455 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.455 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 nova_compute[192144]: 2025-10-02 12:13:16.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.456 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.457 12 DEBUG ceilometer.compute.pollsters [-] Instance 960cdfa5-111c-4d08-82c7-29134bd55212 was shut off while getting sample of network.outgoing.bytes.delta: Failed to inspect data of instance <name=instance-0000003e, id=960cdfa5-111c-4d08-82c7-29134bd55212>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.458 12 DEBUG ceilometer.compute.pollsters [-] f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:13:16 np0005466013 nova_compute[192144]: 2025-10-02 12:13:16.455 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapad0f91db-01, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.459 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '85ba8977-0332-4ba0-a115-20072d59adc3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a212b40f430d496d94ca57954f39afd6', 'user_name': None, 'project_id': '7cc67bd4da7644d3bd8155cc7f188aa4', 'project_name': None, 'resource_id': 'instance-0000003f-f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b-tapaf4c0ef6-86', 'timestamp': '2025-10-02T12:13:16.457153', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'tapaf4c0ef6-86', 'instance_id': 'f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b', 'instance_type': 'm1.nano', 'host': 'fad5650edbbaa0bc17676f5a4cf78998d246bbd7bd79331c2b863d1c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a9:c4:a7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaf4c0ef6-86'}, 'message_id': '2d1b428c-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5188.059215457, 'message_signature': '97065224db21398389e51b11a667320b57e5c16c0fd0fa2608f634d53e397e9e'}]}, 'timestamp': '2025-10-02 12:13:16.458375', '_unique_id': 'd97fd3dd66b34d86a818c59bbf50dcb4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.459 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.459 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.459 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.459 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.459 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.459 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.459 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.459 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.459 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.459 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.459 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 nova_compute[192144]: 2025-10-02 12:13:16.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.459 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.459 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.459 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.459 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.459 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.459 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.459 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.459 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.459 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.459 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.459 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.459 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.459 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.459 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.459 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.459 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.459 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.459 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.459 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.459 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.460 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  2 08:13:16 np0005466013 nova_compute[192144]: 2025-10-02 12:13:16.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.462 12 DEBUG ceilometer.compute.pollsters [-] Instance 960cdfa5-111c-4d08-82c7-29134bd55212 was shut off while getting sample of network.incoming.packets.error: Failed to inspect data of instance <name=instance-0000003e, id=960cdfa5-111c-4d08-82c7-29134bd55212>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.462 12 DEBUG ceilometer.compute.pollsters [-] f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.463 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1649930d-c8ca-4445-8c0d-18183f9cfe15', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a212b40f430d496d94ca57954f39afd6', 'user_name': None, 'project_id': '7cc67bd4da7644d3bd8155cc7f188aa4', 'project_name': None, 'resource_id': 'instance-0000003f-f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b-tapaf4c0ef6-86', 'timestamp': '2025-10-02T12:13:16.460357', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'tapaf4c0ef6-86', 'instance_id': 'f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b', 'instance_type': 'm1.nano', 'host': 'fad5650edbbaa0bc17676f5a4cf78998d246bbd7bd79331c2b863d1c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a9:c4:a7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaf4c0ef6-86'}, 'message_id': '2d1be3cc-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5188.059215457, 'message_signature': 'a9ab56befa055598d7a06a306e268c88818aaa5ca327c5bdff9e424158ccf085'}]}, 'timestamp': '2025-10-02 12:13:16.462528', '_unique_id': 'b69b87ef94074327a479899f2dd96679'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.463 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.463 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.463 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.463 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.463 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.463 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.463 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.463 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.463 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.463 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.463 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.463 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.463 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.463 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.463 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.463 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.463 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.463 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.463 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.463 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.463 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.463 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.463 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.463 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.463 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.463 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.463 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.463 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.463 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.463 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.463 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.464 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  2 08:13:16 np0005466013 nova_compute[192144]: 2025-10-02 12:13:16.464 2 INFO os_vif [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:62:bc:7f,bridge_name='br-int',has_traffic_filtering=True,id=ad0f91db-0150-4421-a472-f9984b0a20bc,network=Network(d6de4737-ca60-4c8d-bfd5-687f9366ec8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapad0f91db-01')#033[00m
Oct  2 08:13:16 np0005466013 nova_compute[192144]: 2025-10-02 12:13:16.465 2 INFO nova.virt.libvirt.driver [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Deleting instance files /var/lib/nova/instances/960cdfa5-111c-4d08-82c7-29134bd55212_del#033[00m
Oct  2 08:13:16 np0005466013 nova_compute[192144]: 2025-10-02 12:13:16.466 2 INFO nova.virt.libvirt.driver [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Deletion of /var/lib/nova/instances/960cdfa5-111c-4d08-82c7-29134bd55212_del complete#033[00m
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.466 12 DEBUG ceilometer.compute.pollsters [-] Instance 960cdfa5-111c-4d08-82c7-29134bd55212 was shut off while getting sample of network.incoming.packets.drop: Failed to inspect data of instance <name=instance-0000003e, id=960cdfa5-111c-4d08-82c7-29134bd55212>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.466 12 DEBUG ceilometer.compute.pollsters [-] f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.467 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '633203c2-3838-468a-a01b-dfab6ac8ab70', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a212b40f430d496d94ca57954f39afd6', 'user_name': None, 'project_id': '7cc67bd4da7644d3bd8155cc7f188aa4', 'project_name': None, 'resource_id': 'instance-0000003f-f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b-tapaf4c0ef6-86', 'timestamp': '2025-10-02T12:13:16.464472', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'tapaf4c0ef6-86', 'instance_id': 'f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b', 'instance_type': 'm1.nano', 'host': 'fad5650edbbaa0bc17676f5a4cf78998d246bbd7bd79331c2b863d1c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a9:c4:a7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaf4c0ef6-86'}, 'message_id': '2d1c94f2-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5188.059215457, 'message_signature': 'f2b784b8b2da5042addb7a1d94e33f3007fc7da1502f2a131caec2486bf416e3'}]}, 'timestamp': '2025-10-02 12:13:16.467035', '_unique_id': '7f4446834b5c49cda8b095a045bfdbc6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.467 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.467 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.467 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.467 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.467 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.467 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.467 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.467 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.467 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.467 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.467 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.467 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.467 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.467 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.467 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.467 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.467 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.467 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.467 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.467 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.467 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.467 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.467 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.467 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.467 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.467 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.467 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.467 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.467 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.467 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.467 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.468 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.468 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.468 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerDiskConfigTestJSON-server-1330483951>, <NovaLikeServer: guest-instance-1>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerDiskConfigTestJSON-server-1330483951>, <NovaLikeServer: guest-instance-1>]
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.468 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.469 12 DEBUG ceilometer.compute.pollsters [-] Instance 960cdfa5-111c-4d08-82c7-29134bd55212 was shut off while getting sample of network.outgoing.packets.drop: Failed to inspect data of instance <name=instance-0000003e, id=960cdfa5-111c-4d08-82c7-29134bd55212>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.469 12 DEBUG ceilometer.compute.pollsters [-] f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.470 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b008482e-f340-412f-b9c9-0d369a198450', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a212b40f430d496d94ca57954f39afd6', 'user_name': None, 'project_id': '7cc67bd4da7644d3bd8155cc7f188aa4', 'project_name': None, 'resource_id': 'instance-0000003f-f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b-tapaf4c0ef6-86', 'timestamp': '2025-10-02T12:13:16.469122', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'tapaf4c0ef6-86', 'instance_id': 'f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b', 'instance_type': 'm1.nano', 'host': 'fad5650edbbaa0bc17676f5a4cf78998d246bbd7bd79331c2b863d1c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a9:c4:a7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaf4c0ef6-86'}, 'message_id': '2d1d0e64-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5188.059215457, 'message_signature': 'a7ea325f4b954debbfe10f922d0b659326f2c8b7feb1e8c112806694f8f6393d'}]}, 'timestamp': '2025-10-02 12:13:16.470144', '_unique_id': '75fbd36ecb884379b2205709ba3ee48a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.470 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.470 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.470 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.470 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.470 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.470 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.470 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.470 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.470 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.470 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.470 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.470 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.470 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.470 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.470 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.470 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.470 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.470 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.470 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.470 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.470 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.470 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.470 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.470 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.470 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.470 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.470 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.470 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.470 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.470 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.470 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.471 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.471 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.472 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerDiskConfigTestJSON-server-1330483951>, <NovaLikeServer: guest-instance-1>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerDiskConfigTestJSON-server-1330483951>, <NovaLikeServer: guest-instance-1>]
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.472 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.472 12 DEBUG ceilometer.compute.pollsters [-] Instance 960cdfa5-111c-4d08-82c7-29134bd55212 was shut off while getting sample of cpu: Failed to inspect data of instance <name=instance-0000003e, id=960cdfa5-111c-4d08-82c7-29134bd55212>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.473 12 DEBUG ceilometer.compute.pollsters [-] f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b/cpu volume: 12200000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.474 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '547a563c-09b1-4264-8631-aa8a1c8607fe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12200000000, 'user_id': 'a212b40f430d496d94ca57954f39afd6', 'user_name': None, 'project_id': '7cc67bd4da7644d3bd8155cc7f188aa4', 'project_name': None, 'resource_id': 'f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b', 'timestamp': '2025-10-02T12:13:16.472353', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'instance-0000003f', 'instance_id': 'f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b', 'instance_type': 'm1.nano', 'host': 'fad5650edbbaa0bc17676f5a4cf78998d246bbd7bd79331c2b863d1c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '2d1d8f4c-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5188.105189185, 'message_signature': 'dc030fb05b8fc966b6a5ea8584d452f193a1650897ca1e06e88d8747cfb0de77'}]}, 'timestamp': '2025-10-02 12:13:16.473466', '_unique_id': 'e75cf1b68534411994c044c7a4f27457'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.474 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.474 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.474 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.474 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.474 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.474 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.474 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.474 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.474 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.474 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.474 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.474 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.474 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.474 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.474 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.474 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.474 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.474 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.474 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.474 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.474 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.474 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.474 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.474 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.474 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.474 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.474 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.474 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.474 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.474 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.474 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.474 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.475 12 DEBUG ceilometer.compute.pollsters [-] Instance 960cdfa5-111c-4d08-82c7-29134bd55212 was shut off while getting sample of network.incoming.bytes.delta: Failed to inspect data of instance <name=instance-0000003e, id=960cdfa5-111c-4d08-82c7-29134bd55212>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.475 12 DEBUG ceilometer.compute.pollsters [-] f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.476 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9437ce33-eab1-4924-ba62-f0a02c91e8cf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a212b40f430d496d94ca57954f39afd6', 'user_name': None, 'project_id': '7cc67bd4da7644d3bd8155cc7f188aa4', 'project_name': None, 'resource_id': 'instance-0000003f-f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b-tapaf4c0ef6-86', 'timestamp': '2025-10-02T12:13:16.475039', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'tapaf4c0ef6-86', 'instance_id': 'f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b', 'instance_type': 'm1.nano', 'host': 'fad5650edbbaa0bc17676f5a4cf78998d246bbd7bd79331c2b863d1c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a9:c4:a7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaf4c0ef6-86'}, 'message_id': '2d1df112-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5188.059215457, 'message_signature': '9516385e39792b920d210b1c0e93ac4ccb9bdfd4bf82fcc3f827b47db12de4c9'}]}, 'timestamp': '2025-10-02 12:13:16.475973', '_unique_id': '4c271c7a1e7c40898b5e69f43210870f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.476 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.476 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.476 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.476 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.476 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.476 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.476 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.476 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.476 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.476 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.476 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.476 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.476 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.476 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.476 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.476 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.476 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.476 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.476 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.476 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.476 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.476 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.476 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.476 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.476 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.476 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.476 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.476 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.476 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.476 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.476 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.477 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.479 12 DEBUG ceilometer.compute.pollsters [-] Instance 960cdfa5-111c-4d08-82c7-29134bd55212 was shut off while getting sample of network.outgoing.bytes: Failed to inspect data of instance <name=instance-0000003e, id=960cdfa5-111c-4d08-82c7-29134bd55212>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.479 12 DEBUG ceilometer.compute.pollsters [-] f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b/network.outgoing.bytes volume: 1284 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.481 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7f656b79-1aa1-4073-8f8a-26c538e3070c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1284, 'user_id': 'a212b40f430d496d94ca57954f39afd6', 'user_name': None, 'project_id': '7cc67bd4da7644d3bd8155cc7f188aa4', 'project_name': None, 'resource_id': 'instance-0000003f-f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b-tapaf4c0ef6-86', 'timestamp': '2025-10-02T12:13:16.478206', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'tapaf4c0ef6-86', 'instance_id': 'f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b', 'instance_type': 'm1.nano', 'host': 'fad5650edbbaa0bc17676f5a4cf78998d246bbd7bd79331c2b863d1c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a9:c4:a7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaf4c0ef6-86'}, 'message_id': '2d1e8280-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5188.059215457, 'message_signature': '8e38ba56bd2212e56233184f2530394dc649f7f72939129fdafbac25ab2a403f'}]}, 'timestamp': '2025-10-02 12:13:16.479765', '_unique_id': 'ea46d458b5db4c5dac6b207b68a82005'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.481 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.481 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.481 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.481 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.481 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.481 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.481 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.481 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.481 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.481 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.481 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.481 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.481 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.481 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.481 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.481 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.481 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.481 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.481 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.481 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.481 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.481 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.481 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.481 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.481 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.481 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.481 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.481 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.481 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.481 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.481 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.481 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.482 12 DEBUG ceilometer.compute.pollsters [-] Instance 960cdfa5-111c-4d08-82c7-29134bd55212 was shut off while getting sample of disk.device.read.bytes: Failed to inspect data of instance <name=instance-0000003e, id=960cdfa5-111c-4d08-82c7-29134bd55212>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.482 12 DEBUG ceilometer.compute.pollsters [-] f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b/disk.device.read.bytes volume: 30099968 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.483 12 DEBUG ceilometer.compute.pollsters [-] f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b/disk.device.read.bytes volume: 299326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.484 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '026d8f19-32cb-4246-a4e4-b306cd9dc060', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30099968, 'user_id': 'a212b40f430d496d94ca57954f39afd6', 'user_name': None, 'project_id': '7cc67bd4da7644d3bd8155cc7f188aa4', 'project_name': None, 'resource_id': 'f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b-vda', 'timestamp': '2025-10-02T12:13:16.482058', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'instance-0000003f', 'instance_id': 'f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b', 'instance_type': 'm1.nano', 'host': 'fad5650edbbaa0bc17676f5a4cf78998d246bbd7bd79331c2b863d1c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2d1f109c-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5188.03324822, 'message_signature': '98bba73424c88d9385a46bb335e33a971778f8b80f92f15adfbe73a815dcc256'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 299326, 'user_id': 'a212b40f430d496d94ca57954f39afd6', 'user_name': None, 'project_id': '7cc67bd4da7644d3bd8155cc7f188aa4', 'project_name': None, 'resource_id': 'f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b-sda', 'timestamp': '2025-10-02T12:13:16.482058', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'instance-0000003f', 'instance_id': 'f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b', 'instance_type': 'm1.nano', 'host': 'fad5650edbbaa0bc17676f5a4cf78998d246bbd7bd79331c2b863d1c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2d1f1db2-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5188.03324822, 'message_signature': 'd7c181195ab1b53b85099d1f5f1c6cf69698ed97f80767833d5488f3ffde95fa'}]}, 'timestamp': '2025-10-02 12:13:16.483636', '_unique_id': '644b492ce4a8417bb3d226fdcbf574a6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.484 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.484 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.484 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.484 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.484 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.484 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.484 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.484 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.484 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.484 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.484 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.484 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.484 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.484 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.484 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.484 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.484 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.484 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.484 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.484 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.484 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.484 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.484 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.484 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.484 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.484 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.484 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.484 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.484 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.484 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.484 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.485 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.486 12 DEBUG ceilometer.compute.pollsters [-] Instance 960cdfa5-111c-4d08-82c7-29134bd55212 was shut off while getting sample of network.outgoing.packets.error: Failed to inspect data of instance <name=instance-0000003e, id=960cdfa5-111c-4d08-82c7-29134bd55212>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.486 12 DEBUG ceilometer.compute.pollsters [-] f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.488 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0a9ccfe0-71c5-47e6-b1ea-3c5632afd623', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a212b40f430d496d94ca57954f39afd6', 'user_name': None, 'project_id': '7cc67bd4da7644d3bd8155cc7f188aa4', 'project_name': None, 'resource_id': 'instance-0000003f-f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b-tapaf4c0ef6-86', 'timestamp': '2025-10-02T12:13:16.485775', 'resource_metadata': {'display_name': 'guest-instance-1', 'name': 'tapaf4c0ef6-86', 'instance_id': 'f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b', 'instance_type': 'm1.nano', 'host': 'fad5650edbbaa0bc17676f5a4cf78998d246bbd7bd79331c2b863d1c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a9:c4:a7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaf4c0ef6-86'}, 'message_id': '2d1f9c88-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5188.059215457, 'message_signature': 'c2d192d276a257bdbf641c74013f7dd1f8b3989f81652744b2976173995ec138'}]}, 'timestamp': '2025-10-02 12:13:16.486937', '_unique_id': 'fa2559c7d5904a2e84dd794df806e3d4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.488 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.488 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.488 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.488 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.488 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.488 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.488 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.488 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.488 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.488 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.488 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.488 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.488 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.488 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.488 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.488 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.488 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.488 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.488 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.488 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.488 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.488 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.488 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.488 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.488 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.488 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.488 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.488 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.488 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.488 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:13:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:13:16.488 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:13:16 np0005466013 nova_compute[192144]: 2025-10-02 12:13:16.787 2 DEBUG nova.virt.libvirt.driver [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:13:16 np0005466013 nova_compute[192144]: 2025-10-02 12:13:16.787 2 INFO nova.virt.libvirt.driver [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Creating image(s)#033[00m
Oct  2 08:13:16 np0005466013 nova_compute[192144]: 2025-10-02 12:13:16.788 2 DEBUG oslo_concurrency.lockutils [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "/var/lib/nova/instances/960cdfa5-111c-4d08-82c7-29134bd55212/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:16 np0005466013 nova_compute[192144]: 2025-10-02 12:13:16.788 2 DEBUG oslo_concurrency.lockutils [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "/var/lib/nova/instances/960cdfa5-111c-4d08-82c7-29134bd55212/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:16 np0005466013 nova_compute[192144]: 2025-10-02 12:13:16.789 2 DEBUG oslo_concurrency.lockutils [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "/var/lib/nova/instances/960cdfa5-111c-4d08-82c7-29134bd55212/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:16 np0005466013 nova_compute[192144]: 2025-10-02 12:13:16.789 2 DEBUG oslo_concurrency.lockutils [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "d7f074efa852dc950deac120296f6eecf48a40d2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:16 np0005466013 nova_compute[192144]: 2025-10-02 12:13:16.790 2 DEBUG oslo_concurrency.lockutils [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "d7f074efa852dc950deac120296f6eecf48a40d2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:17 np0005466013 podman[228845]: 2025-10-02 12:13:17.693982932 +0000 UTC m=+0.066322191 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm)
Oct  2 08:13:18 np0005466013 nova_compute[192144]: 2025-10-02 12:13:18.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:18 np0005466013 nova_compute[192144]: 2025-10-02 12:13:18.382 2 DEBUG nova.compute.manager [req-3571a750-afc3-485c-a11b-1213e67397aa req-af09038d-eeaf-4bbb-9564-cd849f98bbc9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Received event network-vif-plugged-ad0f91db-0150-4421-a472-f9984b0a20bc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:13:18 np0005466013 nova_compute[192144]: 2025-10-02 12:13:18.382 2 DEBUG oslo_concurrency.lockutils [req-3571a750-afc3-485c-a11b-1213e67397aa req-af09038d-eeaf-4bbb-9564-cd849f98bbc9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "960cdfa5-111c-4d08-82c7-29134bd55212-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:18 np0005466013 nova_compute[192144]: 2025-10-02 12:13:18.382 2 DEBUG oslo_concurrency.lockutils [req-3571a750-afc3-485c-a11b-1213e67397aa req-af09038d-eeaf-4bbb-9564-cd849f98bbc9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "960cdfa5-111c-4d08-82c7-29134bd55212-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:18 np0005466013 nova_compute[192144]: 2025-10-02 12:13:18.383 2 DEBUG oslo_concurrency.lockutils [req-3571a750-afc3-485c-a11b-1213e67397aa req-af09038d-eeaf-4bbb-9564-cd849f98bbc9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "960cdfa5-111c-4d08-82c7-29134bd55212-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:18 np0005466013 nova_compute[192144]: 2025-10-02 12:13:18.383 2 DEBUG nova.compute.manager [req-3571a750-afc3-485c-a11b-1213e67397aa req-af09038d-eeaf-4bbb-9564-cd849f98bbc9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] No waiting events found dispatching network-vif-plugged-ad0f91db-0150-4421-a472-f9984b0a20bc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:13:18 np0005466013 nova_compute[192144]: 2025-10-02 12:13:18.383 2 WARNING nova.compute.manager [req-3571a750-afc3-485c-a11b-1213e67397aa req-af09038d-eeaf-4bbb-9564-cd849f98bbc9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Received unexpected event network-vif-plugged-ad0f91db-0150-4421-a472-f9984b0a20bc for instance with vm_state active and task_state rebuild_spawning.#033[00m
Oct  2 08:13:18 np0005466013 nova_compute[192144]: 2025-10-02 12:13:18.471 2 DEBUG oslo_concurrency.processutils [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:13:18 np0005466013 nova_compute[192144]: 2025-10-02 12:13:18.554 2 DEBUG oslo_concurrency.processutils [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2.part --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:13:18 np0005466013 nova_compute[192144]: 2025-10-02 12:13:18.555 2 DEBUG nova.virt.images [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] 062d9f80-76b6-42ce-bee7-0fb82a008353 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Oct  2 08:13:18 np0005466013 nova_compute[192144]: 2025-10-02 12:13:18.556 2 DEBUG nova.privsep.utils [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Oct  2 08:13:18 np0005466013 nova_compute[192144]: 2025-10-02 12:13:18.557 2 DEBUG oslo_concurrency.processutils [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2.part /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:13:18 np0005466013 nova_compute[192144]: 2025-10-02 12:13:18.997 2 DEBUG oslo_concurrency.processutils [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2.part /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2.converted" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.001 2 DEBUG oslo_concurrency.processutils [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.061 2 DEBUG oslo_concurrency.processutils [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2.converted --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.063 2 DEBUG oslo_concurrency.lockutils [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "d7f074efa852dc950deac120296f6eecf48a40d2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.273s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.077 2 DEBUG oslo_concurrency.processutils [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.135 2 DEBUG oslo_concurrency.processutils [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.136 2 DEBUG oslo_concurrency.lockutils [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "d7f074efa852dc950deac120296f6eecf48a40d2" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.137 2 DEBUG oslo_concurrency.lockutils [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "d7f074efa852dc950deac120296f6eecf48a40d2" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.149 2 DEBUG oslo_concurrency.processutils [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.202 2 DEBUG oslo_concurrency.processutils [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.203 2 DEBUG oslo_concurrency.processutils [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2,backing_fmt=raw /var/lib/nova/instances/960cdfa5-111c-4d08-82c7-29134bd55212/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.246 2 DEBUG oslo_concurrency.processutils [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2,backing_fmt=raw /var/lib/nova/instances/960cdfa5-111c-4d08-82c7-29134bd55212/disk 1073741824" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.248 2 DEBUG oslo_concurrency.lockutils [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "d7f074efa852dc950deac120296f6eecf48a40d2" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.248 2 DEBUG oslo_concurrency.processutils [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.308 2 DEBUG oslo_concurrency.processutils [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.309 2 DEBUG nova.virt.disk.api [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Checking if we can resize image /var/lib/nova/instances/960cdfa5-111c-4d08-82c7-29134bd55212/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.310 2 DEBUG oslo_concurrency.processutils [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/960cdfa5-111c-4d08-82c7-29134bd55212/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.372 2 DEBUG oslo_concurrency.processutils [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/960cdfa5-111c-4d08-82c7-29134bd55212/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.374 2 DEBUG nova.virt.disk.api [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Cannot resize image /var/lib/nova/instances/960cdfa5-111c-4d08-82c7-29134bd55212/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.374 2 DEBUG nova.virt.libvirt.driver [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.375 2 DEBUG nova.virt.libvirt.driver [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Ensure instance console log exists: /var/lib/nova/instances/960cdfa5-111c-4d08-82c7-29134bd55212/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.375 2 DEBUG oslo_concurrency.lockutils [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.376 2 DEBUG oslo_concurrency.lockutils [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.376 2 DEBUG oslo_concurrency.lockutils [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.378 2 DEBUG nova.virt.libvirt.driver [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Start _get_guest_xml network_info=[{"id": "ad0f91db-0150-4421-a472-f9984b0a20bc", "address": "fa:16:3e:62:bc:7f", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad0f91db-01", "ovs_interfaceid": "ad0f91db-0150-4421-a472-f9984b0a20bc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:28Z,direct_url=<?>,disk_format='qcow2',id=062d9f80-76b6-42ce-bee7-0fb82a008353,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.385 2 WARNING nova.virt.libvirt.driver [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.396 2 DEBUG nova.virt.libvirt.host [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.397 2 DEBUG nova.virt.libvirt.host [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.405 2 DEBUG nova.virt.libvirt.host [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.406 2 DEBUG nova.virt.libvirt.host [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.407 2 DEBUG nova.virt.libvirt.driver [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.407 2 DEBUG nova.virt.hardware [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:28Z,direct_url=<?>,disk_format='qcow2',id=062d9f80-76b6-42ce-bee7-0fb82a008353,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.408 2 DEBUG nova.virt.hardware [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.408 2 DEBUG nova.virt.hardware [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.408 2 DEBUG nova.virt.hardware [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.409 2 DEBUG nova.virt.hardware [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.409 2 DEBUG nova.virt.hardware [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.409 2 DEBUG nova.virt.hardware [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.409 2 DEBUG nova.virt.hardware [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.410 2 DEBUG nova.virt.hardware [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.410 2 DEBUG nova.virt.hardware [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.410 2 DEBUG nova.virt.hardware [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.410 2 DEBUG nova.objects.instance [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lazy-loading 'vcpu_model' on Instance uuid 960cdfa5-111c-4d08-82c7-29134bd55212 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.430 2 DEBUG nova.virt.libvirt.vif [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T12:12:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1330483951',display_name='tempest-ServerDiskConfigTestJSON-server-1330483951',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1330483951',id=62,image_ref='062d9f80-76b6-42ce-bee7-0fb82a008353',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:13:00Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ffae703d68b24b9c89686c149113fc2b',ramdisk_id='',reservation_id='r-bb4if0yp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='062d9f80-76b6-42ce-bee7-0fb82a008353',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-1763056137',owner_user_name='tempest-ServerDiskConfigTestJSON-1763056137-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:13:16Z,user_data=None,user_id='def48c13fd6a43ba88836b753986a731',uuid=960cdfa5-111c-4d08-82c7-29134bd55212,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ad0f91db-0150-4421-a472-f9984b0a20bc", "address": "fa:16:3e:62:bc:7f", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad0f91db-01", "ovs_interfaceid": "ad0f91db-0150-4421-a472-f9984b0a20bc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.431 2 DEBUG nova.network.os_vif_util [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Converting VIF {"id": "ad0f91db-0150-4421-a472-f9984b0a20bc", "address": "fa:16:3e:62:bc:7f", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad0f91db-01", "ovs_interfaceid": "ad0f91db-0150-4421-a472-f9984b0a20bc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.432 2 DEBUG nova.network.os_vif_util [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:62:bc:7f,bridge_name='br-int',has_traffic_filtering=True,id=ad0f91db-0150-4421-a472-f9984b0a20bc,network=Network(d6de4737-ca60-4c8d-bfd5-687f9366ec8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapad0f91db-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.434 2 DEBUG nova.virt.libvirt.driver [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:13:19 np0005466013 nova_compute[192144]:  <uuid>960cdfa5-111c-4d08-82c7-29134bd55212</uuid>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:  <name>instance-0000003e</name>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:13:19 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:      <nova:name>tempest-ServerDiskConfigTestJSON-server-1330483951</nova:name>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:13:19</nova:creationTime>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:13:19 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:        <nova:user uuid="def48c13fd6a43ba88836b753986a731">tempest-ServerDiskConfigTestJSON-1763056137-project-member</nova:user>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:        <nova:project uuid="ffae703d68b24b9c89686c149113fc2b">tempest-ServerDiskConfigTestJSON-1763056137</nova:project>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="062d9f80-76b6-42ce-bee7-0fb82a008353"/>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:        <nova:port uuid="ad0f91db-0150-4421-a472-f9984b0a20bc">
Oct  2 08:13:19 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:      <entry name="serial">960cdfa5-111c-4d08-82c7-29134bd55212</entry>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:      <entry name="uuid">960cdfa5-111c-4d08-82c7-29134bd55212</entry>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:13:19 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/960cdfa5-111c-4d08-82c7-29134bd55212/disk"/>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:13:19 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/960cdfa5-111c-4d08-82c7-29134bd55212/disk.config"/>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:13:19 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:62:bc:7f"/>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:      <target dev="tapad0f91db-01"/>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:13:19 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/960cdfa5-111c-4d08-82c7-29134bd55212/console.log" append="off"/>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:13:19 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:13:19 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:13:19 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:13:19 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:13:19 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.435 2 DEBUG nova.compute.manager [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Preparing to wait for external event network-vif-plugged-ad0f91db-0150-4421-a472-f9984b0a20bc prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.435 2 DEBUG oslo_concurrency.lockutils [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "960cdfa5-111c-4d08-82c7-29134bd55212-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.435 2 DEBUG oslo_concurrency.lockutils [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "960cdfa5-111c-4d08-82c7-29134bd55212-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.435 2 DEBUG oslo_concurrency.lockutils [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "960cdfa5-111c-4d08-82c7-29134bd55212-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.436 2 DEBUG nova.virt.libvirt.vif [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T12:12:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1330483951',display_name='tempest-ServerDiskConfigTestJSON-server-1330483951',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1330483951',id=62,image_ref='062d9f80-76b6-42ce-bee7-0fb82a008353',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:13:00Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ffae703d68b24b9c89686c149113fc2b',ramdisk_id='',reservation_id='r-bb4if0yp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='062d9f80-76b6-42ce-bee7-0fb82a008353',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-1763056137',owner_user_name='tempest-ServerDiskConfigTestJSON-1763056137-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:13:16Z,user_data=None,user_id='def48c13fd6a43ba88836b753986a731',uuid=960cdfa5-111c-4d08-82c7-29134bd55212,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ad0f91db-0150-4421-a472-f9984b0a20bc", "address": "fa:16:3e:62:bc:7f", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad0f91db-01", "ovs_interfaceid": "ad0f91db-0150-4421-a472-f9984b0a20bc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.436 2 DEBUG nova.network.os_vif_util [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Converting VIF {"id": "ad0f91db-0150-4421-a472-f9984b0a20bc", "address": "fa:16:3e:62:bc:7f", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad0f91db-01", "ovs_interfaceid": "ad0f91db-0150-4421-a472-f9984b0a20bc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.437 2 DEBUG nova.network.os_vif_util [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:62:bc:7f,bridge_name='br-int',has_traffic_filtering=True,id=ad0f91db-0150-4421-a472-f9984b0a20bc,network=Network(d6de4737-ca60-4c8d-bfd5-687f9366ec8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapad0f91db-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.437 2 DEBUG os_vif [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:62:bc:7f,bridge_name='br-int',has_traffic_filtering=True,id=ad0f91db-0150-4421-a472-f9984b0a20bc,network=Network(d6de4737-ca60-4c8d-bfd5-687f9366ec8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapad0f91db-01') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.439 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.439 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.443 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.443 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapad0f91db-01, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.444 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapad0f91db-01, col_values=(('external_ids', {'iface-id': 'ad0f91db-0150-4421-a472-f9984b0a20bc', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:62:bc:7f', 'vm-uuid': '960cdfa5-111c-4d08-82c7-29134bd55212'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.446 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:19 np0005466013 NetworkManager[51205]: <info>  [1759407199.4469] manager: (tapad0f91db-01): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/106)
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.448 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.453 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.454 2 INFO os_vif [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:62:bc:7f,bridge_name='br-int',has_traffic_filtering=True,id=ad0f91db-0150-4421-a472-f9984b0a20bc,network=Network(d6de4737-ca60-4c8d-bfd5-687f9366ec8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapad0f91db-01')#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.540 2 DEBUG nova.virt.libvirt.driver [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.541 2 DEBUG nova.virt.libvirt.driver [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.541 2 DEBUG nova.virt.libvirt.driver [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] No VIF found with MAC fa:16:3e:62:bc:7f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.542 2 INFO nova.virt.libvirt.driver [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Using config drive#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.568 2 DEBUG nova.objects.instance [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lazy-loading 'ec2_ids' on Instance uuid 960cdfa5-111c-4d08-82c7-29134bd55212 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.625 2 DEBUG nova.objects.instance [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lazy-loading 'keypairs' on Instance uuid 960cdfa5-111c-4d08-82c7-29134bd55212 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.723 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:13:19 np0005466013 nova_compute[192144]: 2025-10-02 12:13:19.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:13:20 np0005466013 podman[228898]: 2025-10-02 12:13:20.699361338 +0000 UTC m=+0.074762125 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd)
Oct  2 08:13:20 np0005466013 podman[228899]: 2025-10-02 12:13:20.719897816 +0000 UTC m=+0.092375502 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, vcs-type=git, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., name=ubi9-minimal, io.openshift.expose-services=)
Oct  2 08:13:20 np0005466013 nova_compute[192144]: 2025-10-02 12:13:20.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:13:20 np0005466013 nova_compute[192144]: 2025-10-02 12:13:20.993 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:13:20 np0005466013 nova_compute[192144]: 2025-10-02 12:13:20.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:13:21 np0005466013 nova_compute[192144]: 2025-10-02 12:13:21.017 2 INFO nova.virt.libvirt.driver [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Creating config drive at /var/lib/nova/instances/960cdfa5-111c-4d08-82c7-29134bd55212/disk.config#033[00m
Oct  2 08:13:21 np0005466013 nova_compute[192144]: 2025-10-02 12:13:21.023 2 DEBUG oslo_concurrency.processutils [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/960cdfa5-111c-4d08-82c7-29134bd55212/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpp9xib1k4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:13:21 np0005466013 nova_compute[192144]: 2025-10-02 12:13:21.151 2 DEBUG oslo_concurrency.processutils [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/960cdfa5-111c-4d08-82c7-29134bd55212/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpp9xib1k4" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:13:21 np0005466013 kernel: tapad0f91db-01: entered promiscuous mode
Oct  2 08:13:21 np0005466013 NetworkManager[51205]: <info>  [1759407201.2256] manager: (tapad0f91db-01): new Tun device (/org/freedesktop/NetworkManager/Devices/107)
Oct  2 08:13:21 np0005466013 ovn_controller[94366]: 2025-10-02T12:13:21Z|00197|binding|INFO|Claiming lport ad0f91db-0150-4421-a472-f9984b0a20bc for this chassis.
Oct  2 08:13:21 np0005466013 nova_compute[192144]: 2025-10-02 12:13:21.228 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:21 np0005466013 ovn_controller[94366]: 2025-10-02T12:13:21Z|00198|binding|INFO|ad0f91db-0150-4421-a472-f9984b0a20bc: Claiming fa:16:3e:62:bc:7f 10.100.0.8
Oct  2 08:13:21 np0005466013 ovn_controller[94366]: 2025-10-02T12:13:21Z|00199|binding|INFO|Setting lport ad0f91db-0150-4421-a472-f9984b0a20bc ovn-installed in OVS
Oct  2 08:13:21 np0005466013 nova_compute[192144]: 2025-10-02 12:13:21.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:21 np0005466013 nova_compute[192144]: 2025-10-02 12:13:21.243 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:21 np0005466013 nova_compute[192144]: 2025-10-02 12:13:21.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:21 np0005466013 systemd-udevd[228954]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:13:21 np0005466013 NetworkManager[51205]: <info>  [1759407201.2750] device (tapad0f91db-01): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:13:21 np0005466013 NetworkManager[51205]: <info>  [1759407201.2764] device (tapad0f91db-01): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:13:21 np0005466013 ovn_controller[94366]: 2025-10-02T12:13:21Z|00200|binding|INFO|Setting lport ad0f91db-0150-4421-a472-f9984b0a20bc up in Southbound
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:21.328 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:62:bc:7f 10.100.0.8'], port_security=['fa:16:3e:62:bc:7f 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '960cdfa5-111c-4d08-82c7-29134bd55212', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ffae703d68b24b9c89686c149113fc2b', 'neutron:revision_number': '5', 'neutron:security_group_ids': '64970375-b20e-4c18-bfb5-2a0465f8be7d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9476db85-7514-407a-b55a-3d3c703e8f7b, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=ad0f91db-0150-4421-a472-f9984b0a20bc) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:21.330 103323 INFO neutron.agent.ovn.metadata.agent [-] Port ad0f91db-0150-4421-a472-f9984b0a20bc in datapath d6de4737-ca60-4c8d-bfd5-687f9366ec8b bound to our chassis#033[00m
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:21.332 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d6de4737-ca60-4c8d-bfd5-687f9366ec8b#033[00m
Oct  2 08:13:21 np0005466013 systemd-machined[152202]: New machine qemu-27-instance-0000003e.
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:21.343 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[61840b63-a1ca-4378-ad35-d99e235793df]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:21.345 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd6de4737-c1 in ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:21.348 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd6de4737-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:21.348 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[cd0040f3-879b-4cb6-b601-8b65e5a50378]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:21.350 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[b3466506-8354-42a6-a373-48c5bfccbd5f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:21 np0005466013 systemd[1]: Started Virtual Machine qemu-27-instance-0000003e.
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:21.363 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[a8077709-13b0-41d4-b8e1-64bf6b3e420b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:21.383 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[63fb08ba-06bc-42c2-8ca8-ac02588480c4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:21.411 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[1d683c2b-93fc-44dd-af57-16bf6adacd20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:21 np0005466013 NetworkManager[51205]: <info>  [1759407201.4191] manager: (tapd6de4737-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/108)
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:21.417 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[3d96ae99-a25d-4666-95aa-f2b4b51da8ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:21 np0005466013 systemd-udevd[228956]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:21.449 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[9940d251-2ef9-4c28-beda-e610f1328fd6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:21.454 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[d83e5c2b-d3c8-4c51-84c6-3fdd5ff5656e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:21 np0005466013 NetworkManager[51205]: <info>  [1759407201.4750] device (tapd6de4737-c0): carrier: link connected
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:21.480 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[12f98317-7874-45f3-8b85-1ec7263151f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:21.497 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[d5b6d87c-089b-466c-8d84-d0e760046098]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd6de4737-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bd:c9:1f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 63], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 519309, 'reachable_time': 23190, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228990, 'error': None, 'target': 'ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:21.513 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[d46ae949-e20b-4bb2-b52e-66576e5fc92d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febd:c91f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 519309, 'tstamp': 519309}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228991, 'error': None, 'target': 'ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:21.528 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[585a005f-7441-42cc-aa25-7eeace1413e6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd6de4737-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bd:c9:1f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 63], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 519309, 'reachable_time': 23190, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 228992, 'error': None, 'target': 'ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:21.562 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[5d8f500a-1d9e-4adb-8807-719cc21278f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:21.629 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[25ce7786-8530-4151-b49e-1d440e3eec28]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:21.631 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd6de4737-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:21.631 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:21.631 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd6de4737-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:13:21 np0005466013 NetworkManager[51205]: <info>  [1759407201.6520] manager: (tapd6de4737-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/109)
Oct  2 08:13:21 np0005466013 kernel: tapd6de4737-c0: entered promiscuous mode
Oct  2 08:13:21 np0005466013 nova_compute[192144]: 2025-10-02 12:13:21.651 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:21.662 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd6de4737-c0, col_values=(('external_ids', {'iface-id': 'cc451eb7-bf34-4b54-96d8-b834f11e06fb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:13:21 np0005466013 ovn_controller[94366]: 2025-10-02T12:13:21Z|00201|binding|INFO|Releasing lport cc451eb7-bf34-4b54-96d8-b834f11e06fb from this chassis (sb_readonly=0)
Oct  2 08:13:21 np0005466013 nova_compute[192144]: 2025-10-02 12:13:21.665 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:21.669 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d6de4737-ca60-4c8d-bfd5-687f9366ec8b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d6de4737-ca60-4c8d-bfd5-687f9366ec8b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:21.670 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[7c3b8e53-98da-438f-aacf-d40530e0b138]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:21.672 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-d6de4737-ca60-4c8d-bfd5-687f9366ec8b
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/d6de4737-ca60-4c8d-bfd5-687f9366ec8b.pid.haproxy
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID d6de4737-ca60-4c8d-bfd5-687f9366ec8b
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:13:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:21.673 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'env', 'PROCESS_TAG=haproxy-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d6de4737-ca60-4c8d-bfd5-687f9366ec8b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:13:21 np0005466013 nova_compute[192144]: 2025-10-02 12:13:21.679 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:21 np0005466013 nova_compute[192144]: 2025-10-02 12:13:21.915 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "refresh_cache-f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:13:21 np0005466013 nova_compute[192144]: 2025-10-02 12:13:21.916 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquired lock "refresh_cache-f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:13:21 np0005466013 nova_compute[192144]: 2025-10-02 12:13:21.916 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:13:21 np0005466013 nova_compute[192144]: 2025-10-02 12:13:21.916 2 DEBUG nova.objects.instance [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lazy-loading 'info_cache' on Instance uuid f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:13:22 np0005466013 podman[229027]: 2025-10-02 12:13:22.093080324 +0000 UTC m=+0.068635964 container create fcb9dce6c2ee7b84ae7c60c6b6c739601b482b6e8cfd318c2e9ee0717382780b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001)
Oct  2 08:13:22 np0005466013 systemd[1]: Started libpod-conmon-fcb9dce6c2ee7b84ae7c60c6b6c739601b482b6e8cfd318c2e9ee0717382780b.scope.
Oct  2 08:13:22 np0005466013 podman[229027]: 2025-10-02 12:13:22.060287474 +0000 UTC m=+0.035843144 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:13:22 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:13:22 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8a73b4c548ac45cd6760f8d1d072c2a299ebdfc7bb575b4022c37a678d04ad9c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:13:22 np0005466013 podman[229027]: 2025-10-02 12:13:22.194975961 +0000 UTC m=+0.170531631 container init fcb9dce6c2ee7b84ae7c60c6b6c739601b482b6e8cfd318c2e9ee0717382780b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Oct  2 08:13:22 np0005466013 podman[229027]: 2025-10-02 12:13:22.202803684 +0000 UTC m=+0.178359324 container start fcb9dce6c2ee7b84ae7c60c6b6c739601b482b6e8cfd318c2e9ee0717382780b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2)
Oct  2 08:13:22 np0005466013 neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b[229047]: [NOTICE]   (229051) : New worker (229053) forked
Oct  2 08:13:22 np0005466013 neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b[229047]: [NOTICE]   (229051) : Loading success.
Oct  2 08:13:22 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:22.238 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:13:22 np0005466013 nova_compute[192144]: 2025-10-02 12:13:22.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:22 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:22.276 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:13:22 np0005466013 nova_compute[192144]: 2025-10-02 12:13:22.575 2 DEBUG nova.virt.libvirt.host [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Removed pending event for 960cdfa5-111c-4d08-82c7-29134bd55212 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Oct  2 08:13:22 np0005466013 nova_compute[192144]: 2025-10-02 12:13:22.575 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407202.5745528, 960cdfa5-111c-4d08-82c7-29134bd55212 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:13:22 np0005466013 nova_compute[192144]: 2025-10-02 12:13:22.576 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] VM Started (Lifecycle Event)#033[00m
Oct  2 08:13:22 np0005466013 nova_compute[192144]: 2025-10-02 12:13:22.599 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:13:22 np0005466013 nova_compute[192144]: 2025-10-02 12:13:22.603 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407202.5747693, 960cdfa5-111c-4d08-82c7-29134bd55212 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:13:22 np0005466013 nova_compute[192144]: 2025-10-02 12:13:22.603 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:13:22 np0005466013 nova_compute[192144]: 2025-10-02 12:13:22.622 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:13:22 np0005466013 nova_compute[192144]: 2025-10-02 12:13:22.627 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:13:22 np0005466013 nova_compute[192144]: 2025-10-02 12:13:22.652 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Oct  2 08:13:23 np0005466013 nova_compute[192144]: 2025-10-02 12:13:23.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:24 np0005466013 nova_compute[192144]: 2025-10-02 12:13:24.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:24 np0005466013 podman[229062]: 2025-10-02 12:13:24.688081395 +0000 UTC m=+0.063938899 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  2 08:13:24 np0005466013 podman[229063]: 2025-10-02 12:13:24.691915014 +0000 UTC m=+0.065799166 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  2 08:13:25 np0005466013 nova_compute[192144]: 2025-10-02 12:13:25.465 2 DEBUG nova.compute.manager [req-dc399e33-49e3-4559-be47-26477b948ccb req-7003dbc0-5ebb-44d8-818f-1687c2b4baea 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Received event network-vif-plugged-ad0f91db-0150-4421-a472-f9984b0a20bc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:13:25 np0005466013 nova_compute[192144]: 2025-10-02 12:13:25.466 2 DEBUG oslo_concurrency.lockutils [req-dc399e33-49e3-4559-be47-26477b948ccb req-7003dbc0-5ebb-44d8-818f-1687c2b4baea 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "960cdfa5-111c-4d08-82c7-29134bd55212-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:25 np0005466013 nova_compute[192144]: 2025-10-02 12:13:25.466 2 DEBUG oslo_concurrency.lockutils [req-dc399e33-49e3-4559-be47-26477b948ccb req-7003dbc0-5ebb-44d8-818f-1687c2b4baea 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "960cdfa5-111c-4d08-82c7-29134bd55212-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:25 np0005466013 nova_compute[192144]: 2025-10-02 12:13:25.466 2 DEBUG oslo_concurrency.lockutils [req-dc399e33-49e3-4559-be47-26477b948ccb req-7003dbc0-5ebb-44d8-818f-1687c2b4baea 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "960cdfa5-111c-4d08-82c7-29134bd55212-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:25 np0005466013 nova_compute[192144]: 2025-10-02 12:13:25.466 2 DEBUG nova.compute.manager [req-dc399e33-49e3-4559-be47-26477b948ccb req-7003dbc0-5ebb-44d8-818f-1687c2b4baea 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Processing event network-vif-plugged-ad0f91db-0150-4421-a472-f9984b0a20bc _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:13:25 np0005466013 nova_compute[192144]: 2025-10-02 12:13:25.467 2 DEBUG nova.compute.manager [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:13:25 np0005466013 nova_compute[192144]: 2025-10-02 12:13:25.470 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407205.4707072, 960cdfa5-111c-4d08-82c7-29134bd55212 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:13:25 np0005466013 nova_compute[192144]: 2025-10-02 12:13:25.471 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:13:25 np0005466013 nova_compute[192144]: 2025-10-02 12:13:25.473 2 DEBUG nova.virt.libvirt.driver [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:13:25 np0005466013 nova_compute[192144]: 2025-10-02 12:13:25.476 2 INFO nova.virt.libvirt.driver [-] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Instance spawned successfully.#033[00m
Oct  2 08:13:25 np0005466013 nova_compute[192144]: 2025-10-02 12:13:25.477 2 DEBUG nova.virt.libvirt.driver [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:13:25 np0005466013 nova_compute[192144]: 2025-10-02 12:13:25.491 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:13:25 np0005466013 nova_compute[192144]: 2025-10-02 12:13:25.498 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:13:25 np0005466013 nova_compute[192144]: 2025-10-02 12:13:25.501 2 DEBUG nova.virt.libvirt.driver [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:13:25 np0005466013 nova_compute[192144]: 2025-10-02 12:13:25.502 2 DEBUG nova.virt.libvirt.driver [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:13:25 np0005466013 nova_compute[192144]: 2025-10-02 12:13:25.502 2 DEBUG nova.virt.libvirt.driver [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:13:25 np0005466013 nova_compute[192144]: 2025-10-02 12:13:25.502 2 DEBUG nova.virt.libvirt.driver [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:13:25 np0005466013 nova_compute[192144]: 2025-10-02 12:13:25.503 2 DEBUG nova.virt.libvirt.driver [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:13:25 np0005466013 nova_compute[192144]: 2025-10-02 12:13:25.503 2 DEBUG nova.virt.libvirt.driver [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:13:25 np0005466013 nova_compute[192144]: 2025-10-02 12:13:25.526 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Oct  2 08:13:25 np0005466013 nova_compute[192144]: 2025-10-02 12:13:25.648 2 DEBUG nova.compute.manager [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:13:25 np0005466013 nova_compute[192144]: 2025-10-02 12:13:25.827 2 DEBUG oslo_concurrency.lockutils [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:25 np0005466013 nova_compute[192144]: 2025-10-02 12:13:25.827 2 DEBUG oslo_concurrency.lockutils [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:25 np0005466013 nova_compute[192144]: 2025-10-02 12:13:25.828 2 DEBUG nova.objects.instance [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Oct  2 08:13:25 np0005466013 nova_compute[192144]: 2025-10-02 12:13:25.918 2 DEBUG oslo_concurrency.lockutils [None req-f4e70715-2180-4437-bcd0-7d20e8386324 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.090s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:26 np0005466013 nova_compute[192144]: 2025-10-02 12:13:26.211 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Updating instance_info_cache with network_info: [{"id": "af4c0ef6-8643-4a83-be1e-4000ab5fd894", "address": "fa:16:3e:a9:c4:a7", "network": {"id": "d8c71e68-a016-4099-877d-881b9a6e634c", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1691956717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7cc67bd4da7644d3bd8155cc7f188aa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf4c0ef6-86", "ovs_interfaceid": "af4c0ef6-8643-4a83-be1e-4000ab5fd894", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:13:26 np0005466013 nova_compute[192144]: 2025-10-02 12:13:26.233 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Releasing lock "refresh_cache-f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:13:26 np0005466013 nova_compute[192144]: 2025-10-02 12:13:26.234 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:13:26 np0005466013 nova_compute[192144]: 2025-10-02 12:13:26.234 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:13:26 np0005466013 nova_compute[192144]: 2025-10-02 12:13:26.235 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:13:26 np0005466013 nova_compute[192144]: 2025-10-02 12:13:26.235 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:13:26 np0005466013 nova_compute[192144]: 2025-10-02 12:13:26.235 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:13:26 np0005466013 nova_compute[192144]: 2025-10-02 12:13:26.236 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:13:26 np0005466013 nova_compute[192144]: 2025-10-02 12:13:26.236 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:13:26 np0005466013 nova_compute[192144]: 2025-10-02 12:13:26.257 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:26 np0005466013 nova_compute[192144]: 2025-10-02 12:13:26.258 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:26 np0005466013 nova_compute[192144]: 2025-10-02 12:13:26.258 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:26 np0005466013 nova_compute[192144]: 2025-10-02 12:13:26.258 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:13:26 np0005466013 nova_compute[192144]: 2025-10-02 12:13:26.330 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/960cdfa5-111c-4d08-82c7-29134bd55212/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:13:26 np0005466013 nova_compute[192144]: 2025-10-02 12:13:26.402 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/960cdfa5-111c-4d08-82c7-29134bd55212/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:13:26 np0005466013 nova_compute[192144]: 2025-10-02 12:13:26.404 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/960cdfa5-111c-4d08-82c7-29134bd55212/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:13:26 np0005466013 nova_compute[192144]: 2025-10-02 12:13:26.473 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/960cdfa5-111c-4d08-82c7-29134bd55212/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:13:26 np0005466013 nova_compute[192144]: 2025-10-02 12:13:26.480 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:13:26 np0005466013 nova_compute[192144]: 2025-10-02 12:13:26.547 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:13:26 np0005466013 nova_compute[192144]: 2025-10-02 12:13:26.548 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:13:26 np0005466013 nova_compute[192144]: 2025-10-02 12:13:26.627 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:13:26 np0005466013 nova_compute[192144]: 2025-10-02 12:13:26.826 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:13:26 np0005466013 nova_compute[192144]: 2025-10-02 12:13:26.828 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5515MB free_disk=73.32492446899414GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:13:26 np0005466013 nova_compute[192144]: 2025-10-02 12:13:26.828 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:26 np0005466013 nova_compute[192144]: 2025-10-02 12:13:26.829 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:26 np0005466013 nova_compute[192144]: 2025-10-02 12:13:26.918 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance 960cdfa5-111c-4d08-82c7-29134bd55212 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:13:26 np0005466013 nova_compute[192144]: 2025-10-02 12:13:26.918 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:13:26 np0005466013 nova_compute[192144]: 2025-10-02 12:13:26.919 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:13:26 np0005466013 nova_compute[192144]: 2025-10-02 12:13:26.919 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:13:27 np0005466013 nova_compute[192144]: 2025-10-02 12:13:27.012 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:13:27 np0005466013 nova_compute[192144]: 2025-10-02 12:13:27.030 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:13:27 np0005466013 nova_compute[192144]: 2025-10-02 12:13:27.072 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:13:27 np0005466013 nova_compute[192144]: 2025-10-02 12:13:27.073 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.244s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:28 np0005466013 nova_compute[192144]: 2025-10-02 12:13:28.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:28 np0005466013 nova_compute[192144]: 2025-10-02 12:13:28.280 2 DEBUG nova.compute.manager [req-ddb17826-e47c-4b00-ba5f-f8a523f27944 req-0536f37c-4610-4941-8a72-bd997b506407 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Received event network-vif-plugged-ad0f91db-0150-4421-a472-f9984b0a20bc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:13:28 np0005466013 nova_compute[192144]: 2025-10-02 12:13:28.281 2 DEBUG oslo_concurrency.lockutils [req-ddb17826-e47c-4b00-ba5f-f8a523f27944 req-0536f37c-4610-4941-8a72-bd997b506407 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "960cdfa5-111c-4d08-82c7-29134bd55212-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:28 np0005466013 nova_compute[192144]: 2025-10-02 12:13:28.281 2 DEBUG oslo_concurrency.lockutils [req-ddb17826-e47c-4b00-ba5f-f8a523f27944 req-0536f37c-4610-4941-8a72-bd997b506407 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "960cdfa5-111c-4d08-82c7-29134bd55212-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:28 np0005466013 nova_compute[192144]: 2025-10-02 12:13:28.281 2 DEBUG oslo_concurrency.lockutils [req-ddb17826-e47c-4b00-ba5f-f8a523f27944 req-0536f37c-4610-4941-8a72-bd997b506407 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "960cdfa5-111c-4d08-82c7-29134bd55212-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:28 np0005466013 nova_compute[192144]: 2025-10-02 12:13:28.282 2 DEBUG nova.compute.manager [req-ddb17826-e47c-4b00-ba5f-f8a523f27944 req-0536f37c-4610-4941-8a72-bd997b506407 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] No waiting events found dispatching network-vif-plugged-ad0f91db-0150-4421-a472-f9984b0a20bc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:13:28 np0005466013 nova_compute[192144]: 2025-10-02 12:13:28.282 2 WARNING nova.compute.manager [req-ddb17826-e47c-4b00-ba5f-f8a523f27944 req-0536f37c-4610-4941-8a72-bd997b506407 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Received unexpected event network-vif-plugged-ad0f91db-0150-4421-a472-f9984b0a20bc for instance with vm_state active and task_state None.#033[00m
Oct  2 08:13:28 np0005466013 nova_compute[192144]: 2025-10-02 12:13:28.493 2 DEBUG oslo_concurrency.lockutils [None req-f80c7a74-6b7f-465d-a723-0314aacfe5a2 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "960cdfa5-111c-4d08-82c7-29134bd55212" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:28 np0005466013 nova_compute[192144]: 2025-10-02 12:13:28.494 2 DEBUG oslo_concurrency.lockutils [None req-f80c7a74-6b7f-465d-a723-0314aacfe5a2 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "960cdfa5-111c-4d08-82c7-29134bd55212" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:28 np0005466013 nova_compute[192144]: 2025-10-02 12:13:28.494 2 DEBUG oslo_concurrency.lockutils [None req-f80c7a74-6b7f-465d-a723-0314aacfe5a2 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "960cdfa5-111c-4d08-82c7-29134bd55212-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:28 np0005466013 nova_compute[192144]: 2025-10-02 12:13:28.495 2 DEBUG oslo_concurrency.lockutils [None req-f80c7a74-6b7f-465d-a723-0314aacfe5a2 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "960cdfa5-111c-4d08-82c7-29134bd55212-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:28 np0005466013 nova_compute[192144]: 2025-10-02 12:13:28.495 2 DEBUG oslo_concurrency.lockutils [None req-f80c7a74-6b7f-465d-a723-0314aacfe5a2 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "960cdfa5-111c-4d08-82c7-29134bd55212-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:28 np0005466013 nova_compute[192144]: 2025-10-02 12:13:28.507 2 INFO nova.compute.manager [None req-f80c7a74-6b7f-465d-a723-0314aacfe5a2 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Terminating instance#033[00m
Oct  2 08:13:28 np0005466013 nova_compute[192144]: 2025-10-02 12:13:28.519 2 DEBUG nova.compute.manager [None req-f80c7a74-6b7f-465d-a723-0314aacfe5a2 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:13:28 np0005466013 kernel: tapad0f91db-01 (unregistering): left promiscuous mode
Oct  2 08:13:28 np0005466013 NetworkManager[51205]: <info>  [1759407208.5519] device (tapad0f91db-01): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:13:28 np0005466013 nova_compute[192144]: 2025-10-02 12:13:28.562 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:28 np0005466013 ovn_controller[94366]: 2025-10-02T12:13:28Z|00202|binding|INFO|Releasing lport ad0f91db-0150-4421-a472-f9984b0a20bc from this chassis (sb_readonly=0)
Oct  2 08:13:28 np0005466013 ovn_controller[94366]: 2025-10-02T12:13:28Z|00203|binding|INFO|Setting lport ad0f91db-0150-4421-a472-f9984b0a20bc down in Southbound
Oct  2 08:13:28 np0005466013 ovn_controller[94366]: 2025-10-02T12:13:28Z|00204|binding|INFO|Removing iface tapad0f91db-01 ovn-installed in OVS
Oct  2 08:13:28 np0005466013 nova_compute[192144]: 2025-10-02 12:13:28.565 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:28 np0005466013 nova_compute[192144]: 2025-10-02 12:13:28.576 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:28 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:28.579 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:62:bc:7f 10.100.0.8'], port_security=['fa:16:3e:62:bc:7f 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '960cdfa5-111c-4d08-82c7-29134bd55212', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ffae703d68b24b9c89686c149113fc2b', 'neutron:revision_number': '6', 'neutron:security_group_ids': '64970375-b20e-4c18-bfb5-2a0465f8be7d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9476db85-7514-407a-b55a-3d3c703e8f7b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=ad0f91db-0150-4421-a472-f9984b0a20bc) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:13:28 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:28.581 103323 INFO neutron.agent.ovn.metadata.agent [-] Port ad0f91db-0150-4421-a472-f9984b0a20bc in datapath d6de4737-ca60-4c8d-bfd5-687f9366ec8b unbound from our chassis#033[00m
Oct  2 08:13:28 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:28.582 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d6de4737-ca60-4c8d-bfd5-687f9366ec8b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:13:28 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:28.584 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[5c922fc0-b26e-4728-9dde-0c3746359904]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:28 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:28.584 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b namespace which is not needed anymore#033[00m
Oct  2 08:13:28 np0005466013 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d0000003e.scope: Deactivated successfully.
Oct  2 08:13:28 np0005466013 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d0000003e.scope: Consumed 4.181s CPU time.
Oct  2 08:13:28 np0005466013 systemd-machined[152202]: Machine qemu-27-instance-0000003e terminated.
Oct  2 08:13:28 np0005466013 neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b[229047]: [NOTICE]   (229051) : haproxy version is 2.8.14-c23fe91
Oct  2 08:13:28 np0005466013 neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b[229047]: [NOTICE]   (229051) : path to executable is /usr/sbin/haproxy
Oct  2 08:13:28 np0005466013 neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b[229047]: [WARNING]  (229051) : Exiting Master process...
Oct  2 08:13:28 np0005466013 neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b[229047]: [WARNING]  (229051) : Exiting Master process...
Oct  2 08:13:28 np0005466013 neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b[229047]: [ALERT]    (229051) : Current worker (229053) exited with code 143 (Terminated)
Oct  2 08:13:28 np0005466013 neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b[229047]: [WARNING]  (229051) : All workers exited. Exiting... (0)
Oct  2 08:13:28 np0005466013 systemd[1]: libpod-fcb9dce6c2ee7b84ae7c60c6b6c739601b482b6e8cfd318c2e9ee0717382780b.scope: Deactivated successfully.
Oct  2 08:13:28 np0005466013 podman[229142]: 2025-10-02 12:13:28.730492008 +0000 UTC m=+0.054060670 container died fcb9dce6c2ee7b84ae7c60c6b6c739601b482b6e8cfd318c2e9ee0717382780b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:13:28 np0005466013 kernel: tapad0f91db-01: entered promiscuous mode
Oct  2 08:13:28 np0005466013 kernel: tapad0f91db-01 (unregistering): left promiscuous mode
Oct  2 08:13:28 np0005466013 nova_compute[192144]: 2025-10-02 12:13:28.754 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:28 np0005466013 ovn_controller[94366]: 2025-10-02T12:13:28Z|00205|binding|INFO|Claiming lport ad0f91db-0150-4421-a472-f9984b0a20bc for this chassis.
Oct  2 08:13:28 np0005466013 ovn_controller[94366]: 2025-10-02T12:13:28Z|00206|binding|INFO|ad0f91db-0150-4421-a472-f9984b0a20bc: Claiming fa:16:3e:62:bc:7f 10.100.0.8
Oct  2 08:13:28 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:28.771 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:62:bc:7f 10.100.0.8'], port_security=['fa:16:3e:62:bc:7f 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '960cdfa5-111c-4d08-82c7-29134bd55212', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ffae703d68b24b9c89686c149113fc2b', 'neutron:revision_number': '6', 'neutron:security_group_ids': '64970375-b20e-4c18-bfb5-2a0465f8be7d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9476db85-7514-407a-b55a-3d3c703e8f7b, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=ad0f91db-0150-4421-a472-f9984b0a20bc) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:13:28 np0005466013 nova_compute[192144]: 2025-10-02 12:13:28.773 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:28 np0005466013 ovn_controller[94366]: 2025-10-02T12:13:28Z|00207|binding|INFO|Releasing lport ad0f91db-0150-4421-a472-f9984b0a20bc from this chassis (sb_readonly=0)
Oct  2 08:13:28 np0005466013 systemd[1]: var-lib-containers-storage-overlay-8a73b4c548ac45cd6760f8d1d072c2a299ebdfc7bb575b4022c37a678d04ad9c-merged.mount: Deactivated successfully.
Oct  2 08:13:28 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fcb9dce6c2ee7b84ae7c60c6b6c739601b482b6e8cfd318c2e9ee0717382780b-userdata-shm.mount: Deactivated successfully.
Oct  2 08:13:28 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:28.790 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:62:bc:7f 10.100.0.8'], port_security=['fa:16:3e:62:bc:7f 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '960cdfa5-111c-4d08-82c7-29134bd55212', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ffae703d68b24b9c89686c149113fc2b', 'neutron:revision_number': '6', 'neutron:security_group_ids': '64970375-b20e-4c18-bfb5-2a0465f8be7d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9476db85-7514-407a-b55a-3d3c703e8f7b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=ad0f91db-0150-4421-a472-f9984b0a20bc) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:13:28 np0005466013 podman[229142]: 2025-10-02 12:13:28.793321391 +0000 UTC m=+0.116890023 container cleanup fcb9dce6c2ee7b84ae7c60c6b6c739601b482b6e8cfd318c2e9ee0717382780b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 08:13:28 np0005466013 nova_compute[192144]: 2025-10-02 12:13:28.801 2 INFO nova.virt.libvirt.driver [-] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Instance destroyed successfully.#033[00m
Oct  2 08:13:28 np0005466013 nova_compute[192144]: 2025-10-02 12:13:28.802 2 DEBUG nova.objects.instance [None req-f80c7a74-6b7f-465d-a723-0314aacfe5a2 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lazy-loading 'resources' on Instance uuid 960cdfa5-111c-4d08-82c7-29134bd55212 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:13:28 np0005466013 systemd[1]: libpod-conmon-fcb9dce6c2ee7b84ae7c60c6b6c739601b482b6e8cfd318c2e9ee0717382780b.scope: Deactivated successfully.
Oct  2 08:13:28 np0005466013 nova_compute[192144]: 2025-10-02 12:13:28.819 2 DEBUG nova.virt.libvirt.vif [None req-f80c7a74-6b7f-465d-a723-0314aacfe5a2 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T12:12:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1330483951',display_name='tempest-ServerDiskConfigTestJSON-server-1330483951',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1330483951',id=62,image_ref='062d9f80-76b6-42ce-bee7-0fb82a008353',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:13:25Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ffae703d68b24b9c89686c149113fc2b',ramdisk_id='',reservation_id='r-bb4if0yp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='062d9f80-76b6-42ce-bee7-0fb82a008353',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-1763056137',owner_user_name='tempest-ServerDiskConfigTestJSON-1763056137-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:13:25Z,user_data=None,user_id='def48c13fd6a43ba88836b753986a731',uuid=960cdfa5-111c-4d08-82c7-29134bd55212,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ad0f91db-0150-4421-a472-f9984b0a20bc", "address": "fa:16:3e:62:bc:7f", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad0f91db-01", "ovs_interfaceid": "ad0f91db-0150-4421-a472-f9984b0a20bc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:13:28 np0005466013 nova_compute[192144]: 2025-10-02 12:13:28.820 2 DEBUG nova.network.os_vif_util [None req-f80c7a74-6b7f-465d-a723-0314aacfe5a2 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Converting VIF {"id": "ad0f91db-0150-4421-a472-f9984b0a20bc", "address": "fa:16:3e:62:bc:7f", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad0f91db-01", "ovs_interfaceid": "ad0f91db-0150-4421-a472-f9984b0a20bc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:13:28 np0005466013 nova_compute[192144]: 2025-10-02 12:13:28.821 2 DEBUG nova.network.os_vif_util [None req-f80c7a74-6b7f-465d-a723-0314aacfe5a2 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:62:bc:7f,bridge_name='br-int',has_traffic_filtering=True,id=ad0f91db-0150-4421-a472-f9984b0a20bc,network=Network(d6de4737-ca60-4c8d-bfd5-687f9366ec8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapad0f91db-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:13:28 np0005466013 nova_compute[192144]: 2025-10-02 12:13:28.821 2 DEBUG os_vif [None req-f80c7a74-6b7f-465d-a723-0314aacfe5a2 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:62:bc:7f,bridge_name='br-int',has_traffic_filtering=True,id=ad0f91db-0150-4421-a472-f9984b0a20bc,network=Network(d6de4737-ca60-4c8d-bfd5-687f9366ec8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapad0f91db-01') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:13:28 np0005466013 nova_compute[192144]: 2025-10-02 12:13:28.823 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:28 np0005466013 nova_compute[192144]: 2025-10-02 12:13:28.823 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapad0f91db-01, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:13:28 np0005466013 nova_compute[192144]: 2025-10-02 12:13:28.825 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:28 np0005466013 nova_compute[192144]: 2025-10-02 12:13:28.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:28 np0005466013 nova_compute[192144]: 2025-10-02 12:13:28.829 2 INFO os_vif [None req-f80c7a74-6b7f-465d-a723-0314aacfe5a2 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:62:bc:7f,bridge_name='br-int',has_traffic_filtering=True,id=ad0f91db-0150-4421-a472-f9984b0a20bc,network=Network(d6de4737-ca60-4c8d-bfd5-687f9366ec8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapad0f91db-01')#033[00m
Oct  2 08:13:28 np0005466013 nova_compute[192144]: 2025-10-02 12:13:28.830 2 INFO nova.virt.libvirt.driver [None req-f80c7a74-6b7f-465d-a723-0314aacfe5a2 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Deleting instance files /var/lib/nova/instances/960cdfa5-111c-4d08-82c7-29134bd55212_del#033[00m
Oct  2 08:13:28 np0005466013 nova_compute[192144]: 2025-10-02 12:13:28.831 2 INFO nova.virt.libvirt.driver [None req-f80c7a74-6b7f-465d-a723-0314aacfe5a2 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Deletion of /var/lib/nova/instances/960cdfa5-111c-4d08-82c7-29134bd55212_del complete#033[00m
Oct  2 08:13:28 np0005466013 podman[229189]: 2025-10-02 12:13:28.872875774 +0000 UTC m=+0.052342938 container remove fcb9dce6c2ee7b84ae7c60c6b6c739601b482b6e8cfd318c2e9ee0717382780b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:13:28 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:28.880 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[32a6da77-cf5f-4206-9eed-b7cd08062295]: (4, ('Thu Oct  2 12:13:28 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b (fcb9dce6c2ee7b84ae7c60c6b6c739601b482b6e8cfd318c2e9ee0717382780b)\nfcb9dce6c2ee7b84ae7c60c6b6c739601b482b6e8cfd318c2e9ee0717382780b\nThu Oct  2 12:13:28 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b (fcb9dce6c2ee7b84ae7c60c6b6c739601b482b6e8cfd318c2e9ee0717382780b)\nfcb9dce6c2ee7b84ae7c60c6b6c739601b482b6e8cfd318c2e9ee0717382780b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:28 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:28.883 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[5bd0396e-9240-4c0c-ab1f-610c8ae623dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:28 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:28.884 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd6de4737-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:13:28 np0005466013 nova_compute[192144]: 2025-10-02 12:13:28.887 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:28 np0005466013 kernel: tapd6de4737-c0: left promiscuous mode
Oct  2 08:13:28 np0005466013 nova_compute[192144]: 2025-10-02 12:13:28.901 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:28 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:28.904 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[b5d3fdc1-be3a-4365-af02-45e1b04ae346]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:28 np0005466013 nova_compute[192144]: 2025-10-02 12:13:28.911 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:28 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:28.926 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[ec6631ff-b6f7-4adb-a974-accece77139d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:28 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:28.929 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[2df8cd41-8284-4455-aba7-285daecd08f3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:28 np0005466013 nova_compute[192144]: 2025-10-02 12:13:28.934 2 INFO nova.compute.manager [None req-f80c7a74-6b7f-465d-a723-0314aacfe5a2 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Took 0.41 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:13:28 np0005466013 nova_compute[192144]: 2025-10-02 12:13:28.935 2 DEBUG oslo.service.loopingcall [None req-f80c7a74-6b7f-465d-a723-0314aacfe5a2 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:13:28 np0005466013 nova_compute[192144]: 2025-10-02 12:13:28.936 2 DEBUG nova.compute.manager [-] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:13:28 np0005466013 nova_compute[192144]: 2025-10-02 12:13:28.936 2 DEBUG nova.network.neutron [-] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:13:28 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:28.953 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[7fceae69-5fe4-4a75-9935-c92d7e3e54d8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 519302, 'reachable_time': 32271, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229204, 'error': None, 'target': 'ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:28 np0005466013 systemd[1]: run-netns-ovnmeta\x2dd6de4737\x2dca60\x2d4c8d\x2dbfd5\x2d687f9366ec8b.mount: Deactivated successfully.
Oct  2 08:13:28 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:28.958 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:13:28 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:28.958 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[e66d32f1-3d23-43a2-a974-b0ddff0b33e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:28 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:28.960 103323 INFO neutron.agent.ovn.metadata.agent [-] Port ad0f91db-0150-4421-a472-f9984b0a20bc in datapath d6de4737-ca60-4c8d-bfd5-687f9366ec8b unbound from our chassis#033[00m
Oct  2 08:13:28 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:28.961 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d6de4737-ca60-4c8d-bfd5-687f9366ec8b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:13:28 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:28.962 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[03f25a0f-6b9a-4233-b298-1fb1b8522cd2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:28 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:28.963 103323 INFO neutron.agent.ovn.metadata.agent [-] Port ad0f91db-0150-4421-a472-f9984b0a20bc in datapath d6de4737-ca60-4c8d-bfd5-687f9366ec8b unbound from our chassis#033[00m
Oct  2 08:13:28 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:28.965 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d6de4737-ca60-4c8d-bfd5-687f9366ec8b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:13:28 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:28.965 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[2247154f-588a-4a04-b87f-82bf383cfed1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:29 np0005466013 nova_compute[192144]: 2025-10-02 12:13:29.922 2 DEBUG nova.network.neutron [-] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:13:29 np0005466013 nova_compute[192144]: 2025-10-02 12:13:29.954 2 INFO nova.compute.manager [-] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Took 1.02 seconds to deallocate network for instance.#033[00m
Oct  2 08:13:30 np0005466013 nova_compute[192144]: 2025-10-02 12:13:30.052 2 DEBUG oslo_concurrency.lockutils [None req-f80c7a74-6b7f-465d-a723-0314aacfe5a2 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:30 np0005466013 nova_compute[192144]: 2025-10-02 12:13:30.053 2 DEBUG oslo_concurrency.lockutils [None req-f80c7a74-6b7f-465d-a723-0314aacfe5a2 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:30 np0005466013 nova_compute[192144]: 2025-10-02 12:13:30.130 2 DEBUG nova.compute.provider_tree [None req-f80c7a74-6b7f-465d-a723-0314aacfe5a2 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:13:30 np0005466013 nova_compute[192144]: 2025-10-02 12:13:30.152 2 DEBUG nova.scheduler.client.report [None req-f80c7a74-6b7f-465d-a723-0314aacfe5a2 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:13:30 np0005466013 nova_compute[192144]: 2025-10-02 12:13:30.224 2 DEBUG oslo_concurrency.lockutils [None req-f80c7a74-6b7f-465d-a723-0314aacfe5a2 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.171s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:30 np0005466013 nova_compute[192144]: 2025-10-02 12:13:30.260 2 INFO nova.scheduler.client.report [None req-f80c7a74-6b7f-465d-a723-0314aacfe5a2 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Deleted allocations for instance 960cdfa5-111c-4d08-82c7-29134bd55212#033[00m
Oct  2 08:13:30 np0005466013 nova_compute[192144]: 2025-10-02 12:13:30.376 2 DEBUG oslo_concurrency.lockutils [None req-f80c7a74-6b7f-465d-a723-0314aacfe5a2 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "960cdfa5-111c-4d08-82c7-29134bd55212" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.882s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:30 np0005466013 nova_compute[192144]: 2025-10-02 12:13:30.430 2 DEBUG nova.compute.manager [req-99233dab-df87-4944-a9c9-9a650b564f88 req-d0861c7b-52b9-4f1a-adb8-837b0e8b2892 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Received event network-vif-unplugged-ad0f91db-0150-4421-a472-f9984b0a20bc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:13:30 np0005466013 nova_compute[192144]: 2025-10-02 12:13:30.431 2 DEBUG oslo_concurrency.lockutils [req-99233dab-df87-4944-a9c9-9a650b564f88 req-d0861c7b-52b9-4f1a-adb8-837b0e8b2892 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "960cdfa5-111c-4d08-82c7-29134bd55212-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:30 np0005466013 nova_compute[192144]: 2025-10-02 12:13:30.431 2 DEBUG oslo_concurrency.lockutils [req-99233dab-df87-4944-a9c9-9a650b564f88 req-d0861c7b-52b9-4f1a-adb8-837b0e8b2892 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "960cdfa5-111c-4d08-82c7-29134bd55212-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:30 np0005466013 nova_compute[192144]: 2025-10-02 12:13:30.432 2 DEBUG oslo_concurrency.lockutils [req-99233dab-df87-4944-a9c9-9a650b564f88 req-d0861c7b-52b9-4f1a-adb8-837b0e8b2892 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "960cdfa5-111c-4d08-82c7-29134bd55212-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:30 np0005466013 nova_compute[192144]: 2025-10-02 12:13:30.432 2 DEBUG nova.compute.manager [req-99233dab-df87-4944-a9c9-9a650b564f88 req-d0861c7b-52b9-4f1a-adb8-837b0e8b2892 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] No waiting events found dispatching network-vif-unplugged-ad0f91db-0150-4421-a472-f9984b0a20bc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:13:30 np0005466013 nova_compute[192144]: 2025-10-02 12:13:30.432 2 WARNING nova.compute.manager [req-99233dab-df87-4944-a9c9-9a650b564f88 req-d0861c7b-52b9-4f1a-adb8-837b0e8b2892 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Received unexpected event network-vif-unplugged-ad0f91db-0150-4421-a472-f9984b0a20bc for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:13:30 np0005466013 nova_compute[192144]: 2025-10-02 12:13:30.433 2 DEBUG nova.compute.manager [req-99233dab-df87-4944-a9c9-9a650b564f88 req-d0861c7b-52b9-4f1a-adb8-837b0e8b2892 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Received event network-vif-plugged-ad0f91db-0150-4421-a472-f9984b0a20bc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:13:30 np0005466013 nova_compute[192144]: 2025-10-02 12:13:30.433 2 DEBUG oslo_concurrency.lockutils [req-99233dab-df87-4944-a9c9-9a650b564f88 req-d0861c7b-52b9-4f1a-adb8-837b0e8b2892 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "960cdfa5-111c-4d08-82c7-29134bd55212-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:30 np0005466013 nova_compute[192144]: 2025-10-02 12:13:30.433 2 DEBUG oslo_concurrency.lockutils [req-99233dab-df87-4944-a9c9-9a650b564f88 req-d0861c7b-52b9-4f1a-adb8-837b0e8b2892 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "960cdfa5-111c-4d08-82c7-29134bd55212-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:30 np0005466013 nova_compute[192144]: 2025-10-02 12:13:30.434 2 DEBUG oslo_concurrency.lockutils [req-99233dab-df87-4944-a9c9-9a650b564f88 req-d0861c7b-52b9-4f1a-adb8-837b0e8b2892 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "960cdfa5-111c-4d08-82c7-29134bd55212-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:30 np0005466013 nova_compute[192144]: 2025-10-02 12:13:30.434 2 DEBUG nova.compute.manager [req-99233dab-df87-4944-a9c9-9a650b564f88 req-d0861c7b-52b9-4f1a-adb8-837b0e8b2892 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] No waiting events found dispatching network-vif-plugged-ad0f91db-0150-4421-a472-f9984b0a20bc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:13:30 np0005466013 nova_compute[192144]: 2025-10-02 12:13:30.434 2 WARNING nova.compute.manager [req-99233dab-df87-4944-a9c9-9a650b564f88 req-d0861c7b-52b9-4f1a-adb8-837b0e8b2892 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Received unexpected event network-vif-plugged-ad0f91db-0150-4421-a472-f9984b0a20bc for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:13:30 np0005466013 nova_compute[192144]: 2025-10-02 12:13:30.434 2 DEBUG nova.compute.manager [req-99233dab-df87-4944-a9c9-9a650b564f88 req-d0861c7b-52b9-4f1a-adb8-837b0e8b2892 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Received event network-vif-deleted-ad0f91db-0150-4421-a472-f9984b0a20bc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:13:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:31.279 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:13:31 np0005466013 nova_compute[192144]: 2025-10-02 12:13:31.795 2 DEBUG oslo_concurrency.lockutils [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "878c8333-7358-4047-ad93-5b34ec0b2643" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:31 np0005466013 nova_compute[192144]: 2025-10-02 12:13:31.796 2 DEBUG oslo_concurrency.lockutils [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "878c8333-7358-4047-ad93-5b34ec0b2643" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:31 np0005466013 nova_compute[192144]: 2025-10-02 12:13:31.829 2 DEBUG nova.compute.manager [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:13:31 np0005466013 nova_compute[192144]: 2025-10-02 12:13:31.946 2 DEBUG oslo_concurrency.lockutils [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:31 np0005466013 nova_compute[192144]: 2025-10-02 12:13:31.947 2 DEBUG oslo_concurrency.lockutils [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:31 np0005466013 nova_compute[192144]: 2025-10-02 12:13:31.955 2 DEBUG nova.virt.hardware [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:13:31 np0005466013 nova_compute[192144]: 2025-10-02 12:13:31.955 2 INFO nova.compute.claims [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:13:32 np0005466013 nova_compute[192144]: 2025-10-02 12:13:32.127 2 DEBUG nova.compute.provider_tree [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:13:32 np0005466013 nova_compute[192144]: 2025-10-02 12:13:32.186 2 DEBUG nova.scheduler.client.report [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:13:32 np0005466013 nova_compute[192144]: 2025-10-02 12:13:32.227 2 DEBUG oslo_concurrency.lockutils [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.281s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:32 np0005466013 nova_compute[192144]: 2025-10-02 12:13:32.228 2 DEBUG nova.compute.manager [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:13:32 np0005466013 nova_compute[192144]: 2025-10-02 12:13:32.335 2 DEBUG nova.compute.manager [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:13:32 np0005466013 nova_compute[192144]: 2025-10-02 12:13:32.336 2 DEBUG nova.network.neutron [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:13:32 np0005466013 nova_compute[192144]: 2025-10-02 12:13:32.372 2 INFO nova.virt.libvirt.driver [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:13:32 np0005466013 nova_compute[192144]: 2025-10-02 12:13:32.425 2 DEBUG nova.compute.manager [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:13:32 np0005466013 nova_compute[192144]: 2025-10-02 12:13:32.634 2 DEBUG nova.compute.manager [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:13:32 np0005466013 nova_compute[192144]: 2025-10-02 12:13:32.636 2 DEBUG nova.virt.libvirt.driver [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:13:32 np0005466013 nova_compute[192144]: 2025-10-02 12:13:32.637 2 INFO nova.virt.libvirt.driver [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Creating image(s)#033[00m
Oct  2 08:13:32 np0005466013 nova_compute[192144]: 2025-10-02 12:13:32.637 2 DEBUG oslo_concurrency.lockutils [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "/var/lib/nova/instances/878c8333-7358-4047-ad93-5b34ec0b2643/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:32 np0005466013 nova_compute[192144]: 2025-10-02 12:13:32.638 2 DEBUG oslo_concurrency.lockutils [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "/var/lib/nova/instances/878c8333-7358-4047-ad93-5b34ec0b2643/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:32 np0005466013 nova_compute[192144]: 2025-10-02 12:13:32.638 2 DEBUG oslo_concurrency.lockutils [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "/var/lib/nova/instances/878c8333-7358-4047-ad93-5b34ec0b2643/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:32 np0005466013 nova_compute[192144]: 2025-10-02 12:13:32.652 2 DEBUG oslo_concurrency.processutils [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:13:32 np0005466013 nova_compute[192144]: 2025-10-02 12:13:32.720 2 DEBUG nova.policy [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'def48c13fd6a43ba88836b753986a731', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ffae703d68b24b9c89686c149113fc2b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:13:32 np0005466013 nova_compute[192144]: 2025-10-02 12:13:32.727 2 DEBUG oslo_concurrency.processutils [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:13:32 np0005466013 nova_compute[192144]: 2025-10-02 12:13:32.728 2 DEBUG oslo_concurrency.lockutils [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:32 np0005466013 nova_compute[192144]: 2025-10-02 12:13:32.729 2 DEBUG oslo_concurrency.lockutils [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:32 np0005466013 nova_compute[192144]: 2025-10-02 12:13:32.745 2 DEBUG oslo_concurrency.processutils [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:13:32 np0005466013 nova_compute[192144]: 2025-10-02 12:13:32.806 2 DEBUG oslo_concurrency.processutils [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:13:32 np0005466013 nova_compute[192144]: 2025-10-02 12:13:32.807 2 DEBUG oslo_concurrency.processutils [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/878c8333-7358-4047-ad93-5b34ec0b2643/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:13:32 np0005466013 nova_compute[192144]: 2025-10-02 12:13:32.847 2 DEBUG oslo_concurrency.processutils [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/878c8333-7358-4047-ad93-5b34ec0b2643/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:13:32 np0005466013 nova_compute[192144]: 2025-10-02 12:13:32.849 2 DEBUG oslo_concurrency.lockutils [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:32 np0005466013 nova_compute[192144]: 2025-10-02 12:13:32.849 2 DEBUG oslo_concurrency.processutils [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:13:32 np0005466013 nova_compute[192144]: 2025-10-02 12:13:32.916 2 DEBUG oslo_concurrency.processutils [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:13:32 np0005466013 nova_compute[192144]: 2025-10-02 12:13:32.917 2 DEBUG nova.virt.disk.api [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Checking if we can resize image /var/lib/nova/instances/878c8333-7358-4047-ad93-5b34ec0b2643/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:13:32 np0005466013 nova_compute[192144]: 2025-10-02 12:13:32.917 2 DEBUG oslo_concurrency.processutils [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/878c8333-7358-4047-ad93-5b34ec0b2643/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:13:32 np0005466013 nova_compute[192144]: 2025-10-02 12:13:32.979 2 DEBUG oslo_concurrency.processutils [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/878c8333-7358-4047-ad93-5b34ec0b2643/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:13:32 np0005466013 nova_compute[192144]: 2025-10-02 12:13:32.980 2 DEBUG nova.virt.disk.api [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Cannot resize image /var/lib/nova/instances/878c8333-7358-4047-ad93-5b34ec0b2643/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:13:32 np0005466013 nova_compute[192144]: 2025-10-02 12:13:32.982 2 DEBUG nova.objects.instance [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lazy-loading 'migration_context' on Instance uuid 878c8333-7358-4047-ad93-5b34ec0b2643 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:13:33 np0005466013 nova_compute[192144]: 2025-10-02 12:13:33.000 2 DEBUG nova.virt.libvirt.driver [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:13:33 np0005466013 nova_compute[192144]: 2025-10-02 12:13:33.000 2 DEBUG nova.virt.libvirt.driver [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Ensure instance console log exists: /var/lib/nova/instances/878c8333-7358-4047-ad93-5b34ec0b2643/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:13:33 np0005466013 nova_compute[192144]: 2025-10-02 12:13:33.001 2 DEBUG oslo_concurrency.lockutils [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:33 np0005466013 nova_compute[192144]: 2025-10-02 12:13:33.001 2 DEBUG oslo_concurrency.lockutils [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:33 np0005466013 nova_compute[192144]: 2025-10-02 12:13:33.002 2 DEBUG oslo_concurrency.lockutils [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:33 np0005466013 nova_compute[192144]: 2025-10-02 12:13:33.271 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:33 np0005466013 nova_compute[192144]: 2025-10-02 12:13:33.826 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:33 np0005466013 nova_compute[192144]: 2025-10-02 12:13:33.832 2 DEBUG nova.network.neutron [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Successfully created port: 4e4dadeb-b5ea-4021-9b54-652b42f7783b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:13:34 np0005466013 nova_compute[192144]: 2025-10-02 12:13:34.068 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:13:35 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:35.601 103434 DEBUG eventlet.wsgi.server [-] (103434) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m
Oct  2 08:13:35 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:35.604 103434 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /openstack/latest/meta_data.json HTTP/1.0#015
Oct  2 08:13:35 np0005466013 ovn_metadata_agent[103318]: Accept: */*#015
Oct  2 08:13:35 np0005466013 ovn_metadata_agent[103318]: Connection: close#015
Oct  2 08:13:35 np0005466013 ovn_metadata_agent[103318]: Content-Type: text/plain#015
Oct  2 08:13:35 np0005466013 ovn_metadata_agent[103318]: Host: 169.254.169.254#015
Oct  2 08:13:35 np0005466013 ovn_metadata_agent[103318]: User-Agent: curl/7.84.0#015
Oct  2 08:13:35 np0005466013 ovn_metadata_agent[103318]: X-Forwarded-For: 10.100.0.8#015
Oct  2 08:13:35 np0005466013 ovn_metadata_agent[103318]: X-Ovn-Network-Id: d8c71e68-a016-4099-877d-881b9a6e634c __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m
Oct  2 08:13:36 np0005466013 nova_compute[192144]: 2025-10-02 12:13:36.070 2 DEBUG nova.network.neutron [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Successfully updated port: 4e4dadeb-b5ea-4021-9b54-652b42f7783b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:13:36 np0005466013 nova_compute[192144]: 2025-10-02 12:13:36.117 2 DEBUG oslo_concurrency.lockutils [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "refresh_cache-878c8333-7358-4047-ad93-5b34ec0b2643" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:13:36 np0005466013 nova_compute[192144]: 2025-10-02 12:13:36.117 2 DEBUG oslo_concurrency.lockutils [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquired lock "refresh_cache-878c8333-7358-4047-ad93-5b34ec0b2643" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:13:36 np0005466013 nova_compute[192144]: 2025-10-02 12:13:36.118 2 DEBUG nova.network.neutron [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:13:36 np0005466013 nova_compute[192144]: 2025-10-02 12:13:36.693 2 DEBUG nova.network.neutron [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:13:37 np0005466013 nova_compute[192144]: 2025-10-02 12:13:37.147 2 DEBUG nova.compute.manager [req-18c5e090-c331-4b50-956c-0251c2bef9a2 req-2578734a-16c1-441f-bb94-660053d924a1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Received event network-changed-4e4dadeb-b5ea-4021-9b54-652b42f7783b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:13:37 np0005466013 nova_compute[192144]: 2025-10-02 12:13:37.148 2 DEBUG nova.compute.manager [req-18c5e090-c331-4b50-956c-0251c2bef9a2 req-2578734a-16c1-441f-bb94-660053d924a1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Refreshing instance network info cache due to event network-changed-4e4dadeb-b5ea-4021-9b54-652b42f7783b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:13:37 np0005466013 nova_compute[192144]: 2025-10-02 12:13:37.148 2 DEBUG oslo_concurrency.lockutils [req-18c5e090-c331-4b50-956c-0251c2bef9a2 req-2578734a-16c1-441f-bb94-660053d924a1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-878c8333-7358-4047-ad93-5b34ec0b2643" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:13:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:37.288 103434 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m
Oct  2 08:13:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:37.289 103434 INFO eventlet.wsgi.server [-] 10.100.0.8,<local> "GET /openstack/latest/meta_data.json HTTP/1.1" status: 200  len: 1671 time: 1.6851544#033[00m
Oct  2 08:13:37 np0005466013 haproxy-metadata-proxy-d8c71e68-a016-4099-877d-881b9a6e634c[228650]: 10.100.0.8:43736 [02/Oct/2025:12:13:35.600] listener listener/metadata 0/0/0/1689/1689 200 1655 - - ---- 1/1/0/0/0 0/0 "GET /openstack/latest/meta_data.json HTTP/1.1"
Oct  2 08:13:37 np0005466013 podman[229221]: 2025-10-02 12:13:37.697232316 +0000 UTC m=+0.064643380 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Oct  2 08:13:37 np0005466013 podman[229220]: 2025-10-02 12:13:37.697495115 +0000 UTC m=+0.063553857 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 08:13:37 np0005466013 podman[229222]: 2025-10-02 12:13:37.736116864 +0000 UTC m=+0.098620816 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:13:37 np0005466013 nova_compute[192144]: 2025-10-02 12:13:37.880 2 DEBUG oslo_concurrency.lockutils [None req-9da7fa0c-0c9e-4c65-9279-0572121e78a1 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Acquiring lock "f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:37 np0005466013 nova_compute[192144]: 2025-10-02 12:13:37.880 2 DEBUG oslo_concurrency.lockutils [None req-9da7fa0c-0c9e-4c65-9279-0572121e78a1 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Lock "f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:37 np0005466013 nova_compute[192144]: 2025-10-02 12:13:37.880 2 DEBUG oslo_concurrency.lockutils [None req-9da7fa0c-0c9e-4c65-9279-0572121e78a1 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Acquiring lock "f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:37 np0005466013 nova_compute[192144]: 2025-10-02 12:13:37.881 2 DEBUG oslo_concurrency.lockutils [None req-9da7fa0c-0c9e-4c65-9279-0572121e78a1 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Lock "f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:37 np0005466013 nova_compute[192144]: 2025-10-02 12:13:37.881 2 DEBUG oslo_concurrency.lockutils [None req-9da7fa0c-0c9e-4c65-9279-0572121e78a1 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Lock "f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:37 np0005466013 nova_compute[192144]: 2025-10-02 12:13:37.935 2 INFO nova.compute.manager [None req-9da7fa0c-0c9e-4c65-9279-0572121e78a1 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Terminating instance#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.035 2 DEBUG nova.compute.manager [None req-9da7fa0c-0c9e-4c65-9279-0572121e78a1 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:13:38 np0005466013 kernel: tapaf4c0ef6-86 (unregistering): left promiscuous mode
Oct  2 08:13:38 np0005466013 NetworkManager[51205]: <info>  [1759407218.0649] device (tapaf4c0ef6-86): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.074 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:38 np0005466013 ovn_controller[94366]: 2025-10-02T12:13:38Z|00208|binding|INFO|Releasing lport af4c0ef6-8643-4a83-be1e-4000ab5fd894 from this chassis (sb_readonly=0)
Oct  2 08:13:38 np0005466013 ovn_controller[94366]: 2025-10-02T12:13:38Z|00209|binding|INFO|Setting lport af4c0ef6-8643-4a83-be1e-4000ab5fd894 down in Southbound
Oct  2 08:13:38 np0005466013 ovn_controller[94366]: 2025-10-02T12:13:38Z|00210|binding|INFO|Removing iface tapaf4c0ef6-86 ovn-installed in OVS
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.077 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:38 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:38.115 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a9:c4:a7 10.100.0.8'], port_security=['fa:16:3e:a9:c4:a7 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d8c71e68-a016-4099-877d-881b9a6e634c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7cc67bd4da7644d3bd8155cc7f188aa4', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7edffac1-7798-43a3-9bf2-487473d2826c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.218'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6d8e7264-2c9c-43b0-92cb-5dba96a13a6a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=af4c0ef6-8643-4a83-be1e-4000ab5fd894) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:13:38 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:38.117 103323 INFO neutron.agent.ovn.metadata.agent [-] Port af4c0ef6-8643-4a83-be1e-4000ab5fd894 in datapath d8c71e68-a016-4099-877d-881b9a6e634c unbound from our chassis#033[00m
Oct  2 08:13:38 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:38.119 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d8c71e68-a016-4099-877d-881b9a6e634c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:13:38 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:38.119 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[7c16a4d7-4fe7-4ab2-a5b5-236d150572c6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:38 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:38.120 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d8c71e68-a016-4099-877d-881b9a6e634c namespace which is not needed anymore#033[00m
Oct  2 08:13:38 np0005466013 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d0000003f.scope: Deactivated successfully.
Oct  2 08:13:38 np0005466013 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d0000003f.scope: Consumed 14.689s CPU time.
Oct  2 08:13:38 np0005466013 systemd-machined[152202]: Machine qemu-26-instance-0000003f terminated.
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.233 2 DEBUG nova.network.neutron [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Updating instance_info_cache with network_info: [{"id": "4e4dadeb-b5ea-4021-9b54-652b42f7783b", "address": "fa:16:3e:8c:d3:48", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4e4dadeb-b5", "ovs_interfaceid": "4e4dadeb-b5ea-4021-9b54-652b42f7783b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:13:38 np0005466013 kernel: tapaf4c0ef6-86: entered promiscuous mode
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.259 2 DEBUG oslo_concurrency.lockutils [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Releasing lock "refresh_cache-878c8333-7358-4047-ad93-5b34ec0b2643" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:13:38 np0005466013 neutron-haproxy-ovnmeta-d8c71e68-a016-4099-877d-881b9a6e634c[228644]: [NOTICE]   (228648) : haproxy version is 2.8.14-c23fe91
Oct  2 08:13:38 np0005466013 neutron-haproxy-ovnmeta-d8c71e68-a016-4099-877d-881b9a6e634c[228644]: [NOTICE]   (228648) : path to executable is /usr/sbin/haproxy
Oct  2 08:13:38 np0005466013 neutron-haproxy-ovnmeta-d8c71e68-a016-4099-877d-881b9a6e634c[228644]: [WARNING]  (228648) : Exiting Master process...
Oct  2 08:13:38 np0005466013 neutron-haproxy-ovnmeta-d8c71e68-a016-4099-877d-881b9a6e634c[228644]: [WARNING]  (228648) : Exiting Master process...
Oct  2 08:13:38 np0005466013 NetworkManager[51205]: <info>  [1759407218.2604] manager: (tapaf4c0ef6-86): new Tun device (/org/freedesktop/NetworkManager/Devices/110)
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.260 2 DEBUG nova.compute.manager [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Instance network_info: |[{"id": "4e4dadeb-b5ea-4021-9b54-652b42f7783b", "address": "fa:16:3e:8c:d3:48", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4e4dadeb-b5", "ovs_interfaceid": "4e4dadeb-b5ea-4021-9b54-652b42f7783b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:13:38 np0005466013 systemd-udevd[229289]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.262 2 DEBUG oslo_concurrency.lockutils [req-18c5e090-c331-4b50-956c-0251c2bef9a2 req-2578734a-16c1-441f-bb94-660053d924a1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-878c8333-7358-4047-ad93-5b34ec0b2643" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.262 2 DEBUG nova.network.neutron [req-18c5e090-c331-4b50-956c-0251c2bef9a2 req-2578734a-16c1-441f-bb94-660053d924a1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Refreshing network info cache for port 4e4dadeb-b5ea-4021-9b54-652b42f7783b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:13:38 np0005466013 kernel: tapaf4c0ef6-86 (unregistering): left promiscuous mode
Oct  2 08:13:38 np0005466013 neutron-haproxy-ovnmeta-d8c71e68-a016-4099-877d-881b9a6e634c[228644]: [ALERT]    (228648) : Current worker (228650) exited with code 143 (Terminated)
Oct  2 08:13:38 np0005466013 neutron-haproxy-ovnmeta-d8c71e68-a016-4099-877d-881b9a6e634c[228644]: [WARNING]  (228648) : All workers exited. Exiting... (0)
Oct  2 08:13:38 np0005466013 ovn_controller[94366]: 2025-10-02T12:13:38Z|00211|binding|INFO|Claiming lport af4c0ef6-8643-4a83-be1e-4000ab5fd894 for this chassis.
Oct  2 08:13:38 np0005466013 ovn_controller[94366]: 2025-10-02T12:13:38Z|00212|binding|INFO|af4c0ef6-8643-4a83-be1e-4000ab5fd894: Claiming fa:16:3e:a9:c4:a7 10.100.0.8
Oct  2 08:13:38 np0005466013 systemd[1]: libpod-6fe455975ed6ed4833a3a8b5f5b4bb5d8ab0dcb5a17a47ad40588f42873cfbfc.scope: Deactivated successfully.
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.268 2 DEBUG nova.virt.libvirt.driver [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Start _get_guest_xml network_info=[{"id": "4e4dadeb-b5ea-4021-9b54-652b42f7783b", "address": "fa:16:3e:8c:d3:48", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4e4dadeb-b5", "ovs_interfaceid": "4e4dadeb-b5ea-4021-9b54-652b42f7783b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:38 np0005466013 conmon[228644]: conmon 6fe455975ed6ed4833a3 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-6fe455975ed6ed4833a3a8b5f5b4bb5d8ab0dcb5a17a47ad40588f42873cfbfc.scope/container/memory.events
Oct  2 08:13:38 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:38.278 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a9:c4:a7 10.100.0.8'], port_security=['fa:16:3e:a9:c4:a7 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d8c71e68-a016-4099-877d-881b9a6e634c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7cc67bd4da7644d3bd8155cc7f188aa4', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7edffac1-7798-43a3-9bf2-487473d2826c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.218'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6d8e7264-2c9c-43b0-92cb-5dba96a13a6a, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=af4c0ef6-8643-4a83-be1e-4000ab5fd894) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:13:38 np0005466013 podman[229310]: 2025-10-02 12:13:38.279297187 +0000 UTC m=+0.061897655 container died 6fe455975ed6ed4833a3a8b5f5b4bb5d8ab0dcb5a17a47ad40588f42873cfbfc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d8c71e68-a016-4099-877d-881b9a6e634c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.279 2 WARNING nova.virt.libvirt.driver [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.282 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:38 np0005466013 ovn_controller[94366]: 2025-10-02T12:13:38Z|00213|binding|INFO|Releasing lport af4c0ef6-8643-4a83-be1e-4000ab5fd894 from this chassis (sb_readonly=0)
Oct  2 08:13:38 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:38.298 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a9:c4:a7 10.100.0.8'], port_security=['fa:16:3e:a9:c4:a7 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d8c71e68-a016-4099-877d-881b9a6e634c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7cc67bd4da7644d3bd8155cc7f188aa4', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7edffac1-7798-43a3-9bf2-487473d2826c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.218'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6d8e7264-2c9c-43b0-92cb-5dba96a13a6a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=af4c0ef6-8643-4a83-be1e-4000ab5fd894) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.304 2 DEBUG nova.virt.libvirt.host [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.305 2 DEBUG nova.virt.libvirt.host [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:13:38 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6fe455975ed6ed4833a3a8b5f5b4bb5d8ab0dcb5a17a47ad40588f42873cfbfc-userdata-shm.mount: Deactivated successfully.
Oct  2 08:13:38 np0005466013 systemd[1]: var-lib-containers-storage-overlay-b08d1fed084485a971404a936ac2c0139cf9f533a0c3be14a90d793d6eb78604-merged.mount: Deactivated successfully.
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.323 2 DEBUG nova.virt.libvirt.host [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.324 2 DEBUG nova.virt.libvirt.host [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.326 2 DEBUG nova.virt.libvirt.driver [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.326 2 DEBUG nova.virt.hardware [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.327 2 DEBUG nova.virt.hardware [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.328 2 DEBUG nova.virt.hardware [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.328 2 DEBUG nova.virt.hardware [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.328 2 DEBUG nova.virt.hardware [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.329 2 DEBUG nova.virt.hardware [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.329 2 DEBUG nova.virt.hardware [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.329 2 DEBUG nova.virt.hardware [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.329 2 DEBUG nova.virt.hardware [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.330 2 DEBUG nova.virt.hardware [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.330 2 DEBUG nova.virt.hardware [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:13:38 np0005466013 podman[229310]: 2025-10-02 12:13:38.330909391 +0000 UTC m=+0.113509859 container cleanup 6fe455975ed6ed4833a3a8b5f5b4bb5d8ab0dcb5a17a47ad40588f42873cfbfc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d8c71e68-a016-4099-877d-881b9a6e634c, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2)
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.333 2 DEBUG nova.virt.libvirt.vif [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:13:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-216461092',display_name='tempest-ServerDiskConfigTestJSON-server-216461092',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-216461092',id=68,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ffae703d68b24b9c89686c149113fc2b',ramdisk_id='',reservation_id='r-q7hhnz17',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-1763056137',owner_user_name='tempest-ServerDiskConfigTestJSON-1763056137-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:13:32Z,user_data=None,user_id='def48c13fd6a43ba88836b753986a731',uuid=878c8333-7358-4047-ad93-5b34ec0b2643,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4e4dadeb-b5ea-4021-9b54-652b42f7783b", "address": "fa:16:3e:8c:d3:48", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4e4dadeb-b5", "ovs_interfaceid": "4e4dadeb-b5ea-4021-9b54-652b42f7783b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.334 2 DEBUG nova.network.os_vif_util [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Converting VIF {"id": "4e4dadeb-b5ea-4021-9b54-652b42f7783b", "address": "fa:16:3e:8c:d3:48", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4e4dadeb-b5", "ovs_interfaceid": "4e4dadeb-b5ea-4021-9b54-652b42f7783b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.335 2 DEBUG nova.network.os_vif_util [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8c:d3:48,bridge_name='br-int',has_traffic_filtering=True,id=4e4dadeb-b5ea-4021-9b54-652b42f7783b,network=Network(d6de4737-ca60-4c8d-bfd5-687f9366ec8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4e4dadeb-b5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.336 2 DEBUG nova.objects.instance [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lazy-loading 'pci_devices' on Instance uuid 878c8333-7358-4047-ad93-5b34ec0b2643 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:13:38 np0005466013 systemd[1]: libpod-conmon-6fe455975ed6ed4833a3a8b5f5b4bb5d8ab0dcb5a17a47ad40588f42873cfbfc.scope: Deactivated successfully.
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.341 2 INFO nova.virt.libvirt.driver [-] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Instance destroyed successfully.#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.342 2 DEBUG nova.objects.instance [None req-9da7fa0c-0c9e-4c65-9279-0572121e78a1 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Lazy-loading 'resources' on Instance uuid f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.360 2 DEBUG nova.virt.libvirt.driver [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:13:38 np0005466013 nova_compute[192144]:  <uuid>878c8333-7358-4047-ad93-5b34ec0b2643</uuid>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:  <name>instance-00000044</name>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:13:38 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:      <nova:name>tempest-ServerDiskConfigTestJSON-server-216461092</nova:name>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:13:38</nova:creationTime>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:13:38 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:        <nova:user uuid="def48c13fd6a43ba88836b753986a731">tempest-ServerDiskConfigTestJSON-1763056137-project-member</nova:user>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:        <nova:project uuid="ffae703d68b24b9c89686c149113fc2b">tempest-ServerDiskConfigTestJSON-1763056137</nova:project>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:        <nova:port uuid="4e4dadeb-b5ea-4021-9b54-652b42f7783b">
Oct  2 08:13:38 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:      <entry name="serial">878c8333-7358-4047-ad93-5b34ec0b2643</entry>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:      <entry name="uuid">878c8333-7358-4047-ad93-5b34ec0b2643</entry>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:13:38 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/878c8333-7358-4047-ad93-5b34ec0b2643/disk"/>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:13:38 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/878c8333-7358-4047-ad93-5b34ec0b2643/disk.config"/>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:13:38 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:8c:d3:48"/>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:      <target dev="tap4e4dadeb-b5"/>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:13:38 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/878c8333-7358-4047-ad93-5b34ec0b2643/console.log" append="off"/>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:13:38 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:13:38 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:13:38 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:13:38 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:13:38 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.362 2 DEBUG nova.compute.manager [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Preparing to wait for external event network-vif-plugged-4e4dadeb-b5ea-4021-9b54-652b42f7783b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.363 2 DEBUG oslo_concurrency.lockutils [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "878c8333-7358-4047-ad93-5b34ec0b2643-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.363 2 DEBUG oslo_concurrency.lockutils [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "878c8333-7358-4047-ad93-5b34ec0b2643-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.363 2 DEBUG oslo_concurrency.lockutils [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "878c8333-7358-4047-ad93-5b34ec0b2643-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.364 2 DEBUG nova.virt.libvirt.vif [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:13:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-216461092',display_name='tempest-ServerDiskConfigTestJSON-server-216461092',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-216461092',id=68,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ffae703d68b24b9c89686c149113fc2b',ramdisk_id='',reservation_id='r-q7hhnz17',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-1763056137',owner_user_name='tempest-ServerDiskConfigTestJSON-1763056137-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:13:32Z,user_data=None,user_id='def48c13fd6a43ba88836b753986a731',uuid=878c8333-7358-4047-ad93-5b34ec0b2643,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4e4dadeb-b5ea-4021-9b54-652b42f7783b", "address": "fa:16:3e:8c:d3:48", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4e4dadeb-b5", "ovs_interfaceid": "4e4dadeb-b5ea-4021-9b54-652b42f7783b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.364 2 DEBUG nova.network.os_vif_util [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Converting VIF {"id": "4e4dadeb-b5ea-4021-9b54-652b42f7783b", "address": "fa:16:3e:8c:d3:48", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4e4dadeb-b5", "ovs_interfaceid": "4e4dadeb-b5ea-4021-9b54-652b42f7783b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.365 2 DEBUG nova.network.os_vif_util [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8c:d3:48,bridge_name='br-int',has_traffic_filtering=True,id=4e4dadeb-b5ea-4021-9b54-652b42f7783b,network=Network(d6de4737-ca60-4c8d-bfd5-687f9366ec8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4e4dadeb-b5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.365 2 DEBUG os_vif [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8c:d3:48,bridge_name='br-int',has_traffic_filtering=True,id=4e4dadeb-b5ea-4021-9b54-652b42f7783b,network=Network(d6de4737-ca60-4c8d-bfd5-687f9366ec8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4e4dadeb-b5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.366 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.366 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.367 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.369 2 DEBUG nova.virt.libvirt.vif [None req-9da7fa0c-0c9e-4c65-9279-0572121e78a1 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:12:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='guest-instance-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx-guest-test.domaintest.com',id=63,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOPn0xpGhVjQWIudBz7gv1qPDp8tKuxdf0JEzn8tPqDQfHIB9ebNoAyxeWg2Ca9UuDQGIerOpqmy+4eH92L/OuNtU0gu0/6j03K3Wlz31kXoWZ7qiagw3/7mChywZRPIBg==',key_name='tempest-keypair-644342312',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:13:01Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7cc67bd4da7644d3bd8155cc7f188aa4',ramdisk_id='',reservation_id='r-40n50i1z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersV294TestFqdnHostnames-347469803',owner_user_name='tempest-ServersV294TestFqdnHostnames-347469803-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:13:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='a212b40f430d496d94ca57954f39afd6',uuid=f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "af4c0ef6-8643-4a83-be1e-4000ab5fd894", "address": "fa:16:3e:a9:c4:a7", "network": {"id": "d8c71e68-a016-4099-877d-881b9a6e634c", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1691956717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7cc67bd4da7644d3bd8155cc7f188aa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf4c0ef6-86", "ovs_interfaceid": "af4c0ef6-8643-4a83-be1e-4000ab5fd894", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.370 2 DEBUG nova.network.os_vif_util [None req-9da7fa0c-0c9e-4c65-9279-0572121e78a1 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Converting VIF {"id": "af4c0ef6-8643-4a83-be1e-4000ab5fd894", "address": "fa:16:3e:a9:c4:a7", "network": {"id": "d8c71e68-a016-4099-877d-881b9a6e634c", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1691956717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7cc67bd4da7644d3bd8155cc7f188aa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf4c0ef6-86", "ovs_interfaceid": "af4c0ef6-8643-4a83-be1e-4000ab5fd894", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.370 2 DEBUG nova.network.os_vif_util [None req-9da7fa0c-0c9e-4c65-9279-0572121e78a1 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a9:c4:a7,bridge_name='br-int',has_traffic_filtering=True,id=af4c0ef6-8643-4a83-be1e-4000ab5fd894,network=Network(d8c71e68-a016-4099-877d-881b9a6e634c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf4c0ef6-86') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.371 2 DEBUG os_vif [None req-9da7fa0c-0c9e-4c65-9279-0572121e78a1 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a9:c4:a7,bridge_name='br-int',has_traffic_filtering=True,id=af4c0ef6-8643-4a83-be1e-4000ab5fd894,network=Network(d8c71e68-a016-4099-877d-881b9a6e634c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf4c0ef6-86') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.373 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.373 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaf4c0ef6-86, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.375 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.377 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.378 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4e4dadeb-b5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.379 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4e4dadeb-b5, col_values=(('external_ids', {'iface-id': '4e4dadeb-b5ea-4021-9b54-652b42f7783b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8c:d3:48', 'vm-uuid': '878c8333-7358-4047-ad93-5b34ec0b2643'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.380 2 INFO os_vif [None req-9da7fa0c-0c9e-4c65-9279-0572121e78a1 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a9:c4:a7,bridge_name='br-int',has_traffic_filtering=True,id=af4c0ef6-8643-4a83-be1e-4000ab5fd894,network=Network(d8c71e68-a016-4099-877d-881b9a6e634c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf4c0ef6-86')#033[00m
Oct  2 08:13:38 np0005466013 NetworkManager[51205]: <info>  [1759407218.3810] manager: (tap4e4dadeb-b5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/111)
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.380 2 INFO nova.virt.libvirt.driver [None req-9da7fa0c-0c9e-4c65-9279-0572121e78a1 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Deleting instance files /var/lib/nova/instances/f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b_del#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.381 2 INFO nova.virt.libvirt.driver [None req-9da7fa0c-0c9e-4c65-9279-0572121e78a1 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Deletion of /var/lib/nova/instances/f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b_del complete#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.385 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.390 2 INFO os_vif [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8c:d3:48,bridge_name='br-int',has_traffic_filtering=True,id=4e4dadeb-b5ea-4021-9b54-652b42f7783b,network=Network(d6de4737-ca60-4c8d-bfd5-687f9366ec8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4e4dadeb-b5')#033[00m
Oct  2 08:13:38 np0005466013 podman[229348]: 2025-10-02 12:13:38.412673611 +0000 UTC m=+0.057028373 container remove 6fe455975ed6ed4833a3a8b5f5b4bb5d8ab0dcb5a17a47ad40588f42873cfbfc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d8c71e68-a016-4099-877d-881b9a6e634c, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 08:13:38 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:38.419 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[d5c55039-4d0f-4339-81ac-418f12dfa44e]: (4, ('Thu Oct  2 12:13:38 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d8c71e68-a016-4099-877d-881b9a6e634c (6fe455975ed6ed4833a3a8b5f5b4bb5d8ab0dcb5a17a47ad40588f42873cfbfc)\n6fe455975ed6ed4833a3a8b5f5b4bb5d8ab0dcb5a17a47ad40588f42873cfbfc\nThu Oct  2 12:13:38 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d8c71e68-a016-4099-877d-881b9a6e634c (6fe455975ed6ed4833a3a8b5f5b4bb5d8ab0dcb5a17a47ad40588f42873cfbfc)\n6fe455975ed6ed4833a3a8b5f5b4bb5d8ab0dcb5a17a47ad40588f42873cfbfc\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:38 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:38.421 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[d0dd5c50-f24b-4c5d-b1b4-a4eed9c298d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:38 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:38.423 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd8c71e68-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:38 np0005466013 kernel: tapd8c71e68-a0: left promiscuous mode
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:38 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:38.445 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[7b79bad6-ddae-45dd-986b-e31d8279c336]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:38 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:38.473 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[e2e1f65b-36a0-4110-8de6-ced9840bab96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:38 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:38.474 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[05579aa3-78a9-4acf-96ac-29d2f3f18b63]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.482 2 DEBUG nova.virt.libvirt.driver [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.482 2 DEBUG nova.virt.libvirt.driver [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.483 2 DEBUG nova.virt.libvirt.driver [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] No VIF found with MAC fa:16:3e:8c:d3:48, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.483 2 INFO nova.virt.libvirt.driver [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Using config drive#033[00m
Oct  2 08:13:38 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:38.492 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[e354da66-fd09-4908-89fc-ea4d0e8eb60a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 517193, 'reachable_time': 33490, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229369, 'error': None, 'target': 'ovnmeta-d8c71e68-a016-4099-877d-881b9a6e634c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:38 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:38.494 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d8c71e68-a016-4099-877d-881b9a6e634c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:13:38 np0005466013 systemd[1]: run-netns-ovnmeta\x2dd8c71e68\x2da016\x2d4099\x2d877d\x2d881b9a6e634c.mount: Deactivated successfully.
Oct  2 08:13:38 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:38.494 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[4bbdb1a5-940d-4dcc-aa23-561168ba0dea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:38 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:38.496 103323 INFO neutron.agent.ovn.metadata.agent [-] Port af4c0ef6-8643-4a83-be1e-4000ab5fd894 in datapath d8c71e68-a016-4099-877d-881b9a6e634c unbound from our chassis#033[00m
Oct  2 08:13:38 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:38.497 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d8c71e68-a016-4099-877d-881b9a6e634c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:13:38 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:38.498 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[7de95d52-4f23-4ce5-927a-b9ad1e27a359]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:38 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:38.498 103323 INFO neutron.agent.ovn.metadata.agent [-] Port af4c0ef6-8643-4a83-be1e-4000ab5fd894 in datapath d8c71e68-a016-4099-877d-881b9a6e634c unbound from our chassis#033[00m
Oct  2 08:13:38 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:38.499 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d8c71e68-a016-4099-877d-881b9a6e634c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:13:38 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:38.500 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[99fe2f73-bb3c-427b-97bd-34155c9831ee]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.612 2 DEBUG nova.compute.manager [req-246359a4-8d99-46c5-aa0d-e60b2fa3f1ae req-2d277322-5d22-47d0-b375-19625a28af17 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Received event network-vif-unplugged-af4c0ef6-8643-4a83-be1e-4000ab5fd894 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.612 2 DEBUG oslo_concurrency.lockutils [req-246359a4-8d99-46c5-aa0d-e60b2fa3f1ae req-2d277322-5d22-47d0-b375-19625a28af17 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.612 2 DEBUG oslo_concurrency.lockutils [req-246359a4-8d99-46c5-aa0d-e60b2fa3f1ae req-2d277322-5d22-47d0-b375-19625a28af17 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.613 2 DEBUG oslo_concurrency.lockutils [req-246359a4-8d99-46c5-aa0d-e60b2fa3f1ae req-2d277322-5d22-47d0-b375-19625a28af17 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.613 2 DEBUG nova.compute.manager [req-246359a4-8d99-46c5-aa0d-e60b2fa3f1ae req-2d277322-5d22-47d0-b375-19625a28af17 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] No waiting events found dispatching network-vif-unplugged-af4c0ef6-8643-4a83-be1e-4000ab5fd894 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.613 2 DEBUG nova.compute.manager [req-246359a4-8d99-46c5-aa0d-e60b2fa3f1ae req-2d277322-5d22-47d0-b375-19625a28af17 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Received event network-vif-unplugged-af4c0ef6-8643-4a83-be1e-4000ab5fd894 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.623 2 INFO nova.compute.manager [None req-9da7fa0c-0c9e-4c65-9279-0572121e78a1 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Took 0.59 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.624 2 DEBUG oslo.service.loopingcall [None req-9da7fa0c-0c9e-4c65-9279-0572121e78a1 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.624 2 DEBUG nova.compute.manager [-] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:13:38 np0005466013 nova_compute[192144]: 2025-10-02 12:13:38.624 2 DEBUG nova.network.neutron [-] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:13:39 np0005466013 nova_compute[192144]: 2025-10-02 12:13:39.277 2 INFO nova.virt.libvirt.driver [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Creating config drive at /var/lib/nova/instances/878c8333-7358-4047-ad93-5b34ec0b2643/disk.config#033[00m
Oct  2 08:13:39 np0005466013 nova_compute[192144]: 2025-10-02 12:13:39.282 2 DEBUG oslo_concurrency.processutils [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/878c8333-7358-4047-ad93-5b34ec0b2643/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxzm73egu execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:13:39 np0005466013 nova_compute[192144]: 2025-10-02 12:13:39.413 2 DEBUG oslo_concurrency.processutils [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/878c8333-7358-4047-ad93-5b34ec0b2643/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxzm73egu" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:13:39 np0005466013 kernel: tap4e4dadeb-b5: entered promiscuous mode
Oct  2 08:13:39 np0005466013 NetworkManager[51205]: <info>  [1759407219.5023] manager: (tap4e4dadeb-b5): new Tun device (/org/freedesktop/NetworkManager/Devices/112)
Oct  2 08:13:39 np0005466013 ovn_controller[94366]: 2025-10-02T12:13:39Z|00214|binding|INFO|Claiming lport 4e4dadeb-b5ea-4021-9b54-652b42f7783b for this chassis.
Oct  2 08:13:39 np0005466013 ovn_controller[94366]: 2025-10-02T12:13:39Z|00215|binding|INFO|4e4dadeb-b5ea-4021-9b54-652b42f7783b: Claiming fa:16:3e:8c:d3:48 10.100.0.11
Oct  2 08:13:39 np0005466013 nova_compute[192144]: 2025-10-02 12:13:39.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:39.512 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8c:d3:48 10.100.0.11'], port_security=['fa:16:3e:8c:d3:48 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '878c8333-7358-4047-ad93-5b34ec0b2643', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ffae703d68b24b9c89686c149113fc2b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '64970375-b20e-4c18-bfb5-2a0465f8be7d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9476db85-7514-407a-b55a-3d3c703e8f7b, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=4e4dadeb-b5ea-4021-9b54-652b42f7783b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:13:39 np0005466013 NetworkManager[51205]: <info>  [1759407219.5134] device (tap4e4dadeb-b5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:13:39 np0005466013 NetworkManager[51205]: <info>  [1759407219.5140] device (tap4e4dadeb-b5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:39.514 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 4e4dadeb-b5ea-4021-9b54-652b42f7783b in datapath d6de4737-ca60-4c8d-bfd5-687f9366ec8b bound to our chassis#033[00m
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:39.515 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d6de4737-ca60-4c8d-bfd5-687f9366ec8b#033[00m
Oct  2 08:13:39 np0005466013 ovn_controller[94366]: 2025-10-02T12:13:39Z|00216|binding|INFO|Setting lport 4e4dadeb-b5ea-4021-9b54-652b42f7783b ovn-installed in OVS
Oct  2 08:13:39 np0005466013 ovn_controller[94366]: 2025-10-02T12:13:39Z|00217|binding|INFO|Setting lport 4e4dadeb-b5ea-4021-9b54-652b42f7783b up in Southbound
Oct  2 08:13:39 np0005466013 nova_compute[192144]: 2025-10-02 12:13:39.518 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:39 np0005466013 nova_compute[192144]: 2025-10-02 12:13:39.520 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:39.531 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[2b8f0be5-0f51-469f-9863-81a5ff713f49]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:39.532 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd6de4737-c1 in ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:39.534 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd6de4737-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:39.534 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[97d48440-2860-4fca-a0ea-6991150d5c49]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:39.535 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[c2862557-64fb-4f43-935d-6c2cdae823c8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:39.548 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[f9d6f153-4d7a-4db1-bcab-daa6dc18468a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:39 np0005466013 systemd-machined[152202]: New machine qemu-28-instance-00000044.
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:39.565 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[c6c23e00-6624-4a01-aab6-28b263f9bf3f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:39 np0005466013 systemd[1]: Started Virtual Machine qemu-28-instance-00000044.
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:39.591 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[6b95d689-9edc-4df3-96a6-7e04129a2094]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:39 np0005466013 NetworkManager[51205]: <info>  [1759407219.5987] manager: (tapd6de4737-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/113)
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:39.598 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[191e643c-d47f-46a6-81e6-e5c7ac9232c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:39.634 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[d5130a12-503a-49e4-a8a6-4ffb9290009f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:39.638 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[44bea671-8a5c-4fb4-a183-7c2e687e76c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:39 np0005466013 NetworkManager[51205]: <info>  [1759407219.6624] device (tapd6de4737-c0): carrier: link connected
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:39.670 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[65129bb8-4b54-4dc6-8c26-a3604417892f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:39.691 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[a1704753-401e-4241-b9ab-a655fceec467]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd6de4737-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bd:c9:1f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 67], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 521127, 'reachable_time': 33930, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 168, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 168, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229419, 'error': None, 'target': 'ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:39.711 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[3ee91f26-f4a9-4d4e-89db-3ea19046f4a6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febd:c91f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 521127, 'tstamp': 521127}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229420, 'error': None, 'target': 'ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:39.728 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[01808f21-0adb-4c7a-8d3b-0aa2b85ade95]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd6de4737-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bd:c9:1f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 2, 'rx_bytes': 196, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 2, 'rx_bytes': 196, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 67], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 521127, 'reachable_time': 33930, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 168, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 168, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229421, 'error': None, 'target': 'ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:39.764 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[cc2ae04c-3850-46cd-80d0-3b52facec16b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:39.832 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[5692b7de-c889-4eb5-9923-9484bbf0ff21]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:39.834 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd6de4737-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:39.834 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:39.835 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd6de4737-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:13:39 np0005466013 kernel: tapd6de4737-c0: entered promiscuous mode
Oct  2 08:13:39 np0005466013 nova_compute[192144]: 2025-10-02 12:13:39.837 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:39 np0005466013 NetworkManager[51205]: <info>  [1759407219.8379] manager: (tapd6de4737-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/114)
Oct  2 08:13:39 np0005466013 nova_compute[192144]: 2025-10-02 12:13:39.840 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:39.843 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd6de4737-c0, col_values=(('external_ids', {'iface-id': 'cc451eb7-bf34-4b54-96d8-b834f11e06fb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:13:39 np0005466013 nova_compute[192144]: 2025-10-02 12:13:39.845 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:39 np0005466013 ovn_controller[94366]: 2025-10-02T12:13:39Z|00218|binding|INFO|Releasing lport cc451eb7-bf34-4b54-96d8-b834f11e06fb from this chassis (sb_readonly=0)
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:39.848 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d6de4737-ca60-4c8d-bfd5-687f9366ec8b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d6de4737-ca60-4c8d-bfd5-687f9366ec8b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:39.860 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[79493974-d8a6-4f75-9e1a-0c2bc1ef0e2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:39 np0005466013 nova_compute[192144]: 2025-10-02 12:13:39.861 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:39.863 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-d6de4737-ca60-4c8d-bfd5-687f9366ec8b
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/d6de4737-ca60-4c8d-bfd5-687f9366ec8b.pid.haproxy
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID d6de4737-ca60-4c8d-bfd5-687f9366ec8b
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:13:39 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:39.865 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'env', 'PROCESS_TAG=haproxy-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d6de4737-ca60-4c8d-bfd5-687f9366ec8b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:13:40 np0005466013 nova_compute[192144]: 2025-10-02 12:13:40.006 2 DEBUG nova.compute.manager [req-0cd19eca-8d0d-409e-ba81-518d8c8abd72 req-949ba9a2-89ed-433b-be1f-8021e1e35868 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Received event network-vif-plugged-4e4dadeb-b5ea-4021-9b54-652b42f7783b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:13:40 np0005466013 nova_compute[192144]: 2025-10-02 12:13:40.007 2 DEBUG oslo_concurrency.lockutils [req-0cd19eca-8d0d-409e-ba81-518d8c8abd72 req-949ba9a2-89ed-433b-be1f-8021e1e35868 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "878c8333-7358-4047-ad93-5b34ec0b2643-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:40 np0005466013 nova_compute[192144]: 2025-10-02 12:13:40.008 2 DEBUG oslo_concurrency.lockutils [req-0cd19eca-8d0d-409e-ba81-518d8c8abd72 req-949ba9a2-89ed-433b-be1f-8021e1e35868 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "878c8333-7358-4047-ad93-5b34ec0b2643-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:40 np0005466013 nova_compute[192144]: 2025-10-02 12:13:40.008 2 DEBUG oslo_concurrency.lockutils [req-0cd19eca-8d0d-409e-ba81-518d8c8abd72 req-949ba9a2-89ed-433b-be1f-8021e1e35868 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "878c8333-7358-4047-ad93-5b34ec0b2643-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:40 np0005466013 nova_compute[192144]: 2025-10-02 12:13:40.008 2 DEBUG nova.compute.manager [req-0cd19eca-8d0d-409e-ba81-518d8c8abd72 req-949ba9a2-89ed-433b-be1f-8021e1e35868 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Processing event network-vif-plugged-4e4dadeb-b5ea-4021-9b54-652b42f7783b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:13:40 np0005466013 podman[229460]: 2025-10-02 12:13:40.254963808 +0000 UTC m=+0.052425100 container create 7f1ac7d676fbb8864907092b165d3eddf1399e5f6ad1a1ff63fe133c7cc144b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Oct  2 08:13:40 np0005466013 systemd[1]: Started libpod-conmon-7f1ac7d676fbb8864907092b165d3eddf1399e5f6ad1a1ff63fe133c7cc144b8.scope.
Oct  2 08:13:40 np0005466013 podman[229460]: 2025-10-02 12:13:40.225805502 +0000 UTC m=+0.023266814 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:13:40 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:13:40 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/54604c6c63e6832291b2f351036bd744c8e027244ee9e305ace7909db1e92d1b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:13:40 np0005466013 podman[229460]: 2025-10-02 12:13:40.354787941 +0000 UTC m=+0.152249293 container init 7f1ac7d676fbb8864907092b165d3eddf1399e5f6ad1a1ff63fe133c7cc144b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  2 08:13:40 np0005466013 podman[229460]: 2025-10-02 12:13:40.360980974 +0000 UTC m=+0.158442286 container start 7f1ac7d676fbb8864907092b165d3eddf1399e5f6ad1a1ff63fe133c7cc144b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2)
Oct  2 08:13:40 np0005466013 neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b[229476]: [NOTICE]   (229480) : New worker (229482) forked
Oct  2 08:13:40 np0005466013 neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b[229476]: [NOTICE]   (229480) : Loading success.
Oct  2 08:13:40 np0005466013 nova_compute[192144]: 2025-10-02 12:13:40.509 2 DEBUG nova.compute.manager [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:13:40 np0005466013 nova_compute[192144]: 2025-10-02 12:13:40.513 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407220.5065682, 878c8333-7358-4047-ad93-5b34ec0b2643 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:13:40 np0005466013 nova_compute[192144]: 2025-10-02 12:13:40.514 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] VM Started (Lifecycle Event)#033[00m
Oct  2 08:13:40 np0005466013 nova_compute[192144]: 2025-10-02 12:13:40.518 2 DEBUG nova.virt.libvirt.driver [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:13:40 np0005466013 nova_compute[192144]: 2025-10-02 12:13:40.525 2 INFO nova.virt.libvirt.driver [-] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Instance spawned successfully.#033[00m
Oct  2 08:13:40 np0005466013 nova_compute[192144]: 2025-10-02 12:13:40.526 2 DEBUG nova.virt.libvirt.driver [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:13:40 np0005466013 nova_compute[192144]: 2025-10-02 12:13:40.550 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:13:40 np0005466013 nova_compute[192144]: 2025-10-02 12:13:40.557 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:13:40 np0005466013 nova_compute[192144]: 2025-10-02 12:13:40.560 2 DEBUG nova.virt.libvirt.driver [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:13:40 np0005466013 nova_compute[192144]: 2025-10-02 12:13:40.561 2 DEBUG nova.virt.libvirt.driver [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:13:40 np0005466013 nova_compute[192144]: 2025-10-02 12:13:40.561 2 DEBUG nova.virt.libvirt.driver [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:13:40 np0005466013 nova_compute[192144]: 2025-10-02 12:13:40.562 2 DEBUG nova.virt.libvirt.driver [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:13:40 np0005466013 nova_compute[192144]: 2025-10-02 12:13:40.562 2 DEBUG nova.virt.libvirt.driver [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:13:40 np0005466013 nova_compute[192144]: 2025-10-02 12:13:40.562 2 DEBUG nova.virt.libvirt.driver [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:13:40 np0005466013 nova_compute[192144]: 2025-10-02 12:13:40.620 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:13:40 np0005466013 nova_compute[192144]: 2025-10-02 12:13:40.620 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407220.5067873, 878c8333-7358-4047-ad93-5b34ec0b2643 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:13:40 np0005466013 nova_compute[192144]: 2025-10-02 12:13:40.621 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:13:40 np0005466013 nova_compute[192144]: 2025-10-02 12:13:40.658 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:13:40 np0005466013 nova_compute[192144]: 2025-10-02 12:13:40.664 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407220.5171728, 878c8333-7358-4047-ad93-5b34ec0b2643 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:13:40 np0005466013 nova_compute[192144]: 2025-10-02 12:13:40.664 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:13:40 np0005466013 nova_compute[192144]: 2025-10-02 12:13:40.755 2 INFO nova.compute.manager [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Took 8.12 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:13:40 np0005466013 nova_compute[192144]: 2025-10-02 12:13:40.756 2 DEBUG nova.compute.manager [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:13:40 np0005466013 nova_compute[192144]: 2025-10-02 12:13:40.768 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:13:40 np0005466013 nova_compute[192144]: 2025-10-02 12:13:40.774 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:13:40 np0005466013 nova_compute[192144]: 2025-10-02 12:13:40.823 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:13:40 np0005466013 nova_compute[192144]: 2025-10-02 12:13:40.882 2 DEBUG nova.network.neutron [-] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:13:40 np0005466013 nova_compute[192144]: 2025-10-02 12:13:40.886 2 DEBUG nova.compute.manager [req-0e6059e4-f0c0-43dd-b186-914a3352ab72 req-e0ca1cfc-5b2f-4d1e-a6b1-57c2cdd7d218 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Received event network-vif-plugged-af4c0ef6-8643-4a83-be1e-4000ab5fd894 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:13:40 np0005466013 nova_compute[192144]: 2025-10-02 12:13:40.886 2 DEBUG oslo_concurrency.lockutils [req-0e6059e4-f0c0-43dd-b186-914a3352ab72 req-e0ca1cfc-5b2f-4d1e-a6b1-57c2cdd7d218 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:40 np0005466013 nova_compute[192144]: 2025-10-02 12:13:40.886 2 DEBUG oslo_concurrency.lockutils [req-0e6059e4-f0c0-43dd-b186-914a3352ab72 req-e0ca1cfc-5b2f-4d1e-a6b1-57c2cdd7d218 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:40 np0005466013 nova_compute[192144]: 2025-10-02 12:13:40.887 2 DEBUG oslo_concurrency.lockutils [req-0e6059e4-f0c0-43dd-b186-914a3352ab72 req-e0ca1cfc-5b2f-4d1e-a6b1-57c2cdd7d218 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:40 np0005466013 nova_compute[192144]: 2025-10-02 12:13:40.887 2 DEBUG nova.compute.manager [req-0e6059e4-f0c0-43dd-b186-914a3352ab72 req-e0ca1cfc-5b2f-4d1e-a6b1-57c2cdd7d218 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] No waiting events found dispatching network-vif-plugged-af4c0ef6-8643-4a83-be1e-4000ab5fd894 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:13:40 np0005466013 nova_compute[192144]: 2025-10-02 12:13:40.887 2 WARNING nova.compute.manager [req-0e6059e4-f0c0-43dd-b186-914a3352ab72 req-e0ca1cfc-5b2f-4d1e-a6b1-57c2cdd7d218 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Received unexpected event network-vif-plugged-af4c0ef6-8643-4a83-be1e-4000ab5fd894 for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:13:40 np0005466013 nova_compute[192144]: 2025-10-02 12:13:40.902 2 INFO nova.compute.manager [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Took 8.99 seconds to build instance.#033[00m
Oct  2 08:13:40 np0005466013 nova_compute[192144]: 2025-10-02 12:13:40.907 2 INFO nova.compute.manager [-] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Took 2.28 seconds to deallocate network for instance.#033[00m
Oct  2 08:13:40 np0005466013 nova_compute[192144]: 2025-10-02 12:13:40.926 2 DEBUG oslo_concurrency.lockutils [None req-b8272ec5-9b2c-458c-b143-ba1feb15dbed def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "878c8333-7358-4047-ad93-5b34ec0b2643" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:41 np0005466013 nova_compute[192144]: 2025-10-02 12:13:41.010 2 DEBUG oslo_concurrency.lockutils [None req-9da7fa0c-0c9e-4c65-9279-0572121e78a1 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:41 np0005466013 nova_compute[192144]: 2025-10-02 12:13:41.011 2 DEBUG oslo_concurrency.lockutils [None req-9da7fa0c-0c9e-4c65-9279-0572121e78a1 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:41 np0005466013 nova_compute[192144]: 2025-10-02 12:13:41.129 2 DEBUG nova.compute.provider_tree [None req-9da7fa0c-0c9e-4c65-9279-0572121e78a1 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:13:41 np0005466013 nova_compute[192144]: 2025-10-02 12:13:41.152 2 DEBUG nova.scheduler.client.report [None req-9da7fa0c-0c9e-4c65-9279-0572121e78a1 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:13:41 np0005466013 nova_compute[192144]: 2025-10-02 12:13:41.181 2 DEBUG nova.network.neutron [req-18c5e090-c331-4b50-956c-0251c2bef9a2 req-2578734a-16c1-441f-bb94-660053d924a1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Updated VIF entry in instance network info cache for port 4e4dadeb-b5ea-4021-9b54-652b42f7783b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:13:41 np0005466013 nova_compute[192144]: 2025-10-02 12:13:41.182 2 DEBUG nova.network.neutron [req-18c5e090-c331-4b50-956c-0251c2bef9a2 req-2578734a-16c1-441f-bb94-660053d924a1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Updating instance_info_cache with network_info: [{"id": "4e4dadeb-b5ea-4021-9b54-652b42f7783b", "address": "fa:16:3e:8c:d3:48", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4e4dadeb-b5", "ovs_interfaceid": "4e4dadeb-b5ea-4021-9b54-652b42f7783b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:13:41 np0005466013 nova_compute[192144]: 2025-10-02 12:13:41.185 2 DEBUG oslo_concurrency.lockutils [None req-9da7fa0c-0c9e-4c65-9279-0572121e78a1 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.174s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:41 np0005466013 nova_compute[192144]: 2025-10-02 12:13:41.211 2 DEBUG oslo_concurrency.lockutils [req-18c5e090-c331-4b50-956c-0251c2bef9a2 req-2578734a-16c1-441f-bb94-660053d924a1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-878c8333-7358-4047-ad93-5b34ec0b2643" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:13:41 np0005466013 nova_compute[192144]: 2025-10-02 12:13:41.221 2 INFO nova.scheduler.client.report [None req-9da7fa0c-0c9e-4c65-9279-0572121e78a1 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Deleted allocations for instance f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b#033[00m
Oct  2 08:13:41 np0005466013 nova_compute[192144]: 2025-10-02 12:13:41.316 2 DEBUG oslo_concurrency.lockutils [None req-9da7fa0c-0c9e-4c65-9279-0572121e78a1 a212b40f430d496d94ca57954f39afd6 7cc67bd4da7644d3bd8155cc7f188aa4 - - default default] Lock "f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.436s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:41 np0005466013 nova_compute[192144]: 2025-10-02 12:13:41.475 2 DEBUG nova.compute.manager [req-b36b2ee2-098d-4905-80f0-b8c0e8dbcf6f req-ece3d887-f6b3-43a9-8a7c-6f84909a38c2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Received event network-vif-deleted-af4c0ef6-8643-4a83-be1e-4000ab5fd894 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:13:42 np0005466013 nova_compute[192144]: 2025-10-02 12:13:42.121 2 DEBUG nova.compute.manager [req-af580ae4-531d-4d5e-9f89-ed1a2f93dea2 req-536dac6f-743e-4f1e-98e8-96034cf2b8f4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Received event network-vif-plugged-4e4dadeb-b5ea-4021-9b54-652b42f7783b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:13:42 np0005466013 nova_compute[192144]: 2025-10-02 12:13:42.123 2 DEBUG oslo_concurrency.lockutils [req-af580ae4-531d-4d5e-9f89-ed1a2f93dea2 req-536dac6f-743e-4f1e-98e8-96034cf2b8f4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "878c8333-7358-4047-ad93-5b34ec0b2643-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:42 np0005466013 nova_compute[192144]: 2025-10-02 12:13:42.123 2 DEBUG oslo_concurrency.lockutils [req-af580ae4-531d-4d5e-9f89-ed1a2f93dea2 req-536dac6f-743e-4f1e-98e8-96034cf2b8f4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "878c8333-7358-4047-ad93-5b34ec0b2643-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:42 np0005466013 nova_compute[192144]: 2025-10-02 12:13:42.124 2 DEBUG oslo_concurrency.lockutils [req-af580ae4-531d-4d5e-9f89-ed1a2f93dea2 req-536dac6f-743e-4f1e-98e8-96034cf2b8f4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "878c8333-7358-4047-ad93-5b34ec0b2643-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:42 np0005466013 nova_compute[192144]: 2025-10-02 12:13:42.124 2 DEBUG nova.compute.manager [req-af580ae4-531d-4d5e-9f89-ed1a2f93dea2 req-536dac6f-743e-4f1e-98e8-96034cf2b8f4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] No waiting events found dispatching network-vif-plugged-4e4dadeb-b5ea-4021-9b54-652b42f7783b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:13:42 np0005466013 nova_compute[192144]: 2025-10-02 12:13:42.124 2 WARNING nova.compute.manager [req-af580ae4-531d-4d5e-9f89-ed1a2f93dea2 req-536dac6f-743e-4f1e-98e8-96034cf2b8f4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Received unexpected event network-vif-plugged-4e4dadeb-b5ea-4021-9b54-652b42f7783b for instance with vm_state active and task_state None.#033[00m
Oct  2 08:13:43 np0005466013 nova_compute[192144]: 2025-10-02 12:13:43.286 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:43 np0005466013 nova_compute[192144]: 2025-10-02 12:13:43.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:43 np0005466013 ovn_controller[94366]: 2025-10-02T12:13:43Z|00219|binding|INFO|Releasing lport cc451eb7-bf34-4b54-96d8-b834f11e06fb from this chassis (sb_readonly=0)
Oct  2 08:13:43 np0005466013 nova_compute[192144]: 2025-10-02 12:13:43.800 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407208.7990265, 960cdfa5-111c-4d08-82c7-29134bd55212 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:13:43 np0005466013 nova_compute[192144]: 2025-10-02 12:13:43.802 2 INFO nova.compute.manager [-] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:13:43 np0005466013 nova_compute[192144]: 2025-10-02 12:13:43.859 2 DEBUG nova.compute.manager [None req-4cabcb0b-f6ac-4698-aed9-4724124b5f56 - - - - - -] [instance: 960cdfa5-111c-4d08-82c7-29134bd55212] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:13:43 np0005466013 nova_compute[192144]: 2025-10-02 12:13:43.860 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:43 np0005466013 ovn_controller[94366]: 2025-10-02T12:13:43Z|00220|binding|INFO|Releasing lport cc451eb7-bf34-4b54-96d8-b834f11e06fb from this chassis (sb_readonly=0)
Oct  2 08:13:43 np0005466013 nova_compute[192144]: 2025-10-02 12:13:43.970 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:46 np0005466013 nova_compute[192144]: 2025-10-02 12:13:46.134 2 INFO nova.compute.manager [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Rebuilding instance#033[00m
Oct  2 08:13:46 np0005466013 nova_compute[192144]: 2025-10-02 12:13:46.858 2 DEBUG nova.compute.manager [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:13:46 np0005466013 nova_compute[192144]: 2025-10-02 12:13:46.941 2 DEBUG nova.objects.instance [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lazy-loading 'pci_requests' on Instance uuid 878c8333-7358-4047-ad93-5b34ec0b2643 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:13:46 np0005466013 nova_compute[192144]: 2025-10-02 12:13:46.962 2 DEBUG nova.objects.instance [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lazy-loading 'pci_devices' on Instance uuid 878c8333-7358-4047-ad93-5b34ec0b2643 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:13:46 np0005466013 nova_compute[192144]: 2025-10-02 12:13:46.976 2 DEBUG nova.objects.instance [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lazy-loading 'resources' on Instance uuid 878c8333-7358-4047-ad93-5b34ec0b2643 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:13:46 np0005466013 nova_compute[192144]: 2025-10-02 12:13:46.993 2 DEBUG nova.objects.instance [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lazy-loading 'migration_context' on Instance uuid 878c8333-7358-4047-ad93-5b34ec0b2643 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:13:47 np0005466013 nova_compute[192144]: 2025-10-02 12:13:47.005 2 DEBUG nova.objects.instance [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Oct  2 08:13:47 np0005466013 nova_compute[192144]: 2025-10-02 12:13:47.009 2 DEBUG nova.virt.libvirt.driver [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Oct  2 08:13:48 np0005466013 nova_compute[192144]: 2025-10-02 12:13:48.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:48 np0005466013 nova_compute[192144]: 2025-10-02 12:13:48.383 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:48 np0005466013 podman[229492]: 2025-10-02 12:13:48.682001593 +0000 UTC m=+0.054856066 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:13:51 np0005466013 podman[229513]: 2025-10-02 12:13:51.697799157 +0000 UTC m=+0.068567730 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, config_id=edpm, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, name=ubi9-minimal, architecture=x86_64, release=1755695350, vendor=Red Hat, Inc., container_name=openstack_network_exporter, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, version=9.6, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers)
Oct  2 08:13:51 np0005466013 podman[229512]: 2025-10-02 12:13:51.711002267 +0000 UTC m=+0.084351691 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.vendor=CentOS)
Oct  2 08:13:53 np0005466013 nova_compute[192144]: 2025-10-02 12:13:53.324 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:53 np0005466013 nova_compute[192144]: 2025-10-02 12:13:53.338 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407218.3277574, f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:13:53 np0005466013 nova_compute[192144]: 2025-10-02 12:13:53.339 2 INFO nova.compute.manager [-] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:13:53 np0005466013 nova_compute[192144]: 2025-10-02 12:13:53.375 2 DEBUG nova.compute.manager [None req-8e612fbf-ca06-4c84-a6de-d8c3fc453ccf - - - - - -] [instance: f20e0a1c-377f-4c93-ad70-cdc5e3b1fa7b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:13:53 np0005466013 nova_compute[192144]: 2025-10-02 12:13:53.386 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:53 np0005466013 ovn_controller[94366]: 2025-10-02T12:13:53Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:8c:d3:48 10.100.0.11
Oct  2 08:13:53 np0005466013 ovn_controller[94366]: 2025-10-02T12:13:53Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:8c:d3:48 10.100.0.11
Oct  2 08:13:55 np0005466013 podman[229565]: 2025-10-02 12:13:55.68335546 +0000 UTC m=+0.056183936 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  2 08:13:55 np0005466013 podman[229566]: 2025-10-02 12:13:55.698041177 +0000 UTC m=+0.065260155 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 08:13:57 np0005466013 nova_compute[192144]: 2025-10-02 12:13:57.054 2 DEBUG nova.virt.libvirt.driver [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Oct  2 08:13:58 np0005466013 nova_compute[192144]: 2025-10-02 12:13:58.326 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:58 np0005466013 nova_compute[192144]: 2025-10-02 12:13:58.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:59 np0005466013 kernel: tap4e4dadeb-b5 (unregistering): left promiscuous mode
Oct  2 08:13:59 np0005466013 NetworkManager[51205]: <info>  [1759407239.2027] device (tap4e4dadeb-b5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:13:59 np0005466013 ovn_controller[94366]: 2025-10-02T12:13:59Z|00221|binding|INFO|Releasing lport 4e4dadeb-b5ea-4021-9b54-652b42f7783b from this chassis (sb_readonly=0)
Oct  2 08:13:59 np0005466013 nova_compute[192144]: 2025-10-02 12:13:59.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:59 np0005466013 ovn_controller[94366]: 2025-10-02T12:13:59Z|00222|binding|INFO|Setting lport 4e4dadeb-b5ea-4021-9b54-652b42f7783b down in Southbound
Oct  2 08:13:59 np0005466013 ovn_controller[94366]: 2025-10-02T12:13:59Z|00223|binding|INFO|Removing iface tap4e4dadeb-b5 ovn-installed in OVS
Oct  2 08:13:59 np0005466013 nova_compute[192144]: 2025-10-02 12:13:59.215 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:59.220 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8c:d3:48 10.100.0.11'], port_security=['fa:16:3e:8c:d3:48 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '878c8333-7358-4047-ad93-5b34ec0b2643', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ffae703d68b24b9c89686c149113fc2b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '64970375-b20e-4c18-bfb5-2a0465f8be7d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9476db85-7514-407a-b55a-3d3c703e8f7b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=4e4dadeb-b5ea-4021-9b54-652b42f7783b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:13:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:59.221 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 4e4dadeb-b5ea-4021-9b54-652b42f7783b in datapath d6de4737-ca60-4c8d-bfd5-687f9366ec8b unbound from our chassis#033[00m
Oct  2 08:13:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:59.223 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d6de4737-ca60-4c8d-bfd5-687f9366ec8b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:13:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:59.226 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[5cf9b673-8bbf-4e71-8aee-03831ac55dc4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:59.227 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b namespace which is not needed anymore#033[00m
Oct  2 08:13:59 np0005466013 nova_compute[192144]: 2025-10-02 12:13:59.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:59 np0005466013 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d00000044.scope: Deactivated successfully.
Oct  2 08:13:59 np0005466013 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d00000044.scope: Consumed 13.617s CPU time.
Oct  2 08:13:59 np0005466013 systemd-machined[152202]: Machine qemu-28-instance-00000044 terminated.
Oct  2 08:13:59 np0005466013 neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b[229476]: [NOTICE]   (229480) : haproxy version is 2.8.14-c23fe91
Oct  2 08:13:59 np0005466013 neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b[229476]: [NOTICE]   (229480) : path to executable is /usr/sbin/haproxy
Oct  2 08:13:59 np0005466013 neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b[229476]: [WARNING]  (229480) : Exiting Master process...
Oct  2 08:13:59 np0005466013 neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b[229476]: [WARNING]  (229480) : Exiting Master process...
Oct  2 08:13:59 np0005466013 neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b[229476]: [ALERT]    (229480) : Current worker (229482) exited with code 143 (Terminated)
Oct  2 08:13:59 np0005466013 neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b[229476]: [WARNING]  (229480) : All workers exited. Exiting... (0)
Oct  2 08:13:59 np0005466013 systemd[1]: libpod-7f1ac7d676fbb8864907092b165d3eddf1399e5f6ad1a1ff63fe133c7cc144b8.scope: Deactivated successfully.
Oct  2 08:13:59 np0005466013 podman[229631]: 2025-10-02 12:13:59.376264924 +0000 UTC m=+0.046170879 container died 7f1ac7d676fbb8864907092b165d3eddf1399e5f6ad1a1ff63fe133c7cc144b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 08:13:59 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7f1ac7d676fbb8864907092b165d3eddf1399e5f6ad1a1ff63fe133c7cc144b8-userdata-shm.mount: Deactivated successfully.
Oct  2 08:13:59 np0005466013 systemd[1]: var-lib-containers-storage-overlay-54604c6c63e6832291b2f351036bd744c8e027244ee9e305ace7909db1e92d1b-merged.mount: Deactivated successfully.
Oct  2 08:13:59 np0005466013 podman[229631]: 2025-10-02 12:13:59.408466796 +0000 UTC m=+0.078372751 container cleanup 7f1ac7d676fbb8864907092b165d3eddf1399e5f6ad1a1ff63fe133c7cc144b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:13:59 np0005466013 systemd[1]: libpod-conmon-7f1ac7d676fbb8864907092b165d3eddf1399e5f6ad1a1ff63fe133c7cc144b8.scope: Deactivated successfully.
Oct  2 08:13:59 np0005466013 podman[229659]: 2025-10-02 12:13:59.476055074 +0000 UTC m=+0.045518057 container remove 7f1ac7d676fbb8864907092b165d3eddf1399e5f6ad1a1ff63fe133c7cc144b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true)
Oct  2 08:13:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:59.485 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[43d1d70b-b918-4c08-84a1-bde477ab5aa2]: (4, ('Thu Oct  2 12:13:59 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b (7f1ac7d676fbb8864907092b165d3eddf1399e5f6ad1a1ff63fe133c7cc144b8)\n7f1ac7d676fbb8864907092b165d3eddf1399e5f6ad1a1ff63fe133c7cc144b8\nThu Oct  2 12:13:59 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b (7f1ac7d676fbb8864907092b165d3eddf1399e5f6ad1a1ff63fe133c7cc144b8)\n7f1ac7d676fbb8864907092b165d3eddf1399e5f6ad1a1ff63fe133c7cc144b8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:59.489 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[fdd7d9f0-28a3-4aa8-b874-3c7c19fd4efa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:59.491 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd6de4737-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:13:59 np0005466013 kernel: tapd6de4737-c0: left promiscuous mode
Oct  2 08:13:59 np0005466013 nova_compute[192144]: 2025-10-02 12:13:59.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:59 np0005466013 nova_compute[192144]: 2025-10-02 12:13:59.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:59.513 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[b20f974a-c239-47ef-b2d9-d8cdcfe6a734]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:59.550 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[9813d093-c782-44ce-b18e-5b6ce5bd3373]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:59.552 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[e13a8eb9-2f43-4fdb-8c85-70d328e7100a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:59.571 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[6a798b90-7890-4243-88b3-693167633e80]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 521120, 'reachable_time': 24772, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229695, 'error': None, 'target': 'ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:59 np0005466013 systemd[1]: run-netns-ovnmeta\x2dd6de4737\x2dca60\x2d4c8d\x2dbfd5\x2d687f9366ec8b.mount: Deactivated successfully.
Oct  2 08:13:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:59.576 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:13:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:13:59.576 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[b1a28f94-2131-4e6a-aeb8-fa1d63d3911d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:59 np0005466013 nova_compute[192144]: 2025-10-02 12:13:59.713 2 DEBUG nova.compute.manager [req-20755fd4-73fe-4d80-afef-bab605c62dbc req-2f18d255-18d0-4307-91c0-f83f46033af5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Received event network-vif-unplugged-4e4dadeb-b5ea-4021-9b54-652b42f7783b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:13:59 np0005466013 nova_compute[192144]: 2025-10-02 12:13:59.714 2 DEBUG oslo_concurrency.lockutils [req-20755fd4-73fe-4d80-afef-bab605c62dbc req-2f18d255-18d0-4307-91c0-f83f46033af5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "878c8333-7358-4047-ad93-5b34ec0b2643-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:59 np0005466013 nova_compute[192144]: 2025-10-02 12:13:59.714 2 DEBUG oslo_concurrency.lockutils [req-20755fd4-73fe-4d80-afef-bab605c62dbc req-2f18d255-18d0-4307-91c0-f83f46033af5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "878c8333-7358-4047-ad93-5b34ec0b2643-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:59 np0005466013 nova_compute[192144]: 2025-10-02 12:13:59.714 2 DEBUG oslo_concurrency.lockutils [req-20755fd4-73fe-4d80-afef-bab605c62dbc req-2f18d255-18d0-4307-91c0-f83f46033af5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "878c8333-7358-4047-ad93-5b34ec0b2643-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:59 np0005466013 nova_compute[192144]: 2025-10-02 12:13:59.715 2 DEBUG nova.compute.manager [req-20755fd4-73fe-4d80-afef-bab605c62dbc req-2f18d255-18d0-4307-91c0-f83f46033af5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] No waiting events found dispatching network-vif-unplugged-4e4dadeb-b5ea-4021-9b54-652b42f7783b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:13:59 np0005466013 nova_compute[192144]: 2025-10-02 12:13:59.715 2 WARNING nova.compute.manager [req-20755fd4-73fe-4d80-afef-bab605c62dbc req-2f18d255-18d0-4307-91c0-f83f46033af5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Received unexpected event network-vif-unplugged-4e4dadeb-b5ea-4021-9b54-652b42f7783b for instance with vm_state active and task_state rebuilding.#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.068 2 INFO nova.virt.libvirt.driver [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Instance shutdown successfully after 13 seconds.#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.075 2 INFO nova.virt.libvirt.driver [-] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Instance destroyed successfully.#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.081 2 INFO nova.virt.libvirt.driver [-] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Instance destroyed successfully.#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.081 2 DEBUG nova.virt.libvirt.vif [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:13:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-216461092',display_name='tempest-ServerDiskConfigTestJSON-server-216461092',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-216461092',id=68,image_ref='062d9f80-76b6-42ce-bee7-0fb82a008353',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:13:40Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ffae703d68b24b9c89686c149113fc2b',ramdisk_id='',reservation_id='r-q7hhnz17',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='062d9f80-76b6-42ce-bee7-0fb82a008353',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-1763056137',owner_user_name='tempest-ServerDiskConfigTestJSON-1763056137-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:13:45Z,user_data=None,user_id='def48c13fd6a43ba88836b753986a731',uuid=878c8333-7358-4047-ad93-5b34ec0b2643,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4e4dadeb-b5ea-4021-9b54-652b42f7783b", "address": "fa:16:3e:8c:d3:48", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4e4dadeb-b5", "ovs_interfaceid": "4e4dadeb-b5ea-4021-9b54-652b42f7783b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.082 2 DEBUG nova.network.os_vif_util [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Converting VIF {"id": "4e4dadeb-b5ea-4021-9b54-652b42f7783b", "address": "fa:16:3e:8c:d3:48", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4e4dadeb-b5", "ovs_interfaceid": "4e4dadeb-b5ea-4021-9b54-652b42f7783b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.082 2 DEBUG nova.network.os_vif_util [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8c:d3:48,bridge_name='br-int',has_traffic_filtering=True,id=4e4dadeb-b5ea-4021-9b54-652b42f7783b,network=Network(d6de4737-ca60-4c8d-bfd5-687f9366ec8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4e4dadeb-b5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.083 2 DEBUG os_vif [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8c:d3:48,bridge_name='br-int',has_traffic_filtering=True,id=4e4dadeb-b5ea-4021-9b54-652b42f7783b,network=Network(d6de4737-ca60-4c8d-bfd5-687f9366ec8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4e4dadeb-b5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.084 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.085 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4e4dadeb-b5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.086 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.087 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.089 2 INFO os_vif [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8c:d3:48,bridge_name='br-int',has_traffic_filtering=True,id=4e4dadeb-b5ea-4021-9b54-652b42f7783b,network=Network(d6de4737-ca60-4c8d-bfd5-687f9366ec8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4e4dadeb-b5')#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.090 2 INFO nova.virt.libvirt.driver [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Deleting instance files /var/lib/nova/instances/878c8333-7358-4047-ad93-5b34ec0b2643_del#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.091 2 INFO nova.virt.libvirt.driver [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Deletion of /var/lib/nova/instances/878c8333-7358-4047-ad93-5b34ec0b2643_del complete#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.501 2 DEBUG nova.virt.libvirt.driver [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.501 2 INFO nova.virt.libvirt.driver [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Creating image(s)#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.502 2 DEBUG oslo_concurrency.lockutils [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "/var/lib/nova/instances/878c8333-7358-4047-ad93-5b34ec0b2643/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.502 2 DEBUG oslo_concurrency.lockutils [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "/var/lib/nova/instances/878c8333-7358-4047-ad93-5b34ec0b2643/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.503 2 DEBUG oslo_concurrency.lockutils [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "/var/lib/nova/instances/878c8333-7358-4047-ad93-5b34ec0b2643/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.516 2 DEBUG oslo_concurrency.processutils [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.586 2 DEBUG oslo_concurrency.processutils [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.587 2 DEBUG oslo_concurrency.lockutils [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "d7f074efa852dc950deac120296f6eecf48a40d2" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.588 2 DEBUG oslo_concurrency.lockutils [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "d7f074efa852dc950deac120296f6eecf48a40d2" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.603 2 DEBUG oslo_concurrency.processutils [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.665 2 DEBUG oslo_concurrency.processutils [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.667 2 DEBUG oslo_concurrency.processutils [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2,backing_fmt=raw /var/lib/nova/instances/878c8333-7358-4047-ad93-5b34ec0b2643/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.702 2 DEBUG oslo_concurrency.processutils [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2,backing_fmt=raw /var/lib/nova/instances/878c8333-7358-4047-ad93-5b34ec0b2643/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.703 2 DEBUG oslo_concurrency.lockutils [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "d7f074efa852dc950deac120296f6eecf48a40d2" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.704 2 DEBUG oslo_concurrency.processutils [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.766 2 DEBUG oslo_concurrency.processutils [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.768 2 DEBUG nova.virt.disk.api [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Checking if we can resize image /var/lib/nova/instances/878c8333-7358-4047-ad93-5b34ec0b2643/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.768 2 DEBUG oslo_concurrency.processutils [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/878c8333-7358-4047-ad93-5b34ec0b2643/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.853 2 DEBUG oslo_concurrency.processutils [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/878c8333-7358-4047-ad93-5b34ec0b2643/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.854 2 DEBUG nova.virt.disk.api [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Cannot resize image /var/lib/nova/instances/878c8333-7358-4047-ad93-5b34ec0b2643/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.855 2 DEBUG nova.virt.libvirt.driver [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.855 2 DEBUG nova.virt.libvirt.driver [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Ensure instance console log exists: /var/lib/nova/instances/878c8333-7358-4047-ad93-5b34ec0b2643/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.855 2 DEBUG oslo_concurrency.lockutils [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.856 2 DEBUG oslo_concurrency.lockutils [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.856 2 DEBUG oslo_concurrency.lockutils [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.858 2 DEBUG nova.virt.libvirt.driver [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Start _get_guest_xml network_info=[{"id": "4e4dadeb-b5ea-4021-9b54-652b42f7783b", "address": "fa:16:3e:8c:d3:48", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4e4dadeb-b5", "ovs_interfaceid": "4e4dadeb-b5ea-4021-9b54-652b42f7783b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:28Z,direct_url=<?>,disk_format='qcow2',id=062d9f80-76b6-42ce-bee7-0fb82a008353,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.862 2 WARNING nova.virt.libvirt.driver [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.872 2 DEBUG nova.virt.libvirt.host [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.873 2 DEBUG nova.virt.libvirt.host [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.879 2 DEBUG nova.virt.libvirt.host [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.880 2 DEBUG nova.virt.libvirt.host [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.881 2 DEBUG nova.virt.libvirt.driver [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.881 2 DEBUG nova.virt.hardware [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:28Z,direct_url=<?>,disk_format='qcow2',id=062d9f80-76b6-42ce-bee7-0fb82a008353,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.882 2 DEBUG nova.virt.hardware [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.882 2 DEBUG nova.virt.hardware [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.882 2 DEBUG nova.virt.hardware [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.883 2 DEBUG nova.virt.hardware [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.883 2 DEBUG nova.virt.hardware [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.883 2 DEBUG nova.virt.hardware [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.883 2 DEBUG nova.virt.hardware [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.883 2 DEBUG nova.virt.hardware [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.884 2 DEBUG nova.virt.hardware [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.884 2 DEBUG nova.virt.hardware [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.884 2 DEBUG nova.objects.instance [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lazy-loading 'vcpu_model' on Instance uuid 878c8333-7358-4047-ad93-5b34ec0b2643 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.914 2 DEBUG nova.virt.libvirt.vif [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T12:13:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-216461092',display_name='tempest-ServerDiskConfigTestJSON-server-216461092',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-216461092',id=68,image_ref='062d9f80-76b6-42ce-bee7-0fb82a008353',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:13:40Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ffae703d68b24b9c89686c149113fc2b',ramdisk_id='',reservation_id='r-q7hhnz17',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='062d9f80-76b6-42ce-bee7-0fb82a008353',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-1763056137',owner_user_name='tempest-ServerDiskConfigTestJSON-1763056137-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:14:00Z,user_data=None,user_id='def48c13fd6a43ba88836b753986a731',uuid=878c8333-7358-4047-ad93-5b34ec0b2643,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4e4dadeb-b5ea-4021-9b54-652b42f7783b", "address": "fa:16:3e:8c:d3:48", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4e4dadeb-b5", "ovs_interfaceid": "4e4dadeb-b5ea-4021-9b54-652b42f7783b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.915 2 DEBUG nova.network.os_vif_util [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Converting VIF {"id": "4e4dadeb-b5ea-4021-9b54-652b42f7783b", "address": "fa:16:3e:8c:d3:48", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4e4dadeb-b5", "ovs_interfaceid": "4e4dadeb-b5ea-4021-9b54-652b42f7783b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.915 2 DEBUG nova.network.os_vif_util [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8c:d3:48,bridge_name='br-int',has_traffic_filtering=True,id=4e4dadeb-b5ea-4021-9b54-652b42f7783b,network=Network(d6de4737-ca60-4c8d-bfd5-687f9366ec8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4e4dadeb-b5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.918 2 DEBUG nova.virt.libvirt.driver [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:14:00 np0005466013 nova_compute[192144]:  <uuid>878c8333-7358-4047-ad93-5b34ec0b2643</uuid>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:  <name>instance-00000044</name>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:14:00 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:      <nova:name>tempest-ServerDiskConfigTestJSON-server-216461092</nova:name>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:14:00</nova:creationTime>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:14:00 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:        <nova:user uuid="def48c13fd6a43ba88836b753986a731">tempest-ServerDiskConfigTestJSON-1763056137-project-member</nova:user>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:        <nova:project uuid="ffae703d68b24b9c89686c149113fc2b">tempest-ServerDiskConfigTestJSON-1763056137</nova:project>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="062d9f80-76b6-42ce-bee7-0fb82a008353"/>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:        <nova:port uuid="4e4dadeb-b5ea-4021-9b54-652b42f7783b">
Oct  2 08:14:00 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:      <entry name="serial">878c8333-7358-4047-ad93-5b34ec0b2643</entry>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:      <entry name="uuid">878c8333-7358-4047-ad93-5b34ec0b2643</entry>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:14:00 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/878c8333-7358-4047-ad93-5b34ec0b2643/disk"/>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:14:00 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/878c8333-7358-4047-ad93-5b34ec0b2643/disk.config"/>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:14:00 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:8c:d3:48"/>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:      <target dev="tap4e4dadeb-b5"/>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:14:00 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/878c8333-7358-4047-ad93-5b34ec0b2643/console.log" append="off"/>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:14:00 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:14:00 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:14:00 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:14:00 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:14:00 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.919 2 DEBUG nova.compute.manager [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Preparing to wait for external event network-vif-plugged-4e4dadeb-b5ea-4021-9b54-652b42f7783b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.919 2 DEBUG oslo_concurrency.lockutils [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "878c8333-7358-4047-ad93-5b34ec0b2643-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.920 2 DEBUG oslo_concurrency.lockutils [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "878c8333-7358-4047-ad93-5b34ec0b2643-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.920 2 DEBUG oslo_concurrency.lockutils [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "878c8333-7358-4047-ad93-5b34ec0b2643-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.921 2 DEBUG nova.virt.libvirt.vif [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T12:13:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-216461092',display_name='tempest-ServerDiskConfigTestJSON-server-216461092',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-216461092',id=68,image_ref='062d9f80-76b6-42ce-bee7-0fb82a008353',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:13:40Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ffae703d68b24b9c89686c149113fc2b',ramdisk_id='',reservation_id='r-q7hhnz17',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='062d9f80-76b6-42ce-bee7-0fb82a008353',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-1763056137',owner_user_name='tempest-ServerDiskConfigTestJSON-1763056137-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:14:00Z,user_data=None,user_id='def48c13fd6a43ba88836b753986a731',uuid=878c8333-7358-4047-ad93-5b34ec0b2643,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4e4dadeb-b5ea-4021-9b54-652b42f7783b", "address": "fa:16:3e:8c:d3:48", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4e4dadeb-b5", "ovs_interfaceid": "4e4dadeb-b5ea-4021-9b54-652b42f7783b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.921 2 DEBUG nova.network.os_vif_util [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Converting VIF {"id": "4e4dadeb-b5ea-4021-9b54-652b42f7783b", "address": "fa:16:3e:8c:d3:48", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4e4dadeb-b5", "ovs_interfaceid": "4e4dadeb-b5ea-4021-9b54-652b42f7783b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.921 2 DEBUG nova.network.os_vif_util [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8c:d3:48,bridge_name='br-int',has_traffic_filtering=True,id=4e4dadeb-b5ea-4021-9b54-652b42f7783b,network=Network(d6de4737-ca60-4c8d-bfd5-687f9366ec8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4e4dadeb-b5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.922 2 DEBUG os_vif [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8c:d3:48,bridge_name='br-int',has_traffic_filtering=True,id=4e4dadeb-b5ea-4021-9b54-652b42f7783b,network=Network(d6de4737-ca60-4c8d-bfd5-687f9366ec8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4e4dadeb-b5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.923 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.923 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.924 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.926 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.926 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4e4dadeb-b5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.927 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4e4dadeb-b5, col_values=(('external_ids', {'iface-id': '4e4dadeb-b5ea-4021-9b54-652b42f7783b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8c:d3:48', 'vm-uuid': '878c8333-7358-4047-ad93-5b34ec0b2643'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.928 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:00 np0005466013 NetworkManager[51205]: <info>  [1759407240.9298] manager: (tap4e4dadeb-b5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/115)
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.934 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.935 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:00 np0005466013 nova_compute[192144]: 2025-10-02 12:14:00.936 2 INFO os_vif [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8c:d3:48,bridge_name='br-int',has_traffic_filtering=True,id=4e4dadeb-b5ea-4021-9b54-652b42f7783b,network=Network(d6de4737-ca60-4c8d-bfd5-687f9366ec8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4e4dadeb-b5')#033[00m
Oct  2 08:14:01 np0005466013 nova_compute[192144]: 2025-10-02 12:14:01.024 2 DEBUG nova.virt.libvirt.driver [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:14:01 np0005466013 nova_compute[192144]: 2025-10-02 12:14:01.025 2 DEBUG nova.virt.libvirt.driver [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:14:01 np0005466013 nova_compute[192144]: 2025-10-02 12:14:01.025 2 DEBUG nova.virt.libvirt.driver [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] No VIF found with MAC fa:16:3e:8c:d3:48, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:14:01 np0005466013 nova_compute[192144]: 2025-10-02 12:14:01.026 2 INFO nova.virt.libvirt.driver [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Using config drive#033[00m
Oct  2 08:14:01 np0005466013 nova_compute[192144]: 2025-10-02 12:14:01.054 2 DEBUG nova.objects.instance [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lazy-loading 'ec2_ids' on Instance uuid 878c8333-7358-4047-ad93-5b34ec0b2643 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:14:01 np0005466013 nova_compute[192144]: 2025-10-02 12:14:01.093 2 DEBUG nova.objects.instance [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lazy-loading 'keypairs' on Instance uuid 878c8333-7358-4047-ad93-5b34ec0b2643 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:14:01 np0005466013 nova_compute[192144]: 2025-10-02 12:14:01.632 2 INFO nova.virt.libvirt.driver [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Creating config drive at /var/lib/nova/instances/878c8333-7358-4047-ad93-5b34ec0b2643/disk.config#033[00m
Oct  2 08:14:01 np0005466013 nova_compute[192144]: 2025-10-02 12:14:01.637 2 DEBUG oslo_concurrency.processutils [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/878c8333-7358-4047-ad93-5b34ec0b2643/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvoudb4kz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:14:01 np0005466013 nova_compute[192144]: 2025-10-02 12:14:01.849 2 DEBUG nova.compute.manager [req-2645cc2a-cc35-49d3-840f-6a7cc3ab21a0 req-914cf9d5-5f7e-4052-8fa1-60820b7dfa5e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Received event network-vif-plugged-4e4dadeb-b5ea-4021-9b54-652b42f7783b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:14:01 np0005466013 nova_compute[192144]: 2025-10-02 12:14:01.851 2 DEBUG oslo_concurrency.lockutils [req-2645cc2a-cc35-49d3-840f-6a7cc3ab21a0 req-914cf9d5-5f7e-4052-8fa1-60820b7dfa5e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "878c8333-7358-4047-ad93-5b34ec0b2643-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:01 np0005466013 nova_compute[192144]: 2025-10-02 12:14:01.852 2 DEBUG oslo_concurrency.lockutils [req-2645cc2a-cc35-49d3-840f-6a7cc3ab21a0 req-914cf9d5-5f7e-4052-8fa1-60820b7dfa5e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "878c8333-7358-4047-ad93-5b34ec0b2643-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:01 np0005466013 nova_compute[192144]: 2025-10-02 12:14:01.852 2 DEBUG oslo_concurrency.lockutils [req-2645cc2a-cc35-49d3-840f-6a7cc3ab21a0 req-914cf9d5-5f7e-4052-8fa1-60820b7dfa5e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "878c8333-7358-4047-ad93-5b34ec0b2643-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:01 np0005466013 nova_compute[192144]: 2025-10-02 12:14:01.852 2 DEBUG nova.compute.manager [req-2645cc2a-cc35-49d3-840f-6a7cc3ab21a0 req-914cf9d5-5f7e-4052-8fa1-60820b7dfa5e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Processing event network-vif-plugged-4e4dadeb-b5ea-4021-9b54-652b42f7783b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:02.296 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:02.296 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:02.296 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:02 np0005466013 nova_compute[192144]: 2025-10-02 12:14:02.365 2 DEBUG oslo_concurrency.processutils [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/878c8333-7358-4047-ad93-5b34ec0b2643/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvoudb4kz" returned: 0 in 0.729s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:14:02 np0005466013 kernel: tap4e4dadeb-b5: entered promiscuous mode
Oct  2 08:14:02 np0005466013 nova_compute[192144]: 2025-10-02 12:14:02.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:02 np0005466013 ovn_controller[94366]: 2025-10-02T12:14:02Z|00224|binding|INFO|Claiming lport 4e4dadeb-b5ea-4021-9b54-652b42f7783b for this chassis.
Oct  2 08:14:02 np0005466013 ovn_controller[94366]: 2025-10-02T12:14:02Z|00225|binding|INFO|4e4dadeb-b5ea-4021-9b54-652b42f7783b: Claiming fa:16:3e:8c:d3:48 10.100.0.11
Oct  2 08:14:02 np0005466013 NetworkManager[51205]: <info>  [1759407242.4245] manager: (tap4e4dadeb-b5): new Tun device (/org/freedesktop/NetworkManager/Devices/116)
Oct  2 08:14:02 np0005466013 ovn_controller[94366]: 2025-10-02T12:14:02Z|00226|binding|INFO|Setting lport 4e4dadeb-b5ea-4021-9b54-652b42f7783b ovn-installed in OVS
Oct  2 08:14:02 np0005466013 nova_compute[192144]: 2025-10-02 12:14:02.436 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:02 np0005466013 ovn_controller[94366]: 2025-10-02T12:14:02Z|00227|binding|INFO|Setting lport 4e4dadeb-b5ea-4021-9b54-652b42f7783b up in Southbound
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:02.441 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8c:d3:48 10.100.0.11'], port_security=['fa:16:3e:8c:d3:48 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '878c8333-7358-4047-ad93-5b34ec0b2643', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ffae703d68b24b9c89686c149113fc2b', 'neutron:revision_number': '5', 'neutron:security_group_ids': '64970375-b20e-4c18-bfb5-2a0465f8be7d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9476db85-7514-407a-b55a-3d3c703e8f7b, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=4e4dadeb-b5ea-4021-9b54-652b42f7783b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:14:02 np0005466013 nova_compute[192144]: 2025-10-02 12:14:02.443 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:02.443 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 4e4dadeb-b5ea-4021-9b54-652b42f7783b in datapath d6de4737-ca60-4c8d-bfd5-687f9366ec8b bound to our chassis#033[00m
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:02.445 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d6de4737-ca60-4c8d-bfd5-687f9366ec8b#033[00m
Oct  2 08:14:02 np0005466013 systemd-udevd[229733]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:02.458 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[672219e7-5c61-4387-96e5-6d5b3fd71dfd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:02.460 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd6de4737-c1 in ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:14:02 np0005466013 systemd-machined[152202]: New machine qemu-29-instance-00000044.
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:02.462 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd6de4737-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:02.462 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[1894e110-dc7b-4628-952d-b8264a9c5cde]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:02.463 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[4630204e-06ed-457a-893d-caf32d2b24aa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:02 np0005466013 NetworkManager[51205]: <info>  [1759407242.4715] device (tap4e4dadeb-b5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:14:02 np0005466013 NetworkManager[51205]: <info>  [1759407242.4726] device (tap4e4dadeb-b5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:14:02 np0005466013 systemd[1]: Started Virtual Machine qemu-29-instance-00000044.
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:02.476 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[9c7e0f46-2b7a-4ed7-84cc-6e980d65cb83]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:02.491 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[591bb698-93c0-4137-a02f-c9b94b544339]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:02.522 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[e6db4a6c-39c7-4e7f-b577-f7aaa524bf5f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:02 np0005466013 NetworkManager[51205]: <info>  [1759407242.5299] manager: (tapd6de4737-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/117)
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:02.531 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[f40c6605-e942-4a17-b61a-e704843394e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:02.565 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[04ac0dd1-456d-4cca-9ef5-aad63fbf79a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:02.569 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[c609a579-8d4a-4aaf-901e-f97b0b424f9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:02 np0005466013 NetworkManager[51205]: <info>  [1759407242.5908] device (tapd6de4737-c0): carrier: link connected
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:02.596 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[0593c0e3-ad75-4ce2-8384-cf3050805fa5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:02.614 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[8db07b58-59c3-421d-82d1-ec8041c5bb4d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd6de4737-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bd:c9:1f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 70], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 523420, 'reachable_time': 16382, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229765, 'error': None, 'target': 'ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:02.627 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[67a091d4-7bc4-436e-af92-b9d8bf498d26]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febd:c91f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 523420, 'tstamp': 523420}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229766, 'error': None, 'target': 'ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:02.640 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[55cb1f08-7e0a-403a-a3b4-da6668930eb8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd6de4737-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bd:c9:1f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 70], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 523420, 'reachable_time': 16382, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229767, 'error': None, 'target': 'ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:02.667 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[6cd686fd-a527-427a-9892-8639c7b09fd5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:02.726 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[38e94c63-707e-4272-a606-1760c6b7fd22]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:02.727 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd6de4737-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:02.727 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:02.728 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd6de4737-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:02 np0005466013 nova_compute[192144]: 2025-10-02 12:14:02.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:02 np0005466013 NetworkManager[51205]: <info>  [1759407242.7309] manager: (tapd6de4737-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/118)
Oct  2 08:14:02 np0005466013 kernel: tapd6de4737-c0: entered promiscuous mode
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:02.735 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd6de4737-c0, col_values=(('external_ids', {'iface-id': 'cc451eb7-bf34-4b54-96d8-b834f11e06fb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:02 np0005466013 ovn_controller[94366]: 2025-10-02T12:14:02Z|00228|binding|INFO|Releasing lport cc451eb7-bf34-4b54-96d8-b834f11e06fb from this chassis (sb_readonly=0)
Oct  2 08:14:02 np0005466013 nova_compute[192144]: 2025-10-02 12:14:02.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:02.739 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d6de4737-ca60-4c8d-bfd5-687f9366ec8b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d6de4737-ca60-4c8d-bfd5-687f9366ec8b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:02.740 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[0f58f96d-805a-4cb2-80bd-ee7c4070182b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:02.741 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-d6de4737-ca60-4c8d-bfd5-687f9366ec8b
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/d6de4737-ca60-4c8d-bfd5-687f9366ec8b.pid.haproxy
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID d6de4737-ca60-4c8d-bfd5-687f9366ec8b
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:14:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:02.742 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'env', 'PROCESS_TAG=haproxy-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d6de4737-ca60-4c8d-bfd5-687f9366ec8b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:14:02 np0005466013 nova_compute[192144]: 2025-10-02 12:14:02.749 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:03 np0005466013 podman[229805]: 2025-10-02 12:14:03.136922659 +0000 UTC m=+0.054356137 container create fca084727a81d868fedf3ab1642f5d0b82f04eaa4b87775018c1c9153f66bd02 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 08:14:03 np0005466013 systemd[1]: Started libpod-conmon-fca084727a81d868fedf3ab1642f5d0b82f04eaa4b87775018c1c9153f66bd02.scope.
Oct  2 08:14:03 np0005466013 podman[229805]: 2025-10-02 12:14:03.108316481 +0000 UTC m=+0.025749979 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:14:03 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:14:03 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/732aca603c1a17a8b031cf03602c590eba5121e86a446526450c44213a73a0d1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:14:03 np0005466013 podman[229805]: 2025-10-02 12:14:03.221320051 +0000 UTC m=+0.138753549 container init fca084727a81d868fedf3ab1642f5d0b82f04eaa4b87775018c1c9153f66bd02 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:14:03 np0005466013 podman[229805]: 2025-10-02 12:14:03.22722288 +0000 UTC m=+0.144656358 container start fca084727a81d868fedf3ab1642f5d0b82f04eaa4b87775018c1c9153f66bd02 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:14:03 np0005466013 nova_compute[192144]: 2025-10-02 12:14:03.233 2 DEBUG nova.virt.libvirt.host [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Removed pending event for 878c8333-7358-4047-ad93-5b34ec0b2643 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Oct  2 08:14:03 np0005466013 nova_compute[192144]: 2025-10-02 12:14:03.233 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407243.2326896, 878c8333-7358-4047-ad93-5b34ec0b2643 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:14:03 np0005466013 nova_compute[192144]: 2025-10-02 12:14:03.234 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] VM Started (Lifecycle Event)#033[00m
Oct  2 08:14:03 np0005466013 nova_compute[192144]: 2025-10-02 12:14:03.236 2 DEBUG nova.compute.manager [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:14:03 np0005466013 nova_compute[192144]: 2025-10-02 12:14:03.241 2 DEBUG nova.virt.libvirt.driver [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:14:03 np0005466013 nova_compute[192144]: 2025-10-02 12:14:03.245 2 INFO nova.virt.libvirt.driver [-] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Instance spawned successfully.#033[00m
Oct  2 08:14:03 np0005466013 nova_compute[192144]: 2025-10-02 12:14:03.246 2 DEBUG nova.virt.libvirt.driver [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:14:03 np0005466013 neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b[229820]: [NOTICE]   (229824) : New worker (229826) forked
Oct  2 08:14:03 np0005466013 neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b[229820]: [NOTICE]   (229824) : Loading success.
Oct  2 08:14:03 np0005466013 nova_compute[192144]: 2025-10-02 12:14:03.262 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:14:03 np0005466013 nova_compute[192144]: 2025-10-02 12:14:03.266 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:14:03 np0005466013 nova_compute[192144]: 2025-10-02 12:14:03.287 2 DEBUG nova.virt.libvirt.driver [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:14:03 np0005466013 nova_compute[192144]: 2025-10-02 12:14:03.288 2 DEBUG nova.virt.libvirt.driver [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:14:03 np0005466013 nova_compute[192144]: 2025-10-02 12:14:03.288 2 DEBUG nova.virt.libvirt.driver [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:14:03 np0005466013 nova_compute[192144]: 2025-10-02 12:14:03.289 2 DEBUG nova.virt.libvirt.driver [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:14:03 np0005466013 nova_compute[192144]: 2025-10-02 12:14:03.289 2 DEBUG nova.virt.libvirt.driver [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:14:03 np0005466013 nova_compute[192144]: 2025-10-02 12:14:03.290 2 DEBUG nova.virt.libvirt.driver [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:14:03 np0005466013 nova_compute[192144]: 2025-10-02 12:14:03.316 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Oct  2 08:14:03 np0005466013 nova_compute[192144]: 2025-10-02 12:14:03.316 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407243.2337282, 878c8333-7358-4047-ad93-5b34ec0b2643 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:14:03 np0005466013 nova_compute[192144]: 2025-10-02 12:14:03.317 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:14:03 np0005466013 nova_compute[192144]: 2025-10-02 12:14:03.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:03 np0005466013 nova_compute[192144]: 2025-10-02 12:14:03.358 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:14:03 np0005466013 nova_compute[192144]: 2025-10-02 12:14:03.362 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407243.2380078, 878c8333-7358-4047-ad93-5b34ec0b2643 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:14:03 np0005466013 nova_compute[192144]: 2025-10-02 12:14:03.362 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:14:03 np0005466013 nova_compute[192144]: 2025-10-02 12:14:03.403 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:14:03 np0005466013 nova_compute[192144]: 2025-10-02 12:14:03.409 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:14:03 np0005466013 nova_compute[192144]: 2025-10-02 12:14:03.443 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Oct  2 08:14:03 np0005466013 nova_compute[192144]: 2025-10-02 12:14:03.667 2 DEBUG nova.compute.manager [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:14:03 np0005466013 nova_compute[192144]: 2025-10-02 12:14:03.911 2 DEBUG oslo_concurrency.lockutils [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:03 np0005466013 nova_compute[192144]: 2025-10-02 12:14:03.911 2 DEBUG oslo_concurrency.lockutils [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:03 np0005466013 nova_compute[192144]: 2025-10-02 12:14:03.911 2 DEBUG nova.objects.instance [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Oct  2 08:14:03 np0005466013 nova_compute[192144]: 2025-10-02 12:14:03.938 2 DEBUG nova.compute.manager [req-ed5bf4c5-51d1-41e6-b8dc-2f4fccc6f375 req-ccbc7090-cd83-48b7-9c14-c3281d0aa61c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Received event network-vif-plugged-4e4dadeb-b5ea-4021-9b54-652b42f7783b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:14:03 np0005466013 nova_compute[192144]: 2025-10-02 12:14:03.939 2 DEBUG oslo_concurrency.lockutils [req-ed5bf4c5-51d1-41e6-b8dc-2f4fccc6f375 req-ccbc7090-cd83-48b7-9c14-c3281d0aa61c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "878c8333-7358-4047-ad93-5b34ec0b2643-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:03 np0005466013 nova_compute[192144]: 2025-10-02 12:14:03.939 2 DEBUG oslo_concurrency.lockutils [req-ed5bf4c5-51d1-41e6-b8dc-2f4fccc6f375 req-ccbc7090-cd83-48b7-9c14-c3281d0aa61c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "878c8333-7358-4047-ad93-5b34ec0b2643-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:03 np0005466013 nova_compute[192144]: 2025-10-02 12:14:03.940 2 DEBUG oslo_concurrency.lockutils [req-ed5bf4c5-51d1-41e6-b8dc-2f4fccc6f375 req-ccbc7090-cd83-48b7-9c14-c3281d0aa61c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "878c8333-7358-4047-ad93-5b34ec0b2643-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:03 np0005466013 nova_compute[192144]: 2025-10-02 12:14:03.940 2 DEBUG nova.compute.manager [req-ed5bf4c5-51d1-41e6-b8dc-2f4fccc6f375 req-ccbc7090-cd83-48b7-9c14-c3281d0aa61c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] No waiting events found dispatching network-vif-plugged-4e4dadeb-b5ea-4021-9b54-652b42f7783b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:14:03 np0005466013 nova_compute[192144]: 2025-10-02 12:14:03.940 2 WARNING nova.compute.manager [req-ed5bf4c5-51d1-41e6-b8dc-2f4fccc6f375 req-ccbc7090-cd83-48b7-9c14-c3281d0aa61c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Received unexpected event network-vif-plugged-4e4dadeb-b5ea-4021-9b54-652b42f7783b for instance with vm_state active and task_state None.#033[00m
Oct  2 08:14:03 np0005466013 nova_compute[192144]: 2025-10-02 12:14:03.940 2 DEBUG nova.compute.manager [req-ed5bf4c5-51d1-41e6-b8dc-2f4fccc6f375 req-ccbc7090-cd83-48b7-9c14-c3281d0aa61c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Received event network-vif-plugged-4e4dadeb-b5ea-4021-9b54-652b42f7783b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:14:03 np0005466013 nova_compute[192144]: 2025-10-02 12:14:03.940 2 DEBUG oslo_concurrency.lockutils [req-ed5bf4c5-51d1-41e6-b8dc-2f4fccc6f375 req-ccbc7090-cd83-48b7-9c14-c3281d0aa61c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "878c8333-7358-4047-ad93-5b34ec0b2643-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:03 np0005466013 nova_compute[192144]: 2025-10-02 12:14:03.941 2 DEBUG oslo_concurrency.lockutils [req-ed5bf4c5-51d1-41e6-b8dc-2f4fccc6f375 req-ccbc7090-cd83-48b7-9c14-c3281d0aa61c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "878c8333-7358-4047-ad93-5b34ec0b2643-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:03 np0005466013 nova_compute[192144]: 2025-10-02 12:14:03.941 2 DEBUG oslo_concurrency.lockutils [req-ed5bf4c5-51d1-41e6-b8dc-2f4fccc6f375 req-ccbc7090-cd83-48b7-9c14-c3281d0aa61c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "878c8333-7358-4047-ad93-5b34ec0b2643-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:03 np0005466013 nova_compute[192144]: 2025-10-02 12:14:03.941 2 DEBUG nova.compute.manager [req-ed5bf4c5-51d1-41e6-b8dc-2f4fccc6f375 req-ccbc7090-cd83-48b7-9c14-c3281d0aa61c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] No waiting events found dispatching network-vif-plugged-4e4dadeb-b5ea-4021-9b54-652b42f7783b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:14:03 np0005466013 nova_compute[192144]: 2025-10-02 12:14:03.941 2 WARNING nova.compute.manager [req-ed5bf4c5-51d1-41e6-b8dc-2f4fccc6f375 req-ccbc7090-cd83-48b7-9c14-c3281d0aa61c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Received unexpected event network-vif-plugged-4e4dadeb-b5ea-4021-9b54-652b42f7783b for instance with vm_state active and task_state None.#033[00m
Oct  2 08:14:04 np0005466013 nova_compute[192144]: 2025-10-02 12:14:04.034 2 DEBUG oslo_concurrency.lockutils [None req-ec04e757-19b9-4178-9194-870bc2575bc1 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:05 np0005466013 nova_compute[192144]: 2025-10-02 12:14:05.910 2 DEBUG oslo_concurrency.lockutils [None req-831beeae-4ed4-4c73-94e2-3765eccdf774 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "878c8333-7358-4047-ad93-5b34ec0b2643" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:05 np0005466013 nova_compute[192144]: 2025-10-02 12:14:05.911 2 DEBUG oslo_concurrency.lockutils [None req-831beeae-4ed4-4c73-94e2-3765eccdf774 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "878c8333-7358-4047-ad93-5b34ec0b2643" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:05 np0005466013 nova_compute[192144]: 2025-10-02 12:14:05.912 2 DEBUG oslo_concurrency.lockutils [None req-831beeae-4ed4-4c73-94e2-3765eccdf774 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "878c8333-7358-4047-ad93-5b34ec0b2643-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:05 np0005466013 nova_compute[192144]: 2025-10-02 12:14:05.912 2 DEBUG oslo_concurrency.lockutils [None req-831beeae-4ed4-4c73-94e2-3765eccdf774 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "878c8333-7358-4047-ad93-5b34ec0b2643-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:05 np0005466013 nova_compute[192144]: 2025-10-02 12:14:05.913 2 DEBUG oslo_concurrency.lockutils [None req-831beeae-4ed4-4c73-94e2-3765eccdf774 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "878c8333-7358-4047-ad93-5b34ec0b2643-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:05 np0005466013 nova_compute[192144]: 2025-10-02 12:14:05.928 2 INFO nova.compute.manager [None req-831beeae-4ed4-4c73-94e2-3765eccdf774 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Terminating instance#033[00m
Oct  2 08:14:05 np0005466013 nova_compute[192144]: 2025-10-02 12:14:05.932 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:05 np0005466013 nova_compute[192144]: 2025-10-02 12:14:05.940 2 DEBUG nova.compute.manager [None req-831beeae-4ed4-4c73-94e2-3765eccdf774 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:14:05 np0005466013 kernel: tap4e4dadeb-b5 (unregistering): left promiscuous mode
Oct  2 08:14:05 np0005466013 NetworkManager[51205]: <info>  [1759407245.9624] device (tap4e4dadeb-b5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:14:05 np0005466013 ovn_controller[94366]: 2025-10-02T12:14:05Z|00229|binding|INFO|Releasing lport 4e4dadeb-b5ea-4021-9b54-652b42f7783b from this chassis (sb_readonly=0)
Oct  2 08:14:05 np0005466013 ovn_controller[94366]: 2025-10-02T12:14:05Z|00230|binding|INFO|Setting lport 4e4dadeb-b5ea-4021-9b54-652b42f7783b down in Southbound
Oct  2 08:14:05 np0005466013 ovn_controller[94366]: 2025-10-02T12:14:05Z|00231|binding|INFO|Removing iface tap4e4dadeb-b5 ovn-installed in OVS
Oct  2 08:14:05 np0005466013 nova_compute[192144]: 2025-10-02 12:14:05.975 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:05.984 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8c:d3:48 10.100.0.11'], port_security=['fa:16:3e:8c:d3:48 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '878c8333-7358-4047-ad93-5b34ec0b2643', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ffae703d68b24b9c89686c149113fc2b', 'neutron:revision_number': '6', 'neutron:security_group_ids': '64970375-b20e-4c18-bfb5-2a0465f8be7d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9476db85-7514-407a-b55a-3d3c703e8f7b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=4e4dadeb-b5ea-4021-9b54-652b42f7783b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:14:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:05.985 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 4e4dadeb-b5ea-4021-9b54-652b42f7783b in datapath d6de4737-ca60-4c8d-bfd5-687f9366ec8b unbound from our chassis#033[00m
Oct  2 08:14:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:05.987 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d6de4737-ca60-4c8d-bfd5-687f9366ec8b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:14:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:05.988 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[6d1f5dd8-b0ce-44be-ae30-d9788bd87892]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:05.989 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b namespace which is not needed anymore#033[00m
Oct  2 08:14:05 np0005466013 nova_compute[192144]: 2025-10-02 12:14:05.996 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:06 np0005466013 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d00000044.scope: Deactivated successfully.
Oct  2 08:14:06 np0005466013 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d00000044.scope: Consumed 3.441s CPU time.
Oct  2 08:14:06 np0005466013 systemd-machined[152202]: Machine qemu-29-instance-00000044 terminated.
Oct  2 08:14:06 np0005466013 neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b[229820]: [NOTICE]   (229824) : haproxy version is 2.8.14-c23fe91
Oct  2 08:14:06 np0005466013 neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b[229820]: [NOTICE]   (229824) : path to executable is /usr/sbin/haproxy
Oct  2 08:14:06 np0005466013 neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b[229820]: [ALERT]    (229824) : Current worker (229826) exited with code 143 (Terminated)
Oct  2 08:14:06 np0005466013 neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b[229820]: [WARNING]  (229824) : All workers exited. Exiting... (0)
Oct  2 08:14:06 np0005466013 systemd[1]: libpod-fca084727a81d868fedf3ab1642f5d0b82f04eaa4b87775018c1c9153f66bd02.scope: Deactivated successfully.
Oct  2 08:14:06 np0005466013 podman[229859]: 2025-10-02 12:14:06.140036125 +0000 UTC m=+0.048831123 container died fca084727a81d868fedf3ab1642f5d0b82f04eaa4b87775018c1c9153f66bd02 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  2 08:14:06 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fca084727a81d868fedf3ab1642f5d0b82f04eaa4b87775018c1c9153f66bd02-userdata-shm.mount: Deactivated successfully.
Oct  2 08:14:06 np0005466013 systemd[1]: var-lib-containers-storage-overlay-732aca603c1a17a8b031cf03602c590eba5121e86a446526450c44213a73a0d1-merged.mount: Deactivated successfully.
Oct  2 08:14:06 np0005466013 podman[229859]: 2025-10-02 12:14:06.18047562 +0000 UTC m=+0.089270618 container cleanup fca084727a81d868fedf3ab1642f5d0b82f04eaa4b87775018c1c9153f66bd02 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:14:06 np0005466013 systemd[1]: libpod-conmon-fca084727a81d868fedf3ab1642f5d0b82f04eaa4b87775018c1c9153f66bd02.scope: Deactivated successfully.
Oct  2 08:14:06 np0005466013 nova_compute[192144]: 2025-10-02 12:14:06.210 2 INFO nova.virt.libvirt.driver [-] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Instance destroyed successfully.#033[00m
Oct  2 08:14:06 np0005466013 nova_compute[192144]: 2025-10-02 12:14:06.211 2 DEBUG nova.objects.instance [None req-831beeae-4ed4-4c73-94e2-3765eccdf774 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lazy-loading 'resources' on Instance uuid 878c8333-7358-4047-ad93-5b34ec0b2643 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:14:06 np0005466013 nova_compute[192144]: 2025-10-02 12:14:06.230 2 DEBUG nova.virt.libvirt.vif [None req-831beeae-4ed4-4c73-94e2-3765eccdf774 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T12:13:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-216461092',display_name='tempest-ServerDiskConfigTestJSON-server-216461092',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-216461092',id=68,image_ref='062d9f80-76b6-42ce-bee7-0fb82a008353',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:14:03Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ffae703d68b24b9c89686c149113fc2b',ramdisk_id='',reservation_id='r-q7hhnz17',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='062d9f80-76b6-42ce-bee7-0fb82a008353',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-1763056137',owner_user_name='tempest-ServerDiskConfigTestJSON-1763056137-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:14:03Z,user_data=None,user_id='def48c13fd6a43ba88836b753986a731',uuid=878c8333-7358-4047-ad93-5b34ec0b2643,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4e4dadeb-b5ea-4021-9b54-652b42f7783b", "address": "fa:16:3e:8c:d3:48", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4e4dadeb-b5", "ovs_interfaceid": "4e4dadeb-b5ea-4021-9b54-652b42f7783b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:14:06 np0005466013 nova_compute[192144]: 2025-10-02 12:14:06.231 2 DEBUG nova.network.os_vif_util [None req-831beeae-4ed4-4c73-94e2-3765eccdf774 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Converting VIF {"id": "4e4dadeb-b5ea-4021-9b54-652b42f7783b", "address": "fa:16:3e:8c:d3:48", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4e4dadeb-b5", "ovs_interfaceid": "4e4dadeb-b5ea-4021-9b54-652b42f7783b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:14:06 np0005466013 nova_compute[192144]: 2025-10-02 12:14:06.231 2 DEBUG nova.network.os_vif_util [None req-831beeae-4ed4-4c73-94e2-3765eccdf774 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8c:d3:48,bridge_name='br-int',has_traffic_filtering=True,id=4e4dadeb-b5ea-4021-9b54-652b42f7783b,network=Network(d6de4737-ca60-4c8d-bfd5-687f9366ec8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4e4dadeb-b5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:14:06 np0005466013 nova_compute[192144]: 2025-10-02 12:14:06.232 2 DEBUG os_vif [None req-831beeae-4ed4-4c73-94e2-3765eccdf774 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8c:d3:48,bridge_name='br-int',has_traffic_filtering=True,id=4e4dadeb-b5ea-4021-9b54-652b42f7783b,network=Network(d6de4737-ca60-4c8d-bfd5-687f9366ec8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4e4dadeb-b5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:14:06 np0005466013 nova_compute[192144]: 2025-10-02 12:14:06.234 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:06 np0005466013 nova_compute[192144]: 2025-10-02 12:14:06.235 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4e4dadeb-b5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:06 np0005466013 nova_compute[192144]: 2025-10-02 12:14:06.237 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:06 np0005466013 nova_compute[192144]: 2025-10-02 12:14:06.240 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:14:06 np0005466013 nova_compute[192144]: 2025-10-02 12:14:06.244 2 INFO os_vif [None req-831beeae-4ed4-4c73-94e2-3765eccdf774 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8c:d3:48,bridge_name='br-int',has_traffic_filtering=True,id=4e4dadeb-b5ea-4021-9b54-652b42f7783b,network=Network(d6de4737-ca60-4c8d-bfd5-687f9366ec8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4e4dadeb-b5')#033[00m
Oct  2 08:14:06 np0005466013 nova_compute[192144]: 2025-10-02 12:14:06.245 2 INFO nova.virt.libvirt.driver [None req-831beeae-4ed4-4c73-94e2-3765eccdf774 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Deleting instance files /var/lib/nova/instances/878c8333-7358-4047-ad93-5b34ec0b2643_del#033[00m
Oct  2 08:14:06 np0005466013 nova_compute[192144]: 2025-10-02 12:14:06.246 2 INFO nova.virt.libvirt.driver [None req-831beeae-4ed4-4c73-94e2-3765eccdf774 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Deletion of /var/lib/nova/instances/878c8333-7358-4047-ad93-5b34ec0b2643_del complete#033[00m
Oct  2 08:14:06 np0005466013 podman[229904]: 2025-10-02 12:14:06.259980606 +0000 UTC m=+0.047568492 container remove fca084727a81d868fedf3ab1642f5d0b82f04eaa4b87775018c1c9153f66bd02 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:14:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:06.266 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[00d9837d-a653-4f85-929c-84180405ef15]: (4, ('Thu Oct  2 12:14:06 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b (fca084727a81d868fedf3ab1642f5d0b82f04eaa4b87775018c1c9153f66bd02)\nfca084727a81d868fedf3ab1642f5d0b82f04eaa4b87775018c1c9153f66bd02\nThu Oct  2 12:14:06 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b (fca084727a81d868fedf3ab1642f5d0b82f04eaa4b87775018c1c9153f66bd02)\nfca084727a81d868fedf3ab1642f5d0b82f04eaa4b87775018c1c9153f66bd02\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:06.269 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[4f49c7cb-e3ac-40bf-af8d-4807fd6b8c82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:06.270 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd6de4737-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:06 np0005466013 nova_compute[192144]: 2025-10-02 12:14:06.272 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:06 np0005466013 kernel: tapd6de4737-c0: left promiscuous mode
Oct  2 08:14:06 np0005466013 nova_compute[192144]: 2025-10-02 12:14:06.285 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:06.288 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[21f8b4bc-8641-4732-a4b1-ebc22cc66066]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:06 np0005466013 nova_compute[192144]: 2025-10-02 12:14:06.322 2 DEBUG nova.compute.manager [req-9ae7b37a-bdcb-427b-842f-4583e5e834f6 req-2b163343-8e5a-4ced-bb95-b0f41ac4665b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Received event network-vif-unplugged-4e4dadeb-b5ea-4021-9b54-652b42f7783b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:14:06 np0005466013 nova_compute[192144]: 2025-10-02 12:14:06.322 2 DEBUG oslo_concurrency.lockutils [req-9ae7b37a-bdcb-427b-842f-4583e5e834f6 req-2b163343-8e5a-4ced-bb95-b0f41ac4665b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "878c8333-7358-4047-ad93-5b34ec0b2643-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:06 np0005466013 nova_compute[192144]: 2025-10-02 12:14:06.322 2 DEBUG oslo_concurrency.lockutils [req-9ae7b37a-bdcb-427b-842f-4583e5e834f6 req-2b163343-8e5a-4ced-bb95-b0f41ac4665b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "878c8333-7358-4047-ad93-5b34ec0b2643-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:06 np0005466013 nova_compute[192144]: 2025-10-02 12:14:06.323 2 DEBUG oslo_concurrency.lockutils [req-9ae7b37a-bdcb-427b-842f-4583e5e834f6 req-2b163343-8e5a-4ced-bb95-b0f41ac4665b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "878c8333-7358-4047-ad93-5b34ec0b2643-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:06 np0005466013 nova_compute[192144]: 2025-10-02 12:14:06.323 2 DEBUG nova.compute.manager [req-9ae7b37a-bdcb-427b-842f-4583e5e834f6 req-2b163343-8e5a-4ced-bb95-b0f41ac4665b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] No waiting events found dispatching network-vif-unplugged-4e4dadeb-b5ea-4021-9b54-652b42f7783b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:14:06 np0005466013 nova_compute[192144]: 2025-10-02 12:14:06.323 2 DEBUG nova.compute.manager [req-9ae7b37a-bdcb-427b-842f-4583e5e834f6 req-2b163343-8e5a-4ced-bb95-b0f41ac4665b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Received event network-vif-unplugged-4e4dadeb-b5ea-4021-9b54-652b42f7783b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:14:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:06.323 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[c78d4a43-2497-4317-a239-6bb5cd072b2b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:06.325 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[e3d10e77-b29a-4008-ba90-9b78fa554e40]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:06 np0005466013 nova_compute[192144]: 2025-10-02 12:14:06.340 2 INFO nova.compute.manager [None req-831beeae-4ed4-4c73-94e2-3765eccdf774 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:14:06 np0005466013 nova_compute[192144]: 2025-10-02 12:14:06.341 2 DEBUG oslo.service.loopingcall [None req-831beeae-4ed4-4c73-94e2-3765eccdf774 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:14:06 np0005466013 nova_compute[192144]: 2025-10-02 12:14:06.341 2 DEBUG nova.compute.manager [-] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:14:06 np0005466013 nova_compute[192144]: 2025-10-02 12:14:06.341 2 DEBUG nova.network.neutron [-] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:14:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:06.344 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[4379d06b-6c6d-47d1-83d8-bf51076daefd]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 523413, 'reachable_time': 29866, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229919, 'error': None, 'target': 'ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:06.346 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:14:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:06.346 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[84059553-57ef-4bba-9d27-a334b931950b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:06 np0005466013 systemd[1]: run-netns-ovnmeta\x2dd6de4737\x2dca60\x2d4c8d\x2dbfd5\x2d687f9366ec8b.mount: Deactivated successfully.
Oct  2 08:14:07 np0005466013 nova_compute[192144]: 2025-10-02 12:14:07.963 2 DEBUG nova.network.neutron [-] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:14:07 np0005466013 nova_compute[192144]: 2025-10-02 12:14:07.999 2 INFO nova.compute.manager [-] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Took 1.66 seconds to deallocate network for instance.#033[00m
Oct  2 08:14:08 np0005466013 nova_compute[192144]: 2025-10-02 12:14:08.122 2 DEBUG oslo_concurrency.lockutils [None req-831beeae-4ed4-4c73-94e2-3765eccdf774 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:08 np0005466013 nova_compute[192144]: 2025-10-02 12:14:08.123 2 DEBUG oslo_concurrency.lockutils [None req-831beeae-4ed4-4c73-94e2-3765eccdf774 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:08 np0005466013 nova_compute[192144]: 2025-10-02 12:14:08.158 2 DEBUG nova.compute.manager [req-42b3c07e-6dd6-46f9-950a-b6c1656a4105 req-aef4b388-f006-489d-98be-cf811932b33a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Received event network-vif-deleted-4e4dadeb-b5ea-4021-9b54-652b42f7783b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:14:08 np0005466013 nova_compute[192144]: 2025-10-02 12:14:08.212 2 DEBUG nova.compute.provider_tree [None req-831beeae-4ed4-4c73-94e2-3765eccdf774 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:14:08 np0005466013 nova_compute[192144]: 2025-10-02 12:14:08.228 2 DEBUG nova.scheduler.client.report [None req-831beeae-4ed4-4c73-94e2-3765eccdf774 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:14:08 np0005466013 nova_compute[192144]: 2025-10-02 12:14:08.269 2 DEBUG oslo_concurrency.lockutils [None req-831beeae-4ed4-4c73-94e2-3765eccdf774 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.146s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:08 np0005466013 nova_compute[192144]: 2025-10-02 12:14:08.304 2 INFO nova.scheduler.client.report [None req-831beeae-4ed4-4c73-94e2-3765eccdf774 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Deleted allocations for instance 878c8333-7358-4047-ad93-5b34ec0b2643#033[00m
Oct  2 08:14:08 np0005466013 nova_compute[192144]: 2025-10-02 12:14:08.370 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:08 np0005466013 nova_compute[192144]: 2025-10-02 12:14:08.445 2 DEBUG oslo_concurrency.lockutils [None req-831beeae-4ed4-4c73-94e2-3765eccdf774 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "878c8333-7358-4047-ad93-5b34ec0b2643" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.534s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:08 np0005466013 nova_compute[192144]: 2025-10-02 12:14:08.496 2 DEBUG nova.compute.manager [req-ab227e32-9dab-4ca5-a435-b243687413e4 req-003b52c6-c9bf-4e2f-9815-07479e518eda 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Received event network-vif-plugged-4e4dadeb-b5ea-4021-9b54-652b42f7783b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:14:08 np0005466013 nova_compute[192144]: 2025-10-02 12:14:08.497 2 DEBUG oslo_concurrency.lockutils [req-ab227e32-9dab-4ca5-a435-b243687413e4 req-003b52c6-c9bf-4e2f-9815-07479e518eda 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "878c8333-7358-4047-ad93-5b34ec0b2643-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:08 np0005466013 nova_compute[192144]: 2025-10-02 12:14:08.498 2 DEBUG oslo_concurrency.lockutils [req-ab227e32-9dab-4ca5-a435-b243687413e4 req-003b52c6-c9bf-4e2f-9815-07479e518eda 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "878c8333-7358-4047-ad93-5b34ec0b2643-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:08 np0005466013 nova_compute[192144]: 2025-10-02 12:14:08.498 2 DEBUG oslo_concurrency.lockutils [req-ab227e32-9dab-4ca5-a435-b243687413e4 req-003b52c6-c9bf-4e2f-9815-07479e518eda 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "878c8333-7358-4047-ad93-5b34ec0b2643-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:08 np0005466013 nova_compute[192144]: 2025-10-02 12:14:08.498 2 DEBUG nova.compute.manager [req-ab227e32-9dab-4ca5-a435-b243687413e4 req-003b52c6-c9bf-4e2f-9815-07479e518eda 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] No waiting events found dispatching network-vif-plugged-4e4dadeb-b5ea-4021-9b54-652b42f7783b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:14:08 np0005466013 nova_compute[192144]: 2025-10-02 12:14:08.498 2 WARNING nova.compute.manager [req-ab227e32-9dab-4ca5-a435-b243687413e4 req-003b52c6-c9bf-4e2f-9815-07479e518eda 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Received unexpected event network-vif-plugged-4e4dadeb-b5ea-4021-9b54-652b42f7783b for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:14:08 np0005466013 podman[229921]: 2025-10-02 12:14:08.700183715 +0000 UTC m=+0.064908233 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:14:08 np0005466013 podman[229920]: 2025-10-02 12:14:08.717005689 +0000 UTC m=+0.086671665 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:14:08 np0005466013 podman[229922]: 2025-10-02 12:14:08.757520467 +0000 UTC m=+0.122176544 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:14:11 np0005466013 nova_compute[192144]: 2025-10-02 12:14:11.238 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:12 np0005466013 nova_compute[192144]: 2025-10-02 12:14:12.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:14:12 np0005466013 nova_compute[192144]: 2025-10-02 12:14:12.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  2 08:14:13 np0005466013 nova_compute[192144]: 2025-10-02 12:14:13.371 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:16 np0005466013 nova_compute[192144]: 2025-10-02 12:14:16.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:18 np0005466013 nova_compute[192144]: 2025-10-02 12:14:18.374 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:19 np0005466013 nova_compute[192144]: 2025-10-02 12:14:19.158 2 DEBUG oslo_concurrency.lockutils [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Acquiring lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:19 np0005466013 nova_compute[192144]: 2025-10-02 12:14:19.158 2 DEBUG oslo_concurrency.lockutils [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:19 np0005466013 nova_compute[192144]: 2025-10-02 12:14:19.198 2 DEBUG nova.compute.manager [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:14:19 np0005466013 nova_compute[192144]: 2025-10-02 12:14:19.375 2 DEBUG oslo_concurrency.lockutils [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:19 np0005466013 nova_compute[192144]: 2025-10-02 12:14:19.376 2 DEBUG oslo_concurrency.lockutils [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:19 np0005466013 nova_compute[192144]: 2025-10-02 12:14:19.381 2 DEBUG nova.virt.hardware [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:14:19 np0005466013 nova_compute[192144]: 2025-10-02 12:14:19.381 2 INFO nova.compute.claims [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:14:19 np0005466013 nova_compute[192144]: 2025-10-02 12:14:19.559 2 DEBUG nova.compute.provider_tree [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:14:19 np0005466013 nova_compute[192144]: 2025-10-02 12:14:19.581 2 DEBUG nova.scheduler.client.report [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:14:19 np0005466013 nova_compute[192144]: 2025-10-02 12:14:19.620 2 DEBUG oslo_concurrency.lockutils [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.245s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:19 np0005466013 nova_compute[192144]: 2025-10-02 12:14:19.622 2 DEBUG nova.compute.manager [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:14:19 np0005466013 podman[229989]: 2025-10-02 12:14:19.715793201 +0000 UTC m=+0.093377988 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:14:19 np0005466013 nova_compute[192144]: 2025-10-02 12:14:19.744 2 DEBUG nova.compute.manager [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:14:19 np0005466013 nova_compute[192144]: 2025-10-02 12:14:19.744 2 DEBUG nova.network.neutron [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:14:19 np0005466013 nova_compute[192144]: 2025-10-02 12:14:19.775 2 INFO nova.virt.libvirt.driver [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:14:19 np0005466013 nova_compute[192144]: 2025-10-02 12:14:19.812 2 DEBUG nova.compute.manager [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:14:20 np0005466013 nova_compute[192144]: 2025-10-02 12:14:20.013 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:14:20 np0005466013 nova_compute[192144]: 2025-10-02 12:14:20.014 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:14:20 np0005466013 nova_compute[192144]: 2025-10-02 12:14:20.082 2 DEBUG nova.compute.manager [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:14:20 np0005466013 nova_compute[192144]: 2025-10-02 12:14:20.084 2 DEBUG nova.virt.libvirt.driver [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:14:20 np0005466013 nova_compute[192144]: 2025-10-02 12:14:20.084 2 INFO nova.virt.libvirt.driver [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Creating image(s)#033[00m
Oct  2 08:14:20 np0005466013 nova_compute[192144]: 2025-10-02 12:14:20.085 2 DEBUG oslo_concurrency.lockutils [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Acquiring lock "/var/lib/nova/instances/38d330be-301c-4f63-9b1e-7ebc2a61f3e9/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:20 np0005466013 nova_compute[192144]: 2025-10-02 12:14:20.085 2 DEBUG oslo_concurrency.lockutils [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Lock "/var/lib/nova/instances/38d330be-301c-4f63-9b1e-7ebc2a61f3e9/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:20 np0005466013 nova_compute[192144]: 2025-10-02 12:14:20.085 2 DEBUG oslo_concurrency.lockutils [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Lock "/var/lib/nova/instances/38d330be-301c-4f63-9b1e-7ebc2a61f3e9/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:20 np0005466013 nova_compute[192144]: 2025-10-02 12:14:20.099 2 DEBUG oslo_concurrency.processutils [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:14:20 np0005466013 nova_compute[192144]: 2025-10-02 12:14:20.138 2 DEBUG nova.policy [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '64ab4561f89846cc90cf0ab7f878cbd3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '11be1361f6f44b10a6efea8fccf616aa', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:14:20 np0005466013 nova_compute[192144]: 2025-10-02 12:14:20.158 2 DEBUG oslo_concurrency.processutils [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:14:20 np0005466013 nova_compute[192144]: 2025-10-02 12:14:20.159 2 DEBUG oslo_concurrency.lockutils [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:20 np0005466013 nova_compute[192144]: 2025-10-02 12:14:20.159 2 DEBUG oslo_concurrency.lockutils [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:20 np0005466013 nova_compute[192144]: 2025-10-02 12:14:20.171 2 DEBUG oslo_concurrency.processutils [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:14:20 np0005466013 nova_compute[192144]: 2025-10-02 12:14:20.235 2 DEBUG oslo_concurrency.processutils [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:14:20 np0005466013 nova_compute[192144]: 2025-10-02 12:14:20.236 2 DEBUG oslo_concurrency.processutils [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/38d330be-301c-4f63-9b1e-7ebc2a61f3e9/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:14:20 np0005466013 nova_compute[192144]: 2025-10-02 12:14:20.274 2 DEBUG oslo_concurrency.processutils [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/38d330be-301c-4f63-9b1e-7ebc2a61f3e9/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:14:20 np0005466013 nova_compute[192144]: 2025-10-02 12:14:20.276 2 DEBUG oslo_concurrency.lockutils [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:20 np0005466013 nova_compute[192144]: 2025-10-02 12:14:20.277 2 DEBUG oslo_concurrency.processutils [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:14:20 np0005466013 nova_compute[192144]: 2025-10-02 12:14:20.353 2 DEBUG oslo_concurrency.processutils [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:14:20 np0005466013 nova_compute[192144]: 2025-10-02 12:14:20.355 2 DEBUG nova.virt.disk.api [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Checking if we can resize image /var/lib/nova/instances/38d330be-301c-4f63-9b1e-7ebc2a61f3e9/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:14:20 np0005466013 nova_compute[192144]: 2025-10-02 12:14:20.355 2 DEBUG oslo_concurrency.processutils [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/38d330be-301c-4f63-9b1e-7ebc2a61f3e9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:14:20 np0005466013 nova_compute[192144]: 2025-10-02 12:14:20.421 2 DEBUG oslo_concurrency.processutils [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/38d330be-301c-4f63-9b1e-7ebc2a61f3e9/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:14:20 np0005466013 nova_compute[192144]: 2025-10-02 12:14:20.422 2 DEBUG nova.virt.disk.api [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Cannot resize image /var/lib/nova/instances/38d330be-301c-4f63-9b1e-7ebc2a61f3e9/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:14:20 np0005466013 nova_compute[192144]: 2025-10-02 12:14:20.423 2 DEBUG nova.objects.instance [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Lazy-loading 'migration_context' on Instance uuid 38d330be-301c-4f63-9b1e-7ebc2a61f3e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:14:20 np0005466013 nova_compute[192144]: 2025-10-02 12:14:20.438 2 DEBUG nova.virt.libvirt.driver [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:14:20 np0005466013 nova_compute[192144]: 2025-10-02 12:14:20.438 2 DEBUG nova.virt.libvirt.driver [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Ensure instance console log exists: /var/lib/nova/instances/38d330be-301c-4f63-9b1e-7ebc2a61f3e9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:14:20 np0005466013 nova_compute[192144]: 2025-10-02 12:14:20.439 2 DEBUG oslo_concurrency.lockutils [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:20 np0005466013 nova_compute[192144]: 2025-10-02 12:14:20.439 2 DEBUG oslo_concurrency.lockutils [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:20 np0005466013 nova_compute[192144]: 2025-10-02 12:14:20.440 2 DEBUG oslo_concurrency.lockutils [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:21 np0005466013 nova_compute[192144]: 2025-10-02 12:14:21.208 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407246.2074764, 878c8333-7358-4047-ad93-5b34ec0b2643 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:14:21 np0005466013 nova_compute[192144]: 2025-10-02 12:14:21.209 2 INFO nova.compute.manager [-] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:14:21 np0005466013 nova_compute[192144]: 2025-10-02 12:14:21.227 2 DEBUG nova.compute.manager [None req-5b219dcf-4138-4812-8b18-b00bf48d36a4 - - - - - -] [instance: 878c8333-7358-4047-ad93-5b34ec0b2643] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:14:21 np0005466013 nova_compute[192144]: 2025-10-02 12:14:21.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:21 np0005466013 nova_compute[192144]: 2025-10-02 12:14:21.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:14:21 np0005466013 nova_compute[192144]: 2025-10-02 12:14:21.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:14:21 np0005466013 nova_compute[192144]: 2025-10-02 12:14:21.997 2 DEBUG nova.network.neutron [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Successfully created port: efd2e9f9-37be-44f8-ae2d-57277d8d6aa1 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:14:22 np0005466013 nova_compute[192144]: 2025-10-02 12:14:22.024 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:22 np0005466013 nova_compute[192144]: 2025-10-02 12:14:22.025 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:22 np0005466013 nova_compute[192144]: 2025-10-02 12:14:22.025 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:22 np0005466013 nova_compute[192144]: 2025-10-02 12:14:22.025 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:14:22 np0005466013 podman[230025]: 2025-10-02 12:14:22.139514145 +0000 UTC m=+0.063245980 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  2 08:14:22 np0005466013 podman[230026]: 2025-10-02 12:14:22.148036396 +0000 UTC m=+0.065821562 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.buildah.version=1.33.7, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, container_name=openstack_network_exporter, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers)
Oct  2 08:14:22 np0005466013 nova_compute[192144]: 2025-10-02 12:14:22.201 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:14:22 np0005466013 nova_compute[192144]: 2025-10-02 12:14:22.202 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5712MB free_disk=73.3545150756836GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:14:22 np0005466013 nova_compute[192144]: 2025-10-02 12:14:22.202 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:22 np0005466013 nova_compute[192144]: 2025-10-02 12:14:22.203 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:22 np0005466013 nova_compute[192144]: 2025-10-02 12:14:22.308 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance 38d330be-301c-4f63-9b1e-7ebc2a61f3e9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:14:22 np0005466013 nova_compute[192144]: 2025-10-02 12:14:22.308 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:14:22 np0005466013 nova_compute[192144]: 2025-10-02 12:14:22.309 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:14:22 np0005466013 nova_compute[192144]: 2025-10-02 12:14:22.396 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:14:22 np0005466013 nova_compute[192144]: 2025-10-02 12:14:22.427 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:14:22 np0005466013 nova_compute[192144]: 2025-10-02 12:14:22.466 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:14:22 np0005466013 nova_compute[192144]: 2025-10-02 12:14:22.466 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.263s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:22 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:22.824 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:14:22 np0005466013 nova_compute[192144]: 2025-10-02 12:14:22.825 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:22 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:22.826 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:14:22 np0005466013 nova_compute[192144]: 2025-10-02 12:14:22.928 2 DEBUG nova.network.neutron [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Successfully created port: e204a8e9-0d37-4412-bc4a-98054b03a065 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:14:23 np0005466013 nova_compute[192144]: 2025-10-02 12:14:23.375 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:23 np0005466013 nova_compute[192144]: 2025-10-02 12:14:23.465 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:14:23 np0005466013 nova_compute[192144]: 2025-10-02 12:14:23.466 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:14:23 np0005466013 nova_compute[192144]: 2025-10-02 12:14:23.506 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:14:23 np0005466013 nova_compute[192144]: 2025-10-02 12:14:23.507 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:14:23 np0005466013 nova_compute[192144]: 2025-10-02 12:14:23.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:14:23 np0005466013 nova_compute[192144]: 2025-10-02 12:14:23.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:14:23 np0005466013 nova_compute[192144]: 2025-10-02 12:14:23.995 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:14:24 np0005466013 nova_compute[192144]: 2025-10-02 12:14:24.097 2 DEBUG nova.network.neutron [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Successfully created port: 9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:14:25 np0005466013 nova_compute[192144]: 2025-10-02 12:14:25.231 2 DEBUG nova.network.neutron [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Successfully updated port: efd2e9f9-37be-44f8-ae2d-57277d8d6aa1 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:14:25 np0005466013 nova_compute[192144]: 2025-10-02 12:14:25.322 2 DEBUG nova.compute.manager [req-3951fcf7-603b-4e1d-8009-2117e8b5f82c req-7ec63886-621e-4e1b-a228-e010e9231145 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Received event network-changed-efd2e9f9-37be-44f8-ae2d-57277d8d6aa1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:14:25 np0005466013 nova_compute[192144]: 2025-10-02 12:14:25.323 2 DEBUG nova.compute.manager [req-3951fcf7-603b-4e1d-8009-2117e8b5f82c req-7ec63886-621e-4e1b-a228-e010e9231145 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Refreshing instance network info cache due to event network-changed-efd2e9f9-37be-44f8-ae2d-57277d8d6aa1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:14:25 np0005466013 nova_compute[192144]: 2025-10-02 12:14:25.323 2 DEBUG oslo_concurrency.lockutils [req-3951fcf7-603b-4e1d-8009-2117e8b5f82c req-7ec63886-621e-4e1b-a228-e010e9231145 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-38d330be-301c-4f63-9b1e-7ebc2a61f3e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:14:25 np0005466013 nova_compute[192144]: 2025-10-02 12:14:25.323 2 DEBUG oslo_concurrency.lockutils [req-3951fcf7-603b-4e1d-8009-2117e8b5f82c req-7ec63886-621e-4e1b-a228-e010e9231145 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-38d330be-301c-4f63-9b1e-7ebc2a61f3e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:14:25 np0005466013 nova_compute[192144]: 2025-10-02 12:14:25.323 2 DEBUG nova.network.neutron [req-3951fcf7-603b-4e1d-8009-2117e8b5f82c req-7ec63886-621e-4e1b-a228-e010e9231145 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Refreshing network info cache for port efd2e9f9-37be-44f8-ae2d-57277d8d6aa1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:14:25 np0005466013 nova_compute[192144]: 2025-10-02 12:14:25.362 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:14:25 np0005466013 nova_compute[192144]: 2025-10-02 12:14:25.397 2 WARNING nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] While synchronizing instance power states, found 1 instances in the database and 0 instances on the hypervisor.#033[00m
Oct  2 08:14:25 np0005466013 nova_compute[192144]: 2025-10-02 12:14:25.397 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Triggering sync for uuid 38d330be-301c-4f63-9b1e-7ebc2a61f3e9 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Oct  2 08:14:25 np0005466013 nova_compute[192144]: 2025-10-02 12:14:25.398 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:25 np0005466013 nova_compute[192144]: 2025-10-02 12:14:25.498 2 DEBUG nova.network.neutron [req-3951fcf7-603b-4e1d-8009-2117e8b5f82c req-7ec63886-621e-4e1b-a228-e010e9231145 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:14:25 np0005466013 nova_compute[192144]: 2025-10-02 12:14:25.892 2 DEBUG nova.network.neutron [req-3951fcf7-603b-4e1d-8009-2117e8b5f82c req-7ec63886-621e-4e1b-a228-e010e9231145 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:14:25 np0005466013 nova_compute[192144]: 2025-10-02 12:14:25.917 2 DEBUG oslo_concurrency.lockutils [req-3951fcf7-603b-4e1d-8009-2117e8b5f82c req-7ec63886-621e-4e1b-a228-e010e9231145 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-38d330be-301c-4f63-9b1e-7ebc2a61f3e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:14:26 np0005466013 nova_compute[192144]: 2025-10-02 12:14:26.024 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:14:26 np0005466013 nova_compute[192144]: 2025-10-02 12:14:26.157 2 DEBUG nova.network.neutron [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Successfully updated port: e204a8e9-0d37-4412-bc4a-98054b03a065 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:14:26 np0005466013 nova_compute[192144]: 2025-10-02 12:14:26.284 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:26 np0005466013 podman[230066]: 2025-10-02 12:14:26.678438851 +0000 UTC m=+0.045086353 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:14:26 np0005466013 podman[230067]: 2025-10-02 12:14:26.703760416 +0000 UTC m=+0.061432193 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct  2 08:14:27 np0005466013 nova_compute[192144]: 2025-10-02 12:14:27.327 2 DEBUG nova.network.neutron [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Successfully updated port: 9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:14:27 np0005466013 nova_compute[192144]: 2025-10-02 12:14:27.348 2 DEBUG oslo_concurrency.lockutils [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Acquiring lock "refresh_cache-38d330be-301c-4f63-9b1e-7ebc2a61f3e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:14:27 np0005466013 nova_compute[192144]: 2025-10-02 12:14:27.348 2 DEBUG oslo_concurrency.lockutils [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Acquired lock "refresh_cache-38d330be-301c-4f63-9b1e-7ebc2a61f3e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:14:27 np0005466013 nova_compute[192144]: 2025-10-02 12:14:27.348 2 DEBUG nova.network.neutron [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:14:27 np0005466013 nova_compute[192144]: 2025-10-02 12:14:27.409 2 DEBUG nova.compute.manager [req-b83a76a5-022f-41f0-8049-231665dc5cf7 req-47452754-d9ac-4524-a8ad-b4f70bc76a42 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Received event network-changed-e204a8e9-0d37-4412-bc4a-98054b03a065 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:14:27 np0005466013 nova_compute[192144]: 2025-10-02 12:14:27.409 2 DEBUG nova.compute.manager [req-b83a76a5-022f-41f0-8049-231665dc5cf7 req-47452754-d9ac-4524-a8ad-b4f70bc76a42 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Refreshing instance network info cache due to event network-changed-e204a8e9-0d37-4412-bc4a-98054b03a065. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:14:27 np0005466013 nova_compute[192144]: 2025-10-02 12:14:27.410 2 DEBUG oslo_concurrency.lockutils [req-b83a76a5-022f-41f0-8049-231665dc5cf7 req-47452754-d9ac-4524-a8ad-b4f70bc76a42 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-38d330be-301c-4f63-9b1e-7ebc2a61f3e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:14:27 np0005466013 nova_compute[192144]: 2025-10-02 12:14:27.502 2 DEBUG nova.network.neutron [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:14:28 np0005466013 nova_compute[192144]: 2025-10-02 12:14:28.377 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:29 np0005466013 nova_compute[192144]: 2025-10-02 12:14:29.006 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.449 2 DEBUG nova.network.neutron [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Updating instance_info_cache with network_info: [{"id": "efd2e9f9-37be-44f8-ae2d-57277d8d6aa1", "address": "fa:16:3e:c7:69:5e", "network": {"id": "bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-797982943", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.238", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11be1361f6f44b10a6efea8fccf616aa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefd2e9f9-37", "ovs_interfaceid": "efd2e9f9-37be-44f8-ae2d-57277d8d6aa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "e204a8e9-0d37-4412-bc4a-98054b03a065", "address": "fa:16:3e:be:4b:1d", "network": {"id": "02195939-8d5e-4e0a-8f89-f16a0db2353e", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1175664720", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.172", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11be1361f6f44b10a6efea8fccf616aa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape204a8e9-0d", "ovs_interfaceid": "e204a8e9-0d37-4412-bc4a-98054b03a065", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8", "address": "fa:16:3e:50:70:84", "network": {"id": "bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-797982943", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.126", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11be1361f6f44b10a6efea8fccf616aa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a3d251d-22", "ovs_interfaceid": "9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.472 2 DEBUG oslo_concurrency.lockutils [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Releasing lock "refresh_cache-38d330be-301c-4f63-9b1e-7ebc2a61f3e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.472 2 DEBUG nova.compute.manager [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Instance network_info: |[{"id": "efd2e9f9-37be-44f8-ae2d-57277d8d6aa1", "address": "fa:16:3e:c7:69:5e", "network": {"id": "bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-797982943", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.238", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11be1361f6f44b10a6efea8fccf616aa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefd2e9f9-37", "ovs_interfaceid": "efd2e9f9-37be-44f8-ae2d-57277d8d6aa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "e204a8e9-0d37-4412-bc4a-98054b03a065", "address": "fa:16:3e:be:4b:1d", "network": {"id": "02195939-8d5e-4e0a-8f89-f16a0db2353e", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1175664720", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.172", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11be1361f6f44b10a6efea8fccf616aa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape204a8e9-0d", "ovs_interfaceid": "e204a8e9-0d37-4412-bc4a-98054b03a065", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8", "address": "fa:16:3e:50:70:84", "network": {"id": "bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-797982943", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.126", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11be1361f6f44b10a6efea8fccf616aa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a3d251d-22", "ovs_interfaceid": "9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.473 2 DEBUG oslo_concurrency.lockutils [req-b83a76a5-022f-41f0-8049-231665dc5cf7 req-47452754-d9ac-4524-a8ad-b4f70bc76a42 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-38d330be-301c-4f63-9b1e-7ebc2a61f3e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.473 2 DEBUG nova.network.neutron [req-b83a76a5-022f-41f0-8049-231665dc5cf7 req-47452754-d9ac-4524-a8ad-b4f70bc76a42 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Refreshing network info cache for port e204a8e9-0d37-4412-bc4a-98054b03a065 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.477 2 DEBUG nova.virt.libvirt.driver [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Start _get_guest_xml network_info=[{"id": "efd2e9f9-37be-44f8-ae2d-57277d8d6aa1", "address": "fa:16:3e:c7:69:5e", "network": {"id": "bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-797982943", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.238", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11be1361f6f44b10a6efea8fccf616aa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefd2e9f9-37", "ovs_interfaceid": "efd2e9f9-37be-44f8-ae2d-57277d8d6aa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "e204a8e9-0d37-4412-bc4a-98054b03a065", "address": "fa:16:3e:be:4b:1d", "network": {"id": "02195939-8d5e-4e0a-8f89-f16a0db2353e", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1175664720", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.172", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11be1361f6f44b10a6efea8fccf616aa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape204a8e9-0d", "ovs_interfaceid": "e204a8e9-0d37-4412-bc4a-98054b03a065", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8", "address": "fa:16:3e:50:70:84", "network": {"id": "bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-797982943", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.126", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11be1361f6f44b10a6efea8fccf616aa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a3d251d-22", "ovs_interfaceid": "9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.480 2 WARNING nova.virt.libvirt.driver [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.492 2 DEBUG nova.virt.libvirt.host [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.493 2 DEBUG nova.virt.libvirt.host [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.505 2 DEBUG nova.virt.libvirt.host [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.506 2 DEBUG nova.virt.libvirt.host [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.507 2 DEBUG nova.virt.libvirt.driver [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.507 2 DEBUG nova.virt.hardware [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.508 2 DEBUG nova.virt.hardware [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.508 2 DEBUG nova.virt.hardware [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.509 2 DEBUG nova.virt.hardware [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.509 2 DEBUG nova.virt.hardware [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.509 2 DEBUG nova.virt.hardware [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.509 2 DEBUG nova.virt.hardware [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.510 2 DEBUG nova.virt.hardware [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.510 2 DEBUG nova.virt.hardware [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.510 2 DEBUG nova.virt.hardware [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.511 2 DEBUG nova.virt.hardware [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.514 2 DEBUG nova.virt.libvirt.vif [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:14:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-162137885',display_name='tempest-ServersTestMultiNic-server-162137885',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-162137885',id=73,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='11be1361f6f44b10a6efea8fccf616aa',ramdisk_id='',reservation_id='r-6tcn6e1c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1305956602',owner_user_name='tempest-ServersTestMultiNic-1305956602-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:14:19Z,user_data=None,user_id='64ab4561f89846cc90cf0ab7f878cbd3',uuid=38d330be-301c-4f63-9b1e-7ebc2a61f3e9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "efd2e9f9-37be-44f8-ae2d-57277d8d6aa1", "address": "fa:16:3e:c7:69:5e", "network": {"id": "bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-797982943", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.238", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11be1361f6f44b10a6efea8fccf616aa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefd2e9f9-37", "ovs_interfaceid": "efd2e9f9-37be-44f8-ae2d-57277d8d6aa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.514 2 DEBUG nova.network.os_vif_util [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Converting VIF {"id": "efd2e9f9-37be-44f8-ae2d-57277d8d6aa1", "address": "fa:16:3e:c7:69:5e", "network": {"id": "bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-797982943", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.238", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11be1361f6f44b10a6efea8fccf616aa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefd2e9f9-37", "ovs_interfaceid": "efd2e9f9-37be-44f8-ae2d-57277d8d6aa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.515 2 DEBUG nova.network.os_vif_util [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c7:69:5e,bridge_name='br-int',has_traffic_filtering=True,id=efd2e9f9-37be-44f8-ae2d-57277d8d6aa1,network=Network(bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapefd2e9f9-37') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.516 2 DEBUG nova.virt.libvirt.vif [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:14:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-162137885',display_name='tempest-ServersTestMultiNic-server-162137885',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-162137885',id=73,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='11be1361f6f44b10a6efea8fccf616aa',ramdisk_id='',reservation_id='r-6tcn6e1c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1305956602',owner_user_name='tempest-ServersTestMultiNic-1305956602-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:14:19Z,user_data=None,user_id='64ab4561f89846cc90cf0ab7f878cbd3',uuid=38d330be-301c-4f63-9b1e-7ebc2a61f3e9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e204a8e9-0d37-4412-bc4a-98054b03a065", "address": "fa:16:3e:be:4b:1d", "network": {"id": "02195939-8d5e-4e0a-8f89-f16a0db2353e", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1175664720", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.172", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11be1361f6f44b10a6efea8fccf616aa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape204a8e9-0d", "ovs_interfaceid": "e204a8e9-0d37-4412-bc4a-98054b03a065", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.516 2 DEBUG nova.network.os_vif_util [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Converting VIF {"id": "e204a8e9-0d37-4412-bc4a-98054b03a065", "address": "fa:16:3e:be:4b:1d", "network": {"id": "02195939-8d5e-4e0a-8f89-f16a0db2353e", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1175664720", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.172", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11be1361f6f44b10a6efea8fccf616aa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape204a8e9-0d", "ovs_interfaceid": "e204a8e9-0d37-4412-bc4a-98054b03a065", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.516 2 DEBUG nova.network.os_vif_util [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:be:4b:1d,bridge_name='br-int',has_traffic_filtering=True,id=e204a8e9-0d37-4412-bc4a-98054b03a065,network=Network(02195939-8d5e-4e0a-8f89-f16a0db2353e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape204a8e9-0d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.517 2 DEBUG nova.virt.libvirt.vif [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:14:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-162137885',display_name='tempest-ServersTestMultiNic-server-162137885',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-162137885',id=73,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='11be1361f6f44b10a6efea8fccf616aa',ramdisk_id='',reservation_id='r-6tcn6e1c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1305956602',owner_user_name='tempest-ServersTestMultiNic-1305956602-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:14:19Z,user_data=None,user_id='64ab4561f89846cc90cf0ab7f878cbd3',uuid=38d330be-301c-4f63-9b1e-7ebc2a61f3e9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8", "address": "fa:16:3e:50:70:84", "network": {"id": "bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-797982943", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.126", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11be1361f6f44b10a6efea8fccf616aa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a3d251d-22", "ovs_interfaceid": "9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.517 2 DEBUG nova.network.os_vif_util [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Converting VIF {"id": "9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8", "address": "fa:16:3e:50:70:84", "network": {"id": "bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-797982943", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.126", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11be1361f6f44b10a6efea8fccf616aa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a3d251d-22", "ovs_interfaceid": "9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.518 2 DEBUG nova.network.os_vif_util [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:50:70:84,bridge_name='br-int',has_traffic_filtering=True,id=9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8,network=Network(bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9a3d251d-22') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.519 2 DEBUG nova.objects.instance [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Lazy-loading 'pci_devices' on Instance uuid 38d330be-301c-4f63-9b1e-7ebc2a61f3e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.535 2 DEBUG nova.virt.libvirt.driver [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:14:30 np0005466013 nova_compute[192144]:  <uuid>38d330be-301c-4f63-9b1e-7ebc2a61f3e9</uuid>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:  <name>instance-00000049</name>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:14:30 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:      <nova:name>tempest-ServersTestMultiNic-server-162137885</nova:name>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:14:30</nova:creationTime>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:14:30 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:        <nova:user uuid="64ab4561f89846cc90cf0ab7f878cbd3">tempest-ServersTestMultiNic-1305956602-project-member</nova:user>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:        <nova:project uuid="11be1361f6f44b10a6efea8fccf616aa">tempest-ServersTestMultiNic-1305956602</nova:project>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:        <nova:port uuid="efd2e9f9-37be-44f8-ae2d-57277d8d6aa1">
Oct  2 08:14:30 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.238" ipVersion="4"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:        <nova:port uuid="e204a8e9-0d37-4412-bc4a-98054b03a065">
Oct  2 08:14:30 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.1.172" ipVersion="4"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:        <nova:port uuid="9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8">
Oct  2 08:14:30 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.126" ipVersion="4"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:      <entry name="serial">38d330be-301c-4f63-9b1e-7ebc2a61f3e9</entry>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:      <entry name="uuid">38d330be-301c-4f63-9b1e-7ebc2a61f3e9</entry>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:14:30 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/38d330be-301c-4f63-9b1e-7ebc2a61f3e9/disk"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:14:30 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/38d330be-301c-4f63-9b1e-7ebc2a61f3e9/disk.config"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:14:30 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:c7:69:5e"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:      <target dev="tapefd2e9f9-37"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:14:30 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:be:4b:1d"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:      <target dev="tape204a8e9-0d"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:14:30 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:50:70:84"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:      <target dev="tap9a3d251d-22"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:14:30 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/38d330be-301c-4f63-9b1e-7ebc2a61f3e9/console.log" append="off"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:14:30 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:14:30 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:14:30 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:14:30 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:14:30 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.536 2 DEBUG nova.compute.manager [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Preparing to wait for external event network-vif-plugged-efd2e9f9-37be-44f8-ae2d-57277d8d6aa1 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.537 2 DEBUG oslo_concurrency.lockutils [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Acquiring lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.537 2 DEBUG oslo_concurrency.lockutils [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.537 2 DEBUG oslo_concurrency.lockutils [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.537 2 DEBUG nova.compute.manager [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Preparing to wait for external event network-vif-plugged-e204a8e9-0d37-4412-bc4a-98054b03a065 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.538 2 DEBUG oslo_concurrency.lockutils [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Acquiring lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.538 2 DEBUG oslo_concurrency.lockutils [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.538 2 DEBUG oslo_concurrency.lockutils [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.538 2 DEBUG nova.compute.manager [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Preparing to wait for external event network-vif-plugged-9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.539 2 DEBUG oslo_concurrency.lockutils [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Acquiring lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.539 2 DEBUG oslo_concurrency.lockutils [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.539 2 DEBUG oslo_concurrency.lockutils [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.540 2 DEBUG nova.virt.libvirt.vif [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:14:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-162137885',display_name='tempest-ServersTestMultiNic-server-162137885',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-162137885',id=73,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='11be1361f6f44b10a6efea8fccf616aa',ramdisk_id='',reservation_id='r-6tcn6e1c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1305956602',owner_user_name='tempest-ServersTestMultiNic-1305956602-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:14:19Z,user_data=None,user_id='64ab4561f89846cc90cf0ab7f878cbd3',uuid=38d330be-301c-4f63-9b1e-7ebc2a61f3e9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "efd2e9f9-37be-44f8-ae2d-57277d8d6aa1", "address": "fa:16:3e:c7:69:5e", "network": {"id": "bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-797982943", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.238", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11be1361f6f44b10a6efea8fccf616aa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefd2e9f9-37", "ovs_interfaceid": "efd2e9f9-37be-44f8-ae2d-57277d8d6aa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.540 2 DEBUG nova.network.os_vif_util [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Converting VIF {"id": "efd2e9f9-37be-44f8-ae2d-57277d8d6aa1", "address": "fa:16:3e:c7:69:5e", "network": {"id": "bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-797982943", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.238", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11be1361f6f44b10a6efea8fccf616aa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefd2e9f9-37", "ovs_interfaceid": "efd2e9f9-37be-44f8-ae2d-57277d8d6aa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.541 2 DEBUG nova.network.os_vif_util [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c7:69:5e,bridge_name='br-int',has_traffic_filtering=True,id=efd2e9f9-37be-44f8-ae2d-57277d8d6aa1,network=Network(bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapefd2e9f9-37') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.543 2 DEBUG os_vif [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c7:69:5e,bridge_name='br-int',has_traffic_filtering=True,id=efd2e9f9-37be-44f8-ae2d-57277d8d6aa1,network=Network(bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapefd2e9f9-37') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.544 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.544 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.545 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.548 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapefd2e9f9-37, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.549 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapefd2e9f9-37, col_values=(('external_ids', {'iface-id': 'efd2e9f9-37be-44f8-ae2d-57277d8d6aa1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c7:69:5e', 'vm-uuid': '38d330be-301c-4f63-9b1e-7ebc2a61f3e9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:30 np0005466013 NetworkManager[51205]: <info>  [1759407270.5515] manager: (tapefd2e9f9-37): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/119)
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.558 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.559 2 INFO os_vif [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c7:69:5e,bridge_name='br-int',has_traffic_filtering=True,id=efd2e9f9-37be-44f8-ae2d-57277d8d6aa1,network=Network(bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapefd2e9f9-37')#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.560 2 DEBUG nova.virt.libvirt.vif [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:14:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-162137885',display_name='tempest-ServersTestMultiNic-server-162137885',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-162137885',id=73,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='11be1361f6f44b10a6efea8fccf616aa',ramdisk_id='',reservation_id='r-6tcn6e1c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1305956602',owner_user_name='tempest-ServersTestMultiNic-1305956602-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:14:19Z,user_data=None,user_id='64ab4561f89846cc90cf0ab7f878cbd3',uuid=38d330be-301c-4f63-9b1e-7ebc2a61f3e9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e204a8e9-0d37-4412-bc4a-98054b03a065", "address": "fa:16:3e:be:4b:1d", "network": {"id": "02195939-8d5e-4e0a-8f89-f16a0db2353e", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1175664720", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.172", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11be1361f6f44b10a6efea8fccf616aa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape204a8e9-0d", "ovs_interfaceid": "e204a8e9-0d37-4412-bc4a-98054b03a065", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.560 2 DEBUG nova.network.os_vif_util [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Converting VIF {"id": "e204a8e9-0d37-4412-bc4a-98054b03a065", "address": "fa:16:3e:be:4b:1d", "network": {"id": "02195939-8d5e-4e0a-8f89-f16a0db2353e", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1175664720", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.172", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11be1361f6f44b10a6efea8fccf616aa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape204a8e9-0d", "ovs_interfaceid": "e204a8e9-0d37-4412-bc4a-98054b03a065", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.561 2 DEBUG nova.network.os_vif_util [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:be:4b:1d,bridge_name='br-int',has_traffic_filtering=True,id=e204a8e9-0d37-4412-bc4a-98054b03a065,network=Network(02195939-8d5e-4e0a-8f89-f16a0db2353e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape204a8e9-0d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.561 2 DEBUG os_vif [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:be:4b:1d,bridge_name='br-int',has_traffic_filtering=True,id=e204a8e9-0d37-4412-bc4a-98054b03a065,network=Network(02195939-8d5e-4e0a-8f89-f16a0db2353e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape204a8e9-0d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.561 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.561 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.562 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.565 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.566 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape204a8e9-0d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.567 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape204a8e9-0d, col_values=(('external_ids', {'iface-id': 'e204a8e9-0d37-4412-bc4a-98054b03a065', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:be:4b:1d', 'vm-uuid': '38d330be-301c-4f63-9b1e-7ebc2a61f3e9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.568 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:30 np0005466013 NetworkManager[51205]: <info>  [1759407270.5690] manager: (tape204a8e9-0d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/120)
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.570 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.577 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.577 2 INFO os_vif [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:be:4b:1d,bridge_name='br-int',has_traffic_filtering=True,id=e204a8e9-0d37-4412-bc4a-98054b03a065,network=Network(02195939-8d5e-4e0a-8f89-f16a0db2353e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape204a8e9-0d')#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.578 2 DEBUG nova.virt.libvirt.vif [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:14:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-162137885',display_name='tempest-ServersTestMultiNic-server-162137885',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-162137885',id=73,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='11be1361f6f44b10a6efea8fccf616aa',ramdisk_id='',reservation_id='r-6tcn6e1c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1305956602',owner_user_name='tempest-ServersTestMultiNic-1305956602-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:14:19Z,user_data=None,user_id='64ab4561f89846cc90cf0ab7f878cbd3',uuid=38d330be-301c-4f63-9b1e-7ebc2a61f3e9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8", "address": "fa:16:3e:50:70:84", "network": {"id": "bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-797982943", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.126", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11be1361f6f44b10a6efea8fccf616aa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a3d251d-22", "ovs_interfaceid": "9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.579 2 DEBUG nova.network.os_vif_util [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Converting VIF {"id": "9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8", "address": "fa:16:3e:50:70:84", "network": {"id": "bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-797982943", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.126", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11be1361f6f44b10a6efea8fccf616aa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a3d251d-22", "ovs_interfaceid": "9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.579 2 DEBUG nova.network.os_vif_util [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:50:70:84,bridge_name='br-int',has_traffic_filtering=True,id=9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8,network=Network(bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9a3d251d-22') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.580 2 DEBUG os_vif [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:50:70:84,bridge_name='br-int',has_traffic_filtering=True,id=9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8,network=Network(bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9a3d251d-22') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.580 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.580 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.581 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.583 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.583 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9a3d251d-22, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.584 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9a3d251d-22, col_values=(('external_ids', {'iface-id': '9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:50:70:84', 'vm-uuid': '38d330be-301c-4f63-9b1e-7ebc2a61f3e9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.584 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:30 np0005466013 NetworkManager[51205]: <info>  [1759407270.5856] manager: (tap9a3d251d-22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/121)
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.586 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.596 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.598 2 INFO os_vif [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:50:70:84,bridge_name='br-int',has_traffic_filtering=True,id=9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8,network=Network(bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9a3d251d-22')#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.669 2 DEBUG nova.virt.libvirt.driver [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.669 2 DEBUG nova.virt.libvirt.driver [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.670 2 DEBUG nova.virt.libvirt.driver [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] No VIF found with MAC fa:16:3e:c7:69:5e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.670 2 DEBUG nova.virt.libvirt.driver [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] No VIF found with MAC fa:16:3e:be:4b:1d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.670 2 DEBUG nova.virt.libvirt.driver [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] No VIF found with MAC fa:16:3e:50:70:84, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.671 2 INFO nova.virt.libvirt.driver [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Using config drive#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:14:30 np0005466013 nova_compute[192144]: 2025-10-02 12:14:30.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  2 08:14:31 np0005466013 nova_compute[192144]: 2025-10-02 12:14:31.018 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  2 08:14:31 np0005466013 nova_compute[192144]: 2025-10-02 12:14:31.201 2 INFO nova.virt.libvirt.driver [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Creating config drive at /var/lib/nova/instances/38d330be-301c-4f63-9b1e-7ebc2a61f3e9/disk.config#033[00m
Oct  2 08:14:31 np0005466013 nova_compute[192144]: 2025-10-02 12:14:31.208 2 DEBUG oslo_concurrency.processutils [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/38d330be-301c-4f63-9b1e-7ebc2a61f3e9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5zol7n_g execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:14:31 np0005466013 nova_compute[192144]: 2025-10-02 12:14:31.338 2 DEBUG oslo_concurrency.processutils [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/38d330be-301c-4f63-9b1e-7ebc2a61f3e9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5zol7n_g" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:14:31 np0005466013 NetworkManager[51205]: <info>  [1759407271.4149] manager: (tapefd2e9f9-37): new Tun device (/org/freedesktop/NetworkManager/Devices/122)
Oct  2 08:14:31 np0005466013 kernel: tapefd2e9f9-37: entered promiscuous mode
Oct  2 08:14:31 np0005466013 nova_compute[192144]: 2025-10-02 12:14:31.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:31 np0005466013 ovn_controller[94366]: 2025-10-02T12:14:31Z|00232|binding|INFO|Claiming lport efd2e9f9-37be-44f8-ae2d-57277d8d6aa1 for this chassis.
Oct  2 08:14:31 np0005466013 ovn_controller[94366]: 2025-10-02T12:14:31Z|00233|binding|INFO|efd2e9f9-37be-44f8-ae2d-57277d8d6aa1: Claiming fa:16:3e:c7:69:5e 10.100.0.238
Oct  2 08:14:31 np0005466013 NetworkManager[51205]: <info>  [1759407271.4342] manager: (tape204a8e9-0d): new Tun device (/org/freedesktop/NetworkManager/Devices/123)
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:31.438 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c7:69:5e 10.100.0.238'], port_security=['fa:16:3e:c7:69:5e 10.100.0.238'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.238/24', 'neutron:device_id': '38d330be-301c-4f63-9b1e-7ebc2a61f3e9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '11be1361f6f44b10a6efea8fccf616aa', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8a0b041d-c4b2-499a-b557-418346b0314a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7d9d3969-f564-46cb-adcb-4455de0929c2, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=efd2e9f9-37be-44f8-ae2d-57277d8d6aa1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:31.439 103323 INFO neutron.agent.ovn.metadata.agent [-] Port efd2e9f9-37be-44f8-ae2d-57277d8d6aa1 in datapath bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85 bound to our chassis#033[00m
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:31.440 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85#033[00m
Oct  2 08:14:31 np0005466013 systemd-udevd[230139]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:14:31 np0005466013 systemd-udevd[230140]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:31.453 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[71382be2-a52f-4037-ad21-2e05fdbc2b64]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:31.455 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapbf5e8c0d-e1 in ovnmeta-bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:14:31 np0005466013 NetworkManager[51205]: <info>  [1759407271.4559] manager: (tap9a3d251d-22): new Tun device (/org/freedesktop/NetworkManager/Devices/124)
Oct  2 08:14:31 np0005466013 kernel: tap9a3d251d-22: entered promiscuous mode
Oct  2 08:14:31 np0005466013 kernel: tape204a8e9-0d: entered promiscuous mode
Oct  2 08:14:31 np0005466013 systemd-udevd[230146]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:31.457 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapbf5e8c0d-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:31.457 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[096fe8a4-7c3a-4bfc-bb5f-42ac76fe0c9c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:31.458 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[bbbc63b9-c146-4eaf-997d-6fc40ee29795]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:31 np0005466013 nova_compute[192144]: 2025-10-02 12:14:31.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:31 np0005466013 ovn_controller[94366]: 2025-10-02T12:14:31Z|00234|binding|INFO|Claiming lport 9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8 for this chassis.
Oct  2 08:14:31 np0005466013 ovn_controller[94366]: 2025-10-02T12:14:31Z|00235|binding|INFO|9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8: Claiming fa:16:3e:50:70:84 10.100.0.126
Oct  2 08:14:31 np0005466013 ovn_controller[94366]: 2025-10-02T12:14:31Z|00236|binding|INFO|Claiming lport e204a8e9-0d37-4412-bc4a-98054b03a065 for this chassis.
Oct  2 08:14:31 np0005466013 ovn_controller[94366]: 2025-10-02T12:14:31Z|00237|binding|INFO|e204a8e9-0d37-4412-bc4a-98054b03a065: Claiming fa:16:3e:be:4b:1d 10.100.1.172
Oct  2 08:14:31 np0005466013 nova_compute[192144]: 2025-10-02 12:14:31.464 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:31 np0005466013 ovn_controller[94366]: 2025-10-02T12:14:31Z|00238|binding|INFO|Setting lport efd2e9f9-37be-44f8-ae2d-57277d8d6aa1 ovn-installed in OVS
Oct  2 08:14:31 np0005466013 nova_compute[192144]: 2025-10-02 12:14:31.465 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:31 np0005466013 ovn_controller[94366]: 2025-10-02T12:14:31Z|00239|binding|INFO|Setting lport efd2e9f9-37be-44f8-ae2d-57277d8d6aa1 up in Southbound
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:31.470 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:50:70:84 10.100.0.126'], port_security=['fa:16:3e:50:70:84 10.100.0.126'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.126/24', 'neutron:device_id': '38d330be-301c-4f63-9b1e-7ebc2a61f3e9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '11be1361f6f44b10a6efea8fccf616aa', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8a0b041d-c4b2-499a-b557-418346b0314a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7d9d3969-f564-46cb-adcb-4455de0929c2, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:31.472 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:be:4b:1d 10.100.1.172'], port_security=['fa:16:3e:be:4b:1d 10.100.1.172'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.1.172/24', 'neutron:device_id': '38d330be-301c-4f63-9b1e-7ebc2a61f3e9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-02195939-8d5e-4e0a-8f89-f16a0db2353e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '11be1361f6f44b10a6efea8fccf616aa', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8a0b041d-c4b2-499a-b557-418346b0314a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f6b1cfbb-3e5b-441a-a995-5309296d5c01, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=e204a8e9-0d37-4412-bc4a-98054b03a065) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:31.475 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[7b4ba1eb-af21-47c9-9df9-6279b0d0a3cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:31 np0005466013 NetworkManager[51205]: <info>  [1759407271.4786] device (tapefd2e9f9-37): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:14:31 np0005466013 NetworkManager[51205]: <info>  [1759407271.4795] device (tape204a8e9-0d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:14:31 np0005466013 NetworkManager[51205]: <info>  [1759407271.4802] device (tap9a3d251d-22): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:14:31 np0005466013 NetworkManager[51205]: <info>  [1759407271.4810] device (tapefd2e9f9-37): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:14:31 np0005466013 NetworkManager[51205]: <info>  [1759407271.4814] device (tape204a8e9-0d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:14:31 np0005466013 NetworkManager[51205]: <info>  [1759407271.4817] device (tap9a3d251d-22): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:31.505 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[f5a1abb5-d242-4955-b591-0439b3a6ccc0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:31 np0005466013 nova_compute[192144]: 2025-10-02 12:14:31.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:31 np0005466013 ovn_controller[94366]: 2025-10-02T12:14:31Z|00240|binding|INFO|Setting lport 9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8 ovn-installed in OVS
Oct  2 08:14:31 np0005466013 ovn_controller[94366]: 2025-10-02T12:14:31Z|00241|binding|INFO|Setting lport 9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8 up in Southbound
Oct  2 08:14:31 np0005466013 ovn_controller[94366]: 2025-10-02T12:14:31Z|00242|binding|INFO|Setting lport e204a8e9-0d37-4412-bc4a-98054b03a065 ovn-installed in OVS
Oct  2 08:14:31 np0005466013 ovn_controller[94366]: 2025-10-02T12:14:31Z|00243|binding|INFO|Setting lport e204a8e9-0d37-4412-bc4a-98054b03a065 up in Southbound
Oct  2 08:14:31 np0005466013 nova_compute[192144]: 2025-10-02 12:14:31.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:31 np0005466013 systemd-machined[152202]: New machine qemu-30-instance-00000049.
Oct  2 08:14:31 np0005466013 systemd[1]: Started Virtual Machine qemu-30-instance-00000049.
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:31.539 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[4b9182eb-59f7-490d-9ce4-410ced6c4ca8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:31 np0005466013 NetworkManager[51205]: <info>  [1759407271.5522] manager: (tapbf5e8c0d-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/125)
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:31.551 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[a5ff8004-0f23-46a8-99a7-36ae0ea3cecb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:31.590 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[721ae283-682e-48bd-84be-635e1337f48d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:31.594 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[514356cb-62d4-49e1-8e17-af739f7d934f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:31 np0005466013 NetworkManager[51205]: <info>  [1759407271.6200] device (tapbf5e8c0d-e0): carrier: link connected
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:31.625 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[57c15784-7091-4a8a-98fb-fc8e7260f400]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:31.645 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[bc65bc84-ce02-472e-9a7f-c9abfeb0c590]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbf5e8c0d-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:70:db:91'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 75], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 526323, 'reachable_time': 29591, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230182, 'error': None, 'target': 'ovnmeta-bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:31.666 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[0996aed2-adf7-4051-bea5-3f9c0ceca5b2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe70:db91'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 526323, 'tstamp': 526323}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230183, 'error': None, 'target': 'ovnmeta-bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:31.688 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[8e0227bd-c9c4-4cb5-9265-bf0418d5ae9a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbf5e8c0d-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:70:db:91'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 75], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 526323, 'reachable_time': 29591, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230184, 'error': None, 'target': 'ovnmeta-bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:31.721 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[9e75209c-919a-420f-a805-f6c261d30489]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:31 np0005466013 nova_compute[192144]: 2025-10-02 12:14:31.777 2 DEBUG nova.compute.manager [req-eb8e55a2-7583-4286-9e86-178091eeb34a req-fee75df0-bf0c-4642-850a-8f4235c3be3e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Received event network-vif-plugged-efd2e9f9-37be-44f8-ae2d-57277d8d6aa1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:14:31 np0005466013 nova_compute[192144]: 2025-10-02 12:14:31.777 2 DEBUG oslo_concurrency.lockutils [req-eb8e55a2-7583-4286-9e86-178091eeb34a req-fee75df0-bf0c-4642-850a-8f4235c3be3e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:31 np0005466013 nova_compute[192144]: 2025-10-02 12:14:31.778 2 DEBUG oslo_concurrency.lockutils [req-eb8e55a2-7583-4286-9e86-178091eeb34a req-fee75df0-bf0c-4642-850a-8f4235c3be3e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:31 np0005466013 nova_compute[192144]: 2025-10-02 12:14:31.778 2 DEBUG oslo_concurrency.lockutils [req-eb8e55a2-7583-4286-9e86-178091eeb34a req-fee75df0-bf0c-4642-850a-8f4235c3be3e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:31 np0005466013 nova_compute[192144]: 2025-10-02 12:14:31.778 2 DEBUG nova.compute.manager [req-eb8e55a2-7583-4286-9e86-178091eeb34a req-fee75df0-bf0c-4642-850a-8f4235c3be3e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Processing event network-vif-plugged-efd2e9f9-37be-44f8-ae2d-57277d8d6aa1 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:31.795 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[d87237fe-d5ec-47d3-90f1-3f5c894880b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:31.797 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbf5e8c0d-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:31.798 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:31.798 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbf5e8c0d-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:31.828 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:31 np0005466013 kernel: tapbf5e8c0d-e0: entered promiscuous mode
Oct  2 08:14:31 np0005466013 nova_compute[192144]: 2025-10-02 12:14:31.854 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:31 np0005466013 NetworkManager[51205]: <info>  [1759407271.8554] manager: (tapbf5e8c0d-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/126)
Oct  2 08:14:31 np0005466013 nova_compute[192144]: 2025-10-02 12:14:31.858 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:31.858 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapbf5e8c0d-e0, col_values=(('external_ids', {'iface-id': '206f33d2-de41-4eed-9da3-d4ba07f4c53d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:31 np0005466013 ovn_controller[94366]: 2025-10-02T12:14:31Z|00244|binding|INFO|Releasing lport 206f33d2-de41-4eed-9da3-d4ba07f4c53d from this chassis (sb_readonly=0)
Oct  2 08:14:31 np0005466013 nova_compute[192144]: 2025-10-02 12:14:31.861 2 DEBUG nova.compute.manager [req-504a5f60-78a9-448b-a9c7-e4d711164ff4 req-232d5a12-269a-4c4d-8a09-0900cde920e3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Received event network-vif-plugged-e204a8e9-0d37-4412-bc4a-98054b03a065 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:14:31 np0005466013 nova_compute[192144]: 2025-10-02 12:14:31.861 2 DEBUG oslo_concurrency.lockutils [req-504a5f60-78a9-448b-a9c7-e4d711164ff4 req-232d5a12-269a-4c4d-8a09-0900cde920e3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:31.861 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:14:31 np0005466013 nova_compute[192144]: 2025-10-02 12:14:31.861 2 DEBUG oslo_concurrency.lockutils [req-504a5f60-78a9-448b-a9c7-e4d711164ff4 req-232d5a12-269a-4c4d-8a09-0900cde920e3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:31 np0005466013 nova_compute[192144]: 2025-10-02 12:14:31.861 2 DEBUG oslo_concurrency.lockutils [req-504a5f60-78a9-448b-a9c7-e4d711164ff4 req-232d5a12-269a-4c4d-8a09-0900cde920e3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:31 np0005466013 nova_compute[192144]: 2025-10-02 12:14:31.862 2 DEBUG nova.compute.manager [req-504a5f60-78a9-448b-a9c7-e4d711164ff4 req-232d5a12-269a-4c4d-8a09-0900cde920e3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Processing event network-vif-plugged-e204a8e9-0d37-4412-bc4a-98054b03a065 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:14:31 np0005466013 nova_compute[192144]: 2025-10-02 12:14:31.862 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:31.862 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[4dc6ced6-3844-4796-b621-0b86ffdd4b07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:31.863 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85.pid.haproxy
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:14:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:31.864 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85', 'env', 'PROCESS_TAG=haproxy-bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:14:31 np0005466013 nova_compute[192144]: 2025-10-02 12:14:31.873 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:31 np0005466013 nova_compute[192144]: 2025-10-02 12:14:31.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:14:32 np0005466013 podman[230224]: 2025-10-02 12:14:32.255458003 +0000 UTC m=+0.050265757 container create 602e3762807b50a85cd970dfe22ca443457d7b129557c33bfb24c6793f1f6d6b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 08:14:32 np0005466013 systemd[1]: Started libpod-conmon-602e3762807b50a85cd970dfe22ca443457d7b129557c33bfb24c6793f1f6d6b.scope.
Oct  2 08:14:32 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:14:32 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ec54854c951ed04172aa487d23ec34c40d64225b78c54a6f0b4b9eb59d35292/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:14:32 np0005466013 podman[230224]: 2025-10-02 12:14:32.324605111 +0000 UTC m=+0.119412875 container init 602e3762807b50a85cd970dfe22ca443457d7b129557c33bfb24c6793f1f6d6b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:14:32 np0005466013 podman[230224]: 2025-10-02 12:14:32.229412476 +0000 UTC m=+0.024220240 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:14:32 np0005466013 podman[230224]: 2025-10-02 12:14:32.330010642 +0000 UTC m=+0.124818386 container start 602e3762807b50a85cd970dfe22ca443457d7b129557c33bfb24c6793f1f6d6b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:14:32 np0005466013 neutron-haproxy-ovnmeta-bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85[230239]: [NOTICE]   (230243) : New worker (230245) forked
Oct  2 08:14:32 np0005466013 neutron-haproxy-ovnmeta-bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85[230239]: [NOTICE]   (230243) : Loading success.
Oct  2 08:14:32 np0005466013 nova_compute[192144]: 2025-10-02 12:14:32.364 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407272.3643234, 38d330be-301c-4f63-9b1e-7ebc2a61f3e9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:14:32 np0005466013 nova_compute[192144]: 2025-10-02 12:14:32.365 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] VM Started (Lifecycle Event)#033[00m
Oct  2 08:14:32 np0005466013 nova_compute[192144]: 2025-10-02 12:14:32.386 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:14:32 np0005466013 nova_compute[192144]: 2025-10-02 12:14:32.390 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407272.3644087, 38d330be-301c-4f63-9b1e-7ebc2a61f3e9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:14:32 np0005466013 nova_compute[192144]: 2025-10-02 12:14:32.391 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:14:32 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:32.404 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8 in datapath bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85 unbound from our chassis#033[00m
Oct  2 08:14:32 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:32.406 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85#033[00m
Oct  2 08:14:32 np0005466013 nova_compute[192144]: 2025-10-02 12:14:32.408 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:14:32 np0005466013 nova_compute[192144]: 2025-10-02 12:14:32.412 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:14:32 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:32.422 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[207dbd57-af28-4aa1-85ca-26bd3ad43616]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:32 np0005466013 nova_compute[192144]: 2025-10-02 12:14:32.429 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:14:32 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:32.454 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[844b9d11-50b3-472d-909e-dd643f1b47ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:32 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:32.458 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[12de640c-8959-46e4-a7e4-511ec160bb36]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:32 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:32.488 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[2e24e529-f376-4da0-962d-c287d2e4eb2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:32 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:32.510 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[aaecbc9c-f222-49b6-b5d1-92f84dfd5020]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbf5e8c0d-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:70:db:91'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 3, 'tx_packets': 4, 'rx_bytes': 306, 'tx_bytes': 264, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 3, 'tx_packets': 4, 'rx_bytes': 306, 'tx_bytes': 264, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 75], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 526323, 'reachable_time': 29591, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 3, 'inoctets': 264, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 3, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 264, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 3, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230259, 'error': None, 'target': 'ovnmeta-bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:32 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:32.529 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[a9626abe-b8a9-43e4-bd7e-649f85041653]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapbf5e8c0d-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 526336, 'tstamp': 526336}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230260, 'error': None, 'target': 'ovnmeta-bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.255'], ['IFA_LABEL', 'tapbf5e8c0d-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 526340, 'tstamp': 526340}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230260, 'error': None, 'target': 'ovnmeta-bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:32 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:32.532 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbf5e8c0d-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:32 np0005466013 nova_compute[192144]: 2025-10-02 12:14:32.534 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:32 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:32.535 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbf5e8c0d-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:32 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:32.536 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:14:32 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:32.536 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapbf5e8c0d-e0, col_values=(('external_ids', {'iface-id': '206f33d2-de41-4eed-9da3-d4ba07f4c53d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:32 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:32.537 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:14:32 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:32.538 103323 INFO neutron.agent.ovn.metadata.agent [-] Port e204a8e9-0d37-4412-bc4a-98054b03a065 in datapath 02195939-8d5e-4e0a-8f89-f16a0db2353e unbound from our chassis#033[00m
Oct  2 08:14:32 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:32.540 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 02195939-8d5e-4e0a-8f89-f16a0db2353e#033[00m
Oct  2 08:14:32 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:32.558 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[492b73ee-1bd9-4f9c-8877-f3beb8ef05b4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:32 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:32.560 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap02195939-81 in ovnmeta-02195939-8d5e-4e0a-8f89-f16a0db2353e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:14:32 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:32.563 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap02195939-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:14:32 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:32.563 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[adb59a14-7b83-47f4-90c9-e58cc7a11cfd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:32 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:32.564 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[e41fad67-42b8-4570-aa63-e8b63ccc92bf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:32 np0005466013 nova_compute[192144]: 2025-10-02 12:14:32.580 2 DEBUG nova.network.neutron [req-b83a76a5-022f-41f0-8049-231665dc5cf7 req-47452754-d9ac-4524-a8ad-b4f70bc76a42 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Updated VIF entry in instance network info cache for port e204a8e9-0d37-4412-bc4a-98054b03a065. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:14:32 np0005466013 nova_compute[192144]: 2025-10-02 12:14:32.581 2 DEBUG nova.network.neutron [req-b83a76a5-022f-41f0-8049-231665dc5cf7 req-47452754-d9ac-4524-a8ad-b4f70bc76a42 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Updating instance_info_cache with network_info: [{"id": "efd2e9f9-37be-44f8-ae2d-57277d8d6aa1", "address": "fa:16:3e:c7:69:5e", "network": {"id": "bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-797982943", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.238", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11be1361f6f44b10a6efea8fccf616aa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefd2e9f9-37", "ovs_interfaceid": "efd2e9f9-37be-44f8-ae2d-57277d8d6aa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "e204a8e9-0d37-4412-bc4a-98054b03a065", "address": "fa:16:3e:be:4b:1d", "network": {"id": "02195939-8d5e-4e0a-8f89-f16a0db2353e", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1175664720", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.172", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11be1361f6f44b10a6efea8fccf616aa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape204a8e9-0d", "ovs_interfaceid": "e204a8e9-0d37-4412-bc4a-98054b03a065", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8", "address": "fa:16:3e:50:70:84", "network": {"id": "bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-797982943", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.126", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11be1361f6f44b10a6efea8fccf616aa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a3d251d-22", "ovs_interfaceid": "9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:14:32 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:32.582 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[05aaa33e-f5fc-4411-8714-4cde99c3d797]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:32 np0005466013 nova_compute[192144]: 2025-10-02 12:14:32.599 2 DEBUG oslo_concurrency.lockutils [req-b83a76a5-022f-41f0-8049-231665dc5cf7 req-47452754-d9ac-4524-a8ad-b4f70bc76a42 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-38d330be-301c-4f63-9b1e-7ebc2a61f3e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:14:32 np0005466013 nova_compute[192144]: 2025-10-02 12:14:32.600 2 DEBUG nova.compute.manager [req-b83a76a5-022f-41f0-8049-231665dc5cf7 req-47452754-d9ac-4524-a8ad-b4f70bc76a42 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Received event network-changed-9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:14:32 np0005466013 nova_compute[192144]: 2025-10-02 12:14:32.600 2 DEBUG nova.compute.manager [req-b83a76a5-022f-41f0-8049-231665dc5cf7 req-47452754-d9ac-4524-a8ad-b4f70bc76a42 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Refreshing instance network info cache due to event network-changed-9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:14:32 np0005466013 nova_compute[192144]: 2025-10-02 12:14:32.601 2 DEBUG oslo_concurrency.lockutils [req-b83a76a5-022f-41f0-8049-231665dc5cf7 req-47452754-d9ac-4524-a8ad-b4f70bc76a42 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-38d330be-301c-4f63-9b1e-7ebc2a61f3e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:14:32 np0005466013 nova_compute[192144]: 2025-10-02 12:14:32.601 2 DEBUG oslo_concurrency.lockutils [req-b83a76a5-022f-41f0-8049-231665dc5cf7 req-47452754-d9ac-4524-a8ad-b4f70bc76a42 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-38d330be-301c-4f63-9b1e-7ebc2a61f3e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:14:32 np0005466013 nova_compute[192144]: 2025-10-02 12:14:32.601 2 DEBUG nova.network.neutron [req-b83a76a5-022f-41f0-8049-231665dc5cf7 req-47452754-d9ac-4524-a8ad-b4f70bc76a42 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Refreshing network info cache for port 9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:14:32 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:32.613 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[6f320df1-4f07-41ae-9980-6e16e4118164]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:32 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:32.649 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[c045d7cb-ef25-457e-a6dd-bacc1ce9315a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:32 np0005466013 systemd-udevd[230171]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:14:32 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:32.659 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[ddf41941-59f6-4d63-a6ca-dc76cc5ac2a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:32 np0005466013 NetworkManager[51205]: <info>  [1759407272.6613] manager: (tap02195939-80): new Veth device (/org/freedesktop/NetworkManager/Devices/127)
Oct  2 08:14:32 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:32.701 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[8c62740c-e91b-4a75-9d35-a5c1c00b587b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:32 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:32.707 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[d3c015b6-2350-4ff1-8f83-068a529de56a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:32 np0005466013 NetworkManager[51205]: <info>  [1759407272.7403] device (tap02195939-80): carrier: link connected
Oct  2 08:14:32 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:32.748 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[22818c16-6cbc-4731-8da4-2ec7976df74b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:32 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:32.768 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[d2aaf95e-3f7e-42ff-a53b-4d9f894918e5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap02195939-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6e:b8:5c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 76], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 526435, 'reachable_time': 28986, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230271, 'error': None, 'target': 'ovnmeta-02195939-8d5e-4e0a-8f89-f16a0db2353e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:32 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:32.790 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[8cc21122-eab7-4cca-8a89-489c27de86bf]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6e:b85c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 526435, 'tstamp': 526435}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230272, 'error': None, 'target': 'ovnmeta-02195939-8d5e-4e0a-8f89-f16a0db2353e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:32 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:32.815 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[b8aa6223-b668-463b-b11d-65c3c544f7ef]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap02195939-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6e:b8:5c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 76], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 526435, 'reachable_time': 28986, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230273, 'error': None, 'target': 'ovnmeta-02195939-8d5e-4e0a-8f89-f16a0db2353e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:32 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:32.860 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[50f843a7-4b47-4fe1-9525-e2b590454761]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:32 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:32.930 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[2159aa99-116a-44b8-a9bf-4ed839dae250]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:32 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:32.932 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap02195939-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:32 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:32.932 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:14:32 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:32.933 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap02195939-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:32 np0005466013 nova_compute[192144]: 2025-10-02 12:14:32.987 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:32 np0005466013 NetworkManager[51205]: <info>  [1759407272.9890] manager: (tap02195939-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/128)
Oct  2 08:14:32 np0005466013 kernel: tap02195939-80: entered promiscuous mode
Oct  2 08:14:32 np0005466013 nova_compute[192144]: 2025-10-02 12:14:32.992 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:32 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:32.994 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap02195939-80, col_values=(('external_ids', {'iface-id': '20d53610-0a53-406e-9594-a31cfd544780'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:32 np0005466013 ovn_controller[94366]: 2025-10-02T12:14:32Z|00245|binding|INFO|Releasing lport 20d53610-0a53-406e-9594-a31cfd544780 from this chassis (sb_readonly=0)
Oct  2 08:14:32 np0005466013 nova_compute[192144]: 2025-10-02 12:14:32.996 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:33 np0005466013 nova_compute[192144]: 2025-10-02 12:14:33.013 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:33 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:33.014 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/02195939-8d5e-4e0a-8f89-f16a0db2353e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/02195939-8d5e-4e0a-8f89-f16a0db2353e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:14:33 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:33.015 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[4f19b8d8-1f0d-47f6-a239-ba7b5bcd9767]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:33 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:33.015 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:14:33 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:14:33 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:14:33 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-02195939-8d5e-4e0a-8f89-f16a0db2353e
Oct  2 08:14:33 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:14:33 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:14:33 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:14:33 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/02195939-8d5e-4e0a-8f89-f16a0db2353e.pid.haproxy
Oct  2 08:14:33 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:14:33 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:14:33 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:14:33 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:14:33 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:14:33 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:14:33 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:14:33 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:14:33 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:14:33 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:14:33 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:14:33 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:14:33 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:14:33 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:14:33 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:14:33 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:14:33 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:14:33 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:14:33 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:14:33 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:14:33 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID 02195939-8d5e-4e0a-8f89-f16a0db2353e
Oct  2 08:14:33 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:14:33 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:33.016 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-02195939-8d5e-4e0a-8f89-f16a0db2353e', 'env', 'PROCESS_TAG=haproxy-02195939-8d5e-4e0a-8f89-f16a0db2353e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/02195939-8d5e-4e0a-8f89-f16a0db2353e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:14:33 np0005466013 nova_compute[192144]: 2025-10-02 12:14:33.379 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:33 np0005466013 podman[230306]: 2025-10-02 12:14:33.439299721 +0000 UTC m=+0.055020990 container create d824d6a9c704e5fed9ffb956f34711ecb525f4e543e3b4d543577053cf8d352c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-02195939-8d5e-4e0a-8f89-f16a0db2353e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3)
Oct  2 08:14:33 np0005466013 systemd[1]: Started libpod-conmon-d824d6a9c704e5fed9ffb956f34711ecb525f4e543e3b4d543577053cf8d352c.scope.
Oct  2 08:14:33 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:14:33 np0005466013 podman[230306]: 2025-10-02 12:14:33.411862949 +0000 UTC m=+0.027584208 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:14:33 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3f26de003de8b8ebd208eefbff0f5d6b27f2851fb183005185e73d340c5c60f1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:14:33 np0005466013 podman[230306]: 2025-10-02 12:14:33.520056437 +0000 UTC m=+0.135777696 container init d824d6a9c704e5fed9ffb956f34711ecb525f4e543e3b4d543577053cf8d352c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-02195939-8d5e-4e0a-8f89-f16a0db2353e, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct  2 08:14:33 np0005466013 podman[230306]: 2025-10-02 12:14:33.526491071 +0000 UTC m=+0.142212310 container start d824d6a9c704e5fed9ffb956f34711ecb525f4e543e3b4d543577053cf8d352c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-02195939-8d5e-4e0a-8f89-f16a0db2353e, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.schema-version=1.0)
Oct  2 08:14:33 np0005466013 neutron-haproxy-ovnmeta-02195939-8d5e-4e0a-8f89-f16a0db2353e[230321]: [NOTICE]   (230325) : New worker (230327) forked
Oct  2 08:14:33 np0005466013 neutron-haproxy-ovnmeta-02195939-8d5e-4e0a-8f89-f16a0db2353e[230321]: [NOTICE]   (230325) : Loading success.
Oct  2 08:14:33 np0005466013 nova_compute[192144]: 2025-10-02 12:14:33.829 2 DEBUG nova.network.neutron [req-b83a76a5-022f-41f0-8049-231665dc5cf7 req-47452754-d9ac-4524-a8ad-b4f70bc76a42 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Updated VIF entry in instance network info cache for port 9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:14:33 np0005466013 nova_compute[192144]: 2025-10-02 12:14:33.831 2 DEBUG nova.network.neutron [req-b83a76a5-022f-41f0-8049-231665dc5cf7 req-47452754-d9ac-4524-a8ad-b4f70bc76a42 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Updating instance_info_cache with network_info: [{"id": "efd2e9f9-37be-44f8-ae2d-57277d8d6aa1", "address": "fa:16:3e:c7:69:5e", "network": {"id": "bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-797982943", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.238", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11be1361f6f44b10a6efea8fccf616aa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefd2e9f9-37", "ovs_interfaceid": "efd2e9f9-37be-44f8-ae2d-57277d8d6aa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "e204a8e9-0d37-4412-bc4a-98054b03a065", "address": "fa:16:3e:be:4b:1d", "network": {"id": "02195939-8d5e-4e0a-8f89-f16a0db2353e", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1175664720", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.172", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11be1361f6f44b10a6efea8fccf616aa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape204a8e9-0d", "ovs_interfaceid": "e204a8e9-0d37-4412-bc4a-98054b03a065", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8", "address": "fa:16:3e:50:70:84", "network": {"id": "bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-797982943", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.126", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11be1361f6f44b10a6efea8fccf616aa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a3d251d-22", "ovs_interfaceid": "9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:14:33 np0005466013 nova_compute[192144]: 2025-10-02 12:14:33.854 2 DEBUG oslo_concurrency.lockutils [req-b83a76a5-022f-41f0-8049-231665dc5cf7 req-47452754-d9ac-4524-a8ad-b4f70bc76a42 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-38d330be-301c-4f63-9b1e-7ebc2a61f3e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:14:33 np0005466013 nova_compute[192144]: 2025-10-02 12:14:33.866 2 DEBUG nova.compute.manager [req-43cd4e9c-2e5f-4d52-be56-de099d1c27ce req-178ac8a4-b572-43dd-9906-ac782c614230 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Received event network-vif-plugged-efd2e9f9-37be-44f8-ae2d-57277d8d6aa1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:14:33 np0005466013 nova_compute[192144]: 2025-10-02 12:14:33.866 2 DEBUG oslo_concurrency.lockutils [req-43cd4e9c-2e5f-4d52-be56-de099d1c27ce req-178ac8a4-b572-43dd-9906-ac782c614230 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:33 np0005466013 nova_compute[192144]: 2025-10-02 12:14:33.867 2 DEBUG oslo_concurrency.lockutils [req-43cd4e9c-2e5f-4d52-be56-de099d1c27ce req-178ac8a4-b572-43dd-9906-ac782c614230 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:33 np0005466013 nova_compute[192144]: 2025-10-02 12:14:33.867 2 DEBUG oslo_concurrency.lockutils [req-43cd4e9c-2e5f-4d52-be56-de099d1c27ce req-178ac8a4-b572-43dd-9906-ac782c614230 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:33 np0005466013 nova_compute[192144]: 2025-10-02 12:14:33.867 2 DEBUG nova.compute.manager [req-43cd4e9c-2e5f-4d52-be56-de099d1c27ce req-178ac8a4-b572-43dd-9906-ac782c614230 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] No event matching network-vif-plugged-efd2e9f9-37be-44f8-ae2d-57277d8d6aa1 in dict_keys([('network-vif-plugged', '9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Oct  2 08:14:33 np0005466013 nova_compute[192144]: 2025-10-02 12:14:33.867 2 WARNING nova.compute.manager [req-43cd4e9c-2e5f-4d52-be56-de099d1c27ce req-178ac8a4-b572-43dd-9906-ac782c614230 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Received unexpected event network-vif-plugged-efd2e9f9-37be-44f8-ae2d-57277d8d6aa1 for instance with vm_state building and task_state spawning.#033[00m
Oct  2 08:14:33 np0005466013 nova_compute[192144]: 2025-10-02 12:14:33.867 2 DEBUG nova.compute.manager [req-43cd4e9c-2e5f-4d52-be56-de099d1c27ce req-178ac8a4-b572-43dd-9906-ac782c614230 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Received event network-vif-plugged-9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:14:33 np0005466013 nova_compute[192144]: 2025-10-02 12:14:33.868 2 DEBUG oslo_concurrency.lockutils [req-43cd4e9c-2e5f-4d52-be56-de099d1c27ce req-178ac8a4-b572-43dd-9906-ac782c614230 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:33 np0005466013 nova_compute[192144]: 2025-10-02 12:14:33.868 2 DEBUG oslo_concurrency.lockutils [req-43cd4e9c-2e5f-4d52-be56-de099d1c27ce req-178ac8a4-b572-43dd-9906-ac782c614230 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:33 np0005466013 nova_compute[192144]: 2025-10-02 12:14:33.868 2 DEBUG oslo_concurrency.lockutils [req-43cd4e9c-2e5f-4d52-be56-de099d1c27ce req-178ac8a4-b572-43dd-9906-ac782c614230 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:33 np0005466013 nova_compute[192144]: 2025-10-02 12:14:33.868 2 DEBUG nova.compute.manager [req-43cd4e9c-2e5f-4d52-be56-de099d1c27ce req-178ac8a4-b572-43dd-9906-ac782c614230 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Processing event network-vif-plugged-9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:14:33 np0005466013 nova_compute[192144]: 2025-10-02 12:14:33.868 2 DEBUG nova.compute.manager [req-43cd4e9c-2e5f-4d52-be56-de099d1c27ce req-178ac8a4-b572-43dd-9906-ac782c614230 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Received event network-vif-plugged-9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:14:33 np0005466013 nova_compute[192144]: 2025-10-02 12:14:33.869 2 DEBUG oslo_concurrency.lockutils [req-43cd4e9c-2e5f-4d52-be56-de099d1c27ce req-178ac8a4-b572-43dd-9906-ac782c614230 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:33 np0005466013 nova_compute[192144]: 2025-10-02 12:14:33.869 2 DEBUG oslo_concurrency.lockutils [req-43cd4e9c-2e5f-4d52-be56-de099d1c27ce req-178ac8a4-b572-43dd-9906-ac782c614230 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:33 np0005466013 nova_compute[192144]: 2025-10-02 12:14:33.869 2 DEBUG oslo_concurrency.lockutils [req-43cd4e9c-2e5f-4d52-be56-de099d1c27ce req-178ac8a4-b572-43dd-9906-ac782c614230 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:33 np0005466013 nova_compute[192144]: 2025-10-02 12:14:33.869 2 DEBUG nova.compute.manager [req-43cd4e9c-2e5f-4d52-be56-de099d1c27ce req-178ac8a4-b572-43dd-9906-ac782c614230 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] No waiting events found dispatching network-vif-plugged-9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:14:33 np0005466013 nova_compute[192144]: 2025-10-02 12:14:33.870 2 WARNING nova.compute.manager [req-43cd4e9c-2e5f-4d52-be56-de099d1c27ce req-178ac8a4-b572-43dd-9906-ac782c614230 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Received unexpected event network-vif-plugged-9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8 for instance with vm_state building and task_state spawning.#033[00m
Oct  2 08:14:33 np0005466013 nova_compute[192144]: 2025-10-02 12:14:33.870 2 DEBUG nova.compute.manager [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Instance event wait completed in 1 seconds for network-vif-plugged,network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:14:33 np0005466013 nova_compute[192144]: 2025-10-02 12:14:33.874 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407273.8740284, 38d330be-301c-4f63-9b1e-7ebc2a61f3e9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:14:33 np0005466013 nova_compute[192144]: 2025-10-02 12:14:33.874 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:14:33 np0005466013 nova_compute[192144]: 2025-10-02 12:14:33.876 2 DEBUG nova.virt.libvirt.driver [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:14:33 np0005466013 nova_compute[192144]: 2025-10-02 12:14:33.881 2 INFO nova.virt.libvirt.driver [-] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Instance spawned successfully.#033[00m
Oct  2 08:14:33 np0005466013 nova_compute[192144]: 2025-10-02 12:14:33.882 2 DEBUG nova.virt.libvirt.driver [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:14:33 np0005466013 nova_compute[192144]: 2025-10-02 12:14:33.896 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:14:33 np0005466013 nova_compute[192144]: 2025-10-02 12:14:33.900 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:14:33 np0005466013 nova_compute[192144]: 2025-10-02 12:14:33.904 2 DEBUG nova.virt.libvirt.driver [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:14:33 np0005466013 nova_compute[192144]: 2025-10-02 12:14:33.904 2 DEBUG nova.virt.libvirt.driver [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:14:33 np0005466013 nova_compute[192144]: 2025-10-02 12:14:33.905 2 DEBUG nova.virt.libvirt.driver [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:14:33 np0005466013 nova_compute[192144]: 2025-10-02 12:14:33.905 2 DEBUG nova.virt.libvirt.driver [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:14:33 np0005466013 nova_compute[192144]: 2025-10-02 12:14:33.906 2 DEBUG nova.virt.libvirt.driver [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:14:33 np0005466013 nova_compute[192144]: 2025-10-02 12:14:33.906 2 DEBUG nova.virt.libvirt.driver [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:14:33 np0005466013 nova_compute[192144]: 2025-10-02 12:14:33.937 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:14:33 np0005466013 nova_compute[192144]: 2025-10-02 12:14:33.940 2 DEBUG nova.compute.manager [req-e329d355-97df-465d-a835-f91d7f8b2b94 req-3f03e10f-8ecc-42c1-8fcd-b69b6790f1d8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Received event network-vif-plugged-e204a8e9-0d37-4412-bc4a-98054b03a065 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:14:33 np0005466013 nova_compute[192144]: 2025-10-02 12:14:33.940 2 DEBUG oslo_concurrency.lockutils [req-e329d355-97df-465d-a835-f91d7f8b2b94 req-3f03e10f-8ecc-42c1-8fcd-b69b6790f1d8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:33 np0005466013 nova_compute[192144]: 2025-10-02 12:14:33.940 2 DEBUG oslo_concurrency.lockutils [req-e329d355-97df-465d-a835-f91d7f8b2b94 req-3f03e10f-8ecc-42c1-8fcd-b69b6790f1d8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:33 np0005466013 nova_compute[192144]: 2025-10-02 12:14:33.940 2 DEBUG oslo_concurrency.lockutils [req-e329d355-97df-465d-a835-f91d7f8b2b94 req-3f03e10f-8ecc-42c1-8fcd-b69b6790f1d8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:33 np0005466013 nova_compute[192144]: 2025-10-02 12:14:33.940 2 DEBUG nova.compute.manager [req-e329d355-97df-465d-a835-f91d7f8b2b94 req-3f03e10f-8ecc-42c1-8fcd-b69b6790f1d8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] No waiting events found dispatching network-vif-plugged-e204a8e9-0d37-4412-bc4a-98054b03a065 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:14:33 np0005466013 nova_compute[192144]: 2025-10-02 12:14:33.941 2 WARNING nova.compute.manager [req-e329d355-97df-465d-a835-f91d7f8b2b94 req-3f03e10f-8ecc-42c1-8fcd-b69b6790f1d8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Received unexpected event network-vif-plugged-e204a8e9-0d37-4412-bc4a-98054b03a065 for instance with vm_state building and task_state spawning.#033[00m
Oct  2 08:14:33 np0005466013 nova_compute[192144]: 2025-10-02 12:14:33.982 2 INFO nova.compute.manager [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Took 13.90 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:14:33 np0005466013 nova_compute[192144]: 2025-10-02 12:14:33.983 2 DEBUG nova.compute.manager [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:14:34 np0005466013 nova_compute[192144]: 2025-10-02 12:14:34.074 2 INFO nova.compute.manager [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Took 14.76 seconds to build instance.#033[00m
Oct  2 08:14:34 np0005466013 nova_compute[192144]: 2025-10-02 12:14:34.097 2 DEBUG oslo_concurrency.lockutils [None req-cb9e71b7-e1ce-4566-9f21-60136c62ab3b 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.939s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:34 np0005466013 nova_compute[192144]: 2025-10-02 12:14:34.100 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 8.702s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:34 np0005466013 nova_compute[192144]: 2025-10-02 12:14:34.102 2 INFO nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:14:34 np0005466013 nova_compute[192144]: 2025-10-02 12:14:34.103 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:35 np0005466013 nova_compute[192144]: 2025-10-02 12:14:35.586 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.432 2 DEBUG oslo_concurrency.lockutils [None req-ca27360f-435b-4514-903f-70018394c079 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Acquiring lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.434 2 DEBUG oslo_concurrency.lockutils [None req-ca27360f-435b-4514-903f-70018394c079 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.434 2 DEBUG oslo_concurrency.lockutils [None req-ca27360f-435b-4514-903f-70018394c079 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Acquiring lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.435 2 DEBUG oslo_concurrency.lockutils [None req-ca27360f-435b-4514-903f-70018394c079 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.435 2 DEBUG oslo_concurrency.lockutils [None req-ca27360f-435b-4514-903f-70018394c079 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.455 2 INFO nova.compute.manager [None req-ca27360f-435b-4514-903f-70018394c079 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Terminating instance#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.474 2 DEBUG nova.compute.manager [None req-ca27360f-435b-4514-903f-70018394c079 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:14:36 np0005466013 kernel: tapefd2e9f9-37 (unregistering): left promiscuous mode
Oct  2 08:14:36 np0005466013 NetworkManager[51205]: <info>  [1759407276.5038] device (tapefd2e9f9-37): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:14:36 np0005466013 ovn_controller[94366]: 2025-10-02T12:14:36Z|00246|binding|INFO|Releasing lport efd2e9f9-37be-44f8-ae2d-57277d8d6aa1 from this chassis (sb_readonly=0)
Oct  2 08:14:36 np0005466013 ovn_controller[94366]: 2025-10-02T12:14:36Z|00247|binding|INFO|Setting lport efd2e9f9-37be-44f8-ae2d-57277d8d6aa1 down in Southbound
Oct  2 08:14:36 np0005466013 ovn_controller[94366]: 2025-10-02T12:14:36Z|00248|binding|INFO|Removing iface tapefd2e9f9-37 ovn-installed in OVS
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.529 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:36 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:36.532 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c7:69:5e 10.100.0.238'], port_security=['fa:16:3e:c7:69:5e 10.100.0.238'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.238/24', 'neutron:device_id': '38d330be-301c-4f63-9b1e-7ebc2a61f3e9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '11be1361f6f44b10a6efea8fccf616aa', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8a0b041d-c4b2-499a-b557-418346b0314a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7d9d3969-f564-46cb-adcb-4455de0929c2, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=efd2e9f9-37be-44f8-ae2d-57277d8d6aa1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:14:36 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:36.533 103323 INFO neutron.agent.ovn.metadata.agent [-] Port efd2e9f9-37be-44f8-ae2d-57277d8d6aa1 in datapath bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85 unbound from our chassis#033[00m
Oct  2 08:14:36 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:36.535 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85#033[00m
Oct  2 08:14:36 np0005466013 kernel: tape204a8e9-0d (unregistering): left promiscuous mode
Oct  2 08:14:36 np0005466013 NetworkManager[51205]: <info>  [1759407276.5448] device (tape204a8e9-0d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:14:36 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:36.555 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[399062ee-e242-445e-a6f0-618e8cc64503]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:36 np0005466013 ovn_controller[94366]: 2025-10-02T12:14:36Z|00249|binding|INFO|Releasing lport e204a8e9-0d37-4412-bc4a-98054b03a065 from this chassis (sb_readonly=0)
Oct  2 08:14:36 np0005466013 ovn_controller[94366]: 2025-10-02T12:14:36Z|00250|binding|INFO|Setting lport e204a8e9-0d37-4412-bc4a-98054b03a065 down in Southbound
Oct  2 08:14:36 np0005466013 ovn_controller[94366]: 2025-10-02T12:14:36Z|00251|binding|INFO|Removing iface tape204a8e9-0d ovn-installed in OVS
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.556 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.559 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:36 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:36.567 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:be:4b:1d 10.100.1.172'], port_security=['fa:16:3e:be:4b:1d 10.100.1.172'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.1.172/24', 'neutron:device_id': '38d330be-301c-4f63-9b1e-7ebc2a61f3e9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-02195939-8d5e-4e0a-8f89-f16a0db2353e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '11be1361f6f44b10a6efea8fccf616aa', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8a0b041d-c4b2-499a-b557-418346b0314a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f6b1cfbb-3e5b-441a-a995-5309296d5c01, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=e204a8e9-0d37-4412-bc4a-98054b03a065) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.570 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:36 np0005466013 kernel: tap9a3d251d-22 (unregistering): left promiscuous mode
Oct  2 08:14:36 np0005466013 NetworkManager[51205]: <info>  [1759407276.5818] device (tap9a3d251d-22): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.585 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:36 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:36.585 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[0b91e15d-57d3-498d-b2c3-c0da60137ed4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:36 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:36.589 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[201e4497-06c1-4451-8f5c-7fdb8056c15a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:36 np0005466013 ovn_controller[94366]: 2025-10-02T12:14:36Z|00252|binding|INFO|Releasing lport 9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8 from this chassis (sb_readonly=0)
Oct  2 08:14:36 np0005466013 ovn_controller[94366]: 2025-10-02T12:14:36Z|00253|binding|INFO|Setting lport 9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8 down in Southbound
Oct  2 08:14:36 np0005466013 ovn_controller[94366]: 2025-10-02T12:14:36Z|00254|binding|INFO|Removing iface tap9a3d251d-22 ovn-installed in OVS
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.601 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:36 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:36.612 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:50:70:84 10.100.0.126'], port_security=['fa:16:3e:50:70:84 10.100.0.126'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.126/24', 'neutron:device_id': '38d330be-301c-4f63-9b1e-7ebc2a61f3e9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '11be1361f6f44b10a6efea8fccf616aa', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8a0b041d-c4b2-499a-b557-418346b0314a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7d9d3969-f564-46cb-adcb-4455de0929c2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:36 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:36.622 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[c1101bfa-0d58-456b-b048-1d0c8e4e149a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:36 np0005466013 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d00000049.scope: Deactivated successfully.
Oct  2 08:14:36 np0005466013 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d00000049.scope: Consumed 3.372s CPU time.
Oct  2 08:14:36 np0005466013 systemd-machined[152202]: Machine qemu-30-instance-00000049 terminated.
Oct  2 08:14:36 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:36.641 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[1d909e06-f23a-43ae-abbf-6cf5689a9c3d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbf5e8c0d-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:70:db:91'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 832, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 832, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 75], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 526323, 'reachable_time': 29591, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230359, 'error': None, 'target': 'ovnmeta-bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:36 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:36.660 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[2e0e1a7c-8712-41b7-8fe5-5bfdde152f15]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapbf5e8c0d-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 526336, 'tstamp': 526336}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230360, 'error': None, 'target': 'ovnmeta-bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.255'], ['IFA_LABEL', 'tapbf5e8c0d-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 526340, 'tstamp': 526340}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230360, 'error': None, 'target': 'ovnmeta-bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:36 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:36.662 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbf5e8c0d-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.664 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.674 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:36 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:36.675 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbf5e8c0d-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:36 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:36.675 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:14:36 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:36.676 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapbf5e8c0d-e0, col_values=(('external_ids', {'iface-id': '206f33d2-de41-4eed-9da3-d4ba07f4c53d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:36 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:36.676 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:14:36 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:36.678 103323 INFO neutron.agent.ovn.metadata.agent [-] Port e204a8e9-0d37-4412-bc4a-98054b03a065 in datapath 02195939-8d5e-4e0a-8f89-f16a0db2353e unbound from our chassis#033[00m
Oct  2 08:14:36 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:36.680 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 02195939-8d5e-4e0a-8f89-f16a0db2353e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:14:36 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:36.683 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[e8c9f963-b9c0-4e46-9f53-5ae3ba6b3b6a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:36 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:36.684 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-02195939-8d5e-4e0a-8f89-f16a0db2353e namespace which is not needed anymore#033[00m
Oct  2 08:14:36 np0005466013 NetworkManager[51205]: <info>  [1759407276.6988] manager: (tapefd2e9f9-37): new Tun device (/org/freedesktop/NetworkManager/Devices/129)
Oct  2 08:14:36 np0005466013 NetworkManager[51205]: <info>  [1759407276.7103] manager: (tape204a8e9-0d): new Tun device (/org/freedesktop/NetworkManager/Devices/130)
Oct  2 08:14:36 np0005466013 NetworkManager[51205]: <info>  [1759407276.7223] manager: (tap9a3d251d-22): new Tun device (/org/freedesktop/NetworkManager/Devices/131)
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.765 2 INFO nova.virt.libvirt.driver [-] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Instance destroyed successfully.#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.766 2 DEBUG nova.objects.instance [None req-ca27360f-435b-4514-903f-70018394c079 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Lazy-loading 'resources' on Instance uuid 38d330be-301c-4f63-9b1e-7ebc2a61f3e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.786 2 DEBUG nova.virt.libvirt.vif [None req-ca27360f-435b-4514-903f-70018394c079 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:14:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-162137885',display_name='tempest-ServersTestMultiNic-server-162137885',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-162137885',id=73,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:14:33Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='11be1361f6f44b10a6efea8fccf616aa',ramdisk_id='',reservation_id='r-6tcn6e1c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestMultiNic-1305956602',owner_user_name='tempest-ServersTestMultiNic-1305956602-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:14:34Z,user_data=None,user_id='64ab4561f89846cc90cf0ab7f878cbd3',uuid=38d330be-301c-4f63-9b1e-7ebc2a61f3e9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "efd2e9f9-37be-44f8-ae2d-57277d8d6aa1", "address": "fa:16:3e:c7:69:5e", "network": {"id": "bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-797982943", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.238", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11be1361f6f44b10a6efea8fccf616aa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefd2e9f9-37", "ovs_interfaceid": "efd2e9f9-37be-44f8-ae2d-57277d8d6aa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.786 2 DEBUG nova.network.os_vif_util [None req-ca27360f-435b-4514-903f-70018394c079 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Converting VIF {"id": "efd2e9f9-37be-44f8-ae2d-57277d8d6aa1", "address": "fa:16:3e:c7:69:5e", "network": {"id": "bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-797982943", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.238", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11be1361f6f44b10a6efea8fccf616aa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefd2e9f9-37", "ovs_interfaceid": "efd2e9f9-37be-44f8-ae2d-57277d8d6aa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.787 2 DEBUG nova.network.os_vif_util [None req-ca27360f-435b-4514-903f-70018394c079 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c7:69:5e,bridge_name='br-int',has_traffic_filtering=True,id=efd2e9f9-37be-44f8-ae2d-57277d8d6aa1,network=Network(bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapefd2e9f9-37') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.788 2 DEBUG os_vif [None req-ca27360f-435b-4514-903f-70018394c079 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c7:69:5e,bridge_name='br-int',has_traffic_filtering=True,id=efd2e9f9-37be-44f8-ae2d-57277d8d6aa1,network=Network(bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapefd2e9f9-37') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.789 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.790 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapefd2e9f9-37, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.793 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.801 2 INFO os_vif [None req-ca27360f-435b-4514-903f-70018394c079 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c7:69:5e,bridge_name='br-int',has_traffic_filtering=True,id=efd2e9f9-37be-44f8-ae2d-57277d8d6aa1,network=Network(bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapefd2e9f9-37')#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.802 2 DEBUG nova.virt.libvirt.vif [None req-ca27360f-435b-4514-903f-70018394c079 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:14:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-162137885',display_name='tempest-ServersTestMultiNic-server-162137885',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-162137885',id=73,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:14:33Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='11be1361f6f44b10a6efea8fccf616aa',ramdisk_id='',reservation_id='r-6tcn6e1c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestMultiNic-1305956602',owner_user_name='tempest-ServersTestMultiNic-1305956602-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:14:34Z,user_data=None,user_id='64ab4561f89846cc90cf0ab7f878cbd3',uuid=38d330be-301c-4f63-9b1e-7ebc2a61f3e9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e204a8e9-0d37-4412-bc4a-98054b03a065", "address": "fa:16:3e:be:4b:1d", "network": {"id": "02195939-8d5e-4e0a-8f89-f16a0db2353e", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1175664720", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.172", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11be1361f6f44b10a6efea8fccf616aa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape204a8e9-0d", "ovs_interfaceid": "e204a8e9-0d37-4412-bc4a-98054b03a065", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.802 2 DEBUG nova.network.os_vif_util [None req-ca27360f-435b-4514-903f-70018394c079 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Converting VIF {"id": "e204a8e9-0d37-4412-bc4a-98054b03a065", "address": "fa:16:3e:be:4b:1d", "network": {"id": "02195939-8d5e-4e0a-8f89-f16a0db2353e", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1175664720", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.172", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11be1361f6f44b10a6efea8fccf616aa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape204a8e9-0d", "ovs_interfaceid": "e204a8e9-0d37-4412-bc4a-98054b03a065", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.803 2 DEBUG nova.network.os_vif_util [None req-ca27360f-435b-4514-903f-70018394c079 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:be:4b:1d,bridge_name='br-int',has_traffic_filtering=True,id=e204a8e9-0d37-4412-bc4a-98054b03a065,network=Network(02195939-8d5e-4e0a-8f89-f16a0db2353e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape204a8e9-0d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.803 2 DEBUG os_vif [None req-ca27360f-435b-4514-903f-70018394c079 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:be:4b:1d,bridge_name='br-int',has_traffic_filtering=True,id=e204a8e9-0d37-4412-bc4a-98054b03a065,network=Network(02195939-8d5e-4e0a-8f89-f16a0db2353e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape204a8e9-0d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.805 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape204a8e9-0d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.808 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.810 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.812 2 INFO os_vif [None req-ca27360f-435b-4514-903f-70018394c079 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:be:4b:1d,bridge_name='br-int',has_traffic_filtering=True,id=e204a8e9-0d37-4412-bc4a-98054b03a065,network=Network(02195939-8d5e-4e0a-8f89-f16a0db2353e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape204a8e9-0d')#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.813 2 DEBUG nova.virt.libvirt.vif [None req-ca27360f-435b-4514-903f-70018394c079 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:14:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-162137885',display_name='tempest-ServersTestMultiNic-server-162137885',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-162137885',id=73,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:14:33Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='11be1361f6f44b10a6efea8fccf616aa',ramdisk_id='',reservation_id='r-6tcn6e1c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestMultiNic-1305956602',owner_user_name='tempest-ServersTestMultiNic-1305956602-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:14:34Z,user_data=None,user_id='64ab4561f89846cc90cf0ab7f878cbd3',uuid=38d330be-301c-4f63-9b1e-7ebc2a61f3e9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8", "address": "fa:16:3e:50:70:84", "network": {"id": "bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-797982943", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.126", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11be1361f6f44b10a6efea8fccf616aa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a3d251d-22", "ovs_interfaceid": "9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.813 2 DEBUG nova.network.os_vif_util [None req-ca27360f-435b-4514-903f-70018394c079 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Converting VIF {"id": "9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8", "address": "fa:16:3e:50:70:84", "network": {"id": "bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-797982943", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.126", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11be1361f6f44b10a6efea8fccf616aa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a3d251d-22", "ovs_interfaceid": "9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.814 2 DEBUG nova.network.os_vif_util [None req-ca27360f-435b-4514-903f-70018394c079 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:50:70:84,bridge_name='br-int',has_traffic_filtering=True,id=9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8,network=Network(bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9a3d251d-22') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.814 2 DEBUG os_vif [None req-ca27360f-435b-4514-903f-70018394c079 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:50:70:84,bridge_name='br-int',has_traffic_filtering=True,id=9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8,network=Network(bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9a3d251d-22') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.815 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.816 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9a3d251d-22, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.817 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.819 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.820 2 INFO os_vif [None req-ca27360f-435b-4514-903f-70018394c079 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:50:70:84,bridge_name='br-int',has_traffic_filtering=True,id=9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8,network=Network(bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9a3d251d-22')#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.820 2 INFO nova.virt.libvirt.driver [None req-ca27360f-435b-4514-903f-70018394c079 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Deleting instance files /var/lib/nova/instances/38d330be-301c-4f63-9b1e-7ebc2a61f3e9_del#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.821 2 INFO nova.virt.libvirt.driver [None req-ca27360f-435b-4514-903f-70018394c079 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Deletion of /var/lib/nova/instances/38d330be-301c-4f63-9b1e-7ebc2a61f3e9_del complete#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.828 2 DEBUG nova.compute.manager [req-f7056ac7-870d-4f38-9174-683c93767891 req-42fb86f0-ba13-4bfc-a97d-7d5ba5c29dbf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Received event network-vif-unplugged-e204a8e9-0d37-4412-bc4a-98054b03a065 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.828 2 DEBUG oslo_concurrency.lockutils [req-f7056ac7-870d-4f38-9174-683c93767891 req-42fb86f0-ba13-4bfc-a97d-7d5ba5c29dbf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.829 2 DEBUG oslo_concurrency.lockutils [req-f7056ac7-870d-4f38-9174-683c93767891 req-42fb86f0-ba13-4bfc-a97d-7d5ba5c29dbf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.829 2 DEBUG oslo_concurrency.lockutils [req-f7056ac7-870d-4f38-9174-683c93767891 req-42fb86f0-ba13-4bfc-a97d-7d5ba5c29dbf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.829 2 DEBUG nova.compute.manager [req-f7056ac7-870d-4f38-9174-683c93767891 req-42fb86f0-ba13-4bfc-a97d-7d5ba5c29dbf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] No waiting events found dispatching network-vif-unplugged-e204a8e9-0d37-4412-bc4a-98054b03a065 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.830 2 DEBUG nova.compute.manager [req-f7056ac7-870d-4f38-9174-683c93767891 req-42fb86f0-ba13-4bfc-a97d-7d5ba5c29dbf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Received event network-vif-unplugged-e204a8e9-0d37-4412-bc4a-98054b03a065 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.837 2 DEBUG nova.compute.manager [req-52527337-1001-45d9-bebe-74f0fd10751c req-d8799857-6bd6-4571-9715-2b839dd5031c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Received event network-vif-unplugged-efd2e9f9-37be-44f8-ae2d-57277d8d6aa1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.837 2 DEBUG oslo_concurrency.lockutils [req-52527337-1001-45d9-bebe-74f0fd10751c req-d8799857-6bd6-4571-9715-2b839dd5031c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.837 2 DEBUG oslo_concurrency.lockutils [req-52527337-1001-45d9-bebe-74f0fd10751c req-d8799857-6bd6-4571-9715-2b839dd5031c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.838 2 DEBUG oslo_concurrency.lockutils [req-52527337-1001-45d9-bebe-74f0fd10751c req-d8799857-6bd6-4571-9715-2b839dd5031c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.838 2 DEBUG nova.compute.manager [req-52527337-1001-45d9-bebe-74f0fd10751c req-d8799857-6bd6-4571-9715-2b839dd5031c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] No waiting events found dispatching network-vif-unplugged-efd2e9f9-37be-44f8-ae2d-57277d8d6aa1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.838 2 DEBUG nova.compute.manager [req-52527337-1001-45d9-bebe-74f0fd10751c req-d8799857-6bd6-4571-9715-2b839dd5031c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Received event network-vif-unplugged-efd2e9f9-37be-44f8-ae2d-57277d8d6aa1 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:14:36 np0005466013 neutron-haproxy-ovnmeta-02195939-8d5e-4e0a-8f89-f16a0db2353e[230321]: [NOTICE]   (230325) : haproxy version is 2.8.14-c23fe91
Oct  2 08:14:36 np0005466013 neutron-haproxy-ovnmeta-02195939-8d5e-4e0a-8f89-f16a0db2353e[230321]: [NOTICE]   (230325) : path to executable is /usr/sbin/haproxy
Oct  2 08:14:36 np0005466013 neutron-haproxy-ovnmeta-02195939-8d5e-4e0a-8f89-f16a0db2353e[230321]: [WARNING]  (230325) : Exiting Master process...
Oct  2 08:14:36 np0005466013 neutron-haproxy-ovnmeta-02195939-8d5e-4e0a-8f89-f16a0db2353e[230321]: [ALERT]    (230325) : Current worker (230327) exited with code 143 (Terminated)
Oct  2 08:14:36 np0005466013 neutron-haproxy-ovnmeta-02195939-8d5e-4e0a-8f89-f16a0db2353e[230321]: [WARNING]  (230325) : All workers exited. Exiting... (0)
Oct  2 08:14:36 np0005466013 systemd[1]: libpod-d824d6a9c704e5fed9ffb956f34711ecb525f4e543e3b4d543577053cf8d352c.scope: Deactivated successfully.
Oct  2 08:14:36 np0005466013 podman[230424]: 2025-10-02 12:14:36.850982818 +0000 UTC m=+0.049514364 container died d824d6a9c704e5fed9ffb956f34711ecb525f4e543e3b4d543577053cf8d352c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-02195939-8d5e-4e0a-8f89-f16a0db2353e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Oct  2 08:14:36 np0005466013 systemd[1]: var-lib-containers-storage-overlay-3f26de003de8b8ebd208eefbff0f5d6b27f2851fb183005185e73d340c5c60f1-merged.mount: Deactivated successfully.
Oct  2 08:14:36 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d824d6a9c704e5fed9ffb956f34711ecb525f4e543e3b4d543577053cf8d352c-userdata-shm.mount: Deactivated successfully.
Oct  2 08:14:36 np0005466013 podman[230424]: 2025-10-02 12:14:36.886221928 +0000 UTC m=+0.084753464 container cleanup d824d6a9c704e5fed9ffb956f34711ecb525f4e543e3b4d543577053cf8d352c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-02195939-8d5e-4e0a-8f89-f16a0db2353e, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.886 2 INFO nova.compute.manager [None req-ca27360f-435b-4514-903f-70018394c079 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Took 0.41 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.887 2 DEBUG oslo.service.loopingcall [None req-ca27360f-435b-4514-903f-70018394c079 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.887 2 DEBUG nova.compute.manager [-] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.888 2 DEBUG nova.network.neutron [-] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:14:36 np0005466013 systemd[1]: libpod-conmon-d824d6a9c704e5fed9ffb956f34711ecb525f4e543e3b4d543577053cf8d352c.scope: Deactivated successfully.
Oct  2 08:14:36 np0005466013 podman[230456]: 2025-10-02 12:14:36.965397284 +0000 UTC m=+0.058716157 container remove d824d6a9c704e5fed9ffb956f34711ecb525f4e543e3b4d543577053cf8d352c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-02195939-8d5e-4e0a-8f89-f16a0db2353e, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Oct  2 08:14:36 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:36.974 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[c204b572-58fd-4d94-a935-55766641feb6]: (4, ('Thu Oct  2 12:14:36 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-02195939-8d5e-4e0a-8f89-f16a0db2353e (d824d6a9c704e5fed9ffb956f34711ecb525f4e543e3b4d543577053cf8d352c)\nd824d6a9c704e5fed9ffb956f34711ecb525f4e543e3b4d543577053cf8d352c\nThu Oct  2 12:14:36 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-02195939-8d5e-4e0a-8f89-f16a0db2353e (d824d6a9c704e5fed9ffb956f34711ecb525f4e543e3b4d543577053cf8d352c)\nd824d6a9c704e5fed9ffb956f34711ecb525f4e543e3b4d543577053cf8d352c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:36 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:36.976 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[5c911ecf-bfd6-4717-9975-a74e04aa25d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:36 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:36.977 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap02195939-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.979 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:36 np0005466013 kernel: tap02195939-80: left promiscuous mode
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.982 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:36 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:36.986 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[ee77b310-f107-446b-8473-d141c2d420f7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:36 np0005466013 nova_compute[192144]: 2025-10-02 12:14:36.993 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:37.013 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[15aa0f2c-4107-4fe8-ac09-e87c982845be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:37.015 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[3d9ce4d4-fb86-4a4a-bc5a-e3e6867b64fd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:37.036 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[95261674-c2e0-4876-8730-26c827c72a8e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 526425, 'reachable_time': 24907, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230470, 'error': None, 'target': 'ovnmeta-02195939-8d5e-4e0a-8f89-f16a0db2353e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:37.039 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-02195939-8d5e-4e0a-8f89-f16a0db2353e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:14:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:37.040 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[bd27d258-3270-4681-b714-fa9610d881c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:37.041 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8 in datapath bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85 unbound from our chassis#033[00m
Oct  2 08:14:37 np0005466013 systemd[1]: run-netns-ovnmeta\x2d02195939\x2d8d5e\x2d4e0a\x2d8f89\x2df16a0db2353e.mount: Deactivated successfully.
Oct  2 08:14:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:37.042 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:14:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:37.043 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[ce90e908-0dc8-4d66-8382-5188e9dc053d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:37.043 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85 namespace which is not needed anymore#033[00m
Oct  2 08:14:37 np0005466013 neutron-haproxy-ovnmeta-bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85[230239]: [NOTICE]   (230243) : haproxy version is 2.8.14-c23fe91
Oct  2 08:14:37 np0005466013 neutron-haproxy-ovnmeta-bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85[230239]: [NOTICE]   (230243) : path to executable is /usr/sbin/haproxy
Oct  2 08:14:37 np0005466013 neutron-haproxy-ovnmeta-bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85[230239]: [WARNING]  (230243) : Exiting Master process...
Oct  2 08:14:37 np0005466013 neutron-haproxy-ovnmeta-bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85[230239]: [ALERT]    (230243) : Current worker (230245) exited with code 143 (Terminated)
Oct  2 08:14:37 np0005466013 neutron-haproxy-ovnmeta-bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85[230239]: [WARNING]  (230243) : All workers exited. Exiting... (0)
Oct  2 08:14:37 np0005466013 systemd[1]: libpod-602e3762807b50a85cd970dfe22ca443457d7b129557c33bfb24c6793f1f6d6b.scope: Deactivated successfully.
Oct  2 08:14:37 np0005466013 podman[230489]: 2025-10-02 12:14:37.175618863 +0000 UTC m=+0.046019653 container died 602e3762807b50a85cd970dfe22ca443457d7b129557c33bfb24c6793f1f6d6b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:14:37 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-602e3762807b50a85cd970dfe22ca443457d7b129557c33bfb24c6793f1f6d6b-userdata-shm.mount: Deactivated successfully.
Oct  2 08:14:37 np0005466013 systemd[1]: var-lib-containers-storage-overlay-6ec54854c951ed04172aa487d23ec34c40d64225b78c54a6f0b4b9eb59d35292-merged.mount: Deactivated successfully.
Oct  2 08:14:37 np0005466013 podman[230489]: 2025-10-02 12:14:37.286658302 +0000 UTC m=+0.157059102 container cleanup 602e3762807b50a85cd970dfe22ca443457d7b129557c33bfb24c6793f1f6d6b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3)
Oct  2 08:14:37 np0005466013 systemd[1]: libpod-conmon-602e3762807b50a85cd970dfe22ca443457d7b129557c33bfb24c6793f1f6d6b.scope: Deactivated successfully.
Oct  2 08:14:37 np0005466013 podman[230521]: 2025-10-02 12:14:37.364409822 +0000 UTC m=+0.049377789 container remove 602e3762807b50a85cd970dfe22ca443457d7b129557c33bfb24c6793f1f6d6b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 08:14:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:37.371 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[19ab8c26-21b9-402e-8c90-a654530cdea8]: (4, ('Thu Oct  2 12:14:37 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85 (602e3762807b50a85cd970dfe22ca443457d7b129557c33bfb24c6793f1f6d6b)\n602e3762807b50a85cd970dfe22ca443457d7b129557c33bfb24c6793f1f6d6b\nThu Oct  2 12:14:37 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85 (602e3762807b50a85cd970dfe22ca443457d7b129557c33bfb24c6793f1f6d6b)\n602e3762807b50a85cd970dfe22ca443457d7b129557c33bfb24c6793f1f6d6b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:37.374 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[1faf847b-ca67-4938-af65-6bcd3220c266]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:37.375 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbf5e8c0d-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:37 np0005466013 nova_compute[192144]: 2025-10-02 12:14:37.377 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:37 np0005466013 kernel: tapbf5e8c0d-e0: left promiscuous mode
Oct  2 08:14:37 np0005466013 nova_compute[192144]: 2025-10-02 12:14:37.380 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:37.383 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[a8d56c2e-8839-4566-afef-003e14ef962f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:37 np0005466013 nova_compute[192144]: 2025-10-02 12:14:37.399 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:37.417 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[4b166aea-1dbc-486a-8490-d49ee3e10238]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:37.419 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[9fa06db1-75cc-4b70-8744-2895184bbb7c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:37.436 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[6399cdcf-8b10-4869-9492-99507f2d6459]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 526314, 'reachable_time': 44289, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230536, 'error': None, 'target': 'ovnmeta-bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:37.438 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:14:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:37.438 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[5edfb7af-7104-4609-b27f-4c6d19595970]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:37 np0005466013 nova_compute[192144]: 2025-10-02 12:14:37.833 2 DEBUG nova.compute.manager [req-2cedfd69-48ae-4b21-856a-3c9a1f058fdb req-38f7244b-ccea-4fea-9d3d-30acab6728b1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Received event network-vif-deleted-e204a8e9-0d37-4412-bc4a-98054b03a065 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:14:37 np0005466013 nova_compute[192144]: 2025-10-02 12:14:37.833 2 INFO nova.compute.manager [req-2cedfd69-48ae-4b21-856a-3c9a1f058fdb req-38f7244b-ccea-4fea-9d3d-30acab6728b1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Neutron deleted interface e204a8e9-0d37-4412-bc4a-98054b03a065; detaching it from the instance and deleting it from the info cache#033[00m
Oct  2 08:14:37 np0005466013 nova_compute[192144]: 2025-10-02 12:14:37.833 2 DEBUG nova.network.neutron [req-2cedfd69-48ae-4b21-856a-3c9a1f058fdb req-38f7244b-ccea-4fea-9d3d-30acab6728b1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Updating instance_info_cache with network_info: [{"id": "efd2e9f9-37be-44f8-ae2d-57277d8d6aa1", "address": "fa:16:3e:c7:69:5e", "network": {"id": "bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-797982943", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.238", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11be1361f6f44b10a6efea8fccf616aa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefd2e9f9-37", "ovs_interfaceid": "efd2e9f9-37be-44f8-ae2d-57277d8d6aa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8", "address": "fa:16:3e:50:70:84", "network": {"id": "bf5e8c0d-ee9c-47ab-bd3e-efd4f4e2fc85", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-797982943", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.126", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11be1361f6f44b10a6efea8fccf616aa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a3d251d-22", "ovs_interfaceid": "9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:14:37 np0005466013 nova_compute[192144]: 2025-10-02 12:14:37.859 2 DEBUG nova.compute.manager [req-2cedfd69-48ae-4b21-856a-3c9a1f058fdb req-38f7244b-ccea-4fea-9d3d-30acab6728b1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Detach interface failed, port_id=e204a8e9-0d37-4412-bc4a-98054b03a065, reason: Instance 38d330be-301c-4f63-9b1e-7ebc2a61f3e9 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Oct  2 08:14:37 np0005466013 systemd[1]: run-netns-ovnmeta\x2dbf5e8c0d\x2dee9c\x2d47ab\x2dbd3e\x2defd4f4e2fc85.mount: Deactivated successfully.
Oct  2 08:14:38 np0005466013 nova_compute[192144]: 2025-10-02 12:14:38.380 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:38 np0005466013 nova_compute[192144]: 2025-10-02 12:14:38.722 2 DEBUG nova.network.neutron [-] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:14:38 np0005466013 nova_compute[192144]: 2025-10-02 12:14:38.745 2 INFO nova.compute.manager [-] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Took 1.86 seconds to deallocate network for instance.#033[00m
Oct  2 08:14:38 np0005466013 nova_compute[192144]: 2025-10-02 12:14:38.824 2 DEBUG oslo_concurrency.lockutils [None req-ca27360f-435b-4514-903f-70018394c079 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:38 np0005466013 nova_compute[192144]: 2025-10-02 12:14:38.824 2 DEBUG oslo_concurrency.lockutils [None req-ca27360f-435b-4514-903f-70018394c079 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:38 np0005466013 nova_compute[192144]: 2025-10-02 12:14:38.880 2 DEBUG nova.compute.provider_tree [None req-ca27360f-435b-4514-903f-70018394c079 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:14:38 np0005466013 nova_compute[192144]: 2025-10-02 12:14:38.896 2 DEBUG nova.scheduler.client.report [None req-ca27360f-435b-4514-903f-70018394c079 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:14:38 np0005466013 nova_compute[192144]: 2025-10-02 12:14:38.903 2 DEBUG nova.compute.manager [req-5473a90a-df5e-46b4-a438-9ec47336fb6f req-6efed9ae-2568-469f-a1cb-5eb12d4f18b5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Received event network-vif-plugged-e204a8e9-0d37-4412-bc4a-98054b03a065 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:14:38 np0005466013 nova_compute[192144]: 2025-10-02 12:14:38.903 2 DEBUG oslo_concurrency.lockutils [req-5473a90a-df5e-46b4-a438-9ec47336fb6f req-6efed9ae-2568-469f-a1cb-5eb12d4f18b5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:38 np0005466013 nova_compute[192144]: 2025-10-02 12:14:38.904 2 DEBUG oslo_concurrency.lockutils [req-5473a90a-df5e-46b4-a438-9ec47336fb6f req-6efed9ae-2568-469f-a1cb-5eb12d4f18b5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:38 np0005466013 nova_compute[192144]: 2025-10-02 12:14:38.904 2 DEBUG oslo_concurrency.lockutils [req-5473a90a-df5e-46b4-a438-9ec47336fb6f req-6efed9ae-2568-469f-a1cb-5eb12d4f18b5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:38 np0005466013 nova_compute[192144]: 2025-10-02 12:14:38.904 2 DEBUG nova.compute.manager [req-5473a90a-df5e-46b4-a438-9ec47336fb6f req-6efed9ae-2568-469f-a1cb-5eb12d4f18b5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] No waiting events found dispatching network-vif-plugged-e204a8e9-0d37-4412-bc4a-98054b03a065 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:14:38 np0005466013 nova_compute[192144]: 2025-10-02 12:14:38.904 2 WARNING nova.compute.manager [req-5473a90a-df5e-46b4-a438-9ec47336fb6f req-6efed9ae-2568-469f-a1cb-5eb12d4f18b5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Received unexpected event network-vif-plugged-e204a8e9-0d37-4412-bc4a-98054b03a065 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:14:38 np0005466013 nova_compute[192144]: 2025-10-02 12:14:38.915 2 DEBUG oslo_concurrency.lockutils [None req-ca27360f-435b-4514-903f-70018394c079 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.091s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:38 np0005466013 nova_compute[192144]: 2025-10-02 12:14:38.942 2 INFO nova.scheduler.client.report [None req-ca27360f-435b-4514-903f-70018394c079 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Deleted allocations for instance 38d330be-301c-4f63-9b1e-7ebc2a61f3e9#033[00m
Oct  2 08:14:38 np0005466013 nova_compute[192144]: 2025-10-02 12:14:38.961 2 DEBUG nova.compute.manager [req-5b759552-cd85-43ac-9cf0-17aea9983713 req-eca986b5-3f4a-425c-bf36-3f78ceab65c6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Received event network-vif-plugged-efd2e9f9-37be-44f8-ae2d-57277d8d6aa1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:14:38 np0005466013 nova_compute[192144]: 2025-10-02 12:14:38.962 2 DEBUG oslo_concurrency.lockutils [req-5b759552-cd85-43ac-9cf0-17aea9983713 req-eca986b5-3f4a-425c-bf36-3f78ceab65c6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:38 np0005466013 nova_compute[192144]: 2025-10-02 12:14:38.962 2 DEBUG oslo_concurrency.lockutils [req-5b759552-cd85-43ac-9cf0-17aea9983713 req-eca986b5-3f4a-425c-bf36-3f78ceab65c6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:38 np0005466013 nova_compute[192144]: 2025-10-02 12:14:38.962 2 DEBUG oslo_concurrency.lockutils [req-5b759552-cd85-43ac-9cf0-17aea9983713 req-eca986b5-3f4a-425c-bf36-3f78ceab65c6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:38 np0005466013 nova_compute[192144]: 2025-10-02 12:14:38.963 2 DEBUG nova.compute.manager [req-5b759552-cd85-43ac-9cf0-17aea9983713 req-eca986b5-3f4a-425c-bf36-3f78ceab65c6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] No waiting events found dispatching network-vif-plugged-efd2e9f9-37be-44f8-ae2d-57277d8d6aa1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:14:38 np0005466013 nova_compute[192144]: 2025-10-02 12:14:38.963 2 WARNING nova.compute.manager [req-5b759552-cd85-43ac-9cf0-17aea9983713 req-eca986b5-3f4a-425c-bf36-3f78ceab65c6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Received unexpected event network-vif-plugged-efd2e9f9-37be-44f8-ae2d-57277d8d6aa1 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:14:38 np0005466013 nova_compute[192144]: 2025-10-02 12:14:38.963 2 DEBUG nova.compute.manager [req-5b759552-cd85-43ac-9cf0-17aea9983713 req-eca986b5-3f4a-425c-bf36-3f78ceab65c6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Received event network-vif-unplugged-9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:14:38 np0005466013 nova_compute[192144]: 2025-10-02 12:14:38.963 2 DEBUG oslo_concurrency.lockutils [req-5b759552-cd85-43ac-9cf0-17aea9983713 req-eca986b5-3f4a-425c-bf36-3f78ceab65c6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:38 np0005466013 nova_compute[192144]: 2025-10-02 12:14:38.964 2 DEBUG oslo_concurrency.lockutils [req-5b759552-cd85-43ac-9cf0-17aea9983713 req-eca986b5-3f4a-425c-bf36-3f78ceab65c6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:38 np0005466013 nova_compute[192144]: 2025-10-02 12:14:38.964 2 DEBUG oslo_concurrency.lockutils [req-5b759552-cd85-43ac-9cf0-17aea9983713 req-eca986b5-3f4a-425c-bf36-3f78ceab65c6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:38 np0005466013 nova_compute[192144]: 2025-10-02 12:14:38.964 2 DEBUG nova.compute.manager [req-5b759552-cd85-43ac-9cf0-17aea9983713 req-eca986b5-3f4a-425c-bf36-3f78ceab65c6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] No waiting events found dispatching network-vif-unplugged-9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:14:38 np0005466013 nova_compute[192144]: 2025-10-02 12:14:38.964 2 WARNING nova.compute.manager [req-5b759552-cd85-43ac-9cf0-17aea9983713 req-eca986b5-3f4a-425c-bf36-3f78ceab65c6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Received unexpected event network-vif-unplugged-9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:14:38 np0005466013 nova_compute[192144]: 2025-10-02 12:14:38.964 2 DEBUG nova.compute.manager [req-5b759552-cd85-43ac-9cf0-17aea9983713 req-eca986b5-3f4a-425c-bf36-3f78ceab65c6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Received event network-vif-plugged-9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:14:38 np0005466013 nova_compute[192144]: 2025-10-02 12:14:38.964 2 DEBUG oslo_concurrency.lockutils [req-5b759552-cd85-43ac-9cf0-17aea9983713 req-eca986b5-3f4a-425c-bf36-3f78ceab65c6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:38 np0005466013 nova_compute[192144]: 2025-10-02 12:14:38.965 2 DEBUG oslo_concurrency.lockutils [req-5b759552-cd85-43ac-9cf0-17aea9983713 req-eca986b5-3f4a-425c-bf36-3f78ceab65c6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:38 np0005466013 nova_compute[192144]: 2025-10-02 12:14:38.965 2 DEBUG oslo_concurrency.lockutils [req-5b759552-cd85-43ac-9cf0-17aea9983713 req-eca986b5-3f4a-425c-bf36-3f78ceab65c6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:38 np0005466013 nova_compute[192144]: 2025-10-02 12:14:38.965 2 DEBUG nova.compute.manager [req-5b759552-cd85-43ac-9cf0-17aea9983713 req-eca986b5-3f4a-425c-bf36-3f78ceab65c6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] No waiting events found dispatching network-vif-plugged-9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:14:38 np0005466013 nova_compute[192144]: 2025-10-02 12:14:38.965 2 WARNING nova.compute.manager [req-5b759552-cd85-43ac-9cf0-17aea9983713 req-eca986b5-3f4a-425c-bf36-3f78ceab65c6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Received unexpected event network-vif-plugged-9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:14:39 np0005466013 nova_compute[192144]: 2025-10-02 12:14:39.014 2 DEBUG oslo_concurrency.lockutils [None req-ca27360f-435b-4514-903f-70018394c079 64ab4561f89846cc90cf0ab7f878cbd3 11be1361f6f44b10a6efea8fccf616aa - - default default] Lock "38d330be-301c-4f63-9b1e-7ebc2a61f3e9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.580s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:39 np0005466013 podman[230538]: 2025-10-02 12:14:39.699358508 +0000 UTC m=+0.062089483 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  2 08:14:39 np0005466013 podman[230537]: 2025-10-02 12:14:39.704029097 +0000 UTC m=+0.060450712 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:14:39 np0005466013 podman[230539]: 2025-10-02 12:14:39.766822442 +0000 UTC m=+0.127276505 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:14:39 np0005466013 nova_compute[192144]: 2025-10-02 12:14:39.925 2 DEBUG nova.compute.manager [req-3ae30d4a-7479-48d8-b881-f8a4e1e685b0 req-b5557216-24d6-4b37-8015-ebc5a4f632cd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Received event network-vif-deleted-efd2e9f9-37be-44f8-ae2d-57277d8d6aa1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:14:39 np0005466013 nova_compute[192144]: 2025-10-02 12:14:39.925 2 DEBUG nova.compute.manager [req-3ae30d4a-7479-48d8-b881-f8a4e1e685b0 req-b5557216-24d6-4b37-8015-ebc5a4f632cd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Received event network-vif-deleted-9a3d251d-227f-48ef-8f0c-0a2d1f1b15f8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:14:40 np0005466013 nova_compute[192144]: 2025-10-02 12:14:40.376 2 DEBUG oslo_concurrency.lockutils [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Acquiring lock "70d78115-9cfc-487c-95b6-f4f4149c52a1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:40 np0005466013 nova_compute[192144]: 2025-10-02 12:14:40.376 2 DEBUG oslo_concurrency.lockutils [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lock "70d78115-9cfc-487c-95b6-f4f4149c52a1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:40 np0005466013 nova_compute[192144]: 2025-10-02 12:14:40.392 2 DEBUG nova.compute.manager [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:14:40 np0005466013 nova_compute[192144]: 2025-10-02 12:14:40.477 2 DEBUG oslo_concurrency.lockutils [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:40 np0005466013 nova_compute[192144]: 2025-10-02 12:14:40.477 2 DEBUG oslo_concurrency.lockutils [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:40 np0005466013 nova_compute[192144]: 2025-10-02 12:14:40.484 2 DEBUG nova.virt.hardware [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:14:40 np0005466013 nova_compute[192144]: 2025-10-02 12:14:40.484 2 INFO nova.compute.claims [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:14:40 np0005466013 nova_compute[192144]: 2025-10-02 12:14:40.636 2 DEBUG nova.compute.provider_tree [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:14:40 np0005466013 nova_compute[192144]: 2025-10-02 12:14:40.649 2 DEBUG nova.scheduler.client.report [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:14:40 np0005466013 nova_compute[192144]: 2025-10-02 12:14:40.697 2 DEBUG oslo_concurrency.lockutils [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.220s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:40 np0005466013 nova_compute[192144]: 2025-10-02 12:14:40.698 2 DEBUG nova.compute.manager [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:14:40 np0005466013 nova_compute[192144]: 2025-10-02 12:14:40.762 2 DEBUG nova.compute.manager [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:14:40 np0005466013 nova_compute[192144]: 2025-10-02 12:14:40.762 2 DEBUG nova.network.neutron [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:14:40 np0005466013 nova_compute[192144]: 2025-10-02 12:14:40.783 2 INFO nova.virt.libvirt.driver [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:14:40 np0005466013 nova_compute[192144]: 2025-10-02 12:14:40.799 2 DEBUG nova.compute.manager [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:14:40 np0005466013 nova_compute[192144]: 2025-10-02 12:14:40.915 2 DEBUG nova.compute.manager [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:14:40 np0005466013 nova_compute[192144]: 2025-10-02 12:14:40.918 2 DEBUG nova.virt.libvirt.driver [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:14:40 np0005466013 nova_compute[192144]: 2025-10-02 12:14:40.919 2 INFO nova.virt.libvirt.driver [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Creating image(s)#033[00m
Oct  2 08:14:40 np0005466013 nova_compute[192144]: 2025-10-02 12:14:40.919 2 DEBUG oslo_concurrency.lockutils [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Acquiring lock "/var/lib/nova/instances/70d78115-9cfc-487c-95b6-f4f4149c52a1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:40 np0005466013 nova_compute[192144]: 2025-10-02 12:14:40.920 2 DEBUG oslo_concurrency.lockutils [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lock "/var/lib/nova/instances/70d78115-9cfc-487c-95b6-f4f4149c52a1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:40 np0005466013 nova_compute[192144]: 2025-10-02 12:14:40.921 2 DEBUG oslo_concurrency.lockutils [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lock "/var/lib/nova/instances/70d78115-9cfc-487c-95b6-f4f4149c52a1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:40 np0005466013 nova_compute[192144]: 2025-10-02 12:14:40.940 2 DEBUG oslo_concurrency.processutils [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:14:41 np0005466013 nova_compute[192144]: 2025-10-02 12:14:41.022 2 DEBUG oslo_concurrency.processutils [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:14:41 np0005466013 nova_compute[192144]: 2025-10-02 12:14:41.023 2 DEBUG oslo_concurrency.lockutils [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:41 np0005466013 nova_compute[192144]: 2025-10-02 12:14:41.024 2 DEBUG oslo_concurrency.lockutils [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:41 np0005466013 nova_compute[192144]: 2025-10-02 12:14:41.036 2 DEBUG oslo_concurrency.processutils [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:14:41 np0005466013 nova_compute[192144]: 2025-10-02 12:14:41.056 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:41 np0005466013 nova_compute[192144]: 2025-10-02 12:14:41.095 2 DEBUG oslo_concurrency.processutils [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:14:41 np0005466013 nova_compute[192144]: 2025-10-02 12:14:41.096 2 DEBUG oslo_concurrency.processutils [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/70d78115-9cfc-487c-95b6-f4f4149c52a1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:14:41 np0005466013 nova_compute[192144]: 2025-10-02 12:14:41.135 2 DEBUG oslo_concurrency.processutils [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/70d78115-9cfc-487c-95b6-f4f4149c52a1/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:14:41 np0005466013 nova_compute[192144]: 2025-10-02 12:14:41.136 2 DEBUG oslo_concurrency.lockutils [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:41 np0005466013 nova_compute[192144]: 2025-10-02 12:14:41.137 2 DEBUG oslo_concurrency.processutils [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:14:41 np0005466013 nova_compute[192144]: 2025-10-02 12:14:41.202 2 DEBUG oslo_concurrency.processutils [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:14:41 np0005466013 nova_compute[192144]: 2025-10-02 12:14:41.206 2 DEBUG nova.virt.disk.api [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Checking if we can resize image /var/lib/nova/instances/70d78115-9cfc-487c-95b6-f4f4149c52a1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:14:41 np0005466013 nova_compute[192144]: 2025-10-02 12:14:41.207 2 DEBUG oslo_concurrency.processutils [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70d78115-9cfc-487c-95b6-f4f4149c52a1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:14:41 np0005466013 nova_compute[192144]: 2025-10-02 12:14:41.269 2 DEBUG oslo_concurrency.processutils [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70d78115-9cfc-487c-95b6-f4f4149c52a1/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:14:41 np0005466013 nova_compute[192144]: 2025-10-02 12:14:41.271 2 DEBUG nova.virt.disk.api [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Cannot resize image /var/lib/nova/instances/70d78115-9cfc-487c-95b6-f4f4149c52a1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:14:41 np0005466013 nova_compute[192144]: 2025-10-02 12:14:41.271 2 DEBUG nova.objects.instance [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lazy-loading 'migration_context' on Instance uuid 70d78115-9cfc-487c-95b6-f4f4149c52a1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:14:41 np0005466013 nova_compute[192144]: 2025-10-02 12:14:41.298 2 DEBUG nova.virt.libvirt.driver [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:14:41 np0005466013 nova_compute[192144]: 2025-10-02 12:14:41.298 2 DEBUG nova.virt.libvirt.driver [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Ensure instance console log exists: /var/lib/nova/instances/70d78115-9cfc-487c-95b6-f4f4149c52a1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:14:41 np0005466013 nova_compute[192144]: 2025-10-02 12:14:41.299 2 DEBUG oslo_concurrency.lockutils [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:41 np0005466013 nova_compute[192144]: 2025-10-02 12:14:41.299 2 DEBUG oslo_concurrency.lockutils [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:41 np0005466013 nova_compute[192144]: 2025-10-02 12:14:41.299 2 DEBUG oslo_concurrency.lockutils [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:41 np0005466013 nova_compute[192144]: 2025-10-02 12:14:41.338 2 DEBUG nova.policy [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7b25dedc41b548469d2b0627e3255b9e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'da589921190a470cab62d12688f03735', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:14:41 np0005466013 nova_compute[192144]: 2025-10-02 12:14:41.818 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:42 np0005466013 nova_compute[192144]: 2025-10-02 12:14:42.349 2 DEBUG nova.network.neutron [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Successfully created port: cea86da2-59ef-4fb2-a414-04dbebd56e75 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:14:43 np0005466013 nova_compute[192144]: 2025-10-02 12:14:43.383 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:43 np0005466013 nova_compute[192144]: 2025-10-02 12:14:43.819 2 DEBUG nova.network.neutron [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Successfully updated port: cea86da2-59ef-4fb2-a414-04dbebd56e75 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:14:43 np0005466013 nova_compute[192144]: 2025-10-02 12:14:43.836 2 DEBUG oslo_concurrency.lockutils [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Acquiring lock "refresh_cache-70d78115-9cfc-487c-95b6-f4f4149c52a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:14:43 np0005466013 nova_compute[192144]: 2025-10-02 12:14:43.836 2 DEBUG oslo_concurrency.lockutils [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Acquired lock "refresh_cache-70d78115-9cfc-487c-95b6-f4f4149c52a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:14:43 np0005466013 nova_compute[192144]: 2025-10-02 12:14:43.837 2 DEBUG nova.network.neutron [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:14:44 np0005466013 nova_compute[192144]: 2025-10-02 12:14:44.644 2 DEBUG nova.network.neutron [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:14:44 np0005466013 nova_compute[192144]: 2025-10-02 12:14:44.724 2 DEBUG nova.compute.manager [req-a5917775-6e9f-4c50-a295-feb9c4b6764d req-19c0d3b1-e0b1-43a8-973d-f641c311d355 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Received event network-changed-cea86da2-59ef-4fb2-a414-04dbebd56e75 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:14:44 np0005466013 nova_compute[192144]: 2025-10-02 12:14:44.725 2 DEBUG nova.compute.manager [req-a5917775-6e9f-4c50-a295-feb9c4b6764d req-19c0d3b1-e0b1-43a8-973d-f641c311d355 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Refreshing instance network info cache due to event network-changed-cea86da2-59ef-4fb2-a414-04dbebd56e75. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:14:44 np0005466013 nova_compute[192144]: 2025-10-02 12:14:44.725 2 DEBUG oslo_concurrency.lockutils [req-a5917775-6e9f-4c50-a295-feb9c4b6764d req-19c0d3b1-e0b1-43a8-973d-f641c311d355 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-70d78115-9cfc-487c-95b6-f4f4149c52a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:14:45 np0005466013 nova_compute[192144]: 2025-10-02 12:14:45.631 2 DEBUG nova.network.neutron [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Updating instance_info_cache with network_info: [{"id": "cea86da2-59ef-4fb2-a414-04dbebd56e75", "address": "fa:16:3e:06:84:17", "network": {"id": "295d1af7-5cfd-4501-a36e-13bd17ce835b", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2032524031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "da589921190a470cab62d12688f03735", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcea86da2-59", "ovs_interfaceid": "cea86da2-59ef-4fb2-a414-04dbebd56e75", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:14:45 np0005466013 nova_compute[192144]: 2025-10-02 12:14:45.661 2 DEBUG oslo_concurrency.lockutils [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Releasing lock "refresh_cache-70d78115-9cfc-487c-95b6-f4f4149c52a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:14:45 np0005466013 nova_compute[192144]: 2025-10-02 12:14:45.661 2 DEBUG nova.compute.manager [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Instance network_info: |[{"id": "cea86da2-59ef-4fb2-a414-04dbebd56e75", "address": "fa:16:3e:06:84:17", "network": {"id": "295d1af7-5cfd-4501-a36e-13bd17ce835b", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2032524031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "da589921190a470cab62d12688f03735", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcea86da2-59", "ovs_interfaceid": "cea86da2-59ef-4fb2-a414-04dbebd56e75", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:14:45 np0005466013 nova_compute[192144]: 2025-10-02 12:14:45.662 2 DEBUG oslo_concurrency.lockutils [req-a5917775-6e9f-4c50-a295-feb9c4b6764d req-19c0d3b1-e0b1-43a8-973d-f641c311d355 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-70d78115-9cfc-487c-95b6-f4f4149c52a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:14:45 np0005466013 nova_compute[192144]: 2025-10-02 12:14:45.662 2 DEBUG nova.network.neutron [req-a5917775-6e9f-4c50-a295-feb9c4b6764d req-19c0d3b1-e0b1-43a8-973d-f641c311d355 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Refreshing network info cache for port cea86da2-59ef-4fb2-a414-04dbebd56e75 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:14:45 np0005466013 nova_compute[192144]: 2025-10-02 12:14:45.664 2 DEBUG nova.virt.libvirt.driver [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Start _get_guest_xml network_info=[{"id": "cea86da2-59ef-4fb2-a414-04dbebd56e75", "address": "fa:16:3e:06:84:17", "network": {"id": "295d1af7-5cfd-4501-a36e-13bd17ce835b", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2032524031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "da589921190a470cab62d12688f03735", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcea86da2-59", "ovs_interfaceid": "cea86da2-59ef-4fb2-a414-04dbebd56e75", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:14:45 np0005466013 nova_compute[192144]: 2025-10-02 12:14:45.668 2 WARNING nova.virt.libvirt.driver [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:14:45 np0005466013 nova_compute[192144]: 2025-10-02 12:14:45.675 2 DEBUG nova.virt.libvirt.host [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:14:45 np0005466013 nova_compute[192144]: 2025-10-02 12:14:45.676 2 DEBUG nova.virt.libvirt.host [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:14:45 np0005466013 nova_compute[192144]: 2025-10-02 12:14:45.679 2 DEBUG nova.virt.libvirt.host [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:14:45 np0005466013 nova_compute[192144]: 2025-10-02 12:14:45.680 2 DEBUG nova.virt.libvirt.host [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:14:45 np0005466013 nova_compute[192144]: 2025-10-02 12:14:45.681 2 DEBUG nova.virt.libvirt.driver [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:14:45 np0005466013 nova_compute[192144]: 2025-10-02 12:14:45.681 2 DEBUG nova.virt.hardware [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:14:45 np0005466013 nova_compute[192144]: 2025-10-02 12:14:45.681 2 DEBUG nova.virt.hardware [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:14:45 np0005466013 nova_compute[192144]: 2025-10-02 12:14:45.682 2 DEBUG nova.virt.hardware [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:14:45 np0005466013 nova_compute[192144]: 2025-10-02 12:14:45.682 2 DEBUG nova.virt.hardware [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:14:45 np0005466013 nova_compute[192144]: 2025-10-02 12:14:45.682 2 DEBUG nova.virt.hardware [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:14:45 np0005466013 nova_compute[192144]: 2025-10-02 12:14:45.682 2 DEBUG nova.virt.hardware [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:14:45 np0005466013 nova_compute[192144]: 2025-10-02 12:14:45.682 2 DEBUG nova.virt.hardware [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:14:45 np0005466013 nova_compute[192144]: 2025-10-02 12:14:45.683 2 DEBUG nova.virt.hardware [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:14:45 np0005466013 nova_compute[192144]: 2025-10-02 12:14:45.683 2 DEBUG nova.virt.hardware [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:14:45 np0005466013 nova_compute[192144]: 2025-10-02 12:14:45.683 2 DEBUG nova.virt.hardware [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:14:45 np0005466013 nova_compute[192144]: 2025-10-02 12:14:45.683 2 DEBUG nova.virt.hardware [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:14:45 np0005466013 nova_compute[192144]: 2025-10-02 12:14:45.686 2 DEBUG nova.virt.libvirt.vif [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:14:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-899246229',display_name='tempest-ServerRescueTestJSON-server-899246229',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-899246229',id=74,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='da589921190a470cab62d12688f03735',ramdisk_id='',reservation_id='r-l0lf8422',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSON-1453830403',owner_user_name='tempest-ServerRescueTestJSON-1453830403-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:14:40Z,user_data=None,user_id='7b25dedc41b548469d2b0627e3255b9e',uuid=70d78115-9cfc-487c-95b6-f4f4149c52a1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cea86da2-59ef-4fb2-a414-04dbebd56e75", "address": "fa:16:3e:06:84:17", "network": {"id": "295d1af7-5cfd-4501-a36e-13bd17ce835b", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2032524031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "da589921190a470cab62d12688f03735", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcea86da2-59", "ovs_interfaceid": "cea86da2-59ef-4fb2-a414-04dbebd56e75", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:14:45 np0005466013 nova_compute[192144]: 2025-10-02 12:14:45.687 2 DEBUG nova.network.os_vif_util [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Converting VIF {"id": "cea86da2-59ef-4fb2-a414-04dbebd56e75", "address": "fa:16:3e:06:84:17", "network": {"id": "295d1af7-5cfd-4501-a36e-13bd17ce835b", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2032524031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "da589921190a470cab62d12688f03735", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcea86da2-59", "ovs_interfaceid": "cea86da2-59ef-4fb2-a414-04dbebd56e75", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:14:45 np0005466013 nova_compute[192144]: 2025-10-02 12:14:45.687 2 DEBUG nova.network.os_vif_util [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:06:84:17,bridge_name='br-int',has_traffic_filtering=True,id=cea86da2-59ef-4fb2-a414-04dbebd56e75,network=Network(295d1af7-5cfd-4501-a36e-13bd17ce835b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcea86da2-59') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:14:45 np0005466013 nova_compute[192144]: 2025-10-02 12:14:45.688 2 DEBUG nova.objects.instance [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lazy-loading 'pci_devices' on Instance uuid 70d78115-9cfc-487c-95b6-f4f4149c52a1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:14:45 np0005466013 nova_compute[192144]: 2025-10-02 12:14:45.703 2 DEBUG nova.virt.libvirt.driver [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:14:45 np0005466013 nova_compute[192144]:  <uuid>70d78115-9cfc-487c-95b6-f4f4149c52a1</uuid>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:  <name>instance-0000004a</name>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:14:45 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:      <nova:name>tempest-ServerRescueTestJSON-server-899246229</nova:name>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:14:45</nova:creationTime>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:14:45 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:        <nova:user uuid="7b25dedc41b548469d2b0627e3255b9e">tempest-ServerRescueTestJSON-1453830403-project-member</nova:user>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:        <nova:project uuid="da589921190a470cab62d12688f03735">tempest-ServerRescueTestJSON-1453830403</nova:project>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:        <nova:port uuid="cea86da2-59ef-4fb2-a414-04dbebd56e75">
Oct  2 08:14:45 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:      <entry name="serial">70d78115-9cfc-487c-95b6-f4f4149c52a1</entry>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:      <entry name="uuid">70d78115-9cfc-487c-95b6-f4f4149c52a1</entry>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:14:45 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/70d78115-9cfc-487c-95b6-f4f4149c52a1/disk"/>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:14:45 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/70d78115-9cfc-487c-95b6-f4f4149c52a1/disk.config"/>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:14:45 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:06:84:17"/>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:      <target dev="tapcea86da2-59"/>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:14:45 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/70d78115-9cfc-487c-95b6-f4f4149c52a1/console.log" append="off"/>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:14:45 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:14:45 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:14:45 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:14:45 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:14:45 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:14:45 np0005466013 nova_compute[192144]: 2025-10-02 12:14:45.704 2 DEBUG nova.compute.manager [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Preparing to wait for external event network-vif-plugged-cea86da2-59ef-4fb2-a414-04dbebd56e75 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:14:45 np0005466013 nova_compute[192144]: 2025-10-02 12:14:45.704 2 DEBUG oslo_concurrency.lockutils [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Acquiring lock "70d78115-9cfc-487c-95b6-f4f4149c52a1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:45 np0005466013 nova_compute[192144]: 2025-10-02 12:14:45.705 2 DEBUG oslo_concurrency.lockutils [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lock "70d78115-9cfc-487c-95b6-f4f4149c52a1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:45 np0005466013 nova_compute[192144]: 2025-10-02 12:14:45.705 2 DEBUG oslo_concurrency.lockutils [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lock "70d78115-9cfc-487c-95b6-f4f4149c52a1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:45 np0005466013 nova_compute[192144]: 2025-10-02 12:14:45.705 2 DEBUG nova.virt.libvirt.vif [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:14:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-899246229',display_name='tempest-ServerRescueTestJSON-server-899246229',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-899246229',id=74,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='da589921190a470cab62d12688f03735',ramdisk_id='',reservation_id='r-l0lf8422',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSON-1453830403',owner_user_name='tempest-ServerRescueTestJSON-1453830403-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:14:40Z,user_data=None,user_id='7b25dedc41b548469d2b0627e3255b9e',uuid=70d78115-9cfc-487c-95b6-f4f4149c52a1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cea86da2-59ef-4fb2-a414-04dbebd56e75", "address": "fa:16:3e:06:84:17", "network": {"id": "295d1af7-5cfd-4501-a36e-13bd17ce835b", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2032524031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "da589921190a470cab62d12688f03735", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcea86da2-59", "ovs_interfaceid": "cea86da2-59ef-4fb2-a414-04dbebd56e75", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:14:45 np0005466013 nova_compute[192144]: 2025-10-02 12:14:45.706 2 DEBUG nova.network.os_vif_util [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Converting VIF {"id": "cea86da2-59ef-4fb2-a414-04dbebd56e75", "address": "fa:16:3e:06:84:17", "network": {"id": "295d1af7-5cfd-4501-a36e-13bd17ce835b", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2032524031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "da589921190a470cab62d12688f03735", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcea86da2-59", "ovs_interfaceid": "cea86da2-59ef-4fb2-a414-04dbebd56e75", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:14:45 np0005466013 nova_compute[192144]: 2025-10-02 12:14:45.706 2 DEBUG nova.network.os_vif_util [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:06:84:17,bridge_name='br-int',has_traffic_filtering=True,id=cea86da2-59ef-4fb2-a414-04dbebd56e75,network=Network(295d1af7-5cfd-4501-a36e-13bd17ce835b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcea86da2-59') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:14:45 np0005466013 nova_compute[192144]: 2025-10-02 12:14:45.707 2 DEBUG os_vif [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:06:84:17,bridge_name='br-int',has_traffic_filtering=True,id=cea86da2-59ef-4fb2-a414-04dbebd56e75,network=Network(295d1af7-5cfd-4501-a36e-13bd17ce835b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcea86da2-59') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:14:45 np0005466013 nova_compute[192144]: 2025-10-02 12:14:45.707 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:45 np0005466013 nova_compute[192144]: 2025-10-02 12:14:45.707 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:45 np0005466013 nova_compute[192144]: 2025-10-02 12:14:45.708 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:14:45 np0005466013 nova_compute[192144]: 2025-10-02 12:14:45.712 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:45 np0005466013 nova_compute[192144]: 2025-10-02 12:14:45.712 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcea86da2-59, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:45 np0005466013 nova_compute[192144]: 2025-10-02 12:14:45.713 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapcea86da2-59, col_values=(('external_ids', {'iface-id': 'cea86da2-59ef-4fb2-a414-04dbebd56e75', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:06:84:17', 'vm-uuid': '70d78115-9cfc-487c-95b6-f4f4149c52a1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:45 np0005466013 nova_compute[192144]: 2025-10-02 12:14:45.714 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:45 np0005466013 NetworkManager[51205]: <info>  [1759407285.7155] manager: (tapcea86da2-59): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/132)
Oct  2 08:14:45 np0005466013 nova_compute[192144]: 2025-10-02 12:14:45.716 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:14:45 np0005466013 nova_compute[192144]: 2025-10-02 12:14:45.720 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:45 np0005466013 nova_compute[192144]: 2025-10-02 12:14:45.721 2 INFO os_vif [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:06:84:17,bridge_name='br-int',has_traffic_filtering=True,id=cea86da2-59ef-4fb2-a414-04dbebd56e75,network=Network(295d1af7-5cfd-4501-a36e-13bd17ce835b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcea86da2-59')#033[00m
Oct  2 08:14:45 np0005466013 nova_compute[192144]: 2025-10-02 12:14:45.777 2 DEBUG nova.virt.libvirt.driver [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:14:45 np0005466013 nova_compute[192144]: 2025-10-02 12:14:45.778 2 DEBUG nova.virt.libvirt.driver [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:14:45 np0005466013 nova_compute[192144]: 2025-10-02 12:14:45.778 2 DEBUG nova.virt.libvirt.driver [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] No VIF found with MAC fa:16:3e:06:84:17, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:14:45 np0005466013 nova_compute[192144]: 2025-10-02 12:14:45.778 2 INFO nova.virt.libvirt.driver [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Using config drive#033[00m
Oct  2 08:14:46 np0005466013 nova_compute[192144]: 2025-10-02 12:14:46.321 2 INFO nova.virt.libvirt.driver [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Creating config drive at /var/lib/nova/instances/70d78115-9cfc-487c-95b6-f4f4149c52a1/disk.config#033[00m
Oct  2 08:14:46 np0005466013 nova_compute[192144]: 2025-10-02 12:14:46.327 2 DEBUG oslo_concurrency.processutils [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/70d78115-9cfc-487c-95b6-f4f4149c52a1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwg8uj0_4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:14:46 np0005466013 nova_compute[192144]: 2025-10-02 12:14:46.457 2 DEBUG oslo_concurrency.processutils [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/70d78115-9cfc-487c-95b6-f4f4149c52a1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwg8uj0_4" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:14:46 np0005466013 kernel: tapcea86da2-59: entered promiscuous mode
Oct  2 08:14:46 np0005466013 NetworkManager[51205]: <info>  [1759407286.5239] manager: (tapcea86da2-59): new Tun device (/org/freedesktop/NetworkManager/Devices/133)
Oct  2 08:14:46 np0005466013 nova_compute[192144]: 2025-10-02 12:14:46.571 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:46 np0005466013 nova_compute[192144]: 2025-10-02 12:14:46.574 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:46 np0005466013 ovn_controller[94366]: 2025-10-02T12:14:46Z|00255|binding|INFO|Claiming lport cea86da2-59ef-4fb2-a414-04dbebd56e75 for this chassis.
Oct  2 08:14:46 np0005466013 ovn_controller[94366]: 2025-10-02T12:14:46Z|00256|binding|INFO|cea86da2-59ef-4fb2-a414-04dbebd56e75: Claiming fa:16:3e:06:84:17 10.100.0.14
Oct  2 08:14:46 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:46.588 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:06:84:17 10.100.0.14'], port_security=['fa:16:3e:06:84:17 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-295d1af7-5cfd-4501-a36e-13bd17ce835b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da589921190a470cab62d12688f03735', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ef00f507-77db-4cb1-a09e-91fc0b57cce7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b4dc983a-7436-400e-b753-e53a88d71c73, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=cea86da2-59ef-4fb2-a414-04dbebd56e75) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:14:46 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:46.590 103323 INFO neutron.agent.ovn.metadata.agent [-] Port cea86da2-59ef-4fb2-a414-04dbebd56e75 in datapath 295d1af7-5cfd-4501-a36e-13bd17ce835b bound to our chassis#033[00m
Oct  2 08:14:46 np0005466013 systemd-udevd[230631]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:14:46 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:46.591 103323 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 295d1af7-5cfd-4501-a36e-13bd17ce835b or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Oct  2 08:14:46 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:14:46.593 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[e70166c4-4d1a-4c5b-817d-6ed79bbad1f3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:46 np0005466013 NetworkManager[51205]: <info>  [1759407286.6052] device (tapcea86da2-59): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:14:46 np0005466013 NetworkManager[51205]: <info>  [1759407286.6068] device (tapcea86da2-59): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:14:46 np0005466013 systemd-machined[152202]: New machine qemu-31-instance-0000004a.
Oct  2 08:14:46 np0005466013 nova_compute[192144]: 2025-10-02 12:14:46.625 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:46 np0005466013 ovn_controller[94366]: 2025-10-02T12:14:46Z|00257|binding|INFO|Setting lport cea86da2-59ef-4fb2-a414-04dbebd56e75 ovn-installed in OVS
Oct  2 08:14:46 np0005466013 ovn_controller[94366]: 2025-10-02T12:14:46Z|00258|binding|INFO|Setting lport cea86da2-59ef-4fb2-a414-04dbebd56e75 up in Southbound
Oct  2 08:14:46 np0005466013 systemd[1]: Started Virtual Machine qemu-31-instance-0000004a.
Oct  2 08:14:46 np0005466013 nova_compute[192144]: 2025-10-02 12:14:46.632 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:46 np0005466013 nova_compute[192144]: 2025-10-02 12:14:46.867 2 DEBUG nova.compute.manager [req-ab85b587-76f9-48c0-8a55-fc6636a7f67a req-65539613-754b-40a0-a530-3ec933585a42 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Received event network-vif-plugged-cea86da2-59ef-4fb2-a414-04dbebd56e75 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:14:46 np0005466013 nova_compute[192144]: 2025-10-02 12:14:46.868 2 DEBUG oslo_concurrency.lockutils [req-ab85b587-76f9-48c0-8a55-fc6636a7f67a req-65539613-754b-40a0-a530-3ec933585a42 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "70d78115-9cfc-487c-95b6-f4f4149c52a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:46 np0005466013 nova_compute[192144]: 2025-10-02 12:14:46.868 2 DEBUG oslo_concurrency.lockutils [req-ab85b587-76f9-48c0-8a55-fc6636a7f67a req-65539613-754b-40a0-a530-3ec933585a42 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "70d78115-9cfc-487c-95b6-f4f4149c52a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:46 np0005466013 nova_compute[192144]: 2025-10-02 12:14:46.869 2 DEBUG oslo_concurrency.lockutils [req-ab85b587-76f9-48c0-8a55-fc6636a7f67a req-65539613-754b-40a0-a530-3ec933585a42 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "70d78115-9cfc-487c-95b6-f4f4149c52a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:46 np0005466013 nova_compute[192144]: 2025-10-02 12:14:46.869 2 DEBUG nova.compute.manager [req-ab85b587-76f9-48c0-8a55-fc6636a7f67a req-65539613-754b-40a0-a530-3ec933585a42 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Processing event network-vif-plugged-cea86da2-59ef-4fb2-a414-04dbebd56e75 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:14:47 np0005466013 nova_compute[192144]: 2025-10-02 12:14:47.388 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407287.3877842, 70d78115-9cfc-487c-95b6-f4f4149c52a1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:14:47 np0005466013 nova_compute[192144]: 2025-10-02 12:14:47.389 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] VM Started (Lifecycle Event)#033[00m
Oct  2 08:14:47 np0005466013 nova_compute[192144]: 2025-10-02 12:14:47.390 2 DEBUG nova.compute.manager [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:14:47 np0005466013 nova_compute[192144]: 2025-10-02 12:14:47.393 2 DEBUG nova.virt.libvirt.driver [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:14:47 np0005466013 nova_compute[192144]: 2025-10-02 12:14:47.397 2 INFO nova.virt.libvirt.driver [-] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Instance spawned successfully.#033[00m
Oct  2 08:14:47 np0005466013 nova_compute[192144]: 2025-10-02 12:14:47.397 2 DEBUG nova.virt.libvirt.driver [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:14:47 np0005466013 nova_compute[192144]: 2025-10-02 12:14:47.417 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:14:47 np0005466013 nova_compute[192144]: 2025-10-02 12:14:47.422 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:14:47 np0005466013 nova_compute[192144]: 2025-10-02 12:14:47.425 2 DEBUG nova.virt.libvirt.driver [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:14:47 np0005466013 nova_compute[192144]: 2025-10-02 12:14:47.425 2 DEBUG nova.virt.libvirt.driver [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:14:47 np0005466013 nova_compute[192144]: 2025-10-02 12:14:47.426 2 DEBUG nova.virt.libvirt.driver [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:14:47 np0005466013 nova_compute[192144]: 2025-10-02 12:14:47.426 2 DEBUG nova.virt.libvirt.driver [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:14:47 np0005466013 nova_compute[192144]: 2025-10-02 12:14:47.426 2 DEBUG nova.virt.libvirt.driver [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:14:47 np0005466013 nova_compute[192144]: 2025-10-02 12:14:47.427 2 DEBUG nova.virt.libvirt.driver [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:14:47 np0005466013 nova_compute[192144]: 2025-10-02 12:14:47.454 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:14:47 np0005466013 nova_compute[192144]: 2025-10-02 12:14:47.455 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407287.3882833, 70d78115-9cfc-487c-95b6-f4f4149c52a1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:14:47 np0005466013 nova_compute[192144]: 2025-10-02 12:14:47.455 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:14:47 np0005466013 nova_compute[192144]: 2025-10-02 12:14:47.483 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:14:47 np0005466013 nova_compute[192144]: 2025-10-02 12:14:47.487 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407287.3930101, 70d78115-9cfc-487c-95b6-f4f4149c52a1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:14:47 np0005466013 nova_compute[192144]: 2025-10-02 12:14:47.487 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:14:47 np0005466013 nova_compute[192144]: 2025-10-02 12:14:47.517 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:14:47 np0005466013 nova_compute[192144]: 2025-10-02 12:14:47.522 2 INFO nova.compute.manager [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Took 6.61 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:14:47 np0005466013 nova_compute[192144]: 2025-10-02 12:14:47.523 2 DEBUG nova.compute.manager [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:14:47 np0005466013 nova_compute[192144]: 2025-10-02 12:14:47.524 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:14:47 np0005466013 nova_compute[192144]: 2025-10-02 12:14:47.556 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:14:47 np0005466013 nova_compute[192144]: 2025-10-02 12:14:47.613 2 INFO nova.compute.manager [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Took 7.17 seconds to build instance.#033[00m
Oct  2 08:14:47 np0005466013 nova_compute[192144]: 2025-10-02 12:14:47.631 2 DEBUG oslo_concurrency.lockutils [None req-f5845a8b-efe0-4108-8a1a-9e55dbcfb8ac 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lock "70d78115-9cfc-487c-95b6-f4f4149c52a1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.255s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:48 np0005466013 nova_compute[192144]: 2025-10-02 12:14:48.055 2 DEBUG nova.network.neutron [req-a5917775-6e9f-4c50-a295-feb9c4b6764d req-19c0d3b1-e0b1-43a8-973d-f641c311d355 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Updated VIF entry in instance network info cache for port cea86da2-59ef-4fb2-a414-04dbebd56e75. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:14:48 np0005466013 nova_compute[192144]: 2025-10-02 12:14:48.055 2 DEBUG nova.network.neutron [req-a5917775-6e9f-4c50-a295-feb9c4b6764d req-19c0d3b1-e0b1-43a8-973d-f641c311d355 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Updating instance_info_cache with network_info: [{"id": "cea86da2-59ef-4fb2-a414-04dbebd56e75", "address": "fa:16:3e:06:84:17", "network": {"id": "295d1af7-5cfd-4501-a36e-13bd17ce835b", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2032524031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "da589921190a470cab62d12688f03735", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcea86da2-59", "ovs_interfaceid": "cea86da2-59ef-4fb2-a414-04dbebd56e75", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:14:48 np0005466013 nova_compute[192144]: 2025-10-02 12:14:48.073 2 DEBUG oslo_concurrency.lockutils [req-a5917775-6e9f-4c50-a295-feb9c4b6764d req-19c0d3b1-e0b1-43a8-973d-f641c311d355 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-70d78115-9cfc-487c-95b6-f4f4149c52a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:14:48 np0005466013 nova_compute[192144]: 2025-10-02 12:14:48.393 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:48 np0005466013 nova_compute[192144]: 2025-10-02 12:14:48.960 2 DEBUG nova.compute.manager [req-7291c5ef-f4de-4f38-afa2-a90710cc4b36 req-091c81c9-ff25-4ecb-a3aa-301a8f5af2c8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Received event network-vif-plugged-cea86da2-59ef-4fb2-a414-04dbebd56e75 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:14:48 np0005466013 nova_compute[192144]: 2025-10-02 12:14:48.961 2 DEBUG oslo_concurrency.lockutils [req-7291c5ef-f4de-4f38-afa2-a90710cc4b36 req-091c81c9-ff25-4ecb-a3aa-301a8f5af2c8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "70d78115-9cfc-487c-95b6-f4f4149c52a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:48 np0005466013 nova_compute[192144]: 2025-10-02 12:14:48.961 2 DEBUG oslo_concurrency.lockutils [req-7291c5ef-f4de-4f38-afa2-a90710cc4b36 req-091c81c9-ff25-4ecb-a3aa-301a8f5af2c8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "70d78115-9cfc-487c-95b6-f4f4149c52a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:48 np0005466013 nova_compute[192144]: 2025-10-02 12:14:48.962 2 DEBUG oslo_concurrency.lockutils [req-7291c5ef-f4de-4f38-afa2-a90710cc4b36 req-091c81c9-ff25-4ecb-a3aa-301a8f5af2c8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "70d78115-9cfc-487c-95b6-f4f4149c52a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:48 np0005466013 nova_compute[192144]: 2025-10-02 12:14:48.962 2 DEBUG nova.compute.manager [req-7291c5ef-f4de-4f38-afa2-a90710cc4b36 req-091c81c9-ff25-4ecb-a3aa-301a8f5af2c8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] No waiting events found dispatching network-vif-plugged-cea86da2-59ef-4fb2-a414-04dbebd56e75 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:14:48 np0005466013 nova_compute[192144]: 2025-10-02 12:14:48.962 2 WARNING nova.compute.manager [req-7291c5ef-f4de-4f38-afa2-a90710cc4b36 req-091c81c9-ff25-4ecb-a3aa-301a8f5af2c8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Received unexpected event network-vif-plugged-cea86da2-59ef-4fb2-a414-04dbebd56e75 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:14:50 np0005466013 nova_compute[192144]: 2025-10-02 12:14:50.212 2 INFO nova.compute.manager [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Rescuing#033[00m
Oct  2 08:14:50 np0005466013 nova_compute[192144]: 2025-10-02 12:14:50.213 2 DEBUG oslo_concurrency.lockutils [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Acquiring lock "refresh_cache-70d78115-9cfc-487c-95b6-f4f4149c52a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:14:50 np0005466013 nova_compute[192144]: 2025-10-02 12:14:50.213 2 DEBUG oslo_concurrency.lockutils [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Acquired lock "refresh_cache-70d78115-9cfc-487c-95b6-f4f4149c52a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:14:50 np0005466013 nova_compute[192144]: 2025-10-02 12:14:50.213 2 DEBUG nova.network.neutron [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:14:50 np0005466013 podman[230650]: 2025-10-02 12:14:50.706462602 +0000 UTC m=+0.075724267 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=edpm)
Oct  2 08:14:50 np0005466013 nova_compute[192144]: 2025-10-02 12:14:50.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:51 np0005466013 nova_compute[192144]: 2025-10-02 12:14:51.764 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407276.7629204, 38d330be-301c-4f63-9b1e-7ebc2a61f3e9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:14:51 np0005466013 nova_compute[192144]: 2025-10-02 12:14:51.764 2 INFO nova.compute.manager [-] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:14:51 np0005466013 nova_compute[192144]: 2025-10-02 12:14:51.787 2 DEBUG nova.compute.manager [None req-e22b774d-e943-4750-81ad-198d07ae1901 - - - - - -] [instance: 38d330be-301c-4f63-9b1e-7ebc2a61f3e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:14:52 np0005466013 podman[230671]: 2025-10-02 12:14:52.706917287 +0000 UTC m=+0.076193592 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., release=1755695350, version=9.6, name=ubi9-minimal, architecture=x86_64, config_id=edpm, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.33.7, managed_by=edpm_ansible, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct  2 08:14:52 np0005466013 podman[230670]: 2025-10-02 12:14:52.709250501 +0000 UTC m=+0.081280593 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:14:53 np0005466013 nova_compute[192144]: 2025-10-02 12:14:53.388 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:54 np0005466013 nova_compute[192144]: 2025-10-02 12:14:54.049 2 DEBUG nova.network.neutron [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Updating instance_info_cache with network_info: [{"id": "cea86da2-59ef-4fb2-a414-04dbebd56e75", "address": "fa:16:3e:06:84:17", "network": {"id": "295d1af7-5cfd-4501-a36e-13bd17ce835b", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2032524031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "da589921190a470cab62d12688f03735", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcea86da2-59", "ovs_interfaceid": "cea86da2-59ef-4fb2-a414-04dbebd56e75", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:14:54 np0005466013 nova_compute[192144]: 2025-10-02 12:14:54.069 2 DEBUG oslo_concurrency.lockutils [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Releasing lock "refresh_cache-70d78115-9cfc-487c-95b6-f4f4149c52a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:14:54 np0005466013 nova_compute[192144]: 2025-10-02 12:14:54.303 2 DEBUG nova.virt.libvirt.driver [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Oct  2 08:14:55 np0005466013 nova_compute[192144]: 2025-10-02 12:14:55.719 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:57 np0005466013 podman[230709]: 2025-10-02 12:14:57.676077974 +0000 UTC m=+0.052378959 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:14:57 np0005466013 podman[230710]: 2025-10-02 12:14:57.684494834 +0000 UTC m=+0.056019160 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  2 08:14:58 np0005466013 nova_compute[192144]: 2025-10-02 12:14:58.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:00 np0005466013 nova_compute[192144]: 2025-10-02 12:15:00.722 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:15:02.297 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:15:02.297 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:15:02.297 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:03 np0005466013 nova_compute[192144]: 2025-10-02 12:15:03.391 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:04 np0005466013 nova_compute[192144]: 2025-10-02 12:15:04.346 2 DEBUG nova.virt.libvirt.driver [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Oct  2 08:15:05 np0005466013 nova_compute[192144]: 2025-10-02 12:15:05.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:06 np0005466013 kernel: tapcea86da2-59 (unregistering): left promiscuous mode
Oct  2 08:15:06 np0005466013 NetworkManager[51205]: <info>  [1759407306.5146] device (tapcea86da2-59): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:15:06 np0005466013 nova_compute[192144]: 2025-10-02 12:15:06.527 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:06 np0005466013 ovn_controller[94366]: 2025-10-02T12:15:06Z|00259|binding|INFO|Releasing lport cea86da2-59ef-4fb2-a414-04dbebd56e75 from this chassis (sb_readonly=0)
Oct  2 08:15:06 np0005466013 ovn_controller[94366]: 2025-10-02T12:15:06Z|00260|binding|INFO|Setting lport cea86da2-59ef-4fb2-a414-04dbebd56e75 down in Southbound
Oct  2 08:15:06 np0005466013 ovn_controller[94366]: 2025-10-02T12:15:06Z|00261|binding|INFO|Removing iface tapcea86da2-59 ovn-installed in OVS
Oct  2 08:15:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:15:06.540 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:06:84:17 10.100.0.14'], port_security=['fa:16:3e:06:84:17 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-295d1af7-5cfd-4501-a36e-13bd17ce835b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da589921190a470cab62d12688f03735', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ef00f507-77db-4cb1-a09e-91fc0b57cce7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b4dc983a-7436-400e-b753-e53a88d71c73, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=cea86da2-59ef-4fb2-a414-04dbebd56e75) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:15:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:15:06.541 103323 INFO neutron.agent.ovn.metadata.agent [-] Port cea86da2-59ef-4fb2-a414-04dbebd56e75 in datapath 295d1af7-5cfd-4501-a36e-13bd17ce835b unbound from our chassis#033[00m
Oct  2 08:15:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:15:06.542 103323 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 295d1af7-5cfd-4501-a36e-13bd17ce835b or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Oct  2 08:15:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:15:06.545 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[fec7d14a-02f8-4de0-a0cd-2e260f11de61]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:06 np0005466013 nova_compute[192144]: 2025-10-02 12:15:06.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:06 np0005466013 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d0000004a.scope: Deactivated successfully.
Oct  2 08:15:06 np0005466013 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d0000004a.scope: Consumed 13.348s CPU time.
Oct  2 08:15:06 np0005466013 systemd-machined[152202]: Machine qemu-31-instance-0000004a terminated.
Oct  2 08:15:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:15:06.819 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:15:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:15:06.820 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:15:06 np0005466013 nova_compute[192144]: 2025-10-02 12:15:06.821 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:06 np0005466013 nova_compute[192144]: 2025-10-02 12:15:06.826 2 DEBUG nova.compute.manager [req-991bf4a8-fac4-4729-bbc0-8b737c24a748 req-33b123fc-5ba2-4f3d-8045-8c838b37ae60 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Received event network-vif-unplugged-cea86da2-59ef-4fb2-a414-04dbebd56e75 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:15:06 np0005466013 nova_compute[192144]: 2025-10-02 12:15:06.827 2 DEBUG oslo_concurrency.lockutils [req-991bf4a8-fac4-4729-bbc0-8b737c24a748 req-33b123fc-5ba2-4f3d-8045-8c838b37ae60 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "70d78115-9cfc-487c-95b6-f4f4149c52a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:06 np0005466013 nova_compute[192144]: 2025-10-02 12:15:06.827 2 DEBUG oslo_concurrency.lockutils [req-991bf4a8-fac4-4729-bbc0-8b737c24a748 req-33b123fc-5ba2-4f3d-8045-8c838b37ae60 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "70d78115-9cfc-487c-95b6-f4f4149c52a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:06 np0005466013 nova_compute[192144]: 2025-10-02 12:15:06.828 2 DEBUG oslo_concurrency.lockutils [req-991bf4a8-fac4-4729-bbc0-8b737c24a748 req-33b123fc-5ba2-4f3d-8045-8c838b37ae60 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "70d78115-9cfc-487c-95b6-f4f4149c52a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:06 np0005466013 nova_compute[192144]: 2025-10-02 12:15:06.828 2 DEBUG nova.compute.manager [req-991bf4a8-fac4-4729-bbc0-8b737c24a748 req-33b123fc-5ba2-4f3d-8045-8c838b37ae60 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] No waiting events found dispatching network-vif-unplugged-cea86da2-59ef-4fb2-a414-04dbebd56e75 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:15:06 np0005466013 nova_compute[192144]: 2025-10-02 12:15:06.828 2 WARNING nova.compute.manager [req-991bf4a8-fac4-4729-bbc0-8b737c24a748 req-33b123fc-5ba2-4f3d-8045-8c838b37ae60 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Received unexpected event network-vif-unplugged-cea86da2-59ef-4fb2-a414-04dbebd56e75 for instance with vm_state active and task_state rescuing.#033[00m
Oct  2 08:15:07 np0005466013 nova_compute[192144]: 2025-10-02 12:15:07.361 2 INFO nova.virt.libvirt.driver [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Instance shutdown successfully after 13 seconds.#033[00m
Oct  2 08:15:07 np0005466013 nova_compute[192144]: 2025-10-02 12:15:07.369 2 INFO nova.virt.libvirt.driver [-] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Instance destroyed successfully.#033[00m
Oct  2 08:15:07 np0005466013 nova_compute[192144]: 2025-10-02 12:15:07.369 2 DEBUG nova.objects.instance [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lazy-loading 'numa_topology' on Instance uuid 70d78115-9cfc-487c-95b6-f4f4149c52a1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:15:07 np0005466013 nova_compute[192144]: 2025-10-02 12:15:07.395 2 INFO nova.virt.libvirt.driver [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Attempting rescue#033[00m
Oct  2 08:15:07 np0005466013 nova_compute[192144]: 2025-10-02 12:15:07.396 2 DEBUG nova.virt.libvirt.driver [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314#033[00m
Oct  2 08:15:07 np0005466013 nova_compute[192144]: 2025-10-02 12:15:07.401 2 DEBUG nova.virt.libvirt.driver [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Oct  2 08:15:07 np0005466013 nova_compute[192144]: 2025-10-02 12:15:07.401 2 INFO nova.virt.libvirt.driver [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Creating image(s)#033[00m
Oct  2 08:15:07 np0005466013 nova_compute[192144]: 2025-10-02 12:15:07.402 2 DEBUG oslo_concurrency.lockutils [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Acquiring lock "/var/lib/nova/instances/70d78115-9cfc-487c-95b6-f4f4149c52a1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:07 np0005466013 nova_compute[192144]: 2025-10-02 12:15:07.402 2 DEBUG oslo_concurrency.lockutils [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lock "/var/lib/nova/instances/70d78115-9cfc-487c-95b6-f4f4149c52a1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:07 np0005466013 nova_compute[192144]: 2025-10-02 12:15:07.403 2 DEBUG oslo_concurrency.lockutils [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lock "/var/lib/nova/instances/70d78115-9cfc-487c-95b6-f4f4149c52a1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:07 np0005466013 nova_compute[192144]: 2025-10-02 12:15:07.403 2 DEBUG nova.objects.instance [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 70d78115-9cfc-487c-95b6-f4f4149c52a1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:15:07 np0005466013 nova_compute[192144]: 2025-10-02 12:15:07.449 2 DEBUG oslo_concurrency.lockutils [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:07 np0005466013 nova_compute[192144]: 2025-10-02 12:15:07.450 2 DEBUG oslo_concurrency.lockutils [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:07 np0005466013 nova_compute[192144]: 2025-10-02 12:15:07.463 2 DEBUG oslo_concurrency.processutils [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:07 np0005466013 nova_compute[192144]: 2025-10-02 12:15:07.525 2 DEBUG oslo_concurrency.processutils [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:07 np0005466013 nova_compute[192144]: 2025-10-02 12:15:07.526 2 DEBUG oslo_concurrency.processutils [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/70d78115-9cfc-487c-95b6-f4f4149c52a1/disk.rescue execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:07 np0005466013 nova_compute[192144]: 2025-10-02 12:15:07.568 2 DEBUG oslo_concurrency.processutils [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/70d78115-9cfc-487c-95b6-f4f4149c52a1/disk.rescue" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:07 np0005466013 nova_compute[192144]: 2025-10-02 12:15:07.570 2 DEBUG oslo_concurrency.lockutils [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:07 np0005466013 nova_compute[192144]: 2025-10-02 12:15:07.571 2 DEBUG nova.objects.instance [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lazy-loading 'migration_context' on Instance uuid 70d78115-9cfc-487c-95b6-f4f4149c52a1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:15:07 np0005466013 nova_compute[192144]: 2025-10-02 12:15:07.586 2 DEBUG nova.virt.libvirt.driver [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:15:07 np0005466013 nova_compute[192144]: 2025-10-02 12:15:07.587 2 DEBUG nova.virt.libvirt.driver [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Start _get_guest_xml network_info=[{"id": "cea86da2-59ef-4fb2-a414-04dbebd56e75", "address": "fa:16:3e:06:84:17", "network": {"id": "295d1af7-5cfd-4501-a36e-13bd17ce835b", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2032524031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSON-2032524031-network", "vif_mac": "fa:16:3e:06:84:17"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "da589921190a470cab62d12688f03735", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcea86da2-59", "ovs_interfaceid": "cea86da2-59ef-4fb2-a414-04dbebd56e75", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'kernel_id': '', 'ramdisk_id': ''} block_device_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:15:07 np0005466013 nova_compute[192144]: 2025-10-02 12:15:07.588 2 DEBUG nova.objects.instance [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lazy-loading 'resources' on Instance uuid 70d78115-9cfc-487c-95b6-f4f4149c52a1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:15:07 np0005466013 nova_compute[192144]: 2025-10-02 12:15:07.614 2 WARNING nova.virt.libvirt.driver [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:15:07 np0005466013 nova_compute[192144]: 2025-10-02 12:15:07.624 2 DEBUG nova.virt.libvirt.host [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:15:07 np0005466013 nova_compute[192144]: 2025-10-02 12:15:07.625 2 DEBUG nova.virt.libvirt.host [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:15:07 np0005466013 nova_compute[192144]: 2025-10-02 12:15:07.630 2 DEBUG nova.virt.libvirt.host [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:15:07 np0005466013 nova_compute[192144]: 2025-10-02 12:15:07.631 2 DEBUG nova.virt.libvirt.host [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:15:07 np0005466013 nova_compute[192144]: 2025-10-02 12:15:07.633 2 DEBUG nova.virt.libvirt.driver [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:15:07 np0005466013 nova_compute[192144]: 2025-10-02 12:15:07.633 2 DEBUG nova.virt.hardware [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:15:07 np0005466013 nova_compute[192144]: 2025-10-02 12:15:07.633 2 DEBUG nova.virt.hardware [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:15:07 np0005466013 nova_compute[192144]: 2025-10-02 12:15:07.634 2 DEBUG nova.virt.hardware [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:15:07 np0005466013 nova_compute[192144]: 2025-10-02 12:15:07.634 2 DEBUG nova.virt.hardware [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:15:07 np0005466013 nova_compute[192144]: 2025-10-02 12:15:07.634 2 DEBUG nova.virt.hardware [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:15:07 np0005466013 nova_compute[192144]: 2025-10-02 12:15:07.634 2 DEBUG nova.virt.hardware [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:15:07 np0005466013 nova_compute[192144]: 2025-10-02 12:15:07.634 2 DEBUG nova.virt.hardware [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:15:07 np0005466013 nova_compute[192144]: 2025-10-02 12:15:07.635 2 DEBUG nova.virt.hardware [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:15:07 np0005466013 nova_compute[192144]: 2025-10-02 12:15:07.635 2 DEBUG nova.virt.hardware [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:15:07 np0005466013 nova_compute[192144]: 2025-10-02 12:15:07.635 2 DEBUG nova.virt.hardware [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:15:07 np0005466013 nova_compute[192144]: 2025-10-02 12:15:07.635 2 DEBUG nova.virt.hardware [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:15:07 np0005466013 nova_compute[192144]: 2025-10-02 12:15:07.636 2 DEBUG nova.objects.instance [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 70d78115-9cfc-487c-95b6-f4f4149c52a1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:15:07 np0005466013 nova_compute[192144]: 2025-10-02 12:15:07.655 2 DEBUG nova.virt.libvirt.vif [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:14:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-899246229',display_name='tempest-ServerRescueTestJSON-server-899246229',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-899246229',id=74,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:14:47Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='da589921190a470cab62d12688f03735',ramdisk_id='',reservation_id='r-l0lf8422',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSON-1453830403',owner_user_name='tempest-ServerRescueTestJSON-1453830403-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:14:47Z,user_data=None,user_id='7b25dedc41b548469d2b0627e3255b9e',uuid=70d78115-9cfc-487c-95b6-f4f4149c52a1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cea86da2-59ef-4fb2-a414-04dbebd56e75", "address": "fa:16:3e:06:84:17", "network": {"id": "295d1af7-5cfd-4501-a36e-13bd17ce835b", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2032524031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSON-2032524031-network", "vif_mac": "fa:16:3e:06:84:17"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "da589921190a470cab62d12688f03735", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcea86da2-59", "ovs_interfaceid": "cea86da2-59ef-4fb2-a414-04dbebd56e75", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:15:07 np0005466013 nova_compute[192144]: 2025-10-02 12:15:07.656 2 DEBUG nova.network.os_vif_util [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Converting VIF {"id": "cea86da2-59ef-4fb2-a414-04dbebd56e75", "address": "fa:16:3e:06:84:17", "network": {"id": "295d1af7-5cfd-4501-a36e-13bd17ce835b", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2032524031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSON-2032524031-network", "vif_mac": "fa:16:3e:06:84:17"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "da589921190a470cab62d12688f03735", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcea86da2-59", "ovs_interfaceid": "cea86da2-59ef-4fb2-a414-04dbebd56e75", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:15:07 np0005466013 nova_compute[192144]: 2025-10-02 12:15:07.657 2 DEBUG nova.network.os_vif_util [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:06:84:17,bridge_name='br-int',has_traffic_filtering=True,id=cea86da2-59ef-4fb2-a414-04dbebd56e75,network=Network(295d1af7-5cfd-4501-a36e-13bd17ce835b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcea86da2-59') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:15:07 np0005466013 nova_compute[192144]: 2025-10-02 12:15:07.658 2 DEBUG nova.objects.instance [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lazy-loading 'pci_devices' on Instance uuid 70d78115-9cfc-487c-95b6-f4f4149c52a1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:15:07 np0005466013 nova_compute[192144]: 2025-10-02 12:15:07.691 2 DEBUG nova.virt.libvirt.driver [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:15:07 np0005466013 nova_compute[192144]:  <uuid>70d78115-9cfc-487c-95b6-f4f4149c52a1</uuid>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:  <name>instance-0000004a</name>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:15:07 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:      <nova:name>tempest-ServerRescueTestJSON-server-899246229</nova:name>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:15:07</nova:creationTime>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:15:07 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:        <nova:user uuid="7b25dedc41b548469d2b0627e3255b9e">tempest-ServerRescueTestJSON-1453830403-project-member</nova:user>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:        <nova:project uuid="da589921190a470cab62d12688f03735">tempest-ServerRescueTestJSON-1453830403</nova:project>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:        <nova:port uuid="cea86da2-59ef-4fb2-a414-04dbebd56e75">
Oct  2 08:15:07 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:      <entry name="serial">70d78115-9cfc-487c-95b6-f4f4149c52a1</entry>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:      <entry name="uuid">70d78115-9cfc-487c-95b6-f4f4149c52a1</entry>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:15:07 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/70d78115-9cfc-487c-95b6-f4f4149c52a1/disk.rescue"/>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:15:07 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/70d78115-9cfc-487c-95b6-f4f4149c52a1/disk"/>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:      <target dev="vdb" bus="virtio"/>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:15:07 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/70d78115-9cfc-487c-95b6-f4f4149c52a1/disk.config.rescue"/>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:15:07 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:06:84:17"/>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:      <target dev="tapcea86da2-59"/>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:15:07 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/70d78115-9cfc-487c-95b6-f4f4149c52a1/console.log" append="off"/>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:15:07 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:15:07 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:15:07 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:15:07 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:15:07 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:15:07 np0005466013 nova_compute[192144]: 2025-10-02 12:15:07.697 2 INFO nova.virt.libvirt.driver [-] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Instance destroyed successfully.#033[00m
Oct  2 08:15:07 np0005466013 nova_compute[192144]: 2025-10-02 12:15:07.766 2 DEBUG nova.virt.libvirt.driver [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:15:07 np0005466013 nova_compute[192144]: 2025-10-02 12:15:07.767 2 DEBUG nova.virt.libvirt.driver [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:15:07 np0005466013 nova_compute[192144]: 2025-10-02 12:15:07.767 2 DEBUG nova.virt.libvirt.driver [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:15:07 np0005466013 nova_compute[192144]: 2025-10-02 12:15:07.767 2 DEBUG nova.virt.libvirt.driver [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] No VIF found with MAC fa:16:3e:06:84:17, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:15:07 np0005466013 nova_compute[192144]: 2025-10-02 12:15:07.768 2 INFO nova.virt.libvirt.driver [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Using config drive#033[00m
Oct  2 08:15:07 np0005466013 nova_compute[192144]: 2025-10-02 12:15:07.795 2 DEBUG nova.objects.instance [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 70d78115-9cfc-487c-95b6-f4f4149c52a1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:15:07 np0005466013 nova_compute[192144]: 2025-10-02 12:15:07.827 2 DEBUG nova.objects.instance [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lazy-loading 'keypairs' on Instance uuid 70d78115-9cfc-487c-95b6-f4f4149c52a1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:15:08 np0005466013 nova_compute[192144]: 2025-10-02 12:15:08.393 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:08 np0005466013 nova_compute[192144]: 2025-10-02 12:15:08.401 2 INFO nova.virt.libvirt.driver [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Creating config drive at /var/lib/nova/instances/70d78115-9cfc-487c-95b6-f4f4149c52a1/disk.config.rescue#033[00m
Oct  2 08:15:08 np0005466013 nova_compute[192144]: 2025-10-02 12:15:08.406 2 DEBUG oslo_concurrency.processutils [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/70d78115-9cfc-487c-95b6-f4f4149c52a1/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbybgn4d7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:08 np0005466013 nova_compute[192144]: 2025-10-02 12:15:08.534 2 DEBUG oslo_concurrency.processutils [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/70d78115-9cfc-487c-95b6-f4f4149c52a1/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbybgn4d7" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:08 np0005466013 kernel: tapcea86da2-59: entered promiscuous mode
Oct  2 08:15:08 np0005466013 NetworkManager[51205]: <info>  [1759407308.6123] manager: (tapcea86da2-59): new Tun device (/org/freedesktop/NetworkManager/Devices/134)
Oct  2 08:15:08 np0005466013 systemd-udevd[230776]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:15:08 np0005466013 ovn_controller[94366]: 2025-10-02T12:15:08Z|00262|binding|INFO|Claiming lport cea86da2-59ef-4fb2-a414-04dbebd56e75 for this chassis.
Oct  2 08:15:08 np0005466013 ovn_controller[94366]: 2025-10-02T12:15:08Z|00263|binding|INFO|cea86da2-59ef-4fb2-a414-04dbebd56e75: Claiming fa:16:3e:06:84:17 10.100.0.14
Oct  2 08:15:08 np0005466013 nova_compute[192144]: 2025-10-02 12:15:08.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:08 np0005466013 NetworkManager[51205]: <info>  [1759407308.6266] device (tapcea86da2-59): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:15:08 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:15:08.625 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:06:84:17 10.100.0.14'], port_security=['fa:16:3e:06:84:17 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-295d1af7-5cfd-4501-a36e-13bd17ce835b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da589921190a470cab62d12688f03735', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'ef00f507-77db-4cb1-a09e-91fc0b57cce7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b4dc983a-7436-400e-b753-e53a88d71c73, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=cea86da2-59ef-4fb2-a414-04dbebd56e75) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:15:08 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:15:08.626 103323 INFO neutron.agent.ovn.metadata.agent [-] Port cea86da2-59ef-4fb2-a414-04dbebd56e75 in datapath 295d1af7-5cfd-4501-a36e-13bd17ce835b bound to our chassis#033[00m
Oct  2 08:15:08 np0005466013 NetworkManager[51205]: <info>  [1759407308.6281] device (tapcea86da2-59): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:15:08 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:15:08.627 103323 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 295d1af7-5cfd-4501-a36e-13bd17ce835b or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Oct  2 08:15:08 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:15:08.628 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[9aa93c88-9e2f-4323-bf38-cf578e58bb24]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:08 np0005466013 nova_compute[192144]: 2025-10-02 12:15:08.629 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:08 np0005466013 ovn_controller[94366]: 2025-10-02T12:15:08Z|00264|binding|INFO|Setting lport cea86da2-59ef-4fb2-a414-04dbebd56e75 ovn-installed in OVS
Oct  2 08:15:08 np0005466013 ovn_controller[94366]: 2025-10-02T12:15:08Z|00265|binding|INFO|Setting lport cea86da2-59ef-4fb2-a414-04dbebd56e75 up in Southbound
Oct  2 08:15:08 np0005466013 nova_compute[192144]: 2025-10-02 12:15:08.632 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:08 np0005466013 nova_compute[192144]: 2025-10-02 12:15:08.633 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:08 np0005466013 systemd-machined[152202]: New machine qemu-32-instance-0000004a.
Oct  2 08:15:08 np0005466013 systemd[1]: Started Virtual Machine qemu-32-instance-0000004a.
Oct  2 08:15:09 np0005466013 nova_compute[192144]: 2025-10-02 12:15:09.448 2 DEBUG nova.compute.manager [req-8d459c91-83f2-4c71-9302-346139a98fc7 req-d211c354-f6ed-4342-9904-43486ca8f610 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Received event network-vif-plugged-cea86da2-59ef-4fb2-a414-04dbebd56e75 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:15:09 np0005466013 nova_compute[192144]: 2025-10-02 12:15:09.449 2 DEBUG oslo_concurrency.lockutils [req-8d459c91-83f2-4c71-9302-346139a98fc7 req-d211c354-f6ed-4342-9904-43486ca8f610 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "70d78115-9cfc-487c-95b6-f4f4149c52a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:09 np0005466013 nova_compute[192144]: 2025-10-02 12:15:09.450 2 DEBUG oslo_concurrency.lockutils [req-8d459c91-83f2-4c71-9302-346139a98fc7 req-d211c354-f6ed-4342-9904-43486ca8f610 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "70d78115-9cfc-487c-95b6-f4f4149c52a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:09 np0005466013 nova_compute[192144]: 2025-10-02 12:15:09.450 2 DEBUG oslo_concurrency.lockutils [req-8d459c91-83f2-4c71-9302-346139a98fc7 req-d211c354-f6ed-4342-9904-43486ca8f610 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "70d78115-9cfc-487c-95b6-f4f4149c52a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:09 np0005466013 nova_compute[192144]: 2025-10-02 12:15:09.450 2 DEBUG nova.compute.manager [req-8d459c91-83f2-4c71-9302-346139a98fc7 req-d211c354-f6ed-4342-9904-43486ca8f610 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] No waiting events found dispatching network-vif-plugged-cea86da2-59ef-4fb2-a414-04dbebd56e75 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:15:09 np0005466013 nova_compute[192144]: 2025-10-02 12:15:09.450 2 WARNING nova.compute.manager [req-8d459c91-83f2-4c71-9302-346139a98fc7 req-d211c354-f6ed-4342-9904-43486ca8f610 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Received unexpected event network-vif-plugged-cea86da2-59ef-4fb2-a414-04dbebd56e75 for instance with vm_state active and task_state rescuing.#033[00m
Oct  2 08:15:09 np0005466013 nova_compute[192144]: 2025-10-02 12:15:09.746 2 DEBUG nova.virt.libvirt.host [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Removed pending event for 70d78115-9cfc-487c-95b6-f4f4149c52a1 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Oct  2 08:15:09 np0005466013 nova_compute[192144]: 2025-10-02 12:15:09.746 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407309.7454193, 70d78115-9cfc-487c-95b6-f4f4149c52a1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:15:09 np0005466013 nova_compute[192144]: 2025-10-02 12:15:09.747 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:15:09 np0005466013 nova_compute[192144]: 2025-10-02 12:15:09.767 2 DEBUG nova.compute.manager [None req-1d897a85-f6a5-4fc2-aeca-2ac820f883b8 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:15:09 np0005466013 nova_compute[192144]: 2025-10-02 12:15:09.777 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:15:09 np0005466013 nova_compute[192144]: 2025-10-02 12:15:09.782 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:15:09 np0005466013 nova_compute[192144]: 2025-10-02 12:15:09.814 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] During sync_power_state the instance has a pending task (rescuing). Skip.#033[00m
Oct  2 08:15:09 np0005466013 nova_compute[192144]: 2025-10-02 12:15:09.814 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407309.7466137, 70d78115-9cfc-487c-95b6-f4f4149c52a1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:15:09 np0005466013 nova_compute[192144]: 2025-10-02 12:15:09.815 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] VM Started (Lifecycle Event)#033[00m
Oct  2 08:15:09 np0005466013 nova_compute[192144]: 2025-10-02 12:15:09.855 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:15:09 np0005466013 nova_compute[192144]: 2025-10-02 12:15:09.861 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:15:10 np0005466013 podman[230837]: 2025-10-02 12:15:10.728041277 +0000 UTC m=+0.085317021 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  2 08:15:10 np0005466013 nova_compute[192144]: 2025-10-02 12:15:10.728 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:10 np0005466013 podman[230836]: 2025-10-02 12:15:10.749348994 +0000 UTC m=+0.106546706 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:15:10 np0005466013 podman[230838]: 2025-10-02 12:15:10.774965404 +0000 UTC m=+0.123374784 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  2 08:15:11 np0005466013 nova_compute[192144]: 2025-10-02 12:15:11.597 2 DEBUG nova.compute.manager [req-20454ff2-8c9f-436c-952b-d34324f39925 req-e86d7075-9b10-4a93-bfb6-a93de91c0fbf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Received event network-vif-plugged-cea86da2-59ef-4fb2-a414-04dbebd56e75 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:15:11 np0005466013 nova_compute[192144]: 2025-10-02 12:15:11.597 2 DEBUG oslo_concurrency.lockutils [req-20454ff2-8c9f-436c-952b-d34324f39925 req-e86d7075-9b10-4a93-bfb6-a93de91c0fbf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "70d78115-9cfc-487c-95b6-f4f4149c52a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:11 np0005466013 nova_compute[192144]: 2025-10-02 12:15:11.598 2 DEBUG oslo_concurrency.lockutils [req-20454ff2-8c9f-436c-952b-d34324f39925 req-e86d7075-9b10-4a93-bfb6-a93de91c0fbf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "70d78115-9cfc-487c-95b6-f4f4149c52a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:11 np0005466013 nova_compute[192144]: 2025-10-02 12:15:11.598 2 DEBUG oslo_concurrency.lockutils [req-20454ff2-8c9f-436c-952b-d34324f39925 req-e86d7075-9b10-4a93-bfb6-a93de91c0fbf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "70d78115-9cfc-487c-95b6-f4f4149c52a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:11 np0005466013 nova_compute[192144]: 2025-10-02 12:15:11.598 2 DEBUG nova.compute.manager [req-20454ff2-8c9f-436c-952b-d34324f39925 req-e86d7075-9b10-4a93-bfb6-a93de91c0fbf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] No waiting events found dispatching network-vif-plugged-cea86da2-59ef-4fb2-a414-04dbebd56e75 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:15:11 np0005466013 nova_compute[192144]: 2025-10-02 12:15:11.598 2 WARNING nova.compute.manager [req-20454ff2-8c9f-436c-952b-d34324f39925 req-e86d7075-9b10-4a93-bfb6-a93de91c0fbf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Received unexpected event network-vif-plugged-cea86da2-59ef-4fb2-a414-04dbebd56e75 for instance with vm_state rescued and task_state None.#033[00m
Oct  2 08:15:11 np0005466013 nova_compute[192144]: 2025-10-02 12:15:11.598 2 DEBUG nova.compute.manager [req-20454ff2-8c9f-436c-952b-d34324f39925 req-e86d7075-9b10-4a93-bfb6-a93de91c0fbf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Received event network-vif-plugged-cea86da2-59ef-4fb2-a414-04dbebd56e75 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:15:11 np0005466013 nova_compute[192144]: 2025-10-02 12:15:11.599 2 DEBUG oslo_concurrency.lockutils [req-20454ff2-8c9f-436c-952b-d34324f39925 req-e86d7075-9b10-4a93-bfb6-a93de91c0fbf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "70d78115-9cfc-487c-95b6-f4f4149c52a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:11 np0005466013 nova_compute[192144]: 2025-10-02 12:15:11.599 2 DEBUG oslo_concurrency.lockutils [req-20454ff2-8c9f-436c-952b-d34324f39925 req-e86d7075-9b10-4a93-bfb6-a93de91c0fbf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "70d78115-9cfc-487c-95b6-f4f4149c52a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:11 np0005466013 nova_compute[192144]: 2025-10-02 12:15:11.599 2 DEBUG oslo_concurrency.lockutils [req-20454ff2-8c9f-436c-952b-d34324f39925 req-e86d7075-9b10-4a93-bfb6-a93de91c0fbf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "70d78115-9cfc-487c-95b6-f4f4149c52a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:11 np0005466013 nova_compute[192144]: 2025-10-02 12:15:11.599 2 DEBUG nova.compute.manager [req-20454ff2-8c9f-436c-952b-d34324f39925 req-e86d7075-9b10-4a93-bfb6-a93de91c0fbf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] No waiting events found dispatching network-vif-plugged-cea86da2-59ef-4fb2-a414-04dbebd56e75 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:15:11 np0005466013 nova_compute[192144]: 2025-10-02 12:15:11.599 2 WARNING nova.compute.manager [req-20454ff2-8c9f-436c-952b-d34324f39925 req-e86d7075-9b10-4a93-bfb6-a93de91c0fbf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Received unexpected event network-vif-plugged-cea86da2-59ef-4fb2-a414-04dbebd56e75 for instance with vm_state rescued and task_state None.#033[00m
Oct  2 08:15:11 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:15:11.822 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:15:13 np0005466013 nova_compute[192144]: 2025-10-02 12:15:13.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:14 np0005466013 nova_compute[192144]: 2025-10-02 12:15:14.799 2 DEBUG oslo_concurrency.lockutils [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Acquiring lock "54738e99-a5ed-4771-810b-8bea70fde21a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:14 np0005466013 nova_compute[192144]: 2025-10-02 12:15:14.799 2 DEBUG oslo_concurrency.lockutils [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lock "54738e99-a5ed-4771-810b-8bea70fde21a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:14 np0005466013 nova_compute[192144]: 2025-10-02 12:15:14.824 2 DEBUG nova.compute.manager [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:15:14 np0005466013 nova_compute[192144]: 2025-10-02 12:15:14.997 2 DEBUG oslo_concurrency.lockutils [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:14 np0005466013 nova_compute[192144]: 2025-10-02 12:15:14.998 2 DEBUG oslo_concurrency.lockutils [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:15 np0005466013 nova_compute[192144]: 2025-10-02 12:15:15.006 2 DEBUG nova.virt.hardware [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:15:15 np0005466013 nova_compute[192144]: 2025-10-02 12:15:15.006 2 INFO nova.compute.claims [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:15:15 np0005466013 nova_compute[192144]: 2025-10-02 12:15:15.203 2 DEBUG nova.scheduler.client.report [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Refreshing inventories for resource provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  2 08:15:15 np0005466013 nova_compute[192144]: 2025-10-02 12:15:15.279 2 DEBUG nova.scheduler.client.report [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Updating ProviderTree inventory for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  2 08:15:15 np0005466013 nova_compute[192144]: 2025-10-02 12:15:15.281 2 DEBUG nova.compute.provider_tree [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Updating inventory in ProviderTree for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 08:15:15 np0005466013 nova_compute[192144]: 2025-10-02 12:15:15.297 2 DEBUG nova.scheduler.client.report [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Refreshing aggregate associations for resource provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  2 08:15:15 np0005466013 nova_compute[192144]: 2025-10-02 12:15:15.339 2 DEBUG nova.scheduler.client.report [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Refreshing trait associations for resource provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80, traits: COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_TRUSTED_CERTS,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NODE,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE,HW_CPU_X86_MMX,COMPUTE_RESCUE_BFV _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  2 08:15:15 np0005466013 nova_compute[192144]: 2025-10-02 12:15:15.681 2 DEBUG nova.compute.provider_tree [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:15:15 np0005466013 nova_compute[192144]: 2025-10-02 12:15:15.708 2 DEBUG nova.scheduler.client.report [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:15:15 np0005466013 nova_compute[192144]: 2025-10-02 12:15:15.731 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:15 np0005466013 nova_compute[192144]: 2025-10-02 12:15:15.742 2 DEBUG oslo_concurrency.lockutils [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.744s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:15 np0005466013 nova_compute[192144]: 2025-10-02 12:15:15.742 2 DEBUG nova.compute.manager [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:15:15 np0005466013 nova_compute[192144]: 2025-10-02 12:15:15.833 2 DEBUG nova.compute.manager [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:15:15 np0005466013 nova_compute[192144]: 2025-10-02 12:15:15.834 2 DEBUG nova.network.neutron [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:15:15 np0005466013 nova_compute[192144]: 2025-10-02 12:15:15.855 2 INFO nova.virt.libvirt.driver [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:15:15 np0005466013 nova_compute[192144]: 2025-10-02 12:15:15.878 2 DEBUG nova.compute.manager [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:15:16 np0005466013 nova_compute[192144]: 2025-10-02 12:15:16.033 2 DEBUG nova.compute.manager [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:15:16 np0005466013 nova_compute[192144]: 2025-10-02 12:15:16.037 2 DEBUG nova.virt.libvirt.driver [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:15:16 np0005466013 nova_compute[192144]: 2025-10-02 12:15:16.038 2 INFO nova.virt.libvirt.driver [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Creating image(s)#033[00m
Oct  2 08:15:16 np0005466013 nova_compute[192144]: 2025-10-02 12:15:16.039 2 DEBUG oslo_concurrency.lockutils [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Acquiring lock "/var/lib/nova/instances/54738e99-a5ed-4771-810b-8bea70fde21a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:16 np0005466013 nova_compute[192144]: 2025-10-02 12:15:16.039 2 DEBUG oslo_concurrency.lockutils [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lock "/var/lib/nova/instances/54738e99-a5ed-4771-810b-8bea70fde21a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:16 np0005466013 nova_compute[192144]: 2025-10-02 12:15:16.040 2 DEBUG oslo_concurrency.lockutils [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lock "/var/lib/nova/instances/54738e99-a5ed-4771-810b-8bea70fde21a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:16 np0005466013 nova_compute[192144]: 2025-10-02 12:15:16.061 2 DEBUG oslo_concurrency.processutils [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:16 np0005466013 nova_compute[192144]: 2025-10-02 12:15:16.097 2 DEBUG nova.policy [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7b25dedc41b548469d2b0627e3255b9e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'da589921190a470cab62d12688f03735', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:15:16 np0005466013 nova_compute[192144]: 2025-10-02 12:15:16.131 2 DEBUG oslo_concurrency.processutils [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:16 np0005466013 nova_compute[192144]: 2025-10-02 12:15:16.133 2 DEBUG oslo_concurrency.lockutils [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:16 np0005466013 nova_compute[192144]: 2025-10-02 12:15:16.134 2 DEBUG oslo_concurrency.lockutils [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:16 np0005466013 nova_compute[192144]: 2025-10-02 12:15:16.158 2 DEBUG oslo_concurrency.processutils [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:16 np0005466013 nova_compute[192144]: 2025-10-02 12:15:16.225 2 DEBUG oslo_concurrency.processutils [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:16 np0005466013 nova_compute[192144]: 2025-10-02 12:15:16.227 2 DEBUG oslo_concurrency.processutils [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/54738e99-a5ed-4771-810b-8bea70fde21a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:16 np0005466013 nova_compute[192144]: 2025-10-02 12:15:16.273 2 DEBUG oslo_concurrency.processutils [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/54738e99-a5ed-4771-810b-8bea70fde21a/disk 1073741824" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:16 np0005466013 nova_compute[192144]: 2025-10-02 12:15:16.274 2 DEBUG oslo_concurrency.lockutils [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:16 np0005466013 nova_compute[192144]: 2025-10-02 12:15:16.275 2 DEBUG oslo_concurrency.processutils [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:16 np0005466013 nova_compute[192144]: 2025-10-02 12:15:16.348 2 DEBUG oslo_concurrency.processutils [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:16 np0005466013 nova_compute[192144]: 2025-10-02 12:15:16.349 2 DEBUG nova.virt.disk.api [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Checking if we can resize image /var/lib/nova/instances/54738e99-a5ed-4771-810b-8bea70fde21a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:15:16 np0005466013 nova_compute[192144]: 2025-10-02 12:15:16.350 2 DEBUG oslo_concurrency.processutils [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/54738e99-a5ed-4771-810b-8bea70fde21a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.349 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '70d78115-9cfc-487c-95b6-f4f4149c52a1', 'name': 'tempest-ServerRescueTestJSON-server-899246229', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000004a', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'da589921190a470cab62d12688f03735', 'user_id': '7b25dedc41b548469d2b0627e3255b9e', 'hostId': '439393a507f64564670b57506323d7fef63882363726b30c3acc778c', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.351 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.354 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 70d78115-9cfc-487c-95b6-f4f4149c52a1 / tapcea86da2-59 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.355 12 DEBUG ceilometer.compute.pollsters [-] 70d78115-9cfc-487c-95b6-f4f4149c52a1/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.358 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'da51807f-eda3-4e4c-a8d2-f9ba490a1d2f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7b25dedc41b548469d2b0627e3255b9e', 'user_name': None, 'project_id': 'da589921190a470cab62d12688f03735', 'project_name': None, 'resource_id': 'instance-0000004a-70d78115-9cfc-487c-95b6-f4f4149c52a1-tapcea86da2-59', 'timestamp': '2025-10-02T12:15:16.351368', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-899246229', 'name': 'tapcea86da2-59', 'instance_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1', 'instance_type': 'm1.nano', 'host': '439393a507f64564670b57506323d7fef63882363726b30c3acc778c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:06:84:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcea86da2-59'}, 'message_id': '74922b30-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5308.02958417, 'message_signature': 'ea4c71907a85b0935d456fb4dfb27e9553c6fce737da4b69aa599d7754ba53e9'}]}, 'timestamp': '2025-10-02 12:15:16.356040', '_unique_id': 'c13857ce8b5c42549ab10c65e72ab0c7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.358 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.358 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.358 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.358 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.358 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.358 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.358 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.358 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.358 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.358 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.358 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.358 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.358 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.358 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.358 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.358 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.358 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.358 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.358 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.358 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.358 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.358 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.358 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.358 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.358 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.358 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.358 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.358 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.358 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.358 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.358 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.359 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.394 12 DEBUG ceilometer.compute.pollsters [-] 70d78115-9cfc-487c-95b6-f4f4149c52a1/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.395 12 DEBUG ceilometer.compute.pollsters [-] 70d78115-9cfc-487c-95b6-f4f4149c52a1/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.395 12 DEBUG ceilometer.compute.pollsters [-] 70d78115-9cfc-487c-95b6-f4f4149c52a1/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.397 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9a88a12e-6421-47c4-a312-b7b362665b8f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '7b25dedc41b548469d2b0627e3255b9e', 'user_name': None, 'project_id': 'da589921190a470cab62d12688f03735', 'project_name': None, 'resource_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1-vda', 'timestamp': '2025-10-02T12:15:16.359712', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-899246229', 'name': 'instance-0000004a', 'instance_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1', 'instance_type': 'm1.nano', 'host': '439393a507f64564670b57506323d7fef63882363726b30c3acc778c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '74982a62-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5308.037981108, 'message_signature': '3de9cb9a12d0f5ab5f1027fd5998222c8aa1d876014f69a33342057a77aeeaa2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '7b25dedc41b548469d2b0627e3255b9e', 'user_name': None, 'project_id': 'da589921190a470cab62d12688f03735', 'project_name': None, 'resource_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1-vdb', 'timestamp': '2025-10-02T12:15:16.359712', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-899246229', 'name': 'instance-0000004a', 'instance_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1', 'instance_type': 'm1.nano', 'host': '439393a507f64564670b57506323d7fef63882363726b30c3acc778c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': '74983ac0-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5308.037981108, 'message_signature': '6e87d21ea7e35cf04d8532416d20ecb38cd6eb877d5b4340639d60d8665a3d0b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '7b25dedc41b548469d2b0627e3255b9e', 'user_name': None, 'project_id': 'da589921190a470cab62d12688f03735', 'project_name': None, 'resource_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1-sda', 'timestamp': '2025-10-02T12:15:16.359712', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-899246229', 'name': 'instance-0000004a', 'instance_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1', 'instance_type': 'm1.nano', 'host': '439393a507f64564670b57506323d7fef63882363726b30c3acc778c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '74984682-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5308.037981108, 'message_signature': '0ab6b78a912526eeb1155c0be3f2b8dfd5ec1cc1b236483d9beeba414a822b51'}]}, 'timestamp': '2025-10-02 12:15:16.395952', '_unique_id': '144f900c741747238ee0802b67682480'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.397 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.397 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.397 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.397 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.397 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.397 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.397 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.397 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.397 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.397 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.397 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.397 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.397 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.397 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.397 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.397 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.397 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.397 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.397 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.397 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.397 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.397 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.397 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.397 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.397 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.397 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.397 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.397 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.397 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.397 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.397 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.398 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  2 08:15:16 np0005466013 nova_compute[192144]: 2025-10-02 12:15:16.415 2 DEBUG oslo_concurrency.processutils [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/54738e99-a5ed-4771-810b-8bea70fde21a/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:16 np0005466013 nova_compute[192144]: 2025-10-02 12:15:16.417 2 DEBUG nova.virt.disk.api [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Cannot resize image /var/lib/nova/instances/54738e99-a5ed-4771-810b-8bea70fde21a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:15:16 np0005466013 nova_compute[192144]: 2025-10-02 12:15:16.417 2 DEBUG nova.objects.instance [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lazy-loading 'migration_context' on Instance uuid 54738e99-a5ed-4771-810b-8bea70fde21a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.418 12 DEBUG ceilometer.compute.pollsters [-] 70d78115-9cfc-487c-95b6-f4f4149c52a1/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.419 12 DEBUG ceilometer.compute.pollsters [-] 70d78115-9cfc-487c-95b6-f4f4149c52a1/disk.device.allocation volume: 30547968 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.419 12 DEBUG ceilometer.compute.pollsters [-] 70d78115-9cfc-487c-95b6-f4f4149c52a1/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.421 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e6358016-b6e1-4079-84e1-631845e2c87e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '7b25dedc41b548469d2b0627e3255b9e', 'user_name': None, 'project_id': 'da589921190a470cab62d12688f03735', 'project_name': None, 'resource_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1-vda', 'timestamp': '2025-10-02T12:15:16.398919', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-899246229', 'name': 'instance-0000004a', 'instance_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1', 'instance_type': 'm1.nano', 'host': '439393a507f64564670b57506323d7fef63882363726b30c3acc778c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '749bde3c-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5308.077179229, 'message_signature': '02babfef11e1e1d105e3c9a9eb413893a16e81b9aef5b56b6a4c53d3ae4df340'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30547968, 'user_id': '7b25dedc41b548469d2b0627e3255b9e', 'user_name': None, 'project_id': 'da589921190a470cab62d12688f03735', 'project_name': None, 'resource_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1-vdb', 'timestamp': '2025-10-02T12:15:16.398919', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-899246229', 'name': 'instance-0000004a', 'instance_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1', 'instance_type': 'm1.nano', 'host': '439393a507f64564670b57506323d7fef63882363726b30c3acc778c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': '749bee0e-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5308.077179229, 'message_signature': '33b3113cbd8d86d3998d4e8db8613acf7d2902ae6be9885dc5c6bbd0eef230ea'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '7b25dedc41b548469d2b0627e3255b9e', 'user_name': None, 'project_id': 'da589921190a470cab62d12688f03735', 'project_name': None, 'resource_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1-sda', 'timestamp': '2025-10-02T12:15:16.398919', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-899246229', 'name': 'instance-0000004a', 'instance_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1', 'instance_type': 'm1.nano', 'host': '439393a507f64564670b57506323d7fef63882363726b30c3acc778c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '749bfa52-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5308.077179229, 'message_signature': 'd8d66bae47c6f085a1aa1bbeb565576912662c34d39984d48ee4ee054cd3ed4b'}]}, 'timestamp': '2025-10-02 12:15:16.420191', '_unique_id': '612658796e6b4c468a8ca4d10f9f87e1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.421 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.421 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.421 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.421 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.421 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.421 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.421 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.421 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.421 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.421 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.421 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.421 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.421 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.421 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.421 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.421 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.421 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.421 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.421 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.421 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.421 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.421 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.421 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.421 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.421 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.421 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.421 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.421 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.421 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.421 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.421 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.422 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.422 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.422 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerRescueTestJSON-server-899246229>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerRescueTestJSON-server-899246229>]
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.423 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.423 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.423 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerRescueTestJSON-server-899246229>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerRescueTestJSON-server-899246229>]
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.423 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.423 12 DEBUG ceilometer.compute.pollsters [-] 70d78115-9cfc-487c-95b6-f4f4149c52a1/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.425 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1288dcc0-f17d-4cbf-8572-64e90eef7b8b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7b25dedc41b548469d2b0627e3255b9e', 'user_name': None, 'project_id': 'da589921190a470cab62d12688f03735', 'project_name': None, 'resource_id': 'instance-0000004a-70d78115-9cfc-487c-95b6-f4f4149c52a1-tapcea86da2-59', 'timestamp': '2025-10-02T12:15:16.423764', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-899246229', 'name': 'tapcea86da2-59', 'instance_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1', 'instance_type': 'm1.nano', 'host': '439393a507f64564670b57506323d7fef63882363726b30c3acc778c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:06:84:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcea86da2-59'}, 'message_id': '749c9462-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5308.02958417, 'message_signature': '3d6260d70f5982942812ae84f59742c81e20622f78b5b940e134265cd467d4ba'}]}, 'timestamp': '2025-10-02 12:15:16.424168', '_unique_id': 'bc85c561e1fb4a428ffbb13951a9fe74'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.425 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.425 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.425 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.425 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.425 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.425 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.425 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.425 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.425 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.425 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.425 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.425 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.425 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.425 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.425 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.425 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.425 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.425 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.425 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.425 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.425 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.425 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.425 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.425 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.425 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.425 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.425 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.425 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.425 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.425 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.425 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.426 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.426 12 DEBUG ceilometer.compute.pollsters [-] 70d78115-9cfc-487c-95b6-f4f4149c52a1/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.427 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '546e49e3-fe29-4bd2-a560-645e1ceac292', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7b25dedc41b548469d2b0627e3255b9e', 'user_name': None, 'project_id': 'da589921190a470cab62d12688f03735', 'project_name': None, 'resource_id': 'instance-0000004a-70d78115-9cfc-487c-95b6-f4f4149c52a1-tapcea86da2-59', 'timestamp': '2025-10-02T12:15:16.426636', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-899246229', 'name': 'tapcea86da2-59', 'instance_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1', 'instance_type': 'm1.nano', 'host': '439393a507f64564670b57506323d7fef63882363726b30c3acc778c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:06:84:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcea86da2-59'}, 'message_id': '749d0460-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5308.02958417, 'message_signature': '8b9a58f44a2429c054ff673da1b2c64a89deb9a917ad76f192c2c576c392d2b9'}]}, 'timestamp': '2025-10-02 12:15:16.427032', '_unique_id': '65a413cd835f467b85953b36e1526e8b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.427 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.427 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.427 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.427 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.427 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.427 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.427 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.427 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.427 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.427 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.427 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.427 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.427 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.427 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.427 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.427 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.427 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.427 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.427 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.427 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.427 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.427 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.427 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.427 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.427 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.427 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.427 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.427 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.427 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.427 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.427 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.428 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.429 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.429 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerRescueTestJSON-server-899246229>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerRescueTestJSON-server-899246229>]
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.429 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  2 08:15:16 np0005466013 nova_compute[192144]: 2025-10-02 12:15:16.433 2 DEBUG nova.virt.libvirt.driver [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:15:16 np0005466013 nova_compute[192144]: 2025-10-02 12:15:16.434 2 DEBUG nova.virt.libvirt.driver [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Ensure instance console log exists: /var/lib/nova/instances/54738e99-a5ed-4771-810b-8bea70fde21a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:15:16 np0005466013 nova_compute[192144]: 2025-10-02 12:15:16.434 2 DEBUG oslo_concurrency.lockutils [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:16 np0005466013 nova_compute[192144]: 2025-10-02 12:15:16.435 2 DEBUG oslo_concurrency.lockutils [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:16 np0005466013 nova_compute[192144]: 2025-10-02 12:15:16.435 2 DEBUG oslo_concurrency.lockutils [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.446 12 DEBUG ceilometer.compute.pollsters [-] 70d78115-9cfc-487c-95b6-f4f4149c52a1/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.447 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 70d78115-9cfc-487c-95b6-f4f4149c52a1: ceilometer.compute.pollsters.NoVolumeException
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.447 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.447 12 DEBUG ceilometer.compute.pollsters [-] 70d78115-9cfc-487c-95b6-f4f4149c52a1/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.448 12 DEBUG ceilometer.compute.pollsters [-] 70d78115-9cfc-487c-95b6-f4f4149c52a1/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.448 12 DEBUG ceilometer.compute.pollsters [-] 70d78115-9cfc-487c-95b6-f4f4149c52a1/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.450 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e595456a-632f-43a1-9ba5-74ab8102139b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '7b25dedc41b548469d2b0627e3255b9e', 'user_name': None, 'project_id': 'da589921190a470cab62d12688f03735', 'project_name': None, 'resource_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1-vda', 'timestamp': '2025-10-02T12:15:16.447578', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-899246229', 'name': 'instance-0000004a', 'instance_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1', 'instance_type': 'm1.nano', 'host': '439393a507f64564670b57506323d7fef63882363726b30c3acc778c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '74a03946-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5308.037981108, 'message_signature': '9bbd1b6160ac7cbb2c639a52534069786dcd7f0e36034696b1aae6f047a8f1c3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '7b25dedc41b548469d2b0627e3255b9e', 'user_name': None, 'project_id': 'da589921190a470cab62d12688f03735', 'project_name': None, 'resource_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1-vdb', 'timestamp': '2025-10-02T12:15:16.447578', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-899246229', 'name': 'instance-0000004a', 'instance_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1', 'instance_type': 'm1.nano', 'host': '439393a507f64564670b57506323d7fef63882363726b30c3acc778c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': '74a04918-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5308.037981108, 'message_signature': '047d92ec790825e31e6810c50e57aee4af8fd3c2575792082d94d6b2e127ba18'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '7b25dedc41b548469d2b0627e3255b9e', 'user_name': None, 'project_id': 'da589921190a470cab62d12688f03735', 'project_name': None, 'resource_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1-sda', 'timestamp': '2025-10-02T12:15:16.447578', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-899246229', 'name': 'instance-0000004a', 'instance_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1', 'instance_type': 'm1.nano', 'host': '439393a507f64564670b57506323d7fef63882363726b30c3acc778c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '74a0552a-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5308.037981108, 'message_signature': '918f6c5260011295b53d54d50353ecc04c4bb089c52dd2ce0a46b8e9dc8f08bb'}]}, 'timestamp': '2025-10-02 12:15:16.448740', '_unique_id': '4e9595eb9f6d4a0e9932f65976d42b4c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.450 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.450 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.450 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.450 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.450 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.450 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.450 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.450 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.450 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.450 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.450 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.450 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.450 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.450 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.450 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.450 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.450 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.450 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.450 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.450 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.450 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.450 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.450 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.450 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.450 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.450 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.450 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.450 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.450 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.450 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.450 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.451 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.451 12 DEBUG ceilometer.compute.pollsters [-] 70d78115-9cfc-487c-95b6-f4f4149c52a1/cpu volume: 6460000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.453 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9f1e2a32-2ea6-43ca-9f6b-29afef1de25b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 6460000000, 'user_id': '7b25dedc41b548469d2b0627e3255b9e', 'user_name': None, 'project_id': 'da589921190a470cab62d12688f03735', 'project_name': None, 'resource_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1', 'timestamp': '2025-10-02T12:15:16.451544', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-899246229', 'name': 'instance-0000004a', 'instance_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1', 'instance_type': 'm1.nano', 'host': '439393a507f64564670b57506323d7fef63882363726b30c3acc778c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '74a0d194-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5308.124821979, 'message_signature': '020356023a932a72f559db234d012331262f546b5ff6cf5d66d63af748d205e0'}]}, 'timestamp': '2025-10-02 12:15:16.451983', '_unique_id': 'a395c9f674a749ba8999a7452cc445ae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.453 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.453 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.453 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.453 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.453 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.453 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.453 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.453 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.453 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.453 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.453 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.453 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.453 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.453 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.453 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.453 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.453 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.453 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.453 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.453 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.453 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.453 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.453 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.453 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.453 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.453 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.453 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.453 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.453 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.453 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.453 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.454 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.454 12 DEBUG ceilometer.compute.pollsters [-] 70d78115-9cfc-487c-95b6-f4f4149c52a1/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.455 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1e76842a-9531-4c1a-8846-82e65d0c49a3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '7b25dedc41b548469d2b0627e3255b9e', 'user_name': None, 'project_id': 'da589921190a470cab62d12688f03735', 'project_name': None, 'resource_id': 'instance-0000004a-70d78115-9cfc-487c-95b6-f4f4149c52a1-tapcea86da2-59', 'timestamp': '2025-10-02T12:15:16.454487', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-899246229', 'name': 'tapcea86da2-59', 'instance_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1', 'instance_type': 'm1.nano', 'host': '439393a507f64564670b57506323d7fef63882363726b30c3acc778c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:06:84:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcea86da2-59'}, 'message_id': '74a144bc-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5308.02958417, 'message_signature': 'bb3424f85129604e15e8a9505532eea01f1f838fb48760c6c4783d4f077208d2'}]}, 'timestamp': '2025-10-02 12:15:16.454936', '_unique_id': '9f291178e5e7445f84a40f25a98879b8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.455 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.455 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.455 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.455 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.455 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.455 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.455 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.455 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.455 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.455 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.455 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.455 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.455 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.455 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.455 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.455 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.455 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.455 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.455 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.455 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.455 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.455 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.455 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.455 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.455 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.455 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.455 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.455 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.455 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.455 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.455 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.457 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.457 12 DEBUG ceilometer.compute.pollsters [-] 70d78115-9cfc-487c-95b6-f4f4149c52a1/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.457 12 DEBUG ceilometer.compute.pollsters [-] 70d78115-9cfc-487c-95b6-f4f4149c52a1/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.458 12 DEBUG ceilometer.compute.pollsters [-] 70d78115-9cfc-487c-95b6-f4f4149c52a1/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.459 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1e033864-3c19-4d6c-bd32-46ff92560dee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '7b25dedc41b548469d2b0627e3255b9e', 'user_name': None, 'project_id': 'da589921190a470cab62d12688f03735', 'project_name': None, 'resource_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1-vda', 'timestamp': '2025-10-02T12:15:16.457444', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-899246229', 'name': 'instance-0000004a', 'instance_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1', 'instance_type': 'm1.nano', 'host': '439393a507f64564670b57506323d7fef63882363726b30c3acc778c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '74a1b8b6-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5308.037981108, 'message_signature': '9e3fded654d47cb1fc32e4d2a3388f5e21f4c0c969dd90d69fe5d6df46702a2e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7b25dedc41b548469d2b0627e3255b9e', 'user_name': None, 'project_id': 'da589921190a470cab62d12688f03735', 'project_name': None, 'resource_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1-vdb', 'timestamp': '2025-10-02T12:15:16.457444', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-899246229', 'name': 'instance-0000004a', 'instance_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1', 'instance_type': 'm1.nano', 'host': '439393a507f64564670b57506323d7fef63882363726b30c3acc778c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': '74a1c7c0-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5308.037981108, 'message_signature': 'e70318a50d739bf87b99467acab5697a44dd8f973388449853aa2cc421383e36'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '7b25dedc41b548469d2b0627e3255b9e', 'user_name': None, 'project_id': 'da589921190a470cab62d12688f03735', 'project_name': None, 'resource_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1-sda', 'timestamp': '2025-10-02T12:15:16.457444', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-899246229', 'name': 'instance-0000004a', 'instance_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1', 'instance_type': 'm1.nano', 'host': '439393a507f64564670b57506323d7fef63882363726b30c3acc778c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '74a1d3a0-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5308.037981108, 'message_signature': 'ffb5ae59e5b3bdcebcdd524ad5acc3d0ba0795e34ebc3fdb5db5c2b7d2c6a588'}]}, 'timestamp': '2025-10-02 12:15:16.458522', '_unique_id': '7b09ed67c6904c3f86a8028b99b75a90'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.459 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.459 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.459 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.459 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.459 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.459 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.459 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.459 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.459 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.459 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.459 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.459 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.459 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.459 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.459 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.459 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.459 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.459 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.459 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.459 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.459 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.459 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.459 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.459 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.459 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.459 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.459 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.459 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.459 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.459 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.459 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.461 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.461 12 DEBUG ceilometer.compute.pollsters [-] 70d78115-9cfc-487c-95b6-f4f4149c52a1/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.462 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '117baf4d-8d59-4864-a204-c8fd77318789', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7b25dedc41b548469d2b0627e3255b9e', 'user_name': None, 'project_id': 'da589921190a470cab62d12688f03735', 'project_name': None, 'resource_id': 'instance-0000004a-70d78115-9cfc-487c-95b6-f4f4149c52a1-tapcea86da2-59', 'timestamp': '2025-10-02T12:15:16.461203', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-899246229', 'name': 'tapcea86da2-59', 'instance_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1', 'instance_type': 'm1.nano', 'host': '439393a507f64564670b57506323d7fef63882363726b30c3acc778c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:06:84:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcea86da2-59'}, 'message_id': '74a24ba0-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5308.02958417, 'message_signature': '7e04eaf21a74549b46fcfba81650e5feeba8ec2ed2fd03ed6d399f87b8a955aa'}]}, 'timestamp': '2025-10-02 12:15:16.461676', '_unique_id': '473d921c7d914e798e6e70389365c41f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.462 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.462 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.462 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.462 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.462 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.462 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.462 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.462 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.462 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.462 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.462 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.462 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.462 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.462 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.462 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.462 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.462 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.462 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.462 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.462 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.462 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.462 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.462 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.462 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.462 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.462 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.462 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.462 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.462 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.462 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.462 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.463 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.464 12 DEBUG ceilometer.compute.pollsters [-] 70d78115-9cfc-487c-95b6-f4f4149c52a1/network.incoming.bytes volume: 110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.465 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b65bdf5c-b8a5-4055-b82d-1b76573794b6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 110, 'user_id': '7b25dedc41b548469d2b0627e3255b9e', 'user_name': None, 'project_id': 'da589921190a470cab62d12688f03735', 'project_name': None, 'resource_id': 'instance-0000004a-70d78115-9cfc-487c-95b6-f4f4149c52a1-tapcea86da2-59', 'timestamp': '2025-10-02T12:15:16.464110', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-899246229', 'name': 'tapcea86da2-59', 'instance_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1', 'instance_type': 'm1.nano', 'host': '439393a507f64564670b57506323d7fef63882363726b30c3acc778c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:06:84:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcea86da2-59'}, 'message_id': '74a2bc52-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5308.02958417, 'message_signature': '94f2be5767c78d98eb56b21821cbf013dd559a306aa87c738e34f548da9fbb44'}]}, 'timestamp': '2025-10-02 12:15:16.464522', '_unique_id': '000f1a6f70ef4801a5a252ff5e6bfb92'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.465 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.465 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.465 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.465 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.465 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.465 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.465 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.465 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.465 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.465 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.465 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.465 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.465 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.465 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.465 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.465 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.465 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.465 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.465 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.465 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.465 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.465 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.465 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.465 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.465 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.465 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.465 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.465 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.465 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.465 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.465 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.466 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.466 12 DEBUG ceilometer.compute.pollsters [-] 70d78115-9cfc-487c-95b6-f4f4149c52a1/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.468 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a079b9fb-a21f-428f-9609-babbc7323065', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7b25dedc41b548469d2b0627e3255b9e', 'user_name': None, 'project_id': 'da589921190a470cab62d12688f03735', 'project_name': None, 'resource_id': 'instance-0000004a-70d78115-9cfc-487c-95b6-f4f4149c52a1-tapcea86da2-59', 'timestamp': '2025-10-02T12:15:16.466867', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-899246229', 'name': 'tapcea86da2-59', 'instance_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1', 'instance_type': 'm1.nano', 'host': '439393a507f64564670b57506323d7fef63882363726b30c3acc778c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:06:84:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcea86da2-59'}, 'message_id': '74a32908-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5308.02958417, 'message_signature': '63267f886e2f422d883ae0ea5b14ad8ec3648ebc00bc7a5314aa549aab0cc0e8'}]}, 'timestamp': '2025-10-02 12:15:16.467314', '_unique_id': '1f37be483a1e48c49bcd35acc6e602a5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.468 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.468 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.468 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.468 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.468 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.468 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.468 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.468 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.468 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.468 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.468 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.468 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.468 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.468 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.468 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.468 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.468 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.468 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.468 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.468 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.468 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.468 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.468 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.468 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.468 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.468 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.468 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.468 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.468 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.468 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.468 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.469 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.469 12 DEBUG ceilometer.compute.pollsters [-] 70d78115-9cfc-487c-95b6-f4f4149c52a1/disk.device.capacity volume: 117440512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.470 12 DEBUG ceilometer.compute.pollsters [-] 70d78115-9cfc-487c-95b6-f4f4149c52a1/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.470 12 DEBUG ceilometer.compute.pollsters [-] 70d78115-9cfc-487c-95b6-f4f4149c52a1/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.471 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ddb77e4d-3432-4efb-ae89-1af1d5bd31c8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 117440512, 'user_id': '7b25dedc41b548469d2b0627e3255b9e', 'user_name': None, 'project_id': 'da589921190a470cab62d12688f03735', 'project_name': None, 'resource_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1-vda', 'timestamp': '2025-10-02T12:15:16.469771', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-899246229', 'name': 'instance-0000004a', 'instance_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1', 'instance_type': 'm1.nano', 'host': '439393a507f64564670b57506323d7fef63882363726b30c3acc778c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '74a39a0a-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5308.077179229, 'message_signature': '98606963ca09a518761ed545375cf2d129370db9f9136eb90ef98eb07089bc99'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7b25dedc41b548469d2b0627e3255b9e', 'user_name': None, 'project_id': 'da589921190a470cab62d12688f03735', 'project_name': None, 'resource_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1-vdb', 'timestamp': '2025-10-02T12:15:16.469771', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-899246229', 'name': 'instance-0000004a', 'instance_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1', 'instance_type': 'm1.nano', 'host': '439393a507f64564670b57506323d7fef63882363726b30c3acc778c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': '74a3a7c0-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5308.077179229, 'message_signature': '40a94fbb427abaa7856016e7825d3ccd2ce0e23f3ab58c2d91d219023bd608e8'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '7b25dedc41b548469d2b0627e3255b9e', 'user_name': None, 'project_id': 'da589921190a470cab62d12688f03735', 'project_name': None, 'resource_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1-sda', 'timestamp': '2025-10-02T12:15:16.469771', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-899246229', 'name': 'instance-0000004a', 'instance_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1', 'instance_type': 'm1.nano', 'host': '439393a507f64564670b57506323d7fef63882363726b30c3acc778c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '74a3b38c-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5308.077179229, 'message_signature': 'fe7c279e1d2b48a172fd722ebc0325dd2f29f94ccf4a97c90a9f65b59ea68c11'}]}, 'timestamp': '2025-10-02 12:15:16.470806', '_unique_id': '80834635bd3b4424a15ed00b8526f285'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.471 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.471 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.471 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.471 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.471 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.471 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.471 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.471 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.471 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.471 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.471 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.471 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.471 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.471 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.471 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.471 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.471 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.471 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.471 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.471 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.471 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.471 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.471 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.471 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.471 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.471 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.471 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.471 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.471 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.471 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.471 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.473 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.473 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.473 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerRescueTestJSON-server-899246229>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerRescueTestJSON-server-899246229>]
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.473 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.473 12 DEBUG ceilometer.compute.pollsters [-] 70d78115-9cfc-487c-95b6-f4f4149c52a1/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.474 12 DEBUG ceilometer.compute.pollsters [-] 70d78115-9cfc-487c-95b6-f4f4149c52a1/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.474 12 DEBUG ceilometer.compute.pollsters [-] 70d78115-9cfc-487c-95b6-f4f4149c52a1/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.475 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cddb6306-669a-423d-b667-bf2e07e90f52', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '7b25dedc41b548469d2b0627e3255b9e', 'user_name': None, 'project_id': 'da589921190a470cab62d12688f03735', 'project_name': None, 'resource_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1-vda', 'timestamp': '2025-10-02T12:15:16.473944', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-899246229', 'name': 'instance-0000004a', 'instance_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1', 'instance_type': 'm1.nano', 'host': '439393a507f64564670b57506323d7fef63882363726b30c3acc778c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '74a43bf4-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5308.037981108, 'message_signature': '1b30d3e56a42cee64d8a89c505ee01f0288bf75940547445bef342e61c29ee61'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '7b25dedc41b548469d2b0627e3255b9e', 'user_name': None, 'project_id': 'da589921190a470cab62d12688f03735', 'project_name': None, 'resource_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1-vdb', 'timestamp': '2025-10-02T12:15:16.473944', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-899246229', 'name': 'instance-0000004a', 'instance_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1', 'instance_type': 'm1.nano', 'host': '439393a507f64564670b57506323d7fef63882363726b30c3acc778c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': '74a4490a-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5308.037981108, 'message_signature': 'd0027d2824d9249453cf27949de0ee0167204c05219a3e4b13321cd569d5c25c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7b25dedc41b548469d2b0627e3255b9e', 'user_name': None, 'project_id': 'da589921190a470cab62d12688f03735', 'project_name': None, 'resource_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1-sda', 'timestamp': '2025-10-02T12:15:16.473944', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-899246229', 'name': 'instance-0000004a', 'instance_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1', 'instance_type': 'm1.nano', 'host': '439393a507f64564670b57506323d7fef63882363726b30c3acc778c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '74a45422-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5308.037981108, 'message_signature': 'fe64ad5e8c6620449d6d1071cbfb9abd2458b9751445c4c48d3863fb5d27cde6'}]}, 'timestamp': '2025-10-02 12:15:16.474928', '_unique_id': '8114aa1f91d447b0beb99f0be32ce854'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.475 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.475 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.475 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.475 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.475 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.475 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.475 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.475 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.475 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.475 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.475 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.475 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.475 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.475 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.475 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.475 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.475 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.475 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.475 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.475 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.475 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.475 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.475 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.475 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.475 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.475 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.475 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.475 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.475 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.475 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.475 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.477 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.477 12 DEBUG ceilometer.compute.pollsters [-] 70d78115-9cfc-487c-95b6-f4f4149c52a1/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.478 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7b20f351-b639-4c35-88d0-e6fda7d1eef7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7b25dedc41b548469d2b0627e3255b9e', 'user_name': None, 'project_id': 'da589921190a470cab62d12688f03735', 'project_name': None, 'resource_id': 'instance-0000004a-70d78115-9cfc-487c-95b6-f4f4149c52a1-tapcea86da2-59', 'timestamp': '2025-10-02T12:15:16.477340', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-899246229', 'name': 'tapcea86da2-59', 'instance_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1', 'instance_type': 'm1.nano', 'host': '439393a507f64564670b57506323d7fef63882363726b30c3acc778c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:06:84:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcea86da2-59'}, 'message_id': '74a4c182-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5308.02958417, 'message_signature': '2a1e97c62826d29990b0773e3b4e56edb173eefda81d07661d225e674a4ee3c5'}]}, 'timestamp': '2025-10-02 12:15:16.477785', '_unique_id': '53a56ecb6bb642f9b66f2ad40f2af733'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.478 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.478 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.478 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.478 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.478 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.478 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.478 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.478 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.478 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.478 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.478 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.478 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.478 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.478 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.478 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.478 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.478 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.478 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.478 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.478 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.478 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.478 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.478 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.478 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.478 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.478 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.478 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.478 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.478 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.478 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.478 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.479 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.480 12 DEBUG ceilometer.compute.pollsters [-] 70d78115-9cfc-487c-95b6-f4f4149c52a1/disk.device.usage volume: 196616 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.480 12 DEBUG ceilometer.compute.pollsters [-] 70d78115-9cfc-487c-95b6-f4f4149c52a1/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.480 12 DEBUG ceilometer.compute.pollsters [-] 70d78115-9cfc-487c-95b6-f4f4149c52a1/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.482 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a021f1a1-b909-49b8-bf71-3c0fe9dfc1fc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196616, 'user_id': '7b25dedc41b548469d2b0627e3255b9e', 'user_name': None, 'project_id': 'da589921190a470cab62d12688f03735', 'project_name': None, 'resource_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1-vda', 'timestamp': '2025-10-02T12:15:16.480092', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-899246229', 'name': 'instance-0000004a', 'instance_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1', 'instance_type': 'm1.nano', 'host': '439393a507f64564670b57506323d7fef63882363726b30c3acc778c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '74a52c4e-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5308.077179229, 'message_signature': 'cf032fef65d2211388c4c6759f300c01e4e66b8b37a5359fb78a3d6a311ee859'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '7b25dedc41b548469d2b0627e3255b9e', 'user_name': None, 'project_id': 'da589921190a470cab62d12688f03735', 'project_name': None, 'resource_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1-vdb', 'timestamp': '2025-10-02T12:15:16.480092', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-899246229', 'name': 'instance-0000004a', 'instance_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1', 'instance_type': 'm1.nano', 'host': '439393a507f64564670b57506323d7fef63882363726b30c3acc778c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': '74a5395a-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5308.077179229, 'message_signature': '11df11032c80657fbda1bd0be39fe0e526fea3b56769e8c11633dd89050de324'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '7b25dedc41b548469d2b0627e3255b9e', 'user_name': None, 'project_id': 'da589921190a470cab62d12688f03735', 'project_name': None, 'resource_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1-sda', 'timestamp': '2025-10-02T12:15:16.480092', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-899246229', 'name': 'instance-0000004a', 'instance_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1', 'instance_type': 'm1.nano', 'host': '439393a507f64564670b57506323d7fef63882363726b30c3acc778c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '74a54530-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5308.077179229, 'message_signature': '03e1955ae56d44c6c74a883a0d154640473597d586430eec10aba5bde32700e8'}]}, 'timestamp': '2025-10-02 12:15:16.481079', '_unique_id': '63d55a543aed4b6dae6ed3c07ed3f21b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.482 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.482 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.482 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.482 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.482 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.482 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.482 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.482 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.482 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.482 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.482 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.482 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.482 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.482 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.482 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.482 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.482 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.482 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.482 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.482 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.482 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.482 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.482 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.482 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.482 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.482 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.482 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.482 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.482 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.482 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.482 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.483 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.483 12 DEBUG ceilometer.compute.pollsters [-] 70d78115-9cfc-487c-95b6-f4f4149c52a1/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.484 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7af9c427-7ac3-41dc-8d9d-4d6b54bf14b9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7b25dedc41b548469d2b0627e3255b9e', 'user_name': None, 'project_id': 'da589921190a470cab62d12688f03735', 'project_name': None, 'resource_id': 'instance-0000004a-70d78115-9cfc-487c-95b6-f4f4149c52a1-tapcea86da2-59', 'timestamp': '2025-10-02T12:15:16.483480', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-899246229', 'name': 'tapcea86da2-59', 'instance_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1', 'instance_type': 'm1.nano', 'host': '439393a507f64564670b57506323d7fef63882363726b30c3acc778c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:06:84:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcea86da2-59'}, 'message_id': '74a5b04c-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5308.02958417, 'message_signature': '0b50ae4be33d07e9e48c0e50d54c2a996855468d5edc036684f322253621ad16'}]}, 'timestamp': '2025-10-02 12:15:16.483880', '_unique_id': '33e3d0f349214c9aac32e1fb8219d7d1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.484 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.484 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.484 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.484 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.484 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.484 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.484 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.484 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.484 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.484 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.484 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.484 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.484 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.484 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.484 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.484 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.484 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.484 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.484 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.484 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.484 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.484 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.484 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.484 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.484 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.484 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.484 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.484 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.484 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.484 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.484 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.485 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.486 12 DEBUG ceilometer.compute.pollsters [-] 70d78115-9cfc-487c-95b6-f4f4149c52a1/disk.device.read.latency volume: 468800483 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.486 12 DEBUG ceilometer.compute.pollsters [-] 70d78115-9cfc-487c-95b6-f4f4149c52a1/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.486 12 DEBUG ceilometer.compute.pollsters [-] 70d78115-9cfc-487c-95b6-f4f4149c52a1/disk.device.read.latency volume: 4258241 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.488 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '791369f9-2079-443f-bcc8-b2dbbca8be68', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 468800483, 'user_id': '7b25dedc41b548469d2b0627e3255b9e', 'user_name': None, 'project_id': 'da589921190a470cab62d12688f03735', 'project_name': None, 'resource_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1-vda', 'timestamp': '2025-10-02T12:15:16.486077', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-899246229', 'name': 'instance-0000004a', 'instance_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1', 'instance_type': 'm1.nano', 'host': '439393a507f64564670b57506323d7fef63882363726b30c3acc778c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '74a61636-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5308.037981108, 'message_signature': '3f34c43f939a1e697184272b118e2d55db3b9706d03a9ae18fb0c5d3750f7614'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '7b25dedc41b548469d2b0627e3255b9e', 'user_name': None, 'project_id': 'da589921190a470cab62d12688f03735', 'project_name': None, 'resource_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1-vdb', 'timestamp': '2025-10-02T12:15:16.486077', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-899246229', 'name': 'instance-0000004a', 'instance_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1', 'instance_type': 'm1.nano', 'host': '439393a507f64564670b57506323d7fef63882363726b30c3acc778c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': '74a623b0-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5308.037981108, 'message_signature': '0f701eea364231722d87ec66cad883d2407a1b49543b9d0508ccddf592d22437'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4258241, 'user_id': '7b25dedc41b548469d2b0627e3255b9e', 'user_name': None, 'project_id': 'da589921190a470cab62d12688f03735', 'project_name': None, 'resource_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1-sda', 'timestamp': '2025-10-02T12:15:16.486077', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-899246229', 'name': 'instance-0000004a', 'instance_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1', 'instance_type': 'm1.nano', 'host': '439393a507f64564670b57506323d7fef63882363726b30c3acc778c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '74a6308a-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5308.037981108, 'message_signature': '138604c97329e2dbef7c473b2eb063c7366ca0ca0261ee85625e4fc4f0852292'}]}, 'timestamp': '2025-10-02 12:15:16.487118', '_unique_id': '7adfe6dda9ab4ac08415a24f0b091668'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.488 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.488 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.488 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.488 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.488 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.488 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.488 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.488 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.488 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.488 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.488 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.488 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.488 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.488 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.488 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.488 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.488 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.488 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.488 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.488 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.488 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.488 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.488 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.488 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.488 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.488 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.488 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.488 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.488 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.488 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.488 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.489 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.489 12 DEBUG ceilometer.compute.pollsters [-] 70d78115-9cfc-487c-95b6-f4f4149c52a1/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.490 12 DEBUG ceilometer.compute.pollsters [-] 70d78115-9cfc-487c-95b6-f4f4149c52a1/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.490 12 DEBUG ceilometer.compute.pollsters [-] 70d78115-9cfc-487c-95b6-f4f4149c52a1/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.491 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1ad7b441-8ece-4a0c-8699-d4f098f09f69', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7b25dedc41b548469d2b0627e3255b9e', 'user_name': None, 'project_id': 'da589921190a470cab62d12688f03735', 'project_name': None, 'resource_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1-vda', 'timestamp': '2025-10-02T12:15:16.489726', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-899246229', 'name': 'instance-0000004a', 'instance_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1', 'instance_type': 'm1.nano', 'host': '439393a507f64564670b57506323d7fef63882363726b30c3acc778c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '74a6a664-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5308.037981108, 'message_signature': '0317b8c1013a8f96598b84c3d61c550f6745d158a766df761c8fa4ddb168f703'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7b25dedc41b548469d2b0627e3255b9e', 'user_name': None, 'project_id': 'da589921190a470cab62d12688f03735', 'project_name': None, 'resource_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1-vdb', 'timestamp': '2025-10-02T12:15:16.489726', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-899246229', 'name': 'instance-0000004a', 'instance_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1', 'instance_type': 'm1.nano', 'host': '439393a507f64564670b57506323d7fef63882363726b30c3acc778c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': '74a6b406-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5308.037981108, 'message_signature': '693d337c1c2009ac92ecb48153a81d043cf1795471228dc878151ce4844bfdaf'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7b25dedc41b548469d2b0627e3255b9e', 'user_name': None, 'project_id': 'da589921190a470cab62d12688f03735', 'project_name': None, 'resource_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1-sda', 'timestamp': '2025-10-02T12:15:16.489726', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-899246229', 'name': 'instance-0000004a', 'instance_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1', 'instance_type': 'm1.nano', 'host': '439393a507f64564670b57506323d7fef63882363726b30c3acc778c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '74a6bfd2-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5308.037981108, 'message_signature': 'e64b3d1649f9c4cfe1272d8b3fb9ab9fcf96fd9295d809ad4bfff68f1417ce57'}]}, 'timestamp': '2025-10-02 12:15:16.490785', '_unique_id': '9a13de3406b2417c9ddb89aca82bc942'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.491 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.491 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.491 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.491 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.491 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.491 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.491 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.491 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.491 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.491 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.491 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.491 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.491 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.491 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.491 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.491 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.491 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.491 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.491 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.491 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.491 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.491 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.491 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.491 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.491 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.491 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.491 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.491 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.491 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.491 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.491 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.493 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.493 12 DEBUG ceilometer.compute.pollsters [-] 70d78115-9cfc-487c-95b6-f4f4149c52a1/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.494 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e0629059-12e4-444e-9b3a-f653df6fcb84', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7b25dedc41b548469d2b0627e3255b9e', 'user_name': None, 'project_id': 'da589921190a470cab62d12688f03735', 'project_name': None, 'resource_id': 'instance-0000004a-70d78115-9cfc-487c-95b6-f4f4149c52a1-tapcea86da2-59', 'timestamp': '2025-10-02T12:15:16.493436', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-899246229', 'name': 'tapcea86da2-59', 'instance_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1', 'instance_type': 'm1.nano', 'host': '439393a507f64564670b57506323d7fef63882363726b30c3acc778c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:06:84:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcea86da2-59'}, 'message_id': '74a73782-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5308.02958417, 'message_signature': '9fbc3ce74219eb38bd85018304db6e6d843f2ac1eca98220b9e2003b0210d3bb'}]}, 'timestamp': '2025-10-02 12:15:16.493917', '_unique_id': '1b1796f2d47343f28f16d3247e315673'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.494 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.494 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.494 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.494 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.494 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.494 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.494 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.494 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.494 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.494 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.494 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.494 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.494 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.494 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.494 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.494 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.494 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.494 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.494 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.494 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.494 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.494 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.494 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.494 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.494 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.494 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.494 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.494 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.494 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.494 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:15:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:15:16.494 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:15:16 np0005466013 nova_compute[192144]: 2025-10-02 12:15:16.994 2 DEBUG nova.network.neutron [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Successfully created port: f7ea9721-35c3-47ed-a0ac-d43479729826 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:15:18 np0005466013 nova_compute[192144]: 2025-10-02 12:15:18.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:18 np0005466013 nova_compute[192144]: 2025-10-02 12:15:18.570 2 DEBUG nova.network.neutron [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Successfully updated port: f7ea9721-35c3-47ed-a0ac-d43479729826 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:15:18 np0005466013 nova_compute[192144]: 2025-10-02 12:15:18.596 2 DEBUG oslo_concurrency.lockutils [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Acquiring lock "refresh_cache-54738e99-a5ed-4771-810b-8bea70fde21a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:15:18 np0005466013 nova_compute[192144]: 2025-10-02 12:15:18.596 2 DEBUG oslo_concurrency.lockutils [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Acquired lock "refresh_cache-54738e99-a5ed-4771-810b-8bea70fde21a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:15:18 np0005466013 nova_compute[192144]: 2025-10-02 12:15:18.597 2 DEBUG nova.network.neutron [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:15:18 np0005466013 nova_compute[192144]: 2025-10-02 12:15:18.720 2 DEBUG nova.compute.manager [req-a992019b-1fa0-4766-89cd-0fa5d9669d12 req-022e2e9c-b8f0-4b00-b919-c07c8c349467 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Received event network-changed-f7ea9721-35c3-47ed-a0ac-d43479729826 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:15:18 np0005466013 nova_compute[192144]: 2025-10-02 12:15:18.721 2 DEBUG nova.compute.manager [req-a992019b-1fa0-4766-89cd-0fa5d9669d12 req-022e2e9c-b8f0-4b00-b919-c07c8c349467 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Refreshing instance network info cache due to event network-changed-f7ea9721-35c3-47ed-a0ac-d43479729826. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:15:18 np0005466013 nova_compute[192144]: 2025-10-02 12:15:18.721 2 DEBUG oslo_concurrency.lockutils [req-a992019b-1fa0-4766-89cd-0fa5d9669d12 req-022e2e9c-b8f0-4b00-b919-c07c8c349467 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-54738e99-a5ed-4771-810b-8bea70fde21a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:15:19 np0005466013 nova_compute[192144]: 2025-10-02 12:15:19.027 2 DEBUG nova.network.neutron [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:15:20 np0005466013 nova_compute[192144]: 2025-10-02 12:15:20.011 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:15:20 np0005466013 nova_compute[192144]: 2025-10-02 12:15:20.012 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:15:20 np0005466013 nova_compute[192144]: 2025-10-02 12:15:20.176 2 DEBUG nova.network.neutron [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Updating instance_info_cache with network_info: [{"id": "f7ea9721-35c3-47ed-a0ac-d43479729826", "address": "fa:16:3e:f2:d3:3b", "network": {"id": "295d1af7-5cfd-4501-a36e-13bd17ce835b", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2032524031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "da589921190a470cab62d12688f03735", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7ea9721-35", "ovs_interfaceid": "f7ea9721-35c3-47ed-a0ac-d43479729826", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:15:20 np0005466013 nova_compute[192144]: 2025-10-02 12:15:20.197 2 DEBUG oslo_concurrency.lockutils [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Releasing lock "refresh_cache-54738e99-a5ed-4771-810b-8bea70fde21a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:15:20 np0005466013 nova_compute[192144]: 2025-10-02 12:15:20.198 2 DEBUG nova.compute.manager [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Instance network_info: |[{"id": "f7ea9721-35c3-47ed-a0ac-d43479729826", "address": "fa:16:3e:f2:d3:3b", "network": {"id": "295d1af7-5cfd-4501-a36e-13bd17ce835b", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2032524031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "da589921190a470cab62d12688f03735", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7ea9721-35", "ovs_interfaceid": "f7ea9721-35c3-47ed-a0ac-d43479729826", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:15:20 np0005466013 nova_compute[192144]: 2025-10-02 12:15:20.199 2 DEBUG oslo_concurrency.lockutils [req-a992019b-1fa0-4766-89cd-0fa5d9669d12 req-022e2e9c-b8f0-4b00-b919-c07c8c349467 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-54738e99-a5ed-4771-810b-8bea70fde21a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:15:20 np0005466013 nova_compute[192144]: 2025-10-02 12:15:20.199 2 DEBUG nova.network.neutron [req-a992019b-1fa0-4766-89cd-0fa5d9669d12 req-022e2e9c-b8f0-4b00-b919-c07c8c349467 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Refreshing network info cache for port f7ea9721-35c3-47ed-a0ac-d43479729826 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:15:20 np0005466013 nova_compute[192144]: 2025-10-02 12:15:20.202 2 DEBUG nova.virt.libvirt.driver [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Start _get_guest_xml network_info=[{"id": "f7ea9721-35c3-47ed-a0ac-d43479729826", "address": "fa:16:3e:f2:d3:3b", "network": {"id": "295d1af7-5cfd-4501-a36e-13bd17ce835b", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2032524031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "da589921190a470cab62d12688f03735", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7ea9721-35", "ovs_interfaceid": "f7ea9721-35c3-47ed-a0ac-d43479729826", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:15:20 np0005466013 nova_compute[192144]: 2025-10-02 12:15:20.208 2 WARNING nova.virt.libvirt.driver [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:15:20 np0005466013 nova_compute[192144]: 2025-10-02 12:15:20.216 2 DEBUG nova.virt.libvirt.host [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:15:20 np0005466013 nova_compute[192144]: 2025-10-02 12:15:20.217 2 DEBUG nova.virt.libvirt.host [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:15:20 np0005466013 nova_compute[192144]: 2025-10-02 12:15:20.220 2 DEBUG nova.virt.libvirt.host [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:15:20 np0005466013 nova_compute[192144]: 2025-10-02 12:15:20.221 2 DEBUG nova.virt.libvirt.host [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:15:20 np0005466013 nova_compute[192144]: 2025-10-02 12:15:20.222 2 DEBUG nova.virt.libvirt.driver [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:15:20 np0005466013 nova_compute[192144]: 2025-10-02 12:15:20.223 2 DEBUG nova.virt.hardware [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:15:20 np0005466013 nova_compute[192144]: 2025-10-02 12:15:20.223 2 DEBUG nova.virt.hardware [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:15:20 np0005466013 nova_compute[192144]: 2025-10-02 12:15:20.223 2 DEBUG nova.virt.hardware [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:15:20 np0005466013 nova_compute[192144]: 2025-10-02 12:15:20.224 2 DEBUG nova.virt.hardware [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:15:20 np0005466013 nova_compute[192144]: 2025-10-02 12:15:20.224 2 DEBUG nova.virt.hardware [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:15:20 np0005466013 nova_compute[192144]: 2025-10-02 12:15:20.224 2 DEBUG nova.virt.hardware [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:15:20 np0005466013 nova_compute[192144]: 2025-10-02 12:15:20.225 2 DEBUG nova.virt.hardware [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:15:20 np0005466013 nova_compute[192144]: 2025-10-02 12:15:20.225 2 DEBUG nova.virt.hardware [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:15:20 np0005466013 nova_compute[192144]: 2025-10-02 12:15:20.225 2 DEBUG nova.virt.hardware [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:15:20 np0005466013 nova_compute[192144]: 2025-10-02 12:15:20.225 2 DEBUG nova.virt.hardware [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:15:20 np0005466013 nova_compute[192144]: 2025-10-02 12:15:20.226 2 DEBUG nova.virt.hardware [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:15:20 np0005466013 nova_compute[192144]: 2025-10-02 12:15:20.229 2 DEBUG nova.virt.libvirt.vif [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:15:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-361250710',display_name='tempest-ServerRescueTestJSON-server-361250710',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-361250710',id=77,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='da589921190a470cab62d12688f03735',ramdisk_id='',reservation_id='r-m05rmlnb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSON-1453830403',owner_user_name='tempest-ServerRescueTestJSON-1453830403-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:15:15Z,user_data=None,user_id='7b25dedc41b548469d2b0627e3255b9e',uuid=54738e99-a5ed-4771-810b-8bea70fde21a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f7ea9721-35c3-47ed-a0ac-d43479729826", "address": "fa:16:3e:f2:d3:3b", "network": {"id": "295d1af7-5cfd-4501-a36e-13bd17ce835b", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2032524031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "da589921190a470cab62d12688f03735", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7ea9721-35", "ovs_interfaceid": "f7ea9721-35c3-47ed-a0ac-d43479729826", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:15:20 np0005466013 nova_compute[192144]: 2025-10-02 12:15:20.230 2 DEBUG nova.network.os_vif_util [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Converting VIF {"id": "f7ea9721-35c3-47ed-a0ac-d43479729826", "address": "fa:16:3e:f2:d3:3b", "network": {"id": "295d1af7-5cfd-4501-a36e-13bd17ce835b", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2032524031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "da589921190a470cab62d12688f03735", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7ea9721-35", "ovs_interfaceid": "f7ea9721-35c3-47ed-a0ac-d43479729826", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:15:20 np0005466013 nova_compute[192144]: 2025-10-02 12:15:20.231 2 DEBUG nova.network.os_vif_util [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f2:d3:3b,bridge_name='br-int',has_traffic_filtering=True,id=f7ea9721-35c3-47ed-a0ac-d43479729826,network=Network(295d1af7-5cfd-4501-a36e-13bd17ce835b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7ea9721-35') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:15:20 np0005466013 nova_compute[192144]: 2025-10-02 12:15:20.232 2 DEBUG nova.objects.instance [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lazy-loading 'pci_devices' on Instance uuid 54738e99-a5ed-4771-810b-8bea70fde21a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:15:20 np0005466013 nova_compute[192144]: 2025-10-02 12:15:20.248 2 DEBUG nova.virt.libvirt.driver [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:15:20 np0005466013 nova_compute[192144]:  <uuid>54738e99-a5ed-4771-810b-8bea70fde21a</uuid>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:  <name>instance-0000004d</name>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:15:20 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:      <nova:name>tempest-ServerRescueTestJSON-server-361250710</nova:name>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:15:20</nova:creationTime>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:15:20 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:        <nova:user uuid="7b25dedc41b548469d2b0627e3255b9e">tempest-ServerRescueTestJSON-1453830403-project-member</nova:user>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:        <nova:project uuid="da589921190a470cab62d12688f03735">tempest-ServerRescueTestJSON-1453830403</nova:project>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:        <nova:port uuid="f7ea9721-35c3-47ed-a0ac-d43479729826">
Oct  2 08:15:20 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:      <entry name="serial">54738e99-a5ed-4771-810b-8bea70fde21a</entry>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:      <entry name="uuid">54738e99-a5ed-4771-810b-8bea70fde21a</entry>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:15:20 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/54738e99-a5ed-4771-810b-8bea70fde21a/disk"/>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:15:20 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/54738e99-a5ed-4771-810b-8bea70fde21a/disk.config"/>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:15:20 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:f2:d3:3b"/>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:      <target dev="tapf7ea9721-35"/>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:15:20 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/54738e99-a5ed-4771-810b-8bea70fde21a/console.log" append="off"/>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:15:20 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:15:20 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:15:20 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:15:20 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:15:20 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:15:20 np0005466013 nova_compute[192144]: 2025-10-02 12:15:20.252 2 DEBUG nova.compute.manager [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Preparing to wait for external event network-vif-plugged-f7ea9721-35c3-47ed-a0ac-d43479729826 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:15:20 np0005466013 nova_compute[192144]: 2025-10-02 12:15:20.253 2 DEBUG oslo_concurrency.lockutils [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Acquiring lock "54738e99-a5ed-4771-810b-8bea70fde21a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:20 np0005466013 nova_compute[192144]: 2025-10-02 12:15:20.253 2 DEBUG oslo_concurrency.lockutils [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lock "54738e99-a5ed-4771-810b-8bea70fde21a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:20 np0005466013 nova_compute[192144]: 2025-10-02 12:15:20.254 2 DEBUG oslo_concurrency.lockutils [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lock "54738e99-a5ed-4771-810b-8bea70fde21a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:20 np0005466013 nova_compute[192144]: 2025-10-02 12:15:20.254 2 DEBUG nova.virt.libvirt.vif [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:15:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-361250710',display_name='tempest-ServerRescueTestJSON-server-361250710',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-361250710',id=77,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='da589921190a470cab62d12688f03735',ramdisk_id='',reservation_id='r-m05rmlnb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSON-1453830403',owner_user_name='tempest-ServerRescueTestJSON-1453830403-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:15:15Z,user_data=None,user_id='7b25dedc41b548469d2b0627e3255b9e',uuid=54738e99-a5ed-4771-810b-8bea70fde21a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f7ea9721-35c3-47ed-a0ac-d43479729826", "address": "fa:16:3e:f2:d3:3b", "network": {"id": "295d1af7-5cfd-4501-a36e-13bd17ce835b", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2032524031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "da589921190a470cab62d12688f03735", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7ea9721-35", "ovs_interfaceid": "f7ea9721-35c3-47ed-a0ac-d43479729826", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:15:20 np0005466013 nova_compute[192144]: 2025-10-02 12:15:20.255 2 DEBUG nova.network.os_vif_util [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Converting VIF {"id": "f7ea9721-35c3-47ed-a0ac-d43479729826", "address": "fa:16:3e:f2:d3:3b", "network": {"id": "295d1af7-5cfd-4501-a36e-13bd17ce835b", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2032524031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "da589921190a470cab62d12688f03735", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7ea9721-35", "ovs_interfaceid": "f7ea9721-35c3-47ed-a0ac-d43479729826", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:15:20 np0005466013 nova_compute[192144]: 2025-10-02 12:15:20.256 2 DEBUG nova.network.os_vif_util [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f2:d3:3b,bridge_name='br-int',has_traffic_filtering=True,id=f7ea9721-35c3-47ed-a0ac-d43479729826,network=Network(295d1af7-5cfd-4501-a36e-13bd17ce835b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7ea9721-35') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:15:20 np0005466013 nova_compute[192144]: 2025-10-02 12:15:20.256 2 DEBUG os_vif [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f2:d3:3b,bridge_name='br-int',has_traffic_filtering=True,id=f7ea9721-35c3-47ed-a0ac-d43479729826,network=Network(295d1af7-5cfd-4501-a36e-13bd17ce835b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7ea9721-35') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:15:20 np0005466013 nova_compute[192144]: 2025-10-02 12:15:20.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:20 np0005466013 nova_compute[192144]: 2025-10-02 12:15:20.258 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:15:20 np0005466013 nova_compute[192144]: 2025-10-02 12:15:20.258 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:15:20 np0005466013 nova_compute[192144]: 2025-10-02 12:15:20.263 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:20 np0005466013 nova_compute[192144]: 2025-10-02 12:15:20.264 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf7ea9721-35, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:15:20 np0005466013 nova_compute[192144]: 2025-10-02 12:15:20.265 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf7ea9721-35, col_values=(('external_ids', {'iface-id': 'f7ea9721-35c3-47ed-a0ac-d43479729826', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f2:d3:3b', 'vm-uuid': '54738e99-a5ed-4771-810b-8bea70fde21a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:15:20 np0005466013 NetworkManager[51205]: <info>  [1759407320.2676] manager: (tapf7ea9721-35): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/135)
Oct  2 08:15:20 np0005466013 nova_compute[192144]: 2025-10-02 12:15:20.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:20 np0005466013 nova_compute[192144]: 2025-10-02 12:15:20.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:15:20 np0005466013 nova_compute[192144]: 2025-10-02 12:15:20.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:20 np0005466013 nova_compute[192144]: 2025-10-02 12:15:20.275 2 INFO os_vif [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f2:d3:3b,bridge_name='br-int',has_traffic_filtering=True,id=f7ea9721-35c3-47ed-a0ac-d43479729826,network=Network(295d1af7-5cfd-4501-a36e-13bd17ce835b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7ea9721-35')#033[00m
Oct  2 08:15:20 np0005466013 nova_compute[192144]: 2025-10-02 12:15:20.339 2 DEBUG nova.virt.libvirt.driver [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:15:20 np0005466013 nova_compute[192144]: 2025-10-02 12:15:20.339 2 DEBUG nova.virt.libvirt.driver [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:15:20 np0005466013 nova_compute[192144]: 2025-10-02 12:15:20.339 2 DEBUG nova.virt.libvirt.driver [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] No VIF found with MAC fa:16:3e:f2:d3:3b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:15:20 np0005466013 nova_compute[192144]: 2025-10-02 12:15:20.340 2 INFO nova.virt.libvirt.driver [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Using config drive#033[00m
Oct  2 08:15:21 np0005466013 nova_compute[192144]: 2025-10-02 12:15:21.008 2 INFO nova.virt.libvirt.driver [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Creating config drive at /var/lib/nova/instances/54738e99-a5ed-4771-810b-8bea70fde21a/disk.config#033[00m
Oct  2 08:15:21 np0005466013 nova_compute[192144]: 2025-10-02 12:15:21.014 2 DEBUG oslo_concurrency.processutils [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/54738e99-a5ed-4771-810b-8bea70fde21a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3akcgjmy execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:21 np0005466013 nova_compute[192144]: 2025-10-02 12:15:21.141 2 DEBUG oslo_concurrency.processutils [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/54738e99-a5ed-4771-810b-8bea70fde21a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3akcgjmy" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:21 np0005466013 kernel: tapf7ea9721-35: entered promiscuous mode
Oct  2 08:15:21 np0005466013 NetworkManager[51205]: <info>  [1759407321.2176] manager: (tapf7ea9721-35): new Tun device (/org/freedesktop/NetworkManager/Devices/136)
Oct  2 08:15:21 np0005466013 ovn_controller[94366]: 2025-10-02T12:15:21Z|00266|binding|INFO|Claiming lport f7ea9721-35c3-47ed-a0ac-d43479729826 for this chassis.
Oct  2 08:15:21 np0005466013 ovn_controller[94366]: 2025-10-02T12:15:21Z|00267|binding|INFO|f7ea9721-35c3-47ed-a0ac-d43479729826: Claiming fa:16:3e:f2:d3:3b 10.100.0.13
Oct  2 08:15:21 np0005466013 nova_compute[192144]: 2025-10-02 12:15:21.218 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:21 np0005466013 ovn_controller[94366]: 2025-10-02T12:15:21Z|00268|binding|INFO|Setting lport f7ea9721-35c3-47ed-a0ac-d43479729826 ovn-installed in OVS
Oct  2 08:15:21 np0005466013 ovn_controller[94366]: 2025-10-02T12:15:21Z|00269|binding|INFO|Setting lport f7ea9721-35c3-47ed-a0ac-d43479729826 up in Southbound
Oct  2 08:15:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:15:21.236 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f2:d3:3b 10.100.0.13'], port_security=['fa:16:3e:f2:d3:3b 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '54738e99-a5ed-4771-810b-8bea70fde21a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-295d1af7-5cfd-4501-a36e-13bd17ce835b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da589921190a470cab62d12688f03735', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ef00f507-77db-4cb1-a09e-91fc0b57cce7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b4dc983a-7436-400e-b753-e53a88d71c73, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=f7ea9721-35c3-47ed-a0ac-d43479729826) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:15:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:15:21.240 103323 INFO neutron.agent.ovn.metadata.agent [-] Port f7ea9721-35c3-47ed-a0ac-d43479729826 in datapath 295d1af7-5cfd-4501-a36e-13bd17ce835b bound to our chassis#033[00m
Oct  2 08:15:21 np0005466013 nova_compute[192144]: 2025-10-02 12:15:21.237 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:21 np0005466013 nova_compute[192144]: 2025-10-02 12:15:21.241 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:15:21.242 103323 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 295d1af7-5cfd-4501-a36e-13bd17ce835b or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Oct  2 08:15:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:15:21.244 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[13a0896f-6101-4a37-8155-79360fd6abe6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:21 np0005466013 systemd-udevd[230963]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:15:21 np0005466013 systemd-machined[152202]: New machine qemu-33-instance-0000004d.
Oct  2 08:15:21 np0005466013 NetworkManager[51205]: <info>  [1759407321.2757] device (tapf7ea9721-35): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:15:21 np0005466013 NetworkManager[51205]: <info>  [1759407321.2765] device (tapf7ea9721-35): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:15:21 np0005466013 systemd[1]: Started Virtual Machine qemu-33-instance-0000004d.
Oct  2 08:15:21 np0005466013 podman[230943]: 2025-10-02 12:15:21.314147174 +0000 UTC m=+0.101415865 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 08:15:21 np0005466013 nova_compute[192144]: 2025-10-02 12:15:21.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:15:21 np0005466013 nova_compute[192144]: 2025-10-02 12:15:21.995 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:15:22 np0005466013 nova_compute[192144]: 2025-10-02 12:15:22.016 2 DEBUG nova.network.neutron [req-a992019b-1fa0-4766-89cd-0fa5d9669d12 req-022e2e9c-b8f0-4b00-b919-c07c8c349467 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Updated VIF entry in instance network info cache for port f7ea9721-35c3-47ed-a0ac-d43479729826. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:15:22 np0005466013 nova_compute[192144]: 2025-10-02 12:15:22.016 2 DEBUG nova.network.neutron [req-a992019b-1fa0-4766-89cd-0fa5d9669d12 req-022e2e9c-b8f0-4b00-b919-c07c8c349467 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Updating instance_info_cache with network_info: [{"id": "f7ea9721-35c3-47ed-a0ac-d43479729826", "address": "fa:16:3e:f2:d3:3b", "network": {"id": "295d1af7-5cfd-4501-a36e-13bd17ce835b", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2032524031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "da589921190a470cab62d12688f03735", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7ea9721-35", "ovs_interfaceid": "f7ea9721-35c3-47ed-a0ac-d43479729826", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:15:22 np0005466013 nova_compute[192144]: 2025-10-02 12:15:22.019 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:22 np0005466013 nova_compute[192144]: 2025-10-02 12:15:22.020 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:22 np0005466013 nova_compute[192144]: 2025-10-02 12:15:22.020 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:22 np0005466013 nova_compute[192144]: 2025-10-02 12:15:22.020 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:15:22 np0005466013 nova_compute[192144]: 2025-10-02 12:15:22.033 2 DEBUG oslo_concurrency.lockutils [req-a992019b-1fa0-4766-89cd-0fa5d9669d12 req-022e2e9c-b8f0-4b00-b919-c07c8c349467 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-54738e99-a5ed-4771-810b-8bea70fde21a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:15:22 np0005466013 nova_compute[192144]: 2025-10-02 12:15:22.126 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/54738e99-a5ed-4771-810b-8bea70fde21a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:22 np0005466013 nova_compute[192144]: 2025-10-02 12:15:22.194 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/54738e99-a5ed-4771-810b-8bea70fde21a/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:22 np0005466013 nova_compute[192144]: 2025-10-02 12:15:22.195 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/54738e99-a5ed-4771-810b-8bea70fde21a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:22 np0005466013 nova_compute[192144]: 2025-10-02 12:15:22.266 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/54738e99-a5ed-4771-810b-8bea70fde21a/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:22 np0005466013 nova_compute[192144]: 2025-10-02 12:15:22.269 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407322.2692072, 54738e99-a5ed-4771-810b-8bea70fde21a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:15:22 np0005466013 nova_compute[192144]: 2025-10-02 12:15:22.269 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] VM Started (Lifecycle Event)#033[00m
Oct  2 08:15:22 np0005466013 nova_compute[192144]: 2025-10-02 12:15:22.275 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70d78115-9cfc-487c-95b6-f4f4149c52a1/disk.rescue --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:22 np0005466013 nova_compute[192144]: 2025-10-02 12:15:22.297 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:15:22 np0005466013 nova_compute[192144]: 2025-10-02 12:15:22.301 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407322.269269, 54738e99-a5ed-4771-810b-8bea70fde21a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:15:22 np0005466013 nova_compute[192144]: 2025-10-02 12:15:22.302 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:15:22 np0005466013 nova_compute[192144]: 2025-10-02 12:15:22.331 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:15:22 np0005466013 nova_compute[192144]: 2025-10-02 12:15:22.334 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70d78115-9cfc-487c-95b6-f4f4149c52a1/disk.rescue --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:22 np0005466013 nova_compute[192144]: 2025-10-02 12:15:22.335 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70d78115-9cfc-487c-95b6-f4f4149c52a1/disk.rescue --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:22 np0005466013 nova_compute[192144]: 2025-10-02 12:15:22.357 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:15:22 np0005466013 nova_compute[192144]: 2025-10-02 12:15:22.382 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:15:22 np0005466013 nova_compute[192144]: 2025-10-02 12:15:22.399 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70d78115-9cfc-487c-95b6-f4f4149c52a1/disk.rescue --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:22 np0005466013 nova_compute[192144]: 2025-10-02 12:15:22.400 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70d78115-9cfc-487c-95b6-f4f4149c52a1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:22 np0005466013 nova_compute[192144]: 2025-10-02 12:15:22.464 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70d78115-9cfc-487c-95b6-f4f4149c52a1/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:22 np0005466013 nova_compute[192144]: 2025-10-02 12:15:22.465 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70d78115-9cfc-487c-95b6-f4f4149c52a1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:22 np0005466013 nova_compute[192144]: 2025-10-02 12:15:22.526 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70d78115-9cfc-487c-95b6-f4f4149c52a1/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:22 np0005466013 nova_compute[192144]: 2025-10-02 12:15:22.707 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:15:22 np0005466013 nova_compute[192144]: 2025-10-02 12:15:22.709 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5556MB free_disk=73.3222885131836GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:15:22 np0005466013 nova_compute[192144]: 2025-10-02 12:15:22.709 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:22 np0005466013 nova_compute[192144]: 2025-10-02 12:15:22.709 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:22 np0005466013 nova_compute[192144]: 2025-10-02 12:15:22.887 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance 70d78115-9cfc-487c-95b6-f4f4149c52a1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:15:22 np0005466013 nova_compute[192144]: 2025-10-02 12:15:22.888 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance 54738e99-a5ed-4771-810b-8bea70fde21a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:15:22 np0005466013 nova_compute[192144]: 2025-10-02 12:15:22.888 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:15:22 np0005466013 nova_compute[192144]: 2025-10-02 12:15:22.888 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:15:22 np0005466013 nova_compute[192144]: 2025-10-02 12:15:22.982 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:15:23 np0005466013 nova_compute[192144]: 2025-10-02 12:15:23.023 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:15:23 np0005466013 nova_compute[192144]: 2025-10-02 12:15:23.085 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:15:23 np0005466013 nova_compute[192144]: 2025-10-02 12:15:23.086 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.377s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:23 np0005466013 nova_compute[192144]: 2025-10-02 12:15:23.400 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:23 np0005466013 podman[231005]: 2025-10-02 12:15:23.695079249 +0000 UTC m=+0.067026805 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, vcs-type=git, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, distribution-scope=public, managed_by=edpm_ansible, name=ubi9-minimal, version=9.6, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9)
Oct  2 08:15:23 np0005466013 podman[231004]: 2025-10-02 12:15:23.69511344 +0000 UTC m=+0.067314394 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.schema-version=1.0)
Oct  2 08:15:24 np0005466013 nova_compute[192144]: 2025-10-02 12:15:24.085 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:15:24 np0005466013 nova_compute[192144]: 2025-10-02 12:15:24.086 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:15:24 np0005466013 nova_compute[192144]: 2025-10-02 12:15:24.086 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:15:24 np0005466013 nova_compute[192144]: 2025-10-02 12:15:24.123 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Oct  2 08:15:24 np0005466013 nova_compute[192144]: 2025-10-02 12:15:24.366 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "refresh_cache-70d78115-9cfc-487c-95b6-f4f4149c52a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:15:24 np0005466013 nova_compute[192144]: 2025-10-02 12:15:24.367 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquired lock "refresh_cache-70d78115-9cfc-487c-95b6-f4f4149c52a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:15:24 np0005466013 nova_compute[192144]: 2025-10-02 12:15:24.367 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:15:24 np0005466013 nova_compute[192144]: 2025-10-02 12:15:24.367 2 DEBUG nova.objects.instance [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lazy-loading 'info_cache' on Instance uuid 70d78115-9cfc-487c-95b6-f4f4149c52a1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:15:25 np0005466013 nova_compute[192144]: 2025-10-02 12:15:25.267 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:26 np0005466013 nova_compute[192144]: 2025-10-02 12:15:26.737 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Updating instance_info_cache with network_info: [{"id": "cea86da2-59ef-4fb2-a414-04dbebd56e75", "address": "fa:16:3e:06:84:17", "network": {"id": "295d1af7-5cfd-4501-a36e-13bd17ce835b", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2032524031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "da589921190a470cab62d12688f03735", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcea86da2-59", "ovs_interfaceid": "cea86da2-59ef-4fb2-a414-04dbebd56e75", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:15:26 np0005466013 nova_compute[192144]: 2025-10-02 12:15:26.764 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Releasing lock "refresh_cache-70d78115-9cfc-487c-95b6-f4f4149c52a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:15:26 np0005466013 nova_compute[192144]: 2025-10-02 12:15:26.765 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:15:26 np0005466013 nova_compute[192144]: 2025-10-02 12:15:26.765 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:15:26 np0005466013 nova_compute[192144]: 2025-10-02 12:15:26.766 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:15:26 np0005466013 nova_compute[192144]: 2025-10-02 12:15:26.766 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:15:26 np0005466013 nova_compute[192144]: 2025-10-02 12:15:26.766 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:15:28 np0005466013 nova_compute[192144]: 2025-10-02 12:15:28.401 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:28 np0005466013 podman[231041]: 2025-10-02 12:15:28.695733692 +0000 UTC m=+0.062973590 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  2 08:15:28 np0005466013 podman[231042]: 2025-10-02 12:15:28.704163881 +0000 UTC m=+0.070042164 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  2 08:15:29 np0005466013 nova_compute[192144]: 2025-10-02 12:15:29.135 2 DEBUG nova.compute.manager [req-ebfe5c63-739d-46bf-8ad9-201110bd9e5f req-80467203-c522-410d-a303-07272bdbc616 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Received event network-vif-plugged-f7ea9721-35c3-47ed-a0ac-d43479729826 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:15:29 np0005466013 nova_compute[192144]: 2025-10-02 12:15:29.135 2 DEBUG oslo_concurrency.lockutils [req-ebfe5c63-739d-46bf-8ad9-201110bd9e5f req-80467203-c522-410d-a303-07272bdbc616 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "54738e99-a5ed-4771-810b-8bea70fde21a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:29 np0005466013 nova_compute[192144]: 2025-10-02 12:15:29.136 2 DEBUG oslo_concurrency.lockutils [req-ebfe5c63-739d-46bf-8ad9-201110bd9e5f req-80467203-c522-410d-a303-07272bdbc616 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "54738e99-a5ed-4771-810b-8bea70fde21a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:29 np0005466013 nova_compute[192144]: 2025-10-02 12:15:29.136 2 DEBUG oslo_concurrency.lockutils [req-ebfe5c63-739d-46bf-8ad9-201110bd9e5f req-80467203-c522-410d-a303-07272bdbc616 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "54738e99-a5ed-4771-810b-8bea70fde21a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:29 np0005466013 nova_compute[192144]: 2025-10-02 12:15:29.136 2 DEBUG nova.compute.manager [req-ebfe5c63-739d-46bf-8ad9-201110bd9e5f req-80467203-c522-410d-a303-07272bdbc616 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Processing event network-vif-plugged-f7ea9721-35c3-47ed-a0ac-d43479729826 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:15:29 np0005466013 nova_compute[192144]: 2025-10-02 12:15:29.136 2 DEBUG nova.compute.manager [req-ebfe5c63-739d-46bf-8ad9-201110bd9e5f req-80467203-c522-410d-a303-07272bdbc616 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Received event network-vif-plugged-f7ea9721-35c3-47ed-a0ac-d43479729826 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:15:29 np0005466013 nova_compute[192144]: 2025-10-02 12:15:29.137 2 DEBUG oslo_concurrency.lockutils [req-ebfe5c63-739d-46bf-8ad9-201110bd9e5f req-80467203-c522-410d-a303-07272bdbc616 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "54738e99-a5ed-4771-810b-8bea70fde21a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:29 np0005466013 nova_compute[192144]: 2025-10-02 12:15:29.137 2 DEBUG oslo_concurrency.lockutils [req-ebfe5c63-739d-46bf-8ad9-201110bd9e5f req-80467203-c522-410d-a303-07272bdbc616 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "54738e99-a5ed-4771-810b-8bea70fde21a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:29 np0005466013 nova_compute[192144]: 2025-10-02 12:15:29.137 2 DEBUG oslo_concurrency.lockutils [req-ebfe5c63-739d-46bf-8ad9-201110bd9e5f req-80467203-c522-410d-a303-07272bdbc616 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "54738e99-a5ed-4771-810b-8bea70fde21a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:29 np0005466013 nova_compute[192144]: 2025-10-02 12:15:29.137 2 DEBUG nova.compute.manager [req-ebfe5c63-739d-46bf-8ad9-201110bd9e5f req-80467203-c522-410d-a303-07272bdbc616 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] No waiting events found dispatching network-vif-plugged-f7ea9721-35c3-47ed-a0ac-d43479729826 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:15:29 np0005466013 nova_compute[192144]: 2025-10-02 12:15:29.137 2 WARNING nova.compute.manager [req-ebfe5c63-739d-46bf-8ad9-201110bd9e5f req-80467203-c522-410d-a303-07272bdbc616 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Received unexpected event network-vif-plugged-f7ea9721-35c3-47ed-a0ac-d43479729826 for instance with vm_state building and task_state spawning.#033[00m
Oct  2 08:15:29 np0005466013 nova_compute[192144]: 2025-10-02 12:15:29.138 2 DEBUG nova.compute.manager [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Instance event wait completed in 6 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:15:29 np0005466013 nova_compute[192144]: 2025-10-02 12:15:29.141 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407329.1416287, 54738e99-a5ed-4771-810b-8bea70fde21a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:15:29 np0005466013 nova_compute[192144]: 2025-10-02 12:15:29.142 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:15:29 np0005466013 nova_compute[192144]: 2025-10-02 12:15:29.144 2 DEBUG nova.virt.libvirt.driver [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:15:29 np0005466013 nova_compute[192144]: 2025-10-02 12:15:29.148 2 INFO nova.virt.libvirt.driver [-] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Instance spawned successfully.#033[00m
Oct  2 08:15:29 np0005466013 nova_compute[192144]: 2025-10-02 12:15:29.148 2 DEBUG nova.virt.libvirt.driver [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:15:29 np0005466013 nova_compute[192144]: 2025-10-02 12:15:29.229 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:15:29 np0005466013 nova_compute[192144]: 2025-10-02 12:15:29.235 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:15:29 np0005466013 nova_compute[192144]: 2025-10-02 12:15:29.237 2 DEBUG nova.virt.libvirt.driver [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:15:29 np0005466013 nova_compute[192144]: 2025-10-02 12:15:29.238 2 DEBUG nova.virt.libvirt.driver [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:15:29 np0005466013 nova_compute[192144]: 2025-10-02 12:15:29.238 2 DEBUG nova.virt.libvirt.driver [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:15:29 np0005466013 nova_compute[192144]: 2025-10-02 12:15:29.239 2 DEBUG nova.virt.libvirt.driver [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:15:29 np0005466013 nova_compute[192144]: 2025-10-02 12:15:29.239 2 DEBUG nova.virt.libvirt.driver [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:15:29 np0005466013 nova_compute[192144]: 2025-10-02 12:15:29.239 2 DEBUG nova.virt.libvirt.driver [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:15:29 np0005466013 nova_compute[192144]: 2025-10-02 12:15:29.339 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:15:29 np0005466013 nova_compute[192144]: 2025-10-02 12:15:29.652 2 INFO nova.compute.manager [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Took 13.62 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:15:29 np0005466013 nova_compute[192144]: 2025-10-02 12:15:29.653 2 DEBUG nova.compute.manager [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:15:29 np0005466013 nova_compute[192144]: 2025-10-02 12:15:29.962 2 INFO nova.compute.manager [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Took 15.06 seconds to build instance.#033[00m
Oct  2 08:15:30 np0005466013 nova_compute[192144]: 2025-10-02 12:15:30.014 2 DEBUG oslo_concurrency.lockutils [None req-d232e40c-934b-4255-aaa5-a4548d2e78cc 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lock "54738e99-a5ed-4771-810b-8bea70fde21a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.214s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:30 np0005466013 nova_compute[192144]: 2025-10-02 12:15:30.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:32 np0005466013 nova_compute[192144]: 2025-10-02 12:15:32.673 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:15:32 np0005466013 nova_compute[192144]: 2025-10-02 12:15:32.994 2 INFO nova.compute.manager [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Rescuing#033[00m
Oct  2 08:15:32 np0005466013 nova_compute[192144]: 2025-10-02 12:15:32.994 2 DEBUG oslo_concurrency.lockutils [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Acquiring lock "refresh_cache-54738e99-a5ed-4771-810b-8bea70fde21a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:15:32 np0005466013 nova_compute[192144]: 2025-10-02 12:15:32.995 2 DEBUG oslo_concurrency.lockutils [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Acquired lock "refresh_cache-54738e99-a5ed-4771-810b-8bea70fde21a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:15:32 np0005466013 nova_compute[192144]: 2025-10-02 12:15:32.995 2 DEBUG nova.network.neutron [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:15:33 np0005466013 nova_compute[192144]: 2025-10-02 12:15:33.403 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:34 np0005466013 nova_compute[192144]: 2025-10-02 12:15:34.880 2 DEBUG nova.network.neutron [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Updating instance_info_cache with network_info: [{"id": "f7ea9721-35c3-47ed-a0ac-d43479729826", "address": "fa:16:3e:f2:d3:3b", "network": {"id": "295d1af7-5cfd-4501-a36e-13bd17ce835b", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2032524031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "da589921190a470cab62d12688f03735", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7ea9721-35", "ovs_interfaceid": "f7ea9721-35c3-47ed-a0ac-d43479729826", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:15:34 np0005466013 nova_compute[192144]: 2025-10-02 12:15:34.919 2 DEBUG oslo_concurrency.lockutils [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Releasing lock "refresh_cache-54738e99-a5ed-4771-810b-8bea70fde21a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:15:35 np0005466013 nova_compute[192144]: 2025-10-02 12:15:35.272 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:35 np0005466013 nova_compute[192144]: 2025-10-02 12:15:35.350 2 DEBUG nova.virt.libvirt.driver [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Oct  2 08:15:38 np0005466013 nova_compute[192144]: 2025-10-02 12:15:38.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:40 np0005466013 nova_compute[192144]: 2025-10-02 12:15:40.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:41 np0005466013 podman[231098]: 2025-10-02 12:15:41.696639731 +0000 UTC m=+0.062342440 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:15:41 np0005466013 podman[231099]: 2025-10-02 12:15:41.703099865 +0000 UTC m=+0.062136673 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  2 08:15:41 np0005466013 podman[231100]: 2025-10-02 12:15:41.739068388 +0000 UTC m=+0.098109836 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 08:15:43 np0005466013 nova_compute[192144]: 2025-10-02 12:15:43.407 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:45 np0005466013 nova_compute[192144]: 2025-10-02 12:15:45.276 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:45 np0005466013 nova_compute[192144]: 2025-10-02 12:15:45.400 2 DEBUG nova.virt.libvirt.driver [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Oct  2 08:15:47 np0005466013 kernel: tapf7ea9721-35 (unregistering): left promiscuous mode
Oct  2 08:15:47 np0005466013 NetworkManager[51205]: <info>  [1759407347.6226] device (tapf7ea9721-35): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:15:47 np0005466013 nova_compute[192144]: 2025-10-02 12:15:47.659 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:47 np0005466013 ovn_controller[94366]: 2025-10-02T12:15:47Z|00270|binding|INFO|Releasing lport f7ea9721-35c3-47ed-a0ac-d43479729826 from this chassis (sb_readonly=0)
Oct  2 08:15:47 np0005466013 ovn_controller[94366]: 2025-10-02T12:15:47Z|00271|binding|INFO|Setting lport f7ea9721-35c3-47ed-a0ac-d43479729826 down in Southbound
Oct  2 08:15:47 np0005466013 ovn_controller[94366]: 2025-10-02T12:15:47Z|00272|binding|INFO|Removing iface tapf7ea9721-35 ovn-installed in OVS
Oct  2 08:15:47 np0005466013 nova_compute[192144]: 2025-10-02 12:15:47.669 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:47 np0005466013 nova_compute[192144]: 2025-10-02 12:15:47.673 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:47 np0005466013 systemd[1]: machine-qemu\x2d33\x2dinstance\x2d0000004d.scope: Deactivated successfully.
Oct  2 08:15:47 np0005466013 systemd[1]: machine-qemu\x2d33\x2dinstance\x2d0000004d.scope: Consumed 13.250s CPU time.
Oct  2 08:15:47 np0005466013 systemd-machined[152202]: Machine qemu-33-instance-0000004d terminated.
Oct  2 08:15:48 np0005466013 nova_compute[192144]: 2025-10-02 12:15:48.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:48 np0005466013 nova_compute[192144]: 2025-10-02 12:15:48.413 2 INFO nova.virt.libvirt.driver [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Instance shutdown successfully after 13 seconds.#033[00m
Oct  2 08:15:48 np0005466013 nova_compute[192144]: 2025-10-02 12:15:48.419 2 INFO nova.virt.libvirt.driver [-] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Instance destroyed successfully.#033[00m
Oct  2 08:15:48 np0005466013 nova_compute[192144]: 2025-10-02 12:15:48.419 2 DEBUG nova.objects.instance [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lazy-loading 'numa_topology' on Instance uuid 54738e99-a5ed-4771-810b-8bea70fde21a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:15:49 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:15:49.779 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f2:d3:3b 10.100.0.13'], port_security=['fa:16:3e:f2:d3:3b 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '54738e99-a5ed-4771-810b-8bea70fde21a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-295d1af7-5cfd-4501-a36e-13bd17ce835b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da589921190a470cab62d12688f03735', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ef00f507-77db-4cb1-a09e-91fc0b57cce7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b4dc983a-7436-400e-b753-e53a88d71c73, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=f7ea9721-35c3-47ed-a0ac-d43479729826) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:15:49 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:15:49.780 103323 INFO neutron.agent.ovn.metadata.agent [-] Port f7ea9721-35c3-47ed-a0ac-d43479729826 in datapath 295d1af7-5cfd-4501-a36e-13bd17ce835b unbound from our chassis#033[00m
Oct  2 08:15:49 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:15:49.782 103323 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 295d1af7-5cfd-4501-a36e-13bd17ce835b or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Oct  2 08:15:49 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:15:49.783 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[1bbef9ef-0fda-46bb-95a4-b4336b095ab9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:50 np0005466013 nova_compute[192144]: 2025-10-02 12:15:50.278 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:51 np0005466013 nova_compute[192144]: 2025-10-02 12:15:51.598 2 INFO nova.virt.libvirt.driver [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Attempting rescue#033[00m
Oct  2 08:15:51 np0005466013 nova_compute[192144]: 2025-10-02 12:15:51.599 2 DEBUG nova.virt.libvirt.driver [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314#033[00m
Oct  2 08:15:51 np0005466013 nova_compute[192144]: 2025-10-02 12:15:51.603 2 DEBUG nova.virt.libvirt.driver [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Oct  2 08:15:51 np0005466013 nova_compute[192144]: 2025-10-02 12:15:51.603 2 INFO nova.virt.libvirt.driver [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Creating image(s)#033[00m
Oct  2 08:15:51 np0005466013 nova_compute[192144]: 2025-10-02 12:15:51.605 2 DEBUG oslo_concurrency.lockutils [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Acquiring lock "/var/lib/nova/instances/54738e99-a5ed-4771-810b-8bea70fde21a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:51 np0005466013 nova_compute[192144]: 2025-10-02 12:15:51.605 2 DEBUG oslo_concurrency.lockutils [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lock "/var/lib/nova/instances/54738e99-a5ed-4771-810b-8bea70fde21a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:51 np0005466013 nova_compute[192144]: 2025-10-02 12:15:51.606 2 DEBUG oslo_concurrency.lockutils [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lock "/var/lib/nova/instances/54738e99-a5ed-4771-810b-8bea70fde21a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:51 np0005466013 nova_compute[192144]: 2025-10-02 12:15:51.606 2 DEBUG nova.objects.instance [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 54738e99-a5ed-4771-810b-8bea70fde21a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:15:51 np0005466013 nova_compute[192144]: 2025-10-02 12:15:51.667 2 DEBUG oslo_concurrency.lockutils [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:51 np0005466013 nova_compute[192144]: 2025-10-02 12:15:51.669 2 DEBUG oslo_concurrency.lockutils [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:51 np0005466013 nova_compute[192144]: 2025-10-02 12:15:51.680 2 DEBUG oslo_concurrency.processutils [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:51 np0005466013 podman[231194]: 2025-10-02 12:15:51.70673555 +0000 UTC m=+0.072804577 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct  2 08:15:51 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:15:51.751 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:15:51 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:15:51.752 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:15:51 np0005466013 nova_compute[192144]: 2025-10-02 12:15:51.753 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:51 np0005466013 nova_compute[192144]: 2025-10-02 12:15:51.755 2 DEBUG oslo_concurrency.processutils [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:51 np0005466013 nova_compute[192144]: 2025-10-02 12:15:51.756 2 DEBUG oslo_concurrency.processutils [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/54738e99-a5ed-4771-810b-8bea70fde21a/disk.rescue execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:51 np0005466013 nova_compute[192144]: 2025-10-02 12:15:51.797 2 DEBUG oslo_concurrency.processutils [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/54738e99-a5ed-4771-810b-8bea70fde21a/disk.rescue" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:51 np0005466013 nova_compute[192144]: 2025-10-02 12:15:51.798 2 DEBUG oslo_concurrency.lockutils [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:51 np0005466013 nova_compute[192144]: 2025-10-02 12:15:51.799 2 DEBUG nova.objects.instance [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lazy-loading 'migration_context' on Instance uuid 54738e99-a5ed-4771-810b-8bea70fde21a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:15:51 np0005466013 nova_compute[192144]: 2025-10-02 12:15:51.853 2 DEBUG nova.virt.libvirt.driver [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:15:51 np0005466013 nova_compute[192144]: 2025-10-02 12:15:51.855 2 DEBUG nova.virt.libvirt.driver [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Start _get_guest_xml network_info=[{"id": "f7ea9721-35c3-47ed-a0ac-d43479729826", "address": "fa:16:3e:f2:d3:3b", "network": {"id": "295d1af7-5cfd-4501-a36e-13bd17ce835b", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2032524031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSON-2032524031-network", "vif_mac": "fa:16:3e:f2:d3:3b"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "da589921190a470cab62d12688f03735", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7ea9721-35", "ovs_interfaceid": "f7ea9721-35c3-47ed-a0ac-d43479729826", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'kernel_id': '', 'ramdisk_id': ''} block_device_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:15:51 np0005466013 nova_compute[192144]: 2025-10-02 12:15:51.855 2 DEBUG nova.objects.instance [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lazy-loading 'resources' on Instance uuid 54738e99-a5ed-4771-810b-8bea70fde21a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:15:51 np0005466013 nova_compute[192144]: 2025-10-02 12:15:51.895 2 WARNING nova.virt.libvirt.driver [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:15:51 np0005466013 nova_compute[192144]: 2025-10-02 12:15:51.909 2 DEBUG nova.virt.libvirt.host [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:15:51 np0005466013 nova_compute[192144]: 2025-10-02 12:15:51.910 2 DEBUG nova.virt.libvirt.host [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:15:51 np0005466013 nova_compute[192144]: 2025-10-02 12:15:51.915 2 DEBUG nova.virt.libvirt.host [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:15:51 np0005466013 nova_compute[192144]: 2025-10-02 12:15:51.916 2 DEBUG nova.virt.libvirt.host [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:15:51 np0005466013 nova_compute[192144]: 2025-10-02 12:15:51.918 2 DEBUG nova.virt.libvirt.driver [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:15:51 np0005466013 nova_compute[192144]: 2025-10-02 12:15:51.918 2 DEBUG nova.virt.hardware [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:15:51 np0005466013 nova_compute[192144]: 2025-10-02 12:15:51.919 2 DEBUG nova.virt.hardware [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:15:51 np0005466013 nova_compute[192144]: 2025-10-02 12:15:51.919 2 DEBUG nova.virt.hardware [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:15:51 np0005466013 nova_compute[192144]: 2025-10-02 12:15:51.919 2 DEBUG nova.virt.hardware [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:15:51 np0005466013 nova_compute[192144]: 2025-10-02 12:15:51.919 2 DEBUG nova.virt.hardware [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:15:51 np0005466013 nova_compute[192144]: 2025-10-02 12:15:51.919 2 DEBUG nova.virt.hardware [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:15:51 np0005466013 nova_compute[192144]: 2025-10-02 12:15:51.920 2 DEBUG nova.virt.hardware [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:15:51 np0005466013 nova_compute[192144]: 2025-10-02 12:15:51.920 2 DEBUG nova.virt.hardware [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:15:51 np0005466013 nova_compute[192144]: 2025-10-02 12:15:51.920 2 DEBUG nova.virt.hardware [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:15:51 np0005466013 nova_compute[192144]: 2025-10-02 12:15:51.920 2 DEBUG nova.virt.hardware [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:15:51 np0005466013 nova_compute[192144]: 2025-10-02 12:15:51.921 2 DEBUG nova.virt.hardware [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:15:51 np0005466013 nova_compute[192144]: 2025-10-02 12:15:51.921 2 DEBUG nova.objects.instance [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 54738e99-a5ed-4771-810b-8bea70fde21a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:15:52 np0005466013 nova_compute[192144]: 2025-10-02 12:15:52.007 2 DEBUG nova.virt.libvirt.vif [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:15:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-361250710',display_name='tempest-ServerRescueTestJSON-server-361250710',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-361250710',id=77,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:15:29Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='da589921190a470cab62d12688f03735',ramdisk_id='',reservation_id='r-m05rmlnb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSON-1453830403',owner_user_name='tempest-ServerRescueTestJSON-1453830403-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:15:29Z,user_data=None,user_id='7b25dedc41b548469d2b0627e3255b9e',uuid=54738e99-a5ed-4771-810b-8bea70fde21a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f7ea9721-35c3-47ed-a0ac-d43479729826", "address": "fa:16:3e:f2:d3:3b", "network": {"id": "295d1af7-5cfd-4501-a36e-13bd17ce835b", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2032524031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSON-2032524031-network", "vif_mac": "fa:16:3e:f2:d3:3b"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "da589921190a470cab62d12688f03735", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7ea9721-35", "ovs_interfaceid": "f7ea9721-35c3-47ed-a0ac-d43479729826", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:15:52 np0005466013 nova_compute[192144]: 2025-10-02 12:15:52.008 2 DEBUG nova.network.os_vif_util [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Converting VIF {"id": "f7ea9721-35c3-47ed-a0ac-d43479729826", "address": "fa:16:3e:f2:d3:3b", "network": {"id": "295d1af7-5cfd-4501-a36e-13bd17ce835b", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2032524031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSON-2032524031-network", "vif_mac": "fa:16:3e:f2:d3:3b"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "da589921190a470cab62d12688f03735", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7ea9721-35", "ovs_interfaceid": "f7ea9721-35c3-47ed-a0ac-d43479729826", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:15:52 np0005466013 nova_compute[192144]: 2025-10-02 12:15:52.009 2 DEBUG nova.network.os_vif_util [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f2:d3:3b,bridge_name='br-int',has_traffic_filtering=True,id=f7ea9721-35c3-47ed-a0ac-d43479729826,network=Network(295d1af7-5cfd-4501-a36e-13bd17ce835b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7ea9721-35') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:15:52 np0005466013 nova_compute[192144]: 2025-10-02 12:15:52.010 2 DEBUG nova.objects.instance [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lazy-loading 'pci_devices' on Instance uuid 54738e99-a5ed-4771-810b-8bea70fde21a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:15:52 np0005466013 nova_compute[192144]: 2025-10-02 12:15:52.145 2 DEBUG nova.virt.libvirt.driver [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:15:52 np0005466013 nova_compute[192144]:  <uuid>54738e99-a5ed-4771-810b-8bea70fde21a</uuid>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:  <name>instance-0000004d</name>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:15:52 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:      <nova:name>tempest-ServerRescueTestJSON-server-361250710</nova:name>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:15:51</nova:creationTime>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:15:52 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:        <nova:user uuid="7b25dedc41b548469d2b0627e3255b9e">tempest-ServerRescueTestJSON-1453830403-project-member</nova:user>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:        <nova:project uuid="da589921190a470cab62d12688f03735">tempest-ServerRescueTestJSON-1453830403</nova:project>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:        <nova:port uuid="f7ea9721-35c3-47ed-a0ac-d43479729826">
Oct  2 08:15:52 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:      <entry name="serial">54738e99-a5ed-4771-810b-8bea70fde21a</entry>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:      <entry name="uuid">54738e99-a5ed-4771-810b-8bea70fde21a</entry>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:15:52 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/54738e99-a5ed-4771-810b-8bea70fde21a/disk.rescue"/>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:15:52 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/54738e99-a5ed-4771-810b-8bea70fde21a/disk"/>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:      <target dev="vdb" bus="virtio"/>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:15:52 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/54738e99-a5ed-4771-810b-8bea70fde21a/disk.config.rescue"/>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:15:52 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:f2:d3:3b"/>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:      <target dev="tapf7ea9721-35"/>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:15:52 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/54738e99-a5ed-4771-810b-8bea70fde21a/console.log" append="off"/>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:15:52 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:15:52 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:15:52 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:15:52 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:15:52 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:15:52 np0005466013 nova_compute[192144]: 2025-10-02 12:15:52.155 2 INFO nova.virt.libvirt.driver [-] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Instance destroyed successfully.#033[00m
Oct  2 08:15:52 np0005466013 nova_compute[192144]: 2025-10-02 12:15:52.450 2 DEBUG nova.virt.libvirt.driver [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:15:52 np0005466013 nova_compute[192144]: 2025-10-02 12:15:52.451 2 DEBUG nova.virt.libvirt.driver [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:15:52 np0005466013 nova_compute[192144]: 2025-10-02 12:15:52.451 2 DEBUG nova.virt.libvirt.driver [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:15:52 np0005466013 nova_compute[192144]: 2025-10-02 12:15:52.452 2 DEBUG nova.virt.libvirt.driver [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] No VIF found with MAC fa:16:3e:f2:d3:3b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:15:52 np0005466013 nova_compute[192144]: 2025-10-02 12:15:52.452 2 INFO nova.virt.libvirt.driver [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Using config drive#033[00m
Oct  2 08:15:52 np0005466013 nova_compute[192144]: 2025-10-02 12:15:52.642 2 DEBUG nova.compute.manager [req-066e928d-cbf4-40cf-84c0-090ebb55a540 req-4e31f4a4-e704-4d41-b330-972ffaa042b0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Received event network-vif-unplugged-f7ea9721-35c3-47ed-a0ac-d43479729826 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:15:52 np0005466013 nova_compute[192144]: 2025-10-02 12:15:52.642 2 DEBUG oslo_concurrency.lockutils [req-066e928d-cbf4-40cf-84c0-090ebb55a540 req-4e31f4a4-e704-4d41-b330-972ffaa042b0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "54738e99-a5ed-4771-810b-8bea70fde21a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:52 np0005466013 nova_compute[192144]: 2025-10-02 12:15:52.643 2 DEBUG oslo_concurrency.lockutils [req-066e928d-cbf4-40cf-84c0-090ebb55a540 req-4e31f4a4-e704-4d41-b330-972ffaa042b0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "54738e99-a5ed-4771-810b-8bea70fde21a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:52 np0005466013 nova_compute[192144]: 2025-10-02 12:15:52.643 2 DEBUG oslo_concurrency.lockutils [req-066e928d-cbf4-40cf-84c0-090ebb55a540 req-4e31f4a4-e704-4d41-b330-972ffaa042b0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "54738e99-a5ed-4771-810b-8bea70fde21a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:52 np0005466013 nova_compute[192144]: 2025-10-02 12:15:52.643 2 DEBUG nova.compute.manager [req-066e928d-cbf4-40cf-84c0-090ebb55a540 req-4e31f4a4-e704-4d41-b330-972ffaa042b0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] No waiting events found dispatching network-vif-unplugged-f7ea9721-35c3-47ed-a0ac-d43479729826 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:15:52 np0005466013 nova_compute[192144]: 2025-10-02 12:15:52.643 2 WARNING nova.compute.manager [req-066e928d-cbf4-40cf-84c0-090ebb55a540 req-4e31f4a4-e704-4d41-b330-972ffaa042b0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Received unexpected event network-vif-unplugged-f7ea9721-35c3-47ed-a0ac-d43479729826 for instance with vm_state active and task_state rescuing.#033[00m
Oct  2 08:15:52 np0005466013 nova_compute[192144]: 2025-10-02 12:15:52.650 2 DEBUG nova.objects.instance [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 54738e99-a5ed-4771-810b-8bea70fde21a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:15:52 np0005466013 nova_compute[192144]: 2025-10-02 12:15:52.777 2 DEBUG nova.objects.instance [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lazy-loading 'keypairs' on Instance uuid 54738e99-a5ed-4771-810b-8bea70fde21a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:15:53 np0005466013 nova_compute[192144]: 2025-10-02 12:15:53.410 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:54 np0005466013 nova_compute[192144]: 2025-10-02 12:15:54.151 2 INFO nova.virt.libvirt.driver [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Creating config drive at /var/lib/nova/instances/54738e99-a5ed-4771-810b-8bea70fde21a/disk.config.rescue#033[00m
Oct  2 08:15:54 np0005466013 nova_compute[192144]: 2025-10-02 12:15:54.157 2 DEBUG oslo_concurrency.processutils [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/54738e99-a5ed-4771-810b-8bea70fde21a/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8je1voo7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:54 np0005466013 nova_compute[192144]: 2025-10-02 12:15:54.290 2 DEBUG oslo_concurrency.processutils [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/54738e99-a5ed-4771-810b-8bea70fde21a/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8je1voo7" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:54 np0005466013 kernel: tapf7ea9721-35: entered promiscuous mode
Oct  2 08:15:54 np0005466013 ovn_controller[94366]: 2025-10-02T12:15:54Z|00273|binding|INFO|Claiming lport f7ea9721-35c3-47ed-a0ac-d43479729826 for this chassis.
Oct  2 08:15:54 np0005466013 ovn_controller[94366]: 2025-10-02T12:15:54Z|00274|binding|INFO|f7ea9721-35c3-47ed-a0ac-d43479729826: Claiming fa:16:3e:f2:d3:3b 10.100.0.13
Oct  2 08:15:54 np0005466013 nova_compute[192144]: 2025-10-02 12:15:54.393 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:54 np0005466013 NetworkManager[51205]: <info>  [1759407354.3963] manager: (tapf7ea9721-35): new Tun device (/org/freedesktop/NetworkManager/Devices/137)
Oct  2 08:15:54 np0005466013 ovn_controller[94366]: 2025-10-02T12:15:54Z|00275|binding|INFO|Setting lport f7ea9721-35c3-47ed-a0ac-d43479729826 ovn-installed in OVS
Oct  2 08:15:54 np0005466013 nova_compute[192144]: 2025-10-02 12:15:54.416 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:54 np0005466013 ovn_controller[94366]: 2025-10-02T12:15:54Z|00276|binding|INFO|Setting lport f7ea9721-35c3-47ed-a0ac-d43479729826 up in Southbound
Oct  2 08:15:54 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:15:54.419 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f2:d3:3b 10.100.0.13'], port_security=['fa:16:3e:f2:d3:3b 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '54738e99-a5ed-4771-810b-8bea70fde21a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-295d1af7-5cfd-4501-a36e-13bd17ce835b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da589921190a470cab62d12688f03735', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'ef00f507-77db-4cb1-a09e-91fc0b57cce7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b4dc983a-7436-400e-b753-e53a88d71c73, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=f7ea9721-35c3-47ed-a0ac-d43479729826) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:15:54 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:15:54.420 103323 INFO neutron.agent.ovn.metadata.agent [-] Port f7ea9721-35c3-47ed-a0ac-d43479729826 in datapath 295d1af7-5cfd-4501-a36e-13bd17ce835b bound to our chassis#033[00m
Oct  2 08:15:54 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:15:54.421 103323 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 295d1af7-5cfd-4501-a36e-13bd17ce835b or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Oct  2 08:15:54 np0005466013 nova_compute[192144]: 2025-10-02 12:15:54.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:54 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:15:54.422 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[3a5ba537-603b-4067-9381-3a781e5c4d4f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:54 np0005466013 systemd-udevd[231272]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:15:54 np0005466013 systemd-machined[152202]: New machine qemu-34-instance-0000004d.
Oct  2 08:15:54 np0005466013 NetworkManager[51205]: <info>  [1759407354.4452] device (tapf7ea9721-35): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:15:54 np0005466013 NetworkManager[51205]: <info>  [1759407354.4463] device (tapf7ea9721-35): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:15:54 np0005466013 systemd[1]: Started Virtual Machine qemu-34-instance-0000004d.
Oct  2 08:15:54 np0005466013 podman[231232]: 2025-10-02 12:15:54.455530759 +0000 UTC m=+0.086246141 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vcs-type=git, container_name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, version=9.6, distribution-scope=public, maintainer=Red Hat, Inc.)
Oct  2 08:15:54 np0005466013 podman[231231]: 2025-10-02 12:15:54.460311438 +0000 UTC m=+0.088037402 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct  2 08:15:54 np0005466013 nova_compute[192144]: 2025-10-02 12:15:54.884 2 DEBUG nova.compute.manager [req-8af04a29-47f0-4c00-998b-64033fb058c1 req-2f9e2aa5-7255-44d6-8af7-ba6627a36de7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Received event network-vif-plugged-f7ea9721-35c3-47ed-a0ac-d43479729826 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:15:54 np0005466013 nova_compute[192144]: 2025-10-02 12:15:54.887 2 DEBUG oslo_concurrency.lockutils [req-8af04a29-47f0-4c00-998b-64033fb058c1 req-2f9e2aa5-7255-44d6-8af7-ba6627a36de7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "54738e99-a5ed-4771-810b-8bea70fde21a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:54 np0005466013 nova_compute[192144]: 2025-10-02 12:15:54.887 2 DEBUG oslo_concurrency.lockutils [req-8af04a29-47f0-4c00-998b-64033fb058c1 req-2f9e2aa5-7255-44d6-8af7-ba6627a36de7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "54738e99-a5ed-4771-810b-8bea70fde21a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:54 np0005466013 nova_compute[192144]: 2025-10-02 12:15:54.887 2 DEBUG oslo_concurrency.lockutils [req-8af04a29-47f0-4c00-998b-64033fb058c1 req-2f9e2aa5-7255-44d6-8af7-ba6627a36de7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "54738e99-a5ed-4771-810b-8bea70fde21a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:54 np0005466013 nova_compute[192144]: 2025-10-02 12:15:54.888 2 DEBUG nova.compute.manager [req-8af04a29-47f0-4c00-998b-64033fb058c1 req-2f9e2aa5-7255-44d6-8af7-ba6627a36de7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] No waiting events found dispatching network-vif-plugged-f7ea9721-35c3-47ed-a0ac-d43479729826 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:15:54 np0005466013 nova_compute[192144]: 2025-10-02 12:15:54.888 2 WARNING nova.compute.manager [req-8af04a29-47f0-4c00-998b-64033fb058c1 req-2f9e2aa5-7255-44d6-8af7-ba6627a36de7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Received unexpected event network-vif-plugged-f7ea9721-35c3-47ed-a0ac-d43479729826 for instance with vm_state active and task_state rescuing.#033[00m
Oct  2 08:15:55 np0005466013 nova_compute[192144]: 2025-10-02 12:15:55.280 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:55 np0005466013 nova_compute[192144]: 2025-10-02 12:15:55.419 2 DEBUG nova.virt.libvirt.host [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Removed pending event for 54738e99-a5ed-4771-810b-8bea70fde21a due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Oct  2 08:15:55 np0005466013 nova_compute[192144]: 2025-10-02 12:15:55.420 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407355.418403, 54738e99-a5ed-4771-810b-8bea70fde21a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:15:55 np0005466013 nova_compute[192144]: 2025-10-02 12:15:55.420 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:15:55 np0005466013 nova_compute[192144]: 2025-10-02 12:15:55.438 2 DEBUG nova.compute.manager [None req-aa35f98a-649e-473a-9bff-7ae7a3fbafee 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:15:55 np0005466013 nova_compute[192144]: 2025-10-02 12:15:55.486 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:15:55 np0005466013 nova_compute[192144]: 2025-10-02 12:15:55.493 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:15:55 np0005466013 nova_compute[192144]: 2025-10-02 12:15:55.533 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] During sync_power_state the instance has a pending task (rescuing). Skip.#033[00m
Oct  2 08:15:55 np0005466013 nova_compute[192144]: 2025-10-02 12:15:55.534 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407355.4200006, 54738e99-a5ed-4771-810b-8bea70fde21a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:15:55 np0005466013 nova_compute[192144]: 2025-10-02 12:15:55.534 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] VM Started (Lifecycle Event)#033[00m
Oct  2 08:15:55 np0005466013 nova_compute[192144]: 2025-10-02 12:15:55.566 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:15:55 np0005466013 nova_compute[192144]: 2025-10-02 12:15:55.575 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:15:57 np0005466013 nova_compute[192144]: 2025-10-02 12:15:57.022 2 DEBUG nova.compute.manager [req-63158324-85e7-4284-9f6b-ab266bbf6503 req-a8fd7899-7ae3-49e3-af36-289b61700af8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Received event network-vif-plugged-f7ea9721-35c3-47ed-a0ac-d43479729826 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:15:57 np0005466013 nova_compute[192144]: 2025-10-02 12:15:57.024 2 DEBUG oslo_concurrency.lockutils [req-63158324-85e7-4284-9f6b-ab266bbf6503 req-a8fd7899-7ae3-49e3-af36-289b61700af8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "54738e99-a5ed-4771-810b-8bea70fde21a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:57 np0005466013 nova_compute[192144]: 2025-10-02 12:15:57.024 2 DEBUG oslo_concurrency.lockutils [req-63158324-85e7-4284-9f6b-ab266bbf6503 req-a8fd7899-7ae3-49e3-af36-289b61700af8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "54738e99-a5ed-4771-810b-8bea70fde21a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:57 np0005466013 nova_compute[192144]: 2025-10-02 12:15:57.025 2 DEBUG oslo_concurrency.lockutils [req-63158324-85e7-4284-9f6b-ab266bbf6503 req-a8fd7899-7ae3-49e3-af36-289b61700af8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "54738e99-a5ed-4771-810b-8bea70fde21a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:57 np0005466013 nova_compute[192144]: 2025-10-02 12:15:57.025 2 DEBUG nova.compute.manager [req-63158324-85e7-4284-9f6b-ab266bbf6503 req-a8fd7899-7ae3-49e3-af36-289b61700af8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] No waiting events found dispatching network-vif-plugged-f7ea9721-35c3-47ed-a0ac-d43479729826 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:15:57 np0005466013 nova_compute[192144]: 2025-10-02 12:15:57.025 2 WARNING nova.compute.manager [req-63158324-85e7-4284-9f6b-ab266bbf6503 req-a8fd7899-7ae3-49e3-af36-289b61700af8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Received unexpected event network-vif-plugged-f7ea9721-35c3-47ed-a0ac-d43479729826 for instance with vm_state rescued and task_state None.#033[00m
Oct  2 08:15:57 np0005466013 nova_compute[192144]: 2025-10-02 12:15:57.025 2 DEBUG nova.compute.manager [req-63158324-85e7-4284-9f6b-ab266bbf6503 req-a8fd7899-7ae3-49e3-af36-289b61700af8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Received event network-vif-plugged-f7ea9721-35c3-47ed-a0ac-d43479729826 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:15:57 np0005466013 nova_compute[192144]: 2025-10-02 12:15:57.026 2 DEBUG oslo_concurrency.lockutils [req-63158324-85e7-4284-9f6b-ab266bbf6503 req-a8fd7899-7ae3-49e3-af36-289b61700af8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "54738e99-a5ed-4771-810b-8bea70fde21a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:57 np0005466013 nova_compute[192144]: 2025-10-02 12:15:57.026 2 DEBUG oslo_concurrency.lockutils [req-63158324-85e7-4284-9f6b-ab266bbf6503 req-a8fd7899-7ae3-49e3-af36-289b61700af8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "54738e99-a5ed-4771-810b-8bea70fde21a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:57 np0005466013 nova_compute[192144]: 2025-10-02 12:15:57.026 2 DEBUG oslo_concurrency.lockutils [req-63158324-85e7-4284-9f6b-ab266bbf6503 req-a8fd7899-7ae3-49e3-af36-289b61700af8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "54738e99-a5ed-4771-810b-8bea70fde21a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:57 np0005466013 nova_compute[192144]: 2025-10-02 12:15:57.027 2 DEBUG nova.compute.manager [req-63158324-85e7-4284-9f6b-ab266bbf6503 req-a8fd7899-7ae3-49e3-af36-289b61700af8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] No waiting events found dispatching network-vif-plugged-f7ea9721-35c3-47ed-a0ac-d43479729826 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:15:57 np0005466013 nova_compute[192144]: 2025-10-02 12:15:57.027 2 WARNING nova.compute.manager [req-63158324-85e7-4284-9f6b-ab266bbf6503 req-a8fd7899-7ae3-49e3-af36-289b61700af8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Received unexpected event network-vif-plugged-f7ea9721-35c3-47ed-a0ac-d43479729826 for instance with vm_state rescued and task_state None.#033[00m
Oct  2 08:15:58 np0005466013 nova_compute[192144]: 2025-10-02 12:15:58.413 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:58 np0005466013 nova_compute[192144]: 2025-10-02 12:15:58.475 2 INFO nova.compute.manager [None req-f30b7ca8-bb8f-4a92-a3ef-439b1f2ee506 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Unrescuing#033[00m
Oct  2 08:15:58 np0005466013 nova_compute[192144]: 2025-10-02 12:15:58.476 2 DEBUG oslo_concurrency.lockutils [None req-f30b7ca8-bb8f-4a92-a3ef-439b1f2ee506 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Acquiring lock "refresh_cache-54738e99-a5ed-4771-810b-8bea70fde21a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:15:58 np0005466013 nova_compute[192144]: 2025-10-02 12:15:58.476 2 DEBUG oslo_concurrency.lockutils [None req-f30b7ca8-bb8f-4a92-a3ef-439b1f2ee506 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Acquired lock "refresh_cache-54738e99-a5ed-4771-810b-8bea70fde21a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:15:58 np0005466013 nova_compute[192144]: 2025-10-02 12:15:58.477 2 DEBUG nova.network.neutron [None req-f30b7ca8-bb8f-4a92-a3ef-439b1f2ee506 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:15:59 np0005466013 podman[231302]: 2025-10-02 12:15:59.707162164 +0000 UTC m=+0.071395017 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  2 08:15:59 np0005466013 podman[231303]: 2025-10-02 12:15:59.707923109 +0000 UTC m=+0.071620834 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, tcib_managed=true, config_id=iscsid, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:16:00 np0005466013 nova_compute[192144]: 2025-10-02 12:16:00.282 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:00.754 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:01 np0005466013 nova_compute[192144]: 2025-10-02 12:16:01.058 2 DEBUG nova.network.neutron [None req-f30b7ca8-bb8f-4a92-a3ef-439b1f2ee506 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Updating instance_info_cache with network_info: [{"id": "f7ea9721-35c3-47ed-a0ac-d43479729826", "address": "fa:16:3e:f2:d3:3b", "network": {"id": "295d1af7-5cfd-4501-a36e-13bd17ce835b", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2032524031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "da589921190a470cab62d12688f03735", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7ea9721-35", "ovs_interfaceid": "f7ea9721-35c3-47ed-a0ac-d43479729826", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:16:01 np0005466013 nova_compute[192144]: 2025-10-02 12:16:01.083 2 DEBUG oslo_concurrency.lockutils [None req-f30b7ca8-bb8f-4a92-a3ef-439b1f2ee506 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Releasing lock "refresh_cache-54738e99-a5ed-4771-810b-8bea70fde21a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:16:01 np0005466013 nova_compute[192144]: 2025-10-02 12:16:01.084 2 DEBUG nova.objects.instance [None req-f30b7ca8-bb8f-4a92-a3ef-439b1f2ee506 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lazy-loading 'flavor' on Instance uuid 54738e99-a5ed-4771-810b-8bea70fde21a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:16:01 np0005466013 kernel: tapf7ea9721-35 (unregistering): left promiscuous mode
Oct  2 08:16:01 np0005466013 NetworkManager[51205]: <info>  [1759407361.1455] device (tapf7ea9721-35): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:16:01 np0005466013 ovn_controller[94366]: 2025-10-02T12:16:01Z|00277|binding|INFO|Releasing lport f7ea9721-35c3-47ed-a0ac-d43479729826 from this chassis (sb_readonly=0)
Oct  2 08:16:01 np0005466013 ovn_controller[94366]: 2025-10-02T12:16:01Z|00278|binding|INFO|Setting lport f7ea9721-35c3-47ed-a0ac-d43479729826 down in Southbound
Oct  2 08:16:01 np0005466013 ovn_controller[94366]: 2025-10-02T12:16:01Z|00279|binding|INFO|Removing iface tapf7ea9721-35 ovn-installed in OVS
Oct  2 08:16:01 np0005466013 nova_compute[192144]: 2025-10-02 12:16:01.163 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:01 np0005466013 nova_compute[192144]: 2025-10-02 12:16:01.166 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:01.172 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f2:d3:3b 10.100.0.13'], port_security=['fa:16:3e:f2:d3:3b 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '54738e99-a5ed-4771-810b-8bea70fde21a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-295d1af7-5cfd-4501-a36e-13bd17ce835b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da589921190a470cab62d12688f03735', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'ef00f507-77db-4cb1-a09e-91fc0b57cce7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b4dc983a-7436-400e-b753-e53a88d71c73, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=f7ea9721-35c3-47ed-a0ac-d43479729826) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:16:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:01.175 103323 INFO neutron.agent.ovn.metadata.agent [-] Port f7ea9721-35c3-47ed-a0ac-d43479729826 in datapath 295d1af7-5cfd-4501-a36e-13bd17ce835b unbound from our chassis#033[00m
Oct  2 08:16:01 np0005466013 nova_compute[192144]: 2025-10-02 12:16:01.175 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:01.176 103323 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 295d1af7-5cfd-4501-a36e-13bd17ce835b or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Oct  2 08:16:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:01.177 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[69a32f51-75c0-4834-9625-9602170bcb5b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:01 np0005466013 systemd[1]: machine-qemu\x2d34\x2dinstance\x2d0000004d.scope: Deactivated successfully.
Oct  2 08:16:01 np0005466013 systemd[1]: machine-qemu\x2d34\x2dinstance\x2d0000004d.scope: Consumed 6.701s CPU time.
Oct  2 08:16:01 np0005466013 systemd-machined[152202]: Machine qemu-34-instance-0000004d terminated.
Oct  2 08:16:01 np0005466013 nova_compute[192144]: 2025-10-02 12:16:01.418 2 INFO nova.virt.libvirt.driver [-] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Instance destroyed successfully.#033[00m
Oct  2 08:16:01 np0005466013 nova_compute[192144]: 2025-10-02 12:16:01.418 2 DEBUG nova.objects.instance [None req-f30b7ca8-bb8f-4a92-a3ef-439b1f2ee506 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lazy-loading 'numa_topology' on Instance uuid 54738e99-a5ed-4771-810b-8bea70fde21a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:16:01 np0005466013 kernel: tapf7ea9721-35: entered promiscuous mode
Oct  2 08:16:01 np0005466013 NetworkManager[51205]: <info>  [1759407361.6822] manager: (tapf7ea9721-35): new Tun device (/org/freedesktop/NetworkManager/Devices/138)
Oct  2 08:16:01 np0005466013 ovn_controller[94366]: 2025-10-02T12:16:01Z|00280|binding|INFO|Claiming lport f7ea9721-35c3-47ed-a0ac-d43479729826 for this chassis.
Oct  2 08:16:01 np0005466013 ovn_controller[94366]: 2025-10-02T12:16:01Z|00281|binding|INFO|f7ea9721-35c3-47ed-a0ac-d43479729826: Claiming fa:16:3e:f2:d3:3b 10.100.0.13
Oct  2 08:16:01 np0005466013 nova_compute[192144]: 2025-10-02 12:16:01.682 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:01 np0005466013 systemd-udevd[231347]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:16:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:01.689 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f2:d3:3b 10.100.0.13'], port_security=['fa:16:3e:f2:d3:3b 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '54738e99-a5ed-4771-810b-8bea70fde21a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-295d1af7-5cfd-4501-a36e-13bd17ce835b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da589921190a470cab62d12688f03735', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'ef00f507-77db-4cb1-a09e-91fc0b57cce7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b4dc983a-7436-400e-b753-e53a88d71c73, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=f7ea9721-35c3-47ed-a0ac-d43479729826) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:16:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:01.690 103323 INFO neutron.agent.ovn.metadata.agent [-] Port f7ea9721-35c3-47ed-a0ac-d43479729826 in datapath 295d1af7-5cfd-4501-a36e-13bd17ce835b bound to our chassis#033[00m
Oct  2 08:16:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:01.691 103323 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 295d1af7-5cfd-4501-a36e-13bd17ce835b or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Oct  2 08:16:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:01.692 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[22cd4968-f79b-48c6-890e-b079acaef06f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:01 np0005466013 ovn_controller[94366]: 2025-10-02T12:16:01Z|00282|binding|INFO|Setting lport f7ea9721-35c3-47ed-a0ac-d43479729826 ovn-installed in OVS
Oct  2 08:16:01 np0005466013 ovn_controller[94366]: 2025-10-02T12:16:01Z|00283|binding|INFO|Setting lport f7ea9721-35c3-47ed-a0ac-d43479729826 up in Southbound
Oct  2 08:16:01 np0005466013 nova_compute[192144]: 2025-10-02 12:16:01.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:01 np0005466013 nova_compute[192144]: 2025-10-02 12:16:01.698 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:01 np0005466013 NetworkManager[51205]: <info>  [1759407361.7043] device (tapf7ea9721-35): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:16:01 np0005466013 NetworkManager[51205]: <info>  [1759407361.7055] device (tapf7ea9721-35): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:16:01 np0005466013 systemd-machined[152202]: New machine qemu-35-instance-0000004d.
Oct  2 08:16:01 np0005466013 systemd[1]: Started Virtual Machine qemu-35-instance-0000004d.
Oct  2 08:16:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:02.298 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:02.300 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:02.300 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:02 np0005466013 nova_compute[192144]: 2025-10-02 12:16:02.690 2 DEBUG nova.virt.libvirt.host [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Removed pending event for 54738e99-a5ed-4771-810b-8bea70fde21a due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Oct  2 08:16:02 np0005466013 nova_compute[192144]: 2025-10-02 12:16:02.692 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407362.6901064, 54738e99-a5ed-4771-810b-8bea70fde21a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:16:02 np0005466013 nova_compute[192144]: 2025-10-02 12:16:02.692 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:16:02 np0005466013 nova_compute[192144]: 2025-10-02 12:16:02.696 2 DEBUG nova.compute.manager [None req-f30b7ca8-bb8f-4a92-a3ef-439b1f2ee506 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:16:02 np0005466013 nova_compute[192144]: 2025-10-02 12:16:02.741 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:16:02 np0005466013 nova_compute[192144]: 2025-10-02 12:16:02.745 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:16:02 np0005466013 nova_compute[192144]: 2025-10-02 12:16:02.777 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] During sync_power_state the instance has a pending task (unrescuing). Skip.#033[00m
Oct  2 08:16:02 np0005466013 nova_compute[192144]: 2025-10-02 12:16:02.777 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407362.6913605, 54738e99-a5ed-4771-810b-8bea70fde21a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:16:02 np0005466013 nova_compute[192144]: 2025-10-02 12:16:02.778 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] VM Started (Lifecycle Event)#033[00m
Oct  2 08:16:02 np0005466013 nova_compute[192144]: 2025-10-02 12:16:02.818 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:16:02 np0005466013 nova_compute[192144]: 2025-10-02 12:16:02.821 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:16:03 np0005466013 nova_compute[192144]: 2025-10-02 12:16:03.415 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:05 np0005466013 nova_compute[192144]: 2025-10-02 12:16:05.284 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:05 np0005466013 nova_compute[192144]: 2025-10-02 12:16:05.514 2 DEBUG nova.compute.manager [req-293f4ea0-604a-4368-b574-a7be16b8cf60 req-30601c2a-a181-4b03-a6a7-eed0744a2adb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Received event network-vif-unplugged-f7ea9721-35c3-47ed-a0ac-d43479729826 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:05 np0005466013 nova_compute[192144]: 2025-10-02 12:16:05.515 2 DEBUG oslo_concurrency.lockutils [req-293f4ea0-604a-4368-b574-a7be16b8cf60 req-30601c2a-a181-4b03-a6a7-eed0744a2adb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "54738e99-a5ed-4771-810b-8bea70fde21a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:05 np0005466013 nova_compute[192144]: 2025-10-02 12:16:05.515 2 DEBUG oslo_concurrency.lockutils [req-293f4ea0-604a-4368-b574-a7be16b8cf60 req-30601c2a-a181-4b03-a6a7-eed0744a2adb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "54738e99-a5ed-4771-810b-8bea70fde21a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:05 np0005466013 nova_compute[192144]: 2025-10-02 12:16:05.516 2 DEBUG oslo_concurrency.lockutils [req-293f4ea0-604a-4368-b574-a7be16b8cf60 req-30601c2a-a181-4b03-a6a7-eed0744a2adb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "54738e99-a5ed-4771-810b-8bea70fde21a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:05 np0005466013 nova_compute[192144]: 2025-10-02 12:16:05.516 2 DEBUG nova.compute.manager [req-293f4ea0-604a-4368-b574-a7be16b8cf60 req-30601c2a-a181-4b03-a6a7-eed0744a2adb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] No waiting events found dispatching network-vif-unplugged-f7ea9721-35c3-47ed-a0ac-d43479729826 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:16:05 np0005466013 nova_compute[192144]: 2025-10-02 12:16:05.516 2 WARNING nova.compute.manager [req-293f4ea0-604a-4368-b574-a7be16b8cf60 req-30601c2a-a181-4b03-a6a7-eed0744a2adb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Received unexpected event network-vif-unplugged-f7ea9721-35c3-47ed-a0ac-d43479729826 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:16:05 np0005466013 nova_compute[192144]: 2025-10-02 12:16:05.517 2 DEBUG nova.compute.manager [req-293f4ea0-604a-4368-b574-a7be16b8cf60 req-30601c2a-a181-4b03-a6a7-eed0744a2adb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Received event network-vif-plugged-f7ea9721-35c3-47ed-a0ac-d43479729826 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:05 np0005466013 nova_compute[192144]: 2025-10-02 12:16:05.517 2 DEBUG oslo_concurrency.lockutils [req-293f4ea0-604a-4368-b574-a7be16b8cf60 req-30601c2a-a181-4b03-a6a7-eed0744a2adb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "54738e99-a5ed-4771-810b-8bea70fde21a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:05 np0005466013 nova_compute[192144]: 2025-10-02 12:16:05.517 2 DEBUG oslo_concurrency.lockutils [req-293f4ea0-604a-4368-b574-a7be16b8cf60 req-30601c2a-a181-4b03-a6a7-eed0744a2adb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "54738e99-a5ed-4771-810b-8bea70fde21a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:05 np0005466013 nova_compute[192144]: 2025-10-02 12:16:05.518 2 DEBUG oslo_concurrency.lockutils [req-293f4ea0-604a-4368-b574-a7be16b8cf60 req-30601c2a-a181-4b03-a6a7-eed0744a2adb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "54738e99-a5ed-4771-810b-8bea70fde21a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:05 np0005466013 nova_compute[192144]: 2025-10-02 12:16:05.518 2 DEBUG nova.compute.manager [req-293f4ea0-604a-4368-b574-a7be16b8cf60 req-30601c2a-a181-4b03-a6a7-eed0744a2adb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] No waiting events found dispatching network-vif-plugged-f7ea9721-35c3-47ed-a0ac-d43479729826 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:16:05 np0005466013 nova_compute[192144]: 2025-10-02 12:16:05.518 2 WARNING nova.compute.manager [req-293f4ea0-604a-4368-b574-a7be16b8cf60 req-30601c2a-a181-4b03-a6a7-eed0744a2adb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Received unexpected event network-vif-plugged-f7ea9721-35c3-47ed-a0ac-d43479729826 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:16:05 np0005466013 nova_compute[192144]: 2025-10-02 12:16:05.519 2 DEBUG nova.compute.manager [req-293f4ea0-604a-4368-b574-a7be16b8cf60 req-30601c2a-a181-4b03-a6a7-eed0744a2adb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Received event network-vif-plugged-f7ea9721-35c3-47ed-a0ac-d43479729826 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:05 np0005466013 nova_compute[192144]: 2025-10-02 12:16:05.519 2 DEBUG oslo_concurrency.lockutils [req-293f4ea0-604a-4368-b574-a7be16b8cf60 req-30601c2a-a181-4b03-a6a7-eed0744a2adb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "54738e99-a5ed-4771-810b-8bea70fde21a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:05 np0005466013 nova_compute[192144]: 2025-10-02 12:16:05.519 2 DEBUG oslo_concurrency.lockutils [req-293f4ea0-604a-4368-b574-a7be16b8cf60 req-30601c2a-a181-4b03-a6a7-eed0744a2adb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "54738e99-a5ed-4771-810b-8bea70fde21a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:05 np0005466013 nova_compute[192144]: 2025-10-02 12:16:05.519 2 DEBUG oslo_concurrency.lockutils [req-293f4ea0-604a-4368-b574-a7be16b8cf60 req-30601c2a-a181-4b03-a6a7-eed0744a2adb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "54738e99-a5ed-4771-810b-8bea70fde21a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:05 np0005466013 nova_compute[192144]: 2025-10-02 12:16:05.520 2 DEBUG nova.compute.manager [req-293f4ea0-604a-4368-b574-a7be16b8cf60 req-30601c2a-a181-4b03-a6a7-eed0744a2adb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] No waiting events found dispatching network-vif-plugged-f7ea9721-35c3-47ed-a0ac-d43479729826 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:16:05 np0005466013 nova_compute[192144]: 2025-10-02 12:16:05.520 2 WARNING nova.compute.manager [req-293f4ea0-604a-4368-b574-a7be16b8cf60 req-30601c2a-a181-4b03-a6a7-eed0744a2adb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Received unexpected event network-vif-plugged-f7ea9721-35c3-47ed-a0ac-d43479729826 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:16:05 np0005466013 nova_compute[192144]: 2025-10-02 12:16:05.520 2 DEBUG nova.compute.manager [req-293f4ea0-604a-4368-b574-a7be16b8cf60 req-30601c2a-a181-4b03-a6a7-eed0744a2adb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Received event network-vif-plugged-f7ea9721-35c3-47ed-a0ac-d43479729826 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:05 np0005466013 nova_compute[192144]: 2025-10-02 12:16:05.520 2 DEBUG oslo_concurrency.lockutils [req-293f4ea0-604a-4368-b574-a7be16b8cf60 req-30601c2a-a181-4b03-a6a7-eed0744a2adb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "54738e99-a5ed-4771-810b-8bea70fde21a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:05 np0005466013 nova_compute[192144]: 2025-10-02 12:16:05.521 2 DEBUG oslo_concurrency.lockutils [req-293f4ea0-604a-4368-b574-a7be16b8cf60 req-30601c2a-a181-4b03-a6a7-eed0744a2adb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "54738e99-a5ed-4771-810b-8bea70fde21a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:05 np0005466013 nova_compute[192144]: 2025-10-02 12:16:05.521 2 DEBUG oslo_concurrency.lockutils [req-293f4ea0-604a-4368-b574-a7be16b8cf60 req-30601c2a-a181-4b03-a6a7-eed0744a2adb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "54738e99-a5ed-4771-810b-8bea70fde21a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:05 np0005466013 nova_compute[192144]: 2025-10-02 12:16:05.521 2 DEBUG nova.compute.manager [req-293f4ea0-604a-4368-b574-a7be16b8cf60 req-30601c2a-a181-4b03-a6a7-eed0744a2adb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] No waiting events found dispatching network-vif-plugged-f7ea9721-35c3-47ed-a0ac-d43479729826 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:16:05 np0005466013 nova_compute[192144]: 2025-10-02 12:16:05.522 2 WARNING nova.compute.manager [req-293f4ea0-604a-4368-b574-a7be16b8cf60 req-30601c2a-a181-4b03-a6a7-eed0744a2adb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Received unexpected event network-vif-plugged-f7ea9721-35c3-47ed-a0ac-d43479729826 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:16:06 np0005466013 nova_compute[192144]: 2025-10-02 12:16:06.517 2 DEBUG oslo_concurrency.lockutils [None req-bc088ca0-7c6a-4761-852b-34cc3b05f1eb 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Acquiring lock "54738e99-a5ed-4771-810b-8bea70fde21a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:06 np0005466013 nova_compute[192144]: 2025-10-02 12:16:06.517 2 DEBUG oslo_concurrency.lockutils [None req-bc088ca0-7c6a-4761-852b-34cc3b05f1eb 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lock "54738e99-a5ed-4771-810b-8bea70fde21a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:06 np0005466013 nova_compute[192144]: 2025-10-02 12:16:06.518 2 DEBUG oslo_concurrency.lockutils [None req-bc088ca0-7c6a-4761-852b-34cc3b05f1eb 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Acquiring lock "54738e99-a5ed-4771-810b-8bea70fde21a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:06 np0005466013 nova_compute[192144]: 2025-10-02 12:16:06.518 2 DEBUG oslo_concurrency.lockutils [None req-bc088ca0-7c6a-4761-852b-34cc3b05f1eb 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lock "54738e99-a5ed-4771-810b-8bea70fde21a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:06 np0005466013 nova_compute[192144]: 2025-10-02 12:16:06.518 2 DEBUG oslo_concurrency.lockutils [None req-bc088ca0-7c6a-4761-852b-34cc3b05f1eb 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lock "54738e99-a5ed-4771-810b-8bea70fde21a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:06 np0005466013 nova_compute[192144]: 2025-10-02 12:16:06.941 2 INFO nova.compute.manager [None req-bc088ca0-7c6a-4761-852b-34cc3b05f1eb 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Terminating instance#033[00m
Oct  2 08:16:07 np0005466013 nova_compute[192144]: 2025-10-02 12:16:07.041 2 DEBUG nova.compute.manager [None req-bc088ca0-7c6a-4761-852b-34cc3b05f1eb 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:16:07 np0005466013 kernel: tapf7ea9721-35 (unregistering): left promiscuous mode
Oct  2 08:16:07 np0005466013 NetworkManager[51205]: <info>  [1759407367.0681] device (tapf7ea9721-35): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:16:07 np0005466013 ovn_controller[94366]: 2025-10-02T12:16:07Z|00284|binding|INFO|Releasing lport f7ea9721-35c3-47ed-a0ac-d43479729826 from this chassis (sb_readonly=0)
Oct  2 08:16:07 np0005466013 ovn_controller[94366]: 2025-10-02T12:16:07Z|00285|binding|INFO|Setting lport f7ea9721-35c3-47ed-a0ac-d43479729826 down in Southbound
Oct  2 08:16:07 np0005466013 ovn_controller[94366]: 2025-10-02T12:16:07Z|00286|binding|INFO|Removing iface tapf7ea9721-35 ovn-installed in OVS
Oct  2 08:16:07 np0005466013 nova_compute[192144]: 2025-10-02 12:16:07.074 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:07 np0005466013 nova_compute[192144]: 2025-10-02 12:16:07.087 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:07 np0005466013 systemd[1]: machine-qemu\x2d35\x2dinstance\x2d0000004d.scope: Deactivated successfully.
Oct  2 08:16:07 np0005466013 systemd[1]: machine-qemu\x2d35\x2dinstance\x2d0000004d.scope: Consumed 5.098s CPU time.
Oct  2 08:16:07 np0005466013 systemd-machined[152202]: Machine qemu-35-instance-0000004d terminated.
Oct  2 08:16:07 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:07.164 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f2:d3:3b 10.100.0.13'], port_security=['fa:16:3e:f2:d3:3b 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '54738e99-a5ed-4771-810b-8bea70fde21a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-295d1af7-5cfd-4501-a36e-13bd17ce835b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da589921190a470cab62d12688f03735', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'ef00f507-77db-4cb1-a09e-91fc0b57cce7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b4dc983a-7436-400e-b753-e53a88d71c73, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=f7ea9721-35c3-47ed-a0ac-d43479729826) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:16:07 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:07.165 103323 INFO neutron.agent.ovn.metadata.agent [-] Port f7ea9721-35c3-47ed-a0ac-d43479729826 in datapath 295d1af7-5cfd-4501-a36e-13bd17ce835b unbound from our chassis#033[00m
Oct  2 08:16:07 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:07.166 103323 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 295d1af7-5cfd-4501-a36e-13bd17ce835b or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Oct  2 08:16:07 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:07.167 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[730f5e8e-d0ba-46be-8de9-c98a40863422]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:07 np0005466013 nova_compute[192144]: 2025-10-02 12:16:07.312 2 INFO nova.virt.libvirt.driver [-] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Instance destroyed successfully.#033[00m
Oct  2 08:16:07 np0005466013 nova_compute[192144]: 2025-10-02 12:16:07.313 2 DEBUG nova.objects.instance [None req-bc088ca0-7c6a-4761-852b-34cc3b05f1eb 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lazy-loading 'resources' on Instance uuid 54738e99-a5ed-4771-810b-8bea70fde21a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:16:07 np0005466013 nova_compute[192144]: 2025-10-02 12:16:07.339 2 DEBUG nova.virt.libvirt.vif [None req-bc088ca0-7c6a-4761-852b-34cc3b05f1eb 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:15:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-361250710',display_name='tempest-ServerRescueTestJSON-server-361250710',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-361250710',id=77,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:15:55Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='da589921190a470cab62d12688f03735',ramdisk_id='',reservation_id='r-m05rmlnb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSON-1453830403',owner_user_name='tempest-ServerRescueTestJSON-1453830403-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:16:02Z,user_data=None,user_id='7b25dedc41b548469d2b0627e3255b9e',uuid=54738e99-a5ed-4771-810b-8bea70fde21a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f7ea9721-35c3-47ed-a0ac-d43479729826", "address": "fa:16:3e:f2:d3:3b", "network": {"id": "295d1af7-5cfd-4501-a36e-13bd17ce835b", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2032524031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "da589921190a470cab62d12688f03735", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7ea9721-35", "ovs_interfaceid": "f7ea9721-35c3-47ed-a0ac-d43479729826", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:16:07 np0005466013 nova_compute[192144]: 2025-10-02 12:16:07.339 2 DEBUG nova.network.os_vif_util [None req-bc088ca0-7c6a-4761-852b-34cc3b05f1eb 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Converting VIF {"id": "f7ea9721-35c3-47ed-a0ac-d43479729826", "address": "fa:16:3e:f2:d3:3b", "network": {"id": "295d1af7-5cfd-4501-a36e-13bd17ce835b", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2032524031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "da589921190a470cab62d12688f03735", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7ea9721-35", "ovs_interfaceid": "f7ea9721-35c3-47ed-a0ac-d43479729826", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:16:07 np0005466013 nova_compute[192144]: 2025-10-02 12:16:07.340 2 DEBUG nova.network.os_vif_util [None req-bc088ca0-7c6a-4761-852b-34cc3b05f1eb 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f2:d3:3b,bridge_name='br-int',has_traffic_filtering=True,id=f7ea9721-35c3-47ed-a0ac-d43479729826,network=Network(295d1af7-5cfd-4501-a36e-13bd17ce835b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7ea9721-35') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:16:07 np0005466013 nova_compute[192144]: 2025-10-02 12:16:07.340 2 DEBUG os_vif [None req-bc088ca0-7c6a-4761-852b-34cc3b05f1eb 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f2:d3:3b,bridge_name='br-int',has_traffic_filtering=True,id=f7ea9721-35c3-47ed-a0ac-d43479729826,network=Network(295d1af7-5cfd-4501-a36e-13bd17ce835b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7ea9721-35') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:16:07 np0005466013 nova_compute[192144]: 2025-10-02 12:16:07.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:07 np0005466013 nova_compute[192144]: 2025-10-02 12:16:07.343 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf7ea9721-35, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:07 np0005466013 nova_compute[192144]: 2025-10-02 12:16:07.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:07 np0005466013 nova_compute[192144]: 2025-10-02 12:16:07.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:07 np0005466013 nova_compute[192144]: 2025-10-02 12:16:07.349 2 INFO os_vif [None req-bc088ca0-7c6a-4761-852b-34cc3b05f1eb 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f2:d3:3b,bridge_name='br-int',has_traffic_filtering=True,id=f7ea9721-35c3-47ed-a0ac-d43479729826,network=Network(295d1af7-5cfd-4501-a36e-13bd17ce835b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7ea9721-35')#033[00m
Oct  2 08:16:07 np0005466013 nova_compute[192144]: 2025-10-02 12:16:07.350 2 INFO nova.virt.libvirt.driver [None req-bc088ca0-7c6a-4761-852b-34cc3b05f1eb 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Deleting instance files /var/lib/nova/instances/54738e99-a5ed-4771-810b-8bea70fde21a_del#033[00m
Oct  2 08:16:07 np0005466013 nova_compute[192144]: 2025-10-02 12:16:07.350 2 INFO nova.virt.libvirt.driver [None req-bc088ca0-7c6a-4761-852b-34cc3b05f1eb 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Deletion of /var/lib/nova/instances/54738e99-a5ed-4771-810b-8bea70fde21a_del complete#033[00m
Oct  2 08:16:07 np0005466013 nova_compute[192144]: 2025-10-02 12:16:07.478 2 INFO nova.compute.manager [None req-bc088ca0-7c6a-4761-852b-34cc3b05f1eb 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Took 0.44 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:16:07 np0005466013 nova_compute[192144]: 2025-10-02 12:16:07.479 2 DEBUG oslo.service.loopingcall [None req-bc088ca0-7c6a-4761-852b-34cc3b05f1eb 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:16:07 np0005466013 nova_compute[192144]: 2025-10-02 12:16:07.479 2 DEBUG nova.compute.manager [-] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:16:07 np0005466013 nova_compute[192144]: 2025-10-02 12:16:07.479 2 DEBUG nova.network.neutron [-] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:16:07 np0005466013 nova_compute[192144]: 2025-10-02 12:16:07.788 2 DEBUG nova.compute.manager [req-2ddc969c-2c5c-4643-a379-61aa488158f1 req-97a7dc29-6327-44f8-b108-57e995c85281 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Received event network-vif-unplugged-f7ea9721-35c3-47ed-a0ac-d43479729826 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:07 np0005466013 nova_compute[192144]: 2025-10-02 12:16:07.788 2 DEBUG oslo_concurrency.lockutils [req-2ddc969c-2c5c-4643-a379-61aa488158f1 req-97a7dc29-6327-44f8-b108-57e995c85281 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "54738e99-a5ed-4771-810b-8bea70fde21a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:07 np0005466013 nova_compute[192144]: 2025-10-02 12:16:07.788 2 DEBUG oslo_concurrency.lockutils [req-2ddc969c-2c5c-4643-a379-61aa488158f1 req-97a7dc29-6327-44f8-b108-57e995c85281 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "54738e99-a5ed-4771-810b-8bea70fde21a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:07 np0005466013 nova_compute[192144]: 2025-10-02 12:16:07.788 2 DEBUG oslo_concurrency.lockutils [req-2ddc969c-2c5c-4643-a379-61aa488158f1 req-97a7dc29-6327-44f8-b108-57e995c85281 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "54738e99-a5ed-4771-810b-8bea70fde21a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:07 np0005466013 nova_compute[192144]: 2025-10-02 12:16:07.789 2 DEBUG nova.compute.manager [req-2ddc969c-2c5c-4643-a379-61aa488158f1 req-97a7dc29-6327-44f8-b108-57e995c85281 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] No waiting events found dispatching network-vif-unplugged-f7ea9721-35c3-47ed-a0ac-d43479729826 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:16:07 np0005466013 nova_compute[192144]: 2025-10-02 12:16:07.789 2 DEBUG nova.compute.manager [req-2ddc969c-2c5c-4643-a379-61aa488158f1 req-97a7dc29-6327-44f8-b108-57e995c85281 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Received event network-vif-unplugged-f7ea9721-35c3-47ed-a0ac-d43479729826 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:16:08 np0005466013 nova_compute[192144]: 2025-10-02 12:16:08.417 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:08 np0005466013 nova_compute[192144]: 2025-10-02 12:16:08.466 2 DEBUG nova.network.neutron [-] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:16:08 np0005466013 nova_compute[192144]: 2025-10-02 12:16:08.486 2 INFO nova.compute.manager [-] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Took 1.01 seconds to deallocate network for instance.#033[00m
Oct  2 08:16:08 np0005466013 nova_compute[192144]: 2025-10-02 12:16:08.583 2 DEBUG oslo_concurrency.lockutils [None req-bc088ca0-7c6a-4761-852b-34cc3b05f1eb 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:08 np0005466013 nova_compute[192144]: 2025-10-02 12:16:08.584 2 DEBUG oslo_concurrency.lockutils [None req-bc088ca0-7c6a-4761-852b-34cc3b05f1eb 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:08 np0005466013 nova_compute[192144]: 2025-10-02 12:16:08.640 2 DEBUG nova.compute.manager [req-1d02863c-0e86-4888-8589-b2b2e8e29f29 req-0a6c33f9-ef2f-45b6-ae2d-eefb1550c4f2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Received event network-vif-deleted-f7ea9721-35c3-47ed-a0ac-d43479729826 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:08 np0005466013 nova_compute[192144]: 2025-10-02 12:16:08.779 2 DEBUG nova.compute.provider_tree [None req-bc088ca0-7c6a-4761-852b-34cc3b05f1eb 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:16:08 np0005466013 nova_compute[192144]: 2025-10-02 12:16:08.829 2 DEBUG nova.scheduler.client.report [None req-bc088ca0-7c6a-4761-852b-34cc3b05f1eb 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:16:08 np0005466013 nova_compute[192144]: 2025-10-02 12:16:08.895 2 DEBUG oslo_concurrency.lockutils [None req-bc088ca0-7c6a-4761-852b-34cc3b05f1eb 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.311s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:08 np0005466013 nova_compute[192144]: 2025-10-02 12:16:08.945 2 INFO nova.scheduler.client.report [None req-bc088ca0-7c6a-4761-852b-34cc3b05f1eb 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Deleted allocations for instance 54738e99-a5ed-4771-810b-8bea70fde21a#033[00m
Oct  2 08:16:09 np0005466013 nova_compute[192144]: 2025-10-02 12:16:09.120 2 DEBUG oslo_concurrency.lockutils [None req-bc088ca0-7c6a-4761-852b-34cc3b05f1eb 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lock "54738e99-a5ed-4771-810b-8bea70fde21a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.603s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:10 np0005466013 nova_compute[192144]: 2025-10-02 12:16:10.122 2 DEBUG nova.compute.manager [req-37281fbe-ace2-4055-a6e6-107470ebc874 req-9b9d6d37-01bd-4a5e-8be1-4b61ce037626 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Received event network-vif-plugged-f7ea9721-35c3-47ed-a0ac-d43479729826 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:10 np0005466013 nova_compute[192144]: 2025-10-02 12:16:10.123 2 DEBUG oslo_concurrency.lockutils [req-37281fbe-ace2-4055-a6e6-107470ebc874 req-9b9d6d37-01bd-4a5e-8be1-4b61ce037626 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "54738e99-a5ed-4771-810b-8bea70fde21a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:10 np0005466013 nova_compute[192144]: 2025-10-02 12:16:10.123 2 DEBUG oslo_concurrency.lockutils [req-37281fbe-ace2-4055-a6e6-107470ebc874 req-9b9d6d37-01bd-4a5e-8be1-4b61ce037626 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "54738e99-a5ed-4771-810b-8bea70fde21a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:10 np0005466013 nova_compute[192144]: 2025-10-02 12:16:10.123 2 DEBUG oslo_concurrency.lockutils [req-37281fbe-ace2-4055-a6e6-107470ebc874 req-9b9d6d37-01bd-4a5e-8be1-4b61ce037626 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "54738e99-a5ed-4771-810b-8bea70fde21a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:10 np0005466013 nova_compute[192144]: 2025-10-02 12:16:10.124 2 DEBUG nova.compute.manager [req-37281fbe-ace2-4055-a6e6-107470ebc874 req-9b9d6d37-01bd-4a5e-8be1-4b61ce037626 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] No waiting events found dispatching network-vif-plugged-f7ea9721-35c3-47ed-a0ac-d43479729826 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:16:10 np0005466013 nova_compute[192144]: 2025-10-02 12:16:10.124 2 WARNING nova.compute.manager [req-37281fbe-ace2-4055-a6e6-107470ebc874 req-9b9d6d37-01bd-4a5e-8be1-4b61ce037626 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Received unexpected event network-vif-plugged-f7ea9721-35c3-47ed-a0ac-d43479729826 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:16:12 np0005466013 nova_compute[192144]: 2025-10-02 12:16:12.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:12 np0005466013 podman[231428]: 2025-10-02 12:16:12.702932826 +0000 UTC m=+0.064478098 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  2 08:16:12 np0005466013 podman[231429]: 2025-10-02 12:16:12.733861339 +0000 UTC m=+0.095585586 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Oct  2 08:16:12 np0005466013 podman[231430]: 2025-10-02 12:16:12.750673192 +0000 UTC m=+0.112400519 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Oct  2 08:16:13 np0005466013 nova_compute[192144]: 2025-10-02 12:16:13.419 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:13 np0005466013 nova_compute[192144]: 2025-10-02 12:16:13.566 2 DEBUG oslo_concurrency.lockutils [None req-d3f94028-6f42-4339-92c2-fc347a59bf61 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Acquiring lock "70d78115-9cfc-487c-95b6-f4f4149c52a1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:13 np0005466013 nova_compute[192144]: 2025-10-02 12:16:13.567 2 DEBUG oslo_concurrency.lockutils [None req-d3f94028-6f42-4339-92c2-fc347a59bf61 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lock "70d78115-9cfc-487c-95b6-f4f4149c52a1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:13 np0005466013 nova_compute[192144]: 2025-10-02 12:16:13.567 2 DEBUG oslo_concurrency.lockutils [None req-d3f94028-6f42-4339-92c2-fc347a59bf61 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Acquiring lock "70d78115-9cfc-487c-95b6-f4f4149c52a1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:13 np0005466013 nova_compute[192144]: 2025-10-02 12:16:13.567 2 DEBUG oslo_concurrency.lockutils [None req-d3f94028-6f42-4339-92c2-fc347a59bf61 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lock "70d78115-9cfc-487c-95b6-f4f4149c52a1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:13 np0005466013 nova_compute[192144]: 2025-10-02 12:16:13.567 2 DEBUG oslo_concurrency.lockutils [None req-d3f94028-6f42-4339-92c2-fc347a59bf61 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lock "70d78115-9cfc-487c-95b6-f4f4149c52a1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:13 np0005466013 nova_compute[192144]: 2025-10-02 12:16:13.736 2 INFO nova.compute.manager [None req-d3f94028-6f42-4339-92c2-fc347a59bf61 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Terminating instance#033[00m
Oct  2 08:16:13 np0005466013 nova_compute[192144]: 2025-10-02 12:16:13.835 2 DEBUG nova.compute.manager [None req-d3f94028-6f42-4339-92c2-fc347a59bf61 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:16:13 np0005466013 kernel: tapcea86da2-59 (unregistering): left promiscuous mode
Oct  2 08:16:13 np0005466013 NetworkManager[51205]: <info>  [1759407373.8607] device (tapcea86da2-59): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:16:13 np0005466013 ovn_controller[94366]: 2025-10-02T12:16:13Z|00287|binding|INFO|Releasing lport cea86da2-59ef-4fb2-a414-04dbebd56e75 from this chassis (sb_readonly=0)
Oct  2 08:16:13 np0005466013 nova_compute[192144]: 2025-10-02 12:16:13.869 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:13 np0005466013 ovn_controller[94366]: 2025-10-02T12:16:13Z|00288|binding|INFO|Setting lport cea86da2-59ef-4fb2-a414-04dbebd56e75 down in Southbound
Oct  2 08:16:13 np0005466013 ovn_controller[94366]: 2025-10-02T12:16:13Z|00289|binding|INFO|Removing iface tapcea86da2-59 ovn-installed in OVS
Oct  2 08:16:13 np0005466013 nova_compute[192144]: 2025-10-02 12:16:13.873 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:13 np0005466013 nova_compute[192144]: 2025-10-02 12:16:13.892 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:13.894 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:06:84:17 10.100.0.14'], port_security=['fa:16:3e:06:84:17 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-295d1af7-5cfd-4501-a36e-13bd17ce835b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da589921190a470cab62d12688f03735', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'ef00f507-77db-4cb1-a09e-91fc0b57cce7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b4dc983a-7436-400e-b753-e53a88d71c73, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=cea86da2-59ef-4fb2-a414-04dbebd56e75) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:16:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:13.895 103323 INFO neutron.agent.ovn.metadata.agent [-] Port cea86da2-59ef-4fb2-a414-04dbebd56e75 in datapath 295d1af7-5cfd-4501-a36e-13bd17ce835b unbound from our chassis#033[00m
Oct  2 08:16:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:13.896 103323 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 295d1af7-5cfd-4501-a36e-13bd17ce835b or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Oct  2 08:16:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:13.897 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[5dfeb152-ba8a-4eff-b88d-94cdaec3187f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:13 np0005466013 systemd[1]: machine-qemu\x2d32\x2dinstance\x2d0000004a.scope: Deactivated successfully.
Oct  2 08:16:13 np0005466013 systemd[1]: machine-qemu\x2d32\x2dinstance\x2d0000004a.scope: Consumed 15.201s CPU time.
Oct  2 08:16:13 np0005466013 systemd-machined[152202]: Machine qemu-32-instance-0000004a terminated.
Oct  2 08:16:14 np0005466013 kernel: tapcea86da2-59: entered promiscuous mode
Oct  2 08:16:14 np0005466013 systemd-udevd[231498]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:16:14 np0005466013 NetworkManager[51205]: <info>  [1759407374.0622] manager: (tapcea86da2-59): new Tun device (/org/freedesktop/NetworkManager/Devices/139)
Oct  2 08:16:14 np0005466013 nova_compute[192144]: 2025-10-02 12:16:14.060 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:14 np0005466013 ovn_controller[94366]: 2025-10-02T12:16:14Z|00290|binding|INFO|Claiming lport cea86da2-59ef-4fb2-a414-04dbebd56e75 for this chassis.
Oct  2 08:16:14 np0005466013 ovn_controller[94366]: 2025-10-02T12:16:14Z|00291|binding|INFO|cea86da2-59ef-4fb2-a414-04dbebd56e75: Claiming fa:16:3e:06:84:17 10.100.0.14
Oct  2 08:16:14 np0005466013 kernel: tapcea86da2-59 (unregistering): left promiscuous mode
Oct  2 08:16:14 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:14.074 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:06:84:17 10.100.0.14'], port_security=['fa:16:3e:06:84:17 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-295d1af7-5cfd-4501-a36e-13bd17ce835b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da589921190a470cab62d12688f03735', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'ef00f507-77db-4cb1-a09e-91fc0b57cce7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b4dc983a-7436-400e-b753-e53a88d71c73, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=cea86da2-59ef-4fb2-a414-04dbebd56e75) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:16:14 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:14.076 103323 INFO neutron.agent.ovn.metadata.agent [-] Port cea86da2-59ef-4fb2-a414-04dbebd56e75 in datapath 295d1af7-5cfd-4501-a36e-13bd17ce835b bound to our chassis#033[00m
Oct  2 08:16:14 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:14.078 103323 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 295d1af7-5cfd-4501-a36e-13bd17ce835b or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Oct  2 08:16:14 np0005466013 nova_compute[192144]: 2025-10-02 12:16:14.078 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:14 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:14.079 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[5d683f0c-f910-4844-b57f-5950db1461a8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:14 np0005466013 ovn_controller[94366]: 2025-10-02T12:16:14Z|00292|binding|INFO|Setting lport cea86da2-59ef-4fb2-a414-04dbebd56e75 ovn-installed in OVS
Oct  2 08:16:14 np0005466013 ovn_controller[94366]: 2025-10-02T12:16:14Z|00293|binding|INFO|Setting lport cea86da2-59ef-4fb2-a414-04dbebd56e75 up in Southbound
Oct  2 08:16:14 np0005466013 nova_compute[192144]: 2025-10-02 12:16:14.084 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:14 np0005466013 ovn_controller[94366]: 2025-10-02T12:16:14Z|00294|binding|INFO|Releasing lport cea86da2-59ef-4fb2-a414-04dbebd56e75 from this chassis (sb_readonly=1)
Oct  2 08:16:14 np0005466013 ovn_controller[94366]: 2025-10-02T12:16:14Z|00295|if_status|INFO|Dropped 1 log messages in last 540 seconds (most recently, 540 seconds ago) due to excessive rate
Oct  2 08:16:14 np0005466013 ovn_controller[94366]: 2025-10-02T12:16:14Z|00296|if_status|INFO|Not setting lport cea86da2-59ef-4fb2-a414-04dbebd56e75 down as sb is readonly
Oct  2 08:16:14 np0005466013 ovn_controller[94366]: 2025-10-02T12:16:14Z|00297|binding|INFO|Removing iface tapcea86da2-59 ovn-installed in OVS
Oct  2 08:16:14 np0005466013 nova_compute[192144]: 2025-10-02 12:16:14.087 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:14 np0005466013 ovn_controller[94366]: 2025-10-02T12:16:14Z|00298|binding|INFO|Releasing lport cea86da2-59ef-4fb2-a414-04dbebd56e75 from this chassis (sb_readonly=0)
Oct  2 08:16:14 np0005466013 ovn_controller[94366]: 2025-10-02T12:16:14Z|00299|binding|INFO|Setting lport cea86da2-59ef-4fb2-a414-04dbebd56e75 down in Southbound
Oct  2 08:16:14 np0005466013 nova_compute[192144]: 2025-10-02 12:16:14.098 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:14 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:14.103 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:06:84:17 10.100.0.14'], port_security=['fa:16:3e:06:84:17 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '70d78115-9cfc-487c-95b6-f4f4149c52a1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-295d1af7-5cfd-4501-a36e-13bd17ce835b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da589921190a470cab62d12688f03735', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'ef00f507-77db-4cb1-a09e-91fc0b57cce7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b4dc983a-7436-400e-b753-e53a88d71c73, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=cea86da2-59ef-4fb2-a414-04dbebd56e75) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:16:14 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:14.104 103323 INFO neutron.agent.ovn.metadata.agent [-] Port cea86da2-59ef-4fb2-a414-04dbebd56e75 in datapath 295d1af7-5cfd-4501-a36e-13bd17ce835b unbound from our chassis#033[00m
Oct  2 08:16:14 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:14.105 103323 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 295d1af7-5cfd-4501-a36e-13bd17ce835b or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Oct  2 08:16:14 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:14.106 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[342d2253-07e9-475b-ba9b-02e0bc0e98ca]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:14 np0005466013 nova_compute[192144]: 2025-10-02 12:16:14.139 2 INFO nova.virt.libvirt.driver [-] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Instance destroyed successfully.#033[00m
Oct  2 08:16:14 np0005466013 nova_compute[192144]: 2025-10-02 12:16:14.140 2 DEBUG nova.objects.instance [None req-d3f94028-6f42-4339-92c2-fc347a59bf61 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lazy-loading 'resources' on Instance uuid 70d78115-9cfc-487c-95b6-f4f4149c52a1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:16:14 np0005466013 nova_compute[192144]: 2025-10-02 12:16:14.156 2 DEBUG nova.virt.libvirt.vif [None req-d3f94028-6f42-4339-92c2-fc347a59bf61 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:14:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-899246229',display_name='tempest-ServerRescueTestJSON-server-899246229',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-899246229',id=74,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:15:09Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='da589921190a470cab62d12688f03735',ramdisk_id='',reservation_id='r-l0lf8422',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSON-1453830403',owner_user_name='tempest-ServerRescueTestJSON-1453830403-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:15:09Z,user_data=None,user_id='7b25dedc41b548469d2b0627e3255b9e',uuid=70d78115-9cfc-487c-95b6-f4f4149c52a1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='rescued') vif={"id": "cea86da2-59ef-4fb2-a414-04dbebd56e75", "address": "fa:16:3e:06:84:17", "network": {"id": "295d1af7-5cfd-4501-a36e-13bd17ce835b", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2032524031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "da589921190a470cab62d12688f03735", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcea86da2-59", "ovs_interfaceid": "cea86da2-59ef-4fb2-a414-04dbebd56e75", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:16:14 np0005466013 nova_compute[192144]: 2025-10-02 12:16:14.157 2 DEBUG nova.network.os_vif_util [None req-d3f94028-6f42-4339-92c2-fc347a59bf61 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Converting VIF {"id": "cea86da2-59ef-4fb2-a414-04dbebd56e75", "address": "fa:16:3e:06:84:17", "network": {"id": "295d1af7-5cfd-4501-a36e-13bd17ce835b", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2032524031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "da589921190a470cab62d12688f03735", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcea86da2-59", "ovs_interfaceid": "cea86da2-59ef-4fb2-a414-04dbebd56e75", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:16:14 np0005466013 nova_compute[192144]: 2025-10-02 12:16:14.157 2 DEBUG nova.network.os_vif_util [None req-d3f94028-6f42-4339-92c2-fc347a59bf61 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:06:84:17,bridge_name='br-int',has_traffic_filtering=True,id=cea86da2-59ef-4fb2-a414-04dbebd56e75,network=Network(295d1af7-5cfd-4501-a36e-13bd17ce835b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcea86da2-59') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:16:14 np0005466013 nova_compute[192144]: 2025-10-02 12:16:14.158 2 DEBUG os_vif [None req-d3f94028-6f42-4339-92c2-fc347a59bf61 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:06:84:17,bridge_name='br-int',has_traffic_filtering=True,id=cea86da2-59ef-4fb2-a414-04dbebd56e75,network=Network(295d1af7-5cfd-4501-a36e-13bd17ce835b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcea86da2-59') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:16:14 np0005466013 nova_compute[192144]: 2025-10-02 12:16:14.159 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:14 np0005466013 nova_compute[192144]: 2025-10-02 12:16:14.159 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcea86da2-59, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:14 np0005466013 nova_compute[192144]: 2025-10-02 12:16:14.160 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:14 np0005466013 nova_compute[192144]: 2025-10-02 12:16:14.162 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:14 np0005466013 nova_compute[192144]: 2025-10-02 12:16:14.165 2 INFO os_vif [None req-d3f94028-6f42-4339-92c2-fc347a59bf61 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:06:84:17,bridge_name='br-int',has_traffic_filtering=True,id=cea86da2-59ef-4fb2-a414-04dbebd56e75,network=Network(295d1af7-5cfd-4501-a36e-13bd17ce835b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcea86da2-59')#033[00m
Oct  2 08:16:14 np0005466013 nova_compute[192144]: 2025-10-02 12:16:14.165 2 INFO nova.virt.libvirt.driver [None req-d3f94028-6f42-4339-92c2-fc347a59bf61 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Deleting instance files /var/lib/nova/instances/70d78115-9cfc-487c-95b6-f4f4149c52a1_del#033[00m
Oct  2 08:16:14 np0005466013 nova_compute[192144]: 2025-10-02 12:16:14.166 2 INFO nova.virt.libvirt.driver [None req-d3f94028-6f42-4339-92c2-fc347a59bf61 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Deletion of /var/lib/nova/instances/70d78115-9cfc-487c-95b6-f4f4149c52a1_del complete#033[00m
Oct  2 08:16:14 np0005466013 nova_compute[192144]: 2025-10-02 12:16:14.240 2 INFO nova.compute.manager [None req-d3f94028-6f42-4339-92c2-fc347a59bf61 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:16:14 np0005466013 nova_compute[192144]: 2025-10-02 12:16:14.241 2 DEBUG oslo.service.loopingcall [None req-d3f94028-6f42-4339-92c2-fc347a59bf61 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:16:14 np0005466013 nova_compute[192144]: 2025-10-02 12:16:14.241 2 DEBUG nova.compute.manager [-] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:16:14 np0005466013 nova_compute[192144]: 2025-10-02 12:16:14.242 2 DEBUG nova.network.neutron [-] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:16:14 np0005466013 nova_compute[192144]: 2025-10-02 12:16:14.474 2 DEBUG oslo_concurrency.lockutils [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "379bce9e-24c6-4d6e-9438-28eed217ca12" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:14 np0005466013 nova_compute[192144]: 2025-10-02 12:16:14.475 2 DEBUG oslo_concurrency.lockutils [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "379bce9e-24c6-4d6e-9438-28eed217ca12" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:14 np0005466013 nova_compute[192144]: 2025-10-02 12:16:14.531 2 DEBUG nova.compute.manager [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:16:14 np0005466013 nova_compute[192144]: 2025-10-02 12:16:14.755 2 DEBUG oslo_concurrency.lockutils [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:14 np0005466013 nova_compute[192144]: 2025-10-02 12:16:14.756 2 DEBUG oslo_concurrency.lockutils [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:14 np0005466013 nova_compute[192144]: 2025-10-02 12:16:14.763 2 DEBUG nova.virt.hardware [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:16:14 np0005466013 nova_compute[192144]: 2025-10-02 12:16:14.764 2 INFO nova.compute.claims [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:16:14 np0005466013 nova_compute[192144]: 2025-10-02 12:16:14.997 2 DEBUG nova.compute.provider_tree [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:16:15 np0005466013 nova_compute[192144]: 2025-10-02 12:16:15.017 2 DEBUG nova.scheduler.client.report [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:16:15 np0005466013 nova_compute[192144]: 2025-10-02 12:16:15.056 2 DEBUG oslo_concurrency.lockutils [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.301s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:15 np0005466013 nova_compute[192144]: 2025-10-02 12:16:15.057 2 DEBUG nova.compute.manager [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:16:15 np0005466013 nova_compute[192144]: 2025-10-02 12:16:15.233 2 DEBUG nova.compute.manager [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:16:15 np0005466013 nova_compute[192144]: 2025-10-02 12:16:15.234 2 DEBUG nova.network.neutron [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:16:15 np0005466013 nova_compute[192144]: 2025-10-02 12:16:15.270 2 INFO nova.virt.libvirt.driver [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:16:15 np0005466013 nova_compute[192144]: 2025-10-02 12:16:15.314 2 DEBUG nova.compute.manager [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:16:15 np0005466013 nova_compute[192144]: 2025-10-02 12:16:15.569 2 DEBUG nova.policy [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'def48c13fd6a43ba88836b753986a731', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ffae703d68b24b9c89686c149113fc2b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:16:16 np0005466013 nova_compute[192144]: 2025-10-02 12:16:16.803 2 DEBUG nova.compute.manager [req-2a43e954-ce9e-4f98-97aa-8c00dd62acf6 req-71218edd-c017-46ea-b949-43cbb1aefcc9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Received event network-vif-unplugged-cea86da2-59ef-4fb2-a414-04dbebd56e75 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:16 np0005466013 nova_compute[192144]: 2025-10-02 12:16:16.804 2 DEBUG oslo_concurrency.lockutils [req-2a43e954-ce9e-4f98-97aa-8c00dd62acf6 req-71218edd-c017-46ea-b949-43cbb1aefcc9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "70d78115-9cfc-487c-95b6-f4f4149c52a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:16 np0005466013 nova_compute[192144]: 2025-10-02 12:16:16.804 2 DEBUG oslo_concurrency.lockutils [req-2a43e954-ce9e-4f98-97aa-8c00dd62acf6 req-71218edd-c017-46ea-b949-43cbb1aefcc9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "70d78115-9cfc-487c-95b6-f4f4149c52a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:16 np0005466013 nova_compute[192144]: 2025-10-02 12:16:16.805 2 DEBUG oslo_concurrency.lockutils [req-2a43e954-ce9e-4f98-97aa-8c00dd62acf6 req-71218edd-c017-46ea-b949-43cbb1aefcc9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "70d78115-9cfc-487c-95b6-f4f4149c52a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:16 np0005466013 nova_compute[192144]: 2025-10-02 12:16:16.805 2 DEBUG nova.compute.manager [req-2a43e954-ce9e-4f98-97aa-8c00dd62acf6 req-71218edd-c017-46ea-b949-43cbb1aefcc9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] No waiting events found dispatching network-vif-unplugged-cea86da2-59ef-4fb2-a414-04dbebd56e75 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:16:16 np0005466013 nova_compute[192144]: 2025-10-02 12:16:16.805 2 DEBUG nova.compute.manager [req-2a43e954-ce9e-4f98-97aa-8c00dd62acf6 req-71218edd-c017-46ea-b949-43cbb1aefcc9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Received event network-vif-unplugged-cea86da2-59ef-4fb2-a414-04dbebd56e75 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:16:16 np0005466013 nova_compute[192144]: 2025-10-02 12:16:16.806 2 DEBUG nova.compute.manager [req-2a43e954-ce9e-4f98-97aa-8c00dd62acf6 req-71218edd-c017-46ea-b949-43cbb1aefcc9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Received event network-vif-plugged-cea86da2-59ef-4fb2-a414-04dbebd56e75 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:16 np0005466013 nova_compute[192144]: 2025-10-02 12:16:16.806 2 DEBUG oslo_concurrency.lockutils [req-2a43e954-ce9e-4f98-97aa-8c00dd62acf6 req-71218edd-c017-46ea-b949-43cbb1aefcc9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "70d78115-9cfc-487c-95b6-f4f4149c52a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:16 np0005466013 nova_compute[192144]: 2025-10-02 12:16:16.806 2 DEBUG oslo_concurrency.lockutils [req-2a43e954-ce9e-4f98-97aa-8c00dd62acf6 req-71218edd-c017-46ea-b949-43cbb1aefcc9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "70d78115-9cfc-487c-95b6-f4f4149c52a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:16 np0005466013 nova_compute[192144]: 2025-10-02 12:16:16.807 2 DEBUG oslo_concurrency.lockutils [req-2a43e954-ce9e-4f98-97aa-8c00dd62acf6 req-71218edd-c017-46ea-b949-43cbb1aefcc9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "70d78115-9cfc-487c-95b6-f4f4149c52a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:16 np0005466013 nova_compute[192144]: 2025-10-02 12:16:16.807 2 DEBUG nova.compute.manager [req-2a43e954-ce9e-4f98-97aa-8c00dd62acf6 req-71218edd-c017-46ea-b949-43cbb1aefcc9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] No waiting events found dispatching network-vif-plugged-cea86da2-59ef-4fb2-a414-04dbebd56e75 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:16:16 np0005466013 nova_compute[192144]: 2025-10-02 12:16:16.808 2 WARNING nova.compute.manager [req-2a43e954-ce9e-4f98-97aa-8c00dd62acf6 req-71218edd-c017-46ea-b949-43cbb1aefcc9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Received unexpected event network-vif-plugged-cea86da2-59ef-4fb2-a414-04dbebd56e75 for instance with vm_state rescued and task_state deleting.#033[00m
Oct  2 08:16:16 np0005466013 nova_compute[192144]: 2025-10-02 12:16:16.808 2 DEBUG nova.compute.manager [req-2a43e954-ce9e-4f98-97aa-8c00dd62acf6 req-71218edd-c017-46ea-b949-43cbb1aefcc9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Received event network-vif-plugged-cea86da2-59ef-4fb2-a414-04dbebd56e75 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:16 np0005466013 nova_compute[192144]: 2025-10-02 12:16:16.808 2 DEBUG oslo_concurrency.lockutils [req-2a43e954-ce9e-4f98-97aa-8c00dd62acf6 req-71218edd-c017-46ea-b949-43cbb1aefcc9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "70d78115-9cfc-487c-95b6-f4f4149c52a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:16 np0005466013 nova_compute[192144]: 2025-10-02 12:16:16.808 2 DEBUG oslo_concurrency.lockutils [req-2a43e954-ce9e-4f98-97aa-8c00dd62acf6 req-71218edd-c017-46ea-b949-43cbb1aefcc9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "70d78115-9cfc-487c-95b6-f4f4149c52a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:16 np0005466013 nova_compute[192144]: 2025-10-02 12:16:16.809 2 DEBUG oslo_concurrency.lockutils [req-2a43e954-ce9e-4f98-97aa-8c00dd62acf6 req-71218edd-c017-46ea-b949-43cbb1aefcc9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "70d78115-9cfc-487c-95b6-f4f4149c52a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:16 np0005466013 nova_compute[192144]: 2025-10-02 12:16:16.809 2 DEBUG nova.compute.manager [req-2a43e954-ce9e-4f98-97aa-8c00dd62acf6 req-71218edd-c017-46ea-b949-43cbb1aefcc9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] No waiting events found dispatching network-vif-plugged-cea86da2-59ef-4fb2-a414-04dbebd56e75 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:16:16 np0005466013 nova_compute[192144]: 2025-10-02 12:16:16.810 2 WARNING nova.compute.manager [req-2a43e954-ce9e-4f98-97aa-8c00dd62acf6 req-71218edd-c017-46ea-b949-43cbb1aefcc9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Received unexpected event network-vif-plugged-cea86da2-59ef-4fb2-a414-04dbebd56e75 for instance with vm_state rescued and task_state deleting.#033[00m
Oct  2 08:16:16 np0005466013 nova_compute[192144]: 2025-10-02 12:16:16.810 2 DEBUG nova.compute.manager [req-2a43e954-ce9e-4f98-97aa-8c00dd62acf6 req-71218edd-c017-46ea-b949-43cbb1aefcc9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Received event network-vif-plugged-cea86da2-59ef-4fb2-a414-04dbebd56e75 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:16 np0005466013 nova_compute[192144]: 2025-10-02 12:16:16.810 2 DEBUG oslo_concurrency.lockutils [req-2a43e954-ce9e-4f98-97aa-8c00dd62acf6 req-71218edd-c017-46ea-b949-43cbb1aefcc9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "70d78115-9cfc-487c-95b6-f4f4149c52a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:16 np0005466013 nova_compute[192144]: 2025-10-02 12:16:16.811 2 DEBUG oslo_concurrency.lockutils [req-2a43e954-ce9e-4f98-97aa-8c00dd62acf6 req-71218edd-c017-46ea-b949-43cbb1aefcc9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "70d78115-9cfc-487c-95b6-f4f4149c52a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:16 np0005466013 nova_compute[192144]: 2025-10-02 12:16:16.811 2 DEBUG oslo_concurrency.lockutils [req-2a43e954-ce9e-4f98-97aa-8c00dd62acf6 req-71218edd-c017-46ea-b949-43cbb1aefcc9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "70d78115-9cfc-487c-95b6-f4f4149c52a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:16 np0005466013 nova_compute[192144]: 2025-10-02 12:16:16.811 2 DEBUG nova.compute.manager [req-2a43e954-ce9e-4f98-97aa-8c00dd62acf6 req-71218edd-c017-46ea-b949-43cbb1aefcc9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] No waiting events found dispatching network-vif-plugged-cea86da2-59ef-4fb2-a414-04dbebd56e75 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:16:16 np0005466013 nova_compute[192144]: 2025-10-02 12:16:16.812 2 WARNING nova.compute.manager [req-2a43e954-ce9e-4f98-97aa-8c00dd62acf6 req-71218edd-c017-46ea-b949-43cbb1aefcc9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Received unexpected event network-vif-plugged-cea86da2-59ef-4fb2-a414-04dbebd56e75 for instance with vm_state rescued and task_state deleting.#033[00m
Oct  2 08:16:16 np0005466013 nova_compute[192144]: 2025-10-02 12:16:16.812 2 DEBUG nova.compute.manager [req-2a43e954-ce9e-4f98-97aa-8c00dd62acf6 req-71218edd-c017-46ea-b949-43cbb1aefcc9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Received event network-vif-unplugged-cea86da2-59ef-4fb2-a414-04dbebd56e75 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:16 np0005466013 nova_compute[192144]: 2025-10-02 12:16:16.812 2 DEBUG oslo_concurrency.lockutils [req-2a43e954-ce9e-4f98-97aa-8c00dd62acf6 req-71218edd-c017-46ea-b949-43cbb1aefcc9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "70d78115-9cfc-487c-95b6-f4f4149c52a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:16 np0005466013 nova_compute[192144]: 2025-10-02 12:16:16.812 2 DEBUG oslo_concurrency.lockutils [req-2a43e954-ce9e-4f98-97aa-8c00dd62acf6 req-71218edd-c017-46ea-b949-43cbb1aefcc9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "70d78115-9cfc-487c-95b6-f4f4149c52a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:16 np0005466013 nova_compute[192144]: 2025-10-02 12:16:16.813 2 DEBUG oslo_concurrency.lockutils [req-2a43e954-ce9e-4f98-97aa-8c00dd62acf6 req-71218edd-c017-46ea-b949-43cbb1aefcc9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "70d78115-9cfc-487c-95b6-f4f4149c52a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:16 np0005466013 nova_compute[192144]: 2025-10-02 12:16:16.813 2 DEBUG nova.compute.manager [req-2a43e954-ce9e-4f98-97aa-8c00dd62acf6 req-71218edd-c017-46ea-b949-43cbb1aefcc9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] No waiting events found dispatching network-vif-unplugged-cea86da2-59ef-4fb2-a414-04dbebd56e75 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:16:16 np0005466013 nova_compute[192144]: 2025-10-02 12:16:16.813 2 DEBUG nova.compute.manager [req-2a43e954-ce9e-4f98-97aa-8c00dd62acf6 req-71218edd-c017-46ea-b949-43cbb1aefcc9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Received event network-vif-unplugged-cea86da2-59ef-4fb2-a414-04dbebd56e75 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:16:16 np0005466013 nova_compute[192144]: 2025-10-02 12:16:16.845 2 DEBUG nova.compute.manager [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:16:16 np0005466013 nova_compute[192144]: 2025-10-02 12:16:16.847 2 DEBUG nova.virt.libvirt.driver [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:16:16 np0005466013 nova_compute[192144]: 2025-10-02 12:16:16.847 2 INFO nova.virt.libvirt.driver [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Creating image(s)#033[00m
Oct  2 08:16:16 np0005466013 nova_compute[192144]: 2025-10-02 12:16:16.848 2 DEBUG oslo_concurrency.lockutils [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "/var/lib/nova/instances/379bce9e-24c6-4d6e-9438-28eed217ca12/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:16 np0005466013 nova_compute[192144]: 2025-10-02 12:16:16.848 2 DEBUG oslo_concurrency.lockutils [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "/var/lib/nova/instances/379bce9e-24c6-4d6e-9438-28eed217ca12/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:16 np0005466013 nova_compute[192144]: 2025-10-02 12:16:16.849 2 DEBUG oslo_concurrency.lockutils [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "/var/lib/nova/instances/379bce9e-24c6-4d6e-9438-28eed217ca12/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:16 np0005466013 nova_compute[192144]: 2025-10-02 12:16:16.865 2 DEBUG oslo_concurrency.processutils [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:16 np0005466013 nova_compute[192144]: 2025-10-02 12:16:16.933 2 DEBUG oslo_concurrency.processutils [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:16 np0005466013 nova_compute[192144]: 2025-10-02 12:16:16.934 2 DEBUG oslo_concurrency.lockutils [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:16 np0005466013 nova_compute[192144]: 2025-10-02 12:16:16.935 2 DEBUG oslo_concurrency.lockutils [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:16 np0005466013 nova_compute[192144]: 2025-10-02 12:16:16.951 2 DEBUG oslo_concurrency.processutils [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:17 np0005466013 nova_compute[192144]: 2025-10-02 12:16:17.023 2 DEBUG oslo_concurrency.processutils [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:17 np0005466013 nova_compute[192144]: 2025-10-02 12:16:17.024 2 DEBUG oslo_concurrency.processutils [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/379bce9e-24c6-4d6e-9438-28eed217ca12/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:17 np0005466013 nova_compute[192144]: 2025-10-02 12:16:17.062 2 DEBUG oslo_concurrency.processutils [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/379bce9e-24c6-4d6e-9438-28eed217ca12/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:17 np0005466013 nova_compute[192144]: 2025-10-02 12:16:17.063 2 DEBUG oslo_concurrency.lockutils [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:17 np0005466013 nova_compute[192144]: 2025-10-02 12:16:17.064 2 DEBUG oslo_concurrency.processutils [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:17 np0005466013 nova_compute[192144]: 2025-10-02 12:16:17.128 2 DEBUG oslo_concurrency.processutils [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:17 np0005466013 nova_compute[192144]: 2025-10-02 12:16:17.130 2 DEBUG nova.virt.disk.api [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Checking if we can resize image /var/lib/nova/instances/379bce9e-24c6-4d6e-9438-28eed217ca12/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:16:17 np0005466013 nova_compute[192144]: 2025-10-02 12:16:17.131 2 DEBUG oslo_concurrency.processutils [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/379bce9e-24c6-4d6e-9438-28eed217ca12/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:17 np0005466013 nova_compute[192144]: 2025-10-02 12:16:17.192 2 DEBUG oslo_concurrency.processutils [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/379bce9e-24c6-4d6e-9438-28eed217ca12/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:17 np0005466013 nova_compute[192144]: 2025-10-02 12:16:17.193 2 DEBUG nova.virt.disk.api [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Cannot resize image /var/lib/nova/instances/379bce9e-24c6-4d6e-9438-28eed217ca12/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:16:17 np0005466013 nova_compute[192144]: 2025-10-02 12:16:17.194 2 DEBUG nova.objects.instance [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lazy-loading 'migration_context' on Instance uuid 379bce9e-24c6-4d6e-9438-28eed217ca12 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:16:17 np0005466013 nova_compute[192144]: 2025-10-02 12:16:17.205 2 DEBUG nova.network.neutron [-] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:16:17 np0005466013 nova_compute[192144]: 2025-10-02 12:16:17.234 2 DEBUG nova.virt.libvirt.driver [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:16:17 np0005466013 nova_compute[192144]: 2025-10-02 12:16:17.234 2 DEBUG nova.virt.libvirt.driver [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Ensure instance console log exists: /var/lib/nova/instances/379bce9e-24c6-4d6e-9438-28eed217ca12/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:16:17 np0005466013 nova_compute[192144]: 2025-10-02 12:16:17.235 2 DEBUG oslo_concurrency.lockutils [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:17 np0005466013 nova_compute[192144]: 2025-10-02 12:16:17.235 2 DEBUG oslo_concurrency.lockutils [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:17 np0005466013 nova_compute[192144]: 2025-10-02 12:16:17.235 2 DEBUG oslo_concurrency.lockutils [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:17 np0005466013 nova_compute[192144]: 2025-10-02 12:16:17.259 2 INFO nova.compute.manager [-] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Took 3.02 seconds to deallocate network for instance.#033[00m
Oct  2 08:16:17 np0005466013 nova_compute[192144]: 2025-10-02 12:16:17.435 2 DEBUG oslo_concurrency.lockutils [None req-d3f94028-6f42-4339-92c2-fc347a59bf61 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:17 np0005466013 nova_compute[192144]: 2025-10-02 12:16:17.435 2 DEBUG oslo_concurrency.lockutils [None req-d3f94028-6f42-4339-92c2-fc347a59bf61 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:17 np0005466013 nova_compute[192144]: 2025-10-02 12:16:17.554 2 DEBUG nova.compute.provider_tree [None req-d3f94028-6f42-4339-92c2-fc347a59bf61 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:16:17 np0005466013 nova_compute[192144]: 2025-10-02 12:16:17.580 2 DEBUG nova.scheduler.client.report [None req-d3f94028-6f42-4339-92c2-fc347a59bf61 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:16:17 np0005466013 nova_compute[192144]: 2025-10-02 12:16:17.632 2 DEBUG oslo_concurrency.lockutils [None req-d3f94028-6f42-4339-92c2-fc347a59bf61 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.197s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:17 np0005466013 nova_compute[192144]: 2025-10-02 12:16:17.671 2 INFO nova.scheduler.client.report [None req-d3f94028-6f42-4339-92c2-fc347a59bf61 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Deleted allocations for instance 70d78115-9cfc-487c-95b6-f4f4149c52a1#033[00m
Oct  2 08:16:17 np0005466013 nova_compute[192144]: 2025-10-02 12:16:17.697 2 DEBUG nova.network.neutron [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Successfully created port: 8bb13439-1bbb-437a-8c8a-d41bd8de3c01 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:16:17 np0005466013 nova_compute[192144]: 2025-10-02 12:16:17.757 2 DEBUG oslo_concurrency.lockutils [None req-d3f94028-6f42-4339-92c2-fc347a59bf61 7b25dedc41b548469d2b0627e3255b9e da589921190a470cab62d12688f03735 - - default default] Lock "70d78115-9cfc-487c-95b6-f4f4149c52a1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.190s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:18 np0005466013 nova_compute[192144]: 2025-10-02 12:16:18.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:18 np0005466013 nova_compute[192144]: 2025-10-02 12:16:18.829 2 DEBUG nova.network.neutron [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Successfully updated port: 8bb13439-1bbb-437a-8c8a-d41bd8de3c01 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:16:18 np0005466013 nova_compute[192144]: 2025-10-02 12:16:18.880 2 DEBUG oslo_concurrency.lockutils [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "refresh_cache-379bce9e-24c6-4d6e-9438-28eed217ca12" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:16:18 np0005466013 nova_compute[192144]: 2025-10-02 12:16:18.880 2 DEBUG oslo_concurrency.lockutils [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquired lock "refresh_cache-379bce9e-24c6-4d6e-9438-28eed217ca12" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:16:18 np0005466013 nova_compute[192144]: 2025-10-02 12:16:18.880 2 DEBUG nova.network.neutron [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.051 2 DEBUG nova.network.neutron [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.162 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.406 2 DEBUG nova.compute.manager [req-45e31d2a-f39d-42d2-bcaf-a80d4f037648 req-d718378b-c13c-4bd2-bc31-c09911dfbed1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Received event network-vif-plugged-cea86da2-59ef-4fb2-a414-04dbebd56e75 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.407 2 DEBUG oslo_concurrency.lockutils [req-45e31d2a-f39d-42d2-bcaf-a80d4f037648 req-d718378b-c13c-4bd2-bc31-c09911dfbed1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "70d78115-9cfc-487c-95b6-f4f4149c52a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.407 2 DEBUG oslo_concurrency.lockutils [req-45e31d2a-f39d-42d2-bcaf-a80d4f037648 req-d718378b-c13c-4bd2-bc31-c09911dfbed1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "70d78115-9cfc-487c-95b6-f4f4149c52a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.408 2 DEBUG oslo_concurrency.lockutils [req-45e31d2a-f39d-42d2-bcaf-a80d4f037648 req-d718378b-c13c-4bd2-bc31-c09911dfbed1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "70d78115-9cfc-487c-95b6-f4f4149c52a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.408 2 DEBUG nova.compute.manager [req-45e31d2a-f39d-42d2-bcaf-a80d4f037648 req-d718378b-c13c-4bd2-bc31-c09911dfbed1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] No waiting events found dispatching network-vif-plugged-cea86da2-59ef-4fb2-a414-04dbebd56e75 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.408 2 WARNING nova.compute.manager [req-45e31d2a-f39d-42d2-bcaf-a80d4f037648 req-d718378b-c13c-4bd2-bc31-c09911dfbed1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Received unexpected event network-vif-plugged-cea86da2-59ef-4fb2-a414-04dbebd56e75 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.408 2 DEBUG nova.compute.manager [req-45e31d2a-f39d-42d2-bcaf-a80d4f037648 req-d718378b-c13c-4bd2-bc31-c09911dfbed1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Received event network-vif-deleted-cea86da2-59ef-4fb2-a414-04dbebd56e75 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.843 2 DEBUG nova.network.neutron [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Updating instance_info_cache with network_info: [{"id": "8bb13439-1bbb-437a-8c8a-d41bd8de3c01", "address": "fa:16:3e:99:72:a3", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8bb13439-1b", "ovs_interfaceid": "8bb13439-1bbb-437a-8c8a-d41bd8de3c01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.862 2 DEBUG oslo_concurrency.lockutils [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Releasing lock "refresh_cache-379bce9e-24c6-4d6e-9438-28eed217ca12" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.863 2 DEBUG nova.compute.manager [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Instance network_info: |[{"id": "8bb13439-1bbb-437a-8c8a-d41bd8de3c01", "address": "fa:16:3e:99:72:a3", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8bb13439-1b", "ovs_interfaceid": "8bb13439-1bbb-437a-8c8a-d41bd8de3c01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.865 2 DEBUG nova.virt.libvirt.driver [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Start _get_guest_xml network_info=[{"id": "8bb13439-1bbb-437a-8c8a-d41bd8de3c01", "address": "fa:16:3e:99:72:a3", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8bb13439-1b", "ovs_interfaceid": "8bb13439-1bbb-437a-8c8a-d41bd8de3c01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.869 2 WARNING nova.virt.libvirt.driver [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.873 2 DEBUG nova.virt.libvirt.host [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.874 2 DEBUG nova.virt.libvirt.host [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.877 2 DEBUG nova.virt.libvirt.host [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.878 2 DEBUG nova.virt.libvirt.host [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.879 2 DEBUG nova.virt.libvirt.driver [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.879 2 DEBUG nova.virt.hardware [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.880 2 DEBUG nova.virt.hardware [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.880 2 DEBUG nova.virt.hardware [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.880 2 DEBUG nova.virt.hardware [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.881 2 DEBUG nova.virt.hardware [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.881 2 DEBUG nova.virt.hardware [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.881 2 DEBUG nova.virt.hardware [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.882 2 DEBUG nova.virt.hardware [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.882 2 DEBUG nova.virt.hardware [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.882 2 DEBUG nova.virt.hardware [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.883 2 DEBUG nova.virt.hardware [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.887 2 DEBUG nova.virt.libvirt.vif [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:16:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1069325640',display_name='tempest-ServerDiskConfigTestJSON-server-1069325640',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1069325640',id=81,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ffae703d68b24b9c89686c149113fc2b',ramdisk_id='',reservation_id='r-t0ssxxiz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-1763056137',owner_user_name='tempest-ServerDiskConfigTestJSON-1763056137-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:16:15Z,user_data=None,user_id='def48c13fd6a43ba88836b753986a731',uuid=379bce9e-24c6-4d6e-9438-28eed217ca12,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8bb13439-1bbb-437a-8c8a-d41bd8de3c01", "address": "fa:16:3e:99:72:a3", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8bb13439-1b", "ovs_interfaceid": "8bb13439-1bbb-437a-8c8a-d41bd8de3c01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.888 2 DEBUG nova.network.os_vif_util [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Converting VIF {"id": "8bb13439-1bbb-437a-8c8a-d41bd8de3c01", "address": "fa:16:3e:99:72:a3", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8bb13439-1b", "ovs_interfaceid": "8bb13439-1bbb-437a-8c8a-d41bd8de3c01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.889 2 DEBUG nova.network.os_vif_util [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:99:72:a3,bridge_name='br-int',has_traffic_filtering=True,id=8bb13439-1bbb-437a-8c8a-d41bd8de3c01,network=Network(d6de4737-ca60-4c8d-bfd5-687f9366ec8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8bb13439-1b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.890 2 DEBUG nova.objects.instance [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lazy-loading 'pci_devices' on Instance uuid 379bce9e-24c6-4d6e-9438-28eed217ca12 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.909 2 DEBUG nova.virt.libvirt.driver [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:16:19 np0005466013 nova_compute[192144]:  <uuid>379bce9e-24c6-4d6e-9438-28eed217ca12</uuid>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:  <name>instance-00000051</name>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:16:19 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:      <nova:name>tempest-ServerDiskConfigTestJSON-server-1069325640</nova:name>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:16:19</nova:creationTime>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:16:19 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:        <nova:user uuid="def48c13fd6a43ba88836b753986a731">tempest-ServerDiskConfigTestJSON-1763056137-project-member</nova:user>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:        <nova:project uuid="ffae703d68b24b9c89686c149113fc2b">tempest-ServerDiskConfigTestJSON-1763056137</nova:project>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:        <nova:port uuid="8bb13439-1bbb-437a-8c8a-d41bd8de3c01">
Oct  2 08:16:19 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:      <entry name="serial">379bce9e-24c6-4d6e-9438-28eed217ca12</entry>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:      <entry name="uuid">379bce9e-24c6-4d6e-9438-28eed217ca12</entry>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:16:19 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/379bce9e-24c6-4d6e-9438-28eed217ca12/disk"/>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:16:19 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/379bce9e-24c6-4d6e-9438-28eed217ca12/disk.config"/>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:16:19 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:99:72:a3"/>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:      <target dev="tap8bb13439-1b"/>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:16:19 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/379bce9e-24c6-4d6e-9438-28eed217ca12/console.log" append="off"/>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:16:19 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:16:19 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:16:19 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:16:19 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:16:19 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.911 2 DEBUG nova.compute.manager [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Preparing to wait for external event network-vif-plugged-8bb13439-1bbb-437a-8c8a-d41bd8de3c01 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.911 2 DEBUG oslo_concurrency.lockutils [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "379bce9e-24c6-4d6e-9438-28eed217ca12-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.912 2 DEBUG oslo_concurrency.lockutils [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "379bce9e-24c6-4d6e-9438-28eed217ca12-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.912 2 DEBUG oslo_concurrency.lockutils [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "379bce9e-24c6-4d6e-9438-28eed217ca12-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.913 2 DEBUG nova.virt.libvirt.vif [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:16:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1069325640',display_name='tempest-ServerDiskConfigTestJSON-server-1069325640',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1069325640',id=81,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ffae703d68b24b9c89686c149113fc2b',ramdisk_id='',reservation_id='r-t0ssxxiz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-1763056137',owner_user_name='tempest-ServerDiskConfigTestJSON-1763056137-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:16:15Z,user_data=None,user_id='def48c13fd6a43ba88836b753986a731',uuid=379bce9e-24c6-4d6e-9438-28eed217ca12,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8bb13439-1bbb-437a-8c8a-d41bd8de3c01", "address": "fa:16:3e:99:72:a3", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8bb13439-1b", "ovs_interfaceid": "8bb13439-1bbb-437a-8c8a-d41bd8de3c01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.913 2 DEBUG nova.network.os_vif_util [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Converting VIF {"id": "8bb13439-1bbb-437a-8c8a-d41bd8de3c01", "address": "fa:16:3e:99:72:a3", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8bb13439-1b", "ovs_interfaceid": "8bb13439-1bbb-437a-8c8a-d41bd8de3c01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.914 2 DEBUG nova.network.os_vif_util [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:99:72:a3,bridge_name='br-int',has_traffic_filtering=True,id=8bb13439-1bbb-437a-8c8a-d41bd8de3c01,network=Network(d6de4737-ca60-4c8d-bfd5-687f9366ec8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8bb13439-1b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.914 2 DEBUG os_vif [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:99:72:a3,bridge_name='br-int',has_traffic_filtering=True,id=8bb13439-1bbb-437a-8c8a-d41bd8de3c01,network=Network(d6de4737-ca60-4c8d-bfd5-687f9366ec8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8bb13439-1b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.915 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.915 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.916 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.918 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.918 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8bb13439-1b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.919 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8bb13439-1b, col_values=(('external_ids', {'iface-id': '8bb13439-1bbb-437a-8c8a-d41bd8de3c01', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:99:72:a3', 'vm-uuid': '379bce9e-24c6-4d6e-9438-28eed217ca12'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.920 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:19 np0005466013 NetworkManager[51205]: <info>  [1759407379.9220] manager: (tap8bb13439-1b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/140)
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.922 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.926 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.926 2 INFO os_vif [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:99:72:a3,bridge_name='br-int',has_traffic_filtering=True,id=8bb13439-1bbb-437a-8c8a-d41bd8de3c01,network=Network(d6de4737-ca60-4c8d-bfd5-687f9366ec8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8bb13439-1b')#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.987 2 DEBUG nova.virt.libvirt.driver [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.988 2 DEBUG nova.virt.libvirt.driver [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.988 2 DEBUG nova.virt.libvirt.driver [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] No VIF found with MAC fa:16:3e:99:72:a3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.989 2 INFO nova.virt.libvirt.driver [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Using config drive#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.992 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:16:19 np0005466013 nova_compute[192144]: 2025-10-02 12:16:19.993 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:16:20 np0005466013 nova_compute[192144]: 2025-10-02 12:16:20.373 2 INFO nova.virt.libvirt.driver [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Creating config drive at /var/lib/nova/instances/379bce9e-24c6-4d6e-9438-28eed217ca12/disk.config#033[00m
Oct  2 08:16:20 np0005466013 nova_compute[192144]: 2025-10-02 12:16:20.378 2 DEBUG oslo_concurrency.processutils [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/379bce9e-24c6-4d6e-9438-28eed217ca12/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6tp0v7h7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:20 np0005466013 nova_compute[192144]: 2025-10-02 12:16:20.509 2 DEBUG oslo_concurrency.processutils [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/379bce9e-24c6-4d6e-9438-28eed217ca12/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6tp0v7h7" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:20 np0005466013 nova_compute[192144]: 2025-10-02 12:16:20.525 2 DEBUG oslo_concurrency.lockutils [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "85bfe864-3153-4ef5-b286-f2f31e93f994" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:20 np0005466013 nova_compute[192144]: 2025-10-02 12:16:20.526 2 DEBUG oslo_concurrency.lockutils [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "85bfe864-3153-4ef5-b286-f2f31e93f994" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:20 np0005466013 nova_compute[192144]: 2025-10-02 12:16:20.559 2 DEBUG nova.compute.manager [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:16:20 np0005466013 kernel: tap8bb13439-1b: entered promiscuous mode
Oct  2 08:16:20 np0005466013 NetworkManager[51205]: <info>  [1759407380.5861] manager: (tap8bb13439-1b): new Tun device (/org/freedesktop/NetworkManager/Devices/141)
Oct  2 08:16:20 np0005466013 ovn_controller[94366]: 2025-10-02T12:16:20Z|00300|binding|INFO|Claiming lport 8bb13439-1bbb-437a-8c8a-d41bd8de3c01 for this chassis.
Oct  2 08:16:20 np0005466013 ovn_controller[94366]: 2025-10-02T12:16:20Z|00301|binding|INFO|8bb13439-1bbb-437a-8c8a-d41bd8de3c01: Claiming fa:16:3e:99:72:a3 10.100.0.9
Oct  2 08:16:20 np0005466013 nova_compute[192144]: 2025-10-02 12:16:20.590 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:20.601 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:99:72:a3 10.100.0.9'], port_security=['fa:16:3e:99:72:a3 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '379bce9e-24c6-4d6e-9438-28eed217ca12', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ffae703d68b24b9c89686c149113fc2b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '64970375-b20e-4c18-bfb5-2a0465f8be7d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9476db85-7514-407a-b55a-3d3c703e8f7b, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=8bb13439-1bbb-437a-8c8a-d41bd8de3c01) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:20.602 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 8bb13439-1bbb-437a-8c8a-d41bd8de3c01 in datapath d6de4737-ca60-4c8d-bfd5-687f9366ec8b bound to our chassis#033[00m
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:20.604 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d6de4737-ca60-4c8d-bfd5-687f9366ec8b#033[00m
Oct  2 08:16:20 np0005466013 systemd-udevd[231548]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:20.619 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[b2117c88-4eb4-476c-a081-0b2c579ac80d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:20.620 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd6de4737-c1 in ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:20.623 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd6de4737-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:20.624 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[ee195f8d-21f3-46c9-9f94-089a3acb1f81]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:20.625 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[af696f26-baae-4e7c-88f5-c1cf066161da]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:20 np0005466013 NetworkManager[51205]: <info>  [1759407380.6334] device (tap8bb13439-1b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:16:20 np0005466013 NetworkManager[51205]: <info>  [1759407380.6347] device (tap8bb13439-1b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:20.638 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[1baf57a7-1136-49a1-99a8-83dc297cc133]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:20 np0005466013 systemd-machined[152202]: New machine qemu-36-instance-00000051.
Oct  2 08:16:20 np0005466013 nova_compute[192144]: 2025-10-02 12:16:20.663 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:20 np0005466013 systemd[1]: Started Virtual Machine qemu-36-instance-00000051.
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:20.670 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[5eabf10f-1d06-4704-9f64-91b75ddf9781]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:20 np0005466013 ovn_controller[94366]: 2025-10-02T12:16:20Z|00302|binding|INFO|Setting lport 8bb13439-1bbb-437a-8c8a-d41bd8de3c01 ovn-installed in OVS
Oct  2 08:16:20 np0005466013 ovn_controller[94366]: 2025-10-02T12:16:20Z|00303|binding|INFO|Setting lport 8bb13439-1bbb-437a-8c8a-d41bd8de3c01 up in Southbound
Oct  2 08:16:20 np0005466013 nova_compute[192144]: 2025-10-02 12:16:20.676 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:20 np0005466013 nova_compute[192144]: 2025-10-02 12:16:20.701 2 DEBUG oslo_concurrency.lockutils [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:20 np0005466013 nova_compute[192144]: 2025-10-02 12:16:20.702 2 DEBUG oslo_concurrency.lockutils [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:20.702 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[a59c95cf-36d7-47ff-a726-b57e3d43d83a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:20 np0005466013 NetworkManager[51205]: <info>  [1759407380.7134] manager: (tapd6de4737-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/142)
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:20.712 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[2ea8a9c8-8e24-4d6e-89fb-c49f0d8c4fa4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:20 np0005466013 nova_compute[192144]: 2025-10-02 12:16:20.714 2 DEBUG nova.virt.hardware [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:16:20 np0005466013 nova_compute[192144]: 2025-10-02 12:16:20.715 2 INFO nova.compute.claims [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:20.756 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[9f258f9c-25db-4629-9281-f4efa082a3d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:20.760 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[ecca7f54-3f24-440e-bcd6-2aae7b6c7809]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:20 np0005466013 NetworkManager[51205]: <info>  [1759407380.7870] device (tapd6de4737-c0): carrier: link connected
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:20.796 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[cce53d92-642c-4542-bcc0-176997e9fd69]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:20.819 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[bde56449-ce5f-4a99-9d85-d257969a3c4f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd6de4737-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bd:c9:1f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 91], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 537240, 'reachable_time': 19334, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231583, 'error': None, 'target': 'ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:20.842 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[e9177ad2-e1e1-41ac-9288-93c120143bc4]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febd:c91f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 537240, 'tstamp': 537240}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231584, 'error': None, 'target': 'ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:20.863 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[52b5406a-aabe-4a7a-a2d5-61f72c36b288]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd6de4737-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bd:c9:1f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 91], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 537240, 'reachable_time': 19334, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231585, 'error': None, 'target': 'ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:20.897 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[e5ba87e6-550b-4be0-8774-c6f0ea62fa5c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:20 np0005466013 nova_compute[192144]: 2025-10-02 12:16:20.914 2 DEBUG nova.compute.provider_tree [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:16:20 np0005466013 nova_compute[192144]: 2025-10-02 12:16:20.934 2 DEBUG nova.scheduler.client.report [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:20.963 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[37f94dff-7054-49c8-8d69-ad5b3314c8f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:20.964 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd6de4737-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:20.965 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:20.965 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd6de4737-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:20 np0005466013 nova_compute[192144]: 2025-10-02 12:16:20.967 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:20 np0005466013 NetworkManager[51205]: <info>  [1759407380.9675] manager: (tapd6de4737-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/143)
Oct  2 08:16:20 np0005466013 kernel: tapd6de4737-c0: entered promiscuous mode
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:20.969 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd6de4737-c0, col_values=(('external_ids', {'iface-id': 'cc451eb7-bf34-4b54-96d8-b834f11e06fb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:20 np0005466013 ovn_controller[94366]: 2025-10-02T12:16:20Z|00304|binding|INFO|Releasing lport cc451eb7-bf34-4b54-96d8-b834f11e06fb from this chassis (sb_readonly=0)
Oct  2 08:16:20 np0005466013 nova_compute[192144]: 2025-10-02 12:16:20.971 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:20 np0005466013 nova_compute[192144]: 2025-10-02 12:16:20.984 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:20.985 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d6de4737-ca60-4c8d-bfd5-687f9366ec8b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d6de4737-ca60-4c8d-bfd5-687f9366ec8b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:20.986 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[3a87d4cf-eb0e-453c-b65c-463eddc1ecdf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:20.987 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-d6de4737-ca60-4c8d-bfd5-687f9366ec8b
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/d6de4737-ca60-4c8d-bfd5-687f9366ec8b.pid.haproxy
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID d6de4737-ca60-4c8d-bfd5-687f9366ec8b
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:16:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:20.989 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'env', 'PROCESS_TAG=haproxy-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d6de4737-ca60-4c8d-bfd5-687f9366ec8b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.007 2 DEBUG oslo_concurrency.lockutils [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.305s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.007 2 DEBUG nova.compute.manager [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.032 2 DEBUG nova.compute.manager [req-de1305f1-ca7d-4a23-bf32-dea2f4035100 req-a28bbc5f-c43c-440d-8d01-e0c16389ab8a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Received event network-vif-plugged-8bb13439-1bbb-437a-8c8a-d41bd8de3c01 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.033 2 DEBUG oslo_concurrency.lockutils [req-de1305f1-ca7d-4a23-bf32-dea2f4035100 req-a28bbc5f-c43c-440d-8d01-e0c16389ab8a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "379bce9e-24c6-4d6e-9438-28eed217ca12-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.033 2 DEBUG oslo_concurrency.lockutils [req-de1305f1-ca7d-4a23-bf32-dea2f4035100 req-a28bbc5f-c43c-440d-8d01-e0c16389ab8a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "379bce9e-24c6-4d6e-9438-28eed217ca12-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.033 2 DEBUG oslo_concurrency.lockutils [req-de1305f1-ca7d-4a23-bf32-dea2f4035100 req-a28bbc5f-c43c-440d-8d01-e0c16389ab8a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "379bce9e-24c6-4d6e-9438-28eed217ca12-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.034 2 DEBUG nova.compute.manager [req-de1305f1-ca7d-4a23-bf32-dea2f4035100 req-a28bbc5f-c43c-440d-8d01-e0c16389ab8a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Processing event network-vif-plugged-8bb13439-1bbb-437a-8c8a-d41bd8de3c01 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.169 2 DEBUG nova.compute.manager [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.170 2 DEBUG nova.network.neutron [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.201 2 INFO nova.virt.libvirt.driver [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.234 2 DEBUG nova.compute.manager [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:16:21 np0005466013 podman[231624]: 2025-10-02 12:16:21.353289268 +0000 UTC m=+0.054370168 container create c80f2463231dcae9ee8b0372a9137c4f84ee1ecb8f921f6dd3fac7230eb96124 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:16:21 np0005466013 systemd[1]: Started libpod-conmon-c80f2463231dcae9ee8b0372a9137c4f84ee1ecb8f921f6dd3fac7230eb96124.scope.
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.404 2 DEBUG nova.compute.manager [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.406 2 DEBUG nova.virt.libvirt.driver [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.407 2 INFO nova.virt.libvirt.driver [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Creating image(s)#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.407 2 DEBUG oslo_concurrency.lockutils [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "/var/lib/nova/instances/85bfe864-3153-4ef5-b286-f2f31e93f994/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.408 2 DEBUG oslo_concurrency.lockutils [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "/var/lib/nova/instances/85bfe864-3153-4ef5-b286-f2f31e93f994/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.409 2 DEBUG oslo_concurrency.lockutils [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "/var/lib/nova/instances/85bfe864-3153-4ef5-b286-f2f31e93f994/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:21 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:16:21 np0005466013 podman[231624]: 2025-10-02 12:16:21.325247297 +0000 UTC m=+0.026328217 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.421 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407381.4208772, 379bce9e-24c6-4d6e-9438-28eed217ca12 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.422 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] VM Started (Lifecycle Event)#033[00m
Oct  2 08:16:21 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cd75b75e1c0d918f3b94a16cadfb82ac1de3982614444dcabff7c25f51f270c0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.425 2 DEBUG nova.compute.manager [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.426 2 DEBUG oslo_concurrency.processutils [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:21 np0005466013 podman[231624]: 2025-10-02 12:16:21.441985913 +0000 UTC m=+0.143066833 container init c80f2463231dcae9ee8b0372a9137c4f84ee1ecb8f921f6dd3fac7230eb96124 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001)
Oct  2 08:16:21 np0005466013 podman[231624]: 2025-10-02 12:16:21.448282413 +0000 UTC m=+0.149363313 container start c80f2463231dcae9ee8b0372a9137c4f84ee1ecb8f921f6dd3fac7230eb96124 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.454 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.460 2 DEBUG nova.policy [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0c0ba8ddde504431b51e593c63f40361', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd5db64e6714348c1a7f57bb53de80915', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.463 2 DEBUG nova.virt.libvirt.driver [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.468 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:16:21 np0005466013 neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b[231640]: [NOTICE]   (231645) : New worker (231647) forked
Oct  2 08:16:21 np0005466013 neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b[231640]: [NOTICE]   (231645) : Loading success.
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.474 2 INFO nova.virt.libvirt.driver [-] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Instance spawned successfully.#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.474 2 DEBUG nova.virt.libvirt.driver [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.497 2 DEBUG oslo_concurrency.processutils [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.498 2 DEBUG oslo_concurrency.lockutils [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.499 2 DEBUG oslo_concurrency.lockutils [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.514 2 DEBUG oslo_concurrency.processutils [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.543 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.544 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407381.4211159, 379bce9e-24c6-4d6e-9438-28eed217ca12 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.545 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.548 2 DEBUG nova.compute.manager [req-781754b7-7a8d-45dd-b2ae-a1857b31620d req-7cd3aac7-a988-4ef8-897a-6bd07a65e665 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Received event network-changed-8bb13439-1bbb-437a-8c8a-d41bd8de3c01 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.549 2 DEBUG nova.compute.manager [req-781754b7-7a8d-45dd-b2ae-a1857b31620d req-7cd3aac7-a988-4ef8-897a-6bd07a65e665 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Refreshing instance network info cache due to event network-changed-8bb13439-1bbb-437a-8c8a-d41bd8de3c01. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.549 2 DEBUG oslo_concurrency.lockutils [req-781754b7-7a8d-45dd-b2ae-a1857b31620d req-7cd3aac7-a988-4ef8-897a-6bd07a65e665 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-379bce9e-24c6-4d6e-9438-28eed217ca12" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.549 2 DEBUG oslo_concurrency.lockutils [req-781754b7-7a8d-45dd-b2ae-a1857b31620d req-7cd3aac7-a988-4ef8-897a-6bd07a65e665 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-379bce9e-24c6-4d6e-9438-28eed217ca12" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.550 2 DEBUG nova.network.neutron [req-781754b7-7a8d-45dd-b2ae-a1857b31620d req-7cd3aac7-a988-4ef8-897a-6bd07a65e665 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Refreshing network info cache for port 8bb13439-1bbb-437a-8c8a-d41bd8de3c01 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.557 2 DEBUG nova.virt.libvirt.driver [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.557 2 DEBUG nova.virt.libvirt.driver [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.558 2 DEBUG nova.virt.libvirt.driver [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.558 2 DEBUG nova.virt.libvirt.driver [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.559 2 DEBUG nova.virt.libvirt.driver [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.559 2 DEBUG nova.virt.libvirt.driver [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.571 2 DEBUG oslo_concurrency.processutils [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.572 2 DEBUG oslo_concurrency.processutils [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/85bfe864-3153-4ef5-b286-f2f31e93f994/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.596 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.605 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407381.4346588, 379bce9e-24c6-4d6e-9438-28eed217ca12 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.605 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.611 2 DEBUG oslo_concurrency.processutils [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/85bfe864-3153-4ef5-b286-f2f31e93f994/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.612 2 DEBUG oslo_concurrency.lockutils [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.612 2 DEBUG oslo_concurrency.processutils [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.641 2 INFO nova.compute.manager [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Took 4.80 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.643 2 DEBUG nova.compute.manager [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.644 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.654 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.678 2 DEBUG oslo_concurrency.processutils [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.680 2 DEBUG nova.virt.disk.api [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Checking if we can resize image /var/lib/nova/instances/85bfe864-3153-4ef5-b286-f2f31e93f994/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.680 2 DEBUG oslo_concurrency.processutils [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/85bfe864-3153-4ef5-b286-f2f31e93f994/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.702 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.737 2 DEBUG oslo_concurrency.processutils [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/85bfe864-3153-4ef5-b286-f2f31e93f994/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.739 2 DEBUG nova.virt.disk.api [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Cannot resize image /var/lib/nova/instances/85bfe864-3153-4ef5-b286-f2f31e93f994/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.739 2 DEBUG nova.objects.instance [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lazy-loading 'migration_context' on Instance uuid 85bfe864-3153-4ef5-b286-f2f31e93f994 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.756 2 DEBUG nova.virt.libvirt.driver [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.757 2 DEBUG nova.virt.libvirt.driver [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Ensure instance console log exists: /var/lib/nova/instances/85bfe864-3153-4ef5-b286-f2f31e93f994/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.758 2 DEBUG oslo_concurrency.lockutils [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.758 2 DEBUG oslo_concurrency.lockutils [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.758 2 DEBUG oslo_concurrency.lockutils [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.769 2 INFO nova.compute.manager [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Took 7.10 seconds to build instance.#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.796 2 DEBUG oslo_concurrency.lockutils [None req-ed4ce3ab-7c3b-437b-98a1-ece1b32bfddb def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "379bce9e-24c6-4d6e-9438-28eed217ca12" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.321s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:21 np0005466013 nova_compute[192144]: 2025-10-02 12:16:21.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:16:22 np0005466013 nova_compute[192144]: 2025-10-02 12:16:22.022 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:22 np0005466013 nova_compute[192144]: 2025-10-02 12:16:22.023 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:22 np0005466013 nova_compute[192144]: 2025-10-02 12:16:22.023 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:22 np0005466013 nova_compute[192144]: 2025-10-02 12:16:22.024 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:16:22 np0005466013 nova_compute[192144]: 2025-10-02 12:16:22.114 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/379bce9e-24c6-4d6e-9438-28eed217ca12/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:22 np0005466013 podman[231670]: 2025-10-02 12:16:22.173881088 +0000 UTC m=+0.099347094 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251001, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:16:22 np0005466013 nova_compute[192144]: 2025-10-02 12:16:22.198 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/379bce9e-24c6-4d6e-9438-28eed217ca12/disk --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:22 np0005466013 nova_compute[192144]: 2025-10-02 12:16:22.199 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/379bce9e-24c6-4d6e-9438-28eed217ca12/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:22 np0005466013 nova_compute[192144]: 2025-10-02 12:16:22.262 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/379bce9e-24c6-4d6e-9438-28eed217ca12/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:22 np0005466013 nova_compute[192144]: 2025-10-02 12:16:22.310 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407367.309214, 54738e99-a5ed-4771-810b-8bea70fde21a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:16:22 np0005466013 nova_compute[192144]: 2025-10-02 12:16:22.311 2 INFO nova.compute.manager [-] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:16:22 np0005466013 nova_compute[192144]: 2025-10-02 12:16:22.428 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:16:22 np0005466013 nova_compute[192144]: 2025-10-02 12:16:22.431 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5643MB free_disk=73.35155868530273GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:16:22 np0005466013 nova_compute[192144]: 2025-10-02 12:16:22.431 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:22 np0005466013 nova_compute[192144]: 2025-10-02 12:16:22.431 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:22 np0005466013 nova_compute[192144]: 2025-10-02 12:16:22.434 2 DEBUG nova.compute.manager [None req-368178ae-3a8d-4332-839b-797c5bb1cc73 - - - - - -] [instance: 54738e99-a5ed-4771-810b-8bea70fde21a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:16:22 np0005466013 ovn_controller[94366]: 2025-10-02T12:16:22Z|00305|binding|INFO|Releasing lport cc451eb7-bf34-4b54-96d8-b834f11e06fb from this chassis (sb_readonly=0)
Oct  2 08:16:22 np0005466013 nova_compute[192144]: 2025-10-02 12:16:22.531 2 DEBUG nova.network.neutron [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Successfully created port: c7aff639-f0e0-4477-80e1-796f2e00f4fc _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:16:22 np0005466013 nova_compute[192144]: 2025-10-02 12:16:22.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:22 np0005466013 nova_compute[192144]: 2025-10-02 12:16:22.623 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance 379bce9e-24c6-4d6e-9438-28eed217ca12 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:16:22 np0005466013 nova_compute[192144]: 2025-10-02 12:16:22.623 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance 85bfe864-3153-4ef5-b286-f2f31e93f994 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:16:22 np0005466013 nova_compute[192144]: 2025-10-02 12:16:22.624 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:16:22 np0005466013 nova_compute[192144]: 2025-10-02 12:16:22.624 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:16:22 np0005466013 nova_compute[192144]: 2025-10-02 12:16:22.704 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:16:22 np0005466013 nova_compute[192144]: 2025-10-02 12:16:22.731 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:16:22 np0005466013 nova_compute[192144]: 2025-10-02 12:16:22.783 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:16:22 np0005466013 nova_compute[192144]: 2025-10-02 12:16:22.784 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.353s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:23 np0005466013 nova_compute[192144]: 2025-10-02 12:16:23.142 2 DEBUG nova.compute.manager [req-a0fb4177-ed64-435a-8205-3ca9bc1ce303 req-db36c5e2-8a74-4dd5-bbae-6dd60fc0d7d5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Received event network-vif-plugged-8bb13439-1bbb-437a-8c8a-d41bd8de3c01 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:23 np0005466013 nova_compute[192144]: 2025-10-02 12:16:23.143 2 DEBUG oslo_concurrency.lockutils [req-a0fb4177-ed64-435a-8205-3ca9bc1ce303 req-db36c5e2-8a74-4dd5-bbae-6dd60fc0d7d5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "379bce9e-24c6-4d6e-9438-28eed217ca12-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:23 np0005466013 nova_compute[192144]: 2025-10-02 12:16:23.143 2 DEBUG oslo_concurrency.lockutils [req-a0fb4177-ed64-435a-8205-3ca9bc1ce303 req-db36c5e2-8a74-4dd5-bbae-6dd60fc0d7d5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "379bce9e-24c6-4d6e-9438-28eed217ca12-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:23 np0005466013 nova_compute[192144]: 2025-10-02 12:16:23.143 2 DEBUG oslo_concurrency.lockutils [req-a0fb4177-ed64-435a-8205-3ca9bc1ce303 req-db36c5e2-8a74-4dd5-bbae-6dd60fc0d7d5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "379bce9e-24c6-4d6e-9438-28eed217ca12-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:23 np0005466013 nova_compute[192144]: 2025-10-02 12:16:23.143 2 DEBUG nova.compute.manager [req-a0fb4177-ed64-435a-8205-3ca9bc1ce303 req-db36c5e2-8a74-4dd5-bbae-6dd60fc0d7d5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] No waiting events found dispatching network-vif-plugged-8bb13439-1bbb-437a-8c8a-d41bd8de3c01 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:16:23 np0005466013 nova_compute[192144]: 2025-10-02 12:16:23.144 2 WARNING nova.compute.manager [req-a0fb4177-ed64-435a-8205-3ca9bc1ce303 req-db36c5e2-8a74-4dd5-bbae-6dd60fc0d7d5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Received unexpected event network-vif-plugged-8bb13439-1bbb-437a-8c8a-d41bd8de3c01 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:16:23 np0005466013 nova_compute[192144]: 2025-10-02 12:16:23.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:23 np0005466013 nova_compute[192144]: 2025-10-02 12:16:23.720 2 DEBUG nova.network.neutron [req-781754b7-7a8d-45dd-b2ae-a1857b31620d req-7cd3aac7-a988-4ef8-897a-6bd07a65e665 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Updated VIF entry in instance network info cache for port 8bb13439-1bbb-437a-8c8a-d41bd8de3c01. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:16:23 np0005466013 nova_compute[192144]: 2025-10-02 12:16:23.721 2 DEBUG nova.network.neutron [req-781754b7-7a8d-45dd-b2ae-a1857b31620d req-7cd3aac7-a988-4ef8-897a-6bd07a65e665 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Updating instance_info_cache with network_info: [{"id": "8bb13439-1bbb-437a-8c8a-d41bd8de3c01", "address": "fa:16:3e:99:72:a3", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8bb13439-1b", "ovs_interfaceid": "8bb13439-1bbb-437a-8c8a-d41bd8de3c01", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:16:23 np0005466013 nova_compute[192144]: 2025-10-02 12:16:23.724 2 DEBUG nova.network.neutron [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Successfully updated port: c7aff639-f0e0-4477-80e1-796f2e00f4fc _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:16:23 np0005466013 nova_compute[192144]: 2025-10-02 12:16:23.742 2 DEBUG oslo_concurrency.lockutils [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "refresh_cache-85bfe864-3153-4ef5-b286-f2f31e93f994" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:16:23 np0005466013 nova_compute[192144]: 2025-10-02 12:16:23.743 2 DEBUG oslo_concurrency.lockutils [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquired lock "refresh_cache-85bfe864-3153-4ef5-b286-f2f31e93f994" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:16:23 np0005466013 nova_compute[192144]: 2025-10-02 12:16:23.743 2 DEBUG nova.network.neutron [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:16:23 np0005466013 nova_compute[192144]: 2025-10-02 12:16:23.745 2 DEBUG oslo_concurrency.lockutils [req-781754b7-7a8d-45dd-b2ae-a1857b31620d req-7cd3aac7-a988-4ef8-897a-6bd07a65e665 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-379bce9e-24c6-4d6e-9438-28eed217ca12" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:16:23 np0005466013 nova_compute[192144]: 2025-10-02 12:16:23.785 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:16:23 np0005466013 nova_compute[192144]: 2025-10-02 12:16:23.883 2 DEBUG nova.compute.manager [req-b2404b00-7704-4213-b41c-30a0016f08e7 req-6f4d37c0-e82d-4d94-af86-f8714624eb71 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Received event network-changed-c7aff639-f0e0-4477-80e1-796f2e00f4fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:23 np0005466013 nova_compute[192144]: 2025-10-02 12:16:23.884 2 DEBUG nova.compute.manager [req-b2404b00-7704-4213-b41c-30a0016f08e7 req-6f4d37c0-e82d-4d94-af86-f8714624eb71 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Refreshing instance network info cache due to event network-changed-c7aff639-f0e0-4477-80e1-796f2e00f4fc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:16:23 np0005466013 nova_compute[192144]: 2025-10-02 12:16:23.884 2 DEBUG oslo_concurrency.lockutils [req-b2404b00-7704-4213-b41c-30a0016f08e7 req-6f4d37c0-e82d-4d94-af86-f8714624eb71 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-85bfe864-3153-4ef5-b286-f2f31e93f994" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:16:24 np0005466013 nova_compute[192144]: 2025-10-02 12:16:24.004 2 DEBUG nova.network.neutron [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:16:24 np0005466013 podman[231697]: 2025-10-02 12:16:24.695913375 +0000 UTC m=+0.067549966 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:16:24 np0005466013 podman[231698]: 2025-10-02 12:16:24.724372998 +0000 UTC m=+0.093552430 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, vendor=Red Hat, Inc., config_id=edpm, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, build-date=2025-08-20T13:12:41, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  2 08:16:24 np0005466013 nova_compute[192144]: 2025-10-02 12:16:24.921 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:24 np0005466013 nova_compute[192144]: 2025-10-02 12:16:24.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:16:24 np0005466013 nova_compute[192144]: 2025-10-02 12:16:24.993 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:16:24 np0005466013 nova_compute[192144]: 2025-10-02 12:16:24.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:16:25 np0005466013 nova_compute[192144]: 2025-10-02 12:16:25.018 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Oct  2 08:16:25 np0005466013 nova_compute[192144]: 2025-10-02 12:16:25.937 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "refresh_cache-379bce9e-24c6-4d6e-9438-28eed217ca12" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:16:25 np0005466013 nova_compute[192144]: 2025-10-02 12:16:25.938 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquired lock "refresh_cache-379bce9e-24c6-4d6e-9438-28eed217ca12" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:16:25 np0005466013 nova_compute[192144]: 2025-10-02 12:16:25.938 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:16:25 np0005466013 nova_compute[192144]: 2025-10-02 12:16:25.938 2 DEBUG nova.objects.instance [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lazy-loading 'info_cache' on Instance uuid 379bce9e-24c6-4d6e-9438-28eed217ca12 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:16:26 np0005466013 nova_compute[192144]: 2025-10-02 12:16:26.012 2 DEBUG nova.network.neutron [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Updating instance_info_cache with network_info: [{"id": "c7aff639-f0e0-4477-80e1-796f2e00f4fc", "address": "fa:16:3e:14:ad:ee", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7aff639-f0", "ovs_interfaceid": "c7aff639-f0e0-4477-80e1-796f2e00f4fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:16:26 np0005466013 nova_compute[192144]: 2025-10-02 12:16:26.045 2 DEBUG oslo_concurrency.lockutils [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Releasing lock "refresh_cache-85bfe864-3153-4ef5-b286-f2f31e93f994" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:16:26 np0005466013 nova_compute[192144]: 2025-10-02 12:16:26.046 2 DEBUG nova.compute.manager [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Instance network_info: |[{"id": "c7aff639-f0e0-4477-80e1-796f2e00f4fc", "address": "fa:16:3e:14:ad:ee", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7aff639-f0", "ovs_interfaceid": "c7aff639-f0e0-4477-80e1-796f2e00f4fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:16:26 np0005466013 nova_compute[192144]: 2025-10-02 12:16:26.046 2 DEBUG oslo_concurrency.lockutils [req-b2404b00-7704-4213-b41c-30a0016f08e7 req-6f4d37c0-e82d-4d94-af86-f8714624eb71 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-85bfe864-3153-4ef5-b286-f2f31e93f994" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:16:26 np0005466013 nova_compute[192144]: 2025-10-02 12:16:26.046 2 DEBUG nova.network.neutron [req-b2404b00-7704-4213-b41c-30a0016f08e7 req-6f4d37c0-e82d-4d94-af86-f8714624eb71 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Refreshing network info cache for port c7aff639-f0e0-4477-80e1-796f2e00f4fc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:16:26 np0005466013 nova_compute[192144]: 2025-10-02 12:16:26.049 2 DEBUG nova.virt.libvirt.driver [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Start _get_guest_xml network_info=[{"id": "c7aff639-f0e0-4477-80e1-796f2e00f4fc", "address": "fa:16:3e:14:ad:ee", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7aff639-f0", "ovs_interfaceid": "c7aff639-f0e0-4477-80e1-796f2e00f4fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:16:26 np0005466013 nova_compute[192144]: 2025-10-02 12:16:26.053 2 WARNING nova.virt.libvirt.driver [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:16:26 np0005466013 nova_compute[192144]: 2025-10-02 12:16:26.064 2 DEBUG nova.virt.libvirt.host [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:16:26 np0005466013 nova_compute[192144]: 2025-10-02 12:16:26.065 2 DEBUG nova.virt.libvirt.host [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:16:26 np0005466013 nova_compute[192144]: 2025-10-02 12:16:26.074 2 DEBUG nova.virt.libvirt.host [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:16:26 np0005466013 nova_compute[192144]: 2025-10-02 12:16:26.075 2 DEBUG nova.virt.libvirt.host [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:16:26 np0005466013 nova_compute[192144]: 2025-10-02 12:16:26.076 2 DEBUG nova.virt.libvirt.driver [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:16:26 np0005466013 nova_compute[192144]: 2025-10-02 12:16:26.076 2 DEBUG nova.virt.hardware [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:16:26 np0005466013 nova_compute[192144]: 2025-10-02 12:16:26.077 2 DEBUG nova.virt.hardware [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:16:26 np0005466013 nova_compute[192144]: 2025-10-02 12:16:26.077 2 DEBUG nova.virt.hardware [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:16:26 np0005466013 nova_compute[192144]: 2025-10-02 12:16:26.077 2 DEBUG nova.virt.hardware [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:16:26 np0005466013 nova_compute[192144]: 2025-10-02 12:16:26.077 2 DEBUG nova.virt.hardware [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:16:26 np0005466013 nova_compute[192144]: 2025-10-02 12:16:26.078 2 DEBUG nova.virt.hardware [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:16:26 np0005466013 nova_compute[192144]: 2025-10-02 12:16:26.078 2 DEBUG nova.virt.hardware [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:16:26 np0005466013 nova_compute[192144]: 2025-10-02 12:16:26.078 2 DEBUG nova.virt.hardware [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:16:26 np0005466013 nova_compute[192144]: 2025-10-02 12:16:26.078 2 DEBUG nova.virt.hardware [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:16:26 np0005466013 nova_compute[192144]: 2025-10-02 12:16:26.079 2 DEBUG nova.virt.hardware [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:16:26 np0005466013 nova_compute[192144]: 2025-10-02 12:16:26.079 2 DEBUG nova.virt.hardware [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:16:26 np0005466013 nova_compute[192144]: 2025-10-02 12:16:26.082 2 DEBUG nova.virt.libvirt.vif [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:16:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-565092550',display_name='tempest-DeleteServersTestJSON-server-565092550',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-565092550',id=82,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d5db64e6714348c1a7f57bb53de80915',ramdisk_id='',reservation_id='r-0u75o0es',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-548982240',owner_user_name='tempest-DeleteServersTestJSON-548982240-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:16:21Z,user_data=None,user_id='0c0ba8ddde504431b51e593c63f40361',uuid=85bfe864-3153-4ef5-b286-f2f31e93f994,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c7aff639-f0e0-4477-80e1-796f2e00f4fc", "address": "fa:16:3e:14:ad:ee", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7aff639-f0", "ovs_interfaceid": "c7aff639-f0e0-4477-80e1-796f2e00f4fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:16:26 np0005466013 nova_compute[192144]: 2025-10-02 12:16:26.083 2 DEBUG nova.network.os_vif_util [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Converting VIF {"id": "c7aff639-f0e0-4477-80e1-796f2e00f4fc", "address": "fa:16:3e:14:ad:ee", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7aff639-f0", "ovs_interfaceid": "c7aff639-f0e0-4477-80e1-796f2e00f4fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:16:26 np0005466013 nova_compute[192144]: 2025-10-02 12:16:26.083 2 DEBUG nova.network.os_vif_util [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:14:ad:ee,bridge_name='br-int',has_traffic_filtering=True,id=c7aff639-f0e0-4477-80e1-796f2e00f4fc,network=Network(b97b8849-844c-4190-8b13-fd7a2d073ce8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc7aff639-f0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:16:26 np0005466013 nova_compute[192144]: 2025-10-02 12:16:26.084 2 DEBUG nova.objects.instance [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lazy-loading 'pci_devices' on Instance uuid 85bfe864-3153-4ef5-b286-f2f31e93f994 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:16:26 np0005466013 nova_compute[192144]: 2025-10-02 12:16:26.110 2 DEBUG nova.virt.libvirt.driver [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:16:26 np0005466013 nova_compute[192144]:  <uuid>85bfe864-3153-4ef5-b286-f2f31e93f994</uuid>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:  <name>instance-00000052</name>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:16:26 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:      <nova:name>tempest-DeleteServersTestJSON-server-565092550</nova:name>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:16:26</nova:creationTime>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:16:26 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:        <nova:user uuid="0c0ba8ddde504431b51e593c63f40361">tempest-DeleteServersTestJSON-548982240-project-member</nova:user>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:        <nova:project uuid="d5db64e6714348c1a7f57bb53de80915">tempest-DeleteServersTestJSON-548982240</nova:project>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:        <nova:port uuid="c7aff639-f0e0-4477-80e1-796f2e00f4fc">
Oct  2 08:16:26 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:      <entry name="serial">85bfe864-3153-4ef5-b286-f2f31e93f994</entry>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:      <entry name="uuid">85bfe864-3153-4ef5-b286-f2f31e93f994</entry>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:16:26 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/85bfe864-3153-4ef5-b286-f2f31e93f994/disk"/>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:16:26 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/85bfe864-3153-4ef5-b286-f2f31e93f994/disk.config"/>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:16:26 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:14:ad:ee"/>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:      <target dev="tapc7aff639-f0"/>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:16:26 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/85bfe864-3153-4ef5-b286-f2f31e93f994/console.log" append="off"/>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:16:26 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:16:26 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:16:26 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:16:26 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:16:26 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:16:26 np0005466013 nova_compute[192144]: 2025-10-02 12:16:26.111 2 DEBUG nova.compute.manager [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Preparing to wait for external event network-vif-plugged-c7aff639-f0e0-4477-80e1-796f2e00f4fc prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:16:26 np0005466013 nova_compute[192144]: 2025-10-02 12:16:26.111 2 DEBUG oslo_concurrency.lockutils [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "85bfe864-3153-4ef5-b286-f2f31e93f994-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:26 np0005466013 nova_compute[192144]: 2025-10-02 12:16:26.111 2 DEBUG oslo_concurrency.lockutils [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "85bfe864-3153-4ef5-b286-f2f31e93f994-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:26 np0005466013 nova_compute[192144]: 2025-10-02 12:16:26.111 2 DEBUG oslo_concurrency.lockutils [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "85bfe864-3153-4ef5-b286-f2f31e93f994-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:26 np0005466013 nova_compute[192144]: 2025-10-02 12:16:26.112 2 DEBUG nova.virt.libvirt.vif [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:16:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-565092550',display_name='tempest-DeleteServersTestJSON-server-565092550',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-565092550',id=82,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d5db64e6714348c1a7f57bb53de80915',ramdisk_id='',reservation_id='r-0u75o0es',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-548982240',owner_user_name='tempest-DeleteServersTestJSON-548982240-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:16:21Z,user_data=None,user_id='0c0ba8ddde504431b51e593c63f40361',uuid=85bfe864-3153-4ef5-b286-f2f31e93f994,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c7aff639-f0e0-4477-80e1-796f2e00f4fc", "address": "fa:16:3e:14:ad:ee", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7aff639-f0", "ovs_interfaceid": "c7aff639-f0e0-4477-80e1-796f2e00f4fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:16:26 np0005466013 nova_compute[192144]: 2025-10-02 12:16:26.112 2 DEBUG nova.network.os_vif_util [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Converting VIF {"id": "c7aff639-f0e0-4477-80e1-796f2e00f4fc", "address": "fa:16:3e:14:ad:ee", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7aff639-f0", "ovs_interfaceid": "c7aff639-f0e0-4477-80e1-796f2e00f4fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:16:26 np0005466013 nova_compute[192144]: 2025-10-02 12:16:26.113 2 DEBUG nova.network.os_vif_util [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:14:ad:ee,bridge_name='br-int',has_traffic_filtering=True,id=c7aff639-f0e0-4477-80e1-796f2e00f4fc,network=Network(b97b8849-844c-4190-8b13-fd7a2d073ce8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc7aff639-f0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:16:26 np0005466013 nova_compute[192144]: 2025-10-02 12:16:26.113 2 DEBUG os_vif [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:14:ad:ee,bridge_name='br-int',has_traffic_filtering=True,id=c7aff639-f0e0-4477-80e1-796f2e00f4fc,network=Network(b97b8849-844c-4190-8b13-fd7a2d073ce8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc7aff639-f0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:16:26 np0005466013 nova_compute[192144]: 2025-10-02 12:16:26.114 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:26 np0005466013 nova_compute[192144]: 2025-10-02 12:16:26.114 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:26 np0005466013 nova_compute[192144]: 2025-10-02 12:16:26.114 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:16:26 np0005466013 nova_compute[192144]: 2025-10-02 12:16:26.117 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:26 np0005466013 nova_compute[192144]: 2025-10-02 12:16:26.117 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc7aff639-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:26 np0005466013 nova_compute[192144]: 2025-10-02 12:16:26.118 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc7aff639-f0, col_values=(('external_ids', {'iface-id': 'c7aff639-f0e0-4477-80e1-796f2e00f4fc', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:14:ad:ee', 'vm-uuid': '85bfe864-3153-4ef5-b286-f2f31e93f994'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:26 np0005466013 NetworkManager[51205]: <info>  [1759407386.1206] manager: (tapc7aff639-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/144)
Oct  2 08:16:26 np0005466013 nova_compute[192144]: 2025-10-02 12:16:26.119 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:26 np0005466013 nova_compute[192144]: 2025-10-02 12:16:26.123 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:16:26 np0005466013 nova_compute[192144]: 2025-10-02 12:16:26.129 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:26 np0005466013 nova_compute[192144]: 2025-10-02 12:16:26.130 2 INFO os_vif [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:14:ad:ee,bridge_name='br-int',has_traffic_filtering=True,id=c7aff639-f0e0-4477-80e1-796f2e00f4fc,network=Network(b97b8849-844c-4190-8b13-fd7a2d073ce8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc7aff639-f0')#033[00m
Oct  2 08:16:26 np0005466013 nova_compute[192144]: 2025-10-02 12:16:26.194 2 DEBUG nova.virt.libvirt.driver [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:16:26 np0005466013 nova_compute[192144]: 2025-10-02 12:16:26.195 2 DEBUG nova.virt.libvirt.driver [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:16:26 np0005466013 nova_compute[192144]: 2025-10-02 12:16:26.210 2 DEBUG nova.virt.libvirt.driver [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] No VIF found with MAC fa:16:3e:14:ad:ee, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:16:26 np0005466013 nova_compute[192144]: 2025-10-02 12:16:26.211 2 INFO nova.virt.libvirt.driver [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Using config drive#033[00m
Oct  2 08:16:27 np0005466013 nova_compute[192144]: 2025-10-02 12:16:27.008 2 INFO nova.virt.libvirt.driver [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Creating config drive at /var/lib/nova/instances/85bfe864-3153-4ef5-b286-f2f31e93f994/disk.config#033[00m
Oct  2 08:16:27 np0005466013 nova_compute[192144]: 2025-10-02 12:16:27.014 2 DEBUG oslo_concurrency.processutils [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/85bfe864-3153-4ef5-b286-f2f31e93f994/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8rp3w99k execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:27 np0005466013 nova_compute[192144]: 2025-10-02 12:16:27.147 2 DEBUG oslo_concurrency.processutils [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/85bfe864-3153-4ef5-b286-f2f31e93f994/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8rp3w99k" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:27 np0005466013 kernel: tapc7aff639-f0: entered promiscuous mode
Oct  2 08:16:27 np0005466013 NetworkManager[51205]: <info>  [1759407387.2176] manager: (tapc7aff639-f0): new Tun device (/org/freedesktop/NetworkManager/Devices/145)
Oct  2 08:16:27 np0005466013 ovn_controller[94366]: 2025-10-02T12:16:27Z|00306|binding|INFO|Claiming lport c7aff639-f0e0-4477-80e1-796f2e00f4fc for this chassis.
Oct  2 08:16:27 np0005466013 ovn_controller[94366]: 2025-10-02T12:16:27Z|00307|binding|INFO|c7aff639-f0e0-4477-80e1-796f2e00f4fc: Claiming fa:16:3e:14:ad:ee 10.100.0.11
Oct  2 08:16:27 np0005466013 nova_compute[192144]: 2025-10-02 12:16:27.236 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:27 np0005466013 systemd-udevd[231757]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:16:27 np0005466013 systemd-machined[152202]: New machine qemu-37-instance-00000052.
Oct  2 08:16:27 np0005466013 NetworkManager[51205]: <info>  [1759407387.2627] device (tapc7aff639-f0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:16:27 np0005466013 NetworkManager[51205]: <info>  [1759407387.2638] device (tapc7aff639-f0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:16:27 np0005466013 systemd[1]: Started Virtual Machine qemu-37-instance-00000052.
Oct  2 08:16:27 np0005466013 nova_compute[192144]: 2025-10-02 12:16:27.284 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:27 np0005466013 ovn_controller[94366]: 2025-10-02T12:16:27Z|00308|binding|INFO|Setting lport c7aff639-f0e0-4477-80e1-796f2e00f4fc ovn-installed in OVS
Oct  2 08:16:27 np0005466013 nova_compute[192144]: 2025-10-02 12:16:27.287 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:27 np0005466013 ovn_controller[94366]: 2025-10-02T12:16:27Z|00309|binding|INFO|Setting lport c7aff639-f0e0-4477-80e1-796f2e00f4fc up in Southbound
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:27.340 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:14:ad:ee 10.100.0.11'], port_security=['fa:16:3e:14:ad:ee 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '85bfe864-3153-4ef5-b286-f2f31e93f994', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b97b8849-844c-4190-8b13-fd7a2d073ce8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd5db64e6714348c1a7f57bb53de80915', 'neutron:revision_number': '2', 'neutron:security_group_ids': '063f732a-6071-414f-814d-a5d6c4e9e012', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2011b0da-7062-465f-963e-59e92e88a653, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=c7aff639-f0e0-4477-80e1-796f2e00f4fc) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:27.343 103323 INFO neutron.agent.ovn.metadata.agent [-] Port c7aff639-f0e0-4477-80e1-796f2e00f4fc in datapath b97b8849-844c-4190-8b13-fd7a2d073ce8 bound to our chassis#033[00m
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:27.344 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b97b8849-844c-4190-8b13-fd7a2d073ce8#033[00m
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:27.357 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[b770509d-f7c3-4438-b01e-4936c7cf8b61]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:27.358 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb97b8849-81 in ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:27.360 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb97b8849-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:27.360 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[991a50de-c084-4e4d-9946-3f993d60c288]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:27.361 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[4962bfe1-8ed0-4836-8ce5-a179a5c6b48a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:27.378 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[d09bf681-f774-44e5-82ed-677e5d9bc78b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:27.397 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[28ac71d6-6783-4c5c-a575-28f36b581411]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:27.428 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[6a92629b-57d2-409e-a2d4-70da9b1e4b77]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:27 np0005466013 NetworkManager[51205]: <info>  [1759407387.4373] manager: (tapb97b8849-80): new Veth device (/org/freedesktop/NetworkManager/Devices/146)
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:27.437 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[5338e477-b841-4d67-a95c-73db316e9eb8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:27 np0005466013 systemd-udevd[231760]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:27.476 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[06d805a1-6365-4120-9785-63c52e754e7b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:27.480 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[a7c14c7a-80a2-42da-871d-c28394bf08c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:27 np0005466013 NetworkManager[51205]: <info>  [1759407387.5064] device (tapb97b8849-80): carrier: link connected
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:27.514 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[e0871b75-5f25-422f-b042-b01b0ea02cad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:27.535 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[8269549f-e97c-4b0c-9ea7-497354de156a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb97b8849-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ea:e0:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 93], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 537912, 'reachable_time': 28524, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231798, 'error': None, 'target': 'ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:27.557 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[1b909b36-6fa6-4c3e-b4c9-555784f5f6cd]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feea:e0b0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 537912, 'tstamp': 537912}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231800, 'error': None, 'target': 'ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:27.582 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[cc25c956-a979-4292-8a7f-c0fabdeb2969]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb97b8849-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ea:e0:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 93], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 537912, 'reachable_time': 28524, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231801, 'error': None, 'target': 'ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:27.624 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[24bc9710-853e-4d93-9509-66d643ef0b35]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:27.694 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[03e58fdb-c523-472e-af34-457a1f1f32b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:27.696 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb97b8849-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:27.696 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:27.697 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb97b8849-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:27 np0005466013 nova_compute[192144]: 2025-10-02 12:16:27.699 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:27 np0005466013 NetworkManager[51205]: <info>  [1759407387.6999] manager: (tapb97b8849-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/147)
Oct  2 08:16:27 np0005466013 kernel: tapb97b8849-80: entered promiscuous mode
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:27.703 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb97b8849-80, col_values=(('external_ids', {'iface-id': '055cf080-4472-4807-a697-69de84e96953'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:27 np0005466013 nova_compute[192144]: 2025-10-02 12:16:27.704 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:27 np0005466013 ovn_controller[94366]: 2025-10-02T12:16:27Z|00310|binding|INFO|Releasing lport 055cf080-4472-4807-a697-69de84e96953 from this chassis (sb_readonly=0)
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:27.717 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b97b8849-844c-4190-8b13-fd7a2d073ce8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b97b8849-844c-4190-8b13-fd7a2d073ce8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:16:27 np0005466013 nova_compute[192144]: 2025-10-02 12:16:27.717 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:27.718 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[70b47445-2f6f-4115-b477-35f037b9aab4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:27.719 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-b97b8849-844c-4190-8b13-fd7a2d073ce8
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/b97b8849-844c-4190-8b13-fd7a2d073ce8.pid.haproxy
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID b97b8849-844c-4190-8b13-fd7a2d073ce8
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:16:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:27.720 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8', 'env', 'PROCESS_TAG=haproxy-b97b8849-844c-4190-8b13-fd7a2d073ce8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b97b8849-844c-4190-8b13-fd7a2d073ce8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:16:27 np0005466013 nova_compute[192144]: 2025-10-02 12:16:27.968 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407387.9675379, 85bfe864-3153-4ef5-b286-f2f31e93f994 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:16:27 np0005466013 nova_compute[192144]: 2025-10-02 12:16:27.968 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] VM Started (Lifecycle Event)#033[00m
Oct  2 08:16:27 np0005466013 nova_compute[192144]: 2025-10-02 12:16:27.989 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:16:27 np0005466013 nova_compute[192144]: 2025-10-02 12:16:27.996 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407387.971339, 85bfe864-3153-4ef5-b286-f2f31e93f994 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:16:27 np0005466013 nova_compute[192144]: 2025-10-02 12:16:27.996 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:16:28 np0005466013 nova_compute[192144]: 2025-10-02 12:16:28.013 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:16:28 np0005466013 nova_compute[192144]: 2025-10-02 12:16:28.018 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:16:28 np0005466013 nova_compute[192144]: 2025-10-02 12:16:28.045 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:16:28 np0005466013 podman[231831]: 2025-10-02 12:16:28.131518994 +0000 UTC m=+0.078780722 container create d1d297f825fbfae07f9601774b3fb948cb7147637302a37fe599571b61ea7312 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:16:28 np0005466013 podman[231831]: 2025-10-02 12:16:28.086381721 +0000 UTC m=+0.033643489 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:16:28 np0005466013 systemd[1]: Started libpod-conmon-d1d297f825fbfae07f9601774b3fb948cb7147637302a37fe599571b61ea7312.scope.
Oct  2 08:16:28 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:16:28 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d96118c6bf071e479c9e84a8237b7824817353b45519483c7bc8e23a68096a2d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:16:28 np0005466013 podman[231831]: 2025-10-02 12:16:28.257380139 +0000 UTC m=+0.204641857 container init d1d297f825fbfae07f9601774b3fb948cb7147637302a37fe599571b61ea7312 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 08:16:28 np0005466013 podman[231831]: 2025-10-02 12:16:28.263666339 +0000 UTC m=+0.210928037 container start d1d297f825fbfae07f9601774b3fb948cb7147637302a37fe599571b61ea7312 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 08:16:28 np0005466013 neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8[231846]: [NOTICE]   (231850) : New worker (231852) forked
Oct  2 08:16:28 np0005466013 neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8[231846]: [NOTICE]   (231850) : Loading success.
Oct  2 08:16:28 np0005466013 nova_compute[192144]: 2025-10-02 12:16:28.374 2 DEBUG nova.compute.manager [req-ef91246a-7ee4-404f-bc55-3f5671f92139 req-268b4b3b-ad6b-4d3b-ad0f-16d58b92e1de 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Received event network-vif-plugged-c7aff639-f0e0-4477-80e1-796f2e00f4fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:28 np0005466013 nova_compute[192144]: 2025-10-02 12:16:28.375 2 DEBUG oslo_concurrency.lockutils [req-ef91246a-7ee4-404f-bc55-3f5671f92139 req-268b4b3b-ad6b-4d3b-ad0f-16d58b92e1de 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "85bfe864-3153-4ef5-b286-f2f31e93f994-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:28 np0005466013 nova_compute[192144]: 2025-10-02 12:16:28.376 2 DEBUG oslo_concurrency.lockutils [req-ef91246a-7ee4-404f-bc55-3f5671f92139 req-268b4b3b-ad6b-4d3b-ad0f-16d58b92e1de 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "85bfe864-3153-4ef5-b286-f2f31e93f994-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:28 np0005466013 nova_compute[192144]: 2025-10-02 12:16:28.376 2 DEBUG oslo_concurrency.lockutils [req-ef91246a-7ee4-404f-bc55-3f5671f92139 req-268b4b3b-ad6b-4d3b-ad0f-16d58b92e1de 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "85bfe864-3153-4ef5-b286-f2f31e93f994-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:28 np0005466013 nova_compute[192144]: 2025-10-02 12:16:28.377 2 DEBUG nova.compute.manager [req-ef91246a-7ee4-404f-bc55-3f5671f92139 req-268b4b3b-ad6b-4d3b-ad0f-16d58b92e1de 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Processing event network-vif-plugged-c7aff639-f0e0-4477-80e1-796f2e00f4fc _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:16:28 np0005466013 nova_compute[192144]: 2025-10-02 12:16:28.378 2 DEBUG nova.compute.manager [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:16:28 np0005466013 nova_compute[192144]: 2025-10-02 12:16:28.383 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407388.383049, 85bfe864-3153-4ef5-b286-f2f31e93f994 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:16:28 np0005466013 nova_compute[192144]: 2025-10-02 12:16:28.384 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:16:28 np0005466013 nova_compute[192144]: 2025-10-02 12:16:28.402 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:16:28 np0005466013 nova_compute[192144]: 2025-10-02 12:16:28.405 2 DEBUG nova.virt.libvirt.driver [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:16:28 np0005466013 nova_compute[192144]: 2025-10-02 12:16:28.408 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:16:28 np0005466013 nova_compute[192144]: 2025-10-02 12:16:28.412 2 INFO nova.virt.libvirt.driver [-] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Instance spawned successfully.#033[00m
Oct  2 08:16:28 np0005466013 nova_compute[192144]: 2025-10-02 12:16:28.412 2 DEBUG nova.virt.libvirt.driver [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:16:28 np0005466013 nova_compute[192144]: 2025-10-02 12:16:28.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:28 np0005466013 nova_compute[192144]: 2025-10-02 12:16:28.430 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:16:28 np0005466013 nova_compute[192144]: 2025-10-02 12:16:28.440 2 DEBUG nova.virt.libvirt.driver [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:16:28 np0005466013 nova_compute[192144]: 2025-10-02 12:16:28.441 2 DEBUG nova.virt.libvirt.driver [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:16:28 np0005466013 nova_compute[192144]: 2025-10-02 12:16:28.441 2 DEBUG nova.virt.libvirt.driver [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:16:28 np0005466013 nova_compute[192144]: 2025-10-02 12:16:28.442 2 DEBUG nova.virt.libvirt.driver [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:16:28 np0005466013 nova_compute[192144]: 2025-10-02 12:16:28.442 2 DEBUG nova.virt.libvirt.driver [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:16:28 np0005466013 nova_compute[192144]: 2025-10-02 12:16:28.442 2 DEBUG nova.virt.libvirt.driver [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:16:28 np0005466013 nova_compute[192144]: 2025-10-02 12:16:28.510 2 INFO nova.compute.manager [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Took 7.10 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:16:28 np0005466013 nova_compute[192144]: 2025-10-02 12:16:28.510 2 DEBUG nova.compute.manager [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:16:28 np0005466013 nova_compute[192144]: 2025-10-02 12:16:28.587 2 INFO nova.compute.manager [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Took 7.93 seconds to build instance.#033[00m
Oct  2 08:16:28 np0005466013 nova_compute[192144]: 2025-10-02 12:16:28.607 2 DEBUG oslo_concurrency.lockutils [None req-e72f5599-4686-4027-b979-be0a56eb9cb3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "85bfe864-3153-4ef5-b286-f2f31e93f994" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.082s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:29 np0005466013 nova_compute[192144]: 2025-10-02 12:16:29.137 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407374.1367686, 70d78115-9cfc-487c-95b6-f4f4149c52a1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:16:29 np0005466013 nova_compute[192144]: 2025-10-02 12:16:29.138 2 INFO nova.compute.manager [-] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:16:29 np0005466013 nova_compute[192144]: 2025-10-02 12:16:29.166 2 DEBUG nova.compute.manager [None req-bd1a9199-b6d2-4d03-a67c-6181a5a49a41 - - - - - -] [instance: 70d78115-9cfc-487c-95b6-f4f4149c52a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:16:29 np0005466013 nova_compute[192144]: 2025-10-02 12:16:29.351 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Updating instance_info_cache with network_info: [{"id": "8bb13439-1bbb-437a-8c8a-d41bd8de3c01", "address": "fa:16:3e:99:72:a3", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8bb13439-1b", "ovs_interfaceid": "8bb13439-1bbb-437a-8c8a-d41bd8de3c01", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:16:29 np0005466013 nova_compute[192144]: 2025-10-02 12:16:29.361 2 DEBUG nova.network.neutron [req-b2404b00-7704-4213-b41c-30a0016f08e7 req-6f4d37c0-e82d-4d94-af86-f8714624eb71 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Updated VIF entry in instance network info cache for port c7aff639-f0e0-4477-80e1-796f2e00f4fc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:16:29 np0005466013 nova_compute[192144]: 2025-10-02 12:16:29.361 2 DEBUG nova.network.neutron [req-b2404b00-7704-4213-b41c-30a0016f08e7 req-6f4d37c0-e82d-4d94-af86-f8714624eb71 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Updating instance_info_cache with network_info: [{"id": "c7aff639-f0e0-4477-80e1-796f2e00f4fc", "address": "fa:16:3e:14:ad:ee", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7aff639-f0", "ovs_interfaceid": "c7aff639-f0e0-4477-80e1-796f2e00f4fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:16:29 np0005466013 nova_compute[192144]: 2025-10-02 12:16:29.378 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Releasing lock "refresh_cache-379bce9e-24c6-4d6e-9438-28eed217ca12" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:16:29 np0005466013 nova_compute[192144]: 2025-10-02 12:16:29.379 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:16:29 np0005466013 nova_compute[192144]: 2025-10-02 12:16:29.379 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:16:29 np0005466013 nova_compute[192144]: 2025-10-02 12:16:29.380 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:16:29 np0005466013 nova_compute[192144]: 2025-10-02 12:16:29.380 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:16:29 np0005466013 nova_compute[192144]: 2025-10-02 12:16:29.380 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:16:29 np0005466013 nova_compute[192144]: 2025-10-02 12:16:29.381 2 DEBUG oslo_concurrency.lockutils [req-b2404b00-7704-4213-b41c-30a0016f08e7 req-6f4d37c0-e82d-4d94-af86-f8714624eb71 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-85bfe864-3153-4ef5-b286-f2f31e93f994" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:16:30 np0005466013 nova_compute[192144]: 2025-10-02 12:16:30.323 2 DEBUG oslo_concurrency.lockutils [None req-4a8c7f70-48a5-4e4c-89a1-a2a0ef201646 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "379bce9e-24c6-4d6e-9438-28eed217ca12" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:30 np0005466013 nova_compute[192144]: 2025-10-02 12:16:30.324 2 DEBUG oslo_concurrency.lockutils [None req-4a8c7f70-48a5-4e4c-89a1-a2a0ef201646 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "379bce9e-24c6-4d6e-9438-28eed217ca12" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:30 np0005466013 nova_compute[192144]: 2025-10-02 12:16:30.324 2 DEBUG oslo_concurrency.lockutils [None req-4a8c7f70-48a5-4e4c-89a1-a2a0ef201646 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "379bce9e-24c6-4d6e-9438-28eed217ca12-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:30 np0005466013 nova_compute[192144]: 2025-10-02 12:16:30.324 2 DEBUG oslo_concurrency.lockutils [None req-4a8c7f70-48a5-4e4c-89a1-a2a0ef201646 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "379bce9e-24c6-4d6e-9438-28eed217ca12-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:30 np0005466013 nova_compute[192144]: 2025-10-02 12:16:30.324 2 DEBUG oslo_concurrency.lockutils [None req-4a8c7f70-48a5-4e4c-89a1-a2a0ef201646 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "379bce9e-24c6-4d6e-9438-28eed217ca12-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:30 np0005466013 nova_compute[192144]: 2025-10-02 12:16:30.335 2 INFO nova.compute.manager [None req-4a8c7f70-48a5-4e4c-89a1-a2a0ef201646 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Terminating instance#033[00m
Oct  2 08:16:30 np0005466013 nova_compute[192144]: 2025-10-02 12:16:30.345 2 DEBUG nova.compute.manager [None req-4a8c7f70-48a5-4e4c-89a1-a2a0ef201646 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:16:30 np0005466013 kernel: tap8bb13439-1b (unregistering): left promiscuous mode
Oct  2 08:16:30 np0005466013 NetworkManager[51205]: <info>  [1759407390.3727] device (tap8bb13439-1b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:16:30 np0005466013 ovn_controller[94366]: 2025-10-02T12:16:30Z|00311|binding|INFO|Releasing lport 8bb13439-1bbb-437a-8c8a-d41bd8de3c01 from this chassis (sb_readonly=0)
Oct  2 08:16:30 np0005466013 ovn_controller[94366]: 2025-10-02T12:16:30Z|00312|binding|INFO|Setting lport 8bb13439-1bbb-437a-8c8a-d41bd8de3c01 down in Southbound
Oct  2 08:16:30 np0005466013 ovn_controller[94366]: 2025-10-02T12:16:30Z|00313|binding|INFO|Removing iface tap8bb13439-1b ovn-installed in OVS
Oct  2 08:16:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:30.392 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:99:72:a3 10.100.0.9'], port_security=['fa:16:3e:99:72:a3 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '379bce9e-24c6-4d6e-9438-28eed217ca12', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ffae703d68b24b9c89686c149113fc2b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '64970375-b20e-4c18-bfb5-2a0465f8be7d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9476db85-7514-407a-b55a-3d3c703e8f7b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=8bb13439-1bbb-437a-8c8a-d41bd8de3c01) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:16:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:30.394 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 8bb13439-1bbb-437a-8c8a-d41bd8de3c01 in datapath d6de4737-ca60-4c8d-bfd5-687f9366ec8b unbound from our chassis#033[00m
Oct  2 08:16:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:30.396 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d6de4737-ca60-4c8d-bfd5-687f9366ec8b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:16:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:30.397 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[2bdbb691-ac4d-4a31-aef9-118e6c92eeb2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:30.398 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b namespace which is not needed anymore#033[00m
Oct  2 08:16:30 np0005466013 nova_compute[192144]: 2025-10-02 12:16:30.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:30 np0005466013 nova_compute[192144]: 2025-10-02 12:16:30.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:30 np0005466013 systemd[1]: machine-qemu\x2d36\x2dinstance\x2d00000051.scope: Deactivated successfully.
Oct  2 08:16:30 np0005466013 systemd[1]: machine-qemu\x2d36\x2dinstance\x2d00000051.scope: Consumed 9.692s CPU time.
Oct  2 08:16:30 np0005466013 systemd-machined[152202]: Machine qemu-36-instance-00000051 terminated.
Oct  2 08:16:30 np0005466013 nova_compute[192144]: 2025-10-02 12:16:30.469 2 DEBUG nova.compute.manager [req-c8ae4670-8833-4bbb-9d9e-e3a33f7148ec req-5febc650-7b55-44f8-a98e-72182dc1fa57 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Received event network-vif-plugged-c7aff639-f0e0-4477-80e1-796f2e00f4fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:30 np0005466013 nova_compute[192144]: 2025-10-02 12:16:30.470 2 DEBUG oslo_concurrency.lockutils [req-c8ae4670-8833-4bbb-9d9e-e3a33f7148ec req-5febc650-7b55-44f8-a98e-72182dc1fa57 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "85bfe864-3153-4ef5-b286-f2f31e93f994-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:30 np0005466013 nova_compute[192144]: 2025-10-02 12:16:30.470 2 DEBUG oslo_concurrency.lockutils [req-c8ae4670-8833-4bbb-9d9e-e3a33f7148ec req-5febc650-7b55-44f8-a98e-72182dc1fa57 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "85bfe864-3153-4ef5-b286-f2f31e93f994-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:30 np0005466013 nova_compute[192144]: 2025-10-02 12:16:30.470 2 DEBUG oslo_concurrency.lockutils [req-c8ae4670-8833-4bbb-9d9e-e3a33f7148ec req-5febc650-7b55-44f8-a98e-72182dc1fa57 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "85bfe864-3153-4ef5-b286-f2f31e93f994-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:30 np0005466013 nova_compute[192144]: 2025-10-02 12:16:30.470 2 DEBUG nova.compute.manager [req-c8ae4670-8833-4bbb-9d9e-e3a33f7148ec req-5febc650-7b55-44f8-a98e-72182dc1fa57 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] No waiting events found dispatching network-vif-plugged-c7aff639-f0e0-4477-80e1-796f2e00f4fc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:16:30 np0005466013 nova_compute[192144]: 2025-10-02 12:16:30.471 2 WARNING nova.compute.manager [req-c8ae4670-8833-4bbb-9d9e-e3a33f7148ec req-5febc650-7b55-44f8-a98e-72182dc1fa57 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Received unexpected event network-vif-plugged-c7aff639-f0e0-4477-80e1-796f2e00f4fc for instance with vm_state active and task_state None.#033[00m
Oct  2 08:16:30 np0005466013 podman[231861]: 2025-10-02 12:16:30.485805554 +0000 UTC m=+0.088471219 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 08:16:30 np0005466013 podman[231864]: 2025-10-02 12:16:30.491148254 +0000 UTC m=+0.097350152 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 08:16:30 np0005466013 neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b[231640]: [NOTICE]   (231645) : haproxy version is 2.8.14-c23fe91
Oct  2 08:16:30 np0005466013 neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b[231640]: [NOTICE]   (231645) : path to executable is /usr/sbin/haproxy
Oct  2 08:16:30 np0005466013 neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b[231640]: [WARNING]  (231645) : Exiting Master process...
Oct  2 08:16:30 np0005466013 neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b[231640]: [ALERT]    (231645) : Current worker (231647) exited with code 143 (Terminated)
Oct  2 08:16:30 np0005466013 neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b[231640]: [WARNING]  (231645) : All workers exited. Exiting... (0)
Oct  2 08:16:30 np0005466013 systemd[1]: libpod-c80f2463231dcae9ee8b0372a9137c4f84ee1ecb8f921f6dd3fac7230eb96124.scope: Deactivated successfully.
Oct  2 08:16:30 np0005466013 podman[231925]: 2025-10-02 12:16:30.555641052 +0000 UTC m=+0.051284760 container died c80f2463231dcae9ee8b0372a9137c4f84ee1ecb8f921f6dd3fac7230eb96124 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Oct  2 08:16:30 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c80f2463231dcae9ee8b0372a9137c4f84ee1ecb8f921f6dd3fac7230eb96124-userdata-shm.mount: Deactivated successfully.
Oct  2 08:16:30 np0005466013 systemd[1]: var-lib-containers-storage-overlay-cd75b75e1c0d918f3b94a16cadfb82ac1de3982614444dcabff7c25f51f270c0-merged.mount: Deactivated successfully.
Oct  2 08:16:30 np0005466013 podman[231925]: 2025-10-02 12:16:30.60566276 +0000 UTC m=+0.101306458 container cleanup c80f2463231dcae9ee8b0372a9137c4f84ee1ecb8f921f6dd3fac7230eb96124 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:16:30 np0005466013 nova_compute[192144]: 2025-10-02 12:16:30.628 2 INFO nova.virt.libvirt.driver [-] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Instance destroyed successfully.#033[00m
Oct  2 08:16:30 np0005466013 nova_compute[192144]: 2025-10-02 12:16:30.630 2 DEBUG nova.objects.instance [None req-4a8c7f70-48a5-4e4c-89a1-a2a0ef201646 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lazy-loading 'resources' on Instance uuid 379bce9e-24c6-4d6e-9438-28eed217ca12 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:16:30 np0005466013 systemd[1]: libpod-conmon-c80f2463231dcae9ee8b0372a9137c4f84ee1ecb8f921f6dd3fac7230eb96124.scope: Deactivated successfully.
Oct  2 08:16:30 np0005466013 nova_compute[192144]: 2025-10-02 12:16:30.681 2 DEBUG nova.virt.libvirt.vif [None req-4a8c7f70-48a5-4e4c-89a1-a2a0ef201646 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:16:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1069325640',display_name='tempest-ServerDiskConfigTestJSON-server-1069325640',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1069325640',id=81,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:16:21Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ffae703d68b24b9c89686c149113fc2b',ramdisk_id='',reservation_id='r-t0ssxxiz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-1763056137',owner_user_name='tempest-ServerDiskConfigTestJSON-1763056137-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:16:28Z,user_data=None,user_id='def48c13fd6a43ba88836b753986a731',uuid=379bce9e-24c6-4d6e-9438-28eed217ca12,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8bb13439-1bbb-437a-8c8a-d41bd8de3c01", "address": "fa:16:3e:99:72:a3", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8bb13439-1b", "ovs_interfaceid": "8bb13439-1bbb-437a-8c8a-d41bd8de3c01", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:16:30 np0005466013 nova_compute[192144]: 2025-10-02 12:16:30.682 2 DEBUG nova.network.os_vif_util [None req-4a8c7f70-48a5-4e4c-89a1-a2a0ef201646 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Converting VIF {"id": "8bb13439-1bbb-437a-8c8a-d41bd8de3c01", "address": "fa:16:3e:99:72:a3", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8bb13439-1b", "ovs_interfaceid": "8bb13439-1bbb-437a-8c8a-d41bd8de3c01", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:16:30 np0005466013 nova_compute[192144]: 2025-10-02 12:16:30.682 2 DEBUG nova.network.os_vif_util [None req-4a8c7f70-48a5-4e4c-89a1-a2a0ef201646 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:99:72:a3,bridge_name='br-int',has_traffic_filtering=True,id=8bb13439-1bbb-437a-8c8a-d41bd8de3c01,network=Network(d6de4737-ca60-4c8d-bfd5-687f9366ec8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8bb13439-1b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:16:30 np0005466013 nova_compute[192144]: 2025-10-02 12:16:30.683 2 DEBUG os_vif [None req-4a8c7f70-48a5-4e4c-89a1-a2a0ef201646 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:99:72:a3,bridge_name='br-int',has_traffic_filtering=True,id=8bb13439-1bbb-437a-8c8a-d41bd8de3c01,network=Network(d6de4737-ca60-4c8d-bfd5-687f9366ec8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8bb13439-1b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:16:30 np0005466013 nova_compute[192144]: 2025-10-02 12:16:30.685 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:30 np0005466013 nova_compute[192144]: 2025-10-02 12:16:30.686 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8bb13439-1b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:30 np0005466013 podman[231971]: 2025-10-02 12:16:30.68821206 +0000 UTC m=+0.049521403 container remove c80f2463231dcae9ee8b0372a9137c4f84ee1ecb8f921f6dd3fac7230eb96124 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2)
Oct  2 08:16:30 np0005466013 nova_compute[192144]: 2025-10-02 12:16:30.687 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:30 np0005466013 nova_compute[192144]: 2025-10-02 12:16:30.690 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:16:30 np0005466013 nova_compute[192144]: 2025-10-02 12:16:30.692 2 INFO os_vif [None req-4a8c7f70-48a5-4e4c-89a1-a2a0ef201646 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:99:72:a3,bridge_name='br-int',has_traffic_filtering=True,id=8bb13439-1bbb-437a-8c8a-d41bd8de3c01,network=Network(d6de4737-ca60-4c8d-bfd5-687f9366ec8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8bb13439-1b')#033[00m
Oct  2 08:16:30 np0005466013 nova_compute[192144]: 2025-10-02 12:16:30.693 2 INFO nova.virt.libvirt.driver [None req-4a8c7f70-48a5-4e4c-89a1-a2a0ef201646 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Deleting instance files /var/lib/nova/instances/379bce9e-24c6-4d6e-9438-28eed217ca12_del#033[00m
Oct  2 08:16:30 np0005466013 nova_compute[192144]: 2025-10-02 12:16:30.694 2 INFO nova.virt.libvirt.driver [None req-4a8c7f70-48a5-4e4c-89a1-a2a0ef201646 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Deletion of /var/lib/nova/instances/379bce9e-24c6-4d6e-9438-28eed217ca12_del complete#033[00m
Oct  2 08:16:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:30.694 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[a40009c5-5f13-42bd-90f9-03ef715a61fd]: (4, ('Thu Oct  2 12:16:30 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b (c80f2463231dcae9ee8b0372a9137c4f84ee1ecb8f921f6dd3fac7230eb96124)\nc80f2463231dcae9ee8b0372a9137c4f84ee1ecb8f921f6dd3fac7230eb96124\nThu Oct  2 12:16:30 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b (c80f2463231dcae9ee8b0372a9137c4f84ee1ecb8f921f6dd3fac7230eb96124)\nc80f2463231dcae9ee8b0372a9137c4f84ee1ecb8f921f6dd3fac7230eb96124\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:30.696 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[1f3a567a-1d2d-4d09-a9d1-91f6f5583f82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:30.697 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd6de4737-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:30 np0005466013 nova_compute[192144]: 2025-10-02 12:16:30.699 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:30 np0005466013 kernel: tapd6de4737-c0: left promiscuous mode
Oct  2 08:16:30 np0005466013 nova_compute[192144]: 2025-10-02 12:16:30.710 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:30.713 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[88f0d6cf-3761-47ec-aead-a0b218883191]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:30.744 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[ea7d311e-6a30-49a5-b183-b26c09b0f6cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:30.746 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[e86709a5-5889-44c7-9129-b0e260dbd797]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:30.763 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[5b4dbe17-e258-4f39-97b3-c5fdc6b1bbfd]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 537231, 'reachable_time': 40740, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231986, 'error': None, 'target': 'ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:30 np0005466013 systemd[1]: run-netns-ovnmeta\x2dd6de4737\x2dca60\x2d4c8d\x2dbfd5\x2d687f9366ec8b.mount: Deactivated successfully.
Oct  2 08:16:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:30.769 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:16:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:30.769 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[1f75c18e-d4a3-48de-ae2a-e1e7ed50f4aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:30 np0005466013 nova_compute[192144]: 2025-10-02 12:16:30.777 2 INFO nova.compute.manager [None req-4a8c7f70-48a5-4e4c-89a1-a2a0ef201646 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Took 0.43 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:16:30 np0005466013 nova_compute[192144]: 2025-10-02 12:16:30.778 2 DEBUG oslo.service.loopingcall [None req-4a8c7f70-48a5-4e4c-89a1-a2a0ef201646 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:16:30 np0005466013 nova_compute[192144]: 2025-10-02 12:16:30.778 2 DEBUG nova.compute.manager [-] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:16:30 np0005466013 nova_compute[192144]: 2025-10-02 12:16:30.779 2 DEBUG nova.network.neutron [-] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:16:30 np0005466013 nova_compute[192144]: 2025-10-02 12:16:30.793 2 DEBUG nova.compute.manager [req-f1f0662a-6ebc-4fc2-8ca8-4d37f048243d req-ea36729f-40d1-45d4-834b-ec014c41a9cb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Received event network-vif-unplugged-8bb13439-1bbb-437a-8c8a-d41bd8de3c01 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:30 np0005466013 nova_compute[192144]: 2025-10-02 12:16:30.794 2 DEBUG oslo_concurrency.lockutils [req-f1f0662a-6ebc-4fc2-8ca8-4d37f048243d req-ea36729f-40d1-45d4-834b-ec014c41a9cb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "379bce9e-24c6-4d6e-9438-28eed217ca12-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:30 np0005466013 nova_compute[192144]: 2025-10-02 12:16:30.795 2 DEBUG oslo_concurrency.lockutils [req-f1f0662a-6ebc-4fc2-8ca8-4d37f048243d req-ea36729f-40d1-45d4-834b-ec014c41a9cb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "379bce9e-24c6-4d6e-9438-28eed217ca12-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:30 np0005466013 nova_compute[192144]: 2025-10-02 12:16:30.795 2 DEBUG oslo_concurrency.lockutils [req-f1f0662a-6ebc-4fc2-8ca8-4d37f048243d req-ea36729f-40d1-45d4-834b-ec014c41a9cb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "379bce9e-24c6-4d6e-9438-28eed217ca12-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:30 np0005466013 nova_compute[192144]: 2025-10-02 12:16:30.795 2 DEBUG nova.compute.manager [req-f1f0662a-6ebc-4fc2-8ca8-4d37f048243d req-ea36729f-40d1-45d4-834b-ec014c41a9cb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] No waiting events found dispatching network-vif-unplugged-8bb13439-1bbb-437a-8c8a-d41bd8de3c01 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:16:30 np0005466013 nova_compute[192144]: 2025-10-02 12:16:30.796 2 DEBUG nova.compute.manager [req-f1f0662a-6ebc-4fc2-8ca8-4d37f048243d req-ea36729f-40d1-45d4-834b-ec014c41a9cb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Received event network-vif-unplugged-8bb13439-1bbb-437a-8c8a-d41bd8de3c01 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:16:31 np0005466013 nova_compute[192144]: 2025-10-02 12:16:31.127 2 INFO nova.compute.manager [None req-34aff3b5-9368-4042-b5cd-a08127e7b0b4 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Pausing#033[00m
Oct  2 08:16:31 np0005466013 nova_compute[192144]: 2025-10-02 12:16:31.129 2 DEBUG nova.objects.instance [None req-34aff3b5-9368-4042-b5cd-a08127e7b0b4 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lazy-loading 'flavor' on Instance uuid 85bfe864-3153-4ef5-b286-f2f31e93f994 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:16:31 np0005466013 nova_compute[192144]: 2025-10-02 12:16:31.170 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407391.1704657, 85bfe864-3153-4ef5-b286-f2f31e93f994 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:16:31 np0005466013 nova_compute[192144]: 2025-10-02 12:16:31.171 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:16:31 np0005466013 nova_compute[192144]: 2025-10-02 12:16:31.173 2 DEBUG nova.compute.manager [None req-34aff3b5-9368-4042-b5cd-a08127e7b0b4 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:16:31 np0005466013 nova_compute[192144]: 2025-10-02 12:16:31.212 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:16:31 np0005466013 nova_compute[192144]: 2025-10-02 12:16:31.215 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:16:31 np0005466013 nova_compute[192144]: 2025-10-02 12:16:31.258 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] During sync_power_state the instance has a pending task (pausing). Skip.#033[00m
Oct  2 08:16:31 np0005466013 nova_compute[192144]: 2025-10-02 12:16:31.684 2 DEBUG nova.network.neutron [-] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:16:31 np0005466013 nova_compute[192144]: 2025-10-02 12:16:31.703 2 INFO nova.compute.manager [-] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Took 0.92 seconds to deallocate network for instance.#033[00m
Oct  2 08:16:31 np0005466013 nova_compute[192144]: 2025-10-02 12:16:31.781 2 DEBUG oslo_concurrency.lockutils [None req-4a8c7f70-48a5-4e4c-89a1-a2a0ef201646 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:31 np0005466013 nova_compute[192144]: 2025-10-02 12:16:31.781 2 DEBUG oslo_concurrency.lockutils [None req-4a8c7f70-48a5-4e4c-89a1-a2a0ef201646 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:31 np0005466013 nova_compute[192144]: 2025-10-02 12:16:31.878 2 DEBUG nova.compute.provider_tree [None req-4a8c7f70-48a5-4e4c-89a1-a2a0ef201646 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:16:31 np0005466013 nova_compute[192144]: 2025-10-02 12:16:31.894 2 DEBUG nova.scheduler.client.report [None req-4a8c7f70-48a5-4e4c-89a1-a2a0ef201646 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:16:31 np0005466013 nova_compute[192144]: 2025-10-02 12:16:31.919 2 DEBUG oslo_concurrency.lockutils [None req-4a8c7f70-48a5-4e4c-89a1-a2a0ef201646 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:31 np0005466013 nova_compute[192144]: 2025-10-02 12:16:31.941 2 INFO nova.scheduler.client.report [None req-4a8c7f70-48a5-4e4c-89a1-a2a0ef201646 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Deleted allocations for instance 379bce9e-24c6-4d6e-9438-28eed217ca12#033[00m
Oct  2 08:16:32 np0005466013 nova_compute[192144]: 2025-10-02 12:16:32.004 2 DEBUG oslo_concurrency.lockutils [None req-4a8c7f70-48a5-4e4c-89a1-a2a0ef201646 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "379bce9e-24c6-4d6e-9438-28eed217ca12" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.681s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:32 np0005466013 nova_compute[192144]: 2025-10-02 12:16:32.552 2 DEBUG nova.compute.manager [req-d70d8cfb-c849-4b27-a031-671a8b4d275f req-20be3c35-95ce-4925-87c2-b386f1cbf840 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Received event network-vif-deleted-8bb13439-1bbb-437a-8c8a-d41bd8de3c01 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:32 np0005466013 nova_compute[192144]: 2025-10-02 12:16:32.869 2 DEBUG nova.compute.manager [req-3c527f62-5b1d-4b06-8d0f-89869c59c821 req-60fc17de-629c-45f3-8d56-b280430d05af 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Received event network-vif-plugged-8bb13439-1bbb-437a-8c8a-d41bd8de3c01 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:32 np0005466013 nova_compute[192144]: 2025-10-02 12:16:32.870 2 DEBUG oslo_concurrency.lockutils [req-3c527f62-5b1d-4b06-8d0f-89869c59c821 req-60fc17de-629c-45f3-8d56-b280430d05af 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "379bce9e-24c6-4d6e-9438-28eed217ca12-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:32 np0005466013 nova_compute[192144]: 2025-10-02 12:16:32.870 2 DEBUG oslo_concurrency.lockutils [req-3c527f62-5b1d-4b06-8d0f-89869c59c821 req-60fc17de-629c-45f3-8d56-b280430d05af 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "379bce9e-24c6-4d6e-9438-28eed217ca12-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:32 np0005466013 nova_compute[192144]: 2025-10-02 12:16:32.870 2 DEBUG oslo_concurrency.lockutils [req-3c527f62-5b1d-4b06-8d0f-89869c59c821 req-60fc17de-629c-45f3-8d56-b280430d05af 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "379bce9e-24c6-4d6e-9438-28eed217ca12-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:32 np0005466013 nova_compute[192144]: 2025-10-02 12:16:32.870 2 DEBUG nova.compute.manager [req-3c527f62-5b1d-4b06-8d0f-89869c59c821 req-60fc17de-629c-45f3-8d56-b280430d05af 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] No waiting events found dispatching network-vif-plugged-8bb13439-1bbb-437a-8c8a-d41bd8de3c01 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:16:32 np0005466013 nova_compute[192144]: 2025-10-02 12:16:32.871 2 WARNING nova.compute.manager [req-3c527f62-5b1d-4b06-8d0f-89869c59c821 req-60fc17de-629c-45f3-8d56-b280430d05af 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Received unexpected event network-vif-plugged-8bb13439-1bbb-437a-8c8a-d41bd8de3c01 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:16:33 np0005466013 nova_compute[192144]: 2025-10-02 12:16:33.427 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:34 np0005466013 nova_compute[192144]: 2025-10-02 12:16:34.279 2 DEBUG oslo_concurrency.lockutils [None req-bad34a25-f993-4dcb-8ba7-79a54984a770 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "85bfe864-3153-4ef5-b286-f2f31e93f994" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:34 np0005466013 nova_compute[192144]: 2025-10-02 12:16:34.280 2 DEBUG oslo_concurrency.lockutils [None req-bad34a25-f993-4dcb-8ba7-79a54984a770 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "85bfe864-3153-4ef5-b286-f2f31e93f994" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:34 np0005466013 nova_compute[192144]: 2025-10-02 12:16:34.280 2 DEBUG oslo_concurrency.lockutils [None req-bad34a25-f993-4dcb-8ba7-79a54984a770 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "85bfe864-3153-4ef5-b286-f2f31e93f994-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:34 np0005466013 nova_compute[192144]: 2025-10-02 12:16:34.280 2 DEBUG oslo_concurrency.lockutils [None req-bad34a25-f993-4dcb-8ba7-79a54984a770 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "85bfe864-3153-4ef5-b286-f2f31e93f994-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:34 np0005466013 nova_compute[192144]: 2025-10-02 12:16:34.281 2 DEBUG oslo_concurrency.lockutils [None req-bad34a25-f993-4dcb-8ba7-79a54984a770 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "85bfe864-3153-4ef5-b286-f2f31e93f994-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:34 np0005466013 nova_compute[192144]: 2025-10-02 12:16:34.294 2 INFO nova.compute.manager [None req-bad34a25-f993-4dcb-8ba7-79a54984a770 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Terminating instance#033[00m
Oct  2 08:16:34 np0005466013 nova_compute[192144]: 2025-10-02 12:16:34.306 2 DEBUG nova.compute.manager [None req-bad34a25-f993-4dcb-8ba7-79a54984a770 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:16:34 np0005466013 kernel: tapc7aff639-f0 (unregistering): left promiscuous mode
Oct  2 08:16:34 np0005466013 NetworkManager[51205]: <info>  [1759407394.3322] device (tapc7aff639-f0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:16:34 np0005466013 nova_compute[192144]: 2025-10-02 12:16:34.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:34 np0005466013 ovn_controller[94366]: 2025-10-02T12:16:34Z|00314|binding|INFO|Releasing lport c7aff639-f0e0-4477-80e1-796f2e00f4fc from this chassis (sb_readonly=0)
Oct  2 08:16:34 np0005466013 ovn_controller[94366]: 2025-10-02T12:16:34Z|00315|binding|INFO|Setting lport c7aff639-f0e0-4477-80e1-796f2e00f4fc down in Southbound
Oct  2 08:16:34 np0005466013 ovn_controller[94366]: 2025-10-02T12:16:34Z|00316|binding|INFO|Removing iface tapc7aff639-f0 ovn-installed in OVS
Oct  2 08:16:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:34.354 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:14:ad:ee 10.100.0.11'], port_security=['fa:16:3e:14:ad:ee 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '85bfe864-3153-4ef5-b286-f2f31e93f994', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b97b8849-844c-4190-8b13-fd7a2d073ce8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd5db64e6714348c1a7f57bb53de80915', 'neutron:revision_number': '4', 'neutron:security_group_ids': '063f732a-6071-414f-814d-a5d6c4e9e012', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2011b0da-7062-465f-963e-59e92e88a653, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=c7aff639-f0e0-4477-80e1-796f2e00f4fc) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:16:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:34.357 103323 INFO neutron.agent.ovn.metadata.agent [-] Port c7aff639-f0e0-4477-80e1-796f2e00f4fc in datapath b97b8849-844c-4190-8b13-fd7a2d073ce8 unbound from our chassis#033[00m
Oct  2 08:16:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:34.359 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b97b8849-844c-4190-8b13-fd7a2d073ce8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:16:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:34.360 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[3eb6efaf-a50e-4219-b0ad-d5242e23e150]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:34.361 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8 namespace which is not needed anymore#033[00m
Oct  2 08:16:34 np0005466013 nova_compute[192144]: 2025-10-02 12:16:34.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:34 np0005466013 nova_compute[192144]: 2025-10-02 12:16:34.375 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:16:34 np0005466013 nova_compute[192144]: 2025-10-02 12:16:34.375 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:16:34 np0005466013 systemd[1]: machine-qemu\x2d37\x2dinstance\x2d00000052.scope: Deactivated successfully.
Oct  2 08:16:34 np0005466013 systemd[1]: machine-qemu\x2d37\x2dinstance\x2d00000052.scope: Consumed 3.355s CPU time.
Oct  2 08:16:34 np0005466013 systemd-machined[152202]: Machine qemu-37-instance-00000052 terminated.
Oct  2 08:16:34 np0005466013 neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8[231846]: [NOTICE]   (231850) : haproxy version is 2.8.14-c23fe91
Oct  2 08:16:34 np0005466013 neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8[231846]: [NOTICE]   (231850) : path to executable is /usr/sbin/haproxy
Oct  2 08:16:34 np0005466013 neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8[231846]: [WARNING]  (231850) : Exiting Master process...
Oct  2 08:16:34 np0005466013 neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8[231846]: [ALERT]    (231850) : Current worker (231852) exited with code 143 (Terminated)
Oct  2 08:16:34 np0005466013 neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8[231846]: [WARNING]  (231850) : All workers exited. Exiting... (0)
Oct  2 08:16:34 np0005466013 systemd[1]: libpod-d1d297f825fbfae07f9601774b3fb948cb7147637302a37fe599571b61ea7312.scope: Deactivated successfully.
Oct  2 08:16:34 np0005466013 podman[232012]: 2025-10-02 12:16:34.504228186 +0000 UTC m=+0.046219989 container died d1d297f825fbfae07f9601774b3fb948cb7147637302a37fe599571b61ea7312 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:16:34 np0005466013 nova_compute[192144]: 2025-10-02 12:16:34.532 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:34 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d1d297f825fbfae07f9601774b3fb948cb7147637302a37fe599571b61ea7312-userdata-shm.mount: Deactivated successfully.
Oct  2 08:16:34 np0005466013 systemd[1]: var-lib-containers-storage-overlay-d96118c6bf071e479c9e84a8237b7824817353b45519483c7bc8e23a68096a2d-merged.mount: Deactivated successfully.
Oct  2 08:16:34 np0005466013 nova_compute[192144]: 2025-10-02 12:16:34.539 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:34 np0005466013 podman[232012]: 2025-10-02 12:16:34.544690951 +0000 UTC m=+0.086682764 container cleanup d1d297f825fbfae07f9601774b3fb948cb7147637302a37fe599571b61ea7312 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  2 08:16:34 np0005466013 systemd[1]: libpod-conmon-d1d297f825fbfae07f9601774b3fb948cb7147637302a37fe599571b61ea7312.scope: Deactivated successfully.
Oct  2 08:16:34 np0005466013 nova_compute[192144]: 2025-10-02 12:16:34.582 2 INFO nova.virt.libvirt.driver [-] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Instance destroyed successfully.#033[00m
Oct  2 08:16:34 np0005466013 nova_compute[192144]: 2025-10-02 12:16:34.582 2 DEBUG nova.objects.instance [None req-bad34a25-f993-4dcb-8ba7-79a54984a770 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lazy-loading 'resources' on Instance uuid 85bfe864-3153-4ef5-b286-f2f31e93f994 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:16:34 np0005466013 nova_compute[192144]: 2025-10-02 12:16:34.595 2 DEBUG nova.virt.libvirt.vif [None req-bad34a25-f993-4dcb-8ba7-79a54984a770 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:16:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-565092550',display_name='tempest-DeleteServersTestJSON-server-565092550',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-565092550',id=82,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:16:28Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=3,progress=0,project_id='d5db64e6714348c1a7f57bb53de80915',ramdisk_id='',reservation_id='r-0u75o0es',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-DeleteServersTestJSON-548982240',owner_user_name='tempest-DeleteServersTestJSON-548982240-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:16:31Z,user_data=None,user_id='0c0ba8ddde504431b51e593c63f40361',uuid=85bfe864-3153-4ef5-b286-f2f31e93f994,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='paused') vif={"id": "c7aff639-f0e0-4477-80e1-796f2e00f4fc", "address": "fa:16:3e:14:ad:ee", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7aff639-f0", "ovs_interfaceid": "c7aff639-f0e0-4477-80e1-796f2e00f4fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:16:34 np0005466013 nova_compute[192144]: 2025-10-02 12:16:34.595 2 DEBUG nova.network.os_vif_util [None req-bad34a25-f993-4dcb-8ba7-79a54984a770 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Converting VIF {"id": "c7aff639-f0e0-4477-80e1-796f2e00f4fc", "address": "fa:16:3e:14:ad:ee", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7aff639-f0", "ovs_interfaceid": "c7aff639-f0e0-4477-80e1-796f2e00f4fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:16:34 np0005466013 nova_compute[192144]: 2025-10-02 12:16:34.596 2 DEBUG nova.network.os_vif_util [None req-bad34a25-f993-4dcb-8ba7-79a54984a770 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:14:ad:ee,bridge_name='br-int',has_traffic_filtering=True,id=c7aff639-f0e0-4477-80e1-796f2e00f4fc,network=Network(b97b8849-844c-4190-8b13-fd7a2d073ce8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc7aff639-f0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:16:34 np0005466013 nova_compute[192144]: 2025-10-02 12:16:34.596 2 DEBUG os_vif [None req-bad34a25-f993-4dcb-8ba7-79a54984a770 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:14:ad:ee,bridge_name='br-int',has_traffic_filtering=True,id=c7aff639-f0e0-4477-80e1-796f2e00f4fc,network=Network(b97b8849-844c-4190-8b13-fd7a2d073ce8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc7aff639-f0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:16:34 np0005466013 nova_compute[192144]: 2025-10-02 12:16:34.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:34 np0005466013 nova_compute[192144]: 2025-10-02 12:16:34.599 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc7aff639-f0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:34 np0005466013 nova_compute[192144]: 2025-10-02 12:16:34.602 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:16:34 np0005466013 nova_compute[192144]: 2025-10-02 12:16:34.605 2 INFO os_vif [None req-bad34a25-f993-4dcb-8ba7-79a54984a770 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:14:ad:ee,bridge_name='br-int',has_traffic_filtering=True,id=c7aff639-f0e0-4477-80e1-796f2e00f4fc,network=Network(b97b8849-844c-4190-8b13-fd7a2d073ce8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc7aff639-f0')#033[00m
Oct  2 08:16:34 np0005466013 nova_compute[192144]: 2025-10-02 12:16:34.605 2 INFO nova.virt.libvirt.driver [None req-bad34a25-f993-4dcb-8ba7-79a54984a770 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Deleting instance files /var/lib/nova/instances/85bfe864-3153-4ef5-b286-f2f31e93f994_del#033[00m
Oct  2 08:16:34 np0005466013 nova_compute[192144]: 2025-10-02 12:16:34.606 2 INFO nova.virt.libvirt.driver [None req-bad34a25-f993-4dcb-8ba7-79a54984a770 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Deletion of /var/lib/nova/instances/85bfe864-3153-4ef5-b286-f2f31e93f994_del complete#033[00m
Oct  2 08:16:34 np0005466013 podman[232049]: 2025-10-02 12:16:34.607021539 +0000 UTC m=+0.042593093 container remove d1d297f825fbfae07f9601774b3fb948cb7147637302a37fe599571b61ea7312 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:16:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:34.612 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[682adcb1-8319-4e74-9931-15eb02115cef]: (4, ('Thu Oct  2 12:16:34 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8 (d1d297f825fbfae07f9601774b3fb948cb7147637302a37fe599571b61ea7312)\nd1d297f825fbfae07f9601774b3fb948cb7147637302a37fe599571b61ea7312\nThu Oct  2 12:16:34 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8 (d1d297f825fbfae07f9601774b3fb948cb7147637302a37fe599571b61ea7312)\nd1d297f825fbfae07f9601774b3fb948cb7147637302a37fe599571b61ea7312\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:34.614 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[80e5a963-ad3c-4dab-acf8-d0a47c7a4747]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:34.615 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb97b8849-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:34 np0005466013 nova_compute[192144]: 2025-10-02 12:16:34.616 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:34 np0005466013 kernel: tapb97b8849-80: left promiscuous mode
Oct  2 08:16:34 np0005466013 nova_compute[192144]: 2025-10-02 12:16:34.630 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:34.633 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[812b69a7-a02e-4836-9449-4ef779c40669]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:34.660 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[6805083c-d0fc-4d27-b19c-13988129a9a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:34.662 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[17273a84-adc4-4be6-95a6-79adb04cbcbc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:34.678 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[58ee92c1-d2cf-479b-8a94-a7e05fc831e0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 537904, 'reachable_time': 40906, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232071, 'error': None, 'target': 'ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:34 np0005466013 systemd[1]: run-netns-ovnmeta\x2db97b8849\x2d844c\x2d4190\x2d8b13\x2dfd7a2d073ce8.mount: Deactivated successfully.
Oct  2 08:16:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:34.682 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:16:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:34.682 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[e53aaa53-9a05-46b6-b896-410e21e12316]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:34 np0005466013 nova_compute[192144]: 2025-10-02 12:16:34.696 2 INFO nova.compute.manager [None req-bad34a25-f993-4dcb-8ba7-79a54984a770 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:16:34 np0005466013 nova_compute[192144]: 2025-10-02 12:16:34.696 2 DEBUG oslo.service.loopingcall [None req-bad34a25-f993-4dcb-8ba7-79a54984a770 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:16:34 np0005466013 nova_compute[192144]: 2025-10-02 12:16:34.697 2 DEBUG nova.compute.manager [-] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:16:34 np0005466013 nova_compute[192144]: 2025-10-02 12:16:34.697 2 DEBUG nova.network.neutron [-] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:16:35 np0005466013 nova_compute[192144]: 2025-10-02 12:16:35.142 2 DEBUG nova.compute.manager [req-8abfa43e-40f2-48e2-9685-aabf6ffa897a req-fe29e85b-24c3-4c0d-adc4-debbe759c830 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Received event network-vif-unplugged-c7aff639-f0e0-4477-80e1-796f2e00f4fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:35 np0005466013 nova_compute[192144]: 2025-10-02 12:16:35.142 2 DEBUG oslo_concurrency.lockutils [req-8abfa43e-40f2-48e2-9685-aabf6ffa897a req-fe29e85b-24c3-4c0d-adc4-debbe759c830 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "85bfe864-3153-4ef5-b286-f2f31e93f994-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:35 np0005466013 nova_compute[192144]: 2025-10-02 12:16:35.142 2 DEBUG oslo_concurrency.lockutils [req-8abfa43e-40f2-48e2-9685-aabf6ffa897a req-fe29e85b-24c3-4c0d-adc4-debbe759c830 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "85bfe864-3153-4ef5-b286-f2f31e93f994-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:35 np0005466013 nova_compute[192144]: 2025-10-02 12:16:35.143 2 DEBUG oslo_concurrency.lockutils [req-8abfa43e-40f2-48e2-9685-aabf6ffa897a req-fe29e85b-24c3-4c0d-adc4-debbe759c830 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "85bfe864-3153-4ef5-b286-f2f31e93f994-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:35 np0005466013 nova_compute[192144]: 2025-10-02 12:16:35.143 2 DEBUG nova.compute.manager [req-8abfa43e-40f2-48e2-9685-aabf6ffa897a req-fe29e85b-24c3-4c0d-adc4-debbe759c830 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] No waiting events found dispatching network-vif-unplugged-c7aff639-f0e0-4477-80e1-796f2e00f4fc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:16:35 np0005466013 nova_compute[192144]: 2025-10-02 12:16:35.143 2 DEBUG nova.compute.manager [req-8abfa43e-40f2-48e2-9685-aabf6ffa897a req-fe29e85b-24c3-4c0d-adc4-debbe759c830 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Received event network-vif-unplugged-c7aff639-f0e0-4477-80e1-796f2e00f4fc for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:16:35 np0005466013 nova_compute[192144]: 2025-10-02 12:16:35.779 2 DEBUG nova.network.neutron [-] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:16:35 np0005466013 nova_compute[192144]: 2025-10-02 12:16:35.797 2 INFO nova.compute.manager [-] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Took 1.10 seconds to deallocate network for instance.#033[00m
Oct  2 08:16:35 np0005466013 nova_compute[192144]: 2025-10-02 12:16:35.872 2 DEBUG oslo_concurrency.lockutils [None req-bad34a25-f993-4dcb-8ba7-79a54984a770 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:35 np0005466013 nova_compute[192144]: 2025-10-02 12:16:35.873 2 DEBUG oslo_concurrency.lockutils [None req-bad34a25-f993-4dcb-8ba7-79a54984a770 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:35 np0005466013 nova_compute[192144]: 2025-10-02 12:16:35.930 2 DEBUG nova.compute.provider_tree [None req-bad34a25-f993-4dcb-8ba7-79a54984a770 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:16:35 np0005466013 nova_compute[192144]: 2025-10-02 12:16:35.948 2 DEBUG nova.scheduler.client.report [None req-bad34a25-f993-4dcb-8ba7-79a54984a770 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:16:35 np0005466013 nova_compute[192144]: 2025-10-02 12:16:35.974 2 DEBUG oslo_concurrency.lockutils [None req-bad34a25-f993-4dcb-8ba7-79a54984a770 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:36 np0005466013 nova_compute[192144]: 2025-10-02 12:16:36.006 2 INFO nova.scheduler.client.report [None req-bad34a25-f993-4dcb-8ba7-79a54984a770 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Deleted allocations for instance 85bfe864-3153-4ef5-b286-f2f31e93f994#033[00m
Oct  2 08:16:36 np0005466013 nova_compute[192144]: 2025-10-02 12:16:36.113 2 DEBUG oslo_concurrency.lockutils [None req-bad34a25-f993-4dcb-8ba7-79a54984a770 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "85bfe864-3153-4ef5-b286-f2f31e93f994" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.833s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:37 np0005466013 nova_compute[192144]: 2025-10-02 12:16:37.373 2 DEBUG nova.compute.manager [req-309b357e-b686-4c5e-94e8-a42d09a620e0 req-a636bd28-cff5-4212-9451-c08013cb1be1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Received event network-vif-plugged-c7aff639-f0e0-4477-80e1-796f2e00f4fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:37 np0005466013 nova_compute[192144]: 2025-10-02 12:16:37.373 2 DEBUG oslo_concurrency.lockutils [req-309b357e-b686-4c5e-94e8-a42d09a620e0 req-a636bd28-cff5-4212-9451-c08013cb1be1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "85bfe864-3153-4ef5-b286-f2f31e93f994-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:37 np0005466013 nova_compute[192144]: 2025-10-02 12:16:37.374 2 DEBUG oslo_concurrency.lockutils [req-309b357e-b686-4c5e-94e8-a42d09a620e0 req-a636bd28-cff5-4212-9451-c08013cb1be1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "85bfe864-3153-4ef5-b286-f2f31e93f994-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:37 np0005466013 nova_compute[192144]: 2025-10-02 12:16:37.374 2 DEBUG oslo_concurrency.lockutils [req-309b357e-b686-4c5e-94e8-a42d09a620e0 req-a636bd28-cff5-4212-9451-c08013cb1be1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "85bfe864-3153-4ef5-b286-f2f31e93f994-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:37 np0005466013 nova_compute[192144]: 2025-10-02 12:16:37.375 2 DEBUG nova.compute.manager [req-309b357e-b686-4c5e-94e8-a42d09a620e0 req-a636bd28-cff5-4212-9451-c08013cb1be1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] No waiting events found dispatching network-vif-plugged-c7aff639-f0e0-4477-80e1-796f2e00f4fc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:16:37 np0005466013 nova_compute[192144]: 2025-10-02 12:16:37.375 2 WARNING nova.compute.manager [req-309b357e-b686-4c5e-94e8-a42d09a620e0 req-a636bd28-cff5-4212-9451-c08013cb1be1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Received unexpected event network-vif-plugged-c7aff639-f0e0-4477-80e1-796f2e00f4fc for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:16:37 np0005466013 nova_compute[192144]: 2025-10-02 12:16:37.376 2 DEBUG nova.compute.manager [req-309b357e-b686-4c5e-94e8-a42d09a620e0 req-a636bd28-cff5-4212-9451-c08013cb1be1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Received event network-vif-deleted-c7aff639-f0e0-4477-80e1-796f2e00f4fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:38 np0005466013 nova_compute[192144]: 2025-10-02 12:16:38.429 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:39 np0005466013 nova_compute[192144]: 2025-10-02 12:16:39.602 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:40.945 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:16:40 np0005466013 nova_compute[192144]: 2025-10-02 12:16:40.946 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:40.946 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:16:40 np0005466013 nova_compute[192144]: 2025-10-02 12:16:40.962 2 DEBUG oslo_concurrency.lockutils [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "03d3f4d9-1589-440a-80c8-3348a75c106b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:40 np0005466013 nova_compute[192144]: 2025-10-02 12:16:40.963 2 DEBUG oslo_concurrency.lockutils [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "03d3f4d9-1589-440a-80c8-3348a75c106b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:40 np0005466013 nova_compute[192144]: 2025-10-02 12:16:40.981 2 DEBUG nova.compute.manager [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:16:41 np0005466013 nova_compute[192144]: 2025-10-02 12:16:41.093 2 DEBUG oslo_concurrency.lockutils [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:41 np0005466013 nova_compute[192144]: 2025-10-02 12:16:41.093 2 DEBUG oslo_concurrency.lockutils [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:41 np0005466013 nova_compute[192144]: 2025-10-02 12:16:41.099 2 DEBUG nova.virt.hardware [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:16:41 np0005466013 nova_compute[192144]: 2025-10-02 12:16:41.100 2 INFO nova.compute.claims [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:16:41 np0005466013 nova_compute[192144]: 2025-10-02 12:16:41.252 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:41 np0005466013 nova_compute[192144]: 2025-10-02 12:16:41.273 2 DEBUG nova.compute.provider_tree [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:16:41 np0005466013 nova_compute[192144]: 2025-10-02 12:16:41.299 2 DEBUG nova.scheduler.client.report [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:16:41 np0005466013 nova_compute[192144]: 2025-10-02 12:16:41.328 2 DEBUG oslo_concurrency.lockutils [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.234s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:41 np0005466013 nova_compute[192144]: 2025-10-02 12:16:41.329 2 DEBUG nova.compute.manager [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:16:41 np0005466013 nova_compute[192144]: 2025-10-02 12:16:41.377 2 DEBUG nova.compute.manager [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:16:41 np0005466013 nova_compute[192144]: 2025-10-02 12:16:41.378 2 DEBUG nova.network.neutron [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:16:41 np0005466013 nova_compute[192144]: 2025-10-02 12:16:41.403 2 INFO nova.virt.libvirt.driver [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:16:41 np0005466013 nova_compute[192144]: 2025-10-02 12:16:41.416 2 DEBUG nova.compute.manager [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:16:41 np0005466013 nova_compute[192144]: 2025-10-02 12:16:41.536 2 DEBUG nova.compute.manager [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:16:41 np0005466013 nova_compute[192144]: 2025-10-02 12:16:41.538 2 DEBUG nova.virt.libvirt.driver [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:16:41 np0005466013 nova_compute[192144]: 2025-10-02 12:16:41.538 2 INFO nova.virt.libvirt.driver [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Creating image(s)#033[00m
Oct  2 08:16:41 np0005466013 nova_compute[192144]: 2025-10-02 12:16:41.539 2 DEBUG oslo_concurrency.lockutils [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "/var/lib/nova/instances/03d3f4d9-1589-440a-80c8-3348a75c106b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:41 np0005466013 nova_compute[192144]: 2025-10-02 12:16:41.540 2 DEBUG oslo_concurrency.lockutils [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "/var/lib/nova/instances/03d3f4d9-1589-440a-80c8-3348a75c106b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:41 np0005466013 nova_compute[192144]: 2025-10-02 12:16:41.541 2 DEBUG oslo_concurrency.lockutils [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "/var/lib/nova/instances/03d3f4d9-1589-440a-80c8-3348a75c106b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:41 np0005466013 nova_compute[192144]: 2025-10-02 12:16:41.558 2 DEBUG oslo_concurrency.processutils [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:41 np0005466013 nova_compute[192144]: 2025-10-02 12:16:41.592 2 DEBUG nova.policy [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0c0ba8ddde504431b51e593c63f40361', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd5db64e6714348c1a7f57bb53de80915', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:16:41 np0005466013 nova_compute[192144]: 2025-10-02 12:16:41.655 2 DEBUG oslo_concurrency.processutils [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:41 np0005466013 nova_compute[192144]: 2025-10-02 12:16:41.657 2 DEBUG oslo_concurrency.lockutils [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:41 np0005466013 nova_compute[192144]: 2025-10-02 12:16:41.658 2 DEBUG oslo_concurrency.lockutils [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:41 np0005466013 nova_compute[192144]: 2025-10-02 12:16:41.673 2 DEBUG oslo_concurrency.processutils [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:41 np0005466013 nova_compute[192144]: 2025-10-02 12:16:41.739 2 DEBUG oslo_concurrency.processutils [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:41 np0005466013 nova_compute[192144]: 2025-10-02 12:16:41.740 2 DEBUG oslo_concurrency.processutils [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/03d3f4d9-1589-440a-80c8-3348a75c106b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:41 np0005466013 nova_compute[192144]: 2025-10-02 12:16:41.785 2 DEBUG oslo_concurrency.processutils [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/03d3f4d9-1589-440a-80c8-3348a75c106b/disk 1073741824" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:41 np0005466013 nova_compute[192144]: 2025-10-02 12:16:41.786 2 DEBUG oslo_concurrency.lockutils [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:41 np0005466013 nova_compute[192144]: 2025-10-02 12:16:41.787 2 DEBUG oslo_concurrency.processutils [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:41 np0005466013 nova_compute[192144]: 2025-10-02 12:16:41.856 2 DEBUG oslo_concurrency.processutils [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:41 np0005466013 nova_compute[192144]: 2025-10-02 12:16:41.858 2 DEBUG nova.virt.disk.api [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Checking if we can resize image /var/lib/nova/instances/03d3f4d9-1589-440a-80c8-3348a75c106b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:16:41 np0005466013 nova_compute[192144]: 2025-10-02 12:16:41.858 2 DEBUG oslo_concurrency.processutils [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/03d3f4d9-1589-440a-80c8-3348a75c106b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:41 np0005466013 nova_compute[192144]: 2025-10-02 12:16:41.921 2 DEBUG oslo_concurrency.processutils [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/03d3f4d9-1589-440a-80c8-3348a75c106b/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:41 np0005466013 nova_compute[192144]: 2025-10-02 12:16:41.922 2 DEBUG nova.virt.disk.api [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Cannot resize image /var/lib/nova/instances/03d3f4d9-1589-440a-80c8-3348a75c106b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:16:41 np0005466013 nova_compute[192144]: 2025-10-02 12:16:41.923 2 DEBUG nova.objects.instance [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lazy-loading 'migration_context' on Instance uuid 03d3f4d9-1589-440a-80c8-3348a75c106b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:16:41 np0005466013 nova_compute[192144]: 2025-10-02 12:16:41.954 2 DEBUG nova.virt.libvirt.driver [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:16:41 np0005466013 nova_compute[192144]: 2025-10-02 12:16:41.954 2 DEBUG nova.virt.libvirt.driver [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Ensure instance console log exists: /var/lib/nova/instances/03d3f4d9-1589-440a-80c8-3348a75c106b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:16:41 np0005466013 nova_compute[192144]: 2025-10-02 12:16:41.955 2 DEBUG oslo_concurrency.lockutils [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:41 np0005466013 nova_compute[192144]: 2025-10-02 12:16:41.955 2 DEBUG oslo_concurrency.lockutils [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:41 np0005466013 nova_compute[192144]: 2025-10-02 12:16:41.956 2 DEBUG oslo_concurrency.lockutils [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:42 np0005466013 nova_compute[192144]: 2025-10-02 12:16:42.467 2 DEBUG nova.network.neutron [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Successfully created port: 920a7e7c-2b32-4e38-932f-e8ed19c81f7c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:16:43 np0005466013 nova_compute[192144]: 2025-10-02 12:16:43.373 2 DEBUG nova.network.neutron [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Successfully updated port: 920a7e7c-2b32-4e38-932f-e8ed19c81f7c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:16:43 np0005466013 nova_compute[192144]: 2025-10-02 12:16:43.387 2 DEBUG oslo_concurrency.lockutils [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "refresh_cache-03d3f4d9-1589-440a-80c8-3348a75c106b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:16:43 np0005466013 nova_compute[192144]: 2025-10-02 12:16:43.388 2 DEBUG oslo_concurrency.lockutils [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquired lock "refresh_cache-03d3f4d9-1589-440a-80c8-3348a75c106b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:16:43 np0005466013 nova_compute[192144]: 2025-10-02 12:16:43.388 2 DEBUG nova.network.neutron [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:16:43 np0005466013 nova_compute[192144]: 2025-10-02 12:16:43.432 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:43 np0005466013 nova_compute[192144]: 2025-10-02 12:16:43.589 2 DEBUG nova.compute.manager [req-f668423e-e40a-4e54-818e-99fba8f244a3 req-f357b683-17a1-47ba-9cd7-54694df60235 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Received event network-changed-920a7e7c-2b32-4e38-932f-e8ed19c81f7c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:43 np0005466013 nova_compute[192144]: 2025-10-02 12:16:43.590 2 DEBUG nova.compute.manager [req-f668423e-e40a-4e54-818e-99fba8f244a3 req-f357b683-17a1-47ba-9cd7-54694df60235 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Refreshing instance network info cache due to event network-changed-920a7e7c-2b32-4e38-932f-e8ed19c81f7c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:16:43 np0005466013 nova_compute[192144]: 2025-10-02 12:16:43.590 2 DEBUG oslo_concurrency.lockutils [req-f668423e-e40a-4e54-818e-99fba8f244a3 req-f357b683-17a1-47ba-9cd7-54694df60235 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-03d3f4d9-1589-440a-80c8-3348a75c106b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:16:43 np0005466013 nova_compute[192144]: 2025-10-02 12:16:43.649 2 DEBUG nova.network.neutron [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:16:43 np0005466013 podman[232088]: 2025-10-02 12:16:43.689898101 +0000 UTC m=+0.061829984 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Oct  2 08:16:43 np0005466013 podman[232087]: 2025-10-02 12:16:43.70971957 +0000 UTC m=+0.081565191 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  2 08:16:43 np0005466013 podman[232089]: 2025-10-02 12:16:43.722849447 +0000 UTC m=+0.086713525 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:16:44 np0005466013 nova_compute[192144]: 2025-10-02 12:16:44.605 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:44 np0005466013 nova_compute[192144]: 2025-10-02 12:16:44.771 2 DEBUG nova.network.neutron [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Updating instance_info_cache with network_info: [{"id": "920a7e7c-2b32-4e38-932f-e8ed19c81f7c", "address": "fa:16:3e:46:e6:8d", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap920a7e7c-2b", "ovs_interfaceid": "920a7e7c-2b32-4e38-932f-e8ed19c81f7c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:16:44 np0005466013 nova_compute[192144]: 2025-10-02 12:16:44.816 2 DEBUG oslo_concurrency.lockutils [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Releasing lock "refresh_cache-03d3f4d9-1589-440a-80c8-3348a75c106b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:16:44 np0005466013 nova_compute[192144]: 2025-10-02 12:16:44.816 2 DEBUG nova.compute.manager [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Instance network_info: |[{"id": "920a7e7c-2b32-4e38-932f-e8ed19c81f7c", "address": "fa:16:3e:46:e6:8d", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap920a7e7c-2b", "ovs_interfaceid": "920a7e7c-2b32-4e38-932f-e8ed19c81f7c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:16:44 np0005466013 nova_compute[192144]: 2025-10-02 12:16:44.817 2 DEBUG oslo_concurrency.lockutils [req-f668423e-e40a-4e54-818e-99fba8f244a3 req-f357b683-17a1-47ba-9cd7-54694df60235 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-03d3f4d9-1589-440a-80c8-3348a75c106b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:16:44 np0005466013 nova_compute[192144]: 2025-10-02 12:16:44.818 2 DEBUG nova.network.neutron [req-f668423e-e40a-4e54-818e-99fba8f244a3 req-f357b683-17a1-47ba-9cd7-54694df60235 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Refreshing network info cache for port 920a7e7c-2b32-4e38-932f-e8ed19c81f7c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:16:44 np0005466013 nova_compute[192144]: 2025-10-02 12:16:44.821 2 DEBUG nova.virt.libvirt.driver [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Start _get_guest_xml network_info=[{"id": "920a7e7c-2b32-4e38-932f-e8ed19c81f7c", "address": "fa:16:3e:46:e6:8d", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap920a7e7c-2b", "ovs_interfaceid": "920a7e7c-2b32-4e38-932f-e8ed19c81f7c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:16:44 np0005466013 nova_compute[192144]: 2025-10-02 12:16:44.827 2 WARNING nova.virt.libvirt.driver [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:16:44 np0005466013 nova_compute[192144]: 2025-10-02 12:16:44.832 2 DEBUG nova.virt.libvirt.host [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:16:44 np0005466013 nova_compute[192144]: 2025-10-02 12:16:44.833 2 DEBUG nova.virt.libvirt.host [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:16:44 np0005466013 nova_compute[192144]: 2025-10-02 12:16:44.836 2 DEBUG nova.virt.libvirt.host [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:16:44 np0005466013 nova_compute[192144]: 2025-10-02 12:16:44.837 2 DEBUG nova.virt.libvirt.host [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:16:44 np0005466013 nova_compute[192144]: 2025-10-02 12:16:44.838 2 DEBUG nova.virt.libvirt.driver [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:16:44 np0005466013 nova_compute[192144]: 2025-10-02 12:16:44.839 2 DEBUG nova.virt.hardware [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:16:44 np0005466013 nova_compute[192144]: 2025-10-02 12:16:44.839 2 DEBUG nova.virt.hardware [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:16:44 np0005466013 nova_compute[192144]: 2025-10-02 12:16:44.840 2 DEBUG nova.virt.hardware [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:16:44 np0005466013 nova_compute[192144]: 2025-10-02 12:16:44.840 2 DEBUG nova.virt.hardware [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:16:44 np0005466013 nova_compute[192144]: 2025-10-02 12:16:44.840 2 DEBUG nova.virt.hardware [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:16:44 np0005466013 nova_compute[192144]: 2025-10-02 12:16:44.841 2 DEBUG nova.virt.hardware [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:16:44 np0005466013 nova_compute[192144]: 2025-10-02 12:16:44.841 2 DEBUG nova.virt.hardware [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:16:44 np0005466013 nova_compute[192144]: 2025-10-02 12:16:44.841 2 DEBUG nova.virt.hardware [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:16:44 np0005466013 nova_compute[192144]: 2025-10-02 12:16:44.842 2 DEBUG nova.virt.hardware [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:16:44 np0005466013 nova_compute[192144]: 2025-10-02 12:16:44.842 2 DEBUG nova.virt.hardware [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:16:44 np0005466013 nova_compute[192144]: 2025-10-02 12:16:44.842 2 DEBUG nova.virt.hardware [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:16:44 np0005466013 nova_compute[192144]: 2025-10-02 12:16:44.847 2 DEBUG nova.virt.libvirt.vif [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:16:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1492208900',display_name='tempest-DeleteServersTestJSON-server-1492208900',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1492208900',id=83,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d5db64e6714348c1a7f57bb53de80915',ramdisk_id='',reservation_id='r-9alpbnhm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-548982240',owner_user_name='tempest-DeleteServersTestJSON-548982240-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:16:41Z,user_data=None,user_id='0c0ba8ddde504431b51e593c63f40361',uuid=03d3f4d9-1589-440a-80c8-3348a75c106b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "920a7e7c-2b32-4e38-932f-e8ed19c81f7c", "address": "fa:16:3e:46:e6:8d", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap920a7e7c-2b", "ovs_interfaceid": "920a7e7c-2b32-4e38-932f-e8ed19c81f7c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:16:44 np0005466013 nova_compute[192144]: 2025-10-02 12:16:44.848 2 DEBUG nova.network.os_vif_util [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Converting VIF {"id": "920a7e7c-2b32-4e38-932f-e8ed19c81f7c", "address": "fa:16:3e:46:e6:8d", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap920a7e7c-2b", "ovs_interfaceid": "920a7e7c-2b32-4e38-932f-e8ed19c81f7c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:16:44 np0005466013 nova_compute[192144]: 2025-10-02 12:16:44.849 2 DEBUG nova.network.os_vif_util [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:46:e6:8d,bridge_name='br-int',has_traffic_filtering=True,id=920a7e7c-2b32-4e38-932f-e8ed19c81f7c,network=Network(b97b8849-844c-4190-8b13-fd7a2d073ce8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap920a7e7c-2b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:16:44 np0005466013 nova_compute[192144]: 2025-10-02 12:16:44.850 2 DEBUG nova.objects.instance [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lazy-loading 'pci_devices' on Instance uuid 03d3f4d9-1589-440a-80c8-3348a75c106b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:16:44 np0005466013 nova_compute[192144]: 2025-10-02 12:16:44.905 2 DEBUG nova.virt.libvirt.driver [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:16:44 np0005466013 nova_compute[192144]:  <uuid>03d3f4d9-1589-440a-80c8-3348a75c106b</uuid>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:  <name>instance-00000053</name>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:16:44 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:      <nova:name>tempest-DeleteServersTestJSON-server-1492208900</nova:name>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:16:44</nova:creationTime>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:16:44 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:        <nova:user uuid="0c0ba8ddde504431b51e593c63f40361">tempest-DeleteServersTestJSON-548982240-project-member</nova:user>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:        <nova:project uuid="d5db64e6714348c1a7f57bb53de80915">tempest-DeleteServersTestJSON-548982240</nova:project>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:        <nova:port uuid="920a7e7c-2b32-4e38-932f-e8ed19c81f7c">
Oct  2 08:16:44 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:      <entry name="serial">03d3f4d9-1589-440a-80c8-3348a75c106b</entry>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:      <entry name="uuid">03d3f4d9-1589-440a-80c8-3348a75c106b</entry>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:16:44 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/03d3f4d9-1589-440a-80c8-3348a75c106b/disk"/>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:16:44 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/03d3f4d9-1589-440a-80c8-3348a75c106b/disk.config"/>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:16:44 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:46:e6:8d"/>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:      <target dev="tap920a7e7c-2b"/>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:16:44 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/03d3f4d9-1589-440a-80c8-3348a75c106b/console.log" append="off"/>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:16:44 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:16:44 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:16:44 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:16:44 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:16:44 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:16:44 np0005466013 nova_compute[192144]: 2025-10-02 12:16:44.906 2 DEBUG nova.compute.manager [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Preparing to wait for external event network-vif-plugged-920a7e7c-2b32-4e38-932f-e8ed19c81f7c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:16:44 np0005466013 nova_compute[192144]: 2025-10-02 12:16:44.907 2 DEBUG oslo_concurrency.lockutils [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "03d3f4d9-1589-440a-80c8-3348a75c106b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:44 np0005466013 nova_compute[192144]: 2025-10-02 12:16:44.907 2 DEBUG oslo_concurrency.lockutils [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "03d3f4d9-1589-440a-80c8-3348a75c106b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:44 np0005466013 nova_compute[192144]: 2025-10-02 12:16:44.907 2 DEBUG oslo_concurrency.lockutils [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "03d3f4d9-1589-440a-80c8-3348a75c106b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:44 np0005466013 nova_compute[192144]: 2025-10-02 12:16:44.908 2 DEBUG nova.virt.libvirt.vif [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:16:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1492208900',display_name='tempest-DeleteServersTestJSON-server-1492208900',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1492208900',id=83,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d5db64e6714348c1a7f57bb53de80915',ramdisk_id='',reservation_id='r-9alpbnhm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-548982240',owner_user_name='tempest-DeleteServersTestJSON-548982240-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:16:41Z,user_data=None,user_id='0c0ba8ddde504431b51e593c63f40361',uuid=03d3f4d9-1589-440a-80c8-3348a75c106b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "920a7e7c-2b32-4e38-932f-e8ed19c81f7c", "address": "fa:16:3e:46:e6:8d", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap920a7e7c-2b", "ovs_interfaceid": "920a7e7c-2b32-4e38-932f-e8ed19c81f7c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:16:44 np0005466013 nova_compute[192144]: 2025-10-02 12:16:44.909 2 DEBUG nova.network.os_vif_util [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Converting VIF {"id": "920a7e7c-2b32-4e38-932f-e8ed19c81f7c", "address": "fa:16:3e:46:e6:8d", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap920a7e7c-2b", "ovs_interfaceid": "920a7e7c-2b32-4e38-932f-e8ed19c81f7c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:16:44 np0005466013 nova_compute[192144]: 2025-10-02 12:16:44.910 2 DEBUG nova.network.os_vif_util [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:46:e6:8d,bridge_name='br-int',has_traffic_filtering=True,id=920a7e7c-2b32-4e38-932f-e8ed19c81f7c,network=Network(b97b8849-844c-4190-8b13-fd7a2d073ce8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap920a7e7c-2b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:16:44 np0005466013 nova_compute[192144]: 2025-10-02 12:16:44.910 2 DEBUG os_vif [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:46:e6:8d,bridge_name='br-int',has_traffic_filtering=True,id=920a7e7c-2b32-4e38-932f-e8ed19c81f7c,network=Network(b97b8849-844c-4190-8b13-fd7a2d073ce8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap920a7e7c-2b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:16:44 np0005466013 nova_compute[192144]: 2025-10-02 12:16:44.911 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:44 np0005466013 nova_compute[192144]: 2025-10-02 12:16:44.912 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:44 np0005466013 nova_compute[192144]: 2025-10-02 12:16:44.912 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:16:44 np0005466013 nova_compute[192144]: 2025-10-02 12:16:44.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:44 np0005466013 nova_compute[192144]: 2025-10-02 12:16:44.916 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap920a7e7c-2b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:44 np0005466013 nova_compute[192144]: 2025-10-02 12:16:44.917 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap920a7e7c-2b, col_values=(('external_ids', {'iface-id': '920a7e7c-2b32-4e38-932f-e8ed19c81f7c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:46:e6:8d', 'vm-uuid': '03d3f4d9-1589-440a-80c8-3348a75c106b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:44 np0005466013 nova_compute[192144]: 2025-10-02 12:16:44.918 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:44 np0005466013 NetworkManager[51205]: <info>  [1759407404.9205] manager: (tap920a7e7c-2b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/148)
Oct  2 08:16:44 np0005466013 nova_compute[192144]: 2025-10-02 12:16:44.921 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:16:44 np0005466013 nova_compute[192144]: 2025-10-02 12:16:44.925 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:44 np0005466013 nova_compute[192144]: 2025-10-02 12:16:44.926 2 INFO os_vif [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:46:e6:8d,bridge_name='br-int',has_traffic_filtering=True,id=920a7e7c-2b32-4e38-932f-e8ed19c81f7c,network=Network(b97b8849-844c-4190-8b13-fd7a2d073ce8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap920a7e7c-2b')#033[00m
Oct  2 08:16:45 np0005466013 nova_compute[192144]: 2025-10-02 12:16:45.000 2 DEBUG nova.virt.libvirt.driver [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:16:45 np0005466013 nova_compute[192144]: 2025-10-02 12:16:45.001 2 DEBUG nova.virt.libvirt.driver [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:16:45 np0005466013 nova_compute[192144]: 2025-10-02 12:16:45.002 2 DEBUG nova.virt.libvirt.driver [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] No VIF found with MAC fa:16:3e:46:e6:8d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:16:45 np0005466013 nova_compute[192144]: 2025-10-02 12:16:45.003 2 INFO nova.virt.libvirt.driver [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Using config drive#033[00m
Oct  2 08:16:45 np0005466013 nova_compute[192144]: 2025-10-02 12:16:45.462 2 INFO nova.virt.libvirt.driver [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Creating config drive at /var/lib/nova/instances/03d3f4d9-1589-440a-80c8-3348a75c106b/disk.config#033[00m
Oct  2 08:16:45 np0005466013 nova_compute[192144]: 2025-10-02 12:16:45.473 2 DEBUG oslo_concurrency.processutils [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/03d3f4d9-1589-440a-80c8-3348a75c106b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3syebqho execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:45 np0005466013 nova_compute[192144]: 2025-10-02 12:16:45.609 2 DEBUG oslo_concurrency.processutils [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/03d3f4d9-1589-440a-80c8-3348a75c106b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3syebqho" returned: 0 in 0.136s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:45 np0005466013 nova_compute[192144]: 2025-10-02 12:16:45.627 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407390.6250076, 379bce9e-24c6-4d6e-9438-28eed217ca12 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:16:45 np0005466013 nova_compute[192144]: 2025-10-02 12:16:45.627 2 INFO nova.compute.manager [-] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:16:45 np0005466013 kernel: tap920a7e7c-2b: entered promiscuous mode
Oct  2 08:16:45 np0005466013 NetworkManager[51205]: <info>  [1759407405.6852] manager: (tap920a7e7c-2b): new Tun device (/org/freedesktop/NetworkManager/Devices/149)
Oct  2 08:16:45 np0005466013 nova_compute[192144]: 2025-10-02 12:16:45.686 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:45 np0005466013 ovn_controller[94366]: 2025-10-02T12:16:45Z|00317|binding|INFO|Claiming lport 920a7e7c-2b32-4e38-932f-e8ed19c81f7c for this chassis.
Oct  2 08:16:45 np0005466013 ovn_controller[94366]: 2025-10-02T12:16:45Z|00318|binding|INFO|920a7e7c-2b32-4e38-932f-e8ed19c81f7c: Claiming fa:16:3e:46:e6:8d 10.100.0.5
Oct  2 08:16:45 np0005466013 nova_compute[192144]: 2025-10-02 12:16:45.689 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:45.708 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:46:e6:8d 10.100.0.5'], port_security=['fa:16:3e:46:e6:8d 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '03d3f4d9-1589-440a-80c8-3348a75c106b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b97b8849-844c-4190-8b13-fd7a2d073ce8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd5db64e6714348c1a7f57bb53de80915', 'neutron:revision_number': '2', 'neutron:security_group_ids': '063f732a-6071-414f-814d-a5d6c4e9e012', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2011b0da-7062-465f-963e-59e92e88a653, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=920a7e7c-2b32-4e38-932f-e8ed19c81f7c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:16:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:45.709 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 920a7e7c-2b32-4e38-932f-e8ed19c81f7c in datapath b97b8849-844c-4190-8b13-fd7a2d073ce8 bound to our chassis#033[00m
Oct  2 08:16:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:45.711 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b97b8849-844c-4190-8b13-fd7a2d073ce8#033[00m
Oct  2 08:16:45 np0005466013 systemd-udevd[232173]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:16:45 np0005466013 nova_compute[192144]: 2025-10-02 12:16:45.723 2 DEBUG nova.compute.manager [None req-3025f413-270e-4902-8731-e0902eb9cb32 - - - - - -] [instance: 379bce9e-24c6-4d6e-9438-28eed217ca12] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:16:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:45.726 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[3f9b55d6-e486-4e6d-87f9-19d18aab1b38]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:45.727 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb97b8849-81 in ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:16:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:45.728 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb97b8849-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:16:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:45.729 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[d2057bac-d32f-40c8-9aff-4c12cb6f33dd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:45.729 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[a51d1e56-c854-4ce2-9394-ebe889113e8e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:45 np0005466013 NetworkManager[51205]: <info>  [1759407405.7330] device (tap920a7e7c-2b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:16:45 np0005466013 NetworkManager[51205]: <info>  [1759407405.7342] device (tap920a7e7c-2b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:16:45 np0005466013 systemd-machined[152202]: New machine qemu-38-instance-00000053.
Oct  2 08:16:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:45.744 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[c127a3c2-ffa9-436d-805a-e566dd86d84e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:45 np0005466013 nova_compute[192144]: 2025-10-02 12:16:45.751 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:45 np0005466013 systemd[1]: Started Virtual Machine qemu-38-instance-00000053.
Oct  2 08:16:45 np0005466013 nova_compute[192144]: 2025-10-02 12:16:45.758 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:45 np0005466013 ovn_controller[94366]: 2025-10-02T12:16:45Z|00319|binding|INFO|Setting lport 920a7e7c-2b32-4e38-932f-e8ed19c81f7c ovn-installed in OVS
Oct  2 08:16:45 np0005466013 ovn_controller[94366]: 2025-10-02T12:16:45Z|00320|binding|INFO|Setting lport 920a7e7c-2b32-4e38-932f-e8ed19c81f7c up in Southbound
Oct  2 08:16:45 np0005466013 nova_compute[192144]: 2025-10-02 12:16:45.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:45.770 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[b102f05e-2230-4836-9c72-d318e7001b0b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:45.801 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[1f3723d7-b8bc-48a0-b918-6dcc3adbad6e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:45 np0005466013 NetworkManager[51205]: <info>  [1759407405.8086] manager: (tapb97b8849-80): new Veth device (/org/freedesktop/NetworkManager/Devices/150)
Oct  2 08:16:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:45.808 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[05ce151d-9194-41f7-81b9-3052b11b29d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:45.843 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[aefd759d-81b4-499d-aca1-0943e5914702]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:45.846 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[05f93499-afa0-4e8f-aa45-7e3cc08f5dd6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:45 np0005466013 NetworkManager[51205]: <info>  [1759407405.8687] device (tapb97b8849-80): carrier: link connected
Oct  2 08:16:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:45.873 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[93dcc262-f7a8-438d-bc35-61f513dc4f4a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:45.890 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[39ee1e52-67e0-4ec9-8dd6-dc6c7394f6da]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb97b8849-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ea:e0:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 97], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 539748, 'reachable_time': 23428, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232207, 'error': None, 'target': 'ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:45.906 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[d3bd9e78-394e-4b0a-9339-46f44a087d3b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feea:e0b0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 539748, 'tstamp': 539748}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232208, 'error': None, 'target': 'ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:45.923 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[3c1e47cb-09c6-4af3-9563-e5b36504da03]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb97b8849-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ea:e0:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 97], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 539748, 'reachable_time': 23428, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 232209, 'error': None, 'target': 'ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:45.958 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[22d8e865-1268-4071-ac96-ac243a726c33]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:46 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:46.032 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[7d6ec104-6328-4865-9a77-a7f5c612313c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:46 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:46.034 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb97b8849-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:46 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:46.035 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:16:46 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:46.035 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb97b8849-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:46 np0005466013 nova_compute[192144]: 2025-10-02 12:16:46.037 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:46 np0005466013 NetworkManager[51205]: <info>  [1759407406.0385] manager: (tapb97b8849-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/151)
Oct  2 08:16:46 np0005466013 kernel: tapb97b8849-80: entered promiscuous mode
Oct  2 08:16:46 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:46.040 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb97b8849-80, col_values=(('external_ids', {'iface-id': '055cf080-4472-4807-a697-69de84e96953'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:46 np0005466013 nova_compute[192144]: 2025-10-02 12:16:46.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:46 np0005466013 ovn_controller[94366]: 2025-10-02T12:16:46Z|00321|binding|INFO|Releasing lport 055cf080-4472-4807-a697-69de84e96953 from this chassis (sb_readonly=0)
Oct  2 08:16:46 np0005466013 nova_compute[192144]: 2025-10-02 12:16:46.057 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:46 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:46.059 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b97b8849-844c-4190-8b13-fd7a2d073ce8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b97b8849-844c-4190-8b13-fd7a2d073ce8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:16:46 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:46.060 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[c26b51e4-79e7-465d-a4fc-01a7a5e4f717]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:46 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:46.062 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:16:46 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:16:46 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:16:46 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-b97b8849-844c-4190-8b13-fd7a2d073ce8
Oct  2 08:16:46 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:16:46 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:16:46 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:16:46 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/b97b8849-844c-4190-8b13-fd7a2d073ce8.pid.haproxy
Oct  2 08:16:46 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:16:46 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:16:46 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:16:46 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:16:46 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:16:46 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:16:46 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:16:46 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:16:46 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:16:46 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:16:46 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:16:46 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:16:46 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:16:46 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:16:46 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:16:46 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:16:46 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:16:46 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:16:46 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:16:46 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:16:46 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID b97b8849-844c-4190-8b13-fd7a2d073ce8
Oct  2 08:16:46 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:16:46 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:46.063 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8', 'env', 'PROCESS_TAG=haproxy-b97b8849-844c-4190-8b13-fd7a2d073ce8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b97b8849-844c-4190-8b13-fd7a2d073ce8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:16:46 np0005466013 podman[232248]: 2025-10-02 12:16:46.464468934 +0000 UTC m=+0.046699394 container create f49066be5c3774279752b80eb4ca76e391b22f4bb7023f6831e33db0e0faadc7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:16:46 np0005466013 nova_compute[192144]: 2025-10-02 12:16:46.481 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407406.4807177, 03d3f4d9-1589-440a-80c8-3348a75c106b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:16:46 np0005466013 nova_compute[192144]: 2025-10-02 12:16:46.482 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] VM Started (Lifecycle Event)#033[00m
Oct  2 08:16:46 np0005466013 systemd[1]: Started libpod-conmon-f49066be5c3774279752b80eb4ca76e391b22f4bb7023f6831e33db0e0faadc7.scope.
Oct  2 08:16:46 np0005466013 podman[232248]: 2025-10-02 12:16:46.439735159 +0000 UTC m=+0.021965639 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:16:46 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:16:46 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d7f382718cbad6eeacf8fcd1ef6d52e516fc10f94c2d7a9837903ebcb55fe9fa/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:16:46 np0005466013 podman[232248]: 2025-10-02 12:16:46.564744678 +0000 UTC m=+0.146975138 container init f49066be5c3774279752b80eb4ca76e391b22f4bb7023f6831e33db0e0faadc7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Oct  2 08:16:46 np0005466013 podman[232248]: 2025-10-02 12:16:46.569819858 +0000 UTC m=+0.152050318 container start f49066be5c3774279752b80eb4ca76e391b22f4bb7023f6831e33db0e0faadc7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 08:16:46 np0005466013 nova_compute[192144]: 2025-10-02 12:16:46.571 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:16:46 np0005466013 nova_compute[192144]: 2025-10-02 12:16:46.577 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407406.4817004, 03d3f4d9-1589-440a-80c8-3348a75c106b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:16:46 np0005466013 nova_compute[192144]: 2025-10-02 12:16:46.578 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:16:46 np0005466013 neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8[232264]: [NOTICE]   (232268) : New worker (232270) forked
Oct  2 08:16:46 np0005466013 neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8[232264]: [NOTICE]   (232268) : Loading success.
Oct  2 08:16:46 np0005466013 nova_compute[192144]: 2025-10-02 12:16:46.639 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:16:46 np0005466013 nova_compute[192144]: 2025-10-02 12:16:46.644 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:16:46 np0005466013 nova_compute[192144]: 2025-10-02 12:16:46.654 2 DEBUG nova.compute.manager [req-a17ef32d-8f90-46fc-9484-8bd407b9404e req-177a7655-0e91-4755-9df8-14e2d62dbed6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Received event network-vif-plugged-920a7e7c-2b32-4e38-932f-e8ed19c81f7c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:46 np0005466013 nova_compute[192144]: 2025-10-02 12:16:46.655 2 DEBUG oslo_concurrency.lockutils [req-a17ef32d-8f90-46fc-9484-8bd407b9404e req-177a7655-0e91-4755-9df8-14e2d62dbed6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "03d3f4d9-1589-440a-80c8-3348a75c106b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:46 np0005466013 nova_compute[192144]: 2025-10-02 12:16:46.655 2 DEBUG oslo_concurrency.lockutils [req-a17ef32d-8f90-46fc-9484-8bd407b9404e req-177a7655-0e91-4755-9df8-14e2d62dbed6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "03d3f4d9-1589-440a-80c8-3348a75c106b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:46 np0005466013 nova_compute[192144]: 2025-10-02 12:16:46.656 2 DEBUG oslo_concurrency.lockutils [req-a17ef32d-8f90-46fc-9484-8bd407b9404e req-177a7655-0e91-4755-9df8-14e2d62dbed6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "03d3f4d9-1589-440a-80c8-3348a75c106b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:46 np0005466013 nova_compute[192144]: 2025-10-02 12:16:46.656 2 DEBUG nova.compute.manager [req-a17ef32d-8f90-46fc-9484-8bd407b9404e req-177a7655-0e91-4755-9df8-14e2d62dbed6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Processing event network-vif-plugged-920a7e7c-2b32-4e38-932f-e8ed19c81f7c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:16:46 np0005466013 nova_compute[192144]: 2025-10-02 12:16:46.657 2 DEBUG nova.compute.manager [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:16:46 np0005466013 nova_compute[192144]: 2025-10-02 12:16:46.662 2 DEBUG nova.virt.libvirt.driver [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:16:46 np0005466013 nova_compute[192144]: 2025-10-02 12:16:46.666 2 INFO nova.virt.libvirt.driver [-] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Instance spawned successfully.#033[00m
Oct  2 08:16:46 np0005466013 nova_compute[192144]: 2025-10-02 12:16:46.666 2 DEBUG nova.virt.libvirt.driver [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:16:46 np0005466013 nova_compute[192144]: 2025-10-02 12:16:46.693 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:16:46 np0005466013 nova_compute[192144]: 2025-10-02 12:16:46.694 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407406.6619902, 03d3f4d9-1589-440a-80c8-3348a75c106b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:16:46 np0005466013 nova_compute[192144]: 2025-10-02 12:16:46.694 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:16:46 np0005466013 nova_compute[192144]: 2025-10-02 12:16:46.719 2 DEBUG nova.network.neutron [req-f668423e-e40a-4e54-818e-99fba8f244a3 req-f357b683-17a1-47ba-9cd7-54694df60235 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Updated VIF entry in instance network info cache for port 920a7e7c-2b32-4e38-932f-e8ed19c81f7c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:16:46 np0005466013 nova_compute[192144]: 2025-10-02 12:16:46.720 2 DEBUG nova.network.neutron [req-f668423e-e40a-4e54-818e-99fba8f244a3 req-f357b683-17a1-47ba-9cd7-54694df60235 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Updating instance_info_cache with network_info: [{"id": "920a7e7c-2b32-4e38-932f-e8ed19c81f7c", "address": "fa:16:3e:46:e6:8d", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap920a7e7c-2b", "ovs_interfaceid": "920a7e7c-2b32-4e38-932f-e8ed19c81f7c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:16:46 np0005466013 nova_compute[192144]: 2025-10-02 12:16:46.760 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:16:46 np0005466013 nova_compute[192144]: 2025-10-02 12:16:46.767 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:16:46 np0005466013 nova_compute[192144]: 2025-10-02 12:16:46.770 2 DEBUG nova.virt.libvirt.driver [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:16:46 np0005466013 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 08:16:46 np0005466013 nova_compute[192144]: 2025-10-02 12:16:46.771 2 DEBUG nova.virt.libvirt.driver [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:16:46 np0005466013 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 08:16:46 np0005466013 nova_compute[192144]: 2025-10-02 12:16:46.771 2 DEBUG nova.virt.libvirt.driver [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:16:46 np0005466013 nova_compute[192144]: 2025-10-02 12:16:46.772 2 DEBUG nova.virt.libvirt.driver [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:16:46 np0005466013 nova_compute[192144]: 2025-10-02 12:16:46.772 2 DEBUG nova.virt.libvirt.driver [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:16:46 np0005466013 nova_compute[192144]: 2025-10-02 12:16:46.773 2 DEBUG nova.virt.libvirt.driver [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:16:46 np0005466013 nova_compute[192144]: 2025-10-02 12:16:46.816 2 DEBUG oslo_concurrency.lockutils [req-f668423e-e40a-4e54-818e-99fba8f244a3 req-f357b683-17a1-47ba-9cd7-54694df60235 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-03d3f4d9-1589-440a-80c8-3348a75c106b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:16:46 np0005466013 nova_compute[192144]: 2025-10-02 12:16:46.859 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:16:47 np0005466013 nova_compute[192144]: 2025-10-02 12:16:47.214 2 INFO nova.compute.manager [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Took 5.68 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:16:47 np0005466013 nova_compute[192144]: 2025-10-02 12:16:47.215 2 DEBUG nova.compute.manager [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:16:47 np0005466013 nova_compute[192144]: 2025-10-02 12:16:47.559 2 INFO nova.compute.manager [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Took 6.50 seconds to build instance.#033[00m
Oct  2 08:16:47 np0005466013 nova_compute[192144]: 2025-10-02 12:16:47.580 2 DEBUG oslo_concurrency.lockutils [None req-6be0e4a4-3e95-4bc1-8aed-c9c94cc81907 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "03d3f4d9-1589-440a-80c8-3348a75c106b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.617s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:48 np0005466013 nova_compute[192144]: 2025-10-02 12:16:48.464 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:16:48.949 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:48 np0005466013 nova_compute[192144]: 2025-10-02 12:16:48.996 2 DEBUG nova.compute.manager [req-86dc4bb8-d350-4e5c-8562-538116f68d66 req-766cfcfd-7885-4daf-95bd-dd7535d5bce4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Received event network-vif-plugged-920a7e7c-2b32-4e38-932f-e8ed19c81f7c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:48 np0005466013 nova_compute[192144]: 2025-10-02 12:16:48.997 2 DEBUG oslo_concurrency.lockutils [req-86dc4bb8-d350-4e5c-8562-538116f68d66 req-766cfcfd-7885-4daf-95bd-dd7535d5bce4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "03d3f4d9-1589-440a-80c8-3348a75c106b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:48 np0005466013 nova_compute[192144]: 2025-10-02 12:16:48.997 2 DEBUG oslo_concurrency.lockutils [req-86dc4bb8-d350-4e5c-8562-538116f68d66 req-766cfcfd-7885-4daf-95bd-dd7535d5bce4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "03d3f4d9-1589-440a-80c8-3348a75c106b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:48 np0005466013 nova_compute[192144]: 2025-10-02 12:16:48.998 2 DEBUG oslo_concurrency.lockutils [req-86dc4bb8-d350-4e5c-8562-538116f68d66 req-766cfcfd-7885-4daf-95bd-dd7535d5bce4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "03d3f4d9-1589-440a-80c8-3348a75c106b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:48 np0005466013 nova_compute[192144]: 2025-10-02 12:16:48.998 2 DEBUG nova.compute.manager [req-86dc4bb8-d350-4e5c-8562-538116f68d66 req-766cfcfd-7885-4daf-95bd-dd7535d5bce4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] No waiting events found dispatching network-vif-plugged-920a7e7c-2b32-4e38-932f-e8ed19c81f7c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:16:48 np0005466013 nova_compute[192144]: 2025-10-02 12:16:48.998 2 WARNING nova.compute.manager [req-86dc4bb8-d350-4e5c-8562-538116f68d66 req-766cfcfd-7885-4daf-95bd-dd7535d5bce4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Received unexpected event network-vif-plugged-920a7e7c-2b32-4e38-932f-e8ed19c81f7c for instance with vm_state active and task_state shelving.#033[00m
Oct  2 08:16:49 np0005466013 nova_compute[192144]: 2025-10-02 12:16:49.106 2 DEBUG oslo_concurrency.lockutils [None req-2c6497fe-b078-450a-a51a-19507e17a1a5 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "03d3f4d9-1589-440a-80c8-3348a75c106b" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:49 np0005466013 nova_compute[192144]: 2025-10-02 12:16:49.107 2 DEBUG oslo_concurrency.lockutils [None req-2c6497fe-b078-450a-a51a-19507e17a1a5 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "03d3f4d9-1589-440a-80c8-3348a75c106b" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:49 np0005466013 nova_compute[192144]: 2025-10-02 12:16:49.107 2 INFO nova.compute.manager [None req-2c6497fe-b078-450a-a51a-19507e17a1a5 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Shelving#033[00m
Oct  2 08:16:49 np0005466013 nova_compute[192144]: 2025-10-02 12:16:49.151 2 DEBUG nova.virt.libvirt.driver [None req-2c6497fe-b078-450a-a51a-19507e17a1a5 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Oct  2 08:16:49 np0005466013 nova_compute[192144]: 2025-10-02 12:16:49.580 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407394.5791566, 85bfe864-3153-4ef5-b286-f2f31e93f994 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:16:49 np0005466013 nova_compute[192144]: 2025-10-02 12:16:49.581 2 INFO nova.compute.manager [-] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:16:49 np0005466013 nova_compute[192144]: 2025-10-02 12:16:49.637 2 DEBUG nova.compute.manager [None req-8b88b883-2360-4f54-9565-5ec6e281faea - - - - - -] [instance: 85bfe864-3153-4ef5-b286-f2f31e93f994] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:16:49 np0005466013 nova_compute[192144]: 2025-10-02 12:16:49.919 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:52 np0005466013 podman[232280]: 2025-10-02 12:16:52.705425233 +0000 UTC m=+0.073120762 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Oct  2 08:16:53 np0005466013 nova_compute[192144]: 2025-10-02 12:16:53.465 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:54 np0005466013 nova_compute[192144]: 2025-10-02 12:16:54.768 2 DEBUG oslo_concurrency.lockutils [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Acquiring lock "f75b8563-463c-4024-97d0-befe60db5872" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:54 np0005466013 nova_compute[192144]: 2025-10-02 12:16:54.769 2 DEBUG oslo_concurrency.lockutils [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Lock "f75b8563-463c-4024-97d0-befe60db5872" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:54 np0005466013 nova_compute[192144]: 2025-10-02 12:16:54.799 2 DEBUG nova.compute.manager [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:16:54 np0005466013 podman[232301]: 2025-10-02 12:16:54.878988466 +0000 UTC m=+0.070740076 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:16:54 np0005466013 podman[232302]: 2025-10-02 12:16:54.878995717 +0000 UTC m=+0.070647654 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., name=ubi9-minimal, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.buildah.version=1.33.7, distribution-scope=public, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, config_id=edpm)
Oct  2 08:16:54 np0005466013 nova_compute[192144]: 2025-10-02 12:16:54.906 2 DEBUG oslo_concurrency.lockutils [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:54 np0005466013 nova_compute[192144]: 2025-10-02 12:16:54.906 2 DEBUG oslo_concurrency.lockutils [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:54 np0005466013 nova_compute[192144]: 2025-10-02 12:16:54.913 2 DEBUG nova.virt.hardware [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:16:54 np0005466013 nova_compute[192144]: 2025-10-02 12:16:54.914 2 INFO nova.compute.claims [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:16:54 np0005466013 nova_compute[192144]: 2025-10-02 12:16:54.921 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:55 np0005466013 nova_compute[192144]: 2025-10-02 12:16:55.055 2 DEBUG nova.compute.provider_tree [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:16:55 np0005466013 nova_compute[192144]: 2025-10-02 12:16:55.075 2 DEBUG nova.scheduler.client.report [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:16:55 np0005466013 nova_compute[192144]: 2025-10-02 12:16:55.190 2 DEBUG oslo_concurrency.lockutils [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.283s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:55 np0005466013 nova_compute[192144]: 2025-10-02 12:16:55.191 2 DEBUG nova.compute.manager [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:16:55 np0005466013 nova_compute[192144]: 2025-10-02 12:16:55.268 2 DEBUG nova.compute.manager [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:16:55 np0005466013 nova_compute[192144]: 2025-10-02 12:16:55.269 2 DEBUG nova.network.neutron [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:16:55 np0005466013 nova_compute[192144]: 2025-10-02 12:16:55.305 2 INFO nova.virt.libvirt.driver [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:16:55 np0005466013 nova_compute[192144]: 2025-10-02 12:16:55.331 2 DEBUG nova.compute.manager [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:16:55 np0005466013 nova_compute[192144]: 2025-10-02 12:16:55.467 2 DEBUG nova.compute.manager [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:16:55 np0005466013 nova_compute[192144]: 2025-10-02 12:16:55.468 2 DEBUG nova.virt.libvirt.driver [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:16:55 np0005466013 nova_compute[192144]: 2025-10-02 12:16:55.469 2 INFO nova.virt.libvirt.driver [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] Creating image(s)#033[00m
Oct  2 08:16:55 np0005466013 nova_compute[192144]: 2025-10-02 12:16:55.470 2 DEBUG oslo_concurrency.lockutils [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Acquiring lock "/var/lib/nova/instances/f75b8563-463c-4024-97d0-befe60db5872/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:55 np0005466013 nova_compute[192144]: 2025-10-02 12:16:55.470 2 DEBUG oslo_concurrency.lockutils [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Lock "/var/lib/nova/instances/f75b8563-463c-4024-97d0-befe60db5872/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:55 np0005466013 nova_compute[192144]: 2025-10-02 12:16:55.471 2 DEBUG oslo_concurrency.lockutils [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Lock "/var/lib/nova/instances/f75b8563-463c-4024-97d0-befe60db5872/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:55 np0005466013 nova_compute[192144]: 2025-10-02 12:16:55.487 2 DEBUG oslo_concurrency.processutils [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:55 np0005466013 nova_compute[192144]: 2025-10-02 12:16:55.546 2 DEBUG oslo_concurrency.processutils [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:55 np0005466013 nova_compute[192144]: 2025-10-02 12:16:55.547 2 DEBUG oslo_concurrency.lockutils [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:55 np0005466013 nova_compute[192144]: 2025-10-02 12:16:55.548 2 DEBUG oslo_concurrency.lockutils [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:55 np0005466013 nova_compute[192144]: 2025-10-02 12:16:55.559 2 DEBUG oslo_concurrency.processutils [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:55 np0005466013 nova_compute[192144]: 2025-10-02 12:16:55.614 2 DEBUG oslo_concurrency.processutils [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:55 np0005466013 nova_compute[192144]: 2025-10-02 12:16:55.616 2 DEBUG oslo_concurrency.processutils [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/f75b8563-463c-4024-97d0-befe60db5872/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:55 np0005466013 nova_compute[192144]: 2025-10-02 12:16:55.650 2 DEBUG oslo_concurrency.processutils [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/f75b8563-463c-4024-97d0-befe60db5872/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:55 np0005466013 nova_compute[192144]: 2025-10-02 12:16:55.651 2 DEBUG oslo_concurrency.lockutils [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:55 np0005466013 nova_compute[192144]: 2025-10-02 12:16:55.652 2 DEBUG oslo_concurrency.processutils [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:55 np0005466013 nova_compute[192144]: 2025-10-02 12:16:55.708 2 DEBUG oslo_concurrency.processutils [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:55 np0005466013 nova_compute[192144]: 2025-10-02 12:16:55.709 2 DEBUG nova.virt.disk.api [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Checking if we can resize image /var/lib/nova/instances/f75b8563-463c-4024-97d0-befe60db5872/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:16:55 np0005466013 nova_compute[192144]: 2025-10-02 12:16:55.710 2 DEBUG oslo_concurrency.processutils [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f75b8563-463c-4024-97d0-befe60db5872/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:55 np0005466013 nova_compute[192144]: 2025-10-02 12:16:55.767 2 DEBUG oslo_concurrency.processutils [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f75b8563-463c-4024-97d0-befe60db5872/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:55 np0005466013 nova_compute[192144]: 2025-10-02 12:16:55.769 2 DEBUG nova.virt.disk.api [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Cannot resize image /var/lib/nova/instances/f75b8563-463c-4024-97d0-befe60db5872/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:16:55 np0005466013 nova_compute[192144]: 2025-10-02 12:16:55.770 2 DEBUG nova.objects.instance [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Lazy-loading 'migration_context' on Instance uuid f75b8563-463c-4024-97d0-befe60db5872 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:16:55 np0005466013 nova_compute[192144]: 2025-10-02 12:16:55.822 2 DEBUG nova.virt.libvirt.driver [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:16:55 np0005466013 nova_compute[192144]: 2025-10-02 12:16:55.822 2 DEBUG nova.virt.libvirt.driver [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] Ensure instance console log exists: /var/lib/nova/instances/f75b8563-463c-4024-97d0-befe60db5872/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:16:55 np0005466013 nova_compute[192144]: 2025-10-02 12:16:55.823 2 DEBUG oslo_concurrency.lockutils [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:55 np0005466013 nova_compute[192144]: 2025-10-02 12:16:55.824 2 DEBUG oslo_concurrency.lockutils [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:55 np0005466013 nova_compute[192144]: 2025-10-02 12:16:55.824 2 DEBUG oslo_concurrency.lockutils [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:55 np0005466013 nova_compute[192144]: 2025-10-02 12:16:55.907 2 DEBUG nova.policy [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:16:58 np0005466013 ovn_controller[94366]: 2025-10-02T12:16:58Z|00030|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:46:e6:8d 10.100.0.5
Oct  2 08:16:58 np0005466013 ovn_controller[94366]: 2025-10-02T12:16:58Z|00031|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:46:e6:8d 10.100.0.5
Oct  2 08:16:58 np0005466013 nova_compute[192144]: 2025-10-02 12:16:58.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:58 np0005466013 nova_compute[192144]: 2025-10-02 12:16:58.635 2 DEBUG nova.network.neutron [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] Successfully created port: d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:16:59 np0005466013 nova_compute[192144]: 2025-10-02 12:16:59.206 2 DEBUG nova.virt.libvirt.driver [None req-2c6497fe-b078-450a-a51a-19507e17a1a5 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Oct  2 08:16:59 np0005466013 nova_compute[192144]: 2025-10-02 12:16:59.924 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:00 np0005466013 podman[232376]: 2025-10-02 12:17:00.687120815 +0000 UTC m=+0.051581028 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  2 08:17:00 np0005466013 podman[232377]: 2025-10-02 12:17:00.694121257 +0000 UTC m=+0.058479807 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:17:01 np0005466013 kernel: tap920a7e7c-2b (unregistering): left promiscuous mode
Oct  2 08:17:01 np0005466013 NetworkManager[51205]: <info>  [1759407421.5647] device (tap920a7e7c-2b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:17:01 np0005466013 ovn_controller[94366]: 2025-10-02T12:17:01Z|00322|binding|INFO|Releasing lport 920a7e7c-2b32-4e38-932f-e8ed19c81f7c from this chassis (sb_readonly=0)
Oct  2 08:17:01 np0005466013 ovn_controller[94366]: 2025-10-02T12:17:01Z|00323|binding|INFO|Setting lport 920a7e7c-2b32-4e38-932f-e8ed19c81f7c down in Southbound
Oct  2 08:17:01 np0005466013 nova_compute[192144]: 2025-10-02 12:17:01.570 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:01 np0005466013 ovn_controller[94366]: 2025-10-02T12:17:01Z|00324|binding|INFO|Removing iface tap920a7e7c-2b ovn-installed in OVS
Oct  2 08:17:01 np0005466013 nova_compute[192144]: 2025-10-02 12:17:01.597 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:01.602 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:46:e6:8d 10.100.0.5'], port_security=['fa:16:3e:46:e6:8d 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '03d3f4d9-1589-440a-80c8-3348a75c106b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b97b8849-844c-4190-8b13-fd7a2d073ce8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd5db64e6714348c1a7f57bb53de80915', 'neutron:revision_number': '4', 'neutron:security_group_ids': '063f732a-6071-414f-814d-a5d6c4e9e012', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2011b0da-7062-465f-963e-59e92e88a653, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=920a7e7c-2b32-4e38-932f-e8ed19c81f7c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:17:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:01.604 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 920a7e7c-2b32-4e38-932f-e8ed19c81f7c in datapath b97b8849-844c-4190-8b13-fd7a2d073ce8 unbound from our chassis#033[00m
Oct  2 08:17:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:01.606 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b97b8849-844c-4190-8b13-fd7a2d073ce8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:17:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:01.607 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[96a19064-be9d-407b-87c0-653cd1c3da58]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:01.608 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8 namespace which is not needed anymore#033[00m
Oct  2 08:17:01 np0005466013 nova_compute[192144]: 2025-10-02 12:17:01.615 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:01 np0005466013 systemd[1]: machine-qemu\x2d38\x2dinstance\x2d00000053.scope: Deactivated successfully.
Oct  2 08:17:01 np0005466013 systemd[1]: machine-qemu\x2d38\x2dinstance\x2d00000053.scope: Consumed 13.046s CPU time.
Oct  2 08:17:01 np0005466013 systemd-machined[152202]: Machine qemu-38-instance-00000053 terminated.
Oct  2 08:17:01 np0005466013 neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8[232264]: [NOTICE]   (232268) : haproxy version is 2.8.14-c23fe91
Oct  2 08:17:01 np0005466013 neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8[232264]: [NOTICE]   (232268) : path to executable is /usr/sbin/haproxy
Oct  2 08:17:01 np0005466013 neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8[232264]: [WARNING]  (232268) : Exiting Master process...
Oct  2 08:17:01 np0005466013 neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8[232264]: [ALERT]    (232268) : Current worker (232270) exited with code 143 (Terminated)
Oct  2 08:17:01 np0005466013 neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8[232264]: [WARNING]  (232268) : All workers exited. Exiting... (0)
Oct  2 08:17:01 np0005466013 systemd[1]: libpod-f49066be5c3774279752b80eb4ca76e391b22f4bb7023f6831e33db0e0faadc7.scope: Deactivated successfully.
Oct  2 08:17:01 np0005466013 podman[232443]: 2025-10-02 12:17:01.755545954 +0000 UTC m=+0.055494973 container died f49066be5c3774279752b80eb4ca76e391b22f4bb7023f6831e33db0e0faadc7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_managed=true)
Oct  2 08:17:01 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f49066be5c3774279752b80eb4ca76e391b22f4bb7023f6831e33db0e0faadc7-userdata-shm.mount: Deactivated successfully.
Oct  2 08:17:01 np0005466013 systemd[1]: var-lib-containers-storage-overlay-d7f382718cbad6eeacf8fcd1ef6d52e516fc10f94c2d7a9837903ebcb55fe9fa-merged.mount: Deactivated successfully.
Oct  2 08:17:01 np0005466013 podman[232443]: 2025-10-02 12:17:01.810302802 +0000 UTC m=+0.110251821 container cleanup f49066be5c3774279752b80eb4ca76e391b22f4bb7023f6831e33db0e0faadc7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:17:01 np0005466013 systemd[1]: libpod-conmon-f49066be5c3774279752b80eb4ca76e391b22f4bb7023f6831e33db0e0faadc7.scope: Deactivated successfully.
Oct  2 08:17:01 np0005466013 kernel: tap920a7e7c-2b: entered promiscuous mode
Oct  2 08:17:01 np0005466013 kernel: tap920a7e7c-2b (unregistering): left promiscuous mode
Oct  2 08:17:01 np0005466013 NetworkManager[51205]: <info>  [1759407421.8347] manager: (tap920a7e7c-2b): new Tun device (/org/freedesktop/NetworkManager/Devices/152)
Oct  2 08:17:01 np0005466013 ovn_controller[94366]: 2025-10-02T12:17:01Z|00325|binding|INFO|Claiming lport 920a7e7c-2b32-4e38-932f-e8ed19c81f7c for this chassis.
Oct  2 08:17:01 np0005466013 ovn_controller[94366]: 2025-10-02T12:17:01Z|00326|binding|INFO|920a7e7c-2b32-4e38-932f-e8ed19c81f7c: Claiming fa:16:3e:46:e6:8d 10.100.0.5
Oct  2 08:17:01 np0005466013 nova_compute[192144]: 2025-10-02 12:17:01.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:01.875 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:46:e6:8d 10.100.0.5'], port_security=['fa:16:3e:46:e6:8d 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '03d3f4d9-1589-440a-80c8-3348a75c106b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b97b8849-844c-4190-8b13-fd7a2d073ce8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd5db64e6714348c1a7f57bb53de80915', 'neutron:revision_number': '4', 'neutron:security_group_ids': '063f732a-6071-414f-814d-a5d6c4e9e012', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2011b0da-7062-465f-963e-59e92e88a653, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=920a7e7c-2b32-4e38-932f-e8ed19c81f7c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:17:01 np0005466013 ovn_controller[94366]: 2025-10-02T12:17:01Z|00327|binding|INFO|Setting lport 920a7e7c-2b32-4e38-932f-e8ed19c81f7c ovn-installed in OVS
Oct  2 08:17:01 np0005466013 ovn_controller[94366]: 2025-10-02T12:17:01Z|00328|binding|INFO|Setting lport 920a7e7c-2b32-4e38-932f-e8ed19c81f7c up in Southbound
Oct  2 08:17:01 np0005466013 nova_compute[192144]: 2025-10-02 12:17:01.880 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:01 np0005466013 podman[232472]: 2025-10-02 12:17:01.914930643 +0000 UTC m=+0.048969645 container remove f49066be5c3774279752b80eb4ca76e391b22f4bb7023f6831e33db0e0faadc7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  2 08:17:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:01.920 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[596acca0-6cb7-4e8c-ba11-fb5c4060fae5]: (4, ('Thu Oct  2 12:17:01 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8 (f49066be5c3774279752b80eb4ca76e391b22f4bb7023f6831e33db0e0faadc7)\nf49066be5c3774279752b80eb4ca76e391b22f4bb7023f6831e33db0e0faadc7\nThu Oct  2 12:17:01 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8 (f49066be5c3774279752b80eb4ca76e391b22f4bb7023f6831e33db0e0faadc7)\nf49066be5c3774279752b80eb4ca76e391b22f4bb7023f6831e33db0e0faadc7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:01.922 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[e3f55444-4fa0-45b8-9bf3-7cea11ecf7c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:01.923 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb97b8849-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:17:01 np0005466013 nova_compute[192144]: 2025-10-02 12:17:01.925 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:01 np0005466013 kernel: tapb97b8849-80: left promiscuous mode
Oct  2 08:17:01 np0005466013 ovn_controller[94366]: 2025-10-02T12:17:01Z|00329|binding|INFO|Releasing lport 920a7e7c-2b32-4e38-932f-e8ed19c81f7c from this chassis (sb_readonly=0)
Oct  2 08:17:01 np0005466013 ovn_controller[94366]: 2025-10-02T12:17:01Z|00330|binding|INFO|Setting lport 920a7e7c-2b32-4e38-932f-e8ed19c81f7c down in Southbound
Oct  2 08:17:01 np0005466013 nova_compute[192144]: 2025-10-02 12:17:01.941 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:01 np0005466013 ovn_controller[94366]: 2025-10-02T12:17:01Z|00331|binding|INFO|Removing iface tap920a7e7c-2b ovn-installed in OVS
Oct  2 08:17:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:01.945 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[d8969997-8bea-4d07-9b8d-08152c137068]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:01 np0005466013 nova_compute[192144]: 2025-10-02 12:17:01.958 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:01.958 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:46:e6:8d 10.100.0.5'], port_security=['fa:16:3e:46:e6:8d 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '03d3f4d9-1589-440a-80c8-3348a75c106b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b97b8849-844c-4190-8b13-fd7a2d073ce8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd5db64e6714348c1a7f57bb53de80915', 'neutron:revision_number': '4', 'neutron:security_group_ids': '063f732a-6071-414f-814d-a5d6c4e9e012', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2011b0da-7062-465f-963e-59e92e88a653, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=920a7e7c-2b32-4e38-932f-e8ed19c81f7c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:17:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:01.984 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[90cb61ac-d7d6-468a-8b31-032ccd4b4ba7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:01.985 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[7b93ef17-4830-4ff2-8a27-4f296fbdfc9d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:02.001 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[89eb32ee-8ac1-4681-b7b3-8ec92927dc8e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 539741, 'reachable_time': 31065, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232500, 'error': None, 'target': 'ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:02.005 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:17:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:02.005 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[482d3ffe-a39e-48d3-bd58-52e15348a1f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:02.006 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 920a7e7c-2b32-4e38-932f-e8ed19c81f7c in datapath b97b8849-844c-4190-8b13-fd7a2d073ce8 unbound from our chassis#033[00m
Oct  2 08:17:02 np0005466013 systemd[1]: run-netns-ovnmeta\x2db97b8849\x2d844c\x2d4190\x2d8b13\x2dfd7a2d073ce8.mount: Deactivated successfully.
Oct  2 08:17:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:02.008 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b97b8849-844c-4190-8b13-fd7a2d073ce8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:17:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:02.008 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[ba0b609b-f570-46c8-9f14-755f327b6f2a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:02.009 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 920a7e7c-2b32-4e38-932f-e8ed19c81f7c in datapath b97b8849-844c-4190-8b13-fd7a2d073ce8 unbound from our chassis#033[00m
Oct  2 08:17:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:02.010 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b97b8849-844c-4190-8b13-fd7a2d073ce8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:17:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:02.011 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[f365e3f4-69c7-4d69-84c1-7d8af6c5c81b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:02 np0005466013 nova_compute[192144]: 2025-10-02 12:17:02.052 2 DEBUG nova.network.neutron [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] Successfully updated port: d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:17:02 np0005466013 nova_compute[192144]: 2025-10-02 12:17:02.082 2 DEBUG oslo_concurrency.lockutils [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Acquiring lock "refresh_cache-f75b8563-463c-4024-97d0-befe60db5872" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:17:02 np0005466013 nova_compute[192144]: 2025-10-02 12:17:02.083 2 DEBUG oslo_concurrency.lockutils [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Acquired lock "refresh_cache-f75b8563-463c-4024-97d0-befe60db5872" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:17:02 np0005466013 nova_compute[192144]: 2025-10-02 12:17:02.083 2 DEBUG nova.network.neutron [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:17:02 np0005466013 nova_compute[192144]: 2025-10-02 12:17:02.222 2 INFO nova.virt.libvirt.driver [None req-2c6497fe-b078-450a-a51a-19507e17a1a5 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Instance shutdown successfully after 13 seconds.#033[00m
Oct  2 08:17:02 np0005466013 nova_compute[192144]: 2025-10-02 12:17:02.228 2 INFO nova.virt.libvirt.driver [-] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Instance destroyed successfully.#033[00m
Oct  2 08:17:02 np0005466013 nova_compute[192144]: 2025-10-02 12:17:02.228 2 DEBUG nova.objects.instance [None req-2c6497fe-b078-450a-a51a-19507e17a1a5 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lazy-loading 'numa_topology' on Instance uuid 03d3f4d9-1589-440a-80c8-3348a75c106b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:17:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:02.298 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:17:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:02.299 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:17:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:02.299 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:17:02 np0005466013 nova_compute[192144]: 2025-10-02 12:17:02.533 2 DEBUG nova.compute.manager [req-da136531-ad89-45bb-a489-bab91a193446 req-6f0a287e-5081-4bc8-b11c-a3cc2a191c3c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] Received event network-changed-d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:17:02 np0005466013 nova_compute[192144]: 2025-10-02 12:17:02.534 2 DEBUG nova.compute.manager [req-da136531-ad89-45bb-a489-bab91a193446 req-6f0a287e-5081-4bc8-b11c-a3cc2a191c3c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] Refreshing instance network info cache due to event network-changed-d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:17:02 np0005466013 nova_compute[192144]: 2025-10-02 12:17:02.535 2 DEBUG oslo_concurrency.lockutils [req-da136531-ad89-45bb-a489-bab91a193446 req-6f0a287e-5081-4bc8-b11c-a3cc2a191c3c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-f75b8563-463c-4024-97d0-befe60db5872" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:17:02 np0005466013 nova_compute[192144]: 2025-10-02 12:17:02.878 2 DEBUG nova.network.neutron [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:17:02 np0005466013 nova_compute[192144]: 2025-10-02 12:17:02.993 2 INFO nova.virt.libvirt.driver [None req-2c6497fe-b078-450a-a51a-19507e17a1a5 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Beginning cold snapshot process#033[00m
Oct  2 08:17:03 np0005466013 nova_compute[192144]: 2025-10-02 12:17:03.281 2 DEBUG nova.privsep.utils [None req-2c6497fe-b078-450a-a51a-19507e17a1a5 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Oct  2 08:17:03 np0005466013 nova_compute[192144]: 2025-10-02 12:17:03.282 2 DEBUG oslo_concurrency.processutils [None req-2c6497fe-b078-450a-a51a-19507e17a1a5 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/03d3f4d9-1589-440a-80c8-3348a75c106b/disk /var/lib/nova/instances/snapshots/tmppsfub8hx/107fde94b64944a9a3101ee21d530974 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:17:03 np0005466013 nova_compute[192144]: 2025-10-02 12:17:03.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:03 np0005466013 nova_compute[192144]: 2025-10-02 12:17:03.680 2 DEBUG oslo_concurrency.processutils [None req-2c6497fe-b078-450a-a51a-19507e17a1a5 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/03d3f4d9-1589-440a-80c8-3348a75c106b/disk /var/lib/nova/instances/snapshots/tmppsfub8hx/107fde94b64944a9a3101ee21d530974" returned: 0 in 0.398s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:17:03 np0005466013 nova_compute[192144]: 2025-10-02 12:17:03.681 2 INFO nova.virt.libvirt.driver [None req-2c6497fe-b078-450a-a51a-19507e17a1a5 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Snapshot extracted, beginning image upload#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.358 2 DEBUG nova.network.neutron [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] Updating instance_info_cache with network_info: [{"id": "d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2", "address": "fa:16:3e:98:5e:f0", "network": {"id": "bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-542543245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e0277f0bb0f4a349e2e6d8ddfa24edf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd86401f6-e4", "ovs_interfaceid": "d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.397 2 DEBUG oslo_concurrency.lockutils [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Releasing lock "refresh_cache-f75b8563-463c-4024-97d0-befe60db5872" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.398 2 DEBUG nova.compute.manager [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] Instance network_info: |[{"id": "d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2", "address": "fa:16:3e:98:5e:f0", "network": {"id": "bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-542543245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e0277f0bb0f4a349e2e6d8ddfa24edf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd86401f6-e4", "ovs_interfaceid": "d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.398 2 DEBUG oslo_concurrency.lockutils [req-da136531-ad89-45bb-a489-bab91a193446 req-6f0a287e-5081-4bc8-b11c-a3cc2a191c3c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-f75b8563-463c-4024-97d0-befe60db5872" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.399 2 DEBUG nova.network.neutron [req-da136531-ad89-45bb-a489-bab91a193446 req-6f0a287e-5081-4bc8-b11c-a3cc2a191c3c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] Refreshing network info cache for port d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.404 2 DEBUG nova.virt.libvirt.driver [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] Start _get_guest_xml network_info=[{"id": "d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2", "address": "fa:16:3e:98:5e:f0", "network": {"id": "bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-542543245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e0277f0bb0f4a349e2e6d8ddfa24edf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd86401f6-e4", "ovs_interfaceid": "d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.409 2 WARNING nova.virt.libvirt.driver [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.415 2 DEBUG nova.virt.libvirt.host [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.416 2 DEBUG nova.virt.libvirt.host [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.419 2 DEBUG nova.virt.libvirt.host [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.420 2 DEBUG nova.virt.libvirt.host [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.422 2 DEBUG nova.virt.libvirt.driver [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.423 2 DEBUG nova.virt.hardware [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:25Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9949d9da-6314-4ede-8797-6f2f0a6a64fc',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.423 2 DEBUG nova.virt.hardware [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.424 2 DEBUG nova.virt.hardware [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.424 2 DEBUG nova.virt.hardware [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.424 2 DEBUG nova.virt.hardware [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.424 2 DEBUG nova.virt.hardware [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.425 2 DEBUG nova.virt.hardware [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.425 2 DEBUG nova.virt.hardware [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.425 2 DEBUG nova.virt.hardware [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.425 2 DEBUG nova.virt.hardware [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.426 2 DEBUG nova.virt.hardware [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.429 2 DEBUG nova.virt.libvirt.vif [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:16:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1580551683',display_name='tempest-ListServerFiltersTestJSON-instance-1580551683',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1580551683',id=87,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6e0277f0bb0f4a349e2e6d8ddfa24edf',ramdisk_id='',reservation_id='r-7yh3ztad',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-298715262',owner_user_name='tempest-ListServerFiltersTestJSON-298715262-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:16:55Z,user_data=None,user_id='001d2d51902d4e299b775131f430a5db',uuid=f75b8563-463c-4024-97d0-befe60db5872,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2", "address": "fa:16:3e:98:5e:f0", "network": {"id": "bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-542543245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e0277f0bb0f4a349e2e6d8ddfa24edf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd86401f6-e4", "ovs_interfaceid": "d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.430 2 DEBUG nova.network.os_vif_util [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Converting VIF {"id": "d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2", "address": "fa:16:3e:98:5e:f0", "network": {"id": "bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-542543245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e0277f0bb0f4a349e2e6d8ddfa24edf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd86401f6-e4", "ovs_interfaceid": "d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.430 2 DEBUG nova.network.os_vif_util [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:98:5e:f0,bridge_name='br-int',has_traffic_filtering=True,id=d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2,network=Network(bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd86401f6-e4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.431 2 DEBUG nova.objects.instance [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Lazy-loading 'pci_devices' on Instance uuid f75b8563-463c-4024-97d0-befe60db5872 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.458 2 DEBUG nova.virt.libvirt.driver [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:17:04 np0005466013 nova_compute[192144]:  <uuid>f75b8563-463c-4024-97d0-befe60db5872</uuid>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:  <name>instance-00000057</name>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:  <memory>196608</memory>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:17:04 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:      <nova:name>tempest-ListServerFiltersTestJSON-instance-1580551683</nova:name>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:17:04</nova:creationTime>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.micro">
Oct  2 08:17:04 np0005466013 nova_compute[192144]:        <nova:memory>192</nova:memory>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:        <nova:user uuid="001d2d51902d4e299b775131f430a5db">tempest-ListServerFiltersTestJSON-298715262-project-member</nova:user>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:        <nova:project uuid="6e0277f0bb0f4a349e2e6d8ddfa24edf">tempest-ListServerFiltersTestJSON-298715262</nova:project>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:        <nova:port uuid="d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2">
Oct  2 08:17:04 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:      <entry name="serial">f75b8563-463c-4024-97d0-befe60db5872</entry>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:      <entry name="uuid">f75b8563-463c-4024-97d0-befe60db5872</entry>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:17:04 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/f75b8563-463c-4024-97d0-befe60db5872/disk"/>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:17:04 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/f75b8563-463c-4024-97d0-befe60db5872/disk.config"/>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:17:04 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:98:5e:f0"/>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:      <target dev="tapd86401f6-e4"/>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:17:04 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/f75b8563-463c-4024-97d0-befe60db5872/console.log" append="off"/>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:17:04 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:17:04 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:17:04 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:17:04 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:17:04 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.460 2 DEBUG nova.compute.manager [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] Preparing to wait for external event network-vif-plugged-d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.460 2 DEBUG oslo_concurrency.lockutils [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Acquiring lock "f75b8563-463c-4024-97d0-befe60db5872-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.460 2 DEBUG oslo_concurrency.lockutils [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Lock "f75b8563-463c-4024-97d0-befe60db5872-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.460 2 DEBUG oslo_concurrency.lockutils [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Lock "f75b8563-463c-4024-97d0-befe60db5872-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.461 2 DEBUG nova.virt.libvirt.vif [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:16:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1580551683',display_name='tempest-ListServerFiltersTestJSON-instance-1580551683',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1580551683',id=87,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6e0277f0bb0f4a349e2e6d8ddfa24edf',ramdisk_id='',reservation_id='r-7yh3ztad',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-298715262',owner_user_name='tempest-ListServerFiltersTestJSON-298715262-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:16:55Z,user_data=None,user_id='001d2d51902d4e299b775131f430a5db',uuid=f75b8563-463c-4024-97d0-befe60db5872,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2", "address": "fa:16:3e:98:5e:f0", "network": {"id": "bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-542543245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e0277f0bb0f4a349e2e6d8ddfa24edf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd86401f6-e4", "ovs_interfaceid": "d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.461 2 DEBUG nova.network.os_vif_util [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Converting VIF {"id": "d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2", "address": "fa:16:3e:98:5e:f0", "network": {"id": "bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-542543245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e0277f0bb0f4a349e2e6d8ddfa24edf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd86401f6-e4", "ovs_interfaceid": "d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.462 2 DEBUG nova.network.os_vif_util [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:98:5e:f0,bridge_name='br-int',has_traffic_filtering=True,id=d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2,network=Network(bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd86401f6-e4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.462 2 DEBUG os_vif [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:98:5e:f0,bridge_name='br-int',has_traffic_filtering=True,id=d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2,network=Network(bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd86401f6-e4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.463 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.463 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.467 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd86401f6-e4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.467 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd86401f6-e4, col_values=(('external_ids', {'iface-id': 'd86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:98:5e:f0', 'vm-uuid': 'f75b8563-463c-4024-97d0-befe60db5872'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:04 np0005466013 NetworkManager[51205]: <info>  [1759407424.4700] manager: (tapd86401f6-e4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/153)
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.471 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.477 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.478 2 INFO os_vif [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:98:5e:f0,bridge_name='br-int',has_traffic_filtering=True,id=d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2,network=Network(bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd86401f6-e4')#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.549 2 DEBUG nova.virt.libvirt.driver [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.551 2 DEBUG nova.virt.libvirt.driver [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.552 2 DEBUG nova.virt.libvirt.driver [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] No VIF found with MAC fa:16:3e:98:5e:f0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.552 2 INFO nova.virt.libvirt.driver [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] Using config drive#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.691 2 DEBUG nova.compute.manager [req-cf41a184-4868-453e-b08a-467bb2416719 req-21a58c2b-6505-4a41-bfb0-7895333f112c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Received event network-vif-unplugged-920a7e7c-2b32-4e38-932f-e8ed19c81f7c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.691 2 DEBUG oslo_concurrency.lockutils [req-cf41a184-4868-453e-b08a-467bb2416719 req-21a58c2b-6505-4a41-bfb0-7895333f112c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "03d3f4d9-1589-440a-80c8-3348a75c106b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.692 2 DEBUG oslo_concurrency.lockutils [req-cf41a184-4868-453e-b08a-467bb2416719 req-21a58c2b-6505-4a41-bfb0-7895333f112c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "03d3f4d9-1589-440a-80c8-3348a75c106b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.692 2 DEBUG oslo_concurrency.lockutils [req-cf41a184-4868-453e-b08a-467bb2416719 req-21a58c2b-6505-4a41-bfb0-7895333f112c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "03d3f4d9-1589-440a-80c8-3348a75c106b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.692 2 DEBUG nova.compute.manager [req-cf41a184-4868-453e-b08a-467bb2416719 req-21a58c2b-6505-4a41-bfb0-7895333f112c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] No waiting events found dispatching network-vif-unplugged-920a7e7c-2b32-4e38-932f-e8ed19c81f7c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.692 2 WARNING nova.compute.manager [req-cf41a184-4868-453e-b08a-467bb2416719 req-21a58c2b-6505-4a41-bfb0-7895333f112c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Received unexpected event network-vif-unplugged-920a7e7c-2b32-4e38-932f-e8ed19c81f7c for instance with vm_state active and task_state shelving_image_uploading.#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.693 2 DEBUG nova.compute.manager [req-cf41a184-4868-453e-b08a-467bb2416719 req-21a58c2b-6505-4a41-bfb0-7895333f112c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Received event network-vif-plugged-920a7e7c-2b32-4e38-932f-e8ed19c81f7c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.693 2 DEBUG oslo_concurrency.lockutils [req-cf41a184-4868-453e-b08a-467bb2416719 req-21a58c2b-6505-4a41-bfb0-7895333f112c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "03d3f4d9-1589-440a-80c8-3348a75c106b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.693 2 DEBUG oslo_concurrency.lockutils [req-cf41a184-4868-453e-b08a-467bb2416719 req-21a58c2b-6505-4a41-bfb0-7895333f112c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "03d3f4d9-1589-440a-80c8-3348a75c106b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.693 2 DEBUG oslo_concurrency.lockutils [req-cf41a184-4868-453e-b08a-467bb2416719 req-21a58c2b-6505-4a41-bfb0-7895333f112c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "03d3f4d9-1589-440a-80c8-3348a75c106b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.694 2 DEBUG nova.compute.manager [req-cf41a184-4868-453e-b08a-467bb2416719 req-21a58c2b-6505-4a41-bfb0-7895333f112c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] No waiting events found dispatching network-vif-plugged-920a7e7c-2b32-4e38-932f-e8ed19c81f7c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.694 2 WARNING nova.compute.manager [req-cf41a184-4868-453e-b08a-467bb2416719 req-21a58c2b-6505-4a41-bfb0-7895333f112c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Received unexpected event network-vif-plugged-920a7e7c-2b32-4e38-932f-e8ed19c81f7c for instance with vm_state active and task_state shelving_image_uploading.#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.694 2 DEBUG nova.compute.manager [req-cf41a184-4868-453e-b08a-467bb2416719 req-21a58c2b-6505-4a41-bfb0-7895333f112c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Received event network-vif-plugged-920a7e7c-2b32-4e38-932f-e8ed19c81f7c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.694 2 DEBUG oslo_concurrency.lockutils [req-cf41a184-4868-453e-b08a-467bb2416719 req-21a58c2b-6505-4a41-bfb0-7895333f112c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "03d3f4d9-1589-440a-80c8-3348a75c106b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.694 2 DEBUG oslo_concurrency.lockutils [req-cf41a184-4868-453e-b08a-467bb2416719 req-21a58c2b-6505-4a41-bfb0-7895333f112c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "03d3f4d9-1589-440a-80c8-3348a75c106b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.695 2 DEBUG oslo_concurrency.lockutils [req-cf41a184-4868-453e-b08a-467bb2416719 req-21a58c2b-6505-4a41-bfb0-7895333f112c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "03d3f4d9-1589-440a-80c8-3348a75c106b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.695 2 DEBUG nova.compute.manager [req-cf41a184-4868-453e-b08a-467bb2416719 req-21a58c2b-6505-4a41-bfb0-7895333f112c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] No waiting events found dispatching network-vif-plugged-920a7e7c-2b32-4e38-932f-e8ed19c81f7c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.695 2 WARNING nova.compute.manager [req-cf41a184-4868-453e-b08a-467bb2416719 req-21a58c2b-6505-4a41-bfb0-7895333f112c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Received unexpected event network-vif-plugged-920a7e7c-2b32-4e38-932f-e8ed19c81f7c for instance with vm_state active and task_state shelving_image_uploading.#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.695 2 DEBUG nova.compute.manager [req-cf41a184-4868-453e-b08a-467bb2416719 req-21a58c2b-6505-4a41-bfb0-7895333f112c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Received event network-vif-plugged-920a7e7c-2b32-4e38-932f-e8ed19c81f7c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.696 2 DEBUG oslo_concurrency.lockutils [req-cf41a184-4868-453e-b08a-467bb2416719 req-21a58c2b-6505-4a41-bfb0-7895333f112c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "03d3f4d9-1589-440a-80c8-3348a75c106b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.696 2 DEBUG oslo_concurrency.lockutils [req-cf41a184-4868-453e-b08a-467bb2416719 req-21a58c2b-6505-4a41-bfb0-7895333f112c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "03d3f4d9-1589-440a-80c8-3348a75c106b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.696 2 DEBUG oslo_concurrency.lockutils [req-cf41a184-4868-453e-b08a-467bb2416719 req-21a58c2b-6505-4a41-bfb0-7895333f112c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "03d3f4d9-1589-440a-80c8-3348a75c106b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.696 2 DEBUG nova.compute.manager [req-cf41a184-4868-453e-b08a-467bb2416719 req-21a58c2b-6505-4a41-bfb0-7895333f112c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] No waiting events found dispatching network-vif-plugged-920a7e7c-2b32-4e38-932f-e8ed19c81f7c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.697 2 WARNING nova.compute.manager [req-cf41a184-4868-453e-b08a-467bb2416719 req-21a58c2b-6505-4a41-bfb0-7895333f112c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Received unexpected event network-vif-plugged-920a7e7c-2b32-4e38-932f-e8ed19c81f7c for instance with vm_state active and task_state shelving_image_uploading.#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.697 2 DEBUG nova.compute.manager [req-cf41a184-4868-453e-b08a-467bb2416719 req-21a58c2b-6505-4a41-bfb0-7895333f112c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Received event network-vif-unplugged-920a7e7c-2b32-4e38-932f-e8ed19c81f7c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.697 2 DEBUG oslo_concurrency.lockutils [req-cf41a184-4868-453e-b08a-467bb2416719 req-21a58c2b-6505-4a41-bfb0-7895333f112c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "03d3f4d9-1589-440a-80c8-3348a75c106b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.697 2 DEBUG oslo_concurrency.lockutils [req-cf41a184-4868-453e-b08a-467bb2416719 req-21a58c2b-6505-4a41-bfb0-7895333f112c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "03d3f4d9-1589-440a-80c8-3348a75c106b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.697 2 DEBUG oslo_concurrency.lockutils [req-cf41a184-4868-453e-b08a-467bb2416719 req-21a58c2b-6505-4a41-bfb0-7895333f112c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "03d3f4d9-1589-440a-80c8-3348a75c106b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.698 2 DEBUG nova.compute.manager [req-cf41a184-4868-453e-b08a-467bb2416719 req-21a58c2b-6505-4a41-bfb0-7895333f112c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] No waiting events found dispatching network-vif-unplugged-920a7e7c-2b32-4e38-932f-e8ed19c81f7c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.698 2 WARNING nova.compute.manager [req-cf41a184-4868-453e-b08a-467bb2416719 req-21a58c2b-6505-4a41-bfb0-7895333f112c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Received unexpected event network-vif-unplugged-920a7e7c-2b32-4e38-932f-e8ed19c81f7c for instance with vm_state active and task_state shelving_image_uploading.#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.698 2 DEBUG nova.compute.manager [req-cf41a184-4868-453e-b08a-467bb2416719 req-21a58c2b-6505-4a41-bfb0-7895333f112c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Received event network-vif-plugged-920a7e7c-2b32-4e38-932f-e8ed19c81f7c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.698 2 DEBUG oslo_concurrency.lockutils [req-cf41a184-4868-453e-b08a-467bb2416719 req-21a58c2b-6505-4a41-bfb0-7895333f112c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "03d3f4d9-1589-440a-80c8-3348a75c106b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.698 2 DEBUG oslo_concurrency.lockutils [req-cf41a184-4868-453e-b08a-467bb2416719 req-21a58c2b-6505-4a41-bfb0-7895333f112c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "03d3f4d9-1589-440a-80c8-3348a75c106b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.699 2 DEBUG oslo_concurrency.lockutils [req-cf41a184-4868-453e-b08a-467bb2416719 req-21a58c2b-6505-4a41-bfb0-7895333f112c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "03d3f4d9-1589-440a-80c8-3348a75c106b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.699 2 DEBUG nova.compute.manager [req-cf41a184-4868-453e-b08a-467bb2416719 req-21a58c2b-6505-4a41-bfb0-7895333f112c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] No waiting events found dispatching network-vif-plugged-920a7e7c-2b32-4e38-932f-e8ed19c81f7c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:17:04 np0005466013 nova_compute[192144]: 2025-10-02 12:17:04.699 2 WARNING nova.compute.manager [req-cf41a184-4868-453e-b08a-467bb2416719 req-21a58c2b-6505-4a41-bfb0-7895333f112c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Received unexpected event network-vif-plugged-920a7e7c-2b32-4e38-932f-e8ed19c81f7c for instance with vm_state active and task_state shelving_image_uploading.#033[00m
Oct  2 08:17:05 np0005466013 nova_compute[192144]: 2025-10-02 12:17:05.208 2 INFO nova.virt.libvirt.driver [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] Creating config drive at /var/lib/nova/instances/f75b8563-463c-4024-97d0-befe60db5872/disk.config#033[00m
Oct  2 08:17:05 np0005466013 nova_compute[192144]: 2025-10-02 12:17:05.215 2 DEBUG oslo_concurrency.processutils [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f75b8563-463c-4024-97d0-befe60db5872/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3_h7zjm6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:17:05 np0005466013 nova_compute[192144]: 2025-10-02 12:17:05.348 2 DEBUG oslo_concurrency.processutils [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f75b8563-463c-4024-97d0-befe60db5872/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3_h7zjm6" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:17:05 np0005466013 kernel: tapd86401f6-e4: entered promiscuous mode
Oct  2 08:17:05 np0005466013 NetworkManager[51205]: <info>  [1759407425.4294] manager: (tapd86401f6-e4): new Tun device (/org/freedesktop/NetworkManager/Devices/154)
Oct  2 08:17:05 np0005466013 ovn_controller[94366]: 2025-10-02T12:17:05Z|00332|binding|INFO|Claiming lport d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2 for this chassis.
Oct  2 08:17:05 np0005466013 ovn_controller[94366]: 2025-10-02T12:17:05Z|00333|binding|INFO|d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2: Claiming fa:16:3e:98:5e:f0 10.100.0.13
Oct  2 08:17:05 np0005466013 nova_compute[192144]: 2025-10-02 12:17:05.434 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:05 np0005466013 systemd-udevd[232531]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:17:05 np0005466013 NetworkManager[51205]: <info>  [1759407425.4756] device (tapd86401f6-e4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:17:05 np0005466013 NetworkManager[51205]: <info>  [1759407425.4779] device (tapd86401f6-e4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:17:05 np0005466013 nova_compute[192144]: 2025-10-02 12:17:05.477 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:05 np0005466013 ovn_controller[94366]: 2025-10-02T12:17:05Z|00334|binding|INFO|Setting lport d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2 ovn-installed in OVS
Oct  2 08:17:05 np0005466013 nova_compute[192144]: 2025-10-02 12:17:05.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:05 np0005466013 systemd-machined[152202]: New machine qemu-39-instance-00000057.
Oct  2 08:17:05 np0005466013 systemd[1]: Started Virtual Machine qemu-39-instance-00000057.
Oct  2 08:17:05 np0005466013 ovn_controller[94366]: 2025-10-02T12:17:05Z|00335|binding|INFO|Setting lport d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2 up in Southbound
Oct  2 08:17:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:05.865 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:98:5e:f0 10.100.0.13'], port_security=['fa:16:3e:98:5e:f0 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'f75b8563-463c-4024-97d0-befe60db5872', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b4e0bc42-3cfd-4f42-a319-553606576b33', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a043239b-039e-45fa-8277-43e361a8bae7, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:17:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:05.866 103323 INFO neutron.agent.ovn.metadata.agent [-] Port d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2 in datapath bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5 bound to our chassis#033[00m
Oct  2 08:17:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:05.867 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5#033[00m
Oct  2 08:17:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:05.880 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[0a74acef-2c93-43c4-8d0d-002a7a4a29eb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:05.882 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapbd543a6a-b1 in ovnmeta-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:17:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:05.884 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapbd543a6a-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:17:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:05.885 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[824439d1-958c-479d-a557-76df6cfa9c8c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:05.885 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[8c97f355-d125-428f-8453-d307db5cd922]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:05.901 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[c18d695e-3a27-4a6e-bd2b-b7a3e66ed970]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:05.920 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[190ea696-8cde-4a4d-ad69-00ef8adffacb]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:05.955 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[a882c44c-eae5-4a54-82ba-0bbf9bb94dee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:05 np0005466013 NetworkManager[51205]: <info>  [1759407425.9682] manager: (tapbd543a6a-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/155)
Oct  2 08:17:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:05.968 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[5531ec83-20c5-4029-ba2e-19f1f37f1338]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:05 np0005466013 systemd-udevd[232537]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:17:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:06.016 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[d2b1ac9c-3153-4bd0-bd94-fde00d340c74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:06.019 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[9578bf74-250f-4b16-91ae-6ca34beedbd2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:06 np0005466013 NetworkManager[51205]: <info>  [1759407426.0438] device (tapbd543a6a-b0): carrier: link connected
Oct  2 08:17:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:06.051 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[a3f6a4eb-cd04-4bbb-a289-a9657ba9cd71]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:06.072 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[22e3b34a-e332-445b-aedf-8ffd0b905cf3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbd543a6a-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:71:7a:4a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 100], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 541766, 'reachable_time': 29720, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232581, 'error': None, 'target': 'ovnmeta-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:06.092 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[5b037e91-e7f1-432c-b22f-ca74328223d7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe71:7a4a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 541766, 'tstamp': 541766}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232582, 'error': None, 'target': 'ovnmeta-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:06.112 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[03015d0b-e9de-4d17-be31-08a80bd3be2d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbd543a6a-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:71:7a:4a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 100], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 541766, 'reachable_time': 29720, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 232584, 'error': None, 'target': 'ovnmeta-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:06.149 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[21208004-b5f6-43b3-8359-5f0befb20070]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:06.219 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[ca3dcbdd-e1e2-414f-a52f-ea6ed49b6ae5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:06.221 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbd543a6a-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:17:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:06.221 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:17:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:06.222 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbd543a6a-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:17:06 np0005466013 kernel: tapbd543a6a-b0: entered promiscuous mode
Oct  2 08:17:06 np0005466013 nova_compute[192144]: 2025-10-02 12:17:06.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:06 np0005466013 NetworkManager[51205]: <info>  [1759407426.2255] manager: (tapbd543a6a-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/156)
Oct  2 08:17:06 np0005466013 nova_compute[192144]: 2025-10-02 12:17:06.230 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:06.232 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapbd543a6a-b0, col_values=(('external_ids', {'iface-id': '1bd1cb43-f90b-4e8c-92cc-e89ec36a0b0f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:17:06 np0005466013 nova_compute[192144]: 2025-10-02 12:17:06.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:06 np0005466013 ovn_controller[94366]: 2025-10-02T12:17:06Z|00336|binding|INFO|Releasing lport 1bd1cb43-f90b-4e8c-92cc-e89ec36a0b0f from this chassis (sb_readonly=0)
Oct  2 08:17:06 np0005466013 nova_compute[192144]: 2025-10-02 12:17:06.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:06.250 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:17:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:06.251 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[727abda9-01ff-41f2-aff8-2c6a6f262999]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:06.252 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:17:06 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:17:06 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:17:06 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5
Oct  2 08:17:06 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:17:06 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:17:06 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:17:06 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5.pid.haproxy
Oct  2 08:17:06 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:17:06 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:17:06 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:17:06 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:17:06 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:17:06 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:17:06 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:17:06 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:17:06 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:17:06 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:17:06 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:17:06 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:17:06 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:17:06 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:17:06 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:17:06 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:17:06 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:17:06 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:17:06 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:17:06 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:17:06 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5
Oct  2 08:17:06 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:17:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:06.252 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5', 'env', 'PROCESS_TAG=haproxy-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:17:06 np0005466013 nova_compute[192144]: 2025-10-02 12:17:06.539 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407426.5380354, f75b8563-463c-4024-97d0-befe60db5872 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:17:06 np0005466013 nova_compute[192144]: 2025-10-02 12:17:06.539 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f75b8563-463c-4024-97d0-befe60db5872] VM Started (Lifecycle Event)#033[00m
Oct  2 08:17:06 np0005466013 podman[232619]: 2025-10-02 12:17:06.635480193 +0000 UTC m=+0.049472868 container create e212bfa8f2e9804b683b23135b865a1bcf8043734676150c0368ac9c5767fcaa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:17:06 np0005466013 systemd[1]: Started libpod-conmon-e212bfa8f2e9804b683b23135b865a1bcf8043734676150c0368ac9c5767fcaa.scope.
Oct  2 08:17:06 np0005466013 podman[232619]: 2025-10-02 12:17:06.611755231 +0000 UTC m=+0.025747936 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:17:06 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:17:06 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/104e279e1377b89d3cda61514788ca782e1f004291b6abbaaf351376e19fa7ad/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:17:06 np0005466013 podman[232619]: 2025-10-02 12:17:06.727301586 +0000 UTC m=+0.141294291 container init e212bfa8f2e9804b683b23135b865a1bcf8043734676150c0368ac9c5767fcaa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct  2 08:17:06 np0005466013 podman[232619]: 2025-10-02 12:17:06.733449845 +0000 UTC m=+0.147442520 container start e212bfa8f2e9804b683b23135b865a1bcf8043734676150c0368ac9c5767fcaa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  2 08:17:06 np0005466013 neutron-haproxy-ovnmeta-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5[232635]: [NOTICE]   (232639) : New worker (232641) forked
Oct  2 08:17:06 np0005466013 neutron-haproxy-ovnmeta-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5[232635]: [NOTICE]   (232639) : Loading success.
Oct  2 08:17:07 np0005466013 nova_compute[192144]: 2025-10-02 12:17:07.587 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f75b8563-463c-4024-97d0-befe60db5872] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:17:07 np0005466013 nova_compute[192144]: 2025-10-02 12:17:07.593 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407426.539777, f75b8563-463c-4024-97d0-befe60db5872 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:17:07 np0005466013 nova_compute[192144]: 2025-10-02 12:17:07.593 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f75b8563-463c-4024-97d0-befe60db5872] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:17:07 np0005466013 nova_compute[192144]: 2025-10-02 12:17:07.684 2 DEBUG nova.network.neutron [req-da136531-ad89-45bb-a489-bab91a193446 req-6f0a287e-5081-4bc8-b11c-a3cc2a191c3c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] Updated VIF entry in instance network info cache for port d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:17:07 np0005466013 nova_compute[192144]: 2025-10-02 12:17:07.685 2 DEBUG nova.network.neutron [req-da136531-ad89-45bb-a489-bab91a193446 req-6f0a287e-5081-4bc8-b11c-a3cc2a191c3c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] Updating instance_info_cache with network_info: [{"id": "d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2", "address": "fa:16:3e:98:5e:f0", "network": {"id": "bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-542543245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e0277f0bb0f4a349e2e6d8ddfa24edf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd86401f6-e4", "ovs_interfaceid": "d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:17:07 np0005466013 nova_compute[192144]: 2025-10-02 12:17:07.981 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f75b8563-463c-4024-97d0-befe60db5872] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:17:07 np0005466013 nova_compute[192144]: 2025-10-02 12:17:07.987 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f75b8563-463c-4024-97d0-befe60db5872] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:17:08 np0005466013 nova_compute[192144]: 2025-10-02 12:17:08.160 2 DEBUG nova.compute.manager [req-3c3e1a14-656f-48dc-8fcc-a9eb58b0d5b6 req-eceb6e72-8587-460d-90f9-220633948748 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] Received event network-vif-plugged-d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:17:08 np0005466013 nova_compute[192144]: 2025-10-02 12:17:08.161 2 DEBUG oslo_concurrency.lockutils [req-3c3e1a14-656f-48dc-8fcc-a9eb58b0d5b6 req-eceb6e72-8587-460d-90f9-220633948748 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "f75b8563-463c-4024-97d0-befe60db5872-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:17:08 np0005466013 nova_compute[192144]: 2025-10-02 12:17:08.161 2 DEBUG oslo_concurrency.lockutils [req-3c3e1a14-656f-48dc-8fcc-a9eb58b0d5b6 req-eceb6e72-8587-460d-90f9-220633948748 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f75b8563-463c-4024-97d0-befe60db5872-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:17:08 np0005466013 nova_compute[192144]: 2025-10-02 12:17:08.162 2 DEBUG oslo_concurrency.lockutils [req-3c3e1a14-656f-48dc-8fcc-a9eb58b0d5b6 req-eceb6e72-8587-460d-90f9-220633948748 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f75b8563-463c-4024-97d0-befe60db5872-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:17:08 np0005466013 nova_compute[192144]: 2025-10-02 12:17:08.162 2 DEBUG nova.compute.manager [req-3c3e1a14-656f-48dc-8fcc-a9eb58b0d5b6 req-eceb6e72-8587-460d-90f9-220633948748 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] Processing event network-vif-plugged-d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:17:08 np0005466013 nova_compute[192144]: 2025-10-02 12:17:08.163 2 DEBUG nova.compute.manager [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:17:08 np0005466013 nova_compute[192144]: 2025-10-02 12:17:08.170 2 DEBUG nova.virt.libvirt.driver [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:17:08 np0005466013 nova_compute[192144]: 2025-10-02 12:17:08.175 2 INFO nova.virt.libvirt.driver [-] [instance: f75b8563-463c-4024-97d0-befe60db5872] Instance spawned successfully.#033[00m
Oct  2 08:17:08 np0005466013 nova_compute[192144]: 2025-10-02 12:17:08.176 2 DEBUG nova.virt.libvirt.driver [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:17:08 np0005466013 nova_compute[192144]: 2025-10-02 12:17:08.183 2 DEBUG oslo_concurrency.lockutils [req-da136531-ad89-45bb-a489-bab91a193446 req-6f0a287e-5081-4bc8-b11c-a3cc2a191c3c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-f75b8563-463c-4024-97d0-befe60db5872" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:17:08 np0005466013 nova_compute[192144]: 2025-10-02 12:17:08.196 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f75b8563-463c-4024-97d0-befe60db5872] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:17:08 np0005466013 nova_compute[192144]: 2025-10-02 12:17:08.197 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407428.1692398, f75b8563-463c-4024-97d0-befe60db5872 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:17:08 np0005466013 nova_compute[192144]: 2025-10-02 12:17:08.197 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f75b8563-463c-4024-97d0-befe60db5872] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:17:08 np0005466013 nova_compute[192144]: 2025-10-02 12:17:08.211 2 DEBUG nova.virt.libvirt.driver [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:17:08 np0005466013 nova_compute[192144]: 2025-10-02 12:17:08.211 2 DEBUG nova.virt.libvirt.driver [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:17:08 np0005466013 nova_compute[192144]: 2025-10-02 12:17:08.212 2 DEBUG nova.virt.libvirt.driver [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:17:08 np0005466013 nova_compute[192144]: 2025-10-02 12:17:08.213 2 DEBUG nova.virt.libvirt.driver [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:17:08 np0005466013 nova_compute[192144]: 2025-10-02 12:17:08.214 2 DEBUG nova.virt.libvirt.driver [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:17:08 np0005466013 nova_compute[192144]: 2025-10-02 12:17:08.214 2 DEBUG nova.virt.libvirt.driver [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:17:08 np0005466013 nova_compute[192144]: 2025-10-02 12:17:08.248 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f75b8563-463c-4024-97d0-befe60db5872] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:17:08 np0005466013 nova_compute[192144]: 2025-10-02 12:17:08.252 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f75b8563-463c-4024-97d0-befe60db5872] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:17:08 np0005466013 nova_compute[192144]: 2025-10-02 12:17:08.284 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f75b8563-463c-4024-97d0-befe60db5872] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:17:08 np0005466013 nova_compute[192144]: 2025-10-02 12:17:08.308 2 INFO nova.compute.manager [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] Took 12.84 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:17:08 np0005466013 nova_compute[192144]: 2025-10-02 12:17:08.309 2 DEBUG nova.compute.manager [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:17:08 np0005466013 nova_compute[192144]: 2025-10-02 12:17:08.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:08 np0005466013 nova_compute[192144]: 2025-10-02 12:17:08.609 2 INFO nova.compute.manager [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] Took 13.74 seconds to build instance.#033[00m
Oct  2 08:17:08 np0005466013 nova_compute[192144]: 2025-10-02 12:17:08.641 2 DEBUG oslo_concurrency.lockutils [None req-14564fc1-463e-4113-99b4-d5fbbe2b2fe1 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Lock "f75b8563-463c-4024-97d0-befe60db5872" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.872s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:17:08 np0005466013 nova_compute[192144]: 2025-10-02 12:17:08.683 2 INFO nova.virt.libvirt.driver [None req-2c6497fe-b078-450a-a51a-19507e17a1a5 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Snapshot image upload complete#033[00m
Oct  2 08:17:08 np0005466013 nova_compute[192144]: 2025-10-02 12:17:08.684 2 DEBUG nova.compute.manager [None req-2c6497fe-b078-450a-a51a-19507e17a1a5 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:17:08 np0005466013 nova_compute[192144]: 2025-10-02 12:17:08.878 2 INFO nova.compute.manager [None req-2c6497fe-b078-450a-a51a-19507e17a1a5 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Shelve offloading#033[00m
Oct  2 08:17:08 np0005466013 nova_compute[192144]: 2025-10-02 12:17:08.905 2 INFO nova.virt.libvirt.driver [-] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Instance destroyed successfully.#033[00m
Oct  2 08:17:08 np0005466013 nova_compute[192144]: 2025-10-02 12:17:08.906 2 DEBUG nova.compute.manager [None req-2c6497fe-b078-450a-a51a-19507e17a1a5 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:17:08 np0005466013 nova_compute[192144]: 2025-10-02 12:17:08.908 2 DEBUG oslo_concurrency.lockutils [None req-2c6497fe-b078-450a-a51a-19507e17a1a5 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "refresh_cache-03d3f4d9-1589-440a-80c8-3348a75c106b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:17:08 np0005466013 nova_compute[192144]: 2025-10-02 12:17:08.908 2 DEBUG oslo_concurrency.lockutils [None req-2c6497fe-b078-450a-a51a-19507e17a1a5 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquired lock "refresh_cache-03d3f4d9-1589-440a-80c8-3348a75c106b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:17:08 np0005466013 nova_compute[192144]: 2025-10-02 12:17:08.908 2 DEBUG nova.network.neutron [None req-2c6497fe-b078-450a-a51a-19507e17a1a5 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:17:09 np0005466013 nova_compute[192144]: 2025-10-02 12:17:09.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:10 np0005466013 nova_compute[192144]: 2025-10-02 12:17:10.269 2 DEBUG nova.compute.manager [req-007f3ac5-5f10-4787-84bb-afadd9c9db34 req-cd48c78c-0e82-4295-81ba-3653e2737912 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] Received event network-vif-plugged-d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:17:10 np0005466013 nova_compute[192144]: 2025-10-02 12:17:10.269 2 DEBUG oslo_concurrency.lockutils [req-007f3ac5-5f10-4787-84bb-afadd9c9db34 req-cd48c78c-0e82-4295-81ba-3653e2737912 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "f75b8563-463c-4024-97d0-befe60db5872-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:17:10 np0005466013 nova_compute[192144]: 2025-10-02 12:17:10.270 2 DEBUG oslo_concurrency.lockutils [req-007f3ac5-5f10-4787-84bb-afadd9c9db34 req-cd48c78c-0e82-4295-81ba-3653e2737912 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f75b8563-463c-4024-97d0-befe60db5872-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:17:10 np0005466013 nova_compute[192144]: 2025-10-02 12:17:10.270 2 DEBUG oslo_concurrency.lockutils [req-007f3ac5-5f10-4787-84bb-afadd9c9db34 req-cd48c78c-0e82-4295-81ba-3653e2737912 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f75b8563-463c-4024-97d0-befe60db5872-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:17:10 np0005466013 nova_compute[192144]: 2025-10-02 12:17:10.270 2 DEBUG nova.compute.manager [req-007f3ac5-5f10-4787-84bb-afadd9c9db34 req-cd48c78c-0e82-4295-81ba-3653e2737912 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] No waiting events found dispatching network-vif-plugged-d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:17:10 np0005466013 nova_compute[192144]: 2025-10-02 12:17:10.270 2 WARNING nova.compute.manager [req-007f3ac5-5f10-4787-84bb-afadd9c9db34 req-cd48c78c-0e82-4295-81ba-3653e2737912 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] Received unexpected event network-vif-plugged-d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:17:10 np0005466013 nova_compute[192144]: 2025-10-02 12:17:10.762 2 DEBUG nova.network.neutron [None req-2c6497fe-b078-450a-a51a-19507e17a1a5 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Updating instance_info_cache with network_info: [{"id": "920a7e7c-2b32-4e38-932f-e8ed19c81f7c", "address": "fa:16:3e:46:e6:8d", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap920a7e7c-2b", "ovs_interfaceid": "920a7e7c-2b32-4e38-932f-e8ed19c81f7c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:17:10 np0005466013 nova_compute[192144]: 2025-10-02 12:17:10.813 2 DEBUG oslo_concurrency.lockutils [None req-2c6497fe-b078-450a-a51a-19507e17a1a5 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Releasing lock "refresh_cache-03d3f4d9-1589-440a-80c8-3348a75c106b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:17:12 np0005466013 nova_compute[192144]: 2025-10-02 12:17:12.829 2 INFO nova.virt.libvirt.driver [-] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Instance destroyed successfully.#033[00m
Oct  2 08:17:12 np0005466013 nova_compute[192144]: 2025-10-02 12:17:12.830 2 DEBUG nova.objects.instance [None req-2c6497fe-b078-450a-a51a-19507e17a1a5 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lazy-loading 'resources' on Instance uuid 03d3f4d9-1589-440a-80c8-3348a75c106b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:17:13 np0005466013 nova_compute[192144]: 2025-10-02 12:17:13.071 2 DEBUG nova.virt.libvirt.vif [None req-2c6497fe-b078-450a-a51a-19507e17a1a5 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:16:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1492208900',display_name='tempest-DeleteServersTestJSON-server-1492208900',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1492208900',id=83,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:16:47Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='d5db64e6714348c1a7f57bb53de80915',ramdisk_id='',reservation_id='r-9alpbnhm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-DeleteServersTestJSON-548982240',owner_user_name='tempest-DeleteServersTestJSON-548982240-project-member',shelved_at='2025-10-02T12:17:08.683963',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='8ed27dea-9dce-4a89-bf0d-5cdeb3cbcbad'},tags=<?>,task_state='shelving_offloading',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:17:03Z,user_data=None,user_id='0c0ba8ddde504431b51e593c63f40361',uuid=03d3f4d9-1589-440a-80c8-3348a75c106b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "920a7e7c-2b32-4e38-932f-e8ed19c81f7c", "address": "fa:16:3e:46:e6:8d", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap920a7e7c-2b", "ovs_interfaceid": "920a7e7c-2b32-4e38-932f-e8ed19c81f7c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:17:13 np0005466013 nova_compute[192144]: 2025-10-02 12:17:13.072 2 DEBUG nova.network.os_vif_util [None req-2c6497fe-b078-450a-a51a-19507e17a1a5 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Converting VIF {"id": "920a7e7c-2b32-4e38-932f-e8ed19c81f7c", "address": "fa:16:3e:46:e6:8d", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap920a7e7c-2b", "ovs_interfaceid": "920a7e7c-2b32-4e38-932f-e8ed19c81f7c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:17:13 np0005466013 nova_compute[192144]: 2025-10-02 12:17:13.074 2 DEBUG nova.network.os_vif_util [None req-2c6497fe-b078-450a-a51a-19507e17a1a5 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:46:e6:8d,bridge_name='br-int',has_traffic_filtering=True,id=920a7e7c-2b32-4e38-932f-e8ed19c81f7c,network=Network(b97b8849-844c-4190-8b13-fd7a2d073ce8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap920a7e7c-2b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:17:13 np0005466013 nova_compute[192144]: 2025-10-02 12:17:13.074 2 DEBUG os_vif [None req-2c6497fe-b078-450a-a51a-19507e17a1a5 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:46:e6:8d,bridge_name='br-int',has_traffic_filtering=True,id=920a7e7c-2b32-4e38-932f-e8ed19c81f7c,network=Network(b97b8849-844c-4190-8b13-fd7a2d073ce8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap920a7e7c-2b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:17:13 np0005466013 nova_compute[192144]: 2025-10-02 12:17:13.077 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:13 np0005466013 nova_compute[192144]: 2025-10-02 12:17:13.077 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap920a7e7c-2b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:17:13 np0005466013 nova_compute[192144]: 2025-10-02 12:17:13.080 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:13 np0005466013 nova_compute[192144]: 2025-10-02 12:17:13.085 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:17:13 np0005466013 nova_compute[192144]: 2025-10-02 12:17:13.087 2 INFO os_vif [None req-2c6497fe-b078-450a-a51a-19507e17a1a5 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:46:e6:8d,bridge_name='br-int',has_traffic_filtering=True,id=920a7e7c-2b32-4e38-932f-e8ed19c81f7c,network=Network(b97b8849-844c-4190-8b13-fd7a2d073ce8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap920a7e7c-2b')#033[00m
Oct  2 08:17:13 np0005466013 nova_compute[192144]: 2025-10-02 12:17:13.088 2 INFO nova.virt.libvirt.driver [None req-2c6497fe-b078-450a-a51a-19507e17a1a5 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Deleting instance files /var/lib/nova/instances/03d3f4d9-1589-440a-80c8-3348a75c106b_del#033[00m
Oct  2 08:17:13 np0005466013 nova_compute[192144]: 2025-10-02 12:17:13.100 2 INFO nova.virt.libvirt.driver [None req-2c6497fe-b078-450a-a51a-19507e17a1a5 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Deletion of /var/lib/nova/instances/03d3f4d9-1589-440a-80c8-3348a75c106b_del complete#033[00m
Oct  2 08:17:13 np0005466013 nova_compute[192144]: 2025-10-02 12:17:13.244 2 DEBUG nova.compute.manager [req-ff1da10a-6726-4d10-b114-0d612bba70dc req-eb5c2eed-7d55-47b8-bdbc-53921587db12 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Received event network-changed-920a7e7c-2b32-4e38-932f-e8ed19c81f7c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:17:13 np0005466013 nova_compute[192144]: 2025-10-02 12:17:13.245 2 DEBUG nova.compute.manager [req-ff1da10a-6726-4d10-b114-0d612bba70dc req-eb5c2eed-7d55-47b8-bdbc-53921587db12 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Refreshing instance network info cache due to event network-changed-920a7e7c-2b32-4e38-932f-e8ed19c81f7c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:17:13 np0005466013 nova_compute[192144]: 2025-10-02 12:17:13.245 2 DEBUG oslo_concurrency.lockutils [req-ff1da10a-6726-4d10-b114-0d612bba70dc req-eb5c2eed-7d55-47b8-bdbc-53921587db12 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-03d3f4d9-1589-440a-80c8-3348a75c106b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:17:13 np0005466013 nova_compute[192144]: 2025-10-02 12:17:13.245 2 DEBUG oslo_concurrency.lockutils [req-ff1da10a-6726-4d10-b114-0d612bba70dc req-eb5c2eed-7d55-47b8-bdbc-53921587db12 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-03d3f4d9-1589-440a-80c8-3348a75c106b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:17:13 np0005466013 nova_compute[192144]: 2025-10-02 12:17:13.246 2 DEBUG nova.network.neutron [req-ff1da10a-6726-4d10-b114-0d612bba70dc req-eb5c2eed-7d55-47b8-bdbc-53921587db12 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Refreshing network info cache for port 920a7e7c-2b32-4e38-932f-e8ed19c81f7c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:17:13 np0005466013 nova_compute[192144]: 2025-10-02 12:17:13.454 2 INFO nova.scheduler.client.report [None req-2c6497fe-b078-450a-a51a-19507e17a1a5 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Deleted allocations for instance 03d3f4d9-1589-440a-80c8-3348a75c106b#033[00m
Oct  2 08:17:13 np0005466013 nova_compute[192144]: 2025-10-02 12:17:13.513 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:13 np0005466013 nova_compute[192144]: 2025-10-02 12:17:13.705 2 DEBUG oslo_concurrency.lockutils [None req-2c6497fe-b078-450a-a51a-19507e17a1a5 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:17:13 np0005466013 nova_compute[192144]: 2025-10-02 12:17:13.707 2 DEBUG oslo_concurrency.lockutils [None req-2c6497fe-b078-450a-a51a-19507e17a1a5 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:17:13 np0005466013 nova_compute[192144]: 2025-10-02 12:17:13.794 2 DEBUG nova.compute.provider_tree [None req-2c6497fe-b078-450a-a51a-19507e17a1a5 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:17:13 np0005466013 nova_compute[192144]: 2025-10-02 12:17:13.943 2 DEBUG nova.scheduler.client.report [None req-2c6497fe-b078-450a-a51a-19507e17a1a5 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:17:14 np0005466013 nova_compute[192144]: 2025-10-02 12:17:14.092 2 DEBUG oslo_concurrency.lockutils [None req-2c6497fe-b078-450a-a51a-19507e17a1a5 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.384s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:17:14 np0005466013 nova_compute[192144]: 2025-10-02 12:17:14.211 2 DEBUG oslo_concurrency.lockutils [None req-2c6497fe-b078-450a-a51a-19507e17a1a5 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "03d3f4d9-1589-440a-80c8-3348a75c106b" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 25.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:17:14 np0005466013 podman[232650]: 2025-10-02 12:17:14.696903012 +0000 UTC m=+0.069751384 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:17:14 np0005466013 podman[232651]: 2025-10-02 12:17:14.697230522 +0000 UTC m=+0.067494213 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  2 08:17:14 np0005466013 podman[232652]: 2025-10-02 12:17:14.759540345 +0000 UTC m=+0.125947498 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true)
Oct  2 08:17:15 np0005466013 nova_compute[192144]: 2025-10-02 12:17:15.841 2 DEBUG nova.network.neutron [req-ff1da10a-6726-4d10-b114-0d612bba70dc req-eb5c2eed-7d55-47b8-bdbc-53921587db12 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Updated VIF entry in instance network info cache for port 920a7e7c-2b32-4e38-932f-e8ed19c81f7c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:17:15 np0005466013 nova_compute[192144]: 2025-10-02 12:17:15.842 2 DEBUG nova.network.neutron [req-ff1da10a-6726-4d10-b114-0d612bba70dc req-eb5c2eed-7d55-47b8-bdbc-53921587db12 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Updating instance_info_cache with network_info: [{"id": "920a7e7c-2b32-4e38-932f-e8ed19c81f7c", "address": "fa:16:3e:46:e6:8d", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": null, "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "unbound", "details": {}, "devname": "tap920a7e7c-2b", "ovs_interfaceid": null, "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:17:15 np0005466013 nova_compute[192144]: 2025-10-02 12:17:15.893 2 DEBUG oslo_concurrency.lockutils [req-ff1da10a-6726-4d10-b114-0d612bba70dc req-eb5c2eed-7d55-47b8-bdbc-53921587db12 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-03d3f4d9-1589-440a-80c8-3348a75c106b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.351 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}a37b003e210c51c060aa24665a0e142885da2fb6424f9ea2d5a1b0a343c6ab3e" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.618 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 644 Content-Type: application/json Date: Thu, 02 Oct 2025 12:17:16 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-268cad98-be0c-4c2b-8df4-8e0b04d69af7 x-openstack-request-id: req-268cad98-be0c-4c2b-8df4-8e0b04d69af7 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.618 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "9949d9da-6314-4ede-8797-6f2f0a6a64fc", "name": "m1.micro", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/9949d9da-6314-4ede-8797-6f2f0a6a64fc"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/9949d9da-6314-4ede-8797-6f2f0a6a64fc"}]}, {"id": "9ac83da7-f31e-4467-8569-d28002f6aeed", "name": "m1.nano", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/9ac83da7-f31e-4467-8569-d28002f6aeed"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/9ac83da7-f31e-4467-8569-d28002f6aeed"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.618 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-268cad98-be0c-4c2b-8df4-8e0b04d69af7 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.620 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors/9949d9da-6314-4ede-8797-6f2f0a6a64fc -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}a37b003e210c51c060aa24665a0e142885da2fb6424f9ea2d5a1b0a343c6ab3e" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.714 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 496 Content-Type: application/json Date: Thu, 02 Oct 2025 12:17:16 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-55e691c2-e337-46cb-a0a2-145cae0d7429 x-openstack-request-id: req-55e691c2-e337-46cb-a0a2-145cae0d7429 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.714 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "9949d9da-6314-4ede-8797-6f2f0a6a64fc", "name": "m1.micro", "ram": 192, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 0, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/9949d9da-6314-4ede-8797-6f2f0a6a64fc"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/9949d9da-6314-4ede-8797-6f2f0a6a64fc"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.714 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors/9949d9da-6314-4ede-8797-6f2f0a6a64fc used request id req-55e691c2-e337-46cb-a0a2-145cae0d7429 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.716 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f75b8563-463c-4024-97d0-befe60db5872', 'name': 'tempest-ListServerFiltersTestJSON-instance-1580551683', 'flavor': {'id': '9949d9da-6314-4ede-8797-6f2f0a6a64fc', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000057', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'user_id': '001d2d51902d4e299b775131f430a5db', 'hostId': '2496da44ba10bf404ca560afaac5f7290e48f65feb6a32884890763b', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.716 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.735 12 DEBUG ceilometer.compute.pollsters [-] f75b8563-463c-4024-97d0-befe60db5872/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.736 12 DEBUG ceilometer.compute.pollsters [-] f75b8563-463c-4024-97d0-befe60db5872/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.738 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9b2a7ee9-7ff1-44e1-ae02-50b334851706', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': 'f75b8563-463c-4024-97d0-befe60db5872-vda', 'timestamp': '2025-10-02T12:17:16.716619', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1580551683', 'name': 'instance-00000057', 'instance_id': 'f75b8563-463c-4024-97d0-befe60db5872', 'instance_type': 'm1.micro', 'host': '2496da44ba10bf404ca560afaac5f7290e48f65feb6a32884890763b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9949d9da-6314-4ede-8797-6f2f0a6a64fc', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bc52c7b8-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5428.394843987, 'message_signature': 'e67e6592754b5a7ea8a780c5ba7b4f62e2953b6104ae1800ab98fb25d541dc65'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': 'f75b8563-463c-4024-97d0-befe60db5872-sda', 'timestamp': '2025-10-02T12:17:16.716619', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1580551683', 'name': 'instance-00000057', 'instance_id': 'f75b8563-463c-4024-97d0-befe60db5872', 'instance_type': 'm1.micro', 'host': '2496da44ba10bf404ca560afaac5f7290e48f65feb6a32884890763b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9949d9da-6314-4ede-8797-6f2f0a6a64fc', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bc52d7b2-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5428.394843987, 'message_signature': '7131f98c4e57ec4fa003235c8c3e2a35b4b499e5101820c993d031a75fcf89c8'}]}, 'timestamp': '2025-10-02 12:17:16.736906', '_unique_id': '81442bb025c94da6ba143736f3a6ed51'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.738 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.738 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.738 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.738 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.738 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.738 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.738 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.738 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.738 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.738 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.738 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.738 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.738 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.738 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.738 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.738 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.738 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.738 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.738 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.738 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.738 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.738 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.738 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.738 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.738 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.738 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.738 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.738 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.738 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.738 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.738 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.739 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.757 12 DEBUG ceilometer.compute.pollsters [-] f75b8563-463c-4024-97d0-befe60db5872/cpu volume: 8330000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.759 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5ca4bac9-d00e-49c8-bcad-ba39f64ba9f5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 8330000000, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': 'f75b8563-463c-4024-97d0-befe60db5872', 'timestamp': '2025-10-02T12:17:16.739777', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1580551683', 'name': 'instance-00000057', 'instance_id': 'f75b8563-463c-4024-97d0-befe60db5872', 'instance_type': 'm1.micro', 'host': '2496da44ba10bf404ca560afaac5f7290e48f65feb6a32884890763b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9949d9da-6314-4ede-8797-6f2f0a6a64fc', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'bc560fae-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5428.435435239, 'message_signature': 'de6a6ff84b8961c230dfa7b807a71c52db4b86fdfec062ac934b1391bb569d37'}]}, 'timestamp': '2025-10-02 12:17:16.758039', '_unique_id': '81f60202adce4af69cd3cff783ac9d32'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.759 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.759 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.759 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.759 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.759 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.759 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.759 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.759 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.759 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.759 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.759 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.759 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.759 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.759 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.759 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.759 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.759 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.759 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.759 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.759 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.759 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.759 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.759 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.759 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.759 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.759 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.759 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.759 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.759 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.759 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.759 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.760 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.762 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for f75b8563-463c-4024-97d0-befe60db5872 / tapd86401f6-e4 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.763 12 DEBUG ceilometer.compute.pollsters [-] f75b8563-463c-4024-97d0-befe60db5872/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.764 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '27e3cce8-2f21-48a9-85a4-d7f7b1d9ccb2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': 'instance-00000057-f75b8563-463c-4024-97d0-befe60db5872-tapd86401f6-e4', 'timestamp': '2025-10-02T12:17:16.760633', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1580551683', 'name': 'tapd86401f6-e4', 'instance_id': 'f75b8563-463c-4024-97d0-befe60db5872', 'instance_type': 'm1.micro', 'host': '2496da44ba10bf404ca560afaac5f7290e48f65feb6a32884890763b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9949d9da-6314-4ede-8797-6f2f0a6a64fc', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:98:5e:f0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd86401f6-e4'}, 'message_id': 'bc56e69a-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5428.438884185, 'message_signature': '6c3e718fdc8941152723b90100353f8d0199f79256639fc0d73b2157f422a601'}]}, 'timestamp': '2025-10-02 12:17:16.763490', '_unique_id': 'c74db970b97641629e708ba6a6af824e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.764 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.764 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.764 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.764 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.764 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.764 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.764 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.764 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.764 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.764 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.764 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.764 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.764 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.764 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.764 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.764 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.764 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.764 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.764 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.764 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.764 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.764 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.764 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.764 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.764 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.764 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.764 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.764 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.764 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.764 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.764 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.765 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.765 12 DEBUG ceilometer.compute.pollsters [-] f75b8563-463c-4024-97d0-befe60db5872/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.766 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '06f92dce-789a-454e-aa56-8e6b8ad141fa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': 'instance-00000057-f75b8563-463c-4024-97d0-befe60db5872-tapd86401f6-e4', 'timestamp': '2025-10-02T12:17:16.765410', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1580551683', 'name': 'tapd86401f6-e4', 'instance_id': 'f75b8563-463c-4024-97d0-befe60db5872', 'instance_type': 'm1.micro', 'host': '2496da44ba10bf404ca560afaac5f7290e48f65feb6a32884890763b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9949d9da-6314-4ede-8797-6f2f0a6a64fc', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:98:5e:f0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd86401f6-e4'}, 'message_id': 'bc574216-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5428.438884185, 'message_signature': '33ae894dc09c26b4ec74518b143bba22ae5e0d316fc70bc3718ad472333d0a7d'}]}, 'timestamp': '2025-10-02 12:17:16.765826', '_unique_id': 'f23d9225a1ee41788316ea69616c107c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.766 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.766 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.766 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.766 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.766 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.766 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.766 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.766 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.766 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.766 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.766 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.766 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.766 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.766 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.766 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.766 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.766 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.766 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.766 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.766 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.766 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.766 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.766 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.766 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.766 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.766 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.766 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.766 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.766 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.766 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.766 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.768 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.768 12 DEBUG ceilometer.compute.pollsters [-] f75b8563-463c-4024-97d0-befe60db5872/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.769 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7782a5f3-41d7-4f09-a760-e010a8183974', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': 'instance-00000057-f75b8563-463c-4024-97d0-befe60db5872-tapd86401f6-e4', 'timestamp': '2025-10-02T12:17:16.768269', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1580551683', 'name': 'tapd86401f6-e4', 'instance_id': 'f75b8563-463c-4024-97d0-befe60db5872', 'instance_type': 'm1.micro', 'host': '2496da44ba10bf404ca560afaac5f7290e48f65feb6a32884890763b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9949d9da-6314-4ede-8797-6f2f0a6a64fc', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:98:5e:f0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd86401f6-e4'}, 'message_id': 'bc57b1a6-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5428.438884185, 'message_signature': '34c923cc12e01deae4fb91ea07bd3313022e0ba1b0be3b5382aa9dec053557df'}]}, 'timestamp': '2025-10-02 12:17:16.768674', '_unique_id': '652313177aee49e2bcba2fa024a69f71'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.769 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.769 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.769 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.769 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.769 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.769 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.769 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.769 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.769 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.769 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.769 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.769 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.769 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.769 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.769 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.769 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.769 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.769 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.769 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.769 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.769 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.769 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.769 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.769 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.769 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.769 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.769 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.769 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.769 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.769 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.769 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.770 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.770 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.770 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1580551683>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1580551683>]
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.771 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.771 12 DEBUG ceilometer.compute.pollsters [-] f75b8563-463c-4024-97d0-befe60db5872/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.771 12 DEBUG ceilometer.compute.pollsters [-] f75b8563-463c-4024-97d0-befe60db5872/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.772 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1a0ceccd-0938-4c44-8aa7-72fddbe8d18c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': 'f75b8563-463c-4024-97d0-befe60db5872-vda', 'timestamp': '2025-10-02T12:17:16.771322', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1580551683', 'name': 'instance-00000057', 'instance_id': 'f75b8563-463c-4024-97d0-befe60db5872', 'instance_type': 'm1.micro', 'host': '2496da44ba10bf404ca560afaac5f7290e48f65feb6a32884890763b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9949d9da-6314-4ede-8797-6f2f0a6a64fc', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bc5828a2-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5428.394843987, 'message_signature': '6dfee2a55ddc1406badfeea7300f9dccce47b33117056828c7013776189143f3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': 'f75b8563-463c-4024-97d0-befe60db5872-sda', 'timestamp': '2025-10-02T12:17:16.771322', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1580551683', 'name': 'instance-00000057', 'instance_id': 'f75b8563-463c-4024-97d0-befe60db5872', 'instance_type': 'm1.micro', 'host': '2496da44ba10bf404ca560afaac5f7290e48f65feb6a32884890763b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9949d9da-6314-4ede-8797-6f2f0a6a64fc', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bc58368a-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5428.394843987, 'message_signature': 'e571afed3e8ca7b538b8f684d24e8c0f7382c133e4eb7df19cb8d1138f004790'}]}, 'timestamp': '2025-10-02 12:17:16.772047', '_unique_id': 'b77e2993be254622a93b0d7c819302ec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.772 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.772 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.772 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.772 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.772 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.772 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.772 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.772 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.772 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.772 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.772 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.772 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.772 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.772 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.772 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.772 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.772 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.772 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.772 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.772 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.772 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.772 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.772 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.772 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.772 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.772 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.772 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.772 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.772 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.772 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.772 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.773 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.774 12 DEBUG ceilometer.compute.pollsters [-] f75b8563-463c-4024-97d0-befe60db5872/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.775 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '212c1fd6-7af5-40d8-ae35-d2f785e7a2a8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': 'instance-00000057-f75b8563-463c-4024-97d0-befe60db5872-tapd86401f6-e4', 'timestamp': '2025-10-02T12:17:16.774049', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1580551683', 'name': 'tapd86401f6-e4', 'instance_id': 'f75b8563-463c-4024-97d0-befe60db5872', 'instance_type': 'm1.micro', 'host': '2496da44ba10bf404ca560afaac5f7290e48f65feb6a32884890763b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9949d9da-6314-4ede-8797-6f2f0a6a64fc', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:98:5e:f0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd86401f6-e4'}, 'message_id': 'bc589346-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5428.438884185, 'message_signature': '185c0ca93fcb53e6ae65ee3014df81ec5b338ed6876e155934bfd623004f3425'}]}, 'timestamp': '2025-10-02 12:17:16.774449', '_unique_id': 'a69465baa6334e8ab67d46e906e5c12a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.775 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.775 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.775 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.775 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.775 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.775 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.775 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.775 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.775 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.775 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.775 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.775 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.775 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.775 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.775 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.775 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.775 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.775 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.775 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.775 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.775 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.775 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.775 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.775 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.775 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.775 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.775 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.775 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.775 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.775 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.775 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.776 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.776 12 DEBUG ceilometer.compute.pollsters [-] f75b8563-463c-4024-97d0-befe60db5872/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.777 12 DEBUG ceilometer.compute.pollsters [-] f75b8563-463c-4024-97d0-befe60db5872/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.778 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '52cc4752-fe51-46cc-bc2e-62724cbc6162', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': 'f75b8563-463c-4024-97d0-befe60db5872-vda', 'timestamp': '2025-10-02T12:17:16.776604', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1580551683', 'name': 'instance-00000057', 'instance_id': 'f75b8563-463c-4024-97d0-befe60db5872', 'instance_type': 'm1.micro', 'host': '2496da44ba10bf404ca560afaac5f7290e48f65feb6a32884890763b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9949d9da-6314-4ede-8797-6f2f0a6a64fc', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bc58f87c-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5428.394843987, 'message_signature': 'f273290b5e11718a7f06ce6aec44110fea53dc8db8c68dcb80dd44c41c5d186e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': 'f75b8563-463c-4024-97d0-befe60db5872-sda', 'timestamp': '2025-10-02T12:17:16.776604', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1580551683', 'name': 'instance-00000057', 'instance_id': 'f75b8563-463c-4024-97d0-befe60db5872', 'instance_type': 'm1.micro', 'host': '2496da44ba10bf404ca560afaac5f7290e48f65feb6a32884890763b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9949d9da-6314-4ede-8797-6f2f0a6a64fc', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bc590650-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5428.394843987, 'message_signature': 'fcb3ef669b270ae047ab32f8da6f73d99fd8e97070b6b598f2d8fe10c41f3b53'}]}, 'timestamp': '2025-10-02 12:17:16.777370', '_unique_id': '1ed8f551ba7440d7ac1fe54211335c44'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.778 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.778 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.778 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.778 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.778 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.778 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.778 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.778 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.778 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.778 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.778 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.778 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.778 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.778 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.778 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.778 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.778 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.778 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.778 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.778 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.778 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.778 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.778 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.778 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.778 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.778 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.778 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.778 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.778 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.778 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.778 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.779 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.789 12 DEBUG ceilometer.compute.pollsters [-] f75b8563-463c-4024-97d0-befe60db5872/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.790 12 DEBUG ceilometer.compute.pollsters [-] f75b8563-463c-4024-97d0-befe60db5872/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.791 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '45ecdbce-57f5-4f2f-8a2b-8e6d7e57a265', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': 'f75b8563-463c-4024-97d0-befe60db5872-vda', 'timestamp': '2025-10-02T12:17:16.779525', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1580551683', 'name': 'instance-00000057', 'instance_id': 'f75b8563-463c-4024-97d0-befe60db5872', 'instance_type': 'm1.micro', 'host': '2496da44ba10bf404ca560afaac5f7290e48f65feb6a32884890763b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9949d9da-6314-4ede-8797-6f2f0a6a64fc', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bc5b0086-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5428.457792559, 'message_signature': '4eb56637d4cdc8e3839078188e4a56e8671acbbac8b6b71ba0945bae690d9b24'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': 'f75b8563-463c-4024-97d0-befe60db5872-sda', 'timestamp': '2025-10-02T12:17:16.779525', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1580551683', 'name': 'instance-00000057', 'instance_id': 'f75b8563-463c-4024-97d0-befe60db5872', 'instance_type': 'm1.micro', 'host': '2496da44ba10bf404ca560afaac5f7290e48f65feb6a32884890763b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9949d9da-6314-4ede-8797-6f2f0a6a64fc', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bc5b10c6-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5428.457792559, 'message_signature': '68f773cffc5a0f70a0215f0e77728df2dd06287ea33f873a84fe392945816139'}]}, 'timestamp': '2025-10-02 12:17:16.790758', '_unique_id': '457d6c17edab43a6962d107cb0bc1878'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.791 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.791 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.791 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.791 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.791 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.791 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.791 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.791 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.791 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.791 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.791 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.791 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.791 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.791 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.791 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.791 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.791 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.791 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.791 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.791 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.791 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.791 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.791 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.791 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.791 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.791 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.791 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.791 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.791 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.791 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.791 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.793 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.793 12 DEBUG ceilometer.compute.pollsters [-] f75b8563-463c-4024-97d0-befe60db5872/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.793 12 DEBUG ceilometer.compute.pollsters [-] f75b8563-463c-4024-97d0-befe60db5872/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.794 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '81c42ee6-3e65-4ae0-a51c-7de1834d1c54', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': 'f75b8563-463c-4024-97d0-befe60db5872-vda', 'timestamp': '2025-10-02T12:17:16.793144', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1580551683', 'name': 'instance-00000057', 'instance_id': 'f75b8563-463c-4024-97d0-befe60db5872', 'instance_type': 'm1.micro', 'host': '2496da44ba10bf404ca560afaac5f7290e48f65feb6a32884890763b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9949d9da-6314-4ede-8797-6f2f0a6a64fc', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bc5b7b88-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5428.457792559, 'message_signature': 'd856b10c9334acbfdbbf12916f789e74735f272ab664a358ff542a9e6a18e133'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': 'f75b8563-463c-4024-97d0-befe60db5872-sda', 'timestamp': '2025-10-02T12:17:16.793144', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1580551683', 'name': 'instance-00000057', 'instance_id': 'f75b8563-463c-4024-97d0-befe60db5872', 'instance_type': 'm1.micro', 'host': '2496da44ba10bf404ca560afaac5f7290e48f65feb6a32884890763b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9949d9da-6314-4ede-8797-6f2f0a6a64fc', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bc5b86b4-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5428.457792559, 'message_signature': 'e00724f331564e9f0f0335fb492b21e965280003fe14190bd46b1b772d88d761'}]}, 'timestamp': '2025-10-02 12:17:16.793717', '_unique_id': '6196f24bdb3d4169870c8f4fd1d30681'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.794 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.794 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.794 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.794 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.794 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.794 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.794 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.794 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.794 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.794 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.794 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.794 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.794 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.794 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.794 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.794 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.794 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.794 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.794 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.794 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.794 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.794 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.794 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.794 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.794 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.794 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.794 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.794 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.794 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.794 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.794 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.795 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.795 12 DEBUG ceilometer.compute.pollsters [-] f75b8563-463c-4024-97d0-befe60db5872/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.796 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '125a08c5-a1b8-40f3-bbb3-db3642fd8f87', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': 'instance-00000057-f75b8563-463c-4024-97d0-befe60db5872-tapd86401f6-e4', 'timestamp': '2025-10-02T12:17:16.795516', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1580551683', 'name': 'tapd86401f6-e4', 'instance_id': 'f75b8563-463c-4024-97d0-befe60db5872', 'instance_type': 'm1.micro', 'host': '2496da44ba10bf404ca560afaac5f7290e48f65feb6a32884890763b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9949d9da-6314-4ede-8797-6f2f0a6a64fc', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:98:5e:f0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd86401f6-e4'}, 'message_id': 'bc5bd7b8-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5428.438884185, 'message_signature': '8057841f208393b88ef20fe07d4e845234cc5a8f4466adaf5291d9f7f7a75e2d'}]}, 'timestamp': '2025-10-02 12:17:16.795804', '_unique_id': '6fac4851f9fd4fdabeff2fc38e89a3d2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.796 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.796 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.796 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.796 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.796 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.796 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.796 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.796 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.796 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.796 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.796 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.796 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.796 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.796 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.796 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.796 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.796 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.796 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.796 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.796 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.796 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.796 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.796 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.796 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.796 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.796 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.796 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.796 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.796 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.796 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.796 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.797 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.797 12 DEBUG ceilometer.compute.pollsters [-] f75b8563-463c-4024-97d0-befe60db5872/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.798 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3fb4da6e-28f5-4fca-943f-fda28416d4b5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': 'instance-00000057-f75b8563-463c-4024-97d0-befe60db5872-tapd86401f6-e4', 'timestamp': '2025-10-02T12:17:16.797420', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1580551683', 'name': 'tapd86401f6-e4', 'instance_id': 'f75b8563-463c-4024-97d0-befe60db5872', 'instance_type': 'm1.micro', 'host': '2496da44ba10bf404ca560afaac5f7290e48f65feb6a32884890763b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9949d9da-6314-4ede-8797-6f2f0a6a64fc', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:98:5e:f0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd86401f6-e4'}, 'message_id': 'bc5c2344-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5428.438884185, 'message_signature': '9baece28ffca937e1d600f82e8c144552952a94d07feee446491faaa56c6d2a4'}]}, 'timestamp': '2025-10-02 12:17:16.797774', '_unique_id': '2090ba61fcdf4bfa8954903ff7bd4ae4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.798 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.798 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.798 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.798 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.798 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.798 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.798 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.798 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.798 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.798 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.798 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.798 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.798 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.798 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.798 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.798 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.798 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.798 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.798 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.798 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.798 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.798 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.798 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.798 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.798 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.798 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.798 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.798 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.798 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.798 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.798 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.799 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.799 12 DEBUG ceilometer.compute.pollsters [-] f75b8563-463c-4024-97d0-befe60db5872/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.799 12 DEBUG ceilometer.compute.pollsters [-] f75b8563-463c-4024-97d0-befe60db5872/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.800 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cac1eed3-ced6-4630-887d-35b0999a4737', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': 'f75b8563-463c-4024-97d0-befe60db5872-vda', 'timestamp': '2025-10-02T12:17:16.799404', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1580551683', 'name': 'instance-00000057', 'instance_id': 'f75b8563-463c-4024-97d0-befe60db5872', 'instance_type': 'm1.micro', 'host': '2496da44ba10bf404ca560afaac5f7290e48f65feb6a32884890763b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9949d9da-6314-4ede-8797-6f2f0a6a64fc', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bc5c6f20-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5428.394843987, 'message_signature': '5910f3e44773f0376ff8cbfea07c7df6f656e0323cc8ae5185a2316f17735d0f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': 'f75b8563-463c-4024-97d0-befe60db5872-sda', 'timestamp': '2025-10-02T12:17:16.799404', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1580551683', 'name': 'instance-00000057', 'instance_id': 'f75b8563-463c-4024-97d0-befe60db5872', 'instance_type': 'm1.micro', 'host': '2496da44ba10bf404ca560afaac5f7290e48f65feb6a32884890763b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9949d9da-6314-4ede-8797-6f2f0a6a64fc', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bc5c78d0-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5428.394843987, 'message_signature': 'de48493bc45b83631a54c5483ce1d73daf5e6384ebcce02380f24bc4e4091e8a'}]}, 'timestamp': '2025-10-02 12:17:16.799962', '_unique_id': '7d9e8111c5984e60b85cf2b435d9c320'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.800 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.800 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.800 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.800 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.800 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.800 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.800 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.800 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.800 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.800 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.800 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.800 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.800 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.800 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.800 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.800 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.800 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.800 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.800 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.800 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.800 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.800 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.800 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.800 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.800 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.800 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.800 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.800 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.800 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.800 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.800 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.801 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.801 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.801 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1580551683>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1580551683>]
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.801 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.802 12 DEBUG ceilometer.compute.pollsters [-] f75b8563-463c-4024-97d0-befe60db5872/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.803 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ee8302d3-c691-4f85-b42f-aa6c8114e09f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': 'instance-00000057-f75b8563-463c-4024-97d0-befe60db5872-tapd86401f6-e4', 'timestamp': '2025-10-02T12:17:16.802065', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1580551683', 'name': 'tapd86401f6-e4', 'instance_id': 'f75b8563-463c-4024-97d0-befe60db5872', 'instance_type': 'm1.micro', 'host': '2496da44ba10bf404ca560afaac5f7290e48f65feb6a32884890763b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9949d9da-6314-4ede-8797-6f2f0a6a64fc', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:98:5e:f0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd86401f6-e4'}, 'message_id': 'bc5ce018-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5428.438884185, 'message_signature': '0f9f576a78e58fa7c310eb3b88cc593fedae3448a880452e49f58edf2281970f'}]}, 'timestamp': '2025-10-02 12:17:16.802609', '_unique_id': 'c09c67eff51242368b4c572e6b57da65'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.803 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.803 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.803 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.803 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.803 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.803 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.803 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.803 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.803 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.803 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.803 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.803 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.803 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.803 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.803 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.803 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.803 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.803 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.803 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.803 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.803 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.803 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.803 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.803 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.803 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.803 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.803 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.803 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.803 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.803 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.803 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.804 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.804 12 DEBUG ceilometer.compute.pollsters [-] f75b8563-463c-4024-97d0-befe60db5872/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.804 12 DEBUG ceilometer.compute.pollsters [-] f75b8563-463c-4024-97d0-befe60db5872/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.805 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6a1e15b9-1d54-49c2-bb98-fb448ac39fc2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': 'f75b8563-463c-4024-97d0-befe60db5872-vda', 'timestamp': '2025-10-02T12:17:16.804106', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1580551683', 'name': 'instance-00000057', 'instance_id': 'f75b8563-463c-4024-97d0-befe60db5872', 'instance_type': 'm1.micro', 'host': '2496da44ba10bf404ca560afaac5f7290e48f65feb6a32884890763b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9949d9da-6314-4ede-8797-6f2f0a6a64fc', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bc5d26a4-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5428.457792559, 'message_signature': '01f13cb5f62f65731b05d633a67b769546bc54f1e92354679239e7219f0e4256'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': 'f75b8563-463c-4024-97d0-befe60db5872-sda', 'timestamp': '2025-10-02T12:17:16.804106', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1580551683', 'name': 'instance-00000057', 'instance_id': 'f75b8563-463c-4024-97d0-befe60db5872', 'instance_type': 'm1.micro', 'host': '2496da44ba10bf404ca560afaac5f7290e48f65feb6a32884890763b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9949d9da-6314-4ede-8797-6f2f0a6a64fc', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bc5d2fe6-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5428.457792559, 'message_signature': '703ce749305139aaa8724d927b0a251af26ab13337934df9e82237d9c982738c'}]}, 'timestamp': '2025-10-02 12:17:16.804599', '_unique_id': '5ed4da2a03994062a858efe73601723c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.805 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.805 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.805 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.805 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.805 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.805 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.805 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.805 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.805 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.805 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.805 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.805 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.805 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.805 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.805 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.805 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.805 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.805 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.805 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.805 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.805 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.805 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.805 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.805 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.805 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.805 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.805 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.805 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.805 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.805 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.805 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.806 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.806 12 DEBUG ceilometer.compute.pollsters [-] f75b8563-463c-4024-97d0-befe60db5872/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.807 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3b7973ec-15be-4a0e-a0fe-2918e9fdcbf2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': 'instance-00000057-f75b8563-463c-4024-97d0-befe60db5872-tapd86401f6-e4', 'timestamp': '2025-10-02T12:17:16.806345', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1580551683', 'name': 'tapd86401f6-e4', 'instance_id': 'f75b8563-463c-4024-97d0-befe60db5872', 'instance_type': 'm1.micro', 'host': '2496da44ba10bf404ca560afaac5f7290e48f65feb6a32884890763b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9949d9da-6314-4ede-8797-6f2f0a6a64fc', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:98:5e:f0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd86401f6-e4'}, 'message_id': 'bc5d7e88-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5428.438884185, 'message_signature': 'de2ce006d52b99e2ad936ba3a6fb190f55bf57a46e727b2f8379b013acc79fa7'}]}, 'timestamp': '2025-10-02 12:17:16.806630', '_unique_id': '43bc89b8496543ceb78643e67aa51707'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.807 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.807 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.807 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.807 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.807 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.807 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.807 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.807 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.807 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.807 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.807 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.807 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.807 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.807 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.807 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.807 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.807 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.807 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.807 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.807 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.807 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.807 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.807 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.807 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.807 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.807 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.807 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.807 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.807 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.807 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.807 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.808 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.808 12 DEBUG ceilometer.compute.pollsters [-] f75b8563-463c-4024-97d0-befe60db5872/disk.device.read.latency volume: 656254262 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.808 12 DEBUG ceilometer.compute.pollsters [-] f75b8563-463c-4024-97d0-befe60db5872/disk.device.read.latency volume: 1038312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.809 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4dbbd7f5-d2f8-4903-8636-564c3e489539', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 656254262, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': 'f75b8563-463c-4024-97d0-befe60db5872-vda', 'timestamp': '2025-10-02T12:17:16.808237', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1580551683', 'name': 'instance-00000057', 'instance_id': 'f75b8563-463c-4024-97d0-befe60db5872', 'instance_type': 'm1.micro', 'host': '2496da44ba10bf404ca560afaac5f7290e48f65feb6a32884890763b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9949d9da-6314-4ede-8797-6f2f0a6a64fc', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bc5dc938-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5428.394843987, 'message_signature': '8c4ebad97cf3583fc5e97038a22661ab82cf7f5241747d481854a41269adadae'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1038312, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': 'f75b8563-463c-4024-97d0-befe60db5872-sda', 'timestamp': '2025-10-02T12:17:16.808237', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1580551683', 'name': 'instance-00000057', 'instance_id': 'f75b8563-463c-4024-97d0-befe60db5872', 'instance_type': 'm1.micro', 'host': '2496da44ba10bf404ca560afaac5f7290e48f65feb6a32884890763b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9949d9da-6314-4ede-8797-6f2f0a6a64fc', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bc5dd4d2-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5428.394843987, 'message_signature': 'ebf06512e6fd33323dd453f83ee22016970fe0a7865faf5e4303e4c121f35e2d'}]}, 'timestamp': '2025-10-02 12:17:16.808873', '_unique_id': '7ab1ab5ece8f449cbb5e7d46cea540a9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.809 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.809 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.809 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.809 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.809 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.809 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.809 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.809 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.809 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.809 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.809 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.809 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.809 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.809 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.809 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.809 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.809 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.809 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.809 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.809 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.809 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.809 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.809 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.809 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.809 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.809 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.809 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.809 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.809 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.809 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.809 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.810 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.810 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.810 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1580551683>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1580551683>]
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.811 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.811 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.811 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1580551683>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1580551683>]
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.811 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.811 12 DEBUG ceilometer.compute.pollsters [-] f75b8563-463c-4024-97d0-befe60db5872/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.811 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance f75b8563-463c-4024-97d0-befe60db5872: ceilometer.compute.pollsters.NoVolumeException
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.811 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.812 12 DEBUG ceilometer.compute.pollsters [-] f75b8563-463c-4024-97d0-befe60db5872/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.812 12 DEBUG ceilometer.compute.pollsters [-] f75b8563-463c-4024-97d0-befe60db5872/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.813 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6d46d499-2137-4048-9764-32685a45b46f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': 'f75b8563-463c-4024-97d0-befe60db5872-vda', 'timestamp': '2025-10-02T12:17:16.812027', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1580551683', 'name': 'instance-00000057', 'instance_id': 'f75b8563-463c-4024-97d0-befe60db5872', 'instance_type': 'm1.micro', 'host': '2496da44ba10bf404ca560afaac5f7290e48f65feb6a32884890763b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9949d9da-6314-4ede-8797-6f2f0a6a64fc', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bc5e5be6-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5428.394843987, 'message_signature': 'c413ec288acf3354bed08fa9a6055f1dfadbc5b45975d4f88a9c805b1d3b3c38'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': 'f75b8563-463c-4024-97d0-befe60db5872-sda', 'timestamp': '2025-10-02T12:17:16.812027', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1580551683', 'name': 'instance-00000057', 'instance_id': 'f75b8563-463c-4024-97d0-befe60db5872', 'instance_type': 'm1.micro', 'host': '2496da44ba10bf404ca560afaac5f7290e48f65feb6a32884890763b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9949d9da-6314-4ede-8797-6f2f0a6a64fc', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bc5e65b4-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5428.394843987, 'message_signature': 'cd858747a0e8941fa6e97b2c149251f4e191021a03bda1fbccce03b858b009b3'}]}, 'timestamp': '2025-10-02 12:17:16.812570', '_unique_id': 'cfcf8a3f5f714f819595b179cdc8cd68'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.813 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.813 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.813 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.813 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.813 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.813 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.813 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.813 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.813 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.813 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.813 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.813 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.813 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.813 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.813 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.813 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.813 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.813 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.813 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.813 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.813 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.813 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.813 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.813 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.813 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.813 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.813 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.813 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.813 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.813 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.813 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 DEBUG ceilometer.compute.pollsters [-] f75b8563-463c-4024-97d0-befe60db5872/network.incoming.bytes volume: 110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '75e0eb93-465f-4abc-ba05-0e33253838ac', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 110, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': 'instance-00000057-f75b8563-463c-4024-97d0-befe60db5872-tapd86401f6-e4', 'timestamp': '2025-10-02T12:17:16.814168', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1580551683', 'name': 'tapd86401f6-e4', 'instance_id': 'f75b8563-463c-4024-97d0-befe60db5872', 'instance_type': 'm1.micro', 'host': '2496da44ba10bf404ca560afaac5f7290e48f65feb6a32884890763b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9949d9da-6314-4ede-8797-6f2f0a6a64fc', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:98:5e:f0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd86401f6-e4'}, 'message_id': 'bc5eb046-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5428.438884185, 'message_signature': '841d0db45f3d13a503b71a2bd31980e68690651b75e07b77bc57b058a40c8686'}]}, 'timestamp': '2025-10-02 12:17:16.814452', '_unique_id': '941da56967d1417bb1ff33daebaa5a81'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.814 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.815 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.815 12 DEBUG ceilometer.compute.pollsters [-] f75b8563-463c-4024-97d0-befe60db5872/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.816 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cfd6dc2c-f421-4d33-9652-ce7c91eee2ce', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': 'instance-00000057-f75b8563-463c-4024-97d0-befe60db5872-tapd86401f6-e4', 'timestamp': '2025-10-02T12:17:16.815909', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1580551683', 'name': 'tapd86401f6-e4', 'instance_id': 'f75b8563-463c-4024-97d0-befe60db5872', 'instance_type': 'm1.micro', 'host': '2496da44ba10bf404ca560afaac5f7290e48f65feb6a32884890763b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9949d9da-6314-4ede-8797-6f2f0a6a64fc', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:98:5e:f0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd86401f6-e4'}, 'message_id': 'bc5ef3ee-9f89-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5428.438884185, 'message_signature': '527cdc19242d993cdba36019a291a6eaeea98b6558e88c0cb53a0c246e35173b'}]}, 'timestamp': '2025-10-02 12:17:16.816184', '_unique_id': 'f0c8cb680e2c4a2890733d13692cfb26'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.816 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.816 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.816 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.816 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.816 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.816 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.816 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.816 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.816 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.816 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.816 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.816 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.816 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.816 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.816 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.816 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.816 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.816 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.816 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.816 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.816 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.816 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.816 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.816 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.816 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.816 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.816 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.816 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.816 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.816 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:17:16.816 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466013 nova_compute[192144]: 2025-10-02 12:17:16.909 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407421.9077523, 03d3f4d9-1589-440a-80c8-3348a75c106b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:17:16 np0005466013 nova_compute[192144]: 2025-10-02 12:17:16.910 2 INFO nova.compute.manager [-] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:17:18 np0005466013 nova_compute[192144]: 2025-10-02 12:17:18.129 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:18 np0005466013 nova_compute[192144]: 2025-10-02 12:17:18.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:19 np0005466013 nova_compute[192144]: 2025-10-02 12:17:19.007 2 DEBUG nova.compute.manager [None req-ae08bee0-3a97-4ea7-af80-e2fc964c15c4 - - - - - -] [instance: 03d3f4d9-1589-440a-80c8-3348a75c106b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:17:20 np0005466013 nova_compute[192144]: 2025-10-02 12:17:20.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:17:20 np0005466013 nova_compute[192144]: 2025-10-02 12:17:20.995 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:17:21 np0005466013 ovn_controller[94366]: 2025-10-02T12:17:21Z|00032|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:98:5e:f0 10.100.0.13
Oct  2 08:17:21 np0005466013 ovn_controller[94366]: 2025-10-02T12:17:21Z|00033|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:98:5e:f0 10.100.0.13
Oct  2 08:17:23 np0005466013 nova_compute[192144]: 2025-10-02 12:17:23.134 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:23 np0005466013 nova_compute[192144]: 2025-10-02 12:17:23.519 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:23 np0005466013 podman[232729]: 2025-10-02 12:17:23.695782338 +0000 UTC m=+0.063578092 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute)
Oct  2 08:17:23 np0005466013 nova_compute[192144]: 2025-10-02 12:17:23.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:17:23 np0005466013 nova_compute[192144]: 2025-10-02 12:17:23.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:17:24 np0005466013 nova_compute[192144]: 2025-10-02 12:17:24.036 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:17:24 np0005466013 nova_compute[192144]: 2025-10-02 12:17:24.037 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:17:24 np0005466013 nova_compute[192144]: 2025-10-02 12:17:24.037 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:17:24 np0005466013 nova_compute[192144]: 2025-10-02 12:17:24.037 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:17:24 np0005466013 nova_compute[192144]: 2025-10-02 12:17:24.130 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f75b8563-463c-4024-97d0-befe60db5872/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:17:24 np0005466013 nova_compute[192144]: 2025-10-02 12:17:24.197 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f75b8563-463c-4024-97d0-befe60db5872/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:17:24 np0005466013 nova_compute[192144]: 2025-10-02 12:17:24.198 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f75b8563-463c-4024-97d0-befe60db5872/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:17:24 np0005466013 nova_compute[192144]: 2025-10-02 12:17:24.264 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f75b8563-463c-4024-97d0-befe60db5872/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:17:24 np0005466013 nova_compute[192144]: 2025-10-02 12:17:24.432 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:17:24 np0005466013 nova_compute[192144]: 2025-10-02 12:17:24.434 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5467MB free_disk=73.32394027709961GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:17:24 np0005466013 nova_compute[192144]: 2025-10-02 12:17:24.434 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:17:24 np0005466013 nova_compute[192144]: 2025-10-02 12:17:24.434 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:17:24 np0005466013 nova_compute[192144]: 2025-10-02 12:17:24.617 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance f75b8563-463c-4024-97d0-befe60db5872 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:17:24 np0005466013 nova_compute[192144]: 2025-10-02 12:17:24.617 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:17:24 np0005466013 nova_compute[192144]: 2025-10-02 12:17:24.618 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=704MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:17:24 np0005466013 nova_compute[192144]: 2025-10-02 12:17:24.775 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:17:24 np0005466013 nova_compute[192144]: 2025-10-02 12:17:24.815 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:17:24 np0005466013 nova_compute[192144]: 2025-10-02 12:17:24.854 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:17:24 np0005466013 nova_compute[192144]: 2025-10-02 12:17:24.854 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.420s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:17:25 np0005466013 podman[232758]: 2025-10-02 12:17:25.69725973 +0000 UTC m=+0.063945015 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., distribution-scope=public, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_id=edpm, architecture=x86_64, container_name=openstack_network_exporter, io.buildah.version=1.33.7)
Oct  2 08:17:25 np0005466013 podman[232757]: 2025-10-02 12:17:25.721944981 +0000 UTC m=+0.091345280 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0)
Oct  2 08:17:25 np0005466013 nova_compute[192144]: 2025-10-02 12:17:25.854 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:17:25 np0005466013 nova_compute[192144]: 2025-10-02 12:17:25.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:17:25 np0005466013 nova_compute[192144]: 2025-10-02 12:17:25.995 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:17:26 np0005466013 nova_compute[192144]: 2025-10-02 12:17:26.484 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:26.487 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:17:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:26.488 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:17:26 np0005466013 nova_compute[192144]: 2025-10-02 12:17:26.995 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:17:26 np0005466013 nova_compute[192144]: 2025-10-02 12:17:26.995 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:17:26 np0005466013 nova_compute[192144]: 2025-10-02 12:17:26.995 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:17:28 np0005466013 nova_compute[192144]: 2025-10-02 12:17:28.186 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:28 np0005466013 nova_compute[192144]: 2025-10-02 12:17:28.519 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:29 np0005466013 nova_compute[192144]: 2025-10-02 12:17:29.141 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "refresh_cache-f75b8563-463c-4024-97d0-befe60db5872" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:17:29 np0005466013 nova_compute[192144]: 2025-10-02 12:17:29.141 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquired lock "refresh_cache-f75b8563-463c-4024-97d0-befe60db5872" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:17:29 np0005466013 nova_compute[192144]: 2025-10-02 12:17:29.141 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: f75b8563-463c-4024-97d0-befe60db5872] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:17:29 np0005466013 nova_compute[192144]: 2025-10-02 12:17:29.142 2 DEBUG nova.objects.instance [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lazy-loading 'info_cache' on Instance uuid f75b8563-463c-4024-97d0-befe60db5872 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:17:31 np0005466013 podman[232798]: 2025-10-02 12:17:31.675201534 +0000 UTC m=+0.048380134 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:17:31 np0005466013 podman[232799]: 2025-10-02 12:17:31.701558197 +0000 UTC m=+0.071464016 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  2 08:17:33 np0005466013 nova_compute[192144]: 2025-10-02 12:17:33.126 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: f75b8563-463c-4024-97d0-befe60db5872] Updating instance_info_cache with network_info: [{"id": "d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2", "address": "fa:16:3e:98:5e:f0", "network": {"id": "bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-542543245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e0277f0bb0f4a349e2e6d8ddfa24edf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd86401f6-e4", "ovs_interfaceid": "d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:17:33 np0005466013 nova_compute[192144]: 2025-10-02 12:17:33.188 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:33 np0005466013 nova_compute[192144]: 2025-10-02 12:17:33.482 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Releasing lock "refresh_cache-f75b8563-463c-4024-97d0-befe60db5872" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:17:33 np0005466013 nova_compute[192144]: 2025-10-02 12:17:33.482 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: f75b8563-463c-4024-97d0-befe60db5872] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:17:33 np0005466013 nova_compute[192144]: 2025-10-02 12:17:33.484 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:17:33 np0005466013 nova_compute[192144]: 2025-10-02 12:17:33.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:35 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:35.491 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:17:36 np0005466013 nova_compute[192144]: 2025-10-02 12:17:36.477 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:17:38 np0005466013 nova_compute[192144]: 2025-10-02 12:17:38.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:38 np0005466013 nova_compute[192144]: 2025-10-02 12:17:38.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:43 np0005466013 nova_compute[192144]: 2025-10-02 12:17:43.234 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:43 np0005466013 nova_compute[192144]: 2025-10-02 12:17:43.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:45 np0005466013 podman[232845]: 2025-10-02 12:17:45.67605769 +0000 UTC m=+0.048418225 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, managed_by=edpm_ansible)
Oct  2 08:17:45 np0005466013 podman[232844]: 2025-10-02 12:17:45.680177877 +0000 UTC m=+0.055067640 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:17:45 np0005466013 podman[232846]: 2025-10-02 12:17:45.726496226 +0000 UTC m=+0.091802524 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:17:48 np0005466013 nova_compute[192144]: 2025-10-02 12:17:48.236 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:48 np0005466013 nova_compute[192144]: 2025-10-02 12:17:48.526 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:49 np0005466013 nova_compute[192144]: 2025-10-02 12:17:49.945 2 DEBUG oslo_concurrency.lockutils [None req-652e1566-422b-41bc-9a4c-ff859a907f96 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Acquiring lock "f75b8563-463c-4024-97d0-befe60db5872" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:17:49 np0005466013 nova_compute[192144]: 2025-10-02 12:17:49.946 2 DEBUG oslo_concurrency.lockutils [None req-652e1566-422b-41bc-9a4c-ff859a907f96 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Lock "f75b8563-463c-4024-97d0-befe60db5872" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:17:49 np0005466013 nova_compute[192144]: 2025-10-02 12:17:49.946 2 DEBUG oslo_concurrency.lockutils [None req-652e1566-422b-41bc-9a4c-ff859a907f96 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Acquiring lock "f75b8563-463c-4024-97d0-befe60db5872-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:17:49 np0005466013 nova_compute[192144]: 2025-10-02 12:17:49.946 2 DEBUG oslo_concurrency.lockutils [None req-652e1566-422b-41bc-9a4c-ff859a907f96 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Lock "f75b8563-463c-4024-97d0-befe60db5872-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:17:49 np0005466013 nova_compute[192144]: 2025-10-02 12:17:49.946 2 DEBUG oslo_concurrency.lockutils [None req-652e1566-422b-41bc-9a4c-ff859a907f96 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Lock "f75b8563-463c-4024-97d0-befe60db5872-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:17:49 np0005466013 nova_compute[192144]: 2025-10-02 12:17:49.962 2 INFO nova.compute.manager [None req-652e1566-422b-41bc-9a4c-ff859a907f96 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] Terminating instance#033[00m
Oct  2 08:17:49 np0005466013 nova_compute[192144]: 2025-10-02 12:17:49.977 2 DEBUG nova.compute.manager [None req-652e1566-422b-41bc-9a4c-ff859a907f96 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:17:49 np0005466013 kernel: tapd86401f6-e4 (unregistering): left promiscuous mode
Oct  2 08:17:50 np0005466013 NetworkManager[51205]: <info>  [1759407470.0034] device (tapd86401f6-e4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:17:50 np0005466013 nova_compute[192144]: 2025-10-02 12:17:50.011 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:50 np0005466013 ovn_controller[94366]: 2025-10-02T12:17:50Z|00337|binding|INFO|Releasing lport d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2 from this chassis (sb_readonly=0)
Oct  2 08:17:50 np0005466013 ovn_controller[94366]: 2025-10-02T12:17:50Z|00338|binding|INFO|Setting lport d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2 down in Southbound
Oct  2 08:17:50 np0005466013 ovn_controller[94366]: 2025-10-02T12:17:50Z|00339|binding|INFO|Removing iface tapd86401f6-e4 ovn-installed in OVS
Oct  2 08:17:50 np0005466013 nova_compute[192144]: 2025-10-02 12:17:50.014 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:50.020 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:98:5e:f0 10.100.0.13'], port_security=['fa:16:3e:98:5e:f0 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'f75b8563-463c-4024-97d0-befe60db5872', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b4e0bc42-3cfd-4f42-a319-553606576b33', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a043239b-039e-45fa-8277-43e361a8bae7, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:17:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:50.022 103323 INFO neutron.agent.ovn.metadata.agent [-] Port d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2 in datapath bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5 unbound from our chassis#033[00m
Oct  2 08:17:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:50.024 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:17:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:50.025 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[247768f9-f2be-4ac6-9c4a-8038118d78fe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:50.026 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5 namespace which is not needed anymore#033[00m
Oct  2 08:17:50 np0005466013 nova_compute[192144]: 2025-10-02 12:17:50.032 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:50 np0005466013 systemd[1]: machine-qemu\x2d39\x2dinstance\x2d00000057.scope: Deactivated successfully.
Oct  2 08:17:50 np0005466013 systemd[1]: machine-qemu\x2d39\x2dinstance\x2d00000057.scope: Consumed 14.924s CPU time.
Oct  2 08:17:50 np0005466013 systemd-machined[152202]: Machine qemu-39-instance-00000057 terminated.
Oct  2 08:17:50 np0005466013 neutron-haproxy-ovnmeta-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5[232635]: [NOTICE]   (232639) : haproxy version is 2.8.14-c23fe91
Oct  2 08:17:50 np0005466013 neutron-haproxy-ovnmeta-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5[232635]: [NOTICE]   (232639) : path to executable is /usr/sbin/haproxy
Oct  2 08:17:50 np0005466013 neutron-haproxy-ovnmeta-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5[232635]: [WARNING]  (232639) : Exiting Master process...
Oct  2 08:17:50 np0005466013 neutron-haproxy-ovnmeta-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5[232635]: [WARNING]  (232639) : Exiting Master process...
Oct  2 08:17:50 np0005466013 neutron-haproxy-ovnmeta-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5[232635]: [ALERT]    (232639) : Current worker (232641) exited with code 143 (Terminated)
Oct  2 08:17:50 np0005466013 neutron-haproxy-ovnmeta-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5[232635]: [WARNING]  (232639) : All workers exited. Exiting... (0)
Oct  2 08:17:50 np0005466013 systemd[1]: libpod-e212bfa8f2e9804b683b23135b865a1bcf8043734676150c0368ac9c5767fcaa.scope: Deactivated successfully.
Oct  2 08:17:50 np0005466013 podman[232936]: 2025-10-02 12:17:50.168719814 +0000 UTC m=+0.050945553 container died e212bfa8f2e9804b683b23135b865a1bcf8043734676150c0368ac9c5767fcaa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS)
Oct  2 08:17:50 np0005466013 kernel: tapd86401f6-e4: entered promiscuous mode
Oct  2 08:17:50 np0005466013 NetworkManager[51205]: <info>  [1759407470.1995] manager: (tapd86401f6-e4): new Tun device (/org/freedesktop/NetworkManager/Devices/157)
Oct  2 08:17:50 np0005466013 systemd-udevd[232914]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:17:50 np0005466013 kernel: tapd86401f6-e4 (unregistering): left promiscuous mode
Oct  2 08:17:50 np0005466013 ovn_controller[94366]: 2025-10-02T12:17:50Z|00340|binding|INFO|Claiming lport d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2 for this chassis.
Oct  2 08:17:50 np0005466013 ovn_controller[94366]: 2025-10-02T12:17:50Z|00341|binding|INFO|d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2: Claiming fa:16:3e:98:5e:f0 10.100.0.13
Oct  2 08:17:50 np0005466013 nova_compute[192144]: 2025-10-02 12:17:50.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:50 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e212bfa8f2e9804b683b23135b865a1bcf8043734676150c0368ac9c5767fcaa-userdata-shm.mount: Deactivated successfully.
Oct  2 08:17:50 np0005466013 systemd[1]: var-lib-containers-storage-overlay-104e279e1377b89d3cda61514788ca782e1f004291b6abbaaf351376e19fa7ad-merged.mount: Deactivated successfully.
Oct  2 08:17:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:50.217 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:98:5e:f0 10.100.0.13'], port_security=['fa:16:3e:98:5e:f0 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'f75b8563-463c-4024-97d0-befe60db5872', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b4e0bc42-3cfd-4f42-a319-553606576b33', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a043239b-039e-45fa-8277-43e361a8bae7, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:17:50 np0005466013 podman[232936]: 2025-10-02 12:17:50.221167382 +0000 UTC m=+0.103393101 container cleanup e212bfa8f2e9804b683b23135b865a1bcf8043734676150c0368ac9c5767fcaa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:17:50 np0005466013 ovn_controller[94366]: 2025-10-02T12:17:50Z|00342|binding|INFO|Releasing lport d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2 from this chassis (sb_readonly=0)
Oct  2 08:17:50 np0005466013 nova_compute[192144]: 2025-10-02 12:17:50.225 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:50.233 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:98:5e:f0 10.100.0.13'], port_security=['fa:16:3e:98:5e:f0 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'f75b8563-463c-4024-97d0-befe60db5872', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b4e0bc42-3cfd-4f42-a319-553606576b33', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a043239b-039e-45fa-8277-43e361a8bae7, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:17:50 np0005466013 systemd[1]: libpod-conmon-e212bfa8f2e9804b683b23135b865a1bcf8043734676150c0368ac9c5767fcaa.scope: Deactivated successfully.
Oct  2 08:17:50 np0005466013 nova_compute[192144]: 2025-10-02 12:17:50.260 2 INFO nova.virt.libvirt.driver [-] [instance: f75b8563-463c-4024-97d0-befe60db5872] Instance destroyed successfully.#033[00m
Oct  2 08:17:50 np0005466013 nova_compute[192144]: 2025-10-02 12:17:50.261 2 DEBUG nova.objects.instance [None req-652e1566-422b-41bc-9a4c-ff859a907f96 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Lazy-loading 'resources' on Instance uuid f75b8563-463c-4024-97d0-befe60db5872 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:17:50 np0005466013 nova_compute[192144]: 2025-10-02 12:17:50.302 2 DEBUG nova.virt.libvirt.vif [None req-652e1566-422b-41bc-9a4c-ff859a907f96 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:16:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1580551683',display_name='tempest-ListServerFiltersTestJSON-instance-1580551683',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1580551683',id=87,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:17:08Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6e0277f0bb0f4a349e2e6d8ddfa24edf',ramdisk_id='',reservation_id='r-7yh3ztad',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-298715262',owner_user_name='tempest-ListServerFiltersTestJSON-298715262-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:17:08Z,user_data=None,user_id='001d2d51902d4e299b775131f430a5db',uuid=f75b8563-463c-4024-97d0-befe60db5872,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2", "address": "fa:16:3e:98:5e:f0", "network": {"id": "bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-542543245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e0277f0bb0f4a349e2e6d8ddfa24edf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd86401f6-e4", "ovs_interfaceid": "d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:17:50 np0005466013 nova_compute[192144]: 2025-10-02 12:17:50.302 2 DEBUG nova.network.os_vif_util [None req-652e1566-422b-41bc-9a4c-ff859a907f96 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Converting VIF {"id": "d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2", "address": "fa:16:3e:98:5e:f0", "network": {"id": "bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-542543245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e0277f0bb0f4a349e2e6d8ddfa24edf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd86401f6-e4", "ovs_interfaceid": "d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:17:50 np0005466013 nova_compute[192144]: 2025-10-02 12:17:50.303 2 DEBUG nova.network.os_vif_util [None req-652e1566-422b-41bc-9a4c-ff859a907f96 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:98:5e:f0,bridge_name='br-int',has_traffic_filtering=True,id=d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2,network=Network(bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd86401f6-e4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:17:50 np0005466013 nova_compute[192144]: 2025-10-02 12:17:50.303 2 DEBUG os_vif [None req-652e1566-422b-41bc-9a4c-ff859a907f96 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:98:5e:f0,bridge_name='br-int',has_traffic_filtering=True,id=d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2,network=Network(bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd86401f6-e4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:17:50 np0005466013 podman[232972]: 2025-10-02 12:17:50.304311848 +0000 UTC m=+0.047970862 container remove e212bfa8f2e9804b683b23135b865a1bcf8043734676150c0368ac9c5767fcaa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:17:50 np0005466013 nova_compute[192144]: 2025-10-02 12:17:50.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:50 np0005466013 nova_compute[192144]: 2025-10-02 12:17:50.305 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd86401f6-e4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:17:50 np0005466013 nova_compute[192144]: 2025-10-02 12:17:50.307 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:50 np0005466013 nova_compute[192144]: 2025-10-02 12:17:50.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:50 np0005466013 nova_compute[192144]: 2025-10-02 12:17:50.310 2 INFO os_vif [None req-652e1566-422b-41bc-9a4c-ff859a907f96 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:98:5e:f0,bridge_name='br-int',has_traffic_filtering=True,id=d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2,network=Network(bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd86401f6-e4')#033[00m
Oct  2 08:17:50 np0005466013 nova_compute[192144]: 2025-10-02 12:17:50.311 2 INFO nova.virt.libvirt.driver [None req-652e1566-422b-41bc-9a4c-ff859a907f96 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] Deleting instance files /var/lib/nova/instances/f75b8563-463c-4024-97d0-befe60db5872_del#033[00m
Oct  2 08:17:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:50.311 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[f57b9afc-2ebd-42da-990d-0ad789548de9]: (4, ('Thu Oct  2 12:17:50 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5 (e212bfa8f2e9804b683b23135b865a1bcf8043734676150c0368ac9c5767fcaa)\ne212bfa8f2e9804b683b23135b865a1bcf8043734676150c0368ac9c5767fcaa\nThu Oct  2 12:17:50 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5 (e212bfa8f2e9804b683b23135b865a1bcf8043734676150c0368ac9c5767fcaa)\ne212bfa8f2e9804b683b23135b865a1bcf8043734676150c0368ac9c5767fcaa\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:50 np0005466013 nova_compute[192144]: 2025-10-02 12:17:50.312 2 INFO nova.virt.libvirt.driver [None req-652e1566-422b-41bc-9a4c-ff859a907f96 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] Deletion of /var/lib/nova/instances/f75b8563-463c-4024-97d0-befe60db5872_del complete#033[00m
Oct  2 08:17:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:50.313 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[b00ecfcd-ee5b-485d-a28d-e53c4ea1b0ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:50.315 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbd543a6a-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:17:50 np0005466013 nova_compute[192144]: 2025-10-02 12:17:50.317 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:50 np0005466013 kernel: tapbd543a6a-b0: left promiscuous mode
Oct  2 08:17:50 np0005466013 nova_compute[192144]: 2025-10-02 12:17:50.319 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:50.323 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[220c82dd-400a-4ad4-ba5c-f7f14ec98bcb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:50 np0005466013 nova_compute[192144]: 2025-10-02 12:17:50.335 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:50.354 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[e1b074a3-d024-4d3d-9026-ec2da79c19e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:50.356 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[7388373a-f982-4692-8a0d-63dce3820658]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:50.374 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[4ca73dcb-a482-4d51-bdf8-74c901569290]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 541756, 'reachable_time': 26558, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232989, 'error': None, 'target': 'ovnmeta-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:50 np0005466013 systemd[1]: run-netns-ovnmeta\x2dbd543a6a\x2dbba1\x2d4bd5\x2d9cbf\x2dfc87bf95cbe5.mount: Deactivated successfully.
Oct  2 08:17:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:50.377 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:17:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:50.379 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[44afde6a-6358-4f06-959d-c8ca9a68a4b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:50.380 103323 INFO neutron.agent.ovn.metadata.agent [-] Port d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2 in datapath bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5 unbound from our chassis#033[00m
Oct  2 08:17:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:50.382 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:17:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:50.382 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[8d9254c1-de93-42a3-bbff-2b3db2565179]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:50.384 103323 INFO neutron.agent.ovn.metadata.agent [-] Port d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2 in datapath bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5 unbound from our chassis#033[00m
Oct  2 08:17:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:50.385 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:17:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:17:50.386 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[45d465cb-60e3-4d84-8d8c-fe45acde5f98]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:50 np0005466013 nova_compute[192144]: 2025-10-02 12:17:50.397 2 INFO nova.compute.manager [None req-652e1566-422b-41bc-9a4c-ff859a907f96 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] Took 0.42 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:17:50 np0005466013 nova_compute[192144]: 2025-10-02 12:17:50.398 2 DEBUG oslo.service.loopingcall [None req-652e1566-422b-41bc-9a4c-ff859a907f96 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:17:50 np0005466013 nova_compute[192144]: 2025-10-02 12:17:50.399 2 DEBUG nova.compute.manager [-] [instance: f75b8563-463c-4024-97d0-befe60db5872] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:17:50 np0005466013 nova_compute[192144]: 2025-10-02 12:17:50.399 2 DEBUG nova.network.neutron [-] [instance: f75b8563-463c-4024-97d0-befe60db5872] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:17:50 np0005466013 nova_compute[192144]: 2025-10-02 12:17:50.402 2 DEBUG nova.compute.manager [req-353fb5c5-68f3-4c81-9761-dcd0c96c95cf req-3eda782e-1075-4bed-b423-a4e5e9696a32 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] Received event network-vif-unplugged-d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:17:50 np0005466013 nova_compute[192144]: 2025-10-02 12:17:50.403 2 DEBUG oslo_concurrency.lockutils [req-353fb5c5-68f3-4c81-9761-dcd0c96c95cf req-3eda782e-1075-4bed-b423-a4e5e9696a32 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "f75b8563-463c-4024-97d0-befe60db5872-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:17:50 np0005466013 nova_compute[192144]: 2025-10-02 12:17:50.403 2 DEBUG oslo_concurrency.lockutils [req-353fb5c5-68f3-4c81-9761-dcd0c96c95cf req-3eda782e-1075-4bed-b423-a4e5e9696a32 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f75b8563-463c-4024-97d0-befe60db5872-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:17:50 np0005466013 nova_compute[192144]: 2025-10-02 12:17:50.403 2 DEBUG oslo_concurrency.lockutils [req-353fb5c5-68f3-4c81-9761-dcd0c96c95cf req-3eda782e-1075-4bed-b423-a4e5e9696a32 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f75b8563-463c-4024-97d0-befe60db5872-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:17:50 np0005466013 nova_compute[192144]: 2025-10-02 12:17:50.403 2 DEBUG nova.compute.manager [req-353fb5c5-68f3-4c81-9761-dcd0c96c95cf req-3eda782e-1075-4bed-b423-a4e5e9696a32 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] No waiting events found dispatching network-vif-unplugged-d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:17:50 np0005466013 nova_compute[192144]: 2025-10-02 12:17:50.404 2 DEBUG nova.compute.manager [req-353fb5c5-68f3-4c81-9761-dcd0c96c95cf req-3eda782e-1075-4bed-b423-a4e5e9696a32 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] Received event network-vif-unplugged-d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:17:51 np0005466013 nova_compute[192144]: 2025-10-02 12:17:51.311 2 DEBUG nova.network.neutron [-] [instance: f75b8563-463c-4024-97d0-befe60db5872] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:17:51 np0005466013 nova_compute[192144]: 2025-10-02 12:17:51.336 2 INFO nova.compute.manager [-] [instance: f75b8563-463c-4024-97d0-befe60db5872] Took 0.94 seconds to deallocate network for instance.#033[00m
Oct  2 08:17:51 np0005466013 nova_compute[192144]: 2025-10-02 12:17:51.439 2 DEBUG oslo_concurrency.lockutils [None req-652e1566-422b-41bc-9a4c-ff859a907f96 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:17:51 np0005466013 nova_compute[192144]: 2025-10-02 12:17:51.439 2 DEBUG oslo_concurrency.lockutils [None req-652e1566-422b-41bc-9a4c-ff859a907f96 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:17:51 np0005466013 nova_compute[192144]: 2025-10-02 12:17:51.485 2 DEBUG nova.compute.manager [req-4c94c29d-cef1-4773-b276-bf24461efbd2 req-5a1826af-f5eb-4a33-9355-7d744c06c671 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] Received event network-vif-deleted-d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:17:51 np0005466013 nova_compute[192144]: 2025-10-02 12:17:51.531 2 DEBUG nova.compute.provider_tree [None req-652e1566-422b-41bc-9a4c-ff859a907f96 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:17:51 np0005466013 nova_compute[192144]: 2025-10-02 12:17:51.553 2 DEBUG nova.scheduler.client.report [None req-652e1566-422b-41bc-9a4c-ff859a907f96 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:17:51 np0005466013 nova_compute[192144]: 2025-10-02 12:17:51.584 2 DEBUG oslo_concurrency.lockutils [None req-652e1566-422b-41bc-9a4c-ff859a907f96 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.145s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:17:51 np0005466013 nova_compute[192144]: 2025-10-02 12:17:51.611 2 INFO nova.scheduler.client.report [None req-652e1566-422b-41bc-9a4c-ff859a907f96 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Deleted allocations for instance f75b8563-463c-4024-97d0-befe60db5872#033[00m
Oct  2 08:17:51 np0005466013 nova_compute[192144]: 2025-10-02 12:17:51.789 2 DEBUG oslo_concurrency.lockutils [None req-652e1566-422b-41bc-9a4c-ff859a907f96 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Lock "f75b8563-463c-4024-97d0-befe60db5872" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.843s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:17:52 np0005466013 nova_compute[192144]: 2025-10-02 12:17:52.550 2 DEBUG nova.compute.manager [req-05f0ed04-13ca-46ca-96d5-58b024cb7f4d req-5ac622ec-25c7-4c22-82e2-a8ce05141e27 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] Received event network-vif-plugged-d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:17:52 np0005466013 nova_compute[192144]: 2025-10-02 12:17:52.551 2 DEBUG oslo_concurrency.lockutils [req-05f0ed04-13ca-46ca-96d5-58b024cb7f4d req-5ac622ec-25c7-4c22-82e2-a8ce05141e27 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "f75b8563-463c-4024-97d0-befe60db5872-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:17:52 np0005466013 nova_compute[192144]: 2025-10-02 12:17:52.552 2 DEBUG oslo_concurrency.lockutils [req-05f0ed04-13ca-46ca-96d5-58b024cb7f4d req-5ac622ec-25c7-4c22-82e2-a8ce05141e27 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f75b8563-463c-4024-97d0-befe60db5872-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:17:52 np0005466013 nova_compute[192144]: 2025-10-02 12:17:52.552 2 DEBUG oslo_concurrency.lockutils [req-05f0ed04-13ca-46ca-96d5-58b024cb7f4d req-5ac622ec-25c7-4c22-82e2-a8ce05141e27 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f75b8563-463c-4024-97d0-befe60db5872-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:17:52 np0005466013 nova_compute[192144]: 2025-10-02 12:17:52.552 2 DEBUG nova.compute.manager [req-05f0ed04-13ca-46ca-96d5-58b024cb7f4d req-5ac622ec-25c7-4c22-82e2-a8ce05141e27 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] No waiting events found dispatching network-vif-plugged-d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:17:52 np0005466013 nova_compute[192144]: 2025-10-02 12:17:52.553 2 WARNING nova.compute.manager [req-05f0ed04-13ca-46ca-96d5-58b024cb7f4d req-5ac622ec-25c7-4c22-82e2-a8ce05141e27 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f75b8563-463c-4024-97d0-befe60db5872] Received unexpected event network-vif-plugged-d86401f6-e4b1-4e8b-aeb8-8a02c81ea3f2 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:17:53 np0005466013 nova_compute[192144]: 2025-10-02 12:17:53.528 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:54 np0005466013 podman[232990]: 2025-10-02 12:17:54.69419373 +0000 UTC m=+0.067667039 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 08:17:55 np0005466013 nova_compute[192144]: 2025-10-02 12:17:55.307 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:56 np0005466013 podman[233011]: 2025-10-02 12:17:56.685384873 +0000 UTC m=+0.060090645 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, managed_by=edpm_ansible, name=ubi9-minimal, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., distribution-scope=public, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, io.openshift.tags=minimal rhel9)
Oct  2 08:17:56 np0005466013 podman[233010]: 2025-10-02 12:17:56.713108889 +0000 UTC m=+0.088428710 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  2 08:17:58 np0005466013 nova_compute[192144]: 2025-10-02 12:17:58.530 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:58 np0005466013 nova_compute[192144]: 2025-10-02 12:17:58.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:00 np0005466013 nova_compute[192144]: 2025-10-02 12:18:00.309 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:02.299 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:02.300 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:02.300 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:02 np0005466013 podman[233049]: 2025-10-02 12:18:02.677803355 +0000 UTC m=+0.058139394 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  2 08:18:02 np0005466013 podman[233050]: 2025-10-02 12:18:02.714813538 +0000 UTC m=+0.090747861 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  2 08:18:03 np0005466013 nova_compute[192144]: 2025-10-02 12:18:03.534 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:05 np0005466013 nova_compute[192144]: 2025-10-02 12:18:05.261 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407470.2566628, f75b8563-463c-4024-97d0-befe60db5872 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:18:05 np0005466013 nova_compute[192144]: 2025-10-02 12:18:05.261 2 INFO nova.compute.manager [-] [instance: f75b8563-463c-4024-97d0-befe60db5872] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:18:05 np0005466013 nova_compute[192144]: 2025-10-02 12:18:05.291 2 DEBUG nova.compute.manager [None req-05a80a63-f12a-4fe5-a908-11e259c4c439 - - - - - -] [instance: f75b8563-463c-4024-97d0-befe60db5872] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:18:05 np0005466013 nova_compute[192144]: 2025-10-02 12:18:05.310 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:06 np0005466013 nova_compute[192144]: 2025-10-02 12:18:06.893 2 DEBUG oslo_concurrency.lockutils [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:06 np0005466013 nova_compute[192144]: 2025-10-02 12:18:06.893 2 DEBUG oslo_concurrency.lockutils [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:06 np0005466013 nova_compute[192144]: 2025-10-02 12:18:06.912 2 DEBUG nova.compute.manager [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:18:07 np0005466013 nova_compute[192144]: 2025-10-02 12:18:07.021 2 DEBUG oslo_concurrency.lockutils [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:07 np0005466013 nova_compute[192144]: 2025-10-02 12:18:07.022 2 DEBUG oslo_concurrency.lockutils [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:07 np0005466013 nova_compute[192144]: 2025-10-02 12:18:07.032 2 DEBUG nova.virt.hardware [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:18:07 np0005466013 nova_compute[192144]: 2025-10-02 12:18:07.033 2 INFO nova.compute.claims [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:18:07 np0005466013 nova_compute[192144]: 2025-10-02 12:18:07.176 2 DEBUG nova.compute.provider_tree [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:18:07 np0005466013 nova_compute[192144]: 2025-10-02 12:18:07.205 2 DEBUG nova.scheduler.client.report [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:18:07 np0005466013 nova_compute[192144]: 2025-10-02 12:18:07.233 2 DEBUG oslo_concurrency.lockutils [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.211s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:07 np0005466013 nova_compute[192144]: 2025-10-02 12:18:07.234 2 DEBUG nova.compute.manager [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:18:07 np0005466013 nova_compute[192144]: 2025-10-02 12:18:07.297 2 DEBUG nova.compute.manager [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:18:07 np0005466013 nova_compute[192144]: 2025-10-02 12:18:07.298 2 DEBUG nova.network.neutron [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:18:07 np0005466013 nova_compute[192144]: 2025-10-02 12:18:07.334 2 INFO nova.virt.libvirt.driver [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:18:07 np0005466013 nova_compute[192144]: 2025-10-02 12:18:07.357 2 DEBUG nova.compute.manager [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:18:07 np0005466013 nova_compute[192144]: 2025-10-02 12:18:07.514 2 DEBUG nova.compute.manager [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:18:07 np0005466013 nova_compute[192144]: 2025-10-02 12:18:07.516 2 DEBUG nova.virt.libvirt.driver [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:18:07 np0005466013 nova_compute[192144]: 2025-10-02 12:18:07.516 2 INFO nova.virt.libvirt.driver [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Creating image(s)#033[00m
Oct  2 08:18:07 np0005466013 nova_compute[192144]: 2025-10-02 12:18:07.517 2 DEBUG oslo_concurrency.lockutils [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "/var/lib/nova/instances/1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:07 np0005466013 nova_compute[192144]: 2025-10-02 12:18:07.518 2 DEBUG oslo_concurrency.lockutils [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "/var/lib/nova/instances/1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:07 np0005466013 nova_compute[192144]: 2025-10-02 12:18:07.519 2 DEBUG oslo_concurrency.lockutils [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "/var/lib/nova/instances/1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:07 np0005466013 nova_compute[192144]: 2025-10-02 12:18:07.540 2 DEBUG oslo_concurrency.processutils [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:18:07 np0005466013 nova_compute[192144]: 2025-10-02 12:18:07.613 2 DEBUG oslo_concurrency.processutils [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:18:07 np0005466013 nova_compute[192144]: 2025-10-02 12:18:07.614 2 DEBUG oslo_concurrency.lockutils [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:07 np0005466013 nova_compute[192144]: 2025-10-02 12:18:07.615 2 DEBUG oslo_concurrency.lockutils [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:07 np0005466013 nova_compute[192144]: 2025-10-02 12:18:07.625 2 DEBUG oslo_concurrency.processutils [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:18:07 np0005466013 nova_compute[192144]: 2025-10-02 12:18:07.692 2 DEBUG oslo_concurrency.processutils [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:18:07 np0005466013 nova_compute[192144]: 2025-10-02 12:18:07.693 2 DEBUG oslo_concurrency.processutils [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:18:07 np0005466013 nova_compute[192144]: 2025-10-02 12:18:07.728 2 DEBUG oslo_concurrency.processutils [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:18:07 np0005466013 nova_compute[192144]: 2025-10-02 12:18:07.729 2 DEBUG oslo_concurrency.lockutils [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:07 np0005466013 nova_compute[192144]: 2025-10-02 12:18:07.730 2 DEBUG oslo_concurrency.processutils [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:18:07 np0005466013 nova_compute[192144]: 2025-10-02 12:18:07.793 2 DEBUG oslo_concurrency.processutils [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:18:07 np0005466013 nova_compute[192144]: 2025-10-02 12:18:07.794 2 DEBUG nova.virt.disk.api [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Checking if we can resize image /var/lib/nova/instances/1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:18:07 np0005466013 nova_compute[192144]: 2025-10-02 12:18:07.795 2 DEBUG oslo_concurrency.processutils [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:18:07 np0005466013 nova_compute[192144]: 2025-10-02 12:18:07.822 2 DEBUG nova.policy [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0c0ba8ddde504431b51e593c63f40361', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd5db64e6714348c1a7f57bb53de80915', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:18:07 np0005466013 nova_compute[192144]: 2025-10-02 12:18:07.862 2 DEBUG oslo_concurrency.processutils [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:18:07 np0005466013 nova_compute[192144]: 2025-10-02 12:18:07.863 2 DEBUG nova.virt.disk.api [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Cannot resize image /var/lib/nova/instances/1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:18:07 np0005466013 nova_compute[192144]: 2025-10-02 12:18:07.863 2 DEBUG nova.objects.instance [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lazy-loading 'migration_context' on Instance uuid 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:18:07 np0005466013 nova_compute[192144]: 2025-10-02 12:18:07.892 2 DEBUG nova.virt.libvirt.driver [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:18:07 np0005466013 nova_compute[192144]: 2025-10-02 12:18:07.892 2 DEBUG nova.virt.libvirt.driver [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Ensure instance console log exists: /var/lib/nova/instances/1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:18:07 np0005466013 nova_compute[192144]: 2025-10-02 12:18:07.893 2 DEBUG oslo_concurrency.lockutils [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:07 np0005466013 nova_compute[192144]: 2025-10-02 12:18:07.893 2 DEBUG oslo_concurrency.lockutils [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:07 np0005466013 nova_compute[192144]: 2025-10-02 12:18:07.894 2 DEBUG oslo_concurrency.lockutils [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:08 np0005466013 nova_compute[192144]: 2025-10-02 12:18:08.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:08 np0005466013 nova_compute[192144]: 2025-10-02 12:18:08.597 2 DEBUG nova.network.neutron [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Successfully created port: ddcd150f-bc18-483e-b0d6-1a300e04de05 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:18:10 np0005466013 nova_compute[192144]: 2025-10-02 12:18:10.186 2 DEBUG nova.network.neutron [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Successfully updated port: ddcd150f-bc18-483e-b0d6-1a300e04de05 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:18:10 np0005466013 nova_compute[192144]: 2025-10-02 12:18:10.202 2 DEBUG oslo_concurrency.lockutils [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "refresh_cache-1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:18:10 np0005466013 nova_compute[192144]: 2025-10-02 12:18:10.202 2 DEBUG oslo_concurrency.lockutils [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquired lock "refresh_cache-1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:18:10 np0005466013 nova_compute[192144]: 2025-10-02 12:18:10.202 2 DEBUG nova.network.neutron [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:18:10 np0005466013 nova_compute[192144]: 2025-10-02 12:18:10.311 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:10 np0005466013 nova_compute[192144]: 2025-10-02 12:18:10.344 2 DEBUG nova.compute.manager [req-5e5441f0-b34f-4857-afc3-37d278a60f27 req-08bb56d9-b1db-42c9-8903-cdb4d71fc42e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Received event network-changed-ddcd150f-bc18-483e-b0d6-1a300e04de05 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:18:10 np0005466013 nova_compute[192144]: 2025-10-02 12:18:10.345 2 DEBUG nova.compute.manager [req-5e5441f0-b34f-4857-afc3-37d278a60f27 req-08bb56d9-b1db-42c9-8903-cdb4d71fc42e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Refreshing instance network info cache due to event network-changed-ddcd150f-bc18-483e-b0d6-1a300e04de05. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:18:10 np0005466013 nova_compute[192144]: 2025-10-02 12:18:10.345 2 DEBUG oslo_concurrency.lockutils [req-5e5441f0-b34f-4857-afc3-37d278a60f27 req-08bb56d9-b1db-42c9-8903-cdb4d71fc42e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:18:10 np0005466013 nova_compute[192144]: 2025-10-02 12:18:10.424 2 DEBUG nova.network.neutron [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:18:12 np0005466013 nova_compute[192144]: 2025-10-02 12:18:12.328 2 DEBUG nova.network.neutron [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Updating instance_info_cache with network_info: [{"id": "ddcd150f-bc18-483e-b0d6-1a300e04de05", "address": "fa:16:3e:60:88:0e", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddcd150f-bc", "ovs_interfaceid": "ddcd150f-bc18-483e-b0d6-1a300e04de05", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:18:12 np0005466013 nova_compute[192144]: 2025-10-02 12:18:12.362 2 DEBUG oslo_concurrency.lockutils [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Releasing lock "refresh_cache-1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:18:12 np0005466013 nova_compute[192144]: 2025-10-02 12:18:12.362 2 DEBUG nova.compute.manager [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Instance network_info: |[{"id": "ddcd150f-bc18-483e-b0d6-1a300e04de05", "address": "fa:16:3e:60:88:0e", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddcd150f-bc", "ovs_interfaceid": "ddcd150f-bc18-483e-b0d6-1a300e04de05", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:18:12 np0005466013 nova_compute[192144]: 2025-10-02 12:18:12.363 2 DEBUG oslo_concurrency.lockutils [req-5e5441f0-b34f-4857-afc3-37d278a60f27 req-08bb56d9-b1db-42c9-8903-cdb4d71fc42e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:18:12 np0005466013 nova_compute[192144]: 2025-10-02 12:18:12.363 2 DEBUG nova.network.neutron [req-5e5441f0-b34f-4857-afc3-37d278a60f27 req-08bb56d9-b1db-42c9-8903-cdb4d71fc42e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Refreshing network info cache for port ddcd150f-bc18-483e-b0d6-1a300e04de05 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:18:12 np0005466013 nova_compute[192144]: 2025-10-02 12:18:12.369 2 DEBUG nova.virt.libvirt.driver [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Start _get_guest_xml network_info=[{"id": "ddcd150f-bc18-483e-b0d6-1a300e04de05", "address": "fa:16:3e:60:88:0e", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddcd150f-bc", "ovs_interfaceid": "ddcd150f-bc18-483e-b0d6-1a300e04de05", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:18:12 np0005466013 nova_compute[192144]: 2025-10-02 12:18:12.376 2 WARNING nova.virt.libvirt.driver [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:18:12 np0005466013 nova_compute[192144]: 2025-10-02 12:18:12.381 2 DEBUG nova.virt.libvirt.host [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:18:12 np0005466013 nova_compute[192144]: 2025-10-02 12:18:12.382 2 DEBUG nova.virt.libvirt.host [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:18:12 np0005466013 nova_compute[192144]: 2025-10-02 12:18:12.394 2 DEBUG nova.virt.libvirt.host [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:18:12 np0005466013 nova_compute[192144]: 2025-10-02 12:18:12.395 2 DEBUG nova.virt.libvirt.host [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:18:12 np0005466013 nova_compute[192144]: 2025-10-02 12:18:12.397 2 DEBUG nova.virt.libvirt.driver [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:18:12 np0005466013 nova_compute[192144]: 2025-10-02 12:18:12.397 2 DEBUG nova.virt.hardware [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:18:12 np0005466013 nova_compute[192144]: 2025-10-02 12:18:12.398 2 DEBUG nova.virt.hardware [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:18:12 np0005466013 nova_compute[192144]: 2025-10-02 12:18:12.399 2 DEBUG nova.virt.hardware [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:18:12 np0005466013 nova_compute[192144]: 2025-10-02 12:18:12.400 2 DEBUG nova.virt.hardware [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:18:12 np0005466013 nova_compute[192144]: 2025-10-02 12:18:12.400 2 DEBUG nova.virt.hardware [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:18:12 np0005466013 nova_compute[192144]: 2025-10-02 12:18:12.401 2 DEBUG nova.virt.hardware [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:18:12 np0005466013 nova_compute[192144]: 2025-10-02 12:18:12.402 2 DEBUG nova.virt.hardware [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:18:12 np0005466013 nova_compute[192144]: 2025-10-02 12:18:12.402 2 DEBUG nova.virt.hardware [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:18:12 np0005466013 nova_compute[192144]: 2025-10-02 12:18:12.402 2 DEBUG nova.virt.hardware [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:18:12 np0005466013 nova_compute[192144]: 2025-10-02 12:18:12.402 2 DEBUG nova.virt.hardware [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:18:12 np0005466013 nova_compute[192144]: 2025-10-02 12:18:12.402 2 DEBUG nova.virt.hardware [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:18:12 np0005466013 nova_compute[192144]: 2025-10-02 12:18:12.406 2 DEBUG nova.virt.libvirt.vif [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:18:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-39022864',display_name='tempest-DeleteServersTestJSON-server-39022864',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-39022864',id=89,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d5db64e6714348c1a7f57bb53de80915',ramdisk_id='',reservation_id='r-x2soh8h3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-548982240',owner_user_name='tempest-DeleteServersTestJSON-548982240-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:18:07Z,user_data=None,user_id='0c0ba8ddde504431b51e593c63f40361',uuid=1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ddcd150f-bc18-483e-b0d6-1a300e04de05", "address": "fa:16:3e:60:88:0e", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddcd150f-bc", "ovs_interfaceid": "ddcd150f-bc18-483e-b0d6-1a300e04de05", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:18:12 np0005466013 nova_compute[192144]: 2025-10-02 12:18:12.407 2 DEBUG nova.network.os_vif_util [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Converting VIF {"id": "ddcd150f-bc18-483e-b0d6-1a300e04de05", "address": "fa:16:3e:60:88:0e", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddcd150f-bc", "ovs_interfaceid": "ddcd150f-bc18-483e-b0d6-1a300e04de05", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:18:12 np0005466013 nova_compute[192144]: 2025-10-02 12:18:12.407 2 DEBUG nova.network.os_vif_util [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:60:88:0e,bridge_name='br-int',has_traffic_filtering=True,id=ddcd150f-bc18-483e-b0d6-1a300e04de05,network=Network(b97b8849-844c-4190-8b13-fd7a2d073ce8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapddcd150f-bc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:18:12 np0005466013 nova_compute[192144]: 2025-10-02 12:18:12.408 2 DEBUG nova.objects.instance [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:18:12 np0005466013 nova_compute[192144]: 2025-10-02 12:18:12.430 2 DEBUG nova.virt.libvirt.driver [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:18:12 np0005466013 nova_compute[192144]:  <uuid>1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd</uuid>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:  <name>instance-00000059</name>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:18:12 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:      <nova:name>tempest-DeleteServersTestJSON-server-39022864</nova:name>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:18:12</nova:creationTime>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:18:12 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:        <nova:user uuid="0c0ba8ddde504431b51e593c63f40361">tempest-DeleteServersTestJSON-548982240-project-member</nova:user>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:        <nova:project uuid="d5db64e6714348c1a7f57bb53de80915">tempest-DeleteServersTestJSON-548982240</nova:project>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:        <nova:port uuid="ddcd150f-bc18-483e-b0d6-1a300e04de05">
Oct  2 08:18:12 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:      <entry name="serial">1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd</entry>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:      <entry name="uuid">1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd</entry>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:18:12 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd/disk"/>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:18:12 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd/disk.config"/>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:18:12 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:60:88:0e"/>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:      <target dev="tapddcd150f-bc"/>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:18:12 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd/console.log" append="off"/>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:18:12 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:18:12 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:18:12 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:18:12 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:18:12 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:18:12 np0005466013 nova_compute[192144]: 2025-10-02 12:18:12.432 2 DEBUG nova.compute.manager [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Preparing to wait for external event network-vif-plugged-ddcd150f-bc18-483e-b0d6-1a300e04de05 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:18:12 np0005466013 nova_compute[192144]: 2025-10-02 12:18:12.433 2 DEBUG oslo_concurrency.lockutils [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:12 np0005466013 nova_compute[192144]: 2025-10-02 12:18:12.434 2 DEBUG oslo_concurrency.lockutils [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:12 np0005466013 nova_compute[192144]: 2025-10-02 12:18:12.434 2 DEBUG oslo_concurrency.lockutils [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:12 np0005466013 nova_compute[192144]: 2025-10-02 12:18:12.437 2 DEBUG nova.virt.libvirt.vif [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:18:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-39022864',display_name='tempest-DeleteServersTestJSON-server-39022864',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-39022864',id=89,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d5db64e6714348c1a7f57bb53de80915',ramdisk_id='',reservation_id='r-x2soh8h3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-548982240',owner_user_name='tempest-DeleteServersTestJSON-548982240-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:18:07Z,user_data=None,user_id='0c0ba8ddde504431b51e593c63f40361',uuid=1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ddcd150f-bc18-483e-b0d6-1a300e04de05", "address": "fa:16:3e:60:88:0e", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddcd150f-bc", "ovs_interfaceid": "ddcd150f-bc18-483e-b0d6-1a300e04de05", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:18:12 np0005466013 nova_compute[192144]: 2025-10-02 12:18:12.437 2 DEBUG nova.network.os_vif_util [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Converting VIF {"id": "ddcd150f-bc18-483e-b0d6-1a300e04de05", "address": "fa:16:3e:60:88:0e", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddcd150f-bc", "ovs_interfaceid": "ddcd150f-bc18-483e-b0d6-1a300e04de05", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:18:12 np0005466013 nova_compute[192144]: 2025-10-02 12:18:12.439 2 DEBUG nova.network.os_vif_util [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:60:88:0e,bridge_name='br-int',has_traffic_filtering=True,id=ddcd150f-bc18-483e-b0d6-1a300e04de05,network=Network(b97b8849-844c-4190-8b13-fd7a2d073ce8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapddcd150f-bc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:18:12 np0005466013 nova_compute[192144]: 2025-10-02 12:18:12.440 2 DEBUG os_vif [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:60:88:0e,bridge_name='br-int',has_traffic_filtering=True,id=ddcd150f-bc18-483e-b0d6-1a300e04de05,network=Network(b97b8849-844c-4190-8b13-fd7a2d073ce8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapddcd150f-bc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:18:12 np0005466013 nova_compute[192144]: 2025-10-02 12:18:12.441 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:12 np0005466013 nova_compute[192144]: 2025-10-02 12:18:12.442 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:12 np0005466013 nova_compute[192144]: 2025-10-02 12:18:12.443 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:18:12 np0005466013 nova_compute[192144]: 2025-10-02 12:18:12.448 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:12 np0005466013 nova_compute[192144]: 2025-10-02 12:18:12.448 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapddcd150f-bc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:12 np0005466013 nova_compute[192144]: 2025-10-02 12:18:12.449 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapddcd150f-bc, col_values=(('external_ids', {'iface-id': 'ddcd150f-bc18-483e-b0d6-1a300e04de05', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:60:88:0e', 'vm-uuid': '1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:12 np0005466013 NetworkManager[51205]: <info>  [1759407492.4528] manager: (tapddcd150f-bc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/158)
Oct  2 08:18:12 np0005466013 nova_compute[192144]: 2025-10-02 12:18:12.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:12 np0005466013 nova_compute[192144]: 2025-10-02 12:18:12.456 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:18:12 np0005466013 nova_compute[192144]: 2025-10-02 12:18:12.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:12 np0005466013 nova_compute[192144]: 2025-10-02 12:18:12.462 2 INFO os_vif [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:60:88:0e,bridge_name='br-int',has_traffic_filtering=True,id=ddcd150f-bc18-483e-b0d6-1a300e04de05,network=Network(b97b8849-844c-4190-8b13-fd7a2d073ce8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapddcd150f-bc')#033[00m
Oct  2 08:18:12 np0005466013 nova_compute[192144]: 2025-10-02 12:18:12.521 2 DEBUG nova.virt.libvirt.driver [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:18:12 np0005466013 nova_compute[192144]: 2025-10-02 12:18:12.521 2 DEBUG nova.virt.libvirt.driver [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:18:12 np0005466013 nova_compute[192144]: 2025-10-02 12:18:12.521 2 DEBUG nova.virt.libvirt.driver [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] No VIF found with MAC fa:16:3e:60:88:0e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:18:12 np0005466013 nova_compute[192144]: 2025-10-02 12:18:12.522 2 INFO nova.virt.libvirt.driver [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Using config drive#033[00m
Oct  2 08:18:13 np0005466013 nova_compute[192144]: 2025-10-02 12:18:13.331 2 INFO nova.virt.libvirt.driver [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Creating config drive at /var/lib/nova/instances/1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd/disk.config#033[00m
Oct  2 08:18:13 np0005466013 nova_compute[192144]: 2025-10-02 12:18:13.337 2 DEBUG oslo_concurrency.processutils [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppyaj8936 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:18:13 np0005466013 nova_compute[192144]: 2025-10-02 12:18:13.467 2 DEBUG oslo_concurrency.processutils [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppyaj8936" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:18:13 np0005466013 kernel: tapddcd150f-bc: entered promiscuous mode
Oct  2 08:18:13 np0005466013 nova_compute[192144]: 2025-10-02 12:18:13.534 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:13 np0005466013 NetworkManager[51205]: <info>  [1759407493.5354] manager: (tapddcd150f-bc): new Tun device (/org/freedesktop/NetworkManager/Devices/159)
Oct  2 08:18:13 np0005466013 ovn_controller[94366]: 2025-10-02T12:18:13Z|00343|binding|INFO|Claiming lport ddcd150f-bc18-483e-b0d6-1a300e04de05 for this chassis.
Oct  2 08:18:13 np0005466013 ovn_controller[94366]: 2025-10-02T12:18:13Z|00344|binding|INFO|ddcd150f-bc18-483e-b0d6-1a300e04de05: Claiming fa:16:3e:60:88:0e 10.100.0.10
Oct  2 08:18:13 np0005466013 nova_compute[192144]: 2025-10-02 12:18:13.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:13.553 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:60:88:0e 10.100.0.10'], port_security=['fa:16:3e:60:88:0e 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b97b8849-844c-4190-8b13-fd7a2d073ce8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd5db64e6714348c1a7f57bb53de80915', 'neutron:revision_number': '2', 'neutron:security_group_ids': '063f732a-6071-414f-814d-a5d6c4e9e012', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2011b0da-7062-465f-963e-59e92e88a653, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=ddcd150f-bc18-483e-b0d6-1a300e04de05) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:13.556 103323 INFO neutron.agent.ovn.metadata.agent [-] Port ddcd150f-bc18-483e-b0d6-1a300e04de05 in datapath b97b8849-844c-4190-8b13-fd7a2d073ce8 bound to our chassis#033[00m
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:13.558 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b97b8849-844c-4190-8b13-fd7a2d073ce8#033[00m
Oct  2 08:18:13 np0005466013 systemd-udevd[233125]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:13.572 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[72a1d9c8-3d06-422e-9760-4486cbad910a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:13.573 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb97b8849-81 in ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:13.576 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb97b8849-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:13.576 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[a648c695-f9e0-4867-b04e-41e867b8ef54]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:13.578 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[fcc23288-d8bc-493f-882a-1430740895de]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:13 np0005466013 systemd-machined[152202]: New machine qemu-40-instance-00000059.
Oct  2 08:18:13 np0005466013 NetworkManager[51205]: <info>  [1759407493.5918] device (tapddcd150f-bc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:13.590 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[0c2e8aaa-8e81-4930-8fa6-5e707aa12645]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:13 np0005466013 NetworkManager[51205]: <info>  [1759407493.5925] device (tapddcd150f-bc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:18:13 np0005466013 nova_compute[192144]: 2025-10-02 12:18:13.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:13 np0005466013 systemd[1]: Started Virtual Machine qemu-40-instance-00000059.
Oct  2 08:18:13 np0005466013 nova_compute[192144]: 2025-10-02 12:18:13.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:13 np0005466013 ovn_controller[94366]: 2025-10-02T12:18:13Z|00345|binding|INFO|Setting lport ddcd150f-bc18-483e-b0d6-1a300e04de05 ovn-installed in OVS
Oct  2 08:18:13 np0005466013 ovn_controller[94366]: 2025-10-02T12:18:13Z|00346|binding|INFO|Setting lport ddcd150f-bc18-483e-b0d6-1a300e04de05 up in Southbound
Oct  2 08:18:13 np0005466013 nova_compute[192144]: 2025-10-02 12:18:13.607 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:13.614 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[c6fb1ee2-32ce-431e-ad0a-d5952da16347]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:13.646 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[b52d5e4e-e630-44f1-83b2-1a885907080b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:13 np0005466013 NetworkManager[51205]: <info>  [1759407493.6535] manager: (tapb97b8849-80): new Veth device (/org/freedesktop/NetworkManager/Devices/160)
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:13.652 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[4f3367f7-0439-4fb1-b161-f63fd4ec129d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:13 np0005466013 systemd-udevd[233129]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:13.687 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[6eaabbf8-bd67-47c1-bc6e-1996e0acabe1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:13.690 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[a283c13c-1418-4ea6-807f-fc3a0522c19d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:13 np0005466013 NetworkManager[51205]: <info>  [1759407493.7142] device (tapb97b8849-80): carrier: link connected
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:13.720 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[206510cf-8f95-44c3-b428-f3ac17658dbf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:13.740 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[5e6b394d-0c93-4e7a-ba0c-f8d21c7707d6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb97b8849-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ea:e0:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 103], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 548533, 'reachable_time': 18876, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233158, 'error': None, 'target': 'ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:13.756 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[c98a9d6c-bea0-4d9f-881f-401ef7e825e9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feea:e0b0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 548533, 'tstamp': 548533}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233159, 'error': None, 'target': 'ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:13.777 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[b959249a-a302-40e0-bdc9-276dac1d5b5e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb97b8849-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ea:e0:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 103], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 548533, 'reachable_time': 18876, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 233160, 'error': None, 'target': 'ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:13.812 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[2360f6ff-d5ce-4d9c-8a1c-c3a238957473]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:13.887 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[3b6a7758-c80b-457c-a148-a1e0d840c0c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:13.888 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb97b8849-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:13.889 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:13.889 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb97b8849-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:13 np0005466013 nova_compute[192144]: 2025-10-02 12:18:13.891 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:13 np0005466013 NetworkManager[51205]: <info>  [1759407493.8920] manager: (tapb97b8849-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/161)
Oct  2 08:18:13 np0005466013 kernel: tapb97b8849-80: entered promiscuous mode
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:13.894 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb97b8849-80, col_values=(('external_ids', {'iface-id': '055cf080-4472-4807-a697-69de84e96953'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:13 np0005466013 ovn_controller[94366]: 2025-10-02T12:18:13Z|00347|binding|INFO|Releasing lport 055cf080-4472-4807-a697-69de84e96953 from this chassis (sb_readonly=0)
Oct  2 08:18:13 np0005466013 nova_compute[192144]: 2025-10-02 12:18:13.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:13 np0005466013 nova_compute[192144]: 2025-10-02 12:18:13.913 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:13.914 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b97b8849-844c-4190-8b13-fd7a2d073ce8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b97b8849-844c-4190-8b13-fd7a2d073ce8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:13.916 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[168f2474-aa6a-4aa4-9f5e-0cd6004551aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:13.917 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-b97b8849-844c-4190-8b13-fd7a2d073ce8
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/b97b8849-844c-4190-8b13-fd7a2d073ce8.pid.haproxy
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID b97b8849-844c-4190-8b13-fd7a2d073ce8
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:18:13 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:13.918 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8', 'env', 'PROCESS_TAG=haproxy-b97b8849-844c-4190-8b13-fd7a2d073ce8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b97b8849-844c-4190-8b13-fd7a2d073ce8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:18:14 np0005466013 podman[233199]: 2025-10-02 12:18:14.302306024 +0000 UTC m=+0.059928276 container create 81889c22d87589cc59f88ff4f35b9ab7dd2581a5daf3ec9cbcd131c5eeac961d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 08:18:14 np0005466013 systemd[1]: Started libpod-conmon-81889c22d87589cc59f88ff4f35b9ab7dd2581a5daf3ec9cbcd131c5eeac961d.scope.
Oct  2 08:18:14 np0005466013 nova_compute[192144]: 2025-10-02 12:18:14.347 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407494.347466, 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:18:14 np0005466013 nova_compute[192144]: 2025-10-02 12:18:14.348 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] VM Started (Lifecycle Event)#033[00m
Oct  2 08:18:14 np0005466013 podman[233199]: 2025-10-02 12:18:14.268717996 +0000 UTC m=+0.026340268 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:18:14 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:18:14 np0005466013 nova_compute[192144]: 2025-10-02 12:18:14.369 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:18:14 np0005466013 nova_compute[192144]: 2025-10-02 12:18:14.373 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407494.3475688, 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:18:14 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d65c6cd2b75cfa7bc82f7058f4c2ea6f3d5253f2a864ddabbadd19304db7d2c6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:18:14 np0005466013 nova_compute[192144]: 2025-10-02 12:18:14.373 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:18:14 np0005466013 podman[233199]: 2025-10-02 12:18:14.386875304 +0000 UTC m=+0.144497576 container init 81889c22d87589cc59f88ff4f35b9ab7dd2581a5daf3ec9cbcd131c5eeac961d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 08:18:14 np0005466013 podman[233199]: 2025-10-02 12:18:14.393258477 +0000 UTC m=+0.150880729 container start 81889c22d87589cc59f88ff4f35b9ab7dd2581a5daf3ec9cbcd131c5eeac961d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct  2 08:18:14 np0005466013 nova_compute[192144]: 2025-10-02 12:18:14.394 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:18:14 np0005466013 nova_compute[192144]: 2025-10-02 12:18:14.398 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:18:14 np0005466013 neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8[233215]: [NOTICE]   (233219) : New worker (233221) forked
Oct  2 08:18:14 np0005466013 neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8[233215]: [NOTICE]   (233219) : Loading success.
Oct  2 08:18:14 np0005466013 nova_compute[192144]: 2025-10-02 12:18:14.421 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:18:14 np0005466013 nova_compute[192144]: 2025-10-02 12:18:14.909 2 DEBUG nova.compute.manager [req-ac3fd9f0-a4d3-40b8-b0f0-227a6da9c42d req-6cfc2f26-3177-47ef-99bb-5d6e798d95a5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Received event network-vif-plugged-ddcd150f-bc18-483e-b0d6-1a300e04de05 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:18:14 np0005466013 nova_compute[192144]: 2025-10-02 12:18:14.910 2 DEBUG oslo_concurrency.lockutils [req-ac3fd9f0-a4d3-40b8-b0f0-227a6da9c42d req-6cfc2f26-3177-47ef-99bb-5d6e798d95a5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:14 np0005466013 nova_compute[192144]: 2025-10-02 12:18:14.911 2 DEBUG oslo_concurrency.lockutils [req-ac3fd9f0-a4d3-40b8-b0f0-227a6da9c42d req-6cfc2f26-3177-47ef-99bb-5d6e798d95a5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:14 np0005466013 nova_compute[192144]: 2025-10-02 12:18:14.911 2 DEBUG oslo_concurrency.lockutils [req-ac3fd9f0-a4d3-40b8-b0f0-227a6da9c42d req-6cfc2f26-3177-47ef-99bb-5d6e798d95a5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:14 np0005466013 nova_compute[192144]: 2025-10-02 12:18:14.911 2 DEBUG nova.compute.manager [req-ac3fd9f0-a4d3-40b8-b0f0-227a6da9c42d req-6cfc2f26-3177-47ef-99bb-5d6e798d95a5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Processing event network-vif-plugged-ddcd150f-bc18-483e-b0d6-1a300e04de05 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:18:14 np0005466013 nova_compute[192144]: 2025-10-02 12:18:14.913 2 DEBUG nova.compute.manager [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:18:14 np0005466013 nova_compute[192144]: 2025-10-02 12:18:14.916 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407494.9160645, 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:18:14 np0005466013 nova_compute[192144]: 2025-10-02 12:18:14.916 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:18:14 np0005466013 nova_compute[192144]: 2025-10-02 12:18:14.920 2 DEBUG nova.virt.libvirt.driver [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:18:14 np0005466013 nova_compute[192144]: 2025-10-02 12:18:14.924 2 INFO nova.virt.libvirt.driver [-] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Instance spawned successfully.#033[00m
Oct  2 08:18:14 np0005466013 nova_compute[192144]: 2025-10-02 12:18:14.925 2 DEBUG nova.virt.libvirt.driver [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:18:14 np0005466013 nova_compute[192144]: 2025-10-02 12:18:14.932 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:18:14 np0005466013 nova_compute[192144]: 2025-10-02 12:18:14.935 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:18:14 np0005466013 nova_compute[192144]: 2025-10-02 12:18:14.944 2 DEBUG nova.virt.libvirt.driver [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:18:14 np0005466013 nova_compute[192144]: 2025-10-02 12:18:14.945 2 DEBUG nova.virt.libvirt.driver [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:18:14 np0005466013 nova_compute[192144]: 2025-10-02 12:18:14.947 2 DEBUG nova.virt.libvirt.driver [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:18:14 np0005466013 nova_compute[192144]: 2025-10-02 12:18:14.948 2 DEBUG nova.virt.libvirt.driver [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:18:14 np0005466013 nova_compute[192144]: 2025-10-02 12:18:14.949 2 DEBUG nova.virt.libvirt.driver [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:18:14 np0005466013 nova_compute[192144]: 2025-10-02 12:18:14.949 2 DEBUG nova.virt.libvirt.driver [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:18:14 np0005466013 nova_compute[192144]: 2025-10-02 12:18:14.953 2 DEBUG nova.network.neutron [req-5e5441f0-b34f-4857-afc3-37d278a60f27 req-08bb56d9-b1db-42c9-8903-cdb4d71fc42e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Updated VIF entry in instance network info cache for port ddcd150f-bc18-483e-b0d6-1a300e04de05. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:18:14 np0005466013 nova_compute[192144]: 2025-10-02 12:18:14.953 2 DEBUG nova.network.neutron [req-5e5441f0-b34f-4857-afc3-37d278a60f27 req-08bb56d9-b1db-42c9-8903-cdb4d71fc42e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Updating instance_info_cache with network_info: [{"id": "ddcd150f-bc18-483e-b0d6-1a300e04de05", "address": "fa:16:3e:60:88:0e", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddcd150f-bc", "ovs_interfaceid": "ddcd150f-bc18-483e-b0d6-1a300e04de05", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:18:14 np0005466013 nova_compute[192144]: 2025-10-02 12:18:14.956 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:18:14 np0005466013 nova_compute[192144]: 2025-10-02 12:18:14.980 2 DEBUG oslo_concurrency.lockutils [req-5e5441f0-b34f-4857-afc3-37d278a60f27 req-08bb56d9-b1db-42c9-8903-cdb4d71fc42e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:18:15 np0005466013 nova_compute[192144]: 2025-10-02 12:18:15.176 2 INFO nova.compute.manager [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Took 7.66 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:18:15 np0005466013 nova_compute[192144]: 2025-10-02 12:18:15.177 2 DEBUG nova.compute.manager [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:18:15 np0005466013 nova_compute[192144]: 2025-10-02 12:18:15.262 2 INFO nova.compute.manager [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Took 8.29 seconds to build instance.#033[00m
Oct  2 08:18:15 np0005466013 nova_compute[192144]: 2025-10-02 12:18:15.278 2 DEBUG oslo_concurrency.lockutils [None req-cc5af819-9cdd-4008-bb60-491145885acf 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.384s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:16 np0005466013 nova_compute[192144]: 2025-10-02 12:18:16.480 2 DEBUG nova.objects.instance [None req-704b0b7c-4d49-405f-b265-3ab46363e406 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:18:16 np0005466013 nova_compute[192144]: 2025-10-02 12:18:16.504 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407496.5043523, 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:18:16 np0005466013 nova_compute[192144]: 2025-10-02 12:18:16.505 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:18:16 np0005466013 nova_compute[192144]: 2025-10-02 12:18:16.530 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:18:16 np0005466013 nova_compute[192144]: 2025-10-02 12:18:16.536 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:18:16 np0005466013 nova_compute[192144]: 2025-10-02 12:18:16.566 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] During sync_power_state the instance has a pending task (suspending). Skip.#033[00m
Oct  2 08:18:16 np0005466013 podman[233233]: 2025-10-02 12:18:16.695026577 +0000 UTC m=+0.067713095 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:18:16 np0005466013 podman[233234]: 2025-10-02 12:18:16.698975062 +0000 UTC m=+0.069684207 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Oct  2 08:18:16 np0005466013 podman[233235]: 2025-10-02 12:18:16.732101146 +0000 UTC m=+0.099382062 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  2 08:18:16 np0005466013 kernel: tapddcd150f-bc (unregistering): left promiscuous mode
Oct  2 08:18:16 np0005466013 NetworkManager[51205]: <info>  [1759407496.9803] device (tapddcd150f-bc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:18:16 np0005466013 nova_compute[192144]: 2025-10-02 12:18:16.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:16 np0005466013 ovn_controller[94366]: 2025-10-02T12:18:16Z|00348|binding|INFO|Releasing lport ddcd150f-bc18-483e-b0d6-1a300e04de05 from this chassis (sb_readonly=0)
Oct  2 08:18:16 np0005466013 ovn_controller[94366]: 2025-10-02T12:18:16Z|00349|binding|INFO|Setting lport ddcd150f-bc18-483e-b0d6-1a300e04de05 down in Southbound
Oct  2 08:18:16 np0005466013 ovn_controller[94366]: 2025-10-02T12:18:16Z|00350|binding|INFO|Removing iface tapddcd150f-bc ovn-installed in OVS
Oct  2 08:18:16 np0005466013 nova_compute[192144]: 2025-10-02 12:18:16.989 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:16.998 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:60:88:0e 10.100.0.10'], port_security=['fa:16:3e:60:88:0e 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b97b8849-844c-4190-8b13-fd7a2d073ce8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd5db64e6714348c1a7f57bb53de80915', 'neutron:revision_number': '4', 'neutron:security_group_ids': '063f732a-6071-414f-814d-a5d6c4e9e012', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2011b0da-7062-465f-963e-59e92e88a653, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=ddcd150f-bc18-483e-b0d6-1a300e04de05) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:18:17 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:17.000 103323 INFO neutron.agent.ovn.metadata.agent [-] Port ddcd150f-bc18-483e-b0d6-1a300e04de05 in datapath b97b8849-844c-4190-8b13-fd7a2d073ce8 unbound from our chassis#033[00m
Oct  2 08:18:17 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:17.002 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b97b8849-844c-4190-8b13-fd7a2d073ce8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:18:17 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:17.003 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[c16ba4bd-e618-4d2e-a8d4-472a9031cdd2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:17 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:17.004 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8 namespace which is not needed anymore#033[00m
Oct  2 08:18:17 np0005466013 nova_compute[192144]: 2025-10-02 12:18:17.008 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:17 np0005466013 systemd[1]: machine-qemu\x2d40\x2dinstance\x2d00000059.scope: Deactivated successfully.
Oct  2 08:18:17 np0005466013 systemd[1]: machine-qemu\x2d40\x2dinstance\x2d00000059.scope: Consumed 2.388s CPU time.
Oct  2 08:18:17 np0005466013 systemd-machined[152202]: Machine qemu-40-instance-00000059 terminated.
Oct  2 08:18:17 np0005466013 neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8[233215]: [NOTICE]   (233219) : haproxy version is 2.8.14-c23fe91
Oct  2 08:18:17 np0005466013 neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8[233215]: [NOTICE]   (233219) : path to executable is /usr/sbin/haproxy
Oct  2 08:18:17 np0005466013 neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8[233215]: [WARNING]  (233219) : Exiting Master process...
Oct  2 08:18:17 np0005466013 neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8[233215]: [ALERT]    (233219) : Current worker (233221) exited with code 143 (Terminated)
Oct  2 08:18:17 np0005466013 neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8[233215]: [WARNING]  (233219) : All workers exited. Exiting... (0)
Oct  2 08:18:17 np0005466013 systemd[1]: libpod-81889c22d87589cc59f88ff4f35b9ab7dd2581a5daf3ec9cbcd131c5eeac961d.scope: Deactivated successfully.
Oct  2 08:18:17 np0005466013 podman[233320]: 2025-10-02 12:18:17.167286527 +0000 UTC m=+0.053905086 container died 81889c22d87589cc59f88ff4f35b9ab7dd2581a5daf3ec9cbcd131c5eeac961d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:18:17 np0005466013 kernel: tapddcd150f-bc: entered promiscuous mode
Oct  2 08:18:17 np0005466013 kernel: tapddcd150f-bc (unregistering): left promiscuous mode
Oct  2 08:18:17 np0005466013 nova_compute[192144]: 2025-10-02 12:18:17.181 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:17 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-81889c22d87589cc59f88ff4f35b9ab7dd2581a5daf3ec9cbcd131c5eeac961d-userdata-shm.mount: Deactivated successfully.
Oct  2 08:18:17 np0005466013 systemd[1]: var-lib-containers-storage-overlay-d65c6cd2b75cfa7bc82f7058f4c2ea6f3d5253f2a864ddabbadd19304db7d2c6-merged.mount: Deactivated successfully.
Oct  2 08:18:17 np0005466013 podman[233320]: 2025-10-02 12:18:17.225662694 +0000 UTC m=+0.112281233 container cleanup 81889c22d87589cc59f88ff4f35b9ab7dd2581a5daf3ec9cbcd131c5eeac961d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  2 08:18:17 np0005466013 systemd[1]: libpod-conmon-81889c22d87589cc59f88ff4f35b9ab7dd2581a5daf3ec9cbcd131c5eeac961d.scope: Deactivated successfully.
Oct  2 08:18:17 np0005466013 nova_compute[192144]: 2025-10-02 12:18:17.234 2 DEBUG nova.compute.manager [None req-704b0b7c-4d49-405f-b265-3ab46363e406 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:18:17 np0005466013 podman[233364]: 2025-10-02 12:18:17.302311022 +0000 UTC m=+0.049646071 container remove 81889c22d87589cc59f88ff4f35b9ab7dd2581a5daf3ec9cbcd131c5eeac961d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  2 08:18:17 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:17.309 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[e3c7703f-b09c-41d3-bc45-196274aa27da]: (4, ('Thu Oct  2 12:18:17 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8 (81889c22d87589cc59f88ff4f35b9ab7dd2581a5daf3ec9cbcd131c5eeac961d)\n81889c22d87589cc59f88ff4f35b9ab7dd2581a5daf3ec9cbcd131c5eeac961d\nThu Oct  2 12:18:17 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8 (81889c22d87589cc59f88ff4f35b9ab7dd2581a5daf3ec9cbcd131c5eeac961d)\n81889c22d87589cc59f88ff4f35b9ab7dd2581a5daf3ec9cbcd131c5eeac961d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:17 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:17.312 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[2281e32e-e57a-4bb0-a094-3603085359a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:17 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:17.313 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb97b8849-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:17 np0005466013 nova_compute[192144]: 2025-10-02 12:18:17.315 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:17 np0005466013 kernel: tapb97b8849-80: left promiscuous mode
Oct  2 08:18:17 np0005466013 nova_compute[192144]: 2025-10-02 12:18:17.335 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:17 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:17.338 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[6fffaebf-afaa-494d-affe-27a2018d33e0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:17 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:17.366 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[f6e03b36-ec3b-4e43-8751-9c7bf2af90c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:17 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:17.369 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[2208e925-4dc5-4052-a987-9d734ad3dff9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:17 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:17.385 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[1931ccca-49af-4c1d-96e6-59022b53428c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 548525, 'reachable_time': 43574, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233383, 'error': None, 'target': 'ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:17 np0005466013 systemd[1]: run-netns-ovnmeta\x2db97b8849\x2d844c\x2d4190\x2d8b13\x2dfd7a2d073ce8.mount: Deactivated successfully.
Oct  2 08:18:17 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:17.390 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:18:17 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:17.390 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[ffbd8e76-f95a-4186-91ed-20c82f54557a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:17 np0005466013 nova_compute[192144]: 2025-10-02 12:18:17.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:17 np0005466013 nova_compute[192144]: 2025-10-02 12:18:17.643 2 DEBUG nova.compute.manager [req-b04b4f07-f646-4460-9ca0-65937b37eb50 req-c1e9012f-4c02-45de-8a56-96a853410ea3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Received event network-vif-plugged-ddcd150f-bc18-483e-b0d6-1a300e04de05 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:18:17 np0005466013 nova_compute[192144]: 2025-10-02 12:18:17.644 2 DEBUG oslo_concurrency.lockutils [req-b04b4f07-f646-4460-9ca0-65937b37eb50 req-c1e9012f-4c02-45de-8a56-96a853410ea3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:17 np0005466013 nova_compute[192144]: 2025-10-02 12:18:17.644 2 DEBUG oslo_concurrency.lockutils [req-b04b4f07-f646-4460-9ca0-65937b37eb50 req-c1e9012f-4c02-45de-8a56-96a853410ea3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:17 np0005466013 nova_compute[192144]: 2025-10-02 12:18:17.644 2 DEBUG oslo_concurrency.lockutils [req-b04b4f07-f646-4460-9ca0-65937b37eb50 req-c1e9012f-4c02-45de-8a56-96a853410ea3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:17 np0005466013 nova_compute[192144]: 2025-10-02 12:18:17.644 2 DEBUG nova.compute.manager [req-b04b4f07-f646-4460-9ca0-65937b37eb50 req-c1e9012f-4c02-45de-8a56-96a853410ea3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] No waiting events found dispatching network-vif-plugged-ddcd150f-bc18-483e-b0d6-1a300e04de05 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:18:17 np0005466013 nova_compute[192144]: 2025-10-02 12:18:17.644 2 WARNING nova.compute.manager [req-b04b4f07-f646-4460-9ca0-65937b37eb50 req-c1e9012f-4c02-45de-8a56-96a853410ea3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Received unexpected event network-vif-plugged-ddcd150f-bc18-483e-b0d6-1a300e04de05 for instance with vm_state suspended and task_state None.#033[00m
Oct  2 08:18:18 np0005466013 nova_compute[192144]: 2025-10-02 12:18:18.538 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:19 np0005466013 nova_compute[192144]: 2025-10-02 12:18:19.240 2 DEBUG oslo_concurrency.lockutils [None req-9832b163-cda5-498e-92a7-f31c1575adac 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:19 np0005466013 nova_compute[192144]: 2025-10-02 12:18:19.241 2 DEBUG oslo_concurrency.lockutils [None req-9832b163-cda5-498e-92a7-f31c1575adac 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:19 np0005466013 nova_compute[192144]: 2025-10-02 12:18:19.241 2 DEBUG oslo_concurrency.lockutils [None req-9832b163-cda5-498e-92a7-f31c1575adac 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:19 np0005466013 nova_compute[192144]: 2025-10-02 12:18:19.241 2 DEBUG oslo_concurrency.lockutils [None req-9832b163-cda5-498e-92a7-f31c1575adac 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:19 np0005466013 nova_compute[192144]: 2025-10-02 12:18:19.241 2 DEBUG oslo_concurrency.lockutils [None req-9832b163-cda5-498e-92a7-f31c1575adac 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:19 np0005466013 nova_compute[192144]: 2025-10-02 12:18:19.253 2 INFO nova.compute.manager [None req-9832b163-cda5-498e-92a7-f31c1575adac 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Terminating instance#033[00m
Oct  2 08:18:19 np0005466013 nova_compute[192144]: 2025-10-02 12:18:19.264 2 DEBUG nova.compute.manager [None req-9832b163-cda5-498e-92a7-f31c1575adac 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:18:19 np0005466013 nova_compute[192144]: 2025-10-02 12:18:19.274 2 INFO nova.virt.libvirt.driver [-] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Instance destroyed successfully.#033[00m
Oct  2 08:18:19 np0005466013 nova_compute[192144]: 2025-10-02 12:18:19.274 2 DEBUG nova.objects.instance [None req-9832b163-cda5-498e-92a7-f31c1575adac 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lazy-loading 'resources' on Instance uuid 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:18:19 np0005466013 nova_compute[192144]: 2025-10-02 12:18:19.292 2 DEBUG nova.virt.libvirt.vif [None req-9832b163-cda5-498e-92a7-f31c1575adac 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:18:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-39022864',display_name='tempest-DeleteServersTestJSON-server-39022864',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-39022864',id=89,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:18:15Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='d5db64e6714348c1a7f57bb53de80915',ramdisk_id='',reservation_id='r-x2soh8h3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-DeleteServersTestJSON-548982240',owner_user_name='tempest-DeleteServersTestJSON-548982240-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:18:17Z,user_data=None,user_id='0c0ba8ddde504431b51e593c63f40361',uuid=1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "ddcd150f-bc18-483e-b0d6-1a300e04de05", "address": "fa:16:3e:60:88:0e", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddcd150f-bc", "ovs_interfaceid": "ddcd150f-bc18-483e-b0d6-1a300e04de05", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:18:19 np0005466013 nova_compute[192144]: 2025-10-02 12:18:19.293 2 DEBUG nova.network.os_vif_util [None req-9832b163-cda5-498e-92a7-f31c1575adac 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Converting VIF {"id": "ddcd150f-bc18-483e-b0d6-1a300e04de05", "address": "fa:16:3e:60:88:0e", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddcd150f-bc", "ovs_interfaceid": "ddcd150f-bc18-483e-b0d6-1a300e04de05", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:18:19 np0005466013 nova_compute[192144]: 2025-10-02 12:18:19.294 2 DEBUG nova.network.os_vif_util [None req-9832b163-cda5-498e-92a7-f31c1575adac 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:60:88:0e,bridge_name='br-int',has_traffic_filtering=True,id=ddcd150f-bc18-483e-b0d6-1a300e04de05,network=Network(b97b8849-844c-4190-8b13-fd7a2d073ce8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapddcd150f-bc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:18:19 np0005466013 nova_compute[192144]: 2025-10-02 12:18:19.294 2 DEBUG os_vif [None req-9832b163-cda5-498e-92a7-f31c1575adac 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:60:88:0e,bridge_name='br-int',has_traffic_filtering=True,id=ddcd150f-bc18-483e-b0d6-1a300e04de05,network=Network(b97b8849-844c-4190-8b13-fd7a2d073ce8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapddcd150f-bc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:18:19 np0005466013 nova_compute[192144]: 2025-10-02 12:18:19.296 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:19 np0005466013 nova_compute[192144]: 2025-10-02 12:18:19.296 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapddcd150f-bc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:19 np0005466013 nova_compute[192144]: 2025-10-02 12:18:19.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:19 np0005466013 nova_compute[192144]: 2025-10-02 12:18:19.330 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:19 np0005466013 nova_compute[192144]: 2025-10-02 12:18:19.334 2 INFO os_vif [None req-9832b163-cda5-498e-92a7-f31c1575adac 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:60:88:0e,bridge_name='br-int',has_traffic_filtering=True,id=ddcd150f-bc18-483e-b0d6-1a300e04de05,network=Network(b97b8849-844c-4190-8b13-fd7a2d073ce8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapddcd150f-bc')#033[00m
Oct  2 08:18:19 np0005466013 nova_compute[192144]: 2025-10-02 12:18:19.335 2 INFO nova.virt.libvirt.driver [None req-9832b163-cda5-498e-92a7-f31c1575adac 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Deleting instance files /var/lib/nova/instances/1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd_del#033[00m
Oct  2 08:18:19 np0005466013 nova_compute[192144]: 2025-10-02 12:18:19.336 2 INFO nova.virt.libvirt.driver [None req-9832b163-cda5-498e-92a7-f31c1575adac 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Deletion of /var/lib/nova/instances/1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd_del complete#033[00m
Oct  2 08:18:19 np0005466013 nova_compute[192144]: 2025-10-02 12:18:19.375 2 DEBUG oslo_concurrency.lockutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Acquiring lock "adf0e304-4d32-438f-9a13-b7171fa09447" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:19 np0005466013 nova_compute[192144]: 2025-10-02 12:18:19.376 2 DEBUG oslo_concurrency.lockutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Lock "adf0e304-4d32-438f-9a13-b7171fa09447" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:19 np0005466013 nova_compute[192144]: 2025-10-02 12:18:19.428 2 DEBUG nova.compute.manager [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:18:19 np0005466013 nova_compute[192144]: 2025-10-02 12:18:19.478 2 INFO nova.compute.manager [None req-9832b163-cda5-498e-92a7-f31c1575adac 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Took 0.21 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:18:19 np0005466013 nova_compute[192144]: 2025-10-02 12:18:19.479 2 DEBUG oslo.service.loopingcall [None req-9832b163-cda5-498e-92a7-f31c1575adac 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:18:19 np0005466013 nova_compute[192144]: 2025-10-02 12:18:19.479 2 DEBUG nova.compute.manager [-] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:18:19 np0005466013 nova_compute[192144]: 2025-10-02 12:18:19.479 2 DEBUG nova.network.neutron [-] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:18:19 np0005466013 nova_compute[192144]: 2025-10-02 12:18:19.572 2 DEBUG oslo_concurrency.lockutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:19 np0005466013 nova_compute[192144]: 2025-10-02 12:18:19.573 2 DEBUG oslo_concurrency.lockutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:19 np0005466013 nova_compute[192144]: 2025-10-02 12:18:19.580 2 DEBUG nova.virt.hardware [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:18:19 np0005466013 nova_compute[192144]: 2025-10-02 12:18:19.580 2 INFO nova.compute.claims [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:18:19 np0005466013 nova_compute[192144]: 2025-10-02 12:18:19.815 2 DEBUG nova.compute.manager [req-a1f66155-215c-4c9b-af20-424dc736c1a8 req-77d5ad0c-8b13-4703-a551-fd86a370d6f1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Received event network-vif-unplugged-ddcd150f-bc18-483e-b0d6-1a300e04de05 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:18:19 np0005466013 nova_compute[192144]: 2025-10-02 12:18:19.815 2 DEBUG oslo_concurrency.lockutils [req-a1f66155-215c-4c9b-af20-424dc736c1a8 req-77d5ad0c-8b13-4703-a551-fd86a370d6f1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:19 np0005466013 nova_compute[192144]: 2025-10-02 12:18:19.816 2 DEBUG oslo_concurrency.lockutils [req-a1f66155-215c-4c9b-af20-424dc736c1a8 req-77d5ad0c-8b13-4703-a551-fd86a370d6f1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:19 np0005466013 nova_compute[192144]: 2025-10-02 12:18:19.816 2 DEBUG oslo_concurrency.lockutils [req-a1f66155-215c-4c9b-af20-424dc736c1a8 req-77d5ad0c-8b13-4703-a551-fd86a370d6f1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:19 np0005466013 nova_compute[192144]: 2025-10-02 12:18:19.816 2 DEBUG nova.compute.manager [req-a1f66155-215c-4c9b-af20-424dc736c1a8 req-77d5ad0c-8b13-4703-a551-fd86a370d6f1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] No waiting events found dispatching network-vif-unplugged-ddcd150f-bc18-483e-b0d6-1a300e04de05 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:18:19 np0005466013 nova_compute[192144]: 2025-10-02 12:18:19.817 2 DEBUG nova.compute.manager [req-a1f66155-215c-4c9b-af20-424dc736c1a8 req-77d5ad0c-8b13-4703-a551-fd86a370d6f1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Received event network-vif-unplugged-ddcd150f-bc18-483e-b0d6-1a300e04de05 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:18:19 np0005466013 nova_compute[192144]: 2025-10-02 12:18:19.817 2 DEBUG nova.compute.manager [req-a1f66155-215c-4c9b-af20-424dc736c1a8 req-77d5ad0c-8b13-4703-a551-fd86a370d6f1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Received event network-vif-plugged-ddcd150f-bc18-483e-b0d6-1a300e04de05 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:18:19 np0005466013 nova_compute[192144]: 2025-10-02 12:18:19.817 2 DEBUG oslo_concurrency.lockutils [req-a1f66155-215c-4c9b-af20-424dc736c1a8 req-77d5ad0c-8b13-4703-a551-fd86a370d6f1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:19 np0005466013 nova_compute[192144]: 2025-10-02 12:18:19.817 2 DEBUG oslo_concurrency.lockutils [req-a1f66155-215c-4c9b-af20-424dc736c1a8 req-77d5ad0c-8b13-4703-a551-fd86a370d6f1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:19 np0005466013 nova_compute[192144]: 2025-10-02 12:18:19.817 2 DEBUG oslo_concurrency.lockutils [req-a1f66155-215c-4c9b-af20-424dc736c1a8 req-77d5ad0c-8b13-4703-a551-fd86a370d6f1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:19 np0005466013 nova_compute[192144]: 2025-10-02 12:18:19.818 2 DEBUG nova.compute.manager [req-a1f66155-215c-4c9b-af20-424dc736c1a8 req-77d5ad0c-8b13-4703-a551-fd86a370d6f1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] No waiting events found dispatching network-vif-plugged-ddcd150f-bc18-483e-b0d6-1a300e04de05 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:18:19 np0005466013 nova_compute[192144]: 2025-10-02 12:18:19.818 2 WARNING nova.compute.manager [req-a1f66155-215c-4c9b-af20-424dc736c1a8 req-77d5ad0c-8b13-4703-a551-fd86a370d6f1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Received unexpected event network-vif-plugged-ddcd150f-bc18-483e-b0d6-1a300e04de05 for instance with vm_state suspended and task_state deleting.#033[00m
Oct  2 08:18:19 np0005466013 nova_compute[192144]: 2025-10-02 12:18:19.914 2 DEBUG nova.compute.provider_tree [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:18:19 np0005466013 nova_compute[192144]: 2025-10-02 12:18:19.931 2 DEBUG nova.scheduler.client.report [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:18:19 np0005466013 nova_compute[192144]: 2025-10-02 12:18:19.984 2 DEBUG oslo_concurrency.lockutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.411s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:19 np0005466013 nova_compute[192144]: 2025-10-02 12:18:19.984 2 DEBUG nova.compute.manager [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:18:20 np0005466013 nova_compute[192144]: 2025-10-02 12:18:20.072 2 DEBUG nova.compute.manager [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:18:20 np0005466013 nova_compute[192144]: 2025-10-02 12:18:20.073 2 DEBUG nova.network.neutron [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:18:20 np0005466013 nova_compute[192144]: 2025-10-02 12:18:20.406 2 DEBUG nova.policy [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '836c60c20a0f48dd994c9d659781fc06', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '49c6a5f4c4c84d7ba686d98befbc981a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:18:20 np0005466013 nova_compute[192144]: 2025-10-02 12:18:20.418 2 INFO nova.virt.libvirt.driver [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:18:20 np0005466013 nova_compute[192144]: 2025-10-02 12:18:20.471 2 DEBUG nova.compute.manager [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:18:20 np0005466013 nova_compute[192144]: 2025-10-02 12:18:20.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:18:20 np0005466013 nova_compute[192144]: 2025-10-02 12:18:20.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:18:21 np0005466013 nova_compute[192144]: 2025-10-02 12:18:21.201 2 DEBUG nova.compute.manager [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:18:21 np0005466013 nova_compute[192144]: 2025-10-02 12:18:21.202 2 DEBUG nova.virt.libvirt.driver [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:18:21 np0005466013 nova_compute[192144]: 2025-10-02 12:18:21.202 2 INFO nova.virt.libvirt.driver [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Creating image(s)#033[00m
Oct  2 08:18:21 np0005466013 nova_compute[192144]: 2025-10-02 12:18:21.203 2 DEBUG oslo_concurrency.lockutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Acquiring lock "/var/lib/nova/instances/adf0e304-4d32-438f-9a13-b7171fa09447/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:21 np0005466013 nova_compute[192144]: 2025-10-02 12:18:21.203 2 DEBUG oslo_concurrency.lockutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Lock "/var/lib/nova/instances/adf0e304-4d32-438f-9a13-b7171fa09447/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:21 np0005466013 nova_compute[192144]: 2025-10-02 12:18:21.204 2 DEBUG oslo_concurrency.lockutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Lock "/var/lib/nova/instances/adf0e304-4d32-438f-9a13-b7171fa09447/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:21 np0005466013 nova_compute[192144]: 2025-10-02 12:18:21.217 2 DEBUG oslo_concurrency.processutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:18:21 np0005466013 nova_compute[192144]: 2025-10-02 12:18:21.278 2 DEBUG oslo_concurrency.processutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:18:21 np0005466013 nova_compute[192144]: 2025-10-02 12:18:21.280 2 DEBUG oslo_concurrency.lockutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:21 np0005466013 nova_compute[192144]: 2025-10-02 12:18:21.281 2 DEBUG oslo_concurrency.lockutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:21 np0005466013 nova_compute[192144]: 2025-10-02 12:18:21.298 2 DEBUG oslo_concurrency.processutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:18:21 np0005466013 nova_compute[192144]: 2025-10-02 12:18:21.364 2 DEBUG oslo_concurrency.processutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:18:21 np0005466013 nova_compute[192144]: 2025-10-02 12:18:21.365 2 DEBUG oslo_concurrency.processutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/adf0e304-4d32-438f-9a13-b7171fa09447/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:18:21 np0005466013 nova_compute[192144]: 2025-10-02 12:18:21.406 2 DEBUG oslo_concurrency.processutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/adf0e304-4d32-438f-9a13-b7171fa09447/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:18:21 np0005466013 nova_compute[192144]: 2025-10-02 12:18:21.407 2 DEBUG oslo_concurrency.lockutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:21 np0005466013 nova_compute[192144]: 2025-10-02 12:18:21.407 2 DEBUG oslo_concurrency.processutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:18:21 np0005466013 nova_compute[192144]: 2025-10-02 12:18:21.471 2 DEBUG oslo_concurrency.processutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:18:21 np0005466013 nova_compute[192144]: 2025-10-02 12:18:21.472 2 DEBUG nova.virt.disk.api [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Checking if we can resize image /var/lib/nova/instances/adf0e304-4d32-438f-9a13-b7171fa09447/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:18:21 np0005466013 nova_compute[192144]: 2025-10-02 12:18:21.472 2 DEBUG oslo_concurrency.processutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/adf0e304-4d32-438f-9a13-b7171fa09447/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:18:21 np0005466013 nova_compute[192144]: 2025-10-02 12:18:21.538 2 DEBUG oslo_concurrency.processutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/adf0e304-4d32-438f-9a13-b7171fa09447/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:18:21 np0005466013 nova_compute[192144]: 2025-10-02 12:18:21.540 2 DEBUG nova.virt.disk.api [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Cannot resize image /var/lib/nova/instances/adf0e304-4d32-438f-9a13-b7171fa09447/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:18:21 np0005466013 nova_compute[192144]: 2025-10-02 12:18:21.541 2 DEBUG nova.objects.instance [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Lazy-loading 'migration_context' on Instance uuid adf0e304-4d32-438f-9a13-b7171fa09447 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:18:21 np0005466013 nova_compute[192144]: 2025-10-02 12:18:21.561 2 DEBUG nova.virt.libvirt.driver [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:18:21 np0005466013 nova_compute[192144]: 2025-10-02 12:18:21.561 2 DEBUG nova.virt.libvirt.driver [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Ensure instance console log exists: /var/lib/nova/instances/adf0e304-4d32-438f-9a13-b7171fa09447/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:18:21 np0005466013 nova_compute[192144]: 2025-10-02 12:18:21.562 2 DEBUG oslo_concurrency.lockutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:21 np0005466013 nova_compute[192144]: 2025-10-02 12:18:21.562 2 DEBUG oslo_concurrency.lockutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:21 np0005466013 nova_compute[192144]: 2025-10-02 12:18:21.562 2 DEBUG oslo_concurrency.lockutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:21 np0005466013 nova_compute[192144]: 2025-10-02 12:18:21.629 2 DEBUG nova.network.neutron [-] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:18:21 np0005466013 nova_compute[192144]: 2025-10-02 12:18:21.647 2 INFO nova.compute.manager [-] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Took 2.17 seconds to deallocate network for instance.#033[00m
Oct  2 08:18:21 np0005466013 nova_compute[192144]: 2025-10-02 12:18:21.752 2 DEBUG oslo_concurrency.lockutils [None req-9832b163-cda5-498e-92a7-f31c1575adac 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:21 np0005466013 nova_compute[192144]: 2025-10-02 12:18:21.753 2 DEBUG oslo_concurrency.lockutils [None req-9832b163-cda5-498e-92a7-f31c1575adac 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:21 np0005466013 nova_compute[192144]: 2025-10-02 12:18:21.837 2 DEBUG nova.compute.provider_tree [None req-9832b163-cda5-498e-92a7-f31c1575adac 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:18:21 np0005466013 nova_compute[192144]: 2025-10-02 12:18:21.859 2 DEBUG nova.scheduler.client.report [None req-9832b163-cda5-498e-92a7-f31c1575adac 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:18:21 np0005466013 nova_compute[192144]: 2025-10-02 12:18:21.880 2 DEBUG oslo_concurrency.lockutils [None req-9832b163-cda5-498e-92a7-f31c1575adac 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:21 np0005466013 nova_compute[192144]: 2025-10-02 12:18:21.903 2 DEBUG nova.compute.manager [req-3a4ed9a5-30f7-4e6c-9729-35b03d1a7adc req-557db51a-ed5f-4436-a44e-9b9dfb7ab107 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Received event network-vif-deleted-ddcd150f-bc18-483e-b0d6-1a300e04de05 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:18:21 np0005466013 nova_compute[192144]: 2025-10-02 12:18:21.942 2 INFO nova.scheduler.client.report [None req-9832b163-cda5-498e-92a7-f31c1575adac 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Deleted allocations for instance 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd#033[00m
Oct  2 08:18:22 np0005466013 nova_compute[192144]: 2025-10-02 12:18:22.059 2 DEBUG oslo_concurrency.lockutils [None req-9832b163-cda5-498e-92a7-f31c1575adac 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.818s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:22 np0005466013 nova_compute[192144]: 2025-10-02 12:18:22.122 2 DEBUG nova.network.neutron [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Successfully created port: 0b607104-1ce8-4f80-8ea3-859d222c9b8e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:18:23 np0005466013 nova_compute[192144]: 2025-10-02 12:18:23.441 2 DEBUG nova.network.neutron [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Successfully updated port: 0b607104-1ce8-4f80-8ea3-859d222c9b8e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:18:23 np0005466013 nova_compute[192144]: 2025-10-02 12:18:23.486 2 DEBUG oslo_concurrency.lockutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Acquiring lock "refresh_cache-adf0e304-4d32-438f-9a13-b7171fa09447" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:18:23 np0005466013 nova_compute[192144]: 2025-10-02 12:18:23.486 2 DEBUG oslo_concurrency.lockutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Acquired lock "refresh_cache-adf0e304-4d32-438f-9a13-b7171fa09447" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:18:23 np0005466013 nova_compute[192144]: 2025-10-02 12:18:23.487 2 DEBUG nova.network.neutron [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:18:23 np0005466013 nova_compute[192144]: 2025-10-02 12:18:23.540 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:23 np0005466013 nova_compute[192144]: 2025-10-02 12:18:23.823 2 DEBUG nova.network.neutron [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:18:23 np0005466013 nova_compute[192144]: 2025-10-02 12:18:23.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.022 2 DEBUG nova.compute.manager [req-9d75f459-c16c-46ae-ad35-f01d95de5925 req-2b9d4b39-c239-44a4-b9f5-3b8148b3ee70 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Received event network-changed-0b607104-1ce8-4f80-8ea3-859d222c9b8e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.023 2 DEBUG nova.compute.manager [req-9d75f459-c16c-46ae-ad35-f01d95de5925 req-2b9d4b39-c239-44a4-b9f5-3b8148b3ee70 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Refreshing instance network info cache due to event network-changed-0b607104-1ce8-4f80-8ea3-859d222c9b8e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.023 2 DEBUG oslo_concurrency.lockutils [req-9d75f459-c16c-46ae-ad35-f01d95de5925 req-2b9d4b39-c239-44a4-b9f5-3b8148b3ee70 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-adf0e304-4d32-438f-9a13-b7171fa09447" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.026 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.026 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.027 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.027 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.194 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.195 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5617MB free_disk=73.35222244262695GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.196 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.196 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.278 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance adf0e304-4d32-438f-9a13-b7171fa09447 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.278 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.279 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.319 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.339 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.362 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.362 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.166s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.792 2 DEBUG nova.network.neutron [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Updating instance_info_cache with network_info: [{"id": "0b607104-1ce8-4f80-8ea3-859d222c9b8e", "address": "fa:16:3e:3c:7e:84", "network": {"id": "5b886deb-ac8b-4d5e-a6d4-b19699c6ae92", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1740420896-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "49c6a5f4c4c84d7ba686d98befbc981a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b607104-1c", "ovs_interfaceid": "0b607104-1ce8-4f80-8ea3-859d222c9b8e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.867 2 DEBUG oslo_concurrency.lockutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Releasing lock "refresh_cache-adf0e304-4d32-438f-9a13-b7171fa09447" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.867 2 DEBUG nova.compute.manager [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Instance network_info: |[{"id": "0b607104-1ce8-4f80-8ea3-859d222c9b8e", "address": "fa:16:3e:3c:7e:84", "network": {"id": "5b886deb-ac8b-4d5e-a6d4-b19699c6ae92", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1740420896-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "49c6a5f4c4c84d7ba686d98befbc981a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b607104-1c", "ovs_interfaceid": "0b607104-1ce8-4f80-8ea3-859d222c9b8e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.868 2 DEBUG oslo_concurrency.lockutils [req-9d75f459-c16c-46ae-ad35-f01d95de5925 req-2b9d4b39-c239-44a4-b9f5-3b8148b3ee70 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-adf0e304-4d32-438f-9a13-b7171fa09447" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.868 2 DEBUG nova.network.neutron [req-9d75f459-c16c-46ae-ad35-f01d95de5925 req-2b9d4b39-c239-44a4-b9f5-3b8148b3ee70 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Refreshing network info cache for port 0b607104-1ce8-4f80-8ea3-859d222c9b8e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.871 2 DEBUG nova.virt.libvirt.driver [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Start _get_guest_xml network_info=[{"id": "0b607104-1ce8-4f80-8ea3-859d222c9b8e", "address": "fa:16:3e:3c:7e:84", "network": {"id": "5b886deb-ac8b-4d5e-a6d4-b19699c6ae92", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1740420896-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "49c6a5f4c4c84d7ba686d98befbc981a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b607104-1c", "ovs_interfaceid": "0b607104-1ce8-4f80-8ea3-859d222c9b8e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.875 2 WARNING nova.virt.libvirt.driver [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.878 2 DEBUG nova.virt.libvirt.host [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.879 2 DEBUG nova.virt.libvirt.host [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.881 2 DEBUG nova.virt.libvirt.host [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.881 2 DEBUG nova.virt.libvirt.host [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.882 2 DEBUG nova.virt.libvirt.driver [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.882 2 DEBUG nova.virt.hardware [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.883 2 DEBUG nova.virt.hardware [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.883 2 DEBUG nova.virt.hardware [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.883 2 DEBUG nova.virt.hardware [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.883 2 DEBUG nova.virt.hardware [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.884 2 DEBUG nova.virt.hardware [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.884 2 DEBUG nova.virt.hardware [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.884 2 DEBUG nova.virt.hardware [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.884 2 DEBUG nova.virt.hardware [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.884 2 DEBUG nova.virt.hardware [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.885 2 DEBUG nova.virt.hardware [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.887 2 DEBUG nova.virt.libvirt.vif [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:18:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-636246405',display_name='tempest-ListServersNegativeTestJSON-server-636246405-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-636246405-2',id=93,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='49c6a5f4c4c84d7ba686d98befbc981a',ramdisk_id='',reservation_id='r-pj0lxy21',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServersNegativeTestJSON-1724341867',owner_user_name='tempest-ListServersNegativeTestJSON-1724341867-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:18:20Z,user_data=None,user_id='836c60c20a0f48dd994c9d659781fc06',uuid=adf0e304-4d32-438f-9a13-b7171fa09447,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0b607104-1ce8-4f80-8ea3-859d222c9b8e", "address": "fa:16:3e:3c:7e:84", "network": {"id": "5b886deb-ac8b-4d5e-a6d4-b19699c6ae92", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1740420896-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "49c6a5f4c4c84d7ba686d98befbc981a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b607104-1c", "ovs_interfaceid": "0b607104-1ce8-4f80-8ea3-859d222c9b8e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.888 2 DEBUG nova.network.os_vif_util [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Converting VIF {"id": "0b607104-1ce8-4f80-8ea3-859d222c9b8e", "address": "fa:16:3e:3c:7e:84", "network": {"id": "5b886deb-ac8b-4d5e-a6d4-b19699c6ae92", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1740420896-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "49c6a5f4c4c84d7ba686d98befbc981a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b607104-1c", "ovs_interfaceid": "0b607104-1ce8-4f80-8ea3-859d222c9b8e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.888 2 DEBUG nova.network.os_vif_util [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3c:7e:84,bridge_name='br-int',has_traffic_filtering=True,id=0b607104-1ce8-4f80-8ea3-859d222c9b8e,network=Network(5b886deb-ac8b-4d5e-a6d4-b19699c6ae92),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b607104-1c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.889 2 DEBUG nova.objects.instance [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Lazy-loading 'pci_devices' on Instance uuid adf0e304-4d32-438f-9a13-b7171fa09447 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.922 2 DEBUG nova.virt.libvirt.driver [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:18:24 np0005466013 nova_compute[192144]:  <uuid>adf0e304-4d32-438f-9a13-b7171fa09447</uuid>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:  <name>instance-0000005d</name>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:18:24 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:      <nova:name>tempest-ListServersNegativeTestJSON-server-636246405-2</nova:name>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:18:24</nova:creationTime>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:18:24 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:        <nova:user uuid="836c60c20a0f48dd994c9d659781fc06">tempest-ListServersNegativeTestJSON-1724341867-project-member</nova:user>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:        <nova:project uuid="49c6a5f4c4c84d7ba686d98befbc981a">tempest-ListServersNegativeTestJSON-1724341867</nova:project>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:        <nova:port uuid="0b607104-1ce8-4f80-8ea3-859d222c9b8e">
Oct  2 08:18:24 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:      <entry name="serial">adf0e304-4d32-438f-9a13-b7171fa09447</entry>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:      <entry name="uuid">adf0e304-4d32-438f-9a13-b7171fa09447</entry>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:18:24 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/adf0e304-4d32-438f-9a13-b7171fa09447/disk"/>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:18:24 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/adf0e304-4d32-438f-9a13-b7171fa09447/disk.config"/>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:18:24 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:3c:7e:84"/>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:      <target dev="tap0b607104-1c"/>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:18:24 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/adf0e304-4d32-438f-9a13-b7171fa09447/console.log" append="off"/>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:18:24 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:18:24 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:18:24 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:18:24 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:18:24 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.923 2 DEBUG nova.compute.manager [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Preparing to wait for external event network-vif-plugged-0b607104-1ce8-4f80-8ea3-859d222c9b8e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.923 2 DEBUG oslo_concurrency.lockutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Acquiring lock "adf0e304-4d32-438f-9a13-b7171fa09447-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.924 2 DEBUG oslo_concurrency.lockutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Lock "adf0e304-4d32-438f-9a13-b7171fa09447-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.924 2 DEBUG oslo_concurrency.lockutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Lock "adf0e304-4d32-438f-9a13-b7171fa09447-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.925 2 DEBUG nova.virt.libvirt.vif [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:18:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-636246405',display_name='tempest-ListServersNegativeTestJSON-server-636246405-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-636246405-2',id=93,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='49c6a5f4c4c84d7ba686d98befbc981a',ramdisk_id='',reservation_id='r-pj0lxy21',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServersNegativeTestJSON-1724341867',owner_user_name='tempest-ListServersNegativeTestJSON-1724341867-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:18:20Z,user_data=None,user_id='836c60c20a0f48dd994c9d659781fc06',uuid=adf0e304-4d32-438f-9a13-b7171fa09447,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0b607104-1ce8-4f80-8ea3-859d222c9b8e", "address": "fa:16:3e:3c:7e:84", "network": {"id": "5b886deb-ac8b-4d5e-a6d4-b19699c6ae92", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1740420896-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "49c6a5f4c4c84d7ba686d98befbc981a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b607104-1c", "ovs_interfaceid": "0b607104-1ce8-4f80-8ea3-859d222c9b8e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.925 2 DEBUG nova.network.os_vif_util [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Converting VIF {"id": "0b607104-1ce8-4f80-8ea3-859d222c9b8e", "address": "fa:16:3e:3c:7e:84", "network": {"id": "5b886deb-ac8b-4d5e-a6d4-b19699c6ae92", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1740420896-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "49c6a5f4c4c84d7ba686d98befbc981a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b607104-1c", "ovs_interfaceid": "0b607104-1ce8-4f80-8ea3-859d222c9b8e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.925 2 DEBUG nova.network.os_vif_util [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3c:7e:84,bridge_name='br-int',has_traffic_filtering=True,id=0b607104-1ce8-4f80-8ea3-859d222c9b8e,network=Network(5b886deb-ac8b-4d5e-a6d4-b19699c6ae92),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b607104-1c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.926 2 DEBUG os_vif [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3c:7e:84,bridge_name='br-int',has_traffic_filtering=True,id=0b607104-1ce8-4f80-8ea3-859d222c9b8e,network=Network(5b886deb-ac8b-4d5e-a6d4-b19699c6ae92),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b607104-1c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.926 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.927 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.927 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.930 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.931 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0b607104-1c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.931 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0b607104-1c, col_values=(('external_ids', {'iface-id': '0b607104-1ce8-4f80-8ea3-859d222c9b8e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3c:7e:84', 'vm-uuid': 'adf0e304-4d32-438f-9a13-b7171fa09447'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.932 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:24 np0005466013 NetworkManager[51205]: <info>  [1759407504.9338] manager: (tap0b607104-1c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/162)
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.935 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.939 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:24 np0005466013 nova_compute[192144]: 2025-10-02 12:18:24.940 2 INFO os_vif [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3c:7e:84,bridge_name='br-int',has_traffic_filtering=True,id=0b607104-1ce8-4f80-8ea3-859d222c9b8e,network=Network(5b886deb-ac8b-4d5e-a6d4-b19699c6ae92),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b607104-1c')#033[00m
Oct  2 08:18:25 np0005466013 nova_compute[192144]: 2025-10-02 12:18:25.019 2 DEBUG nova.virt.libvirt.driver [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:18:25 np0005466013 nova_compute[192144]: 2025-10-02 12:18:25.020 2 DEBUG nova.virt.libvirt.driver [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:18:25 np0005466013 nova_compute[192144]: 2025-10-02 12:18:25.020 2 DEBUG nova.virt.libvirt.driver [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] No VIF found with MAC fa:16:3e:3c:7e:84, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:18:25 np0005466013 nova_compute[192144]: 2025-10-02 12:18:25.020 2 INFO nova.virt.libvirt.driver [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Using config drive#033[00m
Oct  2 08:18:25 np0005466013 nova_compute[192144]: 2025-10-02 12:18:25.575 2 INFO nova.virt.libvirt.driver [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Creating config drive at /var/lib/nova/instances/adf0e304-4d32-438f-9a13-b7171fa09447/disk.config#033[00m
Oct  2 08:18:25 np0005466013 nova_compute[192144]: 2025-10-02 12:18:25.582 2 DEBUG oslo_concurrency.processutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/adf0e304-4d32-438f-9a13-b7171fa09447/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpj9mrrkj9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:18:25 np0005466013 podman[233403]: 2025-10-02 12:18:25.706075408 +0000 UTC m=+0.075578865 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.build-date=20251001)
Oct  2 08:18:25 np0005466013 nova_compute[192144]: 2025-10-02 12:18:25.713 2 DEBUG oslo_concurrency.processutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/adf0e304-4d32-438f-9a13-b7171fa09447/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpj9mrrkj9" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:18:25 np0005466013 kernel: tap0b607104-1c: entered promiscuous mode
Oct  2 08:18:25 np0005466013 NetworkManager[51205]: <info>  [1759407505.7974] manager: (tap0b607104-1c): new Tun device (/org/freedesktop/NetworkManager/Devices/163)
Oct  2 08:18:25 np0005466013 nova_compute[192144]: 2025-10-02 12:18:25.797 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:25 np0005466013 ovn_controller[94366]: 2025-10-02T12:18:25Z|00351|binding|INFO|Claiming lport 0b607104-1ce8-4f80-8ea3-859d222c9b8e for this chassis.
Oct  2 08:18:25 np0005466013 ovn_controller[94366]: 2025-10-02T12:18:25Z|00352|binding|INFO|0b607104-1ce8-4f80-8ea3-859d222c9b8e: Claiming fa:16:3e:3c:7e:84 10.100.0.5
Oct  2 08:18:25 np0005466013 nova_compute[192144]: 2025-10-02 12:18:25.800 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:25.812 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3c:7e:84 10.100.0.5'], port_security=['fa:16:3e:3c:7e:84 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'adf0e304-4d32-438f-9a13-b7171fa09447', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5b886deb-ac8b-4d5e-a6d4-b19699c6ae92', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '49c6a5f4c4c84d7ba686d98befbc981a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '55283a5f-31d5-4a4d-bc9f-4b8e3fc9f6b5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2cc3415d-eee4-499b-a06c-93196fe04768, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=0b607104-1ce8-4f80-8ea3-859d222c9b8e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:18:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:25.814 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 0b607104-1ce8-4f80-8ea3-859d222c9b8e in datapath 5b886deb-ac8b-4d5e-a6d4-b19699c6ae92 bound to our chassis#033[00m
Oct  2 08:18:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:25.816 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5b886deb-ac8b-4d5e-a6d4-b19699c6ae92#033[00m
Oct  2 08:18:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:25.827 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[8b4d08b2-8585-435f-a74d-66fd6fce1cd1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:25.828 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap5b886deb-a1 in ovnmeta-5b886deb-ac8b-4d5e-a6d4-b19699c6ae92 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:18:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:25.830 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap5b886deb-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:18:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:25.830 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[f21ba3f6-eb4c-4ef4-a330-57dac52887d3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:25.831 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[5a180ee5-3e58-41d5-a30d-ce1d8a30346c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:25 np0005466013 systemd-udevd[233439]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:18:25 np0005466013 systemd-machined[152202]: New machine qemu-41-instance-0000005d.
Oct  2 08:18:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:25.843 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[f1111604-0c78-4ee8-94cd-9098d5036150]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:25 np0005466013 NetworkManager[51205]: <info>  [1759407505.8500] device (tap0b607104-1c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:18:25 np0005466013 NetworkManager[51205]: <info>  [1759407505.8512] device (tap0b607104-1c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:18:25 np0005466013 nova_compute[192144]: 2025-10-02 12:18:25.857 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:25.862 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[73a0e7b2-4a7e-442d-b7e8-5d96de033dec]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:25 np0005466013 ovn_controller[94366]: 2025-10-02T12:18:25Z|00353|binding|INFO|Setting lport 0b607104-1ce8-4f80-8ea3-859d222c9b8e ovn-installed in OVS
Oct  2 08:18:25 np0005466013 ovn_controller[94366]: 2025-10-02T12:18:25Z|00354|binding|INFO|Setting lport 0b607104-1ce8-4f80-8ea3-859d222c9b8e up in Southbound
Oct  2 08:18:25 np0005466013 systemd[1]: Started Virtual Machine qemu-41-instance-0000005d.
Oct  2 08:18:25 np0005466013 nova_compute[192144]: 2025-10-02 12:18:25.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:25.893 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[bd576e0c-925f-42b2-ae60-1c145a232646]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:25.899 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[7e0f8b37-0547-4eae-b41d-9671c7af92bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:25 np0005466013 NetworkManager[51205]: <info>  [1759407505.9008] manager: (tap5b886deb-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/164)
Oct  2 08:18:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:25.937 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[52a61660-841d-4941-a4ad-f754517b7dbf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:25.941 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[f67049c7-27a0-4576-a3cb-5e671418c7ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:25 np0005466013 NetworkManager[51205]: <info>  [1759407505.9661] device (tap5b886deb-a0): carrier: link connected
Oct  2 08:18:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:25.974 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[67b055ff-aa13-4208-81e0-8c25d671128f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:25.993 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[993a6850-c19a-4c2d-bc90-b43c034466ba]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5b886deb-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:77:39:f1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 106], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 549758, 'reachable_time': 34988, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233472, 'error': None, 'target': 'ovnmeta-5b886deb-ac8b-4d5e-a6d4-b19699c6ae92', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:26.009 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[d4f1ae96-1578-425e-a7b6-428d235acb4f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe77:39f1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 549758, 'tstamp': 549758}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233473, 'error': None, 'target': 'ovnmeta-5b886deb-ac8b-4d5e-a6d4-b19699c6ae92', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:26.025 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[a51d4256-26cc-44db-b8c9-71e726942ca0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5b886deb-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:77:39:f1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 106], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 549758, 'reachable_time': 34988, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 233474, 'error': None, 'target': 'ovnmeta-5b886deb-ac8b-4d5e-a6d4-b19699c6ae92', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:26.059 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[bd7323da-6ec3-432b-94c8-51e6e8d77ac2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:26.129 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[8afceb10-c16f-48bf-83da-dce7d929e156]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:26.131 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5b886deb-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:26.131 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:18:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:26.131 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5b886deb-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:26 np0005466013 NetworkManager[51205]: <info>  [1759407506.1343] manager: (tap5b886deb-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/165)
Oct  2 08:18:26 np0005466013 kernel: tap5b886deb-a0: entered promiscuous mode
Oct  2 08:18:26 np0005466013 nova_compute[192144]: 2025-10-02 12:18:26.138 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:26.139 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5b886deb-a0, col_values=(('external_ids', {'iface-id': '444f6470-b3a4-44de-9f71-88b373acc28c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:26 np0005466013 ovn_controller[94366]: 2025-10-02T12:18:26Z|00355|binding|INFO|Releasing lport 444f6470-b3a4-44de-9f71-88b373acc28c from this chassis (sb_readonly=0)
Oct  2 08:18:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:26.143 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5b886deb-ac8b-4d5e-a6d4-b19699c6ae92.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5b886deb-ac8b-4d5e-a6d4-b19699c6ae92.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:18:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:26.144 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[820a73d3-de62-4b4c-b40c-55731184a582]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:26.144 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:18:26 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:18:26 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:18:26 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-5b886deb-ac8b-4d5e-a6d4-b19699c6ae92
Oct  2 08:18:26 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:18:26 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:18:26 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:18:26 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/5b886deb-ac8b-4d5e-a6d4-b19699c6ae92.pid.haproxy
Oct  2 08:18:26 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:18:26 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:18:26 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:18:26 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:18:26 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:18:26 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:18:26 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:18:26 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:18:26 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:18:26 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:18:26 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:18:26 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:18:26 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:18:26 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:18:26 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:18:26 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:18:26 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:18:26 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:18:26 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:18:26 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:18:26 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID 5b886deb-ac8b-4d5e-a6d4-b19699c6ae92
Oct  2 08:18:26 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:18:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:26.145 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-5b886deb-ac8b-4d5e-a6d4-b19699c6ae92', 'env', 'PROCESS_TAG=haproxy-5b886deb-ac8b-4d5e-a6d4-b19699c6ae92', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/5b886deb-ac8b-4d5e-a6d4-b19699c6ae92.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:18:26 np0005466013 nova_compute[192144]: 2025-10-02 12:18:26.152 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:26 np0005466013 nova_compute[192144]: 2025-10-02 12:18:26.363 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:18:26 np0005466013 podman[233513]: 2025-10-02 12:18:26.535622623 +0000 UTC m=+0.056046393 container create f337f7e850e1e1ce83a7e5dc22ae66e83d2bfa056e990892a01b457e87b35e76 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5b886deb-ac8b-4d5e-a6d4-b19699c6ae92, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Oct  2 08:18:26 np0005466013 systemd[1]: Started libpod-conmon-f337f7e850e1e1ce83a7e5dc22ae66e83d2bfa056e990892a01b457e87b35e76.scope.
Oct  2 08:18:26 np0005466013 podman[233513]: 2025-10-02 12:18:26.505001699 +0000 UTC m=+0.025425479 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:18:26 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:18:26 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eee39f93910b8a7c9555b063619774d2bf14979581701cd3c23a05dbfe2b54b3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:18:26 np0005466013 podman[233513]: 2025-10-02 12:18:26.626657589 +0000 UTC m=+0.147081379 container init f337f7e850e1e1ce83a7e5dc22ae66e83d2bfa056e990892a01b457e87b35e76 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5b886deb-ac8b-4d5e-a6d4-b19699c6ae92, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:18:26 np0005466013 podman[233513]: 2025-10-02 12:18:26.633509856 +0000 UTC m=+0.153933636 container start f337f7e850e1e1ce83a7e5dc22ae66e83d2bfa056e990892a01b457e87b35e76 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5b886deb-ac8b-4d5e-a6d4-b19699c6ae92, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct  2 08:18:26 np0005466013 neutron-haproxy-ovnmeta-5b886deb-ac8b-4d5e-a6d4-b19699c6ae92[233528]: [NOTICE]   (233532) : New worker (233534) forked
Oct  2 08:18:26 np0005466013 neutron-haproxy-ovnmeta-5b886deb-ac8b-4d5e-a6d4-b19699c6ae92[233528]: [NOTICE]   (233532) : Loading success.
Oct  2 08:18:26 np0005466013 nova_compute[192144]: 2025-10-02 12:18:26.735 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407506.7342465, adf0e304-4d32-438f-9a13-b7171fa09447 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:18:26 np0005466013 nova_compute[192144]: 2025-10-02 12:18:26.736 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] VM Started (Lifecycle Event)#033[00m
Oct  2 08:18:26 np0005466013 nova_compute[192144]: 2025-10-02 12:18:26.767 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:18:26 np0005466013 nova_compute[192144]: 2025-10-02 12:18:26.775 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407506.734472, adf0e304-4d32-438f-9a13-b7171fa09447 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:18:26 np0005466013 nova_compute[192144]: 2025-10-02 12:18:26.776 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:18:26 np0005466013 nova_compute[192144]: 2025-10-02 12:18:26.806 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:18:26 np0005466013 nova_compute[192144]: 2025-10-02 12:18:26.811 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:18:26 np0005466013 nova_compute[192144]: 2025-10-02 12:18:26.847 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:18:26 np0005466013 nova_compute[192144]: 2025-10-02 12:18:26.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:18:26 np0005466013 nova_compute[192144]: 2025-10-02 12:18:26.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:18:26 np0005466013 nova_compute[192144]: 2025-10-02 12:18:26.995 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:18:27 np0005466013 podman[233543]: 2025-10-02 12:18:27.706912427 +0000 UTC m=+0.072519728 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  2 08:18:27 np0005466013 podman[233544]: 2025-10-02 12:18:27.740030279 +0000 UTC m=+0.106215808 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, name=ubi9-minimal, distribution-scope=public, io.buildah.version=1.33.7, release=1755695350, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  2 08:18:27 np0005466013 nova_compute[192144]: 2025-10-02 12:18:27.876 2 DEBUG nova.network.neutron [req-9d75f459-c16c-46ae-ad35-f01d95de5925 req-2b9d4b39-c239-44a4-b9f5-3b8148b3ee70 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Updated VIF entry in instance network info cache for port 0b607104-1ce8-4f80-8ea3-859d222c9b8e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:18:27 np0005466013 nova_compute[192144]: 2025-10-02 12:18:27.877 2 DEBUG nova.network.neutron [req-9d75f459-c16c-46ae-ad35-f01d95de5925 req-2b9d4b39-c239-44a4-b9f5-3b8148b3ee70 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Updating instance_info_cache with network_info: [{"id": "0b607104-1ce8-4f80-8ea3-859d222c9b8e", "address": "fa:16:3e:3c:7e:84", "network": {"id": "5b886deb-ac8b-4d5e-a6d4-b19699c6ae92", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1740420896-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "49c6a5f4c4c84d7ba686d98befbc981a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b607104-1c", "ovs_interfaceid": "0b607104-1ce8-4f80-8ea3-859d222c9b8e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:18:27 np0005466013 nova_compute[192144]: 2025-10-02 12:18:27.937 2 DEBUG oslo_concurrency.lockutils [req-9d75f459-c16c-46ae-ad35-f01d95de5925 req-2b9d4b39-c239-44a4-b9f5-3b8148b3ee70 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-adf0e304-4d32-438f-9a13-b7171fa09447" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:18:27 np0005466013 nova_compute[192144]: 2025-10-02 12:18:27.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:18:27 np0005466013 nova_compute[192144]: 2025-10-02 12:18:27.995 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:18:27 np0005466013 nova_compute[192144]: 2025-10-02 12:18:27.995 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:18:28 np0005466013 nova_compute[192144]: 2025-10-02 12:18:28.027 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Oct  2 08:18:28 np0005466013 nova_compute[192144]: 2025-10-02 12:18:28.028 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:18:28 np0005466013 nova_compute[192144]: 2025-10-02 12:18:28.092 2 DEBUG nova.compute.manager [req-2822cb03-7e74-47af-832d-f193cc2ece88 req-343c8372-e3da-48fd-a43a-9e835098b4a0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Received event network-vif-plugged-0b607104-1ce8-4f80-8ea3-859d222c9b8e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:18:28 np0005466013 nova_compute[192144]: 2025-10-02 12:18:28.092 2 DEBUG oslo_concurrency.lockutils [req-2822cb03-7e74-47af-832d-f193cc2ece88 req-343c8372-e3da-48fd-a43a-9e835098b4a0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "adf0e304-4d32-438f-9a13-b7171fa09447-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:28 np0005466013 nova_compute[192144]: 2025-10-02 12:18:28.093 2 DEBUG oslo_concurrency.lockutils [req-2822cb03-7e74-47af-832d-f193cc2ece88 req-343c8372-e3da-48fd-a43a-9e835098b4a0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "adf0e304-4d32-438f-9a13-b7171fa09447-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:28 np0005466013 nova_compute[192144]: 2025-10-02 12:18:28.093 2 DEBUG oslo_concurrency.lockutils [req-2822cb03-7e74-47af-832d-f193cc2ece88 req-343c8372-e3da-48fd-a43a-9e835098b4a0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "adf0e304-4d32-438f-9a13-b7171fa09447-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:28 np0005466013 nova_compute[192144]: 2025-10-02 12:18:28.093 2 DEBUG nova.compute.manager [req-2822cb03-7e74-47af-832d-f193cc2ece88 req-343c8372-e3da-48fd-a43a-9e835098b4a0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Processing event network-vif-plugged-0b607104-1ce8-4f80-8ea3-859d222c9b8e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:18:28 np0005466013 nova_compute[192144]: 2025-10-02 12:18:28.093 2 DEBUG nova.compute.manager [req-2822cb03-7e74-47af-832d-f193cc2ece88 req-343c8372-e3da-48fd-a43a-9e835098b4a0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Received event network-vif-plugged-0b607104-1ce8-4f80-8ea3-859d222c9b8e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:18:28 np0005466013 nova_compute[192144]: 2025-10-02 12:18:28.093 2 DEBUG oslo_concurrency.lockutils [req-2822cb03-7e74-47af-832d-f193cc2ece88 req-343c8372-e3da-48fd-a43a-9e835098b4a0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "adf0e304-4d32-438f-9a13-b7171fa09447-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:28 np0005466013 nova_compute[192144]: 2025-10-02 12:18:28.094 2 DEBUG oslo_concurrency.lockutils [req-2822cb03-7e74-47af-832d-f193cc2ece88 req-343c8372-e3da-48fd-a43a-9e835098b4a0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "adf0e304-4d32-438f-9a13-b7171fa09447-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:28 np0005466013 nova_compute[192144]: 2025-10-02 12:18:28.094 2 DEBUG oslo_concurrency.lockutils [req-2822cb03-7e74-47af-832d-f193cc2ece88 req-343c8372-e3da-48fd-a43a-9e835098b4a0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "adf0e304-4d32-438f-9a13-b7171fa09447-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:28 np0005466013 nova_compute[192144]: 2025-10-02 12:18:28.094 2 DEBUG nova.compute.manager [req-2822cb03-7e74-47af-832d-f193cc2ece88 req-343c8372-e3da-48fd-a43a-9e835098b4a0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] No waiting events found dispatching network-vif-plugged-0b607104-1ce8-4f80-8ea3-859d222c9b8e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:18:28 np0005466013 nova_compute[192144]: 2025-10-02 12:18:28.094 2 WARNING nova.compute.manager [req-2822cb03-7e74-47af-832d-f193cc2ece88 req-343c8372-e3da-48fd-a43a-9e835098b4a0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Received unexpected event network-vif-plugged-0b607104-1ce8-4f80-8ea3-859d222c9b8e for instance with vm_state building and task_state spawning.#033[00m
Oct  2 08:18:28 np0005466013 nova_compute[192144]: 2025-10-02 12:18:28.095 2 DEBUG nova.compute.manager [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:18:28 np0005466013 nova_compute[192144]: 2025-10-02 12:18:28.100 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407508.0993128, adf0e304-4d32-438f-9a13-b7171fa09447 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:18:28 np0005466013 nova_compute[192144]: 2025-10-02 12:18:28.100 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:18:28 np0005466013 nova_compute[192144]: 2025-10-02 12:18:28.103 2 DEBUG nova.virt.libvirt.driver [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:18:28 np0005466013 nova_compute[192144]: 2025-10-02 12:18:28.106 2 INFO nova.virt.libvirt.driver [-] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Instance spawned successfully.#033[00m
Oct  2 08:18:28 np0005466013 nova_compute[192144]: 2025-10-02 12:18:28.107 2 DEBUG nova.virt.libvirt.driver [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:18:28 np0005466013 nova_compute[192144]: 2025-10-02 12:18:28.133 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:18:28 np0005466013 nova_compute[192144]: 2025-10-02 12:18:28.141 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:18:28 np0005466013 nova_compute[192144]: 2025-10-02 12:18:28.146 2 DEBUG nova.virt.libvirt.driver [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:18:28 np0005466013 nova_compute[192144]: 2025-10-02 12:18:28.146 2 DEBUG nova.virt.libvirt.driver [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:18:28 np0005466013 nova_compute[192144]: 2025-10-02 12:18:28.147 2 DEBUG nova.virt.libvirt.driver [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:18:28 np0005466013 nova_compute[192144]: 2025-10-02 12:18:28.148 2 DEBUG nova.virt.libvirt.driver [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:18:28 np0005466013 nova_compute[192144]: 2025-10-02 12:18:28.148 2 DEBUG nova.virt.libvirt.driver [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:18:28 np0005466013 nova_compute[192144]: 2025-10-02 12:18:28.149 2 DEBUG nova.virt.libvirt.driver [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:18:28 np0005466013 nova_compute[192144]: 2025-10-02 12:18:28.191 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:18:28 np0005466013 nova_compute[192144]: 2025-10-02 12:18:28.215 2 DEBUG oslo_concurrency.lockutils [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "95da5a4a-5301-4a2b-b135-01e08486477d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:28 np0005466013 nova_compute[192144]: 2025-10-02 12:18:28.215 2 DEBUG oslo_concurrency.lockutils [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "95da5a4a-5301-4a2b-b135-01e08486477d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:28 np0005466013 nova_compute[192144]: 2025-10-02 12:18:28.252 2 DEBUG nova.compute.manager [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:18:28 np0005466013 nova_compute[192144]: 2025-10-02 12:18:28.285 2 INFO nova.compute.manager [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Took 7.08 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:18:28 np0005466013 nova_compute[192144]: 2025-10-02 12:18:28.286 2 DEBUG nova.compute.manager [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:18:28 np0005466013 nova_compute[192144]: 2025-10-02 12:18:28.391 2 DEBUG oslo_concurrency.lockutils [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:28 np0005466013 nova_compute[192144]: 2025-10-02 12:18:28.392 2 DEBUG oslo_concurrency.lockutils [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:28 np0005466013 nova_compute[192144]: 2025-10-02 12:18:28.405 2 DEBUG nova.virt.hardware [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:18:28 np0005466013 nova_compute[192144]: 2025-10-02 12:18:28.406 2 INFO nova.compute.claims [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:18:28 np0005466013 nova_compute[192144]: 2025-10-02 12:18:28.415 2 INFO nova.compute.manager [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Took 8.90 seconds to build instance.#033[00m
Oct  2 08:18:28 np0005466013 nova_compute[192144]: 2025-10-02 12:18:28.451 2 DEBUG oslo_concurrency.lockutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Lock "adf0e304-4d32-438f-9a13-b7171fa09447" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.075s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:28 np0005466013 nova_compute[192144]: 2025-10-02 12:18:28.543 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:28 np0005466013 nova_compute[192144]: 2025-10-02 12:18:28.558 2 DEBUG nova.compute.provider_tree [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:18:28 np0005466013 nova_compute[192144]: 2025-10-02 12:18:28.576 2 DEBUG nova.scheduler.client.report [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:18:28 np0005466013 nova_compute[192144]: 2025-10-02 12:18:28.603 2 DEBUG oslo_concurrency.lockutils [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.211s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:28 np0005466013 nova_compute[192144]: 2025-10-02 12:18:28.604 2 DEBUG nova.compute.manager [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:18:28 np0005466013 nova_compute[192144]: 2025-10-02 12:18:28.667 2 DEBUG nova.compute.manager [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:18:28 np0005466013 nova_compute[192144]: 2025-10-02 12:18:28.667 2 DEBUG nova.network.neutron [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:18:28 np0005466013 nova_compute[192144]: 2025-10-02 12:18:28.693 2 INFO nova.virt.libvirt.driver [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:18:28 np0005466013 nova_compute[192144]: 2025-10-02 12:18:28.719 2 DEBUG nova.compute.manager [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:18:28 np0005466013 nova_compute[192144]: 2025-10-02 12:18:28.883 2 DEBUG nova.compute.manager [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:18:28 np0005466013 nova_compute[192144]: 2025-10-02 12:18:28.886 2 DEBUG nova.virt.libvirt.driver [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:18:28 np0005466013 nova_compute[192144]: 2025-10-02 12:18:28.887 2 INFO nova.virt.libvirt.driver [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Creating image(s)#033[00m
Oct  2 08:18:28 np0005466013 nova_compute[192144]: 2025-10-02 12:18:28.887 2 DEBUG oslo_concurrency.lockutils [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "/var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:28 np0005466013 nova_compute[192144]: 2025-10-02 12:18:28.888 2 DEBUG oslo_concurrency.lockutils [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "/var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:28 np0005466013 nova_compute[192144]: 2025-10-02 12:18:28.889 2 DEBUG oslo_concurrency.lockutils [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "/var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:28 np0005466013 nova_compute[192144]: 2025-10-02 12:18:28.907 2 DEBUG oslo_concurrency.processutils [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:18:28 np0005466013 nova_compute[192144]: 2025-10-02 12:18:28.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:18:29 np0005466013 nova_compute[192144]: 2025-10-02 12:18:29.002 2 DEBUG oslo_concurrency.processutils [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:18:29 np0005466013 nova_compute[192144]: 2025-10-02 12:18:29.002 2 DEBUG oslo_concurrency.lockutils [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:29 np0005466013 nova_compute[192144]: 2025-10-02 12:18:29.003 2 DEBUG oslo_concurrency.lockutils [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:29 np0005466013 nova_compute[192144]: 2025-10-02 12:18:29.016 2 DEBUG oslo_concurrency.processutils [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:18:29 np0005466013 nova_compute[192144]: 2025-10-02 12:18:29.092 2 DEBUG oslo_concurrency.processutils [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:18:29 np0005466013 nova_compute[192144]: 2025-10-02 12:18:29.094 2 DEBUG oslo_concurrency.processutils [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:18:29 np0005466013 nova_compute[192144]: 2025-10-02 12:18:29.147 2 DEBUG oslo_concurrency.processutils [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d/disk 1073741824" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:18:29 np0005466013 nova_compute[192144]: 2025-10-02 12:18:29.149 2 DEBUG oslo_concurrency.lockutils [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.146s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:29 np0005466013 nova_compute[192144]: 2025-10-02 12:18:29.150 2 DEBUG oslo_concurrency.processutils [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:18:29 np0005466013 nova_compute[192144]: 2025-10-02 12:18:29.207 2 DEBUG nova.policy [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0c0ba8ddde504431b51e593c63f40361', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd5db64e6714348c1a7f57bb53de80915', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:18:29 np0005466013 nova_compute[192144]: 2025-10-02 12:18:29.214 2 DEBUG oslo_concurrency.processutils [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:18:29 np0005466013 nova_compute[192144]: 2025-10-02 12:18:29.215 2 DEBUG nova.virt.disk.api [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Checking if we can resize image /var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:18:29 np0005466013 nova_compute[192144]: 2025-10-02 12:18:29.215 2 DEBUG oslo_concurrency.processutils [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:18:29 np0005466013 nova_compute[192144]: 2025-10-02 12:18:29.310 2 DEBUG oslo_concurrency.processutils [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:18:29 np0005466013 nova_compute[192144]: 2025-10-02 12:18:29.312 2 DEBUG nova.virt.disk.api [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Cannot resize image /var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:18:29 np0005466013 nova_compute[192144]: 2025-10-02 12:18:29.312 2 DEBUG nova.objects.instance [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lazy-loading 'migration_context' on Instance uuid 95da5a4a-5301-4a2b-b135-01e08486477d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:18:29 np0005466013 nova_compute[192144]: 2025-10-02 12:18:29.345 2 DEBUG nova.virt.libvirt.driver [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:18:29 np0005466013 nova_compute[192144]: 2025-10-02 12:18:29.347 2 DEBUG nova.virt.libvirt.driver [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Ensure instance console log exists: /var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:18:29 np0005466013 nova_compute[192144]: 2025-10-02 12:18:29.348 2 DEBUG oslo_concurrency.lockutils [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:29 np0005466013 nova_compute[192144]: 2025-10-02 12:18:29.348 2 DEBUG oslo_concurrency.lockutils [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:29 np0005466013 nova_compute[192144]: 2025-10-02 12:18:29.349 2 DEBUG oslo_concurrency.lockutils [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:29 np0005466013 nova_compute[192144]: 2025-10-02 12:18:29.935 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:29 np0005466013 nova_compute[192144]: 2025-10-02 12:18:29.990 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:18:30 np0005466013 nova_compute[192144]: 2025-10-02 12:18:30.413 2 DEBUG nova.network.neutron [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Successfully created port: 4b6da309-2e2d-465d-91bd-9e0bae3250eb _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:18:31 np0005466013 nova_compute[192144]: 2025-10-02 12:18:31.410 2 DEBUG nova.network.neutron [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Successfully updated port: 4b6da309-2e2d-465d-91bd-9e0bae3250eb _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:18:31 np0005466013 nova_compute[192144]: 2025-10-02 12:18:31.429 2 DEBUG oslo_concurrency.lockutils [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "refresh_cache-95da5a4a-5301-4a2b-b135-01e08486477d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:18:31 np0005466013 nova_compute[192144]: 2025-10-02 12:18:31.430 2 DEBUG oslo_concurrency.lockutils [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquired lock "refresh_cache-95da5a4a-5301-4a2b-b135-01e08486477d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:18:31 np0005466013 nova_compute[192144]: 2025-10-02 12:18:31.430 2 DEBUG nova.network.neutron [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:18:31 np0005466013 nova_compute[192144]: 2025-10-02 12:18:31.556 2 DEBUG nova.compute.manager [req-f89ab445-9cfb-4ae8-a4a9-3cabc263f23d req-bbc9be5b-b7e6-41fa-80e8-3d18ddd20b0d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Received event network-changed-4b6da309-2e2d-465d-91bd-9e0bae3250eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:18:31 np0005466013 nova_compute[192144]: 2025-10-02 12:18:31.556 2 DEBUG nova.compute.manager [req-f89ab445-9cfb-4ae8-a4a9-3cabc263f23d req-bbc9be5b-b7e6-41fa-80e8-3d18ddd20b0d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Refreshing instance network info cache due to event network-changed-4b6da309-2e2d-465d-91bd-9e0bae3250eb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:18:31 np0005466013 nova_compute[192144]: 2025-10-02 12:18:31.557 2 DEBUG oslo_concurrency.lockutils [req-f89ab445-9cfb-4ae8-a4a9-3cabc263f23d req-bbc9be5b-b7e6-41fa-80e8-3d18ddd20b0d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-95da5a4a-5301-4a2b-b135-01e08486477d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:18:31 np0005466013 nova_compute[192144]: 2025-10-02 12:18:31.608 2 DEBUG nova.network.neutron [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:18:32 np0005466013 nova_compute[192144]: 2025-10-02 12:18:32.236 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407497.235196, 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:18:32 np0005466013 nova_compute[192144]: 2025-10-02 12:18:32.237 2 INFO nova.compute.manager [-] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:18:32 np0005466013 nova_compute[192144]: 2025-10-02 12:18:32.272 2 DEBUG nova.compute.manager [None req-92f62155-cdc9-4c21-a088-1b3378340270 - - - - - -] [instance: 1b50d6f6-1a5f-4e5a-979a-6c881a19d9cd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:18:32 np0005466013 nova_compute[192144]: 2025-10-02 12:18:32.668 2 DEBUG nova.network.neutron [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Updating instance_info_cache with network_info: [{"id": "4b6da309-2e2d-465d-91bd-9e0bae3250eb", "address": "fa:16:3e:5d:9c:dc", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b6da309-2e", "ovs_interfaceid": "4b6da309-2e2d-465d-91bd-9e0bae3250eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:18:32 np0005466013 nova_compute[192144]: 2025-10-02 12:18:32.701 2 DEBUG oslo_concurrency.lockutils [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Releasing lock "refresh_cache-95da5a4a-5301-4a2b-b135-01e08486477d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:18:32 np0005466013 nova_compute[192144]: 2025-10-02 12:18:32.702 2 DEBUG nova.compute.manager [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Instance network_info: |[{"id": "4b6da309-2e2d-465d-91bd-9e0bae3250eb", "address": "fa:16:3e:5d:9c:dc", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b6da309-2e", "ovs_interfaceid": "4b6da309-2e2d-465d-91bd-9e0bae3250eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:18:32 np0005466013 nova_compute[192144]: 2025-10-02 12:18:32.703 2 DEBUG oslo_concurrency.lockutils [req-f89ab445-9cfb-4ae8-a4a9-3cabc263f23d req-bbc9be5b-b7e6-41fa-80e8-3d18ddd20b0d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-95da5a4a-5301-4a2b-b135-01e08486477d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:18:32 np0005466013 nova_compute[192144]: 2025-10-02 12:18:32.704 2 DEBUG nova.network.neutron [req-f89ab445-9cfb-4ae8-a4a9-3cabc263f23d req-bbc9be5b-b7e6-41fa-80e8-3d18ddd20b0d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Refreshing network info cache for port 4b6da309-2e2d-465d-91bd-9e0bae3250eb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:18:32 np0005466013 nova_compute[192144]: 2025-10-02 12:18:32.706 2 DEBUG nova.virt.libvirt.driver [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Start _get_guest_xml network_info=[{"id": "4b6da309-2e2d-465d-91bd-9e0bae3250eb", "address": "fa:16:3e:5d:9c:dc", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b6da309-2e", "ovs_interfaceid": "4b6da309-2e2d-465d-91bd-9e0bae3250eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:18:32 np0005466013 nova_compute[192144]: 2025-10-02 12:18:32.710 2 WARNING nova.virt.libvirt.driver [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:18:32 np0005466013 nova_compute[192144]: 2025-10-02 12:18:32.716 2 DEBUG nova.virt.libvirt.host [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:18:32 np0005466013 nova_compute[192144]: 2025-10-02 12:18:32.717 2 DEBUG nova.virt.libvirt.host [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:18:32 np0005466013 nova_compute[192144]: 2025-10-02 12:18:32.725 2 DEBUG nova.virt.libvirt.host [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:18:32 np0005466013 nova_compute[192144]: 2025-10-02 12:18:32.726 2 DEBUG nova.virt.libvirt.host [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:18:32 np0005466013 nova_compute[192144]: 2025-10-02 12:18:32.729 2 DEBUG nova.virt.libvirt.driver [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:18:32 np0005466013 nova_compute[192144]: 2025-10-02 12:18:32.729 2 DEBUG nova.virt.hardware [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:18:32 np0005466013 nova_compute[192144]: 2025-10-02 12:18:32.730 2 DEBUG nova.virt.hardware [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:18:32 np0005466013 nova_compute[192144]: 2025-10-02 12:18:32.730 2 DEBUG nova.virt.hardware [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:18:32 np0005466013 nova_compute[192144]: 2025-10-02 12:18:32.730 2 DEBUG nova.virt.hardware [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:18:32 np0005466013 nova_compute[192144]: 2025-10-02 12:18:32.731 2 DEBUG nova.virt.hardware [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:18:32 np0005466013 nova_compute[192144]: 2025-10-02 12:18:32.731 2 DEBUG nova.virt.hardware [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:18:32 np0005466013 nova_compute[192144]: 2025-10-02 12:18:32.731 2 DEBUG nova.virt.hardware [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:18:32 np0005466013 nova_compute[192144]: 2025-10-02 12:18:32.732 2 DEBUG nova.virt.hardware [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:18:32 np0005466013 nova_compute[192144]: 2025-10-02 12:18:32.732 2 DEBUG nova.virt.hardware [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:18:32 np0005466013 nova_compute[192144]: 2025-10-02 12:18:32.732 2 DEBUG nova.virt.hardware [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:18:32 np0005466013 nova_compute[192144]: 2025-10-02 12:18:32.733 2 DEBUG nova.virt.hardware [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:18:32 np0005466013 nova_compute[192144]: 2025-10-02 12:18:32.737 2 DEBUG nova.virt.libvirt.vif [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:18:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-2096752872',display_name='tempest-DeleteServersTestJSON-server-2096752872',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-2096752872',id=95,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d5db64e6714348c1a7f57bb53de80915',ramdisk_id='',reservation_id='r-frz7g55l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-548982240',owner_user_name='tempest-DeleteServersTestJSON-548982240-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:18:28Z,user_data=None,user_id='0c0ba8ddde504431b51e593c63f40361',uuid=95da5a4a-5301-4a2b-b135-01e08486477d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4b6da309-2e2d-465d-91bd-9e0bae3250eb", "address": "fa:16:3e:5d:9c:dc", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b6da309-2e", "ovs_interfaceid": "4b6da309-2e2d-465d-91bd-9e0bae3250eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:18:32 np0005466013 nova_compute[192144]: 2025-10-02 12:18:32.738 2 DEBUG nova.network.os_vif_util [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Converting VIF {"id": "4b6da309-2e2d-465d-91bd-9e0bae3250eb", "address": "fa:16:3e:5d:9c:dc", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b6da309-2e", "ovs_interfaceid": "4b6da309-2e2d-465d-91bd-9e0bae3250eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:18:32 np0005466013 nova_compute[192144]: 2025-10-02 12:18:32.739 2 DEBUG nova.network.os_vif_util [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5d:9c:dc,bridge_name='br-int',has_traffic_filtering=True,id=4b6da309-2e2d-465d-91bd-9e0bae3250eb,network=Network(b97b8849-844c-4190-8b13-fd7a2d073ce8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b6da309-2e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:18:32 np0005466013 nova_compute[192144]: 2025-10-02 12:18:32.740 2 DEBUG nova.objects.instance [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lazy-loading 'pci_devices' on Instance uuid 95da5a4a-5301-4a2b-b135-01e08486477d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:18:32 np0005466013 nova_compute[192144]: 2025-10-02 12:18:32.776 2 DEBUG nova.virt.libvirt.driver [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:18:32 np0005466013 nova_compute[192144]:  <uuid>95da5a4a-5301-4a2b-b135-01e08486477d</uuid>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:  <name>instance-0000005f</name>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:18:32 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:      <nova:name>tempest-DeleteServersTestJSON-server-2096752872</nova:name>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:18:32</nova:creationTime>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:18:32 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:        <nova:user uuid="0c0ba8ddde504431b51e593c63f40361">tempest-DeleteServersTestJSON-548982240-project-member</nova:user>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:        <nova:project uuid="d5db64e6714348c1a7f57bb53de80915">tempest-DeleteServersTestJSON-548982240</nova:project>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:        <nova:port uuid="4b6da309-2e2d-465d-91bd-9e0bae3250eb">
Oct  2 08:18:32 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:      <entry name="serial">95da5a4a-5301-4a2b-b135-01e08486477d</entry>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:      <entry name="uuid">95da5a4a-5301-4a2b-b135-01e08486477d</entry>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:18:32 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d/disk"/>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:18:32 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d/disk.config"/>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:18:32 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:5d:9c:dc"/>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:      <target dev="tap4b6da309-2e"/>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:18:32 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d/console.log" append="off"/>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:18:32 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:18:32 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:18:32 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:18:32 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:18:32 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:18:32 np0005466013 nova_compute[192144]: 2025-10-02 12:18:32.783 2 DEBUG nova.compute.manager [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Preparing to wait for external event network-vif-plugged-4b6da309-2e2d-465d-91bd-9e0bae3250eb prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:18:32 np0005466013 nova_compute[192144]: 2025-10-02 12:18:32.784 2 DEBUG oslo_concurrency.lockutils [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "95da5a4a-5301-4a2b-b135-01e08486477d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:32 np0005466013 nova_compute[192144]: 2025-10-02 12:18:32.784 2 DEBUG oslo_concurrency.lockutils [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "95da5a4a-5301-4a2b-b135-01e08486477d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:32 np0005466013 nova_compute[192144]: 2025-10-02 12:18:32.785 2 DEBUG oslo_concurrency.lockutils [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "95da5a4a-5301-4a2b-b135-01e08486477d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:32 np0005466013 nova_compute[192144]: 2025-10-02 12:18:32.785 2 DEBUG nova.virt.libvirt.vif [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:18:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-2096752872',display_name='tempest-DeleteServersTestJSON-server-2096752872',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-2096752872',id=95,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d5db64e6714348c1a7f57bb53de80915',ramdisk_id='',reservation_id='r-frz7g55l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-548982240',owner_user_name='tempest-DeleteServersTestJSON-548982240-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:18:28Z,user_data=None,user_id='0c0ba8ddde504431b51e593c63f40361',uuid=95da5a4a-5301-4a2b-b135-01e08486477d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4b6da309-2e2d-465d-91bd-9e0bae3250eb", "address": "fa:16:3e:5d:9c:dc", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b6da309-2e", "ovs_interfaceid": "4b6da309-2e2d-465d-91bd-9e0bae3250eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:18:32 np0005466013 nova_compute[192144]: 2025-10-02 12:18:32.786 2 DEBUG nova.network.os_vif_util [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Converting VIF {"id": "4b6da309-2e2d-465d-91bd-9e0bae3250eb", "address": "fa:16:3e:5d:9c:dc", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b6da309-2e", "ovs_interfaceid": "4b6da309-2e2d-465d-91bd-9e0bae3250eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:18:32 np0005466013 nova_compute[192144]: 2025-10-02 12:18:32.787 2 DEBUG nova.network.os_vif_util [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5d:9c:dc,bridge_name='br-int',has_traffic_filtering=True,id=4b6da309-2e2d-465d-91bd-9e0bae3250eb,network=Network(b97b8849-844c-4190-8b13-fd7a2d073ce8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b6da309-2e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:18:32 np0005466013 nova_compute[192144]: 2025-10-02 12:18:32.787 2 DEBUG os_vif [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5d:9c:dc,bridge_name='br-int',has_traffic_filtering=True,id=4b6da309-2e2d-465d-91bd-9e0bae3250eb,network=Network(b97b8849-844c-4190-8b13-fd7a2d073ce8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b6da309-2e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:18:32 np0005466013 nova_compute[192144]: 2025-10-02 12:18:32.788 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:32 np0005466013 nova_compute[192144]: 2025-10-02 12:18:32.788 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:32 np0005466013 nova_compute[192144]: 2025-10-02 12:18:32.789 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:18:32 np0005466013 nova_compute[192144]: 2025-10-02 12:18:32.792 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:32 np0005466013 nova_compute[192144]: 2025-10-02 12:18:32.792 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4b6da309-2e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:32 np0005466013 nova_compute[192144]: 2025-10-02 12:18:32.793 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4b6da309-2e, col_values=(('external_ids', {'iface-id': '4b6da309-2e2d-465d-91bd-9e0bae3250eb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5d:9c:dc', 'vm-uuid': '95da5a4a-5301-4a2b-b135-01e08486477d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:32 np0005466013 NetworkManager[51205]: <info>  [1759407512.7964] manager: (tap4b6da309-2e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/166)
Oct  2 08:18:32 np0005466013 nova_compute[192144]: 2025-10-02 12:18:32.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:32 np0005466013 nova_compute[192144]: 2025-10-02 12:18:32.803 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:18:32 np0005466013 nova_compute[192144]: 2025-10-02 12:18:32.804 2 INFO os_vif [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5d:9c:dc,bridge_name='br-int',has_traffic_filtering=True,id=4b6da309-2e2d-465d-91bd-9e0bae3250eb,network=Network(b97b8849-844c-4190-8b13-fd7a2d073ce8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b6da309-2e')#033[00m
Oct  2 08:18:32 np0005466013 nova_compute[192144]: 2025-10-02 12:18:32.868 2 DEBUG nova.virt.libvirt.driver [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:18:32 np0005466013 nova_compute[192144]: 2025-10-02 12:18:32.869 2 DEBUG nova.virt.libvirt.driver [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:18:32 np0005466013 nova_compute[192144]: 2025-10-02 12:18:32.870 2 DEBUG nova.virt.libvirt.driver [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] No VIF found with MAC fa:16:3e:5d:9c:dc, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:18:32 np0005466013 nova_compute[192144]: 2025-10-02 12:18:32.870 2 INFO nova.virt.libvirt.driver [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Using config drive#033[00m
Oct  2 08:18:32 np0005466013 podman[233602]: 2025-10-02 12:18:32.90515122 +0000 UTC m=+0.065577027 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  2 08:18:32 np0005466013 podman[233603]: 2025-10-02 12:18:32.908476866 +0000 UTC m=+0.065412042 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=iscsid)
Oct  2 08:18:32 np0005466013 nova_compute[192144]: 2025-10-02 12:18:32.988 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:18:33 np0005466013 nova_compute[192144]: 2025-10-02 12:18:33.264 2 INFO nova.virt.libvirt.driver [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Creating config drive at /var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d/disk.config#033[00m
Oct  2 08:18:33 np0005466013 nova_compute[192144]: 2025-10-02 12:18:33.273 2 DEBUG oslo_concurrency.processutils [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpokhxrtbv execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:18:33 np0005466013 nova_compute[192144]: 2025-10-02 12:18:33.410 2 DEBUG oslo_concurrency.processutils [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpokhxrtbv" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:18:33 np0005466013 kernel: tap4b6da309-2e: entered promiscuous mode
Oct  2 08:18:33 np0005466013 NetworkManager[51205]: <info>  [1759407513.4943] manager: (tap4b6da309-2e): new Tun device (/org/freedesktop/NetworkManager/Devices/167)
Oct  2 08:18:33 np0005466013 ovn_controller[94366]: 2025-10-02T12:18:33Z|00356|binding|INFO|Claiming lport 4b6da309-2e2d-465d-91bd-9e0bae3250eb for this chassis.
Oct  2 08:18:33 np0005466013 ovn_controller[94366]: 2025-10-02T12:18:33Z|00357|binding|INFO|4b6da309-2e2d-465d-91bd-9e0bae3250eb: Claiming fa:16:3e:5d:9c:dc 10.100.0.8
Oct  2 08:18:33 np0005466013 nova_compute[192144]: 2025-10-02 12:18:33.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:33 np0005466013 ovn_controller[94366]: 2025-10-02T12:18:33Z|00358|binding|INFO|Setting lport 4b6da309-2e2d-465d-91bd-9e0bae3250eb ovn-installed in OVS
Oct  2 08:18:33 np0005466013 nova_compute[192144]: 2025-10-02 12:18:33.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:33.518 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5d:9c:dc 10.100.0.8'], port_security=['fa:16:3e:5d:9c:dc 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '95da5a4a-5301-4a2b-b135-01e08486477d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b97b8849-844c-4190-8b13-fd7a2d073ce8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd5db64e6714348c1a7f57bb53de80915', 'neutron:revision_number': '2', 'neutron:security_group_ids': '063f732a-6071-414f-814d-a5d6c4e9e012', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2011b0da-7062-465f-963e-59e92e88a653, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=4b6da309-2e2d-465d-91bd-9e0bae3250eb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:33.521 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 4b6da309-2e2d-465d-91bd-9e0bae3250eb in datapath b97b8849-844c-4190-8b13-fd7a2d073ce8 bound to our chassis#033[00m
Oct  2 08:18:33 np0005466013 ovn_controller[94366]: 2025-10-02T12:18:33Z|00359|binding|INFO|Setting lport 4b6da309-2e2d-465d-91bd-9e0bae3250eb up in Southbound
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:33.524 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b97b8849-844c-4190-8b13-fd7a2d073ce8#033[00m
Oct  2 08:18:33 np0005466013 systemd-udevd[233662]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:33.539 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[1129d838-5942-4145-a695-224a303ecfba]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:33.541 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb97b8849-81 in ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:33.543 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb97b8849-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:33.543 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[4473f82c-31ad-4499-bf15-fb56d1e678ae]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:33.548 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[807cc99a-e144-4afc-91eb-c4c7a89190fe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:33 np0005466013 nova_compute[192144]: 2025-10-02 12:18:33.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:33 np0005466013 systemd-machined[152202]: New machine qemu-42-instance-0000005f.
Oct  2 08:18:33 np0005466013 NetworkManager[51205]: <info>  [1759407513.5582] device (tap4b6da309-2e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:18:33 np0005466013 NetworkManager[51205]: <info>  [1759407513.5597] device (tap4b6da309-2e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:33.571 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[5e10b173-dd2d-4e0c-993a-0ec563b8fe6e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:33 np0005466013 systemd[1]: Started Virtual Machine qemu-42-instance-0000005f.
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:33.602 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[6354f18d-4378-41c5-80d0-775e97e0ef58]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:33.639 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[b4ec5c3c-112c-4490-9d70-91ad8ec5d82b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:33 np0005466013 NetworkManager[51205]: <info>  [1759407513.6520] manager: (tapb97b8849-80): new Veth device (/org/freedesktop/NetworkManager/Devices/168)
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:33.653 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[edb93db2-7482-410f-8623-1b964099650c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:33.698 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[bdf99d43-313a-4ed2-973f-b1524f26f33c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:33.704 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[8a1486c1-53cf-49a4-9a75-52ced0b455fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:33 np0005466013 NetworkManager[51205]: <info>  [1759407513.7449] device (tapb97b8849-80): carrier: link connected
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:33.748 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[f5f61541-ff06-46b4-829e-2d7c7a33103e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:33.768 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[c704625e-772d-4cfa-bdf1-32afce92254e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb97b8849-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ea:e0:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 108], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 550535, 'reachable_time': 16367, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233695, 'error': None, 'target': 'ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:33.789 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[78ccd757-6aa8-4a29-bb80-c77039fc5234]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feea:e0b0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 550535, 'tstamp': 550535}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233697, 'error': None, 'target': 'ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:33.814 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[d4de5c76-8d79-465e-b837-50ffcf13a896]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb97b8849-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ea:e0:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 108], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 550535, 'reachable_time': 16367, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 233703, 'error': None, 'target': 'ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:33.855 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[6f00cf17-2209-4d0e-b199-f76729d3d896]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:33 np0005466013 nova_compute[192144]: 2025-10-02 12:18:33.875 2 DEBUG nova.network.neutron [req-f89ab445-9cfb-4ae8-a4a9-3cabc263f23d req-bbc9be5b-b7e6-41fa-80e8-3d18ddd20b0d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Updated VIF entry in instance network info cache for port 4b6da309-2e2d-465d-91bd-9e0bae3250eb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:18:33 np0005466013 nova_compute[192144]: 2025-10-02 12:18:33.876 2 DEBUG nova.network.neutron [req-f89ab445-9cfb-4ae8-a4a9-3cabc263f23d req-bbc9be5b-b7e6-41fa-80e8-3d18ddd20b0d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Updating instance_info_cache with network_info: [{"id": "4b6da309-2e2d-465d-91bd-9e0bae3250eb", "address": "fa:16:3e:5d:9c:dc", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b6da309-2e", "ovs_interfaceid": "4b6da309-2e2d-465d-91bd-9e0bae3250eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:18:33 np0005466013 nova_compute[192144]: 2025-10-02 12:18:33.902 2 DEBUG nova.compute.manager [req-e91cebe9-cb51-495c-9555-8d5a81bc448f req-a44e1458-9a06-4ccf-9cac-dc1d9864e397 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Received event network-vif-plugged-4b6da309-2e2d-465d-91bd-9e0bae3250eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:18:33 np0005466013 nova_compute[192144]: 2025-10-02 12:18:33.903 2 DEBUG oslo_concurrency.lockutils [req-e91cebe9-cb51-495c-9555-8d5a81bc448f req-a44e1458-9a06-4ccf-9cac-dc1d9864e397 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "95da5a4a-5301-4a2b-b135-01e08486477d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:33 np0005466013 nova_compute[192144]: 2025-10-02 12:18:33.903 2 DEBUG oslo_concurrency.lockutils [req-e91cebe9-cb51-495c-9555-8d5a81bc448f req-a44e1458-9a06-4ccf-9cac-dc1d9864e397 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "95da5a4a-5301-4a2b-b135-01e08486477d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:33 np0005466013 nova_compute[192144]: 2025-10-02 12:18:33.903 2 DEBUG oslo_concurrency.lockutils [req-e91cebe9-cb51-495c-9555-8d5a81bc448f req-a44e1458-9a06-4ccf-9cac-dc1d9864e397 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "95da5a4a-5301-4a2b-b135-01e08486477d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:33 np0005466013 nova_compute[192144]: 2025-10-02 12:18:33.903 2 DEBUG nova.compute.manager [req-e91cebe9-cb51-495c-9555-8d5a81bc448f req-a44e1458-9a06-4ccf-9cac-dc1d9864e397 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Processing event network-vif-plugged-4b6da309-2e2d-465d-91bd-9e0bae3250eb _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:18:33 np0005466013 nova_compute[192144]: 2025-10-02 12:18:33.904 2 DEBUG oslo_concurrency.lockutils [req-f89ab445-9cfb-4ae8-a4a9-3cabc263f23d req-bbc9be5b-b7e6-41fa-80e8-3d18ddd20b0d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-95da5a4a-5301-4a2b-b135-01e08486477d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:33.929 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[5e3bbd20-e094-4ca5-86b2-06ededea942f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:33.931 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb97b8849-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:33.931 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:33.931 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb97b8849-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:33 np0005466013 NetworkManager[51205]: <info>  [1759407513.9344] manager: (tapb97b8849-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/169)
Oct  2 08:18:33 np0005466013 kernel: tapb97b8849-80: entered promiscuous mode
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:33.938 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb97b8849-80, col_values=(('external_ids', {'iface-id': '055cf080-4472-4807-a697-69de84e96953'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:33 np0005466013 ovn_controller[94366]: 2025-10-02T12:18:33Z|00360|binding|INFO|Releasing lport 055cf080-4472-4807-a697-69de84e96953 from this chassis (sb_readonly=0)
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:33.942 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b97b8849-844c-4190-8b13-fd7a2d073ce8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b97b8849-844c-4190-8b13-fd7a2d073ce8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:18:33 np0005466013 nova_compute[192144]: 2025-10-02 12:18:33.943 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:33 np0005466013 nova_compute[192144]: 2025-10-02 12:18:33.959 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:33.958 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[b2db808f-993a-485b-8f17-3de76c53fdf7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:33.959 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-b97b8849-844c-4190-8b13-fd7a2d073ce8
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/b97b8849-844c-4190-8b13-fd7a2d073ce8.pid.haproxy
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID b97b8849-844c-4190-8b13-fd7a2d073ce8
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:18:33 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:33.960 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8', 'env', 'PROCESS_TAG=haproxy-b97b8849-844c-4190-8b13-fd7a2d073ce8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b97b8849-844c-4190-8b13-fd7a2d073ce8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:18:34 np0005466013 nova_compute[192144]: 2025-10-02 12:18:34.343 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407514.342791, 95da5a4a-5301-4a2b-b135-01e08486477d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:18:34 np0005466013 nova_compute[192144]: 2025-10-02 12:18:34.344 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] VM Started (Lifecycle Event)#033[00m
Oct  2 08:18:34 np0005466013 nova_compute[192144]: 2025-10-02 12:18:34.346 2 DEBUG nova.compute.manager [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:18:34 np0005466013 nova_compute[192144]: 2025-10-02 12:18:34.353 2 DEBUG nova.virt.libvirt.driver [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:18:34 np0005466013 nova_compute[192144]: 2025-10-02 12:18:34.357 2 INFO nova.virt.libvirt.driver [-] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Instance spawned successfully.#033[00m
Oct  2 08:18:34 np0005466013 nova_compute[192144]: 2025-10-02 12:18:34.357 2 DEBUG nova.virt.libvirt.driver [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:18:34 np0005466013 nova_compute[192144]: 2025-10-02 12:18:34.364 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:18:34 np0005466013 nova_compute[192144]: 2025-10-02 12:18:34.375 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:18:34 np0005466013 nova_compute[192144]: 2025-10-02 12:18:34.382 2 DEBUG nova.virt.libvirt.driver [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:18:34 np0005466013 nova_compute[192144]: 2025-10-02 12:18:34.382 2 DEBUG nova.virt.libvirt.driver [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:18:34 np0005466013 nova_compute[192144]: 2025-10-02 12:18:34.383 2 DEBUG nova.virt.libvirt.driver [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:18:34 np0005466013 nova_compute[192144]: 2025-10-02 12:18:34.383 2 DEBUG nova.virt.libvirt.driver [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:18:34 np0005466013 nova_compute[192144]: 2025-10-02 12:18:34.384 2 DEBUG nova.virt.libvirt.driver [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:18:34 np0005466013 nova_compute[192144]: 2025-10-02 12:18:34.384 2 DEBUG nova.virt.libvirt.driver [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:18:34 np0005466013 podman[233736]: 2025-10-02 12:18:34.403876268 +0000 UTC m=+0.070237624 container create a90334938134872264938857ccd2de290bfdcae7c0044dde75bc3445a175b0bf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001)
Oct  2 08:18:34 np0005466013 nova_compute[192144]: 2025-10-02 12:18:34.406 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:18:34 np0005466013 nova_compute[192144]: 2025-10-02 12:18:34.407 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407514.3429706, 95da5a4a-5301-4a2b-b135-01e08486477d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:18:34 np0005466013 nova_compute[192144]: 2025-10-02 12:18:34.407 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:18:34 np0005466013 nova_compute[192144]: 2025-10-02 12:18:34.431 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:18:34 np0005466013 nova_compute[192144]: 2025-10-02 12:18:34.436 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407514.353136, 95da5a4a-5301-4a2b-b135-01e08486477d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:18:34 np0005466013 nova_compute[192144]: 2025-10-02 12:18:34.436 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:18:34 np0005466013 systemd[1]: Started libpod-conmon-a90334938134872264938857ccd2de290bfdcae7c0044dde75bc3445a175b0bf.scope.
Oct  2 08:18:34 np0005466013 nova_compute[192144]: 2025-10-02 12:18:34.459 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:18:34 np0005466013 nova_compute[192144]: 2025-10-02 12:18:34.464 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:18:34 np0005466013 podman[233736]: 2025-10-02 12:18:34.372515921 +0000 UTC m=+0.038877287 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:18:34 np0005466013 nova_compute[192144]: 2025-10-02 12:18:34.470 2 INFO nova.compute.manager [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Took 5.59 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:18:34 np0005466013 nova_compute[192144]: 2025-10-02 12:18:34.471 2 DEBUG nova.compute.manager [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:18:34 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:18:34 np0005466013 nova_compute[192144]: 2025-10-02 12:18:34.481 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:18:34 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f70c8961ae43d895c3644e08dbf6eef49d335abe8c36fcce46ab27661637d0e9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:18:34 np0005466013 podman[233736]: 2025-10-02 12:18:34.497807296 +0000 UTC m=+0.164168682 container init a90334938134872264938857ccd2de290bfdcae7c0044dde75bc3445a175b0bf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:18:34 np0005466013 podman[233736]: 2025-10-02 12:18:34.505408858 +0000 UTC m=+0.171770214 container start a90334938134872264938857ccd2de290bfdcae7c0044dde75bc3445a175b0bf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Oct  2 08:18:34 np0005466013 neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8[233751]: [NOTICE]   (233755) : New worker (233757) forked
Oct  2 08:18:34 np0005466013 neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8[233751]: [NOTICE]   (233755) : Loading success.
Oct  2 08:18:34 np0005466013 nova_compute[192144]: 2025-10-02 12:18:34.555 2 INFO nova.compute.manager [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Took 6.20 seconds to build instance.#033[00m
Oct  2 08:18:34 np0005466013 nova_compute[192144]: 2025-10-02 12:18:34.575 2 DEBUG oslo_concurrency.lockutils [None req-f7687797-1f8b-4484-b342-355b92191448 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "95da5a4a-5301-4a2b-b135-01e08486477d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.359s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:35 np0005466013 nova_compute[192144]: 2025-10-02 12:18:35.982 2 DEBUG nova.compute.manager [req-316a8c39-a1e4-4d86-b046-ddd5fefc631a req-fd4950cb-d4d0-4295-99a7-54c14d82da13 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Received event network-vif-plugged-4b6da309-2e2d-465d-91bd-9e0bae3250eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:18:35 np0005466013 nova_compute[192144]: 2025-10-02 12:18:35.984 2 DEBUG oslo_concurrency.lockutils [req-316a8c39-a1e4-4d86-b046-ddd5fefc631a req-fd4950cb-d4d0-4295-99a7-54c14d82da13 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "95da5a4a-5301-4a2b-b135-01e08486477d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:35 np0005466013 nova_compute[192144]: 2025-10-02 12:18:35.987 2 DEBUG oslo_concurrency.lockutils [req-316a8c39-a1e4-4d86-b046-ddd5fefc631a req-fd4950cb-d4d0-4295-99a7-54c14d82da13 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "95da5a4a-5301-4a2b-b135-01e08486477d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:35 np0005466013 nova_compute[192144]: 2025-10-02 12:18:35.988 2 DEBUG oslo_concurrency.lockutils [req-316a8c39-a1e4-4d86-b046-ddd5fefc631a req-fd4950cb-d4d0-4295-99a7-54c14d82da13 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "95da5a4a-5301-4a2b-b135-01e08486477d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:35 np0005466013 nova_compute[192144]: 2025-10-02 12:18:35.988 2 DEBUG nova.compute.manager [req-316a8c39-a1e4-4d86-b046-ddd5fefc631a req-fd4950cb-d4d0-4295-99a7-54c14d82da13 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] No waiting events found dispatching network-vif-plugged-4b6da309-2e2d-465d-91bd-9e0bae3250eb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:18:35 np0005466013 nova_compute[192144]: 2025-10-02 12:18:35.988 2 WARNING nova.compute.manager [req-316a8c39-a1e4-4d86-b046-ddd5fefc631a req-fd4950cb-d4d0-4295-99a7-54c14d82da13 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Received unexpected event network-vif-plugged-4b6da309-2e2d-465d-91bd-9e0bae3250eb for instance with vm_state active and task_state None.#033[00m
Oct  2 08:18:37 np0005466013 nova_compute[192144]: 2025-10-02 12:18:37.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:37.814 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:18:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:37.816 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:18:37 np0005466013 nova_compute[192144]: 2025-10-02 12:18:37.819 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:38 np0005466013 nova_compute[192144]: 2025-10-02 12:18:38.043 2 DEBUG oslo_concurrency.lockutils [None req-14a9e0c9-4670-4755-9b1c-a8590c6eccdd 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Acquiring lock "adf0e304-4d32-438f-9a13-b7171fa09447" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:38 np0005466013 nova_compute[192144]: 2025-10-02 12:18:38.045 2 DEBUG oslo_concurrency.lockutils [None req-14a9e0c9-4670-4755-9b1c-a8590c6eccdd 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Lock "adf0e304-4d32-438f-9a13-b7171fa09447" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:38 np0005466013 nova_compute[192144]: 2025-10-02 12:18:38.046 2 DEBUG oslo_concurrency.lockutils [None req-14a9e0c9-4670-4755-9b1c-a8590c6eccdd 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Acquiring lock "adf0e304-4d32-438f-9a13-b7171fa09447-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:38 np0005466013 nova_compute[192144]: 2025-10-02 12:18:38.047 2 DEBUG oslo_concurrency.lockutils [None req-14a9e0c9-4670-4755-9b1c-a8590c6eccdd 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Lock "adf0e304-4d32-438f-9a13-b7171fa09447-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:38 np0005466013 nova_compute[192144]: 2025-10-02 12:18:38.048 2 DEBUG oslo_concurrency.lockutils [None req-14a9e0c9-4670-4755-9b1c-a8590c6eccdd 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Lock "adf0e304-4d32-438f-9a13-b7171fa09447-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:38 np0005466013 nova_compute[192144]: 2025-10-02 12:18:38.068 2 INFO nova.compute.manager [None req-14a9e0c9-4670-4755-9b1c-a8590c6eccdd 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Terminating instance#033[00m
Oct  2 08:18:38 np0005466013 nova_compute[192144]: 2025-10-02 12:18:38.086 2 DEBUG nova.compute.manager [None req-14a9e0c9-4670-4755-9b1c-a8590c6eccdd 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:18:38 np0005466013 kernel: tap0b607104-1c (unregistering): left promiscuous mode
Oct  2 08:18:38 np0005466013 NetworkManager[51205]: <info>  [1759407518.1234] device (tap0b607104-1c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:18:38 np0005466013 ovn_controller[94366]: 2025-10-02T12:18:38Z|00361|binding|INFO|Releasing lport 0b607104-1ce8-4f80-8ea3-859d222c9b8e from this chassis (sb_readonly=0)
Oct  2 08:18:38 np0005466013 ovn_controller[94366]: 2025-10-02T12:18:38Z|00362|binding|INFO|Setting lport 0b607104-1ce8-4f80-8ea3-859d222c9b8e down in Southbound
Oct  2 08:18:38 np0005466013 ovn_controller[94366]: 2025-10-02T12:18:38Z|00363|binding|INFO|Removing iface tap0b607104-1c ovn-installed in OVS
Oct  2 08:18:38 np0005466013 nova_compute[192144]: 2025-10-02 12:18:38.136 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:38 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:38.150 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3c:7e:84 10.100.0.5'], port_security=['fa:16:3e:3c:7e:84 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'adf0e304-4d32-438f-9a13-b7171fa09447', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5b886deb-ac8b-4d5e-a6d4-b19699c6ae92', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '49c6a5f4c4c84d7ba686d98befbc981a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '55283a5f-31d5-4a4d-bc9f-4b8e3fc9f6b5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2cc3415d-eee4-499b-a06c-93196fe04768, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=0b607104-1ce8-4f80-8ea3-859d222c9b8e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:18:38 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:38.151 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 0b607104-1ce8-4f80-8ea3-859d222c9b8e in datapath 5b886deb-ac8b-4d5e-a6d4-b19699c6ae92 unbound from our chassis#033[00m
Oct  2 08:18:38 np0005466013 nova_compute[192144]: 2025-10-02 12:18:38.153 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:38 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:38.153 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5b886deb-ac8b-4d5e-a6d4-b19699c6ae92, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:18:38 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:38.154 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[c8196e9c-d66e-49cf-9701-67d2363d34a8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:38 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:38.154 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-5b886deb-ac8b-4d5e-a6d4-b19699c6ae92 namespace which is not needed anymore#033[00m
Oct  2 08:18:38 np0005466013 systemd[1]: machine-qemu\x2d41\x2dinstance\x2d0000005d.scope: Deactivated successfully.
Oct  2 08:18:38 np0005466013 systemd[1]: machine-qemu\x2d41\x2dinstance\x2d0000005d.scope: Consumed 10.791s CPU time.
Oct  2 08:18:38 np0005466013 systemd-machined[152202]: Machine qemu-41-instance-0000005d terminated.
Oct  2 08:18:38 np0005466013 neutron-haproxy-ovnmeta-5b886deb-ac8b-4d5e-a6d4-b19699c6ae92[233528]: [NOTICE]   (233532) : haproxy version is 2.8.14-c23fe91
Oct  2 08:18:38 np0005466013 neutron-haproxy-ovnmeta-5b886deb-ac8b-4d5e-a6d4-b19699c6ae92[233528]: [NOTICE]   (233532) : path to executable is /usr/sbin/haproxy
Oct  2 08:18:38 np0005466013 neutron-haproxy-ovnmeta-5b886deb-ac8b-4d5e-a6d4-b19699c6ae92[233528]: [WARNING]  (233532) : Exiting Master process...
Oct  2 08:18:38 np0005466013 neutron-haproxy-ovnmeta-5b886deb-ac8b-4d5e-a6d4-b19699c6ae92[233528]: [ALERT]    (233532) : Current worker (233534) exited with code 143 (Terminated)
Oct  2 08:18:38 np0005466013 neutron-haproxy-ovnmeta-5b886deb-ac8b-4d5e-a6d4-b19699c6ae92[233528]: [WARNING]  (233532) : All workers exited. Exiting... (0)
Oct  2 08:18:38 np0005466013 systemd[1]: libpod-f337f7e850e1e1ce83a7e5dc22ae66e83d2bfa056e990892a01b457e87b35e76.scope: Deactivated successfully.
Oct  2 08:18:38 np0005466013 podman[233790]: 2025-10-02 12:18:38.310681097 +0000 UTC m=+0.053911856 container died f337f7e850e1e1ce83a7e5dc22ae66e83d2bfa056e990892a01b457e87b35e76 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5b886deb-ac8b-4d5e-a6d4-b19699c6ae92, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:18:38 np0005466013 nova_compute[192144]: 2025-10-02 12:18:38.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:38 np0005466013 nova_compute[192144]: 2025-10-02 12:18:38.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:38 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f337f7e850e1e1ce83a7e5dc22ae66e83d2bfa056e990892a01b457e87b35e76-userdata-shm.mount: Deactivated successfully.
Oct  2 08:18:38 np0005466013 systemd[1]: var-lib-containers-storage-overlay-eee39f93910b8a7c9555b063619774d2bf14979581701cd3c23a05dbfe2b54b3-merged.mount: Deactivated successfully.
Oct  2 08:18:38 np0005466013 nova_compute[192144]: 2025-10-02 12:18:38.360 2 INFO nova.virt.libvirt.driver [-] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Instance destroyed successfully.#033[00m
Oct  2 08:18:38 np0005466013 nova_compute[192144]: 2025-10-02 12:18:38.362 2 DEBUG nova.objects.instance [None req-14a9e0c9-4670-4755-9b1c-a8590c6eccdd 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Lazy-loading 'resources' on Instance uuid adf0e304-4d32-438f-9a13-b7171fa09447 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:18:38 np0005466013 podman[233790]: 2025-10-02 12:18:38.375388655 +0000 UTC m=+0.118619434 container cleanup f337f7e850e1e1ce83a7e5dc22ae66e83d2bfa056e990892a01b457e87b35e76 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5b886deb-ac8b-4d5e-a6d4-b19699c6ae92, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 08:18:38 np0005466013 nova_compute[192144]: 2025-10-02 12:18:38.379 2 DEBUG nova.virt.libvirt.vif [None req-14a9e0c9-4670-4755-9b1c-a8590c6eccdd 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:18:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-636246405',display_name='tempest-ListServersNegativeTestJSON-server-636246405-2',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-636246405-2',id=93,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=1,launched_at=2025-10-02T12:18:28Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='49c6a5f4c4c84d7ba686d98befbc981a',ramdisk_id='',reservation_id='r-pj0lxy21',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServersNegativeTestJSON-1724341867',owner_user_name='tempest-ListServersNegativeTestJSON-1724341867-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:18:28Z,user_data=None,user_id='836c60c20a0f48dd994c9d659781fc06',uuid=adf0e304-4d32-438f-9a13-b7171fa09447,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0b607104-1ce8-4f80-8ea3-859d222c9b8e", "address": "fa:16:3e:3c:7e:84", "network": {"id": "5b886deb-ac8b-4d5e-a6d4-b19699c6ae92", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1740420896-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "49c6a5f4c4c84d7ba686d98befbc981a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b607104-1c", "ovs_interfaceid": "0b607104-1ce8-4f80-8ea3-859d222c9b8e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:18:38 np0005466013 nova_compute[192144]: 2025-10-02 12:18:38.380 2 DEBUG nova.network.os_vif_util [None req-14a9e0c9-4670-4755-9b1c-a8590c6eccdd 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Converting VIF {"id": "0b607104-1ce8-4f80-8ea3-859d222c9b8e", "address": "fa:16:3e:3c:7e:84", "network": {"id": "5b886deb-ac8b-4d5e-a6d4-b19699c6ae92", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1740420896-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "49c6a5f4c4c84d7ba686d98befbc981a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b607104-1c", "ovs_interfaceid": "0b607104-1ce8-4f80-8ea3-859d222c9b8e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:18:38 np0005466013 nova_compute[192144]: 2025-10-02 12:18:38.381 2 DEBUG nova.network.os_vif_util [None req-14a9e0c9-4670-4755-9b1c-a8590c6eccdd 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3c:7e:84,bridge_name='br-int',has_traffic_filtering=True,id=0b607104-1ce8-4f80-8ea3-859d222c9b8e,network=Network(5b886deb-ac8b-4d5e-a6d4-b19699c6ae92),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b607104-1c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:18:38 np0005466013 nova_compute[192144]: 2025-10-02 12:18:38.382 2 DEBUG os_vif [None req-14a9e0c9-4670-4755-9b1c-a8590c6eccdd 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3c:7e:84,bridge_name='br-int',has_traffic_filtering=True,id=0b607104-1ce8-4f80-8ea3-859d222c9b8e,network=Network(5b886deb-ac8b-4d5e-a6d4-b19699c6ae92),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b607104-1c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:18:38 np0005466013 systemd[1]: libpod-conmon-f337f7e850e1e1ce83a7e5dc22ae66e83d2bfa056e990892a01b457e87b35e76.scope: Deactivated successfully.
Oct  2 08:18:38 np0005466013 nova_compute[192144]: 2025-10-02 12:18:38.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:38 np0005466013 nova_compute[192144]: 2025-10-02 12:18:38.388 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0b607104-1c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:38 np0005466013 nova_compute[192144]: 2025-10-02 12:18:38.392 2 DEBUG nova.compute.manager [req-653f1e08-907a-4ae2-9bda-c1741d2f7a3d req-435ca875-93db-4cfc-a8e7-6a7f871eb960 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Received event network-vif-unplugged-0b607104-1ce8-4f80-8ea3-859d222c9b8e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:18:38 np0005466013 nova_compute[192144]: 2025-10-02 12:18:38.394 2 DEBUG oslo_concurrency.lockutils [req-653f1e08-907a-4ae2-9bda-c1741d2f7a3d req-435ca875-93db-4cfc-a8e7-6a7f871eb960 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "adf0e304-4d32-438f-9a13-b7171fa09447-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:38 np0005466013 nova_compute[192144]: 2025-10-02 12:18:38.394 2 DEBUG oslo_concurrency.lockutils [req-653f1e08-907a-4ae2-9bda-c1741d2f7a3d req-435ca875-93db-4cfc-a8e7-6a7f871eb960 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "adf0e304-4d32-438f-9a13-b7171fa09447-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:38 np0005466013 nova_compute[192144]: 2025-10-02 12:18:38.394 2 DEBUG oslo_concurrency.lockutils [req-653f1e08-907a-4ae2-9bda-c1741d2f7a3d req-435ca875-93db-4cfc-a8e7-6a7f871eb960 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "adf0e304-4d32-438f-9a13-b7171fa09447-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:38 np0005466013 nova_compute[192144]: 2025-10-02 12:18:38.395 2 DEBUG nova.compute.manager [req-653f1e08-907a-4ae2-9bda-c1741d2f7a3d req-435ca875-93db-4cfc-a8e7-6a7f871eb960 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] No waiting events found dispatching network-vif-unplugged-0b607104-1ce8-4f80-8ea3-859d222c9b8e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:18:38 np0005466013 nova_compute[192144]: 2025-10-02 12:18:38.395 2 DEBUG nova.compute.manager [req-653f1e08-907a-4ae2-9bda-c1741d2f7a3d req-435ca875-93db-4cfc-a8e7-6a7f871eb960 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Received event network-vif-unplugged-0b607104-1ce8-4f80-8ea3-859d222c9b8e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:18:38 np0005466013 nova_compute[192144]: 2025-10-02 12:18:38.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:38 np0005466013 nova_compute[192144]: 2025-10-02 12:18:38.398 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:18:38 np0005466013 nova_compute[192144]: 2025-10-02 12:18:38.402 2 INFO os_vif [None req-14a9e0c9-4670-4755-9b1c-a8590c6eccdd 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3c:7e:84,bridge_name='br-int',has_traffic_filtering=True,id=0b607104-1ce8-4f80-8ea3-859d222c9b8e,network=Network(5b886deb-ac8b-4d5e-a6d4-b19699c6ae92),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b607104-1c')#033[00m
Oct  2 08:18:38 np0005466013 nova_compute[192144]: 2025-10-02 12:18:38.403 2 INFO nova.virt.libvirt.driver [None req-14a9e0c9-4670-4755-9b1c-a8590c6eccdd 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Deleting instance files /var/lib/nova/instances/adf0e304-4d32-438f-9a13-b7171fa09447_del#033[00m
Oct  2 08:18:38 np0005466013 nova_compute[192144]: 2025-10-02 12:18:38.404 2 INFO nova.virt.libvirt.driver [None req-14a9e0c9-4670-4755-9b1c-a8590c6eccdd 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Deletion of /var/lib/nova/instances/adf0e304-4d32-438f-9a13-b7171fa09447_del complete#033[00m
Oct  2 08:18:38 np0005466013 podman[233831]: 2025-10-02 12:18:38.449690508 +0000 UTC m=+0.046428318 container remove f337f7e850e1e1ce83a7e5dc22ae66e83d2bfa056e990892a01b457e87b35e76 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5b886deb-ac8b-4d5e-a6d4-b19699c6ae92, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 08:18:38 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:38.455 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[b8e058a1-686e-4c7f-bf85-2743011eb66f]: (4, ('Thu Oct  2 12:18:38 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-5b886deb-ac8b-4d5e-a6d4-b19699c6ae92 (f337f7e850e1e1ce83a7e5dc22ae66e83d2bfa056e990892a01b457e87b35e76)\nf337f7e850e1e1ce83a7e5dc22ae66e83d2bfa056e990892a01b457e87b35e76\nThu Oct  2 12:18:38 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-5b886deb-ac8b-4d5e-a6d4-b19699c6ae92 (f337f7e850e1e1ce83a7e5dc22ae66e83d2bfa056e990892a01b457e87b35e76)\nf337f7e850e1e1ce83a7e5dc22ae66e83d2bfa056e990892a01b457e87b35e76\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:38 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:38.457 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[7b75219d-6a4a-4dc8-81fd-c0cc3fa3339b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:38 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:38.459 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5b886deb-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:38 np0005466013 nova_compute[192144]: 2025-10-02 12:18:38.461 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:38 np0005466013 kernel: tap5b886deb-a0: left promiscuous mode
Oct  2 08:18:38 np0005466013 nova_compute[192144]: 2025-10-02 12:18:38.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:38 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:38.476 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[28b1f120-8a58-49a2-8f33-010ca5d83a87]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:38 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:38.507 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[3ba4b531-bdee-4403-9527-8210baea13ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:38 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:38.509 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[1d849104-93a9-4e3e-a950-c72634be365a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:38 np0005466013 nova_compute[192144]: 2025-10-02 12:18:38.518 2 INFO nova.compute.manager [None req-14a9e0c9-4670-4755-9b1c-a8590c6eccdd 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Took 0.43 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:18:38 np0005466013 nova_compute[192144]: 2025-10-02 12:18:38.519 2 DEBUG oslo.service.loopingcall [None req-14a9e0c9-4670-4755-9b1c-a8590c6eccdd 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:18:38 np0005466013 nova_compute[192144]: 2025-10-02 12:18:38.519 2 DEBUG nova.compute.manager [-] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:18:38 np0005466013 nova_compute[192144]: 2025-10-02 12:18:38.520 2 DEBUG nova.network.neutron [-] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:18:38 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:38.529 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[4f5c856c-26b8-4014-8823-12e4da531dba]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 549750, 'reachable_time': 23535, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233847, 'error': None, 'target': 'ovnmeta-5b886deb-ac8b-4d5e-a6d4-b19699c6ae92', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:38 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:38.532 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-5b886deb-ac8b-4d5e-a6d4-b19699c6ae92 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:18:38 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:38.532 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[bda22689-3f05-4f02-b22c-7025b9f194fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:38 np0005466013 systemd[1]: run-netns-ovnmeta\x2d5b886deb\x2dac8b\x2d4d5e\x2da6d4\x2db19699c6ae92.mount: Deactivated successfully.
Oct  2 08:18:38 np0005466013 nova_compute[192144]: 2025-10-02 12:18:38.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:38 np0005466013 nova_compute[192144]: 2025-10-02 12:18:38.933 2 DEBUG oslo_concurrency.lockutils [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "refresh_cache-95da5a4a-5301-4a2b-b135-01e08486477d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:18:38 np0005466013 nova_compute[192144]: 2025-10-02 12:18:38.935 2 DEBUG oslo_concurrency.lockutils [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquired lock "refresh_cache-95da5a4a-5301-4a2b-b135-01e08486477d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:18:38 np0005466013 nova_compute[192144]: 2025-10-02 12:18:38.935 2 DEBUG nova.network.neutron [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:18:39 np0005466013 nova_compute[192144]: 2025-10-02 12:18:39.870 2 DEBUG nova.network.neutron [-] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:18:39 np0005466013 nova_compute[192144]: 2025-10-02 12:18:39.924 2 INFO nova.compute.manager [-] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Took 1.40 seconds to deallocate network for instance.#033[00m
Oct  2 08:18:39 np0005466013 nova_compute[192144]: 2025-10-02 12:18:39.968 2 DEBUG nova.compute.manager [req-9643abcf-f3cf-4ba9-9605-ce71cce14c47 req-038a7774-0490-402b-a91e-8e8ee09120ec 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Received event network-vif-deleted-0b607104-1ce8-4f80-8ea3-859d222c9b8e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:18:40 np0005466013 nova_compute[192144]: 2025-10-02 12:18:40.034 2 DEBUG oslo_concurrency.lockutils [None req-14a9e0c9-4670-4755-9b1c-a8590c6eccdd 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:40 np0005466013 nova_compute[192144]: 2025-10-02 12:18:40.035 2 DEBUG oslo_concurrency.lockutils [None req-14a9e0c9-4670-4755-9b1c-a8590c6eccdd 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:40 np0005466013 nova_compute[192144]: 2025-10-02 12:18:40.146 2 DEBUG nova.compute.provider_tree [None req-14a9e0c9-4670-4755-9b1c-a8590c6eccdd 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:18:40 np0005466013 nova_compute[192144]: 2025-10-02 12:18:40.169 2 DEBUG nova.scheduler.client.report [None req-14a9e0c9-4670-4755-9b1c-a8590c6eccdd 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:18:40 np0005466013 nova_compute[192144]: 2025-10-02 12:18:40.191 2 DEBUG oslo_concurrency.lockutils [None req-14a9e0c9-4670-4755-9b1c-a8590c6eccdd 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.156s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:40 np0005466013 nova_compute[192144]: 2025-10-02 12:18:40.225 2 INFO nova.scheduler.client.report [None req-14a9e0c9-4670-4755-9b1c-a8590c6eccdd 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Deleted allocations for instance adf0e304-4d32-438f-9a13-b7171fa09447#033[00m
Oct  2 08:18:40 np0005466013 nova_compute[192144]: 2025-10-02 12:18:40.308 2 DEBUG oslo_concurrency.lockutils [None req-14a9e0c9-4670-4755-9b1c-a8590c6eccdd 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Lock "adf0e304-4d32-438f-9a13-b7171fa09447" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.263s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:40 np0005466013 nova_compute[192144]: 2025-10-02 12:18:40.473 2 DEBUG nova.compute.manager [req-262f72fb-4a05-424f-a54c-182d4586d435 req-f429ed85-a75e-4b3a-829c-db87398c00bc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Received event network-vif-plugged-0b607104-1ce8-4f80-8ea3-859d222c9b8e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:18:40 np0005466013 nova_compute[192144]: 2025-10-02 12:18:40.475 2 DEBUG oslo_concurrency.lockutils [req-262f72fb-4a05-424f-a54c-182d4586d435 req-f429ed85-a75e-4b3a-829c-db87398c00bc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "adf0e304-4d32-438f-9a13-b7171fa09447-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:40 np0005466013 nova_compute[192144]: 2025-10-02 12:18:40.476 2 DEBUG oslo_concurrency.lockutils [req-262f72fb-4a05-424f-a54c-182d4586d435 req-f429ed85-a75e-4b3a-829c-db87398c00bc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "adf0e304-4d32-438f-9a13-b7171fa09447-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:40 np0005466013 nova_compute[192144]: 2025-10-02 12:18:40.477 2 DEBUG oslo_concurrency.lockutils [req-262f72fb-4a05-424f-a54c-182d4586d435 req-f429ed85-a75e-4b3a-829c-db87398c00bc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "adf0e304-4d32-438f-9a13-b7171fa09447-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:40 np0005466013 nova_compute[192144]: 2025-10-02 12:18:40.477 2 DEBUG nova.compute.manager [req-262f72fb-4a05-424f-a54c-182d4586d435 req-f429ed85-a75e-4b3a-829c-db87398c00bc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] No waiting events found dispatching network-vif-plugged-0b607104-1ce8-4f80-8ea3-859d222c9b8e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:18:40 np0005466013 nova_compute[192144]: 2025-10-02 12:18:40.478 2 WARNING nova.compute.manager [req-262f72fb-4a05-424f-a54c-182d4586d435 req-f429ed85-a75e-4b3a-829c-db87398c00bc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Received unexpected event network-vif-plugged-0b607104-1ce8-4f80-8ea3-859d222c9b8e for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:18:40 np0005466013 nova_compute[192144]: 2025-10-02 12:18:40.851 2 DEBUG nova.network.neutron [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Updating instance_info_cache with network_info: [{"id": "4b6da309-2e2d-465d-91bd-9e0bae3250eb", "address": "fa:16:3e:5d:9c:dc", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b6da309-2e", "ovs_interfaceid": "4b6da309-2e2d-465d-91bd-9e0bae3250eb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:18:40 np0005466013 nova_compute[192144]: 2025-10-02 12:18:40.881 2 DEBUG oslo_concurrency.lockutils [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Releasing lock "refresh_cache-95da5a4a-5301-4a2b-b135-01e08486477d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:18:40 np0005466013 nova_compute[192144]: 2025-10-02 12:18:40.994 2 DEBUG nova.virt.libvirt.driver [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Oct  2 08:18:40 np0005466013 nova_compute[192144]: 2025-10-02 12:18:40.996 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Creating file /var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d/ee6d6f8799144591b47efc6355a523de.tmp on remote host 192.168.122.101 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Oct  2 08:18:40 np0005466013 nova_compute[192144]: 2025-10-02 12:18:40.996 2 DEBUG oslo_concurrency.processutils [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d/ee6d6f8799144591b47efc6355a523de.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:18:41 np0005466013 nova_compute[192144]: 2025-10-02 12:18:41.432 2 DEBUG oslo_concurrency.processutils [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d/ee6d6f8799144591b47efc6355a523de.tmp" returned: 1 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:18:41 np0005466013 nova_compute[192144]: 2025-10-02 12:18:41.434 2 DEBUG oslo_concurrency.processutils [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] 'ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d/ee6d6f8799144591b47efc6355a523de.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Oct  2 08:18:41 np0005466013 nova_compute[192144]: 2025-10-02 12:18:41.435 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Creating directory /var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d on remote host 192.168.122.101 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Oct  2 08:18:41 np0005466013 nova_compute[192144]: 2025-10-02 12:18:41.436 2 DEBUG oslo_concurrency.processutils [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:18:41 np0005466013 nova_compute[192144]: 2025-10-02 12:18:41.655 2 DEBUG oslo_concurrency.processutils [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d" returned: 0 in 0.219s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:18:41 np0005466013 nova_compute[192144]: 2025-10-02 12:18:41.661 2 DEBUG nova.virt.libvirt.driver [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Oct  2 08:18:43 np0005466013 nova_compute[192144]: 2025-10-02 12:18:43.392 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:43 np0005466013 nova_compute[192144]: 2025-10-02 12:18:43.587 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:43 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:18:43.819 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:44 np0005466013 ovn_controller[94366]: 2025-10-02T12:18:44Z|00364|binding|INFO|Releasing lport 055cf080-4472-4807-a697-69de84e96953 from this chassis (sb_readonly=0)
Oct  2 08:18:44 np0005466013 nova_compute[192144]: 2025-10-02 12:18:44.842 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:47 np0005466013 podman[233867]: 2025-10-02 12:18:47.701462378 +0000 UTC m=+0.068692475 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Oct  2 08:18:47 np0005466013 podman[233866]: 2025-10-02 12:18:47.702297744 +0000 UTC m=+0.072307011 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  2 08:18:47 np0005466013 podman[233868]: 2025-10-02 12:18:47.749574728 +0000 UTC m=+0.112347255 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Oct  2 08:18:48 np0005466013 nova_compute[192144]: 2025-10-02 12:18:48.431 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:48 np0005466013 nova_compute[192144]: 2025-10-02 12:18:48.590 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:51 np0005466013 nova_compute[192144]: 2025-10-02 12:18:51.722 2 DEBUG nova.virt.libvirt.driver [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Oct  2 08:18:53 np0005466013 nova_compute[192144]: 2025-10-02 12:18:53.359 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407518.3572857, adf0e304-4d32-438f-9a13-b7171fa09447 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:18:53 np0005466013 nova_compute[192144]: 2025-10-02 12:18:53.360 2 INFO nova.compute.manager [-] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:18:53 np0005466013 nova_compute[192144]: 2025-10-02 12:18:53.382 2 DEBUG nova.compute.manager [None req-de992a17-7eba-4e1f-b176-b74ff262f176 - - - - - -] [instance: adf0e304-4d32-438f-9a13-b7171fa09447] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:18:53 np0005466013 nova_compute[192144]: 2025-10-02 12:18:53.433 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:53 np0005466013 nova_compute[192144]: 2025-10-02 12:18:53.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:54 np0005466013 ovn_controller[94366]: 2025-10-02T12:18:54Z|00034|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:5d:9c:dc 10.100.0.8
Oct  2 08:18:54 np0005466013 ovn_controller[94366]: 2025-10-02T12:18:54Z|00035|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:5d:9c:dc 10.100.0.8
Oct  2 08:18:56 np0005466013 podman[233938]: 2025-10-02 12:18:56.71812964 +0000 UTC m=+0.090205269 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_id=edpm, org.label-schema.vendor=CentOS)
Oct  2 08:18:58 np0005466013 nova_compute[192144]: 2025-10-02 12:18:58.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:58 np0005466013 nova_compute[192144]: 2025-10-02 12:18:58.596 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:58 np0005466013 podman[233960]: 2025-10-02 12:18:58.696789263 +0000 UTC m=+0.065380190 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, version=9.6, release=1755695350, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_id=edpm, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Oct  2 08:18:58 np0005466013 podman[233959]: 2025-10-02 12:18:58.724343629 +0000 UTC m=+0.094707103 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251001)
Oct  2 08:19:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:02.301 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:02.303 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:02.304 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:02 np0005466013 nova_compute[192144]: 2025-10-02 12:19:02.779 2 DEBUG nova.virt.libvirt.driver [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Instance in state 1 after 21 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Oct  2 08:19:03 np0005466013 nova_compute[192144]: 2025-10-02 12:19:03.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:03 np0005466013 nova_compute[192144]: 2025-10-02 12:19:03.597 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:03 np0005466013 podman[233998]: 2025-10-02 12:19:03.720174557 +0000 UTC m=+0.079293843 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 08:19:03 np0005466013 podman[233999]: 2025-10-02 12:19:03.728135071 +0000 UTC m=+0.078863700 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  2 08:19:05 np0005466013 kernel: tap4b6da309-2e (unregistering): left promiscuous mode
Oct  2 08:19:05 np0005466013 NetworkManager[51205]: <info>  [1759407545.1136] device (tap4b6da309-2e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:19:05 np0005466013 nova_compute[192144]: 2025-10-02 12:19:05.127 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:05 np0005466013 ovn_controller[94366]: 2025-10-02T12:19:05Z|00365|binding|INFO|Releasing lport 4b6da309-2e2d-465d-91bd-9e0bae3250eb from this chassis (sb_readonly=0)
Oct  2 08:19:05 np0005466013 ovn_controller[94366]: 2025-10-02T12:19:05Z|00366|binding|INFO|Setting lport 4b6da309-2e2d-465d-91bd-9e0bae3250eb down in Southbound
Oct  2 08:19:05 np0005466013 ovn_controller[94366]: 2025-10-02T12:19:05Z|00367|binding|INFO|Removing iface tap4b6da309-2e ovn-installed in OVS
Oct  2 08:19:05 np0005466013 nova_compute[192144]: 2025-10-02 12:19:05.132 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:05.138 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5d:9c:dc 10.100.0.8'], port_security=['fa:16:3e:5d:9c:dc 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '95da5a4a-5301-4a2b-b135-01e08486477d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b97b8849-844c-4190-8b13-fd7a2d073ce8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd5db64e6714348c1a7f57bb53de80915', 'neutron:revision_number': '4', 'neutron:security_group_ids': '063f732a-6071-414f-814d-a5d6c4e9e012', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2011b0da-7062-465f-963e-59e92e88a653, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=4b6da309-2e2d-465d-91bd-9e0bae3250eb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:19:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:05.139 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 4b6da309-2e2d-465d-91bd-9e0bae3250eb in datapath b97b8849-844c-4190-8b13-fd7a2d073ce8 unbound from our chassis#033[00m
Oct  2 08:19:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:05.140 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b97b8849-844c-4190-8b13-fd7a2d073ce8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:19:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:05.142 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[9c3e8692-709f-4169-ab22-ad01603b6c60]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:05.142 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8 namespace which is not needed anymore#033[00m
Oct  2 08:19:05 np0005466013 nova_compute[192144]: 2025-10-02 12:19:05.147 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:05 np0005466013 systemd[1]: machine-qemu\x2d42\x2dinstance\x2d0000005f.scope: Deactivated successfully.
Oct  2 08:19:05 np0005466013 systemd[1]: machine-qemu\x2d42\x2dinstance\x2d0000005f.scope: Consumed 14.081s CPU time.
Oct  2 08:19:05 np0005466013 systemd-machined[152202]: Machine qemu-42-instance-0000005f terminated.
Oct  2 08:19:05 np0005466013 neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8[233751]: [NOTICE]   (233755) : haproxy version is 2.8.14-c23fe91
Oct  2 08:19:05 np0005466013 neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8[233751]: [NOTICE]   (233755) : path to executable is /usr/sbin/haproxy
Oct  2 08:19:05 np0005466013 neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8[233751]: [WARNING]  (233755) : Exiting Master process...
Oct  2 08:19:05 np0005466013 neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8[233751]: [ALERT]    (233755) : Current worker (233757) exited with code 143 (Terminated)
Oct  2 08:19:05 np0005466013 neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8[233751]: [WARNING]  (233755) : All workers exited. Exiting... (0)
Oct  2 08:19:05 np0005466013 systemd[1]: libpod-a90334938134872264938857ccd2de290bfdcae7c0044dde75bc3445a175b0bf.scope: Deactivated successfully.
Oct  2 08:19:05 np0005466013 podman[234065]: 2025-10-02 12:19:05.31085789 +0000 UTC m=+0.068222571 container died a90334938134872264938857ccd2de290bfdcae7c0044dde75bc3445a175b0bf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:19:05 np0005466013 nova_compute[192144]: 2025-10-02 12:19:05.454 2 DEBUG nova.compute.manager [req-5c30db85-534a-418d-a913-34fe3c6f862f req-e0d2af38-9baa-40b6-be33-6a0feaa9dc2f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Received event network-vif-unplugged-4b6da309-2e2d-465d-91bd-9e0bae3250eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:19:05 np0005466013 nova_compute[192144]: 2025-10-02 12:19:05.455 2 DEBUG oslo_concurrency.lockutils [req-5c30db85-534a-418d-a913-34fe3c6f862f req-e0d2af38-9baa-40b6-be33-6a0feaa9dc2f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "95da5a4a-5301-4a2b-b135-01e08486477d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:05 np0005466013 nova_compute[192144]: 2025-10-02 12:19:05.455 2 DEBUG oslo_concurrency.lockutils [req-5c30db85-534a-418d-a913-34fe3c6f862f req-e0d2af38-9baa-40b6-be33-6a0feaa9dc2f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "95da5a4a-5301-4a2b-b135-01e08486477d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:05 np0005466013 nova_compute[192144]: 2025-10-02 12:19:05.456 2 DEBUG oslo_concurrency.lockutils [req-5c30db85-534a-418d-a913-34fe3c6f862f req-e0d2af38-9baa-40b6-be33-6a0feaa9dc2f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "95da5a4a-5301-4a2b-b135-01e08486477d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:05 np0005466013 nova_compute[192144]: 2025-10-02 12:19:05.456 2 DEBUG nova.compute.manager [req-5c30db85-534a-418d-a913-34fe3c6f862f req-e0d2af38-9baa-40b6-be33-6a0feaa9dc2f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] No waiting events found dispatching network-vif-unplugged-4b6da309-2e2d-465d-91bd-9e0bae3250eb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:19:05 np0005466013 nova_compute[192144]: 2025-10-02 12:19:05.457 2 WARNING nova.compute.manager [req-5c30db85-534a-418d-a913-34fe3c6f862f req-e0d2af38-9baa-40b6-be33-6a0feaa9dc2f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Received unexpected event network-vif-unplugged-4b6da309-2e2d-465d-91bd-9e0bae3250eb for instance with vm_state active and task_state resize_migrating.#033[00m
Oct  2 08:19:05 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a90334938134872264938857ccd2de290bfdcae7c0044dde75bc3445a175b0bf-userdata-shm.mount: Deactivated successfully.
Oct  2 08:19:05 np0005466013 systemd[1]: var-lib-containers-storage-overlay-f70c8961ae43d895c3644e08dbf6eef49d335abe8c36fcce46ab27661637d0e9-merged.mount: Deactivated successfully.
Oct  2 08:19:05 np0005466013 podman[234065]: 2025-10-02 12:19:05.542679352 +0000 UTC m=+0.300044013 container cleanup a90334938134872264938857ccd2de290bfdcae7c0044dde75bc3445a175b0bf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  2 08:19:05 np0005466013 systemd[1]: libpod-conmon-a90334938134872264938857ccd2de290bfdcae7c0044dde75bc3445a175b0bf.scope: Deactivated successfully.
Oct  2 08:19:05 np0005466013 nova_compute[192144]: 2025-10-02 12:19:05.793 2 INFO nova.virt.libvirt.driver [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Instance shutdown successfully after 24 seconds.#033[00m
Oct  2 08:19:05 np0005466013 nova_compute[192144]: 2025-10-02 12:19:05.799 2 INFO nova.virt.libvirt.driver [-] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Instance destroyed successfully.#033[00m
Oct  2 08:19:05 np0005466013 nova_compute[192144]: 2025-10-02 12:19:05.800 2 DEBUG nova.virt.libvirt.vif [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:18:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-2096752872',display_name='tempest-DeleteServersTestJSON-server-2096752872',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-2096752872',id=95,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:18:34Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d5db64e6714348c1a7f57bb53de80915',ramdisk_id='',reservation_id='r-frz7g55l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-DeleteServersTestJSON-548982240',owner_user_name='tempest-DeleteServersTestJSON-548982240-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:18:38Z,user_data=None,user_id='0c0ba8ddde504431b51e593c63f40361',uuid=95da5a4a-5301-4a2b-b135-01e08486477d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4b6da309-2e2d-465d-91bd-9e0bae3250eb", "address": "fa:16:3e:5d:9c:dc", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-1299594383-network", "vif_mac": "fa:16:3e:5d:9c:dc"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b6da309-2e", "ovs_interfaceid": "4b6da309-2e2d-465d-91bd-9e0bae3250eb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:19:05 np0005466013 nova_compute[192144]: 2025-10-02 12:19:05.800 2 DEBUG nova.network.os_vif_util [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Converting VIF {"id": "4b6da309-2e2d-465d-91bd-9e0bae3250eb", "address": "fa:16:3e:5d:9c:dc", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-1299594383-network", "vif_mac": "fa:16:3e:5d:9c:dc"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b6da309-2e", "ovs_interfaceid": "4b6da309-2e2d-465d-91bd-9e0bae3250eb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:19:05 np0005466013 nova_compute[192144]: 2025-10-02 12:19:05.801 2 DEBUG nova.network.os_vif_util [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5d:9c:dc,bridge_name='br-int',has_traffic_filtering=True,id=4b6da309-2e2d-465d-91bd-9e0bae3250eb,network=Network(b97b8849-844c-4190-8b13-fd7a2d073ce8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b6da309-2e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:19:05 np0005466013 nova_compute[192144]: 2025-10-02 12:19:05.801 2 DEBUG os_vif [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5d:9c:dc,bridge_name='br-int',has_traffic_filtering=True,id=4b6da309-2e2d-465d-91bd-9e0bae3250eb,network=Network(b97b8849-844c-4190-8b13-fd7a2d073ce8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b6da309-2e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:19:05 np0005466013 nova_compute[192144]: 2025-10-02 12:19:05.803 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:05 np0005466013 nova_compute[192144]: 2025-10-02 12:19:05.803 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b6da309-2e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:19:05 np0005466013 nova_compute[192144]: 2025-10-02 12:19:05.804 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:05 np0005466013 nova_compute[192144]: 2025-10-02 12:19:05.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:05 np0005466013 nova_compute[192144]: 2025-10-02 12:19:05.808 2 INFO os_vif [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5d:9c:dc,bridge_name='br-int',has_traffic_filtering=True,id=4b6da309-2e2d-465d-91bd-9e0bae3250eb,network=Network(b97b8849-844c-4190-8b13-fd7a2d073ce8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b6da309-2e')#033[00m
Oct  2 08:19:05 np0005466013 nova_compute[192144]: 2025-10-02 12:19:05.812 2 DEBUG oslo_concurrency.processutils [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:19:05 np0005466013 nova_compute[192144]: 2025-10-02 12:19:05.877 2 DEBUG oslo_concurrency.processutils [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:19:05 np0005466013 nova_compute[192144]: 2025-10-02 12:19:05.879 2 DEBUG oslo_concurrency.processutils [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:19:05 np0005466013 podman[234110]: 2025-10-02 12:19:05.935272849 +0000 UTC m=+0.366620831 container remove a90334938134872264938857ccd2de290bfdcae7c0044dde75bc3445a175b0bf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:19:05 np0005466013 nova_compute[192144]: 2025-10-02 12:19:05.938 2 DEBUG oslo_concurrency.processutils [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:19:05 np0005466013 nova_compute[192144]: 2025-10-02 12:19:05.940 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Copying file /var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d_resize/disk to 192.168.122.101:/var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Oct  2 08:19:05 np0005466013 nova_compute[192144]: 2025-10-02 12:19:05.940 2 DEBUG oslo_concurrency.processutils [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d_resize/disk 192.168.122.101:/var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:19:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:05.940 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[b92ed633-9a42-49a1-ac59-2435b80c141b]: (4, ('Thu Oct  2 12:19:05 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8 (a90334938134872264938857ccd2de290bfdcae7c0044dde75bc3445a175b0bf)\na90334938134872264938857ccd2de290bfdcae7c0044dde75bc3445a175b0bf\nThu Oct  2 12:19:05 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8 (a90334938134872264938857ccd2de290bfdcae7c0044dde75bc3445a175b0bf)\na90334938134872264938857ccd2de290bfdcae7c0044dde75bc3445a175b0bf\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:05.941 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[bd3a56ec-2e76-49fd-ae1e-e0444e7e3c03]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:05.942 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb97b8849-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:19:05 np0005466013 kernel: tapb97b8849-80: left promiscuous mode
Oct  2 08:19:05 np0005466013 nova_compute[192144]: 2025-10-02 12:19:05.991 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:06 np0005466013 nova_compute[192144]: 2025-10-02 12:19:06.002 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:06.004 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[88fdfb3e-a552-4a1b-a296-4594b09781d9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:06.044 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[86e4d9a1-0299-4dd1-99e4-2ac6ca559b2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:06.046 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[76d64943-ca7b-4dee-82a6-2e8c1d1a92d5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:06.060 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[453d10f5-30af-47d6-8a60-cf001dc836b1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 550524, 'reachable_time': 27462, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234138, 'error': None, 'target': 'ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:06.064 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:19:06 np0005466013 systemd[1]: run-netns-ovnmeta\x2db97b8849\x2d844c\x2d4190\x2d8b13\x2dfd7a2d073ce8.mount: Deactivated successfully.
Oct  2 08:19:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:06.064 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[8838653b-f7e2-4479-9e50-ab76d6cc459e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:06 np0005466013 nova_compute[192144]: 2025-10-02 12:19:06.741 2 DEBUG oslo_concurrency.processutils [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CMD "scp -r /var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d_resize/disk 192.168.122.101:/var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d/disk" returned: 0 in 0.801s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:19:06 np0005466013 nova_compute[192144]: 2025-10-02 12:19:06.742 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Copying file /var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d_resize/disk.config to 192.168.122.101:/var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Oct  2 08:19:06 np0005466013 nova_compute[192144]: 2025-10-02 12:19:06.742 2 DEBUG oslo_concurrency.processutils [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d_resize/disk.config 192.168.122.101:/var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:19:06 np0005466013 nova_compute[192144]: 2025-10-02 12:19:06.970 2 DEBUG oslo_concurrency.processutils [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CMD "scp -C -r /var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d_resize/disk.config 192.168.122.101:/var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d/disk.config" returned: 0 in 0.228s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:19:06 np0005466013 nova_compute[192144]: 2025-10-02 12:19:06.971 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Copying file /var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d_resize/disk.info to 192.168.122.101:/var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Oct  2 08:19:06 np0005466013 nova_compute[192144]: 2025-10-02 12:19:06.971 2 DEBUG oslo_concurrency.processutils [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d_resize/disk.info 192.168.122.101:/var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:19:07 np0005466013 nova_compute[192144]: 2025-10-02 12:19:07.180 2 DEBUG oslo_concurrency.processutils [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CMD "scp -C -r /var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d_resize/disk.info 192.168.122.101:/var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d/disk.info" returned: 0 in 0.209s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:19:07 np0005466013 nova_compute[192144]: 2025-10-02 12:19:07.428 2 DEBUG neutronclient.v2_0.client [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 4b6da309-2e2d-465d-91bd-9e0bae3250eb for host compute-1.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Oct  2 08:19:07 np0005466013 nova_compute[192144]: 2025-10-02 12:19:07.557 2 DEBUG oslo_concurrency.lockutils [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "95da5a4a-5301-4a2b-b135-01e08486477d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:07 np0005466013 nova_compute[192144]: 2025-10-02 12:19:07.558 2 DEBUG oslo_concurrency.lockutils [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "95da5a4a-5301-4a2b-b135-01e08486477d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:07 np0005466013 nova_compute[192144]: 2025-10-02 12:19:07.558 2 DEBUG oslo_concurrency.lockutils [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "95da5a4a-5301-4a2b-b135-01e08486477d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:07 np0005466013 nova_compute[192144]: 2025-10-02 12:19:07.667 2 DEBUG nova.compute.manager [req-4f479535-df6d-43b4-afec-ebf87c4b6a5d req-d09f92af-1cd6-4b0b-95d9-bd59fa9137bc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Received event network-vif-plugged-4b6da309-2e2d-465d-91bd-9e0bae3250eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:19:07 np0005466013 nova_compute[192144]: 2025-10-02 12:19:07.667 2 DEBUG oslo_concurrency.lockutils [req-4f479535-df6d-43b4-afec-ebf87c4b6a5d req-d09f92af-1cd6-4b0b-95d9-bd59fa9137bc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "95da5a4a-5301-4a2b-b135-01e08486477d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:07 np0005466013 nova_compute[192144]: 2025-10-02 12:19:07.667 2 DEBUG oslo_concurrency.lockutils [req-4f479535-df6d-43b4-afec-ebf87c4b6a5d req-d09f92af-1cd6-4b0b-95d9-bd59fa9137bc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "95da5a4a-5301-4a2b-b135-01e08486477d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:07 np0005466013 nova_compute[192144]: 2025-10-02 12:19:07.667 2 DEBUG oslo_concurrency.lockutils [req-4f479535-df6d-43b4-afec-ebf87c4b6a5d req-d09f92af-1cd6-4b0b-95d9-bd59fa9137bc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "95da5a4a-5301-4a2b-b135-01e08486477d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:07 np0005466013 nova_compute[192144]: 2025-10-02 12:19:07.668 2 DEBUG nova.compute.manager [req-4f479535-df6d-43b4-afec-ebf87c4b6a5d req-d09f92af-1cd6-4b0b-95d9-bd59fa9137bc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] No waiting events found dispatching network-vif-plugged-4b6da309-2e2d-465d-91bd-9e0bae3250eb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:19:07 np0005466013 nova_compute[192144]: 2025-10-02 12:19:07.668 2 WARNING nova.compute.manager [req-4f479535-df6d-43b4-afec-ebf87c4b6a5d req-d09f92af-1cd6-4b0b-95d9-bd59fa9137bc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Received unexpected event network-vif-plugged-4b6da309-2e2d-465d-91bd-9e0bae3250eb for instance with vm_state active and task_state resize_migrated.#033[00m
Oct  2 08:19:08 np0005466013 nova_compute[192144]: 2025-10-02 12:19:08.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:09 np0005466013 nova_compute[192144]: 2025-10-02 12:19:09.094 2 DEBUG nova.compute.manager [req-0e09b708-c3a2-45b6-bee2-bd951a276f70 req-aa8977db-db4f-4646-9b43-a7721d56a961 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Received event network-changed-4b6da309-2e2d-465d-91bd-9e0bae3250eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:19:09 np0005466013 nova_compute[192144]: 2025-10-02 12:19:09.094 2 DEBUG nova.compute.manager [req-0e09b708-c3a2-45b6-bee2-bd951a276f70 req-aa8977db-db4f-4646-9b43-a7721d56a961 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Refreshing instance network info cache due to event network-changed-4b6da309-2e2d-465d-91bd-9e0bae3250eb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:19:09 np0005466013 nova_compute[192144]: 2025-10-02 12:19:09.095 2 DEBUG oslo_concurrency.lockutils [req-0e09b708-c3a2-45b6-bee2-bd951a276f70 req-aa8977db-db4f-4646-9b43-a7721d56a961 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-95da5a4a-5301-4a2b-b135-01e08486477d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:19:09 np0005466013 nova_compute[192144]: 2025-10-02 12:19:09.095 2 DEBUG oslo_concurrency.lockutils [req-0e09b708-c3a2-45b6-bee2-bd951a276f70 req-aa8977db-db4f-4646-9b43-a7721d56a961 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-95da5a4a-5301-4a2b-b135-01e08486477d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:19:09 np0005466013 nova_compute[192144]: 2025-10-02 12:19:09.095 2 DEBUG nova.network.neutron [req-0e09b708-c3a2-45b6-bee2-bd951a276f70 req-aa8977db-db4f-4646-9b43-a7721d56a961 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Refreshing network info cache for port 4b6da309-2e2d-465d-91bd-9e0bae3250eb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:19:10 np0005466013 nova_compute[192144]: 2025-10-02 12:19:10.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:11 np0005466013 nova_compute[192144]: 2025-10-02 12:19:11.704 2 DEBUG nova.compute.manager [req-1e1bff55-b51c-4844-a175-7f87913ce7c9 req-e294ac95-31d1-4732-9283-564f24af231e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Received event network-vif-plugged-4b6da309-2e2d-465d-91bd-9e0bae3250eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:19:11 np0005466013 nova_compute[192144]: 2025-10-02 12:19:11.704 2 DEBUG oslo_concurrency.lockutils [req-1e1bff55-b51c-4844-a175-7f87913ce7c9 req-e294ac95-31d1-4732-9283-564f24af231e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "95da5a4a-5301-4a2b-b135-01e08486477d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:11 np0005466013 nova_compute[192144]: 2025-10-02 12:19:11.705 2 DEBUG oslo_concurrency.lockutils [req-1e1bff55-b51c-4844-a175-7f87913ce7c9 req-e294ac95-31d1-4732-9283-564f24af231e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "95da5a4a-5301-4a2b-b135-01e08486477d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:11 np0005466013 nova_compute[192144]: 2025-10-02 12:19:11.705 2 DEBUG oslo_concurrency.lockutils [req-1e1bff55-b51c-4844-a175-7f87913ce7c9 req-e294ac95-31d1-4732-9283-564f24af231e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "95da5a4a-5301-4a2b-b135-01e08486477d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:11 np0005466013 nova_compute[192144]: 2025-10-02 12:19:11.705 2 DEBUG nova.compute.manager [req-1e1bff55-b51c-4844-a175-7f87913ce7c9 req-e294ac95-31d1-4732-9283-564f24af231e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] No waiting events found dispatching network-vif-plugged-4b6da309-2e2d-465d-91bd-9e0bae3250eb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:19:11 np0005466013 nova_compute[192144]: 2025-10-02 12:19:11.705 2 WARNING nova.compute.manager [req-1e1bff55-b51c-4844-a175-7f87913ce7c9 req-e294ac95-31d1-4732-9283-564f24af231e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Received unexpected event network-vif-plugged-4b6da309-2e2d-465d-91bd-9e0bae3250eb for instance with vm_state active and task_state resize_finish.#033[00m
Oct  2 08:19:11 np0005466013 nova_compute[192144]: 2025-10-02 12:19:11.846 2 DEBUG nova.network.neutron [req-0e09b708-c3a2-45b6-bee2-bd951a276f70 req-aa8977db-db4f-4646-9b43-a7721d56a961 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Updated VIF entry in instance network info cache for port 4b6da309-2e2d-465d-91bd-9e0bae3250eb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:19:11 np0005466013 nova_compute[192144]: 2025-10-02 12:19:11.846 2 DEBUG nova.network.neutron [req-0e09b708-c3a2-45b6-bee2-bd951a276f70 req-aa8977db-db4f-4646-9b43-a7721d56a961 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Updating instance_info_cache with network_info: [{"id": "4b6da309-2e2d-465d-91bd-9e0bae3250eb", "address": "fa:16:3e:5d:9c:dc", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b6da309-2e", "ovs_interfaceid": "4b6da309-2e2d-465d-91bd-9e0bae3250eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:19:11 np0005466013 nova_compute[192144]: 2025-10-02 12:19:11.990 2 DEBUG oslo_concurrency.lockutils [req-0e09b708-c3a2-45b6-bee2-bd951a276f70 req-aa8977db-db4f-4646-9b43-a7721d56a961 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-95da5a4a-5301-4a2b-b135-01e08486477d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:19:13 np0005466013 nova_compute[192144]: 2025-10-02 12:19:13.600 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:13 np0005466013 nova_compute[192144]: 2025-10-02 12:19:13.682 2 DEBUG oslo_concurrency.lockutils [None req-8df60e1e-56c2-4e33-8938-21344a5278c8 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "95da5a4a-5301-4a2b-b135-01e08486477d" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:13 np0005466013 nova_compute[192144]: 2025-10-02 12:19:13.682 2 DEBUG oslo_concurrency.lockutils [None req-8df60e1e-56c2-4e33-8938-21344a5278c8 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "95da5a4a-5301-4a2b-b135-01e08486477d" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:13 np0005466013 nova_compute[192144]: 2025-10-02 12:19:13.683 2 DEBUG nova.compute.manager [None req-8df60e1e-56c2-4e33-8938-21344a5278c8 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Going to confirm migration 13 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Oct  2 08:19:13 np0005466013 nova_compute[192144]: 2025-10-02 12:19:13.710 2 DEBUG nova.objects.instance [None req-8df60e1e-56c2-4e33-8938-21344a5278c8 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lazy-loading 'info_cache' on Instance uuid 95da5a4a-5301-4a2b-b135-01e08486477d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:19:13 np0005466013 nova_compute[192144]: 2025-10-02 12:19:13.826 2 DEBUG nova.compute.manager [req-38deb2bd-dee3-4ff5-87a9-bb228ddf77fb req-9d89871b-8b42-46e8-bf41-90bf3878dbbc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Received event network-vif-plugged-4b6da309-2e2d-465d-91bd-9e0bae3250eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:19:13 np0005466013 nova_compute[192144]: 2025-10-02 12:19:13.827 2 DEBUG oslo_concurrency.lockutils [req-38deb2bd-dee3-4ff5-87a9-bb228ddf77fb req-9d89871b-8b42-46e8-bf41-90bf3878dbbc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "95da5a4a-5301-4a2b-b135-01e08486477d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:13 np0005466013 nova_compute[192144]: 2025-10-02 12:19:13.832 2 DEBUG oslo_concurrency.lockutils [req-38deb2bd-dee3-4ff5-87a9-bb228ddf77fb req-9d89871b-8b42-46e8-bf41-90bf3878dbbc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "95da5a4a-5301-4a2b-b135-01e08486477d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:13 np0005466013 nova_compute[192144]: 2025-10-02 12:19:13.832 2 DEBUG oslo_concurrency.lockutils [req-38deb2bd-dee3-4ff5-87a9-bb228ddf77fb req-9d89871b-8b42-46e8-bf41-90bf3878dbbc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "95da5a4a-5301-4a2b-b135-01e08486477d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:13 np0005466013 nova_compute[192144]: 2025-10-02 12:19:13.833 2 DEBUG nova.compute.manager [req-38deb2bd-dee3-4ff5-87a9-bb228ddf77fb req-9d89871b-8b42-46e8-bf41-90bf3878dbbc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] No waiting events found dispatching network-vif-plugged-4b6da309-2e2d-465d-91bd-9e0bae3250eb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:19:13 np0005466013 nova_compute[192144]: 2025-10-02 12:19:13.833 2 WARNING nova.compute.manager [req-38deb2bd-dee3-4ff5-87a9-bb228ddf77fb req-9d89871b-8b42-46e8-bf41-90bf3878dbbc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Received unexpected event network-vif-plugged-4b6da309-2e2d-465d-91bd-9e0bae3250eb for instance with vm_state resized and task_state deleting.#033[00m
Oct  2 08:19:14 np0005466013 nova_compute[192144]: 2025-10-02 12:19:14.183 2 DEBUG neutronclient.v2_0.client [None req-8df60e1e-56c2-4e33-8938-21344a5278c8 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 4b6da309-2e2d-465d-91bd-9e0bae3250eb for host compute-2.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Oct  2 08:19:14 np0005466013 nova_compute[192144]: 2025-10-02 12:19:14.184 2 DEBUG oslo_concurrency.lockutils [None req-8df60e1e-56c2-4e33-8938-21344a5278c8 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "refresh_cache-95da5a4a-5301-4a2b-b135-01e08486477d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:19:14 np0005466013 nova_compute[192144]: 2025-10-02 12:19:14.184 2 DEBUG oslo_concurrency.lockutils [None req-8df60e1e-56c2-4e33-8938-21344a5278c8 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquired lock "refresh_cache-95da5a4a-5301-4a2b-b135-01e08486477d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:19:14 np0005466013 nova_compute[192144]: 2025-10-02 12:19:14.184 2 DEBUG nova.network.neutron [None req-8df60e1e-56c2-4e33-8938-21344a5278c8 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:19:14 np0005466013 nova_compute[192144]: 2025-10-02 12:19:14.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:19:14 np0005466013 nova_compute[192144]: 2025-10-02 12:19:14.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  2 08:19:15 np0005466013 nova_compute[192144]: 2025-10-02 12:19:15.728 2 DEBUG nova.network.neutron [None req-8df60e1e-56c2-4e33-8938-21344a5278c8 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Updating instance_info_cache with network_info: [{"id": "4b6da309-2e2d-465d-91bd-9e0bae3250eb", "address": "fa:16:3e:5d:9c:dc", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b6da309-2e", "ovs_interfaceid": "4b6da309-2e2d-465d-91bd-9e0bae3250eb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:19:15 np0005466013 nova_compute[192144]: 2025-10-02 12:19:15.757 2 DEBUG oslo_concurrency.lockutils [None req-8df60e1e-56c2-4e33-8938-21344a5278c8 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Releasing lock "refresh_cache-95da5a4a-5301-4a2b-b135-01e08486477d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:19:15 np0005466013 nova_compute[192144]: 2025-10-02 12:19:15.757 2 DEBUG nova.objects.instance [None req-8df60e1e-56c2-4e33-8938-21344a5278c8 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lazy-loading 'migration_context' on Instance uuid 95da5a4a-5301-4a2b-b135-01e08486477d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:19:15 np0005466013 nova_compute[192144]: 2025-10-02 12:19:15.782 2 DEBUG nova.virt.libvirt.vif [None req-8df60e1e-56c2-4e33-8938-21344a5278c8 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:18:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-2096752872',display_name='tempest-DeleteServersTestJSON-server-2096752872',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-2096752872',id=95,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:19:12Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d5db64e6714348c1a7f57bb53de80915',ramdisk_id='',reservation_id='r-frz7g55l',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-DeleteServersTestJSON-548982240',owner_user_name='tempest-DeleteServersTestJSON-548982240-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:19:13Z,user_data=None,user_id='0c0ba8ddde504431b51e593c63f40361',uuid=95da5a4a-5301-4a2b-b135-01e08486477d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "4b6da309-2e2d-465d-91bd-9e0bae3250eb", "address": "fa:16:3e:5d:9c:dc", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b6da309-2e", "ovs_interfaceid": "4b6da309-2e2d-465d-91bd-9e0bae3250eb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:19:15 np0005466013 nova_compute[192144]: 2025-10-02 12:19:15.783 2 DEBUG nova.network.os_vif_util [None req-8df60e1e-56c2-4e33-8938-21344a5278c8 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Converting VIF {"id": "4b6da309-2e2d-465d-91bd-9e0bae3250eb", "address": "fa:16:3e:5d:9c:dc", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b6da309-2e", "ovs_interfaceid": "4b6da309-2e2d-465d-91bd-9e0bae3250eb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:19:15 np0005466013 nova_compute[192144]: 2025-10-02 12:19:15.784 2 DEBUG nova.network.os_vif_util [None req-8df60e1e-56c2-4e33-8938-21344a5278c8 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5d:9c:dc,bridge_name='br-int',has_traffic_filtering=True,id=4b6da309-2e2d-465d-91bd-9e0bae3250eb,network=Network(b97b8849-844c-4190-8b13-fd7a2d073ce8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b6da309-2e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:19:15 np0005466013 nova_compute[192144]: 2025-10-02 12:19:15.784 2 DEBUG os_vif [None req-8df60e1e-56c2-4e33-8938-21344a5278c8 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5d:9c:dc,bridge_name='br-int',has_traffic_filtering=True,id=4b6da309-2e2d-465d-91bd-9e0bae3250eb,network=Network(b97b8849-844c-4190-8b13-fd7a2d073ce8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b6da309-2e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:19:15 np0005466013 nova_compute[192144]: 2025-10-02 12:19:15.787 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:15 np0005466013 nova_compute[192144]: 2025-10-02 12:19:15.787 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b6da309-2e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:19:15 np0005466013 nova_compute[192144]: 2025-10-02 12:19:15.787 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:19:15 np0005466013 nova_compute[192144]: 2025-10-02 12:19:15.790 2 INFO os_vif [None req-8df60e1e-56c2-4e33-8938-21344a5278c8 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5d:9c:dc,bridge_name='br-int',has_traffic_filtering=True,id=4b6da309-2e2d-465d-91bd-9e0bae3250eb,network=Network(b97b8849-844c-4190-8b13-fd7a2d073ce8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b6da309-2e')#033[00m
Oct  2 08:19:15 np0005466013 nova_compute[192144]: 2025-10-02 12:19:15.791 2 DEBUG oslo_concurrency.lockutils [None req-8df60e1e-56c2-4e33-8938-21344a5278c8 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:15 np0005466013 nova_compute[192144]: 2025-10-02 12:19:15.791 2 DEBUG oslo_concurrency.lockutils [None req-8df60e1e-56c2-4e33-8938-21344a5278c8 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:15 np0005466013 nova_compute[192144]: 2025-10-02 12:19:15.808 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:15 np0005466013 nova_compute[192144]: 2025-10-02 12:19:15.894 2 DEBUG nova.compute.provider_tree [None req-8df60e1e-56c2-4e33-8938-21344a5278c8 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:19:15 np0005466013 nova_compute[192144]: 2025-10-02 12:19:15.921 2 DEBUG nova.scheduler.client.report [None req-8df60e1e-56c2-4e33-8938-21344a5278c8 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:19:15 np0005466013 nova_compute[192144]: 2025-10-02 12:19:15.961 2 DEBUG oslo_concurrency.lockutils [None req-8df60e1e-56c2-4e33-8938-21344a5278c8 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.169s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:16 np0005466013 nova_compute[192144]: 2025-10-02 12:19:16.109 2 INFO nova.scheduler.client.report [None req-8df60e1e-56c2-4e33-8938-21344a5278c8 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Deleted allocation for migration 9ac81be8-ebe7-4fa1-8a2c-22673a5e20bf#033[00m
Oct  2 08:19:16 np0005466013 nova_compute[192144]: 2025-10-02 12:19:16.207 2 DEBUG oslo_concurrency.lockutils [None req-8df60e1e-56c2-4e33-8938-21344a5278c8 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "95da5a4a-5301-4a2b-b135-01e08486477d" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 2.525s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:19:16.348 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:19:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:19:16.349 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:19:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:19:16.349 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:19:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:19:16.349 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:19:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:19:16.349 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:19:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:19:16.349 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:19:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:19:16.349 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:19:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:19:16.349 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:19:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:19:16.349 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:19:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:19:16.349 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:19:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:19:16.349 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:19:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:19:16.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:19:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:19:16.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:19:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:19:16.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:19:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:19:16.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:19:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:19:16.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:19:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:19:16.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:19:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:19:16.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:19:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:19:16.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:19:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:19:16.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:19:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:19:16.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:19:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:19:16.351 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:19:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:19:16.351 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:19:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:19:16.351 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:19:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:19:16.351 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:19:18 np0005466013 nova_compute[192144]: 2025-10-02 12:19:18.601 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:18 np0005466013 podman[234143]: 2025-10-02 12:19:18.692778946 +0000 UTC m=+0.056732689 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  2 08:19:18 np0005466013 podman[234144]: 2025-10-02 12:19:18.694607063 +0000 UTC m=+0.058128812 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 08:19:18 np0005466013 podman[234145]: 2025-10-02 12:19:18.73465475 +0000 UTC m=+0.095214807 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:19:20 np0005466013 nova_compute[192144]: 2025-10-02 12:19:20.405 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407545.403772, 95da5a4a-5301-4a2b-b135-01e08486477d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:19:20 np0005466013 nova_compute[192144]: 2025-10-02 12:19:20.405 2 INFO nova.compute.manager [-] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:19:20 np0005466013 nova_compute[192144]: 2025-10-02 12:19:20.449 2 DEBUG nova.compute.manager [None req-78f22dc9-6c9b-44ac-af03-321d711e75cc - - - - - -] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:19:20 np0005466013 nova_compute[192144]: 2025-10-02 12:19:20.810 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:23 np0005466013 nova_compute[192144]: 2025-10-02 12:19:23.016 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:19:23 np0005466013 nova_compute[192144]: 2025-10-02 12:19:23.016 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:19:23 np0005466013 nova_compute[192144]: 2025-10-02 12:19:23.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:24 np0005466013 nova_compute[192144]: 2025-10-02 12:19:24.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:19:25 np0005466013 nova_compute[192144]: 2025-10-02 12:19:25.016 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:25 np0005466013 nova_compute[192144]: 2025-10-02 12:19:25.017 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:25 np0005466013 nova_compute[192144]: 2025-10-02 12:19:25.017 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:25 np0005466013 nova_compute[192144]: 2025-10-02 12:19:25.017 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:19:25 np0005466013 nova_compute[192144]: 2025-10-02 12:19:25.202 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:19:25 np0005466013 nova_compute[192144]: 2025-10-02 12:19:25.203 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5689MB free_disk=73.35243606567383GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:19:25 np0005466013 nova_compute[192144]: 2025-10-02 12:19:25.204 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:25 np0005466013 nova_compute[192144]: 2025-10-02 12:19:25.204 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:25 np0005466013 nova_compute[192144]: 2025-10-02 12:19:25.317 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:19:25 np0005466013 nova_compute[192144]: 2025-10-02 12:19:25.318 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:19:25 np0005466013 nova_compute[192144]: 2025-10-02 12:19:25.346 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:19:25 np0005466013 nova_compute[192144]: 2025-10-02 12:19:25.362 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:19:25 np0005466013 nova_compute[192144]: 2025-10-02 12:19:25.388 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:19:25 np0005466013 nova_compute[192144]: 2025-10-02 12:19:25.388 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.184s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:25 np0005466013 nova_compute[192144]: 2025-10-02 12:19:25.472 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:25 np0005466013 nova_compute[192144]: 2025-10-02 12:19:25.814 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:26 np0005466013 nova_compute[192144]: 2025-10-02 12:19:26.389 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:19:27 np0005466013 podman[234210]: 2025-10-02 12:19:27.687456804 +0000 UTC m=+0.060733853 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Oct  2 08:19:27 np0005466013 nova_compute[192144]: 2025-10-02 12:19:27.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:19:27 np0005466013 nova_compute[192144]: 2025-10-02 12:19:27.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:19:28 np0005466013 nova_compute[192144]: 2025-10-02 12:19:28.608 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:28 np0005466013 nova_compute[192144]: 2025-10-02 12:19:28.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:19:29 np0005466013 podman[234230]: 2025-10-02 12:19:29.672005705 +0000 UTC m=+0.051938289 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:19:29 np0005466013 podman[234231]: 2025-10-02 12:19:29.672964015 +0000 UTC m=+0.051499356 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, architecture=x86_64, container_name=openstack_network_exporter, io.buildah.version=1.33.7, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, release=1755695350, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct  2 08:19:29 np0005466013 nova_compute[192144]: 2025-10-02 12:19:29.995 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:19:29 np0005466013 nova_compute[192144]: 2025-10-02 12:19:29.995 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:19:29 np0005466013 nova_compute[192144]: 2025-10-02 12:19:29.995 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:19:30 np0005466013 nova_compute[192144]: 2025-10-02 12:19:30.063 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:19:30 np0005466013 nova_compute[192144]: 2025-10-02 12:19:30.817 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:30 np0005466013 nova_compute[192144]: 2025-10-02 12:19:30.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:19:31 np0005466013 nova_compute[192144]: 2025-10-02 12:19:31.989 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:19:33 np0005466013 nova_compute[192144]: 2025-10-02 12:19:33.610 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:33 np0005466013 nova_compute[192144]: 2025-10-02 12:19:33.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:19:33 np0005466013 nova_compute[192144]: 2025-10-02 12:19:33.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  2 08:19:34 np0005466013 nova_compute[192144]: 2025-10-02 12:19:34.011 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  2 08:19:34 np0005466013 podman[234271]: 2025-10-02 12:19:34.686347901 +0000 UTC m=+0.062072684 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:19:34 np0005466013 podman[234272]: 2025-10-02 12:19:34.71070738 +0000 UTC m=+0.073369517 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:19:35 np0005466013 nova_compute[192144]: 2025-10-02 12:19:35.820 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:38 np0005466013 nova_compute[192144]: 2025-10-02 12:19:38.641 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:38 np0005466013 nova_compute[192144]: 2025-10-02 12:19:38.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:19:39 np0005466013 nova_compute[192144]: 2025-10-02 12:19:39.160 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:39 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:39.159 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:19:39 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:39.161 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:19:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:40.163 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:19:40 np0005466013 nova_compute[192144]: 2025-10-02 12:19:40.822 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:42 np0005466013 nova_compute[192144]: 2025-10-02 12:19:42.437 2 DEBUG oslo_concurrency.lockutils [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Acquiring lock "b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:42 np0005466013 nova_compute[192144]: 2025-10-02 12:19:42.438 2 DEBUG oslo_concurrency.lockutils [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Lock "b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:42 np0005466013 nova_compute[192144]: 2025-10-02 12:19:42.471 2 DEBUG nova.compute.manager [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:19:42 np0005466013 nova_compute[192144]: 2025-10-02 12:19:42.578 2 DEBUG oslo_concurrency.lockutils [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:42 np0005466013 nova_compute[192144]: 2025-10-02 12:19:42.579 2 DEBUG oslo_concurrency.lockutils [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:42 np0005466013 nova_compute[192144]: 2025-10-02 12:19:42.586 2 DEBUG nova.virt.hardware [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:19:42 np0005466013 nova_compute[192144]: 2025-10-02 12:19:42.587 2 INFO nova.compute.claims [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:19:42 np0005466013 nova_compute[192144]: 2025-10-02 12:19:42.771 2 DEBUG nova.compute.provider_tree [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:19:42 np0005466013 nova_compute[192144]: 2025-10-02 12:19:42.788 2 DEBUG nova.scheduler.client.report [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:19:42 np0005466013 nova_compute[192144]: 2025-10-02 12:19:42.949 2 DEBUG oslo_concurrency.lockutils [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.370s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:42 np0005466013 nova_compute[192144]: 2025-10-02 12:19:42.950 2 DEBUG nova.compute.manager [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:19:43 np0005466013 nova_compute[192144]: 2025-10-02 12:19:43.090 2 DEBUG nova.compute.manager [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:19:43 np0005466013 nova_compute[192144]: 2025-10-02 12:19:43.091 2 DEBUG nova.network.neutron [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:19:43 np0005466013 nova_compute[192144]: 2025-10-02 12:19:43.117 2 INFO nova.virt.libvirt.driver [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:19:43 np0005466013 nova_compute[192144]: 2025-10-02 12:19:43.135 2 DEBUG nova.compute.manager [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:19:43 np0005466013 nova_compute[192144]: 2025-10-02 12:19:43.245 2 DEBUG nova.compute.manager [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:19:43 np0005466013 nova_compute[192144]: 2025-10-02 12:19:43.247 2 DEBUG nova.virt.libvirt.driver [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:19:43 np0005466013 nova_compute[192144]: 2025-10-02 12:19:43.247 2 INFO nova.virt.libvirt.driver [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Creating image(s)#033[00m
Oct  2 08:19:43 np0005466013 nova_compute[192144]: 2025-10-02 12:19:43.248 2 DEBUG oslo_concurrency.lockutils [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Acquiring lock "/var/lib/nova/instances/b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:43 np0005466013 nova_compute[192144]: 2025-10-02 12:19:43.248 2 DEBUG oslo_concurrency.lockutils [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Lock "/var/lib/nova/instances/b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:43 np0005466013 nova_compute[192144]: 2025-10-02 12:19:43.249 2 DEBUG oslo_concurrency.lockutils [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Lock "/var/lib/nova/instances/b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:43 np0005466013 nova_compute[192144]: 2025-10-02 12:19:43.268 2 DEBUG oslo_concurrency.processutils [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:19:43 np0005466013 nova_compute[192144]: 2025-10-02 12:19:43.343 2 DEBUG oslo_concurrency.processutils [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:19:43 np0005466013 nova_compute[192144]: 2025-10-02 12:19:43.344 2 DEBUG oslo_concurrency.lockutils [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:43 np0005466013 nova_compute[192144]: 2025-10-02 12:19:43.345 2 DEBUG oslo_concurrency.lockutils [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:43 np0005466013 nova_compute[192144]: 2025-10-02 12:19:43.357 2 DEBUG oslo_concurrency.processutils [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:19:43 np0005466013 nova_compute[192144]: 2025-10-02 12:19:43.421 2 DEBUG oslo_concurrency.processutils [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:19:43 np0005466013 nova_compute[192144]: 2025-10-02 12:19:43.422 2 DEBUG oslo_concurrency.processutils [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:19:43 np0005466013 nova_compute[192144]: 2025-10-02 12:19:43.485 2 DEBUG oslo_concurrency.processutils [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef/disk 1073741824" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:19:43 np0005466013 nova_compute[192144]: 2025-10-02 12:19:43.486 2 DEBUG oslo_concurrency.lockutils [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:43 np0005466013 nova_compute[192144]: 2025-10-02 12:19:43.486 2 DEBUG oslo_concurrency.processutils [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:19:43 np0005466013 nova_compute[192144]: 2025-10-02 12:19:43.549 2 DEBUG oslo_concurrency.processutils [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:19:43 np0005466013 nova_compute[192144]: 2025-10-02 12:19:43.550 2 DEBUG nova.virt.disk.api [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Checking if we can resize image /var/lib/nova/instances/b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:19:43 np0005466013 nova_compute[192144]: 2025-10-02 12:19:43.550 2 DEBUG oslo_concurrency.processutils [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:19:43 np0005466013 nova_compute[192144]: 2025-10-02 12:19:43.613 2 DEBUG oslo_concurrency.processutils [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:19:43 np0005466013 nova_compute[192144]: 2025-10-02 12:19:43.614 2 DEBUG nova.virt.disk.api [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Cannot resize image /var/lib/nova/instances/b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:19:43 np0005466013 nova_compute[192144]: 2025-10-02 12:19:43.614 2 DEBUG nova.objects.instance [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Lazy-loading 'migration_context' on Instance uuid b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:19:43 np0005466013 nova_compute[192144]: 2025-10-02 12:19:43.625 2 DEBUG nova.policy [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8f34ba2e02d4445b8e188b6b8bf09e6b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b80e82beaaf44c9c92c703674174093c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:19:43 np0005466013 nova_compute[192144]: 2025-10-02 12:19:43.628 2 DEBUG nova.virt.libvirt.driver [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:19:43 np0005466013 nova_compute[192144]: 2025-10-02 12:19:43.628 2 DEBUG nova.virt.libvirt.driver [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Ensure instance console log exists: /var/lib/nova/instances/b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:19:43 np0005466013 nova_compute[192144]: 2025-10-02 12:19:43.629 2 DEBUG oslo_concurrency.lockutils [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:43 np0005466013 nova_compute[192144]: 2025-10-02 12:19:43.629 2 DEBUG oslo_concurrency.lockutils [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:43 np0005466013 nova_compute[192144]: 2025-10-02 12:19:43.629 2 DEBUG oslo_concurrency.lockutils [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:43 np0005466013 nova_compute[192144]: 2025-10-02 12:19:43.643 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:44 np0005466013 nova_compute[192144]: 2025-10-02 12:19:44.473 2 DEBUG nova.network.neutron [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Successfully created port: 2e474d48-e6c2-4dcb-88f5-d549a66d95a0 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:19:45 np0005466013 nova_compute[192144]: 2025-10-02 12:19:45.348 2 DEBUG nova.network.neutron [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Successfully updated port: 2e474d48-e6c2-4dcb-88f5-d549a66d95a0 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:19:45 np0005466013 nova_compute[192144]: 2025-10-02 12:19:45.363 2 DEBUG oslo_concurrency.lockutils [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Acquiring lock "refresh_cache-b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:19:45 np0005466013 nova_compute[192144]: 2025-10-02 12:19:45.363 2 DEBUG oslo_concurrency.lockutils [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Acquired lock "refresh_cache-b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:19:45 np0005466013 nova_compute[192144]: 2025-10-02 12:19:45.364 2 DEBUG nova.network.neutron [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:19:45 np0005466013 nova_compute[192144]: 2025-10-02 12:19:45.507 2 DEBUG nova.compute.manager [req-37d305e0-66af-46ad-ab63-2bd2edaceb56 req-fdad04d2-f131-4ae6-8246-572be2389bb1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Received event network-changed-2e474d48-e6c2-4dcb-88f5-d549a66d95a0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:19:45 np0005466013 nova_compute[192144]: 2025-10-02 12:19:45.508 2 DEBUG nova.compute.manager [req-37d305e0-66af-46ad-ab63-2bd2edaceb56 req-fdad04d2-f131-4ae6-8246-572be2389bb1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Refreshing instance network info cache due to event network-changed-2e474d48-e6c2-4dcb-88f5-d549a66d95a0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:19:45 np0005466013 nova_compute[192144]: 2025-10-02 12:19:45.508 2 DEBUG oslo_concurrency.lockutils [req-37d305e0-66af-46ad-ab63-2bd2edaceb56 req-fdad04d2-f131-4ae6-8246-572be2389bb1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:19:45 np0005466013 nova_compute[192144]: 2025-10-02 12:19:45.633 2 DEBUG nova.network.neutron [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:19:45 np0005466013 nova_compute[192144]: 2025-10-02 12:19:45.825 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:46 np0005466013 nova_compute[192144]: 2025-10-02 12:19:46.823 2 DEBUG nova.network.neutron [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Updating instance_info_cache with network_info: [{"id": "2e474d48-e6c2-4dcb-88f5-d549a66d95a0", "address": "fa:16:3e:88:c6:a5", "network": {"id": "b5801b7a-2134-4b44-b6db-d862032a4851", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1654874963-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b80e82beaaf44c9c92c703674174093c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e474d48-e6", "ovs_interfaceid": "2e474d48-e6c2-4dcb-88f5-d549a66d95a0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:19:46 np0005466013 nova_compute[192144]: 2025-10-02 12:19:46.845 2 DEBUG oslo_concurrency.lockutils [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Releasing lock "refresh_cache-b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:19:46 np0005466013 nova_compute[192144]: 2025-10-02 12:19:46.845 2 DEBUG nova.compute.manager [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Instance network_info: |[{"id": "2e474d48-e6c2-4dcb-88f5-d549a66d95a0", "address": "fa:16:3e:88:c6:a5", "network": {"id": "b5801b7a-2134-4b44-b6db-d862032a4851", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1654874963-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b80e82beaaf44c9c92c703674174093c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e474d48-e6", "ovs_interfaceid": "2e474d48-e6c2-4dcb-88f5-d549a66d95a0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:19:46 np0005466013 nova_compute[192144]: 2025-10-02 12:19:46.846 2 DEBUG oslo_concurrency.lockutils [req-37d305e0-66af-46ad-ab63-2bd2edaceb56 req-fdad04d2-f131-4ae6-8246-572be2389bb1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:19:46 np0005466013 nova_compute[192144]: 2025-10-02 12:19:46.846 2 DEBUG nova.network.neutron [req-37d305e0-66af-46ad-ab63-2bd2edaceb56 req-fdad04d2-f131-4ae6-8246-572be2389bb1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Refreshing network info cache for port 2e474d48-e6c2-4dcb-88f5-d549a66d95a0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:19:46 np0005466013 nova_compute[192144]: 2025-10-02 12:19:46.849 2 DEBUG nova.virt.libvirt.driver [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Start _get_guest_xml network_info=[{"id": "2e474d48-e6c2-4dcb-88f5-d549a66d95a0", "address": "fa:16:3e:88:c6:a5", "network": {"id": "b5801b7a-2134-4b44-b6db-d862032a4851", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1654874963-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b80e82beaaf44c9c92c703674174093c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e474d48-e6", "ovs_interfaceid": "2e474d48-e6c2-4dcb-88f5-d549a66d95a0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:19:46 np0005466013 nova_compute[192144]: 2025-10-02 12:19:46.853 2 WARNING nova.virt.libvirt.driver [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:19:46 np0005466013 nova_compute[192144]: 2025-10-02 12:19:46.856 2 DEBUG nova.virt.libvirt.host [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:19:46 np0005466013 nova_compute[192144]: 2025-10-02 12:19:46.857 2 DEBUG nova.virt.libvirt.host [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:19:46 np0005466013 nova_compute[192144]: 2025-10-02 12:19:46.862 2 DEBUG nova.virt.libvirt.host [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:19:46 np0005466013 nova_compute[192144]: 2025-10-02 12:19:46.863 2 DEBUG nova.virt.libvirt.host [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:19:46 np0005466013 nova_compute[192144]: 2025-10-02 12:19:46.865 2 DEBUG nova.virt.libvirt.driver [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:19:46 np0005466013 nova_compute[192144]: 2025-10-02 12:19:46.865 2 DEBUG nova.virt.hardware [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:19:46 np0005466013 nova_compute[192144]: 2025-10-02 12:19:46.866 2 DEBUG nova.virt.hardware [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:19:46 np0005466013 nova_compute[192144]: 2025-10-02 12:19:46.866 2 DEBUG nova.virt.hardware [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:19:46 np0005466013 nova_compute[192144]: 2025-10-02 12:19:46.866 2 DEBUG nova.virt.hardware [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:19:46 np0005466013 nova_compute[192144]: 2025-10-02 12:19:46.866 2 DEBUG nova.virt.hardware [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:19:46 np0005466013 nova_compute[192144]: 2025-10-02 12:19:46.866 2 DEBUG nova.virt.hardware [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:19:46 np0005466013 nova_compute[192144]: 2025-10-02 12:19:46.867 2 DEBUG nova.virt.hardware [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:19:46 np0005466013 nova_compute[192144]: 2025-10-02 12:19:46.867 2 DEBUG nova.virt.hardware [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:19:46 np0005466013 nova_compute[192144]: 2025-10-02 12:19:46.867 2 DEBUG nova.virt.hardware [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:19:46 np0005466013 nova_compute[192144]: 2025-10-02 12:19:46.867 2 DEBUG nova.virt.hardware [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:19:46 np0005466013 nova_compute[192144]: 2025-10-02 12:19:46.868 2 DEBUG nova.virt.hardware [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:19:46 np0005466013 nova_compute[192144]: 2025-10-02 12:19:46.871 2 DEBUG nova.virt.libvirt.vif [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:19:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-InstanceActionsV221TestJSON-server-1605038063',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-instanceactionsv221testjson-server-1605038063',id=99,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b80e82beaaf44c9c92c703674174093c',ramdisk_id='',reservation_id='r-5jsojqsl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-InstanceActionsV221TestJSON-1172944562',owner_user_name='tempest-InstanceActionsV221TestJSON-1172944562-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:19:43Z,user_data=None,user_id='8f34ba2e02d4445b8e188b6b8bf09e6b',uuid=b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2e474d48-e6c2-4dcb-88f5-d549a66d95a0", "address": "fa:16:3e:88:c6:a5", "network": {"id": "b5801b7a-2134-4b44-b6db-d862032a4851", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1654874963-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b80e82beaaf44c9c92c703674174093c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e474d48-e6", "ovs_interfaceid": "2e474d48-e6c2-4dcb-88f5-d549a66d95a0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:19:46 np0005466013 nova_compute[192144]: 2025-10-02 12:19:46.871 2 DEBUG nova.network.os_vif_util [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Converting VIF {"id": "2e474d48-e6c2-4dcb-88f5-d549a66d95a0", "address": "fa:16:3e:88:c6:a5", "network": {"id": "b5801b7a-2134-4b44-b6db-d862032a4851", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1654874963-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b80e82beaaf44c9c92c703674174093c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e474d48-e6", "ovs_interfaceid": "2e474d48-e6c2-4dcb-88f5-d549a66d95a0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:19:46 np0005466013 nova_compute[192144]: 2025-10-02 12:19:46.872 2 DEBUG nova.network.os_vif_util [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:88:c6:a5,bridge_name='br-int',has_traffic_filtering=True,id=2e474d48-e6c2-4dcb-88f5-d549a66d95a0,network=Network(b5801b7a-2134-4b44-b6db-d862032a4851),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e474d48-e6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:19:46 np0005466013 nova_compute[192144]: 2025-10-02 12:19:46.872 2 DEBUG nova.objects.instance [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Lazy-loading 'pci_devices' on Instance uuid b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:19:46 np0005466013 nova_compute[192144]: 2025-10-02 12:19:46.891 2 DEBUG nova.virt.libvirt.driver [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:19:46 np0005466013 nova_compute[192144]:  <uuid>b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef</uuid>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:  <name>instance-00000063</name>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:19:46 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:      <nova:name>tempest-InstanceActionsV221TestJSON-server-1605038063</nova:name>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:19:46</nova:creationTime>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:19:46 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:        <nova:user uuid="8f34ba2e02d4445b8e188b6b8bf09e6b">tempest-InstanceActionsV221TestJSON-1172944562-project-member</nova:user>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:        <nova:project uuid="b80e82beaaf44c9c92c703674174093c">tempest-InstanceActionsV221TestJSON-1172944562</nova:project>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:        <nova:port uuid="2e474d48-e6c2-4dcb-88f5-d549a66d95a0">
Oct  2 08:19:46 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:      <entry name="serial">b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef</entry>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:      <entry name="uuid">b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef</entry>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:19:46 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef/disk"/>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:19:46 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef/disk.config"/>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:19:46 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:88:c6:a5"/>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:      <target dev="tap2e474d48-e6"/>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:19:46 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef/console.log" append="off"/>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:19:46 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:19:46 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:19:46 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:19:46 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:19:46 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:19:46 np0005466013 nova_compute[192144]: 2025-10-02 12:19:46.892 2 DEBUG nova.compute.manager [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Preparing to wait for external event network-vif-plugged-2e474d48-e6c2-4dcb-88f5-d549a66d95a0 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:19:46 np0005466013 nova_compute[192144]: 2025-10-02 12:19:46.893 2 DEBUG oslo_concurrency.lockutils [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Acquiring lock "b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:46 np0005466013 nova_compute[192144]: 2025-10-02 12:19:46.893 2 DEBUG oslo_concurrency.lockutils [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Lock "b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:46 np0005466013 nova_compute[192144]: 2025-10-02 12:19:46.893 2 DEBUG oslo_concurrency.lockutils [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Lock "b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:46 np0005466013 nova_compute[192144]: 2025-10-02 12:19:46.894 2 DEBUG nova.virt.libvirt.vif [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:19:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-InstanceActionsV221TestJSON-server-1605038063',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-instanceactionsv221testjson-server-1605038063',id=99,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b80e82beaaf44c9c92c703674174093c',ramdisk_id='',reservation_id='r-5jsojqsl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-InstanceActionsV221TestJSON-1172944562',owner_user_name='tempest-InstanceActionsV221TestJSON-1172944562-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:19:43Z,user_data=None,user_id='8f34ba2e02d4445b8e188b6b8bf09e6b',uuid=b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2e474d48-e6c2-4dcb-88f5-d549a66d95a0", "address": "fa:16:3e:88:c6:a5", "network": {"id": "b5801b7a-2134-4b44-b6db-d862032a4851", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1654874963-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b80e82beaaf44c9c92c703674174093c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e474d48-e6", "ovs_interfaceid": "2e474d48-e6c2-4dcb-88f5-d549a66d95a0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:19:46 np0005466013 nova_compute[192144]: 2025-10-02 12:19:46.894 2 DEBUG nova.network.os_vif_util [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Converting VIF {"id": "2e474d48-e6c2-4dcb-88f5-d549a66d95a0", "address": "fa:16:3e:88:c6:a5", "network": {"id": "b5801b7a-2134-4b44-b6db-d862032a4851", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1654874963-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b80e82beaaf44c9c92c703674174093c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e474d48-e6", "ovs_interfaceid": "2e474d48-e6c2-4dcb-88f5-d549a66d95a0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:19:46 np0005466013 nova_compute[192144]: 2025-10-02 12:19:46.895 2 DEBUG nova.network.os_vif_util [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:88:c6:a5,bridge_name='br-int',has_traffic_filtering=True,id=2e474d48-e6c2-4dcb-88f5-d549a66d95a0,network=Network(b5801b7a-2134-4b44-b6db-d862032a4851),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e474d48-e6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:19:46 np0005466013 nova_compute[192144]: 2025-10-02 12:19:46.895 2 DEBUG os_vif [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:88:c6:a5,bridge_name='br-int',has_traffic_filtering=True,id=2e474d48-e6c2-4dcb-88f5-d549a66d95a0,network=Network(b5801b7a-2134-4b44-b6db-d862032a4851),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e474d48-e6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:19:46 np0005466013 nova_compute[192144]: 2025-10-02 12:19:46.895 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:46 np0005466013 nova_compute[192144]: 2025-10-02 12:19:46.896 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:19:46 np0005466013 nova_compute[192144]: 2025-10-02 12:19:46.896 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:19:46 np0005466013 nova_compute[192144]: 2025-10-02 12:19:46.898 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:46 np0005466013 nova_compute[192144]: 2025-10-02 12:19:46.899 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2e474d48-e6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:19:46 np0005466013 nova_compute[192144]: 2025-10-02 12:19:46.899 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2e474d48-e6, col_values=(('external_ids', {'iface-id': '2e474d48-e6c2-4dcb-88f5-d549a66d95a0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:88:c6:a5', 'vm-uuid': 'b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:19:46 np0005466013 nova_compute[192144]: 2025-10-02 12:19:46.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:46 np0005466013 NetworkManager[51205]: <info>  [1759407586.9021] manager: (tap2e474d48-e6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/170)
Oct  2 08:19:46 np0005466013 nova_compute[192144]: 2025-10-02 12:19:46.904 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:19:46 np0005466013 nova_compute[192144]: 2025-10-02 12:19:46.907 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:46 np0005466013 nova_compute[192144]: 2025-10-02 12:19:46.908 2 INFO os_vif [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:88:c6:a5,bridge_name='br-int',has_traffic_filtering=True,id=2e474d48-e6c2-4dcb-88f5-d549a66d95a0,network=Network(b5801b7a-2134-4b44-b6db-d862032a4851),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e474d48-e6')#033[00m
Oct  2 08:19:46 np0005466013 nova_compute[192144]: 2025-10-02 12:19:46.956 2 DEBUG nova.virt.libvirt.driver [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:19:46 np0005466013 nova_compute[192144]: 2025-10-02 12:19:46.957 2 DEBUG nova.virt.libvirt.driver [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:19:46 np0005466013 nova_compute[192144]: 2025-10-02 12:19:46.957 2 DEBUG nova.virt.libvirt.driver [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] No VIF found with MAC fa:16:3e:88:c6:a5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:19:46 np0005466013 nova_compute[192144]: 2025-10-02 12:19:46.958 2 INFO nova.virt.libvirt.driver [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Using config drive#033[00m
Oct  2 08:19:47 np0005466013 nova_compute[192144]: 2025-10-02 12:19:47.440 2 INFO nova.virt.libvirt.driver [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Creating config drive at /var/lib/nova/instances/b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef/disk.config#033[00m
Oct  2 08:19:47 np0005466013 nova_compute[192144]: 2025-10-02 12:19:47.447 2 DEBUG oslo_concurrency.processutils [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvnfv89zi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:19:47 np0005466013 nova_compute[192144]: 2025-10-02 12:19:47.577 2 DEBUG oslo_concurrency.processutils [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvnfv89zi" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:19:47 np0005466013 kernel: tap2e474d48-e6: entered promiscuous mode
Oct  2 08:19:47 np0005466013 ovn_controller[94366]: 2025-10-02T12:19:47Z|00368|binding|INFO|Claiming lport 2e474d48-e6c2-4dcb-88f5-d549a66d95a0 for this chassis.
Oct  2 08:19:47 np0005466013 ovn_controller[94366]: 2025-10-02T12:19:47Z|00369|binding|INFO|2e474d48-e6c2-4dcb-88f5-d549a66d95a0: Claiming fa:16:3e:88:c6:a5 10.100.0.5
Oct  2 08:19:47 np0005466013 NetworkManager[51205]: <info>  [1759407587.6467] manager: (tap2e474d48-e6): new Tun device (/org/freedesktop/NetworkManager/Devices/171)
Oct  2 08:19:47 np0005466013 nova_compute[192144]: 2025-10-02 12:19:47.646 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:47 np0005466013 systemd-udevd[234346]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:19:47 np0005466013 NetworkManager[51205]: <info>  [1759407587.6833] device (tap2e474d48-e6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:19:47 np0005466013 NetworkManager[51205]: <info>  [1759407587.6840] device (tap2e474d48-e6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:19:47 np0005466013 nova_compute[192144]: 2025-10-02 12:19:47.703 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:47 np0005466013 systemd-machined[152202]: New machine qemu-43-instance-00000063.
Oct  2 08:19:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:47.704 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:88:c6:a5 10.100.0.5'], port_security=['fa:16:3e:88:c6:a5 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b5801b7a-2134-4b44-b6db-d862032a4851', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b80e82beaaf44c9c92c703674174093c', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd70ee44c-c49a-4e34-b748-be5f26507761', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d0a82a3e-689e-43ad-b151-c54f2eee2f82, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=2e474d48-e6c2-4dcb-88f5-d549a66d95a0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:19:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:47.706 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 2e474d48-e6c2-4dcb-88f5-d549a66d95a0 in datapath b5801b7a-2134-4b44-b6db-d862032a4851 bound to our chassis#033[00m
Oct  2 08:19:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:47.708 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b5801b7a-2134-4b44-b6db-d862032a4851#033[00m
Oct  2 08:19:47 np0005466013 ovn_controller[94366]: 2025-10-02T12:19:47Z|00370|binding|INFO|Setting lport 2e474d48-e6c2-4dcb-88f5-d549a66d95a0 ovn-installed in OVS
Oct  2 08:19:47 np0005466013 ovn_controller[94366]: 2025-10-02T12:19:47Z|00371|binding|INFO|Setting lport 2e474d48-e6c2-4dcb-88f5-d549a66d95a0 up in Southbound
Oct  2 08:19:47 np0005466013 nova_compute[192144]: 2025-10-02 12:19:47.711 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:47 np0005466013 systemd[1]: Started Virtual Machine qemu-43-instance-00000063.
Oct  2 08:19:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:47.722 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[ee414cdb-0b78-44ab-be9f-34682c5a7821]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:47.723 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb5801b7a-21 in ovnmeta-b5801b7a-2134-4b44-b6db-d862032a4851 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:19:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:47.724 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb5801b7a-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:19:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:47.724 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[ab263fd7-ca79-44ab-8a9a-dbf0cc71c7e5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:47.725 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[cadac69a-3fe8-4c17-93d3-d52c6cf6bf14]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:47.736 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[13fa0931-263d-4f08-b3a6-b22dd27c9db9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:47.762 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[0b03c289-ad10-4e5e-a7f5-1f85f620df21]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:47.786 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[5d0badd7-ac36-40e9-a6a4-fd7a2fd6ad75]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:47.792 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[2036f6da-c674-4113-8cb2-eba78f4b52e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:47 np0005466013 systemd-udevd[234350]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:19:47 np0005466013 NetworkManager[51205]: <info>  [1759407587.7940] manager: (tapb5801b7a-20): new Veth device (/org/freedesktop/NetworkManager/Devices/172)
Oct  2 08:19:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:47.820 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[93bc7d21-36fd-40f7-b008-9ead1af6bdfc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:47.823 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[dfc26265-979e-4871-8df0-a5ee75a2e970]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:47 np0005466013 NetworkManager[51205]: <info>  [1759407587.8429] device (tapb5801b7a-20): carrier: link connected
Oct  2 08:19:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:47.848 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[2795bd21-a3c9-4750-bbfe-4213fffcfa97]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:47.864 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[8dc3a8d6-d3c3-4a91-980d-b09dd77ec437]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb5801b7a-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1a:5c:db'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 112], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 557945, 'reachable_time': 42498, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234382, 'error': None, 'target': 'ovnmeta-b5801b7a-2134-4b44-b6db-d862032a4851', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:47.878 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[294a28b3-d460-4fbf-850e-ee317b1fb11e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1a:5cdb'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 557945, 'tstamp': 557945}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234383, 'error': None, 'target': 'ovnmeta-b5801b7a-2134-4b44-b6db-d862032a4851', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:47.896 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[5eb840c9-183b-4ad8-bc07-1869b86a04e6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb5801b7a-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1a:5c:db'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 112], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 557945, 'reachable_time': 42498, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 234384, 'error': None, 'target': 'ovnmeta-b5801b7a-2134-4b44-b6db-d862032a4851', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:47.928 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[f360f26c-9d1d-49b9-ad66-fd1f8e3491fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:47.990 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[a1f15c46-1360-4085-9d74-16f29b7cb168]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:47.992 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb5801b7a-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:19:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:47.992 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:19:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:47.992 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb5801b7a-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:19:47 np0005466013 kernel: tapb5801b7a-20: entered promiscuous mode
Oct  2 08:19:47 np0005466013 nova_compute[192144]: 2025-10-02 12:19:47.994 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:47.996 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb5801b7a-20, col_values=(('external_ids', {'iface-id': 'f34a26a7-8160-4935-a0fe-15abb1873f7a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:19:47 np0005466013 NetworkManager[51205]: <info>  [1759407587.9969] manager: (tapb5801b7a-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/173)
Oct  2 08:19:47 np0005466013 nova_compute[192144]: 2025-10-02 12:19:47.997 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:47 np0005466013 ovn_controller[94366]: 2025-10-02T12:19:47Z|00372|binding|INFO|Releasing lport f34a26a7-8160-4935-a0fe-15abb1873f7a from this chassis (sb_readonly=0)
Oct  2 08:19:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:47.999 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b5801b7a-2134-4b44-b6db-d862032a4851.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b5801b7a-2134-4b44-b6db-d862032a4851.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:19:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:48.000 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[b6417eeb-69d4-4008-a3af-fac619302ab0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:48.001 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:19:48 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:19:48 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:19:48 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-b5801b7a-2134-4b44-b6db-d862032a4851
Oct  2 08:19:48 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:19:48 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:19:48 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:19:48 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/b5801b7a-2134-4b44-b6db-d862032a4851.pid.haproxy
Oct  2 08:19:48 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:19:48 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:19:48 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:19:48 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:19:48 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:19:48 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:19:48 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:19:48 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:19:48 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:19:48 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:19:48 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:19:48 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:19:48 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:19:48 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:19:48 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:19:48 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:19:48 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:19:48 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:19:48 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:19:48 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:19:48 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID b5801b7a-2134-4b44-b6db-d862032a4851
Oct  2 08:19:48 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:19:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:48.001 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b5801b7a-2134-4b44-b6db-d862032a4851', 'env', 'PROCESS_TAG=haproxy-b5801b7a-2134-4b44-b6db-d862032a4851', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b5801b7a-2134-4b44-b6db-d862032a4851.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:19:48 np0005466013 nova_compute[192144]: 2025-10-02 12:19:48.011 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:48 np0005466013 podman[234416]: 2025-10-02 12:19:48.374496151 +0000 UTC m=+0.055563531 container create 2476153ea00c1d3bf65a3499180c65fb735e4d29d3a5d44edbfed6484224a031 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b5801b7a-2134-4b44-b6db-d862032a4851, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:19:48 np0005466013 systemd[1]: Started libpod-conmon-2476153ea00c1d3bf65a3499180c65fb735e4d29d3a5d44edbfed6484224a031.scope.
Oct  2 08:19:48 np0005466013 nova_compute[192144]: 2025-10-02 12:19:48.434 2 DEBUG nova.network.neutron [req-37d305e0-66af-46ad-ab63-2bd2edaceb56 req-fdad04d2-f131-4ae6-8246-572be2389bb1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Updated VIF entry in instance network info cache for port 2e474d48-e6c2-4dcb-88f5-d549a66d95a0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:19:48 np0005466013 podman[234416]: 2025-10-02 12:19:48.340286866 +0000 UTC m=+0.021355226 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:19:48 np0005466013 nova_compute[192144]: 2025-10-02 12:19:48.434 2 DEBUG nova.network.neutron [req-37d305e0-66af-46ad-ab63-2bd2edaceb56 req-fdad04d2-f131-4ae6-8246-572be2389bb1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Updating instance_info_cache with network_info: [{"id": "2e474d48-e6c2-4dcb-88f5-d549a66d95a0", "address": "fa:16:3e:88:c6:a5", "network": {"id": "b5801b7a-2134-4b44-b6db-d862032a4851", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1654874963-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b80e82beaaf44c9c92c703674174093c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e474d48-e6", "ovs_interfaceid": "2e474d48-e6c2-4dcb-88f5-d549a66d95a0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:19:48 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:19:48 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/30ccb393baf47c647fec680b87b37517dccf00005b287d6caba2d50d47efbf20/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:19:48 np0005466013 nova_compute[192144]: 2025-10-02 12:19:48.475 2 DEBUG oslo_concurrency.lockutils [req-37d305e0-66af-46ad-ab63-2bd2edaceb56 req-fdad04d2-f131-4ae6-8246-572be2389bb1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:19:48 np0005466013 podman[234416]: 2025-10-02 12:19:48.474234188 +0000 UTC m=+0.155301558 container init 2476153ea00c1d3bf65a3499180c65fb735e4d29d3a5d44edbfed6484224a031 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b5801b7a-2134-4b44-b6db-d862032a4851, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 08:19:48 np0005466013 podman[234416]: 2025-10-02 12:19:48.483101224 +0000 UTC m=+0.164168604 container start 2476153ea00c1d3bf65a3499180c65fb735e4d29d3a5d44edbfed6484224a031 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b5801b7a-2134-4b44-b6db-d862032a4851, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Oct  2 08:19:48 np0005466013 neutron-haproxy-ovnmeta-b5801b7a-2134-4b44-b6db-d862032a4851[234431]: [NOTICE]   (234436) : New worker (234438) forked
Oct  2 08:19:48 np0005466013 neutron-haproxy-ovnmeta-b5801b7a-2134-4b44-b6db-d862032a4851[234431]: [NOTICE]   (234436) : Loading success.
Oct  2 08:19:48 np0005466013 nova_compute[192144]: 2025-10-02 12:19:48.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:49 np0005466013 nova_compute[192144]: 2025-10-02 12:19:49.084 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407589.0835536, b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:19:49 np0005466013 nova_compute[192144]: 2025-10-02 12:19:49.085 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] VM Started (Lifecycle Event)#033[00m
Oct  2 08:19:49 np0005466013 nova_compute[192144]: 2025-10-02 12:19:49.110 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:19:49 np0005466013 nova_compute[192144]: 2025-10-02 12:19:49.117 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407589.0839055, b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:19:49 np0005466013 nova_compute[192144]: 2025-10-02 12:19:49.117 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:19:49 np0005466013 nova_compute[192144]: 2025-10-02 12:19:49.147 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:19:49 np0005466013 nova_compute[192144]: 2025-10-02 12:19:49.154 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:19:49 np0005466013 nova_compute[192144]: 2025-10-02 12:19:49.191 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:19:49 np0005466013 podman[234455]: 2025-10-02 12:19:49.703203965 +0000 UTC m=+0.072447328 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, config_id=ovn_metadata_agent)
Oct  2 08:19:49 np0005466013 podman[234454]: 2025-10-02 12:19:49.704060692 +0000 UTC m=+0.074950305 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  2 08:19:49 np0005466013 podman[234456]: 2025-10-02 12:19:49.736818622 +0000 UTC m=+0.102575816 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_controller)
Oct  2 08:19:51 np0005466013 nova_compute[192144]: 2025-10-02 12:19:51.705 2 DEBUG nova.compute.manager [req-2d3bcae6-1916-4127-b442-b622c8102f19 req-86bec861-0e99-487c-a061-fcbe53550a84 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Received event network-vif-plugged-2e474d48-e6c2-4dcb-88f5-d549a66d95a0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:19:51 np0005466013 nova_compute[192144]: 2025-10-02 12:19:51.706 2 DEBUG oslo_concurrency.lockutils [req-2d3bcae6-1916-4127-b442-b622c8102f19 req-86bec861-0e99-487c-a061-fcbe53550a84 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:51 np0005466013 nova_compute[192144]: 2025-10-02 12:19:51.706 2 DEBUG oslo_concurrency.lockutils [req-2d3bcae6-1916-4127-b442-b622c8102f19 req-86bec861-0e99-487c-a061-fcbe53550a84 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:51 np0005466013 nova_compute[192144]: 2025-10-02 12:19:51.706 2 DEBUG oslo_concurrency.lockutils [req-2d3bcae6-1916-4127-b442-b622c8102f19 req-86bec861-0e99-487c-a061-fcbe53550a84 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:51 np0005466013 nova_compute[192144]: 2025-10-02 12:19:51.706 2 DEBUG nova.compute.manager [req-2d3bcae6-1916-4127-b442-b622c8102f19 req-86bec861-0e99-487c-a061-fcbe53550a84 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Processing event network-vif-plugged-2e474d48-e6c2-4dcb-88f5-d549a66d95a0 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:19:51 np0005466013 nova_compute[192144]: 2025-10-02 12:19:51.707 2 DEBUG nova.compute.manager [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:19:51 np0005466013 nova_compute[192144]: 2025-10-02 12:19:51.711 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407591.7109897, b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:19:51 np0005466013 nova_compute[192144]: 2025-10-02 12:19:51.712 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:19:51 np0005466013 nova_compute[192144]: 2025-10-02 12:19:51.714 2 DEBUG nova.virt.libvirt.driver [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:19:51 np0005466013 nova_compute[192144]: 2025-10-02 12:19:51.718 2 INFO nova.virt.libvirt.driver [-] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Instance spawned successfully.#033[00m
Oct  2 08:19:51 np0005466013 nova_compute[192144]: 2025-10-02 12:19:51.718 2 DEBUG nova.virt.libvirt.driver [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:19:51 np0005466013 nova_compute[192144]: 2025-10-02 12:19:51.744 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:19:51 np0005466013 nova_compute[192144]: 2025-10-02 12:19:51.751 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:19:51 np0005466013 nova_compute[192144]: 2025-10-02 12:19:51.755 2 DEBUG nova.virt.libvirt.driver [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:19:51 np0005466013 nova_compute[192144]: 2025-10-02 12:19:51.755 2 DEBUG nova.virt.libvirt.driver [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:19:51 np0005466013 nova_compute[192144]: 2025-10-02 12:19:51.756 2 DEBUG nova.virt.libvirt.driver [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:19:51 np0005466013 nova_compute[192144]: 2025-10-02 12:19:51.756 2 DEBUG nova.virt.libvirt.driver [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:19:51 np0005466013 nova_compute[192144]: 2025-10-02 12:19:51.756 2 DEBUG nova.virt.libvirt.driver [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:19:51 np0005466013 nova_compute[192144]: 2025-10-02 12:19:51.757 2 DEBUG nova.virt.libvirt.driver [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:19:51 np0005466013 nova_compute[192144]: 2025-10-02 12:19:51.784 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:19:51 np0005466013 nova_compute[192144]: 2025-10-02 12:19:51.832 2 INFO nova.compute.manager [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Took 8.59 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:19:51 np0005466013 nova_compute[192144]: 2025-10-02 12:19:51.833 2 DEBUG nova.compute.manager [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:19:51 np0005466013 nova_compute[192144]: 2025-10-02 12:19:51.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:52 np0005466013 nova_compute[192144]: 2025-10-02 12:19:52.081 2 INFO nova.compute.manager [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Took 9.55 seconds to build instance.#033[00m
Oct  2 08:19:52 np0005466013 nova_compute[192144]: 2025-10-02 12:19:52.195 2 DEBUG oslo_concurrency.lockutils [None req-53cd5b59-4bb3-4f82-ac9c-de81dfa5304b 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Lock "b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.758s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:52 np0005466013 nova_compute[192144]: 2025-10-02 12:19:52.897 2 DEBUG oslo_concurrency.lockutils [None req-46271c73-2bb3-4a68-8ef6-177bb3ef72ee 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Acquiring lock "b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:52 np0005466013 nova_compute[192144]: 2025-10-02 12:19:52.898 2 DEBUG oslo_concurrency.lockutils [None req-46271c73-2bb3-4a68-8ef6-177bb3ef72ee 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Lock "b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:52 np0005466013 nova_compute[192144]: 2025-10-02 12:19:52.899 2 DEBUG oslo_concurrency.lockutils [None req-46271c73-2bb3-4a68-8ef6-177bb3ef72ee 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Acquiring lock "b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:52 np0005466013 nova_compute[192144]: 2025-10-02 12:19:52.899 2 DEBUG oslo_concurrency.lockutils [None req-46271c73-2bb3-4a68-8ef6-177bb3ef72ee 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Lock "b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:52 np0005466013 nova_compute[192144]: 2025-10-02 12:19:52.900 2 DEBUG oslo_concurrency.lockutils [None req-46271c73-2bb3-4a68-8ef6-177bb3ef72ee 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Lock "b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:52 np0005466013 nova_compute[192144]: 2025-10-02 12:19:52.919 2 INFO nova.compute.manager [None req-46271c73-2bb3-4a68-8ef6-177bb3ef72ee 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Terminating instance#033[00m
Oct  2 08:19:52 np0005466013 nova_compute[192144]: 2025-10-02 12:19:52.937 2 DEBUG nova.compute.manager [None req-46271c73-2bb3-4a68-8ef6-177bb3ef72ee 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:19:52 np0005466013 kernel: tap2e474d48-e6 (unregistering): left promiscuous mode
Oct  2 08:19:52 np0005466013 NetworkManager[51205]: <info>  [1759407592.9573] device (tap2e474d48-e6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:19:53 np0005466013 ovn_controller[94366]: 2025-10-02T12:19:53Z|00373|binding|INFO|Releasing lport 2e474d48-e6c2-4dcb-88f5-d549a66d95a0 from this chassis (sb_readonly=0)
Oct  2 08:19:53 np0005466013 ovn_controller[94366]: 2025-10-02T12:19:53Z|00374|binding|INFO|Setting lport 2e474d48-e6c2-4dcb-88f5-d549a66d95a0 down in Southbound
Oct  2 08:19:53 np0005466013 ovn_controller[94366]: 2025-10-02T12:19:53Z|00375|binding|INFO|Removing iface tap2e474d48-e6 ovn-installed in OVS
Oct  2 08:19:53 np0005466013 nova_compute[192144]: 2025-10-02 12:19:53.009 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:53 np0005466013 nova_compute[192144]: 2025-10-02 12:19:53.013 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:53 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:53.021 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:88:c6:a5 10.100.0.5'], port_security=['fa:16:3e:88:c6:a5 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b5801b7a-2134-4b44-b6db-d862032a4851', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b80e82beaaf44c9c92c703674174093c', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd70ee44c-c49a-4e34-b748-be5f26507761', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d0a82a3e-689e-43ad-b151-c54f2eee2f82, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=2e474d48-e6c2-4dcb-88f5-d549a66d95a0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:19:53 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:53.024 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 2e474d48-e6c2-4dcb-88f5-d549a66d95a0 in datapath b5801b7a-2134-4b44-b6db-d862032a4851 unbound from our chassis#033[00m
Oct  2 08:19:53 np0005466013 nova_compute[192144]: 2025-10-02 12:19:53.022 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:53 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:53.025 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b5801b7a-2134-4b44-b6db-d862032a4851, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:19:53 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:53.027 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[f2d5f479-6809-462e-b9e5-d1b82d044490]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:53 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:53.028 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b5801b7a-2134-4b44-b6db-d862032a4851 namespace which is not needed anymore#033[00m
Oct  2 08:19:53 np0005466013 systemd[1]: machine-qemu\x2d43\x2dinstance\x2d00000063.scope: Deactivated successfully.
Oct  2 08:19:53 np0005466013 systemd[1]: machine-qemu\x2d43\x2dinstance\x2d00000063.scope: Consumed 2.583s CPU time.
Oct  2 08:19:53 np0005466013 systemd-machined[152202]: Machine qemu-43-instance-00000063 terminated.
Oct  2 08:19:53 np0005466013 nova_compute[192144]: 2025-10-02 12:19:53.159 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:53 np0005466013 nova_compute[192144]: 2025-10-02 12:19:53.163 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:53 np0005466013 nova_compute[192144]: 2025-10-02 12:19:53.208 2 INFO nova.virt.libvirt.driver [-] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Instance destroyed successfully.#033[00m
Oct  2 08:19:53 np0005466013 nova_compute[192144]: 2025-10-02 12:19:53.209 2 DEBUG nova.objects.instance [None req-46271c73-2bb3-4a68-8ef6-177bb3ef72ee 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Lazy-loading 'resources' on Instance uuid b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:19:53 np0005466013 nova_compute[192144]: 2025-10-02 12:19:53.242 2 DEBUG nova.virt.libvirt.vif [None req-46271c73-2bb3-4a68-8ef6-177bb3ef72ee 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:19:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-InstanceActionsV221TestJSON-server-1605038063',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-instanceactionsv221testjson-server-1605038063',id=99,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:19:51Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b80e82beaaf44c9c92c703674174093c',ramdisk_id='',reservation_id='r-5jsojqsl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-InstanceActionsV221TestJSON-1172944562',owner_user_name='tempest-InstanceActionsV221TestJSON-1172944562-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:19:51Z,user_data=None,user_id='8f34ba2e02d4445b8e188b6b8bf09e6b',uuid=b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2e474d48-e6c2-4dcb-88f5-d549a66d95a0", "address": "fa:16:3e:88:c6:a5", "network": {"id": "b5801b7a-2134-4b44-b6db-d862032a4851", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1654874963-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b80e82beaaf44c9c92c703674174093c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e474d48-e6", "ovs_interfaceid": "2e474d48-e6c2-4dcb-88f5-d549a66d95a0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:19:53 np0005466013 nova_compute[192144]: 2025-10-02 12:19:53.243 2 DEBUG nova.network.os_vif_util [None req-46271c73-2bb3-4a68-8ef6-177bb3ef72ee 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Converting VIF {"id": "2e474d48-e6c2-4dcb-88f5-d549a66d95a0", "address": "fa:16:3e:88:c6:a5", "network": {"id": "b5801b7a-2134-4b44-b6db-d862032a4851", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1654874963-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b80e82beaaf44c9c92c703674174093c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e474d48-e6", "ovs_interfaceid": "2e474d48-e6c2-4dcb-88f5-d549a66d95a0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:19:53 np0005466013 nova_compute[192144]: 2025-10-02 12:19:53.244 2 DEBUG nova.network.os_vif_util [None req-46271c73-2bb3-4a68-8ef6-177bb3ef72ee 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:88:c6:a5,bridge_name='br-int',has_traffic_filtering=True,id=2e474d48-e6c2-4dcb-88f5-d549a66d95a0,network=Network(b5801b7a-2134-4b44-b6db-d862032a4851),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e474d48-e6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:19:53 np0005466013 nova_compute[192144]: 2025-10-02 12:19:53.245 2 DEBUG os_vif [None req-46271c73-2bb3-4a68-8ef6-177bb3ef72ee 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:88:c6:a5,bridge_name='br-int',has_traffic_filtering=True,id=2e474d48-e6c2-4dcb-88f5-d549a66d95a0,network=Network(b5801b7a-2134-4b44-b6db-d862032a4851),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e474d48-e6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:19:53 np0005466013 nova_compute[192144]: 2025-10-02 12:19:53.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:53 np0005466013 nova_compute[192144]: 2025-10-02 12:19:53.248 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2e474d48-e6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:19:53 np0005466013 nova_compute[192144]: 2025-10-02 12:19:53.252 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:19:53 np0005466013 nova_compute[192144]: 2025-10-02 12:19:53.254 2 INFO os_vif [None req-46271c73-2bb3-4a68-8ef6-177bb3ef72ee 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:88:c6:a5,bridge_name='br-int',has_traffic_filtering=True,id=2e474d48-e6c2-4dcb-88f5-d549a66d95a0,network=Network(b5801b7a-2134-4b44-b6db-d862032a4851),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e474d48-e6')#033[00m
Oct  2 08:19:53 np0005466013 nova_compute[192144]: 2025-10-02 12:19:53.255 2 INFO nova.virt.libvirt.driver [None req-46271c73-2bb3-4a68-8ef6-177bb3ef72ee 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Deleting instance files /var/lib/nova/instances/b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef_del#033[00m
Oct  2 08:19:53 np0005466013 nova_compute[192144]: 2025-10-02 12:19:53.255 2 INFO nova.virt.libvirt.driver [None req-46271c73-2bb3-4a68-8ef6-177bb3ef72ee 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Deletion of /var/lib/nova/instances/b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef_del complete#033[00m
Oct  2 08:19:53 np0005466013 neutron-haproxy-ovnmeta-b5801b7a-2134-4b44-b6db-d862032a4851[234431]: [NOTICE]   (234436) : haproxy version is 2.8.14-c23fe91
Oct  2 08:19:53 np0005466013 neutron-haproxy-ovnmeta-b5801b7a-2134-4b44-b6db-d862032a4851[234431]: [NOTICE]   (234436) : path to executable is /usr/sbin/haproxy
Oct  2 08:19:53 np0005466013 neutron-haproxy-ovnmeta-b5801b7a-2134-4b44-b6db-d862032a4851[234431]: [WARNING]  (234436) : Exiting Master process...
Oct  2 08:19:53 np0005466013 neutron-haproxy-ovnmeta-b5801b7a-2134-4b44-b6db-d862032a4851[234431]: [WARNING]  (234436) : Exiting Master process...
Oct  2 08:19:53 np0005466013 neutron-haproxy-ovnmeta-b5801b7a-2134-4b44-b6db-d862032a4851[234431]: [ALERT]    (234436) : Current worker (234438) exited with code 143 (Terminated)
Oct  2 08:19:53 np0005466013 neutron-haproxy-ovnmeta-b5801b7a-2134-4b44-b6db-d862032a4851[234431]: [WARNING]  (234436) : All workers exited. Exiting... (0)
Oct  2 08:19:53 np0005466013 systemd[1]: libpod-2476153ea00c1d3bf65a3499180c65fb735e4d29d3a5d44edbfed6484224a031.scope: Deactivated successfully.
Oct  2 08:19:53 np0005466013 podman[234544]: 2025-10-02 12:19:53.349050868 +0000 UTC m=+0.226269698 container died 2476153ea00c1d3bf65a3499180c65fb735e4d29d3a5d44edbfed6484224a031 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b5801b7a-2134-4b44-b6db-d862032a4851, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  2 08:19:53 np0005466013 nova_compute[192144]: 2025-10-02 12:19:53.423 2 INFO nova.compute.manager [None req-46271c73-2bb3-4a68-8ef6-177bb3ef72ee 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Took 0.48 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:19:53 np0005466013 nova_compute[192144]: 2025-10-02 12:19:53.423 2 DEBUG oslo.service.loopingcall [None req-46271c73-2bb3-4a68-8ef6-177bb3ef72ee 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:19:53 np0005466013 nova_compute[192144]: 2025-10-02 12:19:53.424 2 DEBUG nova.compute.manager [-] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:19:53 np0005466013 nova_compute[192144]: 2025-10-02 12:19:53.424 2 DEBUG nova.network.neutron [-] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:19:53 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2476153ea00c1d3bf65a3499180c65fb735e4d29d3a5d44edbfed6484224a031-userdata-shm.mount: Deactivated successfully.
Oct  2 08:19:53 np0005466013 systemd[1]: var-lib-containers-storage-overlay-30ccb393baf47c647fec680b87b37517dccf00005b287d6caba2d50d47efbf20-merged.mount: Deactivated successfully.
Oct  2 08:19:53 np0005466013 nova_compute[192144]: 2025-10-02 12:19:53.648 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:53 np0005466013 nova_compute[192144]: 2025-10-02 12:19:53.795 2 DEBUG nova.compute.manager [req-57d15a66-985e-4a51-acdc-761c9218bc0f req-233faefe-8d5b-4716-ab12-6de398397b55 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Received event network-vif-plugged-2e474d48-e6c2-4dcb-88f5-d549a66d95a0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:19:53 np0005466013 nova_compute[192144]: 2025-10-02 12:19:53.795 2 DEBUG oslo_concurrency.lockutils [req-57d15a66-985e-4a51-acdc-761c9218bc0f req-233faefe-8d5b-4716-ab12-6de398397b55 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:53 np0005466013 nova_compute[192144]: 2025-10-02 12:19:53.796 2 DEBUG oslo_concurrency.lockutils [req-57d15a66-985e-4a51-acdc-761c9218bc0f req-233faefe-8d5b-4716-ab12-6de398397b55 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:53 np0005466013 nova_compute[192144]: 2025-10-02 12:19:53.796 2 DEBUG oslo_concurrency.lockutils [req-57d15a66-985e-4a51-acdc-761c9218bc0f req-233faefe-8d5b-4716-ab12-6de398397b55 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:53 np0005466013 nova_compute[192144]: 2025-10-02 12:19:53.796 2 DEBUG nova.compute.manager [req-57d15a66-985e-4a51-acdc-761c9218bc0f req-233faefe-8d5b-4716-ab12-6de398397b55 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] No waiting events found dispatching network-vif-plugged-2e474d48-e6c2-4dcb-88f5-d549a66d95a0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:19:53 np0005466013 nova_compute[192144]: 2025-10-02 12:19:53.796 2 WARNING nova.compute.manager [req-57d15a66-985e-4a51-acdc-761c9218bc0f req-233faefe-8d5b-4716-ab12-6de398397b55 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Received unexpected event network-vif-plugged-2e474d48-e6c2-4dcb-88f5-d549a66d95a0 for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:19:53 np0005466013 podman[234544]: 2025-10-02 12:19:53.810554602 +0000 UTC m=+0.687773432 container cleanup 2476153ea00c1d3bf65a3499180c65fb735e4d29d3a5d44edbfed6484224a031 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b5801b7a-2134-4b44-b6db-d862032a4851, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3)
Oct  2 08:19:53 np0005466013 systemd[1]: libpod-conmon-2476153ea00c1d3bf65a3499180c65fb735e4d29d3a5d44edbfed6484224a031.scope: Deactivated successfully.
Oct  2 08:19:53 np0005466013 podman[234589]: 2025-10-02 12:19:53.969982967 +0000 UTC m=+0.136647236 container remove 2476153ea00c1d3bf65a3499180c65fb735e4d29d3a5d44edbfed6484224a031 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b5801b7a-2134-4b44-b6db-d862032a4851, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:19:53 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:53.977 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[bc334c6f-4779-4bc5-b824-2b293f9fdbfe]: (4, ('Thu Oct  2 12:19:53 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-b5801b7a-2134-4b44-b6db-d862032a4851 (2476153ea00c1d3bf65a3499180c65fb735e4d29d3a5d44edbfed6484224a031)\n2476153ea00c1d3bf65a3499180c65fb735e4d29d3a5d44edbfed6484224a031\nThu Oct  2 12:19:53 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-b5801b7a-2134-4b44-b6db-d862032a4851 (2476153ea00c1d3bf65a3499180c65fb735e4d29d3a5d44edbfed6484224a031)\n2476153ea00c1d3bf65a3499180c65fb735e4d29d3a5d44edbfed6484224a031\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:53 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:53.980 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[7757a37c-b435-43cc-aa49-c284a0f08642]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:53 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:53.982 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb5801b7a-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:19:53 np0005466013 nova_compute[192144]: 2025-10-02 12:19:53.984 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:53 np0005466013 kernel: tapb5801b7a-20: left promiscuous mode
Oct  2 08:19:53 np0005466013 nova_compute[192144]: 2025-10-02 12:19:53.996 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:54 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:54.000 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[c50133b0-3cb2-47a6-a7da-b2bf9d3c709f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:54 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:54.039 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[eae99ff5-8dc9-45b4-80af-2e5545c69c65]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:54 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:54.041 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[9d25633f-5df7-453c-9a1e-6fc09dd5f38a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:54 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:54.061 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[444011d6-2fed-4d21-bdeb-5af82686e4b6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 557940, 'reachable_time': 41023, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234604, 'error': None, 'target': 'ovnmeta-b5801b7a-2134-4b44-b6db-d862032a4851', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:54 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:54.064 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b5801b7a-2134-4b44-b6db-d862032a4851 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:19:54 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:19:54.064 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[a8f852fe-a613-446b-adf0-b41bf052bc6d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:54 np0005466013 systemd[1]: run-netns-ovnmeta\x2db5801b7a\x2d2134\x2d4b44\x2db6db\x2dd862032a4851.mount: Deactivated successfully.
Oct  2 08:19:54 np0005466013 nova_compute[192144]: 2025-10-02 12:19:54.648 2 DEBUG nova.network.neutron [-] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:19:54 np0005466013 nova_compute[192144]: 2025-10-02 12:19:54.687 2 INFO nova.compute.manager [-] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Took 1.26 seconds to deallocate network for instance.#033[00m
Oct  2 08:19:54 np0005466013 nova_compute[192144]: 2025-10-02 12:19:54.764 2 DEBUG oslo_concurrency.lockutils [None req-46271c73-2bb3-4a68-8ef6-177bb3ef72ee 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:54 np0005466013 nova_compute[192144]: 2025-10-02 12:19:54.765 2 DEBUG oslo_concurrency.lockutils [None req-46271c73-2bb3-4a68-8ef6-177bb3ef72ee 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:54 np0005466013 nova_compute[192144]: 2025-10-02 12:19:54.796 2 DEBUG nova.compute.manager [req-a2fdbc12-05ca-424e-b71b-28e66778a7c2 req-c7b66166-69f5-4947-9864-8668b274efa7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Received event network-vif-deleted-2e474d48-e6c2-4dcb-88f5-d549a66d95a0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:19:54 np0005466013 nova_compute[192144]: 2025-10-02 12:19:54.832 2 DEBUG nova.compute.provider_tree [None req-46271c73-2bb3-4a68-8ef6-177bb3ef72ee 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:19:54 np0005466013 nova_compute[192144]: 2025-10-02 12:19:54.847 2 DEBUG nova.scheduler.client.report [None req-46271c73-2bb3-4a68-8ef6-177bb3ef72ee 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:19:54 np0005466013 nova_compute[192144]: 2025-10-02 12:19:54.867 2 DEBUG oslo_concurrency.lockutils [None req-46271c73-2bb3-4a68-8ef6-177bb3ef72ee 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:54 np0005466013 nova_compute[192144]: 2025-10-02 12:19:54.894 2 INFO nova.scheduler.client.report [None req-46271c73-2bb3-4a68-8ef6-177bb3ef72ee 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Deleted allocations for instance b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef#033[00m
Oct  2 08:19:54 np0005466013 nova_compute[192144]: 2025-10-02 12:19:54.986 2 DEBUG oslo_concurrency.lockutils [None req-46271c73-2bb3-4a68-8ef6-177bb3ef72ee 8f34ba2e02d4445b8e188b6b8bf09e6b b80e82beaaf44c9c92c703674174093c - - default default] Lock "b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.088s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:56 np0005466013 nova_compute[192144]: 2025-10-02 12:19:56.419 2 DEBUG oslo_concurrency.lockutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Acquiring lock "32c9613e-842d-48c2-be9d-368df9649040" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:56 np0005466013 nova_compute[192144]: 2025-10-02 12:19:56.419 2 DEBUG oslo_concurrency.lockutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "32c9613e-842d-48c2-be9d-368df9649040" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:56 np0005466013 nova_compute[192144]: 2025-10-02 12:19:56.441 2 DEBUG nova.compute.manager [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:19:56 np0005466013 nova_compute[192144]: 2025-10-02 12:19:56.484 2 DEBUG oslo_concurrency.lockutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Acquiring lock "c6a7915d-e2fe-4409-a7fe-ead19189985a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:56 np0005466013 nova_compute[192144]: 2025-10-02 12:19:56.484 2 DEBUG oslo_concurrency.lockutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "c6a7915d-e2fe-4409-a7fe-ead19189985a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:56 np0005466013 nova_compute[192144]: 2025-10-02 12:19:56.509 2 DEBUG nova.compute.manager [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:19:56 np0005466013 nova_compute[192144]: 2025-10-02 12:19:56.537 2 DEBUG oslo_concurrency.lockutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:56 np0005466013 nova_compute[192144]: 2025-10-02 12:19:56.537 2 DEBUG oslo_concurrency.lockutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:56 np0005466013 nova_compute[192144]: 2025-10-02 12:19:56.544 2 DEBUG nova.virt.hardware [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:19:56 np0005466013 nova_compute[192144]: 2025-10-02 12:19:56.544 2 INFO nova.compute.claims [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:19:56 np0005466013 nova_compute[192144]: 2025-10-02 12:19:56.606 2 DEBUG oslo_concurrency.lockutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:56 np0005466013 nova_compute[192144]: 2025-10-02 12:19:56.766 2 DEBUG nova.compute.provider_tree [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:19:56 np0005466013 nova_compute[192144]: 2025-10-02 12:19:56.778 2 DEBUG nova.scheduler.client.report [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:19:56 np0005466013 nova_compute[192144]: 2025-10-02 12:19:56.801 2 DEBUG oslo_concurrency.lockutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.263s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:56 np0005466013 nova_compute[192144]: 2025-10-02 12:19:56.801 2 DEBUG nova.compute.manager [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:19:56 np0005466013 nova_compute[192144]: 2025-10-02 12:19:56.803 2 DEBUG oslo_concurrency.lockutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.198s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:56 np0005466013 nova_compute[192144]: 2025-10-02 12:19:56.811 2 DEBUG nova.virt.hardware [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:19:56 np0005466013 nova_compute[192144]: 2025-10-02 12:19:56.811 2 INFO nova.compute.claims [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:19:56 np0005466013 nova_compute[192144]: 2025-10-02 12:19:56.884 2 DEBUG nova.compute.manager [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:19:56 np0005466013 nova_compute[192144]: 2025-10-02 12:19:56.885 2 DEBUG nova.network.neutron [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:19:56 np0005466013 nova_compute[192144]: 2025-10-02 12:19:56.906 2 INFO nova.virt.libvirt.driver [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:19:56 np0005466013 nova_compute[192144]: 2025-10-02 12:19:56.924 2 DEBUG nova.compute.manager [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:19:56 np0005466013 nova_compute[192144]: 2025-10-02 12:19:56.973 2 DEBUG nova.compute.provider_tree [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.003 2 DEBUG nova.scheduler.client.report [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.031 2 DEBUG oslo_concurrency.lockutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.228s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.032 2 DEBUG nova.compute.manager [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.040 2 DEBUG nova.compute.manager [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.041 2 DEBUG nova.virt.libvirt.driver [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.041 2 INFO nova.virt.libvirt.driver [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Creating image(s)#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.042 2 DEBUG oslo_concurrency.lockutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Acquiring lock "/var/lib/nova/instances/32c9613e-842d-48c2-be9d-368df9649040/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.042 2 DEBUG oslo_concurrency.lockutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "/var/lib/nova/instances/32c9613e-842d-48c2-be9d-368df9649040/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.043 2 DEBUG oslo_concurrency.lockutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "/var/lib/nova/instances/32c9613e-842d-48c2-be9d-368df9649040/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.055 2 DEBUG oslo_concurrency.processutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.085 2 DEBUG nova.compute.manager [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.086 2 DEBUG nova.network.neutron [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.110 2 INFO nova.virt.libvirt.driver [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.117 2 DEBUG oslo_concurrency.processutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.118 2 DEBUG oslo_concurrency.lockutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.118 2 DEBUG oslo_concurrency.lockutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.133 2 DEBUG oslo_concurrency.processutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.152 2 DEBUG nova.compute.manager [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.189 2 DEBUG oslo_concurrency.processutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.190 2 DEBUG oslo_concurrency.processutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/32c9613e-842d-48c2-be9d-368df9649040/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.227 2 DEBUG oslo_concurrency.processutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/32c9613e-842d-48c2-be9d-368df9649040/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.228 2 DEBUG oslo_concurrency.lockutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.229 2 DEBUG oslo_concurrency.processutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.286 2 DEBUG oslo_concurrency.processutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.287 2 DEBUG nova.virt.disk.api [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Checking if we can resize image /var/lib/nova/instances/32c9613e-842d-48c2-be9d-368df9649040/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.288 2 DEBUG oslo_concurrency.processutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/32c9613e-842d-48c2-be9d-368df9649040/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.310 2 DEBUG nova.policy [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4407ad6914204506adfa85e11e94e5d0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '32942e5bdadc470989ae2d43e074169e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.321 2 DEBUG nova.compute.manager [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.323 2 DEBUG nova.virt.libvirt.driver [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.323 2 INFO nova.virt.libvirt.driver [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Creating image(s)#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.324 2 DEBUG oslo_concurrency.lockutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Acquiring lock "/var/lib/nova/instances/c6a7915d-e2fe-4409-a7fe-ead19189985a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.324 2 DEBUG oslo_concurrency.lockutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "/var/lib/nova/instances/c6a7915d-e2fe-4409-a7fe-ead19189985a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.325 2 DEBUG oslo_concurrency.lockutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "/var/lib/nova/instances/c6a7915d-e2fe-4409-a7fe-ead19189985a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.341 2 DEBUG nova.policy [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4407ad6914204506adfa85e11e94e5d0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '32942e5bdadc470989ae2d43e074169e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.344 2 DEBUG oslo_concurrency.processutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.366 2 DEBUG oslo_concurrency.processutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/32c9613e-842d-48c2-be9d-368df9649040/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.367 2 DEBUG nova.virt.disk.api [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Cannot resize image /var/lib/nova/instances/32c9613e-842d-48c2-be9d-368df9649040/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.368 2 DEBUG nova.objects.instance [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lazy-loading 'migration_context' on Instance uuid 32c9613e-842d-48c2-be9d-368df9649040 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.382 2 DEBUG nova.virt.libvirt.driver [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.382 2 DEBUG nova.virt.libvirt.driver [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Ensure instance console log exists: /var/lib/nova/instances/32c9613e-842d-48c2-be9d-368df9649040/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.383 2 DEBUG oslo_concurrency.lockutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.383 2 DEBUG oslo_concurrency.lockutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.384 2 DEBUG oslo_concurrency.lockutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.406 2 DEBUG oslo_concurrency.processutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.406 2 DEBUG oslo_concurrency.lockutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.407 2 DEBUG oslo_concurrency.lockutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.423 2 DEBUG oslo_concurrency.processutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.487 2 DEBUG oslo_concurrency.processutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.488 2 DEBUG oslo_concurrency.processutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/c6a7915d-e2fe-4409-a7fe-ead19189985a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.525 2 DEBUG oslo_concurrency.processutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/c6a7915d-e2fe-4409-a7fe-ead19189985a/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.526 2 DEBUG oslo_concurrency.lockutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.526 2 DEBUG oslo_concurrency.processutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.590 2 DEBUG oslo_concurrency.processutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.591 2 DEBUG nova.virt.disk.api [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Checking if we can resize image /var/lib/nova/instances/c6a7915d-e2fe-4409-a7fe-ead19189985a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.592 2 DEBUG oslo_concurrency.processutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c6a7915d-e2fe-4409-a7fe-ead19189985a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.670 2 DEBUG oslo_concurrency.processutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c6a7915d-e2fe-4409-a7fe-ead19189985a/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.672 2 DEBUG nova.virt.disk.api [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Cannot resize image /var/lib/nova/instances/c6a7915d-e2fe-4409-a7fe-ead19189985a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.672 2 DEBUG nova.objects.instance [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lazy-loading 'migration_context' on Instance uuid c6a7915d-e2fe-4409-a7fe-ead19189985a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.693 2 DEBUG nova.virt.libvirt.driver [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.693 2 DEBUG nova.virt.libvirt.driver [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Ensure instance console log exists: /var/lib/nova/instances/c6a7915d-e2fe-4409-a7fe-ead19189985a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.694 2 DEBUG oslo_concurrency.lockutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.694 2 DEBUG oslo_concurrency.lockutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:57 np0005466013 nova_compute[192144]: 2025-10-02 12:19:57.694 2 DEBUG oslo_concurrency.lockutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:58 np0005466013 nova_compute[192144]: 2025-10-02 12:19:58.031 2 DEBUG nova.network.neutron [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Successfully created port: 3b2fddbf-4dfc-49fd-8f32-e37500c967d4 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:19:58 np0005466013 nova_compute[192144]: 2025-10-02 12:19:58.051 2 DEBUG nova.network.neutron [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Successfully created port: fc5ead0e-1493-42fc-92c9-2861340d0114 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:19:58 np0005466013 nova_compute[192144]: 2025-10-02 12:19:58.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:58 np0005466013 nova_compute[192144]: 2025-10-02 12:19:58.651 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:58 np0005466013 podman[234635]: 2025-10-02 12:19:58.688237432 +0000 UTC m=+0.062733466 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=edpm, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:19:58 np0005466013 nova_compute[192144]: 2025-10-02 12:19:58.806 2 DEBUG nova.network.neutron [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Successfully updated port: fc5ead0e-1493-42fc-92c9-2861340d0114 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:19:58 np0005466013 nova_compute[192144]: 2025-10-02 12:19:58.830 2 DEBUG oslo_concurrency.lockutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Acquiring lock "refresh_cache-c6a7915d-e2fe-4409-a7fe-ead19189985a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:19:58 np0005466013 nova_compute[192144]: 2025-10-02 12:19:58.830 2 DEBUG oslo_concurrency.lockutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Acquired lock "refresh_cache-c6a7915d-e2fe-4409-a7fe-ead19189985a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:19:58 np0005466013 nova_compute[192144]: 2025-10-02 12:19:58.830 2 DEBUG nova.network.neutron [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:19:58 np0005466013 nova_compute[192144]: 2025-10-02 12:19:58.876 2 DEBUG nova.compute.manager [req-6f653e64-0711-4a44-b5c6-ede8f8d875d4 req-4ed5b253-c5ee-401b-89a3-58495a2d2cf1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Received event network-changed-fc5ead0e-1493-42fc-92c9-2861340d0114 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:19:58 np0005466013 nova_compute[192144]: 2025-10-02 12:19:58.877 2 DEBUG nova.compute.manager [req-6f653e64-0711-4a44-b5c6-ede8f8d875d4 req-4ed5b253-c5ee-401b-89a3-58495a2d2cf1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Refreshing instance network info cache due to event network-changed-fc5ead0e-1493-42fc-92c9-2861340d0114. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:19:58 np0005466013 nova_compute[192144]: 2025-10-02 12:19:58.877 2 DEBUG oslo_concurrency.lockutils [req-6f653e64-0711-4a44-b5c6-ede8f8d875d4 req-4ed5b253-c5ee-401b-89a3-58495a2d2cf1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-c6a7915d-e2fe-4409-a7fe-ead19189985a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:19:59 np0005466013 nova_compute[192144]: 2025-10-02 12:19:59.243 2 DEBUG nova.network.neutron [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Successfully updated port: 3b2fddbf-4dfc-49fd-8f32-e37500c967d4 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:19:59 np0005466013 nova_compute[192144]: 2025-10-02 12:19:59.258 2 DEBUG oslo_concurrency.lockutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Acquiring lock "refresh_cache-32c9613e-842d-48c2-be9d-368df9649040" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:19:59 np0005466013 nova_compute[192144]: 2025-10-02 12:19:59.259 2 DEBUG oslo_concurrency.lockutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Acquired lock "refresh_cache-32c9613e-842d-48c2-be9d-368df9649040" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:19:59 np0005466013 nova_compute[192144]: 2025-10-02 12:19:59.259 2 DEBUG nova.network.neutron [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:19:59 np0005466013 nova_compute[192144]: 2025-10-02 12:19:59.265 2 DEBUG nova.network.neutron [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:19:59 np0005466013 nova_compute[192144]: 2025-10-02 12:19:59.349 2 DEBUG nova.compute.manager [req-ce19bfea-f7f6-47b7-955d-6fd92f1013e4 req-80459423-6620-432c-b4eb-f55772ad5a6a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Received event network-changed-3b2fddbf-4dfc-49fd-8f32-e37500c967d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:19:59 np0005466013 nova_compute[192144]: 2025-10-02 12:19:59.350 2 DEBUG nova.compute.manager [req-ce19bfea-f7f6-47b7-955d-6fd92f1013e4 req-80459423-6620-432c-b4eb-f55772ad5a6a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Refreshing instance network info cache due to event network-changed-3b2fddbf-4dfc-49fd-8f32-e37500c967d4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:19:59 np0005466013 nova_compute[192144]: 2025-10-02 12:19:59.350 2 DEBUG oslo_concurrency.lockutils [req-ce19bfea-f7f6-47b7-955d-6fd92f1013e4 req-80459423-6620-432c-b4eb-f55772ad5a6a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-32c9613e-842d-48c2-be9d-368df9649040" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:19:59 np0005466013 nova_compute[192144]: 2025-10-02 12:19:59.446 2 DEBUG nova.network.neutron [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.333 2 DEBUG nova.network.neutron [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Updating instance_info_cache with network_info: [{"id": "fc5ead0e-1493-42fc-92c9-2861340d0114", "address": "fa:16:3e:8f:2a:ea", "network": {"id": "671431ea-00d4-4b91-9313-30948c14cb3a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1375422383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32942e5bdadc470989ae2d43e074169e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc5ead0e-14", "ovs_interfaceid": "fc5ead0e-1493-42fc-92c9-2861340d0114", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.391 2 DEBUG oslo_concurrency.lockutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Releasing lock "refresh_cache-c6a7915d-e2fe-4409-a7fe-ead19189985a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.391 2 DEBUG nova.compute.manager [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Instance network_info: |[{"id": "fc5ead0e-1493-42fc-92c9-2861340d0114", "address": "fa:16:3e:8f:2a:ea", "network": {"id": "671431ea-00d4-4b91-9313-30948c14cb3a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1375422383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32942e5bdadc470989ae2d43e074169e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc5ead0e-14", "ovs_interfaceid": "fc5ead0e-1493-42fc-92c9-2861340d0114", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.391 2 DEBUG oslo_concurrency.lockutils [req-6f653e64-0711-4a44-b5c6-ede8f8d875d4 req-4ed5b253-c5ee-401b-89a3-58495a2d2cf1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-c6a7915d-e2fe-4409-a7fe-ead19189985a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.392 2 DEBUG nova.network.neutron [req-6f653e64-0711-4a44-b5c6-ede8f8d875d4 req-4ed5b253-c5ee-401b-89a3-58495a2d2cf1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Refreshing network info cache for port fc5ead0e-1493-42fc-92c9-2861340d0114 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.394 2 DEBUG nova.virt.libvirt.driver [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Start _get_guest_xml network_info=[{"id": "fc5ead0e-1493-42fc-92c9-2861340d0114", "address": "fa:16:3e:8f:2a:ea", "network": {"id": "671431ea-00d4-4b91-9313-30948c14cb3a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1375422383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32942e5bdadc470989ae2d43e074169e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc5ead0e-14", "ovs_interfaceid": "fc5ead0e-1493-42fc-92c9-2861340d0114", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.398 2 WARNING nova.virt.libvirt.driver [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.402 2 DEBUG nova.virt.libvirt.host [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.402 2 DEBUG nova.virt.libvirt.host [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.405 2 DEBUG nova.virt.libvirt.host [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.406 2 DEBUG nova.virt.libvirt.host [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.407 2 DEBUG nova.virt.libvirt.driver [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.407 2 DEBUG nova.virt.hardware [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.408 2 DEBUG nova.virt.hardware [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.408 2 DEBUG nova.virt.hardware [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.408 2 DEBUG nova.virt.hardware [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.408 2 DEBUG nova.virt.hardware [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.408 2 DEBUG nova.virt.hardware [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.408 2 DEBUG nova.virt.hardware [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.409 2 DEBUG nova.virt.hardware [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.409 2 DEBUG nova.virt.hardware [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.409 2 DEBUG nova.virt.hardware [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.409 2 DEBUG nova.virt.hardware [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.413 2 DEBUG nova.virt.libvirt.vif [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:19:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1685103395',display_name='tempest-tempest.common.compute-instance-1685103395-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1685103395-2',id=104,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='32942e5bdadc470989ae2d43e074169e',ramdisk_id='',reservation_id='r-70160ig1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-1289676365',owner_user_name='tempest-MultipleCreateTestJSON-1289676365-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:19:57Z,user_data=None,user_id='4407ad6914204506adfa85e11e94e5d0',uuid=c6a7915d-e2fe-4409-a7fe-ead19189985a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fc5ead0e-1493-42fc-92c9-2861340d0114", "address": "fa:16:3e:8f:2a:ea", "network": {"id": "671431ea-00d4-4b91-9313-30948c14cb3a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1375422383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32942e5bdadc470989ae2d43e074169e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc5ead0e-14", "ovs_interfaceid": "fc5ead0e-1493-42fc-92c9-2861340d0114", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.414 2 DEBUG nova.network.os_vif_util [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Converting VIF {"id": "fc5ead0e-1493-42fc-92c9-2861340d0114", "address": "fa:16:3e:8f:2a:ea", "network": {"id": "671431ea-00d4-4b91-9313-30948c14cb3a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1375422383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32942e5bdadc470989ae2d43e074169e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc5ead0e-14", "ovs_interfaceid": "fc5ead0e-1493-42fc-92c9-2861340d0114", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.414 2 DEBUG nova.network.os_vif_util [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8f:2a:ea,bridge_name='br-int',has_traffic_filtering=True,id=fc5ead0e-1493-42fc-92c9-2861340d0114,network=Network(671431ea-00d4-4b91-9313-30948c14cb3a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc5ead0e-14') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.416 2 DEBUG nova.objects.instance [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lazy-loading 'pci_devices' on Instance uuid c6a7915d-e2fe-4409-a7fe-ead19189985a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.422 2 DEBUG nova.network.neutron [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Updating instance_info_cache with network_info: [{"id": "3b2fddbf-4dfc-49fd-8f32-e37500c967d4", "address": "fa:16:3e:ed:b5:cd", "network": {"id": "671431ea-00d4-4b91-9313-30948c14cb3a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1375422383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32942e5bdadc470989ae2d43e074169e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b2fddbf-4d", "ovs_interfaceid": "3b2fddbf-4dfc-49fd-8f32-e37500c967d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.431 2 DEBUG nova.virt.libvirt.driver [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:20:00 np0005466013 nova_compute[192144]:  <uuid>c6a7915d-e2fe-4409-a7fe-ead19189985a</uuid>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:  <name>instance-00000068</name>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      <nova:name>tempest-tempest.common.compute-instance-1685103395-2</nova:name>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:20:00</nova:creationTime>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:20:00 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:        <nova:user uuid="4407ad6914204506adfa85e11e94e5d0">tempest-MultipleCreateTestJSON-1289676365-project-member</nova:user>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:        <nova:project uuid="32942e5bdadc470989ae2d43e074169e">tempest-MultipleCreateTestJSON-1289676365</nova:project>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:        <nova:port uuid="fc5ead0e-1493-42fc-92c9-2861340d0114">
Oct  2 08:20:00 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      <entry name="serial">c6a7915d-e2fe-4409-a7fe-ead19189985a</entry>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      <entry name="uuid">c6a7915d-e2fe-4409-a7fe-ead19189985a</entry>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/c6a7915d-e2fe-4409-a7fe-ead19189985a/disk"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/c6a7915d-e2fe-4409-a7fe-ead19189985a/disk.config"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:8f:2a:ea"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      <target dev="tapfc5ead0e-14"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/c6a7915d-e2fe-4409-a7fe-ead19189985a/console.log" append="off"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:20:00 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:20:00 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.431 2 DEBUG nova.compute.manager [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Preparing to wait for external event network-vif-plugged-fc5ead0e-1493-42fc-92c9-2861340d0114 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.432 2 DEBUG oslo_concurrency.lockutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Acquiring lock "c6a7915d-e2fe-4409-a7fe-ead19189985a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.432 2 DEBUG oslo_concurrency.lockutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "c6a7915d-e2fe-4409-a7fe-ead19189985a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.432 2 DEBUG oslo_concurrency.lockutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "c6a7915d-e2fe-4409-a7fe-ead19189985a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.433 2 DEBUG nova.virt.libvirt.vif [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:19:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1685103395',display_name='tempest-tempest.common.compute-instance-1685103395-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1685103395-2',id=104,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='32942e5bdadc470989ae2d43e074169e',ramdisk_id='',reservation_id='r-70160ig1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-1289676365',owner_user_name='tempest-MultipleCreateTestJSON-1289676365-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:19:57Z,user_data=None,user_id='4407ad6914204506adfa85e11e94e5d0',uuid=c6a7915d-e2fe-4409-a7fe-ead19189985a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fc5ead0e-1493-42fc-92c9-2861340d0114", "address": "fa:16:3e:8f:2a:ea", "network": {"id": "671431ea-00d4-4b91-9313-30948c14cb3a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1375422383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32942e5bdadc470989ae2d43e074169e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc5ead0e-14", "ovs_interfaceid": "fc5ead0e-1493-42fc-92c9-2861340d0114", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.433 2 DEBUG nova.network.os_vif_util [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Converting VIF {"id": "fc5ead0e-1493-42fc-92c9-2861340d0114", "address": "fa:16:3e:8f:2a:ea", "network": {"id": "671431ea-00d4-4b91-9313-30948c14cb3a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1375422383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32942e5bdadc470989ae2d43e074169e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc5ead0e-14", "ovs_interfaceid": "fc5ead0e-1493-42fc-92c9-2861340d0114", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.434 2 DEBUG nova.network.os_vif_util [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8f:2a:ea,bridge_name='br-int',has_traffic_filtering=True,id=fc5ead0e-1493-42fc-92c9-2861340d0114,network=Network(671431ea-00d4-4b91-9313-30948c14cb3a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc5ead0e-14') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.434 2 DEBUG os_vif [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8f:2a:ea,bridge_name='br-int',has_traffic_filtering=True,id=fc5ead0e-1493-42fc-92c9-2861340d0114,network=Network(671431ea-00d4-4b91-9313-30948c14cb3a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc5ead0e-14') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.435 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.436 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.438 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfc5ead0e-14, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.438 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfc5ead0e-14, col_values=(('external_ids', {'iface-id': 'fc5ead0e-1493-42fc-92c9-2861340d0114', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8f:2a:ea', 'vm-uuid': 'c6a7915d-e2fe-4409-a7fe-ead19189985a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.440 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:00 np0005466013 NetworkManager[51205]: <info>  [1759407600.4411] manager: (tapfc5ead0e-14): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/174)
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.446 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.448 2 INFO os_vif [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8f:2a:ea,bridge_name='br-int',has_traffic_filtering=True,id=fc5ead0e-1493-42fc-92c9-2861340d0114,network=Network(671431ea-00d4-4b91-9313-30948c14cb3a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc5ead0e-14')#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.449 2 DEBUG oslo_concurrency.lockutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Releasing lock "refresh_cache-32c9613e-842d-48c2-be9d-368df9649040" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.450 2 DEBUG nova.compute.manager [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Instance network_info: |[{"id": "3b2fddbf-4dfc-49fd-8f32-e37500c967d4", "address": "fa:16:3e:ed:b5:cd", "network": {"id": "671431ea-00d4-4b91-9313-30948c14cb3a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1375422383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32942e5bdadc470989ae2d43e074169e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b2fddbf-4d", "ovs_interfaceid": "3b2fddbf-4dfc-49fd-8f32-e37500c967d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.450 2 DEBUG oslo_concurrency.lockutils [req-ce19bfea-f7f6-47b7-955d-6fd92f1013e4 req-80459423-6620-432c-b4eb-f55772ad5a6a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-32c9613e-842d-48c2-be9d-368df9649040" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.450 2 DEBUG nova.network.neutron [req-ce19bfea-f7f6-47b7-955d-6fd92f1013e4 req-80459423-6620-432c-b4eb-f55772ad5a6a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Refreshing network info cache for port 3b2fddbf-4dfc-49fd-8f32-e37500c967d4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.453 2 DEBUG nova.virt.libvirt.driver [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Start _get_guest_xml network_info=[{"id": "3b2fddbf-4dfc-49fd-8f32-e37500c967d4", "address": "fa:16:3e:ed:b5:cd", "network": {"id": "671431ea-00d4-4b91-9313-30948c14cb3a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1375422383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32942e5bdadc470989ae2d43e074169e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b2fddbf-4d", "ovs_interfaceid": "3b2fddbf-4dfc-49fd-8f32-e37500c967d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.457 2 WARNING nova.virt.libvirt.driver [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.462 2 DEBUG nova.virt.libvirt.host [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.463 2 DEBUG nova.virt.libvirt.host [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.467 2 DEBUG nova.virt.libvirt.host [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.467 2 DEBUG nova.virt.libvirt.host [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.468 2 DEBUG nova.virt.libvirt.driver [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.468 2 DEBUG nova.virt.hardware [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.469 2 DEBUG nova.virt.hardware [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.469 2 DEBUG nova.virt.hardware [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.469 2 DEBUG nova.virt.hardware [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.469 2 DEBUG nova.virt.hardware [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.469 2 DEBUG nova.virt.hardware [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.470 2 DEBUG nova.virt.hardware [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.470 2 DEBUG nova.virt.hardware [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.470 2 DEBUG nova.virt.hardware [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.470 2 DEBUG nova.virt.hardware [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.470 2 DEBUG nova.virt.hardware [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.474 2 DEBUG nova.virt.libvirt.vif [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:19:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1685103395',display_name='tempest-tempest.common.compute-instance-1685103395-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1685103395-1',id=103,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='32942e5bdadc470989ae2d43e074169e',ramdisk_id='',reservation_id='r-70160ig1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-1289676365',owner_user_name='tempest-MultipleCreateTestJSON-1289676365-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:19:56Z,user_data=None,user_id='4407ad6914204506adfa85e11e94e5d0',uuid=32c9613e-842d-48c2-be9d-368df9649040,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3b2fddbf-4dfc-49fd-8f32-e37500c967d4", "address": "fa:16:3e:ed:b5:cd", "network": {"id": "671431ea-00d4-4b91-9313-30948c14cb3a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1375422383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32942e5bdadc470989ae2d43e074169e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b2fddbf-4d", "ovs_interfaceid": "3b2fddbf-4dfc-49fd-8f32-e37500c967d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.474 2 DEBUG nova.network.os_vif_util [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Converting VIF {"id": "3b2fddbf-4dfc-49fd-8f32-e37500c967d4", "address": "fa:16:3e:ed:b5:cd", "network": {"id": "671431ea-00d4-4b91-9313-30948c14cb3a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1375422383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32942e5bdadc470989ae2d43e074169e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b2fddbf-4d", "ovs_interfaceid": "3b2fddbf-4dfc-49fd-8f32-e37500c967d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.474 2 DEBUG nova.network.os_vif_util [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ed:b5:cd,bridge_name='br-int',has_traffic_filtering=True,id=3b2fddbf-4dfc-49fd-8f32-e37500c967d4,network=Network(671431ea-00d4-4b91-9313-30948c14cb3a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b2fddbf-4d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.475 2 DEBUG nova.objects.instance [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lazy-loading 'pci_devices' on Instance uuid 32c9613e-842d-48c2-be9d-368df9649040 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.512 2 DEBUG nova.virt.libvirt.driver [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:20:00 np0005466013 nova_compute[192144]:  <uuid>32c9613e-842d-48c2-be9d-368df9649040</uuid>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:  <name>instance-00000067</name>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      <nova:name>tempest-tempest.common.compute-instance-1685103395-1</nova:name>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:20:00</nova:creationTime>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:20:00 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:        <nova:user uuid="4407ad6914204506adfa85e11e94e5d0">tempest-MultipleCreateTestJSON-1289676365-project-member</nova:user>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:        <nova:project uuid="32942e5bdadc470989ae2d43e074169e">tempest-MultipleCreateTestJSON-1289676365</nova:project>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:        <nova:port uuid="3b2fddbf-4dfc-49fd-8f32-e37500c967d4">
Oct  2 08:20:00 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      <entry name="serial">32c9613e-842d-48c2-be9d-368df9649040</entry>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      <entry name="uuid">32c9613e-842d-48c2-be9d-368df9649040</entry>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/32c9613e-842d-48c2-be9d-368df9649040/disk"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/32c9613e-842d-48c2-be9d-368df9649040/disk.config"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:ed:b5:cd"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      <target dev="tap3b2fddbf-4d"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/32c9613e-842d-48c2-be9d-368df9649040/console.log" append="off"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:20:00 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:20:00 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:20:00 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:20:00 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.513 2 DEBUG nova.compute.manager [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Preparing to wait for external event network-vif-plugged-3b2fddbf-4dfc-49fd-8f32-e37500c967d4 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.513 2 DEBUG oslo_concurrency.lockutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Acquiring lock "32c9613e-842d-48c2-be9d-368df9649040-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.513 2 DEBUG oslo_concurrency.lockutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "32c9613e-842d-48c2-be9d-368df9649040-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.514 2 DEBUG oslo_concurrency.lockutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "32c9613e-842d-48c2-be9d-368df9649040-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.514 2 DEBUG nova.virt.libvirt.vif [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:19:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1685103395',display_name='tempest-tempest.common.compute-instance-1685103395-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1685103395-1',id=103,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='32942e5bdadc470989ae2d43e074169e',ramdisk_id='',reservation_id='r-70160ig1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-1289676365',owner_user_name='tempest-MultipleCreateTestJSON-1289676365-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:19:56Z,user_data=None,user_id='4407ad6914204506adfa85e11e94e5d0',uuid=32c9613e-842d-48c2-be9d-368df9649040,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3b2fddbf-4dfc-49fd-8f32-e37500c967d4", "address": "fa:16:3e:ed:b5:cd", "network": {"id": "671431ea-00d4-4b91-9313-30948c14cb3a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1375422383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32942e5bdadc470989ae2d43e074169e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b2fddbf-4d", "ovs_interfaceid": "3b2fddbf-4dfc-49fd-8f32-e37500c967d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.515 2 DEBUG nova.network.os_vif_util [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Converting VIF {"id": "3b2fddbf-4dfc-49fd-8f32-e37500c967d4", "address": "fa:16:3e:ed:b5:cd", "network": {"id": "671431ea-00d4-4b91-9313-30948c14cb3a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1375422383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32942e5bdadc470989ae2d43e074169e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b2fddbf-4d", "ovs_interfaceid": "3b2fddbf-4dfc-49fd-8f32-e37500c967d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.515 2 DEBUG nova.network.os_vif_util [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ed:b5:cd,bridge_name='br-int',has_traffic_filtering=True,id=3b2fddbf-4dfc-49fd-8f32-e37500c967d4,network=Network(671431ea-00d4-4b91-9313-30948c14cb3a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b2fddbf-4d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.515 2 DEBUG os_vif [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ed:b5:cd,bridge_name='br-int',has_traffic_filtering=True,id=3b2fddbf-4dfc-49fd-8f32-e37500c967d4,network=Network(671431ea-00d4-4b91-9313-30948c14cb3a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b2fddbf-4d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.517 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.517 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.520 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.521 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3b2fddbf-4d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.521 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3b2fddbf-4d, col_values=(('external_ids', {'iface-id': '3b2fddbf-4dfc-49fd-8f32-e37500c967d4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ed:b5:cd', 'vm-uuid': '32c9613e-842d-48c2-be9d-368df9649040'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:00 np0005466013 NetworkManager[51205]: <info>  [1759407600.5240] manager: (tap3b2fddbf-4d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/175)
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.529 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.530 2 INFO os_vif [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ed:b5:cd,bridge_name='br-int',has_traffic_filtering=True,id=3b2fddbf-4dfc-49fd-8f32-e37500c967d4,network=Network(671431ea-00d4-4b91-9313-30948c14cb3a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b2fddbf-4d')#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.550 2 DEBUG nova.virt.libvirt.driver [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.550 2 DEBUG nova.virt.libvirt.driver [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.551 2 DEBUG nova.virt.libvirt.driver [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] No VIF found with MAC fa:16:3e:8f:2a:ea, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.551 2 INFO nova.virt.libvirt.driver [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Using config drive#033[00m
Oct  2 08:20:00 np0005466013 podman[234658]: 2025-10-02 12:20:00.558704419 +0000 UTC m=+0.068514995 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251001)
Oct  2 08:20:00 np0005466013 podman[234659]: 2025-10-02 12:20:00.558733279 +0000 UTC m=+0.064977144 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vendor=Red Hat, Inc., distribution-scope=public, managed_by=edpm_ansible, io.buildah.version=1.33.7, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vcs-type=git, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.617 2 DEBUG nova.virt.libvirt.driver [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.618 2 DEBUG nova.virt.libvirt.driver [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.618 2 DEBUG nova.virt.libvirt.driver [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] No VIF found with MAC fa:16:3e:ed:b5:cd, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:20:00 np0005466013 nova_compute[192144]: 2025-10-02 12:20:00.618 2 INFO nova.virt.libvirt.driver [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Using config drive#033[00m
Oct  2 08:20:01 np0005466013 nova_compute[192144]: 2025-10-02 12:20:01.234 2 INFO nova.virt.libvirt.driver [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Creating config drive at /var/lib/nova/instances/c6a7915d-e2fe-4409-a7fe-ead19189985a/disk.config#033[00m
Oct  2 08:20:01 np0005466013 nova_compute[192144]: 2025-10-02 12:20:01.238 2 DEBUG oslo_concurrency.processutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c6a7915d-e2fe-4409-a7fe-ead19189985a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpuvpbq8p1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:01 np0005466013 nova_compute[192144]: 2025-10-02 12:20:01.268 2 INFO nova.virt.libvirt.driver [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Creating config drive at /var/lib/nova/instances/32c9613e-842d-48c2-be9d-368df9649040/disk.config#033[00m
Oct  2 08:20:01 np0005466013 nova_compute[192144]: 2025-10-02 12:20:01.274 2 DEBUG oslo_concurrency.processutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/32c9613e-842d-48c2-be9d-368df9649040/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpq7jmdx48 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:01 np0005466013 nova_compute[192144]: 2025-10-02 12:20:01.372 2 DEBUG oslo_concurrency.processutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c6a7915d-e2fe-4409-a7fe-ead19189985a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpuvpbq8p1" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:01 np0005466013 nova_compute[192144]: 2025-10-02 12:20:01.407 2 DEBUG oslo_concurrency.processutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/32c9613e-842d-48c2-be9d-368df9649040/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpq7jmdx48" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:01 np0005466013 NetworkManager[51205]: <info>  [1759407601.4609] manager: (tapfc5ead0e-14): new Tun device (/org/freedesktop/NetworkManager/Devices/176)
Oct  2 08:20:01 np0005466013 kernel: tapfc5ead0e-14: entered promiscuous mode
Oct  2 08:20:01 np0005466013 ovn_controller[94366]: 2025-10-02T12:20:01Z|00376|binding|INFO|Claiming lport fc5ead0e-1493-42fc-92c9-2861340d0114 for this chassis.
Oct  2 08:20:01 np0005466013 nova_compute[192144]: 2025-10-02 12:20:01.464 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:01 np0005466013 ovn_controller[94366]: 2025-10-02T12:20:01Z|00377|binding|INFO|fc5ead0e-1493-42fc-92c9-2861340d0114: Claiming fa:16:3e:8f:2a:ea 10.100.0.6
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:01.478 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8f:2a:ea 10.100.0.6'], port_security=['fa:16:3e:8f:2a:ea 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'c6a7915d-e2fe-4409-a7fe-ead19189985a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-671431ea-00d4-4b91-9313-30948c14cb3a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '32942e5bdadc470989ae2d43e074169e', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'cfb3aec0-81fc-4b5f-b44d-ec3c2e8c38e2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6bad4103-c8e8-465b-8547-fe93b2136e07, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=fc5ead0e-1493-42fc-92c9-2861340d0114) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:01.480 103323 INFO neutron.agent.ovn.metadata.agent [-] Port fc5ead0e-1493-42fc-92c9-2861340d0114 in datapath 671431ea-00d4-4b91-9313-30948c14cb3a bound to our chassis#033[00m
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:01.481 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 671431ea-00d4-4b91-9313-30948c14cb3a#033[00m
Oct  2 08:20:01 np0005466013 NetworkManager[51205]: <info>  [1759407601.4982] manager: (tap3b2fddbf-4d): new Tun device (/org/freedesktop/NetworkManager/Devices/177)
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:01.499 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[7278b7e8-23dd-4b48-aede-dd9f58c11974]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:01.500 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap671431ea-01 in ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:01.502 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap671431ea-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:01.502 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[93076a22-3edf-4b80-8d6e-34be62c6c907]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:01 np0005466013 systemd-udevd[234728]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:20:01 np0005466013 systemd-udevd[234727]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:01.504 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[be784985-b941-4460-bc08-8e071f12e553]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:01 np0005466013 NetworkManager[51205]: <info>  [1759407601.5208] device (tapfc5ead0e-14): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:20:01 np0005466013 NetworkManager[51205]: <info>  [1759407601.5217] device (tapfc5ead0e-14): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:01.519 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[98d7f6f4-1b83-4ed9-a848-bb48f6b5e96e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:01 np0005466013 kernel: tap3b2fddbf-4d: entered promiscuous mode
Oct  2 08:20:01 np0005466013 NetworkManager[51205]: <info>  [1759407601.5279] device (tap3b2fddbf-4d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:20:01 np0005466013 NetworkManager[51205]: <info>  [1759407601.5290] device (tap3b2fddbf-4d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:20:01 np0005466013 nova_compute[192144]: 2025-10-02 12:20:01.527 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:01 np0005466013 nova_compute[192144]: 2025-10-02 12:20:01.532 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:01 np0005466013 ovn_controller[94366]: 2025-10-02T12:20:01Z|00378|binding|INFO|Claiming lport 3b2fddbf-4dfc-49fd-8f32-e37500c967d4 for this chassis.
Oct  2 08:20:01 np0005466013 ovn_controller[94366]: 2025-10-02T12:20:01Z|00379|binding|INFO|3b2fddbf-4dfc-49fd-8f32-e37500c967d4: Claiming fa:16:3e:ed:b5:cd 10.100.0.5
Oct  2 08:20:01 np0005466013 ovn_controller[94366]: 2025-10-02T12:20:01Z|00380|binding|INFO|Setting lport fc5ead0e-1493-42fc-92c9-2861340d0114 ovn-installed in OVS
Oct  2 08:20:01 np0005466013 systemd-machined[152202]: New machine qemu-44-instance-00000068.
Oct  2 08:20:01 np0005466013 nova_compute[192144]: 2025-10-02 12:20:01.536 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:01 np0005466013 ovn_controller[94366]: 2025-10-02T12:20:01Z|00381|binding|INFO|Setting lport fc5ead0e-1493-42fc-92c9-2861340d0114 up in Southbound
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:01.542 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ed:b5:cd 10.100.0.5'], port_security=['fa:16:3e:ed:b5:cd 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '32c9613e-842d-48c2-be9d-368df9649040', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-671431ea-00d4-4b91-9313-30948c14cb3a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '32942e5bdadc470989ae2d43e074169e', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'cfb3aec0-81fc-4b5f-b44d-ec3c2e8c38e2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6bad4103-c8e8-465b-8547-fe93b2136e07, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=3b2fddbf-4dfc-49fd-8f32-e37500c967d4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:20:01 np0005466013 ovn_controller[94366]: 2025-10-02T12:20:01Z|00382|binding|INFO|Setting lport 3b2fddbf-4dfc-49fd-8f32-e37500c967d4 ovn-installed in OVS
Oct  2 08:20:01 np0005466013 nova_compute[192144]: 2025-10-02 12:20:01.547 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:01.547 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[f715a152-a00a-4063-ba1c-1ecbe9356e4b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:01 np0005466013 ovn_controller[94366]: 2025-10-02T12:20:01Z|00383|binding|INFO|Setting lport 3b2fddbf-4dfc-49fd-8f32-e37500c967d4 up in Southbound
Oct  2 08:20:01 np0005466013 systemd[1]: Started Virtual Machine qemu-44-instance-00000068.
Oct  2 08:20:01 np0005466013 systemd-machined[152202]: New machine qemu-45-instance-00000067.
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:01.583 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[04ebfcf8-747c-47e4-a612-58f370888d34]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:01.589 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[e3ebe822-5993-4a11-8b62-2324b5922de2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:01 np0005466013 NetworkManager[51205]: <info>  [1759407601.5903] manager: (tap671431ea-00): new Veth device (/org/freedesktop/NetworkManager/Devices/178)
Oct  2 08:20:01 np0005466013 systemd[1]: Started Virtual Machine qemu-45-instance-00000067.
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:01.624 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[40c94d48-8045-40a3-a19d-2dc50b70b76e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:01.628 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[12972ef3-56eb-47c7-96d0-2ed49d23db23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:01 np0005466013 NetworkManager[51205]: <info>  [1759407601.6529] device (tap671431ea-00): carrier: link connected
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:01.660 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[e8f88756-4062-4a7d-a3ac-904a7c5a4ed7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:01.684 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[806e8171-308b-4e93-bb56-f0d7b51dcd98]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap671431ea-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e4:de:56'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 116], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 559327, 'reachable_time': 31413, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234770, 'error': None, 'target': 'ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:01.706 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[1b7baae2-3f4c-4eda-80cf-99d12c84fc1c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee4:de56'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 559327, 'tstamp': 559327}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234772, 'error': None, 'target': 'ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:01.726 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[f835ab4e-3992-42af-89b4-43528f01c1bb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap671431ea-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e4:de:56'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 116], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 559327, 'reachable_time': 31413, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 234773, 'error': None, 'target': 'ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:01.767 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[54a61762-15cf-4fca-9012-0132349e1165]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:01.831 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[20d07b3d-5824-44d1-a88b-a6c03d580252]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:01.832 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap671431ea-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:01.832 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:01.833 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap671431ea-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:01 np0005466013 nova_compute[192144]: 2025-10-02 12:20:01.835 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:01 np0005466013 NetworkManager[51205]: <info>  [1759407601.8356] manager: (tap671431ea-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/179)
Oct  2 08:20:01 np0005466013 kernel: tap671431ea-00: entered promiscuous mode
Oct  2 08:20:01 np0005466013 nova_compute[192144]: 2025-10-02 12:20:01.838 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:01.839 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap671431ea-00, col_values=(('external_ids', {'iface-id': 'db7c1214-1d9a-4543-8afe-9e322e5cef6d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:01 np0005466013 nova_compute[192144]: 2025-10-02 12:20:01.840 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:01 np0005466013 ovn_controller[94366]: 2025-10-02T12:20:01Z|00384|binding|INFO|Releasing lport db7c1214-1d9a-4543-8afe-9e322e5cef6d from this chassis (sb_readonly=0)
Oct  2 08:20:01 np0005466013 nova_compute[192144]: 2025-10-02 12:20:01.854 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:01.856 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/671431ea-00d4-4b91-9313-30948c14cb3a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/671431ea-00d4-4b91-9313-30948c14cb3a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:01.857 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[65b8381b-92ac-4249-9535-5a7988249846]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:01.858 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-671431ea-00d4-4b91-9313-30948c14cb3a
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/671431ea-00d4-4b91-9313-30948c14cb3a.pid.haproxy
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID 671431ea-00d4-4b91-9313-30948c14cb3a
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:20:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:01.860 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a', 'env', 'PROCESS_TAG=haproxy-671431ea-00d4-4b91-9313-30948c14cb3a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/671431ea-00d4-4b91-9313-30948c14cb3a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:20:02 np0005466013 nova_compute[192144]: 2025-10-02 12:20:02.077 2 DEBUG nova.compute.manager [req-528ee9f7-5707-4fd3-b09e-d9d6fc9a5e39 req-8d43f531-5414-49a1-a1cb-99163a473c39 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Received event network-vif-plugged-fc5ead0e-1493-42fc-92c9-2861340d0114 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:02 np0005466013 nova_compute[192144]: 2025-10-02 12:20:02.078 2 DEBUG oslo_concurrency.lockutils [req-528ee9f7-5707-4fd3-b09e-d9d6fc9a5e39 req-8d43f531-5414-49a1-a1cb-99163a473c39 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "c6a7915d-e2fe-4409-a7fe-ead19189985a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:02 np0005466013 nova_compute[192144]: 2025-10-02 12:20:02.078 2 DEBUG oslo_concurrency.lockutils [req-528ee9f7-5707-4fd3-b09e-d9d6fc9a5e39 req-8d43f531-5414-49a1-a1cb-99163a473c39 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "c6a7915d-e2fe-4409-a7fe-ead19189985a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:02 np0005466013 nova_compute[192144]: 2025-10-02 12:20:02.079 2 DEBUG oslo_concurrency.lockutils [req-528ee9f7-5707-4fd3-b09e-d9d6fc9a5e39 req-8d43f531-5414-49a1-a1cb-99163a473c39 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "c6a7915d-e2fe-4409-a7fe-ead19189985a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:02 np0005466013 nova_compute[192144]: 2025-10-02 12:20:02.079 2 DEBUG nova.compute.manager [req-528ee9f7-5707-4fd3-b09e-d9d6fc9a5e39 req-8d43f531-5414-49a1-a1cb-99163a473c39 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Processing event network-vif-plugged-fc5ead0e-1493-42fc-92c9-2861340d0114 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:20:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:02.302 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:02.303 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:02.304 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:02 np0005466013 podman[234818]: 2025-10-02 12:20:02.326187499 +0000 UTC m=+0.083478982 container create 118f1c190894d24876fb6e27df298c50bb39bbacbc108d3b7d6eb5a731f5e360 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2)
Oct  2 08:20:02 np0005466013 nova_compute[192144]: 2025-10-02 12:20:02.339 2 DEBUG nova.network.neutron [req-6f653e64-0711-4a44-b5c6-ede8f8d875d4 req-4ed5b253-c5ee-401b-89a3-58495a2d2cf1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Updated VIF entry in instance network info cache for port fc5ead0e-1493-42fc-92c9-2861340d0114. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:20:02 np0005466013 nova_compute[192144]: 2025-10-02 12:20:02.344 2 DEBUG nova.network.neutron [req-6f653e64-0711-4a44-b5c6-ede8f8d875d4 req-4ed5b253-c5ee-401b-89a3-58495a2d2cf1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Updating instance_info_cache with network_info: [{"id": "fc5ead0e-1493-42fc-92c9-2861340d0114", "address": "fa:16:3e:8f:2a:ea", "network": {"id": "671431ea-00d4-4b91-9313-30948c14cb3a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1375422383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32942e5bdadc470989ae2d43e074169e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc5ead0e-14", "ovs_interfaceid": "fc5ead0e-1493-42fc-92c9-2861340d0114", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:20:02 np0005466013 podman[234818]: 2025-10-02 12:20:02.270219565 +0000 UTC m=+0.027511068 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:20:02 np0005466013 nova_compute[192144]: 2025-10-02 12:20:02.368 2 DEBUG oslo_concurrency.lockutils [req-6f653e64-0711-4a44-b5c6-ede8f8d875d4 req-4ed5b253-c5ee-401b-89a3-58495a2d2cf1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-c6a7915d-e2fe-4409-a7fe-ead19189985a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:20:02 np0005466013 systemd[1]: Started libpod-conmon-118f1c190894d24876fb6e27df298c50bb39bbacbc108d3b7d6eb5a731f5e360.scope.
Oct  2 08:20:02 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:20:02 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/954ac8d8e8204b6e3f1c175984ab18e3a1a11c1c881ba4296be113ebedd44ee9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:20:02 np0005466013 podman[234818]: 2025-10-02 12:20:02.450322394 +0000 UTC m=+0.207613907 container init 118f1c190894d24876fb6e27df298c50bb39bbacbc108d3b7d6eb5a731f5e360 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:20:02 np0005466013 nova_compute[192144]: 2025-10-02 12:20:02.458 2 DEBUG nova.network.neutron [req-ce19bfea-f7f6-47b7-955d-6fd92f1013e4 req-80459423-6620-432c-b4eb-f55772ad5a6a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Updated VIF entry in instance network info cache for port 3b2fddbf-4dfc-49fd-8f32-e37500c967d4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:20:02 np0005466013 nova_compute[192144]: 2025-10-02 12:20:02.458 2 DEBUG nova.network.neutron [req-ce19bfea-f7f6-47b7-955d-6fd92f1013e4 req-80459423-6620-432c-b4eb-f55772ad5a6a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Updating instance_info_cache with network_info: [{"id": "3b2fddbf-4dfc-49fd-8f32-e37500c967d4", "address": "fa:16:3e:ed:b5:cd", "network": {"id": "671431ea-00d4-4b91-9313-30948c14cb3a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1375422383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32942e5bdadc470989ae2d43e074169e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b2fddbf-4d", "ovs_interfaceid": "3b2fddbf-4dfc-49fd-8f32-e37500c967d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:20:02 np0005466013 podman[234818]: 2025-10-02 12:20:02.458626504 +0000 UTC m=+0.215917987 container start 118f1c190894d24876fb6e27df298c50bb39bbacbc108d3b7d6eb5a731f5e360 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  2 08:20:02 np0005466013 nova_compute[192144]: 2025-10-02 12:20:02.480 2 DEBUG oslo_concurrency.lockutils [req-ce19bfea-f7f6-47b7-955d-6fd92f1013e4 req-80459423-6620-432c-b4eb-f55772ad5a6a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-32c9613e-842d-48c2-be9d-368df9649040" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:20:02 np0005466013 neutron-haproxy-ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a[234832]: [NOTICE]   (234836) : New worker (234838) forked
Oct  2 08:20:02 np0005466013 neutron-haproxy-ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a[234832]: [NOTICE]   (234836) : Loading success.
Oct  2 08:20:02 np0005466013 nova_compute[192144]: 2025-10-02 12:20:02.517 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407602.5165658, c6a7915d-e2fe-4409-a7fe-ead19189985a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:20:02 np0005466013 nova_compute[192144]: 2025-10-02 12:20:02.518 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] VM Started (Lifecycle Event)#033[00m
Oct  2 08:20:02 np0005466013 nova_compute[192144]: 2025-10-02 12:20:02.520 2 DEBUG nova.compute.manager [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:20:02 np0005466013 nova_compute[192144]: 2025-10-02 12:20:02.530 2 DEBUG nova.virt.libvirt.driver [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:20:02 np0005466013 nova_compute[192144]: 2025-10-02 12:20:02.542 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:20:02 np0005466013 nova_compute[192144]: 2025-10-02 12:20:02.544 2 INFO nova.virt.libvirt.driver [-] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Instance spawned successfully.#033[00m
Oct  2 08:20:02 np0005466013 nova_compute[192144]: 2025-10-02 12:20:02.544 2 DEBUG nova.virt.libvirt.driver [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:20:02 np0005466013 nova_compute[192144]: 2025-10-02 12:20:02.550 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:20:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:02.564 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 3b2fddbf-4dfc-49fd-8f32-e37500c967d4 in datapath 671431ea-00d4-4b91-9313-30948c14cb3a unbound from our chassis#033[00m
Oct  2 08:20:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:02.566 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 671431ea-00d4-4b91-9313-30948c14cb3a#033[00m
Oct  2 08:20:02 np0005466013 nova_compute[192144]: 2025-10-02 12:20:02.571 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:20:02 np0005466013 nova_compute[192144]: 2025-10-02 12:20:02.572 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407602.5169437, c6a7915d-e2fe-4409-a7fe-ead19189985a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:20:02 np0005466013 nova_compute[192144]: 2025-10-02 12:20:02.572 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:20:02 np0005466013 nova_compute[192144]: 2025-10-02 12:20:02.578 2 DEBUG nova.virt.libvirt.driver [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:02 np0005466013 nova_compute[192144]: 2025-10-02 12:20:02.579 2 DEBUG nova.virt.libvirt.driver [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:02 np0005466013 nova_compute[192144]: 2025-10-02 12:20:02.579 2 DEBUG nova.virt.libvirt.driver [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:02 np0005466013 nova_compute[192144]: 2025-10-02 12:20:02.580 2 DEBUG nova.virt.libvirt.driver [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:02 np0005466013 nova_compute[192144]: 2025-10-02 12:20:02.580 2 DEBUG nova.virt.libvirt.driver [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:02 np0005466013 nova_compute[192144]: 2025-10-02 12:20:02.581 2 DEBUG nova.virt.libvirt.driver [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:02.589 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[f5754332-a37b-4318-9c8e-5513ff165345]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:02 np0005466013 nova_compute[192144]: 2025-10-02 12:20:02.594 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:20:02 np0005466013 nova_compute[192144]: 2025-10-02 12:20:02.599 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407602.525182, c6a7915d-e2fe-4409-a7fe-ead19189985a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:20:02 np0005466013 nova_compute[192144]: 2025-10-02 12:20:02.600 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:20:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:02.623 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[1676e8a0-7462-40ee-9248-734c4f50734e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:02.627 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[4a4c8081-205e-41d5-a071-5379a3e93d9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:02 np0005466013 nova_compute[192144]: 2025-10-02 12:20:02.629 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:20:02 np0005466013 nova_compute[192144]: 2025-10-02 12:20:02.633 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:20:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:02.659 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[a30836d4-9f8f-4eab-b944-52c3d66fe230]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:02 np0005466013 nova_compute[192144]: 2025-10-02 12:20:02.660 2 INFO nova.compute.manager [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Took 5.34 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:20:02 np0005466013 nova_compute[192144]: 2025-10-02 12:20:02.660 2 DEBUG nova.compute.manager [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:20:02 np0005466013 nova_compute[192144]: 2025-10-02 12:20:02.661 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:20:02 np0005466013 nova_compute[192144]: 2025-10-02 12:20:02.662 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407602.5901437, 32c9613e-842d-48c2-be9d-368df9649040 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:20:02 np0005466013 nova_compute[192144]: 2025-10-02 12:20:02.662 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 32c9613e-842d-48c2-be9d-368df9649040] VM Started (Lifecycle Event)#033[00m
Oct  2 08:20:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:02.679 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[918e89fe-ab6e-498f-b9fd-418f854b6279]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap671431ea-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e4:de:56'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 3, 'tx_packets': 5, 'rx_bytes': 306, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 3, 'tx_packets': 5, 'rx_bytes': 306, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 116], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 559327, 'reachable_time': 31413, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 3, 'inoctets': 264, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 3, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 264, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 3, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234852, 'error': None, 'target': 'ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:02 np0005466013 nova_compute[192144]: 2025-10-02 12:20:02.694 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:20:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:02.695 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[5c081d1f-d5f3-4210-86f1-b4cf3f6f0a13]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap671431ea-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 559341, 'tstamp': 559341}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234853, 'error': None, 'target': 'ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap671431ea-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 559344, 'tstamp': 559344}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234853, 'error': None, 'target': 'ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:02.698 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap671431ea-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:02 np0005466013 nova_compute[192144]: 2025-10-02 12:20:02.699 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407602.59029, 32c9613e-842d-48c2-be9d-368df9649040 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:20:02 np0005466013 nova_compute[192144]: 2025-10-02 12:20:02.699 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 32c9613e-842d-48c2-be9d-368df9649040] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:20:02 np0005466013 nova_compute[192144]: 2025-10-02 12:20:02.700 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:02 np0005466013 nova_compute[192144]: 2025-10-02 12:20:02.701 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:02.702 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap671431ea-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:02.702 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:20:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:02.703 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap671431ea-00, col_values=(('external_ids', {'iface-id': 'db7c1214-1d9a-4543-8afe-9e322e5cef6d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:02.703 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:20:02 np0005466013 nova_compute[192144]: 2025-10-02 12:20:02.726 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:20:02 np0005466013 nova_compute[192144]: 2025-10-02 12:20:02.730 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:20:02 np0005466013 nova_compute[192144]: 2025-10-02 12:20:02.761 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 32c9613e-842d-48c2-be9d-368df9649040] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:20:02 np0005466013 nova_compute[192144]: 2025-10-02 12:20:02.774 2 INFO nova.compute.manager [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Took 6.20 seconds to build instance.#033[00m
Oct  2 08:20:02 np0005466013 nova_compute[192144]: 2025-10-02 12:20:02.792 2 DEBUG oslo_concurrency.lockutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "c6a7915d-e2fe-4409-a7fe-ead19189985a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.308s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:03 np0005466013 nova_compute[192144]: 2025-10-02 12:20:03.651 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:04 np0005466013 nova_compute[192144]: 2025-10-02 12:20:04.192 2 DEBUG nova.compute.manager [req-14d0c607-531e-4cdc-a9b9-ba3042e453c7 req-734eb638-c70b-425e-ada9-6280c323ca02 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Received event network-vif-plugged-fc5ead0e-1493-42fc-92c9-2861340d0114 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:04 np0005466013 nova_compute[192144]: 2025-10-02 12:20:04.193 2 DEBUG oslo_concurrency.lockutils [req-14d0c607-531e-4cdc-a9b9-ba3042e453c7 req-734eb638-c70b-425e-ada9-6280c323ca02 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "c6a7915d-e2fe-4409-a7fe-ead19189985a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:04 np0005466013 nova_compute[192144]: 2025-10-02 12:20:04.193 2 DEBUG oslo_concurrency.lockutils [req-14d0c607-531e-4cdc-a9b9-ba3042e453c7 req-734eb638-c70b-425e-ada9-6280c323ca02 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "c6a7915d-e2fe-4409-a7fe-ead19189985a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:04 np0005466013 nova_compute[192144]: 2025-10-02 12:20:04.194 2 DEBUG oslo_concurrency.lockutils [req-14d0c607-531e-4cdc-a9b9-ba3042e453c7 req-734eb638-c70b-425e-ada9-6280c323ca02 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "c6a7915d-e2fe-4409-a7fe-ead19189985a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:04 np0005466013 nova_compute[192144]: 2025-10-02 12:20:04.194 2 DEBUG nova.compute.manager [req-14d0c607-531e-4cdc-a9b9-ba3042e453c7 req-734eb638-c70b-425e-ada9-6280c323ca02 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] No waiting events found dispatching network-vif-plugged-fc5ead0e-1493-42fc-92c9-2861340d0114 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:20:04 np0005466013 nova_compute[192144]: 2025-10-02 12:20:04.194 2 WARNING nova.compute.manager [req-14d0c607-531e-4cdc-a9b9-ba3042e453c7 req-734eb638-c70b-425e-ada9-6280c323ca02 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Received unexpected event network-vif-plugged-fc5ead0e-1493-42fc-92c9-2861340d0114 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:20:04 np0005466013 nova_compute[192144]: 2025-10-02 12:20:04.195 2 DEBUG nova.compute.manager [req-14d0c607-531e-4cdc-a9b9-ba3042e453c7 req-734eb638-c70b-425e-ada9-6280c323ca02 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Received event network-vif-plugged-3b2fddbf-4dfc-49fd-8f32-e37500c967d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:04 np0005466013 nova_compute[192144]: 2025-10-02 12:20:04.195 2 DEBUG oslo_concurrency.lockutils [req-14d0c607-531e-4cdc-a9b9-ba3042e453c7 req-734eb638-c70b-425e-ada9-6280c323ca02 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "32c9613e-842d-48c2-be9d-368df9649040-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:04 np0005466013 nova_compute[192144]: 2025-10-02 12:20:04.195 2 DEBUG oslo_concurrency.lockutils [req-14d0c607-531e-4cdc-a9b9-ba3042e453c7 req-734eb638-c70b-425e-ada9-6280c323ca02 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "32c9613e-842d-48c2-be9d-368df9649040-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:04 np0005466013 nova_compute[192144]: 2025-10-02 12:20:04.196 2 DEBUG oslo_concurrency.lockutils [req-14d0c607-531e-4cdc-a9b9-ba3042e453c7 req-734eb638-c70b-425e-ada9-6280c323ca02 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "32c9613e-842d-48c2-be9d-368df9649040-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:04 np0005466013 nova_compute[192144]: 2025-10-02 12:20:04.196 2 DEBUG nova.compute.manager [req-14d0c607-531e-4cdc-a9b9-ba3042e453c7 req-734eb638-c70b-425e-ada9-6280c323ca02 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Processing event network-vif-plugged-3b2fddbf-4dfc-49fd-8f32-e37500c967d4 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:20:04 np0005466013 nova_compute[192144]: 2025-10-02 12:20:04.196 2 DEBUG nova.compute.manager [req-14d0c607-531e-4cdc-a9b9-ba3042e453c7 req-734eb638-c70b-425e-ada9-6280c323ca02 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Received event network-vif-plugged-3b2fddbf-4dfc-49fd-8f32-e37500c967d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:04 np0005466013 nova_compute[192144]: 2025-10-02 12:20:04.197 2 DEBUG oslo_concurrency.lockutils [req-14d0c607-531e-4cdc-a9b9-ba3042e453c7 req-734eb638-c70b-425e-ada9-6280c323ca02 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "32c9613e-842d-48c2-be9d-368df9649040-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:04 np0005466013 nova_compute[192144]: 2025-10-02 12:20:04.197 2 DEBUG oslo_concurrency.lockutils [req-14d0c607-531e-4cdc-a9b9-ba3042e453c7 req-734eb638-c70b-425e-ada9-6280c323ca02 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "32c9613e-842d-48c2-be9d-368df9649040-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:04 np0005466013 nova_compute[192144]: 2025-10-02 12:20:04.197 2 DEBUG oslo_concurrency.lockutils [req-14d0c607-531e-4cdc-a9b9-ba3042e453c7 req-734eb638-c70b-425e-ada9-6280c323ca02 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "32c9613e-842d-48c2-be9d-368df9649040-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:04 np0005466013 nova_compute[192144]: 2025-10-02 12:20:04.198 2 DEBUG nova.compute.manager [req-14d0c607-531e-4cdc-a9b9-ba3042e453c7 req-734eb638-c70b-425e-ada9-6280c323ca02 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] No waiting events found dispatching network-vif-plugged-3b2fddbf-4dfc-49fd-8f32-e37500c967d4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:20:04 np0005466013 nova_compute[192144]: 2025-10-02 12:20:04.198 2 WARNING nova.compute.manager [req-14d0c607-531e-4cdc-a9b9-ba3042e453c7 req-734eb638-c70b-425e-ada9-6280c323ca02 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Received unexpected event network-vif-plugged-3b2fddbf-4dfc-49fd-8f32-e37500c967d4 for instance with vm_state building and task_state spawning.#033[00m
Oct  2 08:20:04 np0005466013 nova_compute[192144]: 2025-10-02 12:20:04.199 2 DEBUG nova.compute.manager [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:20:04 np0005466013 nova_compute[192144]: 2025-10-02 12:20:04.205 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407604.2055502, 32c9613e-842d-48c2-be9d-368df9649040 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:20:04 np0005466013 nova_compute[192144]: 2025-10-02 12:20:04.206 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 32c9613e-842d-48c2-be9d-368df9649040] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:20:04 np0005466013 nova_compute[192144]: 2025-10-02 12:20:04.208 2 DEBUG nova.virt.libvirt.driver [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:20:04 np0005466013 nova_compute[192144]: 2025-10-02 12:20:04.212 2 INFO nova.virt.libvirt.driver [-] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Instance spawned successfully.#033[00m
Oct  2 08:20:04 np0005466013 nova_compute[192144]: 2025-10-02 12:20:04.213 2 DEBUG nova.virt.libvirt.driver [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:20:04 np0005466013 nova_compute[192144]: 2025-10-02 12:20:04.227 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:20:04 np0005466013 nova_compute[192144]: 2025-10-02 12:20:04.234 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:20:04 np0005466013 nova_compute[192144]: 2025-10-02 12:20:04.238 2 DEBUG nova.virt.libvirt.driver [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:04 np0005466013 nova_compute[192144]: 2025-10-02 12:20:04.238 2 DEBUG nova.virt.libvirt.driver [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:04 np0005466013 nova_compute[192144]: 2025-10-02 12:20:04.239 2 DEBUG nova.virt.libvirt.driver [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:04 np0005466013 nova_compute[192144]: 2025-10-02 12:20:04.240 2 DEBUG nova.virt.libvirt.driver [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:04 np0005466013 nova_compute[192144]: 2025-10-02 12:20:04.240 2 DEBUG nova.virt.libvirt.driver [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:04 np0005466013 nova_compute[192144]: 2025-10-02 12:20:04.241 2 DEBUG nova.virt.libvirt.driver [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:04 np0005466013 nova_compute[192144]: 2025-10-02 12:20:04.263 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 32c9613e-842d-48c2-be9d-368df9649040] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:20:04 np0005466013 nova_compute[192144]: 2025-10-02 12:20:04.308 2 INFO nova.compute.manager [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Took 7.27 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:20:04 np0005466013 nova_compute[192144]: 2025-10-02 12:20:04.309 2 DEBUG nova.compute.manager [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:20:04 np0005466013 nova_compute[192144]: 2025-10-02 12:20:04.398 2 INFO nova.compute.manager [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Took 7.89 seconds to build instance.#033[00m
Oct  2 08:20:04 np0005466013 nova_compute[192144]: 2025-10-02 12:20:04.417 2 DEBUG oslo_concurrency.lockutils [None req-b066302e-07a6-410f-8cbc-844e2def1ea6 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "32c9613e-842d-48c2-be9d-368df9649040" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.998s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.134 2 DEBUG oslo_concurrency.lockutils [None req-90c02ea6-c0ff-4590-8124-106e67bbce5e 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Acquiring lock "32c9613e-842d-48c2-be9d-368df9649040" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.135 2 DEBUG oslo_concurrency.lockutils [None req-90c02ea6-c0ff-4590-8124-106e67bbce5e 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "32c9613e-842d-48c2-be9d-368df9649040" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.135 2 DEBUG oslo_concurrency.lockutils [None req-90c02ea6-c0ff-4590-8124-106e67bbce5e 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Acquiring lock "32c9613e-842d-48c2-be9d-368df9649040-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.136 2 DEBUG oslo_concurrency.lockutils [None req-90c02ea6-c0ff-4590-8124-106e67bbce5e 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "32c9613e-842d-48c2-be9d-368df9649040-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.137 2 DEBUG oslo_concurrency.lockutils [None req-90c02ea6-c0ff-4590-8124-106e67bbce5e 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "32c9613e-842d-48c2-be9d-368df9649040-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.191 2 INFO nova.compute.manager [None req-90c02ea6-c0ff-4590-8124-106e67bbce5e 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Terminating instance#033[00m
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.248 2 DEBUG nova.compute.manager [None req-90c02ea6-c0ff-4590-8124-106e67bbce5e 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:20:05 np0005466013 kernel: tap3b2fddbf-4d (unregistering): left promiscuous mode
Oct  2 08:20:05 np0005466013 NetworkManager[51205]: <info>  [1759407605.2742] device (tap3b2fddbf-4d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:20:05 np0005466013 ovn_controller[94366]: 2025-10-02T12:20:05Z|00385|binding|INFO|Releasing lport 3b2fddbf-4dfc-49fd-8f32-e37500c967d4 from this chassis (sb_readonly=0)
Oct  2 08:20:05 np0005466013 ovn_controller[94366]: 2025-10-02T12:20:05Z|00386|binding|INFO|Setting lport 3b2fddbf-4dfc-49fd-8f32-e37500c967d4 down in Southbound
Oct  2 08:20:05 np0005466013 ovn_controller[94366]: 2025-10-02T12:20:05Z|00387|binding|INFO|Removing iface tap3b2fddbf-4d ovn-installed in OVS
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.294 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:05.303 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ed:b5:cd 10.100.0.5'], port_security=['fa:16:3e:ed:b5:cd 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '32c9613e-842d-48c2-be9d-368df9649040', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-671431ea-00d4-4b91-9313-30948c14cb3a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '32942e5bdadc470989ae2d43e074169e', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cfb3aec0-81fc-4b5f-b44d-ec3c2e8c38e2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6bad4103-c8e8-465b-8547-fe93b2136e07, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=3b2fddbf-4dfc-49fd-8f32-e37500c967d4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:20:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:05.304 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 3b2fddbf-4dfc-49fd-8f32-e37500c967d4 in datapath 671431ea-00d4-4b91-9313-30948c14cb3a unbound from our chassis#033[00m
Oct  2 08:20:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:05.305 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 671431ea-00d4-4b91-9313-30948c14cb3a#033[00m
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.320 2 DEBUG oslo_concurrency.lockutils [None req-277eb500-1168-487d-83ee-b60efa06a3ed 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Acquiring lock "c6a7915d-e2fe-4409-a7fe-ead19189985a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.321 2 DEBUG oslo_concurrency.lockutils [None req-277eb500-1168-487d-83ee-b60efa06a3ed 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "c6a7915d-e2fe-4409-a7fe-ead19189985a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.321 2 DEBUG oslo_concurrency.lockutils [None req-277eb500-1168-487d-83ee-b60efa06a3ed 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Acquiring lock "c6a7915d-e2fe-4409-a7fe-ead19189985a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.321 2 DEBUG oslo_concurrency.lockutils [None req-277eb500-1168-487d-83ee-b60efa06a3ed 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "c6a7915d-e2fe-4409-a7fe-ead19189985a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.322 2 DEBUG oslo_concurrency.lockutils [None req-277eb500-1168-487d-83ee-b60efa06a3ed 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "c6a7915d-e2fe-4409-a7fe-ead19189985a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:05.332 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[c4033237-f142-4d04-b96a-fd67246853b4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.338 2 INFO nova.compute.manager [None req-277eb500-1168-487d-83ee-b60efa06a3ed 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Terminating instance#033[00m
Oct  2 08:20:05 np0005466013 systemd[1]: machine-qemu\x2d45\x2dinstance\x2d00000067.scope: Deactivated successfully.
Oct  2 08:20:05 np0005466013 systemd[1]: machine-qemu\x2d45\x2dinstance\x2d00000067.scope: Consumed 1.949s CPU time.
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.352 2 DEBUG nova.compute.manager [None req-277eb500-1168-487d-83ee-b60efa06a3ed 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:20:05 np0005466013 systemd-machined[152202]: Machine qemu-45-instance-00000067 terminated.
Oct  2 08:20:05 np0005466013 kernel: tapfc5ead0e-14 (unregistering): left promiscuous mode
Oct  2 08:20:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:05.384 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[e7fe7ee4-82fd-4543-bfbf-3820e3c932c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:05 np0005466013 NetworkManager[51205]: <info>  [1759407605.3852] device (tapfc5ead0e-14): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:05.388 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[fe800d59-0efe-4762-93eb-1cd425c1301b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.400 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:05 np0005466013 podman[234854]: 2025-10-02 12:20:05.402555194 +0000 UTC m=+0.090420307 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  2 08:20:05 np0005466013 ovn_controller[94366]: 2025-10-02T12:20:05Z|00388|binding|INFO|Releasing lport fc5ead0e-1493-42fc-92c9-2861340d0114 from this chassis (sb_readonly=0)
Oct  2 08:20:05 np0005466013 ovn_controller[94366]: 2025-10-02T12:20:05Z|00389|binding|INFO|Setting lport fc5ead0e-1493-42fc-92c9-2861340d0114 down in Southbound
Oct  2 08:20:05 np0005466013 ovn_controller[94366]: 2025-10-02T12:20:05Z|00390|binding|INFO|Removing iface tapfc5ead0e-14 ovn-installed in OVS
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:05.413 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8f:2a:ea 10.100.0.6'], port_security=['fa:16:3e:8f:2a:ea 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'c6a7915d-e2fe-4409-a7fe-ead19189985a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-671431ea-00d4-4b91-9313-30948c14cb3a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '32942e5bdadc470989ae2d43e074169e', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cfb3aec0-81fc-4b5f-b44d-ec3c2e8c38e2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6bad4103-c8e8-465b-8547-fe93b2136e07, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=fc5ead0e-1493-42fc-92c9-2861340d0114) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:20:05 np0005466013 podman[234857]: 2025-10-02 12:20:05.415144096 +0000 UTC m=+0.092097179 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.415 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:05.430 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[699797a1-67d0-425f-96b6-080e6c20334a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:05 np0005466013 systemd[1]: machine-qemu\x2d44\x2dinstance\x2d00000068.scope: Deactivated successfully.
Oct  2 08:20:05 np0005466013 systemd[1]: machine-qemu\x2d44\x2dinstance\x2d00000068.scope: Consumed 3.709s CPU time.
Oct  2 08:20:05 np0005466013 systemd-machined[152202]: Machine qemu-44-instance-00000068 terminated.
Oct  2 08:20:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:05.451 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[005f6fa2-9b3c-40fb-9625-85fdcc0f02a4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap671431ea-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e4:de:56'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 832, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 832, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 116], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 559327, 'reachable_time': 44135, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234910, 'error': None, 'target': 'ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:05.473 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[716355d6-fd68-40be-ab35-67d26cf2f978]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap671431ea-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 559341, 'tstamp': 559341}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234911, 'error': None, 'target': 'ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap671431ea-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 559344, 'tstamp': 559344}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234911, 'error': None, 'target': 'ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:05.478 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap671431ea-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:05 np0005466013 NetworkManager[51205]: <info>  [1759407605.4788] manager: (tap3b2fddbf-4d): new Tun device (/org/freedesktop/NetworkManager/Devices/180)
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.532 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.534 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.542 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:05.544 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap671431ea-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:05.544 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:20:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:05.544 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap671431ea-00, col_values=(('external_ids', {'iface-id': 'db7c1214-1d9a-4543-8afe-9e322e5cef6d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:05.545 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:20:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:05.546 103323 INFO neutron.agent.ovn.metadata.agent [-] Port fc5ead0e-1493-42fc-92c9-2861340d0114 in datapath 671431ea-00d4-4b91-9313-30948c14cb3a unbound from our chassis#033[00m
Oct  2 08:20:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:05.547 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 671431ea-00d4-4b91-9313-30948c14cb3a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:20:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:05.548 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[3b009ca9-7cd8-430d-819a-7638a8fcd2fb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:05.549 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a namespace which is not needed anymore#033[00m
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.597 2 INFO nova.virt.libvirt.driver [-] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Instance destroyed successfully.#033[00m
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.598 2 DEBUG nova.objects.instance [None req-90c02ea6-c0ff-4590-8124-106e67bbce5e 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lazy-loading 'resources' on Instance uuid 32c9613e-842d-48c2-be9d-368df9649040 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.621 2 DEBUG nova.virt.libvirt.vif [None req-90c02ea6-c0ff-4590-8124-106e67bbce5e 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:19:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1685103395',display_name='tempest-tempest.common.compute-instance-1685103395-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1685103395-1',id=103,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:20:04Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='32942e5bdadc470989ae2d43e074169e',ramdisk_id='',reservation_id='r-70160ig1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-MultipleCreateTestJSON-1289676365',owner_user_name='tempest-MultipleCreateTestJSON-1289676365-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:20:04Z,user_data=None,user_id='4407ad6914204506adfa85e11e94e5d0',uuid=32c9613e-842d-48c2-be9d-368df9649040,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3b2fddbf-4dfc-49fd-8f32-e37500c967d4", "address": "fa:16:3e:ed:b5:cd", "network": {"id": "671431ea-00d4-4b91-9313-30948c14cb3a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1375422383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32942e5bdadc470989ae2d43e074169e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b2fddbf-4d", "ovs_interfaceid": "3b2fddbf-4dfc-49fd-8f32-e37500c967d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.622 2 DEBUG nova.network.os_vif_util [None req-90c02ea6-c0ff-4590-8124-106e67bbce5e 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Converting VIF {"id": "3b2fddbf-4dfc-49fd-8f32-e37500c967d4", "address": "fa:16:3e:ed:b5:cd", "network": {"id": "671431ea-00d4-4b91-9313-30948c14cb3a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1375422383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32942e5bdadc470989ae2d43e074169e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b2fddbf-4d", "ovs_interfaceid": "3b2fddbf-4dfc-49fd-8f32-e37500c967d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.622 2 DEBUG nova.network.os_vif_util [None req-90c02ea6-c0ff-4590-8124-106e67bbce5e 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ed:b5:cd,bridge_name='br-int',has_traffic_filtering=True,id=3b2fddbf-4dfc-49fd-8f32-e37500c967d4,network=Network(671431ea-00d4-4b91-9313-30948c14cb3a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b2fddbf-4d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.623 2 DEBUG os_vif [None req-90c02ea6-c0ff-4590-8124-106e67bbce5e 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ed:b5:cd,bridge_name='br-int',has_traffic_filtering=True,id=3b2fddbf-4dfc-49fd-8f32-e37500c967d4,network=Network(671431ea-00d4-4b91-9313-30948c14cb3a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b2fddbf-4d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.624 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.625 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3b2fddbf-4d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.627 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.631 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.632 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.637 2 INFO os_vif [None req-90c02ea6-c0ff-4590-8124-106e67bbce5e 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ed:b5:cd,bridge_name='br-int',has_traffic_filtering=True,id=3b2fddbf-4dfc-49fd-8f32-e37500c967d4,network=Network(671431ea-00d4-4b91-9313-30948c14cb3a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b2fddbf-4d')#033[00m
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.639 2 INFO nova.virt.libvirt.driver [None req-90c02ea6-c0ff-4590-8124-106e67bbce5e 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Deleting instance files /var/lib/nova/instances/32c9613e-842d-48c2-be9d-368df9649040_del#033[00m
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.640 2 INFO nova.virt.libvirt.driver [None req-90c02ea6-c0ff-4590-8124-106e67bbce5e 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Deletion of /var/lib/nova/instances/32c9613e-842d-48c2-be9d-368df9649040_del complete#033[00m
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.650 2 INFO nova.virt.libvirt.driver [-] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Instance destroyed successfully.#033[00m
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.651 2 DEBUG nova.objects.instance [None req-277eb500-1168-487d-83ee-b60efa06a3ed 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lazy-loading 'resources' on Instance uuid c6a7915d-e2fe-4409-a7fe-ead19189985a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.673 2 DEBUG nova.virt.libvirt.vif [None req-277eb500-1168-487d-83ee-b60efa06a3ed 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:19:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1685103395',display_name='tempest-tempest.common.compute-instance-1685103395-2',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1685103395-2',id=104,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=1,launched_at=2025-10-02T12:20:02Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='32942e5bdadc470989ae2d43e074169e',ramdisk_id='',reservation_id='r-70160ig1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-MultipleCreateTestJSON-1289676365',owner_user_name='tempest-MultipleCreateTestJSON-1289676365-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:20:02Z,user_data=None,user_id='4407ad6914204506adfa85e11e94e5d0',uuid=c6a7915d-e2fe-4409-a7fe-ead19189985a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fc5ead0e-1493-42fc-92c9-2861340d0114", "address": "fa:16:3e:8f:2a:ea", "network": {"id": "671431ea-00d4-4b91-9313-30948c14cb3a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1375422383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32942e5bdadc470989ae2d43e074169e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc5ead0e-14", "ovs_interfaceid": "fc5ead0e-1493-42fc-92c9-2861340d0114", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.674 2 DEBUG nova.network.os_vif_util [None req-277eb500-1168-487d-83ee-b60efa06a3ed 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Converting VIF {"id": "fc5ead0e-1493-42fc-92c9-2861340d0114", "address": "fa:16:3e:8f:2a:ea", "network": {"id": "671431ea-00d4-4b91-9313-30948c14cb3a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1375422383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32942e5bdadc470989ae2d43e074169e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc5ead0e-14", "ovs_interfaceid": "fc5ead0e-1493-42fc-92c9-2861340d0114", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.675 2 DEBUG nova.network.os_vif_util [None req-277eb500-1168-487d-83ee-b60efa06a3ed 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8f:2a:ea,bridge_name='br-int',has_traffic_filtering=True,id=fc5ead0e-1493-42fc-92c9-2861340d0114,network=Network(671431ea-00d4-4b91-9313-30948c14cb3a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc5ead0e-14') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.675 2 DEBUG os_vif [None req-277eb500-1168-487d-83ee-b60efa06a3ed 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8f:2a:ea,bridge_name='br-int',has_traffic_filtering=True,id=fc5ead0e-1493-42fc-92c9-2861340d0114,network=Network(671431ea-00d4-4b91-9313-30948c14cb3a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc5ead0e-14') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.676 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.677 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfc5ead0e-14, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.678 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.681 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.684 2 INFO os_vif [None req-277eb500-1168-487d-83ee-b60efa06a3ed 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8f:2a:ea,bridge_name='br-int',has_traffic_filtering=True,id=fc5ead0e-1493-42fc-92c9-2861340d0114,network=Network(671431ea-00d4-4b91-9313-30948c14cb3a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc5ead0e-14')#033[00m
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.684 2 INFO nova.virt.libvirt.driver [None req-277eb500-1168-487d-83ee-b60efa06a3ed 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Deleting instance files /var/lib/nova/instances/c6a7915d-e2fe-4409-a7fe-ead19189985a_del#033[00m
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.685 2 INFO nova.virt.libvirt.driver [None req-277eb500-1168-487d-83ee-b60efa06a3ed 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Deletion of /var/lib/nova/instances/c6a7915d-e2fe-4409-a7fe-ead19189985a_del complete#033[00m
Oct  2 08:20:05 np0005466013 neutron-haproxy-ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a[234832]: [NOTICE]   (234836) : haproxy version is 2.8.14-c23fe91
Oct  2 08:20:05 np0005466013 neutron-haproxy-ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a[234832]: [NOTICE]   (234836) : path to executable is /usr/sbin/haproxy
Oct  2 08:20:05 np0005466013 neutron-haproxy-ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a[234832]: [WARNING]  (234836) : Exiting Master process...
Oct  2 08:20:05 np0005466013 neutron-haproxy-ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a[234832]: [ALERT]    (234836) : Current worker (234838) exited with code 143 (Terminated)
Oct  2 08:20:05 np0005466013 neutron-haproxy-ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a[234832]: [WARNING]  (234836) : All workers exited. Exiting... (0)
Oct  2 08:20:05 np0005466013 systemd[1]: libpod-118f1c190894d24876fb6e27df298c50bb39bbacbc108d3b7d6eb5a731f5e360.scope: Deactivated successfully.
Oct  2 08:20:05 np0005466013 podman[234968]: 2025-10-02 12:20:05.710457234 +0000 UTC m=+0.049674958 container died 118f1c190894d24876fb6e27df298c50bb39bbacbc108d3b7d6eb5a731f5e360 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.729 2 INFO nova.compute.manager [None req-90c02ea6-c0ff-4590-8124-106e67bbce5e 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Took 0.48 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.730 2 DEBUG oslo.service.loopingcall [None req-90c02ea6-c0ff-4590-8124-106e67bbce5e 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.731 2 DEBUG nova.compute.manager [-] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.731 2 DEBUG nova.network.neutron [-] [instance: 32c9613e-842d-48c2-be9d-368df9649040] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:20:05 np0005466013 systemd[1]: var-lib-containers-storage-overlay-954ac8d8e8204b6e3f1c175984ab18e3a1a11c1c881ba4296be113ebedd44ee9-merged.mount: Deactivated successfully.
Oct  2 08:20:05 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-118f1c190894d24876fb6e27df298c50bb39bbacbc108d3b7d6eb5a731f5e360-userdata-shm.mount: Deactivated successfully.
Oct  2 08:20:05 np0005466013 podman[234968]: 2025-10-02 12:20:05.748301293 +0000 UTC m=+0.087519027 container cleanup 118f1c190894d24876fb6e27df298c50bb39bbacbc108d3b7d6eb5a731f5e360 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:20:05 np0005466013 systemd[1]: libpod-conmon-118f1c190894d24876fb6e27df298c50bb39bbacbc108d3b7d6eb5a731f5e360.scope: Deactivated successfully.
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.760 2 INFO nova.compute.manager [None req-277eb500-1168-487d-83ee-b60efa06a3ed 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Took 0.41 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.761 2 DEBUG oslo.service.loopingcall [None req-277eb500-1168-487d-83ee-b60efa06a3ed 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.761 2 DEBUG nova.compute.manager [-] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.761 2 DEBUG nova.network.neutron [-] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:20:05 np0005466013 podman[234999]: 2025-10-02 12:20:05.811393538 +0000 UTC m=+0.040222773 container remove 118f1c190894d24876fb6e27df298c50bb39bbacbc108d3b7d6eb5a731f5e360 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Oct  2 08:20:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:05.817 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[f661831b-5cf1-4519-a076-0e4a6ba6b660]: (4, ('Thu Oct  2 12:20:05 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a (118f1c190894d24876fb6e27df298c50bb39bbacbc108d3b7d6eb5a731f5e360)\n118f1c190894d24876fb6e27df298c50bb39bbacbc108d3b7d6eb5a731f5e360\nThu Oct  2 12:20:05 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a (118f1c190894d24876fb6e27df298c50bb39bbacbc108d3b7d6eb5a731f5e360)\n118f1c190894d24876fb6e27df298c50bb39bbacbc108d3b7d6eb5a731f5e360\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:05.819 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[68d4e0b0-6c12-46bf-b31f-f267666269b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:05.820 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap671431ea-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:05 np0005466013 kernel: tap671431ea-00: left promiscuous mode
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.824 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:05.832 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[4476f5b4-173b-41df-a6bc-f039e82413c5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:05 np0005466013 nova_compute[192144]: 2025-10-02 12:20:05.838 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:05.872 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[62f7ec14-c059-4965-b696-7a6a5fce97a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:05.874 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[1c6700e3-e10b-49ad-9ce3-46443606a220]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:05.896 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[16dea750-4ceb-4efe-b049-5ca0c9005d49]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 559319, 'reachable_time': 19319, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235014, 'error': None, 'target': 'ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:05 np0005466013 systemd[1]: run-netns-ovnmeta\x2d671431ea\x2d00d4\x2d4b91\x2d9313\x2d30948c14cb3a.mount: Deactivated successfully.
Oct  2 08:20:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:05.899 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:20:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:05.899 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[0d08bf64-429b-4a7f-b6f3-86629014bf14]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:06 np0005466013 nova_compute[192144]: 2025-10-02 12:20:06.286 2 DEBUG nova.compute.manager [req-95618218-e40a-470f-9270-9d37fde4acf3 req-b5eedf5b-0e54-4127-b96c-8712b69895c9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Received event network-vif-unplugged-3b2fddbf-4dfc-49fd-8f32-e37500c967d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:06 np0005466013 nova_compute[192144]: 2025-10-02 12:20:06.286 2 DEBUG oslo_concurrency.lockutils [req-95618218-e40a-470f-9270-9d37fde4acf3 req-b5eedf5b-0e54-4127-b96c-8712b69895c9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "32c9613e-842d-48c2-be9d-368df9649040-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:06 np0005466013 nova_compute[192144]: 2025-10-02 12:20:06.286 2 DEBUG oslo_concurrency.lockutils [req-95618218-e40a-470f-9270-9d37fde4acf3 req-b5eedf5b-0e54-4127-b96c-8712b69895c9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "32c9613e-842d-48c2-be9d-368df9649040-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:06 np0005466013 nova_compute[192144]: 2025-10-02 12:20:06.287 2 DEBUG oslo_concurrency.lockutils [req-95618218-e40a-470f-9270-9d37fde4acf3 req-b5eedf5b-0e54-4127-b96c-8712b69895c9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "32c9613e-842d-48c2-be9d-368df9649040-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:06 np0005466013 nova_compute[192144]: 2025-10-02 12:20:06.287 2 DEBUG nova.compute.manager [req-95618218-e40a-470f-9270-9d37fde4acf3 req-b5eedf5b-0e54-4127-b96c-8712b69895c9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] No waiting events found dispatching network-vif-unplugged-3b2fddbf-4dfc-49fd-8f32-e37500c967d4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:20:06 np0005466013 nova_compute[192144]: 2025-10-02 12:20:06.287 2 DEBUG nova.compute.manager [req-95618218-e40a-470f-9270-9d37fde4acf3 req-b5eedf5b-0e54-4127-b96c-8712b69895c9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Received event network-vif-unplugged-3b2fddbf-4dfc-49fd-8f32-e37500c967d4 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:20:06 np0005466013 nova_compute[192144]: 2025-10-02 12:20:06.287 2 DEBUG nova.compute.manager [req-95618218-e40a-470f-9270-9d37fde4acf3 req-b5eedf5b-0e54-4127-b96c-8712b69895c9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Received event network-vif-plugged-3b2fddbf-4dfc-49fd-8f32-e37500c967d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:06 np0005466013 nova_compute[192144]: 2025-10-02 12:20:06.287 2 DEBUG oslo_concurrency.lockutils [req-95618218-e40a-470f-9270-9d37fde4acf3 req-b5eedf5b-0e54-4127-b96c-8712b69895c9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "32c9613e-842d-48c2-be9d-368df9649040-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:06 np0005466013 nova_compute[192144]: 2025-10-02 12:20:06.287 2 DEBUG oslo_concurrency.lockutils [req-95618218-e40a-470f-9270-9d37fde4acf3 req-b5eedf5b-0e54-4127-b96c-8712b69895c9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "32c9613e-842d-48c2-be9d-368df9649040-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:06 np0005466013 nova_compute[192144]: 2025-10-02 12:20:06.288 2 DEBUG oslo_concurrency.lockutils [req-95618218-e40a-470f-9270-9d37fde4acf3 req-b5eedf5b-0e54-4127-b96c-8712b69895c9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "32c9613e-842d-48c2-be9d-368df9649040-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:06 np0005466013 nova_compute[192144]: 2025-10-02 12:20:06.288 2 DEBUG nova.compute.manager [req-95618218-e40a-470f-9270-9d37fde4acf3 req-b5eedf5b-0e54-4127-b96c-8712b69895c9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] No waiting events found dispatching network-vif-plugged-3b2fddbf-4dfc-49fd-8f32-e37500c967d4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:20:06 np0005466013 nova_compute[192144]: 2025-10-02 12:20:06.288 2 WARNING nova.compute.manager [req-95618218-e40a-470f-9270-9d37fde4acf3 req-b5eedf5b-0e54-4127-b96c-8712b69895c9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Received unexpected event network-vif-plugged-3b2fddbf-4dfc-49fd-8f32-e37500c967d4 for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:20:06 np0005466013 nova_compute[192144]: 2025-10-02 12:20:06.288 2 DEBUG nova.compute.manager [req-95618218-e40a-470f-9270-9d37fde4acf3 req-b5eedf5b-0e54-4127-b96c-8712b69895c9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Received event network-vif-unplugged-fc5ead0e-1493-42fc-92c9-2861340d0114 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:06 np0005466013 nova_compute[192144]: 2025-10-02 12:20:06.288 2 DEBUG oslo_concurrency.lockutils [req-95618218-e40a-470f-9270-9d37fde4acf3 req-b5eedf5b-0e54-4127-b96c-8712b69895c9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "c6a7915d-e2fe-4409-a7fe-ead19189985a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:06 np0005466013 nova_compute[192144]: 2025-10-02 12:20:06.288 2 DEBUG oslo_concurrency.lockutils [req-95618218-e40a-470f-9270-9d37fde4acf3 req-b5eedf5b-0e54-4127-b96c-8712b69895c9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "c6a7915d-e2fe-4409-a7fe-ead19189985a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:06 np0005466013 nova_compute[192144]: 2025-10-02 12:20:06.289 2 DEBUG oslo_concurrency.lockutils [req-95618218-e40a-470f-9270-9d37fde4acf3 req-b5eedf5b-0e54-4127-b96c-8712b69895c9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "c6a7915d-e2fe-4409-a7fe-ead19189985a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:06 np0005466013 nova_compute[192144]: 2025-10-02 12:20:06.289 2 DEBUG nova.compute.manager [req-95618218-e40a-470f-9270-9d37fde4acf3 req-b5eedf5b-0e54-4127-b96c-8712b69895c9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] No waiting events found dispatching network-vif-unplugged-fc5ead0e-1493-42fc-92c9-2861340d0114 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:20:06 np0005466013 nova_compute[192144]: 2025-10-02 12:20:06.289 2 DEBUG nova.compute.manager [req-95618218-e40a-470f-9270-9d37fde4acf3 req-b5eedf5b-0e54-4127-b96c-8712b69895c9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Received event network-vif-unplugged-fc5ead0e-1493-42fc-92c9-2861340d0114 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:20:06 np0005466013 nova_compute[192144]: 2025-10-02 12:20:06.289 2 DEBUG nova.compute.manager [req-95618218-e40a-470f-9270-9d37fde4acf3 req-b5eedf5b-0e54-4127-b96c-8712b69895c9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Received event network-vif-plugged-fc5ead0e-1493-42fc-92c9-2861340d0114 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:06 np0005466013 nova_compute[192144]: 2025-10-02 12:20:06.289 2 DEBUG oslo_concurrency.lockutils [req-95618218-e40a-470f-9270-9d37fde4acf3 req-b5eedf5b-0e54-4127-b96c-8712b69895c9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "c6a7915d-e2fe-4409-a7fe-ead19189985a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:06 np0005466013 nova_compute[192144]: 2025-10-02 12:20:06.289 2 DEBUG oslo_concurrency.lockutils [req-95618218-e40a-470f-9270-9d37fde4acf3 req-b5eedf5b-0e54-4127-b96c-8712b69895c9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "c6a7915d-e2fe-4409-a7fe-ead19189985a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:06 np0005466013 nova_compute[192144]: 2025-10-02 12:20:06.290 2 DEBUG oslo_concurrency.lockutils [req-95618218-e40a-470f-9270-9d37fde4acf3 req-b5eedf5b-0e54-4127-b96c-8712b69895c9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "c6a7915d-e2fe-4409-a7fe-ead19189985a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:06 np0005466013 nova_compute[192144]: 2025-10-02 12:20:06.290 2 DEBUG nova.compute.manager [req-95618218-e40a-470f-9270-9d37fde4acf3 req-b5eedf5b-0e54-4127-b96c-8712b69895c9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] No waiting events found dispatching network-vif-plugged-fc5ead0e-1493-42fc-92c9-2861340d0114 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:20:06 np0005466013 nova_compute[192144]: 2025-10-02 12:20:06.290 2 WARNING nova.compute.manager [req-95618218-e40a-470f-9270-9d37fde4acf3 req-b5eedf5b-0e54-4127-b96c-8712b69895c9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Received unexpected event network-vif-plugged-fc5ead0e-1493-42fc-92c9-2861340d0114 for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:20:06 np0005466013 nova_compute[192144]: 2025-10-02 12:20:06.401 2 DEBUG nova.network.neutron [-] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:20:06 np0005466013 nova_compute[192144]: 2025-10-02 12:20:06.407 2 DEBUG nova.network.neutron [-] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:20:06 np0005466013 nova_compute[192144]: 2025-10-02 12:20:06.427 2 INFO nova.compute.manager [-] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Took 0.70 seconds to deallocate network for instance.#033[00m
Oct  2 08:20:06 np0005466013 nova_compute[192144]: 2025-10-02 12:20:06.427 2 INFO nova.compute.manager [-] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Took 0.67 seconds to deallocate network for instance.#033[00m
Oct  2 08:20:06 np0005466013 nova_compute[192144]: 2025-10-02 12:20:06.491 2 DEBUG nova.compute.manager [req-9ce5e6c2-0b5e-4c36-b5f2-ca179c6092fc req-a37ed1d7-0848-462c-8ca0-ef3cee900152 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Received event network-vif-deleted-3b2fddbf-4dfc-49fd-8f32-e37500c967d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:06 np0005466013 nova_compute[192144]: 2025-10-02 12:20:06.540 2 DEBUG oslo_concurrency.lockutils [None req-90c02ea6-c0ff-4590-8124-106e67bbce5e 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:06 np0005466013 nova_compute[192144]: 2025-10-02 12:20:06.541 2 DEBUG oslo_concurrency.lockutils [None req-90c02ea6-c0ff-4590-8124-106e67bbce5e 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:06 np0005466013 nova_compute[192144]: 2025-10-02 12:20:06.554 2 DEBUG oslo_concurrency.lockutils [None req-277eb500-1168-487d-83ee-b60efa06a3ed 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:06 np0005466013 nova_compute[192144]: 2025-10-02 12:20:06.621 2 DEBUG nova.compute.provider_tree [None req-90c02ea6-c0ff-4590-8124-106e67bbce5e 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:20:06 np0005466013 nova_compute[192144]: 2025-10-02 12:20:06.641 2 DEBUG nova.scheduler.client.report [None req-90c02ea6-c0ff-4590-8124-106e67bbce5e 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:20:06 np0005466013 nova_compute[192144]: 2025-10-02 12:20:06.680 2 DEBUG oslo_concurrency.lockutils [None req-90c02ea6-c0ff-4590-8124-106e67bbce5e 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.139s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:06 np0005466013 nova_compute[192144]: 2025-10-02 12:20:06.683 2 DEBUG oslo_concurrency.lockutils [None req-277eb500-1168-487d-83ee-b60efa06a3ed 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:06 np0005466013 nova_compute[192144]: 2025-10-02 12:20:06.713 2 INFO nova.scheduler.client.report [None req-90c02ea6-c0ff-4590-8124-106e67bbce5e 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Deleted allocations for instance 32c9613e-842d-48c2-be9d-368df9649040#033[00m
Oct  2 08:20:06 np0005466013 nova_compute[192144]: 2025-10-02 12:20:06.742 2 DEBUG nova.compute.provider_tree [None req-277eb500-1168-487d-83ee-b60efa06a3ed 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:20:06 np0005466013 nova_compute[192144]: 2025-10-02 12:20:06.764 2 DEBUG nova.scheduler.client.report [None req-277eb500-1168-487d-83ee-b60efa06a3ed 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:20:06 np0005466013 nova_compute[192144]: 2025-10-02 12:20:06.793 2 DEBUG oslo_concurrency.lockutils [None req-277eb500-1168-487d-83ee-b60efa06a3ed 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:06 np0005466013 nova_compute[192144]: 2025-10-02 12:20:06.803 2 DEBUG oslo_concurrency.lockutils [None req-90c02ea6-c0ff-4590-8124-106e67bbce5e 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "32c9613e-842d-48c2-be9d-368df9649040" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.668s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:06 np0005466013 nova_compute[192144]: 2025-10-02 12:20:06.825 2 INFO nova.scheduler.client.report [None req-277eb500-1168-487d-83ee-b60efa06a3ed 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Deleted allocations for instance c6a7915d-e2fe-4409-a7fe-ead19189985a#033[00m
Oct  2 08:20:06 np0005466013 nova_compute[192144]: 2025-10-02 12:20:06.890 2 DEBUG oslo_concurrency.lockutils [None req-277eb500-1168-487d-83ee-b60efa06a3ed 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "c6a7915d-e2fe-4409-a7fe-ead19189985a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.569s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:08 np0005466013 nova_compute[192144]: 2025-10-02 12:20:08.207 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407593.2057495, b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:20:08 np0005466013 nova_compute[192144]: 2025-10-02 12:20:08.207 2 INFO nova.compute.manager [-] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:20:08 np0005466013 nova_compute[192144]: 2025-10-02 12:20:08.225 2 DEBUG nova.compute.manager [None req-dd75f958-e317-4b74-97ee-fea101be8946 - - - - - -] [instance: b9e4c414-bee8-4daf-a3f8-d5fc8fb363ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:20:08 np0005466013 nova_compute[192144]: 2025-10-02 12:20:08.411 2 DEBUG nova.compute.manager [req-627cbe2f-d8ea-4d73-9147-dfda00c75238 req-b38e04cd-c4bb-4483-af31-eb06fc653d95 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Received event network-vif-deleted-fc5ead0e-1493-42fc-92c9-2861340d0114 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:08 np0005466013 nova_compute[192144]: 2025-10-02 12:20:08.653 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:10 np0005466013 nova_compute[192144]: 2025-10-02 12:20:10.453 2 DEBUG oslo_concurrency.lockutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Acquiring lock "21558304-9e39-4f50-9137-af0282f5cfca" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:10 np0005466013 nova_compute[192144]: 2025-10-02 12:20:10.454 2 DEBUG oslo_concurrency.lockutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "21558304-9e39-4f50-9137-af0282f5cfca" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:10 np0005466013 nova_compute[192144]: 2025-10-02 12:20:10.493 2 DEBUG nova.compute.manager [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:20:10 np0005466013 nova_compute[192144]: 2025-10-02 12:20:10.570 2 DEBUG oslo_concurrency.lockutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Acquiring lock "d88505e6-83d4-4006-a5d8-33e9ab64a380" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:10 np0005466013 nova_compute[192144]: 2025-10-02 12:20:10.570 2 DEBUG oslo_concurrency.lockutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "d88505e6-83d4-4006-a5d8-33e9ab64a380" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:10 np0005466013 nova_compute[192144]: 2025-10-02 12:20:10.597 2 DEBUG nova.compute.manager [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:20:10 np0005466013 nova_compute[192144]: 2025-10-02 12:20:10.636 2 DEBUG oslo_concurrency.lockutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:10 np0005466013 nova_compute[192144]: 2025-10-02 12:20:10.636 2 DEBUG oslo_concurrency.lockutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:10 np0005466013 nova_compute[192144]: 2025-10-02 12:20:10.644 2 DEBUG nova.virt.hardware [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:20:10 np0005466013 nova_compute[192144]: 2025-10-02 12:20:10.645 2 INFO nova.compute.claims [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:20:10 np0005466013 nova_compute[192144]: 2025-10-02 12:20:10.679 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:10 np0005466013 nova_compute[192144]: 2025-10-02 12:20:10.758 2 DEBUG oslo_concurrency.lockutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:10 np0005466013 nova_compute[192144]: 2025-10-02 12:20:10.916 2 DEBUG nova.compute.provider_tree [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:20:10 np0005466013 nova_compute[192144]: 2025-10-02 12:20:10.949 2 DEBUG nova.scheduler.client.report [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:20:10 np0005466013 nova_compute[192144]: 2025-10-02 12:20:10.986 2 DEBUG oslo_concurrency.lockutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.349s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:10 np0005466013 nova_compute[192144]: 2025-10-02 12:20:10.986 2 DEBUG nova.compute.manager [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:20:10 np0005466013 nova_compute[192144]: 2025-10-02 12:20:10.988 2 DEBUG oslo_concurrency.lockutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.231s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:10 np0005466013 nova_compute[192144]: 2025-10-02 12:20:10.994 2 DEBUG nova.virt.hardware [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:20:10 np0005466013 nova_compute[192144]: 2025-10-02 12:20:10.994 2 INFO nova.compute.claims [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:20:11 np0005466013 nova_compute[192144]: 2025-10-02 12:20:11.164 2 DEBUG nova.compute.manager [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:20:11 np0005466013 nova_compute[192144]: 2025-10-02 12:20:11.165 2 DEBUG nova.network.neutron [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:20:11 np0005466013 nova_compute[192144]: 2025-10-02 12:20:11.240 2 INFO nova.virt.libvirt.driver [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:20:11 np0005466013 nova_compute[192144]: 2025-10-02 12:20:11.268 2 DEBUG nova.compute.provider_tree [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:20:11 np0005466013 nova_compute[192144]: 2025-10-02 12:20:11.305 2 DEBUG nova.compute.manager [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:20:11 np0005466013 nova_compute[192144]: 2025-10-02 12:20:11.312 2 DEBUG nova.scheduler.client.report [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:20:11 np0005466013 nova_compute[192144]: 2025-10-02 12:20:11.352 2 DEBUG nova.policy [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4407ad6914204506adfa85e11e94e5d0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '32942e5bdadc470989ae2d43e074169e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:20:11 np0005466013 nova_compute[192144]: 2025-10-02 12:20:11.440 2 DEBUG oslo_concurrency.lockutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.452s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:11 np0005466013 nova_compute[192144]: 2025-10-02 12:20:11.441 2 DEBUG nova.compute.manager [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:20:11 np0005466013 nova_compute[192144]: 2025-10-02 12:20:11.536 2 DEBUG nova.compute.manager [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:20:11 np0005466013 nova_compute[192144]: 2025-10-02 12:20:11.537 2 DEBUG nova.network.neutron [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:20:11 np0005466013 nova_compute[192144]: 2025-10-02 12:20:11.595 2 INFO nova.virt.libvirt.driver [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:20:11 np0005466013 nova_compute[192144]: 2025-10-02 12:20:11.598 2 DEBUG nova.compute.manager [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:20:11 np0005466013 nova_compute[192144]: 2025-10-02 12:20:11.598 2 DEBUG nova.virt.libvirt.driver [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:20:11 np0005466013 nova_compute[192144]: 2025-10-02 12:20:11.599 2 INFO nova.virt.libvirt.driver [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Creating image(s)#033[00m
Oct  2 08:20:11 np0005466013 nova_compute[192144]: 2025-10-02 12:20:11.599 2 DEBUG oslo_concurrency.lockutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Acquiring lock "/var/lib/nova/instances/21558304-9e39-4f50-9137-af0282f5cfca/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:11 np0005466013 nova_compute[192144]: 2025-10-02 12:20:11.599 2 DEBUG oslo_concurrency.lockutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "/var/lib/nova/instances/21558304-9e39-4f50-9137-af0282f5cfca/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:11 np0005466013 nova_compute[192144]: 2025-10-02 12:20:11.600 2 DEBUG oslo_concurrency.lockutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "/var/lib/nova/instances/21558304-9e39-4f50-9137-af0282f5cfca/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:11 np0005466013 nova_compute[192144]: 2025-10-02 12:20:11.611 2 DEBUG oslo_concurrency.processutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:11 np0005466013 nova_compute[192144]: 2025-10-02 12:20:11.657 2 DEBUG nova.compute.manager [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:20:11 np0005466013 nova_compute[192144]: 2025-10-02 12:20:11.677 2 DEBUG oslo_concurrency.processutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:11 np0005466013 nova_compute[192144]: 2025-10-02 12:20:11.678 2 DEBUG oslo_concurrency.lockutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:11 np0005466013 nova_compute[192144]: 2025-10-02 12:20:11.679 2 DEBUG oslo_concurrency.lockutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:11 np0005466013 nova_compute[192144]: 2025-10-02 12:20:11.693 2 DEBUG oslo_concurrency.processutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:11 np0005466013 nova_compute[192144]: 2025-10-02 12:20:11.715 2 DEBUG nova.policy [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4407ad6914204506adfa85e11e94e5d0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '32942e5bdadc470989ae2d43e074169e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:20:11 np0005466013 nova_compute[192144]: 2025-10-02 12:20:11.751 2 DEBUG oslo_concurrency.processutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:11 np0005466013 nova_compute[192144]: 2025-10-02 12:20:11.752 2 DEBUG oslo_concurrency.processutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/21558304-9e39-4f50-9137-af0282f5cfca/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:11 np0005466013 nova_compute[192144]: 2025-10-02 12:20:11.791 2 DEBUG oslo_concurrency.processutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/21558304-9e39-4f50-9137-af0282f5cfca/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:11 np0005466013 nova_compute[192144]: 2025-10-02 12:20:11.792 2 DEBUG oslo_concurrency.lockutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:11 np0005466013 nova_compute[192144]: 2025-10-02 12:20:11.792 2 DEBUG oslo_concurrency.processutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:11 np0005466013 nova_compute[192144]: 2025-10-02 12:20:11.859 2 DEBUG oslo_concurrency.processutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:11 np0005466013 nova_compute[192144]: 2025-10-02 12:20:11.861 2 DEBUG nova.virt.disk.api [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Checking if we can resize image /var/lib/nova/instances/21558304-9e39-4f50-9137-af0282f5cfca/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:20:11 np0005466013 nova_compute[192144]: 2025-10-02 12:20:11.861 2 DEBUG oslo_concurrency.processutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/21558304-9e39-4f50-9137-af0282f5cfca/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:11 np0005466013 nova_compute[192144]: 2025-10-02 12:20:11.923 2 DEBUG oslo_concurrency.processutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/21558304-9e39-4f50-9137-af0282f5cfca/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:11 np0005466013 nova_compute[192144]: 2025-10-02 12:20:11.923 2 DEBUG nova.virt.disk.api [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Cannot resize image /var/lib/nova/instances/21558304-9e39-4f50-9137-af0282f5cfca/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:20:11 np0005466013 nova_compute[192144]: 2025-10-02 12:20:11.924 2 DEBUG nova.objects.instance [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lazy-loading 'migration_context' on Instance uuid 21558304-9e39-4f50-9137-af0282f5cfca obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:20:12 np0005466013 nova_compute[192144]: 2025-10-02 12:20:12.012 2 DEBUG nova.virt.libvirt.driver [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:20:12 np0005466013 nova_compute[192144]: 2025-10-02 12:20:12.012 2 DEBUG nova.virt.libvirt.driver [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Ensure instance console log exists: /var/lib/nova/instances/21558304-9e39-4f50-9137-af0282f5cfca/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:20:12 np0005466013 nova_compute[192144]: 2025-10-02 12:20:12.013 2 DEBUG oslo_concurrency.lockutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:12 np0005466013 nova_compute[192144]: 2025-10-02 12:20:12.013 2 DEBUG oslo_concurrency.lockutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:12 np0005466013 nova_compute[192144]: 2025-10-02 12:20:12.013 2 DEBUG oslo_concurrency.lockutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:12 np0005466013 nova_compute[192144]: 2025-10-02 12:20:12.209 2 DEBUG nova.network.neutron [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Successfully created port: d7141e9a-343b-4628-a1b6-25c7db74ece7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:20:12 np0005466013 nova_compute[192144]: 2025-10-02 12:20:12.261 2 DEBUG nova.compute.manager [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:20:12 np0005466013 nova_compute[192144]: 2025-10-02 12:20:12.262 2 DEBUG nova.virt.libvirt.driver [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:20:12 np0005466013 nova_compute[192144]: 2025-10-02 12:20:12.263 2 INFO nova.virt.libvirt.driver [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Creating image(s)#033[00m
Oct  2 08:20:12 np0005466013 nova_compute[192144]: 2025-10-02 12:20:12.263 2 DEBUG oslo_concurrency.lockutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Acquiring lock "/var/lib/nova/instances/d88505e6-83d4-4006-a5d8-33e9ab64a380/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:12 np0005466013 nova_compute[192144]: 2025-10-02 12:20:12.263 2 DEBUG oslo_concurrency.lockutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "/var/lib/nova/instances/d88505e6-83d4-4006-a5d8-33e9ab64a380/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:12 np0005466013 nova_compute[192144]: 2025-10-02 12:20:12.264 2 DEBUG oslo_concurrency.lockutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "/var/lib/nova/instances/d88505e6-83d4-4006-a5d8-33e9ab64a380/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:12 np0005466013 nova_compute[192144]: 2025-10-02 12:20:12.276 2 DEBUG oslo_concurrency.processutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:12 np0005466013 nova_compute[192144]: 2025-10-02 12:20:12.337 2 DEBUG oslo_concurrency.processutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:12 np0005466013 nova_compute[192144]: 2025-10-02 12:20:12.338 2 DEBUG oslo_concurrency.lockutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:12 np0005466013 nova_compute[192144]: 2025-10-02 12:20:12.339 2 DEBUG oslo_concurrency.lockutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:12 np0005466013 nova_compute[192144]: 2025-10-02 12:20:12.351 2 DEBUG oslo_concurrency.processutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:12 np0005466013 nova_compute[192144]: 2025-10-02 12:20:12.420 2 DEBUG oslo_concurrency.processutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:12 np0005466013 nova_compute[192144]: 2025-10-02 12:20:12.421 2 DEBUG oslo_concurrency.processutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/d88505e6-83d4-4006-a5d8-33e9ab64a380/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:12 np0005466013 nova_compute[192144]: 2025-10-02 12:20:12.459 2 DEBUG oslo_concurrency.processutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/d88505e6-83d4-4006-a5d8-33e9ab64a380/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:12 np0005466013 nova_compute[192144]: 2025-10-02 12:20:12.460 2 DEBUG oslo_concurrency.lockutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:12 np0005466013 nova_compute[192144]: 2025-10-02 12:20:12.460 2 DEBUG oslo_concurrency.processutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:12 np0005466013 nova_compute[192144]: 2025-10-02 12:20:12.483 2 DEBUG nova.network.neutron [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Successfully created port: b860dc8c-5253-4155-89c5-2384f0b08ff2 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:20:12 np0005466013 nova_compute[192144]: 2025-10-02 12:20:12.530 2 DEBUG oslo_concurrency.processutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:12 np0005466013 nova_compute[192144]: 2025-10-02 12:20:12.531 2 DEBUG nova.virt.disk.api [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Checking if we can resize image /var/lib/nova/instances/d88505e6-83d4-4006-a5d8-33e9ab64a380/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:20:12 np0005466013 nova_compute[192144]: 2025-10-02 12:20:12.532 2 DEBUG oslo_concurrency.processutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d88505e6-83d4-4006-a5d8-33e9ab64a380/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:12 np0005466013 nova_compute[192144]: 2025-10-02 12:20:12.590 2 DEBUG oslo_concurrency.processutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d88505e6-83d4-4006-a5d8-33e9ab64a380/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:12 np0005466013 nova_compute[192144]: 2025-10-02 12:20:12.591 2 DEBUG nova.virt.disk.api [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Cannot resize image /var/lib/nova/instances/d88505e6-83d4-4006-a5d8-33e9ab64a380/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:20:12 np0005466013 nova_compute[192144]: 2025-10-02 12:20:12.592 2 DEBUG nova.objects.instance [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lazy-loading 'migration_context' on Instance uuid d88505e6-83d4-4006-a5d8-33e9ab64a380 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:20:12 np0005466013 nova_compute[192144]: 2025-10-02 12:20:12.606 2 DEBUG nova.virt.libvirt.driver [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:20:12 np0005466013 nova_compute[192144]: 2025-10-02 12:20:12.607 2 DEBUG nova.virt.libvirt.driver [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Ensure instance console log exists: /var/lib/nova/instances/d88505e6-83d4-4006-a5d8-33e9ab64a380/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:20:12 np0005466013 nova_compute[192144]: 2025-10-02 12:20:12.607 2 DEBUG oslo_concurrency.lockutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:12 np0005466013 nova_compute[192144]: 2025-10-02 12:20:12.608 2 DEBUG oslo_concurrency.lockutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:12 np0005466013 nova_compute[192144]: 2025-10-02 12:20:12.608 2 DEBUG oslo_concurrency.lockutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:12 np0005466013 nova_compute[192144]: 2025-10-02 12:20:12.980 2 DEBUG nova.network.neutron [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Successfully updated port: d7141e9a-343b-4628-a1b6-25c7db74ece7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:20:13 np0005466013 nova_compute[192144]: 2025-10-02 12:20:13.001 2 DEBUG oslo_concurrency.lockutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Acquiring lock "refresh_cache-21558304-9e39-4f50-9137-af0282f5cfca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:20:13 np0005466013 nova_compute[192144]: 2025-10-02 12:20:13.002 2 DEBUG oslo_concurrency.lockutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Acquired lock "refresh_cache-21558304-9e39-4f50-9137-af0282f5cfca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:20:13 np0005466013 nova_compute[192144]: 2025-10-02 12:20:13.002 2 DEBUG nova.network.neutron [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:20:13 np0005466013 nova_compute[192144]: 2025-10-02 12:20:13.058 2 DEBUG nova.compute.manager [req-57452eda-31ee-46a3-8ea7-6803cf493f95 req-583081b1-a2a9-4c55-90ba-ccfd30a183b8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Received event network-changed-d7141e9a-343b-4628-a1b6-25c7db74ece7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:13 np0005466013 nova_compute[192144]: 2025-10-02 12:20:13.058 2 DEBUG nova.compute.manager [req-57452eda-31ee-46a3-8ea7-6803cf493f95 req-583081b1-a2a9-4c55-90ba-ccfd30a183b8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Refreshing instance network info cache due to event network-changed-d7141e9a-343b-4628-a1b6-25c7db74ece7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:20:13 np0005466013 nova_compute[192144]: 2025-10-02 12:20:13.059 2 DEBUG oslo_concurrency.lockutils [req-57452eda-31ee-46a3-8ea7-6803cf493f95 req-583081b1-a2a9-4c55-90ba-ccfd30a183b8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-21558304-9e39-4f50-9137-af0282f5cfca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:20:13 np0005466013 nova_compute[192144]: 2025-10-02 12:20:13.119 2 DEBUG nova.network.neutron [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Successfully updated port: b860dc8c-5253-4155-89c5-2384f0b08ff2 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:20:13 np0005466013 nova_compute[192144]: 2025-10-02 12:20:13.136 2 DEBUG oslo_concurrency.lockutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Acquiring lock "refresh_cache-d88505e6-83d4-4006-a5d8-33e9ab64a380" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:20:13 np0005466013 nova_compute[192144]: 2025-10-02 12:20:13.136 2 DEBUG oslo_concurrency.lockutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Acquired lock "refresh_cache-d88505e6-83d4-4006-a5d8-33e9ab64a380" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:20:13 np0005466013 nova_compute[192144]: 2025-10-02 12:20:13.137 2 DEBUG nova.network.neutron [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:20:13 np0005466013 nova_compute[192144]: 2025-10-02 12:20:13.189 2 DEBUG nova.compute.manager [req-74a8366d-adfc-4adf-8b4c-4dff23e1bae9 req-c1fd3dd8-907e-4e74-80e2-4419d108ca48 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Received event network-changed-b860dc8c-5253-4155-89c5-2384f0b08ff2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:13 np0005466013 nova_compute[192144]: 2025-10-02 12:20:13.190 2 DEBUG nova.compute.manager [req-74a8366d-adfc-4adf-8b4c-4dff23e1bae9 req-c1fd3dd8-907e-4e74-80e2-4419d108ca48 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Refreshing instance network info cache due to event network-changed-b860dc8c-5253-4155-89c5-2384f0b08ff2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:20:13 np0005466013 nova_compute[192144]: 2025-10-02 12:20:13.190 2 DEBUG oslo_concurrency.lockutils [req-74a8366d-adfc-4adf-8b4c-4dff23e1bae9 req-c1fd3dd8-907e-4e74-80e2-4419d108ca48 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-d88505e6-83d4-4006-a5d8-33e9ab64a380" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:20:13 np0005466013 nova_compute[192144]: 2025-10-02 12:20:13.196 2 DEBUG nova.network.neutron [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:20:13 np0005466013 nova_compute[192144]: 2025-10-02 12:20:13.286 2 DEBUG nova.network.neutron [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:20:13 np0005466013 nova_compute[192144]: 2025-10-02 12:20:13.656 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.350 2 DEBUG nova.network.neutron [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Updating instance_info_cache with network_info: [{"id": "d7141e9a-343b-4628-a1b6-25c7db74ece7", "address": "fa:16:3e:10:ac:4e", "network": {"id": "671431ea-00d4-4b91-9313-30948c14cb3a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1375422383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32942e5bdadc470989ae2d43e074169e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7141e9a-34", "ovs_interfaceid": "d7141e9a-343b-4628-a1b6-25c7db74ece7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.371 2 DEBUG oslo_concurrency.lockutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Releasing lock "refresh_cache-21558304-9e39-4f50-9137-af0282f5cfca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.372 2 DEBUG nova.compute.manager [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Instance network_info: |[{"id": "d7141e9a-343b-4628-a1b6-25c7db74ece7", "address": "fa:16:3e:10:ac:4e", "network": {"id": "671431ea-00d4-4b91-9313-30948c14cb3a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1375422383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32942e5bdadc470989ae2d43e074169e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7141e9a-34", "ovs_interfaceid": "d7141e9a-343b-4628-a1b6-25c7db74ece7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.372 2 DEBUG oslo_concurrency.lockutils [req-57452eda-31ee-46a3-8ea7-6803cf493f95 req-583081b1-a2a9-4c55-90ba-ccfd30a183b8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-21558304-9e39-4f50-9137-af0282f5cfca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.372 2 DEBUG nova.network.neutron [req-57452eda-31ee-46a3-8ea7-6803cf493f95 req-583081b1-a2a9-4c55-90ba-ccfd30a183b8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Refreshing network info cache for port d7141e9a-343b-4628-a1b6-25c7db74ece7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.375 2 DEBUG nova.virt.libvirt.driver [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Start _get_guest_xml network_info=[{"id": "d7141e9a-343b-4628-a1b6-25c7db74ece7", "address": "fa:16:3e:10:ac:4e", "network": {"id": "671431ea-00d4-4b91-9313-30948c14cb3a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1375422383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32942e5bdadc470989ae2d43e074169e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7141e9a-34", "ovs_interfaceid": "d7141e9a-343b-4628-a1b6-25c7db74ece7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.380 2 WARNING nova.virt.libvirt.driver [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.385 2 DEBUG nova.virt.libvirt.host [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.385 2 DEBUG nova.virt.libvirt.host [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.388 2 DEBUG nova.virt.libvirt.host [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.388 2 DEBUG nova.virt.libvirt.host [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.389 2 DEBUG nova.virt.libvirt.driver [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.390 2 DEBUG nova.virt.hardware [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.390 2 DEBUG nova.virt.hardware [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.390 2 DEBUG nova.virt.hardware [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.391 2 DEBUG nova.virt.hardware [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.391 2 DEBUG nova.virt.hardware [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.391 2 DEBUG nova.virt.hardware [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.391 2 DEBUG nova.virt.hardware [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.392 2 DEBUG nova.virt.hardware [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.392 2 DEBUG nova.virt.hardware [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.392 2 DEBUG nova.virt.hardware [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.392 2 DEBUG nova.virt.hardware [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.396 2 DEBUG nova.virt.libvirt.vif [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:20:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-MultipleCreateTestJSON-server-1108095109',display_name='tempest-MultipleCreateTestJSON-server-1108095109-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-multiplecreatetestjson-server-1108095109-1',id=106,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='32942e5bdadc470989ae2d43e074169e',ramdisk_id='',reservation_id='r-xtpuqm3p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-1289676365',owner_user_name='tempest-MultipleCreateTestJSON-1289676365-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:20:11Z,user_data=None,user_id='4407ad6914204506adfa85e11e94e5d0',uuid=21558304-9e39-4f50-9137-af0282f5cfca,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d7141e9a-343b-4628-a1b6-25c7db74ece7", "address": "fa:16:3e:10:ac:4e", "network": {"id": "671431ea-00d4-4b91-9313-30948c14cb3a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1375422383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32942e5bdadc470989ae2d43e074169e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7141e9a-34", "ovs_interfaceid": "d7141e9a-343b-4628-a1b6-25c7db74ece7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.396 2 DEBUG nova.network.os_vif_util [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Converting VIF {"id": "d7141e9a-343b-4628-a1b6-25c7db74ece7", "address": "fa:16:3e:10:ac:4e", "network": {"id": "671431ea-00d4-4b91-9313-30948c14cb3a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1375422383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32942e5bdadc470989ae2d43e074169e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7141e9a-34", "ovs_interfaceid": "d7141e9a-343b-4628-a1b6-25c7db74ece7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.397 2 DEBUG nova.network.os_vif_util [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:10:ac:4e,bridge_name='br-int',has_traffic_filtering=True,id=d7141e9a-343b-4628-a1b6-25c7db74ece7,network=Network(671431ea-00d4-4b91-9313-30948c14cb3a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7141e9a-34') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.398 2 DEBUG nova.objects.instance [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lazy-loading 'pci_devices' on Instance uuid 21558304-9e39-4f50-9137-af0282f5cfca obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.413 2 DEBUG nova.virt.libvirt.driver [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:20:15 np0005466013 nova_compute[192144]:  <uuid>21558304-9e39-4f50-9137-af0282f5cfca</uuid>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:  <name>instance-0000006a</name>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      <nova:name>tempest-MultipleCreateTestJSON-server-1108095109-1</nova:name>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:20:15</nova:creationTime>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:20:15 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:        <nova:user uuid="4407ad6914204506adfa85e11e94e5d0">tempest-MultipleCreateTestJSON-1289676365-project-member</nova:user>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:        <nova:project uuid="32942e5bdadc470989ae2d43e074169e">tempest-MultipleCreateTestJSON-1289676365</nova:project>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:        <nova:port uuid="d7141e9a-343b-4628-a1b6-25c7db74ece7">
Oct  2 08:20:15 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      <entry name="serial">21558304-9e39-4f50-9137-af0282f5cfca</entry>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      <entry name="uuid">21558304-9e39-4f50-9137-af0282f5cfca</entry>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/21558304-9e39-4f50-9137-af0282f5cfca/disk"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/21558304-9e39-4f50-9137-af0282f5cfca/disk.config"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:10:ac:4e"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      <target dev="tapd7141e9a-34"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/21558304-9e39-4f50-9137-af0282f5cfca/console.log" append="off"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:20:15 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:20:15 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.414 2 DEBUG nova.compute.manager [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Preparing to wait for external event network-vif-plugged-d7141e9a-343b-4628-a1b6-25c7db74ece7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.414 2 DEBUG oslo_concurrency.lockutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Acquiring lock "21558304-9e39-4f50-9137-af0282f5cfca-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.415 2 DEBUG oslo_concurrency.lockutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "21558304-9e39-4f50-9137-af0282f5cfca-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.415 2 DEBUG oslo_concurrency.lockutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "21558304-9e39-4f50-9137-af0282f5cfca-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.416 2 DEBUG nova.virt.libvirt.vif [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:20:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-MultipleCreateTestJSON-server-1108095109',display_name='tempest-MultipleCreateTestJSON-server-1108095109-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-multiplecreatetestjson-server-1108095109-1',id=106,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='32942e5bdadc470989ae2d43e074169e',ramdisk_id='',reservation_id='r-xtpuqm3p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-1289676365',owner_user_name='tempest-MultipleCreateTestJSON-1289676365-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:20:11Z,user_data=None,user_id='4407ad6914204506adfa85e11e94e5d0',uuid=21558304-9e39-4f50-9137-af0282f5cfca,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d7141e9a-343b-4628-a1b6-25c7db74ece7", "address": "fa:16:3e:10:ac:4e", "network": {"id": "671431ea-00d4-4b91-9313-30948c14cb3a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1375422383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32942e5bdadc470989ae2d43e074169e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7141e9a-34", "ovs_interfaceid": "d7141e9a-343b-4628-a1b6-25c7db74ece7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.416 2 DEBUG nova.network.os_vif_util [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Converting VIF {"id": "d7141e9a-343b-4628-a1b6-25c7db74ece7", "address": "fa:16:3e:10:ac:4e", "network": {"id": "671431ea-00d4-4b91-9313-30948c14cb3a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1375422383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32942e5bdadc470989ae2d43e074169e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7141e9a-34", "ovs_interfaceid": "d7141e9a-343b-4628-a1b6-25c7db74ece7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.417 2 DEBUG nova.network.os_vif_util [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:10:ac:4e,bridge_name='br-int',has_traffic_filtering=True,id=d7141e9a-343b-4628-a1b6-25c7db74ece7,network=Network(671431ea-00d4-4b91-9313-30948c14cb3a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7141e9a-34') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.417 2 DEBUG os_vif [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:10:ac:4e,bridge_name='br-int',has_traffic_filtering=True,id=d7141e9a-343b-4628-a1b6-25c7db74ece7,network=Network(671431ea-00d4-4b91-9313-30948c14cb3a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7141e9a-34') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.418 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.418 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.419 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.422 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd7141e9a-34, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.422 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd7141e9a-34, col_values=(('external_ids', {'iface-id': 'd7141e9a-343b-4628-a1b6-25c7db74ece7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:10:ac:4e', 'vm-uuid': '21558304-9e39-4f50-9137-af0282f5cfca'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:15 np0005466013 NetworkManager[51205]: <info>  [1759407615.4265] manager: (tapd7141e9a-34): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/181)
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.429 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.430 2 INFO os_vif [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:10:ac:4e,bridge_name='br-int',has_traffic_filtering=True,id=d7141e9a-343b-4628-a1b6-25c7db74ece7,network=Network(671431ea-00d4-4b91-9313-30948c14cb3a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7141e9a-34')#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.500 2 DEBUG nova.network.neutron [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Updating instance_info_cache with network_info: [{"id": "b860dc8c-5253-4155-89c5-2384f0b08ff2", "address": "fa:16:3e:19:59:b6", "network": {"id": "671431ea-00d4-4b91-9313-30948c14cb3a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1375422383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32942e5bdadc470989ae2d43e074169e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb860dc8c-52", "ovs_interfaceid": "b860dc8c-5253-4155-89c5-2384f0b08ff2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.508 2 DEBUG nova.virt.libvirt.driver [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.508 2 DEBUG nova.virt.libvirt.driver [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.508 2 DEBUG nova.virt.libvirt.driver [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] No VIF found with MAC fa:16:3e:10:ac:4e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.509 2 INFO nova.virt.libvirt.driver [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Using config drive#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.537 2 DEBUG oslo_concurrency.lockutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Releasing lock "refresh_cache-d88505e6-83d4-4006-a5d8-33e9ab64a380" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.538 2 DEBUG nova.compute.manager [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Instance network_info: |[{"id": "b860dc8c-5253-4155-89c5-2384f0b08ff2", "address": "fa:16:3e:19:59:b6", "network": {"id": "671431ea-00d4-4b91-9313-30948c14cb3a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1375422383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32942e5bdadc470989ae2d43e074169e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb860dc8c-52", "ovs_interfaceid": "b860dc8c-5253-4155-89c5-2384f0b08ff2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.538 2 DEBUG oslo_concurrency.lockutils [req-74a8366d-adfc-4adf-8b4c-4dff23e1bae9 req-c1fd3dd8-907e-4e74-80e2-4419d108ca48 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-d88505e6-83d4-4006-a5d8-33e9ab64a380" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.539 2 DEBUG nova.network.neutron [req-74a8366d-adfc-4adf-8b4c-4dff23e1bae9 req-c1fd3dd8-907e-4e74-80e2-4419d108ca48 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Refreshing network info cache for port b860dc8c-5253-4155-89c5-2384f0b08ff2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.541 2 DEBUG nova.virt.libvirt.driver [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Start _get_guest_xml network_info=[{"id": "b860dc8c-5253-4155-89c5-2384f0b08ff2", "address": "fa:16:3e:19:59:b6", "network": {"id": "671431ea-00d4-4b91-9313-30948c14cb3a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1375422383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32942e5bdadc470989ae2d43e074169e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb860dc8c-52", "ovs_interfaceid": "b860dc8c-5253-4155-89c5-2384f0b08ff2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.546 2 WARNING nova.virt.libvirt.driver [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.549 2 DEBUG nova.virt.libvirt.host [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.550 2 DEBUG nova.virt.libvirt.host [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.553 2 DEBUG nova.virt.libvirt.host [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.553 2 DEBUG nova.virt.libvirt.host [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.554 2 DEBUG nova.virt.libvirt.driver [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.555 2 DEBUG nova.virt.hardware [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.555 2 DEBUG nova.virt.hardware [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.555 2 DEBUG nova.virt.hardware [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.555 2 DEBUG nova.virt.hardware [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.556 2 DEBUG nova.virt.hardware [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.556 2 DEBUG nova.virt.hardware [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.556 2 DEBUG nova.virt.hardware [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.556 2 DEBUG nova.virt.hardware [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.556 2 DEBUG nova.virt.hardware [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.557 2 DEBUG nova.virt.hardware [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.557 2 DEBUG nova.virt.hardware [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.560 2 DEBUG nova.virt.libvirt.vif [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:20:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-MultipleCreateTestJSON-server-1108095109',display_name='tempest-MultipleCreateTestJSON-server-1108095109-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-multiplecreatetestjson-server-1108095109-2',id=107,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='32942e5bdadc470989ae2d43e074169e',ramdisk_id='',reservation_id='r-xtpuqm3p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-1289676365',owner_user_name='tempest-MultipleCreateTestJSON-1289676365-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:20:11Z,user_data=None,user_id='4407ad6914204506adfa85e11e94e5d0',uuid=d88505e6-83d4-4006-a5d8-33e9ab64a380,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b860dc8c-5253-4155-89c5-2384f0b08ff2", "address": "fa:16:3e:19:59:b6", "network": {"id": "671431ea-00d4-4b91-9313-30948c14cb3a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1375422383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32942e5bdadc470989ae2d43e074169e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb860dc8c-52", "ovs_interfaceid": "b860dc8c-5253-4155-89c5-2384f0b08ff2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.560 2 DEBUG nova.network.os_vif_util [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Converting VIF {"id": "b860dc8c-5253-4155-89c5-2384f0b08ff2", "address": "fa:16:3e:19:59:b6", "network": {"id": "671431ea-00d4-4b91-9313-30948c14cb3a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1375422383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32942e5bdadc470989ae2d43e074169e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb860dc8c-52", "ovs_interfaceid": "b860dc8c-5253-4155-89c5-2384f0b08ff2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.561 2 DEBUG nova.network.os_vif_util [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:59:b6,bridge_name='br-int',has_traffic_filtering=True,id=b860dc8c-5253-4155-89c5-2384f0b08ff2,network=Network(671431ea-00d4-4b91-9313-30948c14cb3a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb860dc8c-52') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.562 2 DEBUG nova.objects.instance [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lazy-loading 'pci_devices' on Instance uuid d88505e6-83d4-4006-a5d8-33e9ab64a380 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.582 2 DEBUG nova.virt.libvirt.driver [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:20:15 np0005466013 nova_compute[192144]:  <uuid>d88505e6-83d4-4006-a5d8-33e9ab64a380</uuid>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:  <name>instance-0000006b</name>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      <nova:name>tempest-MultipleCreateTestJSON-server-1108095109-2</nova:name>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:20:15</nova:creationTime>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:20:15 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:        <nova:user uuid="4407ad6914204506adfa85e11e94e5d0">tempest-MultipleCreateTestJSON-1289676365-project-member</nova:user>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:        <nova:project uuid="32942e5bdadc470989ae2d43e074169e">tempest-MultipleCreateTestJSON-1289676365</nova:project>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:        <nova:port uuid="b860dc8c-5253-4155-89c5-2384f0b08ff2">
Oct  2 08:20:15 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      <entry name="serial">d88505e6-83d4-4006-a5d8-33e9ab64a380</entry>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      <entry name="uuid">d88505e6-83d4-4006-a5d8-33e9ab64a380</entry>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/d88505e6-83d4-4006-a5d8-33e9ab64a380/disk"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/d88505e6-83d4-4006-a5d8-33e9ab64a380/disk.config"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:19:59:b6"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      <target dev="tapb860dc8c-52"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/d88505e6-83d4-4006-a5d8-33e9ab64a380/console.log" append="off"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:20:15 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:20:15 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:20:15 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:20:15 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.584 2 DEBUG nova.compute.manager [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Preparing to wait for external event network-vif-plugged-b860dc8c-5253-4155-89c5-2384f0b08ff2 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.584 2 DEBUG oslo_concurrency.lockutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Acquiring lock "d88505e6-83d4-4006-a5d8-33e9ab64a380-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.584 2 DEBUG oslo_concurrency.lockutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "d88505e6-83d4-4006-a5d8-33e9ab64a380-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.584 2 DEBUG oslo_concurrency.lockutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "d88505e6-83d4-4006-a5d8-33e9ab64a380-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.585 2 DEBUG nova.virt.libvirt.vif [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:20:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-MultipleCreateTestJSON-server-1108095109',display_name='tempest-MultipleCreateTestJSON-server-1108095109-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-multiplecreatetestjson-server-1108095109-2',id=107,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='32942e5bdadc470989ae2d43e074169e',ramdisk_id='',reservation_id='r-xtpuqm3p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-1289676365',owner_user_name='tempest-MultipleCreateTestJSON-1289676365-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:20:11Z,user_data=None,user_id='4407ad6914204506adfa85e11e94e5d0',uuid=d88505e6-83d4-4006-a5d8-33e9ab64a380,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b860dc8c-5253-4155-89c5-2384f0b08ff2", "address": "fa:16:3e:19:59:b6", "network": {"id": "671431ea-00d4-4b91-9313-30948c14cb3a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1375422383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32942e5bdadc470989ae2d43e074169e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb860dc8c-52", "ovs_interfaceid": "b860dc8c-5253-4155-89c5-2384f0b08ff2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.586 2 DEBUG nova.network.os_vif_util [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Converting VIF {"id": "b860dc8c-5253-4155-89c5-2384f0b08ff2", "address": "fa:16:3e:19:59:b6", "network": {"id": "671431ea-00d4-4b91-9313-30948c14cb3a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1375422383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32942e5bdadc470989ae2d43e074169e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb860dc8c-52", "ovs_interfaceid": "b860dc8c-5253-4155-89c5-2384f0b08ff2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.586 2 DEBUG nova.network.os_vif_util [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:59:b6,bridge_name='br-int',has_traffic_filtering=True,id=b860dc8c-5253-4155-89c5-2384f0b08ff2,network=Network(671431ea-00d4-4b91-9313-30948c14cb3a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb860dc8c-52') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.586 2 DEBUG os_vif [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:59:b6,bridge_name='br-int',has_traffic_filtering=True,id=b860dc8c-5253-4155-89c5-2384f0b08ff2,network=Network(671431ea-00d4-4b91-9313-30948c14cb3a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb860dc8c-52') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.587 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.587 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.588 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.591 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.591 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb860dc8c-52, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.592 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb860dc8c-52, col_values=(('external_ids', {'iface-id': 'b860dc8c-5253-4155-89c5-2384f0b08ff2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:19:59:b6', 'vm-uuid': 'd88505e6-83d4-4006-a5d8-33e9ab64a380'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:15 np0005466013 NetworkManager[51205]: <info>  [1759407615.5940] manager: (tapb860dc8c-52): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/182)
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.596 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.600 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.601 2 INFO os_vif [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:59:b6,bridge_name='br-int',has_traffic_filtering=True,id=b860dc8c-5253-4155-89c5-2384f0b08ff2,network=Network(671431ea-00d4-4b91-9313-30948c14cb3a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb860dc8c-52')#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.651 2 DEBUG nova.virt.libvirt.driver [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.652 2 DEBUG nova.virt.libvirt.driver [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.652 2 DEBUG nova.virt.libvirt.driver [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] No VIF found with MAC fa:16:3e:19:59:b6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:20:15 np0005466013 nova_compute[192144]: 2025-10-02 12:20:15.653 2 INFO nova.virt.libvirt.driver [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Using config drive#033[00m
Oct  2 08:20:16 np0005466013 nova_compute[192144]: 2025-10-02 12:20:16.572 2 INFO nova.virt.libvirt.driver [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Creating config drive at /var/lib/nova/instances/d88505e6-83d4-4006-a5d8-33e9ab64a380/disk.config#033[00m
Oct  2 08:20:16 np0005466013 nova_compute[192144]: 2025-10-02 12:20:16.578 2 DEBUG oslo_concurrency.processutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d88505e6-83d4-4006-a5d8-33e9ab64a380/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpubnfpvox execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:16 np0005466013 nova_compute[192144]: 2025-10-02 12:20:16.607 2 INFO nova.virt.libvirt.driver [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Creating config drive at /var/lib/nova/instances/21558304-9e39-4f50-9137-af0282f5cfca/disk.config#033[00m
Oct  2 08:20:16 np0005466013 nova_compute[192144]: 2025-10-02 12:20:16.612 2 DEBUG oslo_concurrency.processutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/21558304-9e39-4f50-9137-af0282f5cfca/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpd8j04qol execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:16 np0005466013 nova_compute[192144]: 2025-10-02 12:20:16.712 2 DEBUG oslo_concurrency.processutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d88505e6-83d4-4006-a5d8-33e9ab64a380/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpubnfpvox" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:16 np0005466013 nova_compute[192144]: 2025-10-02 12:20:16.749 2 DEBUG oslo_concurrency.processutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/21558304-9e39-4f50-9137-af0282f5cfca/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpd8j04qol" returned: 0 in 0.136s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:16 np0005466013 kernel: tapb860dc8c-52: entered promiscuous mode
Oct  2 08:20:16 np0005466013 NetworkManager[51205]: <info>  [1759407616.7803] manager: (tapb860dc8c-52): new Tun device (/org/freedesktop/NetworkManager/Devices/183)
Oct  2 08:20:16 np0005466013 ovn_controller[94366]: 2025-10-02T12:20:16Z|00391|binding|INFO|Claiming lport b860dc8c-5253-4155-89c5-2384f0b08ff2 for this chassis.
Oct  2 08:20:16 np0005466013 nova_compute[192144]: 2025-10-02 12:20:16.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:16 np0005466013 ovn_controller[94366]: 2025-10-02T12:20:16Z|00392|binding|INFO|b860dc8c-5253-4155-89c5-2384f0b08ff2: Claiming fa:16:3e:19:59:b6 10.100.0.12
Oct  2 08:20:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:16.796 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:19:59:b6 10.100.0.12'], port_security=['fa:16:3e:19:59:b6 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'd88505e6-83d4-4006-a5d8-33e9ab64a380', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-671431ea-00d4-4b91-9313-30948c14cb3a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '32942e5bdadc470989ae2d43e074169e', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'cfb3aec0-81fc-4b5f-b44d-ec3c2e8c38e2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6bad4103-c8e8-465b-8547-fe93b2136e07, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=b860dc8c-5253-4155-89c5-2384f0b08ff2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:20:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:16.797 103323 INFO neutron.agent.ovn.metadata.agent [-] Port b860dc8c-5253-4155-89c5-2384f0b08ff2 in datapath 671431ea-00d4-4b91-9313-30948c14cb3a bound to our chassis#033[00m
Oct  2 08:20:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:16.799 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 671431ea-00d4-4b91-9313-30948c14cb3a#033[00m
Oct  2 08:20:16 np0005466013 ovn_controller[94366]: 2025-10-02T12:20:16Z|00393|binding|INFO|Setting lport b860dc8c-5253-4155-89c5-2384f0b08ff2 ovn-installed in OVS
Oct  2 08:20:16 np0005466013 ovn_controller[94366]: 2025-10-02T12:20:16Z|00394|binding|INFO|Setting lport b860dc8c-5253-4155-89c5-2384f0b08ff2 up in Southbound
Oct  2 08:20:16 np0005466013 nova_compute[192144]: 2025-10-02 12:20:16.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:16 np0005466013 nova_compute[192144]: 2025-10-02 12:20:16.807 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:16.811 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[92fa09fc-3ff5-47ff-a508-040211d9f98b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:16.812 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap671431ea-01 in ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:20:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:16.814 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap671431ea-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:20:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:16.814 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[6fb29c7d-38b4-493c-a887-3f709c601486]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:16 np0005466013 systemd-udevd[235081]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:20:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:16.815 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[d69376e1-853b-40ad-bd8d-58b617bfe03d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:16.824 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[54a3ae73-a018-4c8c-ad8e-f2fb5ff254c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:16 np0005466013 NetworkManager[51205]: <info>  [1759407616.8279] device (tapb860dc8c-52): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:20:16 np0005466013 NetworkManager[51205]: <info>  [1759407616.8303] manager: (tapd7141e9a-34): new Tun device (/org/freedesktop/NetworkManager/Devices/184)
Oct  2 08:20:16 np0005466013 kernel: tapd7141e9a-34: entered promiscuous mode
Oct  2 08:20:16 np0005466013 NetworkManager[51205]: <info>  [1759407616.8309] device (tapb860dc8c-52): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:20:16 np0005466013 nova_compute[192144]: 2025-10-02 12:20:16.832 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:16 np0005466013 ovn_controller[94366]: 2025-10-02T12:20:16Z|00395|binding|INFO|Claiming lport d7141e9a-343b-4628-a1b6-25c7db74ece7 for this chassis.
Oct  2 08:20:16 np0005466013 ovn_controller[94366]: 2025-10-02T12:20:16Z|00396|binding|INFO|d7141e9a-343b-4628-a1b6-25c7db74ece7: Claiming fa:16:3e:10:ac:4e 10.100.0.4
Oct  2 08:20:16 np0005466013 systemd-machined[152202]: New machine qemu-46-instance-0000006b.
Oct  2 08:20:16 np0005466013 NetworkManager[51205]: <info>  [1759407616.8419] device (tapd7141e9a-34): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:20:16 np0005466013 NetworkManager[51205]: <info>  [1759407616.8431] device (tapd7141e9a-34): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:20:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:16.844 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:10:ac:4e 10.100.0.4'], port_security=['fa:16:3e:10:ac:4e 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '21558304-9e39-4f50-9137-af0282f5cfca', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-671431ea-00d4-4b91-9313-30948c14cb3a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '32942e5bdadc470989ae2d43e074169e', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'cfb3aec0-81fc-4b5f-b44d-ec3c2e8c38e2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6bad4103-c8e8-465b-8547-fe93b2136e07, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=d7141e9a-343b-4628-a1b6-25c7db74ece7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:20:16 np0005466013 ovn_controller[94366]: 2025-10-02T12:20:16Z|00397|binding|INFO|Setting lport d7141e9a-343b-4628-a1b6-25c7db74ece7 ovn-installed in OVS
Oct  2 08:20:16 np0005466013 ovn_controller[94366]: 2025-10-02T12:20:16Z|00398|binding|INFO|Setting lport d7141e9a-343b-4628-a1b6-25c7db74ece7 up in Southbound
Oct  2 08:20:16 np0005466013 nova_compute[192144]: 2025-10-02 12:20:16.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:16 np0005466013 systemd[1]: Started Virtual Machine qemu-46-instance-0000006b.
Oct  2 08:20:16 np0005466013 nova_compute[192144]: 2025-10-02 12:20:16.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:16.860 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[9873b4c2-5ef4-4893-91aa-12546d534a89]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:16 np0005466013 systemd-machined[152202]: New machine qemu-47-instance-0000006a.
Oct  2 08:20:16 np0005466013 systemd[1]: Started Virtual Machine qemu-47-instance-0000006a.
Oct  2 08:20:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:16.895 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[74ab0a10-9d6b-41a0-95f9-5902fbd4e62e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:16 np0005466013 NetworkManager[51205]: <info>  [1759407616.9024] manager: (tap671431ea-00): new Veth device (/org/freedesktop/NetworkManager/Devices/185)
Oct  2 08:20:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:16.901 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[e178ed89-7207-4b1d-a1f6-ca68539b5771]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:16 np0005466013 systemd-udevd[235089]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:20:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:16.931 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[62ff1136-7265-4105-98aa-157f6fe40279]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:16.935 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[d2b72859-6e41-4fd7-ba90-7ca1502313c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:16 np0005466013 NetworkManager[51205]: <info>  [1759407616.9662] device (tap671431ea-00): carrier: link connected
Oct  2 08:20:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:16.967 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[dee2d988-89c6-40ad-b58a-addd1a0c051b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:16.986 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[9f803211-ebf5-4df2-a599-9aa5c29d4b5e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap671431ea-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e4:de:56'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 121], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 560858, 'reachable_time': 21654, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 192, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 192, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235128, 'error': None, 'target': 'ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:17 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:17.004 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[b84b65e8-0eb0-4ca8-9fe4-919b7e1be16c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee4:de56'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 560858, 'tstamp': 560858}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235130, 'error': None, 'target': 'ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:17 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:17.022 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[4de515b9-1dcb-4a23-94a7-c1dea26cf69a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap671431ea-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e4:de:56'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 121], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 560858, 'reachable_time': 21654, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 192, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 192, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 235131, 'error': None, 'target': 'ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:17 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:17.059 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[e09dcb9e-6616-42cf-a876-15f4ab07098d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:17 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:17.123 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[a9ff0f29-fd78-4798-9584-edba1341e702]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:17 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:17.125 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap671431ea-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:17 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:17.125 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:20:17 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:17.126 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap671431ea-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:17 np0005466013 nova_compute[192144]: 2025-10-02 12:20:17.128 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:17 np0005466013 NetworkManager[51205]: <info>  [1759407617.1309] manager: (tap671431ea-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/186)
Oct  2 08:20:17 np0005466013 kernel: tap671431ea-00: entered promiscuous mode
Oct  2 08:20:17 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:17.135 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap671431ea-00, col_values=(('external_ids', {'iface-id': 'db7c1214-1d9a-4543-8afe-9e322e5cef6d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:17 np0005466013 ovn_controller[94366]: 2025-10-02T12:20:17Z|00399|binding|INFO|Releasing lport db7c1214-1d9a-4543-8afe-9e322e5cef6d from this chassis (sb_readonly=0)
Oct  2 08:20:17 np0005466013 nova_compute[192144]: 2025-10-02 12:20:17.137 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:17 np0005466013 nova_compute[192144]: 2025-10-02 12:20:17.151 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:17 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:17.152 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/671431ea-00d4-4b91-9313-30948c14cb3a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/671431ea-00d4-4b91-9313-30948c14cb3a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:20:17 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:17.153 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[d89f5208-909b-43fb-9d77-2ab3252b37a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:17 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:17.154 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:20:17 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:20:17 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:20:17 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-671431ea-00d4-4b91-9313-30948c14cb3a
Oct  2 08:20:17 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:20:17 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:20:17 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:20:17 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/671431ea-00d4-4b91-9313-30948c14cb3a.pid.haproxy
Oct  2 08:20:17 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:20:17 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:20:17 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:20:17 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:20:17 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:20:17 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:20:17 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:20:17 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:20:17 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:20:17 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:20:17 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:20:17 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:20:17 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:20:17 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:20:17 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:20:17 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:20:17 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:20:17 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:20:17 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:20:17 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:20:17 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID 671431ea-00d4-4b91-9313-30948c14cb3a
Oct  2 08:20:17 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:20:17 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:17.155 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a', 'env', 'PROCESS_TAG=haproxy-671431ea-00d4-4b91-9313-30948c14cb3a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/671431ea-00d4-4b91-9313-30948c14cb3a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:20:17 np0005466013 nova_compute[192144]: 2025-10-02 12:20:17.541 2 DEBUG nova.compute.manager [req-645f8c25-7b3e-49ac-9f0e-1c0f161343f1 req-25df10c2-3519-4c4f-a377-8e2125a4cd53 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Received event network-vif-plugged-b860dc8c-5253-4155-89c5-2384f0b08ff2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:17 np0005466013 nova_compute[192144]: 2025-10-02 12:20:17.541 2 DEBUG oslo_concurrency.lockutils [req-645f8c25-7b3e-49ac-9f0e-1c0f161343f1 req-25df10c2-3519-4c4f-a377-8e2125a4cd53 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "d88505e6-83d4-4006-a5d8-33e9ab64a380-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:17 np0005466013 nova_compute[192144]: 2025-10-02 12:20:17.546 2 DEBUG oslo_concurrency.lockutils [req-645f8c25-7b3e-49ac-9f0e-1c0f161343f1 req-25df10c2-3519-4c4f-a377-8e2125a4cd53 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "d88505e6-83d4-4006-a5d8-33e9ab64a380-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:17 np0005466013 nova_compute[192144]: 2025-10-02 12:20:17.546 2 DEBUG oslo_concurrency.lockutils [req-645f8c25-7b3e-49ac-9f0e-1c0f161343f1 req-25df10c2-3519-4c4f-a377-8e2125a4cd53 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "d88505e6-83d4-4006-a5d8-33e9ab64a380-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:17 np0005466013 nova_compute[192144]: 2025-10-02 12:20:17.546 2 DEBUG nova.compute.manager [req-645f8c25-7b3e-49ac-9f0e-1c0f161343f1 req-25df10c2-3519-4c4f-a377-8e2125a4cd53 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Processing event network-vif-plugged-b860dc8c-5253-4155-89c5-2384f0b08ff2 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:20:17 np0005466013 podman[235177]: 2025-10-02 12:20:17.56553664 +0000 UTC m=+0.073152677 container create 5d6028e642c6a51dfe37045ddb8795a25dee2a1c2d3b4664a64ad5e892c4aa9f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:20:17 np0005466013 podman[235177]: 2025-10-02 12:20:17.513195767 +0000 UTC m=+0.020811814 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:20:17 np0005466013 systemd[1]: Started libpod-conmon-5d6028e642c6a51dfe37045ddb8795a25dee2a1c2d3b4664a64ad5e892c4aa9f.scope.
Oct  2 08:20:17 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:20:17 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/07709a3bd1be0e17cda6725f2904c909a5dfd1dd9f0d3b3f3e9d26d11712e659/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:20:17 np0005466013 nova_compute[192144]: 2025-10-02 12:20:17.646 2 DEBUG nova.compute.manager [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:20:17 np0005466013 nova_compute[192144]: 2025-10-02 12:20:17.647 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407617.6453917, d88505e6-83d4-4006-a5d8-33e9ab64a380 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:20:17 np0005466013 nova_compute[192144]: 2025-10-02 12:20:17.647 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] VM Started (Lifecycle Event)#033[00m
Oct  2 08:20:17 np0005466013 nova_compute[192144]: 2025-10-02 12:20:17.651 2 DEBUG nova.virt.libvirt.driver [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:20:17 np0005466013 nova_compute[192144]: 2025-10-02 12:20:17.655 2 INFO nova.virt.libvirt.driver [-] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Instance spawned successfully.#033[00m
Oct  2 08:20:17 np0005466013 nova_compute[192144]: 2025-10-02 12:20:17.655 2 DEBUG nova.virt.libvirt.driver [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:20:17 np0005466013 podman[235177]: 2025-10-02 12:20:17.664304098 +0000 UTC m=+0.171920125 container init 5d6028e642c6a51dfe37045ddb8795a25dee2a1c2d3b4664a64ad5e892c4aa9f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct  2 08:20:17 np0005466013 podman[235177]: 2025-10-02 12:20:17.672032209 +0000 UTC m=+0.179648236 container start 5d6028e642c6a51dfe37045ddb8795a25dee2a1c2d3b4664a64ad5e892c4aa9f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:20:17 np0005466013 nova_compute[192144]: 2025-10-02 12:20:17.674 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:20:17 np0005466013 nova_compute[192144]: 2025-10-02 12:20:17.678 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:20:17 np0005466013 nova_compute[192144]: 2025-10-02 12:20:17.691 2 DEBUG nova.virt.libvirt.driver [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:17 np0005466013 nova_compute[192144]: 2025-10-02 12:20:17.691 2 DEBUG nova.virt.libvirt.driver [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:17 np0005466013 nova_compute[192144]: 2025-10-02 12:20:17.692 2 DEBUG nova.virt.libvirt.driver [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:17 np0005466013 nova_compute[192144]: 2025-10-02 12:20:17.692 2 DEBUG nova.virt.libvirt.driver [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:17 np0005466013 nova_compute[192144]: 2025-10-02 12:20:17.692 2 DEBUG nova.virt.libvirt.driver [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:17 np0005466013 nova_compute[192144]: 2025-10-02 12:20:17.693 2 DEBUG nova.virt.libvirt.driver [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:17 np0005466013 neutron-haproxy-ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a[235192]: [NOTICE]   (235196) : New worker (235198) forked
Oct  2 08:20:17 np0005466013 neutron-haproxy-ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a[235192]: [NOTICE]   (235196) : Loading success.
Oct  2 08:20:17 np0005466013 nova_compute[192144]: 2025-10-02 12:20:17.727 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:20:17 np0005466013 nova_compute[192144]: 2025-10-02 12:20:17.728 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407617.6470094, d88505e6-83d4-4006-a5d8-33e9ab64a380 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:20:17 np0005466013 nova_compute[192144]: 2025-10-02 12:20:17.728 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:20:17 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:17.737 103323 INFO neutron.agent.ovn.metadata.agent [-] Port d7141e9a-343b-4628-a1b6-25c7db74ece7 in datapath 671431ea-00d4-4b91-9313-30948c14cb3a unbound from our chassis#033[00m
Oct  2 08:20:17 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:17.739 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 671431ea-00d4-4b91-9313-30948c14cb3a#033[00m
Oct  2 08:20:17 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:17.755 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[592cafe4-27f7-47e2-b151-b719485a307d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:17 np0005466013 nova_compute[192144]: 2025-10-02 12:20:17.768 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:20:17 np0005466013 nova_compute[192144]: 2025-10-02 12:20:17.772 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407617.6508029, d88505e6-83d4-4006-a5d8-33e9ab64a380 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:20:17 np0005466013 nova_compute[192144]: 2025-10-02 12:20:17.772 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:20:17 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:17.786 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[bb5c0f86-69d4-4796-af1a-38d99f0ad6f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:17 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:17.790 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[a44ef9ad-d8ce-4027-b914-f3167bddf123]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:17 np0005466013 nova_compute[192144]: 2025-10-02 12:20:17.801 2 INFO nova.compute.manager [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Took 5.54 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:20:17 np0005466013 nova_compute[192144]: 2025-10-02 12:20:17.802 2 DEBUG nova.compute.manager [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:20:17 np0005466013 nova_compute[192144]: 2025-10-02 12:20:17.806 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:20:17 np0005466013 nova_compute[192144]: 2025-10-02 12:20:17.815 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:20:17 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:17.824 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[01d4fcb2-bf16-4009-8734-9717c4293d21]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:17 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:17.846 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[9c3b91cf-18fc-4eca-b86f-f8c607e907ee]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap671431ea-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e4:de:56'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 3, 'tx_packets': 5, 'rx_bytes': 306, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 3, 'tx_packets': 5, 'rx_bytes': 306, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 121], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 560858, 'reachable_time': 21654, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 3, 'inoctets': 264, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 3, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 264, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 3, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235212, 'error': None, 'target': 'ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:17 np0005466013 nova_compute[192144]: 2025-10-02 12:20:17.864 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:20:17 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:17.865 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[5574614b-63fc-41a7-b2a4-4bd7110eb4b5]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap671431ea-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 560870, 'tstamp': 560870}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235213, 'error': None, 'target': 'ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap671431ea-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 560873, 'tstamp': 560873}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235213, 'error': None, 'target': 'ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:17 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:17.867 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap671431ea-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:17 np0005466013 nova_compute[192144]: 2025-10-02 12:20:17.869 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:17 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:17.871 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap671431ea-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:17 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:17.871 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:20:17 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:17.872 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap671431ea-00, col_values=(('external_ids', {'iface-id': 'db7c1214-1d9a-4543-8afe-9e322e5cef6d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:17 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:17.872 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:20:17 np0005466013 nova_compute[192144]: 2025-10-02 12:20:17.923 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407617.9229188, 21558304-9e39-4f50-9137-af0282f5cfca => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:20:17 np0005466013 nova_compute[192144]: 2025-10-02 12:20:17.924 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] VM Started (Lifecycle Event)#033[00m
Oct  2 08:20:17 np0005466013 nova_compute[192144]: 2025-10-02 12:20:17.931 2 INFO nova.compute.manager [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Took 7.23 seconds to build instance.#033[00m
Oct  2 08:20:17 np0005466013 nova_compute[192144]: 2025-10-02 12:20:17.946 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:20:17 np0005466013 nova_compute[192144]: 2025-10-02 12:20:17.952 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407617.9233158, 21558304-9e39-4f50-9137-af0282f5cfca => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:20:17 np0005466013 nova_compute[192144]: 2025-10-02 12:20:17.953 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:20:17 np0005466013 nova_compute[192144]: 2025-10-02 12:20:17.960 2 DEBUG oslo_concurrency.lockutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "d88505e6-83d4-4006-a5d8-33e9ab64a380" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.390s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:17 np0005466013 nova_compute[192144]: 2025-10-02 12:20:17.973 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:20:17 np0005466013 nova_compute[192144]: 2025-10-02 12:20:17.976 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:20:17 np0005466013 nova_compute[192144]: 2025-10-02 12:20:17.993 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:20:18 np0005466013 nova_compute[192144]: 2025-10-02 12:20:18.230 2 DEBUG nova.network.neutron [req-57452eda-31ee-46a3-8ea7-6803cf493f95 req-583081b1-a2a9-4c55-90ba-ccfd30a183b8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Updated VIF entry in instance network info cache for port d7141e9a-343b-4628-a1b6-25c7db74ece7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:20:18 np0005466013 nova_compute[192144]: 2025-10-02 12:20:18.230 2 DEBUG nova.network.neutron [req-57452eda-31ee-46a3-8ea7-6803cf493f95 req-583081b1-a2a9-4c55-90ba-ccfd30a183b8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Updating instance_info_cache with network_info: [{"id": "d7141e9a-343b-4628-a1b6-25c7db74ece7", "address": "fa:16:3e:10:ac:4e", "network": {"id": "671431ea-00d4-4b91-9313-30948c14cb3a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1375422383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32942e5bdadc470989ae2d43e074169e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7141e9a-34", "ovs_interfaceid": "d7141e9a-343b-4628-a1b6-25c7db74ece7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:20:18 np0005466013 nova_compute[192144]: 2025-10-02 12:20:18.244 2 DEBUG oslo_concurrency.lockutils [req-57452eda-31ee-46a3-8ea7-6803cf493f95 req-583081b1-a2a9-4c55-90ba-ccfd30a183b8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-21558304-9e39-4f50-9137-af0282f5cfca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:20:18 np0005466013 nova_compute[192144]: 2025-10-02 12:20:18.324 2 DEBUG nova.network.neutron [req-74a8366d-adfc-4adf-8b4c-4dff23e1bae9 req-c1fd3dd8-907e-4e74-80e2-4419d108ca48 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Updated VIF entry in instance network info cache for port b860dc8c-5253-4155-89c5-2384f0b08ff2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:20:18 np0005466013 nova_compute[192144]: 2025-10-02 12:20:18.325 2 DEBUG nova.network.neutron [req-74a8366d-adfc-4adf-8b4c-4dff23e1bae9 req-c1fd3dd8-907e-4e74-80e2-4419d108ca48 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Updating instance_info_cache with network_info: [{"id": "b860dc8c-5253-4155-89c5-2384f0b08ff2", "address": "fa:16:3e:19:59:b6", "network": {"id": "671431ea-00d4-4b91-9313-30948c14cb3a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1375422383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32942e5bdadc470989ae2d43e074169e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb860dc8c-52", "ovs_interfaceid": "b860dc8c-5253-4155-89c5-2384f0b08ff2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:20:18 np0005466013 nova_compute[192144]: 2025-10-02 12:20:18.347 2 DEBUG oslo_concurrency.lockutils [req-74a8366d-adfc-4adf-8b4c-4dff23e1bae9 req-c1fd3dd8-907e-4e74-80e2-4419d108ca48 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-d88505e6-83d4-4006-a5d8-33e9ab64a380" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:20:18 np0005466013 nova_compute[192144]: 2025-10-02 12:20:18.658 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:19 np0005466013 nova_compute[192144]: 2025-10-02 12:20:19.550 2 DEBUG nova.compute.manager [req-2c858723-2d35-4d9d-93a1-ef8dd7d1c3a2 req-d018428c-1112-47bd-a556-67644fd6d671 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Received event network-vif-plugged-d7141e9a-343b-4628-a1b6-25c7db74ece7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:19 np0005466013 nova_compute[192144]: 2025-10-02 12:20:19.551 2 DEBUG oslo_concurrency.lockutils [req-2c858723-2d35-4d9d-93a1-ef8dd7d1c3a2 req-d018428c-1112-47bd-a556-67644fd6d671 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "21558304-9e39-4f50-9137-af0282f5cfca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:19 np0005466013 nova_compute[192144]: 2025-10-02 12:20:19.551 2 DEBUG oslo_concurrency.lockutils [req-2c858723-2d35-4d9d-93a1-ef8dd7d1c3a2 req-d018428c-1112-47bd-a556-67644fd6d671 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "21558304-9e39-4f50-9137-af0282f5cfca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:19 np0005466013 nova_compute[192144]: 2025-10-02 12:20:19.552 2 DEBUG oslo_concurrency.lockutils [req-2c858723-2d35-4d9d-93a1-ef8dd7d1c3a2 req-d018428c-1112-47bd-a556-67644fd6d671 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "21558304-9e39-4f50-9137-af0282f5cfca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:19 np0005466013 nova_compute[192144]: 2025-10-02 12:20:19.552 2 DEBUG nova.compute.manager [req-2c858723-2d35-4d9d-93a1-ef8dd7d1c3a2 req-d018428c-1112-47bd-a556-67644fd6d671 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Processing event network-vif-plugged-d7141e9a-343b-4628-a1b6-25c7db74ece7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:20:19 np0005466013 nova_compute[192144]: 2025-10-02 12:20:19.553 2 DEBUG nova.compute.manager [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:20:19 np0005466013 nova_compute[192144]: 2025-10-02 12:20:19.557 2 DEBUG nova.virt.libvirt.driver [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:20:19 np0005466013 nova_compute[192144]: 2025-10-02 12:20:19.557 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407619.55677, 21558304-9e39-4f50-9137-af0282f5cfca => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:20:19 np0005466013 nova_compute[192144]: 2025-10-02 12:20:19.557 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:20:19 np0005466013 nova_compute[192144]: 2025-10-02 12:20:19.565 2 INFO nova.virt.libvirt.driver [-] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Instance spawned successfully.#033[00m
Oct  2 08:20:19 np0005466013 nova_compute[192144]: 2025-10-02 12:20:19.565 2 DEBUG nova.virt.libvirt.driver [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:20:19 np0005466013 nova_compute[192144]: 2025-10-02 12:20:19.592 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:20:19 np0005466013 nova_compute[192144]: 2025-10-02 12:20:19.598 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:20:19 np0005466013 nova_compute[192144]: 2025-10-02 12:20:19.602 2 DEBUG nova.virt.libvirt.driver [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:19 np0005466013 nova_compute[192144]: 2025-10-02 12:20:19.602 2 DEBUG nova.virt.libvirt.driver [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:19 np0005466013 nova_compute[192144]: 2025-10-02 12:20:19.603 2 DEBUG nova.virt.libvirt.driver [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:19 np0005466013 nova_compute[192144]: 2025-10-02 12:20:19.603 2 DEBUG nova.virt.libvirt.driver [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:19 np0005466013 nova_compute[192144]: 2025-10-02 12:20:19.603 2 DEBUG nova.virt.libvirt.driver [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:19 np0005466013 nova_compute[192144]: 2025-10-02 12:20:19.604 2 DEBUG nova.virt.libvirt.driver [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:19 np0005466013 nova_compute[192144]: 2025-10-02 12:20:19.627 2 DEBUG nova.compute.manager [req-f86a039a-45a3-42f5-8512-7272e8c776ef req-c88b9145-fe08-403a-8bbf-8c52582956f0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Received event network-vif-plugged-b860dc8c-5253-4155-89c5-2384f0b08ff2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:19 np0005466013 nova_compute[192144]: 2025-10-02 12:20:19.628 2 DEBUG oslo_concurrency.lockutils [req-f86a039a-45a3-42f5-8512-7272e8c776ef req-c88b9145-fe08-403a-8bbf-8c52582956f0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "d88505e6-83d4-4006-a5d8-33e9ab64a380-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:19 np0005466013 nova_compute[192144]: 2025-10-02 12:20:19.628 2 DEBUG oslo_concurrency.lockutils [req-f86a039a-45a3-42f5-8512-7272e8c776ef req-c88b9145-fe08-403a-8bbf-8c52582956f0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "d88505e6-83d4-4006-a5d8-33e9ab64a380-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:19 np0005466013 nova_compute[192144]: 2025-10-02 12:20:19.629 2 DEBUG oslo_concurrency.lockutils [req-f86a039a-45a3-42f5-8512-7272e8c776ef req-c88b9145-fe08-403a-8bbf-8c52582956f0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "d88505e6-83d4-4006-a5d8-33e9ab64a380-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:19 np0005466013 nova_compute[192144]: 2025-10-02 12:20:19.629 2 DEBUG nova.compute.manager [req-f86a039a-45a3-42f5-8512-7272e8c776ef req-c88b9145-fe08-403a-8bbf-8c52582956f0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] No waiting events found dispatching network-vif-plugged-b860dc8c-5253-4155-89c5-2384f0b08ff2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:20:19 np0005466013 nova_compute[192144]: 2025-10-02 12:20:19.629 2 WARNING nova.compute.manager [req-f86a039a-45a3-42f5-8512-7272e8c776ef req-c88b9145-fe08-403a-8bbf-8c52582956f0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Received unexpected event network-vif-plugged-b860dc8c-5253-4155-89c5-2384f0b08ff2 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:20:19 np0005466013 nova_compute[192144]: 2025-10-02 12:20:19.636 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:20:19 np0005466013 nova_compute[192144]: 2025-10-02 12:20:19.708 2 INFO nova.compute.manager [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Took 8.11 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:20:19 np0005466013 nova_compute[192144]: 2025-10-02 12:20:19.709 2 DEBUG nova.compute.manager [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:20:19 np0005466013 nova_compute[192144]: 2025-10-02 12:20:19.796 2 INFO nova.compute.manager [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Took 9.19 seconds to build instance.#033[00m
Oct  2 08:20:19 np0005466013 nova_compute[192144]: 2025-10-02 12:20:19.820 2 DEBUG oslo_concurrency.lockutils [None req-05153920-0eb1-4795-be7e-422dcca64384 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "21558304-9e39-4f50-9137-af0282f5cfca" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.366s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.446 2 DEBUG oslo_concurrency.lockutils [None req-d72995f1-5814-423c-b514-6c86fab6bf5b 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Acquiring lock "21558304-9e39-4f50-9137-af0282f5cfca" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.447 2 DEBUG oslo_concurrency.lockutils [None req-d72995f1-5814-423c-b514-6c86fab6bf5b 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "21558304-9e39-4f50-9137-af0282f5cfca" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.447 2 DEBUG oslo_concurrency.lockutils [None req-d72995f1-5814-423c-b514-6c86fab6bf5b 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Acquiring lock "21558304-9e39-4f50-9137-af0282f5cfca-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.448 2 DEBUG oslo_concurrency.lockutils [None req-d72995f1-5814-423c-b514-6c86fab6bf5b 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "21558304-9e39-4f50-9137-af0282f5cfca-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.448 2 DEBUG oslo_concurrency.lockutils [None req-d72995f1-5814-423c-b514-6c86fab6bf5b 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "21558304-9e39-4f50-9137-af0282f5cfca-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.461 2 INFO nova.compute.manager [None req-d72995f1-5814-423c-b514-6c86fab6bf5b 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Terminating instance#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.473 2 DEBUG nova.compute.manager [None req-d72995f1-5814-423c-b514-6c86fab6bf5b 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:20:20 np0005466013 kernel: tapd7141e9a-34 (unregistering): left promiscuous mode
Oct  2 08:20:20 np0005466013 NetworkManager[51205]: <info>  [1759407620.5092] device (tapd7141e9a-34): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:20:20 np0005466013 ovn_controller[94366]: 2025-10-02T12:20:20Z|00400|binding|INFO|Releasing lport d7141e9a-343b-4628-a1b6-25c7db74ece7 from this chassis (sb_readonly=0)
Oct  2 08:20:20 np0005466013 ovn_controller[94366]: 2025-10-02T12:20:20Z|00401|binding|INFO|Setting lport d7141e9a-343b-4628-a1b6-25c7db74ece7 down in Southbound
Oct  2 08:20:20 np0005466013 ovn_controller[94366]: 2025-10-02T12:20:20Z|00402|binding|INFO|Removing iface tapd7141e9a-34 ovn-installed in OVS
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.520 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:20.534 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:10:ac:4e 10.100.0.4'], port_security=['fa:16:3e:10:ac:4e 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '21558304-9e39-4f50-9137-af0282f5cfca', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-671431ea-00d4-4b91-9313-30948c14cb3a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '32942e5bdadc470989ae2d43e074169e', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cfb3aec0-81fc-4b5f-b44d-ec3c2e8c38e2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6bad4103-c8e8-465b-8547-fe93b2136e07, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=d7141e9a-343b-4628-a1b6-25c7db74ece7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.535 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:20.537 103323 INFO neutron.agent.ovn.metadata.agent [-] Port d7141e9a-343b-4628-a1b6-25c7db74ece7 in datapath 671431ea-00d4-4b91-9313-30948c14cb3a unbound from our chassis#033[00m
Oct  2 08:20:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:20.539 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 671431ea-00d4-4b91-9313-30948c14cb3a#033[00m
Oct  2 08:20:20 np0005466013 systemd[1]: machine-qemu\x2d47\x2dinstance\x2d0000006a.scope: Deactivated successfully.
Oct  2 08:20:20 np0005466013 systemd[1]: machine-qemu\x2d47\x2dinstance\x2d0000006a.scope: Consumed 1.811s CPU time.
Oct  2 08:20:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:20.562 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[677e5e37-0f86-43c2-8b25-8dd54ae15a21]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:20 np0005466013 systemd-machined[152202]: Machine qemu-47-instance-0000006a terminated.
Oct  2 08:20:20 np0005466013 podman[235214]: 2025-10-02 12:20:20.585739753 +0000 UTC m=+0.054812630 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.587 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407605.5856888, 32c9613e-842d-48c2-be9d-368df9649040 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.587 2 INFO nova.compute.manager [-] [instance: 32c9613e-842d-48c2-be9d-368df9649040] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.595 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:20.599 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[74b888da-bba1-4920-a37b-8e4ad42426d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.602 2 DEBUG oslo_concurrency.lockutils [None req-f570dcae-a2f0-41f6-9410-02a30c8111b0 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Acquiring lock "d88505e6-83d4-4006-a5d8-33e9ab64a380" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.603 2 DEBUG oslo_concurrency.lockutils [None req-f570dcae-a2f0-41f6-9410-02a30c8111b0 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "d88505e6-83d4-4006-a5d8-33e9ab64a380" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.603 2 DEBUG oslo_concurrency.lockutils [None req-f570dcae-a2f0-41f6-9410-02a30c8111b0 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Acquiring lock "d88505e6-83d4-4006-a5d8-33e9ab64a380-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:20.602 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[ecd73e31-8c3d-47d3-84c2-d7394c2905f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.603 2 DEBUG oslo_concurrency.lockutils [None req-f570dcae-a2f0-41f6-9410-02a30c8111b0 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "d88505e6-83d4-4006-a5d8-33e9ab64a380-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.603 2 DEBUG oslo_concurrency.lockutils [None req-f570dcae-a2f0-41f6-9410-02a30c8111b0 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "d88505e6-83d4-4006-a5d8-33e9ab64a380-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.614 2 DEBUG nova.compute.manager [None req-696d8c1a-a03a-45e7-89b3-553280d3b6ef - - - - - -] [instance: 32c9613e-842d-48c2-be9d-368df9649040] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.616 2 INFO nova.compute.manager [None req-f570dcae-a2f0-41f6-9410-02a30c8111b0 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Terminating instance#033[00m
Oct  2 08:20:20 np0005466013 podman[235217]: 2025-10-02 12:20:20.617604983 +0000 UTC m=+0.086953379 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.634 2 DEBUG nova.compute.manager [None req-f570dcae-a2f0-41f6-9410-02a30c8111b0 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:20:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:20.638 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[567ac7f4-fc2e-4710-9e88-f96c6328a30b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.649 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407605.648848, c6a7915d-e2fe-4409-a7fe-ead19189985a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.650 2 INFO nova.compute.manager [-] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:20:20 np0005466013 kernel: tapb860dc8c-52 (unregistering): left promiscuous mode
Oct  2 08:20:20 np0005466013 NetworkManager[51205]: <info>  [1759407620.6587] device (tapb860dc8c-52): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.661 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:20 np0005466013 podman[235218]: 2025-10-02 12:20:20.661830579 +0000 UTC m=+0.129816353 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  2 08:20:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:20.662 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[35779b57-b8d5-4269-8930-051d8293c9b1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap671431ea-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e4:de:56'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 832, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 832, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 121], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 560858, 'reachable_time': 21654, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235290, 'error': None, 'target': 'ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:20 np0005466013 ovn_controller[94366]: 2025-10-02T12:20:20Z|00403|binding|INFO|Releasing lport b860dc8c-5253-4155-89c5-2384f0b08ff2 from this chassis (sb_readonly=0)
Oct  2 08:20:20 np0005466013 ovn_controller[94366]: 2025-10-02T12:20:20Z|00404|binding|INFO|Setting lport b860dc8c-5253-4155-89c5-2384f0b08ff2 down in Southbound
Oct  2 08:20:20 np0005466013 ovn_controller[94366]: 2025-10-02T12:20:20Z|00405|binding|INFO|Removing iface tapb860dc8c-52 ovn-installed in OVS
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.669 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:20.679 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:19:59:b6 10.100.0.12'], port_security=['fa:16:3e:19:59:b6 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'd88505e6-83d4-4006-a5d8-33e9ab64a380', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-671431ea-00d4-4b91-9313-30948c14cb3a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '32942e5bdadc470989ae2d43e074169e', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cfb3aec0-81fc-4b5f-b44d-ec3c2e8c38e2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6bad4103-c8e8-465b-8547-fe93b2136e07, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=b860dc8c-5253-4155-89c5-2384f0b08ff2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.681 2 DEBUG nova.compute.manager [None req-ec331501-b691-456c-86b5-07b4e3b07bc7 - - - - - -] [instance: c6a7915d-e2fe-4409-a7fe-ead19189985a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:20:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:20.682 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[dd6e7385-2629-4d4f-a125-cc54687f3593]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap671431ea-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 560870, 'tstamp': 560870}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235296, 'error': None, 'target': 'ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap671431ea-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 560873, 'tstamp': 560873}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235296, 'error': None, 'target': 'ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:20.683 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap671431ea-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.683 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.684 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:20 np0005466013 NetworkManager[51205]: <info>  [1759407620.6941] manager: (tapd7141e9a-34): new Tun device (/org/freedesktop/NetworkManager/Devices/187)
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.693 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:20.694 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap671431ea-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:20.695 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:20:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:20.695 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap671431ea-00, col_values=(('external_ids', {'iface-id': 'db7c1214-1d9a-4543-8afe-9e322e5cef6d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:20.695 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:20:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:20.697 103323 INFO neutron.agent.ovn.metadata.agent [-] Port b860dc8c-5253-4155-89c5-2384f0b08ff2 in datapath 671431ea-00d4-4b91-9313-30948c14cb3a unbound from our chassis#033[00m
Oct  2 08:20:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:20.698 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 671431ea-00d4-4b91-9313-30948c14cb3a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:20:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:20.698 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[26bfca4f-4701-4a9a-8bcd-e2ccfe29a2cb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:20.700 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a namespace which is not needed anymore#033[00m
Oct  2 08:20:20 np0005466013 systemd[1]: machine-qemu\x2d46\x2dinstance\x2d0000006b.scope: Deactivated successfully.
Oct  2 08:20:20 np0005466013 systemd[1]: machine-qemu\x2d46\x2dinstance\x2d0000006b.scope: Consumed 3.661s CPU time.
Oct  2 08:20:20 np0005466013 systemd-machined[152202]: Machine qemu-46-instance-0000006b terminated.
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.733 2 INFO nova.virt.libvirt.driver [-] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Instance destroyed successfully.#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.733 2 DEBUG nova.objects.instance [None req-d72995f1-5814-423c-b514-6c86fab6bf5b 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lazy-loading 'resources' on Instance uuid 21558304-9e39-4f50-9137-af0282f5cfca obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.754 2 DEBUG nova.virt.libvirt.vif [None req-d72995f1-5814-423c-b514-6c86fab6bf5b 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:20:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-MultipleCreateTestJSON-server-1108095109',display_name='tempest-MultipleCreateTestJSON-server-1108095109-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-multiplecreatetestjson-server-1108095109-1',id=106,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:20:19Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='32942e5bdadc470989ae2d43e074169e',ramdisk_id='',reservation_id='r-xtpuqm3p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-MultipleCreateTestJSON-1289676365',owner_user_name='tempest-MultipleCreateTestJSON-1289676365-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:20:19Z,user_data=None,user_id='4407ad6914204506adfa85e11e94e5d0',uuid=21558304-9e39-4f50-9137-af0282f5cfca,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d7141e9a-343b-4628-a1b6-25c7db74ece7", "address": "fa:16:3e:10:ac:4e", "network": {"id": "671431ea-00d4-4b91-9313-30948c14cb3a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1375422383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32942e5bdadc470989ae2d43e074169e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7141e9a-34", "ovs_interfaceid": "d7141e9a-343b-4628-a1b6-25c7db74ece7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.755 2 DEBUG nova.network.os_vif_util [None req-d72995f1-5814-423c-b514-6c86fab6bf5b 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Converting VIF {"id": "d7141e9a-343b-4628-a1b6-25c7db74ece7", "address": "fa:16:3e:10:ac:4e", "network": {"id": "671431ea-00d4-4b91-9313-30948c14cb3a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1375422383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32942e5bdadc470989ae2d43e074169e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7141e9a-34", "ovs_interfaceid": "d7141e9a-343b-4628-a1b6-25c7db74ece7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.755 2 DEBUG nova.network.os_vif_util [None req-d72995f1-5814-423c-b514-6c86fab6bf5b 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:10:ac:4e,bridge_name='br-int',has_traffic_filtering=True,id=d7141e9a-343b-4628-a1b6-25c7db74ece7,network=Network(671431ea-00d4-4b91-9313-30948c14cb3a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7141e9a-34') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.756 2 DEBUG os_vif [None req-d72995f1-5814-423c-b514-6c86fab6bf5b 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:10:ac:4e,bridge_name='br-int',has_traffic_filtering=True,id=d7141e9a-343b-4628-a1b6-25c7db74ece7,network=Network(671431ea-00d4-4b91-9313-30948c14cb3a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7141e9a-34') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.758 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.758 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7141e9a-34, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.759 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.761 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.763 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.766 2 INFO os_vif [None req-d72995f1-5814-423c-b514-6c86fab6bf5b 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:10:ac:4e,bridge_name='br-int',has_traffic_filtering=True,id=d7141e9a-343b-4628-a1b6-25c7db74ece7,network=Network(671431ea-00d4-4b91-9313-30948c14cb3a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7141e9a-34')#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.766 2 INFO nova.virt.libvirt.driver [None req-d72995f1-5814-423c-b514-6c86fab6bf5b 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Deleting instance files /var/lib/nova/instances/21558304-9e39-4f50-9137-af0282f5cfca_del#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.767 2 INFO nova.virt.libvirt.driver [None req-d72995f1-5814-423c-b514-6c86fab6bf5b 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Deletion of /var/lib/nova/instances/21558304-9e39-4f50-9137-af0282f5cfca_del complete#033[00m
Oct  2 08:20:20 np0005466013 neutron-haproxy-ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a[235192]: [NOTICE]   (235196) : haproxy version is 2.8.14-c23fe91
Oct  2 08:20:20 np0005466013 neutron-haproxy-ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a[235192]: [NOTICE]   (235196) : path to executable is /usr/sbin/haproxy
Oct  2 08:20:20 np0005466013 neutron-haproxy-ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a[235192]: [WARNING]  (235196) : Exiting Master process...
Oct  2 08:20:20 np0005466013 neutron-haproxy-ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a[235192]: [WARNING]  (235196) : Exiting Master process...
Oct  2 08:20:20 np0005466013 neutron-haproxy-ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a[235192]: [ALERT]    (235196) : Current worker (235198) exited with code 143 (Terminated)
Oct  2 08:20:20 np0005466013 neutron-haproxy-ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a[235192]: [WARNING]  (235196) : All workers exited. Exiting... (0)
Oct  2 08:20:20 np0005466013 systemd[1]: libpod-5d6028e642c6a51dfe37045ddb8795a25dee2a1c2d3b4664a64ad5e892c4aa9f.scope: Deactivated successfully.
Oct  2 08:20:20 np0005466013 podman[235334]: 2025-10-02 12:20:20.835010192 +0000 UTC m=+0.047294755 container died 5d6028e642c6a51dfe37045ddb8795a25dee2a1c2d3b4664a64ad5e892c4aa9f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 08:20:20 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5d6028e642c6a51dfe37045ddb8795a25dee2a1c2d3b4664a64ad5e892c4aa9f-userdata-shm.mount: Deactivated successfully.
Oct  2 08:20:20 np0005466013 systemd[1]: var-lib-containers-storage-overlay-07709a3bd1be0e17cda6725f2904c909a5dfd1dd9f0d3b3f3e9d26d11712e659-merged.mount: Deactivated successfully.
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.871 2 INFO nova.compute.manager [None req-d72995f1-5814-423c-b514-6c86fab6bf5b 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.872 2 DEBUG oslo.service.loopingcall [None req-d72995f1-5814-423c-b514-6c86fab6bf5b 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.872 2 DEBUG nova.compute.manager [-] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.872 2 DEBUG nova.network.neutron [-] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:20:20 np0005466013 podman[235334]: 2025-10-02 12:20:20.873375515 +0000 UTC m=+0.085660078 container cleanup 5d6028e642c6a51dfe37045ddb8795a25dee2a1c2d3b4664a64ad5e892c4aa9f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 08:20:20 np0005466013 systemd[1]: libpod-conmon-5d6028e642c6a51dfe37045ddb8795a25dee2a1c2d3b4664a64ad5e892c4aa9f.scope: Deactivated successfully.
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.896 2 INFO nova.virt.libvirt.driver [-] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Instance destroyed successfully.#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.896 2 DEBUG nova.objects.instance [None req-f570dcae-a2f0-41f6-9410-02a30c8111b0 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lazy-loading 'resources' on Instance uuid d88505e6-83d4-4006-a5d8-33e9ab64a380 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.909 2 DEBUG nova.virt.libvirt.vif [None req-f570dcae-a2f0-41f6-9410-02a30c8111b0 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:20:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-MultipleCreateTestJSON-server-1108095109',display_name='tempest-MultipleCreateTestJSON-server-1108095109-2',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-multiplecreatetestjson-server-1108095109-2',id=107,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=1,launched_at=2025-10-02T12:20:17Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='32942e5bdadc470989ae2d43e074169e',ramdisk_id='',reservation_id='r-xtpuqm3p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-MultipleCreateTestJSON-1289676365',owner_user_name='tempest-MultipleCreateTestJSON-1289676365-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:20:17Z,user_data=None,user_id='4407ad6914204506adfa85e11e94e5d0',uuid=d88505e6-83d4-4006-a5d8-33e9ab64a380,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b860dc8c-5253-4155-89c5-2384f0b08ff2", "address": "fa:16:3e:19:59:b6", "network": {"id": "671431ea-00d4-4b91-9313-30948c14cb3a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1375422383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32942e5bdadc470989ae2d43e074169e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb860dc8c-52", "ovs_interfaceid": "b860dc8c-5253-4155-89c5-2384f0b08ff2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.910 2 DEBUG nova.network.os_vif_util [None req-f570dcae-a2f0-41f6-9410-02a30c8111b0 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Converting VIF {"id": "b860dc8c-5253-4155-89c5-2384f0b08ff2", "address": "fa:16:3e:19:59:b6", "network": {"id": "671431ea-00d4-4b91-9313-30948c14cb3a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1375422383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32942e5bdadc470989ae2d43e074169e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb860dc8c-52", "ovs_interfaceid": "b860dc8c-5253-4155-89c5-2384f0b08ff2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.910 2 DEBUG nova.network.os_vif_util [None req-f570dcae-a2f0-41f6-9410-02a30c8111b0 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:59:b6,bridge_name='br-int',has_traffic_filtering=True,id=b860dc8c-5253-4155-89c5-2384f0b08ff2,network=Network(671431ea-00d4-4b91-9313-30948c14cb3a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb860dc8c-52') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.911 2 DEBUG os_vif [None req-f570dcae-a2f0-41f6-9410-02a30c8111b0 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:59:b6,bridge_name='br-int',has_traffic_filtering=True,id=b860dc8c-5253-4155-89c5-2384f0b08ff2,network=Network(671431ea-00d4-4b91-9313-30948c14cb3a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb860dc8c-52') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.912 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.913 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb860dc8c-52, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.914 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.917 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.919 2 INFO os_vif [None req-f570dcae-a2f0-41f6-9410-02a30c8111b0 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:59:b6,bridge_name='br-int',has_traffic_filtering=True,id=b860dc8c-5253-4155-89c5-2384f0b08ff2,network=Network(671431ea-00d4-4b91-9313-30948c14cb3a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb860dc8c-52')#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.920 2 INFO nova.virt.libvirt.driver [None req-f570dcae-a2f0-41f6-9410-02a30c8111b0 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Deleting instance files /var/lib/nova/instances/d88505e6-83d4-4006-a5d8-33e9ab64a380_del#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.920 2 INFO nova.virt.libvirt.driver [None req-f570dcae-a2f0-41f6-9410-02a30c8111b0 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Deletion of /var/lib/nova/instances/d88505e6-83d4-4006-a5d8-33e9ab64a380_del complete#033[00m
Oct  2 08:20:20 np0005466013 podman[235375]: 2025-10-02 12:20:20.940314354 +0000 UTC m=+0.042107511 container remove 5d6028e642c6a51dfe37045ddb8795a25dee2a1c2d3b4664a64ad5e892c4aa9f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  2 08:20:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:20.944 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[49bb1df1-95d7-41a1-aa3c-bbadb6440917]: (4, ('Thu Oct  2 12:20:20 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a (5d6028e642c6a51dfe37045ddb8795a25dee2a1c2d3b4664a64ad5e892c4aa9f)\n5d6028e642c6a51dfe37045ddb8795a25dee2a1c2d3b4664a64ad5e892c4aa9f\nThu Oct  2 12:20:20 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a (5d6028e642c6a51dfe37045ddb8795a25dee2a1c2d3b4664a64ad5e892c4aa9f)\n5d6028e642c6a51dfe37045ddb8795a25dee2a1c2d3b4664a64ad5e892c4aa9f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:20.947 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[a306bdcc-4db2-4965-99c7-abf728985d74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:20.948 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap671431ea-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:20 np0005466013 kernel: tap671431ea-00: left promiscuous mode
Oct  2 08:20:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:20.956 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[b198de01-8c8c-4913-8494-ceadf9194585]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.956 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.967 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:20.991 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[41cc0376-940a-4fa1-a9de-bac64b5b18bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:20.993 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[80860a42-d5a8-40eb-87f7-5020e80cf180]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.994 2 INFO nova.compute.manager [None req-f570dcae-a2f0-41f6-9410-02a30c8111b0 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.995 2 DEBUG oslo.service.loopingcall [None req-f570dcae-a2f0-41f6-9410-02a30c8111b0 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.995 2 DEBUG nova.compute.manager [-] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:20:20 np0005466013 nova_compute[192144]: 2025-10-02 12:20:20.996 2 DEBUG nova.network.neutron [-] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:20:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:21.009 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[c5aa4d4d-e9e0-4ebc-99e3-3963e85e75d1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 560850, 'reachable_time': 39342, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235392, 'error': None, 'target': 'ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:21.014 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-671431ea-00d4-4b91-9313-30948c14cb3a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:20:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:21.014 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[541606b7-6860-4449-a222-916d0b654644]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:21 np0005466013 systemd[1]: run-netns-ovnmeta\x2d671431ea\x2d00d4\x2d4b91\x2d9313\x2d30948c14cb3a.mount: Deactivated successfully.
Oct  2 08:20:21 np0005466013 nova_compute[192144]: 2025-10-02 12:20:21.396 2 DEBUG nova.network.neutron [-] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:20:21 np0005466013 nova_compute[192144]: 2025-10-02 12:20:21.414 2 INFO nova.compute.manager [-] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Took 0.54 seconds to deallocate network for instance.#033[00m
Oct  2 08:20:21 np0005466013 nova_compute[192144]: 2025-10-02 12:20:21.512 2 DEBUG oslo_concurrency.lockutils [None req-d72995f1-5814-423c-b514-6c86fab6bf5b 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:21 np0005466013 nova_compute[192144]: 2025-10-02 12:20:21.512 2 DEBUG oslo_concurrency.lockutils [None req-d72995f1-5814-423c-b514-6c86fab6bf5b 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:21 np0005466013 nova_compute[192144]: 2025-10-02 12:20:21.595 2 DEBUG nova.network.neutron [-] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:20:21 np0005466013 nova_compute[192144]: 2025-10-02 12:20:21.617 2 INFO nova.compute.manager [-] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Took 0.62 seconds to deallocate network for instance.#033[00m
Oct  2 08:20:21 np0005466013 nova_compute[192144]: 2025-10-02 12:20:21.621 2 DEBUG nova.scheduler.client.report [None req-d72995f1-5814-423c-b514-6c86fab6bf5b 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Refreshing inventories for resource provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  2 08:20:21 np0005466013 nova_compute[192144]: 2025-10-02 12:20:21.655 2 DEBUG nova.compute.manager [req-eb12ed76-058d-4fd6-bf85-b85285d5d39c req-426e640c-9b6a-4cb8-bb94-f57fdde85695 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Received event network-vif-plugged-d7141e9a-343b-4628-a1b6-25c7db74ece7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:21 np0005466013 nova_compute[192144]: 2025-10-02 12:20:21.656 2 DEBUG oslo_concurrency.lockutils [req-eb12ed76-058d-4fd6-bf85-b85285d5d39c req-426e640c-9b6a-4cb8-bb94-f57fdde85695 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "21558304-9e39-4f50-9137-af0282f5cfca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:21 np0005466013 nova_compute[192144]: 2025-10-02 12:20:21.656 2 DEBUG oslo_concurrency.lockutils [req-eb12ed76-058d-4fd6-bf85-b85285d5d39c req-426e640c-9b6a-4cb8-bb94-f57fdde85695 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "21558304-9e39-4f50-9137-af0282f5cfca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:21 np0005466013 nova_compute[192144]: 2025-10-02 12:20:21.656 2 DEBUG oslo_concurrency.lockutils [req-eb12ed76-058d-4fd6-bf85-b85285d5d39c req-426e640c-9b6a-4cb8-bb94-f57fdde85695 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "21558304-9e39-4f50-9137-af0282f5cfca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:21 np0005466013 nova_compute[192144]: 2025-10-02 12:20:21.656 2 DEBUG nova.compute.manager [req-eb12ed76-058d-4fd6-bf85-b85285d5d39c req-426e640c-9b6a-4cb8-bb94-f57fdde85695 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] No waiting events found dispatching network-vif-plugged-d7141e9a-343b-4628-a1b6-25c7db74ece7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:20:21 np0005466013 nova_compute[192144]: 2025-10-02 12:20:21.657 2 WARNING nova.compute.manager [req-eb12ed76-058d-4fd6-bf85-b85285d5d39c req-426e640c-9b6a-4cb8-bb94-f57fdde85695 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Received unexpected event network-vif-plugged-d7141e9a-343b-4628-a1b6-25c7db74ece7 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:20:21 np0005466013 nova_compute[192144]: 2025-10-02 12:20:21.657 2 DEBUG nova.compute.manager [req-eb12ed76-058d-4fd6-bf85-b85285d5d39c req-426e640c-9b6a-4cb8-bb94-f57fdde85695 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Received event network-vif-unplugged-d7141e9a-343b-4628-a1b6-25c7db74ece7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:21 np0005466013 nova_compute[192144]: 2025-10-02 12:20:21.657 2 DEBUG oslo_concurrency.lockutils [req-eb12ed76-058d-4fd6-bf85-b85285d5d39c req-426e640c-9b6a-4cb8-bb94-f57fdde85695 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "21558304-9e39-4f50-9137-af0282f5cfca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:21 np0005466013 nova_compute[192144]: 2025-10-02 12:20:21.657 2 DEBUG oslo_concurrency.lockutils [req-eb12ed76-058d-4fd6-bf85-b85285d5d39c req-426e640c-9b6a-4cb8-bb94-f57fdde85695 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "21558304-9e39-4f50-9137-af0282f5cfca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:21 np0005466013 nova_compute[192144]: 2025-10-02 12:20:21.658 2 DEBUG oslo_concurrency.lockutils [req-eb12ed76-058d-4fd6-bf85-b85285d5d39c req-426e640c-9b6a-4cb8-bb94-f57fdde85695 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "21558304-9e39-4f50-9137-af0282f5cfca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:21 np0005466013 nova_compute[192144]: 2025-10-02 12:20:21.658 2 DEBUG nova.compute.manager [req-eb12ed76-058d-4fd6-bf85-b85285d5d39c req-426e640c-9b6a-4cb8-bb94-f57fdde85695 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] No waiting events found dispatching network-vif-unplugged-d7141e9a-343b-4628-a1b6-25c7db74ece7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:20:21 np0005466013 nova_compute[192144]: 2025-10-02 12:20:21.658 2 WARNING nova.compute.manager [req-eb12ed76-058d-4fd6-bf85-b85285d5d39c req-426e640c-9b6a-4cb8-bb94-f57fdde85695 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Received unexpected event network-vif-unplugged-d7141e9a-343b-4628-a1b6-25c7db74ece7 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:20:21 np0005466013 nova_compute[192144]: 2025-10-02 12:20:21.658 2 DEBUG nova.compute.manager [req-eb12ed76-058d-4fd6-bf85-b85285d5d39c req-426e640c-9b6a-4cb8-bb94-f57fdde85695 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Received event network-vif-plugged-d7141e9a-343b-4628-a1b6-25c7db74ece7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:21 np0005466013 nova_compute[192144]: 2025-10-02 12:20:21.659 2 DEBUG oslo_concurrency.lockutils [req-eb12ed76-058d-4fd6-bf85-b85285d5d39c req-426e640c-9b6a-4cb8-bb94-f57fdde85695 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "21558304-9e39-4f50-9137-af0282f5cfca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:21 np0005466013 nova_compute[192144]: 2025-10-02 12:20:21.659 2 DEBUG oslo_concurrency.lockutils [req-eb12ed76-058d-4fd6-bf85-b85285d5d39c req-426e640c-9b6a-4cb8-bb94-f57fdde85695 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "21558304-9e39-4f50-9137-af0282f5cfca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:21 np0005466013 nova_compute[192144]: 2025-10-02 12:20:21.659 2 DEBUG oslo_concurrency.lockutils [req-eb12ed76-058d-4fd6-bf85-b85285d5d39c req-426e640c-9b6a-4cb8-bb94-f57fdde85695 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "21558304-9e39-4f50-9137-af0282f5cfca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:21 np0005466013 nova_compute[192144]: 2025-10-02 12:20:21.659 2 DEBUG nova.compute.manager [req-eb12ed76-058d-4fd6-bf85-b85285d5d39c req-426e640c-9b6a-4cb8-bb94-f57fdde85695 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] No waiting events found dispatching network-vif-plugged-d7141e9a-343b-4628-a1b6-25c7db74ece7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:20:21 np0005466013 nova_compute[192144]: 2025-10-02 12:20:21.660 2 WARNING nova.compute.manager [req-eb12ed76-058d-4fd6-bf85-b85285d5d39c req-426e640c-9b6a-4cb8-bb94-f57fdde85695 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Received unexpected event network-vif-plugged-d7141e9a-343b-4628-a1b6-25c7db74ece7 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:20:21 np0005466013 nova_compute[192144]: 2025-10-02 12:20:21.660 2 DEBUG nova.compute.manager [req-eb12ed76-058d-4fd6-bf85-b85285d5d39c req-426e640c-9b6a-4cb8-bb94-f57fdde85695 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Received event network-vif-deleted-d7141e9a-343b-4628-a1b6-25c7db74ece7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:21 np0005466013 nova_compute[192144]: 2025-10-02 12:20:21.711 2 DEBUG oslo_concurrency.lockutils [None req-f570dcae-a2f0-41f6-9410-02a30c8111b0 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:21 np0005466013 nova_compute[192144]: 2025-10-02 12:20:21.712 2 DEBUG nova.scheduler.client.report [None req-d72995f1-5814-423c-b514-6c86fab6bf5b 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Updating ProviderTree inventory for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  2 08:20:21 np0005466013 nova_compute[192144]: 2025-10-02 12:20:21.712 2 DEBUG nova.compute.provider_tree [None req-d72995f1-5814-423c-b514-6c86fab6bf5b 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Updating inventory in ProviderTree for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 08:20:21 np0005466013 nova_compute[192144]: 2025-10-02 12:20:21.742 2 DEBUG nova.compute.manager [req-62c020a4-d185-4d23-840f-a4568d6f8158 req-ee00aba5-16b5-4b42-8fba-197aa08d458a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Received event network-vif-deleted-b860dc8c-5253-4155-89c5-2384f0b08ff2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:21 np0005466013 nova_compute[192144]: 2025-10-02 12:20:21.762 2 DEBUG nova.scheduler.client.report [None req-d72995f1-5814-423c-b514-6c86fab6bf5b 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Refreshing aggregate associations for resource provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  2 08:20:21 np0005466013 nova_compute[192144]: 2025-10-02 12:20:21.784 2 DEBUG nova.scheduler.client.report [None req-d72995f1-5814-423c-b514-6c86fab6bf5b 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Refreshing trait associations for resource provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80, traits: COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_TRUSTED_CERTS,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NODE,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE,HW_CPU_X86_MMX,COMPUTE_RESCUE_BFV _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  2 08:20:21 np0005466013 nova_compute[192144]: 2025-10-02 12:20:21.803 2 DEBUG nova.compute.manager [req-4cf98acb-22fb-4942-ae86-d360b30fa0b2 req-0b9bfcb9-8e59-47dc-9e21-de67f02b4589 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Received event network-vif-unplugged-b860dc8c-5253-4155-89c5-2384f0b08ff2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:21 np0005466013 nova_compute[192144]: 2025-10-02 12:20:21.803 2 DEBUG oslo_concurrency.lockutils [req-4cf98acb-22fb-4942-ae86-d360b30fa0b2 req-0b9bfcb9-8e59-47dc-9e21-de67f02b4589 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "d88505e6-83d4-4006-a5d8-33e9ab64a380-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:21 np0005466013 nova_compute[192144]: 2025-10-02 12:20:21.803 2 DEBUG oslo_concurrency.lockutils [req-4cf98acb-22fb-4942-ae86-d360b30fa0b2 req-0b9bfcb9-8e59-47dc-9e21-de67f02b4589 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "d88505e6-83d4-4006-a5d8-33e9ab64a380-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:21 np0005466013 nova_compute[192144]: 2025-10-02 12:20:21.804 2 DEBUG oslo_concurrency.lockutils [req-4cf98acb-22fb-4942-ae86-d360b30fa0b2 req-0b9bfcb9-8e59-47dc-9e21-de67f02b4589 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "d88505e6-83d4-4006-a5d8-33e9ab64a380-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:21 np0005466013 nova_compute[192144]: 2025-10-02 12:20:21.804 2 DEBUG nova.compute.manager [req-4cf98acb-22fb-4942-ae86-d360b30fa0b2 req-0b9bfcb9-8e59-47dc-9e21-de67f02b4589 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] No waiting events found dispatching network-vif-unplugged-b860dc8c-5253-4155-89c5-2384f0b08ff2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:20:21 np0005466013 nova_compute[192144]: 2025-10-02 12:20:21.804 2 WARNING nova.compute.manager [req-4cf98acb-22fb-4942-ae86-d360b30fa0b2 req-0b9bfcb9-8e59-47dc-9e21-de67f02b4589 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Received unexpected event network-vif-unplugged-b860dc8c-5253-4155-89c5-2384f0b08ff2 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:20:21 np0005466013 nova_compute[192144]: 2025-10-02 12:20:21.804 2 DEBUG nova.compute.manager [req-4cf98acb-22fb-4942-ae86-d360b30fa0b2 req-0b9bfcb9-8e59-47dc-9e21-de67f02b4589 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Received event network-vif-plugged-b860dc8c-5253-4155-89c5-2384f0b08ff2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:21 np0005466013 nova_compute[192144]: 2025-10-02 12:20:21.805 2 DEBUG oslo_concurrency.lockutils [req-4cf98acb-22fb-4942-ae86-d360b30fa0b2 req-0b9bfcb9-8e59-47dc-9e21-de67f02b4589 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "d88505e6-83d4-4006-a5d8-33e9ab64a380-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:21 np0005466013 nova_compute[192144]: 2025-10-02 12:20:21.805 2 DEBUG oslo_concurrency.lockutils [req-4cf98acb-22fb-4942-ae86-d360b30fa0b2 req-0b9bfcb9-8e59-47dc-9e21-de67f02b4589 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "d88505e6-83d4-4006-a5d8-33e9ab64a380-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:21 np0005466013 nova_compute[192144]: 2025-10-02 12:20:21.805 2 DEBUG oslo_concurrency.lockutils [req-4cf98acb-22fb-4942-ae86-d360b30fa0b2 req-0b9bfcb9-8e59-47dc-9e21-de67f02b4589 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "d88505e6-83d4-4006-a5d8-33e9ab64a380-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:21 np0005466013 nova_compute[192144]: 2025-10-02 12:20:21.806 2 DEBUG nova.compute.manager [req-4cf98acb-22fb-4942-ae86-d360b30fa0b2 req-0b9bfcb9-8e59-47dc-9e21-de67f02b4589 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] No waiting events found dispatching network-vif-plugged-b860dc8c-5253-4155-89c5-2384f0b08ff2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:20:21 np0005466013 nova_compute[192144]: 2025-10-02 12:20:21.806 2 WARNING nova.compute.manager [req-4cf98acb-22fb-4942-ae86-d360b30fa0b2 req-0b9bfcb9-8e59-47dc-9e21-de67f02b4589 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Received unexpected event network-vif-plugged-b860dc8c-5253-4155-89c5-2384f0b08ff2 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:20:21 np0005466013 nova_compute[192144]: 2025-10-02 12:20:21.833 2 DEBUG nova.compute.provider_tree [None req-d72995f1-5814-423c-b514-6c86fab6bf5b 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:20:21 np0005466013 nova_compute[192144]: 2025-10-02 12:20:21.853 2 DEBUG nova.scheduler.client.report [None req-d72995f1-5814-423c-b514-6c86fab6bf5b 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:20:21 np0005466013 nova_compute[192144]: 2025-10-02 12:20:21.877 2 DEBUG oslo_concurrency.lockutils [None req-d72995f1-5814-423c-b514-6c86fab6bf5b 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.365s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:21 np0005466013 nova_compute[192144]: 2025-10-02 12:20:21.879 2 DEBUG oslo_concurrency.lockutils [None req-f570dcae-a2f0-41f6-9410-02a30c8111b0 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.169s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:21 np0005466013 nova_compute[192144]: 2025-10-02 12:20:21.909 2 INFO nova.scheduler.client.report [None req-d72995f1-5814-423c-b514-6c86fab6bf5b 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Deleted allocations for instance 21558304-9e39-4f50-9137-af0282f5cfca#033[00m
Oct  2 08:20:21 np0005466013 nova_compute[192144]: 2025-10-02 12:20:21.952 2 DEBUG nova.compute.provider_tree [None req-f570dcae-a2f0-41f6-9410-02a30c8111b0 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:20:21 np0005466013 nova_compute[192144]: 2025-10-02 12:20:21.970 2 DEBUG nova.scheduler.client.report [None req-f570dcae-a2f0-41f6-9410-02a30c8111b0 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:20:21 np0005466013 nova_compute[192144]: 2025-10-02 12:20:21.989 2 DEBUG oslo_concurrency.lockutils [None req-d72995f1-5814-423c-b514-6c86fab6bf5b 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "21558304-9e39-4f50-9137-af0282f5cfca" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.542s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:21 np0005466013 nova_compute[192144]: 2025-10-02 12:20:21.992 2 DEBUG oslo_concurrency.lockutils [None req-f570dcae-a2f0-41f6-9410-02a30c8111b0 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:22 np0005466013 nova_compute[192144]: 2025-10-02 12:20:22.016 2 INFO nova.scheduler.client.report [None req-f570dcae-a2f0-41f6-9410-02a30c8111b0 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Deleted allocations for instance d88505e6-83d4-4006-a5d8-33e9ab64a380#033[00m
Oct  2 08:20:22 np0005466013 nova_compute[192144]: 2025-10-02 12:20:22.079 2 DEBUG oslo_concurrency.lockutils [None req-f570dcae-a2f0-41f6-9410-02a30c8111b0 4407ad6914204506adfa85e11e94e5d0 32942e5bdadc470989ae2d43e074169e - - default default] Lock "d88505e6-83d4-4006-a5d8-33e9ab64a380" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.477s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:23 np0005466013 nova_compute[192144]: 2025-10-02 12:20:23.015 2 DEBUG oslo_concurrency.lockutils [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Acquiring lock "ad2d69bb-3aa9-4c11-b9de-29996574cfa2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:23 np0005466013 nova_compute[192144]: 2025-10-02 12:20:23.015 2 DEBUG oslo_concurrency.lockutils [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "ad2d69bb-3aa9-4c11-b9de-29996574cfa2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:23 np0005466013 nova_compute[192144]: 2025-10-02 12:20:23.035 2 DEBUG nova.compute.manager [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:20:23 np0005466013 nova_compute[192144]: 2025-10-02 12:20:23.116 2 DEBUG oslo_concurrency.lockutils [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:23 np0005466013 nova_compute[192144]: 2025-10-02 12:20:23.116 2 DEBUG oslo_concurrency.lockutils [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:23 np0005466013 nova_compute[192144]: 2025-10-02 12:20:23.121 2 DEBUG nova.virt.hardware [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:20:23 np0005466013 nova_compute[192144]: 2025-10-02 12:20:23.122 2 INFO nova.compute.claims [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:20:23 np0005466013 nova_compute[192144]: 2025-10-02 12:20:23.259 2 DEBUG nova.compute.provider_tree [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:20:23 np0005466013 nova_compute[192144]: 2025-10-02 12:20:23.273 2 DEBUG nova.scheduler.client.report [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:20:23 np0005466013 nova_compute[192144]: 2025-10-02 12:20:23.307 2 DEBUG oslo_concurrency.lockutils [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.191s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:23 np0005466013 nova_compute[192144]: 2025-10-02 12:20:23.308 2 DEBUG nova.compute.manager [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:20:23 np0005466013 nova_compute[192144]: 2025-10-02 12:20:23.368 2 DEBUG nova.compute.manager [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:20:23 np0005466013 nova_compute[192144]: 2025-10-02 12:20:23.369 2 DEBUG nova.network.neutron [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:20:23 np0005466013 nova_compute[192144]: 2025-10-02 12:20:23.388 2 INFO nova.virt.libvirt.driver [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:20:23 np0005466013 nova_compute[192144]: 2025-10-02 12:20:23.405 2 DEBUG nova.compute.manager [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:20:23 np0005466013 nova_compute[192144]: 2025-10-02 12:20:23.536 2 DEBUG nova.compute.manager [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:20:23 np0005466013 nova_compute[192144]: 2025-10-02 12:20:23.538 2 DEBUG nova.virt.libvirt.driver [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:20:23 np0005466013 nova_compute[192144]: 2025-10-02 12:20:23.538 2 INFO nova.virt.libvirt.driver [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Creating image(s)#033[00m
Oct  2 08:20:23 np0005466013 nova_compute[192144]: 2025-10-02 12:20:23.539 2 DEBUG oslo_concurrency.lockutils [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Acquiring lock "/var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:23 np0005466013 nova_compute[192144]: 2025-10-02 12:20:23.539 2 DEBUG oslo_concurrency.lockutils [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "/var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:23 np0005466013 nova_compute[192144]: 2025-10-02 12:20:23.540 2 DEBUG oslo_concurrency.lockutils [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "/var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:23 np0005466013 nova_compute[192144]: 2025-10-02 12:20:23.554 2 DEBUG oslo_concurrency.processutils [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:23 np0005466013 nova_compute[192144]: 2025-10-02 12:20:23.586 2 DEBUG nova.policy [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:20:23 np0005466013 nova_compute[192144]: 2025-10-02 12:20:23.619 2 DEBUG oslo_concurrency.processutils [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:23 np0005466013 nova_compute[192144]: 2025-10-02 12:20:23.620 2 DEBUG oslo_concurrency.lockutils [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:23 np0005466013 nova_compute[192144]: 2025-10-02 12:20:23.621 2 DEBUG oslo_concurrency.lockutils [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:23 np0005466013 nova_compute[192144]: 2025-10-02 12:20:23.632 2 DEBUG oslo_concurrency.processutils [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:23 np0005466013 nova_compute[192144]: 2025-10-02 12:20:23.695 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:23 np0005466013 nova_compute[192144]: 2025-10-02 12:20:23.706 2 DEBUG oslo_concurrency.processutils [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:23 np0005466013 nova_compute[192144]: 2025-10-02 12:20:23.708 2 DEBUG oslo_concurrency.processutils [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:23 np0005466013 nova_compute[192144]: 2025-10-02 12:20:23.751 2 DEBUG oslo_concurrency.processutils [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:23 np0005466013 nova_compute[192144]: 2025-10-02 12:20:23.753 2 DEBUG oslo_concurrency.lockutils [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:23 np0005466013 nova_compute[192144]: 2025-10-02 12:20:23.753 2 DEBUG oslo_concurrency.processutils [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:23 np0005466013 nova_compute[192144]: 2025-10-02 12:20:23.824 2 DEBUG oslo_concurrency.processutils [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:23 np0005466013 nova_compute[192144]: 2025-10-02 12:20:23.825 2 DEBUG nova.virt.disk.api [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Checking if we can resize image /var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:20:23 np0005466013 nova_compute[192144]: 2025-10-02 12:20:23.826 2 DEBUG oslo_concurrency.processutils [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:23 np0005466013 nova_compute[192144]: 2025-10-02 12:20:23.888 2 DEBUG oslo_concurrency.processutils [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:23 np0005466013 nova_compute[192144]: 2025-10-02 12:20:23.889 2 DEBUG nova.virt.disk.api [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Cannot resize image /var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:20:23 np0005466013 nova_compute[192144]: 2025-10-02 12:20:23.890 2 DEBUG nova.objects.instance [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lazy-loading 'migration_context' on Instance uuid ad2d69bb-3aa9-4c11-b9de-29996574cfa2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:20:23 np0005466013 nova_compute[192144]: 2025-10-02 12:20:23.904 2 DEBUG nova.virt.libvirt.driver [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:20:23 np0005466013 nova_compute[192144]: 2025-10-02 12:20:23.905 2 DEBUG nova.virt.libvirt.driver [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Ensure instance console log exists: /var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:20:23 np0005466013 nova_compute[192144]: 2025-10-02 12:20:23.905 2 DEBUG oslo_concurrency.lockutils [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:23 np0005466013 nova_compute[192144]: 2025-10-02 12:20:23.906 2 DEBUG oslo_concurrency.lockutils [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:23 np0005466013 nova_compute[192144]: 2025-10-02 12:20:23.906 2 DEBUG oslo_concurrency.lockutils [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:24 np0005466013 nova_compute[192144]: 2025-10-02 12:20:24.442 2 DEBUG nova.network.neutron [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Successfully created port: cae13af9-8175-4eab-b9ec-18019b521d0b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:20:24 np0005466013 nova_compute[192144]: 2025-10-02 12:20:24.927 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:25 np0005466013 nova_compute[192144]: 2025-10-02 12:20:25.006 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:20:25 np0005466013 nova_compute[192144]: 2025-10-02 12:20:25.007 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:20:25 np0005466013 nova_compute[192144]: 2025-10-02 12:20:25.007 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:20:25 np0005466013 nova_compute[192144]: 2025-10-02 12:20:25.028 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:25 np0005466013 nova_compute[192144]: 2025-10-02 12:20:25.028 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:25 np0005466013 nova_compute[192144]: 2025-10-02 12:20:25.028 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:25 np0005466013 nova_compute[192144]: 2025-10-02 12:20:25.029 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:20:25 np0005466013 nova_compute[192144]: 2025-10-02 12:20:25.179 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:20:25 np0005466013 nova_compute[192144]: 2025-10-02 12:20:25.180 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5593MB free_disk=73.35329055786133GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:20:25 np0005466013 nova_compute[192144]: 2025-10-02 12:20:25.180 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:25 np0005466013 nova_compute[192144]: 2025-10-02 12:20:25.181 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:25 np0005466013 nova_compute[192144]: 2025-10-02 12:20:25.249 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance ad2d69bb-3aa9-4c11-b9de-29996574cfa2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:20:25 np0005466013 nova_compute[192144]: 2025-10-02 12:20:25.250 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:20:25 np0005466013 nova_compute[192144]: 2025-10-02 12:20:25.250 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:20:25 np0005466013 nova_compute[192144]: 2025-10-02 12:20:25.298 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:20:25 np0005466013 nova_compute[192144]: 2025-10-02 12:20:25.311 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:20:25 np0005466013 nova_compute[192144]: 2025-10-02 12:20:25.337 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:20:25 np0005466013 nova_compute[192144]: 2025-10-02 12:20:25.338 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.157s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:25 np0005466013 nova_compute[192144]: 2025-10-02 12:20:25.458 2 DEBUG nova.network.neutron [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Successfully updated port: cae13af9-8175-4eab-b9ec-18019b521d0b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:20:25 np0005466013 nova_compute[192144]: 2025-10-02 12:20:25.478 2 DEBUG oslo_concurrency.lockutils [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Acquiring lock "refresh_cache-ad2d69bb-3aa9-4c11-b9de-29996574cfa2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:20:25 np0005466013 nova_compute[192144]: 2025-10-02 12:20:25.479 2 DEBUG oslo_concurrency.lockutils [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Acquired lock "refresh_cache-ad2d69bb-3aa9-4c11-b9de-29996574cfa2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:20:25 np0005466013 nova_compute[192144]: 2025-10-02 12:20:25.479 2 DEBUG nova.network.neutron [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:20:25 np0005466013 nova_compute[192144]: 2025-10-02 12:20:25.552 2 DEBUG nova.compute.manager [req-349b30e0-7889-45f6-84fb-8175a42c3ada req-4eb283ca-8c60-4016-970b-f2c62f98614e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Received event network-changed-cae13af9-8175-4eab-b9ec-18019b521d0b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:25 np0005466013 nova_compute[192144]: 2025-10-02 12:20:25.552 2 DEBUG nova.compute.manager [req-349b30e0-7889-45f6-84fb-8175a42c3ada req-4eb283ca-8c60-4016-970b-f2c62f98614e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Refreshing instance network info cache due to event network-changed-cae13af9-8175-4eab-b9ec-18019b521d0b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:20:25 np0005466013 nova_compute[192144]: 2025-10-02 12:20:25.553 2 DEBUG oslo_concurrency.lockutils [req-349b30e0-7889-45f6-84fb-8175a42c3ada req-4eb283ca-8c60-4016-970b-f2c62f98614e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-ad2d69bb-3aa9-4c11-b9de-29996574cfa2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:20:25 np0005466013 nova_compute[192144]: 2025-10-02 12:20:25.678 2 DEBUG nova.network.neutron [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:20:25 np0005466013 nova_compute[192144]: 2025-10-02 12:20:25.933 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:26 np0005466013 nova_compute[192144]: 2025-10-02 12:20:26.980 2 DEBUG nova.network.neutron [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Updating instance_info_cache with network_info: [{"id": "cae13af9-8175-4eab-b9ec-18019b521d0b", "address": "fa:16:3e:35:d3:eb", "network": {"id": "20eb29be-ee23-463b-85af-bfc2388e9f77", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-370285634-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffce7d629aa24a7f970d93b2a79045f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcae13af9-81", "ovs_interfaceid": "cae13af9-8175-4eab-b9ec-18019b521d0b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:20:27 np0005466013 nova_compute[192144]: 2025-10-02 12:20:27.008 2 DEBUG oslo_concurrency.lockutils [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Releasing lock "refresh_cache-ad2d69bb-3aa9-4c11-b9de-29996574cfa2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:20:27 np0005466013 nova_compute[192144]: 2025-10-02 12:20:27.008 2 DEBUG nova.compute.manager [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Instance network_info: |[{"id": "cae13af9-8175-4eab-b9ec-18019b521d0b", "address": "fa:16:3e:35:d3:eb", "network": {"id": "20eb29be-ee23-463b-85af-bfc2388e9f77", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-370285634-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffce7d629aa24a7f970d93b2a79045f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcae13af9-81", "ovs_interfaceid": "cae13af9-8175-4eab-b9ec-18019b521d0b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:20:27 np0005466013 nova_compute[192144]: 2025-10-02 12:20:27.008 2 DEBUG oslo_concurrency.lockutils [req-349b30e0-7889-45f6-84fb-8175a42c3ada req-4eb283ca-8c60-4016-970b-f2c62f98614e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-ad2d69bb-3aa9-4c11-b9de-29996574cfa2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:20:27 np0005466013 nova_compute[192144]: 2025-10-02 12:20:27.009 2 DEBUG nova.network.neutron [req-349b30e0-7889-45f6-84fb-8175a42c3ada req-4eb283ca-8c60-4016-970b-f2c62f98614e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Refreshing network info cache for port cae13af9-8175-4eab-b9ec-18019b521d0b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:20:27 np0005466013 nova_compute[192144]: 2025-10-02 12:20:27.012 2 DEBUG nova.virt.libvirt.driver [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Start _get_guest_xml network_info=[{"id": "cae13af9-8175-4eab-b9ec-18019b521d0b", "address": "fa:16:3e:35:d3:eb", "network": {"id": "20eb29be-ee23-463b-85af-bfc2388e9f77", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-370285634-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffce7d629aa24a7f970d93b2a79045f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcae13af9-81", "ovs_interfaceid": "cae13af9-8175-4eab-b9ec-18019b521d0b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:20:27 np0005466013 nova_compute[192144]: 2025-10-02 12:20:27.017 2 WARNING nova.virt.libvirt.driver [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:20:27 np0005466013 nova_compute[192144]: 2025-10-02 12:20:27.022 2 DEBUG nova.virt.libvirt.host [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:20:27 np0005466013 nova_compute[192144]: 2025-10-02 12:20:27.023 2 DEBUG nova.virt.libvirt.host [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:20:27 np0005466013 nova_compute[192144]: 2025-10-02 12:20:27.029 2 DEBUG nova.virt.libvirt.host [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:20:27 np0005466013 nova_compute[192144]: 2025-10-02 12:20:27.030 2 DEBUG nova.virt.libvirt.host [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:20:27 np0005466013 nova_compute[192144]: 2025-10-02 12:20:27.031 2 DEBUG nova.virt.libvirt.driver [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:20:27 np0005466013 nova_compute[192144]: 2025-10-02 12:20:27.032 2 DEBUG nova.virt.hardware [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:20:27 np0005466013 nova_compute[192144]: 2025-10-02 12:20:27.032 2 DEBUG nova.virt.hardware [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:20:27 np0005466013 nova_compute[192144]: 2025-10-02 12:20:27.032 2 DEBUG nova.virt.hardware [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:20:27 np0005466013 nova_compute[192144]: 2025-10-02 12:20:27.032 2 DEBUG nova.virt.hardware [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:20:27 np0005466013 nova_compute[192144]: 2025-10-02 12:20:27.033 2 DEBUG nova.virt.hardware [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:20:27 np0005466013 nova_compute[192144]: 2025-10-02 12:20:27.033 2 DEBUG nova.virt.hardware [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:20:27 np0005466013 nova_compute[192144]: 2025-10-02 12:20:27.033 2 DEBUG nova.virt.hardware [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:20:27 np0005466013 nova_compute[192144]: 2025-10-02 12:20:27.033 2 DEBUG nova.virt.hardware [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:20:27 np0005466013 nova_compute[192144]: 2025-10-02 12:20:27.033 2 DEBUG nova.virt.hardware [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:20:27 np0005466013 nova_compute[192144]: 2025-10-02 12:20:27.034 2 DEBUG nova.virt.hardware [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:20:27 np0005466013 nova_compute[192144]: 2025-10-02 12:20:27.034 2 DEBUG nova.virt.hardware [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:20:27 np0005466013 nova_compute[192144]: 2025-10-02 12:20:27.037 2 DEBUG nova.virt.libvirt.vif [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:20:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1629207280',display_name='tempest-ServerActionsTestOtherB-server-1629207280',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1629207280',id=108,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG+aqSe4de2VLtRAXN5xeLQn4S/3X8QrNMy2M5WdQ5hviVyEOgqK+m+uWmzPaUSUgE38sEdkytfwUHD32CBZajBt4q3OEf9i3yPJUQGuqp42pAUD+A3EoBIyeptNeSxGdA==',key_name='tempest-keypair-1900171990',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ffce7d629aa24a7f970d93b2a79045f1',ramdisk_id='',reservation_id='r-jtzab0yc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-263921372',owner_user_name='tempest-ServerActionsTestOtherB-263921372-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:20:23Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0ea122e2fff94f2ba7c78bf30b04029c',uuid=ad2d69bb-3aa9-4c11-b9de-29996574cfa2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cae13af9-8175-4eab-b9ec-18019b521d0b", "address": "fa:16:3e:35:d3:eb", "network": {"id": "20eb29be-ee23-463b-85af-bfc2388e9f77", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-370285634-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffce7d629aa24a7f970d93b2a79045f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcae13af9-81", "ovs_interfaceid": "cae13af9-8175-4eab-b9ec-18019b521d0b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:20:27 np0005466013 nova_compute[192144]: 2025-10-02 12:20:27.037 2 DEBUG nova.network.os_vif_util [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Converting VIF {"id": "cae13af9-8175-4eab-b9ec-18019b521d0b", "address": "fa:16:3e:35:d3:eb", "network": {"id": "20eb29be-ee23-463b-85af-bfc2388e9f77", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-370285634-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffce7d629aa24a7f970d93b2a79045f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcae13af9-81", "ovs_interfaceid": "cae13af9-8175-4eab-b9ec-18019b521d0b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:20:27 np0005466013 nova_compute[192144]: 2025-10-02 12:20:27.038 2 DEBUG nova.network.os_vif_util [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:d3:eb,bridge_name='br-int',has_traffic_filtering=True,id=cae13af9-8175-4eab-b9ec-18019b521d0b,network=Network(20eb29be-ee23-463b-85af-bfc2388e9f77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcae13af9-81') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:20:27 np0005466013 nova_compute[192144]: 2025-10-02 12:20:27.039 2 DEBUG nova.objects.instance [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lazy-loading 'pci_devices' on Instance uuid ad2d69bb-3aa9-4c11-b9de-29996574cfa2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:20:27 np0005466013 nova_compute[192144]: 2025-10-02 12:20:27.053 2 DEBUG nova.virt.libvirt.driver [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:20:27 np0005466013 nova_compute[192144]:  <uuid>ad2d69bb-3aa9-4c11-b9de-29996574cfa2</uuid>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:  <name>instance-0000006c</name>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:20:27 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:      <nova:name>tempest-ServerActionsTestOtherB-server-1629207280</nova:name>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:20:27</nova:creationTime>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:20:27 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:        <nova:user uuid="0ea122e2fff94f2ba7c78bf30b04029c">tempest-ServerActionsTestOtherB-263921372-project-member</nova:user>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:        <nova:project uuid="ffce7d629aa24a7f970d93b2a79045f1">tempest-ServerActionsTestOtherB-263921372</nova:project>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:        <nova:port uuid="cae13af9-8175-4eab-b9ec-18019b521d0b">
Oct  2 08:20:27 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:      <entry name="serial">ad2d69bb-3aa9-4c11-b9de-29996574cfa2</entry>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:      <entry name="uuid">ad2d69bb-3aa9-4c11-b9de-29996574cfa2</entry>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:20:27 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk"/>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:20:27 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk.config"/>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:20:27 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:35:d3:eb"/>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:      <target dev="tapcae13af9-81"/>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:20:27 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2/console.log" append="off"/>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:20:27 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:20:27 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:20:27 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:20:27 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:20:27 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:20:27 np0005466013 nova_compute[192144]: 2025-10-02 12:20:27.054 2 DEBUG nova.compute.manager [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Preparing to wait for external event network-vif-plugged-cae13af9-8175-4eab-b9ec-18019b521d0b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:20:27 np0005466013 nova_compute[192144]: 2025-10-02 12:20:27.054 2 DEBUG oslo_concurrency.lockutils [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Acquiring lock "ad2d69bb-3aa9-4c11-b9de-29996574cfa2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:27 np0005466013 nova_compute[192144]: 2025-10-02 12:20:27.055 2 DEBUG oslo_concurrency.lockutils [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "ad2d69bb-3aa9-4c11-b9de-29996574cfa2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:27 np0005466013 nova_compute[192144]: 2025-10-02 12:20:27.055 2 DEBUG oslo_concurrency.lockutils [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "ad2d69bb-3aa9-4c11-b9de-29996574cfa2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:27 np0005466013 nova_compute[192144]: 2025-10-02 12:20:27.056 2 DEBUG nova.virt.libvirt.vif [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:20:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1629207280',display_name='tempest-ServerActionsTestOtherB-server-1629207280',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1629207280',id=108,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG+aqSe4de2VLtRAXN5xeLQn4S/3X8QrNMy2M5WdQ5hviVyEOgqK+m+uWmzPaUSUgE38sEdkytfwUHD32CBZajBt4q3OEf9i3yPJUQGuqp42pAUD+A3EoBIyeptNeSxGdA==',key_name='tempest-keypair-1900171990',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ffce7d629aa24a7f970d93b2a79045f1',ramdisk_id='',reservation_id='r-jtzab0yc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-263921372',owner_user_name='tempest-ServerActionsTestOtherB-263921372-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:20:23Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0ea122e2fff94f2ba7c78bf30b04029c',uuid=ad2d69bb-3aa9-4c11-b9de-29996574cfa2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cae13af9-8175-4eab-b9ec-18019b521d0b", "address": "fa:16:3e:35:d3:eb", "network": {"id": "20eb29be-ee23-463b-85af-bfc2388e9f77", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-370285634-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffce7d629aa24a7f970d93b2a79045f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcae13af9-81", "ovs_interfaceid": "cae13af9-8175-4eab-b9ec-18019b521d0b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:20:27 np0005466013 nova_compute[192144]: 2025-10-02 12:20:27.056 2 DEBUG nova.network.os_vif_util [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Converting VIF {"id": "cae13af9-8175-4eab-b9ec-18019b521d0b", "address": "fa:16:3e:35:d3:eb", "network": {"id": "20eb29be-ee23-463b-85af-bfc2388e9f77", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-370285634-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffce7d629aa24a7f970d93b2a79045f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcae13af9-81", "ovs_interfaceid": "cae13af9-8175-4eab-b9ec-18019b521d0b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:20:27 np0005466013 nova_compute[192144]: 2025-10-02 12:20:27.056 2 DEBUG nova.network.os_vif_util [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:d3:eb,bridge_name='br-int',has_traffic_filtering=True,id=cae13af9-8175-4eab-b9ec-18019b521d0b,network=Network(20eb29be-ee23-463b-85af-bfc2388e9f77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcae13af9-81') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:20:27 np0005466013 nova_compute[192144]: 2025-10-02 12:20:27.057 2 DEBUG os_vif [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:d3:eb,bridge_name='br-int',has_traffic_filtering=True,id=cae13af9-8175-4eab-b9ec-18019b521d0b,network=Network(20eb29be-ee23-463b-85af-bfc2388e9f77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcae13af9-81') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:20:27 np0005466013 nova_compute[192144]: 2025-10-02 12:20:27.058 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:27 np0005466013 nova_compute[192144]: 2025-10-02 12:20:27.058 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:27 np0005466013 nova_compute[192144]: 2025-10-02 12:20:27.058 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:20:27 np0005466013 nova_compute[192144]: 2025-10-02 12:20:27.061 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:27 np0005466013 nova_compute[192144]: 2025-10-02 12:20:27.061 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcae13af9-81, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:27 np0005466013 nova_compute[192144]: 2025-10-02 12:20:27.061 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapcae13af9-81, col_values=(('external_ids', {'iface-id': 'cae13af9-8175-4eab-b9ec-18019b521d0b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:35:d3:eb', 'vm-uuid': 'ad2d69bb-3aa9-4c11-b9de-29996574cfa2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:27 np0005466013 nova_compute[192144]: 2025-10-02 12:20:27.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:27 np0005466013 NetworkManager[51205]: <info>  [1759407627.0645] manager: (tapcae13af9-81): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/188)
Oct  2 08:20:27 np0005466013 nova_compute[192144]: 2025-10-02 12:20:27.065 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:20:27 np0005466013 nova_compute[192144]: 2025-10-02 12:20:27.070 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:27 np0005466013 nova_compute[192144]: 2025-10-02 12:20:27.071 2 INFO os_vif [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:d3:eb,bridge_name='br-int',has_traffic_filtering=True,id=cae13af9-8175-4eab-b9ec-18019b521d0b,network=Network(20eb29be-ee23-463b-85af-bfc2388e9f77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcae13af9-81')#033[00m
Oct  2 08:20:27 np0005466013 nova_compute[192144]: 2025-10-02 12:20:27.131 2 DEBUG nova.virt.libvirt.driver [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:20:27 np0005466013 nova_compute[192144]: 2025-10-02 12:20:27.131 2 DEBUG nova.virt.libvirt.driver [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:20:27 np0005466013 nova_compute[192144]: 2025-10-02 12:20:27.132 2 DEBUG nova.virt.libvirt.driver [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] No VIF found with MAC fa:16:3e:35:d3:eb, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:20:27 np0005466013 nova_compute[192144]: 2025-10-02 12:20:27.132 2 INFO nova.virt.libvirt.driver [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Using config drive#033[00m
Oct  2 08:20:27 np0005466013 nova_compute[192144]: 2025-10-02 12:20:27.530 2 INFO nova.virt.libvirt.driver [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Creating config drive at /var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk.config#033[00m
Oct  2 08:20:27 np0005466013 nova_compute[192144]: 2025-10-02 12:20:27.536 2 DEBUG oslo_concurrency.processutils [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwblq9lyh execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:27 np0005466013 nova_compute[192144]: 2025-10-02 12:20:27.669 2 DEBUG oslo_concurrency.processutils [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwblq9lyh" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:27 np0005466013 kernel: tapcae13af9-81: entered promiscuous mode
Oct  2 08:20:27 np0005466013 NetworkManager[51205]: <info>  [1759407627.7225] manager: (tapcae13af9-81): new Tun device (/org/freedesktop/NetworkManager/Devices/189)
Oct  2 08:20:27 np0005466013 ovn_controller[94366]: 2025-10-02T12:20:27Z|00406|binding|INFO|Claiming lport cae13af9-8175-4eab-b9ec-18019b521d0b for this chassis.
Oct  2 08:20:27 np0005466013 ovn_controller[94366]: 2025-10-02T12:20:27Z|00407|binding|INFO|cae13af9-8175-4eab-b9ec-18019b521d0b: Claiming fa:16:3e:35:d3:eb 10.100.0.14
Oct  2 08:20:27 np0005466013 nova_compute[192144]: 2025-10-02 12:20:27.723 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:27.737 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:35:d3:eb 10.100.0.14'], port_security=['fa:16:3e:35:d3:eb 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'ad2d69bb-3aa9-4c11-b9de-29996574cfa2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-20eb29be-ee23-463b-85af-bfc2388e9f77', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'neutron:revision_number': '2', 'neutron:security_group_ids': '12e9168a-be86-462f-a658-971f38e3430f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e183e2c6-21dc-48e3-ae47-279bc8b32eeb, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=cae13af9-8175-4eab-b9ec-18019b521d0b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:20:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:27.738 103323 INFO neutron.agent.ovn.metadata.agent [-] Port cae13af9-8175-4eab-b9ec-18019b521d0b in datapath 20eb29be-ee23-463b-85af-bfc2388e9f77 bound to our chassis#033[00m
Oct  2 08:20:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:27.740 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 20eb29be-ee23-463b-85af-bfc2388e9f77#033[00m
Oct  2 08:20:27 np0005466013 systemd-udevd[235428]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:20:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:27.751 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[90a514d2-0f0f-4578-8d77-05fbe95d3fc2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:27.752 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap20eb29be-e1 in ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:20:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:27.755 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap20eb29be-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:20:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:27.755 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[d048aba8-70e6-4e7f-9b0e-1b3478d05da2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:27.756 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[248ac00c-1587-4fde-9ee0-fcc0580441b5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:27 np0005466013 systemd-machined[152202]: New machine qemu-48-instance-0000006c.
Oct  2 08:20:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:27.767 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[20e8706b-a76e-44ab-af99-7a8fe42284eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:27 np0005466013 NetworkManager[51205]: <info>  [1759407627.7739] device (tapcae13af9-81): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:20:27 np0005466013 NetworkManager[51205]: <info>  [1759407627.7751] device (tapcae13af9-81): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:20:27 np0005466013 nova_compute[192144]: 2025-10-02 12:20:27.780 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:27 np0005466013 nova_compute[192144]: 2025-10-02 12:20:27.786 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:27 np0005466013 ovn_controller[94366]: 2025-10-02T12:20:27Z|00408|binding|INFO|Setting lport cae13af9-8175-4eab-b9ec-18019b521d0b ovn-installed in OVS
Oct  2 08:20:27 np0005466013 ovn_controller[94366]: 2025-10-02T12:20:27Z|00409|binding|INFO|Setting lport cae13af9-8175-4eab-b9ec-18019b521d0b up in Southbound
Oct  2 08:20:27 np0005466013 systemd[1]: Started Virtual Machine qemu-48-instance-0000006c.
Oct  2 08:20:27 np0005466013 nova_compute[192144]: 2025-10-02 12:20:27.791 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:27.795 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[58e8ed7d-2147-42a3-833e-cbb27865c296]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:27.824 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[fd1a472a-ef19-4e75-850b-87c084f7c820]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:27.828 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[33ee7efb-119b-42e2-9b09-eb05aae4cf05]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:27 np0005466013 NetworkManager[51205]: <info>  [1759407627.8300] manager: (tap20eb29be-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/190)
Oct  2 08:20:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:27.872 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[a6fcb9cd-52cd-4876-b489-b3645c0158d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:27.875 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[17bc6752-c4a7-4f50-92be-22b46b454a86]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:27 np0005466013 NetworkManager[51205]: <info>  [1759407627.8960] device (tap20eb29be-e0): carrier: link connected
Oct  2 08:20:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:27.901 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[cdc07513-6e40-48de-aacd-0a3db70c3801]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:27.922 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[8de7029d-8ea1-4e8e-8043-c1ad2be12d81]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap20eb29be-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:77:55:96'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 125], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 561951, 'reachable_time': 40813, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235461, 'error': None, 'target': 'ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:27.937 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[5456a85d-8c2c-4f7c-b1f9-bbc9200823e0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe77:5596'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 561951, 'tstamp': 561951}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235462, 'error': None, 'target': 'ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:27.956 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[17c8b1c0-38a5-4bfa-8a15-39b2f3441f34]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap20eb29be-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:77:55:96'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 125], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 561951, 'reachable_time': 40813, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 235463, 'error': None, 'target': 'ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:27.989 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[c3bac517-6857-4ad0-8d65-2d4aca3603da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:28 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:28.050 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[3ce101f0-a8eb-4e27-97f2-3bc70c46b3b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:28 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:28.051 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap20eb29be-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:28 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:28.051 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:20:28 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:28.052 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap20eb29be-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:28 np0005466013 NetworkManager[51205]: <info>  [1759407628.0549] manager: (tap20eb29be-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/191)
Oct  2 08:20:28 np0005466013 nova_compute[192144]: 2025-10-02 12:20:28.055 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:28 np0005466013 kernel: tap20eb29be-e0: entered promiscuous mode
Oct  2 08:20:28 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:28.058 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap20eb29be-e0, col_values=(('external_ids', {'iface-id': 'e533861f-45cb-4843-b071-0b628ca25128'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:28 np0005466013 nova_compute[192144]: 2025-10-02 12:20:28.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:28 np0005466013 ovn_controller[94366]: 2025-10-02T12:20:28Z|00410|binding|INFO|Releasing lport e533861f-45cb-4843-b071-0b628ca25128 from this chassis (sb_readonly=0)
Oct  2 08:20:28 np0005466013 nova_compute[192144]: 2025-10-02 12:20:28.122 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:28 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:28.123 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/20eb29be-ee23-463b-85af-bfc2388e9f77.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/20eb29be-ee23-463b-85af-bfc2388e9f77.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:20:28 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:28.124 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[bc15af56-e962-46f1-b72e-c4ad6e54f16b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:28 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:28.125 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:20:28 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:20:28 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:20:28 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-20eb29be-ee23-463b-85af-bfc2388e9f77
Oct  2 08:20:28 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:20:28 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:20:28 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:20:28 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/20eb29be-ee23-463b-85af-bfc2388e9f77.pid.haproxy
Oct  2 08:20:28 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:20:28 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:20:28 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:20:28 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:20:28 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:20:28 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:20:28 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:20:28 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:20:28 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:20:28 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:20:28 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:20:28 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:20:28 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:20:28 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:20:28 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:20:28 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:20:28 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:20:28 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:20:28 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:20:28 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:20:28 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID 20eb29be-ee23-463b-85af-bfc2388e9f77
Oct  2 08:20:28 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:20:28 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:28.126 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77', 'env', 'PROCESS_TAG=haproxy-20eb29be-ee23-463b-85af-bfc2388e9f77', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/20eb29be-ee23-463b-85af-bfc2388e9f77.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:20:28 np0005466013 nova_compute[192144]: 2025-10-02 12:20:28.145 2 DEBUG nova.compute.manager [req-7e3b1f08-5414-4b60-8a5b-b485e1c9c0f2 req-2f5549a3-c2f2-45e7-b693-8d6406b772ac 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Received event network-vif-plugged-cae13af9-8175-4eab-b9ec-18019b521d0b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:28 np0005466013 nova_compute[192144]: 2025-10-02 12:20:28.146 2 DEBUG oslo_concurrency.lockutils [req-7e3b1f08-5414-4b60-8a5b-b485e1c9c0f2 req-2f5549a3-c2f2-45e7-b693-8d6406b772ac 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "ad2d69bb-3aa9-4c11-b9de-29996574cfa2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:28 np0005466013 nova_compute[192144]: 2025-10-02 12:20:28.146 2 DEBUG oslo_concurrency.lockutils [req-7e3b1f08-5414-4b60-8a5b-b485e1c9c0f2 req-2f5549a3-c2f2-45e7-b693-8d6406b772ac 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ad2d69bb-3aa9-4c11-b9de-29996574cfa2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:28 np0005466013 nova_compute[192144]: 2025-10-02 12:20:28.146 2 DEBUG oslo_concurrency.lockutils [req-7e3b1f08-5414-4b60-8a5b-b485e1c9c0f2 req-2f5549a3-c2f2-45e7-b693-8d6406b772ac 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ad2d69bb-3aa9-4c11-b9de-29996574cfa2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:28 np0005466013 nova_compute[192144]: 2025-10-02 12:20:28.147 2 DEBUG nova.compute.manager [req-7e3b1f08-5414-4b60-8a5b-b485e1c9c0f2 req-2f5549a3-c2f2-45e7-b693-8d6406b772ac 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Processing event network-vif-plugged-cae13af9-8175-4eab-b9ec-18019b521d0b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:20:28 np0005466013 nova_compute[192144]: 2025-10-02 12:20:28.325 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:20:28 np0005466013 nova_compute[192144]: 2025-10-02 12:20:28.326 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:20:28 np0005466013 nova_compute[192144]: 2025-10-02 12:20:28.491 2 DEBUG nova.compute.manager [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:20:28 np0005466013 nova_compute[192144]: 2025-10-02 12:20:28.492 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407628.490452, ad2d69bb-3aa9-4c11-b9de-29996574cfa2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:20:28 np0005466013 nova_compute[192144]: 2025-10-02 12:20:28.493 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] VM Started (Lifecycle Event)#033[00m
Oct  2 08:20:28 np0005466013 nova_compute[192144]: 2025-10-02 12:20:28.496 2 DEBUG nova.virt.libvirt.driver [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:20:28 np0005466013 nova_compute[192144]: 2025-10-02 12:20:28.499 2 INFO nova.virt.libvirt.driver [-] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Instance spawned successfully.#033[00m
Oct  2 08:20:28 np0005466013 nova_compute[192144]: 2025-10-02 12:20:28.500 2 DEBUG nova.virt.libvirt.driver [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:20:28 np0005466013 nova_compute[192144]: 2025-10-02 12:20:28.517 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:20:28 np0005466013 nova_compute[192144]: 2025-10-02 12:20:28.524 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:20:28 np0005466013 nova_compute[192144]: 2025-10-02 12:20:28.528 2 DEBUG nova.virt.libvirt.driver [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:28 np0005466013 nova_compute[192144]: 2025-10-02 12:20:28.529 2 DEBUG nova.virt.libvirt.driver [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:28 np0005466013 nova_compute[192144]: 2025-10-02 12:20:28.529 2 DEBUG nova.virt.libvirt.driver [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:28 np0005466013 nova_compute[192144]: 2025-10-02 12:20:28.530 2 DEBUG nova.virt.libvirt.driver [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:28 np0005466013 nova_compute[192144]: 2025-10-02 12:20:28.530 2 DEBUG nova.virt.libvirt.driver [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:28 np0005466013 nova_compute[192144]: 2025-10-02 12:20:28.531 2 DEBUG nova.virt.libvirt.driver [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:28 np0005466013 podman[235502]: 2025-10-02 12:20:28.555998163 +0000 UTC m=+0.051363392 container create ec5bfb782c3b7ae4bc1e30eba90123e0ba718892d3d4de75c69008d459956534 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:20:28 np0005466013 nova_compute[192144]: 2025-10-02 12:20:28.560 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:20:28 np0005466013 nova_compute[192144]: 2025-10-02 12:20:28.560 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407628.4908097, ad2d69bb-3aa9-4c11-b9de-29996574cfa2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:20:28 np0005466013 nova_compute[192144]: 2025-10-02 12:20:28.561 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:20:28 np0005466013 systemd[1]: Started libpod-conmon-ec5bfb782c3b7ae4bc1e30eba90123e0ba718892d3d4de75c69008d459956534.scope.
Oct  2 08:20:28 np0005466013 nova_compute[192144]: 2025-10-02 12:20:28.594 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:20:28 np0005466013 nova_compute[192144]: 2025-10-02 12:20:28.601 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407628.4953275, ad2d69bb-3aa9-4c11-b9de-29996574cfa2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:20:28 np0005466013 nova_compute[192144]: 2025-10-02 12:20:28.601 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:20:28 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:20:28 np0005466013 nova_compute[192144]: 2025-10-02 12:20:28.617 2 INFO nova.compute.manager [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Took 5.08 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:20:28 np0005466013 nova_compute[192144]: 2025-10-02 12:20:28.618 2 DEBUG nova.compute.manager [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:20:28 np0005466013 podman[235502]: 2025-10-02 12:20:28.526788807 +0000 UTC m=+0.022154056 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:20:28 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6bd8b09aaf0e9d525298c7ff89972a46bf226f7585941e7fa69b8b90dfdae1d8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:20:28 np0005466013 nova_compute[192144]: 2025-10-02 12:20:28.622 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:20:28 np0005466013 nova_compute[192144]: 2025-10-02 12:20:28.631 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:20:28 np0005466013 podman[235502]: 2025-10-02 12:20:28.636184708 +0000 UTC m=+0.131549957 container init ec5bfb782c3b7ae4bc1e30eba90123e0ba718892d3d4de75c69008d459956534 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 08:20:28 np0005466013 podman[235502]: 2025-10-02 12:20:28.64262699 +0000 UTC m=+0.137992219 container start ec5bfb782c3b7ae4bc1e30eba90123e0ba718892d3d4de75c69008d459956534 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:20:28 np0005466013 neutron-haproxy-ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77[235517]: [NOTICE]   (235521) : New worker (235523) forked
Oct  2 08:20:28 np0005466013 neutron-haproxy-ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77[235517]: [NOTICE]   (235521) : Loading success.
Oct  2 08:20:28 np0005466013 nova_compute[192144]: 2025-10-02 12:20:28.682 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:20:28 np0005466013 nova_compute[192144]: 2025-10-02 12:20:28.697 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:28 np0005466013 nova_compute[192144]: 2025-10-02 12:20:28.707 2 DEBUG nova.network.neutron [req-349b30e0-7889-45f6-84fb-8175a42c3ada req-4eb283ca-8c60-4016-970b-f2c62f98614e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Updated VIF entry in instance network info cache for port cae13af9-8175-4eab-b9ec-18019b521d0b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:20:28 np0005466013 nova_compute[192144]: 2025-10-02 12:20:28.708 2 DEBUG nova.network.neutron [req-349b30e0-7889-45f6-84fb-8175a42c3ada req-4eb283ca-8c60-4016-970b-f2c62f98614e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Updating instance_info_cache with network_info: [{"id": "cae13af9-8175-4eab-b9ec-18019b521d0b", "address": "fa:16:3e:35:d3:eb", "network": {"id": "20eb29be-ee23-463b-85af-bfc2388e9f77", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-370285634-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffce7d629aa24a7f970d93b2a79045f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcae13af9-81", "ovs_interfaceid": "cae13af9-8175-4eab-b9ec-18019b521d0b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:20:28 np0005466013 nova_compute[192144]: 2025-10-02 12:20:28.728 2 DEBUG oslo_concurrency.lockutils [req-349b30e0-7889-45f6-84fb-8175a42c3ada req-4eb283ca-8c60-4016-970b-f2c62f98614e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-ad2d69bb-3aa9-4c11-b9de-29996574cfa2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:20:28 np0005466013 nova_compute[192144]: 2025-10-02 12:20:28.732 2 INFO nova.compute.manager [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Took 5.64 seconds to build instance.#033[00m
Oct  2 08:20:28 np0005466013 nova_compute[192144]: 2025-10-02 12:20:28.758 2 DEBUG oslo_concurrency.lockutils [None req-75354ac3-823c-4540-8ee9-ca1fc32bb00f 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "ad2d69bb-3aa9-4c11-b9de-29996574cfa2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.743s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:28 np0005466013 nova_compute[192144]: 2025-10-02 12:20:28.998 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:20:29 np0005466013 podman[235532]: 2025-10-02 12:20:29.69661083 +0000 UTC m=+0.062469950 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:20:29 np0005466013 nova_compute[192144]: 2025-10-02 12:20:29.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:20:29 np0005466013 nova_compute[192144]: 2025-10-02 12:20:29.993 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:20:29 np0005466013 nova_compute[192144]: 2025-10-02 12:20:29.993 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:20:30 np0005466013 nova_compute[192144]: 2025-10-02 12:20:30.142 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "refresh_cache-ad2d69bb-3aa9-4c11-b9de-29996574cfa2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:20:30 np0005466013 nova_compute[192144]: 2025-10-02 12:20:30.143 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquired lock "refresh_cache-ad2d69bb-3aa9-4c11-b9de-29996574cfa2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:20:30 np0005466013 nova_compute[192144]: 2025-10-02 12:20:30.143 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:20:30 np0005466013 nova_compute[192144]: 2025-10-02 12:20:30.143 2 DEBUG nova.objects.instance [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lazy-loading 'info_cache' on Instance uuid ad2d69bb-3aa9-4c11-b9de-29996574cfa2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:20:30 np0005466013 nova_compute[192144]: 2025-10-02 12:20:30.277 2 DEBUG nova.compute.manager [req-f6bc123d-1831-4b88-ab5a-4f7c93af2402 req-ba99f708-8c55-4441-9b60-9862f3712ef9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Received event network-vif-plugged-cae13af9-8175-4eab-b9ec-18019b521d0b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:30 np0005466013 nova_compute[192144]: 2025-10-02 12:20:30.278 2 DEBUG oslo_concurrency.lockutils [req-f6bc123d-1831-4b88-ab5a-4f7c93af2402 req-ba99f708-8c55-4441-9b60-9862f3712ef9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "ad2d69bb-3aa9-4c11-b9de-29996574cfa2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:30 np0005466013 nova_compute[192144]: 2025-10-02 12:20:30.279 2 DEBUG oslo_concurrency.lockutils [req-f6bc123d-1831-4b88-ab5a-4f7c93af2402 req-ba99f708-8c55-4441-9b60-9862f3712ef9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ad2d69bb-3aa9-4c11-b9de-29996574cfa2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:30 np0005466013 nova_compute[192144]: 2025-10-02 12:20:30.279 2 DEBUG oslo_concurrency.lockutils [req-f6bc123d-1831-4b88-ab5a-4f7c93af2402 req-ba99f708-8c55-4441-9b60-9862f3712ef9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ad2d69bb-3aa9-4c11-b9de-29996574cfa2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:30 np0005466013 nova_compute[192144]: 2025-10-02 12:20:30.279 2 DEBUG nova.compute.manager [req-f6bc123d-1831-4b88-ab5a-4f7c93af2402 req-ba99f708-8c55-4441-9b60-9862f3712ef9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] No waiting events found dispatching network-vif-plugged-cae13af9-8175-4eab-b9ec-18019b521d0b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:20:30 np0005466013 nova_compute[192144]: 2025-10-02 12:20:30.280 2 WARNING nova.compute.manager [req-f6bc123d-1831-4b88-ab5a-4f7c93af2402 req-ba99f708-8c55-4441-9b60-9862f3712ef9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Received unexpected event network-vif-plugged-cae13af9-8175-4eab-b9ec-18019b521d0b for instance with vm_state active and task_state None.#033[00m
Oct  2 08:20:30 np0005466013 podman[235554]: 2025-10-02 12:20:30.687900654 +0000 UTC m=+0.064184885 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:20:30 np0005466013 podman[235555]: 2025-10-02 12:20:30.71488429 +0000 UTC m=+0.084609745 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-type=git, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, architecture=x86_64, build-date=2025-08-20T13:12:41, config_id=edpm, io.openshift.expose-services=, release=1755695350, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  2 08:20:31 np0005466013 nova_compute[192144]: 2025-10-02 12:20:31.162 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:31 np0005466013 NetworkManager[51205]: <info>  [1759407631.1649] manager: (patch-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/192)
Oct  2 08:20:31 np0005466013 NetworkManager[51205]: <info>  [1759407631.1663] manager: (patch-br-int-to-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/193)
Oct  2 08:20:31 np0005466013 nova_compute[192144]: 2025-10-02 12:20:31.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:31 np0005466013 ovn_controller[94366]: 2025-10-02T12:20:31Z|00411|binding|INFO|Releasing lport e533861f-45cb-4843-b071-0b628ca25128 from this chassis (sb_readonly=0)
Oct  2 08:20:31 np0005466013 nova_compute[192144]: 2025-10-02 12:20:31.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:31 np0005466013 nova_compute[192144]: 2025-10-02 12:20:31.497 2 DEBUG nova.compute.manager [req-dc647fa5-9d86-4dd7-8086-27d13dc055c9 req-7fe2e0c3-90f7-4eac-b4f1-6b8b1af56829 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Received event network-changed-cae13af9-8175-4eab-b9ec-18019b521d0b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:31 np0005466013 nova_compute[192144]: 2025-10-02 12:20:31.497 2 DEBUG nova.compute.manager [req-dc647fa5-9d86-4dd7-8086-27d13dc055c9 req-7fe2e0c3-90f7-4eac-b4f1-6b8b1af56829 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Refreshing instance network info cache due to event network-changed-cae13af9-8175-4eab-b9ec-18019b521d0b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:20:31 np0005466013 nova_compute[192144]: 2025-10-02 12:20:31.498 2 DEBUG oslo_concurrency.lockutils [req-dc647fa5-9d86-4dd7-8086-27d13dc055c9 req-7fe2e0c3-90f7-4eac-b4f1-6b8b1af56829 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-ad2d69bb-3aa9-4c11-b9de-29996574cfa2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:20:31 np0005466013 nova_compute[192144]: 2025-10-02 12:20:31.960 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Updating instance_info_cache with network_info: [{"id": "cae13af9-8175-4eab-b9ec-18019b521d0b", "address": "fa:16:3e:35:d3:eb", "network": {"id": "20eb29be-ee23-463b-85af-bfc2388e9f77", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-370285634-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffce7d629aa24a7f970d93b2a79045f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcae13af9-81", "ovs_interfaceid": "cae13af9-8175-4eab-b9ec-18019b521d0b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:20:32 np0005466013 nova_compute[192144]: 2025-10-02 12:20:32.023 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Releasing lock "refresh_cache-ad2d69bb-3aa9-4c11-b9de-29996574cfa2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:20:32 np0005466013 nova_compute[192144]: 2025-10-02 12:20:32.024 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:20:32 np0005466013 nova_compute[192144]: 2025-10-02 12:20:32.024 2 DEBUG oslo_concurrency.lockutils [req-dc647fa5-9d86-4dd7-8086-27d13dc055c9 req-7fe2e0c3-90f7-4eac-b4f1-6b8b1af56829 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-ad2d69bb-3aa9-4c11-b9de-29996574cfa2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:20:32 np0005466013 nova_compute[192144]: 2025-10-02 12:20:32.024 2 DEBUG nova.network.neutron [req-dc647fa5-9d86-4dd7-8086-27d13dc055c9 req-7fe2e0c3-90f7-4eac-b4f1-6b8b1af56829 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Refreshing network info cache for port cae13af9-8175-4eab-b9ec-18019b521d0b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:20:32 np0005466013 nova_compute[192144]: 2025-10-02 12:20:32.026 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:20:32 np0005466013 nova_compute[192144]: 2025-10-02 12:20:32.027 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:20:32 np0005466013 nova_compute[192144]: 2025-10-02 12:20:32.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:33 np0005466013 nova_compute[192144]: 2025-10-02 12:20:33.739 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:34 np0005466013 nova_compute[192144]: 2025-10-02 12:20:34.022 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:20:34 np0005466013 nova_compute[192144]: 2025-10-02 12:20:34.023 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:20:35 np0005466013 nova_compute[192144]: 2025-10-02 12:20:35.427 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:35 np0005466013 podman[235595]: 2025-10-02 12:20:35.704757455 +0000 UTC m=+0.065140175 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=iscsid, managed_by=edpm_ansible, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:20:35 np0005466013 podman[235594]: 2025-10-02 12:20:35.726644922 +0000 UTC m=+0.090088397 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:20:35 np0005466013 nova_compute[192144]: 2025-10-02 12:20:35.731 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407620.7303143, 21558304-9e39-4f50-9137-af0282f5cfca => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:20:35 np0005466013 nova_compute[192144]: 2025-10-02 12:20:35.731 2 INFO nova.compute.manager [-] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:20:35 np0005466013 nova_compute[192144]: 2025-10-02 12:20:35.759 2 DEBUG nova.compute.manager [None req-a90affd7-cc7c-41c9-801d-413311a39251 - - - - - -] [instance: 21558304-9e39-4f50-9137-af0282f5cfca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:20:35 np0005466013 nova_compute[192144]: 2025-10-02 12:20:35.894 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407620.893926, d88505e6-83d4-4006-a5d8-33e9ab64a380 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:20:35 np0005466013 nova_compute[192144]: 2025-10-02 12:20:35.895 2 INFO nova.compute.manager [-] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:20:35 np0005466013 nova_compute[192144]: 2025-10-02 12:20:35.919 2 DEBUG nova.compute.manager [None req-2561c57d-b857-4316-abf7-04a6343b3957 - - - - - -] [instance: d88505e6-83d4-4006-a5d8-33e9ab64a380] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:20:36 np0005466013 nova_compute[192144]: 2025-10-02 12:20:36.370 2 DEBUG nova.network.neutron [req-dc647fa5-9d86-4dd7-8086-27d13dc055c9 req-7fe2e0c3-90f7-4eac-b4f1-6b8b1af56829 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Updated VIF entry in instance network info cache for port cae13af9-8175-4eab-b9ec-18019b521d0b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:20:36 np0005466013 nova_compute[192144]: 2025-10-02 12:20:36.371 2 DEBUG nova.network.neutron [req-dc647fa5-9d86-4dd7-8086-27d13dc055c9 req-7fe2e0c3-90f7-4eac-b4f1-6b8b1af56829 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Updating instance_info_cache with network_info: [{"id": "cae13af9-8175-4eab-b9ec-18019b521d0b", "address": "fa:16:3e:35:d3:eb", "network": {"id": "20eb29be-ee23-463b-85af-bfc2388e9f77", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-370285634-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffce7d629aa24a7f970d93b2a79045f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcae13af9-81", "ovs_interfaceid": "cae13af9-8175-4eab-b9ec-18019b521d0b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:20:36 np0005466013 nova_compute[192144]: 2025-10-02 12:20:36.395 2 DEBUG oslo_concurrency.lockutils [req-dc647fa5-9d86-4dd7-8086-27d13dc055c9 req-7fe2e0c3-90f7-4eac-b4f1-6b8b1af56829 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-ad2d69bb-3aa9-4c11-b9de-29996574cfa2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:20:37 np0005466013 nova_compute[192144]: 2025-10-02 12:20:37.067 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:38 np0005466013 nova_compute[192144]: 2025-10-02 12:20:38.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:40 np0005466013 nova_compute[192144]: 2025-10-02 12:20:40.335 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:42 np0005466013 nova_compute[192144]: 2025-10-02 12:20:42.071 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:42 np0005466013 ovn_controller[94366]: 2025-10-02T12:20:42Z|00036|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:35:d3:eb 10.100.0.14
Oct  2 08:20:42 np0005466013 ovn_controller[94366]: 2025-10-02T12:20:42Z|00037|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:35:d3:eb 10.100.0.14
Oct  2 08:20:43 np0005466013 nova_compute[192144]: 2025-10-02 12:20:43.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:47 np0005466013 nova_compute[192144]: 2025-10-02 12:20:47.074 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:48 np0005466013 nova_compute[192144]: 2025-10-02 12:20:48.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:50 np0005466013 podman[235652]: 2025-10-02 12:20:50.679926695 +0000 UTC m=+0.055132621 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 08:20:50 np0005466013 podman[235676]: 2025-10-02 12:20:50.758747867 +0000 UTC m=+0.056199845 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:20:50 np0005466013 podman[235677]: 2025-10-02 12:20:50.816211999 +0000 UTC m=+0.110225449 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  2 08:20:52 np0005466013 nova_compute[192144]: 2025-10-02 12:20:52.076 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:53 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:53.468 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:20:53 np0005466013 nova_compute[192144]: 2025-10-02 12:20:53.468 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:53 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:53.470 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:20:53 np0005466013 nova_compute[192144]: 2025-10-02 12:20:53.746 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:56 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:20:56.472 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:57 np0005466013 nova_compute[192144]: 2025-10-02 12:20:57.079 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:58 np0005466013 nova_compute[192144]: 2025-10-02 12:20:58.749 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:00 np0005466013 ovn_controller[94366]: 2025-10-02T12:21:00Z|00412|binding|INFO|Releasing lport e533861f-45cb-4843-b071-0b628ca25128 from this chassis (sb_readonly=0)
Oct  2 08:21:00 np0005466013 nova_compute[192144]: 2025-10-02 12:21:00.669 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:00 np0005466013 podman[235716]: 2025-10-02 12:21:00.69468616 +0000 UTC m=+0.065619909 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct  2 08:21:00 np0005466013 podman[235736]: 2025-10-02 12:21:00.779340746 +0000 UTC m=+0.059712694 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=multipathd)
Oct  2 08:21:00 np0005466013 podman[235756]: 2025-10-02 12:21:00.862792853 +0000 UTC m=+0.056073409 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, release=1755695350, vcs-type=git, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, name=ubi9-minimal)
Oct  2 08:21:02 np0005466013 nova_compute[192144]: 2025-10-02 12:21:02.080 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:02.303 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:02.304 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:02.306 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:03 np0005466013 nova_compute[192144]: 2025-10-02 12:21:03.751 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:06 np0005466013 nova_compute[192144]: 2025-10-02 12:21:06.041 2 DEBUG nova.compute.manager [None req-120eb160-3412-4e62-9676-4b000c93cf7a 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:21:06 np0005466013 nova_compute[192144]: 2025-10-02 12:21:06.103 2 INFO nova.compute.manager [None req-120eb160-3412-4e62-9676-4b000c93cf7a 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] instance snapshotting#033[00m
Oct  2 08:21:06 np0005466013 nova_compute[192144]: 2025-10-02 12:21:06.103 2 DEBUG nova.objects.instance [None req-120eb160-3412-4e62-9676-4b000c93cf7a 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lazy-loading 'flavor' on Instance uuid ad2d69bb-3aa9-4c11-b9de-29996574cfa2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:21:06 np0005466013 nova_compute[192144]: 2025-10-02 12:21:06.517 2 INFO nova.virt.libvirt.driver [None req-120eb160-3412-4e62-9676-4b000c93cf7a 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Beginning live snapshot process#033[00m
Oct  2 08:21:06 np0005466013 podman[235778]: 2025-10-02 12:21:06.676090846 +0000 UTC m=+0.053964234 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=iscsid, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  2 08:21:06 np0005466013 podman[235777]: 2025-10-02 12:21:06.680596698 +0000 UTC m=+0.058384812 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 08:21:06 np0005466013 virtqemud[191867]: invalid argument: disk vda does not have an active block job
Oct  2 08:21:06 np0005466013 nova_compute[192144]: 2025-10-02 12:21:06.735 2 DEBUG oslo_concurrency.processutils [None req-120eb160-3412-4e62-9676-4b000c93cf7a 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:06 np0005466013 nova_compute[192144]: 2025-10-02 12:21:06.815 2 DEBUG oslo_concurrency.processutils [None req-120eb160-3412-4e62-9676-4b000c93cf7a 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk --force-share --output=json -f qcow2" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:06 np0005466013 nova_compute[192144]: 2025-10-02 12:21:06.817 2 DEBUG oslo_concurrency.processutils [None req-120eb160-3412-4e62-9676-4b000c93cf7a 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:06 np0005466013 nova_compute[192144]: 2025-10-02 12:21:06.876 2 DEBUG oslo_concurrency.processutils [None req-120eb160-3412-4e62-9676-4b000c93cf7a 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk --force-share --output=json -f qcow2" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:06 np0005466013 nova_compute[192144]: 2025-10-02 12:21:06.889 2 DEBUG oslo_concurrency.processutils [None req-120eb160-3412-4e62-9676-4b000c93cf7a 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:06 np0005466013 nova_compute[192144]: 2025-10-02 12:21:06.944 2 DEBUG oslo_concurrency.processutils [None req-120eb160-3412-4e62-9676-4b000c93cf7a 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:06 np0005466013 nova_compute[192144]: 2025-10-02 12:21:06.945 2 DEBUG oslo_concurrency.processutils [None req-120eb160-3412-4e62-9676-4b000c93cf7a 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpr0jt74kq/20e0d293aeee480c954c24952668fcc6.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:07 np0005466013 nova_compute[192144]: 2025-10-02 12:21:07.083 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:07 np0005466013 nova_compute[192144]: 2025-10-02 12:21:07.197 2 DEBUG oslo_concurrency.processutils [None req-120eb160-3412-4e62-9676-4b000c93cf7a 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpr0jt74kq/20e0d293aeee480c954c24952668fcc6.delta 1073741824" returned: 0 in 0.252s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:07 np0005466013 nova_compute[192144]: 2025-10-02 12:21:07.197 2 INFO nova.virt.libvirt.driver [None req-120eb160-3412-4e62-9676-4b000c93cf7a 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Oct  2 08:21:07 np0005466013 nova_compute[192144]: 2025-10-02 12:21:07.253 2 DEBUG nova.virt.libvirt.guest [None req-120eb160-3412-4e62-9676-4b000c93cf7a 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] COPY block job progress, current cursor: 0 final cursor: 75431936 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Oct  2 08:21:07 np0005466013 nova_compute[192144]: 2025-10-02 12:21:07.758 2 DEBUG nova.virt.libvirt.guest [None req-120eb160-3412-4e62-9676-4b000c93cf7a 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] COPY block job progress, current cursor: 3735552 final cursor: 75431936 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Oct  2 08:21:08 np0005466013 nova_compute[192144]: 2025-10-02 12:21:08.261 2 DEBUG nova.virt.libvirt.guest [None req-120eb160-3412-4e62-9676-4b000c93cf7a 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] COPY block job progress, current cursor: 9502720 final cursor: 75431936 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Oct  2 08:21:08 np0005466013 nova_compute[192144]: 2025-10-02 12:21:08.754 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:08 np0005466013 nova_compute[192144]: 2025-10-02 12:21:08.766 2 DEBUG nova.virt.libvirt.guest [None req-120eb160-3412-4e62-9676-4b000c93cf7a 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] COPY block job progress, current cursor: 75431936 final cursor: 75431936 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Oct  2 08:21:08 np0005466013 nova_compute[192144]: 2025-10-02 12:21:08.771 2 INFO nova.virt.libvirt.driver [None req-120eb160-3412-4e62-9676-4b000c93cf7a 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Oct  2 08:21:08 np0005466013 nova_compute[192144]: 2025-10-02 12:21:08.822 2 DEBUG nova.privsep.utils [None req-120eb160-3412-4e62-9676-4b000c93cf7a 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Oct  2 08:21:08 np0005466013 nova_compute[192144]: 2025-10-02 12:21:08.824 2 DEBUG oslo_concurrency.processutils [None req-120eb160-3412-4e62-9676-4b000c93cf7a 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpr0jt74kq/20e0d293aeee480c954c24952668fcc6.delta /var/lib/nova/instances/snapshots/tmpr0jt74kq/20e0d293aeee480c954c24952668fcc6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:09 np0005466013 nova_compute[192144]: 2025-10-02 12:21:09.294 2 DEBUG oslo_concurrency.processutils [None req-120eb160-3412-4e62-9676-4b000c93cf7a 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpr0jt74kq/20e0d293aeee480c954c24952668fcc6.delta /var/lib/nova/instances/snapshots/tmpr0jt74kq/20e0d293aeee480c954c24952668fcc6" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:09 np0005466013 nova_compute[192144]: 2025-10-02 12:21:09.300 2 INFO nova.virt.libvirt.driver [None req-120eb160-3412-4e62-9676-4b000c93cf7a 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Snapshot extracted, beginning image upload#033[00m
Oct  2 08:21:10 np0005466013 nova_compute[192144]: 2025-10-02 12:21:10.057 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:12 np0005466013 nova_compute[192144]: 2025-10-02 12:21:12.085 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:12 np0005466013 nova_compute[192144]: 2025-10-02 12:21:12.870 2 INFO nova.virt.libvirt.driver [None req-120eb160-3412-4e62-9676-4b000c93cf7a 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Snapshot image upload complete#033[00m
Oct  2 08:21:12 np0005466013 nova_compute[192144]: 2025-10-02 12:21:12.871 2 INFO nova.compute.manager [None req-120eb160-3412-4e62-9676-4b000c93cf7a 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Took 6.73 seconds to snapshot the instance on the hypervisor.#033[00m
Oct  2 08:21:13 np0005466013 nova_compute[192144]: 2025-10-02 12:21:13.287 2 DEBUG nova.compute.manager [None req-120eb160-3412-4e62-9676-4b000c93cf7a 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Found 1 images (rotation: 2) _rotate_backups /usr/lib/python3.9/site-packages/nova/compute/manager.py:4450#033[00m
Oct  2 08:21:13 np0005466013 nova_compute[192144]: 2025-10-02 12:21:13.757 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:14 np0005466013 nova_compute[192144]: 2025-10-02 12:21:14.959 2 DEBUG nova.compute.manager [None req-a9898b34-74bc-4b6f-b208-ef660dfaed18 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:21:15 np0005466013 nova_compute[192144]: 2025-10-02 12:21:15.041 2 INFO nova.compute.manager [None req-a9898b34-74bc-4b6f-b208-ef660dfaed18 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] instance snapshotting#033[00m
Oct  2 08:21:15 np0005466013 nova_compute[192144]: 2025-10-02 12:21:15.042 2 DEBUG nova.objects.instance [None req-a9898b34-74bc-4b6f-b208-ef660dfaed18 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lazy-loading 'flavor' on Instance uuid ad2d69bb-3aa9-4c11-b9de-29996574cfa2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:21:15 np0005466013 nova_compute[192144]: 2025-10-02 12:21:15.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:15 np0005466013 nova_compute[192144]: 2025-10-02 12:21:15.441 2 INFO nova.virt.libvirt.driver [None req-a9898b34-74bc-4b6f-b208-ef660dfaed18 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Beginning live snapshot process#033[00m
Oct  2 08:21:15 np0005466013 virtqemud[191867]: invalid argument: disk vda does not have an active block job
Oct  2 08:21:15 np0005466013 nova_compute[192144]: 2025-10-02 12:21:15.792 2 DEBUG oslo_concurrency.processutils [None req-a9898b34-74bc-4b6f-b208-ef660dfaed18 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:15 np0005466013 nova_compute[192144]: 2025-10-02 12:21:15.886 2 DEBUG oslo_concurrency.processutils [None req-a9898b34-74bc-4b6f-b208-ef660dfaed18 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk --force-share --output=json -f qcow2" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:15 np0005466013 nova_compute[192144]: 2025-10-02 12:21:15.888 2 DEBUG oslo_concurrency.processutils [None req-a9898b34-74bc-4b6f-b208-ef660dfaed18 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:15 np0005466013 nova_compute[192144]: 2025-10-02 12:21:15.942 2 DEBUG oslo_concurrency.processutils [None req-a9898b34-74bc-4b6f-b208-ef660dfaed18 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk --force-share --output=json -f qcow2" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:15 np0005466013 nova_compute[192144]: 2025-10-02 12:21:15.955 2 DEBUG oslo_concurrency.processutils [None req-a9898b34-74bc-4b6f-b208-ef660dfaed18 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:16 np0005466013 nova_compute[192144]: 2025-10-02 12:21:16.021 2 DEBUG oslo_concurrency.processutils [None req-a9898b34-74bc-4b6f-b208-ef660dfaed18 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:16 np0005466013 nova_compute[192144]: 2025-10-02 12:21:16.023 2 DEBUG oslo_concurrency.processutils [None req-a9898b34-74bc-4b6f-b208-ef660dfaed18 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp9k_icnv4/34db5714671b4959a04c4cabc1388a2b.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:16 np0005466013 nova_compute[192144]: 2025-10-02 12:21:16.066 2 DEBUG oslo_concurrency.processutils [None req-a9898b34-74bc-4b6f-b208-ef660dfaed18 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp9k_icnv4/34db5714671b4959a04c4cabc1388a2b.delta 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:16 np0005466013 nova_compute[192144]: 2025-10-02 12:21:16.068 2 INFO nova.virt.libvirt.driver [None req-a9898b34-74bc-4b6f-b208-ef660dfaed18 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Oct  2 08:21:16 np0005466013 nova_compute[192144]: 2025-10-02 12:21:16.122 2 DEBUG nova.virt.libvirt.guest [None req-a9898b34-74bc-4b6f-b208-ef660dfaed18 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] COPY block job progress, current cursor: 0 final cursor: 75431936 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.351 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'ad2d69bb-3aa9-4c11-b9de-29996574cfa2', 'name': 'tempest-ServerActionsTestOtherB-server-1629207280', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000006c', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'hostId': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.352 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.374 12 DEBUG ceilometer.compute.pollsters [-] ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk.device.write.requests volume: 338 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.375 12 DEBUG ceilometer.compute.pollsters [-] ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.376 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7c433aa2-b206-4f80-915c-7e7bc72e0dfa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 338, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'ad2d69bb-3aa9-4c11-b9de-29996574cfa2-vda', 'timestamp': '2025-10-02T12:21:16.352523', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629207280', 'name': 'instance-0000006c', 'instance_id': 'ad2d69bb-3aa9-4c11-b9de-29996574cfa2', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4b28ba92-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5668.030742405, 'message_signature': '2a3229e8867bd8954d8d13fc31a6254df63bedf01637b2860b55eb461ab892c3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'ad2d69bb-3aa9-4c11-b9de-29996574cfa2-sda', 'timestamp': '2025-10-02T12:21:16.352523', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629207280', 'name': 'instance-0000006c', 'instance_id': 'ad2d69bb-3aa9-4c11-b9de-29996574cfa2', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4b28c992-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5668.030742405, 'message_signature': 'c70bcd056c35b27bfd309d3f787a5b4085be116c18303c232282510f056031a7'}]}, 'timestamp': '2025-10-02 12:21:16.375440', '_unique_id': '9b1185b602904024ad0eab4300ec9ad2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.376 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.376 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.376 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.376 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.376 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.376 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.376 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.376 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.376 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.376 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.376 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.376 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.376 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.376 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.376 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.376 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.376 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.376 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.376 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.376 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.376 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.376 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.376 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.376 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.376 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.376 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.376 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.376 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.376 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.376 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.376 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.377 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.381 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for ad2d69bb-3aa9-4c11-b9de-29996574cfa2 / tapcae13af9-81 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.381 12 DEBUG ceilometer.compute.pollsters [-] ad2d69bb-3aa9-4c11-b9de-29996574cfa2/network.outgoing.bytes volume: 3390 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.382 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fa5e5068-9ba4-4a1a-819d-cd48a86fae8c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3390, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'instance-0000006c-ad2d69bb-3aa9-4c11-b9de-29996574cfa2-tapcae13af9-81', 'timestamp': '2025-10-02T12:21:16.377859', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629207280', 'name': 'tapcae13af9-81', 'instance_id': 'ad2d69bb-3aa9-4c11-b9de-29996574cfa2', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:35:d3:eb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcae13af9-81'}, 'message_id': '4b29bb04-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5668.056092791, 'message_signature': '652bbf46a94cdec30c08b2cbb81266ee26bbdc5f0191995a17a563449fb9b6d8'}]}, 'timestamp': '2025-10-02 12:21:16.381726', '_unique_id': 'ea64b3adcaec4369a5d745051851cb5e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.382 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.382 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.382 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.382 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.382 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.382 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.382 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.382 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.382 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.382 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.382 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.382 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.382 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.382 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.382 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.382 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.382 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.382 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.382 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.382 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.382 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.382 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.382 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.382 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.382 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.382 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.382 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.382 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.382 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.382 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.382 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.383 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.383 12 DEBUG ceilometer.compute.pollsters [-] ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk.device.read.requests volume: 1123 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.383 12 DEBUG ceilometer.compute.pollsters [-] ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk.device.read.requests volume: 120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.384 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5253c088-62a4-49b1-b80e-0f98707c5b82', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1123, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'ad2d69bb-3aa9-4c11-b9de-29996574cfa2-vda', 'timestamp': '2025-10-02T12:21:16.383453', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629207280', 'name': 'instance-0000006c', 'instance_id': 'ad2d69bb-3aa9-4c11-b9de-29996574cfa2', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4b2a14be-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5668.030742405, 'message_signature': '48eb22ce59330b3bfe5dbc00a56f53c592699337a5c509b56049e1d44a00a19b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 120, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'ad2d69bb-3aa9-4c11-b9de-29996574cfa2-sda', 'timestamp': '2025-10-02T12:21:16.383453', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629207280', 'name': 'instance-0000006c', 'instance_id': 'ad2d69bb-3aa9-4c11-b9de-29996574cfa2', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4b2a2242-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5668.030742405, 'message_signature': '63736df98b6fdde8081d5a0d88f5e7bc2d54bc86b839867d9a5be63fed116eac'}]}, 'timestamp': '2025-10-02 12:21:16.384268', '_unique_id': 'a981b949140c4ee0bbccd5114c500b3a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.384 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.384 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.384 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.384 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.384 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.384 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.384 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.384 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.384 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.384 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.384 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.384 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.384 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.384 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.384 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.384 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.384 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.384 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.384 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.384 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.384 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.384 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.384 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.384 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.384 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.384 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.384 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.384 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.384 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.384 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.384 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.385 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.398 12 DEBUG ceilometer.compute.pollsters [-] ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk.device.allocation volume: 30679040 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.399 12 DEBUG ceilometer.compute.pollsters [-] ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.400 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b91b91a2-ac8e-4889-949b-35afc9f89236', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30679040, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'ad2d69bb-3aa9-4c11-b9de-29996574cfa2-vda', 'timestamp': '2025-10-02T12:21:16.385984', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629207280', 'name': 'instance-0000006c', 'instance_id': 'ad2d69bb-3aa9-4c11-b9de-29996574cfa2', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4b2c5fd0-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5668.064207296, 'message_signature': '27c47b18bad2de0ff1b8023060f2c257b88f0255b10c89e715ea05cedc3f2d66'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'ad2d69bb-3aa9-4c11-b9de-29996574cfa2-sda', 'timestamp': '2025-10-02T12:21:16.385984', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629207280', 'name': 'instance-0000006c', 'instance_id': 'ad2d69bb-3aa9-4c11-b9de-29996574cfa2', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4b2c6f66-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5668.064207296, 'message_signature': 'a41e46dd1c76fbf5b1d79e801f335144b917a1392e7bfa499657aa86045f86b4'}]}, 'timestamp': '2025-10-02 12:21:16.399343', '_unique_id': '627a6f397aa04dd08525d42538f70742'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.400 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.400 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.400 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.400 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.400 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.400 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.400 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.400 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.400 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.400 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.400 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.400 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.400 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.400 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.400 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.400 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.400 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.400 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.400 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.400 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.400 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.400 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.400 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.400 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.400 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.400 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.400 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.400 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.400 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.400 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.400 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.401 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.401 12 DEBUG ceilometer.compute.pollsters [-] ad2d69bb-3aa9-4c11-b9de-29996574cfa2/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.402 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '728f783b-d36b-4122-8076-959050232079', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'instance-0000006c-ad2d69bb-3aa9-4c11-b9de-29996574cfa2-tapcae13af9-81', 'timestamp': '2025-10-02T12:21:16.401860', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629207280', 'name': 'tapcae13af9-81', 'instance_id': 'ad2d69bb-3aa9-4c11-b9de-29996574cfa2', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:35:d3:eb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcae13af9-81'}, 'message_id': '4b2cdda2-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5668.056092791, 'message_signature': 'dc6783c9e7cb4d6bc64501f5f775bed304f490128ca67213577e700eaa837e7c'}]}, 'timestamp': '2025-10-02 12:21:16.402124', '_unique_id': '96b8b99a579e4d878c547c097f4795d9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.402 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.402 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.402 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.402 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.402 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.402 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.402 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.402 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.402 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.402 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.402 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.402 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.402 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.402 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.402 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.402 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.402 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.402 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.402 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.402 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.402 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.402 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.402 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.402 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.402 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.402 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.402 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.402 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.402 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.402 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.402 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.403 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.403 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.403 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-1629207280>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-1629207280>]
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.403 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.403 12 DEBUG ceilometer.compute.pollsters [-] ad2d69bb-3aa9-4c11-b9de-29996574cfa2/network.outgoing.packets volume: 28 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.404 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5b1b967e-aac6-4f86-8cf6-80d51c3fc776', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 28, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'instance-0000006c-ad2d69bb-3aa9-4c11-b9de-29996574cfa2-tapcae13af9-81', 'timestamp': '2025-10-02T12:21:16.403795', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629207280', 'name': 'tapcae13af9-81', 'instance_id': 'ad2d69bb-3aa9-4c11-b9de-29996574cfa2', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:35:d3:eb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcae13af9-81'}, 'message_id': '4b2d28de-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5668.056092791, 'message_signature': 'ff6cd496090a2ab292757e4d93736ee15ceff8cc0d3f98029156690f2cc3076b'}]}, 'timestamp': '2025-10-02 12:21:16.404052', '_unique_id': 'af47b623cf28486e950ed4ba34e656bb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.404 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.404 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.404 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.404 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.404 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.404 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.404 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.404 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.404 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.404 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.404 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.404 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.404 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.404 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.404 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.404 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.404 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.404 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.404 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.404 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.404 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.404 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.404 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.404 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.404 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.404 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.404 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.404 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.404 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.404 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.404 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 DEBUG ceilometer.compute.pollsters [-] ad2d69bb-3aa9-4c11-b9de-29996574cfa2/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ece461a2-a961-4d45-b03e-db567f501e7d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'instance-0000006c-ad2d69bb-3aa9-4c11-b9de-29996574cfa2-tapcae13af9-81', 'timestamp': '2025-10-02T12:21:16.405174', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629207280', 'name': 'tapcae13af9-81', 'instance_id': 'ad2d69bb-3aa9-4c11-b9de-29996574cfa2', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:35:d3:eb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcae13af9-81'}, 'message_id': '4b2d5eda-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5668.056092791, 'message_signature': 'c17675cfc9cf63f82ae8ad58a3e08485593e859da022a432608123eacb57aae2'}]}, 'timestamp': '2025-10-02 12:21:16.405431', '_unique_id': '0cc5fc0fb2be4d258ba1684c841e164f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.405 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.406 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.406 12 DEBUG ceilometer.compute.pollsters [-] ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk.device.write.latency volume: 78437643984 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.406 12 DEBUG ceilometer.compute.pollsters [-] ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.407 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2bda8b3b-c03f-4b37-a7c2-69168bad6bc4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 78437643984, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'ad2d69bb-3aa9-4c11-b9de-29996574cfa2-vda', 'timestamp': '2025-10-02T12:21:16.406623', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629207280', 'name': 'instance-0000006c', 'instance_id': 'ad2d69bb-3aa9-4c11-b9de-29996574cfa2', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4b2d96de-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5668.030742405, 'message_signature': '41786bce8bda00da51cb733d4cf351166ab377d1dc498cbff02ca6805294b346'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'ad2d69bb-3aa9-4c11-b9de-29996574cfa2-sda', 'timestamp': '2025-10-02T12:21:16.406623', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629207280', 'name': 'instance-0000006c', 'instance_id': 'ad2d69bb-3aa9-4c11-b9de-29996574cfa2', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4b2da0f2-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5668.030742405, 'message_signature': 'b59775ec19d8119a7b4ae37bd0adf379a794d5381b94b145f2f160d6ace9fee4'}]}, 'timestamp': '2025-10-02 12:21:16.407132', '_unique_id': 'b821f25cae954d59b88feb25b8ae54dc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.407 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.407 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.407 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.407 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.407 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.407 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.407 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.407 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.407 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.407 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.407 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.407 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.407 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.407 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.407 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.407 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.407 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.407 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.407 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.407 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.407 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.407 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.407 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.407 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.407 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.407 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.407 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.407 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.407 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.407 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.407 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.408 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.408 12 DEBUG ceilometer.compute.pollsters [-] ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk.device.read.bytes volume: 30751232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.408 12 DEBUG ceilometer.compute.pollsters [-] ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk.device.read.bytes volume: 299326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0a4a5833-b152-496f-928b-434076110e26', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30751232, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'ad2d69bb-3aa9-4c11-b9de-29996574cfa2-vda', 'timestamp': '2025-10-02T12:21:16.408320', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629207280', 'name': 'instance-0000006c', 'instance_id': 'ad2d69bb-3aa9-4c11-b9de-29996574cfa2', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4b2dd914-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5668.030742405, 'message_signature': '8e21c2cd0c87c4583651a88d62e1bc9d140d079d65446cf72bcd806da946b649'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 299326, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'ad2d69bb-3aa9-4c11-b9de-29996574cfa2-sda', 'timestamp': '2025-10-02T12:21:16.408320', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629207280', 'name': 'instance-0000006c', 'instance_id': 'ad2d69bb-3aa9-4c11-b9de-29996574cfa2', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4b2de0c6-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5668.030742405, 'message_signature': '26334fc41f3e80cddeb3c823bcd4a1f8ffebfd2b833181980161c38f4c73c450'}]}, 'timestamp': '2025-10-02 12:21:16.408740', '_unique_id': '3cdbeef7dd744fe6820f1bf8e771688a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.409 12 DEBUG ceilometer.compute.pollsters [-] ad2d69bb-3aa9-4c11-b9de-29996574cfa2/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.410 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '242c134f-d2c3-4511-b08d-01b92294c3e9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'instance-0000006c-ad2d69bb-3aa9-4c11-b9de-29996574cfa2-tapcae13af9-81', 'timestamp': '2025-10-02T12:21:16.409867', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629207280', 'name': 'tapcae13af9-81', 'instance_id': 'ad2d69bb-3aa9-4c11-b9de-29996574cfa2', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:35:d3:eb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcae13af9-81'}, 'message_id': '4b2e1546-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5668.056092791, 'message_signature': '9785068a16c319e1a26502d84b3762d8710457eb4446414b1a45fe13c48eea44'}]}, 'timestamp': '2025-10-02 12:21:16.410097', '_unique_id': '90c5be10c4124941b5feb1d03a7684f3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.410 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.410 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.410 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.410 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.410 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.410 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.410 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.410 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.410 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.410 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.410 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.410 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.410 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.410 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.410 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.410 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.410 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.410 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.410 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.410 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.410 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.410 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.410 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.410 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.410 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.410 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.410 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.410 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.410 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.410 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.410 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.411 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.411 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.411 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-1629207280>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-1629207280>]
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.411 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.411 12 DEBUG ceilometer.compute.pollsters [-] ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk.device.read.latency volume: 693972963 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.411 12 DEBUG ceilometer.compute.pollsters [-] ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk.device.read.latency volume: 55263204 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4cbaccc1-bddc-4c85-8873-debce827ffc9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 693972963, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'ad2d69bb-3aa9-4c11-b9de-29996574cfa2-vda', 'timestamp': '2025-10-02T12:21:16.411466', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629207280', 'name': 'instance-0000006c', 'instance_id': 'ad2d69bb-3aa9-4c11-b9de-29996574cfa2', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4b2e533a-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5668.030742405, 'message_signature': 'e58ae6b61ae292752e8fd2bd0937cd0dfbcb03f4a0263a2f2872b4d7381807a0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 55263204, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'ad2d69bb-3aa9-4c11-b9de-29996574cfa2-sda', 'timestamp': '2025-10-02T12:21:16.411466', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629207280', 'name': 'instance-0000006c', 'instance_id': 'ad2d69bb-3aa9-4c11-b9de-29996574cfa2', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4b2e5b00-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5668.030742405, 'message_signature': '1106485a25e5a6603e27c8ce89f6532a906531a0d917afde83c0a484d77ff268'}]}, 'timestamp': '2025-10-02 12:21:16.411889', '_unique_id': '83e5c6e0115f49c289c84dafada575f7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.412 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-1629207280>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-1629207280>]
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 DEBUG ceilometer.compute.pollsters [-] ad2d69bb-3aa9-4c11-b9de-29996574cfa2/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '44c696fc-d66a-44e8-aeb7-8f8dfd00a4be', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'instance-0000006c-ad2d69bb-3aa9-4c11-b9de-29996574cfa2-tapcae13af9-81', 'timestamp': '2025-10-02T12:21:16.413220', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629207280', 'name': 'tapcae13af9-81', 'instance_id': 'ad2d69bb-3aa9-4c11-b9de-29996574cfa2', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:35:d3:eb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcae13af9-81'}, 'message_id': '4b2e97e6-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5668.056092791, 'message_signature': '62f2989d407478cdc1c2acddbdc05b7ed74cfd3a93156547c74068bb9491130f'}]}, 'timestamp': '2025-10-02 12:21:16.413440', '_unique_id': 'a9003e9d73ae4ca88783a7b8d731b7ff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.413 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.414 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.438 12 DEBUG ceilometer.compute.pollsters [-] ad2d69bb-3aa9-4c11-b9de-29996574cfa2/cpu volume: 11930000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.439 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e3eb3174-1b91-4a00-9665-3e6b1ad9414d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11930000000, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'ad2d69bb-3aa9-4c11-b9de-29996574cfa2', 'timestamp': '2025-10-02T12:21:16.414512', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629207280', 'name': 'instance-0000006c', 'instance_id': 'ad2d69bb-3aa9-4c11-b9de-29996574cfa2', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '4b326e66-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5668.116161075, 'message_signature': '1d2ad88ce81bb4fb26da598415b3a835eea430986af4601ef949f3677813bff2'}]}, 'timestamp': '2025-10-02 12:21:16.438665', '_unique_id': 'fafd9775b57f4ab0aadde105d9197dc9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.439 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.439 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.439 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.439 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.439 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.439 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.439 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.439 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.439 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.439 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.439 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.439 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.439 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.439 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.439 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.439 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.439 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.439 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.439 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.439 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.439 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.439 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.439 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.439 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.439 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.439 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.439 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.439 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.439 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.439 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.439 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.440 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.440 12 DEBUG ceilometer.compute.pollsters [-] ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.440 12 DEBUG ceilometer.compute.pollsters [-] ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.441 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8d58d8ff-7502-4060-891d-3d7cff707c0d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'ad2d69bb-3aa9-4c11-b9de-29996574cfa2-vda', 'timestamp': '2025-10-02T12:21:16.440383', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629207280', 'name': 'instance-0000006c', 'instance_id': 'ad2d69bb-3aa9-4c11-b9de-29996574cfa2', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4b32bd80-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5668.064207296, 'message_signature': '40ce6712da63d0c19713ee9b7ef08348e3273d953bbf01ef906c93fcc0069b5d'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'ad2d69bb-3aa9-4c11-b9de-29996574cfa2-sda', 'timestamp': '2025-10-02T12:21:16.440383', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629207280', 'name': 'instance-0000006c', 'instance_id': 'ad2d69bb-3aa9-4c11-b9de-29996574cfa2', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4b32c5c8-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5668.064207296, 'message_signature': '660f45974890210f914868f60af80ac8032cbf54335656b818ac632cfc808ccb'}]}, 'timestamp': '2025-10-02 12:21:16.440818', '_unique_id': '176141b44bfc49cb84b57381f6934b24'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.441 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.441 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.441 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.441 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.441 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.441 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.441 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.441 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.441 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.441 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.441 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.441 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.441 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.441 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.441 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.441 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.441 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.441 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.441 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.441 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.441 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.441 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.441 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.441 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.441 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.441 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.441 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.441 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.441 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.441 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.441 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.441 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 DEBUG ceilometer.compute.pollsters [-] ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 DEBUG ceilometer.compute.pollsters [-] ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ebae9ec3-e46d-4e56-8abf-223472207285', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'ad2d69bb-3aa9-4c11-b9de-29996574cfa2-vda', 'timestamp': '2025-10-02T12:21:16.442019', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629207280', 'name': 'instance-0000006c', 'instance_id': 'ad2d69bb-3aa9-4c11-b9de-29996574cfa2', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4b32fd2c-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5668.064207296, 'message_signature': 'dff86bda649b9b2a77ef30a56033fa6b5806ed81b6fedc32286db4f484431d7d'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'ad2d69bb-3aa9-4c11-b9de-29996574cfa2-sda', 'timestamp': '2025-10-02T12:21:16.442019', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629207280', 'name': 'instance-0000006c', 'instance_id': 'ad2d69bb-3aa9-4c11-b9de-29996574cfa2', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4b330542-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5668.064207296, 'message_signature': '20f1bb630eacf26cae4aba4dbb0881bb4e1d90b72f95ebec03d6ddd20df8b933'}]}, 'timestamp': '2025-10-02 12:21:16.442450', '_unique_id': '2a63e4f28b7a48f380151c5bf4a18e28'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.442 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.443 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.443 12 DEBUG ceilometer.compute.pollsters [-] ad2d69bb-3aa9-4c11-b9de-29996574cfa2/memory.usage volume: 42.88671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.444 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a29f0633-c5d4-4679-8771-4733376e8883', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.88671875, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'ad2d69bb-3aa9-4c11-b9de-29996574cfa2', 'timestamp': '2025-10-02T12:21:16.443641', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629207280', 'name': 'instance-0000006c', 'instance_id': 'ad2d69bb-3aa9-4c11-b9de-29996574cfa2', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '4b333c56-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5668.116161075, 'message_signature': 'a90e90a6e1a9d74c18b94f1fe418b4adc8567fc9677a6f0f2b1009bf969a0565'}]}, 'timestamp': '2025-10-02 12:21:16.443887', '_unique_id': 'f1cf58329d924506bd8faa7de11fde71'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.444 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.444 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.444 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.444 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.444 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.444 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.444 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.444 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.444 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.444 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.444 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.444 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.444 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.444 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.444 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.444 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.444 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.444 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.444 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.444 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.444 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.444 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.444 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.444 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.444 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.444 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.444 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.444 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.444 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.444 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.444 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.444 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.445 12 DEBUG ceilometer.compute.pollsters [-] ad2d69bb-3aa9-4c11-b9de-29996574cfa2/network.incoming.packets volume: 29 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.445 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9dd7f4a5-b215-4fbb-a748-c130bc954bae', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 29, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'instance-0000006c-ad2d69bb-3aa9-4c11-b9de-29996574cfa2-tapcae13af9-81', 'timestamp': '2025-10-02T12:21:16.445051', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629207280', 'name': 'tapcae13af9-81', 'instance_id': 'ad2d69bb-3aa9-4c11-b9de-29996574cfa2', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:35:d3:eb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcae13af9-81'}, 'message_id': '4b3373ce-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5668.056092791, 'message_signature': 'ae32066524ca6dfd300d9b61ea41e07fa5b8e670d8a539837363f5c09a665df3'}]}, 'timestamp': '2025-10-02 12:21:16.445343', '_unique_id': 'e623812526a54baab0c9d640207a8ff0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.445 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.445 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.445 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.445 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.445 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.445 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.445 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.445 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.445 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.445 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.445 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.445 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.445 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.445 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.445 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.445 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.445 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.445 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.445 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.445 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.445 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.445 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.445 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.445 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.445 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.445 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.445 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.445 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.445 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.445 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.445 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.446 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.446 12 DEBUG ceilometer.compute.pollsters [-] ad2d69bb-3aa9-4c11-b9de-29996574cfa2/network.incoming.bytes volume: 4569 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '12398d47-2bc2-4525-9d70-da49e145a4bd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4569, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'instance-0000006c-ad2d69bb-3aa9-4c11-b9de-29996574cfa2-tapcae13af9-81', 'timestamp': '2025-10-02T12:21:16.446488', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629207280', 'name': 'tapcae13af9-81', 'instance_id': 'ad2d69bb-3aa9-4c11-b9de-29996574cfa2', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:35:d3:eb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcae13af9-81'}, 'message_id': '4b33ab5a-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5668.056092791, 'message_signature': 'f461e35da98f68bd232cf1f65e7fcef5bd3099ddacf635f814067bad1e46e613'}]}, 'timestamp': '2025-10-02 12:21:16.446707', '_unique_id': '9245df9ef49b4fe3badce0f0789b545b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.447 12 DEBUG ceilometer.compute.pollsters [-] ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk.device.write.bytes volume: 72994816 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.448 12 DEBUG ceilometer.compute.pollsters [-] ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.448 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '944267db-146e-476d-89c2-d827fb09467f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72994816, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'ad2d69bb-3aa9-4c11-b9de-29996574cfa2-vda', 'timestamp': '2025-10-02T12:21:16.447808', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629207280', 'name': 'instance-0000006c', 'instance_id': 'ad2d69bb-3aa9-4c11-b9de-29996574cfa2', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4b33e048-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5668.030742405, 'message_signature': 'aafcf32b4a5d65d594778c452490a8925330165e0c4e9bb5fcbd8479862f367f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'ad2d69bb-3aa9-4c11-b9de-29996574cfa2-sda', 'timestamp': '2025-10-02T12:21:16.447808', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629207280', 'name': 'instance-0000006c', 'instance_id': 'ad2d69bb-3aa9-4c11-b9de-29996574cfa2', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4b33e980-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5668.030742405, 'message_signature': '673f1d7745dd54d7cc9f28746eaa5a91eadb90aaa29a0620171789a0e10f5266'}]}, 'timestamp': '2025-10-02 12:21:16.448312', '_unique_id': 'ad6d49cc59e24fbcb134ca0d95f4a269'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.448 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.448 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.448 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.448 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.448 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.448 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.448 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.448 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.448 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.448 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.448 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.448 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.448 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.448 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.448 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.448 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.448 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.448 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.448 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.448 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.448 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.448 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.448 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.448 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.448 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.448 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.448 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.448 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.448 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.448 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.448 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.449 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.449 12 DEBUG ceilometer.compute.pollsters [-] ad2d69bb-3aa9-4c11-b9de-29996574cfa2/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e00b35dd-c70f-44b9-abfa-c382557cee43', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'instance-0000006c-ad2d69bb-3aa9-4c11-b9de-29996574cfa2-tapcae13af9-81', 'timestamp': '2025-10-02T12:21:16.449487', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629207280', 'name': 'tapcae13af9-81', 'instance_id': 'ad2d69bb-3aa9-4c11-b9de-29996574cfa2', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:35:d3:eb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcae13af9-81'}, 'message_id': '4b3420e4-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5668.056092791, 'message_signature': '8d392fa5f7253c40be61a663d9375774e7a379922e866d9174e88d688aef85f3'}]}, 'timestamp': '2025-10-02 12:21:16.449715', '_unique_id': '85e95c369f7943bd9de2b2b27b092653'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.450 12 DEBUG ceilometer.compute.pollsters [-] ad2d69bb-3aa9-4c11-b9de-29996574cfa2/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.451 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '95c0df4b-9409-4ea5-8e28-feceee824641', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'instance-0000006c-ad2d69bb-3aa9-4c11-b9de-29996574cfa2-tapcae13af9-81', 'timestamp': '2025-10-02T12:21:16.450756', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629207280', 'name': 'tapcae13af9-81', 'instance_id': 'ad2d69bb-3aa9-4c11-b9de-29996574cfa2', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:35:d3:eb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcae13af9-81'}, 'message_id': '4b3452c6-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5668.056092791, 'message_signature': '59998ef1d18c96a4aacaf1e232f62586e63283d5c3deb1d76400d387b63dfceb'}]}, 'timestamp': '2025-10-02 12:21:16.450994', '_unique_id': '51ec542e16494e219b4710308fcbd8ab'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.451 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.451 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.451 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.451 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.451 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.451 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.451 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.451 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.451 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.451 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.451 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.451 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.451 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.451 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.451 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.451 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.451 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.451 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.451 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.451 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.451 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.451 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.451 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.451 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.451 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.451 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.451 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.451 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.451 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.451 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.451 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.451 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.452 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:21:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:21:16.452 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-1629207280>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-1629207280>]
Oct  2 08:21:16 np0005466013 nova_compute[192144]: 2025-10-02 12:21:16.626 2 DEBUG nova.virt.libvirt.guest [None req-a9898b34-74bc-4b6f-b208-ef660dfaed18 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] COPY block job progress, current cursor: 75431936 final cursor: 75431936 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Oct  2 08:21:16 np0005466013 nova_compute[192144]: 2025-10-02 12:21:16.631 2 INFO nova.virt.libvirt.driver [None req-a9898b34-74bc-4b6f-b208-ef660dfaed18 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Oct  2 08:21:16 np0005466013 nova_compute[192144]: 2025-10-02 12:21:16.672 2 DEBUG nova.privsep.utils [None req-a9898b34-74bc-4b6f-b208-ef660dfaed18 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Oct  2 08:21:16 np0005466013 nova_compute[192144]: 2025-10-02 12:21:16.672 2 DEBUG oslo_concurrency.processutils [None req-a9898b34-74bc-4b6f-b208-ef660dfaed18 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp9k_icnv4/34db5714671b4959a04c4cabc1388a2b.delta /var/lib/nova/instances/snapshots/tmp9k_icnv4/34db5714671b4959a04c4cabc1388a2b execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:17 np0005466013 nova_compute[192144]: 2025-10-02 12:21:17.068 2 DEBUG oslo_concurrency.processutils [None req-a9898b34-74bc-4b6f-b208-ef660dfaed18 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp9k_icnv4/34db5714671b4959a04c4cabc1388a2b.delta /var/lib/nova/instances/snapshots/tmp9k_icnv4/34db5714671b4959a04c4cabc1388a2b" returned: 0 in 0.396s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:17 np0005466013 nova_compute[192144]: 2025-10-02 12:21:17.076 2 INFO nova.virt.libvirt.driver [None req-a9898b34-74bc-4b6f-b208-ef660dfaed18 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Snapshot extracted, beginning image upload#033[00m
Oct  2 08:21:17 np0005466013 nova_compute[192144]: 2025-10-02 12:21:17.120 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:17 np0005466013 nova_compute[192144]: 2025-10-02 12:21:17.804 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:18 np0005466013 nova_compute[192144]: 2025-10-02 12:21:18.760 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:19 np0005466013 nova_compute[192144]: 2025-10-02 12:21:19.905 2 INFO nova.virt.libvirt.driver [None req-a9898b34-74bc-4b6f-b208-ef660dfaed18 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Snapshot image upload complete#033[00m
Oct  2 08:21:19 np0005466013 nova_compute[192144]: 2025-10-02 12:21:19.906 2 INFO nova.compute.manager [None req-a9898b34-74bc-4b6f-b208-ef660dfaed18 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Took 4.82 seconds to snapshot the instance on the hypervisor.#033[00m
Oct  2 08:21:20 np0005466013 nova_compute[192144]: 2025-10-02 12:21:20.266 2 DEBUG nova.compute.manager [None req-a9898b34-74bc-4b6f-b208-ef660dfaed18 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Found 2 images (rotation: 2) _rotate_backups /usr/lib/python3.9/site-packages/nova/compute/manager.py:4450#033[00m
Oct  2 08:21:21 np0005466013 podman[235890]: 2025-10-02 12:21:21.728252853 +0000 UTC m=+0.092962297 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:21:21 np0005466013 podman[235889]: 2025-10-02 12:21:21.751021028 +0000 UTC m=+0.113852563 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  2 08:21:21 np0005466013 podman[235891]: 2025-10-02 12:21:21.77533533 +0000 UTC m=+0.130188644 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct  2 08:21:22 np0005466013 nova_compute[192144]: 2025-10-02 12:21:22.005 2 DEBUG nova.compute.manager [None req-5b110a8c-92e9-4331-97fa-6d497055bc74 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:21:22 np0005466013 nova_compute[192144]: 2025-10-02 12:21:22.088 2 INFO nova.compute.manager [None req-5b110a8c-92e9-4331-97fa-6d497055bc74 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] instance snapshotting#033[00m
Oct  2 08:21:22 np0005466013 nova_compute[192144]: 2025-10-02 12:21:22.089 2 DEBUG nova.objects.instance [None req-5b110a8c-92e9-4331-97fa-6d497055bc74 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lazy-loading 'flavor' on Instance uuid ad2d69bb-3aa9-4c11-b9de-29996574cfa2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:21:22 np0005466013 nova_compute[192144]: 2025-10-02 12:21:22.123 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:22 np0005466013 nova_compute[192144]: 2025-10-02 12:21:22.488 2 INFO nova.virt.libvirt.driver [None req-5b110a8c-92e9-4331-97fa-6d497055bc74 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Beginning live snapshot process#033[00m
Oct  2 08:21:22 np0005466013 virtqemud[191867]: invalid argument: disk vda does not have an active block job
Oct  2 08:21:22 np0005466013 nova_compute[192144]: 2025-10-02 12:21:22.760 2 DEBUG oslo_concurrency.processutils [None req-5b110a8c-92e9-4331-97fa-6d497055bc74 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:22 np0005466013 nova_compute[192144]: 2025-10-02 12:21:22.829 2 DEBUG oslo_concurrency.processutils [None req-5b110a8c-92e9-4331-97fa-6d497055bc74 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk --force-share --output=json -f qcow2" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:22 np0005466013 nova_compute[192144]: 2025-10-02 12:21:22.830 2 DEBUG oslo_concurrency.processutils [None req-5b110a8c-92e9-4331-97fa-6d497055bc74 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:22 np0005466013 nova_compute[192144]: 2025-10-02 12:21:22.897 2 DEBUG oslo_concurrency.processutils [None req-5b110a8c-92e9-4331-97fa-6d497055bc74 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk --force-share --output=json -f qcow2" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:22 np0005466013 nova_compute[192144]: 2025-10-02 12:21:22.910 2 DEBUG oslo_concurrency.processutils [None req-5b110a8c-92e9-4331-97fa-6d497055bc74 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:22 np0005466013 nova_compute[192144]: 2025-10-02 12:21:22.970 2 DEBUG oslo_concurrency.processutils [None req-5b110a8c-92e9-4331-97fa-6d497055bc74 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:22 np0005466013 nova_compute[192144]: 2025-10-02 12:21:22.972 2 DEBUG oslo_concurrency.processutils [None req-5b110a8c-92e9-4331-97fa-6d497055bc74 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpw2o074ks/6257e2b515f34288a2c5ae522f3a3b1f.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:23 np0005466013 nova_compute[192144]: 2025-10-02 12:21:23.015 2 DEBUG oslo_concurrency.processutils [None req-5b110a8c-92e9-4331-97fa-6d497055bc74 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpw2o074ks/6257e2b515f34288a2c5ae522f3a3b1f.delta 1073741824" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:23 np0005466013 nova_compute[192144]: 2025-10-02 12:21:23.017 2 INFO nova.virt.libvirt.driver [None req-5b110a8c-92e9-4331-97fa-6d497055bc74 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Oct  2 08:21:23 np0005466013 nova_compute[192144]: 2025-10-02 12:21:23.078 2 DEBUG nova.virt.libvirt.guest [None req-5b110a8c-92e9-4331-97fa-6d497055bc74 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] COPY block job progress, current cursor: 0 final cursor: 75431936 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Oct  2 08:21:23 np0005466013 nova_compute[192144]: 2025-10-02 12:21:23.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:23 np0005466013 nova_compute[192144]: 2025-10-02 12:21:23.583 2 DEBUG nova.virt.libvirt.guest [None req-5b110a8c-92e9-4331-97fa-6d497055bc74 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] COPY block job progress, current cursor: 75431936 final cursor: 75431936 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Oct  2 08:21:23 np0005466013 nova_compute[192144]: 2025-10-02 12:21:23.589 2 INFO nova.virt.libvirt.driver [None req-5b110a8c-92e9-4331-97fa-6d497055bc74 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Oct  2 08:21:23 np0005466013 nova_compute[192144]: 2025-10-02 12:21:23.634 2 DEBUG nova.privsep.utils [None req-5b110a8c-92e9-4331-97fa-6d497055bc74 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Oct  2 08:21:23 np0005466013 nova_compute[192144]: 2025-10-02 12:21:23.635 2 DEBUG oslo_concurrency.processutils [None req-5b110a8c-92e9-4331-97fa-6d497055bc74 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpw2o074ks/6257e2b515f34288a2c5ae522f3a3b1f.delta /var/lib/nova/instances/snapshots/tmpw2o074ks/6257e2b515f34288a2c5ae522f3a3b1f execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:23 np0005466013 nova_compute[192144]: 2025-10-02 12:21:23.801 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:24 np0005466013 nova_compute[192144]: 2025-10-02 12:21:24.100 2 DEBUG oslo_concurrency.processutils [None req-5b110a8c-92e9-4331-97fa-6d497055bc74 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpw2o074ks/6257e2b515f34288a2c5ae522f3a3b1f.delta /var/lib/nova/instances/snapshots/tmpw2o074ks/6257e2b515f34288a2c5ae522f3a3b1f" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:24 np0005466013 nova_compute[192144]: 2025-10-02 12:21:24.111 2 INFO nova.virt.libvirt.driver [None req-5b110a8c-92e9-4331-97fa-6d497055bc74 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Snapshot extracted, beginning image upload#033[00m
Oct  2 08:21:25 np0005466013 nova_compute[192144]: 2025-10-02 12:21:25.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:21:25 np0005466013 nova_compute[192144]: 2025-10-02 12:21:25.995 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:21:26 np0005466013 nova_compute[192144]: 2025-10-02 12:21:26.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:21:27 np0005466013 nova_compute[192144]: 2025-10-02 12:21:27.030 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:27 np0005466013 nova_compute[192144]: 2025-10-02 12:21:27.031 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:27 np0005466013 nova_compute[192144]: 2025-10-02 12:21:27.031 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:27 np0005466013 nova_compute[192144]: 2025-10-02 12:21:27.031 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:21:27 np0005466013 nova_compute[192144]: 2025-10-02 12:21:27.058 2 INFO nova.virt.libvirt.driver [None req-5b110a8c-92e9-4331-97fa-6d497055bc74 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Snapshot image upload complete#033[00m
Oct  2 08:21:27 np0005466013 nova_compute[192144]: 2025-10-02 12:21:27.059 2 INFO nova.compute.manager [None req-5b110a8c-92e9-4331-97fa-6d497055bc74 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Took 4.93 seconds to snapshot the instance on the hypervisor.#033[00m
Oct  2 08:21:27 np0005466013 nova_compute[192144]: 2025-10-02 12:21:27.173 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:27 np0005466013 nova_compute[192144]: 2025-10-02 12:21:27.243 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:27 np0005466013 nova_compute[192144]: 2025-10-02 12:21:27.329 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:27 np0005466013 nova_compute[192144]: 2025-10-02 12:21:27.331 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:27 np0005466013 nova_compute[192144]: 2025-10-02 12:21:27.395 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:27 np0005466013 nova_compute[192144]: 2025-10-02 12:21:27.595 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:21:27 np0005466013 nova_compute[192144]: 2025-10-02 12:21:27.596 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5557MB free_disk=73.27542877197266GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:21:27 np0005466013 nova_compute[192144]: 2025-10-02 12:21:27.596 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:27 np0005466013 nova_compute[192144]: 2025-10-02 12:21:27.597 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:27 np0005466013 nova_compute[192144]: 2025-10-02 12:21:27.648 2 DEBUG nova.compute.manager [None req-5b110a8c-92e9-4331-97fa-6d497055bc74 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Found 3 images (rotation: 2) _rotate_backups /usr/lib/python3.9/site-packages/nova/compute/manager.py:4450#033[00m
Oct  2 08:21:27 np0005466013 nova_compute[192144]: 2025-10-02 12:21:27.648 2 DEBUG nova.compute.manager [None req-5b110a8c-92e9-4331-97fa-6d497055bc74 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Rotating out 1 backups _rotate_backups /usr/lib/python3.9/site-packages/nova/compute/manager.py:4458#033[00m
Oct  2 08:21:27 np0005466013 nova_compute[192144]: 2025-10-02 12:21:27.649 2 DEBUG nova.compute.manager [None req-5b110a8c-92e9-4331-97fa-6d497055bc74 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Deleting image 7e9655f3-9387-477d-9455-cacc68bfdba2 _rotate_backups /usr/lib/python3.9/site-packages/nova/compute/manager.py:4463#033[00m
Oct  2 08:21:27 np0005466013 nova_compute[192144]: 2025-10-02 12:21:27.736 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance ad2d69bb-3aa9-4c11-b9de-29996574cfa2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:21:27 np0005466013 nova_compute[192144]: 2025-10-02 12:21:27.737 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:21:27 np0005466013 nova_compute[192144]: 2025-10-02 12:21:27.737 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:21:27 np0005466013 nova_compute[192144]: 2025-10-02 12:21:27.790 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:21:27 np0005466013 nova_compute[192144]: 2025-10-02 12:21:27.803 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:21:27 np0005466013 nova_compute[192144]: 2025-10-02 12:21:27.835 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:21:27 np0005466013 nova_compute[192144]: 2025-10-02 12:21:27.835 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.239s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:28 np0005466013 nova_compute[192144]: 2025-10-02 12:21:28.804 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:28 np0005466013 nova_compute[192144]: 2025-10-02 12:21:28.841 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:21:28 np0005466013 nova_compute[192144]: 2025-10-02 12:21:28.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:21:30 np0005466013 nova_compute[192144]: 2025-10-02 12:21:30.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:21:30 np0005466013 nova_compute[192144]: 2025-10-02 12:21:30.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:21:30 np0005466013 nova_compute[192144]: 2025-10-02 12:21:30.995 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:21:31 np0005466013 podman[235990]: 2025-10-02 12:21:31.691465236 +0000 UTC m=+0.054611029 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, maintainer=Red Hat, Inc., config_id=edpm, name=ubi9-minimal, release=1755695350, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, distribution-scope=public, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, version=9.6)
Oct  2 08:21:31 np0005466013 podman[235989]: 2025-10-02 12:21:31.704701117 +0000 UTC m=+0.071579866 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  2 08:21:31 np0005466013 podman[235991]: 2025-10-02 12:21:31.724703549 +0000 UTC m=+0.076751908 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  2 08:21:31 np0005466013 nova_compute[192144]: 2025-10-02 12:21:31.991 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:21:31 np0005466013 nova_compute[192144]: 2025-10-02 12:21:31.992 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:21:31 np0005466013 nova_compute[192144]: 2025-10-02 12:21:31.993 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:21:31 np0005466013 nova_compute[192144]: 2025-10-02 12:21:31.993 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:21:32 np0005466013 nova_compute[192144]: 2025-10-02 12:21:32.183 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:32 np0005466013 nova_compute[192144]: 2025-10-02 12:21:32.212 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "refresh_cache-ad2d69bb-3aa9-4c11-b9de-29996574cfa2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:21:32 np0005466013 nova_compute[192144]: 2025-10-02 12:21:32.212 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquired lock "refresh_cache-ad2d69bb-3aa9-4c11-b9de-29996574cfa2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:21:32 np0005466013 nova_compute[192144]: 2025-10-02 12:21:32.212 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:21:32 np0005466013 nova_compute[192144]: 2025-10-02 12:21:32.212 2 DEBUG nova.objects.instance [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lazy-loading 'info_cache' on Instance uuid ad2d69bb-3aa9-4c11-b9de-29996574cfa2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:21:33 np0005466013 nova_compute[192144]: 2025-10-02 12:21:33.838 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:34 np0005466013 nova_compute[192144]: 2025-10-02 12:21:34.742 2 DEBUG oslo_concurrency.lockutils [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Acquiring lock "35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:34 np0005466013 nova_compute[192144]: 2025-10-02 12:21:34.743 2 DEBUG oslo_concurrency.lockutils [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Lock "35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c" acquired by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:34 np0005466013 nova_compute[192144]: 2025-10-02 12:21:34.743 2 INFO nova.compute.manager [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Unshelving#033[00m
Oct  2 08:21:34 np0005466013 nova_compute[192144]: 2025-10-02 12:21:34.870 2 DEBUG oslo_concurrency.lockutils [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:34 np0005466013 nova_compute[192144]: 2025-10-02 12:21:34.871 2 DEBUG oslo_concurrency.lockutils [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:34 np0005466013 nova_compute[192144]: 2025-10-02 12:21:34.875 2 DEBUG nova.objects.instance [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Lazy-loading 'pci_requests' on Instance uuid 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:21:34 np0005466013 nova_compute[192144]: 2025-10-02 12:21:34.893 2 DEBUG nova.objects.instance [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Lazy-loading 'numa_topology' on Instance uuid 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:21:34 np0005466013 nova_compute[192144]: 2025-10-02 12:21:34.905 2 DEBUG nova.virt.hardware [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:21:34 np0005466013 nova_compute[192144]: 2025-10-02 12:21:34.905 2 INFO nova.compute.claims [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:21:35 np0005466013 nova_compute[192144]: 2025-10-02 12:21:35.052 2 DEBUG nova.compute.provider_tree [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:21:35 np0005466013 nova_compute[192144]: 2025-10-02 12:21:35.068 2 DEBUG nova.scheduler.client.report [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:21:35 np0005466013 nova_compute[192144]: 2025-10-02 12:21:35.122 2 DEBUG oslo_concurrency.lockutils [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.251s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:35 np0005466013 nova_compute[192144]: 2025-10-02 12:21:35.288 2 INFO nova.network.neutron [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Updating port 0c328734-ebc6-47bc-b603-2e4af1cae573 with attributes {'binding:host_id': 'compute-2.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Oct  2 08:21:36 np0005466013 nova_compute[192144]: 2025-10-02 12:21:36.355 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Updating instance_info_cache with network_info: [{"id": "cae13af9-8175-4eab-b9ec-18019b521d0b", "address": "fa:16:3e:35:d3:eb", "network": {"id": "20eb29be-ee23-463b-85af-bfc2388e9f77", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-370285634-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffce7d629aa24a7f970d93b2a79045f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcae13af9-81", "ovs_interfaceid": "cae13af9-8175-4eab-b9ec-18019b521d0b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:21:36 np0005466013 nova_compute[192144]: 2025-10-02 12:21:36.379 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Releasing lock "refresh_cache-ad2d69bb-3aa9-4c11-b9de-29996574cfa2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:21:36 np0005466013 nova_compute[192144]: 2025-10-02 12:21:36.380 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:21:37 np0005466013 nova_compute[192144]: 2025-10-02 12:21:37.187 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:37 np0005466013 nova_compute[192144]: 2025-10-02 12:21:37.566 2 DEBUG oslo_concurrency.lockutils [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Acquiring lock "refresh_cache-35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:21:37 np0005466013 nova_compute[192144]: 2025-10-02 12:21:37.566 2 DEBUG oslo_concurrency.lockutils [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Acquired lock "refresh_cache-35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:21:37 np0005466013 nova_compute[192144]: 2025-10-02 12:21:37.566 2 DEBUG nova.network.neutron [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:21:37 np0005466013 podman[236043]: 2025-10-02 12:21:37.683927338 +0000 UTC m=+0.055677022 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:21:37 np0005466013 podman[236044]: 2025-10-02 12:21:37.695767456 +0000 UTC m=+0.063343731 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:21:38 np0005466013 nova_compute[192144]: 2025-10-02 12:21:38.036 2 DEBUG nova.compute.manager [req-f2a4c6d6-b7b5-4732-9cd1-36db411bdcf6 req-aae0e7a5-0f05-40ff-808d-a98690d71ade 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Received event network-changed-0c328734-ebc6-47bc-b603-2e4af1cae573 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:21:38 np0005466013 nova_compute[192144]: 2025-10-02 12:21:38.036 2 DEBUG nova.compute.manager [req-f2a4c6d6-b7b5-4732-9cd1-36db411bdcf6 req-aae0e7a5-0f05-40ff-808d-a98690d71ade 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Refreshing instance network info cache due to event network-changed-0c328734-ebc6-47bc-b603-2e4af1cae573. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:21:38 np0005466013 nova_compute[192144]: 2025-10-02 12:21:38.037 2 DEBUG oslo_concurrency.lockutils [req-f2a4c6d6-b7b5-4732-9cd1-36db411bdcf6 req-aae0e7a5-0f05-40ff-808d-a98690d71ade 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:21:38 np0005466013 nova_compute[192144]: 2025-10-02 12:21:38.842 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:39 np0005466013 nova_compute[192144]: 2025-10-02 12:21:39.303 2 DEBUG nova.network.neutron [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Updating instance_info_cache with network_info: [{"id": "0c328734-ebc6-47bc-b603-2e4af1cae573", "address": "fa:16:3e:ef:e3:79", "network": {"id": "8f494075-66bf-4ce0-a765-98fd91c31199", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1553125421-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0c8c8a8631b4721beed577a99f8bdb7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c328734-eb", "ovs_interfaceid": "0c328734-ebc6-47bc-b603-2e4af1cae573", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:21:39 np0005466013 nova_compute[192144]: 2025-10-02 12:21:39.328 2 DEBUG oslo_concurrency.lockutils [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Releasing lock "refresh_cache-35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:21:39 np0005466013 nova_compute[192144]: 2025-10-02 12:21:39.330 2 DEBUG nova.virt.libvirt.driver [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:21:39 np0005466013 nova_compute[192144]: 2025-10-02 12:21:39.330 2 INFO nova.virt.libvirt.driver [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Creating image(s)#033[00m
Oct  2 08:21:39 np0005466013 nova_compute[192144]: 2025-10-02 12:21:39.331 2 DEBUG oslo_concurrency.lockutils [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Acquiring lock "/var/lib/nova/instances/35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:39 np0005466013 nova_compute[192144]: 2025-10-02 12:21:39.331 2 DEBUG oslo_concurrency.lockutils [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Lock "/var/lib/nova/instances/35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:39 np0005466013 nova_compute[192144]: 2025-10-02 12:21:39.332 2 DEBUG oslo_concurrency.lockutils [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Lock "/var/lib/nova/instances/35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:39 np0005466013 nova_compute[192144]: 2025-10-02 12:21:39.332 2 DEBUG nova.objects.instance [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:21:39 np0005466013 nova_compute[192144]: 2025-10-02 12:21:39.333 2 DEBUG oslo_concurrency.lockutils [req-f2a4c6d6-b7b5-4732-9cd1-36db411bdcf6 req-aae0e7a5-0f05-40ff-808d-a98690d71ade 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:21:39 np0005466013 nova_compute[192144]: 2025-10-02 12:21:39.333 2 DEBUG nova.network.neutron [req-f2a4c6d6-b7b5-4732-9cd1-36db411bdcf6 req-aae0e7a5-0f05-40ff-808d-a98690d71ade 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Refreshing network info cache for port 0c328734-ebc6-47bc-b603-2e4af1cae573 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:21:39 np0005466013 nova_compute[192144]: 2025-10-02 12:21:39.351 2 DEBUG oslo_concurrency.lockutils [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Acquiring lock "c5d7c7df9c32610775fd016cc9585c255e2a2d15" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:39 np0005466013 nova_compute[192144]: 2025-10-02 12:21:39.353 2 DEBUG oslo_concurrency.lockutils [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Lock "c5d7c7df9c32610775fd016cc9585c255e2a2d15" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:41 np0005466013 nova_compute[192144]: 2025-10-02 12:21:41.070 2 DEBUG oslo_concurrency.processutils [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c5d7c7df9c32610775fd016cc9585c255e2a2d15.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:41 np0005466013 nova_compute[192144]: 2025-10-02 12:21:41.134 2 DEBUG oslo_concurrency.processutils [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c5d7c7df9c32610775fd016cc9585c255e2a2d15.part --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:41 np0005466013 nova_compute[192144]: 2025-10-02 12:21:41.135 2 DEBUG nova.virt.images [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] e54e42ff-f245-4c1b-a659-20ba701a4194 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Oct  2 08:21:41 np0005466013 nova_compute[192144]: 2025-10-02 12:21:41.137 2 DEBUG nova.privsep.utils [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Oct  2 08:21:41 np0005466013 nova_compute[192144]: 2025-10-02 12:21:41.137 2 DEBUG oslo_concurrency.processutils [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/c5d7c7df9c32610775fd016cc9585c255e2a2d15.part /var/lib/nova/instances/_base/c5d7c7df9c32610775fd016cc9585c255e2a2d15.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:41 np0005466013 nova_compute[192144]: 2025-10-02 12:21:41.301 2 DEBUG nova.network.neutron [req-f2a4c6d6-b7b5-4732-9cd1-36db411bdcf6 req-aae0e7a5-0f05-40ff-808d-a98690d71ade 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Updated VIF entry in instance network info cache for port 0c328734-ebc6-47bc-b603-2e4af1cae573. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:21:41 np0005466013 nova_compute[192144]: 2025-10-02 12:21:41.303 2 DEBUG nova.network.neutron [req-f2a4c6d6-b7b5-4732-9cd1-36db411bdcf6 req-aae0e7a5-0f05-40ff-808d-a98690d71ade 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Updating instance_info_cache with network_info: [{"id": "0c328734-ebc6-47bc-b603-2e4af1cae573", "address": "fa:16:3e:ef:e3:79", "network": {"id": "8f494075-66bf-4ce0-a765-98fd91c31199", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1553125421-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0c8c8a8631b4721beed577a99f8bdb7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c328734-eb", "ovs_interfaceid": "0c328734-ebc6-47bc-b603-2e4af1cae573", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:21:41 np0005466013 nova_compute[192144]: 2025-10-02 12:21:41.332 2 DEBUG oslo_concurrency.lockutils [req-f2a4c6d6-b7b5-4732-9cd1-36db411bdcf6 req-aae0e7a5-0f05-40ff-808d-a98690d71ade 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:21:41 np0005466013 nova_compute[192144]: 2025-10-02 12:21:41.576 2 DEBUG oslo_concurrency.processutils [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/c5d7c7df9c32610775fd016cc9585c255e2a2d15.part /var/lib/nova/instances/_base/c5d7c7df9c32610775fd016cc9585c255e2a2d15.converted" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:41 np0005466013 nova_compute[192144]: 2025-10-02 12:21:41.585 2 DEBUG oslo_concurrency.processutils [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c5d7c7df9c32610775fd016cc9585c255e2a2d15.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:41 np0005466013 nova_compute[192144]: 2025-10-02 12:21:41.650 2 DEBUG oslo_concurrency.processutils [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c5d7c7df9c32610775fd016cc9585c255e2a2d15.converted --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:41 np0005466013 nova_compute[192144]: 2025-10-02 12:21:41.651 2 DEBUG oslo_concurrency.lockutils [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Lock "c5d7c7df9c32610775fd016cc9585c255e2a2d15" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.298s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:41 np0005466013 nova_compute[192144]: 2025-10-02 12:21:41.665 2 DEBUG oslo_concurrency.processutils [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c5d7c7df9c32610775fd016cc9585c255e2a2d15 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:41 np0005466013 nova_compute[192144]: 2025-10-02 12:21:41.724 2 DEBUG oslo_concurrency.processutils [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c5d7c7df9c32610775fd016cc9585c255e2a2d15 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:41 np0005466013 nova_compute[192144]: 2025-10-02 12:21:41.725 2 DEBUG oslo_concurrency.lockutils [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Acquiring lock "c5d7c7df9c32610775fd016cc9585c255e2a2d15" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:41 np0005466013 nova_compute[192144]: 2025-10-02 12:21:41.726 2 DEBUG oslo_concurrency.lockutils [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Lock "c5d7c7df9c32610775fd016cc9585c255e2a2d15" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:41 np0005466013 nova_compute[192144]: 2025-10-02 12:21:41.737 2 DEBUG oslo_concurrency.processutils [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c5d7c7df9c32610775fd016cc9585c255e2a2d15 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:41 np0005466013 nova_compute[192144]: 2025-10-02 12:21:41.794 2 DEBUG oslo_concurrency.processutils [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c5d7c7df9c32610775fd016cc9585c255e2a2d15 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:41 np0005466013 nova_compute[192144]: 2025-10-02 12:21:41.796 2 DEBUG oslo_concurrency.processutils [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c5d7c7df9c32610775fd016cc9585c255e2a2d15,backing_fmt=raw /var/lib/nova/instances/35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:41 np0005466013 nova_compute[192144]: 2025-10-02 12:21:41.851 2 DEBUG oslo_concurrency.processutils [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c5d7c7df9c32610775fd016cc9585c255e2a2d15,backing_fmt=raw /var/lib/nova/instances/35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c/disk 1073741824" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:41 np0005466013 nova_compute[192144]: 2025-10-02 12:21:41.853 2 DEBUG oslo_concurrency.lockutils [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Lock "c5d7c7df9c32610775fd016cc9585c255e2a2d15" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:41 np0005466013 nova_compute[192144]: 2025-10-02 12:21:41.853 2 DEBUG oslo_concurrency.processutils [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c5d7c7df9c32610775fd016cc9585c255e2a2d15 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:41 np0005466013 nova_compute[192144]: 2025-10-02 12:21:41.928 2 DEBUG oslo_concurrency.processutils [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c5d7c7df9c32610775fd016cc9585c255e2a2d15 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:41 np0005466013 nova_compute[192144]: 2025-10-02 12:21:41.930 2 DEBUG nova.objects.instance [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Lazy-loading 'migration_context' on Instance uuid 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:21:41 np0005466013 nova_compute[192144]: 2025-10-02 12:21:41.951 2 INFO nova.virt.libvirt.driver [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Rebasing disk image.#033[00m
Oct  2 08:21:41 np0005466013 nova_compute[192144]: 2025-10-02 12:21:41.951 2 DEBUG oslo_concurrency.processutils [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:42 np0005466013 nova_compute[192144]: 2025-10-02 12:21:42.015 2 DEBUG oslo_concurrency.processutils [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:42 np0005466013 nova_compute[192144]: 2025-10-02 12:21:42.017 2 DEBUG oslo_concurrency.processutils [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Running cmd (subprocess): qemu-img rebase -b /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 -F raw /var/lib/nova/instances/35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:42 np0005466013 nova_compute[192144]: 2025-10-02 12:21:42.189 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:42 np0005466013 nova_compute[192144]: 2025-10-02 12:21:42.434 2 DEBUG oslo_concurrency.lockutils [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Acquiring lock "f2f0b852-0c4a-4d16-9c7f-54845e7f7b42" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:42 np0005466013 nova_compute[192144]: 2025-10-02 12:21:42.435 2 DEBUG oslo_concurrency.lockutils [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "f2f0b852-0c4a-4d16-9c7f-54845e7f7b42" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:42 np0005466013 nova_compute[192144]: 2025-10-02 12:21:42.454 2 DEBUG nova.compute.manager [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:21:42 np0005466013 nova_compute[192144]: 2025-10-02 12:21:42.551 2 DEBUG oslo_concurrency.lockutils [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:42 np0005466013 nova_compute[192144]: 2025-10-02 12:21:42.552 2 DEBUG oslo_concurrency.lockutils [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:42 np0005466013 nova_compute[192144]: 2025-10-02 12:21:42.561 2 DEBUG nova.virt.hardware [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:21:42 np0005466013 nova_compute[192144]: 2025-10-02 12:21:42.561 2 INFO nova.compute.claims [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:21:42 np0005466013 nova_compute[192144]: 2025-10-02 12:21:42.707 2 DEBUG nova.compute.provider_tree [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:21:42 np0005466013 nova_compute[192144]: 2025-10-02 12:21:42.721 2 DEBUG nova.scheduler.client.report [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:21:42 np0005466013 nova_compute[192144]: 2025-10-02 12:21:42.742 2 DEBUG oslo_concurrency.lockutils [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.190s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:42 np0005466013 nova_compute[192144]: 2025-10-02 12:21:42.742 2 DEBUG nova.compute.manager [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:21:42 np0005466013 nova_compute[192144]: 2025-10-02 12:21:42.794 2 DEBUG nova.compute.manager [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:21:42 np0005466013 nova_compute[192144]: 2025-10-02 12:21:42.795 2 DEBUG nova.network.neutron [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:21:42 np0005466013 nova_compute[192144]: 2025-10-02 12:21:42.819 2 INFO nova.virt.libvirt.driver [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:21:42 np0005466013 nova_compute[192144]: 2025-10-02 12:21:42.843 2 DEBUG nova.compute.manager [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:21:42 np0005466013 nova_compute[192144]: 2025-10-02 12:21:42.956 2 DEBUG nova.compute.manager [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:21:42 np0005466013 nova_compute[192144]: 2025-10-02 12:21:42.957 2 DEBUG nova.virt.libvirt.driver [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:21:42 np0005466013 nova_compute[192144]: 2025-10-02 12:21:42.957 2 INFO nova.virt.libvirt.driver [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Creating image(s)#033[00m
Oct  2 08:21:42 np0005466013 nova_compute[192144]: 2025-10-02 12:21:42.958 2 DEBUG oslo_concurrency.lockutils [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Acquiring lock "/var/lib/nova/instances/f2f0b852-0c4a-4d16-9c7f-54845e7f7b42/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:42 np0005466013 nova_compute[192144]: 2025-10-02 12:21:42.958 2 DEBUG oslo_concurrency.lockutils [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "/var/lib/nova/instances/f2f0b852-0c4a-4d16-9c7f-54845e7f7b42/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:42 np0005466013 nova_compute[192144]: 2025-10-02 12:21:42.959 2 DEBUG oslo_concurrency.lockutils [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "/var/lib/nova/instances/f2f0b852-0c4a-4d16-9c7f-54845e7f7b42/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:42 np0005466013 nova_compute[192144]: 2025-10-02 12:21:42.974 2 DEBUG oslo_concurrency.processutils [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.041 2 DEBUG oslo_concurrency.processutils [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.042 2 DEBUG oslo_concurrency.lockutils [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.043 2 DEBUG oslo_concurrency.lockutils [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.059 2 DEBUG oslo_concurrency.processutils [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.119 2 DEBUG oslo_concurrency.processutils [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.120 2 DEBUG oslo_concurrency.processutils [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/f2f0b852-0c4a-4d16-9c7f-54845e7f7b42/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.229 2 DEBUG oslo_concurrency.processutils [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/f2f0b852-0c4a-4d16-9c7f-54845e7f7b42/disk 1073741824" returned: 0 in 0.109s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.231 2 DEBUG oslo_concurrency.lockutils [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.188s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.232 2 DEBUG oslo_concurrency.processutils [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.294 2 DEBUG oslo_concurrency.processutils [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.296 2 DEBUG nova.virt.disk.api [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Checking if we can resize image /var/lib/nova/instances/f2f0b852-0c4a-4d16-9c7f-54845e7f7b42/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.296 2 DEBUG oslo_concurrency.processutils [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f2f0b852-0c4a-4d16-9c7f-54845e7f7b42/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.318 2 DEBUG nova.policy [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.365 2 DEBUG oslo_concurrency.processutils [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f2f0b852-0c4a-4d16-9c7f-54845e7f7b42/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.366 2 DEBUG nova.virt.disk.api [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Cannot resize image /var/lib/nova/instances/f2f0b852-0c4a-4d16-9c7f-54845e7f7b42/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.366 2 DEBUG nova.objects.instance [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lazy-loading 'migration_context' on Instance uuid f2f0b852-0c4a-4d16-9c7f-54845e7f7b42 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.384 2 DEBUG nova.virt.libvirt.driver [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.385 2 DEBUG nova.virt.libvirt.driver [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Ensure instance console log exists: /var/lib/nova/instances/f2f0b852-0c4a-4d16-9c7f-54845e7f7b42/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.385 2 DEBUG oslo_concurrency.lockutils [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.385 2 DEBUG oslo_concurrency.lockutils [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.385 2 DEBUG oslo_concurrency.lockutils [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.607 2 DEBUG oslo_concurrency.processutils [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] CMD "qemu-img rebase -b /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 -F raw /var/lib/nova/instances/35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c/disk" returned: 0 in 1.590s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.608 2 DEBUG nova.virt.libvirt.driver [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.608 2 DEBUG nova.virt.libvirt.driver [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Ensure instance console log exists: /var/lib/nova/instances/35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.609 2 DEBUG oslo_concurrency.lockutils [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.609 2 DEBUG oslo_concurrency.lockutils [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.609 2 DEBUG oslo_concurrency.lockutils [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.611 2 DEBUG nova.virt.libvirt.driver [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Start _get_guest_xml network_info=[{"id": "0c328734-ebc6-47bc-b603-2e4af1cae573", "address": "fa:16:3e:ef:e3:79", "network": {"id": "8f494075-66bf-4ce0-a765-98fd91c31199", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1553125421-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0c8c8a8631b4721beed577a99f8bdb7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c328734-eb", "ovs_interfaceid": "0c328734-ebc6-47bc-b603-2e4af1cae573", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='4ba12734f950c2c692266126a9babbc1',container_format='bare',created_at=2025-10-02T12:21:16Z,direct_url=<?>,disk_format='qcow2',id=e54e42ff-f245-4c1b-a659-20ba701a4194,min_disk=1,min_ram=0,name='tempest-ServersNegativeTestJSON-server-1170653470-shelved',owner='f0c8c8a8631b4721beed577a99f8bdb7',properties=ImageMetaProps,protected=<?>,size=52232192,status='active',tags=<?>,updated_at=2025-10-02T12:21:23Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.615 2 WARNING nova.virt.libvirt.driver [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.618 2 DEBUG nova.virt.libvirt.host [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.619 2 DEBUG nova.virt.libvirt.host [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.621 2 DEBUG nova.virt.libvirt.host [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.622 2 DEBUG nova.virt.libvirt.host [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.623 2 DEBUG nova.virt.libvirt.driver [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.623 2 DEBUG nova.virt.hardware [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='4ba12734f950c2c692266126a9babbc1',container_format='bare',created_at=2025-10-02T12:21:16Z,direct_url=<?>,disk_format='qcow2',id=e54e42ff-f245-4c1b-a659-20ba701a4194,min_disk=1,min_ram=0,name='tempest-ServersNegativeTestJSON-server-1170653470-shelved',owner='f0c8c8a8631b4721beed577a99f8bdb7',properties=ImageMetaProps,protected=<?>,size=52232192,status='active',tags=<?>,updated_at=2025-10-02T12:21:23Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.623 2 DEBUG nova.virt.hardware [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.623 2 DEBUG nova.virt.hardware [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.623 2 DEBUG nova.virt.hardware [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.624 2 DEBUG nova.virt.hardware [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.624 2 DEBUG nova.virt.hardware [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.624 2 DEBUG nova.virt.hardware [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.624 2 DEBUG nova.virt.hardware [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.624 2 DEBUG nova.virt.hardware [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.624 2 DEBUG nova.virt.hardware [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.625 2 DEBUG nova.virt.hardware [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.625 2 DEBUG nova.objects.instance [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.643 2 DEBUG nova.virt.libvirt.vif [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T12:19:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1170653470',display_name='tempest-ServersNegativeTestJSON-server-1170653470',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-1170653470',id=101,image_ref='e54e42ff-f245-4c1b-a659-20ba701a4194',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:19:57Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='f0c8c8a8631b4721beed577a99f8bdb7',ramdisk_id='',reservation_id='r-sfsoaqzs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestJSON-114354241',owner_user_name='tempest-ServersNegativeTestJSON-114354241-project-member',shelved_at='2025-10-02T12:21:23.459143',shelved_host='compute-0.ctlplane.example.com',shelved_image_id='e54e42ff-f245-4c1b-a659-20ba701a4194'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:21:34Z,user_data=None,user_id='a803afe9939346088252c3b944f124f2',uuid=35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "0c328734-ebc6-47bc-b603-2e4af1cae573", "address": "fa:16:3e:ef:e3:79", "network": {"id": "8f494075-66bf-4ce0-a765-98fd91c31199", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1553125421-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0c8c8a8631b4721beed577a99f8bdb7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c328734-eb", "ovs_interfaceid": "0c328734-ebc6-47bc-b603-2e4af1cae573", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.643 2 DEBUG nova.network.os_vif_util [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Converting VIF {"id": "0c328734-ebc6-47bc-b603-2e4af1cae573", "address": "fa:16:3e:ef:e3:79", "network": {"id": "8f494075-66bf-4ce0-a765-98fd91c31199", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1553125421-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0c8c8a8631b4721beed577a99f8bdb7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c328734-eb", "ovs_interfaceid": "0c328734-ebc6-47bc-b603-2e4af1cae573", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.644 2 DEBUG nova.network.os_vif_util [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ef:e3:79,bridge_name='br-int',has_traffic_filtering=True,id=0c328734-ebc6-47bc-b603-2e4af1cae573,network=Network(8f494075-66bf-4ce0-a765-98fd91c31199),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0c328734-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.645 2 DEBUG nova.objects.instance [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Lazy-loading 'pci_devices' on Instance uuid 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.669 2 DEBUG nova.virt.libvirt.driver [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:21:43 np0005466013 nova_compute[192144]:  <uuid>35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c</uuid>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:  <name>instance-00000065</name>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:21:43 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:      <nova:name>tempest-ServersNegativeTestJSON-server-1170653470</nova:name>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:21:43</nova:creationTime>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:21:43 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:        <nova:user uuid="a803afe9939346088252c3b944f124f2">tempest-ServersNegativeTestJSON-114354241-project-member</nova:user>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:        <nova:project uuid="f0c8c8a8631b4721beed577a99f8bdb7">tempest-ServersNegativeTestJSON-114354241</nova:project>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="e54e42ff-f245-4c1b-a659-20ba701a4194"/>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:        <nova:port uuid="0c328734-ebc6-47bc-b603-2e4af1cae573">
Oct  2 08:21:43 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:      <entry name="serial">35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c</entry>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:      <entry name="uuid">35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c</entry>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:21:43 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c/disk"/>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:21:43 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c/disk.config"/>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:21:43 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:ef:e3:79"/>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:      <target dev="tap0c328734-eb"/>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:21:43 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c/console.log" append="off"/>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    <input type="keyboard" bus="usb"/>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:21:43 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:21:43 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:21:43 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:21:43 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:21:43 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.670 2 DEBUG nova.compute.manager [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Preparing to wait for external event network-vif-plugged-0c328734-ebc6-47bc-b603-2e4af1cae573 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.670 2 DEBUG oslo_concurrency.lockutils [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Acquiring lock "35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.670 2 DEBUG oslo_concurrency.lockutils [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Lock "35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.670 2 DEBUG oslo_concurrency.lockutils [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Lock "35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.671 2 DEBUG nova.virt.libvirt.vif [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T12:19:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1170653470',display_name='tempest-ServersNegativeTestJSON-server-1170653470',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-1170653470',id=101,image_ref='e54e42ff-f245-4c1b-a659-20ba701a4194',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:19:57Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='f0c8c8a8631b4721beed577a99f8bdb7',ramdisk_id='',reservation_id='r-sfsoaqzs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestJSON-114354241',owner_user_name='tempest-ServersNegativeTestJSON-114354241-project-member',shelved_at='2025-10-02T12:21:23.459143',shelved_host='compute-0.ctlplane.example.com',shelved_image_id='e54e42ff-f245-4c1b-a659-20ba701a4194'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:21:34Z,user_data=None,user_id='a803afe9939346088252c3b944f124f2',uuid=35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "0c328734-ebc6-47bc-b603-2e4af1cae573", "address": "fa:16:3e:ef:e3:79", "network": {"id": "8f494075-66bf-4ce0-a765-98fd91c31199", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1553125421-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0c8c8a8631b4721beed577a99f8bdb7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c328734-eb", "ovs_interfaceid": "0c328734-ebc6-47bc-b603-2e4af1cae573", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.671 2 DEBUG nova.network.os_vif_util [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Converting VIF {"id": "0c328734-ebc6-47bc-b603-2e4af1cae573", "address": "fa:16:3e:ef:e3:79", "network": {"id": "8f494075-66bf-4ce0-a765-98fd91c31199", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1553125421-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0c8c8a8631b4721beed577a99f8bdb7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c328734-eb", "ovs_interfaceid": "0c328734-ebc6-47bc-b603-2e4af1cae573", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.672 2 DEBUG nova.network.os_vif_util [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ef:e3:79,bridge_name='br-int',has_traffic_filtering=True,id=0c328734-ebc6-47bc-b603-2e4af1cae573,network=Network(8f494075-66bf-4ce0-a765-98fd91c31199),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0c328734-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.672 2 DEBUG os_vif [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ef:e3:79,bridge_name='br-int',has_traffic_filtering=True,id=0c328734-ebc6-47bc-b603-2e4af1cae573,network=Network(8f494075-66bf-4ce0-a765-98fd91c31199),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0c328734-eb') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.673 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.674 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.674 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.680 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.680 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0c328734-eb, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.681 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0c328734-eb, col_values=(('external_ids', {'iface-id': '0c328734-ebc6-47bc-b603-2e4af1cae573', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ef:e3:79', 'vm-uuid': '35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:21:43 np0005466013 NetworkManager[51205]: <info>  [1759407703.6833] manager: (tap0c328734-eb): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/194)
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.684 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.690 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.691 2 INFO os_vif [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ef:e3:79,bridge_name='br-int',has_traffic_filtering=True,id=0c328734-ebc6-47bc-b603-2e4af1cae573,network=Network(8f494075-66bf-4ce0-a765-98fd91c31199),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0c328734-eb')#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.739 2 DEBUG nova.virt.libvirt.driver [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.740 2 DEBUG nova.virt.libvirt.driver [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.740 2 DEBUG nova.virt.libvirt.driver [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] No VIF found with MAC fa:16:3e:ef:e3:79, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.740 2 INFO nova.virt.libvirt.driver [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Using config drive#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.754 2 DEBUG nova.objects.instance [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.788 2 DEBUG nova.objects.instance [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Lazy-loading 'keypairs' on Instance uuid 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.844 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:43 np0005466013 nova_compute[192144]: 2025-10-02 12:21:43.908 2 DEBUG nova.network.neutron [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Successfully created port: 6374e02b-d27f-466e-9a75-8ba586327036 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:21:44 np0005466013 nova_compute[192144]: 2025-10-02 12:21:44.564 2 INFO nova.virt.libvirt.driver [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Creating config drive at /var/lib/nova/instances/35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c/disk.config#033[00m
Oct  2 08:21:44 np0005466013 nova_compute[192144]: 2025-10-02 12:21:44.569 2 DEBUG oslo_concurrency.processutils [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwf1zcd6m execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:44 np0005466013 nova_compute[192144]: 2025-10-02 12:21:44.703 2 DEBUG oslo_concurrency.processutils [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwf1zcd6m" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:44 np0005466013 kernel: tap0c328734-eb: entered promiscuous mode
Oct  2 08:21:44 np0005466013 NetworkManager[51205]: <info>  [1759407704.7580] manager: (tap0c328734-eb): new Tun device (/org/freedesktop/NetworkManager/Devices/195)
Oct  2 08:21:44 np0005466013 ovn_controller[94366]: 2025-10-02T12:21:44Z|00413|binding|INFO|Claiming lport 0c328734-ebc6-47bc-b603-2e4af1cae573 for this chassis.
Oct  2 08:21:44 np0005466013 ovn_controller[94366]: 2025-10-02T12:21:44Z|00414|binding|INFO|0c328734-ebc6-47bc-b603-2e4af1cae573: Claiming fa:16:3e:ef:e3:79 10.100.0.10
Oct  2 08:21:44 np0005466013 nova_compute[192144]: 2025-10-02 12:21:44.758 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:44 np0005466013 ovn_controller[94366]: 2025-10-02T12:21:44Z|00415|binding|INFO|Setting lport 0c328734-ebc6-47bc-b603-2e4af1cae573 ovn-installed in OVS
Oct  2 08:21:44 np0005466013 nova_compute[192144]: 2025-10-02 12:21:44.771 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:44 np0005466013 nova_compute[192144]: 2025-10-02 12:21:44.772 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:44 np0005466013 nova_compute[192144]: 2025-10-02 12:21:44.773 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:44 np0005466013 systemd-machined[152202]: New machine qemu-49-instance-00000065.
Oct  2 08:21:44 np0005466013 systemd[1]: Started Virtual Machine qemu-49-instance-00000065.
Oct  2 08:21:44 np0005466013 systemd-udevd[236153]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:21:44 np0005466013 NetworkManager[51205]: <info>  [1759407704.8450] device (tap0c328734-eb): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:21:44 np0005466013 NetworkManager[51205]: <info>  [1759407704.8457] device (tap0c328734-eb): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:21:44 np0005466013 ovn_controller[94366]: 2025-10-02T12:21:44Z|00416|binding|INFO|Setting lport 0c328734-ebc6-47bc-b603-2e4af1cae573 up in Southbound
Oct  2 08:21:44 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:44.934 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ef:e3:79 10.100.0.10'], port_security=['fa:16:3e:ef:e3:79 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8f494075-66bf-4ce0-a765-98fd91c31199', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f0c8c8a8631b4721beed577a99f8bdb7', 'neutron:revision_number': '7', 'neutron:security_group_ids': 'eb030dcc-72ea-4850-916a-e1df7c4d9a87', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e43b5827-85bf-4b83-b921-ec45e12f1f2e, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=0c328734-ebc6-47bc-b603-2e4af1cae573) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:21:44 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:44.936 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 0c328734-ebc6-47bc-b603-2e4af1cae573 in datapath 8f494075-66bf-4ce0-a765-98fd91c31199 bound to our chassis#033[00m
Oct  2 08:21:44 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:44.938 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8f494075-66bf-4ce0-a765-98fd91c31199#033[00m
Oct  2 08:21:44 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:44.956 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[6454966c-1214-4911-91a1-58e623258083]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:44 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:44.957 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8f494075-61 in ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:21:44 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:44.963 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8f494075-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:21:44 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:44.963 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[e2bece5a-1851-4ab4-a89e-7b251c9d064d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:44 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:44.964 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[5a55595f-ce30-419f-8ea0-3eeda47c73cb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:44 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:44.976 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[1c00e0a1-027b-4cfe-a9ea-d2fad0c9fea0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:44 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:44.993 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[bcb86b34-3879-46ba-991f-f09519a8f1bb]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:45.019 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[9132b04f-9d29-4a37-a6bf-e9ab650127c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:45.025 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[543fcd8d-20fc-4307-ad20-93b19f895fe9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:45 np0005466013 systemd-udevd[236158]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:21:45 np0005466013 NetworkManager[51205]: <info>  [1759407705.0271] manager: (tap8f494075-60): new Veth device (/org/freedesktop/NetworkManager/Devices/196)
Oct  2 08:21:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:45.061 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[69cc749c-9be2-456a-b331-c9bc5300a32f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:45.065 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[fbadd0e3-e88e-476e-be5e-1ef763ea1674]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:45 np0005466013 NetworkManager[51205]: <info>  [1759407705.0879] device (tap8f494075-60): carrier: link connected
Oct  2 08:21:45 np0005466013 ovn_controller[94366]: 2025-10-02T12:21:45Z|00417|binding|INFO|Releasing lport e533861f-45cb-4843-b071-0b628ca25128 from this chassis (sb_readonly=0)
Oct  2 08:21:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:45.102 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[7a0860d3-d0ff-45e2-9fde-6148b880cd43]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:45.121 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[58806928-9735-45a6-b951-f110a67acf20]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8f494075-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9b:9a:65'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 127], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 569670, 'reachable_time': 15866, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236186, 'error': None, 'target': 'ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:45.140 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[282baa02-c14d-44a2-8c79-7aef02de344e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe9b:9a65'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 569670, 'tstamp': 569670}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236187, 'error': None, 'target': 'ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:45.155 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[bc652fed-663d-4604-a242-7fa8739294c6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8f494075-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9b:9a:65'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 127], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 569670, 'reachable_time': 15866, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 236188, 'error': None, 'target': 'ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:45.186 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[23dff263-a3ca-4f7a-903a-12de60993b22]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:45.249 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[d676a911-aa5a-41c5-a0cb-8d421c2f1855]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:45.250 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8f494075-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:21:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:45.251 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:21:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:45.251 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8f494075-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:21:45 np0005466013 kernel: tap8f494075-60: entered promiscuous mode
Oct  2 08:21:45 np0005466013 NetworkManager[51205]: <info>  [1759407705.2808] manager: (tap8f494075-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/197)
Oct  2 08:21:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:45.284 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8f494075-60, col_values=(('external_ids', {'iface-id': 'a5eb523a-b004-42b7-a3f6-24b2514f40bf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:21:45 np0005466013 nova_compute[192144]: 2025-10-02 12:21:45.285 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:45 np0005466013 ovn_controller[94366]: 2025-10-02T12:21:45Z|00418|binding|INFO|Releasing lport a5eb523a-b004-42b7-a3f6-24b2514f40bf from this chassis (sb_readonly=0)
Oct  2 08:21:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:45.301 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8f494075-66bf-4ce0-a765-98fd91c31199.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8f494075-66bf-4ce0-a765-98fd91c31199.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:21:45 np0005466013 nova_compute[192144]: 2025-10-02 12:21:45.300 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:45.302 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[b9a4d659-8924-4cbf-9952-666ae2492ce2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:45.303 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:21:45 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:21:45 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:21:45 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-8f494075-66bf-4ce0-a765-98fd91c31199
Oct  2 08:21:45 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:21:45 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:21:45 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:21:45 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/8f494075-66bf-4ce0-a765-98fd91c31199.pid.haproxy
Oct  2 08:21:45 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:21:45 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:21:45 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:21:45 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:21:45 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:21:45 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:21:45 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:21:45 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:21:45 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:21:45 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:21:45 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:21:45 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:21:45 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:21:45 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:21:45 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:21:45 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:21:45 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:21:45 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:21:45 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:21:45 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:21:45 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID 8f494075-66bf-4ce0-a765-98fd91c31199
Oct  2 08:21:45 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:21:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:45.305 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199', 'env', 'PROCESS_TAG=haproxy-8f494075-66bf-4ce0-a765-98fd91c31199', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8f494075-66bf-4ce0-a765-98fd91c31199.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:21:45 np0005466013 nova_compute[192144]: 2025-10-02 12:21:45.637 2 DEBUG nova.compute.manager [req-b40d04aa-1af9-457a-9063-aa833f44ff09 req-547b670c-d706-4e2e-b197-a51e19ef27c7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Received event network-vif-plugged-0c328734-ebc6-47bc-b603-2e4af1cae573 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:21:45 np0005466013 nova_compute[192144]: 2025-10-02 12:21:45.638 2 DEBUG oslo_concurrency.lockutils [req-b40d04aa-1af9-457a-9063-aa833f44ff09 req-547b670c-d706-4e2e-b197-a51e19ef27c7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:45 np0005466013 nova_compute[192144]: 2025-10-02 12:21:45.639 2 DEBUG oslo_concurrency.lockutils [req-b40d04aa-1af9-457a-9063-aa833f44ff09 req-547b670c-d706-4e2e-b197-a51e19ef27c7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:45 np0005466013 nova_compute[192144]: 2025-10-02 12:21:45.639 2 DEBUG oslo_concurrency.lockutils [req-b40d04aa-1af9-457a-9063-aa833f44ff09 req-547b670c-d706-4e2e-b197-a51e19ef27c7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:45 np0005466013 nova_compute[192144]: 2025-10-02 12:21:45.639 2 DEBUG nova.compute.manager [req-b40d04aa-1af9-457a-9063-aa833f44ff09 req-547b670c-d706-4e2e-b197-a51e19ef27c7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Processing event network-vif-plugged-0c328734-ebc6-47bc-b603-2e4af1cae573 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:21:45 np0005466013 nova_compute[192144]: 2025-10-02 12:21:45.681 2 DEBUG nova.network.neutron [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Successfully updated port: 6374e02b-d27f-466e-9a75-8ba586327036 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:21:45 np0005466013 nova_compute[192144]: 2025-10-02 12:21:45.697 2 DEBUG oslo_concurrency.lockutils [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Acquiring lock "refresh_cache-f2f0b852-0c4a-4d16-9c7f-54845e7f7b42" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:21:45 np0005466013 nova_compute[192144]: 2025-10-02 12:21:45.699 2 DEBUG oslo_concurrency.lockutils [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Acquired lock "refresh_cache-f2f0b852-0c4a-4d16-9c7f-54845e7f7b42" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:21:45 np0005466013 nova_compute[192144]: 2025-10-02 12:21:45.699 2 DEBUG nova.network.neutron [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:21:45 np0005466013 podman[236227]: 2025-10-02 12:21:45.728463493 +0000 UTC m=+0.066770967 container create e9c452881d770ad7ee3e9af91811c0e9826538ad3cee52a57324aba7dd0c91c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 08:21:45 np0005466013 systemd[1]: Started libpod-conmon-e9c452881d770ad7ee3e9af91811c0e9826538ad3cee52a57324aba7dd0c91c0.scope.
Oct  2 08:21:45 np0005466013 podman[236227]: 2025-10-02 12:21:45.680719059 +0000 UTC m=+0.019026553 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:21:45 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:21:45 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aec24bda8cc799c9deb7b8e4d927b33413eea22ba4950f6ac9e2479e484e9e0e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:21:45 np0005466013 podman[236227]: 2025-10-02 12:21:45.816221162 +0000 UTC m=+0.154528656 container init e9c452881d770ad7ee3e9af91811c0e9826538ad3cee52a57324aba7dd0c91c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:21:45 np0005466013 podman[236227]: 2025-10-02 12:21:45.822463516 +0000 UTC m=+0.160770990 container start e9c452881d770ad7ee3e9af91811c0e9826538ad3cee52a57324aba7dd0c91c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  2 08:21:45 np0005466013 nova_compute[192144]: 2025-10-02 12:21:45.840 2 DEBUG nova.network.neutron [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:21:45 np0005466013 neutron-haproxy-ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199[236242]: [NOTICE]   (236246) : New worker (236248) forked
Oct  2 08:21:45 np0005466013 neutron-haproxy-ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199[236242]: [NOTICE]   (236246) : Loading success.
Oct  2 08:21:45 np0005466013 nova_compute[192144]: 2025-10-02 12:21:45.900 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407705.9003937, 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:21:45 np0005466013 nova_compute[192144]: 2025-10-02 12:21:45.901 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] VM Started (Lifecycle Event)#033[00m
Oct  2 08:21:45 np0005466013 nova_compute[192144]: 2025-10-02 12:21:45.904 2 DEBUG nova.compute.manager [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:21:45 np0005466013 nova_compute[192144]: 2025-10-02 12:21:45.907 2 DEBUG nova.virt.libvirt.driver [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:21:45 np0005466013 nova_compute[192144]: 2025-10-02 12:21:45.910 2 INFO nova.virt.libvirt.driver [-] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Instance spawned successfully.#033[00m
Oct  2 08:21:45 np0005466013 nova_compute[192144]: 2025-10-02 12:21:45.923 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:21:45 np0005466013 nova_compute[192144]: 2025-10-02 12:21:45.925 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:21:45 np0005466013 nova_compute[192144]: 2025-10-02 12:21:45.947 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:21:45 np0005466013 nova_compute[192144]: 2025-10-02 12:21:45.947 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407705.9011912, 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:21:45 np0005466013 nova_compute[192144]: 2025-10-02 12:21:45.947 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:21:45 np0005466013 nova_compute[192144]: 2025-10-02 12:21:45.966 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:21:45 np0005466013 nova_compute[192144]: 2025-10-02 12:21:45.968 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407705.9066453, 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:21:45 np0005466013 nova_compute[192144]: 2025-10-02 12:21:45.968 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:21:45 np0005466013 nova_compute[192144]: 2025-10-02 12:21:45.983 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:21:45 np0005466013 nova_compute[192144]: 2025-10-02 12:21:45.985 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:21:46 np0005466013 nova_compute[192144]: 2025-10-02 12:21:46.000 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:21:46 np0005466013 nova_compute[192144]: 2025-10-02 12:21:46.568 2 DEBUG nova.compute.manager [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:21:46 np0005466013 nova_compute[192144]: 2025-10-02 12:21:46.658 2 DEBUG oslo_concurrency.lockutils [None req-a993ab94-0da0-4fe5-b7df-b24ee94ef568 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Lock "35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 11.915s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:46 np0005466013 nova_compute[192144]: 2025-10-02 12:21:46.692 2 DEBUG nova.network.neutron [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Updating instance_info_cache with network_info: [{"id": "6374e02b-d27f-466e-9a75-8ba586327036", "address": "fa:16:3e:17:03:e5", "network": {"id": "20eb29be-ee23-463b-85af-bfc2388e9f77", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-370285634-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffce7d629aa24a7f970d93b2a79045f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6374e02b-d2", "ovs_interfaceid": "6374e02b-d27f-466e-9a75-8ba586327036", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:21:46 np0005466013 nova_compute[192144]: 2025-10-02 12:21:46.707 2 DEBUG oslo_concurrency.lockutils [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Releasing lock "refresh_cache-f2f0b852-0c4a-4d16-9c7f-54845e7f7b42" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:21:46 np0005466013 nova_compute[192144]: 2025-10-02 12:21:46.708 2 DEBUG nova.compute.manager [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Instance network_info: |[{"id": "6374e02b-d27f-466e-9a75-8ba586327036", "address": "fa:16:3e:17:03:e5", "network": {"id": "20eb29be-ee23-463b-85af-bfc2388e9f77", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-370285634-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffce7d629aa24a7f970d93b2a79045f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6374e02b-d2", "ovs_interfaceid": "6374e02b-d27f-466e-9a75-8ba586327036", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:21:46 np0005466013 nova_compute[192144]: 2025-10-02 12:21:46.711 2 DEBUG nova.virt.libvirt.driver [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Start _get_guest_xml network_info=[{"id": "6374e02b-d27f-466e-9a75-8ba586327036", "address": "fa:16:3e:17:03:e5", "network": {"id": "20eb29be-ee23-463b-85af-bfc2388e9f77", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-370285634-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffce7d629aa24a7f970d93b2a79045f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6374e02b-d2", "ovs_interfaceid": "6374e02b-d27f-466e-9a75-8ba586327036", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:21:46 np0005466013 nova_compute[192144]: 2025-10-02 12:21:46.715 2 WARNING nova.virt.libvirt.driver [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:21:46 np0005466013 nova_compute[192144]: 2025-10-02 12:21:46.720 2 DEBUG nova.virt.libvirt.host [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:21:46 np0005466013 nova_compute[192144]: 2025-10-02 12:21:46.722 2 DEBUG nova.virt.libvirt.host [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:21:46 np0005466013 nova_compute[192144]: 2025-10-02 12:21:46.725 2 DEBUG nova.virt.libvirt.host [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:21:46 np0005466013 nova_compute[192144]: 2025-10-02 12:21:46.726 2 DEBUG nova.virt.libvirt.host [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:21:46 np0005466013 nova_compute[192144]: 2025-10-02 12:21:46.727 2 DEBUG nova.virt.libvirt.driver [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:21:46 np0005466013 nova_compute[192144]: 2025-10-02 12:21:46.728 2 DEBUG nova.virt.hardware [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:21:46 np0005466013 nova_compute[192144]: 2025-10-02 12:21:46.729 2 DEBUG nova.virt.hardware [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:21:46 np0005466013 nova_compute[192144]: 2025-10-02 12:21:46.729 2 DEBUG nova.virt.hardware [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:21:46 np0005466013 nova_compute[192144]: 2025-10-02 12:21:46.730 2 DEBUG nova.virt.hardware [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:21:46 np0005466013 nova_compute[192144]: 2025-10-02 12:21:46.730 2 DEBUG nova.virt.hardware [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:21:46 np0005466013 nova_compute[192144]: 2025-10-02 12:21:46.731 2 DEBUG nova.virt.hardware [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:21:46 np0005466013 nova_compute[192144]: 2025-10-02 12:21:46.731 2 DEBUG nova.virt.hardware [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:21:46 np0005466013 nova_compute[192144]: 2025-10-02 12:21:46.732 2 DEBUG nova.virt.hardware [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:21:46 np0005466013 nova_compute[192144]: 2025-10-02 12:21:46.732 2 DEBUG nova.virt.hardware [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:21:46 np0005466013 nova_compute[192144]: 2025-10-02 12:21:46.733 2 DEBUG nova.virt.hardware [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:21:46 np0005466013 nova_compute[192144]: 2025-10-02 12:21:46.733 2 DEBUG nova.virt.hardware [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:21:46 np0005466013 nova_compute[192144]: 2025-10-02 12:21:46.739 2 DEBUG nova.virt.libvirt.vif [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:21:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1061088034',display_name='tempest-ServerActionsTestOtherB-server-1061088034',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1061088034',id=113,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ffce7d629aa24a7f970d93b2a79045f1',ramdisk_id='',reservation_id='r-f3s92q8v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-263921372',owner_user_name='tempest-ServerActionsTestOtherB-263921372-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:21:42Z,user_data=None,user_id='0ea122e2fff94f2ba7c78bf30b04029c',uuid=f2f0b852-0c4a-4d16-9c7f-54845e7f7b42,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6374e02b-d27f-466e-9a75-8ba586327036", "address": "fa:16:3e:17:03:e5", "network": {"id": "20eb29be-ee23-463b-85af-bfc2388e9f77", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-370285634-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffce7d629aa24a7f970d93b2a79045f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6374e02b-d2", "ovs_interfaceid": "6374e02b-d27f-466e-9a75-8ba586327036", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:21:46 np0005466013 nova_compute[192144]: 2025-10-02 12:21:46.739 2 DEBUG nova.network.os_vif_util [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Converting VIF {"id": "6374e02b-d27f-466e-9a75-8ba586327036", "address": "fa:16:3e:17:03:e5", "network": {"id": "20eb29be-ee23-463b-85af-bfc2388e9f77", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-370285634-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffce7d629aa24a7f970d93b2a79045f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6374e02b-d2", "ovs_interfaceid": "6374e02b-d27f-466e-9a75-8ba586327036", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:21:46 np0005466013 nova_compute[192144]: 2025-10-02 12:21:46.740 2 DEBUG nova.network.os_vif_util [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:17:03:e5,bridge_name='br-int',has_traffic_filtering=True,id=6374e02b-d27f-466e-9a75-8ba586327036,network=Network(20eb29be-ee23-463b-85af-bfc2388e9f77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6374e02b-d2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:21:46 np0005466013 nova_compute[192144]: 2025-10-02 12:21:46.742 2 DEBUG nova.objects.instance [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lazy-loading 'pci_devices' on Instance uuid f2f0b852-0c4a-4d16-9c7f-54845e7f7b42 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:21:46 np0005466013 nova_compute[192144]: 2025-10-02 12:21:46.757 2 DEBUG nova.virt.libvirt.driver [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:21:46 np0005466013 nova_compute[192144]:  <uuid>f2f0b852-0c4a-4d16-9c7f-54845e7f7b42</uuid>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:  <name>instance-00000071</name>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:21:46 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:      <nova:name>tempest-ServerActionsTestOtherB-server-1061088034</nova:name>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:21:46</nova:creationTime>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:21:46 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:        <nova:user uuid="0ea122e2fff94f2ba7c78bf30b04029c">tempest-ServerActionsTestOtherB-263921372-project-member</nova:user>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:        <nova:project uuid="ffce7d629aa24a7f970d93b2a79045f1">tempest-ServerActionsTestOtherB-263921372</nova:project>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:        <nova:port uuid="6374e02b-d27f-466e-9a75-8ba586327036">
Oct  2 08:21:46 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:      <entry name="serial">f2f0b852-0c4a-4d16-9c7f-54845e7f7b42</entry>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:      <entry name="uuid">f2f0b852-0c4a-4d16-9c7f-54845e7f7b42</entry>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:21:46 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/f2f0b852-0c4a-4d16-9c7f-54845e7f7b42/disk"/>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:21:46 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/f2f0b852-0c4a-4d16-9c7f-54845e7f7b42/disk.config"/>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:21:46 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:17:03:e5"/>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:      <target dev="tap6374e02b-d2"/>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:21:46 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/f2f0b852-0c4a-4d16-9c7f-54845e7f7b42/console.log" append="off"/>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:21:46 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:21:46 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:21:46 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:21:46 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:21:46 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:21:46 np0005466013 nova_compute[192144]: 2025-10-02 12:21:46.759 2 DEBUG nova.compute.manager [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Preparing to wait for external event network-vif-plugged-6374e02b-d27f-466e-9a75-8ba586327036 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:21:46 np0005466013 nova_compute[192144]: 2025-10-02 12:21:46.759 2 DEBUG oslo_concurrency.lockutils [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Acquiring lock "f2f0b852-0c4a-4d16-9c7f-54845e7f7b42-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:46 np0005466013 nova_compute[192144]: 2025-10-02 12:21:46.760 2 DEBUG oslo_concurrency.lockutils [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "f2f0b852-0c4a-4d16-9c7f-54845e7f7b42-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:46 np0005466013 nova_compute[192144]: 2025-10-02 12:21:46.760 2 DEBUG oslo_concurrency.lockutils [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "f2f0b852-0c4a-4d16-9c7f-54845e7f7b42-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:46 np0005466013 nova_compute[192144]: 2025-10-02 12:21:46.760 2 DEBUG nova.virt.libvirt.vif [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:21:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1061088034',display_name='tempest-ServerActionsTestOtherB-server-1061088034',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1061088034',id=113,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ffce7d629aa24a7f970d93b2a79045f1',ramdisk_id='',reservation_id='r-f3s92q8v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-263921372',owner_user_name='tempest-ServerActionsTestOtherB-263921372-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:21:42Z,user_data=None,user_id='0ea122e2fff94f2ba7c78bf30b04029c',uuid=f2f0b852-0c4a-4d16-9c7f-54845e7f7b42,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6374e02b-d27f-466e-9a75-8ba586327036", "address": "fa:16:3e:17:03:e5", "network": {"id": "20eb29be-ee23-463b-85af-bfc2388e9f77", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-370285634-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffce7d629aa24a7f970d93b2a79045f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6374e02b-d2", "ovs_interfaceid": "6374e02b-d27f-466e-9a75-8ba586327036", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:21:46 np0005466013 nova_compute[192144]: 2025-10-02 12:21:46.761 2 DEBUG nova.network.os_vif_util [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Converting VIF {"id": "6374e02b-d27f-466e-9a75-8ba586327036", "address": "fa:16:3e:17:03:e5", "network": {"id": "20eb29be-ee23-463b-85af-bfc2388e9f77", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-370285634-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffce7d629aa24a7f970d93b2a79045f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6374e02b-d2", "ovs_interfaceid": "6374e02b-d27f-466e-9a75-8ba586327036", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:21:46 np0005466013 nova_compute[192144]: 2025-10-02 12:21:46.761 2 DEBUG nova.network.os_vif_util [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:17:03:e5,bridge_name='br-int',has_traffic_filtering=True,id=6374e02b-d27f-466e-9a75-8ba586327036,network=Network(20eb29be-ee23-463b-85af-bfc2388e9f77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6374e02b-d2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:21:46 np0005466013 nova_compute[192144]: 2025-10-02 12:21:46.762 2 DEBUG os_vif [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:17:03:e5,bridge_name='br-int',has_traffic_filtering=True,id=6374e02b-d27f-466e-9a75-8ba586327036,network=Network(20eb29be-ee23-463b-85af-bfc2388e9f77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6374e02b-d2') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:21:46 np0005466013 nova_compute[192144]: 2025-10-02 12:21:46.762 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:46 np0005466013 nova_compute[192144]: 2025-10-02 12:21:46.763 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:21:46 np0005466013 nova_compute[192144]: 2025-10-02 12:21:46.763 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:21:46 np0005466013 nova_compute[192144]: 2025-10-02 12:21:46.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:46 np0005466013 nova_compute[192144]: 2025-10-02 12:21:46.765 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6374e02b-d2, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:21:46 np0005466013 nova_compute[192144]: 2025-10-02 12:21:46.766 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6374e02b-d2, col_values=(('external_ids', {'iface-id': '6374e02b-d27f-466e-9a75-8ba586327036', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:17:03:e5', 'vm-uuid': 'f2f0b852-0c4a-4d16-9c7f-54845e7f7b42'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:21:46 np0005466013 NetworkManager[51205]: <info>  [1759407706.7685] manager: (tap6374e02b-d2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/198)
Oct  2 08:21:46 np0005466013 nova_compute[192144]: 2025-10-02 12:21:46.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:21:46 np0005466013 nova_compute[192144]: 2025-10-02 12:21:46.778 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:46 np0005466013 nova_compute[192144]: 2025-10-02 12:21:46.779 2 INFO os_vif [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:17:03:e5,bridge_name='br-int',has_traffic_filtering=True,id=6374e02b-d27f-466e-9a75-8ba586327036,network=Network(20eb29be-ee23-463b-85af-bfc2388e9f77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6374e02b-d2')#033[00m
Oct  2 08:21:46 np0005466013 nova_compute[192144]: 2025-10-02 12:21:46.904 2 DEBUG nova.virt.libvirt.driver [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:21:46 np0005466013 nova_compute[192144]: 2025-10-02 12:21:46.904 2 DEBUG nova.virt.libvirt.driver [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:21:46 np0005466013 nova_compute[192144]: 2025-10-02 12:21:46.904 2 DEBUG nova.virt.libvirt.driver [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] No VIF found with MAC fa:16:3e:17:03:e5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:21:46 np0005466013 nova_compute[192144]: 2025-10-02 12:21:46.905 2 INFO nova.virt.libvirt.driver [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Using config drive#033[00m
Oct  2 08:21:47 np0005466013 nova_compute[192144]: 2025-10-02 12:21:47.279 2 INFO nova.virt.libvirt.driver [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Creating config drive at /var/lib/nova/instances/f2f0b852-0c4a-4d16-9c7f-54845e7f7b42/disk.config#033[00m
Oct  2 08:21:47 np0005466013 nova_compute[192144]: 2025-10-02 12:21:47.285 2 DEBUG oslo_concurrency.processutils [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f2f0b852-0c4a-4d16-9c7f-54845e7f7b42/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfmsfez2y execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:47 np0005466013 nova_compute[192144]: 2025-10-02 12:21:47.420 2 DEBUG oslo_concurrency.processutils [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f2f0b852-0c4a-4d16-9c7f-54845e7f7b42/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfmsfez2y" returned: 0 in 0.136s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:47 np0005466013 kernel: tap6374e02b-d2: entered promiscuous mode
Oct  2 08:21:47 np0005466013 NetworkManager[51205]: <info>  [1759407707.4832] manager: (tap6374e02b-d2): new Tun device (/org/freedesktop/NetworkManager/Devices/199)
Oct  2 08:21:47 np0005466013 systemd-udevd[236180]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:21:47 np0005466013 NetworkManager[51205]: <info>  [1759407707.4991] device (tap6374e02b-d2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:21:47 np0005466013 NetworkManager[51205]: <info>  [1759407707.5002] device (tap6374e02b-d2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:21:47 np0005466013 ovn_controller[94366]: 2025-10-02T12:21:47Z|00419|binding|INFO|Claiming lport 6374e02b-d27f-466e-9a75-8ba586327036 for this chassis.
Oct  2 08:21:47 np0005466013 ovn_controller[94366]: 2025-10-02T12:21:47Z|00420|binding|INFO|6374e02b-d27f-466e-9a75-8ba586327036: Claiming fa:16:3e:17:03:e5 10.100.0.6
Oct  2 08:21:47 np0005466013 nova_compute[192144]: 2025-10-02 12:21:47.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:47 np0005466013 ovn_controller[94366]: 2025-10-02T12:21:47Z|00421|binding|INFO|Setting lport 6374e02b-d27f-466e-9a75-8ba586327036 ovn-installed in OVS
Oct  2 08:21:47 np0005466013 ovn_controller[94366]: 2025-10-02T12:21:47Z|00422|binding|INFO|Setting lport 6374e02b-d27f-466e-9a75-8ba586327036 up in Southbound
Oct  2 08:21:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:47.523 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:17:03:e5 10.100.0.6'], port_security=['fa:16:3e:17:03:e5 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'f2f0b852-0c4a-4d16-9c7f-54845e7f7b42', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-20eb29be-ee23-463b-85af-bfc2388e9f77', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7993d405-22b6-4649-b5b8-9f3e7d07d4ce', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e183e2c6-21dc-48e3-ae47-279bc8b32eeb, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=6374e02b-d27f-466e-9a75-8ba586327036) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:21:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:47.524 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 6374e02b-d27f-466e-9a75-8ba586327036 in datapath 20eb29be-ee23-463b-85af-bfc2388e9f77 bound to our chassis#033[00m
Oct  2 08:21:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:47.526 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 20eb29be-ee23-463b-85af-bfc2388e9f77#033[00m
Oct  2 08:21:47 np0005466013 nova_compute[192144]: 2025-10-02 12:21:47.527 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:47.544 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[6116af5f-8166-44da-b2c8-00abd8833b0e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:47 np0005466013 systemd-machined[152202]: New machine qemu-50-instance-00000071.
Oct  2 08:21:47 np0005466013 systemd[1]: Started Virtual Machine qemu-50-instance-00000071.
Oct  2 08:21:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:47.579 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[f144c22a-06bc-4aab-bf2f-4c2feedc32a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:47.582 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[8d800356-ef20-472f-99a0-24c38a9d4e33]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:47.607 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[b6f52c35-06ab-4e39-85e4-3cd00c0d8bda]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:47.639 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[beaa9fbc-bc1e-4d09-bc7b-4129d6121568]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap20eb29be-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:77:55:96'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 125], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 561951, 'reachable_time': 40813, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236289, 'error': None, 'target': 'ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:47.657 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[a16a483e-1b39-4bc2-84b4-daf173ed84ee]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap20eb29be-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 561963, 'tstamp': 561963}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236291, 'error': None, 'target': 'ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap20eb29be-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 561966, 'tstamp': 561966}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236291, 'error': None, 'target': 'ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:47.660 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap20eb29be-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:21:47 np0005466013 nova_compute[192144]: 2025-10-02 12:21:47.661 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:47.663 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap20eb29be-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:21:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:47.663 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:21:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:47.664 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap20eb29be-e0, col_values=(('external_ids', {'iface-id': 'e533861f-45cb-4843-b071-0b628ca25128'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:21:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:47.664 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:21:47 np0005466013 nova_compute[192144]: 2025-10-02 12:21:47.744 2 DEBUG nova.compute.manager [req-1f61b8c2-47b0-49a3-a881-f78698bb08a4 req-79270238-47c5-4b22-b568-95392e2bee2d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Received event network-vif-plugged-6374e02b-d27f-466e-9a75-8ba586327036 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:21:47 np0005466013 nova_compute[192144]: 2025-10-02 12:21:47.745 2 DEBUG oslo_concurrency.lockutils [req-1f61b8c2-47b0-49a3-a881-f78698bb08a4 req-79270238-47c5-4b22-b568-95392e2bee2d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "f2f0b852-0c4a-4d16-9c7f-54845e7f7b42-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:47 np0005466013 nova_compute[192144]: 2025-10-02 12:21:47.745 2 DEBUG oslo_concurrency.lockutils [req-1f61b8c2-47b0-49a3-a881-f78698bb08a4 req-79270238-47c5-4b22-b568-95392e2bee2d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f2f0b852-0c4a-4d16-9c7f-54845e7f7b42-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:47 np0005466013 nova_compute[192144]: 2025-10-02 12:21:47.746 2 DEBUG oslo_concurrency.lockutils [req-1f61b8c2-47b0-49a3-a881-f78698bb08a4 req-79270238-47c5-4b22-b568-95392e2bee2d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f2f0b852-0c4a-4d16-9c7f-54845e7f7b42-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:47 np0005466013 nova_compute[192144]: 2025-10-02 12:21:47.746 2 DEBUG nova.compute.manager [req-1f61b8c2-47b0-49a3-a881-f78698bb08a4 req-79270238-47c5-4b22-b568-95392e2bee2d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Processing event network-vif-plugged-6374e02b-d27f-466e-9a75-8ba586327036 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:21:47 np0005466013 nova_compute[192144]: 2025-10-02 12:21:47.762 2 DEBUG nova.compute.manager [req-2f799ea7-3001-4c61-82fc-0eca7c156ab9 req-0fa18da7-bfe4-4e48-8802-180b70374d41 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Received event network-changed-6374e02b-d27f-466e-9a75-8ba586327036 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:21:47 np0005466013 nova_compute[192144]: 2025-10-02 12:21:47.762 2 DEBUG nova.compute.manager [req-2f799ea7-3001-4c61-82fc-0eca7c156ab9 req-0fa18da7-bfe4-4e48-8802-180b70374d41 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Refreshing instance network info cache due to event network-changed-6374e02b-d27f-466e-9a75-8ba586327036. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:21:47 np0005466013 nova_compute[192144]: 2025-10-02 12:21:47.763 2 DEBUG oslo_concurrency.lockutils [req-2f799ea7-3001-4c61-82fc-0eca7c156ab9 req-0fa18da7-bfe4-4e48-8802-180b70374d41 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-f2f0b852-0c4a-4d16-9c7f-54845e7f7b42" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:21:47 np0005466013 nova_compute[192144]: 2025-10-02 12:21:47.763 2 DEBUG oslo_concurrency.lockutils [req-2f799ea7-3001-4c61-82fc-0eca7c156ab9 req-0fa18da7-bfe4-4e48-8802-180b70374d41 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-f2f0b852-0c4a-4d16-9c7f-54845e7f7b42" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:21:47 np0005466013 nova_compute[192144]: 2025-10-02 12:21:47.763 2 DEBUG nova.network.neutron [req-2f799ea7-3001-4c61-82fc-0eca7c156ab9 req-0fa18da7-bfe4-4e48-8802-180b70374d41 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Refreshing network info cache for port 6374e02b-d27f-466e-9a75-8ba586327036 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:21:48 np0005466013 nova_compute[192144]: 2025-10-02 12:21:48.269 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407708.268988, f2f0b852-0c4a-4d16-9c7f-54845e7f7b42 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:21:48 np0005466013 nova_compute[192144]: 2025-10-02 12:21:48.270 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] VM Started (Lifecycle Event)#033[00m
Oct  2 08:21:48 np0005466013 nova_compute[192144]: 2025-10-02 12:21:48.272 2 DEBUG nova.compute.manager [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:21:48 np0005466013 nova_compute[192144]: 2025-10-02 12:21:48.279 2 DEBUG nova.virt.libvirt.driver [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:21:48 np0005466013 nova_compute[192144]: 2025-10-02 12:21:48.283 2 INFO nova.virt.libvirt.driver [-] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Instance spawned successfully.#033[00m
Oct  2 08:21:48 np0005466013 nova_compute[192144]: 2025-10-02 12:21:48.284 2 DEBUG nova.virt.libvirt.driver [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:21:48 np0005466013 nova_compute[192144]: 2025-10-02 12:21:48.310 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:21:48 np0005466013 nova_compute[192144]: 2025-10-02 12:21:48.314 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:21:48 np0005466013 nova_compute[192144]: 2025-10-02 12:21:48.322 2 DEBUG nova.virt.libvirt.driver [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:21:48 np0005466013 nova_compute[192144]: 2025-10-02 12:21:48.323 2 DEBUG nova.virt.libvirt.driver [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:21:48 np0005466013 nova_compute[192144]: 2025-10-02 12:21:48.324 2 DEBUG nova.virt.libvirt.driver [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:21:48 np0005466013 nova_compute[192144]: 2025-10-02 12:21:48.324 2 DEBUG nova.virt.libvirt.driver [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:21:48 np0005466013 nova_compute[192144]: 2025-10-02 12:21:48.325 2 DEBUG nova.virt.libvirt.driver [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:21:48 np0005466013 nova_compute[192144]: 2025-10-02 12:21:48.326 2 DEBUG nova.virt.libvirt.driver [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:21:48 np0005466013 nova_compute[192144]: 2025-10-02 12:21:48.359 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:21:48 np0005466013 nova_compute[192144]: 2025-10-02 12:21:48.359 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407708.2691581, f2f0b852-0c4a-4d16-9c7f-54845e7f7b42 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:21:48 np0005466013 nova_compute[192144]: 2025-10-02 12:21:48.360 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:21:48 np0005466013 nova_compute[192144]: 2025-10-02 12:21:48.389 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:21:48 np0005466013 nova_compute[192144]: 2025-10-02 12:21:48.392 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407708.2773743, f2f0b852-0c4a-4d16-9c7f-54845e7f7b42 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:21:48 np0005466013 nova_compute[192144]: 2025-10-02 12:21:48.392 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:21:48 np0005466013 nova_compute[192144]: 2025-10-02 12:21:48.417 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:21:48 np0005466013 nova_compute[192144]: 2025-10-02 12:21:48.419 2 INFO nova.compute.manager [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Took 5.46 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:21:48 np0005466013 nova_compute[192144]: 2025-10-02 12:21:48.419 2 DEBUG nova.compute.manager [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:21:48 np0005466013 nova_compute[192144]: 2025-10-02 12:21:48.423 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:21:48 np0005466013 nova_compute[192144]: 2025-10-02 12:21:48.455 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:21:48 np0005466013 nova_compute[192144]: 2025-10-02 12:21:48.538 2 INFO nova.compute.manager [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Took 6.02 seconds to build instance.#033[00m
Oct  2 08:21:48 np0005466013 nova_compute[192144]: 2025-10-02 12:21:48.561 2 DEBUG oslo_concurrency.lockutils [None req-52783027-268f-4336-aa58-f933dc7ec2f8 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "f2f0b852-0c4a-4d16-9c7f-54845e7f7b42" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:48 np0005466013 nova_compute[192144]: 2025-10-02 12:21:48.787 2 DEBUG nova.network.neutron [req-2f799ea7-3001-4c61-82fc-0eca7c156ab9 req-0fa18da7-bfe4-4e48-8802-180b70374d41 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Updated VIF entry in instance network info cache for port 6374e02b-d27f-466e-9a75-8ba586327036. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:21:48 np0005466013 nova_compute[192144]: 2025-10-02 12:21:48.788 2 DEBUG nova.network.neutron [req-2f799ea7-3001-4c61-82fc-0eca7c156ab9 req-0fa18da7-bfe4-4e48-8802-180b70374d41 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Updating instance_info_cache with network_info: [{"id": "6374e02b-d27f-466e-9a75-8ba586327036", "address": "fa:16:3e:17:03:e5", "network": {"id": "20eb29be-ee23-463b-85af-bfc2388e9f77", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-370285634-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffce7d629aa24a7f970d93b2a79045f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6374e02b-d2", "ovs_interfaceid": "6374e02b-d27f-466e-9a75-8ba586327036", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:21:48 np0005466013 nova_compute[192144]: 2025-10-02 12:21:48.805 2 DEBUG oslo_concurrency.lockutils [req-2f799ea7-3001-4c61-82fc-0eca7c156ab9 req-0fa18da7-bfe4-4e48-8802-180b70374d41 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-f2f0b852-0c4a-4d16-9c7f-54845e7f7b42" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:21:48 np0005466013 nova_compute[192144]: 2025-10-02 12:21:48.805 2 DEBUG nova.compute.manager [req-2f799ea7-3001-4c61-82fc-0eca7c156ab9 req-0fa18da7-bfe4-4e48-8802-180b70374d41 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Received event network-vif-plugged-0c328734-ebc6-47bc-b603-2e4af1cae573 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:21:48 np0005466013 nova_compute[192144]: 2025-10-02 12:21:48.805 2 DEBUG oslo_concurrency.lockutils [req-2f799ea7-3001-4c61-82fc-0eca7c156ab9 req-0fa18da7-bfe4-4e48-8802-180b70374d41 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:48 np0005466013 nova_compute[192144]: 2025-10-02 12:21:48.806 2 DEBUG oslo_concurrency.lockutils [req-2f799ea7-3001-4c61-82fc-0eca7c156ab9 req-0fa18da7-bfe4-4e48-8802-180b70374d41 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:48 np0005466013 nova_compute[192144]: 2025-10-02 12:21:48.806 2 DEBUG oslo_concurrency.lockutils [req-2f799ea7-3001-4c61-82fc-0eca7c156ab9 req-0fa18da7-bfe4-4e48-8802-180b70374d41 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:48 np0005466013 nova_compute[192144]: 2025-10-02 12:21:48.806 2 DEBUG nova.compute.manager [req-2f799ea7-3001-4c61-82fc-0eca7c156ab9 req-0fa18da7-bfe4-4e48-8802-180b70374d41 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] No waiting events found dispatching network-vif-plugged-0c328734-ebc6-47bc-b603-2e4af1cae573 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:21:48 np0005466013 nova_compute[192144]: 2025-10-02 12:21:48.806 2 WARNING nova.compute.manager [req-2f799ea7-3001-4c61-82fc-0eca7c156ab9 req-0fa18da7-bfe4-4e48-8802-180b70374d41 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Received unexpected event network-vif-plugged-0c328734-ebc6-47bc-b603-2e4af1cae573 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:21:48 np0005466013 nova_compute[192144]: 2025-10-02 12:21:48.846 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:49 np0005466013 nova_compute[192144]: 2025-10-02 12:21:49.178 2 INFO nova.compute.manager [None req-4daff1b4-3cdb-4e23-b2b0-cab357c69fc7 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Get console output#033[00m
Oct  2 08:21:49 np0005466013 nova_compute[192144]: 2025-10-02 12:21:49.304 56 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  2 08:21:49 np0005466013 nova_compute[192144]: 2025-10-02 12:21:49.817 2 DEBUG nova.compute.manager [req-cc89276a-95a7-42a7-9969-185415d437f9 req-bd084e6e-a6b9-4074-967d-8c1190090b9f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Received event network-vif-plugged-6374e02b-d27f-466e-9a75-8ba586327036 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:21:49 np0005466013 nova_compute[192144]: 2025-10-02 12:21:49.818 2 DEBUG oslo_concurrency.lockutils [req-cc89276a-95a7-42a7-9969-185415d437f9 req-bd084e6e-a6b9-4074-967d-8c1190090b9f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "f2f0b852-0c4a-4d16-9c7f-54845e7f7b42-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:49 np0005466013 nova_compute[192144]: 2025-10-02 12:21:49.818 2 DEBUG oslo_concurrency.lockutils [req-cc89276a-95a7-42a7-9969-185415d437f9 req-bd084e6e-a6b9-4074-967d-8c1190090b9f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f2f0b852-0c4a-4d16-9c7f-54845e7f7b42-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:49 np0005466013 nova_compute[192144]: 2025-10-02 12:21:49.819 2 DEBUG oslo_concurrency.lockutils [req-cc89276a-95a7-42a7-9969-185415d437f9 req-bd084e6e-a6b9-4074-967d-8c1190090b9f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f2f0b852-0c4a-4d16-9c7f-54845e7f7b42-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:49 np0005466013 nova_compute[192144]: 2025-10-02 12:21:49.819 2 DEBUG nova.compute.manager [req-cc89276a-95a7-42a7-9969-185415d437f9 req-bd084e6e-a6b9-4074-967d-8c1190090b9f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] No waiting events found dispatching network-vif-plugged-6374e02b-d27f-466e-9a75-8ba586327036 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:21:49 np0005466013 nova_compute[192144]: 2025-10-02 12:21:49.819 2 WARNING nova.compute.manager [req-cc89276a-95a7-42a7-9969-185415d437f9 req-bd084e6e-a6b9-4074-967d-8c1190090b9f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Received unexpected event network-vif-plugged-6374e02b-d27f-466e-9a75-8ba586327036 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:21:50 np0005466013 nova_compute[192144]: 2025-10-02 12:21:50.115 2 DEBUG nova.objects.instance [None req-355db3cc-dfbd-4e24-b136-261242e16d8c a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Lazy-loading 'pci_devices' on Instance uuid 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:21:50 np0005466013 nova_compute[192144]: 2025-10-02 12:21:50.147 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407710.1468182, 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:21:50 np0005466013 nova_compute[192144]: 2025-10-02 12:21:50.148 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:21:50 np0005466013 nova_compute[192144]: 2025-10-02 12:21:50.178 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:21:50 np0005466013 nova_compute[192144]: 2025-10-02 12:21:50.182 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:21:50 np0005466013 nova_compute[192144]: 2025-10-02 12:21:50.198 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] During sync_power_state the instance has a pending task (suspending). Skip.#033[00m
Oct  2 08:21:50 np0005466013 nova_compute[192144]: 2025-10-02 12:21:50.435 2 INFO nova.compute.manager [None req-023e24c4-6cfc-4610-9d10-e69bedcef848 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Get console output#033[00m
Oct  2 08:21:50 np0005466013 nova_compute[192144]: 2025-10-02 12:21:50.443 56 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  2 08:21:50 np0005466013 kernel: tap0c328734-eb (unregistering): left promiscuous mode
Oct  2 08:21:50 np0005466013 NetworkManager[51205]: <info>  [1759407710.9415] device (tap0c328734-eb): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:21:50 np0005466013 ovn_controller[94366]: 2025-10-02T12:21:50Z|00423|binding|INFO|Releasing lport 0c328734-ebc6-47bc-b603-2e4af1cae573 from this chassis (sb_readonly=0)
Oct  2 08:21:50 np0005466013 ovn_controller[94366]: 2025-10-02T12:21:50Z|00424|binding|INFO|Setting lport 0c328734-ebc6-47bc-b603-2e4af1cae573 down in Southbound
Oct  2 08:21:50 np0005466013 ovn_controller[94366]: 2025-10-02T12:21:50Z|00425|binding|INFO|Removing iface tap0c328734-eb ovn-installed in OVS
Oct  2 08:21:50 np0005466013 nova_compute[192144]: 2025-10-02 12:21:50.960 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:50.965 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ef:e3:79 10.100.0.10'], port_security=['fa:16:3e:ef:e3:79 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8f494075-66bf-4ce0-a765-98fd91c31199', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f0c8c8a8631b4721beed577a99f8bdb7', 'neutron:revision_number': '9', 'neutron:security_group_ids': 'eb030dcc-72ea-4850-916a-e1df7c4d9a87', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e43b5827-85bf-4b83-b921-ec45e12f1f2e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=0c328734-ebc6-47bc-b603-2e4af1cae573) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:21:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:50.966 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 0c328734-ebc6-47bc-b603-2e4af1cae573 in datapath 8f494075-66bf-4ce0-a765-98fd91c31199 unbound from our chassis#033[00m
Oct  2 08:21:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:50.967 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8f494075-66bf-4ce0-a765-98fd91c31199, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:21:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:50.972 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[b40917ec-4282-4002-aaa6-e67e7013092f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:50 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:50.972 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199 namespace which is not needed anymore#033[00m
Oct  2 08:21:50 np0005466013 nova_compute[192144]: 2025-10-02 12:21:50.975 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:51 np0005466013 systemd[1]: machine-qemu\x2d49\x2dinstance\x2d00000065.scope: Deactivated successfully.
Oct  2 08:21:51 np0005466013 systemd[1]: machine-qemu\x2d49\x2dinstance\x2d00000065.scope: Consumed 5.430s CPU time.
Oct  2 08:21:51 np0005466013 systemd-machined[152202]: Machine qemu-49-instance-00000065 terminated.
Oct  2 08:21:51 np0005466013 neutron-haproxy-ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199[236242]: [NOTICE]   (236246) : haproxy version is 2.8.14-c23fe91
Oct  2 08:21:51 np0005466013 neutron-haproxy-ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199[236242]: [NOTICE]   (236246) : path to executable is /usr/sbin/haproxy
Oct  2 08:21:51 np0005466013 neutron-haproxy-ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199[236242]: [WARNING]  (236246) : Exiting Master process...
Oct  2 08:21:51 np0005466013 neutron-haproxy-ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199[236242]: [ALERT]    (236246) : Current worker (236248) exited with code 143 (Terminated)
Oct  2 08:21:51 np0005466013 neutron-haproxy-ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199[236242]: [WARNING]  (236246) : All workers exited. Exiting... (0)
Oct  2 08:21:51 np0005466013 systemd[1]: libpod-e9c452881d770ad7ee3e9af91811c0e9826538ad3cee52a57324aba7dd0c91c0.scope: Deactivated successfully.
Oct  2 08:21:51 np0005466013 podman[236326]: 2025-10-02 12:21:51.123690748 +0000 UTC m=+0.046209086 container died e9c452881d770ad7ee3e9af91811c0e9826538ad3cee52a57324aba7dd0c91c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:21:51 np0005466013 nova_compute[192144]: 2025-10-02 12:21:51.140 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:51 np0005466013 nova_compute[192144]: 2025-10-02 12:21:51.147 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:51 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e9c452881d770ad7ee3e9af91811c0e9826538ad3cee52a57324aba7dd0c91c0-userdata-shm.mount: Deactivated successfully.
Oct  2 08:21:51 np0005466013 systemd[1]: var-lib-containers-storage-overlay-aec24bda8cc799c9deb7b8e4d927b33413eea22ba4950f6ac9e2479e484e9e0e-merged.mount: Deactivated successfully.
Oct  2 08:21:51 np0005466013 nova_compute[192144]: 2025-10-02 12:21:51.187 2 DEBUG nova.compute.manager [None req-355db3cc-dfbd-4e24-b136-261242e16d8c a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:21:51 np0005466013 podman[236326]: 2025-10-02 12:21:51.201242719 +0000 UTC m=+0.123761037 container cleanup e9c452881d770ad7ee3e9af91811c0e9826538ad3cee52a57324aba7dd0c91c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:21:51 np0005466013 systemd[1]: libpod-conmon-e9c452881d770ad7ee3e9af91811c0e9826538ad3cee52a57324aba7dd0c91c0.scope: Deactivated successfully.
Oct  2 08:21:51 np0005466013 podman[236369]: 2025-10-02 12:21:51.278589324 +0000 UTC m=+0.054234417 container remove e9c452881d770ad7ee3e9af91811c0e9826538ad3cee52a57324aba7dd0c91c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:21:51 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:51.286 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[acb30b31-646e-494c-85b9-e58217f7b370]: (4, ('Thu Oct  2 12:21:51 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199 (e9c452881d770ad7ee3e9af91811c0e9826538ad3cee52a57324aba7dd0c91c0)\ne9c452881d770ad7ee3e9af91811c0e9826538ad3cee52a57324aba7dd0c91c0\nThu Oct  2 12:21:51 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199 (e9c452881d770ad7ee3e9af91811c0e9826538ad3cee52a57324aba7dd0c91c0)\ne9c452881d770ad7ee3e9af91811c0e9826538ad3cee52a57324aba7dd0c91c0\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:51 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:51.288 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[b4202baf-8ede-4356-b597-e60868eeecf4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:51 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:51.289 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8f494075-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:21:51 np0005466013 nova_compute[192144]: 2025-10-02 12:21:51.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:51 np0005466013 kernel: tap8f494075-60: left promiscuous mode
Oct  2 08:21:51 np0005466013 nova_compute[192144]: 2025-10-02 12:21:51.371 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:51 np0005466013 nova_compute[192144]: 2025-10-02 12:21:51.377 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:51 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:51.380 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[0ebd23ae-3552-429d-b505-f0d20f521b20]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:51 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:51.408 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[37159196-1611-411a-b4bb-f0fd976e2f7e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:51 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:51.410 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[752218cf-0b1f-40cf-a918-bfb0c181086f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:51 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:51.432 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[311f7dd1-1090-4e0f-9c44-358d1417e246]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 569663, 'reachable_time': 42832, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236385, 'error': None, 'target': 'ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:51 np0005466013 systemd[1]: run-netns-ovnmeta\x2d8f494075\x2d66bf\x2d4ce0\x2da765\x2d98fd91c31199.mount: Deactivated successfully.
Oct  2 08:21:51 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:51.435 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:21:51 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:51.435 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[c8039865-be44-4d5b-ac27-c0dbce9b1a8d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:51 np0005466013 nova_compute[192144]: 2025-10-02 12:21:51.441 2 DEBUG nova.compute.manager [req-b4f24017-c2a8-40f1-b0af-ee24b551e48f req-fdacd22b-6505-410c-8511-649f966c023d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Received event network-vif-unplugged-0c328734-ebc6-47bc-b603-2e4af1cae573 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:21:51 np0005466013 nova_compute[192144]: 2025-10-02 12:21:51.441 2 DEBUG oslo_concurrency.lockutils [req-b4f24017-c2a8-40f1-b0af-ee24b551e48f req-fdacd22b-6505-410c-8511-649f966c023d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:51 np0005466013 nova_compute[192144]: 2025-10-02 12:21:51.441 2 DEBUG oslo_concurrency.lockutils [req-b4f24017-c2a8-40f1-b0af-ee24b551e48f req-fdacd22b-6505-410c-8511-649f966c023d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:51 np0005466013 nova_compute[192144]: 2025-10-02 12:21:51.441 2 DEBUG oslo_concurrency.lockutils [req-b4f24017-c2a8-40f1-b0af-ee24b551e48f req-fdacd22b-6505-410c-8511-649f966c023d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:51 np0005466013 nova_compute[192144]: 2025-10-02 12:21:51.442 2 DEBUG nova.compute.manager [req-b4f24017-c2a8-40f1-b0af-ee24b551e48f req-fdacd22b-6505-410c-8511-649f966c023d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] No waiting events found dispatching network-vif-unplugged-0c328734-ebc6-47bc-b603-2e4af1cae573 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:21:51 np0005466013 nova_compute[192144]: 2025-10-02 12:21:51.442 2 WARNING nova.compute.manager [req-b4f24017-c2a8-40f1-b0af-ee24b551e48f req-fdacd22b-6505-410c-8511-649f966c023d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Received unexpected event network-vif-unplugged-0c328734-ebc6-47bc-b603-2e4af1cae573 for instance with vm_state suspended and task_state None.#033[00m
Oct  2 08:21:51 np0005466013 nova_compute[192144]: 2025-10-02 12:21:51.554 2 INFO nova.compute.manager [None req-de9a89d4-9935-448c-a441-ae77338bead5 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Get console output#033[00m
Oct  2 08:21:51 np0005466013 nova_compute[192144]: 2025-10-02 12:21:51.560 56 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  2 08:21:51 np0005466013 nova_compute[192144]: 2025-10-02 12:21:51.768 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:52 np0005466013 podman[236387]: 2025-10-02 12:21:52.720000994 +0000 UTC m=+0.085893091 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, container_name=ovn_metadata_agent)
Oct  2 08:21:52 np0005466013 podman[236386]: 2025-10-02 12:21:52.720040526 +0000 UTC m=+0.088027088 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 08:21:52 np0005466013 podman[236388]: 2025-10-02 12:21:52.729780059 +0000 UTC m=+0.093323803 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  2 08:21:53 np0005466013 nova_compute[192144]: 2025-10-02 12:21:53.513 2 DEBUG nova.compute.manager [req-c448efba-45a8-490f-9e4b-53b743f4baa3 req-6f05584b-238c-4e05-8b8a-3ae07c88df9a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Received event network-vif-plugged-0c328734-ebc6-47bc-b603-2e4af1cae573 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:21:53 np0005466013 nova_compute[192144]: 2025-10-02 12:21:53.514 2 DEBUG oslo_concurrency.lockutils [req-c448efba-45a8-490f-9e4b-53b743f4baa3 req-6f05584b-238c-4e05-8b8a-3ae07c88df9a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:53 np0005466013 nova_compute[192144]: 2025-10-02 12:21:53.514 2 DEBUG oslo_concurrency.lockutils [req-c448efba-45a8-490f-9e4b-53b743f4baa3 req-6f05584b-238c-4e05-8b8a-3ae07c88df9a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:53 np0005466013 nova_compute[192144]: 2025-10-02 12:21:53.514 2 DEBUG oslo_concurrency.lockutils [req-c448efba-45a8-490f-9e4b-53b743f4baa3 req-6f05584b-238c-4e05-8b8a-3ae07c88df9a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:53 np0005466013 nova_compute[192144]: 2025-10-02 12:21:53.515 2 DEBUG nova.compute.manager [req-c448efba-45a8-490f-9e4b-53b743f4baa3 req-6f05584b-238c-4e05-8b8a-3ae07c88df9a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] No waiting events found dispatching network-vif-plugged-0c328734-ebc6-47bc-b603-2e4af1cae573 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:21:53 np0005466013 nova_compute[192144]: 2025-10-02 12:21:53.515 2 WARNING nova.compute.manager [req-c448efba-45a8-490f-9e4b-53b743f4baa3 req-6f05584b-238c-4e05-8b8a-3ae07c88df9a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Received unexpected event network-vif-plugged-0c328734-ebc6-47bc-b603-2e4af1cae573 for instance with vm_state suspended and task_state None.#033[00m
Oct  2 08:21:53 np0005466013 nova_compute[192144]: 2025-10-02 12:21:53.882 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:54 np0005466013 nova_compute[192144]: 2025-10-02 12:21:54.706 2 INFO nova.compute.manager [None req-d530c94c-fa04-4003-8537-e5ec3722c17d a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Resuming#033[00m
Oct  2 08:21:54 np0005466013 nova_compute[192144]: 2025-10-02 12:21:54.707 2 DEBUG nova.objects.instance [None req-d530c94c-fa04-4003-8537-e5ec3722c17d a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Lazy-loading 'flavor' on Instance uuid 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:21:54 np0005466013 nova_compute[192144]: 2025-10-02 12:21:54.751 2 DEBUG oslo_concurrency.lockutils [None req-d530c94c-fa04-4003-8537-e5ec3722c17d a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Acquiring lock "refresh_cache-35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:21:54 np0005466013 nova_compute[192144]: 2025-10-02 12:21:54.751 2 DEBUG oslo_concurrency.lockutils [None req-d530c94c-fa04-4003-8537-e5ec3722c17d a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Acquired lock "refresh_cache-35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:21:54 np0005466013 nova_compute[192144]: 2025-10-02 12:21:54.752 2 DEBUG nova.network.neutron [None req-d530c94c-fa04-4003-8537-e5ec3722c17d a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:21:56 np0005466013 nova_compute[192144]: 2025-10-02 12:21:56.636 2 DEBUG nova.compute.manager [None req-65333172-b9ef-4a87-88d4-b2d91b0d445d 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Getting vnc console get_vnc_console /usr/lib/python3.9/site-packages/nova/compute/manager.py:7196#033[00m
Oct  2 08:21:56 np0005466013 nova_compute[192144]: 2025-10-02 12:21:56.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:57 np0005466013 nova_compute[192144]: 2025-10-02 12:21:57.609 2 DEBUG nova.network.neutron [None req-d530c94c-fa04-4003-8537-e5ec3722c17d a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Updating instance_info_cache with network_info: [{"id": "0c328734-ebc6-47bc-b603-2e4af1cae573", "address": "fa:16:3e:ef:e3:79", "network": {"id": "8f494075-66bf-4ce0-a765-98fd91c31199", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1553125421-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0c8c8a8631b4721beed577a99f8bdb7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c328734-eb", "ovs_interfaceid": "0c328734-ebc6-47bc-b603-2e4af1cae573", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:21:57 np0005466013 nova_compute[192144]: 2025-10-02 12:21:57.623 2 DEBUG oslo_concurrency.lockutils [None req-d530c94c-fa04-4003-8537-e5ec3722c17d a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Releasing lock "refresh_cache-35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:21:57 np0005466013 nova_compute[192144]: 2025-10-02 12:21:57.628 2 DEBUG nova.virt.libvirt.vif [None req-d530c94c-fa04-4003-8537-e5ec3722c17d a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T12:19:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1170653470',display_name='tempest-ServersNegativeTestJSON-server-1170653470',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-1170653470',id=101,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:21:46Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='f0c8c8a8631b4721beed577a99f8bdb7',ramdisk_id='',reservation_id='r-sfsoaqzs',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServersNegativeTestJSON-114354241',owner_user_name='tempest-ServersNegativeTestJSON-114354241-project-member'},tags=<?>,task_state='resuming',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:21:51Z,user_data=None,user_id='a803afe9939346088252c3b944f124f2',uuid=35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "0c328734-ebc6-47bc-b603-2e4af1cae573", "address": "fa:16:3e:ef:e3:79", "network": {"id": "8f494075-66bf-4ce0-a765-98fd91c31199", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1553125421-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0c8c8a8631b4721beed577a99f8bdb7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c328734-eb", "ovs_interfaceid": "0c328734-ebc6-47bc-b603-2e4af1cae573", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:21:57 np0005466013 nova_compute[192144]: 2025-10-02 12:21:57.629 2 DEBUG nova.network.os_vif_util [None req-d530c94c-fa04-4003-8537-e5ec3722c17d a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Converting VIF {"id": "0c328734-ebc6-47bc-b603-2e4af1cae573", "address": "fa:16:3e:ef:e3:79", "network": {"id": "8f494075-66bf-4ce0-a765-98fd91c31199", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1553125421-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0c8c8a8631b4721beed577a99f8bdb7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c328734-eb", "ovs_interfaceid": "0c328734-ebc6-47bc-b603-2e4af1cae573", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:21:57 np0005466013 nova_compute[192144]: 2025-10-02 12:21:57.630 2 DEBUG nova.network.os_vif_util [None req-d530c94c-fa04-4003-8537-e5ec3722c17d a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ef:e3:79,bridge_name='br-int',has_traffic_filtering=True,id=0c328734-ebc6-47bc-b603-2e4af1cae573,network=Network(8f494075-66bf-4ce0-a765-98fd91c31199),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0c328734-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:21:57 np0005466013 nova_compute[192144]: 2025-10-02 12:21:57.630 2 DEBUG os_vif [None req-d530c94c-fa04-4003-8537-e5ec3722c17d a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ef:e3:79,bridge_name='br-int',has_traffic_filtering=True,id=0c328734-ebc6-47bc-b603-2e4af1cae573,network=Network(8f494075-66bf-4ce0-a765-98fd91c31199),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0c328734-eb') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:21:57 np0005466013 nova_compute[192144]: 2025-10-02 12:21:57.631 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:57 np0005466013 nova_compute[192144]: 2025-10-02 12:21:57.631 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:21:57 np0005466013 nova_compute[192144]: 2025-10-02 12:21:57.632 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:21:57 np0005466013 nova_compute[192144]: 2025-10-02 12:21:57.636 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:57 np0005466013 nova_compute[192144]: 2025-10-02 12:21:57.636 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0c328734-eb, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:21:57 np0005466013 nova_compute[192144]: 2025-10-02 12:21:57.637 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0c328734-eb, col_values=(('external_ids', {'iface-id': '0c328734-ebc6-47bc-b603-2e4af1cae573', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ef:e3:79', 'vm-uuid': '35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:21:57 np0005466013 nova_compute[192144]: 2025-10-02 12:21:57.637 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:21:57 np0005466013 nova_compute[192144]: 2025-10-02 12:21:57.638 2 INFO os_vif [None req-d530c94c-fa04-4003-8537-e5ec3722c17d a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ef:e3:79,bridge_name='br-int',has_traffic_filtering=True,id=0c328734-ebc6-47bc-b603-2e4af1cae573,network=Network(8f494075-66bf-4ce0-a765-98fd91c31199),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0c328734-eb')#033[00m
Oct  2 08:21:57 np0005466013 nova_compute[192144]: 2025-10-02 12:21:57.676 2 DEBUG nova.objects.instance [None req-d530c94c-fa04-4003-8537-e5ec3722c17d a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Lazy-loading 'numa_topology' on Instance uuid 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:21:57 np0005466013 kernel: tap0c328734-eb: entered promiscuous mode
Oct  2 08:21:57 np0005466013 NetworkManager[51205]: <info>  [1759407717.7586] manager: (tap0c328734-eb): new Tun device (/org/freedesktop/NetworkManager/Devices/200)
Oct  2 08:21:57 np0005466013 ovn_controller[94366]: 2025-10-02T12:21:57Z|00426|binding|INFO|Claiming lport 0c328734-ebc6-47bc-b603-2e4af1cae573 for this chassis.
Oct  2 08:21:57 np0005466013 ovn_controller[94366]: 2025-10-02T12:21:57Z|00427|binding|INFO|0c328734-ebc6-47bc-b603-2e4af1cae573: Claiming fa:16:3e:ef:e3:79 10.100.0.10
Oct  2 08:21:57 np0005466013 nova_compute[192144]: 2025-10-02 12:21:57.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:57 np0005466013 ovn_controller[94366]: 2025-10-02T12:21:57Z|00428|binding|INFO|Setting lport 0c328734-ebc6-47bc-b603-2e4af1cae573 ovn-installed in OVS
Oct  2 08:21:57 np0005466013 ovn_controller[94366]: 2025-10-02T12:21:57Z|00429|binding|INFO|Setting lport 0c328734-ebc6-47bc-b603-2e4af1cae573 up in Southbound
Oct  2 08:21:57 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:57.781 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ef:e3:79 10.100.0.10'], port_security=['fa:16:3e:ef:e3:79 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8f494075-66bf-4ce0-a765-98fd91c31199', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f0c8c8a8631b4721beed577a99f8bdb7', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'eb030dcc-72ea-4850-916a-e1df7c4d9a87', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e43b5827-85bf-4b83-b921-ec45e12f1f2e, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=0c328734-ebc6-47bc-b603-2e4af1cae573) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:21:57 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:57.782 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 0c328734-ebc6-47bc-b603-2e4af1cae573 in datapath 8f494075-66bf-4ce0-a765-98fd91c31199 bound to our chassis#033[00m
Oct  2 08:21:57 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:57.783 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8f494075-66bf-4ce0-a765-98fd91c31199#033[00m
Oct  2 08:21:57 np0005466013 nova_compute[192144]: 2025-10-02 12:21:57.792 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:57 np0005466013 nova_compute[192144]: 2025-10-02 12:21:57.796 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:57 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:57.801 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[52d5e3aa-f88a-4ae3-b077-dc77cd5d2706]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:57 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:57.802 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8f494075-61 in ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:21:57 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:57.805 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8f494075-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:21:57 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:57.805 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[58e2c347-bc09-47db-ac6e-8969e764a412]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:57 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:57.806 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[daf76ac5-0682-425e-a9e1-93a0ac70cef4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:57 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:57.824 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[04cc22aa-d6e7-4716-8c9c-81d343e1c3d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:57 np0005466013 systemd-machined[152202]: New machine qemu-51-instance-00000065.
Oct  2 08:21:57 np0005466013 systemd-udevd[236469]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:21:57 np0005466013 systemd[1]: Started Virtual Machine qemu-51-instance-00000065.
Oct  2 08:21:57 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:57.850 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[3dc78d4b-9c39-41c3-bd17-f0060b66e95c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:57 np0005466013 NetworkManager[51205]: <info>  [1759407717.8556] device (tap0c328734-eb): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:21:57 np0005466013 NetworkManager[51205]: <info>  [1759407717.8564] device (tap0c328734-eb): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:21:57 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:57.885 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[3ef9cc6d-785b-4be0-8935-3deffff2e7c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:57 np0005466013 NetworkManager[51205]: <info>  [1759407717.9015] manager: (tap8f494075-60): new Veth device (/org/freedesktop/NetworkManager/Devices/201)
Oct  2 08:21:57 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:57.890 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[24af1109-91ab-4114-9f96-d6a19d8d81f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:57 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:57.939 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[6410b545-d45f-446d-841b-3e5bd589e1f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:57 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:57.942 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[d4a4a68c-b38c-41e6-a07b-7e11d8d4337a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:57 np0005466013 NetworkManager[51205]: <info>  [1759407717.9658] device (tap8f494075-60): carrier: link connected
Oct  2 08:21:57 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:57.973 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[ac6f9ce3-42c8-49be-aca3-f2a8467a1006]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:57 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:57.993 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[bcae7ace-539f-45e4-982a-6d9073c42690]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8f494075-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9b:9a:65'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 131], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 570958, 'reachable_time': 26143, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236499, 'error': None, 'target': 'ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:58 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:58.008 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[113612e7-c017-4946-bf0b-eee09b754eb4]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe9b:9a65'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 570958, 'tstamp': 570958}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236501, 'error': None, 'target': 'ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:58 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:58.026 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[bfb56115-c51e-4d58-9092-9ec839f09ec5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8f494075-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9b:9a:65'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 131], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 570958, 'reachable_time': 26143, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 236503, 'error': None, 'target': 'ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:58 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:58.066 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[2a578034-e789-406a-acc2-dfc03221803e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:58 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:58.137 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[a0352e45-995a-4bca-b182-dda66d90ae8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:58 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:58.139 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8f494075-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:21:58 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:58.139 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:21:58 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:58.139 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8f494075-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:21:58 np0005466013 NetworkManager[51205]: <info>  [1759407718.1421] manager: (tap8f494075-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/202)
Oct  2 08:21:58 np0005466013 kernel: tap8f494075-60: entered promiscuous mode
Oct  2 08:21:58 np0005466013 nova_compute[192144]: 2025-10-02 12:21:58.142 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:58 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:58.146 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8f494075-60, col_values=(('external_ids', {'iface-id': 'a5eb523a-b004-42b7-a3f6-24b2514f40bf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:21:58 np0005466013 ovn_controller[94366]: 2025-10-02T12:21:58Z|00430|binding|INFO|Releasing lport a5eb523a-b004-42b7-a3f6-24b2514f40bf from this chassis (sb_readonly=0)
Oct  2 08:21:58 np0005466013 nova_compute[192144]: 2025-10-02 12:21:58.159 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:58 np0005466013 nova_compute[192144]: 2025-10-02 12:21:58.162 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:58 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:58.163 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8f494075-66bf-4ce0-a765-98fd91c31199.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8f494075-66bf-4ce0-a765-98fd91c31199.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:21:58 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:58.164 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[d7528084-f639-4365-b5aa-2e52fb16a554]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:58 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:58.165 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:21:58 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:21:58 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:21:58 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-8f494075-66bf-4ce0-a765-98fd91c31199
Oct  2 08:21:58 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:21:58 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:21:58 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:21:58 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/8f494075-66bf-4ce0-a765-98fd91c31199.pid.haproxy
Oct  2 08:21:58 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:21:58 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:21:58 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:21:58 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:21:58 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:21:58 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:21:58 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:21:58 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:21:58 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:21:58 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:21:58 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:21:58 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:21:58 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:21:58 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:21:58 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:21:58 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:21:58 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:21:58 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:21:58 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:21:58 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:21:58 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID 8f494075-66bf-4ce0-a765-98fd91c31199
Oct  2 08:21:58 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:21:58 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:58.166 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199', 'env', 'PROCESS_TAG=haproxy-8f494075-66bf-4ce0-a765-98fd91c31199', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8f494075-66bf-4ce0-a765-98fd91c31199.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:21:58 np0005466013 nova_compute[192144]: 2025-10-02 12:21:58.434 2 DEBUG nova.compute.manager [req-a37c1843-5ebe-4c2d-8a04-c0576bd98b1c req-f8458cff-de1d-4410-82cc-34efa67934bf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Received event network-vif-plugged-0c328734-ebc6-47bc-b603-2e4af1cae573 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:21:58 np0005466013 nova_compute[192144]: 2025-10-02 12:21:58.435 2 DEBUG oslo_concurrency.lockutils [req-a37c1843-5ebe-4c2d-8a04-c0576bd98b1c req-f8458cff-de1d-4410-82cc-34efa67934bf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:58 np0005466013 nova_compute[192144]: 2025-10-02 12:21:58.435 2 DEBUG oslo_concurrency.lockutils [req-a37c1843-5ebe-4c2d-8a04-c0576bd98b1c req-f8458cff-de1d-4410-82cc-34efa67934bf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:58 np0005466013 nova_compute[192144]: 2025-10-02 12:21:58.436 2 DEBUG oslo_concurrency.lockutils [req-a37c1843-5ebe-4c2d-8a04-c0576bd98b1c req-f8458cff-de1d-4410-82cc-34efa67934bf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:58 np0005466013 nova_compute[192144]: 2025-10-02 12:21:58.436 2 DEBUG nova.compute.manager [req-a37c1843-5ebe-4c2d-8a04-c0576bd98b1c req-f8458cff-de1d-4410-82cc-34efa67934bf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] No waiting events found dispatching network-vif-plugged-0c328734-ebc6-47bc-b603-2e4af1cae573 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:21:58 np0005466013 nova_compute[192144]: 2025-10-02 12:21:58.437 2 WARNING nova.compute.manager [req-a37c1843-5ebe-4c2d-8a04-c0576bd98b1c req-f8458cff-de1d-4410-82cc-34efa67934bf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Received unexpected event network-vif-plugged-0c328734-ebc6-47bc-b603-2e4af1cae573 for instance with vm_state suspended and task_state resuming.#033[00m
Oct  2 08:21:58 np0005466013 nova_compute[192144]: 2025-10-02 12:21:58.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:58 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:58.554 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:21:58 np0005466013 podman[236540]: 2025-10-02 12:21:58.62316237 +0000 UTC m=+0.083568668 container create 721792690a2169788f92c6be41278e2e8c5ae00b7ef3ef2ea19ce4ffbafb6948 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:21:58 np0005466013 nova_compute[192144]: 2025-10-02 12:21:58.668 2 DEBUG nova.virt.libvirt.host [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Removed pending event for 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Oct  2 08:21:58 np0005466013 nova_compute[192144]: 2025-10-02 12:21:58.670 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407718.6670828, 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:21:58 np0005466013 nova_compute[192144]: 2025-10-02 12:21:58.671 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] VM Started (Lifecycle Event)#033[00m
Oct  2 08:21:58 np0005466013 podman[236540]: 2025-10-02 12:21:58.575167718 +0000 UTC m=+0.035574036 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:21:58 np0005466013 nova_compute[192144]: 2025-10-02 12:21:58.692 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:21:58 np0005466013 systemd[1]: Started libpod-conmon-721792690a2169788f92c6be41278e2e8c5ae00b7ef3ef2ea19ce4ffbafb6948.scope.
Oct  2 08:21:58 np0005466013 nova_compute[192144]: 2025-10-02 12:21:58.702 2 DEBUG nova.compute.manager [None req-d530c94c-fa04-4003-8537-e5ec3722c17d a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:21:58 np0005466013 nova_compute[192144]: 2025-10-02 12:21:58.702 2 DEBUG nova.objects.instance [None req-d530c94c-fa04-4003-8537-e5ec3722c17d a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Lazy-loading 'pci_devices' on Instance uuid 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:21:58 np0005466013 nova_compute[192144]: 2025-10-02 12:21:58.706 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:21:58 np0005466013 nova_compute[192144]: 2025-10-02 12:21:58.724 2 INFO nova.virt.libvirt.driver [-] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Instance running successfully.#033[00m
Oct  2 08:21:58 np0005466013 virtqemud[191867]: argument unsupported: QEMU guest agent is not configured
Oct  2 08:21:58 np0005466013 nova_compute[192144]: 2025-10-02 12:21:58.728 2 DEBUG nova.virt.libvirt.guest [None req-d530c94c-fa04-4003-8537-e5ec3722c17d a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Oct  2 08:21:58 np0005466013 nova_compute[192144]: 2025-10-02 12:21:58.728 2 DEBUG nova.compute.manager [None req-d530c94c-fa04-4003-8537-e5ec3722c17d a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:21:58 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:21:58 np0005466013 nova_compute[192144]: 2025-10-02 12:21:58.734 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] During sync_power_state the instance has a pending task (resuming). Skip.#033[00m
Oct  2 08:21:58 np0005466013 nova_compute[192144]: 2025-10-02 12:21:58.735 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407718.6815968, 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:21:58 np0005466013 nova_compute[192144]: 2025-10-02 12:21:58.735 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:21:58 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/407e3931ab7ca3524164c0910d38b332a6ad9a8978f59a2f7ee5fe5890126d0b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:21:58 np0005466013 podman[236540]: 2025-10-02 12:21:58.761987176 +0000 UTC m=+0.222393554 container init 721792690a2169788f92c6be41278e2e8c5ae00b7ef3ef2ea19ce4ffbafb6948 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251001)
Oct  2 08:21:58 np0005466013 nova_compute[192144]: 2025-10-02 12:21:58.763 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:21:58 np0005466013 nova_compute[192144]: 2025-10-02 12:21:58.767 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:21:58 np0005466013 podman[236540]: 2025-10-02 12:21:58.775278529 +0000 UTC m=+0.235684857 container start 721792690a2169788f92c6be41278e2e8c5ae00b7ef3ef2ea19ce4ffbafb6948 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct  2 08:21:58 np0005466013 nova_compute[192144]: 2025-10-02 12:21:58.800 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] During sync_power_state the instance has a pending task (resuming). Skip.#033[00m
Oct  2 08:21:58 np0005466013 neutron-haproxy-ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199[236554]: [NOTICE]   (236558) : New worker (236560) forked
Oct  2 08:21:58 np0005466013 neutron-haproxy-ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199[236554]: [NOTICE]   (236558) : Loading success.
Oct  2 08:21:58 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:58.858 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:21:58 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:21:58.858 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:21:58 np0005466013 nova_compute[192144]: 2025-10-02 12:21:58.884 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:00 np0005466013 nova_compute[192144]: 2025-10-02 12:22:00.030 2 DEBUG oslo_concurrency.lockutils [None req-26546478-f033-4694-b8e7-a5fc0e25bdef 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Acquiring lock "ad2d69bb-3aa9-4c11-b9de-29996574cfa2" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:00 np0005466013 nova_compute[192144]: 2025-10-02 12:22:00.030 2 DEBUG oslo_concurrency.lockutils [None req-26546478-f033-4694-b8e7-a5fc0e25bdef 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "ad2d69bb-3aa9-4c11-b9de-29996574cfa2" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:00 np0005466013 nova_compute[192144]: 2025-10-02 12:22:00.030 2 DEBUG nova.compute.manager [None req-26546478-f033-4694-b8e7-a5fc0e25bdef 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:22:00 np0005466013 nova_compute[192144]: 2025-10-02 12:22:00.036 2 DEBUG nova.compute.manager [None req-26546478-f033-4694-b8e7-a5fc0e25bdef 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m
Oct  2 08:22:00 np0005466013 nova_compute[192144]: 2025-10-02 12:22:00.036 2 DEBUG nova.objects.instance [None req-26546478-f033-4694-b8e7-a5fc0e25bdef 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lazy-loading 'flavor' on Instance uuid ad2d69bb-3aa9-4c11-b9de-29996574cfa2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:22:00 np0005466013 nova_compute[192144]: 2025-10-02 12:22:00.066 2 DEBUG nova.objects.instance [None req-26546478-f033-4694-b8e7-a5fc0e25bdef 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lazy-loading 'info_cache' on Instance uuid ad2d69bb-3aa9-4c11-b9de-29996574cfa2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:22:00 np0005466013 nova_compute[192144]: 2025-10-02 12:22:00.103 2 DEBUG nova.virt.libvirt.driver [None req-26546478-f033-4694-b8e7-a5fc0e25bdef 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Oct  2 08:22:00 np0005466013 nova_compute[192144]: 2025-10-02 12:22:00.530 2 DEBUG nova.compute.manager [req-f11ca45c-c101-43eb-951a-1dd6d32b0ffb req-6205f04f-6416-487b-a5e0-882b9ca4dc89 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Received event network-vif-plugged-0c328734-ebc6-47bc-b603-2e4af1cae573 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:22:00 np0005466013 nova_compute[192144]: 2025-10-02 12:22:00.530 2 DEBUG oslo_concurrency.lockutils [req-f11ca45c-c101-43eb-951a-1dd6d32b0ffb req-6205f04f-6416-487b-a5e0-882b9ca4dc89 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:00 np0005466013 nova_compute[192144]: 2025-10-02 12:22:00.530 2 DEBUG oslo_concurrency.lockutils [req-f11ca45c-c101-43eb-951a-1dd6d32b0ffb req-6205f04f-6416-487b-a5e0-882b9ca4dc89 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:00 np0005466013 nova_compute[192144]: 2025-10-02 12:22:00.531 2 DEBUG oslo_concurrency.lockutils [req-f11ca45c-c101-43eb-951a-1dd6d32b0ffb req-6205f04f-6416-487b-a5e0-882b9ca4dc89 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:00 np0005466013 nova_compute[192144]: 2025-10-02 12:22:00.531 2 DEBUG nova.compute.manager [req-f11ca45c-c101-43eb-951a-1dd6d32b0ffb req-6205f04f-6416-487b-a5e0-882b9ca4dc89 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] No waiting events found dispatching network-vif-plugged-0c328734-ebc6-47bc-b603-2e4af1cae573 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:22:00 np0005466013 nova_compute[192144]: 2025-10-02 12:22:00.531 2 WARNING nova.compute.manager [req-f11ca45c-c101-43eb-951a-1dd6d32b0ffb req-6205f04f-6416-487b-a5e0-882b9ca4dc89 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Received unexpected event network-vif-plugged-0c328734-ebc6-47bc-b603-2e4af1cae573 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:22:01 np0005466013 nova_compute[192144]: 2025-10-02 12:22:01.774 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:02.304 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:02.305 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:02.306 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:02 np0005466013 kernel: tapcae13af9-81 (unregistering): left promiscuous mode
Oct  2 08:22:02 np0005466013 NetworkManager[51205]: <info>  [1759407722.5020] device (tapcae13af9-81): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:22:02 np0005466013 ovn_controller[94366]: 2025-10-02T12:22:02Z|00431|binding|INFO|Releasing lport cae13af9-8175-4eab-b9ec-18019b521d0b from this chassis (sb_readonly=0)
Oct  2 08:22:02 np0005466013 ovn_controller[94366]: 2025-10-02T12:22:02Z|00432|binding|INFO|Setting lport cae13af9-8175-4eab-b9ec-18019b521d0b down in Southbound
Oct  2 08:22:02 np0005466013 ovn_controller[94366]: 2025-10-02T12:22:02Z|00433|binding|INFO|Removing iface tapcae13af9-81 ovn-installed in OVS
Oct  2 08:22:02 np0005466013 nova_compute[192144]: 2025-10-02 12:22:02.559 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:02 np0005466013 nova_compute[192144]: 2025-10-02 12:22:02.579 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:02.581 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:35:d3:eb 10.100.0.14'], port_security=['fa:16:3e:35:d3:eb 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'ad2d69bb-3aa9-4c11-b9de-29996574cfa2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-20eb29be-ee23-463b-85af-bfc2388e9f77', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'neutron:revision_number': '4', 'neutron:security_group_ids': '12e9168a-be86-462f-a658-971f38e3430f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.185'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e183e2c6-21dc-48e3-ae47-279bc8b32eeb, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=cae13af9-8175-4eab-b9ec-18019b521d0b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:22:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:02.582 103323 INFO neutron.agent.ovn.metadata.agent [-] Port cae13af9-8175-4eab-b9ec-18019b521d0b in datapath 20eb29be-ee23-463b-85af-bfc2388e9f77 unbound from our chassis#033[00m
Oct  2 08:22:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:02.584 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 20eb29be-ee23-463b-85af-bfc2388e9f77#033[00m
Oct  2 08:22:02 np0005466013 systemd[1]: machine-qemu\x2d48\x2dinstance\x2d0000006c.scope: Deactivated successfully.
Oct  2 08:22:02 np0005466013 systemd[1]: machine-qemu\x2d48\x2dinstance\x2d0000006c.scope: Consumed 16.940s CPU time.
Oct  2 08:22:02 np0005466013 systemd-machined[152202]: Machine qemu-48-instance-0000006c terminated.
Oct  2 08:22:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:02.610 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[01951381-760f-4092-b3e8-ad354ab7f231]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:02 np0005466013 podman[236593]: 2025-10-02 12:22:02.644235407 +0000 UTC m=+0.091090943 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct  2 08:22:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:02.654 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[0cffba1d-4b56-4897-8125-2f91ad370baf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:02.658 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[fa350db7-fd97-46f3-aa5d-e82d10fda432]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:02 np0005466013 podman[236592]: 2025-10-02 12:22:02.665995713 +0000 UTC m=+0.112586591 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vcs-type=git, version=9.6, distribution-scope=public, io.buildah.version=1.33.7, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, release=1755695350, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct  2 08:22:02 np0005466013 podman[236589]: 2025-10-02 12:22:02.666472547 +0000 UTC m=+0.112982812 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001)
Oct  2 08:22:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:02.691 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[7c79e0e4-9d9b-4a2f-840f-c7e3e7b8886b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:02.712 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[3c1c8662-8e97-48dd-9ef9-95a096cff77f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap20eb29be-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:77:55:96'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 916, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 916, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 125], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 561951, 'reachable_time': 40813, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236658, 'error': None, 'target': 'ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:02.732 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[f55160c9-2a58-4fda-9a71-35784c549413]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap20eb29be-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 561963, 'tstamp': 561963}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236659, 'error': None, 'target': 'ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap20eb29be-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 561966, 'tstamp': 561966}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236659, 'error': None, 'target': 'ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:02.735 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap20eb29be-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:22:02 np0005466013 nova_compute[192144]: 2025-10-02 12:22:02.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:02 np0005466013 nova_compute[192144]: 2025-10-02 12:22:02.739 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:02 np0005466013 nova_compute[192144]: 2025-10-02 12:22:02.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:02.746 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap20eb29be-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:22:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:02.747 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:22:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:02.747 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap20eb29be-e0, col_values=(('external_ids', {'iface-id': 'e533861f-45cb-4843-b071-0b628ca25128'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:22:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:02.747 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:22:02 np0005466013 nova_compute[192144]: 2025-10-02 12:22:02.856 2 DEBUG nova.compute.manager [req-da90f62c-1f24-4e26-b97d-6a7d88ddbdd0 req-312308d9-9e38-4cc5-800f-d49f0b358b34 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Received event network-vif-unplugged-cae13af9-8175-4eab-b9ec-18019b521d0b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:22:02 np0005466013 nova_compute[192144]: 2025-10-02 12:22:02.856 2 DEBUG oslo_concurrency.lockutils [req-da90f62c-1f24-4e26-b97d-6a7d88ddbdd0 req-312308d9-9e38-4cc5-800f-d49f0b358b34 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "ad2d69bb-3aa9-4c11-b9de-29996574cfa2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:02 np0005466013 nova_compute[192144]: 2025-10-02 12:22:02.857 2 DEBUG oslo_concurrency.lockutils [req-da90f62c-1f24-4e26-b97d-6a7d88ddbdd0 req-312308d9-9e38-4cc5-800f-d49f0b358b34 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ad2d69bb-3aa9-4c11-b9de-29996574cfa2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:02 np0005466013 nova_compute[192144]: 2025-10-02 12:22:02.857 2 DEBUG oslo_concurrency.lockutils [req-da90f62c-1f24-4e26-b97d-6a7d88ddbdd0 req-312308d9-9e38-4cc5-800f-d49f0b358b34 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ad2d69bb-3aa9-4c11-b9de-29996574cfa2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:02 np0005466013 nova_compute[192144]: 2025-10-02 12:22:02.857 2 DEBUG nova.compute.manager [req-da90f62c-1f24-4e26-b97d-6a7d88ddbdd0 req-312308d9-9e38-4cc5-800f-d49f0b358b34 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] No waiting events found dispatching network-vif-unplugged-cae13af9-8175-4eab-b9ec-18019b521d0b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:22:02 np0005466013 nova_compute[192144]: 2025-10-02 12:22:02.857 2 WARNING nova.compute.manager [req-da90f62c-1f24-4e26-b97d-6a7d88ddbdd0 req-312308d9-9e38-4cc5-800f-d49f0b358b34 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Received unexpected event network-vif-unplugged-cae13af9-8175-4eab-b9ec-18019b521d0b for instance with vm_state active and task_state powering-off.#033[00m
Oct  2 08:22:03 np0005466013 nova_compute[192144]: 2025-10-02 12:22:03.123 2 INFO nova.virt.libvirt.driver [None req-26546478-f033-4694-b8e7-a5fc0e25bdef 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Instance shutdown successfully after 3 seconds.#033[00m
Oct  2 08:22:03 np0005466013 nova_compute[192144]: 2025-10-02 12:22:03.129 2 INFO nova.virt.libvirt.driver [-] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Instance destroyed successfully.#033[00m
Oct  2 08:22:03 np0005466013 nova_compute[192144]: 2025-10-02 12:22:03.130 2 DEBUG nova.objects.instance [None req-26546478-f033-4694-b8e7-a5fc0e25bdef 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lazy-loading 'numa_topology' on Instance uuid ad2d69bb-3aa9-4c11-b9de-29996574cfa2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:22:03 np0005466013 nova_compute[192144]: 2025-10-02 12:22:03.144 2 DEBUG nova.compute.manager [None req-26546478-f033-4694-b8e7-a5fc0e25bdef 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:22:03 np0005466013 nova_compute[192144]: 2025-10-02 12:22:03.238 2 DEBUG oslo_concurrency.lockutils [None req-26546478-f033-4694-b8e7-a5fc0e25bdef 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "ad2d69bb-3aa9-4c11-b9de-29996574cfa2" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 3.208s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:03 np0005466013 ovn_controller[94366]: 2025-10-02T12:22:03Z|00038|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:17:03:e5 10.100.0.6
Oct  2 08:22:03 np0005466013 ovn_controller[94366]: 2025-10-02T12:22:03Z|00039|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:17:03:e5 10.100.0.6
Oct  2 08:22:03 np0005466013 nova_compute[192144]: 2025-10-02 12:22:03.919 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:04 np0005466013 nova_compute[192144]: 2025-10-02 12:22:04.974 2 DEBUG nova.compute.manager [req-7ea4a715-c097-4e11-b9e3-617a39c34b0d req-6d2311c8-d18d-4b8b-8219-e62d3e006959 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Received event network-vif-plugged-cae13af9-8175-4eab-b9ec-18019b521d0b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:22:04 np0005466013 nova_compute[192144]: 2025-10-02 12:22:04.974 2 DEBUG oslo_concurrency.lockutils [req-7ea4a715-c097-4e11-b9e3-617a39c34b0d req-6d2311c8-d18d-4b8b-8219-e62d3e006959 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "ad2d69bb-3aa9-4c11-b9de-29996574cfa2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:04 np0005466013 nova_compute[192144]: 2025-10-02 12:22:04.975 2 DEBUG oslo_concurrency.lockutils [req-7ea4a715-c097-4e11-b9e3-617a39c34b0d req-6d2311c8-d18d-4b8b-8219-e62d3e006959 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ad2d69bb-3aa9-4c11-b9de-29996574cfa2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:04 np0005466013 nova_compute[192144]: 2025-10-02 12:22:04.975 2 DEBUG oslo_concurrency.lockutils [req-7ea4a715-c097-4e11-b9e3-617a39c34b0d req-6d2311c8-d18d-4b8b-8219-e62d3e006959 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ad2d69bb-3aa9-4c11-b9de-29996574cfa2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:04 np0005466013 nova_compute[192144]: 2025-10-02 12:22:04.975 2 DEBUG nova.compute.manager [req-7ea4a715-c097-4e11-b9e3-617a39c34b0d req-6d2311c8-d18d-4b8b-8219-e62d3e006959 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] No waiting events found dispatching network-vif-plugged-cae13af9-8175-4eab-b9ec-18019b521d0b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:22:04 np0005466013 nova_compute[192144]: 2025-10-02 12:22:04.975 2 WARNING nova.compute.manager [req-7ea4a715-c097-4e11-b9e3-617a39c34b0d req-6d2311c8-d18d-4b8b-8219-e62d3e006959 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Received unexpected event network-vif-plugged-cae13af9-8175-4eab-b9ec-18019b521d0b for instance with vm_state stopped and task_state None.#033[00m
Oct  2 08:22:06 np0005466013 nova_compute[192144]: 2025-10-02 12:22:06.777 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:08 np0005466013 nova_compute[192144]: 2025-10-02 12:22:08.122 2 DEBUG oslo_concurrency.lockutils [None req-47fa1911-b268-4829-9880-033aaa9cbef2 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Acquiring lock "refresh_cache-ad2d69bb-3aa9-4c11-b9de-29996574cfa2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:22:08 np0005466013 nova_compute[192144]: 2025-10-02 12:22:08.125 2 DEBUG oslo_concurrency.lockutils [None req-47fa1911-b268-4829-9880-033aaa9cbef2 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Acquired lock "refresh_cache-ad2d69bb-3aa9-4c11-b9de-29996574cfa2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:22:08 np0005466013 nova_compute[192144]: 2025-10-02 12:22:08.125 2 DEBUG nova.network.neutron [None req-47fa1911-b268-4829-9880-033aaa9cbef2 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:22:08 np0005466013 podman[236695]: 2025-10-02 12:22:08.683917997 +0000 UTC m=+0.057111666 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:22:08 np0005466013 podman[236696]: 2025-10-02 12:22:08.698663926 +0000 UTC m=+0.072320950 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  2 08:22:08 np0005466013 nova_compute[192144]: 2025-10-02 12:22:08.925 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:09 np0005466013 ovn_controller[94366]: 2025-10-02T12:22:09Z|00040|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ef:e3:79 10.100.0.10
Oct  2 08:22:10 np0005466013 nova_compute[192144]: 2025-10-02 12:22:10.073 2 DEBUG nova.network.neutron [None req-47fa1911-b268-4829-9880-033aaa9cbef2 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Updating instance_info_cache with network_info: [{"id": "cae13af9-8175-4eab-b9ec-18019b521d0b", "address": "fa:16:3e:35:d3:eb", "network": {"id": "20eb29be-ee23-463b-85af-bfc2388e9f77", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-370285634-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffce7d629aa24a7f970d93b2a79045f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcae13af9-81", "ovs_interfaceid": "cae13af9-8175-4eab-b9ec-18019b521d0b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:22:10 np0005466013 nova_compute[192144]: 2025-10-02 12:22:10.122 2 DEBUG oslo_concurrency.lockutils [None req-47fa1911-b268-4829-9880-033aaa9cbef2 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Releasing lock "refresh_cache-ad2d69bb-3aa9-4c11-b9de-29996574cfa2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:22:10 np0005466013 nova_compute[192144]: 2025-10-02 12:22:10.268 2 DEBUG nova.virt.libvirt.driver [None req-47fa1911-b268-4829-9880-033aaa9cbef2 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Oct  2 08:22:10 np0005466013 nova_compute[192144]: 2025-10-02 12:22:10.268 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-47fa1911-b268-4829-9880-033aaa9cbef2 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Creating file /var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2/80ed675472754c51ae6dc429e5d675ac.tmp on remote host 192.168.122.100 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Oct  2 08:22:10 np0005466013 nova_compute[192144]: 2025-10-02 12:22:10.269 2 DEBUG oslo_concurrency.processutils [None req-47fa1911-b268-4829-9880-033aaa9cbef2 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2/80ed675472754c51ae6dc429e5d675ac.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:22:10 np0005466013 nova_compute[192144]: 2025-10-02 12:22:10.758 2 DEBUG oslo_concurrency.processutils [None req-47fa1911-b268-4829-9880-033aaa9cbef2 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2/80ed675472754c51ae6dc429e5d675ac.tmp" returned: 1 in 0.490s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:22:10 np0005466013 nova_compute[192144]: 2025-10-02 12:22:10.760 2 DEBUG oslo_concurrency.processutils [None req-47fa1911-b268-4829-9880-033aaa9cbef2 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] 'ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2/80ed675472754c51ae6dc429e5d675ac.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Oct  2 08:22:10 np0005466013 nova_compute[192144]: 2025-10-02 12:22:10.760 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-47fa1911-b268-4829-9880-033aaa9cbef2 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Creating directory /var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2 on remote host 192.168.122.100 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Oct  2 08:22:10 np0005466013 nova_compute[192144]: 2025-10-02 12:22:10.761 2 DEBUG oslo_concurrency.processutils [None req-47fa1911-b268-4829-9880-033aaa9cbef2 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:22:10 np0005466013 nova_compute[192144]: 2025-10-02 12:22:10.988 2 DEBUG oslo_concurrency.processutils [None req-47fa1911-b268-4829-9880-033aaa9cbef2 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2" returned: 0 in 0.227s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:22:10 np0005466013 nova_compute[192144]: 2025-10-02 12:22:10.996 2 INFO nova.virt.libvirt.driver [None req-47fa1911-b268-4829-9880-033aaa9cbef2 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Instance already shutdown.#033[00m
Oct  2 08:22:11 np0005466013 nova_compute[192144]: 2025-10-02 12:22:11.005 2 INFO nova.virt.libvirt.driver [-] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Instance destroyed successfully.#033[00m
Oct  2 08:22:11 np0005466013 nova_compute[192144]: 2025-10-02 12:22:11.007 2 DEBUG nova.virt.libvirt.vif [None req-47fa1911-b268-4829-9880-033aaa9cbef2 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:20:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1629207280',display_name='tempest-ServerActionsTestOtherB-server-1629207280',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1629207280',id=108,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG+aqSe4de2VLtRAXN5xeLQn4S/3X8QrNMy2M5WdQ5hviVyEOgqK+m+uWmzPaUSUgE38sEdkytfwUHD32CBZajBt4q3OEf9i3yPJUQGuqp42pAUD+A3EoBIyeptNeSxGdA==',key_name='tempest-keypair-1900171990',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:20:28Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='ffce7d629aa24a7f970d93b2a79045f1',ramdisk_id='',reservation_id='r-jtzab0yc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='stopped',owner_project_name='tempest-ServerActionsTestOtherB-263921372',owner_user_name='tempest-ServerActionsTestOtherB-263921372-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:22:07Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0ea122e2fff94f2ba7c78bf30b04029c',uuid=ad2d69bb-3aa9-4c11-b9de-29996574cfa2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "cae13af9-8175-4eab-b9ec-18019b521d0b", "address": "fa:16:3e:35:d3:eb", "network": {"id": "20eb29be-ee23-463b-85af-bfc2388e9f77", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-370285634-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherB-370285634-network", "vif_mac": "fa:16:3e:35:d3:eb"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffce7d629aa24a7f970d93b2a79045f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcae13af9-81", "ovs_interfaceid": "cae13af9-8175-4eab-b9ec-18019b521d0b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:22:11 np0005466013 nova_compute[192144]: 2025-10-02 12:22:11.008 2 DEBUG nova.network.os_vif_util [None req-47fa1911-b268-4829-9880-033aaa9cbef2 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Converting VIF {"id": "cae13af9-8175-4eab-b9ec-18019b521d0b", "address": "fa:16:3e:35:d3:eb", "network": {"id": "20eb29be-ee23-463b-85af-bfc2388e9f77", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-370285634-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherB-370285634-network", "vif_mac": "fa:16:3e:35:d3:eb"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffce7d629aa24a7f970d93b2a79045f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcae13af9-81", "ovs_interfaceid": "cae13af9-8175-4eab-b9ec-18019b521d0b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:22:11 np0005466013 nova_compute[192144]: 2025-10-02 12:22:11.010 2 DEBUG nova.network.os_vif_util [None req-47fa1911-b268-4829-9880-033aaa9cbef2 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:d3:eb,bridge_name='br-int',has_traffic_filtering=True,id=cae13af9-8175-4eab-b9ec-18019b521d0b,network=Network(20eb29be-ee23-463b-85af-bfc2388e9f77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcae13af9-81') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:22:11 np0005466013 nova_compute[192144]: 2025-10-02 12:22:11.011 2 DEBUG os_vif [None req-47fa1911-b268-4829-9880-033aaa9cbef2 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:d3:eb,bridge_name='br-int',has_traffic_filtering=True,id=cae13af9-8175-4eab-b9ec-18019b521d0b,network=Network(20eb29be-ee23-463b-85af-bfc2388e9f77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcae13af9-81') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:22:11 np0005466013 nova_compute[192144]: 2025-10-02 12:22:11.015 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:11 np0005466013 nova_compute[192144]: 2025-10-02 12:22:11.016 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcae13af9-81, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:22:11 np0005466013 nova_compute[192144]: 2025-10-02 12:22:11.020 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:11 np0005466013 nova_compute[192144]: 2025-10-02 12:22:11.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:22:11 np0005466013 nova_compute[192144]: 2025-10-02 12:22:11.029 2 INFO os_vif [None req-47fa1911-b268-4829-9880-033aaa9cbef2 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:d3:eb,bridge_name='br-int',has_traffic_filtering=True,id=cae13af9-8175-4eab-b9ec-18019b521d0b,network=Network(20eb29be-ee23-463b-85af-bfc2388e9f77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcae13af9-81')#033[00m
Oct  2 08:22:11 np0005466013 nova_compute[192144]: 2025-10-02 12:22:11.035 2 DEBUG oslo_concurrency.processutils [None req-47fa1911-b268-4829-9880-033aaa9cbef2 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:22:11 np0005466013 nova_compute[192144]: 2025-10-02 12:22:11.101 2 DEBUG oslo_concurrency.processutils [None req-47fa1911-b268-4829-9880-033aaa9cbef2 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:22:11 np0005466013 nova_compute[192144]: 2025-10-02 12:22:11.102 2 DEBUG oslo_concurrency.processutils [None req-47fa1911-b268-4829-9880-033aaa9cbef2 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:22:11 np0005466013 nova_compute[192144]: 2025-10-02 12:22:11.164 2 DEBUG oslo_concurrency.processutils [None req-47fa1911-b268-4829-9880-033aaa9cbef2 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:22:11 np0005466013 nova_compute[192144]: 2025-10-02 12:22:11.166 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-47fa1911-b268-4829-9880-033aaa9cbef2 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Copying file /var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2_resize/disk to 192.168.122.100:/var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Oct  2 08:22:11 np0005466013 nova_compute[192144]: 2025-10-02 12:22:11.167 2 DEBUG oslo_concurrency.processutils [None req-47fa1911-b268-4829-9880-033aaa9cbef2 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2_resize/disk 192.168.122.100:/var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:22:11 np0005466013 nova_compute[192144]: 2025-10-02 12:22:11.879 2 DEBUG oslo_concurrency.processutils [None req-47fa1911-b268-4829-9880-033aaa9cbef2 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CMD "scp -r /var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2_resize/disk 192.168.122.100:/var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk" returned: 0 in 0.712s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:22:11 np0005466013 nova_compute[192144]: 2025-10-02 12:22:11.880 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-47fa1911-b268-4829-9880-033aaa9cbef2 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Copying file /var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2_resize/disk.config to 192.168.122.100:/var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Oct  2 08:22:11 np0005466013 nova_compute[192144]: 2025-10-02 12:22:11.881 2 DEBUG oslo_concurrency.processutils [None req-47fa1911-b268-4829-9880-033aaa9cbef2 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2_resize/disk.config 192.168.122.100:/var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:22:12 np0005466013 nova_compute[192144]: 2025-10-02 12:22:12.138 2 DEBUG oslo_concurrency.processutils [None req-47fa1911-b268-4829-9880-033aaa9cbef2 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CMD "scp -C -r /var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2_resize/disk.config 192.168.122.100:/var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk.config" returned: 0 in 0.258s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:22:12 np0005466013 nova_compute[192144]: 2025-10-02 12:22:12.142 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-47fa1911-b268-4829-9880-033aaa9cbef2 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Copying file /var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2_resize/disk.info to 192.168.122.100:/var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Oct  2 08:22:12 np0005466013 nova_compute[192144]: 2025-10-02 12:22:12.143 2 DEBUG oslo_concurrency.processutils [None req-47fa1911-b268-4829-9880-033aaa9cbef2 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2_resize/disk.info 192.168.122.100:/var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:22:12 np0005466013 nova_compute[192144]: 2025-10-02 12:22:12.361 2 DEBUG oslo_concurrency.processutils [None req-47fa1911-b268-4829-9880-033aaa9cbef2 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CMD "scp -C -r /var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2_resize/disk.info 192.168.122.100:/var/lib/nova/instances/ad2d69bb-3aa9-4c11-b9de-29996574cfa2/disk.info" returned: 0 in 0.219s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:22:13 np0005466013 nova_compute[192144]: 2025-10-02 12:22:13.303 2 DEBUG neutronclient.v2_0.client [None req-47fa1911-b268-4829-9880-033aaa9cbef2 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port cae13af9-8175-4eab-b9ec-18019b521d0b for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Oct  2 08:22:13 np0005466013 nova_compute[192144]: 2025-10-02 12:22:13.490 2 DEBUG oslo_concurrency.lockutils [None req-47fa1911-b268-4829-9880-033aaa9cbef2 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Acquiring lock "ad2d69bb-3aa9-4c11-b9de-29996574cfa2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:13 np0005466013 nova_compute[192144]: 2025-10-02 12:22:13.490 2 DEBUG oslo_concurrency.lockutils [None req-47fa1911-b268-4829-9880-033aaa9cbef2 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "ad2d69bb-3aa9-4c11-b9de-29996574cfa2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:13 np0005466013 nova_compute[192144]: 2025-10-02 12:22:13.490 2 DEBUG oslo_concurrency.lockutils [None req-47fa1911-b268-4829-9880-033aaa9cbef2 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "ad2d69bb-3aa9-4c11-b9de-29996574cfa2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:13 np0005466013 nova_compute[192144]: 2025-10-02 12:22:13.925 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:14 np0005466013 nova_compute[192144]: 2025-10-02 12:22:14.835 2 DEBUG oslo_concurrency.lockutils [None req-50da8073-a72b-4d1e-8c8f-3271b1da9b38 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Acquiring lock "35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:14 np0005466013 nova_compute[192144]: 2025-10-02 12:22:14.836 2 DEBUG oslo_concurrency.lockutils [None req-50da8073-a72b-4d1e-8c8f-3271b1da9b38 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Lock "35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:14 np0005466013 nova_compute[192144]: 2025-10-02 12:22:14.836 2 DEBUG oslo_concurrency.lockutils [None req-50da8073-a72b-4d1e-8c8f-3271b1da9b38 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Acquiring lock "35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:14 np0005466013 nova_compute[192144]: 2025-10-02 12:22:14.836 2 DEBUG oslo_concurrency.lockutils [None req-50da8073-a72b-4d1e-8c8f-3271b1da9b38 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Lock "35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:14 np0005466013 nova_compute[192144]: 2025-10-02 12:22:14.837 2 DEBUG oslo_concurrency.lockutils [None req-50da8073-a72b-4d1e-8c8f-3271b1da9b38 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Lock "35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:14 np0005466013 nova_compute[192144]: 2025-10-02 12:22:14.870 2 INFO nova.compute.manager [None req-50da8073-a72b-4d1e-8c8f-3271b1da9b38 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Terminating instance#033[00m
Oct  2 08:22:14 np0005466013 nova_compute[192144]: 2025-10-02 12:22:14.902 2 DEBUG nova.compute.manager [None req-50da8073-a72b-4d1e-8c8f-3271b1da9b38 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:22:14 np0005466013 kernel: tap0c328734-eb (unregistering): left promiscuous mode
Oct  2 08:22:14 np0005466013 NetworkManager[51205]: <info>  [1759407734.9339] device (tap0c328734-eb): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:22:14 np0005466013 nova_compute[192144]: 2025-10-02 12:22:14.947 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:14 np0005466013 ovn_controller[94366]: 2025-10-02T12:22:14Z|00434|binding|INFO|Releasing lport 0c328734-ebc6-47bc-b603-2e4af1cae573 from this chassis (sb_readonly=0)
Oct  2 08:22:14 np0005466013 ovn_controller[94366]: 2025-10-02T12:22:14Z|00435|binding|INFO|Setting lport 0c328734-ebc6-47bc-b603-2e4af1cae573 down in Southbound
Oct  2 08:22:14 np0005466013 ovn_controller[94366]: 2025-10-02T12:22:14Z|00436|binding|INFO|Removing iface tap0c328734-eb ovn-installed in OVS
Oct  2 08:22:14 np0005466013 nova_compute[192144]: 2025-10-02 12:22:14.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:14 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:14.963 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ef:e3:79 10.100.0.10'], port_security=['fa:16:3e:ef:e3:79 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8f494075-66bf-4ce0-a765-98fd91c31199', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f0c8c8a8631b4721beed577a99f8bdb7', 'neutron:revision_number': '11', 'neutron:security_group_ids': 'eb030dcc-72ea-4850-916a-e1df7c4d9a87', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e43b5827-85bf-4b83-b921-ec45e12f1f2e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=0c328734-ebc6-47bc-b603-2e4af1cae573) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:22:14 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:14.964 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 0c328734-ebc6-47bc-b603-2e4af1cae573 in datapath 8f494075-66bf-4ce0-a765-98fd91c31199 unbound from our chassis#033[00m
Oct  2 08:22:14 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:14.965 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8f494075-66bf-4ce0-a765-98fd91c31199, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:22:14 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:14.966 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[0114ee7d-063b-41e9-ae95-07e8878dbbb1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:14 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:14.969 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199 namespace which is not needed anymore#033[00m
Oct  2 08:22:14 np0005466013 nova_compute[192144]: 2025-10-02 12:22:14.975 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:15 np0005466013 systemd[1]: machine-qemu\x2d51\x2dinstance\x2d00000065.scope: Deactivated successfully.
Oct  2 08:22:15 np0005466013 systemd[1]: machine-qemu\x2d51\x2dinstance\x2d00000065.scope: Consumed 10.997s CPU time.
Oct  2 08:22:15 np0005466013 systemd-machined[152202]: Machine qemu-51-instance-00000065 terminated.
Oct  2 08:22:15 np0005466013 neutron-haproxy-ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199[236554]: [NOTICE]   (236558) : haproxy version is 2.8.14-c23fe91
Oct  2 08:22:15 np0005466013 neutron-haproxy-ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199[236554]: [NOTICE]   (236558) : path to executable is /usr/sbin/haproxy
Oct  2 08:22:15 np0005466013 neutron-haproxy-ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199[236554]: [WARNING]  (236558) : Exiting Master process...
Oct  2 08:22:15 np0005466013 neutron-haproxy-ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199[236554]: [ALERT]    (236558) : Current worker (236560) exited with code 143 (Terminated)
Oct  2 08:22:15 np0005466013 neutron-haproxy-ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199[236554]: [WARNING]  (236558) : All workers exited. Exiting... (0)
Oct  2 08:22:15 np0005466013 systemd[1]: libpod-721792690a2169788f92c6be41278e2e8c5ae00b7ef3ef2ea19ce4ffbafb6948.scope: Deactivated successfully.
Oct  2 08:22:15 np0005466013 podman[236776]: 2025-10-02 12:22:15.16600745 +0000 UTC m=+0.096592964 container died 721792690a2169788f92c6be41278e2e8c5ae00b7ef3ef2ea19ce4ffbafb6948 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:22:15 np0005466013 nova_compute[192144]: 2025-10-02 12:22:15.189 2 INFO nova.virt.libvirt.driver [-] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Instance destroyed successfully.#033[00m
Oct  2 08:22:15 np0005466013 nova_compute[192144]: 2025-10-02 12:22:15.190 2 DEBUG nova.objects.instance [None req-50da8073-a72b-4d1e-8c8f-3271b1da9b38 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Lazy-loading 'resources' on Instance uuid 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:22:15 np0005466013 nova_compute[192144]: 2025-10-02 12:22:15.214 2 DEBUG nova.virt.libvirt.vif [None req-50da8073-a72b-4d1e-8c8f-3271b1da9b38 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T12:19:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1170653470',display_name='tempest-ServersNegativeTestJSON-server-1170653470',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-1170653470',id=101,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:21:46Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f0c8c8a8631b4721beed577a99f8bdb7',ramdisk_id='',reservation_id='r-sfsoaqzs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestJSON-114354241',owner_user_name='tempest-ServersNegativeTestJSON-114354241-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:21:58Z,user_data=None,user_id='a803afe9939346088252c3b944f124f2',uuid=35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0c328734-ebc6-47bc-b603-2e4af1cae573", "address": "fa:16:3e:ef:e3:79", "network": {"id": "8f494075-66bf-4ce0-a765-98fd91c31199", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1553125421-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0c8c8a8631b4721beed577a99f8bdb7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c328734-eb", "ovs_interfaceid": "0c328734-ebc6-47bc-b603-2e4af1cae573", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:22:15 np0005466013 nova_compute[192144]: 2025-10-02 12:22:15.216 2 DEBUG nova.network.os_vif_util [None req-50da8073-a72b-4d1e-8c8f-3271b1da9b38 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Converting VIF {"id": "0c328734-ebc6-47bc-b603-2e4af1cae573", "address": "fa:16:3e:ef:e3:79", "network": {"id": "8f494075-66bf-4ce0-a765-98fd91c31199", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1553125421-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0c8c8a8631b4721beed577a99f8bdb7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c328734-eb", "ovs_interfaceid": "0c328734-ebc6-47bc-b603-2e4af1cae573", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:22:15 np0005466013 nova_compute[192144]: 2025-10-02 12:22:15.217 2 DEBUG nova.network.os_vif_util [None req-50da8073-a72b-4d1e-8c8f-3271b1da9b38 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ef:e3:79,bridge_name='br-int',has_traffic_filtering=True,id=0c328734-ebc6-47bc-b603-2e4af1cae573,network=Network(8f494075-66bf-4ce0-a765-98fd91c31199),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0c328734-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:22:15 np0005466013 nova_compute[192144]: 2025-10-02 12:22:15.217 2 DEBUG os_vif [None req-50da8073-a72b-4d1e-8c8f-3271b1da9b38 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ef:e3:79,bridge_name='br-int',has_traffic_filtering=True,id=0c328734-ebc6-47bc-b603-2e4af1cae573,network=Network(8f494075-66bf-4ce0-a765-98fd91c31199),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0c328734-eb') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:22:15 np0005466013 nova_compute[192144]: 2025-10-02 12:22:15.221 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:15 np0005466013 nova_compute[192144]: 2025-10-02 12:22:15.222 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0c328734-eb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:22:15 np0005466013 nova_compute[192144]: 2025-10-02 12:22:15.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:15 np0005466013 nova_compute[192144]: 2025-10-02 12:22:15.225 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:15 np0005466013 nova_compute[192144]: 2025-10-02 12:22:15.228 2 INFO os_vif [None req-50da8073-a72b-4d1e-8c8f-3271b1da9b38 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ef:e3:79,bridge_name='br-int',has_traffic_filtering=True,id=0c328734-ebc6-47bc-b603-2e4af1cae573,network=Network(8f494075-66bf-4ce0-a765-98fd91c31199),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0c328734-eb')#033[00m
Oct  2 08:22:15 np0005466013 nova_compute[192144]: 2025-10-02 12:22:15.230 2 INFO nova.virt.libvirt.driver [None req-50da8073-a72b-4d1e-8c8f-3271b1da9b38 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Deleting instance files /var/lib/nova/instances/35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c_del#033[00m
Oct  2 08:22:15 np0005466013 nova_compute[192144]: 2025-10-02 12:22:15.236 2 INFO nova.virt.libvirt.driver [None req-50da8073-a72b-4d1e-8c8f-3271b1da9b38 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Deletion of /var/lib/nova/instances/35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c_del complete#033[00m
Oct  2 08:22:15 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-721792690a2169788f92c6be41278e2e8c5ae00b7ef3ef2ea19ce4ffbafb6948-userdata-shm.mount: Deactivated successfully.
Oct  2 08:22:15 np0005466013 systemd[1]: var-lib-containers-storage-overlay-407e3931ab7ca3524164c0910d38b332a6ad9a8978f59a2f7ee5fe5890126d0b-merged.mount: Deactivated successfully.
Oct  2 08:22:15 np0005466013 nova_compute[192144]: 2025-10-02 12:22:15.328 2 DEBUG nova.compute.manager [req-82b53fa2-c503-45e6-b5b4-8c9fa0be65c2 req-d1da8e55-d34d-422e-a6ad-a7015753f2d1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Received event network-changed-cae13af9-8175-4eab-b9ec-18019b521d0b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:22:15 np0005466013 nova_compute[192144]: 2025-10-02 12:22:15.329 2 DEBUG nova.compute.manager [req-82b53fa2-c503-45e6-b5b4-8c9fa0be65c2 req-d1da8e55-d34d-422e-a6ad-a7015753f2d1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Refreshing instance network info cache due to event network-changed-cae13af9-8175-4eab-b9ec-18019b521d0b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:22:15 np0005466013 nova_compute[192144]: 2025-10-02 12:22:15.329 2 DEBUG oslo_concurrency.lockutils [req-82b53fa2-c503-45e6-b5b4-8c9fa0be65c2 req-d1da8e55-d34d-422e-a6ad-a7015753f2d1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-ad2d69bb-3aa9-4c11-b9de-29996574cfa2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:22:15 np0005466013 nova_compute[192144]: 2025-10-02 12:22:15.329 2 DEBUG oslo_concurrency.lockutils [req-82b53fa2-c503-45e6-b5b4-8c9fa0be65c2 req-d1da8e55-d34d-422e-a6ad-a7015753f2d1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-ad2d69bb-3aa9-4c11-b9de-29996574cfa2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:22:15 np0005466013 nova_compute[192144]: 2025-10-02 12:22:15.330 2 DEBUG nova.network.neutron [req-82b53fa2-c503-45e6-b5b4-8c9fa0be65c2 req-d1da8e55-d34d-422e-a6ad-a7015753f2d1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Refreshing network info cache for port cae13af9-8175-4eab-b9ec-18019b521d0b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:22:15 np0005466013 nova_compute[192144]: 2025-10-02 12:22:15.351 2 INFO nova.compute.manager [None req-50da8073-a72b-4d1e-8c8f-3271b1da9b38 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Took 0.45 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:22:15 np0005466013 nova_compute[192144]: 2025-10-02 12:22:15.352 2 DEBUG oslo.service.loopingcall [None req-50da8073-a72b-4d1e-8c8f-3271b1da9b38 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:22:15 np0005466013 nova_compute[192144]: 2025-10-02 12:22:15.352 2 DEBUG nova.compute.manager [-] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:22:15 np0005466013 nova_compute[192144]: 2025-10-02 12:22:15.353 2 DEBUG nova.network.neutron [-] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:22:15 np0005466013 podman[236776]: 2025-10-02 12:22:15.355260993 +0000 UTC m=+0.285846497 container cleanup 721792690a2169788f92c6be41278e2e8c5ae00b7ef3ef2ea19ce4ffbafb6948 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:22:15 np0005466013 systemd[1]: libpod-conmon-721792690a2169788f92c6be41278e2e8c5ae00b7ef3ef2ea19ce4ffbafb6948.scope: Deactivated successfully.
Oct  2 08:22:15 np0005466013 podman[236822]: 2025-10-02 12:22:15.712818689 +0000 UTC m=+0.320128814 container remove 721792690a2169788f92c6be41278e2e8c5ae00b7ef3ef2ea19ce4ffbafb6948 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:22:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:15.721 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[ad774262-9a04-4734-9ddc-f948aa00f932]: (4, ('Thu Oct  2 12:22:15 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199 (721792690a2169788f92c6be41278e2e8c5ae00b7ef3ef2ea19ce4ffbafb6948)\n721792690a2169788f92c6be41278e2e8c5ae00b7ef3ef2ea19ce4ffbafb6948\nThu Oct  2 12:22:15 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199 (721792690a2169788f92c6be41278e2e8c5ae00b7ef3ef2ea19ce4ffbafb6948)\n721792690a2169788f92c6be41278e2e8c5ae00b7ef3ef2ea19ce4ffbafb6948\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:15.724 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[06ef70d9-ca29-4f61-bdcf-bfee4bfab5fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:15.726 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8f494075-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:22:15 np0005466013 kernel: tap8f494075-60: left promiscuous mode
Oct  2 08:22:15 np0005466013 nova_compute[192144]: 2025-10-02 12:22:15.728 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:15 np0005466013 nova_compute[192144]: 2025-10-02 12:22:15.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:15.744 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[d06f930b-5311-42f0-bc9c-58aacd206094]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:15.769 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[691065da-ab38-42e9-8e7a-3aa9e6e8952c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:15.772 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[50ccd2b4-237e-4739-9a98-3a3c75308643]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:15.790 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[ff6088b6-6b25-40f6-bb86-1171d8ce4bec]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 570949, 'reachable_time': 29288, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236836, 'error': None, 'target': 'ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:15.794 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:22:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:15.794 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[35e09110-862e-4e64-a60b-e56fadbd1975]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:15 np0005466013 systemd[1]: run-netns-ovnmeta\x2d8f494075\x2d66bf\x2d4ce0\x2da765\x2d98fd91c31199.mount: Deactivated successfully.
Oct  2 08:22:15 np0005466013 nova_compute[192144]: 2025-10-02 12:22:15.878 2 DEBUG nova.compute.manager [req-469a6b95-184a-4dcf-b2d9-2fe7478ed1ec req-52783082-b8d4-401a-a2da-3dfe308c752d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Received event network-vif-unplugged-0c328734-ebc6-47bc-b603-2e4af1cae573 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:22:15 np0005466013 nova_compute[192144]: 2025-10-02 12:22:15.880 2 DEBUG oslo_concurrency.lockutils [req-469a6b95-184a-4dcf-b2d9-2fe7478ed1ec req-52783082-b8d4-401a-a2da-3dfe308c752d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:15 np0005466013 nova_compute[192144]: 2025-10-02 12:22:15.880 2 DEBUG oslo_concurrency.lockutils [req-469a6b95-184a-4dcf-b2d9-2fe7478ed1ec req-52783082-b8d4-401a-a2da-3dfe308c752d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:15 np0005466013 nova_compute[192144]: 2025-10-02 12:22:15.880 2 DEBUG oslo_concurrency.lockutils [req-469a6b95-184a-4dcf-b2d9-2fe7478ed1ec req-52783082-b8d4-401a-a2da-3dfe308c752d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:15 np0005466013 nova_compute[192144]: 2025-10-02 12:22:15.881 2 DEBUG nova.compute.manager [req-469a6b95-184a-4dcf-b2d9-2fe7478ed1ec req-52783082-b8d4-401a-a2da-3dfe308c752d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] No waiting events found dispatching network-vif-unplugged-0c328734-ebc6-47bc-b603-2e4af1cae573 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:22:15 np0005466013 nova_compute[192144]: 2025-10-02 12:22:15.882 2 DEBUG nova.compute.manager [req-469a6b95-184a-4dcf-b2d9-2fe7478ed1ec req-52783082-b8d4-401a-a2da-3dfe308c752d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Received event network-vif-unplugged-0c328734-ebc6-47bc-b603-2e4af1cae573 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:22:16 np0005466013 nova_compute[192144]: 2025-10-02 12:22:16.376 2 DEBUG nova.network.neutron [-] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:22:16 np0005466013 nova_compute[192144]: 2025-10-02 12:22:16.422 2 INFO nova.compute.manager [-] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Took 1.07 seconds to deallocate network for instance.#033[00m
Oct  2 08:22:16 np0005466013 nova_compute[192144]: 2025-10-02 12:22:16.459 2 DEBUG nova.compute.manager [req-0f3ac7ca-c6f9-4488-862a-ab7fb2a2df54 req-37eab980-ebec-4f6f-8d2d-ad36ccaddfd8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Received event network-vif-deleted-0c328734-ebc6-47bc-b603-2e4af1cae573 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:22:16 np0005466013 nova_compute[192144]: 2025-10-02 12:22:16.727 2 DEBUG oslo_concurrency.lockutils [None req-50da8073-a72b-4d1e-8c8f-3271b1da9b38 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:16 np0005466013 nova_compute[192144]: 2025-10-02 12:22:16.728 2 DEBUG oslo_concurrency.lockutils [None req-50da8073-a72b-4d1e-8c8f-3271b1da9b38 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:16 np0005466013 nova_compute[192144]: 2025-10-02 12:22:16.820 2 DEBUG nova.compute.provider_tree [None req-50da8073-a72b-4d1e-8c8f-3271b1da9b38 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:22:16 np0005466013 nova_compute[192144]: 2025-10-02 12:22:16.851 2 DEBUG nova.scheduler.client.report [None req-50da8073-a72b-4d1e-8c8f-3271b1da9b38 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:22:16 np0005466013 nova_compute[192144]: 2025-10-02 12:22:16.909 2 DEBUG oslo_concurrency.lockutils [None req-50da8073-a72b-4d1e-8c8f-3271b1da9b38 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.181s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:16 np0005466013 nova_compute[192144]: 2025-10-02 12:22:16.974 2 INFO nova.scheduler.client.report [None req-50da8073-a72b-4d1e-8c8f-3271b1da9b38 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Deleted allocations for instance 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c#033[00m
Oct  2 08:22:17 np0005466013 nova_compute[192144]: 2025-10-02 12:22:17.096 2 DEBUG oslo_concurrency.lockutils [None req-50da8073-a72b-4d1e-8c8f-3271b1da9b38 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Lock "35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.260s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:17 np0005466013 nova_compute[192144]: 2025-10-02 12:22:17.586 2 DEBUG nova.network.neutron [req-82b53fa2-c503-45e6-b5b4-8c9fa0be65c2 req-d1da8e55-d34d-422e-a6ad-a7015753f2d1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Updated VIF entry in instance network info cache for port cae13af9-8175-4eab-b9ec-18019b521d0b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:22:17 np0005466013 nova_compute[192144]: 2025-10-02 12:22:17.586 2 DEBUG nova.network.neutron [req-82b53fa2-c503-45e6-b5b4-8c9fa0be65c2 req-d1da8e55-d34d-422e-a6ad-a7015753f2d1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Updating instance_info_cache with network_info: [{"id": "cae13af9-8175-4eab-b9ec-18019b521d0b", "address": "fa:16:3e:35:d3:eb", "network": {"id": "20eb29be-ee23-463b-85af-bfc2388e9f77", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-370285634-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffce7d629aa24a7f970d93b2a79045f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcae13af9-81", "ovs_interfaceid": "cae13af9-8175-4eab-b9ec-18019b521d0b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:22:17 np0005466013 nova_compute[192144]: 2025-10-02 12:22:17.622 2 DEBUG oslo_concurrency.lockutils [req-82b53fa2-c503-45e6-b5b4-8c9fa0be65c2 req-d1da8e55-d34d-422e-a6ad-a7015753f2d1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-ad2d69bb-3aa9-4c11-b9de-29996574cfa2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:22:17 np0005466013 nova_compute[192144]: 2025-10-02 12:22:17.787 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407722.7864134, ad2d69bb-3aa9-4c11-b9de-29996574cfa2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:22:17 np0005466013 nova_compute[192144]: 2025-10-02 12:22:17.788 2 INFO nova.compute.manager [-] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:22:17 np0005466013 nova_compute[192144]: 2025-10-02 12:22:17.936 2 DEBUG nova.compute.manager [None req-35197bd9-ae5e-49f7-8f2f-91fe37ebb119 - - - - - -] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:22:17 np0005466013 nova_compute[192144]: 2025-10-02 12:22:17.939 2 DEBUG nova.compute.manager [None req-35197bd9-ae5e-49f7-8f2f-91fe37ebb119 - - - - - -] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: stopped, current task_state: resize_migrated, current DB power_state: 4, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:22:17 np0005466013 nova_compute[192144]: 2025-10-02 12:22:17.954 2 DEBUG nova.compute.manager [req-3a95085f-1c0c-488c-b849-0590b5d182b6 req-1511de0f-66c6-4efd-89e4-8a3c18e6e5c0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Received event network-vif-plugged-0c328734-ebc6-47bc-b603-2e4af1cae573 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:22:17 np0005466013 nova_compute[192144]: 2025-10-02 12:22:17.954 2 DEBUG oslo_concurrency.lockutils [req-3a95085f-1c0c-488c-b849-0590b5d182b6 req-1511de0f-66c6-4efd-89e4-8a3c18e6e5c0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:17 np0005466013 nova_compute[192144]: 2025-10-02 12:22:17.954 2 DEBUG oslo_concurrency.lockutils [req-3a95085f-1c0c-488c-b849-0590b5d182b6 req-1511de0f-66c6-4efd-89e4-8a3c18e6e5c0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:17 np0005466013 nova_compute[192144]: 2025-10-02 12:22:17.955 2 DEBUG oslo_concurrency.lockutils [req-3a95085f-1c0c-488c-b849-0590b5d182b6 req-1511de0f-66c6-4efd-89e4-8a3c18e6e5c0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:17 np0005466013 nova_compute[192144]: 2025-10-02 12:22:17.955 2 DEBUG nova.compute.manager [req-3a95085f-1c0c-488c-b849-0590b5d182b6 req-1511de0f-66c6-4efd-89e4-8a3c18e6e5c0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] No waiting events found dispatching network-vif-plugged-0c328734-ebc6-47bc-b603-2e4af1cae573 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:22:17 np0005466013 nova_compute[192144]: 2025-10-02 12:22:17.955 2 WARNING nova.compute.manager [req-3a95085f-1c0c-488c-b849-0590b5d182b6 req-1511de0f-66c6-4efd-89e4-8a3c18e6e5c0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Received unexpected event network-vif-plugged-0c328734-ebc6-47bc-b603-2e4af1cae573 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:22:17 np0005466013 nova_compute[192144]: 2025-10-02 12:22:17.979 2 INFO nova.compute.manager [None req-35197bd9-ae5e-49f7-8f2f-91fe37ebb119 - - - - - -] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-2.ctlplane.example.com#033[00m
Oct  2 08:22:18 np0005466013 nova_compute[192144]: 2025-10-02 12:22:18.928 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:20 np0005466013 nova_compute[192144]: 2025-10-02 12:22:20.225 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:21 np0005466013 nova_compute[192144]: 2025-10-02 12:22:21.791 2 DEBUG oslo_concurrency.lockutils [None req-0711420b-4bae-4d0b-8765-319312f6e860 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Acquiring lock "ad2d69bb-3aa9-4c11-b9de-29996574cfa2" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:21 np0005466013 nova_compute[192144]: 2025-10-02 12:22:21.792 2 DEBUG oslo_concurrency.lockutils [None req-0711420b-4bae-4d0b-8765-319312f6e860 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "ad2d69bb-3aa9-4c11-b9de-29996574cfa2" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:21 np0005466013 nova_compute[192144]: 2025-10-02 12:22:21.793 2 DEBUG nova.compute.manager [None req-0711420b-4bae-4d0b-8765-319312f6e860 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Going to confirm migration 15 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Oct  2 08:22:21 np0005466013 nova_compute[192144]: 2025-10-02 12:22:21.828 2 DEBUG nova.objects.instance [None req-0711420b-4bae-4d0b-8765-319312f6e860 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lazy-loading 'info_cache' on Instance uuid ad2d69bb-3aa9-4c11-b9de-29996574cfa2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:22:22 np0005466013 nova_compute[192144]: 2025-10-02 12:22:22.248 2 DEBUG neutronclient.v2_0.client [None req-0711420b-4bae-4d0b-8765-319312f6e860 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port cae13af9-8175-4eab-b9ec-18019b521d0b for host compute-2.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Oct  2 08:22:22 np0005466013 nova_compute[192144]: 2025-10-02 12:22:22.249 2 DEBUG oslo_concurrency.lockutils [None req-0711420b-4bae-4d0b-8765-319312f6e860 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Acquiring lock "refresh_cache-ad2d69bb-3aa9-4c11-b9de-29996574cfa2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:22:22 np0005466013 nova_compute[192144]: 2025-10-02 12:22:22.249 2 DEBUG oslo_concurrency.lockutils [None req-0711420b-4bae-4d0b-8765-319312f6e860 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Acquired lock "refresh_cache-ad2d69bb-3aa9-4c11-b9de-29996574cfa2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:22:22 np0005466013 nova_compute[192144]: 2025-10-02 12:22:22.249 2 DEBUG nova.network.neutron [None req-0711420b-4bae-4d0b-8765-319312f6e860 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:22:22 np0005466013 ovn_controller[94366]: 2025-10-02T12:22:22Z|00437|binding|INFO|Releasing lport e533861f-45cb-4843-b071-0b628ca25128 from this chassis (sb_readonly=0)
Oct  2 08:22:22 np0005466013 nova_compute[192144]: 2025-10-02 12:22:22.665 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:23 np0005466013 podman[236842]: 2025-10-02 12:22:23.689335561 +0000 UTC m=+0.057058235 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  2 08:22:23 np0005466013 podman[236840]: 2025-10-02 12:22:23.689812945 +0000 UTC m=+0.058260032 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:22:23 np0005466013 podman[236843]: 2025-10-02 12:22:23.762860926 +0000 UTC m=+0.128391162 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  2 08:22:23 np0005466013 nova_compute[192144]: 2025-10-02 12:22:23.792 2 DEBUG nova.network.neutron [None req-0711420b-4bae-4d0b-8765-319312f6e860 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Updating instance_info_cache with network_info: [{"id": "cae13af9-8175-4eab-b9ec-18019b521d0b", "address": "fa:16:3e:35:d3:eb", "network": {"id": "20eb29be-ee23-463b-85af-bfc2388e9f77", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-370285634-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffce7d629aa24a7f970d93b2a79045f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcae13af9-81", "ovs_interfaceid": "cae13af9-8175-4eab-b9ec-18019b521d0b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:22:23 np0005466013 nova_compute[192144]: 2025-10-02 12:22:23.828 2 DEBUG oslo_concurrency.lockutils [None req-0711420b-4bae-4d0b-8765-319312f6e860 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Releasing lock "refresh_cache-ad2d69bb-3aa9-4c11-b9de-29996574cfa2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:22:23 np0005466013 nova_compute[192144]: 2025-10-02 12:22:23.828 2 DEBUG nova.objects.instance [None req-0711420b-4bae-4d0b-8765-319312f6e860 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lazy-loading 'migration_context' on Instance uuid ad2d69bb-3aa9-4c11-b9de-29996574cfa2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:22:23 np0005466013 nova_compute[192144]: 2025-10-02 12:22:23.856 2 DEBUG nova.virt.libvirt.vif [None req-0711420b-4bae-4d0b-8765-319312f6e860 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:20:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1629207280',display_name='tempest-ServerActionsTestOtherB-server-1629207280',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1629207280',id=108,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG+aqSe4de2VLtRAXN5xeLQn4S/3X8QrNMy2M5WdQ5hviVyEOgqK+m+uWmzPaUSUgE38sEdkytfwUHD32CBZajBt4q3OEf9i3yPJUQGuqp42pAUD+A3EoBIyeptNeSxGdA==',key_name='tempest-keypair-1900171990',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:22:18Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='ffce7d629aa24a7f970d93b2a79045f1',ramdisk_id='',reservation_id='r-jtzab0yc',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='stopped',owner_project_name='tempest-ServerActionsTestOtherB-263921372',owner_user_name='tempest-ServerActionsTestOtherB-263921372-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:22:18Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0ea122e2fff94f2ba7c78bf30b04029c',uuid=ad2d69bb-3aa9-4c11-b9de-29996574cfa2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "cae13af9-8175-4eab-b9ec-18019b521d0b", "address": "fa:16:3e:35:d3:eb", "network": {"id": "20eb29be-ee23-463b-85af-bfc2388e9f77", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-370285634-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffce7d629aa24a7f970d93b2a79045f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcae13af9-81", "ovs_interfaceid": "cae13af9-8175-4eab-b9ec-18019b521d0b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:22:23 np0005466013 nova_compute[192144]: 2025-10-02 12:22:23.857 2 DEBUG nova.network.os_vif_util [None req-0711420b-4bae-4d0b-8765-319312f6e860 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Converting VIF {"id": "cae13af9-8175-4eab-b9ec-18019b521d0b", "address": "fa:16:3e:35:d3:eb", "network": {"id": "20eb29be-ee23-463b-85af-bfc2388e9f77", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-370285634-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffce7d629aa24a7f970d93b2a79045f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcae13af9-81", "ovs_interfaceid": "cae13af9-8175-4eab-b9ec-18019b521d0b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:22:23 np0005466013 nova_compute[192144]: 2025-10-02 12:22:23.858 2 DEBUG nova.network.os_vif_util [None req-0711420b-4bae-4d0b-8765-319312f6e860 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:d3:eb,bridge_name='br-int',has_traffic_filtering=True,id=cae13af9-8175-4eab-b9ec-18019b521d0b,network=Network(20eb29be-ee23-463b-85af-bfc2388e9f77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcae13af9-81') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:22:23 np0005466013 nova_compute[192144]: 2025-10-02 12:22:23.858 2 DEBUG os_vif [None req-0711420b-4bae-4d0b-8765-319312f6e860 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:d3:eb,bridge_name='br-int',has_traffic_filtering=True,id=cae13af9-8175-4eab-b9ec-18019b521d0b,network=Network(20eb29be-ee23-463b-85af-bfc2388e9f77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcae13af9-81') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:22:23 np0005466013 nova_compute[192144]: 2025-10-02 12:22:23.860 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:23 np0005466013 nova_compute[192144]: 2025-10-02 12:22:23.860 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcae13af9-81, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:22:23 np0005466013 nova_compute[192144]: 2025-10-02 12:22:23.860 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:22:23 np0005466013 nova_compute[192144]: 2025-10-02 12:22:23.862 2 INFO os_vif [None req-0711420b-4bae-4d0b-8765-319312f6e860 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:d3:eb,bridge_name='br-int',has_traffic_filtering=True,id=cae13af9-8175-4eab-b9ec-18019b521d0b,network=Network(20eb29be-ee23-463b-85af-bfc2388e9f77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcae13af9-81')#033[00m
Oct  2 08:22:23 np0005466013 nova_compute[192144]: 2025-10-02 12:22:23.863 2 DEBUG oslo_concurrency.lockutils [None req-0711420b-4bae-4d0b-8765-319312f6e860 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:23 np0005466013 nova_compute[192144]: 2025-10-02 12:22:23.863 2 DEBUG oslo_concurrency.lockutils [None req-0711420b-4bae-4d0b-8765-319312f6e860 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:23 np0005466013 nova_compute[192144]: 2025-10-02 12:22:23.930 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:23 np0005466013 nova_compute[192144]: 2025-10-02 12:22:23.947 2 DEBUG nova.compute.provider_tree [None req-0711420b-4bae-4d0b-8765-319312f6e860 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:22:23 np0005466013 nova_compute[192144]: 2025-10-02 12:22:23.969 2 DEBUG nova.scheduler.client.report [None req-0711420b-4bae-4d0b-8765-319312f6e860 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:22:24 np0005466013 nova_compute[192144]: 2025-10-02 12:22:24.012 2 DEBUG oslo_concurrency.lockutils [None req-0711420b-4bae-4d0b-8765-319312f6e860 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.149s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:24 np0005466013 nova_compute[192144]: 2025-10-02 12:22:24.013 2 DEBUG nova.compute.manager [None req-0711420b-4bae-4d0b-8765-319312f6e860 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ad2d69bb-3aa9-4c11-b9de-29996574cfa2] Resized/migrated instance is powered off. Setting vm_state to 'stopped'. _confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4805#033[00m
Oct  2 08:22:24 np0005466013 nova_compute[192144]: 2025-10-02 12:22:24.131 2 INFO nova.scheduler.client.report [None req-0711420b-4bae-4d0b-8765-319312f6e860 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Deleted allocation for migration 630aba62-6e03-48d5-8063-37553a2a143a#033[00m
Oct  2 08:22:24 np0005466013 nova_compute[192144]: 2025-10-02 12:22:24.217 2 DEBUG oslo_concurrency.lockutils [None req-0711420b-4bae-4d0b-8765-319312f6e860 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "ad2d69bb-3aa9-4c11-b9de-29996574cfa2" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 2.424s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:25 np0005466013 nova_compute[192144]: 2025-10-02 12:22:25.227 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:26 np0005466013 nova_compute[192144]: 2025-10-02 12:22:26.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:22:26 np0005466013 nova_compute[192144]: 2025-10-02 12:22:26.993 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:22:26 np0005466013 nova_compute[192144]: 2025-10-02 12:22:26.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:22:27 np0005466013 nova_compute[192144]: 2025-10-02 12:22:27.013 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:27 np0005466013 nova_compute[192144]: 2025-10-02 12:22:27.014 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:27 np0005466013 nova_compute[192144]: 2025-10-02 12:22:27.014 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:27 np0005466013 nova_compute[192144]: 2025-10-02 12:22:27.014 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:22:27 np0005466013 nova_compute[192144]: 2025-10-02 12:22:27.096 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f2f0b852-0c4a-4d16-9c7f-54845e7f7b42/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:22:27 np0005466013 nova_compute[192144]: 2025-10-02 12:22:27.157 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f2f0b852-0c4a-4d16-9c7f-54845e7f7b42/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:22:27 np0005466013 nova_compute[192144]: 2025-10-02 12:22:27.158 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f2f0b852-0c4a-4d16-9c7f-54845e7f7b42/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:22:27 np0005466013 nova_compute[192144]: 2025-10-02 12:22:27.211 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f2f0b852-0c4a-4d16-9c7f-54845e7f7b42/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:22:27 np0005466013 nova_compute[192144]: 2025-10-02 12:22:27.343 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:22:27 np0005466013 nova_compute[192144]: 2025-10-02 12:22:27.344 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5537MB free_disk=73.25717544555664GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:22:27 np0005466013 nova_compute[192144]: 2025-10-02 12:22:27.344 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:27 np0005466013 nova_compute[192144]: 2025-10-02 12:22:27.345 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:27 np0005466013 nova_compute[192144]: 2025-10-02 12:22:27.409 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance f2f0b852-0c4a-4d16-9c7f-54845e7f7b42 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:22:27 np0005466013 nova_compute[192144]: 2025-10-02 12:22:27.409 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:22:27 np0005466013 nova_compute[192144]: 2025-10-02 12:22:27.410 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:22:27 np0005466013 nova_compute[192144]: 2025-10-02 12:22:27.448 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:22:27 np0005466013 nova_compute[192144]: 2025-10-02 12:22:27.468 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:22:27 np0005466013 nova_compute[192144]: 2025-10-02 12:22:27.492 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:22:27 np0005466013 nova_compute[192144]: 2025-10-02 12:22:27.493 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.149s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:28 np0005466013 nova_compute[192144]: 2025-10-02 12:22:28.932 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:30 np0005466013 nova_compute[192144]: 2025-10-02 12:22:30.186 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407735.185233, 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:22:30 np0005466013 nova_compute[192144]: 2025-10-02 12:22:30.187 2 INFO nova.compute.manager [-] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:22:30 np0005466013 nova_compute[192144]: 2025-10-02 12:22:30.216 2 DEBUG nova.compute.manager [None req-c7897ffa-4362-4b67-9e38-de744e1d0a2e - - - - - -] [instance: 35c6bb03-2e70-4705-bfc3-78bdeeaf6c9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:22:30 np0005466013 nova_compute[192144]: 2025-10-02 12:22:30.229 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:30 np0005466013 nova_compute[192144]: 2025-10-02 12:22:30.494 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:22:30 np0005466013 nova_compute[192144]: 2025-10-02 12:22:30.494 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:22:30 np0005466013 nova_compute[192144]: 2025-10-02 12:22:30.826 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:30 np0005466013 nova_compute[192144]: 2025-10-02 12:22:30.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:22:32 np0005466013 nova_compute[192144]: 2025-10-02 12:22:32.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:22:32 np0005466013 nova_compute[192144]: 2025-10-02 12:22:32.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:22:33 np0005466013 podman[236919]: 2025-10-02 12:22:33.67769589 +0000 UTC m=+0.058285104 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct  2 08:22:33 np0005466013 podman[236920]: 2025-10-02 12:22:33.679408534 +0000 UTC m=+0.058238523 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, config_id=edpm, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.buildah.version=1.33.7, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.6)
Oct  2 08:22:33 np0005466013 podman[236921]: 2025-10-02 12:22:33.690309073 +0000 UTC m=+0.064999432 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, config_id=edpm, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3)
Oct  2 08:22:33 np0005466013 nova_compute[192144]: 2025-10-02 12:22:33.935 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:33 np0005466013 nova_compute[192144]: 2025-10-02 12:22:33.989 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:22:33 np0005466013 nova_compute[192144]: 2025-10-02 12:22:33.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:22:33 np0005466013 nova_compute[192144]: 2025-10-02 12:22:33.993 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:22:33 np0005466013 nova_compute[192144]: 2025-10-02 12:22:33.993 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:22:34 np0005466013 nova_compute[192144]: 2025-10-02 12:22:34.212 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:34 np0005466013 nova_compute[192144]: 2025-10-02 12:22:34.298 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "refresh_cache-f2f0b852-0c4a-4d16-9c7f-54845e7f7b42" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:22:34 np0005466013 nova_compute[192144]: 2025-10-02 12:22:34.298 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquired lock "refresh_cache-f2f0b852-0c4a-4d16-9c7f-54845e7f7b42" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:22:34 np0005466013 nova_compute[192144]: 2025-10-02 12:22:34.299 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:22:34 np0005466013 nova_compute[192144]: 2025-10-02 12:22:34.299 2 DEBUG nova.objects.instance [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lazy-loading 'info_cache' on Instance uuid f2f0b852-0c4a-4d16-9c7f-54845e7f7b42 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:22:35 np0005466013 nova_compute[192144]: 2025-10-02 12:22:35.233 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:36 np0005466013 nova_compute[192144]: 2025-10-02 12:22:36.850 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Updating instance_info_cache with network_info: [{"id": "6374e02b-d27f-466e-9a75-8ba586327036", "address": "fa:16:3e:17:03:e5", "network": {"id": "20eb29be-ee23-463b-85af-bfc2388e9f77", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-370285634-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffce7d629aa24a7f970d93b2a79045f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6374e02b-d2", "ovs_interfaceid": "6374e02b-d27f-466e-9a75-8ba586327036", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:22:36 np0005466013 nova_compute[192144]: 2025-10-02 12:22:36.863 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Releasing lock "refresh_cache-f2f0b852-0c4a-4d16-9c7f-54845e7f7b42" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:22:36 np0005466013 nova_compute[192144]: 2025-10-02 12:22:36.863 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:22:38 np0005466013 nova_compute[192144]: 2025-10-02 12:22:38.978 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:39 np0005466013 podman[236982]: 2025-10-02 12:22:39.691861606 +0000 UTC m=+0.059627885 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=iscsid)
Oct  2 08:22:39 np0005466013 podman[236981]: 2025-10-02 12:22:39.699746921 +0000 UTC m=+0.065614341 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  2 08:22:40 np0005466013 nova_compute[192144]: 2025-10-02 12:22:40.237 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:40 np0005466013 nova_compute[192144]: 2025-10-02 12:22:40.466 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:40 np0005466013 nova_compute[192144]: 2025-10-02 12:22:40.859 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:22:43 np0005466013 nova_compute[192144]: 2025-10-02 12:22:43.981 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:44 np0005466013 nova_compute[192144]: 2025-10-02 12:22:44.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:45 np0005466013 nova_compute[192144]: 2025-10-02 12:22:45.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:47 np0005466013 ovn_controller[94366]: 2025-10-02T12:22:47Z|00438|binding|INFO|Releasing lport e533861f-45cb-4843-b071-0b628ca25128 from this chassis (sb_readonly=0)
Oct  2 08:22:47 np0005466013 nova_compute[192144]: 2025-10-02 12:22:47.412 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:47 np0005466013 ovn_controller[94366]: 2025-10-02T12:22:47Z|00439|binding|INFO|Releasing lport e533861f-45cb-4843-b071-0b628ca25128 from this chassis (sb_readonly=0)
Oct  2 08:22:47 np0005466013 nova_compute[192144]: 2025-10-02 12:22:47.642 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:48 np0005466013 nova_compute[192144]: 2025-10-02 12:22:48.692 2 DEBUG oslo_concurrency.lockutils [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Acquiring lock "7c3bfdbf-eed9-420a-b5c8-cfb648dceac1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:48 np0005466013 nova_compute[192144]: 2025-10-02 12:22:48.693 2 DEBUG oslo_concurrency.lockutils [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Lock "7c3bfdbf-eed9-420a-b5c8-cfb648dceac1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:48 np0005466013 nova_compute[192144]: 2025-10-02 12:22:48.713 2 DEBUG nova.compute.manager [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:22:48 np0005466013 nova_compute[192144]: 2025-10-02 12:22:48.742 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:48 np0005466013 NetworkManager[51205]: <info>  [1759407768.7441] manager: (patch-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/203)
Oct  2 08:22:48 np0005466013 NetworkManager[51205]: <info>  [1759407768.7462] manager: (patch-br-int-to-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/204)
Oct  2 08:22:48 np0005466013 nova_compute[192144]: 2025-10-02 12:22:48.803 2 DEBUG oslo_concurrency.lockutils [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:48 np0005466013 nova_compute[192144]: 2025-10-02 12:22:48.805 2 DEBUG oslo_concurrency.lockutils [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:48 np0005466013 nova_compute[192144]: 2025-10-02 12:22:48.812 2 DEBUG nova.virt.hardware [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:22:48 np0005466013 nova_compute[192144]: 2025-10-02 12:22:48.812 2 INFO nova.compute.claims [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:22:48 np0005466013 nova_compute[192144]: 2025-10-02 12:22:48.919 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:48 np0005466013 ovn_controller[94366]: 2025-10-02T12:22:48Z|00440|binding|INFO|Releasing lport e533861f-45cb-4843-b071-0b628ca25128 from this chassis (sb_readonly=0)
Oct  2 08:22:48 np0005466013 nova_compute[192144]: 2025-10-02 12:22:48.935 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:48 np0005466013 nova_compute[192144]: 2025-10-02 12:22:48.968 2 DEBUG nova.compute.provider_tree [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:22:48 np0005466013 nova_compute[192144]: 2025-10-02 12:22:48.982 2 DEBUG nova.scheduler.client.report [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:22:48 np0005466013 nova_compute[192144]: 2025-10-02 12:22:48.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:49 np0005466013 nova_compute[192144]: 2025-10-02 12:22:49.004 2 DEBUG oslo_concurrency.lockutils [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.199s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:49 np0005466013 nova_compute[192144]: 2025-10-02 12:22:49.005 2 DEBUG nova.compute.manager [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:22:49 np0005466013 nova_compute[192144]: 2025-10-02 12:22:49.063 2 DEBUG nova.compute.manager [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:22:49 np0005466013 nova_compute[192144]: 2025-10-02 12:22:49.063 2 DEBUG nova.network.neutron [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:22:49 np0005466013 nova_compute[192144]: 2025-10-02 12:22:49.088 2 INFO nova.virt.libvirt.driver [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:22:49 np0005466013 nova_compute[192144]: 2025-10-02 12:22:49.109 2 DEBUG nova.compute.manager [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:22:49 np0005466013 nova_compute[192144]: 2025-10-02 12:22:49.245 2 DEBUG nova.compute.manager [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:22:49 np0005466013 nova_compute[192144]: 2025-10-02 12:22:49.248 2 DEBUG nova.virt.libvirt.driver [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:22:49 np0005466013 nova_compute[192144]: 2025-10-02 12:22:49.249 2 INFO nova.virt.libvirt.driver [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Creating image(s)#033[00m
Oct  2 08:22:49 np0005466013 nova_compute[192144]: 2025-10-02 12:22:49.250 2 DEBUG oslo_concurrency.lockutils [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Acquiring lock "/var/lib/nova/instances/7c3bfdbf-eed9-420a-b5c8-cfb648dceac1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:49 np0005466013 nova_compute[192144]: 2025-10-02 12:22:49.250 2 DEBUG oslo_concurrency.lockutils [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Lock "/var/lib/nova/instances/7c3bfdbf-eed9-420a-b5c8-cfb648dceac1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:49 np0005466013 nova_compute[192144]: 2025-10-02 12:22:49.251 2 DEBUG oslo_concurrency.lockutils [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Lock "/var/lib/nova/instances/7c3bfdbf-eed9-420a-b5c8-cfb648dceac1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:49 np0005466013 nova_compute[192144]: 2025-10-02 12:22:49.264 2 DEBUG oslo_concurrency.processutils [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:22:49 np0005466013 nova_compute[192144]: 2025-10-02 12:22:49.339 2 DEBUG oslo_concurrency.processutils [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:22:49 np0005466013 nova_compute[192144]: 2025-10-02 12:22:49.340 2 DEBUG oslo_concurrency.lockutils [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:49 np0005466013 nova_compute[192144]: 2025-10-02 12:22:49.341 2 DEBUG oslo_concurrency.lockutils [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:49 np0005466013 nova_compute[192144]: 2025-10-02 12:22:49.353 2 DEBUG oslo_concurrency.processutils [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:22:49 np0005466013 nova_compute[192144]: 2025-10-02 12:22:49.418 2 DEBUG oslo_concurrency.processutils [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:22:49 np0005466013 nova_compute[192144]: 2025-10-02 12:22:49.419 2 DEBUG oslo_concurrency.processutils [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/7c3bfdbf-eed9-420a-b5c8-cfb648dceac1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:22:49 np0005466013 nova_compute[192144]: 2025-10-02 12:22:49.528 2 DEBUG nova.policy [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '34f60e2f6dd64fc8a66fda781f291109', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e50da64635e042abbcac5618a1476e01', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:22:49 np0005466013 nova_compute[192144]: 2025-10-02 12:22:49.689 2 DEBUG oslo_concurrency.processutils [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/7c3bfdbf-eed9-420a-b5c8-cfb648dceac1/disk 1073741824" returned: 0 in 0.270s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:22:49 np0005466013 nova_compute[192144]: 2025-10-02 12:22:49.690 2 DEBUG oslo_concurrency.lockutils [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.349s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:49 np0005466013 nova_compute[192144]: 2025-10-02 12:22:49.691 2 DEBUG oslo_concurrency.processutils [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:22:49 np0005466013 nova_compute[192144]: 2025-10-02 12:22:49.774 2 DEBUG oslo_concurrency.processutils [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:22:49 np0005466013 nova_compute[192144]: 2025-10-02 12:22:49.776 2 DEBUG nova.virt.disk.api [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Checking if we can resize image /var/lib/nova/instances/7c3bfdbf-eed9-420a-b5c8-cfb648dceac1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:22:49 np0005466013 nova_compute[192144]: 2025-10-02 12:22:49.777 2 DEBUG oslo_concurrency.processutils [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7c3bfdbf-eed9-420a-b5c8-cfb648dceac1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:22:49 np0005466013 nova_compute[192144]: 2025-10-02 12:22:49.856 2 DEBUG oslo_concurrency.processutils [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7c3bfdbf-eed9-420a-b5c8-cfb648dceac1/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:22:49 np0005466013 nova_compute[192144]: 2025-10-02 12:22:49.858 2 DEBUG nova.virt.disk.api [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Cannot resize image /var/lib/nova/instances/7c3bfdbf-eed9-420a-b5c8-cfb648dceac1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:22:49 np0005466013 nova_compute[192144]: 2025-10-02 12:22:49.859 2 DEBUG nova.objects.instance [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Lazy-loading 'migration_context' on Instance uuid 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:22:49 np0005466013 nova_compute[192144]: 2025-10-02 12:22:49.881 2 DEBUG nova.virt.libvirt.driver [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:22:49 np0005466013 nova_compute[192144]: 2025-10-02 12:22:49.882 2 DEBUG nova.virt.libvirt.driver [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Ensure instance console log exists: /var/lib/nova/instances/7c3bfdbf-eed9-420a-b5c8-cfb648dceac1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:22:49 np0005466013 nova_compute[192144]: 2025-10-02 12:22:49.882 2 DEBUG oslo_concurrency.lockutils [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:49 np0005466013 nova_compute[192144]: 2025-10-02 12:22:49.883 2 DEBUG oslo_concurrency.lockutils [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:49 np0005466013 nova_compute[192144]: 2025-10-02 12:22:49.883 2 DEBUG oslo_concurrency.lockutils [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:50 np0005466013 nova_compute[192144]: 2025-10-02 12:22:50.043 2 DEBUG nova.network.neutron [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Successfully created port: 7a345361-b752-4cc0-97a4-a1d5bfb267a3 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:22:50 np0005466013 nova_compute[192144]: 2025-10-02 12:22:50.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:51 np0005466013 nova_compute[192144]: 2025-10-02 12:22:51.428 2 DEBUG nova.network.neutron [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Successfully updated port: 7a345361-b752-4cc0-97a4-a1d5bfb267a3 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:22:51 np0005466013 nova_compute[192144]: 2025-10-02 12:22:51.460 2 DEBUG oslo_concurrency.lockutils [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Acquiring lock "refresh_cache-7c3bfdbf-eed9-420a-b5c8-cfb648dceac1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:22:51 np0005466013 nova_compute[192144]: 2025-10-02 12:22:51.460 2 DEBUG oslo_concurrency.lockutils [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Acquired lock "refresh_cache-7c3bfdbf-eed9-420a-b5c8-cfb648dceac1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:22:51 np0005466013 nova_compute[192144]: 2025-10-02 12:22:51.461 2 DEBUG nova.network.neutron [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:22:51 np0005466013 nova_compute[192144]: 2025-10-02 12:22:51.527 2 DEBUG nova.compute.manager [req-e74bd58e-8541-49ab-b1ea-6e9047ca89b3 req-9689ec5d-8564-4598-9c64-97cc4880640c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Received event network-changed-7a345361-b752-4cc0-97a4-a1d5bfb267a3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:22:51 np0005466013 nova_compute[192144]: 2025-10-02 12:22:51.527 2 DEBUG nova.compute.manager [req-e74bd58e-8541-49ab-b1ea-6e9047ca89b3 req-9689ec5d-8564-4598-9c64-97cc4880640c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Refreshing instance network info cache due to event network-changed-7a345361-b752-4cc0-97a4-a1d5bfb267a3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:22:51 np0005466013 nova_compute[192144]: 2025-10-02 12:22:51.528 2 DEBUG oslo_concurrency.lockutils [req-e74bd58e-8541-49ab-b1ea-6e9047ca89b3 req-9689ec5d-8564-4598-9c64-97cc4880640c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-7c3bfdbf-eed9-420a-b5c8-cfb648dceac1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:22:51 np0005466013 nova_compute[192144]: 2025-10-02 12:22:51.643 2 DEBUG nova.network.neutron [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:22:52 np0005466013 nova_compute[192144]: 2025-10-02 12:22:52.500 2 DEBUG nova.network.neutron [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Updating instance_info_cache with network_info: [{"id": "7a345361-b752-4cc0-97a4-a1d5bfb267a3", "address": "fa:16:3e:ff:45:41", "network": {"id": "a6507c1e-923c-46c5-b6ef-82a78a7fe9f6", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-545963585-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e50da64635e042abbcac5618a1476e01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a345361-b7", "ovs_interfaceid": "7a345361-b752-4cc0-97a4-a1d5bfb267a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:22:52 np0005466013 nova_compute[192144]: 2025-10-02 12:22:52.889 2 DEBUG oslo_concurrency.lockutils [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Releasing lock "refresh_cache-7c3bfdbf-eed9-420a-b5c8-cfb648dceac1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:22:52 np0005466013 nova_compute[192144]: 2025-10-02 12:22:52.890 2 DEBUG nova.compute.manager [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Instance network_info: |[{"id": "7a345361-b752-4cc0-97a4-a1d5bfb267a3", "address": "fa:16:3e:ff:45:41", "network": {"id": "a6507c1e-923c-46c5-b6ef-82a78a7fe9f6", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-545963585-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e50da64635e042abbcac5618a1476e01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a345361-b7", "ovs_interfaceid": "7a345361-b752-4cc0-97a4-a1d5bfb267a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:22:52 np0005466013 nova_compute[192144]: 2025-10-02 12:22:52.891 2 DEBUG oslo_concurrency.lockutils [req-e74bd58e-8541-49ab-b1ea-6e9047ca89b3 req-9689ec5d-8564-4598-9c64-97cc4880640c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-7c3bfdbf-eed9-420a-b5c8-cfb648dceac1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:22:52 np0005466013 nova_compute[192144]: 2025-10-02 12:22:52.891 2 DEBUG nova.network.neutron [req-e74bd58e-8541-49ab-b1ea-6e9047ca89b3 req-9689ec5d-8564-4598-9c64-97cc4880640c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Refreshing network info cache for port 7a345361-b752-4cc0-97a4-a1d5bfb267a3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:22:52 np0005466013 nova_compute[192144]: 2025-10-02 12:22:52.896 2 DEBUG nova.virt.libvirt.driver [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Start _get_guest_xml network_info=[{"id": "7a345361-b752-4cc0-97a4-a1d5bfb267a3", "address": "fa:16:3e:ff:45:41", "network": {"id": "a6507c1e-923c-46c5-b6ef-82a78a7fe9f6", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-545963585-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e50da64635e042abbcac5618a1476e01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a345361-b7", "ovs_interfaceid": "7a345361-b752-4cc0-97a4-a1d5bfb267a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:22:52 np0005466013 nova_compute[192144]: 2025-10-02 12:22:52.902 2 WARNING nova.virt.libvirt.driver [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:22:52 np0005466013 nova_compute[192144]: 2025-10-02 12:22:52.945 2 DEBUG nova.virt.libvirt.host [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:22:52 np0005466013 nova_compute[192144]: 2025-10-02 12:22:52.947 2 DEBUG nova.virt.libvirt.host [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:22:52 np0005466013 nova_compute[192144]: 2025-10-02 12:22:52.955 2 DEBUG nova.virt.libvirt.host [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:22:52 np0005466013 nova_compute[192144]: 2025-10-02 12:22:52.956 2 DEBUG nova.virt.libvirt.host [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:22:52 np0005466013 nova_compute[192144]: 2025-10-02 12:22:52.958 2 DEBUG nova.virt.libvirt.driver [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:22:52 np0005466013 nova_compute[192144]: 2025-10-02 12:22:52.959 2 DEBUG nova.virt.hardware [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:22:52 np0005466013 nova_compute[192144]: 2025-10-02 12:22:52.959 2 DEBUG nova.virt.hardware [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:22:52 np0005466013 nova_compute[192144]: 2025-10-02 12:22:52.960 2 DEBUG nova.virt.hardware [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:22:52 np0005466013 nova_compute[192144]: 2025-10-02 12:22:52.960 2 DEBUG nova.virt.hardware [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:22:52 np0005466013 nova_compute[192144]: 2025-10-02 12:22:52.960 2 DEBUG nova.virt.hardware [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:22:52 np0005466013 nova_compute[192144]: 2025-10-02 12:22:52.961 2 DEBUG nova.virt.hardware [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:22:52 np0005466013 nova_compute[192144]: 2025-10-02 12:22:52.961 2 DEBUG nova.virt.hardware [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:22:52 np0005466013 nova_compute[192144]: 2025-10-02 12:22:52.962 2 DEBUG nova.virt.hardware [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:22:52 np0005466013 nova_compute[192144]: 2025-10-02 12:22:52.962 2 DEBUG nova.virt.hardware [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:22:52 np0005466013 nova_compute[192144]: 2025-10-02 12:22:52.962 2 DEBUG nova.virt.hardware [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:22:52 np0005466013 nova_compute[192144]: 2025-10-02 12:22:52.963 2 DEBUG nova.virt.hardware [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:22:52 np0005466013 nova_compute[192144]: 2025-10-02 12:22:52.969 2 DEBUG nova.virt.libvirt.vif [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:22:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestMultiTenantJSON-server-1369592646',display_name='tempest-ServersNegativeTestMultiTenantJSON-server-1369592646',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversnegativetestmultitenantjson-server-1369592646',id=115,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e50da64635e042abbcac5618a1476e01',ramdisk_id='',reservation_id='r-hy7hzmnw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersNegativeTestMultiTenantJSON-1092797849',owner_user_name='tempest-ServersNegativeTestMultiTenantJSON-1092797849-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:22:49Z,user_data=None,user_id='34f60e2f6dd64fc8a66fda781f291109',uuid=7c3bfdbf-eed9-420a-b5c8-cfb648dceac1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7a345361-b752-4cc0-97a4-a1d5bfb267a3", "address": "fa:16:3e:ff:45:41", "network": {"id": "a6507c1e-923c-46c5-b6ef-82a78a7fe9f6", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-545963585-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e50da64635e042abbcac5618a1476e01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a345361-b7", "ovs_interfaceid": "7a345361-b752-4cc0-97a4-a1d5bfb267a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:22:52 np0005466013 nova_compute[192144]: 2025-10-02 12:22:52.969 2 DEBUG nova.network.os_vif_util [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Converting VIF {"id": "7a345361-b752-4cc0-97a4-a1d5bfb267a3", "address": "fa:16:3e:ff:45:41", "network": {"id": "a6507c1e-923c-46c5-b6ef-82a78a7fe9f6", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-545963585-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e50da64635e042abbcac5618a1476e01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a345361-b7", "ovs_interfaceid": "7a345361-b752-4cc0-97a4-a1d5bfb267a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:22:52 np0005466013 nova_compute[192144]: 2025-10-02 12:22:52.971 2 DEBUG nova.network.os_vif_util [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ff:45:41,bridge_name='br-int',has_traffic_filtering=True,id=7a345361-b752-4cc0-97a4-a1d5bfb267a3,network=Network(a6507c1e-923c-46c5-b6ef-82a78a7fe9f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a345361-b7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:22:52 np0005466013 nova_compute[192144]: 2025-10-02 12:22:52.972 2 DEBUG nova.objects.instance [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:22:53 np0005466013 nova_compute[192144]: 2025-10-02 12:22:53.216 2 DEBUG nova.virt.libvirt.driver [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:22:53 np0005466013 nova_compute[192144]:  <uuid>7c3bfdbf-eed9-420a-b5c8-cfb648dceac1</uuid>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:  <name>instance-00000073</name>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:22:53 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:      <nova:name>tempest-ServersNegativeTestMultiTenantJSON-server-1369592646</nova:name>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:22:52</nova:creationTime>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:22:53 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:        <nova:user uuid="34f60e2f6dd64fc8a66fda781f291109">tempest-ServersNegativeTestMultiTenantJSON-1092797849-project-member</nova:user>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:        <nova:project uuid="e50da64635e042abbcac5618a1476e01">tempest-ServersNegativeTestMultiTenantJSON-1092797849</nova:project>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:        <nova:port uuid="7a345361-b752-4cc0-97a4-a1d5bfb267a3">
Oct  2 08:22:53 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:      <entry name="serial">7c3bfdbf-eed9-420a-b5c8-cfb648dceac1</entry>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:      <entry name="uuid">7c3bfdbf-eed9-420a-b5c8-cfb648dceac1</entry>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:22:53 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/7c3bfdbf-eed9-420a-b5c8-cfb648dceac1/disk"/>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:22:53 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/7c3bfdbf-eed9-420a-b5c8-cfb648dceac1/disk.config"/>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:22:53 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:ff:45:41"/>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:      <target dev="tap7a345361-b7"/>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:22:53 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/7c3bfdbf-eed9-420a-b5c8-cfb648dceac1/console.log" append="off"/>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:22:53 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:22:53 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:22:53 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:22:53 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:22:53 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:22:53 np0005466013 nova_compute[192144]: 2025-10-02 12:22:53.218 2 DEBUG nova.compute.manager [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Preparing to wait for external event network-vif-plugged-7a345361-b752-4cc0-97a4-a1d5bfb267a3 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:22:53 np0005466013 nova_compute[192144]: 2025-10-02 12:22:53.218 2 DEBUG oslo_concurrency.lockutils [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Acquiring lock "7c3bfdbf-eed9-420a-b5c8-cfb648dceac1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:53 np0005466013 nova_compute[192144]: 2025-10-02 12:22:53.219 2 DEBUG oslo_concurrency.lockutils [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Lock "7c3bfdbf-eed9-420a-b5c8-cfb648dceac1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:53 np0005466013 nova_compute[192144]: 2025-10-02 12:22:53.220 2 DEBUG oslo_concurrency.lockutils [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Lock "7c3bfdbf-eed9-420a-b5c8-cfb648dceac1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:53 np0005466013 nova_compute[192144]: 2025-10-02 12:22:53.222 2 DEBUG nova.virt.libvirt.vif [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:22:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestMultiTenantJSON-server-1369592646',display_name='tempest-ServersNegativeTestMultiTenantJSON-server-1369592646',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversnegativetestmultitenantjson-server-1369592646',id=115,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e50da64635e042abbcac5618a1476e01',ramdisk_id='',reservation_id='r-hy7hzmnw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersNegativeTestMultiTenantJSON-1092797849',owner_user_name='tempest-ServersNegativeTestMultiTenantJSON-1092797849-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:22:49Z,user_data=None,user_id='34f60e2f6dd64fc8a66fda781f291109',uuid=7c3bfdbf-eed9-420a-b5c8-cfb648dceac1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7a345361-b752-4cc0-97a4-a1d5bfb267a3", "address": "fa:16:3e:ff:45:41", "network": {"id": "a6507c1e-923c-46c5-b6ef-82a78a7fe9f6", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-545963585-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e50da64635e042abbcac5618a1476e01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a345361-b7", "ovs_interfaceid": "7a345361-b752-4cc0-97a4-a1d5bfb267a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:22:53 np0005466013 nova_compute[192144]: 2025-10-02 12:22:53.223 2 DEBUG nova.network.os_vif_util [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Converting VIF {"id": "7a345361-b752-4cc0-97a4-a1d5bfb267a3", "address": "fa:16:3e:ff:45:41", "network": {"id": "a6507c1e-923c-46c5-b6ef-82a78a7fe9f6", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-545963585-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e50da64635e042abbcac5618a1476e01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a345361-b7", "ovs_interfaceid": "7a345361-b752-4cc0-97a4-a1d5bfb267a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:22:53 np0005466013 nova_compute[192144]: 2025-10-02 12:22:53.225 2 DEBUG nova.network.os_vif_util [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ff:45:41,bridge_name='br-int',has_traffic_filtering=True,id=7a345361-b752-4cc0-97a4-a1d5bfb267a3,network=Network(a6507c1e-923c-46c5-b6ef-82a78a7fe9f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a345361-b7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:22:53 np0005466013 nova_compute[192144]: 2025-10-02 12:22:53.226 2 DEBUG os_vif [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:45:41,bridge_name='br-int',has_traffic_filtering=True,id=7a345361-b752-4cc0-97a4-a1d5bfb267a3,network=Network(a6507c1e-923c-46c5-b6ef-82a78a7fe9f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a345361-b7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:22:53 np0005466013 nova_compute[192144]: 2025-10-02 12:22:53.227 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:53 np0005466013 nova_compute[192144]: 2025-10-02 12:22:53.228 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:22:53 np0005466013 nova_compute[192144]: 2025-10-02 12:22:53.229 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:22:53 np0005466013 nova_compute[192144]: 2025-10-02 12:22:53.234 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:53 np0005466013 nova_compute[192144]: 2025-10-02 12:22:53.234 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7a345361-b7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:22:53 np0005466013 nova_compute[192144]: 2025-10-02 12:22:53.235 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7a345361-b7, col_values=(('external_ids', {'iface-id': '7a345361-b752-4cc0-97a4-a1d5bfb267a3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ff:45:41', 'vm-uuid': '7c3bfdbf-eed9-420a-b5c8-cfb648dceac1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:22:53 np0005466013 NetworkManager[51205]: <info>  [1759407773.2387] manager: (tap7a345361-b7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/205)
Oct  2 08:22:53 np0005466013 nova_compute[192144]: 2025-10-02 12:22:53.240 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:22:53 np0005466013 nova_compute[192144]: 2025-10-02 12:22:53.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:53 np0005466013 nova_compute[192144]: 2025-10-02 12:22:53.246 2 INFO os_vif [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:45:41,bridge_name='br-int',has_traffic_filtering=True,id=7a345361-b752-4cc0-97a4-a1d5bfb267a3,network=Network(a6507c1e-923c-46c5-b6ef-82a78a7fe9f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a345361-b7')#033[00m
Oct  2 08:22:53 np0005466013 nova_compute[192144]: 2025-10-02 12:22:53.757 2 DEBUG nova.virt.libvirt.driver [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:22:53 np0005466013 nova_compute[192144]: 2025-10-02 12:22:53.757 2 DEBUG nova.virt.libvirt.driver [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:22:53 np0005466013 nova_compute[192144]: 2025-10-02 12:22:53.758 2 DEBUG nova.virt.libvirt.driver [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] No VIF found with MAC fa:16:3e:ff:45:41, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:22:53 np0005466013 nova_compute[192144]: 2025-10-02 12:22:53.759 2 INFO nova.virt.libvirt.driver [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Using config drive#033[00m
Oct  2 08:22:53 np0005466013 nova_compute[192144]: 2025-10-02 12:22:53.984 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:54 np0005466013 podman[237041]: 2025-10-02 12:22:54.198207943 +0000 UTC m=+0.070777382 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:22:54 np0005466013 podman[237042]: 2025-10-02 12:22:54.201409503 +0000 UTC m=+0.067406327 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:22:54 np0005466013 podman[237043]: 2025-10-02 12:22:54.239372153 +0000 UTC m=+0.100560257 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:22:54 np0005466013 nova_compute[192144]: 2025-10-02 12:22:54.494 2 INFO nova.virt.libvirt.driver [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Creating config drive at /var/lib/nova/instances/7c3bfdbf-eed9-420a-b5c8-cfb648dceac1/disk.config#033[00m
Oct  2 08:22:54 np0005466013 nova_compute[192144]: 2025-10-02 12:22:54.500 2 DEBUG oslo_concurrency.processutils [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7c3bfdbf-eed9-420a-b5c8-cfb648dceac1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprkfrw4ya execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:22:54 np0005466013 nova_compute[192144]: 2025-10-02 12:22:54.630 2 DEBUG oslo_concurrency.processutils [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7c3bfdbf-eed9-420a-b5c8-cfb648dceac1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprkfrw4ya" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:22:54 np0005466013 kernel: tap7a345361-b7: entered promiscuous mode
Oct  2 08:22:54 np0005466013 NetworkManager[51205]: <info>  [1759407774.7166] manager: (tap7a345361-b7): new Tun device (/org/freedesktop/NetworkManager/Devices/206)
Oct  2 08:22:54 np0005466013 ovn_controller[94366]: 2025-10-02T12:22:54Z|00441|binding|INFO|Claiming lport 7a345361-b752-4cc0-97a4-a1d5bfb267a3 for this chassis.
Oct  2 08:22:54 np0005466013 ovn_controller[94366]: 2025-10-02T12:22:54Z|00442|binding|INFO|7a345361-b752-4cc0-97a4-a1d5bfb267a3: Claiming fa:16:3e:ff:45:41 10.100.0.5
Oct  2 08:22:54 np0005466013 nova_compute[192144]: 2025-10-02 12:22:54.721 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:54 np0005466013 ovn_controller[94366]: 2025-10-02T12:22:54Z|00443|binding|INFO|Setting lport 7a345361-b752-4cc0-97a4-a1d5bfb267a3 ovn-installed in OVS
Oct  2 08:22:54 np0005466013 nova_compute[192144]: 2025-10-02 12:22:54.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:54 np0005466013 nova_compute[192144]: 2025-10-02 12:22:54.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:54 np0005466013 systemd-udevd[237122]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:22:54 np0005466013 systemd-machined[152202]: New machine qemu-52-instance-00000073.
Oct  2 08:22:54 np0005466013 NetworkManager[51205]: <info>  [1759407774.7757] device (tap7a345361-b7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:22:54 np0005466013 NetworkManager[51205]: <info>  [1759407774.7768] device (tap7a345361-b7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:22:54 np0005466013 systemd[1]: Started Virtual Machine qemu-52-instance-00000073.
Oct  2 08:22:54 np0005466013 ovn_controller[94366]: 2025-10-02T12:22:54Z|00444|binding|INFO|Setting lport 7a345361-b752-4cc0-97a4-a1d5bfb267a3 up in Southbound
Oct  2 08:22:54 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:54.793 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ff:45:41 10.100.0.5'], port_security=['fa:16:3e:ff:45:41 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '7c3bfdbf-eed9-420a-b5c8-cfb648dceac1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a6507c1e-923c-46c5-b6ef-82a78a7fe9f6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e50da64635e042abbcac5618a1476e01', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3a1bba91-2572-401b-810b-959ce43362d6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b5e479f5-b5ec-47b5-af33-9ce91f260a2d, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=7a345361-b752-4cc0-97a4-a1d5bfb267a3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:22:54 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:54.795 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 7a345361-b752-4cc0-97a4-a1d5bfb267a3 in datapath a6507c1e-923c-46c5-b6ef-82a78a7fe9f6 bound to our chassis#033[00m
Oct  2 08:22:54 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:54.797 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a6507c1e-923c-46c5-b6ef-82a78a7fe9f6#033[00m
Oct  2 08:22:54 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:54.811 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[56f33d23-8680-436d-8815-151169238754]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:54 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:54.813 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa6507c1e-91 in ovnmeta-a6507c1e-923c-46c5-b6ef-82a78a7fe9f6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:22:54 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:54.816 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa6507c1e-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:22:54 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:54.816 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[0831c795-9b48-4969-bc98-258f516c2b93]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:54 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:54.817 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[24a55ced-db2c-45c4-9811-0900858e5672]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:54 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:54.830 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[5787e71f-d766-4b7c-b82d-92f71025f151]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:54 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:54.851 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[7871754a-8d03-4281-a187-30041f3d67f5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:54 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:54.886 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[95e6227c-ccba-461f-8d9b-b5a136081009]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:54 np0005466013 systemd-udevd[237125]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:22:54 np0005466013 NetworkManager[51205]: <info>  [1759407774.8935] manager: (tapa6507c1e-90): new Veth device (/org/freedesktop/NetworkManager/Devices/207)
Oct  2 08:22:54 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:54.895 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[360125fc-4175-4c3a-9dab-99755bab4e95]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:54 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:54.940 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[b7fea84b-4a64-48ce-a1a8-509d6a39b738]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:54 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:54.943 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[0f098306-ad15-4a83-bb8b-95ecd6e22432]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:54 np0005466013 NetworkManager[51205]: <info>  [1759407774.9735] device (tapa6507c1e-90): carrier: link connected
Oct  2 08:22:54 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:54.985 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[46ffc7d3-3df3-46eb-99c8-adabc7cc0289]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:55 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:55.012 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[820d5e82-0acb-49a2-ab29-859187dc0ed4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa6507c1e-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:38:2a:07'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 135], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 576659, 'reachable_time': 28888, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237156, 'error': None, 'target': 'ovnmeta-a6507c1e-923c-46c5-b6ef-82a78a7fe9f6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:55 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:55.039 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[8c8f0978-8cf9-44aa-af8a-c5261e16dff1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe38:2a07'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 576659, 'tstamp': 576659}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237157, 'error': None, 'target': 'ovnmeta-a6507c1e-923c-46c5-b6ef-82a78a7fe9f6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:55 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:55.067 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[9f3c10c4-2bba-4de7-bab4-d87bdb202c20]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa6507c1e-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:38:2a:07'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 135], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 576659, 'reachable_time': 28888, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 237158, 'error': None, 'target': 'ovnmeta-a6507c1e-923c-46c5-b6ef-82a78a7fe9f6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:55 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:55.124 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[1a5c222c-9241-47d8-adba-95141935a152]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:55 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:55.211 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[980edd1f-dcd4-450b-902b-b9ade1d5447b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:55 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:55.214 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa6507c1e-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:22:55 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:55.214 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:22:55 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:55.215 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa6507c1e-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:22:55 np0005466013 nova_compute[192144]: 2025-10-02 12:22:55.218 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:55 np0005466013 kernel: tapa6507c1e-90: entered promiscuous mode
Oct  2 08:22:55 np0005466013 NetworkManager[51205]: <info>  [1759407775.2216] manager: (tapa6507c1e-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/208)
Oct  2 08:22:55 np0005466013 nova_compute[192144]: 2025-10-02 12:22:55.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:55 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:55.225 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa6507c1e-90, col_values=(('external_ids', {'iface-id': '49d05c77-d214-4768-bf45-47005dc24d24'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:22:55 np0005466013 nova_compute[192144]: 2025-10-02 12:22:55.227 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:55 np0005466013 ovn_controller[94366]: 2025-10-02T12:22:55Z|00445|binding|INFO|Releasing lport 49d05c77-d214-4768-bf45-47005dc24d24 from this chassis (sb_readonly=0)
Oct  2 08:22:55 np0005466013 nova_compute[192144]: 2025-10-02 12:22:55.229 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:55 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:55.230 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a6507c1e-923c-46c5-b6ef-82a78a7fe9f6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a6507c1e-923c-46c5-b6ef-82a78a7fe9f6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:22:55 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:55.231 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[4bc7b88c-60b8-4da7-b559-1f9c678810df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:55 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:55.232 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:22:55 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:22:55 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:22:55 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-a6507c1e-923c-46c5-b6ef-82a78a7fe9f6
Oct  2 08:22:55 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:22:55 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:22:55 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:22:55 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/a6507c1e-923c-46c5-b6ef-82a78a7fe9f6.pid.haproxy
Oct  2 08:22:55 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:22:55 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:22:55 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:22:55 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:22:55 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:22:55 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:22:55 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:22:55 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:22:55 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:22:55 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:22:55 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:22:55 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:22:55 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:22:55 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:22:55 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:22:55 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:22:55 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:22:55 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:22:55 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:22:55 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:22:55 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID a6507c1e-923c-46c5-b6ef-82a78a7fe9f6
Oct  2 08:22:55 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:22:55 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:22:55.232 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a6507c1e-923c-46c5-b6ef-82a78a7fe9f6', 'env', 'PROCESS_TAG=haproxy-a6507c1e-923c-46c5-b6ef-82a78a7fe9f6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a6507c1e-923c-46c5-b6ef-82a78a7fe9f6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:22:55 np0005466013 nova_compute[192144]: 2025-10-02 12:22:55.253 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:55 np0005466013 nova_compute[192144]: 2025-10-02 12:22:55.337 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:55 np0005466013 nova_compute[192144]: 2025-10-02 12:22:55.388 2 DEBUG nova.network.neutron [req-e74bd58e-8541-49ab-b1ea-6e9047ca89b3 req-9689ec5d-8564-4598-9c64-97cc4880640c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Updated VIF entry in instance network info cache for port 7a345361-b752-4cc0-97a4-a1d5bfb267a3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:22:55 np0005466013 nova_compute[192144]: 2025-10-02 12:22:55.389 2 DEBUG nova.network.neutron [req-e74bd58e-8541-49ab-b1ea-6e9047ca89b3 req-9689ec5d-8564-4598-9c64-97cc4880640c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Updating instance_info_cache with network_info: [{"id": "7a345361-b752-4cc0-97a4-a1d5bfb267a3", "address": "fa:16:3e:ff:45:41", "network": {"id": "a6507c1e-923c-46c5-b6ef-82a78a7fe9f6", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-545963585-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e50da64635e042abbcac5618a1476e01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a345361-b7", "ovs_interfaceid": "7a345361-b752-4cc0-97a4-a1d5bfb267a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:22:55 np0005466013 nova_compute[192144]: 2025-10-02 12:22:55.539 2 DEBUG oslo_concurrency.lockutils [req-e74bd58e-8541-49ab-b1ea-6e9047ca89b3 req-9689ec5d-8564-4598-9c64-97cc4880640c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-7c3bfdbf-eed9-420a-b5c8-cfb648dceac1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:22:55 np0005466013 nova_compute[192144]: 2025-10-02 12:22:55.758 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407775.758105, 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:22:55 np0005466013 nova_compute[192144]: 2025-10-02 12:22:55.760 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] VM Started (Lifecycle Event)#033[00m
Oct  2 08:22:55 np0005466013 podman[237194]: 2025-10-02 12:22:55.688119807 +0000 UTC m=+0.039411686 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:22:55 np0005466013 nova_compute[192144]: 2025-10-02 12:22:55.851 2 DEBUG nova.compute.manager [req-fe43f22d-9774-4b68-b6c3-452d6ad36f01 req-3505fa5f-aa0f-40e3-8432-d1d6a83640a4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Received event network-vif-plugged-7a345361-b752-4cc0-97a4-a1d5bfb267a3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:22:55 np0005466013 nova_compute[192144]: 2025-10-02 12:22:55.853 2 DEBUG oslo_concurrency.lockutils [req-fe43f22d-9774-4b68-b6c3-452d6ad36f01 req-3505fa5f-aa0f-40e3-8432-d1d6a83640a4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "7c3bfdbf-eed9-420a-b5c8-cfb648dceac1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:55 np0005466013 nova_compute[192144]: 2025-10-02 12:22:55.854 2 DEBUG oslo_concurrency.lockutils [req-fe43f22d-9774-4b68-b6c3-452d6ad36f01 req-3505fa5f-aa0f-40e3-8432-d1d6a83640a4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "7c3bfdbf-eed9-420a-b5c8-cfb648dceac1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:55 np0005466013 nova_compute[192144]: 2025-10-02 12:22:55.854 2 DEBUG oslo_concurrency.lockutils [req-fe43f22d-9774-4b68-b6c3-452d6ad36f01 req-3505fa5f-aa0f-40e3-8432-d1d6a83640a4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "7c3bfdbf-eed9-420a-b5c8-cfb648dceac1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:55 np0005466013 nova_compute[192144]: 2025-10-02 12:22:55.855 2 DEBUG nova.compute.manager [req-fe43f22d-9774-4b68-b6c3-452d6ad36f01 req-3505fa5f-aa0f-40e3-8432-d1d6a83640a4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Processing event network-vif-plugged-7a345361-b752-4cc0-97a4-a1d5bfb267a3 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:22:55 np0005466013 nova_compute[192144]: 2025-10-02 12:22:55.857 2 DEBUG nova.compute.manager [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:22:55 np0005466013 nova_compute[192144]: 2025-10-02 12:22:55.863 2 DEBUG nova.virt.libvirt.driver [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:22:55 np0005466013 nova_compute[192144]: 2025-10-02 12:22:55.871 2 INFO nova.virt.libvirt.driver [-] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Instance spawned successfully.#033[00m
Oct  2 08:22:55 np0005466013 nova_compute[192144]: 2025-10-02 12:22:55.873 2 DEBUG nova.virt.libvirt.driver [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:22:55 np0005466013 nova_compute[192144]: 2025-10-02 12:22:55.877 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:22:55 np0005466013 nova_compute[192144]: 2025-10-02 12:22:55.882 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:22:56 np0005466013 nova_compute[192144]: 2025-10-02 12:22:56.114 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:22:56 np0005466013 nova_compute[192144]: 2025-10-02 12:22:56.115 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407775.7591288, 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:22:56 np0005466013 nova_compute[192144]: 2025-10-02 12:22:56.115 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:22:56 np0005466013 nova_compute[192144]: 2025-10-02 12:22:56.123 2 DEBUG nova.virt.libvirt.driver [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:22:56 np0005466013 nova_compute[192144]: 2025-10-02 12:22:56.124 2 DEBUG nova.virt.libvirt.driver [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:22:56 np0005466013 nova_compute[192144]: 2025-10-02 12:22:56.124 2 DEBUG nova.virt.libvirt.driver [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:22:56 np0005466013 nova_compute[192144]: 2025-10-02 12:22:56.125 2 DEBUG nova.virt.libvirt.driver [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:22:56 np0005466013 nova_compute[192144]: 2025-10-02 12:22:56.125 2 DEBUG nova.virt.libvirt.driver [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:22:56 np0005466013 nova_compute[192144]: 2025-10-02 12:22:56.125 2 DEBUG nova.virt.libvirt.driver [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:22:56 np0005466013 nova_compute[192144]: 2025-10-02 12:22:56.244 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:22:56 np0005466013 nova_compute[192144]: 2025-10-02 12:22:56.249 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407775.860887, 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:22:56 np0005466013 nova_compute[192144]: 2025-10-02 12:22:56.249 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:22:56 np0005466013 podman[237194]: 2025-10-02 12:22:56.260971719 +0000 UTC m=+0.612263558 container create f75eadf271921d562c06dd71ff624d9f937cd6255e5928c80cb4c3dd48044e5f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a6507c1e-923c-46c5-b6ef-82a78a7fe9f6, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001)
Oct  2 08:22:56 np0005466013 systemd[1]: Started libpod-conmon-f75eadf271921d562c06dd71ff624d9f937cd6255e5928c80cb4c3dd48044e5f.scope.
Oct  2 08:22:56 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:22:56 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5e77d5323be808504b89ea80ccc8dbbd518617e6dd4c964598bd6ffbe5731123/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:22:56 np0005466013 nova_compute[192144]: 2025-10-02 12:22:56.536 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:22:56 np0005466013 podman[237194]: 2025-10-02 12:22:56.548188199 +0000 UTC m=+0.899480038 container init f75eadf271921d562c06dd71ff624d9f937cd6255e5928c80cb4c3dd48044e5f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a6507c1e-923c-46c5-b6ef-82a78a7fe9f6, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3)
Oct  2 08:22:56 np0005466013 nova_compute[192144]: 2025-10-02 12:22:56.553 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:22:56 np0005466013 podman[237194]: 2025-10-02 12:22:56.555801226 +0000 UTC m=+0.907093065 container start f75eadf271921d562c06dd71ff624d9f937cd6255e5928c80cb4c3dd48044e5f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a6507c1e-923c-46c5-b6ef-82a78a7fe9f6, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 08:22:56 np0005466013 neutron-haproxy-ovnmeta-a6507c1e-923c-46c5-b6ef-82a78a7fe9f6[237209]: [NOTICE]   (237213) : New worker (237215) forked
Oct  2 08:22:56 np0005466013 neutron-haproxy-ovnmeta-a6507c1e-923c-46c5-b6ef-82a78a7fe9f6[237209]: [NOTICE]   (237213) : Loading success.
Oct  2 08:22:56 np0005466013 nova_compute[192144]: 2025-10-02 12:22:56.772 2 INFO nova.compute.manager [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Took 7.53 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:22:56 np0005466013 nova_compute[192144]: 2025-10-02 12:22:56.773 2 DEBUG nova.compute.manager [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:22:57 np0005466013 nova_compute[192144]: 2025-10-02 12:22:57.025 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:22:58 np0005466013 nova_compute[192144]: 2025-10-02 12:22:58.225 2 INFO nova.compute.manager [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Took 9.46 seconds to build instance.#033[00m
Oct  2 08:22:58 np0005466013 nova_compute[192144]: 2025-10-02 12:22:58.238 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:58 np0005466013 nova_compute[192144]: 2025-10-02 12:22:58.460 2 DEBUG oslo_concurrency.lockutils [None req-f514f3ba-82fb-42ef-ade3-d176c99b2140 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Lock "7c3bfdbf-eed9-420a-b5c8-cfb648dceac1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.768s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:58 np0005466013 nova_compute[192144]: 2025-10-02 12:22:58.987 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:58 np0005466013 nova_compute[192144]: 2025-10-02 12:22:58.993 2 DEBUG nova.compute.manager [req-65055813-999a-4185-b0a9-32a0fa822635 req-da4aa313-e572-49ab-917c-d53ba35af7c2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Received event network-vif-plugged-7a345361-b752-4cc0-97a4-a1d5bfb267a3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:22:58 np0005466013 nova_compute[192144]: 2025-10-02 12:22:58.994 2 DEBUG oslo_concurrency.lockutils [req-65055813-999a-4185-b0a9-32a0fa822635 req-da4aa313-e572-49ab-917c-d53ba35af7c2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "7c3bfdbf-eed9-420a-b5c8-cfb648dceac1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:58 np0005466013 nova_compute[192144]: 2025-10-02 12:22:58.994 2 DEBUG oslo_concurrency.lockutils [req-65055813-999a-4185-b0a9-32a0fa822635 req-da4aa313-e572-49ab-917c-d53ba35af7c2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "7c3bfdbf-eed9-420a-b5c8-cfb648dceac1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:58 np0005466013 nova_compute[192144]: 2025-10-02 12:22:58.995 2 DEBUG oslo_concurrency.lockutils [req-65055813-999a-4185-b0a9-32a0fa822635 req-da4aa313-e572-49ab-917c-d53ba35af7c2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "7c3bfdbf-eed9-420a-b5c8-cfb648dceac1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:58 np0005466013 nova_compute[192144]: 2025-10-02 12:22:58.995 2 DEBUG nova.compute.manager [req-65055813-999a-4185-b0a9-32a0fa822635 req-da4aa313-e572-49ab-917c-d53ba35af7c2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] No waiting events found dispatching network-vif-plugged-7a345361-b752-4cc0-97a4-a1d5bfb267a3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:22:58 np0005466013 nova_compute[192144]: 2025-10-02 12:22:58.995 2 WARNING nova.compute.manager [req-65055813-999a-4185-b0a9-32a0fa822635 req-da4aa313-e572-49ab-917c-d53ba35af7c2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Received unexpected event network-vif-plugged-7a345361-b752-4cc0-97a4-a1d5bfb267a3 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:23:01 np0005466013 nova_compute[192144]: 2025-10-02 12:23:01.175 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:01.176 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:23:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:01.178 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:23:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:02.305 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:02.306 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:02.307 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:03 np0005466013 nova_compute[192144]: 2025-10-02 12:23:03.285 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:03 np0005466013 nova_compute[192144]: 2025-10-02 12:23:03.992 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:04 np0005466013 podman[237225]: 2025-10-02 12:23:04.724146287 +0000 UTC m=+0.089412260 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, name=ubi9-minimal, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.expose-services=, architecture=x86_64, build-date=2025-08-20T13:12:41, version=9.6)
Oct  2 08:23:04 np0005466013 podman[237224]: 2025-10-02 12:23:04.739851066 +0000 UTC m=+0.101206778 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Oct  2 08:23:04 np0005466013 podman[237226]: 2025-10-02 12:23:04.744585723 +0000 UTC m=+0.097988957 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  2 08:23:05 np0005466013 nova_compute[192144]: 2025-10-02 12:23:05.445 2 DEBUG oslo_concurrency.lockutils [None req-be21448a-4b1e-432e-99d2-2d524e13b813 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Acquiring lock "7c3bfdbf-eed9-420a-b5c8-cfb648dceac1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:05 np0005466013 nova_compute[192144]: 2025-10-02 12:23:05.447 2 DEBUG oslo_concurrency.lockutils [None req-be21448a-4b1e-432e-99d2-2d524e13b813 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Lock "7c3bfdbf-eed9-420a-b5c8-cfb648dceac1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:05 np0005466013 nova_compute[192144]: 2025-10-02 12:23:05.447 2 DEBUG oslo_concurrency.lockutils [None req-be21448a-4b1e-432e-99d2-2d524e13b813 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Acquiring lock "7c3bfdbf-eed9-420a-b5c8-cfb648dceac1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:05 np0005466013 nova_compute[192144]: 2025-10-02 12:23:05.448 2 DEBUG oslo_concurrency.lockutils [None req-be21448a-4b1e-432e-99d2-2d524e13b813 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Lock "7c3bfdbf-eed9-420a-b5c8-cfb648dceac1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:05 np0005466013 nova_compute[192144]: 2025-10-02 12:23:05.448 2 DEBUG oslo_concurrency.lockutils [None req-be21448a-4b1e-432e-99d2-2d524e13b813 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Lock "7c3bfdbf-eed9-420a-b5c8-cfb648dceac1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:05 np0005466013 nova_compute[192144]: 2025-10-02 12:23:05.956 2 INFO nova.compute.manager [None req-be21448a-4b1e-432e-99d2-2d524e13b813 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Terminating instance#033[00m
Oct  2 08:23:06 np0005466013 nova_compute[192144]: 2025-10-02 12:23:06.190 2 DEBUG nova.compute.manager [None req-be21448a-4b1e-432e-99d2-2d524e13b813 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:23:06 np0005466013 kernel: tap7a345361-b7 (unregistering): left promiscuous mode
Oct  2 08:23:06 np0005466013 NetworkManager[51205]: <info>  [1759407786.2195] device (tap7a345361-b7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:23:06 np0005466013 nova_compute[192144]: 2025-10-02 12:23:06.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:06 np0005466013 ovn_controller[94366]: 2025-10-02T12:23:06Z|00446|binding|INFO|Releasing lport 7a345361-b752-4cc0-97a4-a1d5bfb267a3 from this chassis (sb_readonly=0)
Oct  2 08:23:06 np0005466013 ovn_controller[94366]: 2025-10-02T12:23:06Z|00447|binding|INFO|Setting lport 7a345361-b752-4cc0-97a4-a1d5bfb267a3 down in Southbound
Oct  2 08:23:06 np0005466013 ovn_controller[94366]: 2025-10-02T12:23:06Z|00448|binding|INFO|Removing iface tap7a345361-b7 ovn-installed in OVS
Oct  2 08:23:06 np0005466013 nova_compute[192144]: 2025-10-02 12:23:06.254 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:06 np0005466013 systemd[1]: machine-qemu\x2d52\x2dinstance\x2d00000073.scope: Deactivated successfully.
Oct  2 08:23:06 np0005466013 systemd[1]: machine-qemu\x2d52\x2dinstance\x2d00000073.scope: Consumed 11.273s CPU time.
Oct  2 08:23:06 np0005466013 systemd-machined[152202]: Machine qemu-52-instance-00000073 terminated.
Oct  2 08:23:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:06.337 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ff:45:41 10.100.0.5'], port_security=['fa:16:3e:ff:45:41 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '7c3bfdbf-eed9-420a-b5c8-cfb648dceac1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a6507c1e-923c-46c5-b6ef-82a78a7fe9f6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e50da64635e042abbcac5618a1476e01', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3a1bba91-2572-401b-810b-959ce43362d6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b5e479f5-b5ec-47b5-af33-9ce91f260a2d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=7a345361-b752-4cc0-97a4-a1d5bfb267a3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:23:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:06.338 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 7a345361-b752-4cc0-97a4-a1d5bfb267a3 in datapath a6507c1e-923c-46c5-b6ef-82a78a7fe9f6 unbound from our chassis#033[00m
Oct  2 08:23:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:06.340 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a6507c1e-923c-46c5-b6ef-82a78a7fe9f6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:23:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:06.343 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[1e907bd5-9f03-4926-a2b1-8108adcc5ba9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:06.344 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a6507c1e-923c-46c5-b6ef-82a78a7fe9f6 namespace which is not needed anymore#033[00m
Oct  2 08:23:06 np0005466013 nova_compute[192144]: 2025-10-02 12:23:06.467 2 INFO nova.virt.libvirt.driver [-] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Instance destroyed successfully.#033[00m
Oct  2 08:23:06 np0005466013 nova_compute[192144]: 2025-10-02 12:23:06.468 2 DEBUG nova.objects.instance [None req-be21448a-4b1e-432e-99d2-2d524e13b813 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Lazy-loading 'resources' on Instance uuid 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:23:06 np0005466013 neutron-haproxy-ovnmeta-a6507c1e-923c-46c5-b6ef-82a78a7fe9f6[237209]: [NOTICE]   (237213) : haproxy version is 2.8.14-c23fe91
Oct  2 08:23:06 np0005466013 neutron-haproxy-ovnmeta-a6507c1e-923c-46c5-b6ef-82a78a7fe9f6[237209]: [NOTICE]   (237213) : path to executable is /usr/sbin/haproxy
Oct  2 08:23:06 np0005466013 neutron-haproxy-ovnmeta-a6507c1e-923c-46c5-b6ef-82a78a7fe9f6[237209]: [WARNING]  (237213) : Exiting Master process...
Oct  2 08:23:06 np0005466013 neutron-haproxy-ovnmeta-a6507c1e-923c-46c5-b6ef-82a78a7fe9f6[237209]: [ALERT]    (237213) : Current worker (237215) exited with code 143 (Terminated)
Oct  2 08:23:06 np0005466013 neutron-haproxy-ovnmeta-a6507c1e-923c-46c5-b6ef-82a78a7fe9f6[237209]: [WARNING]  (237213) : All workers exited. Exiting... (0)
Oct  2 08:23:06 np0005466013 systemd[1]: libpod-f75eadf271921d562c06dd71ff624d9f937cd6255e5928c80cb4c3dd48044e5f.scope: Deactivated successfully.
Oct  2 08:23:06 np0005466013 podman[237312]: 2025-10-02 12:23:06.515311579 +0000 UTC m=+0.059079067 container died f75eadf271921d562c06dd71ff624d9f937cd6255e5928c80cb4c3dd48044e5f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a6507c1e-923c-46c5-b6ef-82a78a7fe9f6, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct  2 08:23:06 np0005466013 systemd[1]: var-lib-containers-storage-overlay-5e77d5323be808504b89ea80ccc8dbbd518617e6dd4c964598bd6ffbe5731123-merged.mount: Deactivated successfully.
Oct  2 08:23:06 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f75eadf271921d562c06dd71ff624d9f937cd6255e5928c80cb4c3dd48044e5f-userdata-shm.mount: Deactivated successfully.
Oct  2 08:23:06 np0005466013 podman[237312]: 2025-10-02 12:23:06.577702239 +0000 UTC m=+0.121469737 container cleanup f75eadf271921d562c06dd71ff624d9f937cd6255e5928c80cb4c3dd48044e5f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a6507c1e-923c-46c5-b6ef-82a78a7fe9f6, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:23:06 np0005466013 systemd[1]: libpod-conmon-f75eadf271921d562c06dd71ff624d9f937cd6255e5928c80cb4c3dd48044e5f.scope: Deactivated successfully.
Oct  2 08:23:06 np0005466013 podman[237350]: 2025-10-02 12:23:06.682613541 +0000 UTC m=+0.077114838 container remove f75eadf271921d562c06dd71ff624d9f937cd6255e5928c80cb4c3dd48044e5f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a6507c1e-923c-46c5-b6ef-82a78a7fe9f6, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:23:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:06.689 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[1e33d7f6-be1d-469a-9939-395d9c2a4c61]: (4, ('Thu Oct  2 12:23:06 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-a6507c1e-923c-46c5-b6ef-82a78a7fe9f6 (f75eadf271921d562c06dd71ff624d9f937cd6255e5928c80cb4c3dd48044e5f)\nf75eadf271921d562c06dd71ff624d9f937cd6255e5928c80cb4c3dd48044e5f\nThu Oct  2 12:23:06 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-a6507c1e-923c-46c5-b6ef-82a78a7fe9f6 (f75eadf271921d562c06dd71ff624d9f937cd6255e5928c80cb4c3dd48044e5f)\nf75eadf271921d562c06dd71ff624d9f937cd6255e5928c80cb4c3dd48044e5f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:06.692 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[5e4830f1-9e56-4e53-8a60-21d7d22a9930]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:06.693 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa6507c1e-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:23:06 np0005466013 nova_compute[192144]: 2025-10-02 12:23:06.693 2 DEBUG nova.virt.libvirt.vif [None req-be21448a-4b1e-432e-99d2-2d524e13b813 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:22:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestMultiTenantJSON-server-1369592646',display_name='tempest-ServersNegativeTestMultiTenantJSON-server-1369592646',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversnegativetestmultitenantjson-server-1369592646',id=115,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:22:56Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e50da64635e042abbcac5618a1476e01',ramdisk_id='',reservation_id='r-hy7hzmnw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestMultiTenantJSON-1092797849',owner_user_name='tempest-ServersNegativeTestMultiTenantJSON-1092797849-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:22:57Z,user_data=None,user_id='34f60e2f6dd64fc8a66fda781f291109',uuid=7c3bfdbf-eed9-420a-b5c8-cfb648dceac1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7a345361-b752-4cc0-97a4-a1d5bfb267a3", "address": "fa:16:3e:ff:45:41", "network": {"id": "a6507c1e-923c-46c5-b6ef-82a78a7fe9f6", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-545963585-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e50da64635e042abbcac5618a1476e01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a345361-b7", "ovs_interfaceid": "7a345361-b752-4cc0-97a4-a1d5bfb267a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:23:06 np0005466013 nova_compute[192144]: 2025-10-02 12:23:06.694 2 DEBUG nova.network.os_vif_util [None req-be21448a-4b1e-432e-99d2-2d524e13b813 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Converting VIF {"id": "7a345361-b752-4cc0-97a4-a1d5bfb267a3", "address": "fa:16:3e:ff:45:41", "network": {"id": "a6507c1e-923c-46c5-b6ef-82a78a7fe9f6", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-545963585-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e50da64635e042abbcac5618a1476e01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a345361-b7", "ovs_interfaceid": "7a345361-b752-4cc0-97a4-a1d5bfb267a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:23:06 np0005466013 nova_compute[192144]: 2025-10-02 12:23:06.695 2 DEBUG nova.network.os_vif_util [None req-be21448a-4b1e-432e-99d2-2d524e13b813 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ff:45:41,bridge_name='br-int',has_traffic_filtering=True,id=7a345361-b752-4cc0-97a4-a1d5bfb267a3,network=Network(a6507c1e-923c-46c5-b6ef-82a78a7fe9f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a345361-b7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:23:06 np0005466013 nova_compute[192144]: 2025-10-02 12:23:06.696 2 DEBUG os_vif [None req-be21448a-4b1e-432e-99d2-2d524e13b813 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:45:41,bridge_name='br-int',has_traffic_filtering=True,id=7a345361-b752-4cc0-97a4-a1d5bfb267a3,network=Network(a6507c1e-923c-46c5-b6ef-82a78a7fe9f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a345361-b7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:23:06 np0005466013 kernel: tapa6507c1e-90: left promiscuous mode
Oct  2 08:23:06 np0005466013 nova_compute[192144]: 2025-10-02 12:23:06.699 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:06 np0005466013 nova_compute[192144]: 2025-10-02 12:23:06.700 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7a345361-b7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:23:06 np0005466013 nova_compute[192144]: 2025-10-02 12:23:06.701 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:06 np0005466013 nova_compute[192144]: 2025-10-02 12:23:06.702 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:06 np0005466013 nova_compute[192144]: 2025-10-02 12:23:06.704 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:23:06 np0005466013 nova_compute[192144]: 2025-10-02 12:23:06.713 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:06 np0005466013 nova_compute[192144]: 2025-10-02 12:23:06.718 2 INFO os_vif [None req-be21448a-4b1e-432e-99d2-2d524e13b813 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:45:41,bridge_name='br-int',has_traffic_filtering=True,id=7a345361-b752-4cc0-97a4-a1d5bfb267a3,network=Network(a6507c1e-923c-46c5-b6ef-82a78a7fe9f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a345361-b7')#033[00m
Oct  2 08:23:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:06.718 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[39c25640-4c04-4220-bda1-2c2530ed1f9f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:06 np0005466013 nova_compute[192144]: 2025-10-02 12:23:06.719 2 INFO nova.virt.libvirt.driver [None req-be21448a-4b1e-432e-99d2-2d524e13b813 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Deleting instance files /var/lib/nova/instances/7c3bfdbf-eed9-420a-b5c8-cfb648dceac1_del#033[00m
Oct  2 08:23:06 np0005466013 nova_compute[192144]: 2025-10-02 12:23:06.720 2 INFO nova.virt.libvirt.driver [None req-be21448a-4b1e-432e-99d2-2d524e13b813 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Deletion of /var/lib/nova/instances/7c3bfdbf-eed9-420a-b5c8-cfb648dceac1_del complete#033[00m
Oct  2 08:23:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:06.750 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[9001d621-0083-4a1c-8d2d-8f943736a32f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:06.752 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[adc97bc3-7a90-4a03-8d70-cff163d37268]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:06.775 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[fffe1450-05f2-42f4-8130-12ebe65738ba]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 576649, 'reachable_time': 31112, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237368, 'error': None, 'target': 'ovnmeta-a6507c1e-923c-46c5-b6ef-82a78a7fe9f6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:06.779 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a6507c1e-923c-46c5-b6ef-82a78a7fe9f6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:23:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:06.779 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[eaea227f-079b-4a5c-8d41-4923790e85b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:06 np0005466013 systemd[1]: run-netns-ovnmeta\x2da6507c1e\x2d923c\x2d46c5\x2db6ef\x2d82a78a7fe9f6.mount: Deactivated successfully.
Oct  2 08:23:07 np0005466013 nova_compute[192144]: 2025-10-02 12:23:07.232 2 INFO nova.compute.manager [None req-be21448a-4b1e-432e-99d2-2d524e13b813 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Took 1.04 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:23:07 np0005466013 nova_compute[192144]: 2025-10-02 12:23:07.233 2 DEBUG oslo.service.loopingcall [None req-be21448a-4b1e-432e-99d2-2d524e13b813 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:23:07 np0005466013 nova_compute[192144]: 2025-10-02 12:23:07.233 2 DEBUG nova.compute.manager [-] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:23:07 np0005466013 nova_compute[192144]: 2025-10-02 12:23:07.233 2 DEBUG nova.network.neutron [-] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:23:07 np0005466013 nova_compute[192144]: 2025-10-02 12:23:07.857 2 DEBUG nova.compute.manager [req-85380dd3-9724-4e02-a630-b2e890112910 req-6d92c95a-6db5-49aa-9253-a8ddf0972e2f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Received event network-vif-unplugged-7a345361-b752-4cc0-97a4-a1d5bfb267a3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:23:07 np0005466013 nova_compute[192144]: 2025-10-02 12:23:07.858 2 DEBUG oslo_concurrency.lockutils [req-85380dd3-9724-4e02-a630-b2e890112910 req-6d92c95a-6db5-49aa-9253-a8ddf0972e2f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "7c3bfdbf-eed9-420a-b5c8-cfb648dceac1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:07 np0005466013 nova_compute[192144]: 2025-10-02 12:23:07.858 2 DEBUG oslo_concurrency.lockutils [req-85380dd3-9724-4e02-a630-b2e890112910 req-6d92c95a-6db5-49aa-9253-a8ddf0972e2f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "7c3bfdbf-eed9-420a-b5c8-cfb648dceac1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:07 np0005466013 nova_compute[192144]: 2025-10-02 12:23:07.858 2 DEBUG oslo_concurrency.lockutils [req-85380dd3-9724-4e02-a630-b2e890112910 req-6d92c95a-6db5-49aa-9253-a8ddf0972e2f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "7c3bfdbf-eed9-420a-b5c8-cfb648dceac1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:07 np0005466013 nova_compute[192144]: 2025-10-02 12:23:07.859 2 DEBUG nova.compute.manager [req-85380dd3-9724-4e02-a630-b2e890112910 req-6d92c95a-6db5-49aa-9253-a8ddf0972e2f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] No waiting events found dispatching network-vif-unplugged-7a345361-b752-4cc0-97a4-a1d5bfb267a3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:23:07 np0005466013 nova_compute[192144]: 2025-10-02 12:23:07.859 2 DEBUG nova.compute.manager [req-85380dd3-9724-4e02-a630-b2e890112910 req-6d92c95a-6db5-49aa-9253-a8ddf0972e2f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Received event network-vif-unplugged-7a345361-b752-4cc0-97a4-a1d5bfb267a3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:23:08 np0005466013 nova_compute[192144]: 2025-10-02 12:23:08.916 2 DEBUG nova.network.neutron [-] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:23:08 np0005466013 nova_compute[192144]: 2025-10-02 12:23:08.967 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:08 np0005466013 nova_compute[192144]: 2025-10-02 12:23:08.992 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:09 np0005466013 nova_compute[192144]: 2025-10-02 12:23:09.021 2 DEBUG nova.compute.manager [req-bfbb45be-9bb5-41f5-8a93-1c28bdac383a req-061531c4-4280-4af6-a5fe-c1188777e69f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Received event network-vif-deleted-7a345361-b752-4cc0-97a4-a1d5bfb267a3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:23:09 np0005466013 nova_compute[192144]: 2025-10-02 12:23:09.022 2 INFO nova.compute.manager [req-bfbb45be-9bb5-41f5-8a93-1c28bdac383a req-061531c4-4280-4af6-a5fe-c1188777e69f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Neutron deleted interface 7a345361-b752-4cc0-97a4-a1d5bfb267a3; detaching it from the instance and deleting it from the info cache#033[00m
Oct  2 08:23:09 np0005466013 nova_compute[192144]: 2025-10-02 12:23:09.022 2 DEBUG nova.network.neutron [req-bfbb45be-9bb5-41f5-8a93-1c28bdac383a req-061531c4-4280-4af6-a5fe-c1188777e69f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:23:09 np0005466013 nova_compute[192144]: 2025-10-02 12:23:09.028 2 INFO nova.compute.manager [-] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Took 1.80 seconds to deallocate network for instance.#033[00m
Oct  2 08:23:09 np0005466013 nova_compute[192144]: 2025-10-02 12:23:09.087 2 DEBUG nova.compute.manager [req-bfbb45be-9bb5-41f5-8a93-1c28bdac383a req-061531c4-4280-4af6-a5fe-c1188777e69f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Detach interface failed, port_id=7a345361-b752-4cc0-97a4-a1d5bfb267a3, reason: Instance 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Oct  2 08:23:09 np0005466013 nova_compute[192144]: 2025-10-02 12:23:09.307 2 DEBUG oslo_concurrency.lockutils [None req-be21448a-4b1e-432e-99d2-2d524e13b813 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:09 np0005466013 nova_compute[192144]: 2025-10-02 12:23:09.308 2 DEBUG oslo_concurrency.lockutils [None req-be21448a-4b1e-432e-99d2-2d524e13b813 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:09 np0005466013 nova_compute[192144]: 2025-10-02 12:23:09.456 2 DEBUG nova.compute.provider_tree [None req-be21448a-4b1e-432e-99d2-2d524e13b813 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:23:09 np0005466013 nova_compute[192144]: 2025-10-02 12:23:09.502 2 DEBUG nova.scheduler.client.report [None req-be21448a-4b1e-432e-99d2-2d524e13b813 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:23:09 np0005466013 nova_compute[192144]: 2025-10-02 12:23:09.664 2 DEBUG oslo_concurrency.lockutils [None req-be21448a-4b1e-432e-99d2-2d524e13b813 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.357s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:09 np0005466013 nova_compute[192144]: 2025-10-02 12:23:09.734 2 INFO nova.scheduler.client.report [None req-be21448a-4b1e-432e-99d2-2d524e13b813 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Deleted allocations for instance 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1#033[00m
Oct  2 08:23:10 np0005466013 nova_compute[192144]: 2025-10-02 12:23:10.045 2 DEBUG oslo_concurrency.lockutils [None req-be21448a-4b1e-432e-99d2-2d524e13b813 34f60e2f6dd64fc8a66fda781f291109 e50da64635e042abbcac5618a1476e01 - - default default] Lock "7c3bfdbf-eed9-420a-b5c8-cfb648dceac1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.598s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:10 np0005466013 nova_compute[192144]: 2025-10-02 12:23:10.492 2 DEBUG nova.compute.manager [req-0f539129-356e-48b9-a7cd-9b473371cf9d req-fbf607b3-b6d6-4948-b030-eacac810e1b6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Received event network-vif-plugged-7a345361-b752-4cc0-97a4-a1d5bfb267a3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:23:10 np0005466013 nova_compute[192144]: 2025-10-02 12:23:10.493 2 DEBUG oslo_concurrency.lockutils [req-0f539129-356e-48b9-a7cd-9b473371cf9d req-fbf607b3-b6d6-4948-b030-eacac810e1b6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "7c3bfdbf-eed9-420a-b5c8-cfb648dceac1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:10 np0005466013 nova_compute[192144]: 2025-10-02 12:23:10.493 2 DEBUG oslo_concurrency.lockutils [req-0f539129-356e-48b9-a7cd-9b473371cf9d req-fbf607b3-b6d6-4948-b030-eacac810e1b6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "7c3bfdbf-eed9-420a-b5c8-cfb648dceac1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:10 np0005466013 nova_compute[192144]: 2025-10-02 12:23:10.493 2 DEBUG oslo_concurrency.lockutils [req-0f539129-356e-48b9-a7cd-9b473371cf9d req-fbf607b3-b6d6-4948-b030-eacac810e1b6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "7c3bfdbf-eed9-420a-b5c8-cfb648dceac1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:10 np0005466013 nova_compute[192144]: 2025-10-02 12:23:10.494 2 DEBUG nova.compute.manager [req-0f539129-356e-48b9-a7cd-9b473371cf9d req-fbf607b3-b6d6-4948-b030-eacac810e1b6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] No waiting events found dispatching network-vif-plugged-7a345361-b752-4cc0-97a4-a1d5bfb267a3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:23:10 np0005466013 nova_compute[192144]: 2025-10-02 12:23:10.494 2 WARNING nova.compute.manager [req-0f539129-356e-48b9-a7cd-9b473371cf9d req-fbf607b3-b6d6-4948-b030-eacac810e1b6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Received unexpected event network-vif-plugged-7a345361-b752-4cc0-97a4-a1d5bfb267a3 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:23:10 np0005466013 podman[237369]: 2025-10-02 12:23:10.689775313 +0000 UTC m=+0.063245388 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 08:23:10 np0005466013 podman[237370]: 2025-10-02 12:23:10.692006792 +0000 UTC m=+0.064899129 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3)
Oct  2 08:23:11 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:11.181 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:23:11 np0005466013 nova_compute[192144]: 2025-10-02 12:23:11.701 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:13 np0005466013 nova_compute[192144]: 2025-10-02 12:23:13.837 2 DEBUG oslo_concurrency.lockutils [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Acquiring lock "86e04f64-f88d-45c1-b90c-344bddb4c4b9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:13 np0005466013 nova_compute[192144]: 2025-10-02 12:23:13.839 2 DEBUG oslo_concurrency.lockutils [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "86e04f64-f88d-45c1-b90c-344bddb4c4b9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:13 np0005466013 nova_compute[192144]: 2025-10-02 12:23:13.930 2 DEBUG nova.compute.manager [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:23:13 np0005466013 nova_compute[192144]: 2025-10-02 12:23:13.994 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:14 np0005466013 nova_compute[192144]: 2025-10-02 12:23:14.328 2 DEBUG oslo_concurrency.lockutils [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:14 np0005466013 nova_compute[192144]: 2025-10-02 12:23:14.329 2 DEBUG oslo_concurrency.lockutils [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:14 np0005466013 nova_compute[192144]: 2025-10-02 12:23:14.338 2 DEBUG nova.virt.hardware [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:23:14 np0005466013 nova_compute[192144]: 2025-10-02 12:23:14.339 2 INFO nova.compute.claims [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:23:14 np0005466013 nova_compute[192144]: 2025-10-02 12:23:14.347 2 DEBUG oslo_concurrency.lockutils [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "32196dd3-2739-4c43-9532-b0365f8095af" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:14 np0005466013 nova_compute[192144]: 2025-10-02 12:23:14.347 2 DEBUG oslo_concurrency.lockutils [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "32196dd3-2739-4c43-9532-b0365f8095af" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:14 np0005466013 nova_compute[192144]: 2025-10-02 12:23:14.460 2 DEBUG nova.compute.manager [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:23:14 np0005466013 nova_compute[192144]: 2025-10-02 12:23:14.643 2 DEBUG oslo_concurrency.lockutils [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:14 np0005466013 nova_compute[192144]: 2025-10-02 12:23:14.660 2 DEBUG nova.compute.provider_tree [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:23:14 np0005466013 nova_compute[192144]: 2025-10-02 12:23:14.722 2 DEBUG nova.scheduler.client.report [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:23:14 np0005466013 nova_compute[192144]: 2025-10-02 12:23:14.785 2 DEBUG oslo_concurrency.lockutils [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.457s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:14 np0005466013 nova_compute[192144]: 2025-10-02 12:23:14.786 2 DEBUG nova.compute.manager [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:23:14 np0005466013 nova_compute[192144]: 2025-10-02 12:23:14.788 2 DEBUG oslo_concurrency.lockutils [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.144s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:14 np0005466013 nova_compute[192144]: 2025-10-02 12:23:14.793 2 DEBUG nova.virt.hardware [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:23:14 np0005466013 nova_compute[192144]: 2025-10-02 12:23:14.793 2 INFO nova.compute.claims [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:23:14 np0005466013 nova_compute[192144]: 2025-10-02 12:23:14.964 2 DEBUG nova.compute.manager [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:23:14 np0005466013 nova_compute[192144]: 2025-10-02 12:23:14.965 2 DEBUG nova.network.neutron [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:23:15 np0005466013 nova_compute[192144]: 2025-10-02 12:23:15.022 2 INFO nova.virt.libvirt.driver [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:23:15 np0005466013 nova_compute[192144]: 2025-10-02 12:23:15.132 2 DEBUG nova.compute.provider_tree [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:23:15 np0005466013 nova_compute[192144]: 2025-10-02 12:23:15.214 2 DEBUG nova.compute.manager [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:23:15 np0005466013 nova_compute[192144]: 2025-10-02 12:23:15.227 2 DEBUG nova.policy [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:23:15 np0005466013 nova_compute[192144]: 2025-10-02 12:23:15.252 2 DEBUG nova.scheduler.client.report [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:23:15 np0005466013 nova_compute[192144]: 2025-10-02 12:23:15.332 2 DEBUG oslo_concurrency.lockutils [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.544s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:15 np0005466013 nova_compute[192144]: 2025-10-02 12:23:15.333 2 DEBUG nova.compute.manager [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:23:15 np0005466013 nova_compute[192144]: 2025-10-02 12:23:15.669 2 DEBUG nova.compute.manager [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:23:15 np0005466013 nova_compute[192144]: 2025-10-02 12:23:15.670 2 DEBUG nova.network.neutron [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:23:15 np0005466013 nova_compute[192144]: 2025-10-02 12:23:15.743 2 INFO nova.virt.libvirt.driver [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:23:15 np0005466013 nova_compute[192144]: 2025-10-02 12:23:15.783 2 DEBUG nova.compute.manager [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:23:15 np0005466013 nova_compute[192144]: 2025-10-02 12:23:15.785 2 DEBUG nova.virt.libvirt.driver [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:23:15 np0005466013 nova_compute[192144]: 2025-10-02 12:23:15.785 2 INFO nova.virt.libvirt.driver [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Creating image(s)#033[00m
Oct  2 08:23:15 np0005466013 nova_compute[192144]: 2025-10-02 12:23:15.786 2 DEBUG oslo_concurrency.lockutils [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Acquiring lock "/var/lib/nova/instances/86e04f64-f88d-45c1-b90c-344bddb4c4b9/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:15 np0005466013 nova_compute[192144]: 2025-10-02 12:23:15.786 2 DEBUG oslo_concurrency.lockutils [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "/var/lib/nova/instances/86e04f64-f88d-45c1-b90c-344bddb4c4b9/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:15 np0005466013 nova_compute[192144]: 2025-10-02 12:23:15.787 2 DEBUG oslo_concurrency.lockutils [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "/var/lib/nova/instances/86e04f64-f88d-45c1-b90c-344bddb4c4b9/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:15 np0005466013 nova_compute[192144]: 2025-10-02 12:23:15.799 2 DEBUG nova.compute.manager [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:23:15 np0005466013 nova_compute[192144]: 2025-10-02 12:23:15.806 2 DEBUG oslo_concurrency.processutils [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:23:15 np0005466013 nova_compute[192144]: 2025-10-02 12:23:15.885 2 DEBUG oslo_concurrency.processutils [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:23:15 np0005466013 nova_compute[192144]: 2025-10-02 12:23:15.886 2 DEBUG oslo_concurrency.lockutils [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:15 np0005466013 nova_compute[192144]: 2025-10-02 12:23:15.887 2 DEBUG oslo_concurrency.lockutils [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:15 np0005466013 nova_compute[192144]: 2025-10-02 12:23:15.896 2 DEBUG oslo_concurrency.processutils [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:23:15 np0005466013 nova_compute[192144]: 2025-10-02 12:23:15.940 2 DEBUG nova.compute.manager [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:23:15 np0005466013 nova_compute[192144]: 2025-10-02 12:23:15.941 2 DEBUG nova.virt.libvirt.driver [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:23:15 np0005466013 nova_compute[192144]: 2025-10-02 12:23:15.942 2 INFO nova.virt.libvirt.driver [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Creating image(s)#033[00m
Oct  2 08:23:15 np0005466013 nova_compute[192144]: 2025-10-02 12:23:15.942 2 DEBUG oslo_concurrency.lockutils [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "/var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:15 np0005466013 nova_compute[192144]: 2025-10-02 12:23:15.943 2 DEBUG oslo_concurrency.lockutils [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "/var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:15 np0005466013 nova_compute[192144]: 2025-10-02 12:23:15.943 2 DEBUG oslo_concurrency.lockutils [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "/var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:15 np0005466013 nova_compute[192144]: 2025-10-02 12:23:15.955 2 DEBUG oslo_concurrency.processutils [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:23:15 np0005466013 nova_compute[192144]: 2025-10-02 12:23:15.956 2 DEBUG oslo_concurrency.processutils [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:23:15 np0005466013 nova_compute[192144]: 2025-10-02 12:23:15.976 2 DEBUG oslo_concurrency.processutils [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/86e04f64-f88d-45c1-b90c-344bddb4c4b9/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:23:16 np0005466013 nova_compute[192144]: 2025-10-02 12:23:16.015 2 DEBUG oslo_concurrency.processutils [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:23:16 np0005466013 nova_compute[192144]: 2025-10-02 12:23:16.016 2 DEBUG oslo_concurrency.lockutils [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:16 np0005466013 nova_compute[192144]: 2025-10-02 12:23:16.061 2 DEBUG oslo_concurrency.processutils [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/86e04f64-f88d-45c1-b90c-344bddb4c4b9/disk 1073741824" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:23:16 np0005466013 nova_compute[192144]: 2025-10-02 12:23:16.062 2 DEBUG oslo_concurrency.lockutils [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.175s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:16 np0005466013 nova_compute[192144]: 2025-10-02 12:23:16.063 2 DEBUG oslo_concurrency.processutils [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:23:16 np0005466013 nova_compute[192144]: 2025-10-02 12:23:16.089 2 DEBUG oslo_concurrency.lockutils [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.073s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:16 np0005466013 nova_compute[192144]: 2025-10-02 12:23:16.100 2 DEBUG oslo_concurrency.processutils [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:23:16 np0005466013 nova_compute[192144]: 2025-10-02 12:23:16.134 2 DEBUG oslo_concurrency.processutils [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:23:16 np0005466013 nova_compute[192144]: 2025-10-02 12:23:16.135 2 DEBUG nova.virt.disk.api [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Checking if we can resize image /var/lib/nova/instances/86e04f64-f88d-45c1-b90c-344bddb4c4b9/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:23:16 np0005466013 nova_compute[192144]: 2025-10-02 12:23:16.136 2 DEBUG oslo_concurrency.processutils [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/86e04f64-f88d-45c1-b90c-344bddb4c4b9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:23:16 np0005466013 nova_compute[192144]: 2025-10-02 12:23:16.159 2 DEBUG oslo_concurrency.processutils [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:23:16 np0005466013 nova_compute[192144]: 2025-10-02 12:23:16.160 2 DEBUG oslo_concurrency.processutils [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:23:16 np0005466013 nova_compute[192144]: 2025-10-02 12:23:16.197 2 DEBUG oslo_concurrency.processutils [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/86e04f64-f88d-45c1-b90c-344bddb4c4b9/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:23:16 np0005466013 nova_compute[192144]: 2025-10-02 12:23:16.198 2 DEBUG nova.virt.disk.api [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Cannot resize image /var/lib/nova/instances/86e04f64-f88d-45c1-b90c-344bddb4c4b9/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:23:16 np0005466013 nova_compute[192144]: 2025-10-02 12:23:16.199 2 DEBUG nova.objects.instance [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lazy-loading 'migration_context' on Instance uuid 86e04f64-f88d-45c1-b90c-344bddb4c4b9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:23:16 np0005466013 nova_compute[192144]: 2025-10-02 12:23:16.201 2 DEBUG oslo_concurrency.processutils [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:23:16 np0005466013 nova_compute[192144]: 2025-10-02 12:23:16.202 2 DEBUG oslo_concurrency.lockutils [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:16 np0005466013 nova_compute[192144]: 2025-10-02 12:23:16.202 2 DEBUG oslo_concurrency.processutils [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:23:16 np0005466013 nova_compute[192144]: 2025-10-02 12:23:16.225 2 DEBUG nova.virt.libvirt.driver [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:23:16 np0005466013 nova_compute[192144]: 2025-10-02 12:23:16.226 2 DEBUG nova.virt.libvirt.driver [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Ensure instance console log exists: /var/lib/nova/instances/86e04f64-f88d-45c1-b90c-344bddb4c4b9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:23:16 np0005466013 nova_compute[192144]: 2025-10-02 12:23:16.227 2 DEBUG oslo_concurrency.lockutils [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:16 np0005466013 nova_compute[192144]: 2025-10-02 12:23:16.227 2 DEBUG oslo_concurrency.lockutils [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:16 np0005466013 nova_compute[192144]: 2025-10-02 12:23:16.227 2 DEBUG oslo_concurrency.lockutils [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:16 np0005466013 nova_compute[192144]: 2025-10-02 12:23:16.262 2 DEBUG oslo_concurrency.processutils [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:23:16 np0005466013 nova_compute[192144]: 2025-10-02 12:23:16.263 2 DEBUG nova.virt.disk.api [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Checking if we can resize image /var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:23:16 np0005466013 nova_compute[192144]: 2025-10-02 12:23:16.263 2 DEBUG oslo_concurrency.processutils [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:23:16 np0005466013 nova_compute[192144]: 2025-10-02 12:23:16.327 2 DEBUG oslo_concurrency.processutils [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:23:16 np0005466013 nova_compute[192144]: 2025-10-02 12:23:16.328 2 DEBUG nova.virt.disk.api [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Cannot resize image /var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:23:16 np0005466013 nova_compute[192144]: 2025-10-02 12:23:16.328 2 DEBUG nova.objects.instance [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lazy-loading 'migration_context' on Instance uuid 32196dd3-2739-4c43-9532-b0365f8095af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:23:16 np0005466013 nova_compute[192144]: 2025-10-02 12:23:16.346 2 DEBUG nova.virt.libvirt.driver [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:23:16 np0005466013 nova_compute[192144]: 2025-10-02 12:23:16.346 2 DEBUG nova.virt.libvirt.driver [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Ensure instance console log exists: /var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:23:16 np0005466013 nova_compute[192144]: 2025-10-02 12:23:16.347 2 DEBUG oslo_concurrency.lockutils [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:16 np0005466013 nova_compute[192144]: 2025-10-02 12:23:16.347 2 DEBUG oslo_concurrency.lockutils [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:16 np0005466013 nova_compute[192144]: 2025-10-02 12:23:16.347 2 DEBUG oslo_concurrency.lockutils [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.353 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f2f0b852-0c4a-4d16-9c7f-54845e7f7b42', 'name': 'tempest-ServerActionsTestOtherB-server-1061088034', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000071', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'hostId': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.354 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.355 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.355 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-1061088034>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-1061088034>]
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.356 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.360 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for f2f0b852-0c4a-4d16-9c7f-54845e7f7b42 / tap6374e02b-d2 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.361 12 DEBUG ceilometer.compute.pollsters [-] f2f0b852-0c4a-4d16-9c7f-54845e7f7b42/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.363 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f7e0928e-4387-4e3f-9f99-bc615c809b24', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'instance-00000071-f2f0b852-0c4a-4d16-9c7f-54845e7f7b42-tap6374e02b-d2', 'timestamp': '2025-10-02T12:23:16.356363', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1061088034', 'name': 'tap6374e02b-d2', 'instance_id': 'f2f0b852-0c4a-4d16-9c7f-54845e7f7b42', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:17:03:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6374e02b-d2'}, 'message_id': '92ad457c-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5788.034620032, 'message_signature': '482041b575320291c540c81b718dc9b79d2ab248b7912805a3f38cedbde86059'}]}, 'timestamp': '2025-10-02 12:23:16.362122', '_unique_id': 'cf9bb985fe444adfa8c284c692f551e2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.363 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.363 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.363 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.363 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.363 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.363 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.363 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.363 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.363 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.363 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.363 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.363 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.363 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.363 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.363 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.363 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.363 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.363 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.363 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.363 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.363 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.363 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.363 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.363 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.363 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.363 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.363 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.363 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.363 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.363 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.363 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.366 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.366 12 DEBUG ceilometer.compute.pollsters [-] f2f0b852-0c4a-4d16-9c7f-54845e7f7b42/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.368 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f35abdc4-9e6b-4862-89bf-d798b903980d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'instance-00000071-f2f0b852-0c4a-4d16-9c7f-54845e7f7b42-tap6374e02b-d2', 'timestamp': '2025-10-02T12:23:16.366572', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1061088034', 'name': 'tap6374e02b-d2', 'instance_id': 'f2f0b852-0c4a-4d16-9c7f-54845e7f7b42', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:17:03:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6374e02b-d2'}, 'message_id': '92ae10ce-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5788.034620032, 'message_signature': '41bdd4c91a64c7439ef6f2ea11864eda4cd6fbad7e32a6188c4215aebfe33d74'}]}, 'timestamp': '2025-10-02 12:23:16.367279', '_unique_id': '1a18680beef8420e885fbe13da89b503'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.368 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.368 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.368 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.368 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.368 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.368 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.368 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.368 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.368 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.368 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.368 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.368 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.368 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.368 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.368 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.368 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.368 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.368 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.368 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.368 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.368 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.368 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.368 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.368 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.368 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.368 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.368 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.368 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.368 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.368 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.368 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.369 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.369 12 DEBUG ceilometer.compute.pollsters [-] f2f0b852-0c4a-4d16-9c7f-54845e7f7b42/network.incoming.bytes volume: 1814 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.370 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0b6953cc-50ee-403d-81d3-c3c9b8700c3b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1814, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'instance-00000071-f2f0b852-0c4a-4d16-9c7f-54845e7f7b42-tap6374e02b-d2', 'timestamp': '2025-10-02T12:23:16.369715', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1061088034', 'name': 'tap6374e02b-d2', 'instance_id': 'f2f0b852-0c4a-4d16-9c7f-54845e7f7b42', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:17:03:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6374e02b-d2'}, 'message_id': '92ae8464-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5788.034620032, 'message_signature': '5bf14802cee638a53ccd43960c1a8e7cb7afb451c5c4f59649ef5e794bd21c08'}]}, 'timestamp': '2025-10-02 12:23:16.370113', '_unique_id': 'a7fe5104434d419eae0fbc548bd37e34'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.370 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.370 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.370 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.370 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.370 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.370 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.370 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.370 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.370 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.370 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.370 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.370 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.370 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.370 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.370 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.370 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.370 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.370 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.370 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.370 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.370 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.370 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.370 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.370 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.370 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.370 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.370 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.370 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.370 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.370 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.370 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.371 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.371 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.372 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-1061088034>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-1061088034>]
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.372 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.372 12 DEBUG ceilometer.compute.pollsters [-] f2f0b852-0c4a-4d16-9c7f-54845e7f7b42/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.373 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '23f0418e-4569-4612-9221-278e213020f5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'instance-00000071-f2f0b852-0c4a-4d16-9c7f-54845e7f7b42-tap6374e02b-d2', 'timestamp': '2025-10-02T12:23:16.372327', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1061088034', 'name': 'tap6374e02b-d2', 'instance_id': 'f2f0b852-0c4a-4d16-9c7f-54845e7f7b42', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:17:03:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6374e02b-d2'}, 'message_id': '92aee90e-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5788.034620032, 'message_signature': 'f263ff2a7a5eb1e14986a24973fbd5ff4c99c9e5dbd9998197205022cdb66dfc'}]}, 'timestamp': '2025-10-02 12:23:16.372709', '_unique_id': '7c3203fa49e1451ebe44e056186de70f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.373 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.373 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.373 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.373 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.373 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.373 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.373 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.373 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.373 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.373 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.373 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.373 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.373 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.373 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.373 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.373 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.373 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.373 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.373 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.373 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.373 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.373 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.373 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.373 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.373 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.373 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.373 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.373 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.373 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.373 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.373 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.374 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.389 12 DEBUG ceilometer.compute.pollsters [-] f2f0b852-0c4a-4d16-9c7f-54845e7f7b42/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.390 12 DEBUG ceilometer.compute.pollsters [-] f2f0b852-0c4a-4d16-9c7f-54845e7f7b42/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.392 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e446cb75-dbe1-4c61-8536-75d600b8abe2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'f2f0b852-0c4a-4d16-9c7f-54845e7f7b42-vda', 'timestamp': '2025-10-02T12:23:16.375105', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1061088034', 'name': 'instance-00000071', 'instance_id': 'f2f0b852-0c4a-4d16-9c7f-54845e7f7b42', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '92b1a4c8-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5788.053367715, 'message_signature': 'b2ff94b2b2d29505e33df44b4342a0874ea6b8c76993870dc2e38618533839e3'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'f2f0b852-0c4a-4d16-9c7f-54845e7f7b42-sda', 'timestamp': '2025-10-02T12:23:16.375105', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1061088034', 'name': 'instance-00000071', 'instance_id': 'f2f0b852-0c4a-4d16-9c7f-54845e7f7b42', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '92b1b95e-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5788.053367715, 'message_signature': '110ea9e91c37eb2e49212c36581ad84b2541baf85f5aeb79c0117fd7beb60550'}]}, 'timestamp': '2025-10-02 12:23:16.391163', '_unique_id': '0eb9315a517f47d78ac38bfad49185c4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.392 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.392 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.392 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.392 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.392 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.392 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.392 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.392 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.392 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.392 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.392 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.392 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.392 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.392 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.392 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.392 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.392 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.392 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.392 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.392 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.392 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.392 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.392 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.392 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.392 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.392 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.392 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.392 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.392 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.392 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.392 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.393 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.420 12 DEBUG ceilometer.compute.pollsters [-] f2f0b852-0c4a-4d16-9c7f-54845e7f7b42/disk.device.write.bytes volume: 73015296 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.421 12 DEBUG ceilometer.compute.pollsters [-] f2f0b852-0c4a-4d16-9c7f-54845e7f7b42/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.423 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b735bf60-6ea7-4e20-9d69-72395c5ad22c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73015296, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'f2f0b852-0c4a-4d16-9c7f-54845e7f7b42-vda', 'timestamp': '2025-10-02T12:23:16.393887', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1061088034', 'name': 'instance-00000071', 'instance_id': 'f2f0b852-0c4a-4d16-9c7f-54845e7f7b42', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '92b6519e-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5788.072148469, 'message_signature': 'a43e84c3bb4e0f69e32680c10a2f26b282a06224e6692a716f493f65d4a8f12f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'f2f0b852-0c4a-4d16-9c7f-54845e7f7b42-sda', 'timestamp': '2025-10-02T12:23:16.393887', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1061088034', 'name': 'instance-00000071', 'instance_id': 'f2f0b852-0c4a-4d16-9c7f-54845e7f7b42', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '92b66418-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5788.072148469, 'message_signature': '95d119ea56c1e5740a04bc6ecd4d966af56b3ffa609ee23d39baccd52778af9f'}]}, 'timestamp': '2025-10-02 12:23:16.421693', '_unique_id': 'e3344144037b4cacb7577da3610a67d9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.423 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.423 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.423 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.423 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.423 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.423 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.423 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.423 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.423 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.423 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.423 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.423 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.423 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.423 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.423 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.423 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.423 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.423 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.423 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.423 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.423 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.423 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.423 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.423 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.423 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.423 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.423 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.423 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.423 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.423 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.423 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.424 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.424 12 DEBUG ceilometer.compute.pollsters [-] f2f0b852-0c4a-4d16-9c7f-54845e7f7b42/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.425 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1447c9a0-1827-4f40-85ef-a24396b1fe3d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'instance-00000071-f2f0b852-0c4a-4d16-9c7f-54845e7f7b42-tap6374e02b-d2', 'timestamp': '2025-10-02T12:23:16.424449', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1061088034', 'name': 'tap6374e02b-d2', 'instance_id': 'f2f0b852-0c4a-4d16-9c7f-54845e7f7b42', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:17:03:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6374e02b-d2'}, 'message_id': '92b6df06-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5788.034620032, 'message_signature': '265587b5c51aaf711907b649fd5ebecc1ae6d065958e026da4967d3886e44ace'}]}, 'timestamp': '2025-10-02 12:23:16.424907', '_unique_id': 'f2e1f4cee63645fc8fae35f38ce3e71e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.425 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.425 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.425 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.425 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.425 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.425 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.425 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.425 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.425 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.425 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.425 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.425 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.425 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.425 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.425 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.425 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.425 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.425 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.425 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.425 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.425 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.425 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.425 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.425 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.425 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.425 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.425 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.425 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.425 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.425 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.425 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.426 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.426 12 DEBUG ceilometer.compute.pollsters [-] f2f0b852-0c4a-4d16-9c7f-54845e7f7b42/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.427 12 DEBUG ceilometer.compute.pollsters [-] f2f0b852-0c4a-4d16-9c7f-54845e7f7b42/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.428 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '48c5c977-a109-4a15-b908-0f020b2fbbdc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'f2f0b852-0c4a-4d16-9c7f-54845e7f7b42-vda', 'timestamp': '2025-10-02T12:23:16.426958', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1061088034', 'name': 'instance-00000071', 'instance_id': 'f2f0b852-0c4a-4d16-9c7f-54845e7f7b42', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '92b73ee2-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5788.053367715, 'message_signature': '182f2ac3a6a9a2d235537148f4aa596920197a59df76cbd3fd79e47f5dae9587'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'f2f0b852-0c4a-4d16-9c7f-54845e7f7b42-sda', 'timestamp': '2025-10-02T12:23:16.426958', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1061088034', 'name': 'instance-00000071', 'instance_id': 'f2f0b852-0c4a-4d16-9c7f-54845e7f7b42', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '92b74a36-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5788.053367715, 'message_signature': 'f1d4cc5db08c1a901067827d163ef3bbfff26f46213c89ca9ffa47b6b0269173'}]}, 'timestamp': '2025-10-02 12:23:16.427562', '_unique_id': '1f611d5113bf45eb8aad10cdd57ec67d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.428 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.428 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.428 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.428 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.428 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.428 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.428 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.428 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.428 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.428 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.428 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.428 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.428 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.428 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.428 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.428 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.428 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.428 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.428 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.428 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.428 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.428 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.428 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.428 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.428 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.428 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.428 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.428 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.428 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.428 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.428 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.429 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.429 12 DEBUG ceilometer.compute.pollsters [-] f2f0b852-0c4a-4d16-9c7f-54845e7f7b42/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.431 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '97eb79d3-5038-45a5-b780-a271ac49e5da', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'instance-00000071-f2f0b852-0c4a-4d16-9c7f-54845e7f7b42-tap6374e02b-d2', 'timestamp': '2025-10-02T12:23:16.429690', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1061088034', 'name': 'tap6374e02b-d2', 'instance_id': 'f2f0b852-0c4a-4d16-9c7f-54845e7f7b42', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:17:03:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6374e02b-d2'}, 'message_id': '92b7ab3e-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5788.034620032, 'message_signature': '87489895f7ac252f5e0f9e654b8de92c66a745ce481cea8e78207ae00ed31c4d'}]}, 'timestamp': '2025-10-02 12:23:16.430136', '_unique_id': '7e0350b51ee74002998efd10c527306e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.431 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.431 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.431 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.431 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.431 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.431 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.431 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.431 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.431 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.431 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.431 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.431 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.431 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.431 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.431 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.431 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.431 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.431 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.431 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.431 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.431 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.431 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.431 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.431 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.431 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.431 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.431 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.431 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.431 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.431 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.431 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.432 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.453 12 DEBUG ceilometer.compute.pollsters [-] f2f0b852-0c4a-4d16-9c7f-54845e7f7b42/cpu volume: 12490000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.456 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'af9ae0d1-12b3-4147-95df-5dd980435b1b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12490000000, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'f2f0b852-0c4a-4d16-9c7f-54845e7f7b42', 'timestamp': '2025-10-02T12:23:16.432224', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1061088034', 'name': 'instance-00000071', 'instance_id': 'f2f0b852-0c4a-4d16-9c7f-54845e7f7b42', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '92bb6fbc-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5788.131986259, 'message_signature': 'c8b2a2aa1add1391b44967dbcc65beff9dd32c55a1ab05eb444fef1c6952dc83'}]}, 'timestamp': '2025-10-02 12:23:16.455035', '_unique_id': 'ddea9f5ad1514a239b9ee0bbdcd4d50d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.456 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.456 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.456 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.456 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.456 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.456 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.456 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.456 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.456 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.456 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.456 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.456 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.456 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.456 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.456 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.456 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.456 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.456 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.456 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.456 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.456 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.456 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.456 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.456 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.456 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.456 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.456 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.456 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.456 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.456 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.456 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.458 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.458 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.459 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-1061088034>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-1061088034>]
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.459 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.459 12 DEBUG ceilometer.compute.pollsters [-] f2f0b852-0c4a-4d16-9c7f-54845e7f7b42/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.461 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cdf9081f-33a5-4741-8abc-ad280f323da9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'instance-00000071-f2f0b852-0c4a-4d16-9c7f-54845e7f7b42-tap6374e02b-d2', 'timestamp': '2025-10-02T12:23:16.459701', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1061088034', 'name': 'tap6374e02b-d2', 'instance_id': 'f2f0b852-0c4a-4d16-9c7f-54845e7f7b42', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:17:03:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6374e02b-d2'}, 'message_id': '92bc45fe-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5788.034620032, 'message_signature': 'ff01350617b54cc15c24d6b362b24506f01f3ed2b69ad5fb94996417c5ac52b2'}]}, 'timestamp': '2025-10-02 12:23:16.460437', '_unique_id': '401250a2f6dd42c28004ecb731561da8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.461 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.461 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.461 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.461 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.461 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.461 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.461 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.461 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.461 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.461 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.461 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.461 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.461 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.461 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.461 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.461 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.461 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.461 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.461 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.461 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.461 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.461 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.461 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.461 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.461 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.461 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.461 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.461 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.461 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.461 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.461 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.463 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.463 12 DEBUG ceilometer.compute.pollsters [-] f2f0b852-0c4a-4d16-9c7f-54845e7f7b42/disk.device.read.latency volume: 826240903 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.464 12 DEBUG ceilometer.compute.pollsters [-] f2f0b852-0c4a-4d16-9c7f-54845e7f7b42/disk.device.read.latency volume: 55696347 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.465 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ac310d48-f2d8-42b2-b9a3-9a885b88eab4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 826240903, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'f2f0b852-0c4a-4d16-9c7f-54845e7f7b42-vda', 'timestamp': '2025-10-02T12:23:16.463620', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1061088034', 'name': 'instance-00000071', 'instance_id': 'f2f0b852-0c4a-4d16-9c7f-54845e7f7b42', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '92bcdbcc-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5788.072148469, 'message_signature': 'ea7ed6d1510f0c00da8e0ab794f275e7063ce650989dacd5dc6ce67b93bab871'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 55696347, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'f2f0b852-0c4a-4d16-9c7f-54845e7f7b42-sda', 'timestamp': '2025-10-02T12:23:16.463620', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1061088034', 'name': 'instance-00000071', 'instance_id': 'f2f0b852-0c4a-4d16-9c7f-54845e7f7b42', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '92bcf10c-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5788.072148469, 'message_signature': 'dd3d60f06a3741e28bb343061b57084a22fb2e88a40c87f561665b05df0cd6b3'}]}, 'timestamp': '2025-10-02 12:23:16.464677', '_unique_id': 'be5b6a712efe49dd89f203babcf21409'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.465 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.465 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.465 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.465 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.465 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.465 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.465 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.465 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.465 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.465 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.465 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.465 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.465 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.465 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.465 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.465 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.465 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.465 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.465 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.465 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.465 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.465 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.465 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.465 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.465 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.465 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.465 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.465 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.465 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.465 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.465 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.467 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.467 12 DEBUG ceilometer.compute.pollsters [-] f2f0b852-0c4a-4d16-9c7f-54845e7f7b42/memory.usage volume: 42.5703125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.469 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '551e9f3f-491b-4af3-a963-4145f5076224', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.5703125, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'f2f0b852-0c4a-4d16-9c7f-54845e7f7b42', 'timestamp': '2025-10-02T12:23:16.467594', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1061088034', 'name': 'instance-00000071', 'instance_id': 'f2f0b852-0c4a-4d16-9c7f-54845e7f7b42', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '92bd7a5a-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5788.131986259, 'message_signature': '06e524c4ecfce777869912e1547f5e708baac21fab72d45ec100a1cc67524193'}]}, 'timestamp': '2025-10-02 12:23:16.468289', '_unique_id': '8cb8dac1bf9e4b348b460307a134ce9a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.469 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.469 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.469 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.469 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.469 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.469 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.469 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.469 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.469 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.469 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.469 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.469 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.469 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.469 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.469 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.469 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.469 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.469 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.469 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.469 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.469 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.469 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.469 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.469 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.469 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.469 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.469 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.469 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.469 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.469 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.469 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.471 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.471 12 DEBUG ceilometer.compute.pollsters [-] f2f0b852-0c4a-4d16-9c7f-54845e7f7b42/disk.device.read.requests volume: 1128 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.472 12 DEBUG ceilometer.compute.pollsters [-] f2f0b852-0c4a-4d16-9c7f-54845e7f7b42/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.474 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c2b7ddb0-f776-4a9c-bc5b-778e6d6ffbe0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1128, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'f2f0b852-0c4a-4d16-9c7f-54845e7f7b42-vda', 'timestamp': '2025-10-02T12:23:16.471495', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1061088034', 'name': 'instance-00000071', 'instance_id': 'f2f0b852-0c4a-4d16-9c7f-54845e7f7b42', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '92be10d2-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5788.072148469, 'message_signature': '018ce6aa25a3950cf3f65e02660ac70d3480b4c35ec17d05c5fb9aab1cdb2b31'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'f2f0b852-0c4a-4d16-9c7f-54845e7f7b42-sda', 'timestamp': '2025-10-02T12:23:16.471495', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1061088034', 'name': 'instance-00000071', 'instance_id': 'f2f0b852-0c4a-4d16-9c7f-54845e7f7b42', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '92be2a18-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5788.072148469, 'message_signature': '3407bb9f7b8c60c49be200a185bb2c083fcf0dc5847ed8fc6d5bd4d2bb33930e'}]}, 'timestamp': '2025-10-02 12:23:16.472773', '_unique_id': '56b2e4b4d3f54e1888f53f620b8e7f0c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.474 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.474 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.474 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.474 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.474 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.474 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.474 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.474 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.474 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.474 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.474 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.474 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.474 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.474 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.474 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.474 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.474 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.474 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.474 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.474 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.474 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.474 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.474 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.474 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.474 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.474 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.474 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.474 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.474 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.474 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.474 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.475 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.476 12 DEBUG ceilometer.compute.pollsters [-] f2f0b852-0c4a-4d16-9c7f-54845e7f7b42/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.476 12 DEBUG ceilometer.compute.pollsters [-] f2f0b852-0c4a-4d16-9c7f-54845e7f7b42/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.478 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5244d2d1-9976-493d-a1b1-77ecef58f273', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30154752, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'f2f0b852-0c4a-4d16-9c7f-54845e7f7b42-vda', 'timestamp': '2025-10-02T12:23:16.476083', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1061088034', 'name': 'instance-00000071', 'instance_id': 'f2f0b852-0c4a-4d16-9c7f-54845e7f7b42', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '92bec48c-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5788.053367715, 'message_signature': '85a56d9bdd31e06d18e22d29428bb463529bff60540e1e43303dea8cb688af68'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'f2f0b852-0c4a-4d16-9c7f-54845e7f7b42-sda', 'timestamp': '2025-10-02T12:23:16.476083', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1061088034', 'name': 'instance-00000071', 'instance_id': 'f2f0b852-0c4a-4d16-9c7f-54845e7f7b42', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '92bed94a-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5788.053367715, 'message_signature': 'c1916118945aaae0bd7811f4e17d804d5a193af019c2675177e4a3941f0f1833'}]}, 'timestamp': '2025-10-02 12:23:16.477266', '_unique_id': '9343807cc9cf4983847dbc2fbe88cd15'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.478 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.478 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.478 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.478 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.478 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.478 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.478 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.478 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.478 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.478 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.478 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.478 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.478 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.478 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.478 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.478 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.478 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.478 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.478 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.478 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.478 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.478 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.478 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.478 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.478 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.478 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.478 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.478 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.478 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.478 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.478 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.479 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.480 12 DEBUG ceilometer.compute.pollsters [-] f2f0b852-0c4a-4d16-9c7f-54845e7f7b42/disk.device.write.latency volume: 32039058219 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.480 12 DEBUG ceilometer.compute.pollsters [-] f2f0b852-0c4a-4d16-9c7f-54845e7f7b42/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.482 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5b1a59e2-9e66-495b-97e3-4d6c41a6a896', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 32039058219, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'f2f0b852-0c4a-4d16-9c7f-54845e7f7b42-vda', 'timestamp': '2025-10-02T12:23:16.479988', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1061088034', 'name': 'instance-00000071', 'instance_id': 'f2f0b852-0c4a-4d16-9c7f-54845e7f7b42', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '92bf5d7a-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5788.072148469, 'message_signature': '1bdf89dcc0273be322327e9a48e5a7324f77e2ed6622646a8b7c1029babd9f18'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'f2f0b852-0c4a-4d16-9c7f-54845e7f7b42-sda', 'timestamp': '2025-10-02T12:23:16.479988', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1061088034', 'name': 'instance-00000071', 'instance_id': 'f2f0b852-0c4a-4d16-9c7f-54845e7f7b42', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '92bf7562-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5788.072148469, 'message_signature': '41402ed9e70e986f905caea2d42485c4535efaca649ad1a12a2c0af1276dda89'}]}, 'timestamp': '2025-10-02 12:23:16.481249', '_unique_id': 'f0eb46577bac435db7e28014c0a3e486'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.482 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.482 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.482 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.482 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.482 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.482 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.482 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.482 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.482 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.482 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.482 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.482 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.482 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.482 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.482 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.482 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.482 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.482 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.482 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.482 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.482 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.482 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.482 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.482 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.482 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.482 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.482 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.482 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.482 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.482 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.482 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.483 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.483 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.483 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-1061088034>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-1061088034>]
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.484 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.484 12 DEBUG ceilometer.compute.pollsters [-] f2f0b852-0c4a-4d16-9c7f-54845e7f7b42/network.incoming.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.485 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b745ecd6-e226-4d87-af6d-98b094f53a55', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'instance-00000071-f2f0b852-0c4a-4d16-9c7f-54845e7f7b42-tap6374e02b-d2', 'timestamp': '2025-10-02T12:23:16.484179', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1061088034', 'name': 'tap6374e02b-d2', 'instance_id': 'f2f0b852-0c4a-4d16-9c7f-54845e7f7b42', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:17:03:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6374e02b-d2'}, 'message_id': '92bffa82-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5788.034620032, 'message_signature': '49171c14a975281e4fa8571e8c522f71bd193b71b2b5b2a53d3faca8d48c833e'}]}, 'timestamp': '2025-10-02 12:23:16.484529', '_unique_id': 'f4032527330a42989787b445626dc360'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.485 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.485 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.485 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.485 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.485 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.485 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.485 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.485 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.485 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.485 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.485 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.485 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.485 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.485 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.485 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.485 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.485 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.485 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.485 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.485 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.485 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.485 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.485 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.485 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.485 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.485 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.485 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.485 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.485 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.485 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.485 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.486 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.486 12 DEBUG ceilometer.compute.pollsters [-] f2f0b852-0c4a-4d16-9c7f-54845e7f7b42/disk.device.write.requests volume: 324 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.486 12 DEBUG ceilometer.compute.pollsters [-] f2f0b852-0c4a-4d16-9c7f-54845e7f7b42/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.487 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '28e14d58-f04d-49b8-b694-76131a81ea7b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 324, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'f2f0b852-0c4a-4d16-9c7f-54845e7f7b42-vda', 'timestamp': '2025-10-02T12:23:16.486151', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1061088034', 'name': 'instance-00000071', 'instance_id': 'f2f0b852-0c4a-4d16-9c7f-54845e7f7b42', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '92c0478a-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5788.072148469, 'message_signature': 'afd8cec26f30f6acabfc17b9022b3d08812d4d00f5cc4a6e518c45b0577b391a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'f2f0b852-0c4a-4d16-9c7f-54845e7f7b42-sda', 'timestamp': '2025-10-02T12:23:16.486151', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1061088034', 'name': 'instance-00000071', 'instance_id': 'f2f0b852-0c4a-4d16-9c7f-54845e7f7b42', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '92c05306-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5788.072148469, 'message_signature': '728eaaf7c8b630429c3d0ecd1b58a6c5880cfcf9a7f839308336a556c3a3466e'}]}, 'timestamp': '2025-10-02 12:23:16.486769', '_unique_id': 'a48331944fca428dbff7f8417a45a657'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.487 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.487 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.487 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.487 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.487 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.487 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.487 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.487 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.487 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.487 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.487 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.487 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.487 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.487 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.487 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.487 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.487 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.487 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.487 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.487 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.487 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.487 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.487 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.487 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.487 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.487 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.487 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.487 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.487 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.487 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.487 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.488 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.488 12 DEBUG ceilometer.compute.pollsters [-] f2f0b852-0c4a-4d16-9c7f-54845e7f7b42/disk.device.read.bytes volume: 30988800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.488 12 DEBUG ceilometer.compute.pollsters [-] f2f0b852-0c4a-4d16-9c7f-54845e7f7b42/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.489 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ec0d1aa0-b21f-4981-919e-b028c652541a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30988800, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'f2f0b852-0c4a-4d16-9c7f-54845e7f7b42-vda', 'timestamp': '2025-10-02T12:23:16.488379', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1061088034', 'name': 'instance-00000071', 'instance_id': 'f2f0b852-0c4a-4d16-9c7f-54845e7f7b42', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '92c09db6-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5788.072148469, 'message_signature': 'e341ca465e04b3e306963992e405a6b08d5739efa4791d1241e924695c5560ba'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'f2f0b852-0c4a-4d16-9c7f-54845e7f7b42-sda', 'timestamp': '2025-10-02T12:23:16.488379', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1061088034', 'name': 'instance-00000071', 'instance_id': 'f2f0b852-0c4a-4d16-9c7f-54845e7f7b42', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '92c0a8f6-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5788.072148469, 'message_signature': '9a08fff8be2bd059c14a7464f9679ebc490aca2f3592ba6dd2b576da6429b336'}]}, 'timestamp': '2025-10-02 12:23:16.488989', '_unique_id': '1342ad20bf8348d9854b21cadd2e10bf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.489 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.489 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.489 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.489 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.489 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.489 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.489 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.489 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.489 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.489 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.489 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.489 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.489 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.489 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.489 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.489 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.489 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.489 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.489 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.489 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.489 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.489 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.489 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.489 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.489 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.489 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.489 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.489 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.489 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.489 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.489 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.490 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.490 12 DEBUG ceilometer.compute.pollsters [-] f2f0b852-0c4a-4d16-9c7f-54845e7f7b42/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.491 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '62519b85-af08-4db9-80c1-1ce44e21ee07', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'instance-00000071-f2f0b852-0c4a-4d16-9c7f-54845e7f7b42-tap6374e02b-d2', 'timestamp': '2025-10-02T12:23:16.490586', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1061088034', 'name': 'tap6374e02b-d2', 'instance_id': 'f2f0b852-0c4a-4d16-9c7f-54845e7f7b42', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:17:03:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6374e02b-d2'}, 'message_id': '92c0f40a-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5788.034620032, 'message_signature': '5452f30d9034a6f95d5c256f4df4051c0a77671589d565a99333f54b50856044'}]}, 'timestamp': '2025-10-02 12:23:16.490934', '_unique_id': '94c336a33a094eaf85cdc60e54d67c42'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.491 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.491 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.491 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.491 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.491 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.491 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.491 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.491 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.491 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.491 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.491 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.491 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.491 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.491 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.491 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.491 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.491 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.491 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.491 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.491 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.491 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.491 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.491 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.491 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.491 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.491 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.491 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.491 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.491 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.491 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.491 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.492 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.492 12 DEBUG ceilometer.compute.pollsters [-] f2f0b852-0c4a-4d16-9c7f-54845e7f7b42/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.493 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8d611f88-acfc-4bf4-9c8a-3544bb327917', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '0ea122e2fff94f2ba7c78bf30b04029c', 'user_name': None, 'project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'project_name': None, 'resource_id': 'instance-00000071-f2f0b852-0c4a-4d16-9c7f-54845e7f7b42-tap6374e02b-d2', 'timestamp': '2025-10-02T12:23:16.492485', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1061088034', 'name': 'tap6374e02b-d2', 'instance_id': 'f2f0b852-0c4a-4d16-9c7f-54845e7f7b42', 'instance_type': 'm1.nano', 'host': '61d989f57ecbfda9653e795206868fcb3b6a27557f78d6ee65648a03', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:17:03:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6374e02b-d2'}, 'message_id': '92c13e38-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5788.034620032, 'message_signature': '3fbca456ecd3452f3b01fb5dc96fe79cf74b088a80cfea7cca5495f3d28edcac'}]}, 'timestamp': '2025-10-02 12:23:16.492861', '_unique_id': '1150740c1b98433eb12fd9c407fe86e5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.493 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.493 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.493 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.493 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.493 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.493 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.493 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.493 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.493 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.493 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.493 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.493 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.493 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.493 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.493 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.493 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.493 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.493 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.493 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.493 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.493 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.493 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.493 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.493 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.493 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.493 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.493 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.493 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.493 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.493 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:23:16.493 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466013 nova_compute[192144]: 2025-10-02 12:23:16.702 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:17 np0005466013 nova_compute[192144]: 2025-10-02 12:23:17.367 2 DEBUG nova.policy [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:23:18 np0005466013 ovn_controller[94366]: 2025-10-02T12:23:18Z|00449|binding|INFO|Releasing lport e533861f-45cb-4843-b071-0b628ca25128 from this chassis (sb_readonly=0)
Oct  2 08:23:18 np0005466013 nova_compute[192144]: 2025-10-02 12:23:18.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:18 np0005466013 nova_compute[192144]: 2025-10-02 12:23:18.908 2 DEBUG nova.network.neutron [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Successfully created port: 399f8dde-d495-4c04-893b-bb0bf50f2cf0 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:23:19 np0005466013 nova_compute[192144]: 2025-10-02 12:23:19.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:19 np0005466013 nova_compute[192144]: 2025-10-02 12:23:19.300 2 DEBUG nova.network.neutron [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Successfully created port: 375c20c8-b3bc-484b-820a-f3988fb1bfa1 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:23:19 np0005466013 nova_compute[192144]: 2025-10-02 12:23:19.736 2 DEBUG nova.network.neutron [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Successfully updated port: 399f8dde-d495-4c04-893b-bb0bf50f2cf0 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:23:19 np0005466013 nova_compute[192144]: 2025-10-02 12:23:19.758 2 DEBUG oslo_concurrency.lockutils [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Acquiring lock "refresh_cache-86e04f64-f88d-45c1-b90c-344bddb4c4b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:23:19 np0005466013 nova_compute[192144]: 2025-10-02 12:23:19.759 2 DEBUG oslo_concurrency.lockutils [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Acquired lock "refresh_cache-86e04f64-f88d-45c1-b90c-344bddb4c4b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:23:19 np0005466013 nova_compute[192144]: 2025-10-02 12:23:19.759 2 DEBUG nova.network.neutron [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:23:19 np0005466013 nova_compute[192144]: 2025-10-02 12:23:19.898 2 DEBUG nova.compute.manager [req-a06611c6-a797-46a7-b7e3-da08b9b95cc9 req-ce54fe00-f58e-4ee3-a24b-93d1def1e9c5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Received event network-changed-399f8dde-d495-4c04-893b-bb0bf50f2cf0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:23:19 np0005466013 nova_compute[192144]: 2025-10-02 12:23:19.899 2 DEBUG nova.compute.manager [req-a06611c6-a797-46a7-b7e3-da08b9b95cc9 req-ce54fe00-f58e-4ee3-a24b-93d1def1e9c5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Refreshing instance network info cache due to event network-changed-399f8dde-d495-4c04-893b-bb0bf50f2cf0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:23:19 np0005466013 nova_compute[192144]: 2025-10-02 12:23:19.900 2 DEBUG oslo_concurrency.lockutils [req-a06611c6-a797-46a7-b7e3-da08b9b95cc9 req-ce54fe00-f58e-4ee3-a24b-93d1def1e9c5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-86e04f64-f88d-45c1-b90c-344bddb4c4b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:23:20 np0005466013 nova_compute[192144]: 2025-10-02 12:23:20.079 2 DEBUG nova.network.neutron [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:23:20 np0005466013 nova_compute[192144]: 2025-10-02 12:23:20.260 2 DEBUG nova.network.neutron [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Successfully updated port: 375c20c8-b3bc-484b-820a-f3988fb1bfa1 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:23:20 np0005466013 nova_compute[192144]: 2025-10-02 12:23:20.294 2 DEBUG oslo_concurrency.lockutils [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "refresh_cache-32196dd3-2739-4c43-9532-b0365f8095af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:23:20 np0005466013 nova_compute[192144]: 2025-10-02 12:23:20.295 2 DEBUG oslo_concurrency.lockutils [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquired lock "refresh_cache-32196dd3-2739-4c43-9532-b0365f8095af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:23:20 np0005466013 nova_compute[192144]: 2025-10-02 12:23:20.295 2 DEBUG nova.network.neutron [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:23:20 np0005466013 nova_compute[192144]: 2025-10-02 12:23:20.393 2 DEBUG nova.compute.manager [req-e48ee08a-c6b7-4970-bb73-e4c0a83eeeec req-5126bbff-a422-4115-a5a7-38a316b3fb4d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Received event network-changed-375c20c8-b3bc-484b-820a-f3988fb1bfa1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:23:20 np0005466013 nova_compute[192144]: 2025-10-02 12:23:20.393 2 DEBUG nova.compute.manager [req-e48ee08a-c6b7-4970-bb73-e4c0a83eeeec req-5126bbff-a422-4115-a5a7-38a316b3fb4d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Refreshing instance network info cache due to event network-changed-375c20c8-b3bc-484b-820a-f3988fb1bfa1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:23:20 np0005466013 nova_compute[192144]: 2025-10-02 12:23:20.394 2 DEBUG oslo_concurrency.lockutils [req-e48ee08a-c6b7-4970-bb73-e4c0a83eeeec req-5126bbff-a422-4115-a5a7-38a316b3fb4d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-32196dd3-2739-4c43-9532-b0365f8095af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:23:21 np0005466013 nova_compute[192144]: 2025-10-02 12:23:21.330 2 DEBUG nova.network.neutron [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:23:21 np0005466013 nova_compute[192144]: 2025-10-02 12:23:21.464 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407786.4633453, 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:23:21 np0005466013 nova_compute[192144]: 2025-10-02 12:23:21.465 2 INFO nova.compute.manager [-] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:23:21 np0005466013 nova_compute[192144]: 2025-10-02 12:23:21.513 2 DEBUG nova.compute.manager [None req-268edda2-b415-483a-a6b6-fd886bd7270f - - - - - -] [instance: 7c3bfdbf-eed9-420a-b5c8-cfb648dceac1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:23:21 np0005466013 nova_compute[192144]: 2025-10-02 12:23:21.705 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:22 np0005466013 nova_compute[192144]: 2025-10-02 12:23:22.004 2 DEBUG nova.network.neutron [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Updating instance_info_cache with network_info: [{"id": "399f8dde-d495-4c04-893b-bb0bf50f2cf0", "address": "fa:16:3e:50:da:03", "network": {"id": "20eb29be-ee23-463b-85af-bfc2388e9f77", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-370285634-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffce7d629aa24a7f970d93b2a79045f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap399f8dde-d4", "ovs_interfaceid": "399f8dde-d495-4c04-893b-bb0bf50f2cf0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:23:22 np0005466013 nova_compute[192144]: 2025-10-02 12:23:22.032 2 DEBUG oslo_concurrency.lockutils [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Releasing lock "refresh_cache-86e04f64-f88d-45c1-b90c-344bddb4c4b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:23:22 np0005466013 nova_compute[192144]: 2025-10-02 12:23:22.032 2 DEBUG nova.compute.manager [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Instance network_info: |[{"id": "399f8dde-d495-4c04-893b-bb0bf50f2cf0", "address": "fa:16:3e:50:da:03", "network": {"id": "20eb29be-ee23-463b-85af-bfc2388e9f77", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-370285634-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffce7d629aa24a7f970d93b2a79045f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap399f8dde-d4", "ovs_interfaceid": "399f8dde-d495-4c04-893b-bb0bf50f2cf0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:23:22 np0005466013 nova_compute[192144]: 2025-10-02 12:23:22.033 2 DEBUG oslo_concurrency.lockutils [req-a06611c6-a797-46a7-b7e3-da08b9b95cc9 req-ce54fe00-f58e-4ee3-a24b-93d1def1e9c5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-86e04f64-f88d-45c1-b90c-344bddb4c4b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:23:22 np0005466013 nova_compute[192144]: 2025-10-02 12:23:22.033 2 DEBUG nova.network.neutron [req-a06611c6-a797-46a7-b7e3-da08b9b95cc9 req-ce54fe00-f58e-4ee3-a24b-93d1def1e9c5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Refreshing network info cache for port 399f8dde-d495-4c04-893b-bb0bf50f2cf0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:23:22 np0005466013 nova_compute[192144]: 2025-10-02 12:23:22.036 2 DEBUG nova.virt.libvirt.driver [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Start _get_guest_xml network_info=[{"id": "399f8dde-d495-4c04-893b-bb0bf50f2cf0", "address": "fa:16:3e:50:da:03", "network": {"id": "20eb29be-ee23-463b-85af-bfc2388e9f77", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-370285634-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffce7d629aa24a7f970d93b2a79045f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap399f8dde-d4", "ovs_interfaceid": "399f8dde-d495-4c04-893b-bb0bf50f2cf0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:23:22 np0005466013 nova_compute[192144]: 2025-10-02 12:23:22.040 2 WARNING nova.virt.libvirt.driver [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:23:22 np0005466013 nova_compute[192144]: 2025-10-02 12:23:22.046 2 DEBUG nova.virt.libvirt.host [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:23:22 np0005466013 nova_compute[192144]: 2025-10-02 12:23:22.046 2 DEBUG nova.virt.libvirt.host [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:23:22 np0005466013 nova_compute[192144]: 2025-10-02 12:23:22.050 2 DEBUG nova.virt.libvirt.host [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:23:22 np0005466013 nova_compute[192144]: 2025-10-02 12:23:22.051 2 DEBUG nova.virt.libvirt.host [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:23:22 np0005466013 nova_compute[192144]: 2025-10-02 12:23:22.052 2 DEBUG nova.virt.libvirt.driver [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:23:22 np0005466013 nova_compute[192144]: 2025-10-02 12:23:22.053 2 DEBUG nova.virt.hardware [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:23:22 np0005466013 nova_compute[192144]: 2025-10-02 12:23:22.053 2 DEBUG nova.virt.hardware [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:23:22 np0005466013 nova_compute[192144]: 2025-10-02 12:23:22.054 2 DEBUG nova.virt.hardware [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:23:22 np0005466013 nova_compute[192144]: 2025-10-02 12:23:22.054 2 DEBUG nova.virt.hardware [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:23:22 np0005466013 nova_compute[192144]: 2025-10-02 12:23:22.055 2 DEBUG nova.virt.hardware [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:23:22 np0005466013 nova_compute[192144]: 2025-10-02 12:23:22.055 2 DEBUG nova.virt.hardware [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:23:22 np0005466013 nova_compute[192144]: 2025-10-02 12:23:22.055 2 DEBUG nova.virt.hardware [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:23:22 np0005466013 nova_compute[192144]: 2025-10-02 12:23:22.056 2 DEBUG nova.virt.hardware [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:23:22 np0005466013 nova_compute[192144]: 2025-10-02 12:23:22.056 2 DEBUG nova.virt.hardware [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:23:22 np0005466013 nova_compute[192144]: 2025-10-02 12:23:22.057 2 DEBUG nova.virt.hardware [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:23:22 np0005466013 nova_compute[192144]: 2025-10-02 12:23:22.057 2 DEBUG nova.virt.hardware [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:23:22 np0005466013 nova_compute[192144]: 2025-10-02 12:23:22.063 2 DEBUG nova.virt.libvirt.vif [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:23:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-292933404',display_name='tempest-ServerActionsTestOtherB-server-292933404',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-292933404',id=116,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ffce7d629aa24a7f970d93b2a79045f1',ramdisk_id='',reservation_id='r-50xphvii',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-263921372',owner_user_name='tempest-ServerActionsTestOtherB-263921372-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:23:15Z,user_data=None,user_id='0ea122e2fff94f2ba7c78bf30b04029c',uuid=86e04f64-f88d-45c1-b90c-344bddb4c4b9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "399f8dde-d495-4c04-893b-bb0bf50f2cf0", "address": "fa:16:3e:50:da:03", "network": {"id": "20eb29be-ee23-463b-85af-bfc2388e9f77", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-370285634-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffce7d629aa24a7f970d93b2a79045f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap399f8dde-d4", "ovs_interfaceid": "399f8dde-d495-4c04-893b-bb0bf50f2cf0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:23:22 np0005466013 nova_compute[192144]: 2025-10-02 12:23:22.064 2 DEBUG nova.network.os_vif_util [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Converting VIF {"id": "399f8dde-d495-4c04-893b-bb0bf50f2cf0", "address": "fa:16:3e:50:da:03", "network": {"id": "20eb29be-ee23-463b-85af-bfc2388e9f77", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-370285634-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffce7d629aa24a7f970d93b2a79045f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap399f8dde-d4", "ovs_interfaceid": "399f8dde-d495-4c04-893b-bb0bf50f2cf0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:23:22 np0005466013 nova_compute[192144]: 2025-10-02 12:23:22.065 2 DEBUG nova.network.os_vif_util [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:50:da:03,bridge_name='br-int',has_traffic_filtering=True,id=399f8dde-d495-4c04-893b-bb0bf50f2cf0,network=Network(20eb29be-ee23-463b-85af-bfc2388e9f77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap399f8dde-d4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:23:22 np0005466013 nova_compute[192144]: 2025-10-02 12:23:22.067 2 DEBUG nova.objects.instance [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lazy-loading 'pci_devices' on Instance uuid 86e04f64-f88d-45c1-b90c-344bddb4c4b9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:23:22 np0005466013 nova_compute[192144]: 2025-10-02 12:23:22.082 2 DEBUG nova.virt.libvirt.driver [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:23:22 np0005466013 nova_compute[192144]:  <uuid>86e04f64-f88d-45c1-b90c-344bddb4c4b9</uuid>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:  <name>instance-00000074</name>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:23:22 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:      <nova:name>tempest-ServerActionsTestOtherB-server-292933404</nova:name>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:23:22</nova:creationTime>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:23:22 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:        <nova:user uuid="0ea122e2fff94f2ba7c78bf30b04029c">tempest-ServerActionsTestOtherB-263921372-project-member</nova:user>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:        <nova:project uuid="ffce7d629aa24a7f970d93b2a79045f1">tempest-ServerActionsTestOtherB-263921372</nova:project>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:        <nova:port uuid="399f8dde-d495-4c04-893b-bb0bf50f2cf0">
Oct  2 08:23:22 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:      <entry name="serial">86e04f64-f88d-45c1-b90c-344bddb4c4b9</entry>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:      <entry name="uuid">86e04f64-f88d-45c1-b90c-344bddb4c4b9</entry>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:23:22 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/86e04f64-f88d-45c1-b90c-344bddb4c4b9/disk"/>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:23:22 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/86e04f64-f88d-45c1-b90c-344bddb4c4b9/disk.config"/>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:23:22 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:50:da:03"/>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:      <target dev="tap399f8dde-d4"/>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:23:22 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/86e04f64-f88d-45c1-b90c-344bddb4c4b9/console.log" append="off"/>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:23:22 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:23:22 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:23:22 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:23:22 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:23:22 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:23:22 np0005466013 nova_compute[192144]: 2025-10-02 12:23:22.084 2 DEBUG nova.compute.manager [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Preparing to wait for external event network-vif-plugged-399f8dde-d495-4c04-893b-bb0bf50f2cf0 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:23:22 np0005466013 nova_compute[192144]: 2025-10-02 12:23:22.084 2 DEBUG oslo_concurrency.lockutils [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Acquiring lock "86e04f64-f88d-45c1-b90c-344bddb4c4b9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:22 np0005466013 nova_compute[192144]: 2025-10-02 12:23:22.084 2 DEBUG oslo_concurrency.lockutils [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "86e04f64-f88d-45c1-b90c-344bddb4c4b9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:22 np0005466013 nova_compute[192144]: 2025-10-02 12:23:22.084 2 DEBUG oslo_concurrency.lockutils [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "86e04f64-f88d-45c1-b90c-344bddb4c4b9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:22 np0005466013 nova_compute[192144]: 2025-10-02 12:23:22.085 2 DEBUG nova.virt.libvirt.vif [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:23:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-292933404',display_name='tempest-ServerActionsTestOtherB-server-292933404',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-292933404',id=116,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ffce7d629aa24a7f970d93b2a79045f1',ramdisk_id='',reservation_id='r-50xphvii',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-263921372',owner_user_name='tempest-ServerActionsTestOtherB-263921372-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:23:15Z,user_data=None,user_id='0ea122e2fff94f2ba7c78bf30b04029c',uuid=86e04f64-f88d-45c1-b90c-344bddb4c4b9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "399f8dde-d495-4c04-893b-bb0bf50f2cf0", "address": "fa:16:3e:50:da:03", "network": {"id": "20eb29be-ee23-463b-85af-bfc2388e9f77", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-370285634-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffce7d629aa24a7f970d93b2a79045f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap399f8dde-d4", "ovs_interfaceid": "399f8dde-d495-4c04-893b-bb0bf50f2cf0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:23:22 np0005466013 nova_compute[192144]: 2025-10-02 12:23:22.086 2 DEBUG nova.network.os_vif_util [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Converting VIF {"id": "399f8dde-d495-4c04-893b-bb0bf50f2cf0", "address": "fa:16:3e:50:da:03", "network": {"id": "20eb29be-ee23-463b-85af-bfc2388e9f77", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-370285634-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffce7d629aa24a7f970d93b2a79045f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap399f8dde-d4", "ovs_interfaceid": "399f8dde-d495-4c04-893b-bb0bf50f2cf0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:23:22 np0005466013 nova_compute[192144]: 2025-10-02 12:23:22.087 2 DEBUG nova.network.os_vif_util [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:50:da:03,bridge_name='br-int',has_traffic_filtering=True,id=399f8dde-d495-4c04-893b-bb0bf50f2cf0,network=Network(20eb29be-ee23-463b-85af-bfc2388e9f77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap399f8dde-d4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:23:22 np0005466013 nova_compute[192144]: 2025-10-02 12:23:22.087 2 DEBUG os_vif [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:50:da:03,bridge_name='br-int',has_traffic_filtering=True,id=399f8dde-d495-4c04-893b-bb0bf50f2cf0,network=Network(20eb29be-ee23-463b-85af-bfc2388e9f77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap399f8dde-d4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:23:22 np0005466013 nova_compute[192144]: 2025-10-02 12:23:22.088 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:22 np0005466013 nova_compute[192144]: 2025-10-02 12:23:22.088 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:23:22 np0005466013 nova_compute[192144]: 2025-10-02 12:23:22.089 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:23:22 np0005466013 nova_compute[192144]: 2025-10-02 12:23:22.091 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:22 np0005466013 nova_compute[192144]: 2025-10-02 12:23:22.092 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap399f8dde-d4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:23:22 np0005466013 nova_compute[192144]: 2025-10-02 12:23:22.092 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap399f8dde-d4, col_values=(('external_ids', {'iface-id': '399f8dde-d495-4c04-893b-bb0bf50f2cf0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:50:da:03', 'vm-uuid': '86e04f64-f88d-45c1-b90c-344bddb4c4b9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:23:22 np0005466013 nova_compute[192144]: 2025-10-02 12:23:22.095 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:22 np0005466013 NetworkManager[51205]: <info>  [1759407802.0964] manager: (tap399f8dde-d4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/209)
Oct  2 08:23:22 np0005466013 nova_compute[192144]: 2025-10-02 12:23:22.097 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:23:22 np0005466013 nova_compute[192144]: 2025-10-02 12:23:22.103 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:22 np0005466013 nova_compute[192144]: 2025-10-02 12:23:22.104 2 INFO os_vif [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:50:da:03,bridge_name='br-int',has_traffic_filtering=True,id=399f8dde-d495-4c04-893b-bb0bf50f2cf0,network=Network(20eb29be-ee23-463b-85af-bfc2388e9f77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap399f8dde-d4')#033[00m
Oct  2 08:23:22 np0005466013 nova_compute[192144]: 2025-10-02 12:23:22.195 2 DEBUG nova.virt.libvirt.driver [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:23:22 np0005466013 nova_compute[192144]: 2025-10-02 12:23:22.195 2 DEBUG nova.virt.libvirt.driver [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:23:22 np0005466013 nova_compute[192144]: 2025-10-02 12:23:22.196 2 DEBUG nova.virt.libvirt.driver [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] No VIF found with MAC fa:16:3e:50:da:03, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:23:22 np0005466013 nova_compute[192144]: 2025-10-02 12:23:22.196 2 INFO nova.virt.libvirt.driver [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Using config drive#033[00m
Oct  2 08:23:23 np0005466013 nova_compute[192144]: 2025-10-02 12:23:23.464 2 DEBUG nova.network.neutron [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Updating instance_info_cache with network_info: [{"id": "375c20c8-b3bc-484b-820a-f3988fb1bfa1", "address": "fa:16:3e:af:53:5f", "network": {"id": "91662be7-398f-4c34-a848-62b46821f0fd", "bridge": "br-int", "label": "tempest-network-smoke--722078817", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap375c20c8-b3", "ovs_interfaceid": "375c20c8-b3bc-484b-820a-f3988fb1bfa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:23:23 np0005466013 nova_compute[192144]: 2025-10-02 12:23:23.502 2 DEBUG oslo_concurrency.lockutils [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Releasing lock "refresh_cache-32196dd3-2739-4c43-9532-b0365f8095af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:23:23 np0005466013 nova_compute[192144]: 2025-10-02 12:23:23.503 2 DEBUG nova.compute.manager [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Instance network_info: |[{"id": "375c20c8-b3bc-484b-820a-f3988fb1bfa1", "address": "fa:16:3e:af:53:5f", "network": {"id": "91662be7-398f-4c34-a848-62b46821f0fd", "bridge": "br-int", "label": "tempest-network-smoke--722078817", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap375c20c8-b3", "ovs_interfaceid": "375c20c8-b3bc-484b-820a-f3988fb1bfa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:23:23 np0005466013 nova_compute[192144]: 2025-10-02 12:23:23.505 2 DEBUG oslo_concurrency.lockutils [req-e48ee08a-c6b7-4970-bb73-e4c0a83eeeec req-5126bbff-a422-4115-a5a7-38a316b3fb4d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-32196dd3-2739-4c43-9532-b0365f8095af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:23:23 np0005466013 nova_compute[192144]: 2025-10-02 12:23:23.506 2 DEBUG nova.network.neutron [req-e48ee08a-c6b7-4970-bb73-e4c0a83eeeec req-5126bbff-a422-4115-a5a7-38a316b3fb4d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Refreshing network info cache for port 375c20c8-b3bc-484b-820a-f3988fb1bfa1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:23:23 np0005466013 nova_compute[192144]: 2025-10-02 12:23:23.512 2 DEBUG nova.virt.libvirt.driver [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Start _get_guest_xml network_info=[{"id": "375c20c8-b3bc-484b-820a-f3988fb1bfa1", "address": "fa:16:3e:af:53:5f", "network": {"id": "91662be7-398f-4c34-a848-62b46821f0fd", "bridge": "br-int", "label": "tempest-network-smoke--722078817", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap375c20c8-b3", "ovs_interfaceid": "375c20c8-b3bc-484b-820a-f3988fb1bfa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:23:23 np0005466013 nova_compute[192144]: 2025-10-02 12:23:23.520 2 WARNING nova.virt.libvirt.driver [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:23:23 np0005466013 nova_compute[192144]: 2025-10-02 12:23:23.525 2 DEBUG nova.virt.libvirt.host [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:23:23 np0005466013 nova_compute[192144]: 2025-10-02 12:23:23.527 2 DEBUG nova.virt.libvirt.host [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:23:23 np0005466013 nova_compute[192144]: 2025-10-02 12:23:23.532 2 DEBUG nova.virt.libvirt.host [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:23:23 np0005466013 nova_compute[192144]: 2025-10-02 12:23:23.533 2 DEBUG nova.virt.libvirt.host [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:23:23 np0005466013 nova_compute[192144]: 2025-10-02 12:23:23.534 2 DEBUG nova.virt.libvirt.driver [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:23:23 np0005466013 nova_compute[192144]: 2025-10-02 12:23:23.535 2 DEBUG nova.virt.hardware [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:23:23 np0005466013 nova_compute[192144]: 2025-10-02 12:23:23.535 2 DEBUG nova.virt.hardware [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:23:23 np0005466013 nova_compute[192144]: 2025-10-02 12:23:23.535 2 DEBUG nova.virt.hardware [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:23:23 np0005466013 nova_compute[192144]: 2025-10-02 12:23:23.536 2 DEBUG nova.virt.hardware [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:23:23 np0005466013 nova_compute[192144]: 2025-10-02 12:23:23.536 2 DEBUG nova.virt.hardware [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:23:23 np0005466013 nova_compute[192144]: 2025-10-02 12:23:23.536 2 DEBUG nova.virt.hardware [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:23:23 np0005466013 nova_compute[192144]: 2025-10-02 12:23:23.536 2 DEBUG nova.virt.hardware [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:23:23 np0005466013 nova_compute[192144]: 2025-10-02 12:23:23.537 2 DEBUG nova.virt.hardware [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:23:23 np0005466013 nova_compute[192144]: 2025-10-02 12:23:23.537 2 DEBUG nova.virt.hardware [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:23:23 np0005466013 nova_compute[192144]: 2025-10-02 12:23:23.537 2 DEBUG nova.virt.hardware [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:23:23 np0005466013 nova_compute[192144]: 2025-10-02 12:23:23.538 2 DEBUG nova.virt.hardware [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:23:23 np0005466013 nova_compute[192144]: 2025-10-02 12:23:23.541 2 DEBUG nova.virt.libvirt.vif [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:23:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-2105436045',display_name='tempest-TestNetworkAdvancedServerOps-server-2105436045',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-2105436045',id=117,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIPVZ/1ugRUXJi6kpxyVgRUtYTdMlYSz5NQQRRxSWUHE0SJ8tz8WjHhrHski+4uyv4G//M9upfdriwZTygaxranlXIWK6yJW4zVM7pqGP5AEtkUxwGNjsUk0aVRz2H8oSQ==',key_name='tempest-TestNetworkAdvancedServerOps-1888170662',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76c7dd40d83e4e3ca71abbebf57921b6',ramdisk_id='',reservation_id='r-0oq6jqhj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-597114071',owner_user_name='tempest-TestNetworkAdvancedServerOps-597114071-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:23:15Z,user_data=None,user_id='1faa7e121a0e43ad8cb4ae5b2cfcc6a2',uuid=32196dd3-2739-4c43-9532-b0365f8095af,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "375c20c8-b3bc-484b-820a-f3988fb1bfa1", "address": "fa:16:3e:af:53:5f", "network": {"id": "91662be7-398f-4c34-a848-62b46821f0fd", "bridge": "br-int", "label": "tempest-network-smoke--722078817", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap375c20c8-b3", "ovs_interfaceid": "375c20c8-b3bc-484b-820a-f3988fb1bfa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:23:23 np0005466013 nova_compute[192144]: 2025-10-02 12:23:23.542 2 DEBUG nova.network.os_vif_util [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Converting VIF {"id": "375c20c8-b3bc-484b-820a-f3988fb1bfa1", "address": "fa:16:3e:af:53:5f", "network": {"id": "91662be7-398f-4c34-a848-62b46821f0fd", "bridge": "br-int", "label": "tempest-network-smoke--722078817", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap375c20c8-b3", "ovs_interfaceid": "375c20c8-b3bc-484b-820a-f3988fb1bfa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:23:23 np0005466013 nova_compute[192144]: 2025-10-02 12:23:23.542 2 DEBUG nova.network.os_vif_util [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:af:53:5f,bridge_name='br-int',has_traffic_filtering=True,id=375c20c8-b3bc-484b-820a-f3988fb1bfa1,network=Network(91662be7-398f-4c34-a848-62b46821f0fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap375c20c8-b3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:23:23 np0005466013 nova_compute[192144]: 2025-10-02 12:23:23.543 2 DEBUG nova.objects.instance [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lazy-loading 'pci_devices' on Instance uuid 32196dd3-2739-4c43-9532-b0365f8095af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:23:23 np0005466013 nova_compute[192144]: 2025-10-02 12:23:23.556 2 DEBUG nova.virt.libvirt.driver [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:23:23 np0005466013 nova_compute[192144]:  <uuid>32196dd3-2739-4c43-9532-b0365f8095af</uuid>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:  <name>instance-00000075</name>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:23:23 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-2105436045</nova:name>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:23:23</nova:creationTime>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:23:23 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:        <nova:user uuid="1faa7e121a0e43ad8cb4ae5b2cfcc6a2">tempest-TestNetworkAdvancedServerOps-597114071-project-member</nova:user>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:        <nova:project uuid="76c7dd40d83e4e3ca71abbebf57921b6">tempest-TestNetworkAdvancedServerOps-597114071</nova:project>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:        <nova:port uuid="375c20c8-b3bc-484b-820a-f3988fb1bfa1">
Oct  2 08:23:23 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:      <entry name="serial">32196dd3-2739-4c43-9532-b0365f8095af</entry>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:      <entry name="uuid">32196dd3-2739-4c43-9532-b0365f8095af</entry>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:23:23 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af/disk"/>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:23:23 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af/disk.config"/>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:23:23 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:af:53:5f"/>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:      <target dev="tap375c20c8-b3"/>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:23:23 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af/console.log" append="off"/>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:23:23 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:23:23 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:23:23 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:23:23 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:23:23 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:23:23 np0005466013 nova_compute[192144]: 2025-10-02 12:23:23.558 2 DEBUG nova.compute.manager [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Preparing to wait for external event network-vif-plugged-375c20c8-b3bc-484b-820a-f3988fb1bfa1 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:23:23 np0005466013 nova_compute[192144]: 2025-10-02 12:23:23.558 2 DEBUG oslo_concurrency.lockutils [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "32196dd3-2739-4c43-9532-b0365f8095af-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:23 np0005466013 nova_compute[192144]: 2025-10-02 12:23:23.559 2 DEBUG oslo_concurrency.lockutils [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "32196dd3-2739-4c43-9532-b0365f8095af-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:23 np0005466013 nova_compute[192144]: 2025-10-02 12:23:23.559 2 DEBUG oslo_concurrency.lockutils [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "32196dd3-2739-4c43-9532-b0365f8095af-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:23 np0005466013 nova_compute[192144]: 2025-10-02 12:23:23.560 2 DEBUG nova.virt.libvirt.vif [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:23:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-2105436045',display_name='tempest-TestNetworkAdvancedServerOps-server-2105436045',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-2105436045',id=117,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIPVZ/1ugRUXJi6kpxyVgRUtYTdMlYSz5NQQRRxSWUHE0SJ8tz8WjHhrHski+4uyv4G//M9upfdriwZTygaxranlXIWK6yJW4zVM7pqGP5AEtkUxwGNjsUk0aVRz2H8oSQ==',key_name='tempest-TestNetworkAdvancedServerOps-1888170662',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76c7dd40d83e4e3ca71abbebf57921b6',ramdisk_id='',reservation_id='r-0oq6jqhj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-597114071',owner_user_name='tempest-TestNetworkAdvancedServerOps-597114071-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:23:15Z,user_data=None,user_id='1faa7e121a0e43ad8cb4ae5b2cfcc6a2',uuid=32196dd3-2739-4c43-9532-b0365f8095af,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "375c20c8-b3bc-484b-820a-f3988fb1bfa1", "address": "fa:16:3e:af:53:5f", "network": {"id": "91662be7-398f-4c34-a848-62b46821f0fd", "bridge": "br-int", "label": "tempest-network-smoke--722078817", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap375c20c8-b3", "ovs_interfaceid": "375c20c8-b3bc-484b-820a-f3988fb1bfa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:23:23 np0005466013 nova_compute[192144]: 2025-10-02 12:23:23.561 2 DEBUG nova.network.os_vif_util [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Converting VIF {"id": "375c20c8-b3bc-484b-820a-f3988fb1bfa1", "address": "fa:16:3e:af:53:5f", "network": {"id": "91662be7-398f-4c34-a848-62b46821f0fd", "bridge": "br-int", "label": "tempest-network-smoke--722078817", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap375c20c8-b3", "ovs_interfaceid": "375c20c8-b3bc-484b-820a-f3988fb1bfa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:23:23 np0005466013 nova_compute[192144]: 2025-10-02 12:23:23.562 2 DEBUG nova.network.os_vif_util [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:af:53:5f,bridge_name='br-int',has_traffic_filtering=True,id=375c20c8-b3bc-484b-820a-f3988fb1bfa1,network=Network(91662be7-398f-4c34-a848-62b46821f0fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap375c20c8-b3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:23:23 np0005466013 nova_compute[192144]: 2025-10-02 12:23:23.562 2 DEBUG os_vif [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:af:53:5f,bridge_name='br-int',has_traffic_filtering=True,id=375c20c8-b3bc-484b-820a-f3988fb1bfa1,network=Network(91662be7-398f-4c34-a848-62b46821f0fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap375c20c8-b3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:23:23 np0005466013 nova_compute[192144]: 2025-10-02 12:23:23.563 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:23 np0005466013 nova_compute[192144]: 2025-10-02 12:23:23.564 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:23:23 np0005466013 nova_compute[192144]: 2025-10-02 12:23:23.564 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:23:23 np0005466013 nova_compute[192144]: 2025-10-02 12:23:23.567 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:23 np0005466013 nova_compute[192144]: 2025-10-02 12:23:23.568 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap375c20c8-b3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:23:23 np0005466013 nova_compute[192144]: 2025-10-02 12:23:23.569 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap375c20c8-b3, col_values=(('external_ids', {'iface-id': '375c20c8-b3bc-484b-820a-f3988fb1bfa1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:af:53:5f', 'vm-uuid': '32196dd3-2739-4c43-9532-b0365f8095af'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:23:23 np0005466013 nova_compute[192144]: 2025-10-02 12:23:23.571 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:23 np0005466013 NetworkManager[51205]: <info>  [1759407803.5732] manager: (tap375c20c8-b3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/210)
Oct  2 08:23:23 np0005466013 nova_compute[192144]: 2025-10-02 12:23:23.575 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:23:23 np0005466013 nova_compute[192144]: 2025-10-02 12:23:23.582 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:23 np0005466013 nova_compute[192144]: 2025-10-02 12:23:23.585 2 INFO os_vif [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:af:53:5f,bridge_name='br-int',has_traffic_filtering=True,id=375c20c8-b3bc-484b-820a-f3988fb1bfa1,network=Network(91662be7-398f-4c34-a848-62b46821f0fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap375c20c8-b3')#033[00m
Oct  2 08:23:23 np0005466013 nova_compute[192144]: 2025-10-02 12:23:23.657 2 DEBUG nova.virt.libvirt.driver [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:23:23 np0005466013 nova_compute[192144]: 2025-10-02 12:23:23.657 2 DEBUG nova.virt.libvirt.driver [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:23:23 np0005466013 nova_compute[192144]: 2025-10-02 12:23:23.657 2 DEBUG nova.virt.libvirt.driver [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] No VIF found with MAC fa:16:3e:af:53:5f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:23:23 np0005466013 nova_compute[192144]: 2025-10-02 12:23:23.658 2 INFO nova.virt.libvirt.driver [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Using config drive#033[00m
Oct  2 08:23:24 np0005466013 nova_compute[192144]: 2025-10-02 12:23:24.050 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:24 np0005466013 nova_compute[192144]: 2025-10-02 12:23:24.374 2 INFO nova.virt.libvirt.driver [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Creating config drive at /var/lib/nova/instances/86e04f64-f88d-45c1-b90c-344bddb4c4b9/disk.config#033[00m
Oct  2 08:23:24 np0005466013 nova_compute[192144]: 2025-10-02 12:23:24.379 2 DEBUG oslo_concurrency.processutils [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/86e04f64-f88d-45c1-b90c-344bddb4c4b9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmts5ge8k execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:23:24 np0005466013 nova_compute[192144]: 2025-10-02 12:23:24.514 2 DEBUG oslo_concurrency.processutils [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/86e04f64-f88d-45c1-b90c-344bddb4c4b9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmts5ge8k" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:23:24 np0005466013 NetworkManager[51205]: <info>  [1759407804.6074] manager: (tap399f8dde-d4): new Tun device (/org/freedesktop/NetworkManager/Devices/211)
Oct  2 08:23:24 np0005466013 kernel: tap399f8dde-d4: entered promiscuous mode
Oct  2 08:23:24 np0005466013 ovn_controller[94366]: 2025-10-02T12:23:24Z|00450|binding|INFO|Claiming lport 399f8dde-d495-4c04-893b-bb0bf50f2cf0 for this chassis.
Oct  2 08:23:24 np0005466013 ovn_controller[94366]: 2025-10-02T12:23:24Z|00451|binding|INFO|399f8dde-d495-4c04-893b-bb0bf50f2cf0: Claiming fa:16:3e:50:da:03 10.100.0.4
Oct  2 08:23:24 np0005466013 nova_compute[192144]: 2025-10-02 12:23:24.617 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:24.630 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:50:da:03 10.100.0.4'], port_security=['fa:16:3e:50:da:03 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '86e04f64-f88d-45c1-b90c-344bddb4c4b9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-20eb29be-ee23-463b-85af-bfc2388e9f77', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7993d405-22b6-4649-b5b8-9f3e7d07d4ce', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e183e2c6-21dc-48e3-ae47-279bc8b32eeb, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=399f8dde-d495-4c04-893b-bb0bf50f2cf0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:23:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:24.631 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 399f8dde-d495-4c04-893b-bb0bf50f2cf0 in datapath 20eb29be-ee23-463b-85af-bfc2388e9f77 bound to our chassis#033[00m
Oct  2 08:23:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:24.634 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 20eb29be-ee23-463b-85af-bfc2388e9f77#033[00m
Oct  2 08:23:24 np0005466013 nova_compute[192144]: 2025-10-02 12:23:24.642 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:24 np0005466013 ovn_controller[94366]: 2025-10-02T12:23:24Z|00452|binding|INFO|Setting lport 399f8dde-d495-4c04-893b-bb0bf50f2cf0 ovn-installed in OVS
Oct  2 08:23:24 np0005466013 ovn_controller[94366]: 2025-10-02T12:23:24Z|00453|binding|INFO|Setting lport 399f8dde-d495-4c04-893b-bb0bf50f2cf0 up in Southbound
Oct  2 08:23:24 np0005466013 systemd-udevd[237505]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:23:24 np0005466013 systemd-machined[152202]: New machine qemu-53-instance-00000074.
Oct  2 08:23:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:24.652 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[c2191543-90ec-45d5-a9b4-8e224aa4e727]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:24 np0005466013 systemd[1]: Started Virtual Machine qemu-53-instance-00000074.
Oct  2 08:23:24 np0005466013 NetworkManager[51205]: <info>  [1759407804.6719] device (tap399f8dde-d4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:23:24 np0005466013 NetworkManager[51205]: <info>  [1759407804.6733] device (tap399f8dde-d4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:23:24 np0005466013 podman[237456]: 2025-10-02 12:23:24.675780239 +0000 UTC m=+0.080965628 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  2 08:23:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:24.693 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[0abefd4f-1c75-4aaa-8230-8bb7b7969e89]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:24.697 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[fade7833-6e94-4447-8f35-325853693686]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:24 np0005466013 podman[237457]: 2025-10-02 12:23:24.705006388 +0000 UTC m=+0.110099624 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:23:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:24.724 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[bde197f4-e90d-45fc-8030-18f4f629d2c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:24 np0005466013 podman[237458]: 2025-10-02 12:23:24.725598719 +0000 UTC m=+0.127096703 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0)
Oct  2 08:23:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:24.743 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[ebefb717-8049-4ba8-995a-ed6a591cd7bf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap20eb29be-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:77:55:96'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 9, 'rx_bytes': 1000, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 9, 'rx_bytes': 1000, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 125], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 561951, 'reachable_time': 40813, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237544, 'error': None, 'target': 'ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:24.762 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[5210dd8f-9552-44c5-ab88-7b81740cd5be]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap20eb29be-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 561963, 'tstamp': 561963}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237546, 'error': None, 'target': 'ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap20eb29be-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 561966, 'tstamp': 561966}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237546, 'error': None, 'target': 'ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:24.764 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap20eb29be-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:23:24 np0005466013 nova_compute[192144]: 2025-10-02 12:23:24.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:24 np0005466013 nova_compute[192144]: 2025-10-02 12:23:24.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:24.770 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap20eb29be-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:23:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:24.771 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:23:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:24.771 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap20eb29be-e0, col_values=(('external_ids', {'iface-id': 'e533861f-45cb-4843-b071-0b628ca25128'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:23:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:24.771 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:23:25 np0005466013 nova_compute[192144]: 2025-10-02 12:23:25.379 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407805.3787668, 86e04f64-f88d-45c1-b90c-344bddb4c4b9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:23:25 np0005466013 nova_compute[192144]: 2025-10-02 12:23:25.380 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] VM Started (Lifecycle Event)#033[00m
Oct  2 08:23:25 np0005466013 nova_compute[192144]: 2025-10-02 12:23:25.401 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:23:25 np0005466013 nova_compute[192144]: 2025-10-02 12:23:25.405 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407805.3803313, 86e04f64-f88d-45c1-b90c-344bddb4c4b9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:23:25 np0005466013 nova_compute[192144]: 2025-10-02 12:23:25.406 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:23:25 np0005466013 nova_compute[192144]: 2025-10-02 12:23:25.424 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:23:25 np0005466013 nova_compute[192144]: 2025-10-02 12:23:25.429 2 INFO nova.virt.libvirt.driver [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Creating config drive at /var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af/disk.config#033[00m
Oct  2 08:23:25 np0005466013 nova_compute[192144]: 2025-10-02 12:23:25.437 2 DEBUG oslo_concurrency.processutils [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplik2zieq execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:23:25 np0005466013 nova_compute[192144]: 2025-10-02 12:23:25.470 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:23:25 np0005466013 nova_compute[192144]: 2025-10-02 12:23:25.511 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:23:25 np0005466013 nova_compute[192144]: 2025-10-02 12:23:25.576 2 DEBUG oslo_concurrency.processutils [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplik2zieq" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:23:25 np0005466013 NetworkManager[51205]: <info>  [1759407805.6553] manager: (tap375c20c8-b3): new Tun device (/org/freedesktop/NetworkManager/Devices/212)
Oct  2 08:23:25 np0005466013 kernel: tap375c20c8-b3: entered promiscuous mode
Oct  2 08:23:25 np0005466013 ovn_controller[94366]: 2025-10-02T12:23:25Z|00454|binding|INFO|Claiming lport 375c20c8-b3bc-484b-820a-f3988fb1bfa1 for this chassis.
Oct  2 08:23:25 np0005466013 ovn_controller[94366]: 2025-10-02T12:23:25Z|00455|binding|INFO|375c20c8-b3bc-484b-820a-f3988fb1bfa1: Claiming fa:16:3e:af:53:5f 10.100.0.11
Oct  2 08:23:25 np0005466013 nova_compute[192144]: 2025-10-02 12:23:25.660 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:25 np0005466013 NetworkManager[51205]: <info>  [1759407805.6702] device (tap375c20c8-b3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:23:25 np0005466013 NetworkManager[51205]: <info>  [1759407805.6724] device (tap375c20c8-b3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:23:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:25.676 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:af:53:5f 10.100.0.11'], port_security=['fa:16:3e:af:53:5f 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '32196dd3-2739-4c43-9532-b0365f8095af', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-91662be7-398f-4c34-a848-62b46821f0fd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'neutron:revision_number': '2', 'neutron:security_group_ids': '56d0844b-17cf-4186-b565-d275a3fd7b1f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2bb1944c-7514-4575-bf6c-55d1c733e488, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=375c20c8-b3bc-484b-820a-f3988fb1bfa1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:23:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:25.677 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 375c20c8-b3bc-484b-820a-f3988fb1bfa1 in datapath 91662be7-398f-4c34-a848-62b46821f0fd bound to our chassis#033[00m
Oct  2 08:23:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:25.679 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 91662be7-398f-4c34-a848-62b46821f0fd#033[00m
Oct  2 08:23:25 np0005466013 ovn_controller[94366]: 2025-10-02T12:23:25Z|00456|binding|INFO|Setting lport 375c20c8-b3bc-484b-820a-f3988fb1bfa1 ovn-installed in OVS
Oct  2 08:23:25 np0005466013 ovn_controller[94366]: 2025-10-02T12:23:25Z|00457|binding|INFO|Setting lport 375c20c8-b3bc-484b-820a-f3988fb1bfa1 up in Southbound
Oct  2 08:23:25 np0005466013 nova_compute[192144]: 2025-10-02 12:23:25.684 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:25 np0005466013 nova_compute[192144]: 2025-10-02 12:23:25.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:25.697 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[a764e2da-ed55-48b8-acff-c80db6f193f1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:25.698 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap91662be7-31 in ovnmeta-91662be7-398f-4c34-a848-62b46821f0fd namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:23:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:25.700 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap91662be7-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:23:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:25.700 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[3e62e2b1-4c56-4542-8899-ed0aa0914504]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:25.701 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[2746c5f1-1d7e-424f-a8d2-5d59761dcb49]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:25 np0005466013 systemd-machined[152202]: New machine qemu-54-instance-00000075.
Oct  2 08:23:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:25.712 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[7b7a1cd7-e996-4658-bae8-9de33a559e49]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:25 np0005466013 systemd[1]: Started Virtual Machine qemu-54-instance-00000075.
Oct  2 08:23:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:25.741 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[a732494f-36a4-499b-8663-2c98a6d9f73a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:25 np0005466013 nova_compute[192144]: 2025-10-02 12:23:25.743 2 DEBUG nova.compute.manager [req-8cb4ba31-3be3-4489-8d4a-2bfa515dc0fe req-d0161ca3-b282-4579-b588-0b9fbec61971 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Received event network-vif-plugged-399f8dde-d495-4c04-893b-bb0bf50f2cf0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:23:25 np0005466013 nova_compute[192144]: 2025-10-02 12:23:25.744 2 DEBUG oslo_concurrency.lockutils [req-8cb4ba31-3be3-4489-8d4a-2bfa515dc0fe req-d0161ca3-b282-4579-b588-0b9fbec61971 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "86e04f64-f88d-45c1-b90c-344bddb4c4b9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:25 np0005466013 nova_compute[192144]: 2025-10-02 12:23:25.744 2 DEBUG oslo_concurrency.lockutils [req-8cb4ba31-3be3-4489-8d4a-2bfa515dc0fe req-d0161ca3-b282-4579-b588-0b9fbec61971 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "86e04f64-f88d-45c1-b90c-344bddb4c4b9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:25 np0005466013 nova_compute[192144]: 2025-10-02 12:23:25.744 2 DEBUG oslo_concurrency.lockutils [req-8cb4ba31-3be3-4489-8d4a-2bfa515dc0fe req-d0161ca3-b282-4579-b588-0b9fbec61971 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "86e04f64-f88d-45c1-b90c-344bddb4c4b9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:25 np0005466013 nova_compute[192144]: 2025-10-02 12:23:25.744 2 DEBUG nova.compute.manager [req-8cb4ba31-3be3-4489-8d4a-2bfa515dc0fe req-d0161ca3-b282-4579-b588-0b9fbec61971 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Processing event network-vif-plugged-399f8dde-d495-4c04-893b-bb0bf50f2cf0 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:23:25 np0005466013 nova_compute[192144]: 2025-10-02 12:23:25.745 2 DEBUG nova.compute.manager [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:23:25 np0005466013 nova_compute[192144]: 2025-10-02 12:23:25.752 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407805.7524617, 86e04f64-f88d-45c1-b90c-344bddb4c4b9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:23:25 np0005466013 nova_compute[192144]: 2025-10-02 12:23:25.753 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:23:25 np0005466013 nova_compute[192144]: 2025-10-02 12:23:25.755 2 DEBUG nova.virt.libvirt.driver [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:23:25 np0005466013 nova_compute[192144]: 2025-10-02 12:23:25.760 2 INFO nova.virt.libvirt.driver [-] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Instance spawned successfully.#033[00m
Oct  2 08:23:25 np0005466013 nova_compute[192144]: 2025-10-02 12:23:25.760 2 DEBUG nova.virt.libvirt.driver [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:23:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:25.774 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[16dce8aa-4a76-45e2-9160-af94c50dbf1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:25 np0005466013 NetworkManager[51205]: <info>  [1759407805.7816] manager: (tap91662be7-30): new Veth device (/org/freedesktop/NetworkManager/Devices/213)
Oct  2 08:23:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:25.781 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[487690dc-ffdf-4278-9067-dea297202d95]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:25 np0005466013 nova_compute[192144]: 2025-10-02 12:23:25.786 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:23:25 np0005466013 nova_compute[192144]: 2025-10-02 12:23:25.807 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:23:25 np0005466013 nova_compute[192144]: 2025-10-02 12:23:25.812 2 DEBUG nova.virt.libvirt.driver [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:23:25 np0005466013 nova_compute[192144]: 2025-10-02 12:23:25.812 2 DEBUG nova.virt.libvirt.driver [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:23:25 np0005466013 nova_compute[192144]: 2025-10-02 12:23:25.813 2 DEBUG nova.virt.libvirt.driver [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:23:25 np0005466013 nova_compute[192144]: 2025-10-02 12:23:25.813 2 DEBUG nova.virt.libvirt.driver [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:23:25 np0005466013 nova_compute[192144]: 2025-10-02 12:23:25.814 2 DEBUG nova.virt.libvirt.driver [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:23:25 np0005466013 nova_compute[192144]: 2025-10-02 12:23:25.814 2 DEBUG nova.virt.libvirt.driver [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:23:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:25.815 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[7cdfff2b-ca98-47c5-8f2b-f5f94d91ceed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:25.818 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[46d436a6-addc-484f-8d49-c7eace4fce24]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:25 np0005466013 NetworkManager[51205]: <info>  [1759407805.8468] device (tap91662be7-30): carrier: link connected
Oct  2 08:23:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:25.853 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[37af9153-4239-4ef3-96ff-c0b84645a15b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:25 np0005466013 nova_compute[192144]: 2025-10-02 12:23:25.869 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:23:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:25.872 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[fca59918-3563-47fa-974a-393e122dd92b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap91662be7-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:44:4b:1d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 139], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 579746, 'reachable_time': 38443, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237606, 'error': None, 'target': 'ovnmeta-91662be7-398f-4c34-a848-62b46821f0fd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:25.890 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[b69504d2-0182-4e65-9ca6-5f96c4e10bda]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe44:4b1d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 579746, 'tstamp': 579746}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237607, 'error': None, 'target': 'ovnmeta-91662be7-398f-4c34-a848-62b46821f0fd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:25.908 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[e49032a9-cc90-45b8-ba72-4e04d73cf576]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap91662be7-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:44:4b:1d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 139], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 579746, 'reachable_time': 38443, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 237608, 'error': None, 'target': 'ovnmeta-91662be7-398f-4c34-a848-62b46821f0fd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:25 np0005466013 nova_compute[192144]: 2025-10-02 12:23:25.942 2 INFO nova.compute.manager [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Took 10.16 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:23:25 np0005466013 nova_compute[192144]: 2025-10-02 12:23:25.942 2 DEBUG nova.compute.manager [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:23:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:25.962 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[4c99a37a-4ed2-44b3-b40a-91bfc9e332fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:26 np0005466013 nova_compute[192144]: 2025-10-02 12:23:26.028 2 INFO nova.compute.manager [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Took 11.87 seconds to build instance.#033[00m
Oct  2 08:23:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:26.034 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[58cd60e2-f975-4f09-a35b-e2d41d5cabaf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:26.035 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap91662be7-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:23:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:26.035 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:23:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:26.035 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap91662be7-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:23:26 np0005466013 NetworkManager[51205]: <info>  [1759407806.0381] manager: (tap91662be7-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/214)
Oct  2 08:23:26 np0005466013 nova_compute[192144]: 2025-10-02 12:23:26.037 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:26 np0005466013 kernel: tap91662be7-30: entered promiscuous mode
Oct  2 08:23:26 np0005466013 nova_compute[192144]: 2025-10-02 12:23:26.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:26.042 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap91662be7-30, col_values=(('external_ids', {'iface-id': '9bd4e8e6-11ff-43aa-92bf-67aec1a8e528'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:23:26 np0005466013 ovn_controller[94366]: 2025-10-02T12:23:26Z|00458|binding|INFO|Releasing lport 9bd4e8e6-11ff-43aa-92bf-67aec1a8e528 from this chassis (sb_readonly=0)
Oct  2 08:23:26 np0005466013 nova_compute[192144]: 2025-10-02 12:23:26.043 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:26.046 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/91662be7-398f-4c34-a848-62b46821f0fd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/91662be7-398f-4c34-a848-62b46821f0fd.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:23:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:26.047 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[e7f31d73-fb68-4ca1-8c5a-35b8bc7bb339]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:26.049 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:23:26 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:23:26 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:23:26 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-91662be7-398f-4c34-a848-62b46821f0fd
Oct  2 08:23:26 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:23:26 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:23:26 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:23:26 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/91662be7-398f-4c34-a848-62b46821f0fd.pid.haproxy
Oct  2 08:23:26 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:23:26 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:23:26 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:23:26 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:23:26 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:23:26 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:23:26 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:23:26 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:23:26 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:23:26 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:23:26 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:23:26 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:23:26 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:23:26 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:23:26 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:23:26 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:23:26 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:23:26 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:23:26 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:23:26 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:23:26 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID 91662be7-398f-4c34-a848-62b46821f0fd
Oct  2 08:23:26 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:23:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:26.050 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-91662be7-398f-4c34-a848-62b46821f0fd', 'env', 'PROCESS_TAG=haproxy-91662be7-398f-4c34-a848-62b46821f0fd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/91662be7-398f-4c34-a848-62b46821f0fd.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:23:26 np0005466013 nova_compute[192144]: 2025-10-02 12:23:26.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:26 np0005466013 nova_compute[192144]: 2025-10-02 12:23:26.061 2 DEBUG oslo_concurrency.lockutils [None req-2c142e1b-7a70-4c5e-882f-a966bdd288bb 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "86e04f64-f88d-45c1-b90c-344bddb4c4b9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.223s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:26 np0005466013 nova_compute[192144]: 2025-10-02 12:23:26.187 2 DEBUG nova.compute.manager [req-105730b0-6d19-4bf5-8ea6-c8e630449f75 req-afd6c4f5-bccc-4f0e-aa8a-3734136d1973 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Received event network-vif-plugged-375c20c8-b3bc-484b-820a-f3988fb1bfa1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:23:26 np0005466013 nova_compute[192144]: 2025-10-02 12:23:26.187 2 DEBUG oslo_concurrency.lockutils [req-105730b0-6d19-4bf5-8ea6-c8e630449f75 req-afd6c4f5-bccc-4f0e-aa8a-3734136d1973 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "32196dd3-2739-4c43-9532-b0365f8095af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:26 np0005466013 nova_compute[192144]: 2025-10-02 12:23:26.188 2 DEBUG oslo_concurrency.lockutils [req-105730b0-6d19-4bf5-8ea6-c8e630449f75 req-afd6c4f5-bccc-4f0e-aa8a-3734136d1973 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "32196dd3-2739-4c43-9532-b0365f8095af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:26 np0005466013 nova_compute[192144]: 2025-10-02 12:23:26.188 2 DEBUG oslo_concurrency.lockutils [req-105730b0-6d19-4bf5-8ea6-c8e630449f75 req-afd6c4f5-bccc-4f0e-aa8a-3734136d1973 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "32196dd3-2739-4c43-9532-b0365f8095af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:26 np0005466013 nova_compute[192144]: 2025-10-02 12:23:26.188 2 DEBUG nova.compute.manager [req-105730b0-6d19-4bf5-8ea6-c8e630449f75 req-afd6c4f5-bccc-4f0e-aa8a-3734136d1973 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Processing event network-vif-plugged-375c20c8-b3bc-484b-820a-f3988fb1bfa1 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:23:26 np0005466013 podman[237642]: 2025-10-02 12:23:26.488298974 +0000 UTC m=+0.065473607 container create 612a46b1e8e6575c0b4539753c72c09d4d931ef3d2eb855de57704da1f716674 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-91662be7-398f-4c34-a848-62b46821f0fd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:23:26 np0005466013 nova_compute[192144]: 2025-10-02 12:23:26.491 2 DEBUG nova.network.neutron [req-e48ee08a-c6b7-4970-bb73-e4c0a83eeeec req-5126bbff-a422-4115-a5a7-38a316b3fb4d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Updated VIF entry in instance network info cache for port 375c20c8-b3bc-484b-820a-f3988fb1bfa1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:23:26 np0005466013 nova_compute[192144]: 2025-10-02 12:23:26.492 2 DEBUG nova.network.neutron [req-e48ee08a-c6b7-4970-bb73-e4c0a83eeeec req-5126bbff-a422-4115-a5a7-38a316b3fb4d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Updating instance_info_cache with network_info: [{"id": "375c20c8-b3bc-484b-820a-f3988fb1bfa1", "address": "fa:16:3e:af:53:5f", "network": {"id": "91662be7-398f-4c34-a848-62b46821f0fd", "bridge": "br-int", "label": "tempest-network-smoke--722078817", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap375c20c8-b3", "ovs_interfaceid": "375c20c8-b3bc-484b-820a-f3988fb1bfa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:23:26 np0005466013 systemd[1]: Started libpod-conmon-612a46b1e8e6575c0b4539753c72c09d4d931ef3d2eb855de57704da1f716674.scope.
Oct  2 08:23:26 np0005466013 nova_compute[192144]: 2025-10-02 12:23:26.527 2 DEBUG oslo_concurrency.lockutils [req-e48ee08a-c6b7-4970-bb73-e4c0a83eeeec req-5126bbff-a422-4115-a5a7-38a316b3fb4d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-32196dd3-2739-4c43-9532-b0365f8095af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:23:26 np0005466013 nova_compute[192144]: 2025-10-02 12:23:26.541 2 DEBUG nova.network.neutron [req-a06611c6-a797-46a7-b7e3-da08b9b95cc9 req-ce54fe00-f58e-4ee3-a24b-93d1def1e9c5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Updated VIF entry in instance network info cache for port 399f8dde-d495-4c04-893b-bb0bf50f2cf0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:23:26 np0005466013 nova_compute[192144]: 2025-10-02 12:23:26.542 2 DEBUG nova.network.neutron [req-a06611c6-a797-46a7-b7e3-da08b9b95cc9 req-ce54fe00-f58e-4ee3-a24b-93d1def1e9c5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Updating instance_info_cache with network_info: [{"id": "399f8dde-d495-4c04-893b-bb0bf50f2cf0", "address": "fa:16:3e:50:da:03", "network": {"id": "20eb29be-ee23-463b-85af-bfc2388e9f77", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-370285634-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffce7d629aa24a7f970d93b2a79045f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap399f8dde-d4", "ovs_interfaceid": "399f8dde-d495-4c04-893b-bb0bf50f2cf0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:23:26 np0005466013 podman[237642]: 2025-10-02 12:23:26.454114472 +0000 UTC m=+0.031289124 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:23:26 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:23:26 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/492a7a83638f26ef21a5e7e3135374166a3aed468a0298ec7b2cc87c85052369/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:23:26 np0005466013 nova_compute[192144]: 2025-10-02 12:23:26.568 2 DEBUG oslo_concurrency.lockutils [req-a06611c6-a797-46a7-b7e3-da08b9b95cc9 req-ce54fe00-f58e-4ee3-a24b-93d1def1e9c5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-86e04f64-f88d-45c1-b90c-344bddb4c4b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:23:26 np0005466013 podman[237642]: 2025-10-02 12:23:26.574272798 +0000 UTC m=+0.151447460 container init 612a46b1e8e6575c0b4539753c72c09d4d931ef3d2eb855de57704da1f716674 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-91662be7-398f-4c34-a848-62b46821f0fd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  2 08:23:26 np0005466013 podman[237642]: 2025-10-02 12:23:26.57979699 +0000 UTC m=+0.156971622 container start 612a46b1e8e6575c0b4539753c72c09d4d931ef3d2eb855de57704da1f716674 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-91662be7-398f-4c34-a848-62b46821f0fd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:23:26 np0005466013 neutron-haproxy-ovnmeta-91662be7-398f-4c34-a848-62b46821f0fd[237663]: [NOTICE]   (237667) : New worker (237669) forked
Oct  2 08:23:26 np0005466013 neutron-haproxy-ovnmeta-91662be7-398f-4c34-a848-62b46821f0fd[237663]: [NOTICE]   (237667) : Loading success.
Oct  2 08:23:26 np0005466013 nova_compute[192144]: 2025-10-02 12:23:26.903 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407806.9035368, 32196dd3-2739-4c43-9532-b0365f8095af => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:23:26 np0005466013 nova_compute[192144]: 2025-10-02 12:23:26.904 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] VM Started (Lifecycle Event)#033[00m
Oct  2 08:23:26 np0005466013 nova_compute[192144]: 2025-10-02 12:23:26.905 2 DEBUG nova.compute.manager [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:23:26 np0005466013 nova_compute[192144]: 2025-10-02 12:23:26.909 2 DEBUG nova.virt.libvirt.driver [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:23:26 np0005466013 nova_compute[192144]: 2025-10-02 12:23:26.912 2 INFO nova.virt.libvirt.driver [-] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Instance spawned successfully.#033[00m
Oct  2 08:23:26 np0005466013 nova_compute[192144]: 2025-10-02 12:23:26.912 2 DEBUG nova.virt.libvirt.driver [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:23:26 np0005466013 nova_compute[192144]: 2025-10-02 12:23:26.936 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:23:26 np0005466013 nova_compute[192144]: 2025-10-02 12:23:26.941 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:23:26 np0005466013 nova_compute[192144]: 2025-10-02 12:23:26.943 2 DEBUG nova.virt.libvirt.driver [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:23:26 np0005466013 nova_compute[192144]: 2025-10-02 12:23:26.944 2 DEBUG nova.virt.libvirt.driver [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:23:26 np0005466013 nova_compute[192144]: 2025-10-02 12:23:26.944 2 DEBUG nova.virt.libvirt.driver [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:23:26 np0005466013 nova_compute[192144]: 2025-10-02 12:23:26.944 2 DEBUG nova.virt.libvirt.driver [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:23:26 np0005466013 nova_compute[192144]: 2025-10-02 12:23:26.945 2 DEBUG nova.virt.libvirt.driver [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:23:26 np0005466013 nova_compute[192144]: 2025-10-02 12:23:26.945 2 DEBUG nova.virt.libvirt.driver [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:23:26 np0005466013 nova_compute[192144]: 2025-10-02 12:23:26.980 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:23:26 np0005466013 nova_compute[192144]: 2025-10-02 12:23:26.981 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407806.903686, 32196dd3-2739-4c43-9532-b0365f8095af => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:23:26 np0005466013 nova_compute[192144]: 2025-10-02 12:23:26.981 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:23:26 np0005466013 nova_compute[192144]: 2025-10-02 12:23:26.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:23:26 np0005466013 nova_compute[192144]: 2025-10-02 12:23:26.993 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:23:27 np0005466013 nova_compute[192144]: 2025-10-02 12:23:27.020 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:23:27 np0005466013 nova_compute[192144]: 2025-10-02 12:23:27.022 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407806.9090528, 32196dd3-2739-4c43-9532-b0365f8095af => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:23:27 np0005466013 nova_compute[192144]: 2025-10-02 12:23:27.023 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:23:27 np0005466013 nova_compute[192144]: 2025-10-02 12:23:27.041 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:23:27 np0005466013 nova_compute[192144]: 2025-10-02 12:23:27.044 2 INFO nova.compute.manager [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Took 11.10 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:23:27 np0005466013 nova_compute[192144]: 2025-10-02 12:23:27.044 2 DEBUG nova.compute.manager [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:23:27 np0005466013 nova_compute[192144]: 2025-10-02 12:23:27.046 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:23:27 np0005466013 nova_compute[192144]: 2025-10-02 12:23:27.075 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:23:27 np0005466013 nova_compute[192144]: 2025-10-02 12:23:27.131 2 INFO nova.compute.manager [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Took 12.54 seconds to build instance.#033[00m
Oct  2 08:23:27 np0005466013 nova_compute[192144]: 2025-10-02 12:23:27.153 2 DEBUG oslo_concurrency.lockutils [None req-2b490378-8159-4b3d-aabc-f4cade2cbf20 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "32196dd3-2739-4c43-9532-b0365f8095af" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.805s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:27 np0005466013 nova_compute[192144]: 2025-10-02 12:23:27.912 2 DEBUG nova.compute.manager [req-9a3049e4-a410-465e-8206-cfd2e5f2aed8 req-6aa6886a-6375-4f7b-abe4-445728c91cd5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Received event network-vif-plugged-399f8dde-d495-4c04-893b-bb0bf50f2cf0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:23:27 np0005466013 nova_compute[192144]: 2025-10-02 12:23:27.913 2 DEBUG oslo_concurrency.lockutils [req-9a3049e4-a410-465e-8206-cfd2e5f2aed8 req-6aa6886a-6375-4f7b-abe4-445728c91cd5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "86e04f64-f88d-45c1-b90c-344bddb4c4b9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:27 np0005466013 nova_compute[192144]: 2025-10-02 12:23:27.913 2 DEBUG oslo_concurrency.lockutils [req-9a3049e4-a410-465e-8206-cfd2e5f2aed8 req-6aa6886a-6375-4f7b-abe4-445728c91cd5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "86e04f64-f88d-45c1-b90c-344bddb4c4b9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:27 np0005466013 nova_compute[192144]: 2025-10-02 12:23:27.913 2 DEBUG oslo_concurrency.lockutils [req-9a3049e4-a410-465e-8206-cfd2e5f2aed8 req-6aa6886a-6375-4f7b-abe4-445728c91cd5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "86e04f64-f88d-45c1-b90c-344bddb4c4b9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:27 np0005466013 nova_compute[192144]: 2025-10-02 12:23:27.913 2 DEBUG nova.compute.manager [req-9a3049e4-a410-465e-8206-cfd2e5f2aed8 req-6aa6886a-6375-4f7b-abe4-445728c91cd5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] No waiting events found dispatching network-vif-plugged-399f8dde-d495-4c04-893b-bb0bf50f2cf0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:23:27 np0005466013 nova_compute[192144]: 2025-10-02 12:23:27.913 2 WARNING nova.compute.manager [req-9a3049e4-a410-465e-8206-cfd2e5f2aed8 req-6aa6886a-6375-4f7b-abe4-445728c91cd5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Received unexpected event network-vif-plugged-399f8dde-d495-4c04-893b-bb0bf50f2cf0 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:23:28 np0005466013 nova_compute[192144]: 2025-10-02 12:23:28.298 2 DEBUG nova.compute.manager [req-8633c6ae-0c02-4e8f-bb37-a601542e228a req-c3f3e27b-e718-4505-a329-a3f8cdcf1fd5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Received event network-vif-plugged-375c20c8-b3bc-484b-820a-f3988fb1bfa1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:23:28 np0005466013 nova_compute[192144]: 2025-10-02 12:23:28.299 2 DEBUG oslo_concurrency.lockutils [req-8633c6ae-0c02-4e8f-bb37-a601542e228a req-c3f3e27b-e718-4505-a329-a3f8cdcf1fd5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "32196dd3-2739-4c43-9532-b0365f8095af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:28 np0005466013 nova_compute[192144]: 2025-10-02 12:23:28.299 2 DEBUG oslo_concurrency.lockutils [req-8633c6ae-0c02-4e8f-bb37-a601542e228a req-c3f3e27b-e718-4505-a329-a3f8cdcf1fd5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "32196dd3-2739-4c43-9532-b0365f8095af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:28 np0005466013 nova_compute[192144]: 2025-10-02 12:23:28.299 2 DEBUG oslo_concurrency.lockutils [req-8633c6ae-0c02-4e8f-bb37-a601542e228a req-c3f3e27b-e718-4505-a329-a3f8cdcf1fd5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "32196dd3-2739-4c43-9532-b0365f8095af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:28 np0005466013 nova_compute[192144]: 2025-10-02 12:23:28.299 2 DEBUG nova.compute.manager [req-8633c6ae-0c02-4e8f-bb37-a601542e228a req-c3f3e27b-e718-4505-a329-a3f8cdcf1fd5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] No waiting events found dispatching network-vif-plugged-375c20c8-b3bc-484b-820a-f3988fb1bfa1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:23:28 np0005466013 nova_compute[192144]: 2025-10-02 12:23:28.299 2 WARNING nova.compute.manager [req-8633c6ae-0c02-4e8f-bb37-a601542e228a req-c3f3e27b-e718-4505-a329-a3f8cdcf1fd5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Received unexpected event network-vif-plugged-375c20c8-b3bc-484b-820a-f3988fb1bfa1 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:23:28 np0005466013 ovn_controller[94366]: 2025-10-02T12:23:28Z|00459|binding|INFO|Releasing lport 9bd4e8e6-11ff-43aa-92bf-67aec1a8e528 from this chassis (sb_readonly=0)
Oct  2 08:23:28 np0005466013 ovn_controller[94366]: 2025-10-02T12:23:28Z|00460|binding|INFO|Releasing lport e533861f-45cb-4843-b071-0b628ca25128 from this chassis (sb_readonly=0)
Oct  2 08:23:28 np0005466013 nova_compute[192144]: 2025-10-02 12:23:28.571 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:28 np0005466013 nova_compute[192144]: 2025-10-02 12:23:28.644 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:28 np0005466013 nova_compute[192144]: 2025-10-02 12:23:28.829 2 INFO nova.compute.manager [None req-678bef43-aa43-4e10-a347-99c77542c002 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Pausing#033[00m
Oct  2 08:23:28 np0005466013 nova_compute[192144]: 2025-10-02 12:23:28.830 2 DEBUG nova.objects.instance [None req-678bef43-aa43-4e10-a347-99c77542c002 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lazy-loading 'flavor' on Instance uuid 86e04f64-f88d-45c1-b90c-344bddb4c4b9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:23:28 np0005466013 nova_compute[192144]: 2025-10-02 12:23:28.873 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407808.8719654, 86e04f64-f88d-45c1-b90c-344bddb4c4b9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:23:28 np0005466013 nova_compute[192144]: 2025-10-02 12:23:28.874 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:23:28 np0005466013 nova_compute[192144]: 2025-10-02 12:23:28.878 2 DEBUG nova.compute.manager [None req-678bef43-aa43-4e10-a347-99c77542c002 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:23:28 np0005466013 nova_compute[192144]: 2025-10-02 12:23:28.912 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:23:28 np0005466013 nova_compute[192144]: 2025-10-02 12:23:28.916 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:23:28 np0005466013 nova_compute[192144]: 2025-10-02 12:23:28.950 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] During sync_power_state the instance has a pending task (pausing). Skip.#033[00m
Oct  2 08:23:28 np0005466013 nova_compute[192144]: 2025-10-02 12:23:28.997 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:23:29 np0005466013 nova_compute[192144]: 2025-10-02 12:23:29.034 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:29 np0005466013 nova_compute[192144]: 2025-10-02 12:23:29.035 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:29 np0005466013 nova_compute[192144]: 2025-10-02 12:23:29.035 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:29 np0005466013 nova_compute[192144]: 2025-10-02 12:23:29.035 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:23:29 np0005466013 nova_compute[192144]: 2025-10-02 12:23:29.053 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:29 np0005466013 nova_compute[192144]: 2025-10-02 12:23:29.199 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/86e04f64-f88d-45c1-b90c-344bddb4c4b9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:23:29 np0005466013 nova_compute[192144]: 2025-10-02 12:23:29.263 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/86e04f64-f88d-45c1-b90c-344bddb4c4b9/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:23:29 np0005466013 nova_compute[192144]: 2025-10-02 12:23:29.264 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/86e04f64-f88d-45c1-b90c-344bddb4c4b9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:23:29 np0005466013 nova_compute[192144]: 2025-10-02 12:23:29.322 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/86e04f64-f88d-45c1-b90c-344bddb4c4b9/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:23:29 np0005466013 nova_compute[192144]: 2025-10-02 12:23:29.330 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f2f0b852-0c4a-4d16-9c7f-54845e7f7b42/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:23:29 np0005466013 nova_compute[192144]: 2025-10-02 12:23:29.398 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f2f0b852-0c4a-4d16-9c7f-54845e7f7b42/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:23:29 np0005466013 nova_compute[192144]: 2025-10-02 12:23:29.400 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f2f0b852-0c4a-4d16-9c7f-54845e7f7b42/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:23:29 np0005466013 nova_compute[192144]: 2025-10-02 12:23:29.466 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f2f0b852-0c4a-4d16-9c7f-54845e7f7b42/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:23:29 np0005466013 nova_compute[192144]: 2025-10-02 12:23:29.475 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:23:29 np0005466013 nova_compute[192144]: 2025-10-02 12:23:29.549 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:23:29 np0005466013 nova_compute[192144]: 2025-10-02 12:23:29.553 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:23:29 np0005466013 nova_compute[192144]: 2025-10-02 12:23:29.621 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:23:29 np0005466013 nova_compute[192144]: 2025-10-02 12:23:29.814 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:23:29 np0005466013 nova_compute[192144]: 2025-10-02 12:23:29.816 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5264MB free_disk=73.25554275512695GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:23:29 np0005466013 nova_compute[192144]: 2025-10-02 12:23:29.816 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:29 np0005466013 nova_compute[192144]: 2025-10-02 12:23:29.816 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:29 np0005466013 nova_compute[192144]: 2025-10-02 12:23:29.939 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance f2f0b852-0c4a-4d16-9c7f-54845e7f7b42 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:23:29 np0005466013 nova_compute[192144]: 2025-10-02 12:23:29.939 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance 86e04f64-f88d-45c1-b90c-344bddb4c4b9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:23:29 np0005466013 nova_compute[192144]: 2025-10-02 12:23:29.940 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance 32196dd3-2739-4c43-9532-b0365f8095af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:23:29 np0005466013 nova_compute[192144]: 2025-10-02 12:23:29.940 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:23:29 np0005466013 nova_compute[192144]: 2025-10-02 12:23:29.940 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=79GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:23:30 np0005466013 nova_compute[192144]: 2025-10-02 12:23:30.058 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:23:30 np0005466013 nova_compute[192144]: 2025-10-02 12:23:30.096 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:23:30 np0005466013 nova_compute[192144]: 2025-10-02 12:23:30.135 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:23:30 np0005466013 nova_compute[192144]: 2025-10-02 12:23:30.135 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.319s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:32 np0005466013 nova_compute[192144]: 2025-10-02 12:23:32.738 2 DEBUG oslo_concurrency.lockutils [None req-cd0c014a-8b18-42a2-b321-369ec04d7aef 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Acquiring lock "86e04f64-f88d-45c1-b90c-344bddb4c4b9" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:32 np0005466013 nova_compute[192144]: 2025-10-02 12:23:32.742 2 DEBUG oslo_concurrency.lockutils [None req-cd0c014a-8b18-42a2-b321-369ec04d7aef 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "86e04f64-f88d-45c1-b90c-344bddb4c4b9" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:32 np0005466013 nova_compute[192144]: 2025-10-02 12:23:32.743 2 INFO nova.compute.manager [None req-cd0c014a-8b18-42a2-b321-369ec04d7aef 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Shelving#033[00m
Oct  2 08:23:32 np0005466013 kernel: tap399f8dde-d4 (unregistering): left promiscuous mode
Oct  2 08:23:32 np0005466013 NetworkManager[51205]: <info>  [1759407812.8422] device (tap399f8dde-d4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:23:32 np0005466013 ovn_controller[94366]: 2025-10-02T12:23:32Z|00461|binding|INFO|Releasing lport 399f8dde-d495-4c04-893b-bb0bf50f2cf0 from this chassis (sb_readonly=0)
Oct  2 08:23:32 np0005466013 ovn_controller[94366]: 2025-10-02T12:23:32Z|00462|binding|INFO|Setting lport 399f8dde-d495-4c04-893b-bb0bf50f2cf0 down in Southbound
Oct  2 08:23:32 np0005466013 ovn_controller[94366]: 2025-10-02T12:23:32Z|00463|binding|INFO|Removing iface tap399f8dde-d4 ovn-installed in OVS
Oct  2 08:23:32 np0005466013 nova_compute[192144]: 2025-10-02 12:23:32.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:32 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:32.911 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:50:da:03 10.100.0.4'], port_security=['fa:16:3e:50:da:03 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '86e04f64-f88d-45c1-b90c-344bddb4c4b9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-20eb29be-ee23-463b-85af-bfc2388e9f77', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7993d405-22b6-4649-b5b8-9f3e7d07d4ce', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e183e2c6-21dc-48e3-ae47-279bc8b32eeb, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=399f8dde-d495-4c04-893b-bb0bf50f2cf0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:23:32 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:32.915 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 399f8dde-d495-4c04-893b-bb0bf50f2cf0 in datapath 20eb29be-ee23-463b-85af-bfc2388e9f77 unbound from our chassis#033[00m
Oct  2 08:23:32 np0005466013 nova_compute[192144]: 2025-10-02 12:23:32.917 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:32 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:32.918 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 20eb29be-ee23-463b-85af-bfc2388e9f77#033[00m
Oct  2 08:23:32 np0005466013 systemd[1]: machine-qemu\x2d53\x2dinstance\x2d00000074.scope: Deactivated successfully.
Oct  2 08:23:32 np0005466013 systemd[1]: machine-qemu\x2d53\x2dinstance\x2d00000074.scope: Consumed 3.795s CPU time.
Oct  2 08:23:32 np0005466013 systemd-machined[152202]: Machine qemu-53-instance-00000074 terminated.
Oct  2 08:23:32 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:32.940 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[04851594-e705-4fb8-9942-e42431b82c3d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:32 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:32.975 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[9670a7f0-db87-43ff-bec9-0a4361e57847]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:32 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:32.978 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[85f1a8e8-6e7b-4083-b28e-2e168af186c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:33 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:33.017 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[69ac52ec-5b4c-4f0e-9a90-346e7c2da801]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:33 np0005466013 nova_compute[192144]: 2025-10-02 12:23:33.034 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:33 np0005466013 nova_compute[192144]: 2025-10-02 12:23:33.043 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:33 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:33.053 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[48dc26b8-cf9e-4971-9009-cdf29d0b65cf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap20eb29be-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:77:55:96'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 11, 'rx_bytes': 1000, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 11, 'rx_bytes': 1000, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 125], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 561951, 'reachable_time': 40813, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237711, 'error': None, 'target': 'ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:33 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:33.077 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[1b10d5c3-3e7e-4e10-9a9f-74be10a36cfa]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap20eb29be-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 561963, 'tstamp': 561963}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237723, 'error': None, 'target': 'ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap20eb29be-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 561966, 'tstamp': 561966}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237723, 'error': None, 'target': 'ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:33 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:33.079 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap20eb29be-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:23:33 np0005466013 nova_compute[192144]: 2025-10-02 12:23:33.081 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:33 np0005466013 nova_compute[192144]: 2025-10-02 12:23:33.088 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:33 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:33.089 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap20eb29be-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:23:33 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:33.089 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:23:33 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:33.089 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap20eb29be-e0, col_values=(('external_ids', {'iface-id': 'e533861f-45cb-4843-b071-0b628ca25128'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:23:33 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:33.090 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:23:33 np0005466013 nova_compute[192144]: 2025-10-02 12:23:33.092 2 INFO nova.virt.libvirt.driver [-] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Instance destroyed successfully.#033[00m
Oct  2 08:23:33 np0005466013 nova_compute[192144]: 2025-10-02 12:23:33.092 2 DEBUG nova.objects.instance [None req-cd0c014a-8b18-42a2-b321-369ec04d7aef 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lazy-loading 'numa_topology' on Instance uuid 86e04f64-f88d-45c1-b90c-344bddb4c4b9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:23:33 np0005466013 nova_compute[192144]: 2025-10-02 12:23:33.131 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:23:33 np0005466013 nova_compute[192144]: 2025-10-02 12:23:33.132 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:23:33 np0005466013 nova_compute[192144]: 2025-10-02 12:23:33.132 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:23:33 np0005466013 nova_compute[192144]: 2025-10-02 12:23:33.132 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:23:33 np0005466013 nova_compute[192144]: 2025-10-02 12:23:33.239 2 DEBUG nova.compute.manager [req-5d0ba866-6f66-4fd6-8652-e03bb9d3ffe3 req-b954f9d7-431d-4182-af63-a95d432f7990 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Received event network-changed-375c20c8-b3bc-484b-820a-f3988fb1bfa1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:23:33 np0005466013 nova_compute[192144]: 2025-10-02 12:23:33.239 2 DEBUG nova.compute.manager [req-5d0ba866-6f66-4fd6-8652-e03bb9d3ffe3 req-b954f9d7-431d-4182-af63-a95d432f7990 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Refreshing instance network info cache due to event network-changed-375c20c8-b3bc-484b-820a-f3988fb1bfa1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:23:33 np0005466013 nova_compute[192144]: 2025-10-02 12:23:33.239 2 DEBUG oslo_concurrency.lockutils [req-5d0ba866-6f66-4fd6-8652-e03bb9d3ffe3 req-b954f9d7-431d-4182-af63-a95d432f7990 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-32196dd3-2739-4c43-9532-b0365f8095af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:23:33 np0005466013 nova_compute[192144]: 2025-10-02 12:23:33.239 2 DEBUG oslo_concurrency.lockutils [req-5d0ba866-6f66-4fd6-8652-e03bb9d3ffe3 req-b954f9d7-431d-4182-af63-a95d432f7990 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-32196dd3-2739-4c43-9532-b0365f8095af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:23:33 np0005466013 nova_compute[192144]: 2025-10-02 12:23:33.239 2 DEBUG nova.network.neutron [req-5d0ba866-6f66-4fd6-8652-e03bb9d3ffe3 req-b954f9d7-431d-4182-af63-a95d432f7990 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Refreshing network info cache for port 375c20c8-b3bc-484b-820a-f3988fb1bfa1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:23:33 np0005466013 nova_compute[192144]: 2025-10-02 12:23:33.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:33 np0005466013 nova_compute[192144]: 2025-10-02 12:23:33.710 2 INFO nova.virt.libvirt.driver [None req-cd0c014a-8b18-42a2-b321-369ec04d7aef 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Beginning cold snapshot process#033[00m
Oct  2 08:23:33 np0005466013 nova_compute[192144]: 2025-10-02 12:23:33.958 2 DEBUG nova.privsep.utils [None req-cd0c014a-8b18-42a2-b321-369ec04d7aef 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Oct  2 08:23:33 np0005466013 nova_compute[192144]: 2025-10-02 12:23:33.959 2 DEBUG oslo_concurrency.processutils [None req-cd0c014a-8b18-42a2-b321-369ec04d7aef 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/86e04f64-f88d-45c1-b90c-344bddb4c4b9/disk /var/lib/nova/instances/snapshots/tmpm37y5ijf/ce5d77c82c024ff582032d5bcd5b9842 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:23:33 np0005466013 nova_compute[192144]: 2025-10-02 12:23:33.989 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:23:33 np0005466013 nova_compute[192144]: 2025-10-02 12:23:33.992 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:23:34 np0005466013 nova_compute[192144]: 2025-10-02 12:23:34.056 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:34 np0005466013 nova_compute[192144]: 2025-10-02 12:23:34.148 2 DEBUG oslo_concurrency.processutils [None req-cd0c014a-8b18-42a2-b321-369ec04d7aef 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/86e04f64-f88d-45c1-b90c-344bddb4c4b9/disk /var/lib/nova/instances/snapshots/tmpm37y5ijf/ce5d77c82c024ff582032d5bcd5b9842" returned: 0 in 0.189s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:23:34 np0005466013 nova_compute[192144]: 2025-10-02 12:23:34.149 2 INFO nova.virt.libvirt.driver [None req-cd0c014a-8b18-42a2-b321-369ec04d7aef 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Snapshot extracted, beginning image upload#033[00m
Oct  2 08:23:34 np0005466013 nova_compute[192144]: 2025-10-02 12:23:34.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:23:34 np0005466013 nova_compute[192144]: 2025-10-02 12:23:34.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:23:34 np0005466013 nova_compute[192144]: 2025-10-02 12:23:34.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:23:35 np0005466013 nova_compute[192144]: 2025-10-02 12:23:35.240 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "refresh_cache-f2f0b852-0c4a-4d16-9c7f-54845e7f7b42" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:23:35 np0005466013 nova_compute[192144]: 2025-10-02 12:23:35.240 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquired lock "refresh_cache-f2f0b852-0c4a-4d16-9c7f-54845e7f7b42" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:23:35 np0005466013 nova_compute[192144]: 2025-10-02 12:23:35.240 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:23:35 np0005466013 nova_compute[192144]: 2025-10-02 12:23:35.241 2 DEBUG nova.objects.instance [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lazy-loading 'info_cache' on Instance uuid f2f0b852-0c4a-4d16-9c7f-54845e7f7b42 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:23:35 np0005466013 nova_compute[192144]: 2025-10-02 12:23:35.408 2 DEBUG nova.compute.manager [req-dcf28bab-ad1d-4372-822c-4a8a7a942040 req-b9d3d0ab-8b14-4544-ac9c-678a5292f676 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Received event network-vif-unplugged-399f8dde-d495-4c04-893b-bb0bf50f2cf0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:23:35 np0005466013 nova_compute[192144]: 2025-10-02 12:23:35.409 2 DEBUG oslo_concurrency.lockutils [req-dcf28bab-ad1d-4372-822c-4a8a7a942040 req-b9d3d0ab-8b14-4544-ac9c-678a5292f676 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "86e04f64-f88d-45c1-b90c-344bddb4c4b9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:35 np0005466013 nova_compute[192144]: 2025-10-02 12:23:35.409 2 DEBUG oslo_concurrency.lockutils [req-dcf28bab-ad1d-4372-822c-4a8a7a942040 req-b9d3d0ab-8b14-4544-ac9c-678a5292f676 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "86e04f64-f88d-45c1-b90c-344bddb4c4b9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:35 np0005466013 nova_compute[192144]: 2025-10-02 12:23:35.409 2 DEBUG oslo_concurrency.lockutils [req-dcf28bab-ad1d-4372-822c-4a8a7a942040 req-b9d3d0ab-8b14-4544-ac9c-678a5292f676 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "86e04f64-f88d-45c1-b90c-344bddb4c4b9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:35 np0005466013 nova_compute[192144]: 2025-10-02 12:23:35.409 2 DEBUG nova.compute.manager [req-dcf28bab-ad1d-4372-822c-4a8a7a942040 req-b9d3d0ab-8b14-4544-ac9c-678a5292f676 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] No waiting events found dispatching network-vif-unplugged-399f8dde-d495-4c04-893b-bb0bf50f2cf0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:23:35 np0005466013 nova_compute[192144]: 2025-10-02 12:23:35.410 2 WARNING nova.compute.manager [req-dcf28bab-ad1d-4372-822c-4a8a7a942040 req-b9d3d0ab-8b14-4544-ac9c-678a5292f676 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Received unexpected event network-vif-unplugged-399f8dde-d495-4c04-893b-bb0bf50f2cf0 for instance with vm_state paused and task_state shelving_image_uploading.#033[00m
Oct  2 08:23:35 np0005466013 nova_compute[192144]: 2025-10-02 12:23:35.410 2 DEBUG nova.compute.manager [req-dcf28bab-ad1d-4372-822c-4a8a7a942040 req-b9d3d0ab-8b14-4544-ac9c-678a5292f676 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Received event network-vif-plugged-399f8dde-d495-4c04-893b-bb0bf50f2cf0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:23:35 np0005466013 nova_compute[192144]: 2025-10-02 12:23:35.410 2 DEBUG oslo_concurrency.lockutils [req-dcf28bab-ad1d-4372-822c-4a8a7a942040 req-b9d3d0ab-8b14-4544-ac9c-678a5292f676 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "86e04f64-f88d-45c1-b90c-344bddb4c4b9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:35 np0005466013 nova_compute[192144]: 2025-10-02 12:23:35.410 2 DEBUG oslo_concurrency.lockutils [req-dcf28bab-ad1d-4372-822c-4a8a7a942040 req-b9d3d0ab-8b14-4544-ac9c-678a5292f676 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "86e04f64-f88d-45c1-b90c-344bddb4c4b9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:35 np0005466013 nova_compute[192144]: 2025-10-02 12:23:35.410 2 DEBUG oslo_concurrency.lockutils [req-dcf28bab-ad1d-4372-822c-4a8a7a942040 req-b9d3d0ab-8b14-4544-ac9c-678a5292f676 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "86e04f64-f88d-45c1-b90c-344bddb4c4b9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:35 np0005466013 nova_compute[192144]: 2025-10-02 12:23:35.410 2 DEBUG nova.compute.manager [req-dcf28bab-ad1d-4372-822c-4a8a7a942040 req-b9d3d0ab-8b14-4544-ac9c-678a5292f676 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] No waiting events found dispatching network-vif-plugged-399f8dde-d495-4c04-893b-bb0bf50f2cf0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:23:35 np0005466013 nova_compute[192144]: 2025-10-02 12:23:35.410 2 WARNING nova.compute.manager [req-dcf28bab-ad1d-4372-822c-4a8a7a942040 req-b9d3d0ab-8b14-4544-ac9c-678a5292f676 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Received unexpected event network-vif-plugged-399f8dde-d495-4c04-893b-bb0bf50f2cf0 for instance with vm_state paused and task_state shelving_image_uploading.#033[00m
Oct  2 08:23:35 np0005466013 nova_compute[192144]: 2025-10-02 12:23:35.518 2 DEBUG nova.network.neutron [req-5d0ba866-6f66-4fd6-8652-e03bb9d3ffe3 req-b954f9d7-431d-4182-af63-a95d432f7990 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Updated VIF entry in instance network info cache for port 375c20c8-b3bc-484b-820a-f3988fb1bfa1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:23:35 np0005466013 nova_compute[192144]: 2025-10-02 12:23:35.519 2 DEBUG nova.network.neutron [req-5d0ba866-6f66-4fd6-8652-e03bb9d3ffe3 req-b954f9d7-431d-4182-af63-a95d432f7990 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Updating instance_info_cache with network_info: [{"id": "375c20c8-b3bc-484b-820a-f3988fb1bfa1", "address": "fa:16:3e:af:53:5f", "network": {"id": "91662be7-398f-4c34-a848-62b46821f0fd", "bridge": "br-int", "label": "tempest-network-smoke--722078817", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap375c20c8-b3", "ovs_interfaceid": "375c20c8-b3bc-484b-820a-f3988fb1bfa1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:23:35 np0005466013 nova_compute[192144]: 2025-10-02 12:23:35.542 2 DEBUG oslo_concurrency.lockutils [req-5d0ba866-6f66-4fd6-8652-e03bb9d3ffe3 req-b954f9d7-431d-4182-af63-a95d432f7990 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-32196dd3-2739-4c43-9532-b0365f8095af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:23:35 np0005466013 podman[237735]: 2025-10-02 12:23:35.718965084 +0000 UTC m=+0.079285424 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, distribution-scope=public, vcs-type=git, managed_by=edpm_ansible, release=1755695350)
Oct  2 08:23:35 np0005466013 podman[237734]: 2025-10-02 12:23:35.720376749 +0000 UTC m=+0.085370456 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  2 08:23:35 np0005466013 podman[237736]: 2025-10-02 12:23:35.743304276 +0000 UTC m=+0.093831209 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001)
Oct  2 08:23:37 np0005466013 nova_compute[192144]: 2025-10-02 12:23:37.218 2 INFO nova.virt.libvirt.driver [None req-cd0c014a-8b18-42a2-b321-369ec04d7aef 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Snapshot image upload complete#033[00m
Oct  2 08:23:37 np0005466013 nova_compute[192144]: 2025-10-02 12:23:37.218 2 DEBUG nova.compute.manager [None req-cd0c014a-8b18-42a2-b321-369ec04d7aef 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:23:37 np0005466013 nova_compute[192144]: 2025-10-02 12:23:37.318 2 INFO nova.compute.manager [None req-cd0c014a-8b18-42a2-b321-369ec04d7aef 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Shelve offloading#033[00m
Oct  2 08:23:37 np0005466013 nova_compute[192144]: 2025-10-02 12:23:37.343 2 INFO nova.virt.libvirt.driver [-] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Instance destroyed successfully.#033[00m
Oct  2 08:23:37 np0005466013 nova_compute[192144]: 2025-10-02 12:23:37.343 2 DEBUG nova.compute.manager [None req-cd0c014a-8b18-42a2-b321-369ec04d7aef 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:23:37 np0005466013 nova_compute[192144]: 2025-10-02 12:23:37.346 2 DEBUG oslo_concurrency.lockutils [None req-cd0c014a-8b18-42a2-b321-369ec04d7aef 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Acquiring lock "refresh_cache-86e04f64-f88d-45c1-b90c-344bddb4c4b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:23:37 np0005466013 nova_compute[192144]: 2025-10-02 12:23:37.346 2 DEBUG oslo_concurrency.lockutils [None req-cd0c014a-8b18-42a2-b321-369ec04d7aef 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Acquired lock "refresh_cache-86e04f64-f88d-45c1-b90c-344bddb4c4b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:23:37 np0005466013 nova_compute[192144]: 2025-10-02 12:23:37.347 2 DEBUG nova.network.neutron [None req-cd0c014a-8b18-42a2-b321-369ec04d7aef 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:23:37 np0005466013 nova_compute[192144]: 2025-10-02 12:23:37.697 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Updating instance_info_cache with network_info: [{"id": "6374e02b-d27f-466e-9a75-8ba586327036", "address": "fa:16:3e:17:03:e5", "network": {"id": "20eb29be-ee23-463b-85af-bfc2388e9f77", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-370285634-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffce7d629aa24a7f970d93b2a79045f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6374e02b-d2", "ovs_interfaceid": "6374e02b-d27f-466e-9a75-8ba586327036", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:23:37 np0005466013 nova_compute[192144]: 2025-10-02 12:23:37.718 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Releasing lock "refresh_cache-f2f0b852-0c4a-4d16-9c7f-54845e7f7b42" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:23:37 np0005466013 nova_compute[192144]: 2025-10-02 12:23:37.718 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:23:38 np0005466013 nova_compute[192144]: 2025-10-02 12:23:38.574 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:38 np0005466013 nova_compute[192144]: 2025-10-02 12:23:38.982 2 DEBUG nova.network.neutron [None req-cd0c014a-8b18-42a2-b321-369ec04d7aef 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Updating instance_info_cache with network_info: [{"id": "399f8dde-d495-4c04-893b-bb0bf50f2cf0", "address": "fa:16:3e:50:da:03", "network": {"id": "20eb29be-ee23-463b-85af-bfc2388e9f77", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-370285634-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffce7d629aa24a7f970d93b2a79045f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap399f8dde-d4", "ovs_interfaceid": "399f8dde-d495-4c04-893b-bb0bf50f2cf0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:23:39 np0005466013 nova_compute[192144]: 2025-10-02 12:23:39.029 2 DEBUG oslo_concurrency.lockutils [None req-cd0c014a-8b18-42a2-b321-369ec04d7aef 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Releasing lock "refresh_cache-86e04f64-f88d-45c1-b90c-344bddb4c4b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:23:39 np0005466013 nova_compute[192144]: 2025-10-02 12:23:39.060 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:40 np0005466013 nova_compute[192144]: 2025-10-02 12:23:40.323 2 INFO nova.virt.libvirt.driver [-] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Instance destroyed successfully.#033[00m
Oct  2 08:23:40 np0005466013 nova_compute[192144]: 2025-10-02 12:23:40.324 2 DEBUG nova.objects.instance [None req-cd0c014a-8b18-42a2-b321-369ec04d7aef 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lazy-loading 'resources' on Instance uuid 86e04f64-f88d-45c1-b90c-344bddb4c4b9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:23:40 np0005466013 nova_compute[192144]: 2025-10-02 12:23:40.338 2 DEBUG nova.virt.libvirt.vif [None req-cd0c014a-8b18-42a2-b321-369ec04d7aef 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:23:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-292933404',display_name='tempest-ServerActionsTestOtherB-server-292933404',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-292933404',id=116,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:23:25Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='ffce7d629aa24a7f970d93b2a79045f1',ramdisk_id='',reservation_id='r-50xphvii',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-263921372',owner_user_name='tempest-ServerActionsTestOtherB-263921372-project-member',shelved_at='2025-10-02T12:23:37.218703',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='eeca2a34-a5d4-4f95-8235-3acd2cec652e'},tags=<?>,task_state='shelving_offloading',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:23:34Z,user_data=None,user_id='0ea122e2fff94f2ba7c78bf30b04029c',uuid=86e04f64-f88d-45c1-b90c-344bddb4c4b9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "399f8dde-d495-4c04-893b-bb0bf50f2cf0", "address": "fa:16:3e:50:da:03", "network": {"id": "20eb29be-ee23-463b-85af-bfc2388e9f77", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-370285634-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffce7d629aa24a7f970d93b2a79045f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap399f8dde-d4", "ovs_interfaceid": "399f8dde-d495-4c04-893b-bb0bf50f2cf0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:23:40 np0005466013 nova_compute[192144]: 2025-10-02 12:23:40.339 2 DEBUG nova.network.os_vif_util [None req-cd0c014a-8b18-42a2-b321-369ec04d7aef 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Converting VIF {"id": "399f8dde-d495-4c04-893b-bb0bf50f2cf0", "address": "fa:16:3e:50:da:03", "network": {"id": "20eb29be-ee23-463b-85af-bfc2388e9f77", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-370285634-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffce7d629aa24a7f970d93b2a79045f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap399f8dde-d4", "ovs_interfaceid": "399f8dde-d495-4c04-893b-bb0bf50f2cf0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:23:40 np0005466013 nova_compute[192144]: 2025-10-02 12:23:40.340 2 DEBUG nova.network.os_vif_util [None req-cd0c014a-8b18-42a2-b321-369ec04d7aef 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:50:da:03,bridge_name='br-int',has_traffic_filtering=True,id=399f8dde-d495-4c04-893b-bb0bf50f2cf0,network=Network(20eb29be-ee23-463b-85af-bfc2388e9f77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap399f8dde-d4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:23:40 np0005466013 nova_compute[192144]: 2025-10-02 12:23:40.340 2 DEBUG os_vif [None req-cd0c014a-8b18-42a2-b321-369ec04d7aef 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:50:da:03,bridge_name='br-int',has_traffic_filtering=True,id=399f8dde-d495-4c04-893b-bb0bf50f2cf0,network=Network(20eb29be-ee23-463b-85af-bfc2388e9f77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap399f8dde-d4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:23:40 np0005466013 nova_compute[192144]: 2025-10-02 12:23:40.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:40 np0005466013 nova_compute[192144]: 2025-10-02 12:23:40.343 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap399f8dde-d4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:23:40 np0005466013 nova_compute[192144]: 2025-10-02 12:23:40.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:40 np0005466013 nova_compute[192144]: 2025-10-02 12:23:40.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:23:40 np0005466013 nova_compute[192144]: 2025-10-02 12:23:40.351 2 INFO os_vif [None req-cd0c014a-8b18-42a2-b321-369ec04d7aef 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:50:da:03,bridge_name='br-int',has_traffic_filtering=True,id=399f8dde-d495-4c04-893b-bb0bf50f2cf0,network=Network(20eb29be-ee23-463b-85af-bfc2388e9f77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap399f8dde-d4')#033[00m
Oct  2 08:23:40 np0005466013 nova_compute[192144]: 2025-10-02 12:23:40.352 2 INFO nova.virt.libvirt.driver [None req-cd0c014a-8b18-42a2-b321-369ec04d7aef 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Deleting instance files /var/lib/nova/instances/86e04f64-f88d-45c1-b90c-344bddb4c4b9_del#033[00m
Oct  2 08:23:40 np0005466013 nova_compute[192144]: 2025-10-02 12:23:40.353 2 INFO nova.virt.libvirt.driver [None req-cd0c014a-8b18-42a2-b321-369ec04d7aef 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Deletion of /var/lib/nova/instances/86e04f64-f88d-45c1-b90c-344bddb4c4b9_del complete#033[00m
Oct  2 08:23:40 np0005466013 nova_compute[192144]: 2025-10-02 12:23:40.495 2 INFO nova.scheduler.client.report [None req-cd0c014a-8b18-42a2-b321-369ec04d7aef 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Deleted allocations for instance 86e04f64-f88d-45c1-b90c-344bddb4c4b9#033[00m
Oct  2 08:23:40 np0005466013 ovn_controller[94366]: 2025-10-02T12:23:40Z|00041|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:af:53:5f 10.100.0.11
Oct  2 08:23:40 np0005466013 ovn_controller[94366]: 2025-10-02T12:23:40Z|00042|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:af:53:5f 10.100.0.11
Oct  2 08:23:40 np0005466013 nova_compute[192144]: 2025-10-02 12:23:40.591 2 DEBUG oslo_concurrency.lockutils [None req-cd0c014a-8b18-42a2-b321-369ec04d7aef 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:40 np0005466013 nova_compute[192144]: 2025-10-02 12:23:40.591 2 DEBUG oslo_concurrency.lockutils [None req-cd0c014a-8b18-42a2-b321-369ec04d7aef 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:40 np0005466013 nova_compute[192144]: 2025-10-02 12:23:40.649 2 DEBUG nova.compute.manager [req-67a4fccb-0068-4680-9c4e-36914fbc7fbd req-5c6d011d-94e8-4530-a009-d912b8cb91de 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Received event network-changed-399f8dde-d495-4c04-893b-bb0bf50f2cf0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:23:40 np0005466013 nova_compute[192144]: 2025-10-02 12:23:40.649 2 DEBUG nova.compute.manager [req-67a4fccb-0068-4680-9c4e-36914fbc7fbd req-5c6d011d-94e8-4530-a009-d912b8cb91de 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Refreshing instance network info cache due to event network-changed-399f8dde-d495-4c04-893b-bb0bf50f2cf0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:23:40 np0005466013 nova_compute[192144]: 2025-10-02 12:23:40.650 2 DEBUG oslo_concurrency.lockutils [req-67a4fccb-0068-4680-9c4e-36914fbc7fbd req-5c6d011d-94e8-4530-a009-d912b8cb91de 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-86e04f64-f88d-45c1-b90c-344bddb4c4b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:23:40 np0005466013 nova_compute[192144]: 2025-10-02 12:23:40.650 2 DEBUG oslo_concurrency.lockutils [req-67a4fccb-0068-4680-9c4e-36914fbc7fbd req-5c6d011d-94e8-4530-a009-d912b8cb91de 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-86e04f64-f88d-45c1-b90c-344bddb4c4b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:23:40 np0005466013 nova_compute[192144]: 2025-10-02 12:23:40.650 2 DEBUG nova.network.neutron [req-67a4fccb-0068-4680-9c4e-36914fbc7fbd req-5c6d011d-94e8-4530-a009-d912b8cb91de 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Refreshing network info cache for port 399f8dde-d495-4c04-893b-bb0bf50f2cf0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:23:40 np0005466013 nova_compute[192144]: 2025-10-02 12:23:40.675 2 DEBUG nova.compute.provider_tree [None req-cd0c014a-8b18-42a2-b321-369ec04d7aef 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:23:40 np0005466013 nova_compute[192144]: 2025-10-02 12:23:40.698 2 DEBUG nova.scheduler.client.report [None req-cd0c014a-8b18-42a2-b321-369ec04d7aef 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:23:40 np0005466013 nova_compute[192144]: 2025-10-02 12:23:40.725 2 DEBUG oslo_concurrency.lockutils [None req-cd0c014a-8b18-42a2-b321-369ec04d7aef 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.133s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:40 np0005466013 nova_compute[192144]: 2025-10-02 12:23:40.807 2 DEBUG oslo_concurrency.lockutils [None req-cd0c014a-8b18-42a2-b321-369ec04d7aef 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "86e04f64-f88d-45c1-b90c-344bddb4c4b9" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 8.065s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:41 np0005466013 podman[237803]: 2025-10-02 12:23:41.6984268 +0000 UTC m=+0.066169893 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  2 08:23:41 np0005466013 podman[237804]: 2025-10-02 12:23:41.715924658 +0000 UTC m=+0.079016006 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=iscsid, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  2 08:23:44 np0005466013 nova_compute[192144]: 2025-10-02 12:23:44.061 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:44 np0005466013 nova_compute[192144]: 2025-10-02 12:23:44.358 2 DEBUG nova.network.neutron [req-67a4fccb-0068-4680-9c4e-36914fbc7fbd req-5c6d011d-94e8-4530-a009-d912b8cb91de 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Updated VIF entry in instance network info cache for port 399f8dde-d495-4c04-893b-bb0bf50f2cf0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:23:44 np0005466013 nova_compute[192144]: 2025-10-02 12:23:44.358 2 DEBUG nova.network.neutron [req-67a4fccb-0068-4680-9c4e-36914fbc7fbd req-5c6d011d-94e8-4530-a009-d912b8cb91de 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Updating instance_info_cache with network_info: [{"id": "399f8dde-d495-4c04-893b-bb0bf50f2cf0", "address": "fa:16:3e:50:da:03", "network": {"id": "20eb29be-ee23-463b-85af-bfc2388e9f77", "bridge": null, "label": "tempest-ServerActionsTestOtherB-370285634-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffce7d629aa24a7f970d93b2a79045f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "unbound", "details": {}, "devname": "tap399f8dde-d4", "ovs_interfaceid": null, "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:23:44 np0005466013 nova_compute[192144]: 2025-10-02 12:23:44.381 2 DEBUG oslo_concurrency.lockutils [req-67a4fccb-0068-4680-9c4e-36914fbc7fbd req-5c6d011d-94e8-4530-a009-d912b8cb91de 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-86e04f64-f88d-45c1-b90c-344bddb4c4b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:23:45 np0005466013 nova_compute[192144]: 2025-10-02 12:23:45.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:47 np0005466013 nova_compute[192144]: 2025-10-02 12:23:47.101 2 INFO nova.compute.manager [None req-879948f8-d4ab-4512-be7e-bbe15fef0724 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Get console output#033[00m
Oct  2 08:23:47 np0005466013 nova_compute[192144]: 2025-10-02 12:23:47.107 56 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  2 08:23:48 np0005466013 nova_compute[192144]: 2025-10-02 12:23:48.089 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407813.087639, 86e04f64-f88d-45c1-b90c-344bddb4c4b9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:23:48 np0005466013 nova_compute[192144]: 2025-10-02 12:23:48.089 2 INFO nova.compute.manager [-] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:23:48 np0005466013 nova_compute[192144]: 2025-10-02 12:23:48.128 2 DEBUG nova.compute.manager [None req-78a609ea-6417-482f-b8f1-278575e9003f - - - - - -] [instance: 86e04f64-f88d-45c1-b90c-344bddb4c4b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:23:49 np0005466013 nova_compute[192144]: 2025-10-02 12:23:49.064 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:49 np0005466013 nova_compute[192144]: 2025-10-02 12:23:49.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:50 np0005466013 nova_compute[192144]: 2025-10-02 12:23:50.351 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:50 np0005466013 nova_compute[192144]: 2025-10-02 12:23:50.587 2 INFO nova.compute.manager [None req-1f10263b-4945-4eb1-a8db-5c9288196e97 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Get console output#033[00m
Oct  2 08:23:50 np0005466013 nova_compute[192144]: 2025-10-02 12:23:50.592 56 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  2 08:23:54 np0005466013 nova_compute[192144]: 2025-10-02 12:23:54.067 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:54 np0005466013 nova_compute[192144]: 2025-10-02 12:23:54.190 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:54 np0005466013 nova_compute[192144]: 2025-10-02 12:23:54.255 2 DEBUG oslo_concurrency.lockutils [None req-bbab774b-2ea1-4a57-9049-cf3c8fae42ad cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Acquiring lock "refresh_cache-32196dd3-2739-4c43-9532-b0365f8095af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:23:54 np0005466013 nova_compute[192144]: 2025-10-02 12:23:54.256 2 DEBUG oslo_concurrency.lockutils [None req-bbab774b-2ea1-4a57-9049-cf3c8fae42ad cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Acquired lock "refresh_cache-32196dd3-2739-4c43-9532-b0365f8095af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:23:54 np0005466013 nova_compute[192144]: 2025-10-02 12:23:54.256 2 DEBUG nova.network.neutron [None req-bbab774b-2ea1-4a57-9049-cf3c8fae42ad cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:23:55 np0005466013 nova_compute[192144]: 2025-10-02 12:23:55.354 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:55 np0005466013 podman[237848]: 2025-10-02 12:23:55.696950945 +0000 UTC m=+0.061531819 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:23:55 np0005466013 podman[237847]: 2025-10-02 12:23:55.713984688 +0000 UTC m=+0.078218540 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  2 08:23:55 np0005466013 podman[237849]: 2025-10-02 12:23:55.745696721 +0000 UTC m=+0.108320464 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:23:56 np0005466013 nova_compute[192144]: 2025-10-02 12:23:56.382 2 DEBUG nova.network.neutron [None req-bbab774b-2ea1-4a57-9049-cf3c8fae42ad cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Updating instance_info_cache with network_info: [{"id": "375c20c8-b3bc-484b-820a-f3988fb1bfa1", "address": "fa:16:3e:af:53:5f", "network": {"id": "91662be7-398f-4c34-a848-62b46821f0fd", "bridge": "br-int", "label": "tempest-network-smoke--722078817", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap375c20c8-b3", "ovs_interfaceid": "375c20c8-b3bc-484b-820a-f3988fb1bfa1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:23:56 np0005466013 nova_compute[192144]: 2025-10-02 12:23:56.417 2 DEBUG oslo_concurrency.lockutils [None req-bbab774b-2ea1-4a57-9049-cf3c8fae42ad cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Releasing lock "refresh_cache-32196dd3-2739-4c43-9532-b0365f8095af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:23:56 np0005466013 nova_compute[192144]: 2025-10-02 12:23:56.596 2 DEBUG nova.virt.libvirt.driver [None req-bbab774b-2ea1-4a57-9049-cf3c8fae42ad cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Oct  2 08:23:56 np0005466013 nova_compute[192144]: 2025-10-02 12:23:56.596 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-bbab774b-2ea1-4a57-9049-cf3c8fae42ad cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Creating file /var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af/43709aa18cff407fb8e2750778f9b992.tmp on remote host 192.168.122.100 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Oct  2 08:23:56 np0005466013 nova_compute[192144]: 2025-10-02 12:23:56.597 2 DEBUG oslo_concurrency.processutils [None req-bbab774b-2ea1-4a57-9049-cf3c8fae42ad cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af/43709aa18cff407fb8e2750778f9b992.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:23:57 np0005466013 nova_compute[192144]: 2025-10-02 12:23:57.133 2 DEBUG oslo_concurrency.processutils [None req-bbab774b-2ea1-4a57-9049-cf3c8fae42ad cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af/43709aa18cff407fb8e2750778f9b992.tmp" returned: 1 in 0.536s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:23:57 np0005466013 nova_compute[192144]: 2025-10-02 12:23:57.134 2 DEBUG oslo_concurrency.processutils [None req-bbab774b-2ea1-4a57-9049-cf3c8fae42ad cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] 'ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af/43709aa18cff407fb8e2750778f9b992.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Oct  2 08:23:57 np0005466013 nova_compute[192144]: 2025-10-02 12:23:57.134 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-bbab774b-2ea1-4a57-9049-cf3c8fae42ad cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Creating directory /var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af on remote host 192.168.122.100 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Oct  2 08:23:57 np0005466013 nova_compute[192144]: 2025-10-02 12:23:57.135 2 DEBUG oslo_concurrency.processutils [None req-bbab774b-2ea1-4a57-9049-cf3c8fae42ad cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:23:57 np0005466013 nova_compute[192144]: 2025-10-02 12:23:57.333 2 DEBUG oslo_concurrency.processutils [None req-bbab774b-2ea1-4a57-9049-cf3c8fae42ad cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af" returned: 0 in 0.198s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:23:57 np0005466013 nova_compute[192144]: 2025-10-02 12:23:57.339 2 DEBUG nova.virt.libvirt.driver [None req-bbab774b-2ea1-4a57-9049-cf3c8fae42ad cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Oct  2 08:23:59 np0005466013 nova_compute[192144]: 2025-10-02 12:23:59.069 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:59 np0005466013 kernel: tap375c20c8-b3 (unregistering): left promiscuous mode
Oct  2 08:23:59 np0005466013 NetworkManager[51205]: <info>  [1759407839.7089] device (tap375c20c8-b3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:23:59 np0005466013 ovn_controller[94366]: 2025-10-02T12:23:59Z|00464|binding|INFO|Releasing lport 375c20c8-b3bc-484b-820a-f3988fb1bfa1 from this chassis (sb_readonly=0)
Oct  2 08:23:59 np0005466013 ovn_controller[94366]: 2025-10-02T12:23:59Z|00465|binding|INFO|Setting lport 375c20c8-b3bc-484b-820a-f3988fb1bfa1 down in Southbound
Oct  2 08:23:59 np0005466013 nova_compute[192144]: 2025-10-02 12:23:59.718 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:59 np0005466013 ovn_controller[94366]: 2025-10-02T12:23:59Z|00466|binding|INFO|Removing iface tap375c20c8-b3 ovn-installed in OVS
Oct  2 08:23:59 np0005466013 nova_compute[192144]: 2025-10-02 12:23:59.724 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:59 np0005466013 nova_compute[192144]: 2025-10-02 12:23:59.734 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:59.744 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:af:53:5f 10.100.0.11'], port_security=['fa:16:3e:af:53:5f 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '32196dd3-2739-4c43-9532-b0365f8095af', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-91662be7-398f-4c34-a848-62b46821f0fd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'neutron:revision_number': '4', 'neutron:security_group_ids': '56d0844b-17cf-4186-b565-d275a3fd7b1f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.233'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2bb1944c-7514-4575-bf6c-55d1c733e488, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=375c20c8-b3bc-484b-820a-f3988fb1bfa1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:23:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:59.746 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 375c20c8-b3bc-484b-820a-f3988fb1bfa1 in datapath 91662be7-398f-4c34-a848-62b46821f0fd unbound from our chassis#033[00m
Oct  2 08:23:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:59.749 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 91662be7-398f-4c34-a848-62b46821f0fd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:23:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:59.750 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[6bf6a300-d3b3-4096-a173-53dfaa92744a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:23:59.751 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-91662be7-398f-4c34-a848-62b46821f0fd namespace which is not needed anymore#033[00m
Oct  2 08:23:59 np0005466013 systemd[1]: machine-qemu\x2d54\x2dinstance\x2d00000075.scope: Deactivated successfully.
Oct  2 08:23:59 np0005466013 systemd[1]: machine-qemu\x2d54\x2dinstance\x2d00000075.scope: Consumed 14.599s CPU time.
Oct  2 08:23:59 np0005466013 systemd-machined[152202]: Machine qemu-54-instance-00000075 terminated.
Oct  2 08:23:59 np0005466013 neutron-haproxy-ovnmeta-91662be7-398f-4c34-a848-62b46821f0fd[237663]: [NOTICE]   (237667) : haproxy version is 2.8.14-c23fe91
Oct  2 08:23:59 np0005466013 neutron-haproxy-ovnmeta-91662be7-398f-4c34-a848-62b46821f0fd[237663]: [NOTICE]   (237667) : path to executable is /usr/sbin/haproxy
Oct  2 08:23:59 np0005466013 neutron-haproxy-ovnmeta-91662be7-398f-4c34-a848-62b46821f0fd[237663]: [WARNING]  (237667) : Exiting Master process...
Oct  2 08:23:59 np0005466013 neutron-haproxy-ovnmeta-91662be7-398f-4c34-a848-62b46821f0fd[237663]: [WARNING]  (237667) : Exiting Master process...
Oct  2 08:23:59 np0005466013 neutron-haproxy-ovnmeta-91662be7-398f-4c34-a848-62b46821f0fd[237663]: [ALERT]    (237667) : Current worker (237669) exited with code 143 (Terminated)
Oct  2 08:23:59 np0005466013 neutron-haproxy-ovnmeta-91662be7-398f-4c34-a848-62b46821f0fd[237663]: [WARNING]  (237667) : All workers exited. Exiting... (0)
Oct  2 08:23:59 np0005466013 systemd[1]: libpod-612a46b1e8e6575c0b4539753c72c09d4d931ef3d2eb855de57704da1f716674.scope: Deactivated successfully.
Oct  2 08:23:59 np0005466013 podman[237941]: 2025-10-02 12:23:59.938539239 +0000 UTC m=+0.069932120 container died 612a46b1e8e6575c0b4539753c72c09d4d931ef3d2eb855de57704da1f716674 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-91662be7-398f-4c34-a848-62b46821f0fd, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  2 08:23:59 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-612a46b1e8e6575c0b4539753c72c09d4d931ef3d2eb855de57704da1f716674-userdata-shm.mount: Deactivated successfully.
Oct  2 08:23:59 np0005466013 systemd[1]: var-lib-containers-storage-overlay-492a7a83638f26ef21a5e7e3135374166a3aed468a0298ec7b2cc87c85052369-merged.mount: Deactivated successfully.
Oct  2 08:23:59 np0005466013 podman[237941]: 2025-10-02 12:23:59.995194014 +0000 UTC m=+0.126586825 container cleanup 612a46b1e8e6575c0b4539753c72c09d4d931ef3d2eb855de57704da1f716674 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-91662be7-398f-4c34-a848-62b46821f0fd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:24:00 np0005466013 systemd[1]: libpod-conmon-612a46b1e8e6575c0b4539753c72c09d4d931ef3d2eb855de57704da1f716674.scope: Deactivated successfully.
Oct  2 08:24:00 np0005466013 nova_compute[192144]: 2025-10-02 12:24:00.099 2 DEBUG nova.compute.manager [req-ce3319fc-9bc2-44be-b639-ea2648cf8a7e req-eb566131-458f-460f-9068-fe1f1e8c6334 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Received event network-vif-unplugged-375c20c8-b3bc-484b-820a-f3988fb1bfa1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:24:00 np0005466013 nova_compute[192144]: 2025-10-02 12:24:00.100 2 DEBUG oslo_concurrency.lockutils [req-ce3319fc-9bc2-44be-b639-ea2648cf8a7e req-eb566131-458f-460f-9068-fe1f1e8c6334 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "32196dd3-2739-4c43-9532-b0365f8095af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:00 np0005466013 nova_compute[192144]: 2025-10-02 12:24:00.101 2 DEBUG oslo_concurrency.lockutils [req-ce3319fc-9bc2-44be-b639-ea2648cf8a7e req-eb566131-458f-460f-9068-fe1f1e8c6334 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "32196dd3-2739-4c43-9532-b0365f8095af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:00 np0005466013 nova_compute[192144]: 2025-10-02 12:24:00.101 2 DEBUG oslo_concurrency.lockutils [req-ce3319fc-9bc2-44be-b639-ea2648cf8a7e req-eb566131-458f-460f-9068-fe1f1e8c6334 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "32196dd3-2739-4c43-9532-b0365f8095af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:00 np0005466013 nova_compute[192144]: 2025-10-02 12:24:00.101 2 DEBUG nova.compute.manager [req-ce3319fc-9bc2-44be-b639-ea2648cf8a7e req-eb566131-458f-460f-9068-fe1f1e8c6334 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] No waiting events found dispatching network-vif-unplugged-375c20c8-b3bc-484b-820a-f3988fb1bfa1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:24:00 np0005466013 nova_compute[192144]: 2025-10-02 12:24:00.102 2 WARNING nova.compute.manager [req-ce3319fc-9bc2-44be-b639-ea2648cf8a7e req-eb566131-458f-460f-9068-fe1f1e8c6334 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Received unexpected event network-vif-unplugged-375c20c8-b3bc-484b-820a-f3988fb1bfa1 for instance with vm_state active and task_state resize_migrating.#033[00m
Oct  2 08:24:00 np0005466013 podman[237984]: 2025-10-02 12:24:00.113339244 +0000 UTC m=+0.084174447 container remove 612a46b1e8e6575c0b4539753c72c09d4d931ef3d2eb855de57704da1f716674 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-91662be7-398f-4c34-a848-62b46821f0fd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:24:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:00.119 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[b7a2b2d9-a284-4d00-b3ee-95cdd5f682af]: (4, ('Thu Oct  2 12:23:59 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-91662be7-398f-4c34-a848-62b46821f0fd (612a46b1e8e6575c0b4539753c72c09d4d931ef3d2eb855de57704da1f716674)\n612a46b1e8e6575c0b4539753c72c09d4d931ef3d2eb855de57704da1f716674\nThu Oct  2 12:24:00 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-91662be7-398f-4c34-a848-62b46821f0fd (612a46b1e8e6575c0b4539753c72c09d4d931ef3d2eb855de57704da1f716674)\n612a46b1e8e6575c0b4539753c72c09d4d931ef3d2eb855de57704da1f716674\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:00.122 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[866e6262-ba43-40d1-b31b-d2c246dccc6e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:00.123 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap91662be7-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:24:00 np0005466013 nova_compute[192144]: 2025-10-02 12:24:00.125 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:00 np0005466013 kernel: tap91662be7-30: left promiscuous mode
Oct  2 08:24:00 np0005466013 nova_compute[192144]: 2025-10-02 12:24:00.141 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:00.145 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[ba673a37-1baa-46fa-b525-2599ea4e455b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:00.169 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[80510867-d48b-40f9-aa8c-046f3986b77c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:00.171 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[09ab9322-e565-4413-bd49-fefa0e4b8d81]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:00.189 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[42c979e1-706d-41fb-915f-8e12ea0b258c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 579738, 'reachable_time': 35163, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238005, 'error': None, 'target': 'ovnmeta-91662be7-398f-4c34-a848-62b46821f0fd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:00.194 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-91662be7-398f-4c34-a848-62b46821f0fd deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:24:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:00.195 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[4b49d5cf-d75c-4e62-bda0-e1d1cc921999]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:00 np0005466013 systemd[1]: run-netns-ovnmeta\x2d91662be7\x2d398f\x2d4c34\x2da848\x2d62b46821f0fd.mount: Deactivated successfully.
Oct  2 08:24:00 np0005466013 nova_compute[192144]: 2025-10-02 12:24:00.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:00 np0005466013 nova_compute[192144]: 2025-10-02 12:24:00.359 2 INFO nova.virt.libvirt.driver [None req-bbab774b-2ea1-4a57-9049-cf3c8fae42ad cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Instance shutdown successfully after 3 seconds.#033[00m
Oct  2 08:24:00 np0005466013 nova_compute[192144]: 2025-10-02 12:24:00.370 2 INFO nova.virt.libvirt.driver [-] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Instance destroyed successfully.#033[00m
Oct  2 08:24:00 np0005466013 nova_compute[192144]: 2025-10-02 12:24:00.373 2 DEBUG nova.virt.libvirt.vif [None req-bbab774b-2ea1-4a57-9049-cf3c8fae42ad cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:23:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-2105436045',display_name='tempest-TestNetworkAdvancedServerOps-server-2105436045',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-2105436045',id=117,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIPVZ/1ugRUXJi6kpxyVgRUtYTdMlYSz5NQQRRxSWUHE0SJ8tz8WjHhrHski+4uyv4G//M9upfdriwZTygaxranlXIWK6yJW4zVM7pqGP5AEtkUxwGNjsUk0aVRz2H8oSQ==',key_name='tempest-TestNetworkAdvancedServerOps-1888170662',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:23:27Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='76c7dd40d83e4e3ca71abbebf57921b6',ramdisk_id='',reservation_id='r-0oq6jqhj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-597114071',owner_user_name='tempest-TestNetworkAdvancedServerOps-597114071-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:23:53Z,user_data=None,user_id='1faa7e121a0e43ad8cb4ae5b2cfcc6a2',uuid=32196dd3-2739-4c43-9532-b0365f8095af,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "375c20c8-b3bc-484b-820a-f3988fb1bfa1", "address": "fa:16:3e:af:53:5f", "network": {"id": "91662be7-398f-4c34-a848-62b46821f0fd", "bridge": "br-int", "label": "tempest-network-smoke--722078817", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--722078817", "vif_mac": "fa:16:3e:af:53:5f"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap375c20c8-b3", "ovs_interfaceid": "375c20c8-b3bc-484b-820a-f3988fb1bfa1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:24:00 np0005466013 nova_compute[192144]: 2025-10-02 12:24:00.374 2 DEBUG nova.network.os_vif_util [None req-bbab774b-2ea1-4a57-9049-cf3c8fae42ad cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Converting VIF {"id": "375c20c8-b3bc-484b-820a-f3988fb1bfa1", "address": "fa:16:3e:af:53:5f", "network": {"id": "91662be7-398f-4c34-a848-62b46821f0fd", "bridge": "br-int", "label": "tempest-network-smoke--722078817", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--722078817", "vif_mac": "fa:16:3e:af:53:5f"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap375c20c8-b3", "ovs_interfaceid": "375c20c8-b3bc-484b-820a-f3988fb1bfa1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:24:00 np0005466013 nova_compute[192144]: 2025-10-02 12:24:00.376 2 DEBUG nova.network.os_vif_util [None req-bbab774b-2ea1-4a57-9049-cf3c8fae42ad cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:af:53:5f,bridge_name='br-int',has_traffic_filtering=True,id=375c20c8-b3bc-484b-820a-f3988fb1bfa1,network=Network(91662be7-398f-4c34-a848-62b46821f0fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap375c20c8-b3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:24:00 np0005466013 nova_compute[192144]: 2025-10-02 12:24:00.377 2 DEBUG os_vif [None req-bbab774b-2ea1-4a57-9049-cf3c8fae42ad cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:af:53:5f,bridge_name='br-int',has_traffic_filtering=True,id=375c20c8-b3bc-484b-820a-f3988fb1bfa1,network=Network(91662be7-398f-4c34-a848-62b46821f0fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap375c20c8-b3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:24:00 np0005466013 nova_compute[192144]: 2025-10-02 12:24:00.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:00 np0005466013 nova_compute[192144]: 2025-10-02 12:24:00.382 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap375c20c8-b3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:24:00 np0005466013 nova_compute[192144]: 2025-10-02 12:24:00.385 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:00 np0005466013 nova_compute[192144]: 2025-10-02 12:24:00.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:24:00 np0005466013 nova_compute[192144]: 2025-10-02 12:24:00.393 2 INFO os_vif [None req-bbab774b-2ea1-4a57-9049-cf3c8fae42ad cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:af:53:5f,bridge_name='br-int',has_traffic_filtering=True,id=375c20c8-b3bc-484b-820a-f3988fb1bfa1,network=Network(91662be7-398f-4c34-a848-62b46821f0fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap375c20c8-b3')#033[00m
Oct  2 08:24:00 np0005466013 nova_compute[192144]: 2025-10-02 12:24:00.401 2 DEBUG oslo_concurrency.processutils [None req-bbab774b-2ea1-4a57-9049-cf3c8fae42ad cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:24:00 np0005466013 nova_compute[192144]: 2025-10-02 12:24:00.493 2 DEBUG oslo_concurrency.processutils [None req-bbab774b-2ea1-4a57-9049-cf3c8fae42ad cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af/disk --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:24:00 np0005466013 nova_compute[192144]: 2025-10-02 12:24:00.494 2 DEBUG oslo_concurrency.processutils [None req-bbab774b-2ea1-4a57-9049-cf3c8fae42ad cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:24:00 np0005466013 nova_compute[192144]: 2025-10-02 12:24:00.570 2 DEBUG oslo_concurrency.processutils [None req-bbab774b-2ea1-4a57-9049-cf3c8fae42ad cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:24:00 np0005466013 nova_compute[192144]: 2025-10-02 12:24:00.572 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-bbab774b-2ea1-4a57-9049-cf3c8fae42ad cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Copying file /var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af_resize/disk to 192.168.122.100:/var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Oct  2 08:24:00 np0005466013 nova_compute[192144]: 2025-10-02 12:24:00.573 2 DEBUG oslo_concurrency.processutils [None req-bbab774b-2ea1-4a57-9049-cf3c8fae42ad cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af_resize/disk 192.168.122.100:/var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:24:01 np0005466013 nova_compute[192144]: 2025-10-02 12:24:01.179 2 DEBUG oslo_concurrency.processutils [None req-bbab774b-2ea1-4a57-9049-cf3c8fae42ad cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] CMD "scp -r /var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af_resize/disk 192.168.122.100:/var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af/disk" returned: 0 in 0.606s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:24:01 np0005466013 nova_compute[192144]: 2025-10-02 12:24:01.180 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-bbab774b-2ea1-4a57-9049-cf3c8fae42ad cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Copying file /var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af_resize/disk.config to 192.168.122.100:/var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Oct  2 08:24:01 np0005466013 nova_compute[192144]: 2025-10-02 12:24:01.181 2 DEBUG oslo_concurrency.processutils [None req-bbab774b-2ea1-4a57-9049-cf3c8fae42ad cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af_resize/disk.config 192.168.122.100:/var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:24:01 np0005466013 nova_compute[192144]: 2025-10-02 12:24:01.445 2 DEBUG oslo_concurrency.processutils [None req-bbab774b-2ea1-4a57-9049-cf3c8fae42ad cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] CMD "scp -C -r /var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af_resize/disk.config 192.168.122.100:/var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af/disk.config" returned: 0 in 0.265s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:24:01 np0005466013 nova_compute[192144]: 2025-10-02 12:24:01.446 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-bbab774b-2ea1-4a57-9049-cf3c8fae42ad cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Copying file /var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af_resize/disk.info to 192.168.122.100:/var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Oct  2 08:24:01 np0005466013 nova_compute[192144]: 2025-10-02 12:24:01.447 2 DEBUG oslo_concurrency.processutils [None req-bbab774b-2ea1-4a57-9049-cf3c8fae42ad cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af_resize/disk.info 192.168.122.100:/var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:24:01 np0005466013 nova_compute[192144]: 2025-10-02 12:24:01.677 2 DEBUG oslo_concurrency.processutils [None req-bbab774b-2ea1-4a57-9049-cf3c8fae42ad cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] CMD "scp -C -r /var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af_resize/disk.info 192.168.122.100:/var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af/disk.info" returned: 0 in 0.230s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:24:01 np0005466013 nova_compute[192144]: 2025-10-02 12:24:01.957 2 DEBUG neutronclient.v2_0.client [None req-bbab774b-2ea1-4a57-9049-cf3c8fae42ad cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 375c20c8-b3bc-484b-820a-f3988fb1bfa1 for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Oct  2 08:24:02 np0005466013 nova_compute[192144]: 2025-10-02 12:24:02.118 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:02 np0005466013 nova_compute[192144]: 2025-10-02 12:24:02.150 2 DEBUG oslo_concurrency.lockutils [None req-bbab774b-2ea1-4a57-9049-cf3c8fae42ad cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Acquiring lock "32196dd3-2739-4c43-9532-b0365f8095af-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:02 np0005466013 nova_compute[192144]: 2025-10-02 12:24:02.151 2 DEBUG oslo_concurrency.lockutils [None req-bbab774b-2ea1-4a57-9049-cf3c8fae42ad cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Lock "32196dd3-2739-4c43-9532-b0365f8095af-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:02 np0005466013 nova_compute[192144]: 2025-10-02 12:24:02.151 2 DEBUG oslo_concurrency.lockutils [None req-bbab774b-2ea1-4a57-9049-cf3c8fae42ad cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Lock "32196dd3-2739-4c43-9532-b0365f8095af-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:02.306 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:02.306 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:02.307 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:02 np0005466013 nova_compute[192144]: 2025-10-02 12:24:02.433 2 DEBUG nova.compute.manager [req-7d8692c8-6330-4987-b7b1-77c594806f5b req-3afd3b87-50ab-4e60-bbc2-0fd18288f443 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Received event network-vif-plugged-375c20c8-b3bc-484b-820a-f3988fb1bfa1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:24:02 np0005466013 nova_compute[192144]: 2025-10-02 12:24:02.434 2 DEBUG oslo_concurrency.lockutils [req-7d8692c8-6330-4987-b7b1-77c594806f5b req-3afd3b87-50ab-4e60-bbc2-0fd18288f443 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "32196dd3-2739-4c43-9532-b0365f8095af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:02 np0005466013 nova_compute[192144]: 2025-10-02 12:24:02.434 2 DEBUG oslo_concurrency.lockutils [req-7d8692c8-6330-4987-b7b1-77c594806f5b req-3afd3b87-50ab-4e60-bbc2-0fd18288f443 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "32196dd3-2739-4c43-9532-b0365f8095af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:02 np0005466013 nova_compute[192144]: 2025-10-02 12:24:02.434 2 DEBUG oslo_concurrency.lockutils [req-7d8692c8-6330-4987-b7b1-77c594806f5b req-3afd3b87-50ab-4e60-bbc2-0fd18288f443 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "32196dd3-2739-4c43-9532-b0365f8095af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:02 np0005466013 nova_compute[192144]: 2025-10-02 12:24:02.435 2 DEBUG nova.compute.manager [req-7d8692c8-6330-4987-b7b1-77c594806f5b req-3afd3b87-50ab-4e60-bbc2-0fd18288f443 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] No waiting events found dispatching network-vif-plugged-375c20c8-b3bc-484b-820a-f3988fb1bfa1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:24:02 np0005466013 nova_compute[192144]: 2025-10-02 12:24:02.435 2 WARNING nova.compute.manager [req-7d8692c8-6330-4987-b7b1-77c594806f5b req-3afd3b87-50ab-4e60-bbc2-0fd18288f443 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Received unexpected event network-vif-plugged-375c20c8-b3bc-484b-820a-f3988fb1bfa1 for instance with vm_state active and task_state resize_migrated.#033[00m
Oct  2 08:24:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:02.650 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:24:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:02.651 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:24:02 np0005466013 nova_compute[192144]: 2025-10-02 12:24:02.652 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:03 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:03.654 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:24:04 np0005466013 nova_compute[192144]: 2025-10-02 12:24:04.102 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:04 np0005466013 nova_compute[192144]: 2025-10-02 12:24:04.629 2 DEBUG nova.compute.manager [req-036b54a7-02ba-4321-a255-87515847fb61 req-e2bf5cd2-6cdd-4ae0-bcfe-f291e031b61e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Received event network-changed-375c20c8-b3bc-484b-820a-f3988fb1bfa1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:24:04 np0005466013 nova_compute[192144]: 2025-10-02 12:24:04.630 2 DEBUG nova.compute.manager [req-036b54a7-02ba-4321-a255-87515847fb61 req-e2bf5cd2-6cdd-4ae0-bcfe-f291e031b61e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Refreshing instance network info cache due to event network-changed-375c20c8-b3bc-484b-820a-f3988fb1bfa1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:24:04 np0005466013 nova_compute[192144]: 2025-10-02 12:24:04.630 2 DEBUG oslo_concurrency.lockutils [req-036b54a7-02ba-4321-a255-87515847fb61 req-e2bf5cd2-6cdd-4ae0-bcfe-f291e031b61e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-32196dd3-2739-4c43-9532-b0365f8095af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:24:04 np0005466013 nova_compute[192144]: 2025-10-02 12:24:04.630 2 DEBUG oslo_concurrency.lockutils [req-036b54a7-02ba-4321-a255-87515847fb61 req-e2bf5cd2-6cdd-4ae0-bcfe-f291e031b61e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-32196dd3-2739-4c43-9532-b0365f8095af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:24:04 np0005466013 nova_compute[192144]: 2025-10-02 12:24:04.631 2 DEBUG nova.network.neutron [req-036b54a7-02ba-4321-a255-87515847fb61 req-e2bf5cd2-6cdd-4ae0-bcfe-f291e031b61e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Refreshing network info cache for port 375c20c8-b3bc-484b-820a-f3988fb1bfa1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:24:05 np0005466013 nova_compute[192144]: 2025-10-02 12:24:05.386 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:06 np0005466013 podman[238020]: 2025-10-02 12:24:06.707895041 +0000 UTC m=+0.080126860 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  2 08:24:06 np0005466013 podman[238018]: 2025-10-02 12:24:06.715130668 +0000 UTC m=+0.087719948 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS)
Oct  2 08:24:06 np0005466013 podman[238019]: 2025-10-02 12:24:06.717218643 +0000 UTC m=+0.089975928 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, config_id=edpm, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public)
Oct  2 08:24:07 np0005466013 nova_compute[192144]: 2025-10-02 12:24:07.802 2 DEBUG nova.network.neutron [req-036b54a7-02ba-4321-a255-87515847fb61 req-e2bf5cd2-6cdd-4ae0-bcfe-f291e031b61e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Updated VIF entry in instance network info cache for port 375c20c8-b3bc-484b-820a-f3988fb1bfa1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:24:07 np0005466013 nova_compute[192144]: 2025-10-02 12:24:07.802 2 DEBUG nova.network.neutron [req-036b54a7-02ba-4321-a255-87515847fb61 req-e2bf5cd2-6cdd-4ae0-bcfe-f291e031b61e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Updating instance_info_cache with network_info: [{"id": "375c20c8-b3bc-484b-820a-f3988fb1bfa1", "address": "fa:16:3e:af:53:5f", "network": {"id": "91662be7-398f-4c34-a848-62b46821f0fd", "bridge": "br-int", "label": "tempest-network-smoke--722078817", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap375c20c8-b3", "ovs_interfaceid": "375c20c8-b3bc-484b-820a-f3988fb1bfa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:24:07 np0005466013 nova_compute[192144]: 2025-10-02 12:24:07.827 2 DEBUG oslo_concurrency.lockutils [req-036b54a7-02ba-4321-a255-87515847fb61 req-e2bf5cd2-6cdd-4ae0-bcfe-f291e031b61e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-32196dd3-2739-4c43-9532-b0365f8095af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:24:09 np0005466013 nova_compute[192144]: 2025-10-02 12:24:09.106 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:10 np0005466013 nova_compute[192144]: 2025-10-02 12:24:10.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:11 np0005466013 nova_compute[192144]: 2025-10-02 12:24:11.704 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:12 np0005466013 nova_compute[192144]: 2025-10-02 12:24:12.479 2 DEBUG nova.compute.manager [req-b4c79e12-c0cc-4681-ae08-729ad76acecd req-f7263593-1b2e-4c52-aff9-70522a4b55c0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Received event network-vif-plugged-375c20c8-b3bc-484b-820a-f3988fb1bfa1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:24:12 np0005466013 nova_compute[192144]: 2025-10-02 12:24:12.480 2 DEBUG oslo_concurrency.lockutils [req-b4c79e12-c0cc-4681-ae08-729ad76acecd req-f7263593-1b2e-4c52-aff9-70522a4b55c0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "32196dd3-2739-4c43-9532-b0365f8095af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:12 np0005466013 nova_compute[192144]: 2025-10-02 12:24:12.480 2 DEBUG oslo_concurrency.lockutils [req-b4c79e12-c0cc-4681-ae08-729ad76acecd req-f7263593-1b2e-4c52-aff9-70522a4b55c0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "32196dd3-2739-4c43-9532-b0365f8095af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:12 np0005466013 nova_compute[192144]: 2025-10-02 12:24:12.480 2 DEBUG oslo_concurrency.lockutils [req-b4c79e12-c0cc-4681-ae08-729ad76acecd req-f7263593-1b2e-4c52-aff9-70522a4b55c0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "32196dd3-2739-4c43-9532-b0365f8095af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:12 np0005466013 nova_compute[192144]: 2025-10-02 12:24:12.480 2 DEBUG nova.compute.manager [req-b4c79e12-c0cc-4681-ae08-729ad76acecd req-f7263593-1b2e-4c52-aff9-70522a4b55c0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] No waiting events found dispatching network-vif-plugged-375c20c8-b3bc-484b-820a-f3988fb1bfa1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:24:12 np0005466013 nova_compute[192144]: 2025-10-02 12:24:12.481 2 WARNING nova.compute.manager [req-b4c79e12-c0cc-4681-ae08-729ad76acecd req-f7263593-1b2e-4c52-aff9-70522a4b55c0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Received unexpected event network-vif-plugged-375c20c8-b3bc-484b-820a-f3988fb1bfa1 for instance with vm_state resized and task_state resize_reverting.#033[00m
Oct  2 08:24:12 np0005466013 podman[238074]: 2025-10-02 12:24:12.706816887 +0000 UTC m=+0.072306826 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 08:24:12 np0005466013 podman[238073]: 2025-10-02 12:24:12.742108661 +0000 UTC m=+0.104541514 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:24:14 np0005466013 nova_compute[192144]: 2025-10-02 12:24:14.108 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:14 np0005466013 nova_compute[192144]: 2025-10-02 12:24:14.631 2 DEBUG nova.compute.manager [req-1a9dc7a2-5570-431b-bd9f-5e166ac9da7d req-f6a9cc67-393c-4417-b460-2446927d3267 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Received event network-vif-plugged-375c20c8-b3bc-484b-820a-f3988fb1bfa1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:24:14 np0005466013 nova_compute[192144]: 2025-10-02 12:24:14.632 2 DEBUG oslo_concurrency.lockutils [req-1a9dc7a2-5570-431b-bd9f-5e166ac9da7d req-f6a9cc67-393c-4417-b460-2446927d3267 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "32196dd3-2739-4c43-9532-b0365f8095af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:14 np0005466013 nova_compute[192144]: 2025-10-02 12:24:14.633 2 DEBUG oslo_concurrency.lockutils [req-1a9dc7a2-5570-431b-bd9f-5e166ac9da7d req-f6a9cc67-393c-4417-b460-2446927d3267 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "32196dd3-2739-4c43-9532-b0365f8095af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:14 np0005466013 nova_compute[192144]: 2025-10-02 12:24:14.633 2 DEBUG oslo_concurrency.lockutils [req-1a9dc7a2-5570-431b-bd9f-5e166ac9da7d req-f6a9cc67-393c-4417-b460-2446927d3267 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "32196dd3-2739-4c43-9532-b0365f8095af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:14 np0005466013 nova_compute[192144]: 2025-10-02 12:24:14.634 2 DEBUG nova.compute.manager [req-1a9dc7a2-5570-431b-bd9f-5e166ac9da7d req-f6a9cc67-393c-4417-b460-2446927d3267 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] No waiting events found dispatching network-vif-plugged-375c20c8-b3bc-484b-820a-f3988fb1bfa1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:24:14 np0005466013 nova_compute[192144]: 2025-10-02 12:24:14.634 2 WARNING nova.compute.manager [req-1a9dc7a2-5570-431b-bd9f-5e166ac9da7d req-f6a9cc67-393c-4417-b460-2446927d3267 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Received unexpected event network-vif-plugged-375c20c8-b3bc-484b-820a-f3988fb1bfa1 for instance with vm_state resized and task_state resize_reverting.#033[00m
Oct  2 08:24:14 np0005466013 nova_compute[192144]: 2025-10-02 12:24:14.831 2 INFO nova.compute.manager [None req-409b0589-14df-43cc-a0b3-439ddfd4f3ac 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Swapping old allocation on dict_keys(['8a5c5335-95d5-48d7-aa6f-2fc6c798dc80']) held by migration 544fb063-d35f-48cc-b24b-f2f0ae933652 for instance#033[00m
Oct  2 08:24:14 np0005466013 nova_compute[192144]: 2025-10-02 12:24:14.876 2 DEBUG nova.scheduler.client.report [None req-409b0589-14df-43cc-a0b3-439ddfd4f3ac 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Overwriting current allocation {'allocations': {'55f2ae21-42ea-47d7-8c73-c3134981d708': {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}, 'generation': 63}}, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'consumer_generation': 1} on consumer 32196dd3-2739-4c43-9532-b0365f8095af move_allocations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:2018#033[00m
Oct  2 08:24:15 np0005466013 nova_compute[192144]: 2025-10-02 12:24:15.028 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407840.0272017, 32196dd3-2739-4c43-9532-b0365f8095af => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:24:15 np0005466013 nova_compute[192144]: 2025-10-02 12:24:15.029 2 INFO nova.compute.manager [-] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:24:15 np0005466013 nova_compute[192144]: 2025-10-02 12:24:15.050 2 DEBUG nova.compute.manager [None req-aeaaedb3-effd-4ef2-bb31-526278f05e7d - - - - - -] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:24:15 np0005466013 nova_compute[192144]: 2025-10-02 12:24:15.054 2 DEBUG nova.compute.manager [None req-aeaaedb3-effd-4ef2-bb31-526278f05e7d - - - - - -] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:24:15 np0005466013 nova_compute[192144]: 2025-10-02 12:24:15.078 2 INFO nova.compute.manager [None req-aeaaedb3-effd-4ef2-bb31-526278f05e7d - - - - - -] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] During sync_power_state the instance has a pending task (resize_reverting). Skip.#033[00m
Oct  2 08:24:15 np0005466013 nova_compute[192144]: 2025-10-02 12:24:15.241 2 INFO nova.network.neutron [None req-409b0589-14df-43cc-a0b3-439ddfd4f3ac 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Updating port 375c20c8-b3bc-484b-820a-f3988fb1bfa1 with attributes {'binding:host_id': 'compute-2.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Oct  2 08:24:15 np0005466013 nova_compute[192144]: 2025-10-02 12:24:15.392 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:15 np0005466013 ovn_controller[94366]: 2025-10-02T12:24:15Z|00467|binding|INFO|Releasing lport e533861f-45cb-4843-b071-0b628ca25128 from this chassis (sb_readonly=0)
Oct  2 08:24:15 np0005466013 nova_compute[192144]: 2025-10-02 12:24:15.519 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:16 np0005466013 nova_compute[192144]: 2025-10-02 12:24:16.974 2 DEBUG nova.compute.manager [req-fa28a159-c799-4d23-a7b3-1b109e13166a req-13cd90e2-b93c-4ff4-a0b5-204e2025cc01 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Received event network-vif-unplugged-375c20c8-b3bc-484b-820a-f3988fb1bfa1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:24:16 np0005466013 nova_compute[192144]: 2025-10-02 12:24:16.975 2 DEBUG oslo_concurrency.lockutils [req-fa28a159-c799-4d23-a7b3-1b109e13166a req-13cd90e2-b93c-4ff4-a0b5-204e2025cc01 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "32196dd3-2739-4c43-9532-b0365f8095af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:16 np0005466013 nova_compute[192144]: 2025-10-02 12:24:16.975 2 DEBUG oslo_concurrency.lockutils [req-fa28a159-c799-4d23-a7b3-1b109e13166a req-13cd90e2-b93c-4ff4-a0b5-204e2025cc01 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "32196dd3-2739-4c43-9532-b0365f8095af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:16 np0005466013 nova_compute[192144]: 2025-10-02 12:24:16.975 2 DEBUG oslo_concurrency.lockutils [req-fa28a159-c799-4d23-a7b3-1b109e13166a req-13cd90e2-b93c-4ff4-a0b5-204e2025cc01 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "32196dd3-2739-4c43-9532-b0365f8095af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:16 np0005466013 nova_compute[192144]: 2025-10-02 12:24:16.975 2 DEBUG nova.compute.manager [req-fa28a159-c799-4d23-a7b3-1b109e13166a req-13cd90e2-b93c-4ff4-a0b5-204e2025cc01 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] No waiting events found dispatching network-vif-unplugged-375c20c8-b3bc-484b-820a-f3988fb1bfa1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:24:16 np0005466013 nova_compute[192144]: 2025-10-02 12:24:16.976 2 WARNING nova.compute.manager [req-fa28a159-c799-4d23-a7b3-1b109e13166a req-13cd90e2-b93c-4ff4-a0b5-204e2025cc01 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Received unexpected event network-vif-unplugged-375c20c8-b3bc-484b-820a-f3988fb1bfa1 for instance with vm_state resized and task_state resize_reverting.#033[00m
Oct  2 08:24:16 np0005466013 nova_compute[192144]: 2025-10-02 12:24:16.976 2 DEBUG nova.compute.manager [req-fa28a159-c799-4d23-a7b3-1b109e13166a req-13cd90e2-b93c-4ff4-a0b5-204e2025cc01 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Received event network-vif-plugged-375c20c8-b3bc-484b-820a-f3988fb1bfa1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:24:16 np0005466013 nova_compute[192144]: 2025-10-02 12:24:16.976 2 DEBUG oslo_concurrency.lockutils [req-fa28a159-c799-4d23-a7b3-1b109e13166a req-13cd90e2-b93c-4ff4-a0b5-204e2025cc01 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "32196dd3-2739-4c43-9532-b0365f8095af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:16 np0005466013 nova_compute[192144]: 2025-10-02 12:24:16.976 2 DEBUG oslo_concurrency.lockutils [req-fa28a159-c799-4d23-a7b3-1b109e13166a req-13cd90e2-b93c-4ff4-a0b5-204e2025cc01 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "32196dd3-2739-4c43-9532-b0365f8095af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:16 np0005466013 nova_compute[192144]: 2025-10-02 12:24:16.976 2 DEBUG oslo_concurrency.lockutils [req-fa28a159-c799-4d23-a7b3-1b109e13166a req-13cd90e2-b93c-4ff4-a0b5-204e2025cc01 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "32196dd3-2739-4c43-9532-b0365f8095af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:16 np0005466013 nova_compute[192144]: 2025-10-02 12:24:16.977 2 DEBUG nova.compute.manager [req-fa28a159-c799-4d23-a7b3-1b109e13166a req-13cd90e2-b93c-4ff4-a0b5-204e2025cc01 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] No waiting events found dispatching network-vif-plugged-375c20c8-b3bc-484b-820a-f3988fb1bfa1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:24:16 np0005466013 nova_compute[192144]: 2025-10-02 12:24:16.977 2 WARNING nova.compute.manager [req-fa28a159-c799-4d23-a7b3-1b109e13166a req-13cd90e2-b93c-4ff4-a0b5-204e2025cc01 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Received unexpected event network-vif-plugged-375c20c8-b3bc-484b-820a-f3988fb1bfa1 for instance with vm_state resized and task_state resize_reverting.#033[00m
Oct  2 08:24:16 np0005466013 nova_compute[192144]: 2025-10-02 12:24:16.994 2 DEBUG oslo_concurrency.lockutils [None req-409b0589-14df-43cc-a0b3-439ddfd4f3ac 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "refresh_cache-32196dd3-2739-4c43-9532-b0365f8095af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:24:16 np0005466013 nova_compute[192144]: 2025-10-02 12:24:16.995 2 DEBUG oslo_concurrency.lockutils [None req-409b0589-14df-43cc-a0b3-439ddfd4f3ac 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquired lock "refresh_cache-32196dd3-2739-4c43-9532-b0365f8095af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:24:16 np0005466013 nova_compute[192144]: 2025-10-02 12:24:16.995 2 DEBUG nova.network.neutron [None req-409b0589-14df-43cc-a0b3-439ddfd4f3ac 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:24:17 np0005466013 nova_compute[192144]: 2025-10-02 12:24:17.357 2 DEBUG nova.compute.manager [req-e44ee9f6-7754-4566-90e8-04b66674eee0 req-4a78725e-0658-41f9-be71-ec0e78339ed1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Received event network-changed-375c20c8-b3bc-484b-820a-f3988fb1bfa1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:24:17 np0005466013 nova_compute[192144]: 2025-10-02 12:24:17.357 2 DEBUG nova.compute.manager [req-e44ee9f6-7754-4566-90e8-04b66674eee0 req-4a78725e-0658-41f9-be71-ec0e78339ed1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Refreshing instance network info cache due to event network-changed-375c20c8-b3bc-484b-820a-f3988fb1bfa1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:24:17 np0005466013 nova_compute[192144]: 2025-10-02 12:24:17.357 2 DEBUG oslo_concurrency.lockutils [req-e44ee9f6-7754-4566-90e8-04b66674eee0 req-4a78725e-0658-41f9-be71-ec0e78339ed1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-32196dd3-2739-4c43-9532-b0365f8095af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:24:18 np0005466013 nova_compute[192144]: 2025-10-02 12:24:18.340 2 DEBUG nova.network.neutron [None req-409b0589-14df-43cc-a0b3-439ddfd4f3ac 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Updating instance_info_cache with network_info: [{"id": "375c20c8-b3bc-484b-820a-f3988fb1bfa1", "address": "fa:16:3e:af:53:5f", "network": {"id": "91662be7-398f-4c34-a848-62b46821f0fd", "bridge": "br-int", "label": "tempest-network-smoke--722078817", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap375c20c8-b3", "ovs_interfaceid": "375c20c8-b3bc-484b-820a-f3988fb1bfa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:24:18 np0005466013 nova_compute[192144]: 2025-10-02 12:24:18.474 2 DEBUG oslo_concurrency.lockutils [None req-409b0589-14df-43cc-a0b3-439ddfd4f3ac 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Releasing lock "refresh_cache-32196dd3-2739-4c43-9532-b0365f8095af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:24:18 np0005466013 nova_compute[192144]: 2025-10-02 12:24:18.475 2 DEBUG nova.virt.libvirt.driver [None req-409b0589-14df-43cc-a0b3-439ddfd4f3ac 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Starting finish_revert_migration finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11843#033[00m
Oct  2 08:24:18 np0005466013 nova_compute[192144]: 2025-10-02 12:24:18.479 2 DEBUG oslo_concurrency.lockutils [req-e44ee9f6-7754-4566-90e8-04b66674eee0 req-4a78725e-0658-41f9-be71-ec0e78339ed1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-32196dd3-2739-4c43-9532-b0365f8095af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:24:18 np0005466013 nova_compute[192144]: 2025-10-02 12:24:18.480 2 DEBUG nova.network.neutron [req-e44ee9f6-7754-4566-90e8-04b66674eee0 req-4a78725e-0658-41f9-be71-ec0e78339ed1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Refreshing network info cache for port 375c20c8-b3bc-484b-820a-f3988fb1bfa1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:24:18 np0005466013 nova_compute[192144]: 2025-10-02 12:24:18.491 2 DEBUG nova.virt.libvirt.driver [None req-409b0589-14df-43cc-a0b3-439ddfd4f3ac 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Start _get_guest_xml network_info=[{"id": "375c20c8-b3bc-484b-820a-f3988fb1bfa1", "address": "fa:16:3e:af:53:5f", "network": {"id": "91662be7-398f-4c34-a848-62b46821f0fd", "bridge": "br-int", "label": "tempest-network-smoke--722078817", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap375c20c8-b3", "ovs_interfaceid": "375c20c8-b3bc-484b-820a-f3988fb1bfa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:24:18 np0005466013 nova_compute[192144]: 2025-10-02 12:24:18.498 2 WARNING nova.virt.libvirt.driver [None req-409b0589-14df-43cc-a0b3-439ddfd4f3ac 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:24:18 np0005466013 nova_compute[192144]: 2025-10-02 12:24:18.517 2 DEBUG nova.virt.libvirt.host [None req-409b0589-14df-43cc-a0b3-439ddfd4f3ac 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:24:18 np0005466013 nova_compute[192144]: 2025-10-02 12:24:18.518 2 DEBUG nova.virt.libvirt.host [None req-409b0589-14df-43cc-a0b3-439ddfd4f3ac 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:24:18 np0005466013 nova_compute[192144]: 2025-10-02 12:24:18.528 2 DEBUG nova.virt.libvirt.host [None req-409b0589-14df-43cc-a0b3-439ddfd4f3ac 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:24:18 np0005466013 nova_compute[192144]: 2025-10-02 12:24:18.529 2 DEBUG nova.virt.libvirt.host [None req-409b0589-14df-43cc-a0b3-439ddfd4f3ac 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:24:18 np0005466013 nova_compute[192144]: 2025-10-02 12:24:18.531 2 DEBUG nova.virt.libvirt.driver [None req-409b0589-14df-43cc-a0b3-439ddfd4f3ac 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:24:18 np0005466013 nova_compute[192144]: 2025-10-02 12:24:18.531 2 DEBUG nova.virt.hardware [None req-409b0589-14df-43cc-a0b3-439ddfd4f3ac 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:24:18 np0005466013 nova_compute[192144]: 2025-10-02 12:24:18.532 2 DEBUG nova.virt.hardware [None req-409b0589-14df-43cc-a0b3-439ddfd4f3ac 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:24:18 np0005466013 nova_compute[192144]: 2025-10-02 12:24:18.532 2 DEBUG nova.virt.hardware [None req-409b0589-14df-43cc-a0b3-439ddfd4f3ac 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:24:18 np0005466013 nova_compute[192144]: 2025-10-02 12:24:18.533 2 DEBUG nova.virt.hardware [None req-409b0589-14df-43cc-a0b3-439ddfd4f3ac 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:24:18 np0005466013 nova_compute[192144]: 2025-10-02 12:24:18.533 2 DEBUG nova.virt.hardware [None req-409b0589-14df-43cc-a0b3-439ddfd4f3ac 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:24:18 np0005466013 nova_compute[192144]: 2025-10-02 12:24:18.534 2 DEBUG nova.virt.hardware [None req-409b0589-14df-43cc-a0b3-439ddfd4f3ac 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:24:18 np0005466013 nova_compute[192144]: 2025-10-02 12:24:18.534 2 DEBUG nova.virt.hardware [None req-409b0589-14df-43cc-a0b3-439ddfd4f3ac 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:24:18 np0005466013 nova_compute[192144]: 2025-10-02 12:24:18.534 2 DEBUG nova.virt.hardware [None req-409b0589-14df-43cc-a0b3-439ddfd4f3ac 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:24:18 np0005466013 nova_compute[192144]: 2025-10-02 12:24:18.535 2 DEBUG nova.virt.hardware [None req-409b0589-14df-43cc-a0b3-439ddfd4f3ac 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:24:18 np0005466013 nova_compute[192144]: 2025-10-02 12:24:18.535 2 DEBUG nova.virt.hardware [None req-409b0589-14df-43cc-a0b3-439ddfd4f3ac 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:24:18 np0005466013 nova_compute[192144]: 2025-10-02 12:24:18.536 2 DEBUG nova.virt.hardware [None req-409b0589-14df-43cc-a0b3-439ddfd4f3ac 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:24:18 np0005466013 nova_compute[192144]: 2025-10-02 12:24:18.536 2 DEBUG nova.objects.instance [None req-409b0589-14df-43cc-a0b3-439ddfd4f3ac 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 32196dd3-2739-4c43-9532-b0365f8095af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:24:18 np0005466013 nova_compute[192144]: 2025-10-02 12:24:18.565 2 DEBUG oslo_concurrency.processutils [None req-409b0589-14df-43cc-a0b3-439ddfd4f3ac 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:24:18 np0005466013 nova_compute[192144]: 2025-10-02 12:24:18.661 2 DEBUG oslo_concurrency.processutils [None req-409b0589-14df-43cc-a0b3-439ddfd4f3ac 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af/disk.config --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:24:18 np0005466013 nova_compute[192144]: 2025-10-02 12:24:18.662 2 DEBUG oslo_concurrency.lockutils [None req-409b0589-14df-43cc-a0b3-439ddfd4f3ac 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "/var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:18 np0005466013 nova_compute[192144]: 2025-10-02 12:24:18.663 2 DEBUG oslo_concurrency.lockutils [None req-409b0589-14df-43cc-a0b3-439ddfd4f3ac 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "/var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:18 np0005466013 nova_compute[192144]: 2025-10-02 12:24:18.664 2 DEBUG oslo_concurrency.lockutils [None req-409b0589-14df-43cc-a0b3-439ddfd4f3ac 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "/var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:18 np0005466013 nova_compute[192144]: 2025-10-02 12:24:18.666 2 DEBUG nova.virt.libvirt.vif [None req-409b0589-14df-43cc-a0b3-439ddfd4f3ac 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T12:23:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-2105436045',display_name='tempest-TestNetworkAdvancedServerOps-server-2105436045',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-2105436045',id=117,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIPVZ/1ugRUXJi6kpxyVgRUtYTdMlYSz5NQQRRxSWUHE0SJ8tz8WjHhrHski+4uyv4G//M9upfdriwZTygaxranlXIWK6yJW4zVM7pqGP5AEtkUxwGNjsUk0aVRz2H8oSQ==',key_name='tempest-TestNetworkAdvancedServerOps-1888170662',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:24:10Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='76c7dd40d83e4e3ca71abbebf57921b6',ramdisk_id='',reservation_id='r-0oq6jqhj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-597114071',owner_user_name='tempest-TestNetworkAdvancedServerOps-597114071-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:24:14Z,user_data=None,user_id='1faa7e121a0e43ad8cb4ae5b2cfcc6a2',uuid=32196dd3-2739-4c43-9532-b0365f8095af,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "375c20c8-b3bc-484b-820a-f3988fb1bfa1", "address": "fa:16:3e:af:53:5f", "network": {"id": "91662be7-398f-4c34-a848-62b46821f0fd", "bridge": "br-int", "label": "tempest-network-smoke--722078817", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap375c20c8-b3", "ovs_interfaceid": "375c20c8-b3bc-484b-820a-f3988fb1bfa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:24:18 np0005466013 nova_compute[192144]: 2025-10-02 12:24:18.666 2 DEBUG nova.network.os_vif_util [None req-409b0589-14df-43cc-a0b3-439ddfd4f3ac 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Converting VIF {"id": "375c20c8-b3bc-484b-820a-f3988fb1bfa1", "address": "fa:16:3e:af:53:5f", "network": {"id": "91662be7-398f-4c34-a848-62b46821f0fd", "bridge": "br-int", "label": "tempest-network-smoke--722078817", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap375c20c8-b3", "ovs_interfaceid": "375c20c8-b3bc-484b-820a-f3988fb1bfa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:24:18 np0005466013 nova_compute[192144]: 2025-10-02 12:24:18.668 2 DEBUG nova.network.os_vif_util [None req-409b0589-14df-43cc-a0b3-439ddfd4f3ac 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:af:53:5f,bridge_name='br-int',has_traffic_filtering=True,id=375c20c8-b3bc-484b-820a-f3988fb1bfa1,network=Network(91662be7-398f-4c34-a848-62b46821f0fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap375c20c8-b3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:24:18 np0005466013 nova_compute[192144]: 2025-10-02 12:24:18.671 2 DEBUG nova.virt.libvirt.driver [None req-409b0589-14df-43cc-a0b3-439ddfd4f3ac 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:24:18 np0005466013 nova_compute[192144]:  <uuid>32196dd3-2739-4c43-9532-b0365f8095af</uuid>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:  <name>instance-00000075</name>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:24:18 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-2105436045</nova:name>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:24:18</nova:creationTime>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:24:18 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:        <nova:user uuid="1faa7e121a0e43ad8cb4ae5b2cfcc6a2">tempest-TestNetworkAdvancedServerOps-597114071-project-member</nova:user>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:        <nova:project uuid="76c7dd40d83e4e3ca71abbebf57921b6">tempest-TestNetworkAdvancedServerOps-597114071</nova:project>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:        <nova:port uuid="375c20c8-b3bc-484b-820a-f3988fb1bfa1">
Oct  2 08:24:18 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:      <entry name="serial">32196dd3-2739-4c43-9532-b0365f8095af</entry>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:      <entry name="uuid">32196dd3-2739-4c43-9532-b0365f8095af</entry>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:24:18 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af/disk"/>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:24:18 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af/disk.config"/>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:24:18 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:af:53:5f"/>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:      <target dev="tap375c20c8-b3"/>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:24:18 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af/console.log" append="off"/>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    <input type="keyboard" bus="usb"/>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:24:18 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:24:18 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:24:18 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:24:18 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:24:18 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:24:18 np0005466013 nova_compute[192144]: 2025-10-02 12:24:18.673 2 DEBUG nova.compute.manager [None req-409b0589-14df-43cc-a0b3-439ddfd4f3ac 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Preparing to wait for external event network-vif-plugged-375c20c8-b3bc-484b-820a-f3988fb1bfa1 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:24:18 np0005466013 nova_compute[192144]: 2025-10-02 12:24:18.674 2 DEBUG oslo_concurrency.lockutils [None req-409b0589-14df-43cc-a0b3-439ddfd4f3ac 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "32196dd3-2739-4c43-9532-b0365f8095af-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:18 np0005466013 nova_compute[192144]: 2025-10-02 12:24:18.674 2 DEBUG oslo_concurrency.lockutils [None req-409b0589-14df-43cc-a0b3-439ddfd4f3ac 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "32196dd3-2739-4c43-9532-b0365f8095af-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:18 np0005466013 nova_compute[192144]: 2025-10-02 12:24:18.674 2 DEBUG oslo_concurrency.lockutils [None req-409b0589-14df-43cc-a0b3-439ddfd4f3ac 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "32196dd3-2739-4c43-9532-b0365f8095af-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:18 np0005466013 nova_compute[192144]: 2025-10-02 12:24:18.675 2 DEBUG nova.virt.libvirt.vif [None req-409b0589-14df-43cc-a0b3-439ddfd4f3ac 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T12:23:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-2105436045',display_name='tempest-TestNetworkAdvancedServerOps-server-2105436045',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-2105436045',id=117,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIPVZ/1ugRUXJi6kpxyVgRUtYTdMlYSz5NQQRRxSWUHE0SJ8tz8WjHhrHski+4uyv4G//M9upfdriwZTygaxranlXIWK6yJW4zVM7pqGP5AEtkUxwGNjsUk0aVRz2H8oSQ==',key_name='tempest-TestNetworkAdvancedServerOps-1888170662',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:24:10Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='76c7dd40d83e4e3ca71abbebf57921b6',ramdisk_id='',reservation_id='r-0oq6jqhj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-597114071',owner_user_name='tempest-TestNetworkAdvancedServerOps-597114071-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:24:14Z,user_data=None,user_id='1faa7e121a0e43ad8cb4ae5b2cfcc6a2',uuid=32196dd3-2739-4c43-9532-b0365f8095af,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "375c20c8-b3bc-484b-820a-f3988fb1bfa1", "address": "fa:16:3e:af:53:5f", "network": {"id": "91662be7-398f-4c34-a848-62b46821f0fd", "bridge": "br-int", "label": "tempest-network-smoke--722078817", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap375c20c8-b3", "ovs_interfaceid": "375c20c8-b3bc-484b-820a-f3988fb1bfa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:24:18 np0005466013 nova_compute[192144]: 2025-10-02 12:24:18.676 2 DEBUG nova.network.os_vif_util [None req-409b0589-14df-43cc-a0b3-439ddfd4f3ac 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Converting VIF {"id": "375c20c8-b3bc-484b-820a-f3988fb1bfa1", "address": "fa:16:3e:af:53:5f", "network": {"id": "91662be7-398f-4c34-a848-62b46821f0fd", "bridge": "br-int", "label": "tempest-network-smoke--722078817", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap375c20c8-b3", "ovs_interfaceid": "375c20c8-b3bc-484b-820a-f3988fb1bfa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:24:18 np0005466013 nova_compute[192144]: 2025-10-02 12:24:18.676 2 DEBUG nova.network.os_vif_util [None req-409b0589-14df-43cc-a0b3-439ddfd4f3ac 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:af:53:5f,bridge_name='br-int',has_traffic_filtering=True,id=375c20c8-b3bc-484b-820a-f3988fb1bfa1,network=Network(91662be7-398f-4c34-a848-62b46821f0fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap375c20c8-b3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:24:18 np0005466013 nova_compute[192144]: 2025-10-02 12:24:18.677 2 DEBUG os_vif [None req-409b0589-14df-43cc-a0b3-439ddfd4f3ac 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:af:53:5f,bridge_name='br-int',has_traffic_filtering=True,id=375c20c8-b3bc-484b-820a-f3988fb1bfa1,network=Network(91662be7-398f-4c34-a848-62b46821f0fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap375c20c8-b3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:24:18 np0005466013 nova_compute[192144]: 2025-10-02 12:24:18.677 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:18 np0005466013 nova_compute[192144]: 2025-10-02 12:24:18.678 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:24:18 np0005466013 nova_compute[192144]: 2025-10-02 12:24:18.678 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:24:18 np0005466013 nova_compute[192144]: 2025-10-02 12:24:18.681 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:18 np0005466013 nova_compute[192144]: 2025-10-02 12:24:18.682 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap375c20c8-b3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:24:18 np0005466013 nova_compute[192144]: 2025-10-02 12:24:18.682 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap375c20c8-b3, col_values=(('external_ids', {'iface-id': '375c20c8-b3bc-484b-820a-f3988fb1bfa1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:af:53:5f', 'vm-uuid': '32196dd3-2739-4c43-9532-b0365f8095af'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:24:18 np0005466013 NetworkManager[51205]: <info>  [1759407858.7256] manager: (tap375c20c8-b3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/215)
Oct  2 08:24:18 np0005466013 nova_compute[192144]: 2025-10-02 12:24:18.725 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:18 np0005466013 nova_compute[192144]: 2025-10-02 12:24:18.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:24:18 np0005466013 nova_compute[192144]: 2025-10-02 12:24:18.735 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:18 np0005466013 nova_compute[192144]: 2025-10-02 12:24:18.736 2 INFO os_vif [None req-409b0589-14df-43cc-a0b3-439ddfd4f3ac 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:af:53:5f,bridge_name='br-int',has_traffic_filtering=True,id=375c20c8-b3bc-484b-820a-f3988fb1bfa1,network=Network(91662be7-398f-4c34-a848-62b46821f0fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap375c20c8-b3')#033[00m
Oct  2 08:24:18 np0005466013 kernel: tap375c20c8-b3: entered promiscuous mode
Oct  2 08:24:18 np0005466013 NetworkManager[51205]: <info>  [1759407858.8200] manager: (tap375c20c8-b3): new Tun device (/org/freedesktop/NetworkManager/Devices/216)
Oct  2 08:24:18 np0005466013 ovn_controller[94366]: 2025-10-02T12:24:18Z|00468|binding|INFO|Claiming lport 375c20c8-b3bc-484b-820a-f3988fb1bfa1 for this chassis.
Oct  2 08:24:18 np0005466013 nova_compute[192144]: 2025-10-02 12:24:18.822 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:18 np0005466013 ovn_controller[94366]: 2025-10-02T12:24:18Z|00469|binding|INFO|375c20c8-b3bc-484b-820a-f3988fb1bfa1: Claiming fa:16:3e:af:53:5f 10.100.0.11
Oct  2 08:24:18 np0005466013 nova_compute[192144]: 2025-10-02 12:24:18.825 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:18.843 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:af:53:5f 10.100.0.11'], port_security=['fa:16:3e:af:53:5f 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '32196dd3-2739-4c43-9532-b0365f8095af', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-91662be7-398f-4c34-a848-62b46821f0fd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'neutron:revision_number': '10', 'neutron:security_group_ids': '56d0844b-17cf-4186-b565-d275a3fd7b1f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.233'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2bb1944c-7514-4575-bf6c-55d1c733e488, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=375c20c8-b3bc-484b-820a-f3988fb1bfa1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:24:18 np0005466013 ovn_controller[94366]: 2025-10-02T12:24:18Z|00470|binding|INFO|Setting lport 375c20c8-b3bc-484b-820a-f3988fb1bfa1 ovn-installed in OVS
Oct  2 08:24:18 np0005466013 ovn_controller[94366]: 2025-10-02T12:24:18Z|00471|binding|INFO|Setting lport 375c20c8-b3bc-484b-820a-f3988fb1bfa1 up in Southbound
Oct  2 08:24:18 np0005466013 nova_compute[192144]: 2025-10-02 12:24:18.842 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:18.846 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 375c20c8-b3bc-484b-820a-f3988fb1bfa1 in datapath 91662be7-398f-4c34-a848-62b46821f0fd bound to our chassis#033[00m
Oct  2 08:24:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:18.849 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 91662be7-398f-4c34-a848-62b46821f0fd#033[00m
Oct  2 08:24:18 np0005466013 systemd-udevd[238135]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:24:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:18.861 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[37daefd8-ed36-4ce2-9e0b-70b7fdef16b0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:18.863 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap91662be7-31 in ovnmeta-91662be7-398f-4c34-a848-62b46821f0fd namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:24:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:18.865 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap91662be7-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:24:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:18.866 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[4e8b0ab0-9dc9-4368-a84e-e38c76b1ca9f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:18 np0005466013 NetworkManager[51205]: <info>  [1759407858.8684] device (tap375c20c8-b3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:24:18 np0005466013 NetworkManager[51205]: <info>  [1759407858.8696] device (tap375c20c8-b3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:24:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:18.869 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[35d14a00-1fc4-4b38-b390-681c670c5a57]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:18 np0005466013 systemd-machined[152202]: New machine qemu-55-instance-00000075.
Oct  2 08:24:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:18.886 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[210a2d56-d009-4c30-a4d0-f34f89c6463b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:18 np0005466013 systemd[1]: Started Virtual Machine qemu-55-instance-00000075.
Oct  2 08:24:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:18.903 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[877573ba-ad8c-48b0-9639-d4576238cae7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:18.942 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[5f2c8116-873d-46e8-9d14-cf68b9b50c09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:18 np0005466013 NetworkManager[51205]: <info>  [1759407858.9493] manager: (tap91662be7-30): new Veth device (/org/freedesktop/NetworkManager/Devices/217)
Oct  2 08:24:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:18.949 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[43d09ca2-a7f6-464d-8214-af94e0e3b985]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:18.993 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[e11b4290-d699-4982-9dea-3e4b09df067d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:18.997 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[83f92ee0-2061-419f-86f2-e47ab8d98ffc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:19 np0005466013 NetworkManager[51205]: <info>  [1759407859.0367] device (tap91662be7-30): carrier: link connected
Oct  2 08:24:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:19.045 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[a3dc7e0f-7f4b-40f6-9413-2183ce771028]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:19.063 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[9c57d058-1afd-46be-b308-7e2c24f02013]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap91662be7-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:44:4b:1d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 143], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 585065, 'reachable_time': 22328, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238170, 'error': None, 'target': 'ovnmeta-91662be7-398f-4c34-a848-62b46821f0fd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:19.086 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[3dcefcff-4abe-4eec-8f33-95eceb8cf83c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe44:4b1d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 585065, 'tstamp': 585065}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238171, 'error': None, 'target': 'ovnmeta-91662be7-398f-4c34-a848-62b46821f0fd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:19.105 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[10dcf1e3-18b8-4abb-b396-fa980df86efc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap91662be7-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:44:4b:1d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 143], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 585065, 'reachable_time': 22328, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 238172, 'error': None, 'target': 'ovnmeta-91662be7-398f-4c34-a848-62b46821f0fd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:19 np0005466013 nova_compute[192144]: 2025-10-02 12:24:19.110 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:19 np0005466013 nova_compute[192144]: 2025-10-02 12:24:19.152 2 DEBUG nova.compute.manager [req-3f8bcaeb-4ca2-4897-9a73-755363ad4ff6 req-cf8fc0e1-c9ec-4a57-9206-c6dc3f0d9f84 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Received event network-vif-plugged-375c20c8-b3bc-484b-820a-f3988fb1bfa1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:24:19 np0005466013 nova_compute[192144]: 2025-10-02 12:24:19.152 2 DEBUG oslo_concurrency.lockutils [req-3f8bcaeb-4ca2-4897-9a73-755363ad4ff6 req-cf8fc0e1-c9ec-4a57-9206-c6dc3f0d9f84 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "32196dd3-2739-4c43-9532-b0365f8095af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:19.152 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[b7981d30-2aef-4e32-959c-cf720ef2e960]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:19 np0005466013 nova_compute[192144]: 2025-10-02 12:24:19.153 2 DEBUG oslo_concurrency.lockutils [req-3f8bcaeb-4ca2-4897-9a73-755363ad4ff6 req-cf8fc0e1-c9ec-4a57-9206-c6dc3f0d9f84 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "32196dd3-2739-4c43-9532-b0365f8095af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:19 np0005466013 nova_compute[192144]: 2025-10-02 12:24:19.154 2 DEBUG oslo_concurrency.lockutils [req-3f8bcaeb-4ca2-4897-9a73-755363ad4ff6 req-cf8fc0e1-c9ec-4a57-9206-c6dc3f0d9f84 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "32196dd3-2739-4c43-9532-b0365f8095af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:19 np0005466013 nova_compute[192144]: 2025-10-02 12:24:19.155 2 DEBUG nova.compute.manager [req-3f8bcaeb-4ca2-4897-9a73-755363ad4ff6 req-cf8fc0e1-c9ec-4a57-9206-c6dc3f0d9f84 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Processing event network-vif-plugged-375c20c8-b3bc-484b-820a-f3988fb1bfa1 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:24:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:19.231 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[723b3c77-d1d7-45f2-8b11-9a311469816a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:19.233 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap91662be7-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:24:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:19.233 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:24:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:19.234 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap91662be7-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:24:19 np0005466013 NetworkManager[51205]: <info>  [1759407859.2365] manager: (tap91662be7-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/218)
Oct  2 08:24:19 np0005466013 nova_compute[192144]: 2025-10-02 12:24:19.236 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:19 np0005466013 kernel: tap91662be7-30: entered promiscuous mode
Oct  2 08:24:19 np0005466013 nova_compute[192144]: 2025-10-02 12:24:19.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:19.242 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap91662be7-30, col_values=(('external_ids', {'iface-id': '9bd4e8e6-11ff-43aa-92bf-67aec1a8e528'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:24:19 np0005466013 ovn_controller[94366]: 2025-10-02T12:24:19Z|00472|binding|INFO|Releasing lport 9bd4e8e6-11ff-43aa-92bf-67aec1a8e528 from this chassis (sb_readonly=0)
Oct  2 08:24:19 np0005466013 nova_compute[192144]: 2025-10-02 12:24:19.243 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:19.244 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/91662be7-398f-4c34-a848-62b46821f0fd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/91662be7-398f-4c34-a848-62b46821f0fd.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:24:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:19.245 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[47d70239-5538-43f8-af40-667698157f82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:19.246 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:24:19 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:24:19 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:24:19 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-91662be7-398f-4c34-a848-62b46821f0fd
Oct  2 08:24:19 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:24:19 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:24:19 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:24:19 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/91662be7-398f-4c34-a848-62b46821f0fd.pid.haproxy
Oct  2 08:24:19 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:24:19 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:24:19 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:24:19 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:24:19 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:24:19 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:24:19 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:24:19 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:24:19 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:24:19 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:24:19 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:24:19 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:24:19 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:24:19 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:24:19 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:24:19 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:24:19 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:24:19 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:24:19 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:24:19 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:24:19 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID 91662be7-398f-4c34-a848-62b46821f0fd
Oct  2 08:24:19 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:24:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:19.249 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-91662be7-398f-4c34-a848-62b46821f0fd', 'env', 'PROCESS_TAG=haproxy-91662be7-398f-4c34-a848-62b46821f0fd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/91662be7-398f-4c34-a848-62b46821f0fd.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:24:19 np0005466013 nova_compute[192144]: 2025-10-02 12:24:19.256 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:19 np0005466013 podman[238212]: 2025-10-02 12:24:19.692967078 +0000 UTC m=+0.070381515 container create f4240a8f43c0599e84e7b58e5ca05a890c7646b1a39fe27c1c859199f760458c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-91662be7-398f-4c34-a848-62b46821f0fd, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Oct  2 08:24:19 np0005466013 systemd[1]: Started libpod-conmon-f4240a8f43c0599e84e7b58e5ca05a890c7646b1a39fe27c1c859199f760458c.scope.
Oct  2 08:24:19 np0005466013 podman[238212]: 2025-10-02 12:24:19.661670848 +0000 UTC m=+0.039085305 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:24:19 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:24:19 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/80f245d4ca0b821c6c391de78d46793a5e47b0691f566a87f6f7f3874cadf233/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:24:19 np0005466013 podman[238212]: 2025-10-02 12:24:19.79296839 +0000 UTC m=+0.170382857 container init f4240a8f43c0599e84e7b58e5ca05a890c7646b1a39fe27c1c859199f760458c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-91662be7-398f-4c34-a848-62b46821f0fd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:24:19 np0005466013 podman[238212]: 2025-10-02 12:24:19.799411002 +0000 UTC m=+0.176825439 container start f4240a8f43c0599e84e7b58e5ca05a890c7646b1a39fe27c1c859199f760458c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-91662be7-398f-4c34-a848-62b46821f0fd, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:24:19 np0005466013 neutron-haproxy-ovnmeta-91662be7-398f-4c34-a848-62b46821f0fd[238228]: [NOTICE]   (238232) : New worker (238234) forked
Oct  2 08:24:19 np0005466013 neutron-haproxy-ovnmeta-91662be7-398f-4c34-a848-62b46821f0fd[238228]: [NOTICE]   (238232) : Loading success.
Oct  2 08:24:19 np0005466013 nova_compute[192144]: 2025-10-02 12:24:19.855 2 DEBUG nova.compute.manager [None req-409b0589-14df-43cc-a0b3-439ddfd4f3ac 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:24:19 np0005466013 nova_compute[192144]: 2025-10-02 12:24:19.856 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407859.8562028, 32196dd3-2739-4c43-9532-b0365f8095af => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:24:19 np0005466013 nova_compute[192144]: 2025-10-02 12:24:19.857 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] VM Started (Lifecycle Event)#033[00m
Oct  2 08:24:19 np0005466013 nova_compute[192144]: 2025-10-02 12:24:19.871 2 INFO nova.virt.libvirt.driver [-] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Instance running successfully.#033[00m
Oct  2 08:24:19 np0005466013 nova_compute[192144]: 2025-10-02 12:24:19.871 2 DEBUG nova.virt.libvirt.driver [None req-409b0589-14df-43cc-a0b3-439ddfd4f3ac 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] finish_revert_migration finished successfully. finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11887#033[00m
Oct  2 08:24:19 np0005466013 nova_compute[192144]: 2025-10-02 12:24:19.877 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:24:19 np0005466013 nova_compute[192144]: 2025-10-02 12:24:19.880 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Synchronizing instance power state after lifecycle event "Started"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:24:19 np0005466013 nova_compute[192144]: 2025-10-02 12:24:19.908 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] During sync_power_state the instance has a pending task (resize_reverting). Skip.#033[00m
Oct  2 08:24:19 np0005466013 nova_compute[192144]: 2025-10-02 12:24:19.909 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407859.8570707, 32196dd3-2739-4c43-9532-b0365f8095af => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:24:19 np0005466013 nova_compute[192144]: 2025-10-02 12:24:19.909 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:24:19 np0005466013 nova_compute[192144]: 2025-10-02 12:24:19.943 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:24:19 np0005466013 nova_compute[192144]: 2025-10-02 12:24:19.948 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407859.8591616, 32196dd3-2739-4c43-9532-b0365f8095af => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:24:19 np0005466013 nova_compute[192144]: 2025-10-02 12:24:19.948 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:24:19 np0005466013 nova_compute[192144]: 2025-10-02 12:24:19.967 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:24:19 np0005466013 nova_compute[192144]: 2025-10-02 12:24:19.972 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:24:20 np0005466013 nova_compute[192144]: 2025-10-02 12:24:19.999 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] During sync_power_state the instance has a pending task (resize_reverting). Skip.#033[00m
Oct  2 08:24:20 np0005466013 nova_compute[192144]: 2025-10-02 12:24:20.001 2 INFO nova.compute.manager [None req-409b0589-14df-43cc-a0b3-439ddfd4f3ac 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Updating instance to original state: 'active'#033[00m
Oct  2 08:24:20 np0005466013 nova_compute[192144]: 2025-10-02 12:24:20.085 2 DEBUG nova.network.neutron [req-e44ee9f6-7754-4566-90e8-04b66674eee0 req-4a78725e-0658-41f9-be71-ec0e78339ed1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Updated VIF entry in instance network info cache for port 375c20c8-b3bc-484b-820a-f3988fb1bfa1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:24:20 np0005466013 nova_compute[192144]: 2025-10-02 12:24:20.085 2 DEBUG nova.network.neutron [req-e44ee9f6-7754-4566-90e8-04b66674eee0 req-4a78725e-0658-41f9-be71-ec0e78339ed1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Updating instance_info_cache with network_info: [{"id": "375c20c8-b3bc-484b-820a-f3988fb1bfa1", "address": "fa:16:3e:af:53:5f", "network": {"id": "91662be7-398f-4c34-a848-62b46821f0fd", "bridge": "br-int", "label": "tempest-network-smoke--722078817", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap375c20c8-b3", "ovs_interfaceid": "375c20c8-b3bc-484b-820a-f3988fb1bfa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:24:20 np0005466013 nova_compute[192144]: 2025-10-02 12:24:20.109 2 DEBUG oslo_concurrency.lockutils [req-e44ee9f6-7754-4566-90e8-04b66674eee0 req-4a78725e-0658-41f9-be71-ec0e78339ed1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-32196dd3-2739-4c43-9532-b0365f8095af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:24:20 np0005466013 ovn_controller[94366]: 2025-10-02T12:24:20Z|00473|binding|INFO|Releasing lport 9bd4e8e6-11ff-43aa-92bf-67aec1a8e528 from this chassis (sb_readonly=0)
Oct  2 08:24:20 np0005466013 ovn_controller[94366]: 2025-10-02T12:24:20Z|00474|binding|INFO|Releasing lport e533861f-45cb-4843-b071-0b628ca25128 from this chassis (sb_readonly=0)
Oct  2 08:24:20 np0005466013 nova_compute[192144]: 2025-10-02 12:24:20.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:21 np0005466013 nova_compute[192144]: 2025-10-02 12:24:21.286 2 DEBUG nova.compute.manager [req-b578d05d-b906-4644-8848-96c72096fa70 req-75599a94-b302-4091-b1ad-19db97ecba72 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Received event network-vif-plugged-375c20c8-b3bc-484b-820a-f3988fb1bfa1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:24:21 np0005466013 nova_compute[192144]: 2025-10-02 12:24:21.286 2 DEBUG oslo_concurrency.lockutils [req-b578d05d-b906-4644-8848-96c72096fa70 req-75599a94-b302-4091-b1ad-19db97ecba72 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "32196dd3-2739-4c43-9532-b0365f8095af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:21 np0005466013 nova_compute[192144]: 2025-10-02 12:24:21.287 2 DEBUG oslo_concurrency.lockutils [req-b578d05d-b906-4644-8848-96c72096fa70 req-75599a94-b302-4091-b1ad-19db97ecba72 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "32196dd3-2739-4c43-9532-b0365f8095af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:21 np0005466013 nova_compute[192144]: 2025-10-02 12:24:21.287 2 DEBUG oslo_concurrency.lockutils [req-b578d05d-b906-4644-8848-96c72096fa70 req-75599a94-b302-4091-b1ad-19db97ecba72 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "32196dd3-2739-4c43-9532-b0365f8095af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:21 np0005466013 nova_compute[192144]: 2025-10-02 12:24:21.287 2 DEBUG nova.compute.manager [req-b578d05d-b906-4644-8848-96c72096fa70 req-75599a94-b302-4091-b1ad-19db97ecba72 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] No waiting events found dispatching network-vif-plugged-375c20c8-b3bc-484b-820a-f3988fb1bfa1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:24:21 np0005466013 nova_compute[192144]: 2025-10-02 12:24:21.287 2 WARNING nova.compute.manager [req-b578d05d-b906-4644-8848-96c72096fa70 req-75599a94-b302-4091-b1ad-19db97ecba72 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Received unexpected event network-vif-plugged-375c20c8-b3bc-484b-820a-f3988fb1bfa1 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:24:22 np0005466013 nova_compute[192144]: 2025-10-02 12:24:22.632 2 DEBUG oslo_concurrency.lockutils [None req-1b652b50-76a1-4195-8598-12b720919394 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Acquiring lock "f2f0b852-0c4a-4d16-9c7f-54845e7f7b42" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:22 np0005466013 nova_compute[192144]: 2025-10-02 12:24:22.634 2 DEBUG oslo_concurrency.lockutils [None req-1b652b50-76a1-4195-8598-12b720919394 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "f2f0b852-0c4a-4d16-9c7f-54845e7f7b42" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:22 np0005466013 nova_compute[192144]: 2025-10-02 12:24:22.635 2 DEBUG oslo_concurrency.lockutils [None req-1b652b50-76a1-4195-8598-12b720919394 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Acquiring lock "f2f0b852-0c4a-4d16-9c7f-54845e7f7b42-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:22 np0005466013 nova_compute[192144]: 2025-10-02 12:24:22.636 2 DEBUG oslo_concurrency.lockutils [None req-1b652b50-76a1-4195-8598-12b720919394 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "f2f0b852-0c4a-4d16-9c7f-54845e7f7b42-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:22 np0005466013 nova_compute[192144]: 2025-10-02 12:24:22.636 2 DEBUG oslo_concurrency.lockutils [None req-1b652b50-76a1-4195-8598-12b720919394 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "f2f0b852-0c4a-4d16-9c7f-54845e7f7b42-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:22 np0005466013 nova_compute[192144]: 2025-10-02 12:24:22.655 2 INFO nova.compute.manager [None req-1b652b50-76a1-4195-8598-12b720919394 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Terminating instance#033[00m
Oct  2 08:24:22 np0005466013 nova_compute[192144]: 2025-10-02 12:24:22.674 2 DEBUG nova.compute.manager [None req-1b652b50-76a1-4195-8598-12b720919394 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:24:22 np0005466013 kernel: tap6374e02b-d2 (unregistering): left promiscuous mode
Oct  2 08:24:22 np0005466013 NetworkManager[51205]: <info>  [1759407862.7137] device (tap6374e02b-d2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:24:22 np0005466013 ovn_controller[94366]: 2025-10-02T12:24:22Z|00475|binding|INFO|Releasing lport 6374e02b-d27f-466e-9a75-8ba586327036 from this chassis (sb_readonly=0)
Oct  2 08:24:22 np0005466013 ovn_controller[94366]: 2025-10-02T12:24:22Z|00476|binding|INFO|Setting lport 6374e02b-d27f-466e-9a75-8ba586327036 down in Southbound
Oct  2 08:24:22 np0005466013 nova_compute[192144]: 2025-10-02 12:24:22.725 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:22 np0005466013 ovn_controller[94366]: 2025-10-02T12:24:22Z|00477|binding|INFO|Removing iface tap6374e02b-d2 ovn-installed in OVS
Oct  2 08:24:22 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:22.735 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:17:03:e5 10.100.0.6'], port_security=['fa:16:3e:17:03:e5 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'f2f0b852-0c4a-4d16-9c7f-54845e7f7b42', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-20eb29be-ee23-463b-85af-bfc2388e9f77', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7993d405-22b6-4649-b5b8-9f3e7d07d4ce', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e183e2c6-21dc-48e3-ae47-279bc8b32eeb, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=6374e02b-d27f-466e-9a75-8ba586327036) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:24:22 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:22.736 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 6374e02b-d27f-466e-9a75-8ba586327036 in datapath 20eb29be-ee23-463b-85af-bfc2388e9f77 unbound from our chassis#033[00m
Oct  2 08:24:22 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:22.738 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 20eb29be-ee23-463b-85af-bfc2388e9f77, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:24:22 np0005466013 nova_compute[192144]: 2025-10-02 12:24:22.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:22 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:22.739 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[f20366a0-4c13-4c67-988f-c4628a2debe6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:22 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:22.740 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77 namespace which is not needed anymore#033[00m
Oct  2 08:24:22 np0005466013 systemd[1]: machine-qemu\x2d50\x2dinstance\x2d00000071.scope: Deactivated successfully.
Oct  2 08:24:22 np0005466013 systemd[1]: machine-qemu\x2d50\x2dinstance\x2d00000071.scope: Consumed 19.710s CPU time.
Oct  2 08:24:22 np0005466013 systemd-machined[152202]: Machine qemu-50-instance-00000071 terminated.
Oct  2 08:24:22 np0005466013 neutron-haproxy-ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77[235517]: [NOTICE]   (235521) : haproxy version is 2.8.14-c23fe91
Oct  2 08:24:22 np0005466013 neutron-haproxy-ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77[235517]: [NOTICE]   (235521) : path to executable is /usr/sbin/haproxy
Oct  2 08:24:22 np0005466013 neutron-haproxy-ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77[235517]: [WARNING]  (235521) : Exiting Master process...
Oct  2 08:24:22 np0005466013 neutron-haproxy-ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77[235517]: [WARNING]  (235521) : Exiting Master process...
Oct  2 08:24:22 np0005466013 neutron-haproxy-ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77[235517]: [ALERT]    (235521) : Current worker (235523) exited with code 143 (Terminated)
Oct  2 08:24:22 np0005466013 neutron-haproxy-ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77[235517]: [WARNING]  (235521) : All workers exited. Exiting... (0)
Oct  2 08:24:22 np0005466013 systemd[1]: libpod-ec5bfb782c3b7ae4bc1e30eba90123e0ba718892d3d4de75c69008d459956534.scope: Deactivated successfully.
Oct  2 08:24:22 np0005466013 podman[238269]: 2025-10-02 12:24:22.942356303 +0000 UTC m=+0.066000418 container died ec5bfb782c3b7ae4bc1e30eba90123e0ba718892d3d4de75c69008d459956534 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct  2 08:24:22 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ec5bfb782c3b7ae4bc1e30eba90123e0ba718892d3d4de75c69008d459956534-userdata-shm.mount: Deactivated successfully.
Oct  2 08:24:22 np0005466013 systemd[1]: var-lib-containers-storage-overlay-6bd8b09aaf0e9d525298c7ff89972a46bf226f7585941e7fa69b8b90dfdae1d8-merged.mount: Deactivated successfully.
Oct  2 08:24:22 np0005466013 nova_compute[192144]: 2025-10-02 12:24:22.980 2 INFO nova.virt.libvirt.driver [-] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Instance destroyed successfully.#033[00m
Oct  2 08:24:22 np0005466013 nova_compute[192144]: 2025-10-02 12:24:22.981 2 DEBUG nova.objects.instance [None req-1b652b50-76a1-4195-8598-12b720919394 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lazy-loading 'resources' on Instance uuid f2f0b852-0c4a-4d16-9c7f-54845e7f7b42 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:24:22 np0005466013 podman[238269]: 2025-10-02 12:24:22.986520656 +0000 UTC m=+0.110164761 container cleanup ec5bfb782c3b7ae4bc1e30eba90123e0ba718892d3d4de75c69008d459956534 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:24:22 np0005466013 nova_compute[192144]: 2025-10-02 12:24:22.998 2 DEBUG nova.virt.libvirt.vif [None req-1b652b50-76a1-4195-8598-12b720919394 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:21:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1061088034',display_name='tempest-ServerActionsTestOtherB-server-1061088034',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1061088034',id=113,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:21:48Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ffce7d629aa24a7f970d93b2a79045f1',ramdisk_id='',reservation_id='r-f3s92q8v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-263921372',owner_user_name='tempest-ServerActionsTestOtherB-263921372-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:21:48Z,user_data=None,user_id='0ea122e2fff94f2ba7c78bf30b04029c',uuid=f2f0b852-0c4a-4d16-9c7f-54845e7f7b42,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6374e02b-d27f-466e-9a75-8ba586327036", "address": "fa:16:3e:17:03:e5", "network": {"id": "20eb29be-ee23-463b-85af-bfc2388e9f77", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-370285634-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffce7d629aa24a7f970d93b2a79045f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6374e02b-d2", "ovs_interfaceid": "6374e02b-d27f-466e-9a75-8ba586327036", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:24:23 np0005466013 nova_compute[192144]: 2025-10-02 12:24:22.999 2 DEBUG nova.network.os_vif_util [None req-1b652b50-76a1-4195-8598-12b720919394 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Converting VIF {"id": "6374e02b-d27f-466e-9a75-8ba586327036", "address": "fa:16:3e:17:03:e5", "network": {"id": "20eb29be-ee23-463b-85af-bfc2388e9f77", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-370285634-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffce7d629aa24a7f970d93b2a79045f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6374e02b-d2", "ovs_interfaceid": "6374e02b-d27f-466e-9a75-8ba586327036", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:24:23 np0005466013 nova_compute[192144]: 2025-10-02 12:24:23.001 2 DEBUG nova.network.os_vif_util [None req-1b652b50-76a1-4195-8598-12b720919394 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:17:03:e5,bridge_name='br-int',has_traffic_filtering=True,id=6374e02b-d27f-466e-9a75-8ba586327036,network=Network(20eb29be-ee23-463b-85af-bfc2388e9f77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6374e02b-d2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:24:23 np0005466013 nova_compute[192144]: 2025-10-02 12:24:23.001 2 DEBUG os_vif [None req-1b652b50-76a1-4195-8598-12b720919394 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:17:03:e5,bridge_name='br-int',has_traffic_filtering=True,id=6374e02b-d27f-466e-9a75-8ba586327036,network=Network(20eb29be-ee23-463b-85af-bfc2388e9f77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6374e02b-d2') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:24:23 np0005466013 nova_compute[192144]: 2025-10-02 12:24:23.003 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:23 np0005466013 nova_compute[192144]: 2025-10-02 12:24:23.004 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6374e02b-d2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:24:23 np0005466013 nova_compute[192144]: 2025-10-02 12:24:23.005 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:23 np0005466013 nova_compute[192144]: 2025-10-02 12:24:23.007 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:23 np0005466013 nova_compute[192144]: 2025-10-02 12:24:23.009 2 INFO os_vif [None req-1b652b50-76a1-4195-8598-12b720919394 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:17:03:e5,bridge_name='br-int',has_traffic_filtering=True,id=6374e02b-d27f-466e-9a75-8ba586327036,network=Network(20eb29be-ee23-463b-85af-bfc2388e9f77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6374e02b-d2')#033[00m
Oct  2 08:24:23 np0005466013 nova_compute[192144]: 2025-10-02 12:24:23.010 2 INFO nova.virt.libvirt.driver [None req-1b652b50-76a1-4195-8598-12b720919394 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Deleting instance files /var/lib/nova/instances/f2f0b852-0c4a-4d16-9c7f-54845e7f7b42_del#033[00m
Oct  2 08:24:23 np0005466013 nova_compute[192144]: 2025-10-02 12:24:23.011 2 INFO nova.virt.libvirt.driver [None req-1b652b50-76a1-4195-8598-12b720919394 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Deletion of /var/lib/nova/instances/f2f0b852-0c4a-4d16-9c7f-54845e7f7b42_del complete#033[00m
Oct  2 08:24:23 np0005466013 systemd[1]: libpod-conmon-ec5bfb782c3b7ae4bc1e30eba90123e0ba718892d3d4de75c69008d459956534.scope: Deactivated successfully.
Oct  2 08:24:23 np0005466013 podman[238307]: 2025-10-02 12:24:23.077114913 +0000 UTC m=+0.059625788 container remove ec5bfb782c3b7ae4bc1e30eba90123e0ba718892d3d4de75c69008d459956534 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:24:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:23.085 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[48f1671d-901d-4678-af11-908a4a8f4651]: (4, ('Thu Oct  2 12:24:22 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77 (ec5bfb782c3b7ae4bc1e30eba90123e0ba718892d3d4de75c69008d459956534)\nec5bfb782c3b7ae4bc1e30eba90123e0ba718892d3d4de75c69008d459956534\nThu Oct  2 12:24:22 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77 (ec5bfb782c3b7ae4bc1e30eba90123e0ba718892d3d4de75c69008d459956534)\nec5bfb782c3b7ae4bc1e30eba90123e0ba718892d3d4de75c69008d459956534\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:23.087 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[94a2351c-84b1-4ff6-bf3f-bbc5a4b8dcf6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:23.089 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap20eb29be-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:24:23 np0005466013 nova_compute[192144]: 2025-10-02 12:24:23.091 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:23 np0005466013 kernel: tap20eb29be-e0: left promiscuous mode
Oct  2 08:24:23 np0005466013 nova_compute[192144]: 2025-10-02 12:24:23.103 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:23 np0005466013 nova_compute[192144]: 2025-10-02 12:24:23.105 2 INFO nova.compute.manager [None req-1b652b50-76a1-4195-8598-12b720919394 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Took 0.43 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:24:23 np0005466013 nova_compute[192144]: 2025-10-02 12:24:23.105 2 DEBUG oslo.service.loopingcall [None req-1b652b50-76a1-4195-8598-12b720919394 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:24:23 np0005466013 nova_compute[192144]: 2025-10-02 12:24:23.106 2 DEBUG nova.compute.manager [-] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:24:23 np0005466013 nova_compute[192144]: 2025-10-02 12:24:23.106 2 DEBUG nova.network.neutron [-] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:24:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:23.111 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[4c7a8879-675d-47f2-9dba-efff18da187a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:23 np0005466013 nova_compute[192144]: 2025-10-02 12:24:23.134 2 DEBUG nova.compute.manager [req-b8a86b8e-c461-482c-a8b3-42f71a7656e7 req-dd8ba857-1876-4804-b677-49e117c797cd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Received event network-vif-unplugged-6374e02b-d27f-466e-9a75-8ba586327036 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:24:23 np0005466013 nova_compute[192144]: 2025-10-02 12:24:23.135 2 DEBUG oslo_concurrency.lockutils [req-b8a86b8e-c461-482c-a8b3-42f71a7656e7 req-dd8ba857-1876-4804-b677-49e117c797cd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "f2f0b852-0c4a-4d16-9c7f-54845e7f7b42-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:23 np0005466013 nova_compute[192144]: 2025-10-02 12:24:23.135 2 DEBUG oslo_concurrency.lockutils [req-b8a86b8e-c461-482c-a8b3-42f71a7656e7 req-dd8ba857-1876-4804-b677-49e117c797cd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f2f0b852-0c4a-4d16-9c7f-54845e7f7b42-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:23 np0005466013 nova_compute[192144]: 2025-10-02 12:24:23.135 2 DEBUG oslo_concurrency.lockutils [req-b8a86b8e-c461-482c-a8b3-42f71a7656e7 req-dd8ba857-1876-4804-b677-49e117c797cd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f2f0b852-0c4a-4d16-9c7f-54845e7f7b42-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:23 np0005466013 nova_compute[192144]: 2025-10-02 12:24:23.136 2 DEBUG nova.compute.manager [req-b8a86b8e-c461-482c-a8b3-42f71a7656e7 req-dd8ba857-1876-4804-b677-49e117c797cd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] No waiting events found dispatching network-vif-unplugged-6374e02b-d27f-466e-9a75-8ba586327036 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:24:23 np0005466013 nova_compute[192144]: 2025-10-02 12:24:23.136 2 DEBUG nova.compute.manager [req-b8a86b8e-c461-482c-a8b3-42f71a7656e7 req-dd8ba857-1876-4804-b677-49e117c797cd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Received event network-vif-unplugged-6374e02b-d27f-466e-9a75-8ba586327036 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:24:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:23.144 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[ef1521dc-3178-47cf-ad92-f13d9d4fc0f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:23.146 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[7b5fa3d2-7cb0-49f4-b290-fbd5b27dcaa3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:23.172 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[4dc17174-05c4-436a-83a8-1a5321a81d6d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 561943, 'reachable_time': 24426, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238324, 'error': None, 'target': 'ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:23.175 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:24:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:23.176 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[10d069e4-ae53-4200-9908-d7ce62902f11]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:23 np0005466013 systemd[1]: run-netns-ovnmeta\x2d20eb29be\x2dee23\x2d463b\x2d85af\x2dbfc2388e9f77.mount: Deactivated successfully.
Oct  2 08:24:24 np0005466013 nova_compute[192144]: 2025-10-02 12:24:24.115 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:25 np0005466013 nova_compute[192144]: 2025-10-02 12:24:25.245 2 DEBUG nova.compute.manager [req-df8052c8-a7b7-44e1-a4af-cc249f7bae56 req-6b43885e-e147-46b4-a254-9477d2be5020 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Received event network-vif-plugged-6374e02b-d27f-466e-9a75-8ba586327036 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:24:25 np0005466013 nova_compute[192144]: 2025-10-02 12:24:25.248 2 DEBUG oslo_concurrency.lockutils [req-df8052c8-a7b7-44e1-a4af-cc249f7bae56 req-6b43885e-e147-46b4-a254-9477d2be5020 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "f2f0b852-0c4a-4d16-9c7f-54845e7f7b42-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:25 np0005466013 nova_compute[192144]: 2025-10-02 12:24:25.248 2 DEBUG oslo_concurrency.lockutils [req-df8052c8-a7b7-44e1-a4af-cc249f7bae56 req-6b43885e-e147-46b4-a254-9477d2be5020 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f2f0b852-0c4a-4d16-9c7f-54845e7f7b42-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:25 np0005466013 nova_compute[192144]: 2025-10-02 12:24:25.248 2 DEBUG oslo_concurrency.lockutils [req-df8052c8-a7b7-44e1-a4af-cc249f7bae56 req-6b43885e-e147-46b4-a254-9477d2be5020 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f2f0b852-0c4a-4d16-9c7f-54845e7f7b42-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:25 np0005466013 nova_compute[192144]: 2025-10-02 12:24:25.249 2 DEBUG nova.compute.manager [req-df8052c8-a7b7-44e1-a4af-cc249f7bae56 req-6b43885e-e147-46b4-a254-9477d2be5020 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] No waiting events found dispatching network-vif-plugged-6374e02b-d27f-466e-9a75-8ba586327036 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:24:25 np0005466013 nova_compute[192144]: 2025-10-02 12:24:25.249 2 WARNING nova.compute.manager [req-df8052c8-a7b7-44e1-a4af-cc249f7bae56 req-6b43885e-e147-46b4-a254-9477d2be5020 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Received unexpected event network-vif-plugged-6374e02b-d27f-466e-9a75-8ba586327036 for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:24:25 np0005466013 nova_compute[192144]: 2025-10-02 12:24:25.432 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:24:25 np0005466013 nova_compute[192144]: 2025-10-02 12:24:25.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:24:25 np0005466013 nova_compute[192144]: 2025-10-02 12:24:25.995 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  2 08:24:26 np0005466013 nova_compute[192144]: 2025-10-02 12:24:26.651 2 DEBUG nova.network.neutron [-] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:24:26 np0005466013 nova_compute[192144]: 2025-10-02 12:24:26.670 2 INFO nova.compute.manager [-] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Took 3.56 seconds to deallocate network for instance.#033[00m
Oct  2 08:24:26 np0005466013 podman[238325]: 2025-10-02 12:24:26.711002309 +0000 UTC m=+0.067530276 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  2 08:24:26 np0005466013 podman[238326]: 2025-10-02 12:24:26.724419258 +0000 UTC m=+0.077656793 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:24:26 np0005466013 nova_compute[192144]: 2025-10-02 12:24:26.737 2 DEBUG oslo_concurrency.lockutils [None req-1b652b50-76a1-4195-8598-12b720919394 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:26 np0005466013 nova_compute[192144]: 2025-10-02 12:24:26.738 2 DEBUG oslo_concurrency.lockutils [None req-1b652b50-76a1-4195-8598-12b720919394 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:26 np0005466013 podman[238327]: 2025-10-02 12:24:26.750579118 +0000 UTC m=+0.103645407 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3)
Oct  2 08:24:26 np0005466013 nova_compute[192144]: 2025-10-02 12:24:26.815 2 DEBUG nova.compute.provider_tree [None req-1b652b50-76a1-4195-8598-12b720919394 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:24:26 np0005466013 nova_compute[192144]: 2025-10-02 12:24:26.829 2 DEBUG nova.scheduler.client.report [None req-1b652b50-76a1-4195-8598-12b720919394 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:24:26 np0005466013 nova_compute[192144]: 2025-10-02 12:24:26.849 2 DEBUG oslo_concurrency.lockutils [None req-1b652b50-76a1-4195-8598-12b720919394 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:26 np0005466013 nova_compute[192144]: 2025-10-02 12:24:26.873 2 INFO nova.scheduler.client.report [None req-1b652b50-76a1-4195-8598-12b720919394 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Deleted allocations for instance f2f0b852-0c4a-4d16-9c7f-54845e7f7b42#033[00m
Oct  2 08:24:26 np0005466013 nova_compute[192144]: 2025-10-02 12:24:26.963 2 DEBUG oslo_concurrency.lockutils [None req-1b652b50-76a1-4195-8598-12b720919394 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "f2f0b852-0c4a-4d16-9c7f-54845e7f7b42" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.329s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:27 np0005466013 nova_compute[192144]: 2025-10-02 12:24:27.013 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:24:27 np0005466013 nova_compute[192144]: 2025-10-02 12:24:27.014 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:24:27 np0005466013 nova_compute[192144]: 2025-10-02 12:24:27.358 2 DEBUG nova.compute.manager [req-c917b1b6-2fa3-4d11-ae59-6c78b5bfb0bf req-8ec82da4-fc2a-49d7-a42f-68c476065f56 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Received event network-vif-deleted-6374e02b-d27f-466e-9a75-8ba586327036 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:24:28 np0005466013 nova_compute[192144]: 2025-10-02 12:24:28.007 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:28 np0005466013 nova_compute[192144]: 2025-10-02 12:24:28.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:24:29 np0005466013 nova_compute[192144]: 2025-10-02 12:24:29.023 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:29 np0005466013 nova_compute[192144]: 2025-10-02 12:24:29.024 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:29 np0005466013 nova_compute[192144]: 2025-10-02 12:24:29.025 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:29 np0005466013 nova_compute[192144]: 2025-10-02 12:24:29.025 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:24:29 np0005466013 nova_compute[192144]: 2025-10-02 12:24:29.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:29 np0005466013 nova_compute[192144]: 2025-10-02 12:24:29.171 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:24:29 np0005466013 nova_compute[192144]: 2025-10-02 12:24:29.262 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af/disk --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:24:29 np0005466013 nova_compute[192144]: 2025-10-02 12:24:29.265 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:24:29 np0005466013 nova_compute[192144]: 2025-10-02 12:24:29.335 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:24:29 np0005466013 nova_compute[192144]: 2025-10-02 12:24:29.527 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:24:29 np0005466013 nova_compute[192144]: 2025-10-02 12:24:29.529 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5512MB free_disk=73.25724029541016GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:24:29 np0005466013 nova_compute[192144]: 2025-10-02 12:24:29.529 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:29 np0005466013 nova_compute[192144]: 2025-10-02 12:24:29.530 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:29 np0005466013 nova_compute[192144]: 2025-10-02 12:24:29.647 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance 32196dd3-2739-4c43-9532-b0365f8095af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:24:29 np0005466013 nova_compute[192144]: 2025-10-02 12:24:29.648 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:24:29 np0005466013 nova_compute[192144]: 2025-10-02 12:24:29.648 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:24:29 np0005466013 nova_compute[192144]: 2025-10-02 12:24:29.701 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:24:29 np0005466013 nova_compute[192144]: 2025-10-02 12:24:29.923 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:24:30 np0005466013 nova_compute[192144]: 2025-10-02 12:24:30.091 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:24:30 np0005466013 nova_compute[192144]: 2025-10-02 12:24:30.092 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.563s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:32 np0005466013 ovn_controller[94366]: 2025-10-02T12:24:32Z|00043|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:af:53:5f 10.100.0.11
Oct  2 08:24:33 np0005466013 nova_compute[192144]: 2025-10-02 12:24:33.013 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:33 np0005466013 ovn_controller[94366]: 2025-10-02T12:24:33Z|00478|binding|INFO|Releasing lport 9bd4e8e6-11ff-43aa-92bf-67aec1a8e528 from this chassis (sb_readonly=0)
Oct  2 08:24:33 np0005466013 nova_compute[192144]: 2025-10-02 12:24:33.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:34 np0005466013 nova_compute[192144]: 2025-10-02 12:24:34.094 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:24:34 np0005466013 nova_compute[192144]: 2025-10-02 12:24:34.094 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:24:34 np0005466013 nova_compute[192144]: 2025-10-02 12:24:34.095 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:24:34 np0005466013 nova_compute[192144]: 2025-10-02 12:24:34.095 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:24:34 np0005466013 nova_compute[192144]: 2025-10-02 12:24:34.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:34 np0005466013 nova_compute[192144]: 2025-10-02 12:24:34.990 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:24:34 np0005466013 nova_compute[192144]: 2025-10-02 12:24:34.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:24:36 np0005466013 nova_compute[192144]: 2025-10-02 12:24:36.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:24:36 np0005466013 nova_compute[192144]: 2025-10-02 12:24:36.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:24:37 np0005466013 nova_compute[192144]: 2025-10-02 12:24:37.422 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "refresh_cache-32196dd3-2739-4c43-9532-b0365f8095af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:24:37 np0005466013 nova_compute[192144]: 2025-10-02 12:24:37.423 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquired lock "refresh_cache-32196dd3-2739-4c43-9532-b0365f8095af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:24:37 np0005466013 nova_compute[192144]: 2025-10-02 12:24:37.423 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:24:37 np0005466013 podman[238413]: 2025-10-02 12:24:37.693044902 +0000 UTC m=+0.063883601 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:24:37 np0005466013 podman[238414]: 2025-10-02 12:24:37.695818849 +0000 UTC m=+0.067392631 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, managed_by=edpm_ansible, name=ubi9-minimal, vcs-type=git, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.openshift.expose-services=, release=1755695350, com.redhat.component=ubi9-minimal-container, config_id=edpm, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., architecture=x86_64)
Oct  2 08:24:37 np0005466013 podman[238415]: 2025-10-02 12:24:37.695828429 +0000 UTC m=+0.064533451 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 08:24:37 np0005466013 nova_compute[192144]: 2025-10-02 12:24:37.971 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407862.9705503, f2f0b852-0c4a-4d16-9c7f-54845e7f7b42 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:24:37 np0005466013 nova_compute[192144]: 2025-10-02 12:24:37.972 2 INFO nova.compute.manager [-] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:24:38 np0005466013 nova_compute[192144]: 2025-10-02 12:24:38.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:38 np0005466013 nova_compute[192144]: 2025-10-02 12:24:38.173 2 INFO nova.compute.manager [None req-526288ba-fd4c-4c64-86ff-38f45f478c82 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Get console output#033[00m
Oct  2 08:24:38 np0005466013 nova_compute[192144]: 2025-10-02 12:24:38.179 56 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  2 08:24:38 np0005466013 nova_compute[192144]: 2025-10-02 12:24:38.189 2 DEBUG nova.compute.manager [None req-268e3de3-a674-4971-94c6-d10b9417e02c - - - - - -] [instance: f2f0b852-0c4a-4d16-9c7f-54845e7f7b42] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:24:39 np0005466013 nova_compute[192144]: 2025-10-02 12:24:39.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:39 np0005466013 nova_compute[192144]: 2025-10-02 12:24:39.718 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Updating instance_info_cache with network_info: [{"id": "375c20c8-b3bc-484b-820a-f3988fb1bfa1", "address": "fa:16:3e:af:53:5f", "network": {"id": "91662be7-398f-4c34-a848-62b46821f0fd", "bridge": "br-int", "label": "tempest-network-smoke--722078817", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap375c20c8-b3", "ovs_interfaceid": "375c20c8-b3bc-484b-820a-f3988fb1bfa1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:24:39 np0005466013 nova_compute[192144]: 2025-10-02 12:24:39.777 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Releasing lock "refresh_cache-32196dd3-2739-4c43-9532-b0365f8095af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:24:39 np0005466013 nova_compute[192144]: 2025-10-02 12:24:39.778 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:24:40 np0005466013 nova_compute[192144]: 2025-10-02 12:24:40.559 2 DEBUG nova.compute.manager [req-0df46a8f-a9fa-49b4-b540-5b403fbc2823 req-5219ca6c-e03a-43c3-82c6-2829faca7a2c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Received event network-changed-375c20c8-b3bc-484b-820a-f3988fb1bfa1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:24:40 np0005466013 nova_compute[192144]: 2025-10-02 12:24:40.559 2 DEBUG nova.compute.manager [req-0df46a8f-a9fa-49b4-b540-5b403fbc2823 req-5219ca6c-e03a-43c3-82c6-2829faca7a2c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Refreshing instance network info cache due to event network-changed-375c20c8-b3bc-484b-820a-f3988fb1bfa1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:24:40 np0005466013 nova_compute[192144]: 2025-10-02 12:24:40.560 2 DEBUG oslo_concurrency.lockutils [req-0df46a8f-a9fa-49b4-b540-5b403fbc2823 req-5219ca6c-e03a-43c3-82c6-2829faca7a2c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-32196dd3-2739-4c43-9532-b0365f8095af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:24:40 np0005466013 nova_compute[192144]: 2025-10-02 12:24:40.560 2 DEBUG oslo_concurrency.lockutils [req-0df46a8f-a9fa-49b4-b540-5b403fbc2823 req-5219ca6c-e03a-43c3-82c6-2829faca7a2c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-32196dd3-2739-4c43-9532-b0365f8095af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:24:40 np0005466013 nova_compute[192144]: 2025-10-02 12:24:40.561 2 DEBUG nova.network.neutron [req-0df46a8f-a9fa-49b4-b540-5b403fbc2823 req-5219ca6c-e03a-43c3-82c6-2829faca7a2c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Refreshing network info cache for port 375c20c8-b3bc-484b-820a-f3988fb1bfa1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:24:40 np0005466013 nova_compute[192144]: 2025-10-02 12:24:40.789 2 DEBUG oslo_concurrency.lockutils [None req-65a42f1c-d67f-4310-b89a-8c78f8ef6146 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "32196dd3-2739-4c43-9532-b0365f8095af" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:40 np0005466013 nova_compute[192144]: 2025-10-02 12:24:40.789 2 DEBUG oslo_concurrency.lockutils [None req-65a42f1c-d67f-4310-b89a-8c78f8ef6146 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "32196dd3-2739-4c43-9532-b0365f8095af" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:40 np0005466013 nova_compute[192144]: 2025-10-02 12:24:40.790 2 DEBUG oslo_concurrency.lockutils [None req-65a42f1c-d67f-4310-b89a-8c78f8ef6146 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "32196dd3-2739-4c43-9532-b0365f8095af-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:40 np0005466013 nova_compute[192144]: 2025-10-02 12:24:40.790 2 DEBUG oslo_concurrency.lockutils [None req-65a42f1c-d67f-4310-b89a-8c78f8ef6146 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "32196dd3-2739-4c43-9532-b0365f8095af-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:40 np0005466013 nova_compute[192144]: 2025-10-02 12:24:40.791 2 DEBUG oslo_concurrency.lockutils [None req-65a42f1c-d67f-4310-b89a-8c78f8ef6146 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "32196dd3-2739-4c43-9532-b0365f8095af-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:40 np0005466013 nova_compute[192144]: 2025-10-02 12:24:40.894 2 INFO nova.compute.manager [None req-65a42f1c-d67f-4310-b89a-8c78f8ef6146 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Terminating instance#033[00m
Oct  2 08:24:40 np0005466013 nova_compute[192144]: 2025-10-02 12:24:40.967 2 DEBUG nova.compute.manager [None req-65a42f1c-d67f-4310-b89a-8c78f8ef6146 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:24:40 np0005466013 kernel: tap375c20c8-b3 (unregistering): left promiscuous mode
Oct  2 08:24:40 np0005466013 NetworkManager[51205]: <info>  [1759407880.9944] device (tap375c20c8-b3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:24:41 np0005466013 ovn_controller[94366]: 2025-10-02T12:24:41Z|00479|binding|INFO|Releasing lport 375c20c8-b3bc-484b-820a-f3988fb1bfa1 from this chassis (sb_readonly=0)
Oct  2 08:24:41 np0005466013 nova_compute[192144]: 2025-10-02 12:24:41.008 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:41 np0005466013 ovn_controller[94366]: 2025-10-02T12:24:41Z|00480|binding|INFO|Setting lport 375c20c8-b3bc-484b-820a-f3988fb1bfa1 down in Southbound
Oct  2 08:24:41 np0005466013 ovn_controller[94366]: 2025-10-02T12:24:41Z|00481|binding|INFO|Removing iface tap375c20c8-b3 ovn-installed in OVS
Oct  2 08:24:41 np0005466013 nova_compute[192144]: 2025-10-02 12:24:41.012 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:41 np0005466013 nova_compute[192144]: 2025-10-02 12:24:41.046 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:41 np0005466013 systemd[1]: machine-qemu\x2d55\x2dinstance\x2d00000075.scope: Deactivated successfully.
Oct  2 08:24:41 np0005466013 systemd[1]: machine-qemu\x2d55\x2dinstance\x2d00000075.scope: Consumed 14.028s CPU time.
Oct  2 08:24:41 np0005466013 systemd-machined[152202]: Machine qemu-55-instance-00000075 terminated.
Oct  2 08:24:41 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:41.122 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:af:53:5f 10.100.0.11'], port_security=['fa:16:3e:af:53:5f 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '32196dd3-2739-4c43-9532-b0365f8095af', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-91662be7-398f-4c34-a848-62b46821f0fd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'neutron:revision_number': '12', 'neutron:security_group_ids': '56d0844b-17cf-4186-b565-d275a3fd7b1f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2bb1944c-7514-4575-bf6c-55d1c733e488, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=375c20c8-b3bc-484b-820a-f3988fb1bfa1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:24:41 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:41.124 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 375c20c8-b3bc-484b-820a-f3988fb1bfa1 in datapath 91662be7-398f-4c34-a848-62b46821f0fd unbound from our chassis#033[00m
Oct  2 08:24:41 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:41.125 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 91662be7-398f-4c34-a848-62b46821f0fd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:24:41 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:41.126 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[f9161ea5-e354-4c1a-a665-e77ac3b55907]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:41 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:41.127 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-91662be7-398f-4c34-a848-62b46821f0fd namespace which is not needed anymore#033[00m
Oct  2 08:24:41 np0005466013 kernel: tap375c20c8-b3: entered promiscuous mode
Oct  2 08:24:41 np0005466013 kernel: tap375c20c8-b3 (unregistering): left promiscuous mode
Oct  2 08:24:41 np0005466013 ovn_controller[94366]: 2025-10-02T12:24:41Z|00482|binding|INFO|Claiming lport 375c20c8-b3bc-484b-820a-f3988fb1bfa1 for this chassis.
Oct  2 08:24:41 np0005466013 ovn_controller[94366]: 2025-10-02T12:24:41Z|00483|binding|INFO|375c20c8-b3bc-484b-820a-f3988fb1bfa1: Claiming fa:16:3e:af:53:5f 10.100.0.11
Oct  2 08:24:41 np0005466013 nova_compute[192144]: 2025-10-02 12:24:41.203 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:41 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:41.226 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:af:53:5f 10.100.0.11'], port_security=['fa:16:3e:af:53:5f 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '32196dd3-2739-4c43-9532-b0365f8095af', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-91662be7-398f-4c34-a848-62b46821f0fd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'neutron:revision_number': '12', 'neutron:security_group_ids': '56d0844b-17cf-4186-b565-d275a3fd7b1f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2bb1944c-7514-4575-bf6c-55d1c733e488, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=375c20c8-b3bc-484b-820a-f3988fb1bfa1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:24:41 np0005466013 nova_compute[192144]: 2025-10-02 12:24:41.229 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:41 np0005466013 ovn_controller[94366]: 2025-10-02T12:24:41Z|00484|binding|INFO|Releasing lport 375c20c8-b3bc-484b-820a-f3988fb1bfa1 from this chassis (sb_readonly=0)
Oct  2 08:24:41 np0005466013 nova_compute[192144]: 2025-10-02 12:24:41.259 2 INFO nova.virt.libvirt.driver [-] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Instance destroyed successfully.#033[00m
Oct  2 08:24:41 np0005466013 nova_compute[192144]: 2025-10-02 12:24:41.260 2 DEBUG nova.objects.instance [None req-65a42f1c-d67f-4310-b89a-8c78f8ef6146 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lazy-loading 'resources' on Instance uuid 32196dd3-2739-4c43-9532-b0365f8095af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:24:41 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:41.361 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:af:53:5f 10.100.0.11'], port_security=['fa:16:3e:af:53:5f 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '32196dd3-2739-4c43-9532-b0365f8095af', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-91662be7-398f-4c34-a848-62b46821f0fd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'neutron:revision_number': '12', 'neutron:security_group_ids': '56d0844b-17cf-4186-b565-d275a3fd7b1f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2bb1944c-7514-4575-bf6c-55d1c733e488, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=375c20c8-b3bc-484b-820a-f3988fb1bfa1) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:24:41 np0005466013 nova_compute[192144]: 2025-10-02 12:24:41.369 2 DEBUG nova.virt.libvirt.vif [None req-65a42f1c-d67f-4310-b89a-8c78f8ef6146 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T12:23:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-2105436045',display_name='tempest-TestNetworkAdvancedServerOps-server-2105436045',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-2105436045',id=117,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIPVZ/1ugRUXJi6kpxyVgRUtYTdMlYSz5NQQRRxSWUHE0SJ8tz8WjHhrHski+4uyv4G//M9upfdriwZTygaxranlXIWK6yJW4zVM7pqGP5AEtkUxwGNjsUk0aVRz2H8oSQ==',key_name='tempest-TestNetworkAdvancedServerOps-1888170662',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:24:19Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='76c7dd40d83e4e3ca71abbebf57921b6',ramdisk_id='',reservation_id='r-0oq6jqhj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-597114071',owner_user_name='tempest-TestNetworkAdvancedServerOps-597114071-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:24:20Z,user_data=None,user_id='1faa7e121a0e43ad8cb4ae5b2cfcc6a2',uuid=32196dd3-2739-4c43-9532-b0365f8095af,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "375c20c8-b3bc-484b-820a-f3988fb1bfa1", "address": "fa:16:3e:af:53:5f", "network": {"id": "91662be7-398f-4c34-a848-62b46821f0fd", "bridge": "br-int", "label": "tempest-network-smoke--722078817", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap375c20c8-b3", "ovs_interfaceid": "375c20c8-b3bc-484b-820a-f3988fb1bfa1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:24:41 np0005466013 nova_compute[192144]: 2025-10-02 12:24:41.370 2 DEBUG nova.network.os_vif_util [None req-65a42f1c-d67f-4310-b89a-8c78f8ef6146 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Converting VIF {"id": "375c20c8-b3bc-484b-820a-f3988fb1bfa1", "address": "fa:16:3e:af:53:5f", "network": {"id": "91662be7-398f-4c34-a848-62b46821f0fd", "bridge": "br-int", "label": "tempest-network-smoke--722078817", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap375c20c8-b3", "ovs_interfaceid": "375c20c8-b3bc-484b-820a-f3988fb1bfa1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:24:41 np0005466013 nova_compute[192144]: 2025-10-02 12:24:41.371 2 DEBUG nova.network.os_vif_util [None req-65a42f1c-d67f-4310-b89a-8c78f8ef6146 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:af:53:5f,bridge_name='br-int',has_traffic_filtering=True,id=375c20c8-b3bc-484b-820a-f3988fb1bfa1,network=Network(91662be7-398f-4c34-a848-62b46821f0fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap375c20c8-b3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:24:41 np0005466013 nova_compute[192144]: 2025-10-02 12:24:41.371 2 DEBUG os_vif [None req-65a42f1c-d67f-4310-b89a-8c78f8ef6146 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:af:53:5f,bridge_name='br-int',has_traffic_filtering=True,id=375c20c8-b3bc-484b-820a-f3988fb1bfa1,network=Network(91662be7-398f-4c34-a848-62b46821f0fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap375c20c8-b3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:24:41 np0005466013 nova_compute[192144]: 2025-10-02 12:24:41.372 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:41 np0005466013 nova_compute[192144]: 2025-10-02 12:24:41.373 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap375c20c8-b3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:24:41 np0005466013 nova_compute[192144]: 2025-10-02 12:24:41.415 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:41 np0005466013 neutron-haproxy-ovnmeta-91662be7-398f-4c34-a848-62b46821f0fd[238228]: [NOTICE]   (238232) : haproxy version is 2.8.14-c23fe91
Oct  2 08:24:41 np0005466013 neutron-haproxy-ovnmeta-91662be7-398f-4c34-a848-62b46821f0fd[238228]: [NOTICE]   (238232) : path to executable is /usr/sbin/haproxy
Oct  2 08:24:41 np0005466013 neutron-haproxy-ovnmeta-91662be7-398f-4c34-a848-62b46821f0fd[238228]: [WARNING]  (238232) : Exiting Master process...
Oct  2 08:24:41 np0005466013 nova_compute[192144]: 2025-10-02 12:24:41.417 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:41 np0005466013 nova_compute[192144]: 2025-10-02 12:24:41.419 2 INFO os_vif [None req-65a42f1c-d67f-4310-b89a-8c78f8ef6146 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:af:53:5f,bridge_name='br-int',has_traffic_filtering=True,id=375c20c8-b3bc-484b-820a-f3988fb1bfa1,network=Network(91662be7-398f-4c34-a848-62b46821f0fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap375c20c8-b3')#033[00m
Oct  2 08:24:41 np0005466013 neutron-haproxy-ovnmeta-91662be7-398f-4c34-a848-62b46821f0fd[238228]: [ALERT]    (238232) : Current worker (238234) exited with code 143 (Terminated)
Oct  2 08:24:41 np0005466013 neutron-haproxy-ovnmeta-91662be7-398f-4c34-a848-62b46821f0fd[238228]: [WARNING]  (238232) : All workers exited. Exiting... (0)
Oct  2 08:24:41 np0005466013 nova_compute[192144]: 2025-10-02 12:24:41.420 2 INFO nova.virt.libvirt.driver [None req-65a42f1c-d67f-4310-b89a-8c78f8ef6146 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Deleting instance files /var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af_del#033[00m
Oct  2 08:24:41 np0005466013 systemd[1]: libpod-f4240a8f43c0599e84e7b58e5ca05a890c7646b1a39fe27c1c859199f760458c.scope: Deactivated successfully.
Oct  2 08:24:41 np0005466013 nova_compute[192144]: 2025-10-02 12:24:41.426 2 INFO nova.virt.libvirt.driver [None req-65a42f1c-d67f-4310-b89a-8c78f8ef6146 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Deletion of /var/lib/nova/instances/32196dd3-2739-4c43-9532-b0365f8095af_del complete#033[00m
Oct  2 08:24:41 np0005466013 podman[238505]: 2025-10-02 12:24:41.428084666 +0000 UTC m=+0.182638480 container died f4240a8f43c0599e84e7b58e5ca05a890c7646b1a39fe27c1c859199f760458c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-91662be7-398f-4c34-a848-62b46821f0fd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:24:42 np0005466013 systemd[1]: var-lib-containers-storage-overlay-80f245d4ca0b821c6c391de78d46793a5e47b0691f566a87f6f7f3874cadf233-merged.mount: Deactivated successfully.
Oct  2 08:24:42 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f4240a8f43c0599e84e7b58e5ca05a890c7646b1a39fe27c1c859199f760458c-userdata-shm.mount: Deactivated successfully.
Oct  2 08:24:42 np0005466013 podman[238505]: 2025-10-02 12:24:42.306166853 +0000 UTC m=+1.060720617 container cleanup f4240a8f43c0599e84e7b58e5ca05a890c7646b1a39fe27c1c859199f760458c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-91662be7-398f-4c34-a848-62b46821f0fd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  2 08:24:42 np0005466013 systemd[1]: libpod-conmon-f4240a8f43c0599e84e7b58e5ca05a890c7646b1a39fe27c1c859199f760458c.scope: Deactivated successfully.
Oct  2 08:24:42 np0005466013 nova_compute[192144]: 2025-10-02 12:24:42.564 2 INFO nova.compute.manager [None req-65a42f1c-d67f-4310-b89a-8c78f8ef6146 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Took 1.60 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:24:42 np0005466013 nova_compute[192144]: 2025-10-02 12:24:42.565 2 DEBUG oslo.service.loopingcall [None req-65a42f1c-d67f-4310-b89a-8c78f8ef6146 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:24:42 np0005466013 nova_compute[192144]: 2025-10-02 12:24:42.565 2 DEBUG nova.compute.manager [-] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:24:42 np0005466013 nova_compute[192144]: 2025-10-02 12:24:42.565 2 DEBUG nova.network.neutron [-] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:24:42 np0005466013 podman[238540]: 2025-10-02 12:24:42.66528883 +0000 UTC m=+0.335109745 container remove f4240a8f43c0599e84e7b58e5ca05a890c7646b1a39fe27c1c859199f760458c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-91662be7-398f-4c34-a848-62b46821f0fd, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:24:42 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:42.670 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[25f6a929-f947-4d46-9c62-dd6ea4f67b37]: (4, ('Thu Oct  2 12:24:41 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-91662be7-398f-4c34-a848-62b46821f0fd (f4240a8f43c0599e84e7b58e5ca05a890c7646b1a39fe27c1c859199f760458c)\nf4240a8f43c0599e84e7b58e5ca05a890c7646b1a39fe27c1c859199f760458c\nThu Oct  2 12:24:42 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-91662be7-398f-4c34-a848-62b46821f0fd (f4240a8f43c0599e84e7b58e5ca05a890c7646b1a39fe27c1c859199f760458c)\nf4240a8f43c0599e84e7b58e5ca05a890c7646b1a39fe27c1c859199f760458c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:42 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:42.672 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[c65f8781-c00a-43a9-9ebd-362b09b455a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:42 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:42.673 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap91662be7-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:24:42 np0005466013 kernel: tap91662be7-30: left promiscuous mode
Oct  2 08:24:42 np0005466013 nova_compute[192144]: 2025-10-02 12:24:42.676 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:42 np0005466013 nova_compute[192144]: 2025-10-02 12:24:42.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:42 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:42.690 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[df37a138-7d66-4e19-ac03-0730f4dbcf20]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:42 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:42.717 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[252ecdcf-8eb5-4549-8434-95b3d0aec522]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:42 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:42.718 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[d35062da-a334-4f26-909f-97baf465de4f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:42 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:42.741 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[9be22ab5-4aa8-4a56-8548-cac82cc1dc23]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 585055, 'reachable_time': 26661, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238554, 'error': None, 'target': 'ovnmeta-91662be7-398f-4c34-a848-62b46821f0fd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:42 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:42.743 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-91662be7-398f-4c34-a848-62b46821f0fd deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:24:42 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:42.744 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[608a55eb-2605-4276-91ac-16588b1dc27e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:42 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:42.744 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 375c20c8-b3bc-484b-820a-f3988fb1bfa1 in datapath 91662be7-398f-4c34-a848-62b46821f0fd unbound from our chassis#033[00m
Oct  2 08:24:42 np0005466013 systemd[1]: run-netns-ovnmeta\x2d91662be7\x2d398f\x2d4c34\x2da848\x2d62b46821f0fd.mount: Deactivated successfully.
Oct  2 08:24:42 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:42.746 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 91662be7-398f-4c34-a848-62b46821f0fd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:24:42 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:42.747 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[e293b1c6-d978-4b4f-a2d4-2fbc66962d61]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:42 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:42.747 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 375c20c8-b3bc-484b-820a-f3988fb1bfa1 in datapath 91662be7-398f-4c34-a848-62b46821f0fd unbound from our chassis#033[00m
Oct  2 08:24:42 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:42.749 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 91662be7-398f-4c34-a848-62b46821f0fd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:24:42 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:42.749 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[a80e2220-fac0-49cf-946e-e92f815f03d9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:42 np0005466013 nova_compute[192144]: 2025-10-02 12:24:42.784 2 DEBUG nova.compute.manager [req-474ffe34-b560-4a09-962a-2703d03da1a2 req-5a040497-1bcf-4283-9894-58c735da6773 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Received event network-vif-unplugged-375c20c8-b3bc-484b-820a-f3988fb1bfa1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:24:42 np0005466013 nova_compute[192144]: 2025-10-02 12:24:42.785 2 DEBUG oslo_concurrency.lockutils [req-474ffe34-b560-4a09-962a-2703d03da1a2 req-5a040497-1bcf-4283-9894-58c735da6773 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "32196dd3-2739-4c43-9532-b0365f8095af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:42 np0005466013 nova_compute[192144]: 2025-10-02 12:24:42.785 2 DEBUG oslo_concurrency.lockutils [req-474ffe34-b560-4a09-962a-2703d03da1a2 req-5a040497-1bcf-4283-9894-58c735da6773 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "32196dd3-2739-4c43-9532-b0365f8095af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:42 np0005466013 nova_compute[192144]: 2025-10-02 12:24:42.785 2 DEBUG oslo_concurrency.lockutils [req-474ffe34-b560-4a09-962a-2703d03da1a2 req-5a040497-1bcf-4283-9894-58c735da6773 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "32196dd3-2739-4c43-9532-b0365f8095af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:42 np0005466013 nova_compute[192144]: 2025-10-02 12:24:42.785 2 DEBUG nova.compute.manager [req-474ffe34-b560-4a09-962a-2703d03da1a2 req-5a040497-1bcf-4283-9894-58c735da6773 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] No waiting events found dispatching network-vif-unplugged-375c20c8-b3bc-484b-820a-f3988fb1bfa1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:24:42 np0005466013 nova_compute[192144]: 2025-10-02 12:24:42.786 2 DEBUG nova.compute.manager [req-474ffe34-b560-4a09-962a-2703d03da1a2 req-5a040497-1bcf-4283-9894-58c735da6773 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Received event network-vif-unplugged-375c20c8-b3bc-484b-820a-f3988fb1bfa1 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:24:42 np0005466013 nova_compute[192144]: 2025-10-02 12:24:42.786 2 DEBUG nova.compute.manager [req-474ffe34-b560-4a09-962a-2703d03da1a2 req-5a040497-1bcf-4283-9894-58c735da6773 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Received event network-vif-plugged-375c20c8-b3bc-484b-820a-f3988fb1bfa1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:24:42 np0005466013 nova_compute[192144]: 2025-10-02 12:24:42.786 2 DEBUG oslo_concurrency.lockutils [req-474ffe34-b560-4a09-962a-2703d03da1a2 req-5a040497-1bcf-4283-9894-58c735da6773 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "32196dd3-2739-4c43-9532-b0365f8095af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:42 np0005466013 nova_compute[192144]: 2025-10-02 12:24:42.786 2 DEBUG oslo_concurrency.lockutils [req-474ffe34-b560-4a09-962a-2703d03da1a2 req-5a040497-1bcf-4283-9894-58c735da6773 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "32196dd3-2739-4c43-9532-b0365f8095af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:42 np0005466013 nova_compute[192144]: 2025-10-02 12:24:42.786 2 DEBUG oslo_concurrency.lockutils [req-474ffe34-b560-4a09-962a-2703d03da1a2 req-5a040497-1bcf-4283-9894-58c735da6773 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "32196dd3-2739-4c43-9532-b0365f8095af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:42 np0005466013 nova_compute[192144]: 2025-10-02 12:24:42.787 2 DEBUG nova.compute.manager [req-474ffe34-b560-4a09-962a-2703d03da1a2 req-5a040497-1bcf-4283-9894-58c735da6773 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] No waiting events found dispatching network-vif-plugged-375c20c8-b3bc-484b-820a-f3988fb1bfa1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:24:42 np0005466013 nova_compute[192144]: 2025-10-02 12:24:42.787 2 WARNING nova.compute.manager [req-474ffe34-b560-4a09-962a-2703d03da1a2 req-5a040497-1bcf-4283-9894-58c735da6773 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Received unexpected event network-vif-plugged-375c20c8-b3bc-484b-820a-f3988fb1bfa1 for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:24:42 np0005466013 nova_compute[192144]: 2025-10-02 12:24:42.809 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:42 np0005466013 podman[238556]: 2025-10-02 12:24:42.841659652 +0000 UTC m=+0.065653486 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:24:42 np0005466013 podman[238555]: 2025-10-02 12:24:42.85178177 +0000 UTC m=+0.075717493 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:24:42 np0005466013 nova_compute[192144]: 2025-10-02 12:24:42.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:24:43 np0005466013 nova_compute[192144]: 2025-10-02 12:24:43.129 2 DEBUG nova.network.neutron [req-0df46a8f-a9fa-49b4-b540-5b403fbc2823 req-5219ca6c-e03a-43c3-82c6-2829faca7a2c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Updated VIF entry in instance network info cache for port 375c20c8-b3bc-484b-820a-f3988fb1bfa1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:24:43 np0005466013 nova_compute[192144]: 2025-10-02 12:24:43.130 2 DEBUG nova.network.neutron [req-0df46a8f-a9fa-49b4-b540-5b403fbc2823 req-5219ca6c-e03a-43c3-82c6-2829faca7a2c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Updating instance_info_cache with network_info: [{"id": "375c20c8-b3bc-484b-820a-f3988fb1bfa1", "address": "fa:16:3e:af:53:5f", "network": {"id": "91662be7-398f-4c34-a848-62b46821f0fd", "bridge": "br-int", "label": "tempest-network-smoke--722078817", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap375c20c8-b3", "ovs_interfaceid": "375c20c8-b3bc-484b-820a-f3988fb1bfa1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:24:43 np0005466013 nova_compute[192144]: 2025-10-02 12:24:43.182 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:24:43 np0005466013 nova_compute[192144]: 2025-10-02 12:24:43.183 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  2 08:24:43 np0005466013 nova_compute[192144]: 2025-10-02 12:24:43.256 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  2 08:24:43 np0005466013 nova_compute[192144]: 2025-10-02 12:24:43.338 2 DEBUG oslo_concurrency.lockutils [req-0df46a8f-a9fa-49b4-b540-5b403fbc2823 req-5219ca6c-e03a-43c3-82c6-2829faca7a2c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-32196dd3-2739-4c43-9532-b0365f8095af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:24:44 np0005466013 nova_compute[192144]: 2025-10-02 12:24:44.206 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:44 np0005466013 nova_compute[192144]: 2025-10-02 12:24:44.416 2 DEBUG nova.network.neutron [-] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:24:44 np0005466013 nova_compute[192144]: 2025-10-02 12:24:44.770 2 DEBUG nova.compute.manager [req-390d610e-f231-42b2-b073-72e0c4b16f6c req-95f5a132-0e1d-4438-b568-715791377c98 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Received event network-vif-deleted-375c20c8-b3bc-484b-820a-f3988fb1bfa1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:24:44 np0005466013 nova_compute[192144]: 2025-10-02 12:24:44.771 2 INFO nova.compute.manager [req-390d610e-f231-42b2-b073-72e0c4b16f6c req-95f5a132-0e1d-4438-b568-715791377c98 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Neutron deleted interface 375c20c8-b3bc-484b-820a-f3988fb1bfa1; detaching it from the instance and deleting it from the info cache#033[00m
Oct  2 08:24:44 np0005466013 nova_compute[192144]: 2025-10-02 12:24:44.771 2 DEBUG nova.network.neutron [req-390d610e-f231-42b2-b073-72e0c4b16f6c req-95f5a132-0e1d-4438-b568-715791377c98 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:24:44 np0005466013 nova_compute[192144]: 2025-10-02 12:24:44.807 2 INFO nova.compute.manager [-] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Took 2.24 seconds to deallocate network for instance.#033[00m
Oct  2 08:24:44 np0005466013 nova_compute[192144]: 2025-10-02 12:24:44.861 2 DEBUG nova.compute.manager [req-390d610e-f231-42b2-b073-72e0c4b16f6c req-95f5a132-0e1d-4438-b568-715791377c98 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Detach interface failed, port_id=375c20c8-b3bc-484b-820a-f3988fb1bfa1, reason: Instance 32196dd3-2739-4c43-9532-b0365f8095af could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Oct  2 08:24:45 np0005466013 nova_compute[192144]: 2025-10-02 12:24:45.388 2 DEBUG oslo_concurrency.lockutils [None req-65a42f1c-d67f-4310-b89a-8c78f8ef6146 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:45 np0005466013 nova_compute[192144]: 2025-10-02 12:24:45.389 2 DEBUG oslo_concurrency.lockutils [None req-65a42f1c-d67f-4310-b89a-8c78f8ef6146 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:45 np0005466013 nova_compute[192144]: 2025-10-02 12:24:45.450 2 DEBUG nova.compute.provider_tree [None req-65a42f1c-d67f-4310-b89a-8c78f8ef6146 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:24:45 np0005466013 nova_compute[192144]: 2025-10-02 12:24:45.489 2 DEBUG nova.scheduler.client.report [None req-65a42f1c-d67f-4310-b89a-8c78f8ef6146 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:24:45 np0005466013 nova_compute[192144]: 2025-10-02 12:24:45.671 2 DEBUG oslo_concurrency.lockutils [None req-65a42f1c-d67f-4310-b89a-8c78f8ef6146 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.282s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:45 np0005466013 nova_compute[192144]: 2025-10-02 12:24:45.730 2 INFO nova.scheduler.client.report [None req-65a42f1c-d67f-4310-b89a-8c78f8ef6146 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Deleted allocations for instance 32196dd3-2739-4c43-9532-b0365f8095af#033[00m
Oct  2 08:24:46 np0005466013 nova_compute[192144]: 2025-10-02 12:24:46.208 2 DEBUG oslo_concurrency.lockutils [None req-65a42f1c-d67f-4310-b89a-8c78f8ef6146 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "32196dd3-2739-4c43-9532-b0365f8095af" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.418s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:46 np0005466013 nova_compute[192144]: 2025-10-02 12:24:46.417 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:47 np0005466013 nova_compute[192144]: 2025-10-02 12:24:47.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:48 np0005466013 nova_compute[192144]: 2025-10-02 12:24:48.434 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:24:48 np0005466013 nova_compute[192144]: 2025-10-02 12:24:48.834 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:49 np0005466013 nova_compute[192144]: 2025-10-02 12:24:49.214 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:50 np0005466013 nova_compute[192144]: 2025-10-02 12:24:50.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:24:51 np0005466013 nova_compute[192144]: 2025-10-02 12:24:51.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:54 np0005466013 nova_compute[192144]: 2025-10-02 12:24:54.127 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:54 np0005466013 nova_compute[192144]: 2025-10-02 12:24:54.301 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:54 np0005466013 nova_compute[192144]: 2025-10-02 12:24:54.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:54 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:54.472 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ea:74:6f 10.100.0.2 2001:db8::f816:3eff:feea:746f'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:feea:746f/64', 'neutron:device_id': 'ovnmeta-26df2dcf-f57c-4dae-8522-0277df741ed3', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-26df2dcf-f57c-4dae-8522-0277df741ed3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e2784fb0-50ac-4c91-ba90-3b5c38b8adf4, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=adc60e93-14bb-4eb4-8a79-15dda196dc01) old=Port_Binding(mac=['fa:16:3e:ea:74:6f 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-26df2dcf-f57c-4dae-8522-0277df741ed3', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-26df2dcf-f57c-4dae-8522-0277df741ed3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:24:54 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:54.473 103323 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port adc60e93-14bb-4eb4-8a79-15dda196dc01 in datapath 26df2dcf-f57c-4dae-8522-0277df741ed3 updated#033[00m
Oct  2 08:24:54 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:54.475 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 26df2dcf-f57c-4dae-8522-0277df741ed3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:24:54 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:24:54.476 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[690679a4-18d8-4b0c-9248-3c8082e0fe93]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:56 np0005466013 nova_compute[192144]: 2025-10-02 12:24:56.258 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407881.2573218, 32196dd3-2739-4c43-9532-b0365f8095af => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:24:56 np0005466013 nova_compute[192144]: 2025-10-02 12:24:56.259 2 INFO nova.compute.manager [-] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:24:56 np0005466013 nova_compute[192144]: 2025-10-02 12:24:56.337 2 DEBUG nova.compute.manager [None req-7b9a6c16-d61f-4409-beba-a967ae401c88 - - - - - -] [instance: 32196dd3-2739-4c43-9532-b0365f8095af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:24:56 np0005466013 nova_compute[192144]: 2025-10-02 12:24:56.423 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:57 np0005466013 podman[238600]: 2025-10-02 12:24:57.682973636 +0000 UTC m=+0.055416157 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 08:24:57 np0005466013 podman[238601]: 2025-10-02 12:24:57.691150892 +0000 UTC m=+0.059893577 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Oct  2 08:24:57 np0005466013 podman[238602]: 2025-10-02 12:24:57.738815004 +0000 UTC m=+0.098077462 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  2 08:24:59 np0005466013 nova_compute[192144]: 2025-10-02 12:24:59.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:01 np0005466013 nova_compute[192144]: 2025-10-02 12:25:01.427 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:02.307 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:02.308 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:02.308 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:02 np0005466013 nova_compute[192144]: 2025-10-02 12:25:02.563 2 DEBUG oslo_concurrency.lockutils [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquiring lock "f92877b9-dd8b-4444-a42b-987004802928" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:02 np0005466013 nova_compute[192144]: 2025-10-02 12:25:02.564 2 DEBUG oslo_concurrency.lockutils [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "f92877b9-dd8b-4444-a42b-987004802928" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:02 np0005466013 nova_compute[192144]: 2025-10-02 12:25:02.758 2 DEBUG nova.compute.manager [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:25:03 np0005466013 nova_compute[192144]: 2025-10-02 12:25:03.987 2 DEBUG oslo_concurrency.lockutils [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:03 np0005466013 nova_compute[192144]: 2025-10-02 12:25:03.988 2 DEBUG oslo_concurrency.lockutils [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:03 np0005466013 nova_compute[192144]: 2025-10-02 12:25:03.997 2 DEBUG nova.virt.hardware [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:25:03 np0005466013 nova_compute[192144]: 2025-10-02 12:25:03.998 2 INFO nova.compute.claims [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:25:04 np0005466013 nova_compute[192144]: 2025-10-02 12:25:04.310 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:04 np0005466013 nova_compute[192144]: 2025-10-02 12:25:04.727 2 DEBUG nova.compute.provider_tree [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:25:04 np0005466013 nova_compute[192144]: 2025-10-02 12:25:04.854 2 DEBUG nova.scheduler.client.report [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:25:05 np0005466013 nova_compute[192144]: 2025-10-02 12:25:05.025 2 DEBUG oslo_concurrency.lockutils [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.037s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:05 np0005466013 nova_compute[192144]: 2025-10-02 12:25:05.026 2 DEBUG nova.compute.manager [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:25:05 np0005466013 nova_compute[192144]: 2025-10-02 12:25:05.635 2 DEBUG nova.compute.manager [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:25:05 np0005466013 nova_compute[192144]: 2025-10-02 12:25:05.635 2 DEBUG nova.network.neutron [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:25:05 np0005466013 nova_compute[192144]: 2025-10-02 12:25:05.802 2 DEBUG nova.policy [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:25:05 np0005466013 nova_compute[192144]: 2025-10-02 12:25:05.903 2 INFO nova.virt.libvirt.driver [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:25:05 np0005466013 nova_compute[192144]: 2025-10-02 12:25:05.994 2 DEBUG nova.compute.manager [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:25:06 np0005466013 nova_compute[192144]: 2025-10-02 12:25:06.430 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:06 np0005466013 nova_compute[192144]: 2025-10-02 12:25:06.685 2 DEBUG nova.compute.manager [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:25:06 np0005466013 nova_compute[192144]: 2025-10-02 12:25:06.687 2 DEBUG nova.virt.libvirt.driver [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:25:06 np0005466013 nova_compute[192144]: 2025-10-02 12:25:06.687 2 INFO nova.virt.libvirt.driver [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Creating image(s)#033[00m
Oct  2 08:25:06 np0005466013 nova_compute[192144]: 2025-10-02 12:25:06.688 2 DEBUG oslo_concurrency.lockutils [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquiring lock "/var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:06 np0005466013 nova_compute[192144]: 2025-10-02 12:25:06.688 2 DEBUG oslo_concurrency.lockutils [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "/var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:06 np0005466013 nova_compute[192144]: 2025-10-02 12:25:06.689 2 DEBUG oslo_concurrency.lockutils [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "/var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:06 np0005466013 nova_compute[192144]: 2025-10-02 12:25:06.710 2 DEBUG oslo_concurrency.processutils [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:06 np0005466013 nova_compute[192144]: 2025-10-02 12:25:06.775 2 DEBUG oslo_concurrency.processutils [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:06 np0005466013 nova_compute[192144]: 2025-10-02 12:25:06.777 2 DEBUG oslo_concurrency.lockutils [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:06 np0005466013 nova_compute[192144]: 2025-10-02 12:25:06.778 2 DEBUG oslo_concurrency.lockutils [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:06 np0005466013 nova_compute[192144]: 2025-10-02 12:25:06.794 2 DEBUG oslo_concurrency.processutils [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:06 np0005466013 nova_compute[192144]: 2025-10-02 12:25:06.856 2 DEBUG oslo_concurrency.processutils [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:06 np0005466013 nova_compute[192144]: 2025-10-02 12:25:06.857 2 DEBUG oslo_concurrency.processutils [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:06 np0005466013 nova_compute[192144]: 2025-10-02 12:25:06.899 2 DEBUG oslo_concurrency.processutils [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:06 np0005466013 nova_compute[192144]: 2025-10-02 12:25:06.901 2 DEBUG oslo_concurrency.lockutils [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:06 np0005466013 nova_compute[192144]: 2025-10-02 12:25:06.902 2 DEBUG oslo_concurrency.processutils [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:06 np0005466013 nova_compute[192144]: 2025-10-02 12:25:06.967 2 DEBUG oslo_concurrency.processutils [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:06 np0005466013 nova_compute[192144]: 2025-10-02 12:25:06.968 2 DEBUG nova.virt.disk.api [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Checking if we can resize image /var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:25:06 np0005466013 nova_compute[192144]: 2025-10-02 12:25:06.969 2 DEBUG oslo_concurrency.processutils [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:07 np0005466013 nova_compute[192144]: 2025-10-02 12:25:07.046 2 DEBUG oslo_concurrency.processutils [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:07 np0005466013 nova_compute[192144]: 2025-10-02 12:25:07.048 2 DEBUG nova.virt.disk.api [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Cannot resize image /var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:25:07 np0005466013 nova_compute[192144]: 2025-10-02 12:25:07.048 2 DEBUG nova.objects.instance [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lazy-loading 'migration_context' on Instance uuid f92877b9-dd8b-4444-a42b-987004802928 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:25:07 np0005466013 nova_compute[192144]: 2025-10-02 12:25:07.069 2 DEBUG nova.virt.libvirt.driver [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:25:07 np0005466013 nova_compute[192144]: 2025-10-02 12:25:07.070 2 DEBUG nova.virt.libvirt.driver [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Ensure instance console log exists: /var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:25:07 np0005466013 nova_compute[192144]: 2025-10-02 12:25:07.070 2 DEBUG oslo_concurrency.lockutils [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:07 np0005466013 nova_compute[192144]: 2025-10-02 12:25:07.071 2 DEBUG oslo_concurrency.lockutils [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:07 np0005466013 nova_compute[192144]: 2025-10-02 12:25:07.071 2 DEBUG oslo_concurrency.lockutils [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:07 np0005466013 nova_compute[192144]: 2025-10-02 12:25:07.128 2 DEBUG nova.network.neutron [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Successfully created port: 60b9a0ec-2ade-4f90-a7b0-443ac527ec3e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:25:08 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:08.183 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=29, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=28) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:25:08 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:08.184 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:25:08 np0005466013 nova_compute[192144]: 2025-10-02 12:25:08.215 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:08 np0005466013 nova_compute[192144]: 2025-10-02 12:25:08.627 2 DEBUG nova.network.neutron [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Successfully updated port: 60b9a0ec-2ade-4f90-a7b0-443ac527ec3e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:25:08 np0005466013 nova_compute[192144]: 2025-10-02 12:25:08.658 2 DEBUG oslo_concurrency.lockutils [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquiring lock "refresh_cache-f92877b9-dd8b-4444-a42b-987004802928" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:25:08 np0005466013 nova_compute[192144]: 2025-10-02 12:25:08.658 2 DEBUG oslo_concurrency.lockutils [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquired lock "refresh_cache-f92877b9-dd8b-4444-a42b-987004802928" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:25:08 np0005466013 nova_compute[192144]: 2025-10-02 12:25:08.659 2 DEBUG nova.network.neutron [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:25:08 np0005466013 podman[238684]: 2025-10-02 12:25:08.703701037 +0000 UTC m=+0.072592855 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible)
Oct  2 08:25:08 np0005466013 podman[238686]: 2025-10-02 12:25:08.710802139 +0000 UTC m=+0.071976345 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  2 08:25:08 np0005466013 podman[238685]: 2025-10-02 12:25:08.736768452 +0000 UTC m=+0.091770324 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, managed_by=edpm_ansible, version=9.6, distribution-scope=public, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vcs-type=git, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9)
Oct  2 08:25:08 np0005466013 nova_compute[192144]: 2025-10-02 12:25:08.759 2 DEBUG nova.compute.manager [req-683ee248-2879-4539-9d65-de70331d0f34 req-b1491448-f9a1-4af6-a2cb-004cdb2401ef 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Received event network-changed-60b9a0ec-2ade-4f90-a7b0-443ac527ec3e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:25:08 np0005466013 nova_compute[192144]: 2025-10-02 12:25:08.760 2 DEBUG nova.compute.manager [req-683ee248-2879-4539-9d65-de70331d0f34 req-b1491448-f9a1-4af6-a2cb-004cdb2401ef 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Refreshing instance network info cache due to event network-changed-60b9a0ec-2ade-4f90-a7b0-443ac527ec3e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:25:08 np0005466013 nova_compute[192144]: 2025-10-02 12:25:08.760 2 DEBUG oslo_concurrency.lockutils [req-683ee248-2879-4539-9d65-de70331d0f34 req-b1491448-f9a1-4af6-a2cb-004cdb2401ef 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-f92877b9-dd8b-4444-a42b-987004802928" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:25:08 np0005466013 nova_compute[192144]: 2025-10-02 12:25:08.874 2 DEBUG nova.network.neutron [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:25:09 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:09.187 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '29'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:25:09 np0005466013 nova_compute[192144]: 2025-10-02 12:25:09.349 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:09 np0005466013 nova_compute[192144]: 2025-10-02 12:25:09.949 2 DEBUG nova.network.neutron [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Updating instance_info_cache with network_info: [{"id": "60b9a0ec-2ade-4f90-a7b0-443ac527ec3e", "address": "fa:16:3e:f0:49:10", "network": {"id": "aa4ebb90-ef5e-4974-a53d-2aabd696731a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88e90c16adec46069b539d4f1431ab4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60b9a0ec-2a", "ovs_interfaceid": "60b9a0ec-2ade-4f90-a7b0-443ac527ec3e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:25:10 np0005466013 nova_compute[192144]: 2025-10-02 12:25:10.076 2 DEBUG oslo_concurrency.lockutils [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Releasing lock "refresh_cache-f92877b9-dd8b-4444-a42b-987004802928" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:25:10 np0005466013 nova_compute[192144]: 2025-10-02 12:25:10.078 2 DEBUG nova.compute.manager [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Instance network_info: |[{"id": "60b9a0ec-2ade-4f90-a7b0-443ac527ec3e", "address": "fa:16:3e:f0:49:10", "network": {"id": "aa4ebb90-ef5e-4974-a53d-2aabd696731a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88e90c16adec46069b539d4f1431ab4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60b9a0ec-2a", "ovs_interfaceid": "60b9a0ec-2ade-4f90-a7b0-443ac527ec3e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:25:10 np0005466013 nova_compute[192144]: 2025-10-02 12:25:10.079 2 DEBUG oslo_concurrency.lockutils [req-683ee248-2879-4539-9d65-de70331d0f34 req-b1491448-f9a1-4af6-a2cb-004cdb2401ef 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-f92877b9-dd8b-4444-a42b-987004802928" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:25:10 np0005466013 nova_compute[192144]: 2025-10-02 12:25:10.080 2 DEBUG nova.network.neutron [req-683ee248-2879-4539-9d65-de70331d0f34 req-b1491448-f9a1-4af6-a2cb-004cdb2401ef 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Refreshing network info cache for port 60b9a0ec-2ade-4f90-a7b0-443ac527ec3e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:25:10 np0005466013 nova_compute[192144]: 2025-10-02 12:25:10.082 2 DEBUG nova.virt.libvirt.driver [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Start _get_guest_xml network_info=[{"id": "60b9a0ec-2ade-4f90-a7b0-443ac527ec3e", "address": "fa:16:3e:f0:49:10", "network": {"id": "aa4ebb90-ef5e-4974-a53d-2aabd696731a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88e90c16adec46069b539d4f1431ab4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60b9a0ec-2a", "ovs_interfaceid": "60b9a0ec-2ade-4f90-a7b0-443ac527ec3e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:25:10 np0005466013 nova_compute[192144]: 2025-10-02 12:25:10.088 2 WARNING nova.virt.libvirt.driver [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:25:10 np0005466013 nova_compute[192144]: 2025-10-02 12:25:10.095 2 DEBUG nova.virt.libvirt.host [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:25:10 np0005466013 nova_compute[192144]: 2025-10-02 12:25:10.096 2 DEBUG nova.virt.libvirt.host [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:25:10 np0005466013 nova_compute[192144]: 2025-10-02 12:25:10.100 2 DEBUG nova.virt.libvirt.host [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:25:10 np0005466013 nova_compute[192144]: 2025-10-02 12:25:10.101 2 DEBUG nova.virt.libvirt.host [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:25:10 np0005466013 nova_compute[192144]: 2025-10-02 12:25:10.102 2 DEBUG nova.virt.libvirt.driver [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:25:10 np0005466013 nova_compute[192144]: 2025-10-02 12:25:10.103 2 DEBUG nova.virt.hardware [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:25:10 np0005466013 nova_compute[192144]: 2025-10-02 12:25:10.103 2 DEBUG nova.virt.hardware [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:25:10 np0005466013 nova_compute[192144]: 2025-10-02 12:25:10.103 2 DEBUG nova.virt.hardware [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:25:10 np0005466013 nova_compute[192144]: 2025-10-02 12:25:10.103 2 DEBUG nova.virt.hardware [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:25:10 np0005466013 nova_compute[192144]: 2025-10-02 12:25:10.104 2 DEBUG nova.virt.hardware [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:25:10 np0005466013 nova_compute[192144]: 2025-10-02 12:25:10.104 2 DEBUG nova.virt.hardware [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:25:10 np0005466013 nova_compute[192144]: 2025-10-02 12:25:10.104 2 DEBUG nova.virt.hardware [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:25:10 np0005466013 nova_compute[192144]: 2025-10-02 12:25:10.105 2 DEBUG nova.virt.hardware [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:25:10 np0005466013 nova_compute[192144]: 2025-10-02 12:25:10.105 2 DEBUG nova.virt.hardware [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:25:10 np0005466013 nova_compute[192144]: 2025-10-02 12:25:10.105 2 DEBUG nova.virt.hardware [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:25:10 np0005466013 nova_compute[192144]: 2025-10-02 12:25:10.105 2 DEBUG nova.virt.hardware [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:25:10 np0005466013 nova_compute[192144]: 2025-10-02 12:25:10.111 2 DEBUG nova.virt.libvirt.vif [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:24:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-915765106',display_name='tempest-ServerStableDeviceRescueTest-server-915765106',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-915765106',id=119,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='88e90c16adec46069b539d4f1431ab4d',ramdisk_id='',reservation_id='r-6ztfdnas',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-232864240',owner_user_name='tempest-ServerStableDeviceRescueTest-232864240-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:25:06Z,user_data=None,user_id='abb9f220716e48d79dbe2f97622937c4',uuid=f92877b9-dd8b-4444-a42b-987004802928,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "60b9a0ec-2ade-4f90-a7b0-443ac527ec3e", "address": "fa:16:3e:f0:49:10", "network": {"id": "aa4ebb90-ef5e-4974-a53d-2aabd696731a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88e90c16adec46069b539d4f1431ab4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60b9a0ec-2a", "ovs_interfaceid": "60b9a0ec-2ade-4f90-a7b0-443ac527ec3e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:25:10 np0005466013 nova_compute[192144]: 2025-10-02 12:25:10.112 2 DEBUG nova.network.os_vif_util [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Converting VIF {"id": "60b9a0ec-2ade-4f90-a7b0-443ac527ec3e", "address": "fa:16:3e:f0:49:10", "network": {"id": "aa4ebb90-ef5e-4974-a53d-2aabd696731a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88e90c16adec46069b539d4f1431ab4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60b9a0ec-2a", "ovs_interfaceid": "60b9a0ec-2ade-4f90-a7b0-443ac527ec3e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:25:10 np0005466013 nova_compute[192144]: 2025-10-02 12:25:10.113 2 DEBUG nova.network.os_vif_util [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f0:49:10,bridge_name='br-int',has_traffic_filtering=True,id=60b9a0ec-2ade-4f90-a7b0-443ac527ec3e,network=Network(aa4ebb90-ef5e-4974-a53d-2aabd696731a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60b9a0ec-2a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:25:10 np0005466013 nova_compute[192144]: 2025-10-02 12:25:10.114 2 DEBUG nova.objects.instance [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lazy-loading 'pci_devices' on Instance uuid f92877b9-dd8b-4444-a42b-987004802928 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:25:10 np0005466013 nova_compute[192144]: 2025-10-02 12:25:10.222 2 DEBUG nova.virt.libvirt.driver [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:25:10 np0005466013 nova_compute[192144]:  <uuid>f92877b9-dd8b-4444-a42b-987004802928</uuid>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:  <name>instance-00000077</name>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:25:10 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:      <nova:name>tempest-ServerStableDeviceRescueTest-server-915765106</nova:name>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:25:10</nova:creationTime>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:25:10 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:        <nova:user uuid="abb9f220716e48d79dbe2f97622937c4">tempest-ServerStableDeviceRescueTest-232864240-project-member</nova:user>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:        <nova:project uuid="88e90c16adec46069b539d4f1431ab4d">tempest-ServerStableDeviceRescueTest-232864240</nova:project>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:        <nova:port uuid="60b9a0ec-2ade-4f90-a7b0-443ac527ec3e">
Oct  2 08:25:10 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:      <entry name="serial">f92877b9-dd8b-4444-a42b-987004802928</entry>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:      <entry name="uuid">f92877b9-dd8b-4444-a42b-987004802928</entry>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:25:10 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/disk"/>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:25:10 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/disk.config"/>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:25:10 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:f0:49:10"/>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:      <target dev="tap60b9a0ec-2a"/>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:25:10 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/console.log" append="off"/>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:25:10 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:25:10 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:25:10 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:25:10 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:25:10 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:25:10 np0005466013 nova_compute[192144]: 2025-10-02 12:25:10.224 2 DEBUG nova.compute.manager [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Preparing to wait for external event network-vif-plugged-60b9a0ec-2ade-4f90-a7b0-443ac527ec3e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:25:10 np0005466013 nova_compute[192144]: 2025-10-02 12:25:10.225 2 DEBUG oslo_concurrency.lockutils [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquiring lock "f92877b9-dd8b-4444-a42b-987004802928-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:10 np0005466013 nova_compute[192144]: 2025-10-02 12:25:10.225 2 DEBUG oslo_concurrency.lockutils [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "f92877b9-dd8b-4444-a42b-987004802928-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:10 np0005466013 nova_compute[192144]: 2025-10-02 12:25:10.225 2 DEBUG oslo_concurrency.lockutils [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "f92877b9-dd8b-4444-a42b-987004802928-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:10 np0005466013 nova_compute[192144]: 2025-10-02 12:25:10.226 2 DEBUG nova.virt.libvirt.vif [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:24:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-915765106',display_name='tempest-ServerStableDeviceRescueTest-server-915765106',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-915765106',id=119,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='88e90c16adec46069b539d4f1431ab4d',ramdisk_id='',reservation_id='r-6ztfdnas',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-232864240',owner_user_name='tempest-ServerStableDeviceRescueTest-232864240-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:25:06Z,user_data=None,user_id='abb9f220716e48d79dbe2f97622937c4',uuid=f92877b9-dd8b-4444-a42b-987004802928,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "60b9a0ec-2ade-4f90-a7b0-443ac527ec3e", "address": "fa:16:3e:f0:49:10", "network": {"id": "aa4ebb90-ef5e-4974-a53d-2aabd696731a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88e90c16adec46069b539d4f1431ab4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60b9a0ec-2a", "ovs_interfaceid": "60b9a0ec-2ade-4f90-a7b0-443ac527ec3e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:25:10 np0005466013 nova_compute[192144]: 2025-10-02 12:25:10.226 2 DEBUG nova.network.os_vif_util [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Converting VIF {"id": "60b9a0ec-2ade-4f90-a7b0-443ac527ec3e", "address": "fa:16:3e:f0:49:10", "network": {"id": "aa4ebb90-ef5e-4974-a53d-2aabd696731a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88e90c16adec46069b539d4f1431ab4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60b9a0ec-2a", "ovs_interfaceid": "60b9a0ec-2ade-4f90-a7b0-443ac527ec3e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:25:10 np0005466013 nova_compute[192144]: 2025-10-02 12:25:10.227 2 DEBUG nova.network.os_vif_util [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f0:49:10,bridge_name='br-int',has_traffic_filtering=True,id=60b9a0ec-2ade-4f90-a7b0-443ac527ec3e,network=Network(aa4ebb90-ef5e-4974-a53d-2aabd696731a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60b9a0ec-2a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:25:10 np0005466013 nova_compute[192144]: 2025-10-02 12:25:10.227 2 DEBUG os_vif [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f0:49:10,bridge_name='br-int',has_traffic_filtering=True,id=60b9a0ec-2ade-4f90-a7b0-443ac527ec3e,network=Network(aa4ebb90-ef5e-4974-a53d-2aabd696731a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60b9a0ec-2a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:25:10 np0005466013 nova_compute[192144]: 2025-10-02 12:25:10.228 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:10 np0005466013 nova_compute[192144]: 2025-10-02 12:25:10.228 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:25:10 np0005466013 nova_compute[192144]: 2025-10-02 12:25:10.228 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:25:10 np0005466013 nova_compute[192144]: 2025-10-02 12:25:10.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:10 np0005466013 nova_compute[192144]: 2025-10-02 12:25:10.233 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap60b9a0ec-2a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:25:10 np0005466013 nova_compute[192144]: 2025-10-02 12:25:10.234 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap60b9a0ec-2a, col_values=(('external_ids', {'iface-id': '60b9a0ec-2ade-4f90-a7b0-443ac527ec3e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f0:49:10', 'vm-uuid': 'f92877b9-dd8b-4444-a42b-987004802928'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:25:10 np0005466013 nova_compute[192144]: 2025-10-02 12:25:10.237 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:10 np0005466013 NetworkManager[51205]: <info>  [1759407910.2379] manager: (tap60b9a0ec-2a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/219)
Oct  2 08:25:10 np0005466013 nova_compute[192144]: 2025-10-02 12:25:10.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:25:10 np0005466013 nova_compute[192144]: 2025-10-02 12:25:10.244 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:10 np0005466013 nova_compute[192144]: 2025-10-02 12:25:10.245 2 INFO os_vif [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f0:49:10,bridge_name='br-int',has_traffic_filtering=True,id=60b9a0ec-2ade-4f90-a7b0-443ac527ec3e,network=Network(aa4ebb90-ef5e-4974-a53d-2aabd696731a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60b9a0ec-2a')#033[00m
Oct  2 08:25:10 np0005466013 nova_compute[192144]: 2025-10-02 12:25:10.366 2 DEBUG nova.virt.libvirt.driver [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:25:10 np0005466013 nova_compute[192144]: 2025-10-02 12:25:10.366 2 DEBUG nova.virt.libvirt.driver [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:25:10 np0005466013 nova_compute[192144]: 2025-10-02 12:25:10.366 2 DEBUG nova.virt.libvirt.driver [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] No VIF found with MAC fa:16:3e:f0:49:10, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:25:10 np0005466013 nova_compute[192144]: 2025-10-02 12:25:10.367 2 INFO nova.virt.libvirt.driver [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Using config drive#033[00m
Oct  2 08:25:11 np0005466013 nova_compute[192144]: 2025-10-02 12:25:11.480 2 INFO nova.virt.libvirt.driver [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Creating config drive at /var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/disk.config#033[00m
Oct  2 08:25:11 np0005466013 nova_compute[192144]: 2025-10-02 12:25:11.485 2 DEBUG oslo_concurrency.processutils [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9gfpoqam execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:11 np0005466013 nova_compute[192144]: 2025-10-02 12:25:11.625 2 DEBUG oslo_concurrency.processutils [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9gfpoqam" returned: 0 in 0.140s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:11 np0005466013 kernel: tap60b9a0ec-2a: entered promiscuous mode
Oct  2 08:25:11 np0005466013 NetworkManager[51205]: <info>  [1759407911.7184] manager: (tap60b9a0ec-2a): new Tun device (/org/freedesktop/NetworkManager/Devices/220)
Oct  2 08:25:11 np0005466013 ovn_controller[94366]: 2025-10-02T12:25:11Z|00485|binding|INFO|Claiming lport 60b9a0ec-2ade-4f90-a7b0-443ac527ec3e for this chassis.
Oct  2 08:25:11 np0005466013 ovn_controller[94366]: 2025-10-02T12:25:11Z|00486|binding|INFO|60b9a0ec-2ade-4f90-a7b0-443ac527ec3e: Claiming fa:16:3e:f0:49:10 10.100.0.4
Oct  2 08:25:11 np0005466013 nova_compute[192144]: 2025-10-02 12:25:11.725 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:11 np0005466013 nova_compute[192144]: 2025-10-02 12:25:11.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:11 np0005466013 nova_compute[192144]: 2025-10-02 12:25:11.732 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:11 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:11.749 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f0:49:10 10.100.0.4'], port_security=['fa:16:3e:f0:49:10 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '88e90c16adec46069b539d4f1431ab4d', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ed50fd5d-92ed-497e-8f4f-4653533c5a19', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=53026845-594b-430c-a1e8-d879cf008d46, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=60b9a0ec-2ade-4f90-a7b0-443ac527ec3e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:25:11 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:11.750 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 60b9a0ec-2ade-4f90-a7b0-443ac527ec3e in datapath aa4ebb90-ef5e-4974-a53d-2aabd696731a bound to our chassis#033[00m
Oct  2 08:25:11 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:11.752 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network aa4ebb90-ef5e-4974-a53d-2aabd696731a#033[00m
Oct  2 08:25:11 np0005466013 systemd-udevd[238761]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:25:11 np0005466013 systemd-machined[152202]: New machine qemu-56-instance-00000077.
Oct  2 08:25:11 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:11.769 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[ce31bc2f-d254-4270-a235-853f0d64bd80]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:11 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:11.771 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapaa4ebb90-e1 in ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:25:11 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:11.773 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapaa4ebb90-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:25:11 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:11.773 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[63a42067-9192-4a93-8aef-1403ec4d9213]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:11 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:11.774 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[705ab564-cf5d-4b56-b94e-bc4e21b442e4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:11 np0005466013 NetworkManager[51205]: <info>  [1759407911.7775] device (tap60b9a0ec-2a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:25:11 np0005466013 NetworkManager[51205]: <info>  [1759407911.7789] device (tap60b9a0ec-2a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:25:11 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:11.791 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[a402abd7-1f92-4a39-ae5f-115027da828b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:11 np0005466013 systemd[1]: Started Virtual Machine qemu-56-instance-00000077.
Oct  2 08:25:11 np0005466013 nova_compute[192144]: 2025-10-02 12:25:11.802 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:11 np0005466013 ovn_controller[94366]: 2025-10-02T12:25:11Z|00487|binding|INFO|Setting lport 60b9a0ec-2ade-4f90-a7b0-443ac527ec3e ovn-installed in OVS
Oct  2 08:25:11 np0005466013 ovn_controller[94366]: 2025-10-02T12:25:11Z|00488|binding|INFO|Setting lport 60b9a0ec-2ade-4f90-a7b0-443ac527ec3e up in Southbound
Oct  2 08:25:11 np0005466013 nova_compute[192144]: 2025-10-02 12:25:11.807 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:11 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:11.811 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[9bed863a-2aac-4a5c-b052-08a1695148ef]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:11 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:11.851 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[dff5fd1e-13da-40e4-bfa8-8d051f3bbc86]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:11 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:11.857 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[7d59e1d4-7668-4f4d-a037-02563a6e3e8d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:11 np0005466013 systemd-udevd[238765]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:25:11 np0005466013 NetworkManager[51205]: <info>  [1759407911.8606] manager: (tapaa4ebb90-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/221)
Oct  2 08:25:11 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:11.900 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[c2bceead-695d-472e-a7f8-2482b1547907]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:11 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:11.904 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[9377e76a-2610-482b-91e3-90279dec3d64]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:11 np0005466013 NetworkManager[51205]: <info>  [1759407911.9403] device (tapaa4ebb90-e0): carrier: link connected
Oct  2 08:25:11 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:11.947 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[b9f25354-44c4-4b86-98df-01f091c51b94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:11 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:11.969 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[d215629e-a12f-4ff4-be86-564fe972a272]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapaa4ebb90-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bc:89:8e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 147], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 590355, 'reachable_time': 19033, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238795, 'error': None, 'target': 'ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:11 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:11.989 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[4b3fdfa9-9ef2-487d-83c5-4c237d52c836]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febc:898e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 590355, 'tstamp': 590355}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238796, 'error': None, 'target': 'ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:12.010 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[00dd27f3-90de-4105-bbdf-420a5890f449]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapaa4ebb90-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bc:89:8e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 147], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 590355, 'reachable_time': 19033, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 192, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 192, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 238797, 'error': None, 'target': 'ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:12.050 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[343e9b8c-825c-4710-a2e9-141b811cfaf2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:12.136 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[7c11ae18-76e1-4c42-b1c6-bf5ad5a6495a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:12.138 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaa4ebb90-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:25:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:12.138 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:25:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:12.139 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaa4ebb90-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:25:12 np0005466013 nova_compute[192144]: 2025-10-02 12:25:12.141 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:12 np0005466013 kernel: tapaa4ebb90-e0: entered promiscuous mode
Oct  2 08:25:12 np0005466013 nova_compute[192144]: 2025-10-02 12:25:12.145 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:12 np0005466013 NetworkManager[51205]: <info>  [1759407912.1459] manager: (tapaa4ebb90-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/222)
Oct  2 08:25:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:12.146 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapaa4ebb90-e0, col_values=(('external_ids', {'iface-id': 'c9a9afa9-78de-46a4-a649-8b8779924189'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:25:12 np0005466013 nova_compute[192144]: 2025-10-02 12:25:12.147 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:12 np0005466013 ovn_controller[94366]: 2025-10-02T12:25:12Z|00489|binding|INFO|Releasing lport c9a9afa9-78de-46a4-a649-8b8779924189 from this chassis (sb_readonly=0)
Oct  2 08:25:12 np0005466013 nova_compute[192144]: 2025-10-02 12:25:12.149 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:12.150 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/aa4ebb90-ef5e-4974-a53d-2aabd696731a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/aa4ebb90-ef5e-4974-a53d-2aabd696731a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:25:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:12.151 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[cc4c2284-6056-47cb-9eab-b6fb384a281a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:12.152 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:25:12 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:25:12 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:25:12 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-aa4ebb90-ef5e-4974-a53d-2aabd696731a
Oct  2 08:25:12 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:25:12 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:25:12 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:25:12 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/aa4ebb90-ef5e-4974-a53d-2aabd696731a.pid.haproxy
Oct  2 08:25:12 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:25:12 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:25:12 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:25:12 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:25:12 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:25:12 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:25:12 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:25:12 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:25:12 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:25:12 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:25:12 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:25:12 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:25:12 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:25:12 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:25:12 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:25:12 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:25:12 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:25:12 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:25:12 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:25:12 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:25:12 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID aa4ebb90-ef5e-4974-a53d-2aabd696731a
Oct  2 08:25:12 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:25:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:12.153 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'env', 'PROCESS_TAG=haproxy-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/aa4ebb90-ef5e-4974-a53d-2aabd696731a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:25:12 np0005466013 nova_compute[192144]: 2025-10-02 12:25:12.164 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:12 np0005466013 nova_compute[192144]: 2025-10-02 12:25:12.463 2 DEBUG nova.network.neutron [req-683ee248-2879-4539-9d65-de70331d0f34 req-b1491448-f9a1-4af6-a2cb-004cdb2401ef 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Updated VIF entry in instance network info cache for port 60b9a0ec-2ade-4f90-a7b0-443ac527ec3e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:25:12 np0005466013 nova_compute[192144]: 2025-10-02 12:25:12.466 2 DEBUG nova.network.neutron [req-683ee248-2879-4539-9d65-de70331d0f34 req-b1491448-f9a1-4af6-a2cb-004cdb2401ef 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Updating instance_info_cache with network_info: [{"id": "60b9a0ec-2ade-4f90-a7b0-443ac527ec3e", "address": "fa:16:3e:f0:49:10", "network": {"id": "aa4ebb90-ef5e-4974-a53d-2aabd696731a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88e90c16adec46069b539d4f1431ab4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60b9a0ec-2a", "ovs_interfaceid": "60b9a0ec-2ade-4f90-a7b0-443ac527ec3e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:25:12 np0005466013 nova_compute[192144]: 2025-10-02 12:25:12.494 2 DEBUG oslo_concurrency.lockutils [req-683ee248-2879-4539-9d65-de70331d0f34 req-b1491448-f9a1-4af6-a2cb-004cdb2401ef 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-f92877b9-dd8b-4444-a42b-987004802928" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:25:12 np0005466013 nova_compute[192144]: 2025-10-02 12:25:12.611 2 DEBUG nova.compute.manager [req-df11fdcb-309c-4a8c-904e-acb4b0d0e9da req-c6af6177-0a84-428a-b864-f319cebccf35 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Received event network-vif-plugged-60b9a0ec-2ade-4f90-a7b0-443ac527ec3e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:25:12 np0005466013 nova_compute[192144]: 2025-10-02 12:25:12.612 2 DEBUG oslo_concurrency.lockutils [req-df11fdcb-309c-4a8c-904e-acb4b0d0e9da req-c6af6177-0a84-428a-b864-f319cebccf35 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "f92877b9-dd8b-4444-a42b-987004802928-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:12 np0005466013 nova_compute[192144]: 2025-10-02 12:25:12.613 2 DEBUG oslo_concurrency.lockutils [req-df11fdcb-309c-4a8c-904e-acb4b0d0e9da req-c6af6177-0a84-428a-b864-f319cebccf35 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f92877b9-dd8b-4444-a42b-987004802928-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:12 np0005466013 nova_compute[192144]: 2025-10-02 12:25:12.613 2 DEBUG oslo_concurrency.lockutils [req-df11fdcb-309c-4a8c-904e-acb4b0d0e9da req-c6af6177-0a84-428a-b864-f319cebccf35 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f92877b9-dd8b-4444-a42b-987004802928-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:12 np0005466013 nova_compute[192144]: 2025-10-02 12:25:12.613 2 DEBUG nova.compute.manager [req-df11fdcb-309c-4a8c-904e-acb4b0d0e9da req-c6af6177-0a84-428a-b864-f319cebccf35 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Processing event network-vif-plugged-60b9a0ec-2ade-4f90-a7b0-443ac527ec3e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:25:12 np0005466013 podman[238835]: 2025-10-02 12:25:12.624732024 +0000 UTC m=+0.074495985 container create 29967f643e579c6beff353e853c9de4a0893f905a95341966ab3bacbc31041da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:25:12 np0005466013 systemd[1]: Started libpod-conmon-29967f643e579c6beff353e853c9de4a0893f905a95341966ab3bacbc31041da.scope.
Oct  2 08:25:12 np0005466013 podman[238835]: 2025-10-02 12:25:12.577497134 +0000 UTC m=+0.027261085 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:25:12 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:25:12 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d3cd4dc8ac81464f0994ea1e10c2002180207a14eec8a18dd63f29e14a36ce85/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:25:12 np0005466013 podman[238835]: 2025-10-02 12:25:12.744941448 +0000 UTC m=+0.194705479 container init 29967f643e579c6beff353e853c9de4a0893f905a95341966ab3bacbc31041da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  2 08:25:12 np0005466013 podman[238835]: 2025-10-02 12:25:12.753535817 +0000 UTC m=+0.203299788 container start 29967f643e579c6beff353e853c9de4a0893f905a95341966ab3bacbc31041da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Oct  2 08:25:12 np0005466013 neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a[238850]: [NOTICE]   (238854) : New worker (238856) forked
Oct  2 08:25:12 np0005466013 neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a[238850]: [NOTICE]   (238854) : Loading success.
Oct  2 08:25:12 np0005466013 nova_compute[192144]: 2025-10-02 12:25:12.977 2 DEBUG nova.compute.manager [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:25:12 np0005466013 nova_compute[192144]: 2025-10-02 12:25:12.978 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407912.9778988, f92877b9-dd8b-4444-a42b-987004802928 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:25:12 np0005466013 nova_compute[192144]: 2025-10-02 12:25:12.978 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f92877b9-dd8b-4444-a42b-987004802928] VM Started (Lifecycle Event)#033[00m
Oct  2 08:25:12 np0005466013 nova_compute[192144]: 2025-10-02 12:25:12.984 2 DEBUG nova.virt.libvirt.driver [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:25:12 np0005466013 nova_compute[192144]: 2025-10-02 12:25:12.989 2 INFO nova.virt.libvirt.driver [-] [instance: f92877b9-dd8b-4444-a42b-987004802928] Instance spawned successfully.#033[00m
Oct  2 08:25:12 np0005466013 nova_compute[192144]: 2025-10-02 12:25:12.990 2 DEBUG nova.virt.libvirt.driver [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:25:13 np0005466013 nova_compute[192144]: 2025-10-02 12:25:13.014 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f92877b9-dd8b-4444-a42b-987004802928] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:25:13 np0005466013 nova_compute[192144]: 2025-10-02 12:25:13.025 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f92877b9-dd8b-4444-a42b-987004802928] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:25:13 np0005466013 nova_compute[192144]: 2025-10-02 12:25:13.031 2 DEBUG nova.virt.libvirt.driver [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:25:13 np0005466013 nova_compute[192144]: 2025-10-02 12:25:13.032 2 DEBUG nova.virt.libvirt.driver [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:25:13 np0005466013 nova_compute[192144]: 2025-10-02 12:25:13.033 2 DEBUG nova.virt.libvirt.driver [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:25:13 np0005466013 nova_compute[192144]: 2025-10-02 12:25:13.033 2 DEBUG nova.virt.libvirt.driver [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:25:13 np0005466013 nova_compute[192144]: 2025-10-02 12:25:13.034 2 DEBUG nova.virt.libvirt.driver [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:25:13 np0005466013 nova_compute[192144]: 2025-10-02 12:25:13.034 2 DEBUG nova.virt.libvirt.driver [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:25:13 np0005466013 nova_compute[192144]: 2025-10-02 12:25:13.048 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f92877b9-dd8b-4444-a42b-987004802928] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:25:13 np0005466013 nova_compute[192144]: 2025-10-02 12:25:13.050 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407912.978017, f92877b9-dd8b-4444-a42b-987004802928 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:25:13 np0005466013 nova_compute[192144]: 2025-10-02 12:25:13.050 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f92877b9-dd8b-4444-a42b-987004802928] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:25:13 np0005466013 nova_compute[192144]: 2025-10-02 12:25:13.087 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f92877b9-dd8b-4444-a42b-987004802928] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:25:13 np0005466013 nova_compute[192144]: 2025-10-02 12:25:13.092 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407912.9833505, f92877b9-dd8b-4444-a42b-987004802928 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:25:13 np0005466013 nova_compute[192144]: 2025-10-02 12:25:13.092 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f92877b9-dd8b-4444-a42b-987004802928] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:25:13 np0005466013 nova_compute[192144]: 2025-10-02 12:25:13.125 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f92877b9-dd8b-4444-a42b-987004802928] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:25:13 np0005466013 nova_compute[192144]: 2025-10-02 12:25:13.129 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f92877b9-dd8b-4444-a42b-987004802928] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:25:13 np0005466013 nova_compute[192144]: 2025-10-02 12:25:13.153 2 INFO nova.compute.manager [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Took 6.47 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:25:13 np0005466013 nova_compute[192144]: 2025-10-02 12:25:13.154 2 DEBUG nova.compute.manager [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:25:13 np0005466013 nova_compute[192144]: 2025-10-02 12:25:13.158 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f92877b9-dd8b-4444-a42b-987004802928] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:25:13 np0005466013 nova_compute[192144]: 2025-10-02 12:25:13.359 2 INFO nova.compute.manager [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Took 10.01 seconds to build instance.#033[00m
Oct  2 08:25:13 np0005466013 nova_compute[192144]: 2025-10-02 12:25:13.393 2 DEBUG oslo_concurrency.lockutils [None req-be63c404-4427-43b1-a1a1-c32fa68b8328 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "f92877b9-dd8b-4444-a42b-987004802928" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.830s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:13 np0005466013 podman[238865]: 2025-10-02 12:25:13.724622916 +0000 UTC m=+0.078876261 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:25:13 np0005466013 podman[238866]: 2025-10-02 12:25:13.728617941 +0000 UTC m=+0.081891735 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:25:14 np0005466013 nova_compute[192144]: 2025-10-02 12:25:14.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:14 np0005466013 nova_compute[192144]: 2025-10-02 12:25:14.709 2 DEBUG nova.compute.manager [req-c450bef5-7263-4e15-ba1b-e95670d40961 req-9bfcccc3-4750-49bf-8cde-ab2a9f2751cd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Received event network-vif-plugged-60b9a0ec-2ade-4f90-a7b0-443ac527ec3e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:25:14 np0005466013 nova_compute[192144]: 2025-10-02 12:25:14.709 2 DEBUG oslo_concurrency.lockutils [req-c450bef5-7263-4e15-ba1b-e95670d40961 req-9bfcccc3-4750-49bf-8cde-ab2a9f2751cd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "f92877b9-dd8b-4444-a42b-987004802928-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:14 np0005466013 nova_compute[192144]: 2025-10-02 12:25:14.710 2 DEBUG oslo_concurrency.lockutils [req-c450bef5-7263-4e15-ba1b-e95670d40961 req-9bfcccc3-4750-49bf-8cde-ab2a9f2751cd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f92877b9-dd8b-4444-a42b-987004802928-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:14 np0005466013 nova_compute[192144]: 2025-10-02 12:25:14.710 2 DEBUG oslo_concurrency.lockutils [req-c450bef5-7263-4e15-ba1b-e95670d40961 req-9bfcccc3-4750-49bf-8cde-ab2a9f2751cd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f92877b9-dd8b-4444-a42b-987004802928-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:14 np0005466013 nova_compute[192144]: 2025-10-02 12:25:14.710 2 DEBUG nova.compute.manager [req-c450bef5-7263-4e15-ba1b-e95670d40961 req-9bfcccc3-4750-49bf-8cde-ab2a9f2751cd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] No waiting events found dispatching network-vif-plugged-60b9a0ec-2ade-4f90-a7b0-443ac527ec3e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:25:14 np0005466013 nova_compute[192144]: 2025-10-02 12:25:14.711 2 WARNING nova.compute.manager [req-c450bef5-7263-4e15-ba1b-e95670d40961 req-9bfcccc3-4750-49bf-8cde-ab2a9f2751cd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Received unexpected event network-vif-plugged-60b9a0ec-2ade-4f90-a7b0-443ac527ec3e for instance with vm_state active and task_state None.#033[00m
Oct  2 08:25:15 np0005466013 nova_compute[192144]: 2025-10-02 12:25:15.238 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:15 np0005466013 nova_compute[192144]: 2025-10-02 12:25:15.367 2 DEBUG nova.compute.manager [None req-38565315-5481-4e67-a1c0-93c50a438391 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:25:15 np0005466013 nova_compute[192144]: 2025-10-02 12:25:15.455 2 INFO nova.compute.manager [None req-38565315-5481-4e67-a1c0-93c50a438391 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] instance snapshotting#033[00m
Oct  2 08:25:15 np0005466013 nova_compute[192144]: 2025-10-02 12:25:15.896 2 INFO nova.virt.libvirt.driver [None req-38565315-5481-4e67-a1c0-93c50a438391 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Beginning live snapshot process#033[00m
Oct  2 08:25:16 np0005466013 virtqemud[191867]: invalid argument: disk vda does not have an active block job
Oct  2 08:25:16 np0005466013 nova_compute[192144]: 2025-10-02 12:25:16.128 2 DEBUG oslo_concurrency.processutils [None req-38565315-5481-4e67-a1c0-93c50a438391 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:16 np0005466013 nova_compute[192144]: 2025-10-02 12:25:16.221 2 DEBUG oslo_concurrency.processutils [None req-38565315-5481-4e67-a1c0-93c50a438391 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/disk --force-share --output=json -f qcow2" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:16 np0005466013 nova_compute[192144]: 2025-10-02 12:25:16.222 2 DEBUG oslo_concurrency.processutils [None req-38565315-5481-4e67-a1c0-93c50a438391 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:16 np0005466013 nova_compute[192144]: 2025-10-02 12:25:16.323 2 DEBUG oslo_concurrency.processutils [None req-38565315-5481-4e67-a1c0-93c50a438391 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/disk --force-share --output=json -f qcow2" returned: 0 in 0.101s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:16 np0005466013 nova_compute[192144]: 2025-10-02 12:25:16.336 2 DEBUG oslo_concurrency.processutils [None req-38565315-5481-4e67-a1c0-93c50a438391 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.353 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f92877b9-dd8b-4444-a42b-987004802928', 'name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000077', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '88e90c16adec46069b539d4f1431ab4d', 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'hostId': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.354 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.358 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for f92877b9-dd8b-4444-a42b-987004802928 / tap60b9a0ec-2a inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.359 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.362 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6c1ffa12-fa24-489e-a4f4-5449437d728e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'instance-00000077-f92877b9-dd8b-4444-a42b-987004802928-tap60b9a0ec-2a', 'timestamp': '2025-10-02T12:25:16.355077', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'tap60b9a0ec-2a', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f0:49:10', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60b9a0ec-2a'}, 'message_id': 'da3396e4-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5908.033330791, 'message_signature': '920fe52d848775a3bee37b4dfa22b05d2b5ed3571c652d9c7970afc671af4d2e'}]}, 'timestamp': '2025-10-02 12:25:16.360704', '_unique_id': '05275b3ceaf8474ab043e286140f91f0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.362 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.362 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.362 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.362 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.362 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.362 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.362 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.362 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.362 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.362 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.362 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.362 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.362 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.362 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.362 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.362 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.362 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.362 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.362 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.362 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.362 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.362 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.362 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.362 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.362 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.362 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.362 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.362 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.362 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.362 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.362 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.364 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.364 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.366 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5c4cfbb2-7b15-4865-89ab-7a06a43559f6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'instance-00000077-f92877b9-dd8b-4444-a42b-987004802928-tap60b9a0ec-2a', 'timestamp': '2025-10-02T12:25:16.364481', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'tap60b9a0ec-2a', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f0:49:10', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60b9a0ec-2a'}, 'message_id': 'da344756-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5908.033330791, 'message_signature': 'cdfd197019cf47259ecebd67e9bfca49b26bb8f3cafaa956373c3e4b60564e14'}]}, 'timestamp': '2025-10-02 12:25:16.365076', '_unique_id': '5240a4b0c503474aa6d51f61c7216cdf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.366 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.366 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.366 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.366 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.366 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.366 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.366 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.366 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.366 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.366 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.366 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.366 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.366 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.366 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.366 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.366 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.366 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.366 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.366 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.366 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.366 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.366 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.366 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.366 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.366 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.366 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.366 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.366 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.366 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.366 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.366 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.367 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.388 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.389 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.391 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '731f7a62-530e-471c-a3ef-381e736390c3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'f92877b9-dd8b-4444-a42b-987004802928-vda', 'timestamp': '2025-10-02T12:25:16.368100', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'instance-00000077', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da37eb0e-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5908.04639817, 'message_signature': 'c40847c451017d975725ecb071f447e65ac9b1fbc2826d148db9442f41097cb4'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'f92877b9-dd8b-4444-a42b-987004802928-sda', 'timestamp': '2025-10-02T12:25:16.368100', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'instance-00000077', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da3803d2-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5908.04639817, 'message_signature': '2f21c9f154f34a85e04efeb686bb83ef2cc8543a7249b391c2ed69420a4ea123'}]}, 'timestamp': '2025-10-02 12:25:16.389506', '_unique_id': '164cc07eab2e4dc9b4bb3561fcaccee6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.391 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.391 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.391 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.391 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.391 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.391 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.391 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.391 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.391 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.391 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.391 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.391 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.391 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.391 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.391 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.391 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.391 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.391 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.391 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.391 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.391 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.391 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.391 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.391 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.391 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.391 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.391 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.391 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.391 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.391 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.391 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.393 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.394 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.396 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '87b395e3-4d54-4405-a572-1f0b6c0b6443', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'instance-00000077-f92877b9-dd8b-4444-a42b-987004802928-tap60b9a0ec-2a', 'timestamp': '2025-10-02T12:25:16.394016', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'tap60b9a0ec-2a', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f0:49:10', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60b9a0ec-2a'}, 'message_id': 'da38caa6-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5908.033330791, 'message_signature': 'd461230f6a795dabfe39750389e6d3fa84595e5e74591655b190b7e602c77596'}]}, 'timestamp': '2025-10-02 12:25:16.394691', '_unique_id': 'be46372b789f43e58f0d98affeddd758'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.396 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.396 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.396 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.396 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.396 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.396 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.396 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.396 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.396 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.396 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.396 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.396 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.396 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.396 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.396 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.396 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.396 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.396 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.396 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.396 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.396 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.396 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.396 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.396 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.396 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.396 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.396 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.396 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.396 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.396 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.396 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.398 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.398 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.399 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-915765106>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-915765106>]
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.399 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.399 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.400 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-915765106>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-915765106>]
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.400 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  2 08:25:16 np0005466013 nova_compute[192144]: 2025-10-02 12:25:16.425 2 DEBUG oslo_concurrency.processutils [None req-38565315-5481-4e67-a1c0-93c50a438391 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:16 np0005466013 nova_compute[192144]: 2025-10-02 12:25:16.427 2 DEBUG oslo_concurrency.processutils [None req-38565315-5481-4e67-a1c0-93c50a438391 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp6b235tsr/23ada1c55d1443baa646e0911b5bc76e.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.439 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/disk.device.read.latency volume: 457927347 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.440 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/disk.device.read.latency volume: 947860 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.443 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8179484a-2b28-4d2b-b0c6-b9a1876496fc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 457927347, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'f92877b9-dd8b-4444-a42b-987004802928-vda', 'timestamp': '2025-10-02T12:25:16.400571', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'instance-00000077', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da3fcfcc-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5908.078882867, 'message_signature': '75e65fca29e016c7868d69a29b31af80464a2aa42243813f0999c542a8fbe1c8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 947860, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'f92877b9-dd8b-4444-a42b-987004802928-sda', 'timestamp': '2025-10-02T12:25:16.400571', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'instance-00000077', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da3fed86-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5908.078882867, 'message_signature': 'f87006c80ee43eac4f26903fae10410faaa93b83f15016e5e46050094f903743'}]}, 'timestamp': '2025-10-02 12:25:16.441375', '_unique_id': 'f9f08d5941ad48f9a740eef13f3ac336'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.443 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.443 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.443 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.443 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.443 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.443 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.443 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.443 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.443 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.443 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.443 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.443 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.443 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.443 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.443 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.443 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.443 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.443 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.443 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.443 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.443 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.443 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.443 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.443 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.443 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.443 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.443 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.443 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.443 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.443 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.443 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.444 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.444 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.445 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.446 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c6cb106f-0de6-498f-ba05-cff8f965b73f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'f92877b9-dd8b-4444-a42b-987004802928-vda', 'timestamp': '2025-10-02T12:25:16.444908', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'instance-00000077', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da408bce-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5908.04639817, 'message_signature': '99237a77b5fa4e5b71492869d9ae1efac08b0ac7efcec06adb69b7dde350e175'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'f92877b9-dd8b-4444-a42b-987004802928-sda', 'timestamp': '2025-10-02T12:25:16.444908', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'instance-00000077', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da409d80-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5908.04639817, 'message_signature': 'd015f42eb8bc5e70a7d4503164f78a1922e89b5e363d7fcf4b181432b00c0644'}]}, 'timestamp': '2025-10-02 12:25:16.445877', '_unique_id': '7d22860163404f6caeb5dfc7bf693a64'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.446 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.446 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.446 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.446 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.446 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.446 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.446 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.446 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.446 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.446 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.446 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.446 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.446 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.446 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.446 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.446 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.446 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.446 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.446 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.446 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.446 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.446 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.446 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.446 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.446 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.446 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.446 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.446 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.446 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.446 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.446 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.448 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.448 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.449 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6afbfd66-7fe8-4fd2-a099-76969df03f08', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'instance-00000077-f92877b9-dd8b-4444-a42b-987004802928-tap60b9a0ec-2a', 'timestamp': '2025-10-02T12:25:16.448310', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'tap60b9a0ec-2a', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f0:49:10', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60b9a0ec-2a'}, 'message_id': 'da411094-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5908.033330791, 'message_signature': '7e055b70b44b077553be1760b3670cdcc2236c9e95ee106164e8d33c5a3adda5'}]}, 'timestamp': '2025-10-02 12:25:16.448818', '_unique_id': '18451e846bb041b7a2a2a1b7818d35ad'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.449 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.449 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.449 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.449 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.449 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.449 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.449 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.449 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.449 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.449 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.449 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.449 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.449 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.449 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.449 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.449 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.449 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.449 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.449 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.449 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.449 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.449 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.449 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.449 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.449 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.449 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.449 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.449 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.449 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.449 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.449 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.451 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.451 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.452 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1217e1b4-30aa-426d-9bdc-2dd35cc6b2de', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'instance-00000077-f92877b9-dd8b-4444-a42b-987004802928-tap60b9a0ec-2a', 'timestamp': '2025-10-02T12:25:16.451223', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'tap60b9a0ec-2a', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f0:49:10', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60b9a0ec-2a'}, 'message_id': 'da418204-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5908.033330791, 'message_signature': 'dcea9dd0d121f2872a5c5fc10e94a2fe71f4a2250d6a14b1342f2beee7a257ec'}]}, 'timestamp': '2025-10-02 12:25:16.451715', '_unique_id': 'ba12e94a97ab415db46fb057ac88b099'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.452 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.452 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.452 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.452 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.452 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.452 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.452 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.452 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.452 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.452 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.452 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.452 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.452 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.452 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.452 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.452 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.452 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.452 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.452 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.452 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.452 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.452 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.452 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.452 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.452 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.452 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.452 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.452 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.452 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.452 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.452 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.453 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.454 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.454 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.456 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '78a24a91-5aca-4cca-9583-89d4c250c366', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'f92877b9-dd8b-4444-a42b-987004802928-vda', 'timestamp': '2025-10-02T12:25:16.454096', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'instance-00000077', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da41f1d0-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5908.078882867, 'message_signature': '00229443fb20b726f819739e29683a90b006b7c5fe743e63153f8d95c6aa7b66'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'f92877b9-dd8b-4444-a42b-987004802928-sda', 'timestamp': '2025-10-02T12:25:16.454096', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'instance-00000077', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da42030a-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5908.078882867, 'message_signature': 'c38ff430bff5699e04f365fdae275a0bbc71d1373092234c5257c2c3a73e6061'}]}, 'timestamp': '2025-10-02 12:25:16.455044', '_unique_id': '5429127e72e04c51bb67b8abc58d01ff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.456 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.456 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.456 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.456 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.456 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.456 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.456 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.456 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.456 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.456 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.456 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.456 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.456 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.456 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.456 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.456 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.456 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.456 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.456 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.456 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.456 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.456 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.456 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.456 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.456 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.456 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.456 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.456 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.456 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.456 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.456 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.457 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.457 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/network.incoming.bytes volume: 110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.459 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd3d8213f-10a7-41fa-aaf9-f621437491f0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 110, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'instance-00000077-f92877b9-dd8b-4444-a42b-987004802928-tap60b9a0ec-2a', 'timestamp': '2025-10-02T12:25:16.457419', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'tap60b9a0ec-2a', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f0:49:10', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60b9a0ec-2a'}, 'message_id': 'da4273ee-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5908.033330791, 'message_signature': '0bd9aae88beaa1190c13875555599acc05b28ddcab0a20dd3a398051a9951898'}]}, 'timestamp': '2025-10-02 12:25:16.457947', '_unique_id': '559865f77e244151b94d01fc8e8fee2f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.459 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.459 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.459 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.459 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.459 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.459 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.459 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.459 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.459 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.459 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.459 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.459 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.459 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.459 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.459 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.459 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.459 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.459 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.459 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.459 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.459 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.459 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.459 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.459 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.459 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.459 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.459 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.459 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.459 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.459 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.459 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.460 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.460 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.462 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7e93b5ac-e1a5-43a6-a3c5-129e8d315a2f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'instance-00000077-f92877b9-dd8b-4444-a42b-987004802928-tap60b9a0ec-2a', 'timestamp': '2025-10-02T12:25:16.460747', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'tap60b9a0ec-2a', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f0:49:10', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60b9a0ec-2a'}, 'message_id': 'da42fc1a-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5908.033330791, 'message_signature': 'bf995721eb9bd13ea50c7062d7429121311016ab549375eba85404c9f83aad1f'}]}, 'timestamp': '2025-10-02 12:25:16.461411', '_unique_id': 'ecce20ad3c704ad68144f5f181b05ea8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.462 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.462 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.462 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.462 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.462 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.462 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.462 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.462 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.462 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.462 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.462 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.462 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.462 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.462 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.462 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.462 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.462 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.462 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.462 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.462 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.462 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.462 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.462 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.462 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.462 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.462 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.462 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.462 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.462 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.462 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.462 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.463 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  2 08:25:16 np0005466013 nova_compute[192144]: 2025-10-02 12:25:16.498 2 DEBUG oslo_concurrency.processutils [None req-38565315-5481-4e67-a1c0-93c50a438391 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp6b235tsr/23ada1c55d1443baa646e0911b5bc76e.delta 1073741824" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:16 np0005466013 nova_compute[192144]: 2025-10-02 12:25:16.500 2 INFO nova.virt.libvirt.driver [None req-38565315-5481-4e67-a1c0-93c50a438391 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.508 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.508 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance f92877b9-dd8b-4444-a42b-987004802928: ceilometer.compute.pollsters.NoVolumeException
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.508 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.508 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.511 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '14574e5a-d820-4816-918b-67b81862760f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'instance-00000077-f92877b9-dd8b-4444-a42b-987004802928-tap60b9a0ec-2a', 'timestamp': '2025-10-02T12:25:16.508952', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'tap60b9a0ec-2a', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f0:49:10', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60b9a0ec-2a'}, 'message_id': 'da4a546a-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5908.033330791, 'message_signature': 'ff7b5dd3e5850f28e07f712e1fe2f220e75a5724626ccd1d9528f90d61e155e4'}]}, 'timestamp': '2025-10-02 12:25:16.509659', '_unique_id': '075a8826459c4f7aaf9852e2ed8166b7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.511 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.511 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.511 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.511 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.511 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.511 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.511 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.511 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.511 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.511 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.511 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.511 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.511 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.511 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.511 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.511 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.511 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.511 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.511 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.511 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.511 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.511 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.511 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.511 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.511 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.511 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.511 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.511 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.511 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.511 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.511 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.512 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.513 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.513 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.522 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bd51bd01-8412-4eb0-a037-90b3314dcf97', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'f92877b9-dd8b-4444-a42b-987004802928-vda', 'timestamp': '2025-10-02T12:25:16.513011', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'instance-00000077', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da4aef88-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5908.04639817, 'message_signature': '78df1e5e4a848ba0c20a0da3c9f2264cf0f81e0cfd0dd41928c3d6b9fe3b0eec'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'f92877b9-dd8b-4444-a42b-987004802928-sda', 'timestamp': '2025-10-02T12:25:16.513011', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'instance-00000077', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da4b0482-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5908.04639817, 'message_signature': '1c4e607d6e1a630ea66d119e0f1035a7f4cccc90456efa2d61a6d23ee1a04f89'}]}, 'timestamp': '2025-10-02 12:25:16.521270', '_unique_id': '616a1ce77b6e406f880268859e8681e9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.522 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.522 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.522 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.522 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.522 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.522 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.522 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.522 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.522 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.522 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.522 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.522 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.522 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.522 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.522 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.522 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.522 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.522 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.522 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.522 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.522 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.522 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.522 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.522 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.522 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.522 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.522 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.522 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.522 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.522 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.522 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.524 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.524 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.525 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.528 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4f8fd8a5-df76-480c-9fb5-81a73422e668', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'f92877b9-dd8b-4444-a42b-987004802928-vda', 'timestamp': '2025-10-02T12:25:16.524450', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'instance-00000077', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da4cb214-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5908.078882867, 'message_signature': '8560a98fb7ed8c9598642fc042fca4eb9bca287421649118466ee1b2c491ea2f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'f92877b9-dd8b-4444-a42b-987004802928-sda', 'timestamp': '2025-10-02T12:25:16.524450', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'instance-00000077', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da4cd334-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5908.078882867, 'message_signature': '760065c420e91c14532cf5e830a58f75648cc928213cf241a437f61daadccdc3'}]}, 'timestamp': '2025-10-02 12:25:16.525907', '_unique_id': 'd959cee457064d3ba6b43eceadbad269'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.528 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.528 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.528 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.528 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.528 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.528 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.528 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.528 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.528 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.528 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.528 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.528 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.528 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.528 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.528 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.528 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.528 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.528 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.528 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.528 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.528 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.528 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.528 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.528 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.528 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.528 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.528 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.528 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.528 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.528 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.528 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.530 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.530 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/cpu volume: 3260000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.535 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6c840f11-5708-44d1-9e8c-efc4053a3081', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3260000000, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'timestamp': '2025-10-02T12:25:16.530512', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'instance-00000077', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'da4d9c9c-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5908.186385804, 'message_signature': '49eca189a36d31df8254873efe77479d1aacfe7e495a016e668c7c6fb387baf9'}]}, 'timestamp': '2025-10-02 12:25:16.531144', '_unique_id': '8a3802515cd04a49a155c33654d6cb54'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.535 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.535 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.535 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.535 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.535 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.535 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.535 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.535 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.535 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.535 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.535 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.535 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.535 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.535 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.535 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.535 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.535 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.535 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.535 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.535 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.535 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.535 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.535 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.535 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.535 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.535 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.535 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.535 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.535 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.535 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.535 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.536 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.536 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.538 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '55d02fb6-0d37-43db-a5c9-a66d9476a455', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'instance-00000077-f92877b9-dd8b-4444-a42b-987004802928-tap60b9a0ec-2a', 'timestamp': '2025-10-02T12:25:16.536661', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'tap60b9a0ec-2a', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f0:49:10', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60b9a0ec-2a'}, 'message_id': 'da4e8e7c-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5908.033330791, 'message_signature': 'b05b2674cc89aee632c1ce6f29f793f2bad0222aebefcb9189623ed5ded1651f'}]}, 'timestamp': '2025-10-02 12:25:16.537252', '_unique_id': '8e3e191e6ba14d2eb29e796ec4ba8aee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.538 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.538 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.538 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.538 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.538 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.538 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.538 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.538 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.538 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.538 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.538 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.538 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.538 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.538 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.538 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.538 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.538 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.538 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.538 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.538 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.538 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.538 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.538 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.538 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.538 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.538 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.538 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.538 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.538 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.538 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.538 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.539 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.539 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.540 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.541 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '23fe6a7b-c3b5-4fa3-8622-63d3525b6a40', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'f92877b9-dd8b-4444-a42b-987004802928-vda', 'timestamp': '2025-10-02T12:25:16.539757', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'instance-00000077', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da4f047e-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5908.078882867, 'message_signature': 'a8fa478febf08029ecc3e289ed7ad82e4755b38c07a57a7d05c6beb10e5a45b4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'f92877b9-dd8b-4444-a42b-987004802928-sda', 'timestamp': '2025-10-02T12:25:16.539757', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'instance-00000077', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da4f141e-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5908.078882867, 'message_signature': '6d6c3965587ecb119dfd5fbf4c270243ab1a0e6051f792d09b9dd9fae5cc8d8a'}]}, 'timestamp': '2025-10-02 12:25:16.540609', '_unique_id': 'c78875f94ee6446ba6f0f32d3b81ea78'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.541 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.541 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.541 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.541 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.541 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.541 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.541 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.541 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.541 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.541 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.541 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.541 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.541 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.541 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.541 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.541 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.541 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.541 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.541 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.541 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.541 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.541 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.541 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.541 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.541 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.541 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.541 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.541 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.541 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.541 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.541 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.542 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.542 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.543 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ca7aa665-8569-4fec-a752-d5e74726e180', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'instance-00000077-f92877b9-dd8b-4444-a42b-987004802928-tap60b9a0ec-2a', 'timestamp': '2025-10-02T12:25:16.542379', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'tap60b9a0ec-2a', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f0:49:10', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60b9a0ec-2a'}, 'message_id': 'da4f6932-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5908.033330791, 'message_signature': '8a7cc76a66c53b8fe80431db211dbaeabab3a1129d42b99e894cb0115ac8f226'}]}, 'timestamp': '2025-10-02 12:25:16.542804', '_unique_id': '6e331bf5d04a46cea39f25e4388aa114'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.543 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.543 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.543 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.543 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.543 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.543 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.543 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.543 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.543 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.543 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.543 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.543 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.543 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.543 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.543 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.543 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.543 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.543 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.543 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.543 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.543 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.543 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.543 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.543 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.543 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.543 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.543 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.543 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.543 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.543 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.543 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.544 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.544 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.544 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.545 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2504c55e-988b-4bdb-8dc3-ee19e38bdcea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'f92877b9-dd8b-4444-a42b-987004802928-vda', 'timestamp': '2025-10-02T12:25:16.544391', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'instance-00000077', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da4fb766-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5908.078882867, 'message_signature': 'c1541aa20bfdb8d83c8d2bdcd470d8a1a5a3bcb47d5985b1cbeb009edd89a324'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'f92877b9-dd8b-4444-a42b-987004802928-sda', 'timestamp': '2025-10-02T12:25:16.544391', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'instance-00000077', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da4fc76a-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5908.078882867, 'message_signature': '29ade54cc21cfeb868a909ab94ed3fc0df7f6e0af217ba69cc7d1352a8f581b1'}]}, 'timestamp': '2025-10-02 12:25:16.545153', '_unique_id': 'ed91e34250c54967b4e3eb5b8fa39c1a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.545 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.545 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.545 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.545 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.545 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.545 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.545 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.545 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.545 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.545 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.545 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.545 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.545 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.545 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.545 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.545 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.545 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.545 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.545 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.545 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.545 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.545 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.545 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.545 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.545 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.545 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.545 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.545 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.545 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.545 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.545 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.546 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.546 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.546 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-915765106>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-915765106>]
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.547 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.547 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.547 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-915765106>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-915765106>]
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.547 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.547 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.548 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.549 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bc765195-ada6-41dd-bff6-013413423d96', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'f92877b9-dd8b-4444-a42b-987004802928-vda', 'timestamp': '2025-10-02T12:25:16.547908', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'instance-00000077', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da50415e-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5908.078882867, 'message_signature': '912504c2942d691921ba3459414f4750a34127e9b1f71420c122f40905457ae1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'f92877b9-dd8b-4444-a42b-987004802928-sda', 'timestamp': '2025-10-02T12:25:16.547908', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'instance-00000077', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da505022-9f8a-11f0-9b9a-fa163ec2af05', 'monotonic_time': 5908.078882867, 'message_signature': '3bea3ad7e09118f2049e0c16d30cf9449c4ac4d2ac46f20ac18c6a527dc396c9'}]}, 'timestamp': '2025-10-02 12:25:16.548699', '_unique_id': 'f100ee67d6b6416780f88d76e6e31467'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.549 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.549 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.549 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.549 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.549 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.549 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.549 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.549 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.549 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.549 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.549 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.549 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.549 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.549 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.549 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.549 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.549 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.549 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.549 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.549 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.549 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.549 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.549 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.549 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.549 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.549 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.549 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.549 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.549 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.549 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:25:16.549 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466013 nova_compute[192144]: 2025-10-02 12:25:16.580 2 DEBUG nova.virt.libvirt.guest [None req-38565315-5481-4e67-a1c0-93c50a438391 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Oct  2 08:25:16 np0005466013 nova_compute[192144]: 2025-10-02 12:25:16.584 2 INFO nova.virt.libvirt.driver [None req-38565315-5481-4e67-a1c0-93c50a438391 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Oct  2 08:25:16 np0005466013 nova_compute[192144]: 2025-10-02 12:25:16.640 2 DEBUG nova.privsep.utils [None req-38565315-5481-4e67-a1c0-93c50a438391 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Oct  2 08:25:16 np0005466013 nova_compute[192144]: 2025-10-02 12:25:16.641 2 DEBUG oslo_concurrency.processutils [None req-38565315-5481-4e67-a1c0-93c50a438391 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp6b235tsr/23ada1c55d1443baa646e0911b5bc76e.delta /var/lib/nova/instances/snapshots/tmp6b235tsr/23ada1c55d1443baa646e0911b5bc76e execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:16 np0005466013 nova_compute[192144]: 2025-10-02 12:25:16.998 2 DEBUG oslo_concurrency.processutils [None req-38565315-5481-4e67-a1c0-93c50a438391 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp6b235tsr/23ada1c55d1443baa646e0911b5bc76e.delta /var/lib/nova/instances/snapshots/tmp6b235tsr/23ada1c55d1443baa646e0911b5bc76e" returned: 0 in 0.357s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:17 np0005466013 nova_compute[192144]: 2025-10-02 12:25:17.000 2 INFO nova.virt.libvirt.driver [None req-38565315-5481-4e67-a1c0-93c50a438391 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Snapshot extracted, beginning image upload#033[00m
Oct  2 08:25:19 np0005466013 nova_compute[192144]: 2025-10-02 12:25:19.202 2 DEBUG oslo_concurrency.lockutils [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "595aea98-0c3e-45c9-81fe-4643f44fe8d3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:19 np0005466013 nova_compute[192144]: 2025-10-02 12:25:19.205 2 DEBUG oslo_concurrency.lockutils [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "595aea98-0c3e-45c9-81fe-4643f44fe8d3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:19 np0005466013 nova_compute[192144]: 2025-10-02 12:25:19.227 2 DEBUG nova.compute.manager [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:25:19 np0005466013 nova_compute[192144]: 2025-10-02 12:25:19.346 2 INFO nova.virt.libvirt.driver [None req-38565315-5481-4e67-a1c0-93c50a438391 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Snapshot image upload complete#033[00m
Oct  2 08:25:19 np0005466013 nova_compute[192144]: 2025-10-02 12:25:19.347 2 INFO nova.compute.manager [None req-38565315-5481-4e67-a1c0-93c50a438391 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Took 3.87 seconds to snapshot the instance on the hypervisor.#033[00m
Oct  2 08:25:19 np0005466013 nova_compute[192144]: 2025-10-02 12:25:19.387 2 DEBUG oslo_concurrency.lockutils [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:19 np0005466013 nova_compute[192144]: 2025-10-02 12:25:19.390 2 DEBUG oslo_concurrency.lockutils [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:19 np0005466013 nova_compute[192144]: 2025-10-02 12:25:19.391 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:19 np0005466013 nova_compute[192144]: 2025-10-02 12:25:19.400 2 DEBUG nova.virt.hardware [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:25:19 np0005466013 nova_compute[192144]: 2025-10-02 12:25:19.400 2 INFO nova.compute.claims [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:25:19 np0005466013 nova_compute[192144]: 2025-10-02 12:25:19.586 2 DEBUG nova.compute.provider_tree [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:25:19 np0005466013 nova_compute[192144]: 2025-10-02 12:25:19.609 2 DEBUG nova.scheduler.client.report [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:25:19 np0005466013 nova_compute[192144]: 2025-10-02 12:25:19.644 2 DEBUG oslo_concurrency.lockutils [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.254s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:19 np0005466013 nova_compute[192144]: 2025-10-02 12:25:19.645 2 DEBUG nova.compute.manager [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:25:19 np0005466013 nova_compute[192144]: 2025-10-02 12:25:19.756 2 DEBUG nova.compute.manager [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:25:19 np0005466013 nova_compute[192144]: 2025-10-02 12:25:19.757 2 DEBUG nova.network.neutron [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:25:19 np0005466013 nova_compute[192144]: 2025-10-02 12:25:19.801 2 INFO nova.virt.libvirt.driver [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:25:19 np0005466013 nova_compute[192144]: 2025-10-02 12:25:19.837 2 DEBUG nova.compute.manager [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:25:19 np0005466013 nova_compute[192144]: 2025-10-02 12:25:19.994 2 DEBUG nova.compute.manager [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:25:19 np0005466013 nova_compute[192144]: 2025-10-02 12:25:19.996 2 DEBUG nova.virt.libvirt.driver [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:25:19 np0005466013 nova_compute[192144]: 2025-10-02 12:25:19.997 2 INFO nova.virt.libvirt.driver [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Creating image(s)#033[00m
Oct  2 08:25:19 np0005466013 nova_compute[192144]: 2025-10-02 12:25:19.998 2 DEBUG oslo_concurrency.lockutils [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "/var/lib/nova/instances/595aea98-0c3e-45c9-81fe-4643f44fe8d3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:19 np0005466013 nova_compute[192144]: 2025-10-02 12:25:19.999 2 DEBUG oslo_concurrency.lockutils [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "/var/lib/nova/instances/595aea98-0c3e-45c9-81fe-4643f44fe8d3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:20 np0005466013 nova_compute[192144]: 2025-10-02 12:25:20.000 2 DEBUG oslo_concurrency.lockutils [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "/var/lib/nova/instances/595aea98-0c3e-45c9-81fe-4643f44fe8d3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:20 np0005466013 nova_compute[192144]: 2025-10-02 12:25:20.016 2 DEBUG oslo_concurrency.processutils [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:20 np0005466013 nova_compute[192144]: 2025-10-02 12:25:20.051 2 DEBUG nova.policy [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:25:20 np0005466013 nova_compute[192144]: 2025-10-02 12:25:20.121 2 DEBUG oslo_concurrency.processutils [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.104s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:20 np0005466013 nova_compute[192144]: 2025-10-02 12:25:20.122 2 DEBUG oslo_concurrency.lockutils [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:20 np0005466013 nova_compute[192144]: 2025-10-02 12:25:20.123 2 DEBUG oslo_concurrency.lockutils [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:20 np0005466013 nova_compute[192144]: 2025-10-02 12:25:20.134 2 DEBUG oslo_concurrency.processutils [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:20 np0005466013 nova_compute[192144]: 2025-10-02 12:25:20.213 2 DEBUG oslo_concurrency.processutils [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:20 np0005466013 nova_compute[192144]: 2025-10-02 12:25:20.215 2 DEBUG oslo_concurrency.processutils [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/595aea98-0c3e-45c9-81fe-4643f44fe8d3/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:20 np0005466013 nova_compute[192144]: 2025-10-02 12:25:20.240 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:20 np0005466013 nova_compute[192144]: 2025-10-02 12:25:20.307 2 DEBUG oslo_concurrency.processutils [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/595aea98-0c3e-45c9-81fe-4643f44fe8d3/disk 1073741824" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:20 np0005466013 nova_compute[192144]: 2025-10-02 12:25:20.308 2 DEBUG oslo_concurrency.lockutils [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.186s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:20 np0005466013 nova_compute[192144]: 2025-10-02 12:25:20.309 2 DEBUG oslo_concurrency.processutils [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:20 np0005466013 nova_compute[192144]: 2025-10-02 12:25:20.375 2 DEBUG oslo_concurrency.processutils [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:20 np0005466013 nova_compute[192144]: 2025-10-02 12:25:20.378 2 DEBUG nova.virt.disk.api [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Checking if we can resize image /var/lib/nova/instances/595aea98-0c3e-45c9-81fe-4643f44fe8d3/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:25:20 np0005466013 nova_compute[192144]: 2025-10-02 12:25:20.379 2 DEBUG oslo_concurrency.processutils [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/595aea98-0c3e-45c9-81fe-4643f44fe8d3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:20 np0005466013 nova_compute[192144]: 2025-10-02 12:25:20.492 2 DEBUG oslo_concurrency.processutils [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/595aea98-0c3e-45c9-81fe-4643f44fe8d3/disk --force-share --output=json" returned: 0 in 0.113s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:20 np0005466013 nova_compute[192144]: 2025-10-02 12:25:20.493 2 DEBUG nova.virt.disk.api [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Cannot resize image /var/lib/nova/instances/595aea98-0c3e-45c9-81fe-4643f44fe8d3/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:25:20 np0005466013 nova_compute[192144]: 2025-10-02 12:25:20.494 2 DEBUG nova.objects.instance [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lazy-loading 'migration_context' on Instance uuid 595aea98-0c3e-45c9-81fe-4643f44fe8d3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:25:20 np0005466013 nova_compute[192144]: 2025-10-02 12:25:20.520 2 DEBUG nova.virt.libvirt.driver [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:25:20 np0005466013 nova_compute[192144]: 2025-10-02 12:25:20.521 2 DEBUG nova.virt.libvirt.driver [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Ensure instance console log exists: /var/lib/nova/instances/595aea98-0c3e-45c9-81fe-4643f44fe8d3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:25:20 np0005466013 nova_compute[192144]: 2025-10-02 12:25:20.522 2 DEBUG oslo_concurrency.lockutils [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:20 np0005466013 nova_compute[192144]: 2025-10-02 12:25:20.523 2 DEBUG oslo_concurrency.lockutils [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:20 np0005466013 nova_compute[192144]: 2025-10-02 12:25:20.524 2 DEBUG oslo_concurrency.lockutils [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:21 np0005466013 nova_compute[192144]: 2025-10-02 12:25:21.155 2 DEBUG nova.network.neutron [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Successfully created port: 375468b9-b213-41ae-87ca-ea569359bdb6 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:25:21 np0005466013 nova_compute[192144]: 2025-10-02 12:25:21.568 2 INFO nova.compute.manager [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Rescuing#033[00m
Oct  2 08:25:21 np0005466013 nova_compute[192144]: 2025-10-02 12:25:21.570 2 DEBUG oslo_concurrency.lockutils [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquiring lock "refresh_cache-f92877b9-dd8b-4444-a42b-987004802928" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:25:21 np0005466013 nova_compute[192144]: 2025-10-02 12:25:21.571 2 DEBUG oslo_concurrency.lockutils [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquired lock "refresh_cache-f92877b9-dd8b-4444-a42b-987004802928" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:25:21 np0005466013 nova_compute[192144]: 2025-10-02 12:25:21.571 2 DEBUG nova.network.neutron [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:25:22 np0005466013 nova_compute[192144]: 2025-10-02 12:25:22.594 2 DEBUG nova.network.neutron [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Successfully updated port: 375468b9-b213-41ae-87ca-ea569359bdb6 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:25:22 np0005466013 nova_compute[192144]: 2025-10-02 12:25:22.620 2 DEBUG oslo_concurrency.lockutils [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "refresh_cache-595aea98-0c3e-45c9-81fe-4643f44fe8d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:25:22 np0005466013 nova_compute[192144]: 2025-10-02 12:25:22.620 2 DEBUG oslo_concurrency.lockutils [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquired lock "refresh_cache-595aea98-0c3e-45c9-81fe-4643f44fe8d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:25:22 np0005466013 nova_compute[192144]: 2025-10-02 12:25:22.620 2 DEBUG nova.network.neutron [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:25:22 np0005466013 nova_compute[192144]: 2025-10-02 12:25:22.693 2 DEBUG nova.compute.manager [req-e1406dad-f45f-4e17-9d04-c91d21901bef req-3241529c-7f48-413d-ad27-ad7b050b3242 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Received event network-changed-375468b9-b213-41ae-87ca-ea569359bdb6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:25:22 np0005466013 nova_compute[192144]: 2025-10-02 12:25:22.694 2 DEBUG nova.compute.manager [req-e1406dad-f45f-4e17-9d04-c91d21901bef req-3241529c-7f48-413d-ad27-ad7b050b3242 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Refreshing instance network info cache due to event network-changed-375468b9-b213-41ae-87ca-ea569359bdb6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:25:22 np0005466013 nova_compute[192144]: 2025-10-02 12:25:22.694 2 DEBUG oslo_concurrency.lockutils [req-e1406dad-f45f-4e17-9d04-c91d21901bef req-3241529c-7f48-413d-ad27-ad7b050b3242 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-595aea98-0c3e-45c9-81fe-4643f44fe8d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:25:22 np0005466013 nova_compute[192144]: 2025-10-02 12:25:22.774 2 DEBUG nova.network.neutron [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:25:23 np0005466013 nova_compute[192144]: 2025-10-02 12:25:23.454 2 DEBUG nova.network.neutron [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Updating instance_info_cache with network_info: [{"id": "60b9a0ec-2ade-4f90-a7b0-443ac527ec3e", "address": "fa:16:3e:f0:49:10", "network": {"id": "aa4ebb90-ef5e-4974-a53d-2aabd696731a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88e90c16adec46069b539d4f1431ab4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60b9a0ec-2a", "ovs_interfaceid": "60b9a0ec-2ade-4f90-a7b0-443ac527ec3e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:25:23 np0005466013 nova_compute[192144]: 2025-10-02 12:25:23.485 2 DEBUG oslo_concurrency.lockutils [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Releasing lock "refresh_cache-f92877b9-dd8b-4444-a42b-987004802928" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:25:23 np0005466013 nova_compute[192144]: 2025-10-02 12:25:23.677 2 DEBUG nova.network.neutron [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Updating instance_info_cache with network_info: [{"id": "375468b9-b213-41ae-87ca-ea569359bdb6", "address": "fa:16:3e:06:5f:dd", "network": {"id": "7af61307-f367-4334-ad00-5d542cb00bd9", "bridge": "br-int", "label": "tempest-network-smoke--1240497721", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap375468b9-b2", "ovs_interfaceid": "375468b9-b213-41ae-87ca-ea569359bdb6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:25:23 np0005466013 nova_compute[192144]: 2025-10-02 12:25:23.703 2 DEBUG oslo_concurrency.lockutils [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Releasing lock "refresh_cache-595aea98-0c3e-45c9-81fe-4643f44fe8d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:25:23 np0005466013 nova_compute[192144]: 2025-10-02 12:25:23.704 2 DEBUG nova.compute.manager [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Instance network_info: |[{"id": "375468b9-b213-41ae-87ca-ea569359bdb6", "address": "fa:16:3e:06:5f:dd", "network": {"id": "7af61307-f367-4334-ad00-5d542cb00bd9", "bridge": "br-int", "label": "tempest-network-smoke--1240497721", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap375468b9-b2", "ovs_interfaceid": "375468b9-b213-41ae-87ca-ea569359bdb6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:25:23 np0005466013 nova_compute[192144]: 2025-10-02 12:25:23.705 2 DEBUG oslo_concurrency.lockutils [req-e1406dad-f45f-4e17-9d04-c91d21901bef req-3241529c-7f48-413d-ad27-ad7b050b3242 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-595aea98-0c3e-45c9-81fe-4643f44fe8d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:25:23 np0005466013 nova_compute[192144]: 2025-10-02 12:25:23.705 2 DEBUG nova.network.neutron [req-e1406dad-f45f-4e17-9d04-c91d21901bef req-3241529c-7f48-413d-ad27-ad7b050b3242 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Refreshing network info cache for port 375468b9-b213-41ae-87ca-ea569359bdb6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:25:23 np0005466013 nova_compute[192144]: 2025-10-02 12:25:23.708 2 DEBUG nova.virt.libvirt.driver [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Start _get_guest_xml network_info=[{"id": "375468b9-b213-41ae-87ca-ea569359bdb6", "address": "fa:16:3e:06:5f:dd", "network": {"id": "7af61307-f367-4334-ad00-5d542cb00bd9", "bridge": "br-int", "label": "tempest-network-smoke--1240497721", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap375468b9-b2", "ovs_interfaceid": "375468b9-b213-41ae-87ca-ea569359bdb6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:25:23 np0005466013 nova_compute[192144]: 2025-10-02 12:25:23.716 2 WARNING nova.virt.libvirt.driver [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:25:23 np0005466013 nova_compute[192144]: 2025-10-02 12:25:23.722 2 DEBUG nova.virt.libvirt.host [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:25:23 np0005466013 nova_compute[192144]: 2025-10-02 12:25:23.723 2 DEBUG nova.virt.libvirt.host [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:25:23 np0005466013 nova_compute[192144]: 2025-10-02 12:25:23.727 2 DEBUG nova.virt.libvirt.host [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:25:23 np0005466013 nova_compute[192144]: 2025-10-02 12:25:23.727 2 DEBUG nova.virt.libvirt.host [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:25:23 np0005466013 nova_compute[192144]: 2025-10-02 12:25:23.728 2 DEBUG nova.virt.libvirt.driver [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:25:23 np0005466013 nova_compute[192144]: 2025-10-02 12:25:23.729 2 DEBUG nova.virt.hardware [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:25:23 np0005466013 nova_compute[192144]: 2025-10-02 12:25:23.729 2 DEBUG nova.virt.hardware [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:25:23 np0005466013 nova_compute[192144]: 2025-10-02 12:25:23.729 2 DEBUG nova.virt.hardware [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:25:23 np0005466013 nova_compute[192144]: 2025-10-02 12:25:23.730 2 DEBUG nova.virt.hardware [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:25:23 np0005466013 nova_compute[192144]: 2025-10-02 12:25:23.730 2 DEBUG nova.virt.hardware [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:25:23 np0005466013 nova_compute[192144]: 2025-10-02 12:25:23.730 2 DEBUG nova.virt.hardware [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:25:23 np0005466013 nova_compute[192144]: 2025-10-02 12:25:23.730 2 DEBUG nova.virt.hardware [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:25:23 np0005466013 nova_compute[192144]: 2025-10-02 12:25:23.731 2 DEBUG nova.virt.hardware [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:25:23 np0005466013 nova_compute[192144]: 2025-10-02 12:25:23.731 2 DEBUG nova.virt.hardware [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:25:23 np0005466013 nova_compute[192144]: 2025-10-02 12:25:23.731 2 DEBUG nova.virt.hardware [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:25:23 np0005466013 nova_compute[192144]: 2025-10-02 12:25:23.732 2 DEBUG nova.virt.hardware [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:25:23 np0005466013 nova_compute[192144]: 2025-10-02 12:25:23.735 2 DEBUG nova.virt.libvirt.vif [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:25:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1910014548',display_name='tempest-TestNetworkAdvancedServerOps-server-1910014548',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1910014548',id=122,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGcjNWQLhorxyiCglISTL85sF/PebgHfK15yneJUghAfWhPSxNP3NydyYmhFfkO9o84fX5BcllBeB8dR7YwYFd3thDd5cmALiWGCn51055R0ZMgFMvAQxqZx7i5T53aIfQ==',key_name='tempest-TestNetworkAdvancedServerOps-954911067',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76c7dd40d83e4e3ca71abbebf57921b6',ramdisk_id='',reservation_id='r-axe4xfwn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-597114071',owner_user_name='tempest-TestNetworkAdvancedServerOps-597114071-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:25:19Z,user_data=None,user_id='1faa7e121a0e43ad8cb4ae5b2cfcc6a2',uuid=595aea98-0c3e-45c9-81fe-4643f44fe8d3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "375468b9-b213-41ae-87ca-ea569359bdb6", "address": "fa:16:3e:06:5f:dd", "network": {"id": "7af61307-f367-4334-ad00-5d542cb00bd9", "bridge": "br-int", "label": "tempest-network-smoke--1240497721", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap375468b9-b2", "ovs_interfaceid": "375468b9-b213-41ae-87ca-ea569359bdb6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:25:23 np0005466013 nova_compute[192144]: 2025-10-02 12:25:23.735 2 DEBUG nova.network.os_vif_util [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Converting VIF {"id": "375468b9-b213-41ae-87ca-ea569359bdb6", "address": "fa:16:3e:06:5f:dd", "network": {"id": "7af61307-f367-4334-ad00-5d542cb00bd9", "bridge": "br-int", "label": "tempest-network-smoke--1240497721", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap375468b9-b2", "ovs_interfaceid": "375468b9-b213-41ae-87ca-ea569359bdb6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:25:23 np0005466013 nova_compute[192144]: 2025-10-02 12:25:23.736 2 DEBUG nova.network.os_vif_util [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:06:5f:dd,bridge_name='br-int',has_traffic_filtering=True,id=375468b9-b213-41ae-87ca-ea569359bdb6,network=Network(7af61307-f367-4334-ad00-5d542cb00bd9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap375468b9-b2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:25:23 np0005466013 nova_compute[192144]: 2025-10-02 12:25:23.737 2 DEBUG nova.objects.instance [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lazy-loading 'pci_devices' on Instance uuid 595aea98-0c3e-45c9-81fe-4643f44fe8d3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:25:23 np0005466013 nova_compute[192144]: 2025-10-02 12:25:23.749 2 DEBUG nova.virt.libvirt.driver [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:25:23 np0005466013 nova_compute[192144]:  <uuid>595aea98-0c3e-45c9-81fe-4643f44fe8d3</uuid>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:  <name>instance-0000007a</name>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:25:23 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-1910014548</nova:name>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:25:23</nova:creationTime>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:25:23 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:        <nova:user uuid="1faa7e121a0e43ad8cb4ae5b2cfcc6a2">tempest-TestNetworkAdvancedServerOps-597114071-project-member</nova:user>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:        <nova:project uuid="76c7dd40d83e4e3ca71abbebf57921b6">tempest-TestNetworkAdvancedServerOps-597114071</nova:project>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:        <nova:port uuid="375468b9-b213-41ae-87ca-ea569359bdb6">
Oct  2 08:25:23 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:25:23 np0005466013 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:25:23 np0005466013 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 08:25:23 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:      <entry name="serial">595aea98-0c3e-45c9-81fe-4643f44fe8d3</entry>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:      <entry name="uuid">595aea98-0c3e-45c9-81fe-4643f44fe8d3</entry>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:25:23 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/595aea98-0c3e-45c9-81fe-4643f44fe8d3/disk"/>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:25:23 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/595aea98-0c3e-45c9-81fe-4643f44fe8d3/disk.config"/>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:25:23 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:06:5f:dd"/>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:      <target dev="tap375468b9-b2"/>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:25:23 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/595aea98-0c3e-45c9-81fe-4643f44fe8d3/console.log" append="off"/>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:25:23 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:25:23 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:25:23 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:25:23 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:25:23 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:25:23 np0005466013 nova_compute[192144]: 2025-10-02 12:25:23.760 2 DEBUG nova.compute.manager [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Preparing to wait for external event network-vif-plugged-375468b9-b213-41ae-87ca-ea569359bdb6 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:25:23 np0005466013 nova_compute[192144]: 2025-10-02 12:25:23.760 2 DEBUG oslo_concurrency.lockutils [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "595aea98-0c3e-45c9-81fe-4643f44fe8d3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:23 np0005466013 nova_compute[192144]: 2025-10-02 12:25:23.760 2 DEBUG oslo_concurrency.lockutils [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "595aea98-0c3e-45c9-81fe-4643f44fe8d3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:23 np0005466013 nova_compute[192144]: 2025-10-02 12:25:23.761 2 DEBUG oslo_concurrency.lockutils [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "595aea98-0c3e-45c9-81fe-4643f44fe8d3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:23 np0005466013 nova_compute[192144]: 2025-10-02 12:25:23.761 2 DEBUG nova.virt.libvirt.vif [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:25:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1910014548',display_name='tempest-TestNetworkAdvancedServerOps-server-1910014548',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1910014548',id=122,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGcjNWQLhorxyiCglISTL85sF/PebgHfK15yneJUghAfWhPSxNP3NydyYmhFfkO9o84fX5BcllBeB8dR7YwYFd3thDd5cmALiWGCn51055R0ZMgFMvAQxqZx7i5T53aIfQ==',key_name='tempest-TestNetworkAdvancedServerOps-954911067',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76c7dd40d83e4e3ca71abbebf57921b6',ramdisk_id='',reservation_id='r-axe4xfwn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-597114071',owner_user_name='tempest-TestNetworkAdvancedServerOps-597114071-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:25:19Z,user_data=None,user_id='1faa7e121a0e43ad8cb4ae5b2cfcc6a2',uuid=595aea98-0c3e-45c9-81fe-4643f44fe8d3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "375468b9-b213-41ae-87ca-ea569359bdb6", "address": "fa:16:3e:06:5f:dd", "network": {"id": "7af61307-f367-4334-ad00-5d542cb00bd9", "bridge": "br-int", "label": "tempest-network-smoke--1240497721", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap375468b9-b2", "ovs_interfaceid": "375468b9-b213-41ae-87ca-ea569359bdb6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:25:23 np0005466013 nova_compute[192144]: 2025-10-02 12:25:23.762 2 DEBUG nova.network.os_vif_util [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Converting VIF {"id": "375468b9-b213-41ae-87ca-ea569359bdb6", "address": "fa:16:3e:06:5f:dd", "network": {"id": "7af61307-f367-4334-ad00-5d542cb00bd9", "bridge": "br-int", "label": "tempest-network-smoke--1240497721", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap375468b9-b2", "ovs_interfaceid": "375468b9-b213-41ae-87ca-ea569359bdb6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:25:23 np0005466013 nova_compute[192144]: 2025-10-02 12:25:23.762 2 DEBUG nova.network.os_vif_util [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:06:5f:dd,bridge_name='br-int',has_traffic_filtering=True,id=375468b9-b213-41ae-87ca-ea569359bdb6,network=Network(7af61307-f367-4334-ad00-5d542cb00bd9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap375468b9-b2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:25:23 np0005466013 nova_compute[192144]: 2025-10-02 12:25:23.763 2 DEBUG os_vif [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:06:5f:dd,bridge_name='br-int',has_traffic_filtering=True,id=375468b9-b213-41ae-87ca-ea569359bdb6,network=Network(7af61307-f367-4334-ad00-5d542cb00bd9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap375468b9-b2') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:25:23 np0005466013 nova_compute[192144]: 2025-10-02 12:25:23.763 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:23 np0005466013 nova_compute[192144]: 2025-10-02 12:25:23.764 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:25:23 np0005466013 nova_compute[192144]: 2025-10-02 12:25:23.764 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:25:23 np0005466013 nova_compute[192144]: 2025-10-02 12:25:23.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:23 np0005466013 nova_compute[192144]: 2025-10-02 12:25:23.771 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap375468b9-b2, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:25:23 np0005466013 nova_compute[192144]: 2025-10-02 12:25:23.772 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap375468b9-b2, col_values=(('external_ids', {'iface-id': '375468b9-b213-41ae-87ca-ea569359bdb6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:06:5f:dd', 'vm-uuid': '595aea98-0c3e-45c9-81fe-4643f44fe8d3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:25:23 np0005466013 nova_compute[192144]: 2025-10-02 12:25:23.774 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:23 np0005466013 NetworkManager[51205]: <info>  [1759407923.7766] manager: (tap375468b9-b2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/223)
Oct  2 08:25:23 np0005466013 nova_compute[192144]: 2025-10-02 12:25:23.777 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:25:23 np0005466013 nova_compute[192144]: 2025-10-02 12:25:23.793 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:23 np0005466013 nova_compute[192144]: 2025-10-02 12:25:23.794 2 INFO os_vif [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:06:5f:dd,bridge_name='br-int',has_traffic_filtering=True,id=375468b9-b213-41ae-87ca-ea569359bdb6,network=Network(7af61307-f367-4334-ad00-5d542cb00bd9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap375468b9-b2')#033[00m
Oct  2 08:25:23 np0005466013 nova_compute[192144]: 2025-10-02 12:25:23.919 2 DEBUG nova.virt.libvirt.driver [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Oct  2 08:25:24 np0005466013 nova_compute[192144]: 2025-10-02 12:25:24.047 2 DEBUG nova.virt.libvirt.driver [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:25:24 np0005466013 nova_compute[192144]: 2025-10-02 12:25:24.047 2 DEBUG nova.virt.libvirt.driver [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:25:24 np0005466013 nova_compute[192144]: 2025-10-02 12:25:24.048 2 DEBUG nova.virt.libvirt.driver [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] No VIF found with MAC fa:16:3e:06:5f:dd, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:25:24 np0005466013 nova_compute[192144]: 2025-10-02 12:25:24.048 2 INFO nova.virt.libvirt.driver [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Using config drive#033[00m
Oct  2 08:25:24 np0005466013 nova_compute[192144]: 2025-10-02 12:25:24.381 2 INFO nova.virt.libvirt.driver [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Creating config drive at /var/lib/nova/instances/595aea98-0c3e-45c9-81fe-4643f44fe8d3/disk.config#033[00m
Oct  2 08:25:24 np0005466013 nova_compute[192144]: 2025-10-02 12:25:24.391 2 DEBUG oslo_concurrency.processutils [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/595aea98-0c3e-45c9-81fe-4643f44fe8d3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2ttzsq_t execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:24 np0005466013 nova_compute[192144]: 2025-10-02 12:25:24.436 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:24 np0005466013 nova_compute[192144]: 2025-10-02 12:25:24.538 2 DEBUG oslo_concurrency.processutils [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/595aea98-0c3e-45c9-81fe-4643f44fe8d3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2ttzsq_t" returned: 0 in 0.147s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:24 np0005466013 kernel: tap375468b9-b2: entered promiscuous mode
Oct  2 08:25:24 np0005466013 NetworkManager[51205]: <info>  [1759407924.6198] manager: (tap375468b9-b2): new Tun device (/org/freedesktop/NetworkManager/Devices/224)
Oct  2 08:25:24 np0005466013 ovn_controller[94366]: 2025-10-02T12:25:24Z|00490|binding|INFO|Claiming lport 375468b9-b213-41ae-87ca-ea569359bdb6 for this chassis.
Oct  2 08:25:24 np0005466013 ovn_controller[94366]: 2025-10-02T12:25:24Z|00491|binding|INFO|375468b9-b213-41ae-87ca-ea569359bdb6: Claiming fa:16:3e:06:5f:dd 10.100.0.6
Oct  2 08:25:24 np0005466013 nova_compute[192144]: 2025-10-02 12:25:24.621 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:24 np0005466013 nova_compute[192144]: 2025-10-02 12:25:24.626 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:24.642 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:06:5f:dd 10.100.0.6'], port_security=['fa:16:3e:06:5f:dd 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '595aea98-0c3e-45c9-81fe-4643f44fe8d3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7af61307-f367-4334-ad00-5d542cb00bd9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'db364350-0b47-4c18-8ab1-bf862406804b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7e947ec6-847e-4b20-b912-5e8f3559dfc4, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=375468b9-b213-41ae-87ca-ea569359bdb6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:25:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:24.644 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 375468b9-b213-41ae-87ca-ea569359bdb6 in datapath 7af61307-f367-4334-ad00-5d542cb00bd9 bound to our chassis#033[00m
Oct  2 08:25:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:24.647 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7af61307-f367-4334-ad00-5d542cb00bd9#033[00m
Oct  2 08:25:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:24.661 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[b877bae4-e90f-4ec6-9088-317a419b9afc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:24.662 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7af61307-f1 in ovnmeta-7af61307-f367-4334-ad00-5d542cb00bd9 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:25:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:24.665 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7af61307-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:25:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:24.665 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[6fd64659-737e-47fb-ae04-170493b93b53]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:24.666 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[cf7d1085-1d03-42f3-942a-35f77a51b88b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:24 np0005466013 systemd-udevd[238980]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:25:24 np0005466013 nova_compute[192144]: 2025-10-02 12:25:24.678 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:24 np0005466013 NetworkManager[51205]: <info>  [1759407924.6843] device (tap375468b9-b2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:25:24 np0005466013 ovn_controller[94366]: 2025-10-02T12:25:24Z|00492|binding|INFO|Setting lport 375468b9-b213-41ae-87ca-ea569359bdb6 ovn-installed in OVS
Oct  2 08:25:24 np0005466013 ovn_controller[94366]: 2025-10-02T12:25:24Z|00493|binding|INFO|Setting lport 375468b9-b213-41ae-87ca-ea569359bdb6 up in Southbound
Oct  2 08:25:24 np0005466013 nova_compute[192144]: 2025-10-02 12:25:24.686 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:24 np0005466013 NetworkManager[51205]: <info>  [1759407924.6909] device (tap375468b9-b2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:25:24 np0005466013 systemd-machined[152202]: New machine qemu-57-instance-0000007a.
Oct  2 08:25:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:24.690 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[668554fd-05a6-4ac7-b528-5721aa9ddf73]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:24 np0005466013 systemd[1]: Started Virtual Machine qemu-57-instance-0000007a.
Oct  2 08:25:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:24.718 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[bdcd29a4-00a6-45c6-806f-0cc8eced90d2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:24 np0005466013 nova_compute[192144]: 2025-10-02 12:25:24.737 2 DEBUG nova.network.neutron [req-e1406dad-f45f-4e17-9d04-c91d21901bef req-3241529c-7f48-413d-ad27-ad7b050b3242 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Updated VIF entry in instance network info cache for port 375468b9-b213-41ae-87ca-ea569359bdb6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:25:24 np0005466013 nova_compute[192144]: 2025-10-02 12:25:24.738 2 DEBUG nova.network.neutron [req-e1406dad-f45f-4e17-9d04-c91d21901bef req-3241529c-7f48-413d-ad27-ad7b050b3242 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Updating instance_info_cache with network_info: [{"id": "375468b9-b213-41ae-87ca-ea569359bdb6", "address": "fa:16:3e:06:5f:dd", "network": {"id": "7af61307-f367-4334-ad00-5d542cb00bd9", "bridge": "br-int", "label": "tempest-network-smoke--1240497721", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap375468b9-b2", "ovs_interfaceid": "375468b9-b213-41ae-87ca-ea569359bdb6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:25:24 np0005466013 nova_compute[192144]: 2025-10-02 12:25:24.760 2 DEBUG oslo_concurrency.lockutils [req-e1406dad-f45f-4e17-9d04-c91d21901bef req-3241529c-7f48-413d-ad27-ad7b050b3242 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-595aea98-0c3e-45c9-81fe-4643f44fe8d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:25:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:24.760 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[ce37aa50-31f0-4753-8466-193ed10d160b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:24.767 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[08ad2207-71c2-4b8c-a5ca-9267e9cd78e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:24 np0005466013 NetworkManager[51205]: <info>  [1759407924.7695] manager: (tap7af61307-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/225)
Oct  2 08:25:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:24.822 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[16184e9f-8d9a-4205-a377-4299b317df99]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:24.826 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[5f14bca1-8d3f-4d85-b0d3-0e9bbf6f2ec0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:24 np0005466013 NetworkManager[51205]: <info>  [1759407924.8604] device (tap7af61307-f0): carrier: link connected
Oct  2 08:25:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:24.869 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[1ee45aa5-fa68-41b8-ac68-df3a95d4a2de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:24.892 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[07508b78-a833-4344-b6d4-7ba3e4d3c199]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7af61307-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d4:b0:66'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 149], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 591647, 'reachable_time': 18202, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239013, 'error': None, 'target': 'ovnmeta-7af61307-f367-4334-ad00-5d542cb00bd9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:24.918 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[cb34a5a4-9027-41a9-9fbc-b3eecfcd7633]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed4:b066'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 591647, 'tstamp': 591647}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 239014, 'error': None, 'target': 'ovnmeta-7af61307-f367-4334-ad00-5d542cb00bd9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:24.945 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[7b3b0ee6-6645-462d-81f4-a20830386348]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7af61307-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d4:b0:66'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 149], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 591647, 'reachable_time': 18202, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 239015, 'error': None, 'target': 'ovnmeta-7af61307-f367-4334-ad00-5d542cb00bd9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:24.986 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[f9344af8-de4d-4c59-9d87-82086d4aeeda]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:25 np0005466013 nova_compute[192144]: 2025-10-02 12:25:25.060 2 DEBUG nova.compute.manager [req-e7d42d96-58ba-4fe5-ad0c-c080effddeb4 req-fc6c0c81-063a-45c1-a7e4-76349267c70a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Received event network-vif-plugged-375468b9-b213-41ae-87ca-ea569359bdb6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:25:25 np0005466013 nova_compute[192144]: 2025-10-02 12:25:25.061 2 DEBUG oslo_concurrency.lockutils [req-e7d42d96-58ba-4fe5-ad0c-c080effddeb4 req-fc6c0c81-063a-45c1-a7e4-76349267c70a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "595aea98-0c3e-45c9-81fe-4643f44fe8d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:25 np0005466013 nova_compute[192144]: 2025-10-02 12:25:25.061 2 DEBUG oslo_concurrency.lockutils [req-e7d42d96-58ba-4fe5-ad0c-c080effddeb4 req-fc6c0c81-063a-45c1-a7e4-76349267c70a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "595aea98-0c3e-45c9-81fe-4643f44fe8d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:25 np0005466013 nova_compute[192144]: 2025-10-02 12:25:25.062 2 DEBUG oslo_concurrency.lockutils [req-e7d42d96-58ba-4fe5-ad0c-c080effddeb4 req-fc6c0c81-063a-45c1-a7e4-76349267c70a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "595aea98-0c3e-45c9-81fe-4643f44fe8d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:25 np0005466013 nova_compute[192144]: 2025-10-02 12:25:25.062 2 DEBUG nova.compute.manager [req-e7d42d96-58ba-4fe5-ad0c-c080effddeb4 req-fc6c0c81-063a-45c1-a7e4-76349267c70a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Processing event network-vif-plugged-375468b9-b213-41ae-87ca-ea569359bdb6 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:25:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:25.081 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[d7cea193-655c-4de8-9dbe-b46f7f4d811c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:25.084 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7af61307-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:25:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:25.085 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:25:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:25.085 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7af61307-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:25:25 np0005466013 kernel: tap7af61307-f0: entered promiscuous mode
Oct  2 08:25:25 np0005466013 NetworkManager[51205]: <info>  [1759407925.0887] manager: (tap7af61307-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/226)
Oct  2 08:25:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:25.091 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7af61307-f0, col_values=(('external_ids', {'iface-id': '2ec12fcd-269a-49bb-95a9-094f7c676163'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:25:25 np0005466013 ovn_controller[94366]: 2025-10-02T12:25:25Z|00494|binding|INFO|Releasing lport 2ec12fcd-269a-49bb-95a9-094f7c676163 from this chassis (sb_readonly=0)
Oct  2 08:25:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:25.095 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7af61307-f367-4334-ad00-5d542cb00bd9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7af61307-f367-4334-ad00-5d542cb00bd9.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:25:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:25.096 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[e9535619-5403-4467-be7c-43a2ae68d1ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:25.097 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:25:25 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:25:25 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:25:25 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-7af61307-f367-4334-ad00-5d542cb00bd9
Oct  2 08:25:25 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:25:25 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:25:25 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:25:25 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/7af61307-f367-4334-ad00-5d542cb00bd9.pid.haproxy
Oct  2 08:25:25 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:25:25 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:25:25 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:25:25 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:25:25 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:25:25 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:25:25 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:25:25 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:25:25 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:25:25 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:25:25 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:25:25 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:25:25 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:25:25 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:25:25 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:25:25 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:25:25 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:25:25 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:25:25 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:25:25 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:25:25 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID 7af61307-f367-4334-ad00-5d542cb00bd9
Oct  2 08:25:25 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:25:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:25.097 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7af61307-f367-4334-ad00-5d542cb00bd9', 'env', 'PROCESS_TAG=haproxy-7af61307-f367-4334-ad00-5d542cb00bd9', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7af61307-f367-4334-ad00-5d542cb00bd9.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:25:25 np0005466013 nova_compute[192144]: 2025-10-02 12:25:25.087 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:25 np0005466013 nova_compute[192144]: 2025-10-02 12:25:25.108 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:25 np0005466013 podman[239053]: 2025-10-02 12:25:25.526260955 +0000 UTC m=+0.071038466 container create 4ed3c80a93af0b31715611e4e84428361d45a75a407dd831f5e21cbf5e51cc9c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7af61307-f367-4334-ad00-5d542cb00bd9, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  2 08:25:25 np0005466013 systemd[1]: Started libpod-conmon-4ed3c80a93af0b31715611e4e84428361d45a75a407dd831f5e21cbf5e51cc9c.scope.
Oct  2 08:25:25 np0005466013 podman[239053]: 2025-10-02 12:25:25.494536851 +0000 UTC m=+0.039314362 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:25:25 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:25:25 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/553a81d4f0496b7c31ea0af411682bef60c197ce3609cba7ded034df9f4aa835/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:25:25 np0005466013 podman[239053]: 2025-10-02 12:25:25.643787705 +0000 UTC m=+0.188565226 container init 4ed3c80a93af0b31715611e4e84428361d45a75a407dd831f5e21cbf5e51cc9c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7af61307-f367-4334-ad00-5d542cb00bd9, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  2 08:25:25 np0005466013 podman[239053]: 2025-10-02 12:25:25.648556724 +0000 UTC m=+0.193334215 container start 4ed3c80a93af0b31715611e4e84428361d45a75a407dd831f5e21cbf5e51cc9c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7af61307-f367-4334-ad00-5d542cb00bd9, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:25:25 np0005466013 neutron-haproxy-ovnmeta-7af61307-f367-4334-ad00-5d542cb00bd9[239070]: [NOTICE]   (239074) : New worker (239076) forked
Oct  2 08:25:25 np0005466013 neutron-haproxy-ovnmeta-7af61307-f367-4334-ad00-5d542cb00bd9[239070]: [NOTICE]   (239074) : Loading success.
Oct  2 08:25:25 np0005466013 nova_compute[192144]: 2025-10-02 12:25:25.960 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407925.959613, 595aea98-0c3e-45c9-81fe-4643f44fe8d3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:25:25 np0005466013 nova_compute[192144]: 2025-10-02 12:25:25.961 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] VM Started (Lifecycle Event)#033[00m
Oct  2 08:25:25 np0005466013 nova_compute[192144]: 2025-10-02 12:25:25.963 2 DEBUG nova.compute.manager [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:25:25 np0005466013 nova_compute[192144]: 2025-10-02 12:25:25.969 2 DEBUG nova.virt.libvirt.driver [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:25:25 np0005466013 nova_compute[192144]: 2025-10-02 12:25:25.973 2 INFO nova.virt.libvirt.driver [-] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Instance spawned successfully.#033[00m
Oct  2 08:25:25 np0005466013 nova_compute[192144]: 2025-10-02 12:25:25.974 2 DEBUG nova.virt.libvirt.driver [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:25:26 np0005466013 nova_compute[192144]: 2025-10-02 12:25:26.002 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:25:26 np0005466013 nova_compute[192144]: 2025-10-02 12:25:26.009 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:25:26 np0005466013 nova_compute[192144]: 2025-10-02 12:25:26.020 2 DEBUG nova.virt.libvirt.driver [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:25:26 np0005466013 nova_compute[192144]: 2025-10-02 12:25:26.021 2 DEBUG nova.virt.libvirt.driver [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:25:26 np0005466013 nova_compute[192144]: 2025-10-02 12:25:26.022 2 DEBUG nova.virt.libvirt.driver [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:25:26 np0005466013 nova_compute[192144]: 2025-10-02 12:25:26.023 2 DEBUG nova.virt.libvirt.driver [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:25:26 np0005466013 nova_compute[192144]: 2025-10-02 12:25:26.023 2 DEBUG nova.virt.libvirt.driver [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:25:26 np0005466013 nova_compute[192144]: 2025-10-02 12:25:26.024 2 DEBUG nova.virt.libvirt.driver [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:25:26 np0005466013 nova_compute[192144]: 2025-10-02 12:25:26.037 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:25:26 np0005466013 nova_compute[192144]: 2025-10-02 12:25:26.038 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407925.959903, 595aea98-0c3e-45c9-81fe-4643f44fe8d3 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:25:26 np0005466013 nova_compute[192144]: 2025-10-02 12:25:26.038 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:25:26 np0005466013 nova_compute[192144]: 2025-10-02 12:25:26.074 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:25:26 np0005466013 nova_compute[192144]: 2025-10-02 12:25:26.079 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407925.9676936, 595aea98-0c3e-45c9-81fe-4643f44fe8d3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:25:26 np0005466013 nova_compute[192144]: 2025-10-02 12:25:26.079 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:25:26 np0005466013 nova_compute[192144]: 2025-10-02 12:25:26.124 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:25:26 np0005466013 nova_compute[192144]: 2025-10-02 12:25:26.128 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:25:26 np0005466013 nova_compute[192144]: 2025-10-02 12:25:26.163 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:25:26 np0005466013 nova_compute[192144]: 2025-10-02 12:25:26.165 2 INFO nova.compute.manager [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Took 6.17 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:25:26 np0005466013 nova_compute[192144]: 2025-10-02 12:25:26.166 2 DEBUG nova.compute.manager [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:25:26 np0005466013 nova_compute[192144]: 2025-10-02 12:25:26.260 2 INFO nova.compute.manager [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Took 6.94 seconds to build instance.#033[00m
Oct  2 08:25:26 np0005466013 nova_compute[192144]: 2025-10-02 12:25:26.282 2 DEBUG oslo_concurrency.lockutils [None req-267e8ebd-54a3-4ac9-bdf2-c6adb8688d30 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "595aea98-0c3e-45c9-81fe-4643f44fe8d3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.077s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:26 np0005466013 ovn_controller[94366]: 2025-10-02T12:25:26Z|00044|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f0:49:10 10.100.0.4
Oct  2 08:25:26 np0005466013 ovn_controller[94366]: 2025-10-02T12:25:26Z|00045|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f0:49:10 10.100.0.4
Oct  2 08:25:27 np0005466013 nova_compute[192144]: 2025-10-02 12:25:27.036 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:25:27 np0005466013 nova_compute[192144]: 2025-10-02 12:25:27.037 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:25:27 np0005466013 nova_compute[192144]: 2025-10-02 12:25:27.188 2 DEBUG nova.compute.manager [req-aca7ec99-e9d0-4684-b290-e9b1e9682f3f req-cd5b6ac7-4ed7-48b9-8537-4c1752e4a437 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Received event network-vif-plugged-375468b9-b213-41ae-87ca-ea569359bdb6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:25:27 np0005466013 nova_compute[192144]: 2025-10-02 12:25:27.189 2 DEBUG oslo_concurrency.lockutils [req-aca7ec99-e9d0-4684-b290-e9b1e9682f3f req-cd5b6ac7-4ed7-48b9-8537-4c1752e4a437 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "595aea98-0c3e-45c9-81fe-4643f44fe8d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:27 np0005466013 nova_compute[192144]: 2025-10-02 12:25:27.189 2 DEBUG oslo_concurrency.lockutils [req-aca7ec99-e9d0-4684-b290-e9b1e9682f3f req-cd5b6ac7-4ed7-48b9-8537-4c1752e4a437 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "595aea98-0c3e-45c9-81fe-4643f44fe8d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:27 np0005466013 nova_compute[192144]: 2025-10-02 12:25:27.189 2 DEBUG oslo_concurrency.lockutils [req-aca7ec99-e9d0-4684-b290-e9b1e9682f3f req-cd5b6ac7-4ed7-48b9-8537-4c1752e4a437 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "595aea98-0c3e-45c9-81fe-4643f44fe8d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:27 np0005466013 nova_compute[192144]: 2025-10-02 12:25:27.190 2 DEBUG nova.compute.manager [req-aca7ec99-e9d0-4684-b290-e9b1e9682f3f req-cd5b6ac7-4ed7-48b9-8537-4c1752e4a437 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] No waiting events found dispatching network-vif-plugged-375468b9-b213-41ae-87ca-ea569359bdb6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:25:27 np0005466013 nova_compute[192144]: 2025-10-02 12:25:27.190 2 WARNING nova.compute.manager [req-aca7ec99-e9d0-4684-b290-e9b1e9682f3f req-cd5b6ac7-4ed7-48b9-8537-4c1752e4a437 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Received unexpected event network-vif-plugged-375468b9-b213-41ae-87ca-ea569359bdb6 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:25:28 np0005466013 podman[239088]: 2025-10-02 12:25:28.713508953 +0000 UTC m=+0.079374987 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Oct  2 08:25:28 np0005466013 podman[239087]: 2025-10-02 12:25:28.736420761 +0000 UTC m=+0.100938303 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:25:28 np0005466013 podman[239089]: 2025-10-02 12:25:28.761874397 +0000 UTC m=+0.132299513 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:25:28 np0005466013 nova_compute[192144]: 2025-10-02 12:25:28.774 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:29 np0005466013 nova_compute[192144]: 2025-10-02 12:25:29.443 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:29 np0005466013 nova_compute[192144]: 2025-10-02 12:25:29.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:25:30 np0005466013 nova_compute[192144]: 2025-10-02 12:25:30.030 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:30 np0005466013 nova_compute[192144]: 2025-10-02 12:25:30.032 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:30 np0005466013 nova_compute[192144]: 2025-10-02 12:25:30.032 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:30 np0005466013 nova_compute[192144]: 2025-10-02 12:25:30.033 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:25:30 np0005466013 nova_compute[192144]: 2025-10-02 12:25:30.278 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:30 np0005466013 nova_compute[192144]: 2025-10-02 12:25:30.370 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/disk --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:30 np0005466013 nova_compute[192144]: 2025-10-02 12:25:30.371 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:30 np0005466013 nova_compute[192144]: 2025-10-02 12:25:30.436 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:30 np0005466013 nova_compute[192144]: 2025-10-02 12:25:30.443 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/595aea98-0c3e-45c9-81fe-4643f44fe8d3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:30 np0005466013 nova_compute[192144]: 2025-10-02 12:25:30.508 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/595aea98-0c3e-45c9-81fe-4643f44fe8d3/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:30 np0005466013 nova_compute[192144]: 2025-10-02 12:25:30.510 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/595aea98-0c3e-45c9-81fe-4643f44fe8d3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:30 np0005466013 nova_compute[192144]: 2025-10-02 12:25:30.576 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/595aea98-0c3e-45c9-81fe-4643f44fe8d3/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:30 np0005466013 nova_compute[192144]: 2025-10-02 12:25:30.840 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:25:30 np0005466013 nova_compute[192144]: 2025-10-02 12:25:30.843 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5376MB free_disk=73.25577545166016GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:25:30 np0005466013 nova_compute[192144]: 2025-10-02 12:25:30.844 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:30 np0005466013 nova_compute[192144]: 2025-10-02 12:25:30.845 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:30 np0005466013 nova_compute[192144]: 2025-10-02 12:25:30.960 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance f92877b9-dd8b-4444-a42b-987004802928 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:25:30 np0005466013 nova_compute[192144]: 2025-10-02 12:25:30.962 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance 595aea98-0c3e-45c9-81fe-4643f44fe8d3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:25:30 np0005466013 nova_compute[192144]: 2025-10-02 12:25:30.962 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:25:30 np0005466013 nova_compute[192144]: 2025-10-02 12:25:30.962 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:25:31 np0005466013 nova_compute[192144]: 2025-10-02 12:25:31.012 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Refreshing inventories for resource provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  2 08:25:31 np0005466013 nova_compute[192144]: 2025-10-02 12:25:31.079 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Updating ProviderTree inventory for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  2 08:25:31 np0005466013 nova_compute[192144]: 2025-10-02 12:25:31.080 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Updating inventory in ProviderTree for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 08:25:31 np0005466013 nova_compute[192144]: 2025-10-02 12:25:31.095 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Refreshing aggregate associations for resource provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  2 08:25:31 np0005466013 nova_compute[192144]: 2025-10-02 12:25:31.119 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Refreshing trait associations for resource provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80, traits: COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_TRUSTED_CERTS,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NODE,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE,HW_CPU_X86_MMX,COMPUTE_RESCUE_BFV _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  2 08:25:31 np0005466013 nova_compute[192144]: 2025-10-02 12:25:31.171 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:25:31 np0005466013 nova_compute[192144]: 2025-10-02 12:25:31.192 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:25:31 np0005466013 nova_compute[192144]: 2025-10-02 12:25:31.218 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:25:31 np0005466013 nova_compute[192144]: 2025-10-02 12:25:31.218 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.374s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:31 np0005466013 NetworkManager[51205]: <info>  [1759407931.7683] manager: (patch-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/227)
Oct  2 08:25:31 np0005466013 nova_compute[192144]: 2025-10-02 12:25:31.768 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:31 np0005466013 NetworkManager[51205]: <info>  [1759407931.7699] manager: (patch-br-int-to-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/228)
Oct  2 08:25:31 np0005466013 nova_compute[192144]: 2025-10-02 12:25:31.839 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:31 np0005466013 ovn_controller[94366]: 2025-10-02T12:25:31Z|00495|binding|INFO|Releasing lport c9a9afa9-78de-46a4-a649-8b8779924189 from this chassis (sb_readonly=0)
Oct  2 08:25:31 np0005466013 ovn_controller[94366]: 2025-10-02T12:25:31Z|00496|binding|INFO|Releasing lport 2ec12fcd-269a-49bb-95a9-094f7c676163 from this chassis (sb_readonly=0)
Oct  2 08:25:31 np0005466013 nova_compute[192144]: 2025-10-02 12:25:31.866 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:31 np0005466013 nova_compute[192144]: 2025-10-02 12:25:31.932 2 DEBUG nova.compute.manager [req-108e647c-4762-43fe-ac6a-da93c16aefbb req-4ec8f366-cdfc-42e3-b1eb-00e6e72473a7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Received event network-changed-375468b9-b213-41ae-87ca-ea569359bdb6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:25:31 np0005466013 nova_compute[192144]: 2025-10-02 12:25:31.933 2 DEBUG nova.compute.manager [req-108e647c-4762-43fe-ac6a-da93c16aefbb req-4ec8f366-cdfc-42e3-b1eb-00e6e72473a7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Refreshing instance network info cache due to event network-changed-375468b9-b213-41ae-87ca-ea569359bdb6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:25:31 np0005466013 nova_compute[192144]: 2025-10-02 12:25:31.933 2 DEBUG oslo_concurrency.lockutils [req-108e647c-4762-43fe-ac6a-da93c16aefbb req-4ec8f366-cdfc-42e3-b1eb-00e6e72473a7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-595aea98-0c3e-45c9-81fe-4643f44fe8d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:25:31 np0005466013 nova_compute[192144]: 2025-10-02 12:25:31.933 2 DEBUG oslo_concurrency.lockutils [req-108e647c-4762-43fe-ac6a-da93c16aefbb req-4ec8f366-cdfc-42e3-b1eb-00e6e72473a7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-595aea98-0c3e-45c9-81fe-4643f44fe8d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:25:31 np0005466013 nova_compute[192144]: 2025-10-02 12:25:31.934 2 DEBUG nova.network.neutron [req-108e647c-4762-43fe-ac6a-da93c16aefbb req-4ec8f366-cdfc-42e3-b1eb-00e6e72473a7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Refreshing network info cache for port 375468b9-b213-41ae-87ca-ea569359bdb6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:25:33 np0005466013 nova_compute[192144]: 2025-10-02 12:25:33.772 2 DEBUG nova.network.neutron [req-108e647c-4762-43fe-ac6a-da93c16aefbb req-4ec8f366-cdfc-42e3-b1eb-00e6e72473a7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Updated VIF entry in instance network info cache for port 375468b9-b213-41ae-87ca-ea569359bdb6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:25:33 np0005466013 nova_compute[192144]: 2025-10-02 12:25:33.774 2 DEBUG nova.network.neutron [req-108e647c-4762-43fe-ac6a-da93c16aefbb req-4ec8f366-cdfc-42e3-b1eb-00e6e72473a7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Updating instance_info_cache with network_info: [{"id": "375468b9-b213-41ae-87ca-ea569359bdb6", "address": "fa:16:3e:06:5f:dd", "network": {"id": "7af61307-f367-4334-ad00-5d542cb00bd9", "bridge": "br-int", "label": "tempest-network-smoke--1240497721", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap375468b9-b2", "ovs_interfaceid": "375468b9-b213-41ae-87ca-ea569359bdb6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:25:33 np0005466013 nova_compute[192144]: 2025-10-02 12:25:33.778 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:33 np0005466013 nova_compute[192144]: 2025-10-02 12:25:33.801 2 DEBUG oslo_concurrency.lockutils [req-108e647c-4762-43fe-ac6a-da93c16aefbb req-4ec8f366-cdfc-42e3-b1eb-00e6e72473a7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-595aea98-0c3e-45c9-81fe-4643f44fe8d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:25:33 np0005466013 nova_compute[192144]: 2025-10-02 12:25:33.992 2 DEBUG nova.virt.libvirt.driver [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Oct  2 08:25:34 np0005466013 nova_compute[192144]: 2025-10-02 12:25:34.220 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:25:34 np0005466013 nova_compute[192144]: 2025-10-02 12:25:34.221 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:25:34 np0005466013 nova_compute[192144]: 2025-10-02 12:25:34.221 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:25:34 np0005466013 nova_compute[192144]: 2025-10-02 12:25:34.484 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:34 np0005466013 nova_compute[192144]: 2025-10-02 12:25:34.990 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:25:35 np0005466013 nova_compute[192144]: 2025-10-02 12:25:35.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:25:35 np0005466013 nova_compute[192144]: 2025-10-02 12:25:35.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:25:36 np0005466013 kernel: tap60b9a0ec-2a (unregistering): left promiscuous mode
Oct  2 08:25:36 np0005466013 NetworkManager[51205]: <info>  [1759407936.1685] device (tap60b9a0ec-2a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:25:36 np0005466013 nova_compute[192144]: 2025-10-02 12:25:36.184 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:36 np0005466013 ovn_controller[94366]: 2025-10-02T12:25:36Z|00497|binding|INFO|Releasing lport 60b9a0ec-2ade-4f90-a7b0-443ac527ec3e from this chassis (sb_readonly=0)
Oct  2 08:25:36 np0005466013 ovn_controller[94366]: 2025-10-02T12:25:36Z|00498|binding|INFO|Setting lport 60b9a0ec-2ade-4f90-a7b0-443ac527ec3e down in Southbound
Oct  2 08:25:36 np0005466013 ovn_controller[94366]: 2025-10-02T12:25:36Z|00499|binding|INFO|Removing iface tap60b9a0ec-2a ovn-installed in OVS
Oct  2 08:25:36 np0005466013 nova_compute[192144]: 2025-10-02 12:25:36.188 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:36 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:36.196 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f0:49:10 10.100.0.4'], port_security=['fa:16:3e:f0:49:10 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '88e90c16adec46069b539d4f1431ab4d', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ed50fd5d-92ed-497e-8f4f-4653533c5a19', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=53026845-594b-430c-a1e8-d879cf008d46, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=60b9a0ec-2ade-4f90-a7b0-443ac527ec3e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:25:36 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:36.198 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 60b9a0ec-2ade-4f90-a7b0-443ac527ec3e in datapath aa4ebb90-ef5e-4974-a53d-2aabd696731a unbound from our chassis#033[00m
Oct  2 08:25:36 np0005466013 nova_compute[192144]: 2025-10-02 12:25:36.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:36 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:36.203 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network aa4ebb90-ef5e-4974-a53d-2aabd696731a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:25:36 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:36.206 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[29edf920-4d92-478a-b9d5-80c77e11046f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:36 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:36.211 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a namespace which is not needed anymore#033[00m
Oct  2 08:25:36 np0005466013 systemd[1]: machine-qemu\x2d56\x2dinstance\x2d00000077.scope: Deactivated successfully.
Oct  2 08:25:36 np0005466013 systemd[1]: machine-qemu\x2d56\x2dinstance\x2d00000077.scope: Consumed 14.197s CPU time.
Oct  2 08:25:36 np0005466013 systemd-machined[152202]: Machine qemu-56-instance-00000077 terminated.
Oct  2 08:25:36 np0005466013 nova_compute[192144]: 2025-10-02 12:25:36.356 2 DEBUG nova.compute.manager [req-6b0e7c7c-39d7-4762-8f17-ea9addfc3126 req-c7936808-b83f-43ff-a44d-8b4ab28ffc37 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Received event network-vif-unplugged-60b9a0ec-2ade-4f90-a7b0-443ac527ec3e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:25:36 np0005466013 nova_compute[192144]: 2025-10-02 12:25:36.359 2 DEBUG oslo_concurrency.lockutils [req-6b0e7c7c-39d7-4762-8f17-ea9addfc3126 req-c7936808-b83f-43ff-a44d-8b4ab28ffc37 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "f92877b9-dd8b-4444-a42b-987004802928-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:36 np0005466013 nova_compute[192144]: 2025-10-02 12:25:36.359 2 DEBUG oslo_concurrency.lockutils [req-6b0e7c7c-39d7-4762-8f17-ea9addfc3126 req-c7936808-b83f-43ff-a44d-8b4ab28ffc37 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f92877b9-dd8b-4444-a42b-987004802928-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:36 np0005466013 nova_compute[192144]: 2025-10-02 12:25:36.360 2 DEBUG oslo_concurrency.lockutils [req-6b0e7c7c-39d7-4762-8f17-ea9addfc3126 req-c7936808-b83f-43ff-a44d-8b4ab28ffc37 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f92877b9-dd8b-4444-a42b-987004802928-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:36 np0005466013 nova_compute[192144]: 2025-10-02 12:25:36.361 2 DEBUG nova.compute.manager [req-6b0e7c7c-39d7-4762-8f17-ea9addfc3126 req-c7936808-b83f-43ff-a44d-8b4ab28ffc37 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] No waiting events found dispatching network-vif-unplugged-60b9a0ec-2ade-4f90-a7b0-443ac527ec3e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:25:36 np0005466013 nova_compute[192144]: 2025-10-02 12:25:36.361 2 WARNING nova.compute.manager [req-6b0e7c7c-39d7-4762-8f17-ea9addfc3126 req-c7936808-b83f-43ff-a44d-8b4ab28ffc37 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Received unexpected event network-vif-unplugged-60b9a0ec-2ade-4f90-a7b0-443ac527ec3e for instance with vm_state active and task_state rescuing.#033[00m
Oct  2 08:25:36 np0005466013 neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a[238850]: [NOTICE]   (238854) : haproxy version is 2.8.14-c23fe91
Oct  2 08:25:36 np0005466013 neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a[238850]: [NOTICE]   (238854) : path to executable is /usr/sbin/haproxy
Oct  2 08:25:36 np0005466013 neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a[238850]: [WARNING]  (238854) : Exiting Master process...
Oct  2 08:25:36 np0005466013 neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a[238850]: [ALERT]    (238854) : Current worker (238856) exited with code 143 (Terminated)
Oct  2 08:25:36 np0005466013 neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a[238850]: [WARNING]  (238854) : All workers exited. Exiting... (0)
Oct  2 08:25:36 np0005466013 systemd[1]: libpod-29967f643e579c6beff353e853c9de4a0893f905a95341966ab3bacbc31041da.scope: Deactivated successfully.
Oct  2 08:25:36 np0005466013 nova_compute[192144]: 2025-10-02 12:25:36.407 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:36 np0005466013 podman[239192]: 2025-10-02 12:25:36.410334029 +0000 UTC m=+0.079068767 container died 29967f643e579c6beff353e853c9de4a0893f905a95341966ab3bacbc31041da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:25:36 np0005466013 nova_compute[192144]: 2025-10-02 12:25:36.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:36 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-29967f643e579c6beff353e853c9de4a0893f905a95341966ab3bacbc31041da-userdata-shm.mount: Deactivated successfully.
Oct  2 08:25:36 np0005466013 systemd[1]: var-lib-containers-storage-overlay-d3cd4dc8ac81464f0994ea1e10c2002180207a14eec8a18dd63f29e14a36ce85-merged.mount: Deactivated successfully.
Oct  2 08:25:36 np0005466013 podman[239192]: 2025-10-02 12:25:36.47104437 +0000 UTC m=+0.139779068 container cleanup 29967f643e579c6beff353e853c9de4a0893f905a95341966ab3bacbc31041da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:25:36 np0005466013 systemd[1]: libpod-conmon-29967f643e579c6beff353e853c9de4a0893f905a95341966ab3bacbc31041da.scope: Deactivated successfully.
Oct  2 08:25:36 np0005466013 podman[239237]: 2025-10-02 12:25:36.552356856 +0000 UTC m=+0.049599364 container remove 29967f643e579c6beff353e853c9de4a0893f905a95341966ab3bacbc31041da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 08:25:36 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:36.563 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[e9bf25b2-57d3-456c-b740-a00b1b2c9144]: (4, ('Thu Oct  2 12:25:36 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a (29967f643e579c6beff353e853c9de4a0893f905a95341966ab3bacbc31041da)\n29967f643e579c6beff353e853c9de4a0893f905a95341966ab3bacbc31041da\nThu Oct  2 12:25:36 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a (29967f643e579c6beff353e853c9de4a0893f905a95341966ab3bacbc31041da)\n29967f643e579c6beff353e853c9de4a0893f905a95341966ab3bacbc31041da\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:36 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:36.567 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[d5616657-a8fa-4194-8226-1a3e7882d29b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:36 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:36.569 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaa4ebb90-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:25:36 np0005466013 nova_compute[192144]: 2025-10-02 12:25:36.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:36 np0005466013 kernel: tapaa4ebb90-e0: left promiscuous mode
Oct  2 08:25:36 np0005466013 nova_compute[192144]: 2025-10-02 12:25:36.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:36 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:36.598 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[8a5e7528-c896-49d1-94a0-535f3a440c8a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:36 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:36.625 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[b7c6d99b-bdf2-4dd5-9ba6-3f6625e0dffb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:36 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:36.628 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[5efd461c-a424-4f78-b245-e381815cc1b6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:36 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:36.653 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[9adf2278-c2c3-4704-9880-502702ab3150]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 590346, 'reachable_time': 38187, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239256, 'error': None, 'target': 'ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:36 np0005466013 systemd[1]: run-netns-ovnmeta\x2daa4ebb90\x2def5e\x2d4974\x2da53d\x2d2aabd696731a.mount: Deactivated successfully.
Oct  2 08:25:36 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:36.660 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:25:36 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:36.661 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[1d4b405c-0b54-4ac4-8cc3-b5f96572a7bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:36 np0005466013 nova_compute[192144]: 2025-10-02 12:25:36.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:25:36 np0005466013 nova_compute[192144]: 2025-10-02 12:25:36.995 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:25:36 np0005466013 nova_compute[192144]: 2025-10-02 12:25:36.996 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:25:37 np0005466013 nova_compute[192144]: 2025-10-02 12:25:37.011 2 INFO nova.virt.libvirt.driver [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Instance shutdown successfully after 13 seconds.#033[00m
Oct  2 08:25:37 np0005466013 nova_compute[192144]: 2025-10-02 12:25:37.019 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "refresh_cache-f92877b9-dd8b-4444-a42b-987004802928" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:25:37 np0005466013 nova_compute[192144]: 2025-10-02 12:25:37.019 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquired lock "refresh_cache-f92877b9-dd8b-4444-a42b-987004802928" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:25:37 np0005466013 nova_compute[192144]: 2025-10-02 12:25:37.020 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: f92877b9-dd8b-4444-a42b-987004802928] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:25:37 np0005466013 nova_compute[192144]: 2025-10-02 12:25:37.020 2 DEBUG nova.objects.instance [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lazy-loading 'info_cache' on Instance uuid f92877b9-dd8b-4444-a42b-987004802928 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:25:37 np0005466013 nova_compute[192144]: 2025-10-02 12:25:37.025 2 INFO nova.virt.libvirt.driver [-] [instance: f92877b9-dd8b-4444-a42b-987004802928] Instance destroyed successfully.#033[00m
Oct  2 08:25:37 np0005466013 nova_compute[192144]: 2025-10-02 12:25:37.026 2 DEBUG nova.objects.instance [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lazy-loading 'numa_topology' on Instance uuid f92877b9-dd8b-4444-a42b-987004802928 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:25:37 np0005466013 nova_compute[192144]: 2025-10-02 12:25:37.048 2 INFO nova.virt.libvirt.driver [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Attempting a stable device rescue#033[00m
Oct  2 08:25:37 np0005466013 nova_compute[192144]: 2025-10-02 12:25:37.315 2 DEBUG nova.virt.libvirt.driver [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'scsi', 'dev': 'sdb', 'type': 'disk'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314#033[00m
Oct  2 08:25:37 np0005466013 nova_compute[192144]: 2025-10-02 12:25:37.324 2 DEBUG nova.virt.libvirt.driver [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Oct  2 08:25:37 np0005466013 nova_compute[192144]: 2025-10-02 12:25:37.325 2 INFO nova.virt.libvirt.driver [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Creating image(s)#033[00m
Oct  2 08:25:37 np0005466013 nova_compute[192144]: 2025-10-02 12:25:37.327 2 DEBUG oslo_concurrency.lockutils [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquiring lock "/var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:37 np0005466013 nova_compute[192144]: 2025-10-02 12:25:37.328 2 DEBUG oslo_concurrency.lockutils [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "/var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:37 np0005466013 nova_compute[192144]: 2025-10-02 12:25:37.330 2 DEBUG oslo_concurrency.lockutils [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "/var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:37 np0005466013 nova_compute[192144]: 2025-10-02 12:25:37.330 2 DEBUG nova.objects.instance [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lazy-loading 'trusted_certs' on Instance uuid f92877b9-dd8b-4444-a42b-987004802928 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:25:37 np0005466013 nova_compute[192144]: 2025-10-02 12:25:37.344 2 DEBUG oslo_concurrency.lockutils [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquiring lock "bf7d91c80d713bd4a5b1fc203f1a5c35101d6204" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:37 np0005466013 nova_compute[192144]: 2025-10-02 12:25:37.345 2 DEBUG oslo_concurrency.lockutils [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "bf7d91c80d713bd4a5b1fc203f1a5c35101d6204" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:38 np0005466013 nova_compute[192144]: 2025-10-02 12:25:38.143 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: f92877b9-dd8b-4444-a42b-987004802928] Updating instance_info_cache with network_info: [{"id": "60b9a0ec-2ade-4f90-a7b0-443ac527ec3e", "address": "fa:16:3e:f0:49:10", "network": {"id": "aa4ebb90-ef5e-4974-a53d-2aabd696731a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88e90c16adec46069b539d4f1431ab4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60b9a0ec-2a", "ovs_interfaceid": "60b9a0ec-2ade-4f90-a7b0-443ac527ec3e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:25:38 np0005466013 ovn_controller[94366]: 2025-10-02T12:25:38Z|00046|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:06:5f:dd 10.100.0.6
Oct  2 08:25:38 np0005466013 ovn_controller[94366]: 2025-10-02T12:25:38Z|00047|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:06:5f:dd 10.100.0.6
Oct  2 08:25:38 np0005466013 nova_compute[192144]: 2025-10-02 12:25:38.254 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Releasing lock "refresh_cache-f92877b9-dd8b-4444-a42b-987004802928" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:25:38 np0005466013 nova_compute[192144]: 2025-10-02 12:25:38.254 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: f92877b9-dd8b-4444-a42b-987004802928] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:25:38 np0005466013 nova_compute[192144]: 2025-10-02 12:25:38.461 2 DEBUG nova.compute.manager [req-e340d03a-b645-49c7-9630-be366d55682e req-cd308557-00bc-4f04-8726-d996859e1a3c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Received event network-vif-plugged-60b9a0ec-2ade-4f90-a7b0-443ac527ec3e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:25:38 np0005466013 nova_compute[192144]: 2025-10-02 12:25:38.461 2 DEBUG oslo_concurrency.lockutils [req-e340d03a-b645-49c7-9630-be366d55682e req-cd308557-00bc-4f04-8726-d996859e1a3c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "f92877b9-dd8b-4444-a42b-987004802928-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:38 np0005466013 nova_compute[192144]: 2025-10-02 12:25:38.462 2 DEBUG oslo_concurrency.lockutils [req-e340d03a-b645-49c7-9630-be366d55682e req-cd308557-00bc-4f04-8726-d996859e1a3c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f92877b9-dd8b-4444-a42b-987004802928-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:38 np0005466013 nova_compute[192144]: 2025-10-02 12:25:38.463 2 DEBUG oslo_concurrency.lockutils [req-e340d03a-b645-49c7-9630-be366d55682e req-cd308557-00bc-4f04-8726-d996859e1a3c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f92877b9-dd8b-4444-a42b-987004802928-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:38 np0005466013 nova_compute[192144]: 2025-10-02 12:25:38.463 2 DEBUG nova.compute.manager [req-e340d03a-b645-49c7-9630-be366d55682e req-cd308557-00bc-4f04-8726-d996859e1a3c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] No waiting events found dispatching network-vif-plugged-60b9a0ec-2ade-4f90-a7b0-443ac527ec3e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:25:38 np0005466013 nova_compute[192144]: 2025-10-02 12:25:38.464 2 WARNING nova.compute.manager [req-e340d03a-b645-49c7-9630-be366d55682e req-cd308557-00bc-4f04-8726-d996859e1a3c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Received unexpected event network-vif-plugged-60b9a0ec-2ade-4f90-a7b0-443ac527ec3e for instance with vm_state active and task_state rescuing.#033[00m
Oct  2 08:25:38 np0005466013 nova_compute[192144]: 2025-10-02 12:25:38.554 2 DEBUG oslo_concurrency.processutils [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bf7d91c80d713bd4a5b1fc203f1a5c35101d6204.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:38 np0005466013 nova_compute[192144]: 2025-10-02 12:25:38.622 2 DEBUG oslo_concurrency.processutils [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bf7d91c80d713bd4a5b1fc203f1a5c35101d6204.part --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:38 np0005466013 nova_compute[192144]: 2025-10-02 12:25:38.624 2 DEBUG nova.virt.images [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] c8eaba5c-e487-4f7e-8b71-031fb7eb86ca was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Oct  2 08:25:38 np0005466013 nova_compute[192144]: 2025-10-02 12:25:38.625 2 DEBUG nova.privsep.utils [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Oct  2 08:25:38 np0005466013 nova_compute[192144]: 2025-10-02 12:25:38.625 2 DEBUG oslo_concurrency.processutils [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/bf7d91c80d713bd4a5b1fc203f1a5c35101d6204.part /var/lib/nova/instances/_base/bf7d91c80d713bd4a5b1fc203f1a5c35101d6204.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:38 np0005466013 nova_compute[192144]: 2025-10-02 12:25:38.781 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:38 np0005466013 nova_compute[192144]: 2025-10-02 12:25:38.809 2 DEBUG oslo_concurrency.processutils [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/bf7d91c80d713bd4a5b1fc203f1a5c35101d6204.part /var/lib/nova/instances/_base/bf7d91c80d713bd4a5b1fc203f1a5c35101d6204.converted" returned: 0 in 0.184s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:38 np0005466013 nova_compute[192144]: 2025-10-02 12:25:38.821 2 DEBUG oslo_concurrency.processutils [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bf7d91c80d713bd4a5b1fc203f1a5c35101d6204.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:38 np0005466013 nova_compute[192144]: 2025-10-02 12:25:38.896 2 DEBUG oslo_concurrency.processutils [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bf7d91c80d713bd4a5b1fc203f1a5c35101d6204.converted --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:38 np0005466013 nova_compute[192144]: 2025-10-02 12:25:38.898 2 DEBUG oslo_concurrency.lockutils [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "bf7d91c80d713bd4a5b1fc203f1a5c35101d6204" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.552s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:38 np0005466013 nova_compute[192144]: 2025-10-02 12:25:38.916 2 DEBUG oslo_concurrency.lockutils [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquiring lock "bf7d91c80d713bd4a5b1fc203f1a5c35101d6204" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:38 np0005466013 nova_compute[192144]: 2025-10-02 12:25:38.917 2 DEBUG oslo_concurrency.lockutils [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "bf7d91c80d713bd4a5b1fc203f1a5c35101d6204" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:38 np0005466013 nova_compute[192144]: 2025-10-02 12:25:38.931 2 DEBUG oslo_concurrency.processutils [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bf7d91c80d713bd4a5b1fc203f1a5c35101d6204 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:38 np0005466013 nova_compute[192144]: 2025-10-02 12:25:38.995 2 DEBUG oslo_concurrency.processutils [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bf7d91c80d713bd4a5b1fc203f1a5c35101d6204 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:38 np0005466013 nova_compute[192144]: 2025-10-02 12:25:38.997 2 DEBUG oslo_concurrency.processutils [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bf7d91c80d713bd4a5b1fc203f1a5c35101d6204,backing_fmt=raw /var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/disk.rescue execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:39 np0005466013 nova_compute[192144]: 2025-10-02 12:25:39.041 2 DEBUG oslo_concurrency.processutils [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bf7d91c80d713bd4a5b1fc203f1a5c35101d6204,backing_fmt=raw /var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/disk.rescue" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:39 np0005466013 nova_compute[192144]: 2025-10-02 12:25:39.042 2 DEBUG oslo_concurrency.lockutils [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "bf7d91c80d713bd4a5b1fc203f1a5c35101d6204" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:39 np0005466013 nova_compute[192144]: 2025-10-02 12:25:39.043 2 DEBUG nova.objects.instance [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lazy-loading 'migration_context' on Instance uuid f92877b9-dd8b-4444-a42b-987004802928 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:25:39 np0005466013 nova_compute[192144]: 2025-10-02 12:25:39.062 2 DEBUG nova.virt.libvirt.driver [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:25:39 np0005466013 nova_compute[192144]: 2025-10-02 12:25:39.066 2 DEBUG nova.virt.libvirt.driver [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Start _get_guest_xml network_info=[{"id": "60b9a0ec-2ade-4f90-a7b0-443ac527ec3e", "address": "fa:16:3e:f0:49:10", "network": {"id": "aa4ebb90-ef5e-4974-a53d-2aabd696731a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "vif_mac": "fa:16:3e:f0:49:10"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88e90c16adec46069b539d4f1431ab4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60b9a0ec-2a", "ovs_interfaceid": "60b9a0ec-2ade-4f90-a7b0-443ac527ec3e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'scsi', 'dev': 'sdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': 'c8eaba5c-e487-4f7e-8b71-031fb7eb86ca', 'kernel_id': '', 'ramdisk_id': ''} block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:25:39 np0005466013 nova_compute[192144]: 2025-10-02 12:25:39.066 2 DEBUG nova.objects.instance [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lazy-loading 'resources' on Instance uuid f92877b9-dd8b-4444-a42b-987004802928 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:25:39 np0005466013 nova_compute[192144]: 2025-10-02 12:25:39.091 2 WARNING nova.virt.libvirt.driver [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:25:39 np0005466013 nova_compute[192144]: 2025-10-02 12:25:39.098 2 DEBUG nova.virt.libvirt.host [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:25:39 np0005466013 nova_compute[192144]: 2025-10-02 12:25:39.099 2 DEBUG nova.virt.libvirt.host [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:25:39 np0005466013 nova_compute[192144]: 2025-10-02 12:25:39.103 2 DEBUG nova.virt.libvirt.host [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:25:39 np0005466013 nova_compute[192144]: 2025-10-02 12:25:39.103 2 DEBUG nova.virt.libvirt.host [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:25:39 np0005466013 nova_compute[192144]: 2025-10-02 12:25:39.105 2 DEBUG nova.virt.libvirt.driver [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:25:39 np0005466013 nova_compute[192144]: 2025-10-02 12:25:39.105 2 DEBUG nova.virt.hardware [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:25:39 np0005466013 nova_compute[192144]: 2025-10-02 12:25:39.106 2 DEBUG nova.virt.hardware [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:25:39 np0005466013 nova_compute[192144]: 2025-10-02 12:25:39.106 2 DEBUG nova.virt.hardware [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:25:39 np0005466013 nova_compute[192144]: 2025-10-02 12:25:39.106 2 DEBUG nova.virt.hardware [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:25:39 np0005466013 nova_compute[192144]: 2025-10-02 12:25:39.106 2 DEBUG nova.virt.hardware [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:25:39 np0005466013 nova_compute[192144]: 2025-10-02 12:25:39.106 2 DEBUG nova.virt.hardware [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:25:39 np0005466013 nova_compute[192144]: 2025-10-02 12:25:39.107 2 DEBUG nova.virt.hardware [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:25:39 np0005466013 nova_compute[192144]: 2025-10-02 12:25:39.107 2 DEBUG nova.virt.hardware [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:25:39 np0005466013 nova_compute[192144]: 2025-10-02 12:25:39.107 2 DEBUG nova.virt.hardware [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:25:39 np0005466013 nova_compute[192144]: 2025-10-02 12:25:39.107 2 DEBUG nova.virt.hardware [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:25:39 np0005466013 nova_compute[192144]: 2025-10-02 12:25:39.107 2 DEBUG nova.virt.hardware [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:25:39 np0005466013 nova_compute[192144]: 2025-10-02 12:25:39.108 2 DEBUG nova.objects.instance [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lazy-loading 'vcpu_model' on Instance uuid f92877b9-dd8b-4444-a42b-987004802928 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:25:39 np0005466013 nova_compute[192144]: 2025-10-02 12:25:39.129 2 DEBUG oslo_concurrency.processutils [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:39 np0005466013 nova_compute[192144]: 2025-10-02 12:25:39.219 2 DEBUG oslo_concurrency.processutils [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/disk.config --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:39 np0005466013 nova_compute[192144]: 2025-10-02 12:25:39.220 2 DEBUG oslo_concurrency.lockutils [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquiring lock "/var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:39 np0005466013 nova_compute[192144]: 2025-10-02 12:25:39.220 2 DEBUG oslo_concurrency.lockutils [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "/var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:39 np0005466013 nova_compute[192144]: 2025-10-02 12:25:39.221 2 DEBUG oslo_concurrency.lockutils [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "/var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:39 np0005466013 nova_compute[192144]: 2025-10-02 12:25:39.223 2 DEBUG nova.virt.libvirt.vif [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:24:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-915765106',display_name='tempest-ServerStableDeviceRescueTest-server-915765106',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-915765106',id=119,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:25:13Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='88e90c16adec46069b539d4f1431ab4d',ramdisk_id='',reservation_id='r-6ztfdnas',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerStableDeviceRescueTest-232864240',owner_user_name='tempest-ServerStableDeviceRescueTest-232864240-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:25:19Z,user_data=None,user_id='abb9f220716e48d79dbe2f97622937c4',uuid=f92877b9-dd8b-4444-a42b-987004802928,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "60b9a0ec-2ade-4f90-a7b0-443ac527ec3e", "address": "fa:16:3e:f0:49:10", "network": {"id": "aa4ebb90-ef5e-4974-a53d-2aabd696731a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "vif_mac": "fa:16:3e:f0:49:10"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88e90c16adec46069b539d4f1431ab4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60b9a0ec-2a", "ovs_interfaceid": "60b9a0ec-2ade-4f90-a7b0-443ac527ec3e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:25:39 np0005466013 nova_compute[192144]: 2025-10-02 12:25:39.223 2 DEBUG nova.network.os_vif_util [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Converting VIF {"id": "60b9a0ec-2ade-4f90-a7b0-443ac527ec3e", "address": "fa:16:3e:f0:49:10", "network": {"id": "aa4ebb90-ef5e-4974-a53d-2aabd696731a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "vif_mac": "fa:16:3e:f0:49:10"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88e90c16adec46069b539d4f1431ab4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60b9a0ec-2a", "ovs_interfaceid": "60b9a0ec-2ade-4f90-a7b0-443ac527ec3e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:25:39 np0005466013 nova_compute[192144]: 2025-10-02 12:25:39.225 2 DEBUG nova.network.os_vif_util [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f0:49:10,bridge_name='br-int',has_traffic_filtering=True,id=60b9a0ec-2ade-4f90-a7b0-443ac527ec3e,network=Network(aa4ebb90-ef5e-4974-a53d-2aabd696731a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60b9a0ec-2a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:25:39 np0005466013 nova_compute[192144]: 2025-10-02 12:25:39.226 2 DEBUG nova.objects.instance [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lazy-loading 'pci_devices' on Instance uuid f92877b9-dd8b-4444-a42b-987004802928 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:25:39 np0005466013 nova_compute[192144]: 2025-10-02 12:25:39.243 2 DEBUG nova.virt.libvirt.driver [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:25:39 np0005466013 nova_compute[192144]:  <uuid>f92877b9-dd8b-4444-a42b-987004802928</uuid>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:  <name>instance-00000077</name>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:25:39 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:      <nova:name>tempest-ServerStableDeviceRescueTest-server-915765106</nova:name>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:25:39</nova:creationTime>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:25:39 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:        <nova:user uuid="abb9f220716e48d79dbe2f97622937c4">tempest-ServerStableDeviceRescueTest-232864240-project-member</nova:user>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:        <nova:project uuid="88e90c16adec46069b539d4f1431ab4d">tempest-ServerStableDeviceRescueTest-232864240</nova:project>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:        <nova:port uuid="60b9a0ec-2ade-4f90-a7b0-443ac527ec3e">
Oct  2 08:25:39 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:      <entry name="serial">f92877b9-dd8b-4444-a42b-987004802928</entry>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:      <entry name="uuid">f92877b9-dd8b-4444-a42b-987004802928</entry>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:25:39 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/disk"/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:25:39 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/disk.config"/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:25:39 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/disk.rescue"/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:      <target dev="sdb" bus="scsi"/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:      <boot order="1"/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:25:39 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:f0:49:10"/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:      <target dev="tap60b9a0ec-2a"/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:25:39 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/console.log" append="off"/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:25:39 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:25:39 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:25:39 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:25:39 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:25:39 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:25:39 np0005466013 nova_compute[192144]: 2025-10-02 12:25:39.253 2 INFO nova.virt.libvirt.driver [-] [instance: f92877b9-dd8b-4444-a42b-987004802928] Instance destroyed successfully.#033[00m
Oct  2 08:25:39 np0005466013 nova_compute[192144]: 2025-10-02 12:25:39.325 2 DEBUG nova.virt.libvirt.driver [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:25:39 np0005466013 nova_compute[192144]: 2025-10-02 12:25:39.326 2 DEBUG nova.virt.libvirt.driver [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:25:39 np0005466013 nova_compute[192144]: 2025-10-02 12:25:39.327 2 DEBUG nova.virt.libvirt.driver [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] No BDM found with device name sdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:25:39 np0005466013 nova_compute[192144]: 2025-10-02 12:25:39.327 2 DEBUG nova.virt.libvirt.driver [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] No VIF found with MAC fa:16:3e:f0:49:10, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:25:39 np0005466013 nova_compute[192144]: 2025-10-02 12:25:39.328 2 INFO nova.virt.libvirt.driver [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Using config drive#033[00m
Oct  2 08:25:39 np0005466013 nova_compute[192144]: 2025-10-02 12:25:39.348 2 DEBUG nova.objects.instance [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lazy-loading 'ec2_ids' on Instance uuid f92877b9-dd8b-4444-a42b-987004802928 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:25:39 np0005466013 podman[239298]: 2025-10-02 12:25:39.381018865 +0000 UTC m=+0.073125471 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., managed_by=edpm_ansible, distribution-scope=public, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.openshift.tags=minimal rhel9, architecture=x86_64, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.openshift.expose-services=, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers)
Oct  2 08:25:39 np0005466013 nova_compute[192144]: 2025-10-02 12:25:39.386 2 DEBUG nova.objects.instance [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lazy-loading 'keypairs' on Instance uuid f92877b9-dd8b-4444-a42b-987004802928 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:25:39 np0005466013 podman[239297]: 2025-10-02 12:25:39.388118238 +0000 UTC m=+0.076302131 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:25:39 np0005466013 podman[239299]: 2025-10-02 12:25:39.413796552 +0000 UTC m=+0.099601691 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001)
Oct  2 08:25:39 np0005466013 nova_compute[192144]: 2025-10-02 12:25:39.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:39 np0005466013 nova_compute[192144]: 2025-10-02 12:25:39.889 2 INFO nova.virt.libvirt.driver [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Creating config drive at /var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/disk.config.rescue#033[00m
Oct  2 08:25:39 np0005466013 nova_compute[192144]: 2025-10-02 12:25:39.894 2 DEBUG oslo_concurrency.processutils [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpajkp55zn execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:40 np0005466013 nova_compute[192144]: 2025-10-02 12:25:40.040 2 DEBUG oslo_concurrency.processutils [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpajkp55zn" returned: 0 in 0.146s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:40 np0005466013 kernel: tap60b9a0ec-2a: entered promiscuous mode
Oct  2 08:25:40 np0005466013 NetworkManager[51205]: <info>  [1759407940.1212] manager: (tap60b9a0ec-2a): new Tun device (/org/freedesktop/NetworkManager/Devices/229)
Oct  2 08:25:40 np0005466013 ovn_controller[94366]: 2025-10-02T12:25:40Z|00500|binding|INFO|Claiming lport 60b9a0ec-2ade-4f90-a7b0-443ac527ec3e for this chassis.
Oct  2 08:25:40 np0005466013 ovn_controller[94366]: 2025-10-02T12:25:40Z|00501|binding|INFO|60b9a0ec-2ade-4f90-a7b0-443ac527ec3e: Claiming fa:16:3e:f0:49:10 10.100.0.4
Oct  2 08:25:40 np0005466013 nova_compute[192144]: 2025-10-02 12:25:40.126 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:40 np0005466013 ovn_controller[94366]: 2025-10-02T12:25:40Z|00502|binding|INFO|Setting lport 60b9a0ec-2ade-4f90-a7b0-443ac527ec3e ovn-installed in OVS
Oct  2 08:25:40 np0005466013 ovn_controller[94366]: 2025-10-02T12:25:40Z|00503|binding|INFO|Setting lport 60b9a0ec-2ade-4f90-a7b0-443ac527ec3e up in Southbound
Oct  2 08:25:40 np0005466013 nova_compute[192144]: 2025-10-02 12:25:40.145 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:40.146 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f0:49:10 10.100.0.4'], port_security=['fa:16:3e:f0:49:10 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '88e90c16adec46069b539d4f1431ab4d', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'ed50fd5d-92ed-497e-8f4f-4653533c5a19', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=53026845-594b-430c-a1e8-d879cf008d46, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=60b9a0ec-2ade-4f90-a7b0-443ac527ec3e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:40.151 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 60b9a0ec-2ade-4f90-a7b0-443ac527ec3e in datapath aa4ebb90-ef5e-4974-a53d-2aabd696731a bound to our chassis#033[00m
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:40.155 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network aa4ebb90-ef5e-4974-a53d-2aabd696731a#033[00m
Oct  2 08:25:40 np0005466013 systemd-udevd[239376]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:40.169 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[72161184-aedc-40cc-b918-e82c4b9861c6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:40.170 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapaa4ebb90-e1 in ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:40.173 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapaa4ebb90-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:40.173 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[a314607b-f5ef-41ad-92fe-7ecd6e868a68]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:40 np0005466013 systemd-machined[152202]: New machine qemu-58-instance-00000077.
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:40.178 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[f3316214-dc1e-43b8-aa92-448d6446eae1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:40 np0005466013 NetworkManager[51205]: <info>  [1759407940.1820] device (tap60b9a0ec-2a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:25:40 np0005466013 NetworkManager[51205]: <info>  [1759407940.1836] device (tap60b9a0ec-2a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:25:40 np0005466013 systemd[1]: Started Virtual Machine qemu-58-instance-00000077.
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:40.193 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[31738109-4784-4bd2-98f8-2b9df946136b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:40.213 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[45283458-acdb-422b-bfce-b4412591563d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:40.253 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[b998aa9d-2b15-4e2d-91d2-cc237a5a06a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:40 np0005466013 NetworkManager[51205]: <info>  [1759407940.2609] manager: (tapaa4ebb90-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/230)
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:40.260 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[0348a4bc-ead4-4bf4-b9a5-e6b4f8f20ed3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:40.298 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[557404fb-e487-49a5-b8ff-619f2f54213c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:40.301 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[ddf7f907-304d-4d9e-9180-93d5f6bb7b5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:40 np0005466013 NetworkManager[51205]: <info>  [1759407940.3287] device (tapaa4ebb90-e0): carrier: link connected
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:40.336 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[1fb55f7e-2f28-4917-89ad-fe397484504b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:40.358 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[37b04df6-f32b-48d8-9ef5-a2833efb3d17]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapaa4ebb90-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bc:89:8e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 152], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 593194, 'reachable_time': 29315, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239409, 'error': None, 'target': 'ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:40.378 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[1550cb3e-eb71-428c-8a38-062a788c8226]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febc:898e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 593194, 'tstamp': 593194}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 239410, 'error': None, 'target': 'ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:40.403 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[48149e02-a325-482f-aec0-5d9f8cff539b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapaa4ebb90-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bc:89:8e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 152], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 593194, 'reachable_time': 29315, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 239411, 'error': None, 'target': 'ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:40.451 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[bea932c2-5abe-40e6-b164-484095b3aebd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:40.541 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[6aec7930-0e86-40b2-9b38-543ca21920af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:40.543 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaa4ebb90-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:40.543 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:40.543 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaa4ebb90-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:25:40 np0005466013 nova_compute[192144]: 2025-10-02 12:25:40.545 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:40 np0005466013 NetworkManager[51205]: <info>  [1759407940.5465] manager: (tapaa4ebb90-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/231)
Oct  2 08:25:40 np0005466013 kernel: tapaa4ebb90-e0: entered promiscuous mode
Oct  2 08:25:40 np0005466013 nova_compute[192144]: 2025-10-02 12:25:40.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:40.550 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapaa4ebb90-e0, col_values=(('external_ids', {'iface-id': 'c9a9afa9-78de-46a4-a649-8b8779924189'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:25:40 np0005466013 nova_compute[192144]: 2025-10-02 12:25:40.551 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:40 np0005466013 ovn_controller[94366]: 2025-10-02T12:25:40Z|00504|binding|INFO|Releasing lport c9a9afa9-78de-46a4-a649-8b8779924189 from this chassis (sb_readonly=0)
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:40.579 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/aa4ebb90-ef5e-4974-a53d-2aabd696731a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/aa4ebb90-ef5e-4974-a53d-2aabd696731a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:25:40 np0005466013 nova_compute[192144]: 2025-10-02 12:25:40.581 2 DEBUG nova.compute.manager [req-8d6f4de5-eff2-4b07-8c88-aff08c234ab7 req-037f90d4-7ece-4aeb-b9aa-2fe38950d63c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Received event network-vif-plugged-60b9a0ec-2ade-4f90-a7b0-443ac527ec3e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:25:40 np0005466013 nova_compute[192144]: 2025-10-02 12:25:40.581 2 DEBUG oslo_concurrency.lockutils [req-8d6f4de5-eff2-4b07-8c88-aff08c234ab7 req-037f90d4-7ece-4aeb-b9aa-2fe38950d63c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "f92877b9-dd8b-4444-a42b-987004802928-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:40 np0005466013 nova_compute[192144]: 2025-10-02 12:25:40.581 2 DEBUG oslo_concurrency.lockutils [req-8d6f4de5-eff2-4b07-8c88-aff08c234ab7 req-037f90d4-7ece-4aeb-b9aa-2fe38950d63c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f92877b9-dd8b-4444-a42b-987004802928-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:40 np0005466013 nova_compute[192144]: 2025-10-02 12:25:40.582 2 DEBUG oslo_concurrency.lockutils [req-8d6f4de5-eff2-4b07-8c88-aff08c234ab7 req-037f90d4-7ece-4aeb-b9aa-2fe38950d63c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f92877b9-dd8b-4444-a42b-987004802928-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:40 np0005466013 nova_compute[192144]: 2025-10-02 12:25:40.582 2 DEBUG nova.compute.manager [req-8d6f4de5-eff2-4b07-8c88-aff08c234ab7 req-037f90d4-7ece-4aeb-b9aa-2fe38950d63c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] No waiting events found dispatching network-vif-plugged-60b9a0ec-2ade-4f90-a7b0-443ac527ec3e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:25:40 np0005466013 nova_compute[192144]: 2025-10-02 12:25:40.582 2 WARNING nova.compute.manager [req-8d6f4de5-eff2-4b07-8c88-aff08c234ab7 req-037f90d4-7ece-4aeb-b9aa-2fe38950d63c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Received unexpected event network-vif-plugged-60b9a0ec-2ade-4f90-a7b0-443ac527ec3e for instance with vm_state active and task_state rescuing.#033[00m
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:40.582 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[3f3a32ce-a5bd-4411-8366-01bc335dfbe7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:40 np0005466013 nova_compute[192144]: 2025-10-02 12:25:40.583 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:40.583 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-aa4ebb90-ef5e-4974-a53d-2aabd696731a
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/aa4ebb90-ef5e-4974-a53d-2aabd696731a.pid.haproxy
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID aa4ebb90-ef5e-4974-a53d-2aabd696731a
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:25:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:40.586 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'env', 'PROCESS_TAG=haproxy-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/aa4ebb90-ef5e-4974-a53d-2aabd696731a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:25:40 np0005466013 podman[239443]: 2025-10-02 12:25:40.957891425 +0000 UTC m=+0.055195859 container create e0d3263f6c6eb307732f74835c5241e59d3860a2d4b1ed9a7dcc7b24ef7aa49e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:25:40 np0005466013 systemd[1]: Started libpod-conmon-e0d3263f6c6eb307732f74835c5241e59d3860a2d4b1ed9a7dcc7b24ef7aa49e.scope.
Oct  2 08:25:41 np0005466013 podman[239443]: 2025-10-02 12:25:40.929250089 +0000 UTC m=+0.026554533 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:25:41 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:25:41 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c02963ddf8361641c9b6e3a1d2c18c65fb69c05c35c42ed38994dbffde525c5a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:25:41 np0005466013 podman[239443]: 2025-10-02 12:25:41.047800361 +0000 UTC m=+0.145104805 container init e0d3263f6c6eb307732f74835c5241e59d3860a2d4b1ed9a7dcc7b24ef7aa49e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 08:25:41 np0005466013 podman[239443]: 2025-10-02 12:25:41.054086128 +0000 UTC m=+0.151390542 container start e0d3263f6c6eb307732f74835c5241e59d3860a2d4b1ed9a7dcc7b24ef7aa49e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:25:41 np0005466013 neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a[239459]: [NOTICE]   (239469) : New worker (239471) forked
Oct  2 08:25:41 np0005466013 neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a[239459]: [NOTICE]   (239469) : Loading success.
Oct  2 08:25:41 np0005466013 nova_compute[192144]: 2025-10-02 12:25:41.559 2 DEBUG nova.virt.libvirt.host [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Removed pending event for f92877b9-dd8b-4444-a42b-987004802928 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Oct  2 08:25:41 np0005466013 nova_compute[192144]: 2025-10-02 12:25:41.560 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407941.558018, f92877b9-dd8b-4444-a42b-987004802928 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:25:41 np0005466013 nova_compute[192144]: 2025-10-02 12:25:41.560 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f92877b9-dd8b-4444-a42b-987004802928] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:25:41 np0005466013 nova_compute[192144]: 2025-10-02 12:25:41.577 2 DEBUG nova.compute.manager [None req-01b14f72-5e4f-439a-9b69-697932e16f08 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:25:41 np0005466013 nova_compute[192144]: 2025-10-02 12:25:41.583 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f92877b9-dd8b-4444-a42b-987004802928] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:25:41 np0005466013 nova_compute[192144]: 2025-10-02 12:25:41.589 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f92877b9-dd8b-4444-a42b-987004802928] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:25:41 np0005466013 nova_compute[192144]: 2025-10-02 12:25:41.636 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f92877b9-dd8b-4444-a42b-987004802928] During sync_power_state the instance has a pending task (rescuing). Skip.#033[00m
Oct  2 08:25:41 np0005466013 nova_compute[192144]: 2025-10-02 12:25:41.636 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407941.5599716, f92877b9-dd8b-4444-a42b-987004802928 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:25:41 np0005466013 nova_compute[192144]: 2025-10-02 12:25:41.637 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f92877b9-dd8b-4444-a42b-987004802928] VM Started (Lifecycle Event)#033[00m
Oct  2 08:25:41 np0005466013 nova_compute[192144]: 2025-10-02 12:25:41.676 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f92877b9-dd8b-4444-a42b-987004802928] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:25:41 np0005466013 nova_compute[192144]: 2025-10-02 12:25:41.680 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f92877b9-dd8b-4444-a42b-987004802928] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:25:42 np0005466013 nova_compute[192144]: 2025-10-02 12:25:42.496 2 INFO nova.compute.manager [None req-0ee4bb47-5d3e-49a3-992d-b51d3835c535 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Unrescuing#033[00m
Oct  2 08:25:42 np0005466013 nova_compute[192144]: 2025-10-02 12:25:42.497 2 DEBUG oslo_concurrency.lockutils [None req-0ee4bb47-5d3e-49a3-992d-b51d3835c535 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquiring lock "refresh_cache-f92877b9-dd8b-4444-a42b-987004802928" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:25:42 np0005466013 nova_compute[192144]: 2025-10-02 12:25:42.497 2 DEBUG oslo_concurrency.lockutils [None req-0ee4bb47-5d3e-49a3-992d-b51d3835c535 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquired lock "refresh_cache-f92877b9-dd8b-4444-a42b-987004802928" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:25:42 np0005466013 nova_compute[192144]: 2025-10-02 12:25:42.497 2 DEBUG nova.network.neutron [None req-0ee4bb47-5d3e-49a3-992d-b51d3835c535 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:25:42 np0005466013 nova_compute[192144]: 2025-10-02 12:25:42.684 2 DEBUG nova.compute.manager [req-1cb98db5-34aa-4995-98a6-7eab4e289482 req-155cae49-6d21-4d56-9234-878564c1cc57 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Received event network-vif-plugged-60b9a0ec-2ade-4f90-a7b0-443ac527ec3e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:25:42 np0005466013 nova_compute[192144]: 2025-10-02 12:25:42.684 2 DEBUG oslo_concurrency.lockutils [req-1cb98db5-34aa-4995-98a6-7eab4e289482 req-155cae49-6d21-4d56-9234-878564c1cc57 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "f92877b9-dd8b-4444-a42b-987004802928-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:42 np0005466013 nova_compute[192144]: 2025-10-02 12:25:42.685 2 DEBUG oslo_concurrency.lockutils [req-1cb98db5-34aa-4995-98a6-7eab4e289482 req-155cae49-6d21-4d56-9234-878564c1cc57 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f92877b9-dd8b-4444-a42b-987004802928-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:42 np0005466013 nova_compute[192144]: 2025-10-02 12:25:42.685 2 DEBUG oslo_concurrency.lockutils [req-1cb98db5-34aa-4995-98a6-7eab4e289482 req-155cae49-6d21-4d56-9234-878564c1cc57 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f92877b9-dd8b-4444-a42b-987004802928-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:42 np0005466013 nova_compute[192144]: 2025-10-02 12:25:42.685 2 DEBUG nova.compute.manager [req-1cb98db5-34aa-4995-98a6-7eab4e289482 req-155cae49-6d21-4d56-9234-878564c1cc57 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] No waiting events found dispatching network-vif-plugged-60b9a0ec-2ade-4f90-a7b0-443ac527ec3e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:25:42 np0005466013 nova_compute[192144]: 2025-10-02 12:25:42.686 2 WARNING nova.compute.manager [req-1cb98db5-34aa-4995-98a6-7eab4e289482 req-155cae49-6d21-4d56-9234-878564c1cc57 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Received unexpected event network-vif-plugged-60b9a0ec-2ade-4f90-a7b0-443ac527ec3e for instance with vm_state rescued and task_state unrescuing.#033[00m
Oct  2 08:25:43 np0005466013 nova_compute[192144]: 2025-10-02 12:25:43.595 2 INFO nova.compute.manager [None req-d5092e9b-b218-4366-8041-e660d6f4ea9d 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Get console output#033[00m
Oct  2 08:25:43 np0005466013 nova_compute[192144]: 2025-10-02 12:25:43.601 56 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  2 08:25:43 np0005466013 nova_compute[192144]: 2025-10-02 12:25:43.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:44 np0005466013 nova_compute[192144]: 2025-10-02 12:25:44.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:44 np0005466013 podman[239481]: 2025-10-02 12:25:44.723188325 +0000 UTC m=+0.088205343 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  2 08:25:44 np0005466013 podman[239482]: 2025-10-02 12:25:44.73769007 +0000 UTC m=+0.091030212 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:25:45 np0005466013 nova_compute[192144]: 2025-10-02 12:25:45.843 2 INFO nova.compute.manager [None req-ec392d7a-598f-4f92-96c8-f31d0fedcee7 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Get console output#033[00m
Oct  2 08:25:45 np0005466013 nova_compute[192144]: 2025-10-02 12:25:45.849 56 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  2 08:25:46 np0005466013 nova_compute[192144]: 2025-10-02 12:25:46.663 2 DEBUG nova.network.neutron [None req-0ee4bb47-5d3e-49a3-992d-b51d3835c535 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Updating instance_info_cache with network_info: [{"id": "60b9a0ec-2ade-4f90-a7b0-443ac527ec3e", "address": "fa:16:3e:f0:49:10", "network": {"id": "aa4ebb90-ef5e-4974-a53d-2aabd696731a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88e90c16adec46069b539d4f1431ab4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60b9a0ec-2a", "ovs_interfaceid": "60b9a0ec-2ade-4f90-a7b0-443ac527ec3e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:25:46 np0005466013 nova_compute[192144]: 2025-10-02 12:25:46.693 2 DEBUG oslo_concurrency.lockutils [None req-0ee4bb47-5d3e-49a3-992d-b51d3835c535 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Releasing lock "refresh_cache-f92877b9-dd8b-4444-a42b-987004802928" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:25:46 np0005466013 nova_compute[192144]: 2025-10-02 12:25:46.695 2 DEBUG nova.objects.instance [None req-0ee4bb47-5d3e-49a3-992d-b51d3835c535 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lazy-loading 'flavor' on Instance uuid f92877b9-dd8b-4444-a42b-987004802928 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:25:46 np0005466013 kernel: tap60b9a0ec-2a (unregistering): left promiscuous mode
Oct  2 08:25:46 np0005466013 NetworkManager[51205]: <info>  [1759407946.7720] device (tap60b9a0ec-2a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:25:46 np0005466013 ovn_controller[94366]: 2025-10-02T12:25:46Z|00505|binding|INFO|Releasing lport 60b9a0ec-2ade-4f90-a7b0-443ac527ec3e from this chassis (sb_readonly=0)
Oct  2 08:25:46 np0005466013 ovn_controller[94366]: 2025-10-02T12:25:46Z|00506|binding|INFO|Setting lport 60b9a0ec-2ade-4f90-a7b0-443ac527ec3e down in Southbound
Oct  2 08:25:46 np0005466013 ovn_controller[94366]: 2025-10-02T12:25:46Z|00507|binding|INFO|Removing iface tap60b9a0ec-2a ovn-installed in OVS
Oct  2 08:25:46 np0005466013 nova_compute[192144]: 2025-10-02 12:25:46.788 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:46 np0005466013 nova_compute[192144]: 2025-10-02 12:25:46.792 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:46 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:46.798 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f0:49:10 10.100.0.4'], port_security=['fa:16:3e:f0:49:10 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '88e90c16adec46069b539d4f1431ab4d', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'ed50fd5d-92ed-497e-8f4f-4653533c5a19', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=53026845-594b-430c-a1e8-d879cf008d46, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=60b9a0ec-2ade-4f90-a7b0-443ac527ec3e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:25:46 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:46.800 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 60b9a0ec-2ade-4f90-a7b0-443ac527ec3e in datapath aa4ebb90-ef5e-4974-a53d-2aabd696731a unbound from our chassis#033[00m
Oct  2 08:25:46 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:46.802 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network aa4ebb90-ef5e-4974-a53d-2aabd696731a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:25:46 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:46.804 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[7a46335d-17a7-4478-9268-f4ccbd218b45]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:46 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:46.804 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a namespace which is not needed anymore#033[00m
Oct  2 08:25:46 np0005466013 nova_compute[192144]: 2025-10-02 12:25:46.820 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:46 np0005466013 systemd[1]: machine-qemu\x2d58\x2dinstance\x2d00000077.scope: Deactivated successfully.
Oct  2 08:25:46 np0005466013 systemd[1]: machine-qemu\x2d58\x2dinstance\x2d00000077.scope: Consumed 6.553s CPU time.
Oct  2 08:25:46 np0005466013 systemd-machined[152202]: Machine qemu-58-instance-00000077 terminated.
Oct  2 08:25:47 np0005466013 nova_compute[192144]: 2025-10-02 12:25:47.049 2 DEBUG nova.compute.manager [req-28165e2b-e101-43c5-aac3-9095dd558056 req-6b6cdf21-1523-4af2-8926-ac40899bf330 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Received event network-vif-unplugged-60b9a0ec-2ade-4f90-a7b0-443ac527ec3e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:25:47 np0005466013 nova_compute[192144]: 2025-10-02 12:25:47.050 2 DEBUG oslo_concurrency.lockutils [req-28165e2b-e101-43c5-aac3-9095dd558056 req-6b6cdf21-1523-4af2-8926-ac40899bf330 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "f92877b9-dd8b-4444-a42b-987004802928-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:47 np0005466013 nova_compute[192144]: 2025-10-02 12:25:47.050 2 DEBUG oslo_concurrency.lockutils [req-28165e2b-e101-43c5-aac3-9095dd558056 req-6b6cdf21-1523-4af2-8926-ac40899bf330 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f92877b9-dd8b-4444-a42b-987004802928-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:47 np0005466013 nova_compute[192144]: 2025-10-02 12:25:47.050 2 DEBUG oslo_concurrency.lockutils [req-28165e2b-e101-43c5-aac3-9095dd558056 req-6b6cdf21-1523-4af2-8926-ac40899bf330 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f92877b9-dd8b-4444-a42b-987004802928-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:47 np0005466013 nova_compute[192144]: 2025-10-02 12:25:47.050 2 DEBUG nova.compute.manager [req-28165e2b-e101-43c5-aac3-9095dd558056 req-6b6cdf21-1523-4af2-8926-ac40899bf330 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] No waiting events found dispatching network-vif-unplugged-60b9a0ec-2ade-4f90-a7b0-443ac527ec3e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:25:47 np0005466013 nova_compute[192144]: 2025-10-02 12:25:47.050 2 WARNING nova.compute.manager [req-28165e2b-e101-43c5-aac3-9095dd558056 req-6b6cdf21-1523-4af2-8926-ac40899bf330 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Received unexpected event network-vif-unplugged-60b9a0ec-2ade-4f90-a7b0-443ac527ec3e for instance with vm_state rescued and task_state unrescuing.#033[00m
Oct  2 08:25:47 np0005466013 nova_compute[192144]: 2025-10-02 12:25:47.062 2 INFO nova.virt.libvirt.driver [-] [instance: f92877b9-dd8b-4444-a42b-987004802928] Instance destroyed successfully.#033[00m
Oct  2 08:25:47 np0005466013 nova_compute[192144]: 2025-10-02 12:25:47.062 2 DEBUG nova.objects.instance [None req-0ee4bb47-5d3e-49a3-992d-b51d3835c535 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lazy-loading 'numa_topology' on Instance uuid f92877b9-dd8b-4444-a42b-987004802928 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:25:47 np0005466013 neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a[239459]: [NOTICE]   (239469) : haproxy version is 2.8.14-c23fe91
Oct  2 08:25:47 np0005466013 neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a[239459]: [NOTICE]   (239469) : path to executable is /usr/sbin/haproxy
Oct  2 08:25:47 np0005466013 neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a[239459]: [WARNING]  (239469) : Exiting Master process...
Oct  2 08:25:47 np0005466013 neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a[239459]: [ALERT]    (239469) : Current worker (239471) exited with code 143 (Terminated)
Oct  2 08:25:47 np0005466013 neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a[239459]: [WARNING]  (239469) : All workers exited. Exiting... (0)
Oct  2 08:25:47 np0005466013 systemd[1]: libpod-e0d3263f6c6eb307732f74835c5241e59d3860a2d4b1ed9a7dcc7b24ef7aa49e.scope: Deactivated successfully.
Oct  2 08:25:47 np0005466013 podman[239544]: 2025-10-02 12:25:47.084816919 +0000 UTC m=+0.134349408 container died e0d3263f6c6eb307732f74835c5241e59d3860a2d4b1ed9a7dcc7b24ef7aa49e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  2 08:25:47 np0005466013 kernel: tap60b9a0ec-2a: entered promiscuous mode
Oct  2 08:25:47 np0005466013 systemd-udevd[239525]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:25:47 np0005466013 NetworkManager[51205]: <info>  [1759407947.3362] manager: (tap60b9a0ec-2a): new Tun device (/org/freedesktop/NetworkManager/Devices/232)
Oct  2 08:25:47 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e0d3263f6c6eb307732f74835c5241e59d3860a2d4b1ed9a7dcc7b24ef7aa49e-userdata-shm.mount: Deactivated successfully.
Oct  2 08:25:47 np0005466013 ovn_controller[94366]: 2025-10-02T12:25:47Z|00508|binding|INFO|Claiming lport 60b9a0ec-2ade-4f90-a7b0-443ac527ec3e for this chassis.
Oct  2 08:25:47 np0005466013 ovn_controller[94366]: 2025-10-02T12:25:47Z|00509|binding|INFO|60b9a0ec-2ade-4f90-a7b0-443ac527ec3e: Claiming fa:16:3e:f0:49:10 10.100.0.4
Oct  2 08:25:47 np0005466013 nova_compute[192144]: 2025-10-02 12:25:47.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:47 np0005466013 NetworkManager[51205]: <info>  [1759407947.3509] device (tap60b9a0ec-2a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:25:47 np0005466013 systemd[1]: var-lib-containers-storage-overlay-c02963ddf8361641c9b6e3a1d2c18c65fb69c05c35c42ed38994dbffde525c5a-merged.mount: Deactivated successfully.
Oct  2 08:25:47 np0005466013 NetworkManager[51205]: <info>  [1759407947.3522] device (tap60b9a0ec-2a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:25:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:47.351 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f0:49:10 10.100.0.4'], port_security=['fa:16:3e:f0:49:10 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '88e90c16adec46069b539d4f1431ab4d', 'neutron:revision_number': '7', 'neutron:security_group_ids': 'ed50fd5d-92ed-497e-8f4f-4653533c5a19', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=53026845-594b-430c-a1e8-d879cf008d46, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=60b9a0ec-2ade-4f90-a7b0-443ac527ec3e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:25:47 np0005466013 ovn_controller[94366]: 2025-10-02T12:25:47Z|00510|binding|INFO|Setting lport 60b9a0ec-2ade-4f90-a7b0-443ac527ec3e ovn-installed in OVS
Oct  2 08:25:47 np0005466013 ovn_controller[94366]: 2025-10-02T12:25:47Z|00511|binding|INFO|Setting lport 60b9a0ec-2ade-4f90-a7b0-443ac527ec3e up in Southbound
Oct  2 08:25:47 np0005466013 nova_compute[192144]: 2025-10-02 12:25:47.375 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:47 np0005466013 systemd-machined[152202]: New machine qemu-59-instance-00000077.
Oct  2 08:25:47 np0005466013 systemd[1]: Started Virtual Machine qemu-59-instance-00000077.
Oct  2 08:25:47 np0005466013 podman[239544]: 2025-10-02 12:25:47.509249031 +0000 UTC m=+0.558781540 container cleanup e0d3263f6c6eb307732f74835c5241e59d3860a2d4b1ed9a7dcc7b24ef7aa49e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:25:47 np0005466013 systemd[1]: libpod-conmon-e0d3263f6c6eb307732f74835c5241e59d3860a2d4b1ed9a7dcc7b24ef7aa49e.scope: Deactivated successfully.
Oct  2 08:25:48 np0005466013 podman[239616]: 2025-10-02 12:25:48.00482589 +0000 UTC m=+0.454653608 container remove e0d3263f6c6eb307732f74835c5241e59d3860a2d4b1ed9a7dcc7b24ef7aa49e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:48.016 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[63839ed7-1589-4bba-a199-5db845e3510a]: (4, ('Thu Oct  2 12:25:46 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a (e0d3263f6c6eb307732f74835c5241e59d3860a2d4b1ed9a7dcc7b24ef7aa49e)\ne0d3263f6c6eb307732f74835c5241e59d3860a2d4b1ed9a7dcc7b24ef7aa49e\nThu Oct  2 12:25:47 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a (e0d3263f6c6eb307732f74835c5241e59d3860a2d4b1ed9a7dcc7b24ef7aa49e)\ne0d3263f6c6eb307732f74835c5241e59d3860a2d4b1ed9a7dcc7b24ef7aa49e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:48.021 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[d871d97e-f9d4-43b5-b8c3-a04b48ad9bef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:48.023 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaa4ebb90-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:25:48 np0005466013 nova_compute[192144]: 2025-10-02 12:25:48.026 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:48 np0005466013 kernel: tapaa4ebb90-e0: left promiscuous mode
Oct  2 08:25:48 np0005466013 nova_compute[192144]: 2025-10-02 12:25:48.030 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:48.038 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[4f473650-09ea-49d4-8d72-1e37a2be6fe7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:48 np0005466013 nova_compute[192144]: 2025-10-02 12:25:48.054 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:48.087 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[0be66cc5-93a5-4764-9d46-1c2bebd5594e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:48.089 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[72423465-3b99-44fa-bd3f-f6cee00244f7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:48.113 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[4c2481f2-664f-4df5-a712-258e49a26242]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 593186, 'reachable_time': 28140, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239637, 'error': None, 'target': 'ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:48.118 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:25:48 np0005466013 systemd[1]: run-netns-ovnmeta\x2daa4ebb90\x2def5e\x2d4974\x2da53d\x2d2aabd696731a.mount: Deactivated successfully.
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:48.119 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[e8db7ec1-11e3-4df0-b342-a007d029bf3b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:48.120 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 60b9a0ec-2ade-4f90-a7b0-443ac527ec3e in datapath aa4ebb90-ef5e-4974-a53d-2aabd696731a unbound from our chassis#033[00m
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:48.125 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network aa4ebb90-ef5e-4974-a53d-2aabd696731a#033[00m
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:48.148 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[b819f220-4659-4f9c-bb6f-b7713d4a8412]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:48.150 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapaa4ebb90-e1 in ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:48.152 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapaa4ebb90-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:48.153 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[eb9c9f91-b5e9-4953-af09-3094e2e1ad26]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:48.154 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[01b87cb7-53b4-4377-a99b-74ccf3be9302]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:48.175 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[864bfad6-4d78-4669-bcbb-7565889d8ce4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:48.208 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[c13444cb-d261-4ece-986d-4a25e97a5c64]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:48.245 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[7ddf061f-f8f3-4d20-a729-7be0f323a997]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:48.258 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[21c1f105-c5ed-475d-8e18-3589bc3c9796]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:48 np0005466013 NetworkManager[51205]: <info>  [1759407948.2607] manager: (tapaa4ebb90-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/233)
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:48.294 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[0d8ccd99-3f0d-4463-a83d-1c8ea0cfd86b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:48.301 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[b50959cd-786c-4d45-9c4c-a3811ca32a67]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:48 np0005466013 NetworkManager[51205]: <info>  [1759407948.3283] device (tapaa4ebb90-e0): carrier: link connected
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:48.337 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[b94d034a-cb37-42fd-8f34-ff869542ad11]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:48.355 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[71cb1cea-a057-4980-9386-75d8db491098]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapaa4ebb90-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bc:89:8e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 155], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 593994, 'reachable_time': 23403, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239664, 'error': None, 'target': 'ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:48.373 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[3ef182f0-bdf9-47ee-a56f-6aa0db265640]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febc:898e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 593994, 'tstamp': 593994}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 239665, 'error': None, 'target': 'ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:48.402 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[40548653-50b1-4c86-8917-c0fe90d155c6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapaa4ebb90-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bc:89:8e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 155], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 593994, 'reachable_time': 23403, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 239666, 'error': None, 'target': 'ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:48.450 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[949842ca-3b5d-4aa5-9abf-e08c424376d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:48 np0005466013 nova_compute[192144]: 2025-10-02 12:25:48.462 2 DEBUG nova.virt.libvirt.host [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Removed pending event for f92877b9-dd8b-4444-a42b-987004802928 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Oct  2 08:25:48 np0005466013 nova_compute[192144]: 2025-10-02 12:25:48.463 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407948.4624639, f92877b9-dd8b-4444-a42b-987004802928 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:25:48 np0005466013 nova_compute[192144]: 2025-10-02 12:25:48.463 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f92877b9-dd8b-4444-a42b-987004802928] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:25:48 np0005466013 nova_compute[192144]: 2025-10-02 12:25:48.466 2 DEBUG nova.compute.manager [None req-0ee4bb47-5d3e-49a3-992d-b51d3835c535 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:48.525 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[73cd12d4-157a-43b7-87b7-5f72e0078464]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:48.528 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaa4ebb90-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:48.528 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:48.529 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaa4ebb90-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:25:48 np0005466013 nova_compute[192144]: 2025-10-02 12:25:48.530 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f92877b9-dd8b-4444-a42b-987004802928] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:25:48 np0005466013 kernel: tapaa4ebb90-e0: entered promiscuous mode
Oct  2 08:25:48 np0005466013 nova_compute[192144]: 2025-10-02 12:25:48.532 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:48 np0005466013 NetworkManager[51205]: <info>  [1759407948.5360] manager: (tapaa4ebb90-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/234)
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:48.538 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapaa4ebb90-e0, col_values=(('external_ids', {'iface-id': 'c9a9afa9-78de-46a4-a649-8b8779924189'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:25:48 np0005466013 nova_compute[192144]: 2025-10-02 12:25:48.540 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:48 np0005466013 ovn_controller[94366]: 2025-10-02T12:25:48Z|00512|binding|INFO|Releasing lport c9a9afa9-78de-46a4-a649-8b8779924189 from this chassis (sb_readonly=0)
Oct  2 08:25:48 np0005466013 nova_compute[192144]: 2025-10-02 12:25:48.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:48.542 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/aa4ebb90-ef5e-4974-a53d-2aabd696731a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/aa4ebb90-ef5e-4974-a53d-2aabd696731a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:48.543 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[7720bcd6-4a4b-46f3-8e28-e67370d644d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:48.544 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-aa4ebb90-ef5e-4974-a53d-2aabd696731a
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/aa4ebb90-ef5e-4974-a53d-2aabd696731a.pid.haproxy
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID aa4ebb90-ef5e-4974-a53d-2aabd696731a
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:25:48 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:25:48.544 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'env', 'PROCESS_TAG=haproxy-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/aa4ebb90-ef5e-4974-a53d-2aabd696731a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:25:48 np0005466013 nova_compute[192144]: 2025-10-02 12:25:48.556 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f92877b9-dd8b-4444-a42b-987004802928] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:25:48 np0005466013 nova_compute[192144]: 2025-10-02 12:25:48.558 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:48 np0005466013 nova_compute[192144]: 2025-10-02 12:25:48.606 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407948.4647346, f92877b9-dd8b-4444-a42b-987004802928 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:25:48 np0005466013 nova_compute[192144]: 2025-10-02 12:25:48.606 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f92877b9-dd8b-4444-a42b-987004802928] VM Started (Lifecycle Event)#033[00m
Oct  2 08:25:48 np0005466013 nova_compute[192144]: 2025-10-02 12:25:48.680 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f92877b9-dd8b-4444-a42b-987004802928] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:25:48 np0005466013 nova_compute[192144]: 2025-10-02 12:25:48.684 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f92877b9-dd8b-4444-a42b-987004802928] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:25:48 np0005466013 nova_compute[192144]: 2025-10-02 12:25:48.787 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:49 np0005466013 podman[239698]: 2025-10-02 12:25:48.964680697 +0000 UTC m=+0.032552319 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:25:49 np0005466013 podman[239698]: 2025-10-02 12:25:49.203386263 +0000 UTC m=+0.271257815 container create 6edf76ae29fb1c28b03dd8ad198b8e751e47f2fbb6cf975e9f01dc0be0851199 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:25:49 np0005466013 nova_compute[192144]: 2025-10-02 12:25:49.333 2 DEBUG nova.compute.manager [req-a3296f04-df7c-40d9-80b8-d3c6dd724b1d req-ea23d44d-dc03-4f6a-a538-f7dd408b40c6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Received event network-vif-plugged-60b9a0ec-2ade-4f90-a7b0-443ac527ec3e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:25:49 np0005466013 nova_compute[192144]: 2025-10-02 12:25:49.333 2 DEBUG oslo_concurrency.lockutils [req-a3296f04-df7c-40d9-80b8-d3c6dd724b1d req-ea23d44d-dc03-4f6a-a538-f7dd408b40c6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "f92877b9-dd8b-4444-a42b-987004802928-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:49 np0005466013 nova_compute[192144]: 2025-10-02 12:25:49.335 2 DEBUG oslo_concurrency.lockutils [req-a3296f04-df7c-40d9-80b8-d3c6dd724b1d req-ea23d44d-dc03-4f6a-a538-f7dd408b40c6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f92877b9-dd8b-4444-a42b-987004802928-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:49 np0005466013 nova_compute[192144]: 2025-10-02 12:25:49.335 2 DEBUG oslo_concurrency.lockutils [req-a3296f04-df7c-40d9-80b8-d3c6dd724b1d req-ea23d44d-dc03-4f6a-a538-f7dd408b40c6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f92877b9-dd8b-4444-a42b-987004802928-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:49 np0005466013 nova_compute[192144]: 2025-10-02 12:25:49.336 2 DEBUG nova.compute.manager [req-a3296f04-df7c-40d9-80b8-d3c6dd724b1d req-ea23d44d-dc03-4f6a-a538-f7dd408b40c6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] No waiting events found dispatching network-vif-plugged-60b9a0ec-2ade-4f90-a7b0-443ac527ec3e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:25:49 np0005466013 nova_compute[192144]: 2025-10-02 12:25:49.336 2 WARNING nova.compute.manager [req-a3296f04-df7c-40d9-80b8-d3c6dd724b1d req-ea23d44d-dc03-4f6a-a538-f7dd408b40c6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Received unexpected event network-vif-plugged-60b9a0ec-2ade-4f90-a7b0-443ac527ec3e for instance with vm_state active and task_state None.#033[00m
Oct  2 08:25:49 np0005466013 nova_compute[192144]: 2025-10-02 12:25:49.337 2 DEBUG nova.compute.manager [req-a3296f04-df7c-40d9-80b8-d3c6dd724b1d req-ea23d44d-dc03-4f6a-a538-f7dd408b40c6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Received event network-vif-plugged-60b9a0ec-2ade-4f90-a7b0-443ac527ec3e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:25:49 np0005466013 nova_compute[192144]: 2025-10-02 12:25:49.337 2 DEBUG oslo_concurrency.lockutils [req-a3296f04-df7c-40d9-80b8-d3c6dd724b1d req-ea23d44d-dc03-4f6a-a538-f7dd408b40c6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "f92877b9-dd8b-4444-a42b-987004802928-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:49 np0005466013 nova_compute[192144]: 2025-10-02 12:25:49.338 2 DEBUG oslo_concurrency.lockutils [req-a3296f04-df7c-40d9-80b8-d3c6dd724b1d req-ea23d44d-dc03-4f6a-a538-f7dd408b40c6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f92877b9-dd8b-4444-a42b-987004802928-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:49 np0005466013 nova_compute[192144]: 2025-10-02 12:25:49.338 2 DEBUG oslo_concurrency.lockutils [req-a3296f04-df7c-40d9-80b8-d3c6dd724b1d req-ea23d44d-dc03-4f6a-a538-f7dd408b40c6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f92877b9-dd8b-4444-a42b-987004802928-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:49 np0005466013 nova_compute[192144]: 2025-10-02 12:25:49.339 2 DEBUG nova.compute.manager [req-a3296f04-df7c-40d9-80b8-d3c6dd724b1d req-ea23d44d-dc03-4f6a-a538-f7dd408b40c6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] No waiting events found dispatching network-vif-plugged-60b9a0ec-2ade-4f90-a7b0-443ac527ec3e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:25:49 np0005466013 nova_compute[192144]: 2025-10-02 12:25:49.339 2 WARNING nova.compute.manager [req-a3296f04-df7c-40d9-80b8-d3c6dd724b1d req-ea23d44d-dc03-4f6a-a538-f7dd408b40c6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Received unexpected event network-vif-plugged-60b9a0ec-2ade-4f90-a7b0-443ac527ec3e for instance with vm_state active and task_state None.#033[00m
Oct  2 08:25:49 np0005466013 nova_compute[192144]: 2025-10-02 12:25:49.340 2 DEBUG nova.compute.manager [req-a3296f04-df7c-40d9-80b8-d3c6dd724b1d req-ea23d44d-dc03-4f6a-a538-f7dd408b40c6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Received event network-vif-plugged-60b9a0ec-2ade-4f90-a7b0-443ac527ec3e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:25:49 np0005466013 nova_compute[192144]: 2025-10-02 12:25:49.340 2 DEBUG oslo_concurrency.lockutils [req-a3296f04-df7c-40d9-80b8-d3c6dd724b1d req-ea23d44d-dc03-4f6a-a538-f7dd408b40c6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "f92877b9-dd8b-4444-a42b-987004802928-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:49 np0005466013 nova_compute[192144]: 2025-10-02 12:25:49.341 2 DEBUG oslo_concurrency.lockutils [req-a3296f04-df7c-40d9-80b8-d3c6dd724b1d req-ea23d44d-dc03-4f6a-a538-f7dd408b40c6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f92877b9-dd8b-4444-a42b-987004802928-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:49 np0005466013 nova_compute[192144]: 2025-10-02 12:25:49.341 2 DEBUG oslo_concurrency.lockutils [req-a3296f04-df7c-40d9-80b8-d3c6dd724b1d req-ea23d44d-dc03-4f6a-a538-f7dd408b40c6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f92877b9-dd8b-4444-a42b-987004802928-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:49 np0005466013 nova_compute[192144]: 2025-10-02 12:25:49.342 2 DEBUG nova.compute.manager [req-a3296f04-df7c-40d9-80b8-d3c6dd724b1d req-ea23d44d-dc03-4f6a-a538-f7dd408b40c6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] No waiting events found dispatching network-vif-plugged-60b9a0ec-2ade-4f90-a7b0-443ac527ec3e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:25:49 np0005466013 nova_compute[192144]: 2025-10-02 12:25:49.342 2 WARNING nova.compute.manager [req-a3296f04-df7c-40d9-80b8-d3c6dd724b1d req-ea23d44d-dc03-4f6a-a538-f7dd408b40c6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Received unexpected event network-vif-plugged-60b9a0ec-2ade-4f90-a7b0-443ac527ec3e for instance with vm_state active and task_state None.#033[00m
Oct  2 08:25:49 np0005466013 systemd[1]: Started libpod-conmon-6edf76ae29fb1c28b03dd8ad198b8e751e47f2fbb6cf975e9f01dc0be0851199.scope.
Oct  2 08:25:49 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:25:49 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ab613bcf399e87a977e1f0f2e769eda7bfe48b5e70667bf48a4383daa37fe142/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:25:49 np0005466013 nova_compute[192144]: 2025-10-02 12:25:49.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:50 np0005466013 podman[239698]: 2025-10-02 12:25:50.039522946 +0000 UTC m=+1.107394508 container init 6edf76ae29fb1c28b03dd8ad198b8e751e47f2fbb6cf975e9f01dc0be0851199 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:25:50 np0005466013 podman[239698]: 2025-10-02 12:25:50.050112798 +0000 UTC m=+1.117984330 container start 6edf76ae29fb1c28b03dd8ad198b8e751e47f2fbb6cf975e9f01dc0be0851199 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, io.buildah.version=1.41.3)
Oct  2 08:25:50 np0005466013 neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a[239714]: [NOTICE]   (239718) : New worker (239720) forked
Oct  2 08:25:50 np0005466013 neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a[239714]: [NOTICE]   (239718) : Loading success.
Oct  2 08:25:50 np0005466013 nova_compute[192144]: 2025-10-02 12:25:50.109 2 DEBUG nova.virt.libvirt.driver [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Check if temp file /var/lib/nova/instances/tmpdyrz5icw exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065#033[00m
Oct  2 08:25:50 np0005466013 nova_compute[192144]: 2025-10-02 12:25:50.115 2 DEBUG oslo_concurrency.processutils [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/595aea98-0c3e-45c9-81fe-4643f44fe8d3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:50 np0005466013 nova_compute[192144]: 2025-10-02 12:25:50.197 2 DEBUG oslo_concurrency.processutils [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/595aea98-0c3e-45c9-81fe-4643f44fe8d3/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:50 np0005466013 nova_compute[192144]: 2025-10-02 12:25:50.199 2 DEBUG oslo_concurrency.processutils [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/595aea98-0c3e-45c9-81fe-4643f44fe8d3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:50 np0005466013 nova_compute[192144]: 2025-10-02 12:25:50.279 2 DEBUG oslo_concurrency.processutils [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/595aea98-0c3e-45c9-81fe-4643f44fe8d3/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:50 np0005466013 nova_compute[192144]: 2025-10-02 12:25:50.282 2 DEBUG nova.compute.manager [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpdyrz5icw',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='595aea98-0c3e-45c9-81fe-4643f44fe8d3',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587#033[00m
Oct  2 08:25:51 np0005466013 nova_compute[192144]: 2025-10-02 12:25:51.677 2 DEBUG oslo_concurrency.processutils [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/595aea98-0c3e-45c9-81fe-4643f44fe8d3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:51 np0005466013 nova_compute[192144]: 2025-10-02 12:25:51.777 2 DEBUG oslo_concurrency.processutils [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/595aea98-0c3e-45c9-81fe-4643f44fe8d3/disk --force-share --output=json" returned: 0 in 0.099s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:51 np0005466013 nova_compute[192144]: 2025-10-02 12:25:51.779 2 DEBUG oslo_concurrency.processutils [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/595aea98-0c3e-45c9-81fe-4643f44fe8d3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:51 np0005466013 nova_compute[192144]: 2025-10-02 12:25:51.848 2 DEBUG oslo_concurrency.processutils [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/595aea98-0c3e-45c9-81fe-4643f44fe8d3/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:53 np0005466013 nova_compute[192144]: 2025-10-02 12:25:53.821 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:54 np0005466013 systemd[1]: Created slice User Slice of UID 42436.
Oct  2 08:25:54 np0005466013 systemd[1]: Starting User Runtime Directory /run/user/42436...
Oct  2 08:25:54 np0005466013 nova_compute[192144]: 2025-10-02 12:25:54.497 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:54 np0005466013 systemd-logind[784]: New session 40 of user nova.
Oct  2 08:25:54 np0005466013 systemd[1]: Finished User Runtime Directory /run/user/42436.
Oct  2 08:25:54 np0005466013 systemd[1]: Starting User Manager for UID 42436...
Oct  2 08:25:54 np0005466013 systemd[239745]: Queued start job for default target Main User Target.
Oct  2 08:25:54 np0005466013 systemd[239745]: Created slice User Application Slice.
Oct  2 08:25:54 np0005466013 systemd[239745]: Started Mark boot as successful after the user session has run 2 minutes.
Oct  2 08:25:54 np0005466013 systemd[239745]: Started Daily Cleanup of User's Temporary Directories.
Oct  2 08:25:54 np0005466013 systemd[239745]: Reached target Paths.
Oct  2 08:25:54 np0005466013 systemd[239745]: Reached target Timers.
Oct  2 08:25:54 np0005466013 systemd[239745]: Starting D-Bus User Message Bus Socket...
Oct  2 08:25:54 np0005466013 systemd[239745]: Starting Create User's Volatile Files and Directories...
Oct  2 08:25:54 np0005466013 systemd[239745]: Listening on D-Bus User Message Bus Socket.
Oct  2 08:25:54 np0005466013 systemd[239745]: Finished Create User's Volatile Files and Directories.
Oct  2 08:25:54 np0005466013 systemd[239745]: Reached target Sockets.
Oct  2 08:25:54 np0005466013 systemd[239745]: Reached target Basic System.
Oct  2 08:25:54 np0005466013 systemd[239745]: Reached target Main User Target.
Oct  2 08:25:54 np0005466013 systemd[239745]: Startup finished in 188ms.
Oct  2 08:25:54 np0005466013 systemd[1]: Started User Manager for UID 42436.
Oct  2 08:25:54 np0005466013 systemd[1]: Started Session 40 of User nova.
Oct  2 08:25:54 np0005466013 systemd-logind[784]: Session 40 logged out. Waiting for processes to exit.
Oct  2 08:25:54 np0005466013 systemd[1]: session-40.scope: Deactivated successfully.
Oct  2 08:25:54 np0005466013 systemd-logind[784]: Removed session 40.
Oct  2 08:25:55 np0005466013 nova_compute[192144]: 2025-10-02 12:25:55.879 2 DEBUG nova.compute.manager [req-a2e3b0d6-99d9-4efd-bc64-ef167362a7b4 req-68f48a61-bed7-4ca1-b891-1936b81cb010 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Received event network-vif-unplugged-375468b9-b213-41ae-87ca-ea569359bdb6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:25:55 np0005466013 nova_compute[192144]: 2025-10-02 12:25:55.882 2 DEBUG oslo_concurrency.lockutils [req-a2e3b0d6-99d9-4efd-bc64-ef167362a7b4 req-68f48a61-bed7-4ca1-b891-1936b81cb010 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "595aea98-0c3e-45c9-81fe-4643f44fe8d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:55 np0005466013 nova_compute[192144]: 2025-10-02 12:25:55.882 2 DEBUG oslo_concurrency.lockutils [req-a2e3b0d6-99d9-4efd-bc64-ef167362a7b4 req-68f48a61-bed7-4ca1-b891-1936b81cb010 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "595aea98-0c3e-45c9-81fe-4643f44fe8d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:55 np0005466013 nova_compute[192144]: 2025-10-02 12:25:55.883 2 DEBUG oslo_concurrency.lockutils [req-a2e3b0d6-99d9-4efd-bc64-ef167362a7b4 req-68f48a61-bed7-4ca1-b891-1936b81cb010 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "595aea98-0c3e-45c9-81fe-4643f44fe8d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:55 np0005466013 nova_compute[192144]: 2025-10-02 12:25:55.883 2 DEBUG nova.compute.manager [req-a2e3b0d6-99d9-4efd-bc64-ef167362a7b4 req-68f48a61-bed7-4ca1-b891-1936b81cb010 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] No waiting events found dispatching network-vif-unplugged-375468b9-b213-41ae-87ca-ea569359bdb6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:25:55 np0005466013 nova_compute[192144]: 2025-10-02 12:25:55.883 2 DEBUG nova.compute.manager [req-a2e3b0d6-99d9-4efd-bc64-ef167362a7b4 req-68f48a61-bed7-4ca1-b891-1936b81cb010 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Received event network-vif-unplugged-375468b9-b213-41ae-87ca-ea569359bdb6 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:25:57 np0005466013 nova_compute[192144]: 2025-10-02 12:25:57.560 2 INFO nova.compute.manager [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Took 5.71 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.#033[00m
Oct  2 08:25:57 np0005466013 nova_compute[192144]: 2025-10-02 12:25:57.561 2 DEBUG nova.compute.manager [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:25:57 np0005466013 nova_compute[192144]: 2025-10-02 12:25:57.617 2 DEBUG nova.compute.manager [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpdyrz5icw',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='595aea98-0c3e-45c9-81fe-4643f44fe8d3',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(1b9ceb57-30d1-4afe-80ee-df45deac9622),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939#033[00m
Oct  2 08:25:57 np0005466013 nova_compute[192144]: 2025-10-02 12:25:57.660 2 DEBUG nova.objects.instance [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Lazy-loading 'migration_context' on Instance uuid 595aea98-0c3e-45c9-81fe-4643f44fe8d3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:25:57 np0005466013 nova_compute[192144]: 2025-10-02 12:25:57.662 2 DEBUG nova.virt.libvirt.driver [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639#033[00m
Oct  2 08:25:57 np0005466013 nova_compute[192144]: 2025-10-02 12:25:57.664 2 DEBUG nova.virt.libvirt.driver [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440#033[00m
Oct  2 08:25:57 np0005466013 nova_compute[192144]: 2025-10-02 12:25:57.665 2 DEBUG nova.virt.libvirt.driver [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449#033[00m
Oct  2 08:25:57 np0005466013 nova_compute[192144]: 2025-10-02 12:25:57.694 2 DEBUG nova.virt.libvirt.vif [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:25:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1910014548',display_name='tempest-TestNetworkAdvancedServerOps-server-1910014548',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1910014548',id=122,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGcjNWQLhorxyiCglISTL85sF/PebgHfK15yneJUghAfWhPSxNP3NydyYmhFfkO9o84fX5BcllBeB8dR7YwYFd3thDd5cmALiWGCn51055R0ZMgFMvAQxqZx7i5T53aIfQ==',key_name='tempest-TestNetworkAdvancedServerOps-954911067',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:25:26Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='76c7dd40d83e4e3ca71abbebf57921b6',ramdisk_id='',reservation_id='r-axe4xfwn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-597114071',owner_user_name='tempest-TestNetworkAdvancedServerOps-597114071-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:25:26Z,user_data=None,user_id='1faa7e121a0e43ad8cb4ae5b2cfcc6a2',uuid=595aea98-0c3e-45c9-81fe-4643f44fe8d3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "375468b9-b213-41ae-87ca-ea569359bdb6", "address": "fa:16:3e:06:5f:dd", "network": {"id": "7af61307-f367-4334-ad00-5d542cb00bd9", "bridge": "br-int", "label": "tempest-network-smoke--1240497721", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap375468b9-b2", "ovs_interfaceid": "375468b9-b213-41ae-87ca-ea569359bdb6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:25:57 np0005466013 nova_compute[192144]: 2025-10-02 12:25:57.695 2 DEBUG nova.network.os_vif_util [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Converting VIF {"id": "375468b9-b213-41ae-87ca-ea569359bdb6", "address": "fa:16:3e:06:5f:dd", "network": {"id": "7af61307-f367-4334-ad00-5d542cb00bd9", "bridge": "br-int", "label": "tempest-network-smoke--1240497721", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap375468b9-b2", "ovs_interfaceid": "375468b9-b213-41ae-87ca-ea569359bdb6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:25:57 np0005466013 nova_compute[192144]: 2025-10-02 12:25:57.696 2 DEBUG nova.network.os_vif_util [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:06:5f:dd,bridge_name='br-int',has_traffic_filtering=True,id=375468b9-b213-41ae-87ca-ea569359bdb6,network=Network(7af61307-f367-4334-ad00-5d542cb00bd9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap375468b9-b2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:25:57 np0005466013 nova_compute[192144]: 2025-10-02 12:25:57.697 2 DEBUG nova.virt.libvirt.migration [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Updating guest XML with vif config: <interface type="ethernet">
Oct  2 08:25:57 np0005466013 nova_compute[192144]:  <mac address="fa:16:3e:06:5f:dd"/>
Oct  2 08:25:57 np0005466013 nova_compute[192144]:  <model type="virtio"/>
Oct  2 08:25:57 np0005466013 nova_compute[192144]:  <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:25:57 np0005466013 nova_compute[192144]:  <mtu size="1442"/>
Oct  2 08:25:57 np0005466013 nova_compute[192144]:  <target dev="tap375468b9-b2"/>
Oct  2 08:25:57 np0005466013 nova_compute[192144]: </interface>
Oct  2 08:25:57 np0005466013 nova_compute[192144]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m
Oct  2 08:25:57 np0005466013 nova_compute[192144]: 2025-10-02 12:25:57.698 2 DEBUG nova.virt.libvirt.driver [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272#033[00m
Oct  2 08:25:58 np0005466013 nova_compute[192144]: 2025-10-02 12:25:58.039 2 DEBUG nova.compute.manager [req-8fc704b8-7206-4c04-b8dd-37ae77230bb3 req-71051413-239b-4f7f-9735-66b14d1239dc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Received event network-vif-plugged-375468b9-b213-41ae-87ca-ea569359bdb6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:25:58 np0005466013 nova_compute[192144]: 2025-10-02 12:25:58.040 2 DEBUG oslo_concurrency.lockutils [req-8fc704b8-7206-4c04-b8dd-37ae77230bb3 req-71051413-239b-4f7f-9735-66b14d1239dc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "595aea98-0c3e-45c9-81fe-4643f44fe8d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:58 np0005466013 nova_compute[192144]: 2025-10-02 12:25:58.040 2 DEBUG oslo_concurrency.lockutils [req-8fc704b8-7206-4c04-b8dd-37ae77230bb3 req-71051413-239b-4f7f-9735-66b14d1239dc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "595aea98-0c3e-45c9-81fe-4643f44fe8d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:58 np0005466013 nova_compute[192144]: 2025-10-02 12:25:58.041 2 DEBUG oslo_concurrency.lockutils [req-8fc704b8-7206-4c04-b8dd-37ae77230bb3 req-71051413-239b-4f7f-9735-66b14d1239dc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "595aea98-0c3e-45c9-81fe-4643f44fe8d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:58 np0005466013 nova_compute[192144]: 2025-10-02 12:25:58.041 2 DEBUG nova.compute.manager [req-8fc704b8-7206-4c04-b8dd-37ae77230bb3 req-71051413-239b-4f7f-9735-66b14d1239dc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] No waiting events found dispatching network-vif-plugged-375468b9-b213-41ae-87ca-ea569359bdb6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:25:58 np0005466013 nova_compute[192144]: 2025-10-02 12:25:58.042 2 WARNING nova.compute.manager [req-8fc704b8-7206-4c04-b8dd-37ae77230bb3 req-71051413-239b-4f7f-9735-66b14d1239dc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Received unexpected event network-vif-plugged-375468b9-b213-41ae-87ca-ea569359bdb6 for instance with vm_state active and task_state migrating.#033[00m
Oct  2 08:25:58 np0005466013 nova_compute[192144]: 2025-10-02 12:25:58.042 2 DEBUG nova.compute.manager [req-8fc704b8-7206-4c04-b8dd-37ae77230bb3 req-71051413-239b-4f7f-9735-66b14d1239dc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Received event network-changed-375468b9-b213-41ae-87ca-ea569359bdb6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:25:58 np0005466013 nova_compute[192144]: 2025-10-02 12:25:58.043 2 DEBUG nova.compute.manager [req-8fc704b8-7206-4c04-b8dd-37ae77230bb3 req-71051413-239b-4f7f-9735-66b14d1239dc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Refreshing instance network info cache due to event network-changed-375468b9-b213-41ae-87ca-ea569359bdb6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:25:58 np0005466013 nova_compute[192144]: 2025-10-02 12:25:58.043 2 DEBUG oslo_concurrency.lockutils [req-8fc704b8-7206-4c04-b8dd-37ae77230bb3 req-71051413-239b-4f7f-9735-66b14d1239dc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-595aea98-0c3e-45c9-81fe-4643f44fe8d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:25:58 np0005466013 nova_compute[192144]: 2025-10-02 12:25:58.044 2 DEBUG oslo_concurrency.lockutils [req-8fc704b8-7206-4c04-b8dd-37ae77230bb3 req-71051413-239b-4f7f-9735-66b14d1239dc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-595aea98-0c3e-45c9-81fe-4643f44fe8d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:25:58 np0005466013 nova_compute[192144]: 2025-10-02 12:25:58.044 2 DEBUG nova.network.neutron [req-8fc704b8-7206-4c04-b8dd-37ae77230bb3 req-71051413-239b-4f7f-9735-66b14d1239dc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Refreshing network info cache for port 375468b9-b213-41ae-87ca-ea569359bdb6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:25:58 np0005466013 nova_compute[192144]: 2025-10-02 12:25:58.167 2 DEBUG nova.virt.libvirt.migration [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Oct  2 08:25:58 np0005466013 nova_compute[192144]: 2025-10-02 12:25:58.168 2 INFO nova.virt.libvirt.migration [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Increasing downtime to 50 ms after 0 sec elapsed time#033[00m
Oct  2 08:25:58 np0005466013 nova_compute[192144]: 2025-10-02 12:25:58.410 2 INFO nova.virt.libvirt.driver [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).#033[00m
Oct  2 08:25:58 np0005466013 nova_compute[192144]: 2025-10-02 12:25:58.823 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:58 np0005466013 nova_compute[192144]: 2025-10-02 12:25:58.914 2 DEBUG nova.virt.libvirt.migration [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Oct  2 08:25:58 np0005466013 nova_compute[192144]: 2025-10-02 12:25:58.914 2 DEBUG nova.virt.libvirt.migration [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Oct  2 08:25:59 np0005466013 nova_compute[192144]: 2025-10-02 12:25:59.421 2 DEBUG nova.virt.libvirt.migration [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Oct  2 08:25:59 np0005466013 nova_compute[192144]: 2025-10-02 12:25:59.423 2 DEBUG nova.virt.libvirt.migration [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Oct  2 08:25:59 np0005466013 nova_compute[192144]: 2025-10-02 12:25:59.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:59 np0005466013 podman[239771]: 2025-10-02 12:25:59.724656426 +0000 UTC m=+0.074537975 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct  2 08:25:59 np0005466013 podman[239770]: 2025-10-02 12:25:59.748666868 +0000 UTC m=+0.100511609 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  2 08:25:59 np0005466013 podman[239772]: 2025-10-02 12:25:59.782534358 +0000 UTC m=+0.134428630 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3)
Oct  2 08:25:59 np0005466013 nova_compute[192144]: 2025-10-02 12:25:59.844 2 DEBUG nova.network.neutron [req-8fc704b8-7206-4c04-b8dd-37ae77230bb3 req-71051413-239b-4f7f-9735-66b14d1239dc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Updated VIF entry in instance network info cache for port 375468b9-b213-41ae-87ca-ea569359bdb6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:25:59 np0005466013 nova_compute[192144]: 2025-10-02 12:25:59.844 2 DEBUG nova.network.neutron [req-8fc704b8-7206-4c04-b8dd-37ae77230bb3 req-71051413-239b-4f7f-9735-66b14d1239dc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Updating instance_info_cache with network_info: [{"id": "375468b9-b213-41ae-87ca-ea569359bdb6", "address": "fa:16:3e:06:5f:dd", "network": {"id": "7af61307-f367-4334-ad00-5d542cb00bd9", "bridge": "br-int", "label": "tempest-network-smoke--1240497721", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap375468b9-b2", "ovs_interfaceid": "375468b9-b213-41ae-87ca-ea569359bdb6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:25:59 np0005466013 nova_compute[192144]: 2025-10-02 12:25:59.915 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759407959.9151242, 595aea98-0c3e-45c9-81fe-4643f44fe8d3 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:25:59 np0005466013 nova_compute[192144]: 2025-10-02 12:25:59.917 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:26:00 np0005466013 nova_compute[192144]: 2025-10-02 12:26:00.035 2 DEBUG oslo_concurrency.lockutils [req-8fc704b8-7206-4c04-b8dd-37ae77230bb3 req-71051413-239b-4f7f-9735-66b14d1239dc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-595aea98-0c3e-45c9-81fe-4643f44fe8d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:26:00 np0005466013 nova_compute[192144]: 2025-10-02 12:26:00.076 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:26:00 np0005466013 nova_compute[192144]: 2025-10-02 12:26:00.293 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:26:00 np0005466013 nova_compute[192144]: 2025-10-02 12:26:00.339 2 DEBUG nova.virt.libvirt.migration [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Oct  2 08:26:00 np0005466013 nova_compute[192144]: 2025-10-02 12:26:00.339 2 DEBUG nova.virt.libvirt.migration [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Oct  2 08:26:00 np0005466013 kernel: tap375468b9-b2 (unregistering): left promiscuous mode
Oct  2 08:26:00 np0005466013 nova_compute[192144]: 2025-10-02 12:26:00.437 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] During sync_power_state the instance has a pending task (migrating). Skip.#033[00m
Oct  2 08:26:00 np0005466013 NetworkManager[51205]: <info>  [1759407960.4440] device (tap375468b9-b2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:26:00 np0005466013 ovn_controller[94366]: 2025-10-02T12:26:00Z|00513|binding|INFO|Releasing lport 375468b9-b213-41ae-87ca-ea569359bdb6 from this chassis (sb_readonly=0)
Oct  2 08:26:00 np0005466013 ovn_controller[94366]: 2025-10-02T12:26:00Z|00514|binding|INFO|Setting lport 375468b9-b213-41ae-87ca-ea569359bdb6 down in Southbound
Oct  2 08:26:00 np0005466013 ovn_controller[94366]: 2025-10-02T12:26:00Z|00515|binding|INFO|Removing iface tap375468b9-b2 ovn-installed in OVS
Oct  2 08:26:00 np0005466013 nova_compute[192144]: 2025-10-02 12:26:00.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:00 np0005466013 nova_compute[192144]: 2025-10-02 12:26:00.505 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:00 np0005466013 systemd[1]: machine-qemu\x2d57\x2dinstance\x2d0000007a.scope: Deactivated successfully.
Oct  2 08:26:00 np0005466013 systemd[1]: machine-qemu\x2d57\x2dinstance\x2d0000007a.scope: Consumed 15.092s CPU time.
Oct  2 08:26:00 np0005466013 systemd-machined[152202]: Machine qemu-57-instance-0000007a terminated.
Oct  2 08:26:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:26:00.557 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:06:5f:dd 10.100.0.6'], port_security=['fa:16:3e:06:5f:dd 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '595aea98-0c3e-45c9-81fe-4643f44fe8d3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7af61307-f367-4334-ad00-5d542cb00bd9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'db364350-0b47-4c18-8ab1-bf862406804b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.240'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7e947ec6-847e-4b20-b912-5e8f3559dfc4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=375468b9-b213-41ae-87ca-ea569359bdb6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:26:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:26:00.561 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 375468b9-b213-41ae-87ca-ea569359bdb6 in datapath 7af61307-f367-4334-ad00-5d542cb00bd9 unbound from our chassis#033[00m
Oct  2 08:26:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:26:00.564 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7af61307-f367-4334-ad00-5d542cb00bd9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:26:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:26:00.565 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[f45d1ce0-ee72-4b9a-96cb-7822ede4a57a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:26:00.566 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7af61307-f367-4334-ad00-5d542cb00bd9 namespace which is not needed anymore#033[00m
Oct  2 08:26:00 np0005466013 nova_compute[192144]: 2025-10-02 12:26:00.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:00 np0005466013 nova_compute[192144]: 2025-10-02 12:26:00.653 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:00 np0005466013 nova_compute[192144]: 2025-10-02 12:26:00.715 2 DEBUG nova.virt.libvirt.driver [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279#033[00m
Oct  2 08:26:00 np0005466013 nova_compute[192144]: 2025-10-02 12:26:00.718 2 DEBUG nova.virt.libvirt.driver [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327#033[00m
Oct  2 08:26:00 np0005466013 nova_compute[192144]: 2025-10-02 12:26:00.718 2 DEBUG nova.virt.libvirt.driver [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630#033[00m
Oct  2 08:26:00 np0005466013 nova_compute[192144]: 2025-10-02 12:26:00.843 2 DEBUG nova.virt.libvirt.guest [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '595aea98-0c3e-45c9-81fe-4643f44fe8d3' (instance-0000007a) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688#033[00m
Oct  2 08:26:00 np0005466013 nova_compute[192144]: 2025-10-02 12:26:00.844 2 INFO nova.virt.libvirt.driver [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Migration operation has completed#033[00m
Oct  2 08:26:00 np0005466013 nova_compute[192144]: 2025-10-02 12:26:00.846 2 INFO nova.compute.manager [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] _post_live_migration() is started..#033[00m
Oct  2 08:26:00 np0005466013 neutron-haproxy-ovnmeta-7af61307-f367-4334-ad00-5d542cb00bd9[239070]: [NOTICE]   (239074) : haproxy version is 2.8.14-c23fe91
Oct  2 08:26:00 np0005466013 neutron-haproxy-ovnmeta-7af61307-f367-4334-ad00-5d542cb00bd9[239070]: [NOTICE]   (239074) : path to executable is /usr/sbin/haproxy
Oct  2 08:26:00 np0005466013 neutron-haproxy-ovnmeta-7af61307-f367-4334-ad00-5d542cb00bd9[239070]: [WARNING]  (239074) : Exiting Master process...
Oct  2 08:26:00 np0005466013 neutron-haproxy-ovnmeta-7af61307-f367-4334-ad00-5d542cb00bd9[239070]: [WARNING]  (239074) : Exiting Master process...
Oct  2 08:26:00 np0005466013 neutron-haproxy-ovnmeta-7af61307-f367-4334-ad00-5d542cb00bd9[239070]: [ALERT]    (239074) : Current worker (239076) exited with code 143 (Terminated)
Oct  2 08:26:00 np0005466013 neutron-haproxy-ovnmeta-7af61307-f367-4334-ad00-5d542cb00bd9[239070]: [WARNING]  (239074) : All workers exited. Exiting... (0)
Oct  2 08:26:00 np0005466013 systemd[1]: libpod-4ed3c80a93af0b31715611e4e84428361d45a75a407dd831f5e21cbf5e51cc9c.scope: Deactivated successfully.
Oct  2 08:26:00 np0005466013 podman[239875]: 2025-10-02 12:26:00.960184547 +0000 UTC m=+0.235407274 container died 4ed3c80a93af0b31715611e4e84428361d45a75a407dd831f5e21cbf5e51cc9c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7af61307-f367-4334-ad00-5d542cb00bd9, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001)
Oct  2 08:26:01 np0005466013 nova_compute[192144]: 2025-10-02 12:26:01.137 2 DEBUG nova.compute.manager [req-470db7dd-0f41-41d1-8990-bce71f910efd req-2e03b35c-8a77-4a29-8f9f-6ea12782d28a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Received event network-vif-unplugged-375468b9-b213-41ae-87ca-ea569359bdb6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:26:01 np0005466013 nova_compute[192144]: 2025-10-02 12:26:01.138 2 DEBUG oslo_concurrency.lockutils [req-470db7dd-0f41-41d1-8990-bce71f910efd req-2e03b35c-8a77-4a29-8f9f-6ea12782d28a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "595aea98-0c3e-45c9-81fe-4643f44fe8d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:26:01 np0005466013 nova_compute[192144]: 2025-10-02 12:26:01.139 2 DEBUG oslo_concurrency.lockutils [req-470db7dd-0f41-41d1-8990-bce71f910efd req-2e03b35c-8a77-4a29-8f9f-6ea12782d28a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "595aea98-0c3e-45c9-81fe-4643f44fe8d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:26:01 np0005466013 nova_compute[192144]: 2025-10-02 12:26:01.139 2 DEBUG oslo_concurrency.lockutils [req-470db7dd-0f41-41d1-8990-bce71f910efd req-2e03b35c-8a77-4a29-8f9f-6ea12782d28a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "595aea98-0c3e-45c9-81fe-4643f44fe8d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:01 np0005466013 nova_compute[192144]: 2025-10-02 12:26:01.139 2 DEBUG nova.compute.manager [req-470db7dd-0f41-41d1-8990-bce71f910efd req-2e03b35c-8a77-4a29-8f9f-6ea12782d28a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] No waiting events found dispatching network-vif-unplugged-375468b9-b213-41ae-87ca-ea569359bdb6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:26:01 np0005466013 nova_compute[192144]: 2025-10-02 12:26:01.139 2 DEBUG nova.compute.manager [req-470db7dd-0f41-41d1-8990-bce71f910efd req-2e03b35c-8a77-4a29-8f9f-6ea12782d28a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Received event network-vif-unplugged-375468b9-b213-41ae-87ca-ea569359bdb6 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:26:01 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4ed3c80a93af0b31715611e4e84428361d45a75a407dd831f5e21cbf5e51cc9c-userdata-shm.mount: Deactivated successfully.
Oct  2 08:26:01 np0005466013 systemd[1]: var-lib-containers-storage-overlay-553a81d4f0496b7c31ea0af411682bef60c197ce3609cba7ded034df9f4aa835-merged.mount: Deactivated successfully.
Oct  2 08:26:01 np0005466013 nova_compute[192144]: 2025-10-02 12:26:01.838 2 DEBUG nova.compute.manager [req-77a90738-5c14-4a56-baa8-f9bed26fd9bd req-bdde1350-58fd-4dc1-aa6b-f7ccf19daa6d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Received event network-vif-unplugged-375468b9-b213-41ae-87ca-ea569359bdb6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:26:01 np0005466013 nova_compute[192144]: 2025-10-02 12:26:01.839 2 DEBUG oslo_concurrency.lockutils [req-77a90738-5c14-4a56-baa8-f9bed26fd9bd req-bdde1350-58fd-4dc1-aa6b-f7ccf19daa6d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "595aea98-0c3e-45c9-81fe-4643f44fe8d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:26:01 np0005466013 nova_compute[192144]: 2025-10-02 12:26:01.840 2 DEBUG oslo_concurrency.lockutils [req-77a90738-5c14-4a56-baa8-f9bed26fd9bd req-bdde1350-58fd-4dc1-aa6b-f7ccf19daa6d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "595aea98-0c3e-45c9-81fe-4643f44fe8d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:26:01 np0005466013 nova_compute[192144]: 2025-10-02 12:26:01.841 2 DEBUG oslo_concurrency.lockutils [req-77a90738-5c14-4a56-baa8-f9bed26fd9bd req-bdde1350-58fd-4dc1-aa6b-f7ccf19daa6d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "595aea98-0c3e-45c9-81fe-4643f44fe8d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:01 np0005466013 nova_compute[192144]: 2025-10-02 12:26:01.841 2 DEBUG nova.compute.manager [req-77a90738-5c14-4a56-baa8-f9bed26fd9bd req-bdde1350-58fd-4dc1-aa6b-f7ccf19daa6d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] No waiting events found dispatching network-vif-unplugged-375468b9-b213-41ae-87ca-ea569359bdb6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:26:01 np0005466013 nova_compute[192144]: 2025-10-02 12:26:01.842 2 DEBUG nova.compute.manager [req-77a90738-5c14-4a56-baa8-f9bed26fd9bd req-bdde1350-58fd-4dc1-aa6b-f7ccf19daa6d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Received event network-vif-unplugged-375468b9-b213-41ae-87ca-ea569359bdb6 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:26:01 np0005466013 podman[239875]: 2025-10-02 12:26:01.934783496 +0000 UTC m=+1.210006273 container cleanup 4ed3c80a93af0b31715611e4e84428361d45a75a407dd831f5e21cbf5e51cc9c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7af61307-f367-4334-ad00-5d542cb00bd9, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Oct  2 08:26:01 np0005466013 systemd[1]: libpod-conmon-4ed3c80a93af0b31715611e4e84428361d45a75a407dd831f5e21cbf5e51cc9c.scope: Deactivated successfully.
Oct  2 08:26:02 np0005466013 nova_compute[192144]: 2025-10-02 12:26:02.088 2 DEBUG nova.network.neutron [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Activated binding for port 375468b9-b213-41ae-87ca-ea569359bdb6 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m
Oct  2 08:26:02 np0005466013 nova_compute[192144]: 2025-10-02 12:26:02.089 2 DEBUG nova.compute.manager [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "375468b9-b213-41ae-87ca-ea569359bdb6", "address": "fa:16:3e:06:5f:dd", "network": {"id": "7af61307-f367-4334-ad00-5d542cb00bd9", "bridge": "br-int", "label": "tempest-network-smoke--1240497721", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap375468b9-b2", "ovs_interfaceid": "375468b9-b213-41ae-87ca-ea569359bdb6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326#033[00m
Oct  2 08:26:02 np0005466013 nova_compute[192144]: 2025-10-02 12:26:02.091 2 DEBUG nova.virt.libvirt.vif [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:25:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1910014548',display_name='tempest-TestNetworkAdvancedServerOps-server-1910014548',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1910014548',id=122,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGcjNWQLhorxyiCglISTL85sF/PebgHfK15yneJUghAfWhPSxNP3NydyYmhFfkO9o84fX5BcllBeB8dR7YwYFd3thDd5cmALiWGCn51055R0ZMgFMvAQxqZx7i5T53aIfQ==',key_name='tempest-TestNetworkAdvancedServerOps-954911067',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:25:26Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='76c7dd40d83e4e3ca71abbebf57921b6',ramdisk_id='',reservation_id='r-axe4xfwn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-597114071',owner_user_name='tempest-TestNetworkAdvancedServerOps-597114071-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:25:47Z,user_data=None,user_id='1faa7e121a0e43ad8cb4ae5b2cfcc6a2',uuid=595aea98-0c3e-45c9-81fe-4643f44fe8d3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "375468b9-b213-41ae-87ca-ea569359bdb6", "address": "fa:16:3e:06:5f:dd", "network": {"id": "7af61307-f367-4334-ad00-5d542cb00bd9", "bridge": "br-int", "label": "tempest-network-smoke--1240497721", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap375468b9-b2", "ovs_interfaceid": "375468b9-b213-41ae-87ca-ea569359bdb6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:26:02 np0005466013 nova_compute[192144]: 2025-10-02 12:26:02.092 2 DEBUG nova.network.os_vif_util [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Converting VIF {"id": "375468b9-b213-41ae-87ca-ea569359bdb6", "address": "fa:16:3e:06:5f:dd", "network": {"id": "7af61307-f367-4334-ad00-5d542cb00bd9", "bridge": "br-int", "label": "tempest-network-smoke--1240497721", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap375468b9-b2", "ovs_interfaceid": "375468b9-b213-41ae-87ca-ea569359bdb6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:26:02 np0005466013 nova_compute[192144]: 2025-10-02 12:26:02.093 2 DEBUG nova.network.os_vif_util [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:06:5f:dd,bridge_name='br-int',has_traffic_filtering=True,id=375468b9-b213-41ae-87ca-ea569359bdb6,network=Network(7af61307-f367-4334-ad00-5d542cb00bd9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap375468b9-b2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:26:02 np0005466013 nova_compute[192144]: 2025-10-02 12:26:02.094 2 DEBUG os_vif [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:06:5f:dd,bridge_name='br-int',has_traffic_filtering=True,id=375468b9-b213-41ae-87ca-ea569359bdb6,network=Network(7af61307-f367-4334-ad00-5d542cb00bd9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap375468b9-b2') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:26:02 np0005466013 nova_compute[192144]: 2025-10-02 12:26:02.097 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:02 np0005466013 nova_compute[192144]: 2025-10-02 12:26:02.098 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap375468b9-b2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:26:02 np0005466013 nova_compute[192144]: 2025-10-02 12:26:02.101 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:02 np0005466013 nova_compute[192144]: 2025-10-02 12:26:02.104 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:02 np0005466013 nova_compute[192144]: 2025-10-02 12:26:02.109 2 INFO os_vif [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:06:5f:dd,bridge_name='br-int',has_traffic_filtering=True,id=375468b9-b213-41ae-87ca-ea569359bdb6,network=Network(7af61307-f367-4334-ad00-5d542cb00bd9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap375468b9-b2')#033[00m
Oct  2 08:26:02 np0005466013 nova_compute[192144]: 2025-10-02 12:26:02.110 2 DEBUG oslo_concurrency.lockutils [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:26:02 np0005466013 nova_compute[192144]: 2025-10-02 12:26:02.111 2 DEBUG oslo_concurrency.lockutils [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:26:02 np0005466013 nova_compute[192144]: 2025-10-02 12:26:02.111 2 DEBUG oslo_concurrency.lockutils [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:02 np0005466013 nova_compute[192144]: 2025-10-02 12:26:02.112 2 DEBUG nova.compute.manager [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349#033[00m
Oct  2 08:26:02 np0005466013 nova_compute[192144]: 2025-10-02 12:26:02.113 2 INFO nova.virt.libvirt.driver [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Deleting instance files /var/lib/nova/instances/595aea98-0c3e-45c9-81fe-4643f44fe8d3_del#033[00m
Oct  2 08:26:02 np0005466013 nova_compute[192144]: 2025-10-02 12:26:02.114 2 INFO nova.virt.libvirt.driver [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Deletion of /var/lib/nova/instances/595aea98-0c3e-45c9-81fe-4643f44fe8d3_del complete#033[00m
Oct  2 08:26:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:26:02.308 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:26:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:26:02.309 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:26:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:26:02.310 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:02 np0005466013 podman[239923]: 2025-10-02 12:26:02.698807941 +0000 UTC m=+0.718650256 container remove 4ed3c80a93af0b31715611e4e84428361d45a75a407dd831f5e21cbf5e51cc9c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7af61307-f367-4334-ad00-5d542cb00bd9, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 08:26:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:26:02.707 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[a23483c6-5838-42f5-86af-6244c4421f95]: (4, ('Thu Oct  2 12:26:00 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-7af61307-f367-4334-ad00-5d542cb00bd9 (4ed3c80a93af0b31715611e4e84428361d45a75a407dd831f5e21cbf5e51cc9c)\n4ed3c80a93af0b31715611e4e84428361d45a75a407dd831f5e21cbf5e51cc9c\nThu Oct  2 12:26:01 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-7af61307-f367-4334-ad00-5d542cb00bd9 (4ed3c80a93af0b31715611e4e84428361d45a75a407dd831f5e21cbf5e51cc9c)\n4ed3c80a93af0b31715611e4e84428361d45a75a407dd831f5e21cbf5e51cc9c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:26:02.709 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[3e0e4104-e25e-47ea-99b2-f8b9aab5a4a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:26:02.710 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7af61307-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:26:02 np0005466013 nova_compute[192144]: 2025-10-02 12:26:02.714 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:02 np0005466013 kernel: tap7af61307-f0: left promiscuous mode
Oct  2 08:26:02 np0005466013 nova_compute[192144]: 2025-10-02 12:26:02.728 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:26:02.730 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[3fdc48e8-8810-49b6-b60f-489f6bc8e5f0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:26:02.760 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[d9364af8-6b28-41ca-8fe5-33d2b5c4402e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:26:02.762 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[8030e2c4-02d0-489a-b873-44029c591cb5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:26:02.784 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[e87f4cab-9459-4781-8e17-068168a09ede]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 591636, 'reachable_time': 23393, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239939, 'error': None, 'target': 'ovnmeta-7af61307-f367-4334-ad00-5d542cb00bd9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:02 np0005466013 systemd[1]: run-netns-ovnmeta\x2d7af61307\x2df367\x2d4334\x2dad00\x2d5d542cb00bd9.mount: Deactivated successfully.
Oct  2 08:26:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:26:02.790 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7af61307-f367-4334-ad00-5d542cb00bd9 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:26:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:26:02.790 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[ce906cda-b2ff-4f92-9e3b-3c93638a451a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:03 np0005466013 nova_compute[192144]: 2025-10-02 12:26:03.344 2 DEBUG nova.compute.manager [req-98c519fa-fe55-4f82-b4a8-d56951c1feb1 req-41b4717e-f36f-4f7f-82ea-01c2a1b55faf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Received event network-vif-plugged-375468b9-b213-41ae-87ca-ea569359bdb6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:26:03 np0005466013 nova_compute[192144]: 2025-10-02 12:26:03.346 2 DEBUG oslo_concurrency.lockutils [req-98c519fa-fe55-4f82-b4a8-d56951c1feb1 req-41b4717e-f36f-4f7f-82ea-01c2a1b55faf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "595aea98-0c3e-45c9-81fe-4643f44fe8d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:26:03 np0005466013 nova_compute[192144]: 2025-10-02 12:26:03.346 2 DEBUG oslo_concurrency.lockutils [req-98c519fa-fe55-4f82-b4a8-d56951c1feb1 req-41b4717e-f36f-4f7f-82ea-01c2a1b55faf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "595aea98-0c3e-45c9-81fe-4643f44fe8d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:26:03 np0005466013 nova_compute[192144]: 2025-10-02 12:26:03.347 2 DEBUG oslo_concurrency.lockutils [req-98c519fa-fe55-4f82-b4a8-d56951c1feb1 req-41b4717e-f36f-4f7f-82ea-01c2a1b55faf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "595aea98-0c3e-45c9-81fe-4643f44fe8d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:03 np0005466013 nova_compute[192144]: 2025-10-02 12:26:03.347 2 DEBUG nova.compute.manager [req-98c519fa-fe55-4f82-b4a8-d56951c1feb1 req-41b4717e-f36f-4f7f-82ea-01c2a1b55faf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] No waiting events found dispatching network-vif-plugged-375468b9-b213-41ae-87ca-ea569359bdb6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:26:03 np0005466013 nova_compute[192144]: 2025-10-02 12:26:03.347 2 WARNING nova.compute.manager [req-98c519fa-fe55-4f82-b4a8-d56951c1feb1 req-41b4717e-f36f-4f7f-82ea-01c2a1b55faf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Received unexpected event network-vif-plugged-375468b9-b213-41ae-87ca-ea569359bdb6 for instance with vm_state active and task_state migrating.#033[00m
Oct  2 08:26:03 np0005466013 nova_compute[192144]: 2025-10-02 12:26:03.348 2 DEBUG nova.compute.manager [req-98c519fa-fe55-4f82-b4a8-d56951c1feb1 req-41b4717e-f36f-4f7f-82ea-01c2a1b55faf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Received event network-vif-plugged-375468b9-b213-41ae-87ca-ea569359bdb6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:26:03 np0005466013 nova_compute[192144]: 2025-10-02 12:26:03.348 2 DEBUG oslo_concurrency.lockutils [req-98c519fa-fe55-4f82-b4a8-d56951c1feb1 req-41b4717e-f36f-4f7f-82ea-01c2a1b55faf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "595aea98-0c3e-45c9-81fe-4643f44fe8d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:26:03 np0005466013 nova_compute[192144]: 2025-10-02 12:26:03.348 2 DEBUG oslo_concurrency.lockutils [req-98c519fa-fe55-4f82-b4a8-d56951c1feb1 req-41b4717e-f36f-4f7f-82ea-01c2a1b55faf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "595aea98-0c3e-45c9-81fe-4643f44fe8d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:26:03 np0005466013 nova_compute[192144]: 2025-10-02 12:26:03.348 2 DEBUG oslo_concurrency.lockutils [req-98c519fa-fe55-4f82-b4a8-d56951c1feb1 req-41b4717e-f36f-4f7f-82ea-01c2a1b55faf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "595aea98-0c3e-45c9-81fe-4643f44fe8d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:03 np0005466013 nova_compute[192144]: 2025-10-02 12:26:03.349 2 DEBUG nova.compute.manager [req-98c519fa-fe55-4f82-b4a8-d56951c1feb1 req-41b4717e-f36f-4f7f-82ea-01c2a1b55faf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] No waiting events found dispatching network-vif-plugged-375468b9-b213-41ae-87ca-ea569359bdb6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:26:03 np0005466013 nova_compute[192144]: 2025-10-02 12:26:03.349 2 WARNING nova.compute.manager [req-98c519fa-fe55-4f82-b4a8-d56951c1feb1 req-41b4717e-f36f-4f7f-82ea-01c2a1b55faf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Received unexpected event network-vif-plugged-375468b9-b213-41ae-87ca-ea569359bdb6 for instance with vm_state active and task_state migrating.#033[00m
Oct  2 08:26:03 np0005466013 nova_compute[192144]: 2025-10-02 12:26:03.349 2 DEBUG nova.compute.manager [req-98c519fa-fe55-4f82-b4a8-d56951c1feb1 req-41b4717e-f36f-4f7f-82ea-01c2a1b55faf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Received event network-vif-plugged-375468b9-b213-41ae-87ca-ea569359bdb6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:26:03 np0005466013 nova_compute[192144]: 2025-10-02 12:26:03.350 2 DEBUG oslo_concurrency.lockutils [req-98c519fa-fe55-4f82-b4a8-d56951c1feb1 req-41b4717e-f36f-4f7f-82ea-01c2a1b55faf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "595aea98-0c3e-45c9-81fe-4643f44fe8d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:26:03 np0005466013 nova_compute[192144]: 2025-10-02 12:26:03.350 2 DEBUG oslo_concurrency.lockutils [req-98c519fa-fe55-4f82-b4a8-d56951c1feb1 req-41b4717e-f36f-4f7f-82ea-01c2a1b55faf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "595aea98-0c3e-45c9-81fe-4643f44fe8d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:26:03 np0005466013 nova_compute[192144]: 2025-10-02 12:26:03.350 2 DEBUG oslo_concurrency.lockutils [req-98c519fa-fe55-4f82-b4a8-d56951c1feb1 req-41b4717e-f36f-4f7f-82ea-01c2a1b55faf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "595aea98-0c3e-45c9-81fe-4643f44fe8d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:03 np0005466013 nova_compute[192144]: 2025-10-02 12:26:03.351 2 DEBUG nova.compute.manager [req-98c519fa-fe55-4f82-b4a8-d56951c1feb1 req-41b4717e-f36f-4f7f-82ea-01c2a1b55faf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] No waiting events found dispatching network-vif-plugged-375468b9-b213-41ae-87ca-ea569359bdb6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:26:03 np0005466013 nova_compute[192144]: 2025-10-02 12:26:03.351 2 WARNING nova.compute.manager [req-98c519fa-fe55-4f82-b4a8-d56951c1feb1 req-41b4717e-f36f-4f7f-82ea-01c2a1b55faf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Received unexpected event network-vif-plugged-375468b9-b213-41ae-87ca-ea569359bdb6 for instance with vm_state active and task_state migrating.#033[00m
Oct  2 08:26:03 np0005466013 ovn_controller[94366]: 2025-10-02T12:26:03Z|00048|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f0:49:10 10.100.0.4
Oct  2 08:26:04 np0005466013 nova_compute[192144]: 2025-10-02 12:26:04.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:05 np0005466013 systemd[1]: Stopping User Manager for UID 42436...
Oct  2 08:26:05 np0005466013 systemd[239745]: Activating special unit Exit the Session...
Oct  2 08:26:05 np0005466013 systemd[239745]: Stopped target Main User Target.
Oct  2 08:26:05 np0005466013 systemd[239745]: Stopped target Basic System.
Oct  2 08:26:05 np0005466013 systemd[239745]: Stopped target Paths.
Oct  2 08:26:05 np0005466013 systemd[239745]: Stopped target Sockets.
Oct  2 08:26:05 np0005466013 systemd[239745]: Stopped target Timers.
Oct  2 08:26:05 np0005466013 systemd[239745]: Stopped Mark boot as successful after the user session has run 2 minutes.
Oct  2 08:26:05 np0005466013 systemd[239745]: Stopped Daily Cleanup of User's Temporary Directories.
Oct  2 08:26:05 np0005466013 systemd[239745]: Closed D-Bus User Message Bus Socket.
Oct  2 08:26:05 np0005466013 systemd[239745]: Stopped Create User's Volatile Files and Directories.
Oct  2 08:26:05 np0005466013 systemd[239745]: Removed slice User Application Slice.
Oct  2 08:26:05 np0005466013 systemd[239745]: Reached target Shutdown.
Oct  2 08:26:05 np0005466013 systemd[239745]: Finished Exit the Session.
Oct  2 08:26:05 np0005466013 systemd[239745]: Reached target Exit the Session.
Oct  2 08:26:05 np0005466013 systemd[1]: user@42436.service: Deactivated successfully.
Oct  2 08:26:05 np0005466013 systemd[1]: Stopped User Manager for UID 42436.
Oct  2 08:26:05 np0005466013 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Oct  2 08:26:05 np0005466013 systemd[1]: run-user-42436.mount: Deactivated successfully.
Oct  2 08:26:05 np0005466013 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Oct  2 08:26:05 np0005466013 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Oct  2 08:26:05 np0005466013 systemd[1]: Removed slice User Slice of UID 42436.
Oct  2 08:26:06 np0005466013 nova_compute[192144]: 2025-10-02 12:26:06.013 2 DEBUG nova.compute.manager [req-3487149b-aeca-4c48-b870-975acd05ab27 req-c9873785-e65f-425d-8e68-90ec9f8cf102 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Received event network-vif-plugged-375468b9-b213-41ae-87ca-ea569359bdb6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:26:06 np0005466013 nova_compute[192144]: 2025-10-02 12:26:06.015 2 DEBUG oslo_concurrency.lockutils [req-3487149b-aeca-4c48-b870-975acd05ab27 req-c9873785-e65f-425d-8e68-90ec9f8cf102 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "595aea98-0c3e-45c9-81fe-4643f44fe8d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:26:06 np0005466013 nova_compute[192144]: 2025-10-02 12:26:06.016 2 DEBUG oslo_concurrency.lockutils [req-3487149b-aeca-4c48-b870-975acd05ab27 req-c9873785-e65f-425d-8e68-90ec9f8cf102 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "595aea98-0c3e-45c9-81fe-4643f44fe8d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:26:06 np0005466013 nova_compute[192144]: 2025-10-02 12:26:06.016 2 DEBUG oslo_concurrency.lockutils [req-3487149b-aeca-4c48-b870-975acd05ab27 req-c9873785-e65f-425d-8e68-90ec9f8cf102 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "595aea98-0c3e-45c9-81fe-4643f44fe8d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:06 np0005466013 nova_compute[192144]: 2025-10-02 12:26:06.016 2 DEBUG nova.compute.manager [req-3487149b-aeca-4c48-b870-975acd05ab27 req-c9873785-e65f-425d-8e68-90ec9f8cf102 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] No waiting events found dispatching network-vif-plugged-375468b9-b213-41ae-87ca-ea569359bdb6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:26:06 np0005466013 nova_compute[192144]: 2025-10-02 12:26:06.016 2 WARNING nova.compute.manager [req-3487149b-aeca-4c48-b870-975acd05ab27 req-c9873785-e65f-425d-8e68-90ec9f8cf102 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Received unexpected event network-vif-plugged-375468b9-b213-41ae-87ca-ea569359bdb6 for instance with vm_state active and task_state migrating.#033[00m
Oct  2 08:26:07 np0005466013 nova_compute[192144]: 2025-10-02 12:26:07.103 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:09 np0005466013 nova_compute[192144]: 2025-10-02 12:26:09.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:09 np0005466013 podman[239943]: 2025-10-02 12:26:09.721818716 +0000 UTC m=+0.080121650 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, managed_by=edpm_ansible, config_id=edpm, vcs-type=git, version=9.6, distribution-scope=public, maintainer=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., release=1755695350)
Oct  2 08:26:09 np0005466013 podman[239944]: 2025-10-02 12:26:09.723617892 +0000 UTC m=+0.079322455 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:26:09 np0005466013 podman[239942]: 2025-10-02 12:26:09.735177034 +0000 UTC m=+0.090245156 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  2 08:26:10 np0005466013 nova_compute[192144]: 2025-10-02 12:26:10.513 2 DEBUG oslo_concurrency.lockutils [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Acquiring lock "595aea98-0c3e-45c9-81fe-4643f44fe8d3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:26:10 np0005466013 nova_compute[192144]: 2025-10-02 12:26:10.515 2 DEBUG oslo_concurrency.lockutils [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Lock "595aea98-0c3e-45c9-81fe-4643f44fe8d3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:26:10 np0005466013 nova_compute[192144]: 2025-10-02 12:26:10.515 2 DEBUG oslo_concurrency.lockutils [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Lock "595aea98-0c3e-45c9-81fe-4643f44fe8d3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:10 np0005466013 nova_compute[192144]: 2025-10-02 12:26:10.643 2 DEBUG oslo_concurrency.lockutils [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:26:10 np0005466013 nova_compute[192144]: 2025-10-02 12:26:10.644 2 DEBUG oslo_concurrency.lockutils [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:26:10 np0005466013 nova_compute[192144]: 2025-10-02 12:26:10.645 2 DEBUG oslo_concurrency.lockutils [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:10 np0005466013 nova_compute[192144]: 2025-10-02 12:26:10.646 2 DEBUG nova.compute.resource_tracker [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:26:10 np0005466013 nova_compute[192144]: 2025-10-02 12:26:10.938 2 DEBUG oslo_concurrency.processutils [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:26:11 np0005466013 nova_compute[192144]: 2025-10-02 12:26:11.035 2 DEBUG oslo_concurrency.processutils [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/disk --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:26:11 np0005466013 nova_compute[192144]: 2025-10-02 12:26:11.038 2 DEBUG oslo_concurrency.processutils [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:26:11 np0005466013 nova_compute[192144]: 2025-10-02 12:26:11.118 2 DEBUG oslo_concurrency.processutils [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:26:11 np0005466013 nova_compute[192144]: 2025-10-02 12:26:11.318 2 WARNING nova.virt.libvirt.driver [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:26:11 np0005466013 nova_compute[192144]: 2025-10-02 12:26:11.320 2 DEBUG nova.compute.resource_tracker [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5502MB free_disk=73.22223663330078GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:26:11 np0005466013 nova_compute[192144]: 2025-10-02 12:26:11.321 2 DEBUG oslo_concurrency.lockutils [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:26:11 np0005466013 nova_compute[192144]: 2025-10-02 12:26:11.322 2 DEBUG oslo_concurrency.lockutils [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:26:11 np0005466013 nova_compute[192144]: 2025-10-02 12:26:11.887 2 DEBUG nova.compute.resource_tracker [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Migration for instance 595aea98-0c3e-45c9-81fe-4643f44fe8d3 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Oct  2 08:26:12 np0005466013 nova_compute[192144]: 2025-10-02 12:26:12.106 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:12 np0005466013 nova_compute[192144]: 2025-10-02 12:26:12.359 2 DEBUG nova.compute.resource_tracker [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Oct  2 08:26:12 np0005466013 nova_compute[192144]: 2025-10-02 12:26:12.727 2 DEBUG nova.compute.resource_tracker [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Instance f92877b9-dd8b-4444-a42b-987004802928 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:26:12 np0005466013 nova_compute[192144]: 2025-10-02 12:26:12.728 2 DEBUG nova.compute.resource_tracker [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Migration 1b9ceb57-30d1-4afe-80ee-df45deac9622 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Oct  2 08:26:12 np0005466013 nova_compute[192144]: 2025-10-02 12:26:12.729 2 DEBUG nova.compute.resource_tracker [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:26:12 np0005466013 nova_compute[192144]: 2025-10-02 12:26:12.729 2 DEBUG nova.compute.resource_tracker [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:26:12 np0005466013 nova_compute[192144]: 2025-10-02 12:26:12.834 2 DEBUG nova.compute.provider_tree [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:26:13 np0005466013 nova_compute[192144]: 2025-10-02 12:26:13.084 2 DEBUG nova.scheduler.client.report [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:26:13 np0005466013 nova_compute[192144]: 2025-10-02 12:26:13.741 2 DEBUG nova.compute.resource_tracker [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:26:13 np0005466013 nova_compute[192144]: 2025-10-02 12:26:13.741 2 DEBUG oslo_concurrency.lockutils [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.420s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:14 np0005466013 nova_compute[192144]: 2025-10-02 12:26:14.539 2 INFO nova.compute.manager [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Migrating instance to compute-1.ctlplane.example.com finished successfully.#033[00m
Oct  2 08:26:14 np0005466013 nova_compute[192144]: 2025-10-02 12:26:14.544 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:15 np0005466013 podman[240006]: 2025-10-02 12:26:15.692029543 +0000 UTC m=+0.060065182 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=iscsid)
Oct  2 08:26:15 np0005466013 podman[240005]: 2025-10-02 12:26:15.692238509 +0000 UTC m=+0.061862807 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:26:15 np0005466013 nova_compute[192144]: 2025-10-02 12:26:15.713 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407960.7116346, 595aea98-0c3e-45c9-81fe-4643f44fe8d3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:26:15 np0005466013 nova_compute[192144]: 2025-10-02 12:26:15.713 2 INFO nova.compute.manager [-] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:26:17 np0005466013 nova_compute[192144]: 2025-10-02 12:26:17.110 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:18 np0005466013 nova_compute[192144]: 2025-10-02 12:26:18.392 2 DEBUG nova.compute.manager [None req-e84621fb-95c4-4780-b88d-da024a226b4f - - - - - -] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:26:18 np0005466013 nova_compute[192144]: 2025-10-02 12:26:18.430 2 INFO nova.scheduler.client.report [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Deleted allocation for migration 1b9ceb57-30d1-4afe-80ee-df45deac9622#033[00m
Oct  2 08:26:18 np0005466013 nova_compute[192144]: 2025-10-02 12:26:18.432 2 DEBUG nova.virt.libvirt.driver [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662#033[00m
Oct  2 08:26:19 np0005466013 nova_compute[192144]: 2025-10-02 12:26:19.545 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:26:19.875 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=30, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=29) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:26:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:26:19.876 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:26:19 np0005466013 nova_compute[192144]: 2025-10-02 12:26:19.877 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:22 np0005466013 nova_compute[192144]: 2025-10-02 12:26:22.112 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:24 np0005466013 nova_compute[192144]: 2025-10-02 12:26:24.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:27 np0005466013 nova_compute[192144]: 2025-10-02 12:26:27.116 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:27 np0005466013 nova_compute[192144]: 2025-10-02 12:26:27.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:26:27 np0005466013 nova_compute[192144]: 2025-10-02 12:26:27.993 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:26:28 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:26:28.880 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '30'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:26:29 np0005466013 nova_compute[192144]: 2025-10-02 12:26:29.551 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:29 np0005466013 nova_compute[192144]: 2025-10-02 12:26:29.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:26:30 np0005466013 nova_compute[192144]: 2025-10-02 12:26:30.624 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:26:30 np0005466013 nova_compute[192144]: 2025-10-02 12:26:30.624 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:26:30 np0005466013 nova_compute[192144]: 2025-10-02 12:26:30.625 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:30 np0005466013 nova_compute[192144]: 2025-10-02 12:26:30.625 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:26:30 np0005466013 podman[240046]: 2025-10-02 12:26:30.725464975 +0000 UTC m=+0.071109438 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  2 08:26:30 np0005466013 podman[240045]: 2025-10-02 12:26:30.726621541 +0000 UTC m=+0.083484025 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:26:30 np0005466013 podman[240047]: 2025-10-02 12:26:30.786735643 +0000 UTC m=+0.130405354 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller)
Oct  2 08:26:31 np0005466013 nova_compute[192144]: 2025-10-02 12:26:31.055 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:26:31 np0005466013 nova_compute[192144]: 2025-10-02 12:26:31.141 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:26:31 np0005466013 nova_compute[192144]: 2025-10-02 12:26:31.143 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:26:31 np0005466013 nova_compute[192144]: 2025-10-02 12:26:31.205 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:26:31 np0005466013 nova_compute[192144]: 2025-10-02 12:26:31.404 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:26:31 np0005466013 nova_compute[192144]: 2025-10-02 12:26:31.405 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5512MB free_disk=73.22223663330078GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:26:31 np0005466013 nova_compute[192144]: 2025-10-02 12:26:31.406 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:26:31 np0005466013 nova_compute[192144]: 2025-10-02 12:26:31.406 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:26:31 np0005466013 nova_compute[192144]: 2025-10-02 12:26:31.562 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance f92877b9-dd8b-4444-a42b-987004802928 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:26:31 np0005466013 nova_compute[192144]: 2025-10-02 12:26:31.562 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:26:31 np0005466013 nova_compute[192144]: 2025-10-02 12:26:31.563 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:26:31 np0005466013 nova_compute[192144]: 2025-10-02 12:26:31.652 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:26:31 np0005466013 nova_compute[192144]: 2025-10-02 12:26:31.690 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:26:31 np0005466013 nova_compute[192144]: 2025-10-02 12:26:31.692 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:26:31 np0005466013 nova_compute[192144]: 2025-10-02 12:26:31.692 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.286s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:32 np0005466013 nova_compute[192144]: 2025-10-02 12:26:32.119 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:34 np0005466013 nova_compute[192144]: 2025-10-02 12:26:34.553 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:34 np0005466013 nova_compute[192144]: 2025-10-02 12:26:34.693 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:26:35 np0005466013 nova_compute[192144]: 2025-10-02 12:26:35.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:26:35 np0005466013 nova_compute[192144]: 2025-10-02 12:26:35.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:26:35 np0005466013 nova_compute[192144]: 2025-10-02 12:26:35.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:26:35 np0005466013 nova_compute[192144]: 2025-10-02 12:26:35.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:26:36 np0005466013 nova_compute[192144]: 2025-10-02 12:26:36.989 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:26:37 np0005466013 ovn_controller[94366]: 2025-10-02T12:26:37Z|00516|binding|INFO|Releasing lport c9a9afa9-78de-46a4-a649-8b8779924189 from this chassis (sb_readonly=0)
Oct  2 08:26:37 np0005466013 nova_compute[192144]: 2025-10-02 12:26:37.037 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:37 np0005466013 nova_compute[192144]: 2025-10-02 12:26:37.120 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:37 np0005466013 ovn_controller[94366]: 2025-10-02T12:26:37Z|00517|binding|INFO|Releasing lport c9a9afa9-78de-46a4-a649-8b8779924189 from this chassis (sb_readonly=0)
Oct  2 08:26:37 np0005466013 nova_compute[192144]: 2025-10-02 12:26:37.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:37 np0005466013 nova_compute[192144]: 2025-10-02 12:26:37.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:26:37 np0005466013 nova_compute[192144]: 2025-10-02 12:26:37.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:26:39 np0005466013 nova_compute[192144]: 2025-10-02 12:26:39.559 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:40 np0005466013 nova_compute[192144]: 2025-10-02 12:26:40.548 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:26:40 np0005466013 podman[240119]: 2025-10-02 12:26:40.704549919 +0000 UTC m=+0.072338976 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=multipathd)
Oct  2 08:26:40 np0005466013 podman[240120]: 2025-10-02 12:26:40.70715807 +0000 UTC m=+0.061525348 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-type=git, io.openshift.tags=minimal rhel9, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct  2 08:26:40 np0005466013 podman[240121]: 2025-10-02 12:26:40.733826145 +0000 UTC m=+0.093480528 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, config_id=edpm, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:26:42 np0005466013 nova_compute[192144]: 2025-10-02 12:26:42.124 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:44 np0005466013 nova_compute[192144]: 2025-10-02 12:26:44.543 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:26:44 np0005466013 nova_compute[192144]: 2025-10-02 12:26:44.558 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:46 np0005466013 podman[240178]: 2025-10-02 12:26:46.715301925 +0000 UTC m=+0.082429061 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  2 08:26:46 np0005466013 podman[240177]: 2025-10-02 12:26:46.742757165 +0000 UTC m=+0.110183731 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  2 08:26:47 np0005466013 nova_compute[192144]: 2025-10-02 12:26:47.128 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:49 np0005466013 nova_compute[192144]: 2025-10-02 12:26:49.561 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:52 np0005466013 nova_compute[192144]: 2025-10-02 12:26:52.131 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:54 np0005466013 nova_compute[192144]: 2025-10-02 12:26:54.563 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:57 np0005466013 nova_compute[192144]: 2025-10-02 12:26:57.140 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:59 np0005466013 nova_compute[192144]: 2025-10-02 12:26:59.569 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:01 np0005466013 podman[240222]: 2025-10-02 12:27:01.706109332 +0000 UTC m=+0.067696171 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:27:01 np0005466013 podman[240221]: 2025-10-02 12:27:01.706748931 +0000 UTC m=+0.068710062 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:27:01 np0005466013 podman[240223]: 2025-10-02 12:27:01.76000904 +0000 UTC m=+0.116367785 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251001)
Oct  2 08:27:02 np0005466013 nova_compute[192144]: 2025-10-02 12:27:02.143 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:27:02.310 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:27:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:27:02.310 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:27:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:27:02.311 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:27:04 np0005466013 nova_compute[192144]: 2025-10-02 12:27:04.570 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:07 np0005466013 nova_compute[192144]: 2025-10-02 12:27:07.147 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:09 np0005466013 nova_compute[192144]: 2025-10-02 12:27:09.573 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:11 np0005466013 podman[240300]: 2025-10-02 12:27:11.683195166 +0000 UTC m=+0.055012103 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, vcs-type=git, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., version=9.6, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, io.openshift.expose-services=, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  2 08:27:11 np0005466013 podman[240301]: 2025-10-02 12:27:11.716225311 +0000 UTC m=+0.072785040 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 08:27:11 np0005466013 podman[240299]: 2025-10-02 12:27:11.724673335 +0000 UTC m=+0.088395029 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Oct  2 08:27:12 np0005466013 nova_compute[192144]: 2025-10-02 12:27:12.079 2 DEBUG oslo_concurrency.lockutils [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquiring lock "351ed18e-9759-4759-9a20-24f8b3d59908" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:27:12 np0005466013 nova_compute[192144]: 2025-10-02 12:27:12.080 2 DEBUG oslo_concurrency.lockutils [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "351ed18e-9759-4759-9a20-24f8b3d59908" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:27:12 np0005466013 nova_compute[192144]: 2025-10-02 12:27:12.134 2 DEBUG nova.compute.manager [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:27:12 np0005466013 nova_compute[192144]: 2025-10-02 12:27:12.150 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:12 np0005466013 nova_compute[192144]: 2025-10-02 12:27:12.408 2 DEBUG oslo_concurrency.lockutils [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:27:12 np0005466013 nova_compute[192144]: 2025-10-02 12:27:12.409 2 DEBUG oslo_concurrency.lockutils [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:27:12 np0005466013 nova_compute[192144]: 2025-10-02 12:27:12.417 2 DEBUG nova.virt.hardware [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:27:12 np0005466013 nova_compute[192144]: 2025-10-02 12:27:12.417 2 INFO nova.compute.claims [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:27:12 np0005466013 nova_compute[192144]: 2025-10-02 12:27:12.681 2 DEBUG nova.compute.provider_tree [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:27:12 np0005466013 nova_compute[192144]: 2025-10-02 12:27:12.710 2 DEBUG nova.scheduler.client.report [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:27:12 np0005466013 nova_compute[192144]: 2025-10-02 12:27:12.738 2 DEBUG oslo_concurrency.lockutils [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.329s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:27:12 np0005466013 nova_compute[192144]: 2025-10-02 12:27:12.739 2 DEBUG nova.compute.manager [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:27:12 np0005466013 nova_compute[192144]: 2025-10-02 12:27:12.818 2 DEBUG nova.compute.manager [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:27:12 np0005466013 nova_compute[192144]: 2025-10-02 12:27:12.819 2 DEBUG nova.network.neutron [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:27:12 np0005466013 nova_compute[192144]: 2025-10-02 12:27:12.841 2 INFO nova.virt.libvirt.driver [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:27:12 np0005466013 nova_compute[192144]: 2025-10-02 12:27:12.862 2 DEBUG nova.compute.manager [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:27:13 np0005466013 nova_compute[192144]: 2025-10-02 12:27:13.040 2 DEBUG nova.compute.manager [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:27:13 np0005466013 nova_compute[192144]: 2025-10-02 12:27:13.043 2 DEBUG nova.virt.libvirt.driver [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:27:13 np0005466013 nova_compute[192144]: 2025-10-02 12:27:13.044 2 INFO nova.virt.libvirt.driver [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Creating image(s)#033[00m
Oct  2 08:27:13 np0005466013 nova_compute[192144]: 2025-10-02 12:27:13.045 2 DEBUG oslo_concurrency.lockutils [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquiring lock "/var/lib/nova/instances/351ed18e-9759-4759-9a20-24f8b3d59908/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:27:13 np0005466013 nova_compute[192144]: 2025-10-02 12:27:13.045 2 DEBUG oslo_concurrency.lockutils [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "/var/lib/nova/instances/351ed18e-9759-4759-9a20-24f8b3d59908/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:27:13 np0005466013 nova_compute[192144]: 2025-10-02 12:27:13.046 2 DEBUG oslo_concurrency.lockutils [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "/var/lib/nova/instances/351ed18e-9759-4759-9a20-24f8b3d59908/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:27:13 np0005466013 nova_compute[192144]: 2025-10-02 12:27:13.066 2 DEBUG oslo_concurrency.processutils [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:27:13 np0005466013 nova_compute[192144]: 2025-10-02 12:27:13.119 2 DEBUG nova.policy [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:27:13 np0005466013 nova_compute[192144]: 2025-10-02 12:27:13.163 2 DEBUG oslo_concurrency.processutils [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:27:13 np0005466013 nova_compute[192144]: 2025-10-02 12:27:13.164 2 DEBUG oslo_concurrency.lockutils [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:27:13 np0005466013 nova_compute[192144]: 2025-10-02 12:27:13.165 2 DEBUG oslo_concurrency.lockutils [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:27:13 np0005466013 nova_compute[192144]: 2025-10-02 12:27:13.180 2 DEBUG oslo_concurrency.processutils [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:27:13 np0005466013 nova_compute[192144]: 2025-10-02 12:27:13.257 2 DEBUG oslo_concurrency.processutils [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:27:13 np0005466013 nova_compute[192144]: 2025-10-02 12:27:13.258 2 DEBUG oslo_concurrency.processutils [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/351ed18e-9759-4759-9a20-24f8b3d59908/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:27:13 np0005466013 nova_compute[192144]: 2025-10-02 12:27:13.360 2 DEBUG oslo_concurrency.processutils [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/351ed18e-9759-4759-9a20-24f8b3d59908/disk 1073741824" returned: 0 in 0.101s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:27:13 np0005466013 nova_compute[192144]: 2025-10-02 12:27:13.361 2 DEBUG oslo_concurrency.lockutils [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.196s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:27:13 np0005466013 nova_compute[192144]: 2025-10-02 12:27:13.361 2 DEBUG oslo_concurrency.processutils [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:27:13 np0005466013 nova_compute[192144]: 2025-10-02 12:27:13.422 2 DEBUG oslo_concurrency.processutils [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:27:13 np0005466013 nova_compute[192144]: 2025-10-02 12:27:13.423 2 DEBUG nova.virt.disk.api [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Checking if we can resize image /var/lib/nova/instances/351ed18e-9759-4759-9a20-24f8b3d59908/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:27:13 np0005466013 nova_compute[192144]: 2025-10-02 12:27:13.423 2 DEBUG oslo_concurrency.processutils [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/351ed18e-9759-4759-9a20-24f8b3d59908/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:27:13 np0005466013 nova_compute[192144]: 2025-10-02 12:27:13.518 2 DEBUG oslo_concurrency.processutils [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/351ed18e-9759-4759-9a20-24f8b3d59908/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:27:13 np0005466013 nova_compute[192144]: 2025-10-02 12:27:13.519 2 DEBUG nova.virt.disk.api [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Cannot resize image /var/lib/nova/instances/351ed18e-9759-4759-9a20-24f8b3d59908/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:27:13 np0005466013 nova_compute[192144]: 2025-10-02 12:27:13.519 2 DEBUG nova.objects.instance [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lazy-loading 'migration_context' on Instance uuid 351ed18e-9759-4759-9a20-24f8b3d59908 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:27:13 np0005466013 nova_compute[192144]: 2025-10-02 12:27:13.571 2 DEBUG nova.virt.libvirt.driver [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:27:13 np0005466013 nova_compute[192144]: 2025-10-02 12:27:13.572 2 DEBUG nova.virt.libvirt.driver [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Ensure instance console log exists: /var/lib/nova/instances/351ed18e-9759-4759-9a20-24f8b3d59908/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:27:13 np0005466013 nova_compute[192144]: 2025-10-02 12:27:13.572 2 DEBUG oslo_concurrency.lockutils [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:27:13 np0005466013 nova_compute[192144]: 2025-10-02 12:27:13.573 2 DEBUG oslo_concurrency.lockutils [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:27:13 np0005466013 nova_compute[192144]: 2025-10-02 12:27:13.573 2 DEBUG oslo_concurrency.lockutils [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:27:14 np0005466013 nova_compute[192144]: 2025-10-02 12:27:14.102 2 DEBUG nova.network.neutron [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Successfully created port: 57aa5324-a5d7-424f-b3ea-20887a672a1c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:27:14 np0005466013 nova_compute[192144]: 2025-10-02 12:27:14.581 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:15 np0005466013 nova_compute[192144]: 2025-10-02 12:27:15.342 2 DEBUG nova.network.neutron [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Successfully updated port: 57aa5324-a5d7-424f-b3ea-20887a672a1c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:27:15 np0005466013 nova_compute[192144]: 2025-10-02 12:27:15.366 2 DEBUG oslo_concurrency.lockutils [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquiring lock "refresh_cache-351ed18e-9759-4759-9a20-24f8b3d59908" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:27:15 np0005466013 nova_compute[192144]: 2025-10-02 12:27:15.367 2 DEBUG oslo_concurrency.lockutils [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquired lock "refresh_cache-351ed18e-9759-4759-9a20-24f8b3d59908" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:27:15 np0005466013 nova_compute[192144]: 2025-10-02 12:27:15.368 2 DEBUG nova.network.neutron [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:27:15 np0005466013 nova_compute[192144]: 2025-10-02 12:27:15.676 2 DEBUG nova.compute.manager [req-0c4507df-8ced-4f94-b74f-08b3c378e7fd req-bf238479-bd44-46b7-aa9d-d465d3b9e45a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Received event network-changed-57aa5324-a5d7-424f-b3ea-20887a672a1c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:27:15 np0005466013 nova_compute[192144]: 2025-10-02 12:27:15.677 2 DEBUG nova.compute.manager [req-0c4507df-8ced-4f94-b74f-08b3c378e7fd req-bf238479-bd44-46b7-aa9d-d465d3b9e45a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Refreshing instance network info cache due to event network-changed-57aa5324-a5d7-424f-b3ea-20887a672a1c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:27:15 np0005466013 nova_compute[192144]: 2025-10-02 12:27:15.677 2 DEBUG oslo_concurrency.lockutils [req-0c4507df-8ced-4f94-b74f-08b3c378e7fd req-bf238479-bd44-46b7-aa9d-d465d3b9e45a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-351ed18e-9759-4759-9a20-24f8b3d59908" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:27:15 np0005466013 nova_compute[192144]: 2025-10-02 12:27:15.693 2 DEBUG nova.network.neutron [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.354 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f92877b9-dd8b-4444-a42b-987004802928', 'name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000077', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '88e90c16adec46069b539d4f1431ab4d', 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'hostId': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.355 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.359 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/network.incoming.bytes.delta volume: 1261 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.361 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4e5935ec-fbf4-4e81-9dd0-dd8b1a3eb271', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 1261, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'instance-00000077-f92877b9-dd8b-4444-a42b-987004802928-tap60b9a0ec-2a', 'timestamp': '2025-10-02T12:27:16.355416', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'tap60b9a0ec-2a', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f0:49:10', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60b9a0ec-2a'}, 'message_id': '21ba0d04-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6028.033625136, 'message_signature': '8874d7e36c79bc12ccdbcadde872a24a87179015132542252ed8bfd266f0b333'}]}, 'timestamp': '2025-10-02 12:27:16.359908', '_unique_id': '893683adfd2849ae8f5b07af99bfa1b6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.361 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.361 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.361 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.361 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.361 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.361 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.361 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.361 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.361 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.361 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.361 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.361 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.361 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.361 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.361 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.361 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.361 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.361 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.361 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.361 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.361 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.361 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.361 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.361 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.361 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.361 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.361 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.361 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.361 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.361 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.361 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/network.incoming.bytes volume: 1371 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '14fa8efd-9c4b-4aa1-9fb9-ba8c3e3c8de5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1371, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'instance-00000077-f92877b9-dd8b-4444-a42b-987004802928-tap60b9a0ec-2a', 'timestamp': '2025-10-02T12:27:16.362117', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'tap60b9a0ec-2a', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f0:49:10', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60b9a0ec-2a'}, 'message_id': '21ba7046-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6028.033625136, 'message_signature': 'ebcf90131c8b7b6242882c4e5225fac41d3e78946daf0e3cf06343cf2f3730a6'}]}, 'timestamp': '2025-10-02 12:27:16.362358', '_unique_id': 'fe6c28ed360447138ef02615c5da2b69'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.362 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.363 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.392 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/disk.device.write.latency volume: 733309552 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.393 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.395 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '42c9b595-7b26-4b17-8b4f-8321112f12e6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 733309552, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'f92877b9-dd8b-4444-a42b-987004802928-vda', 'timestamp': '2025-10-02T12:27:16.363632', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'instance-00000077', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '21bf2154-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6028.041834253, 'message_signature': 'a8266e0b8718a8896769a6275bc91913558619a9ee220265f51771adce47009f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'f92877b9-dd8b-4444-a42b-987004802928-sda', 'timestamp': '2025-10-02T12:27:16.363632', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'instance-00000077', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '21bf3afe-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6028.041834253, 'message_signature': '0315c8d2e6fba570dfff8a272b02795d5bab9b881786432dcde999d2d60c2689'}]}, 'timestamp': '2025-10-02 12:27:16.393945', '_unique_id': '5b62177ba21a403aa62608106b803a40'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.395 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.395 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.395 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.395 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.395 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.395 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.395 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.395 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.395 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.395 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.395 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.395 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.395 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.395 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.395 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.395 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.395 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.395 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.395 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.395 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.395 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.395 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.395 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.395 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.395 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.395 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.395 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.395 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.395 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.395 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.395 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.397 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.423 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/memory.usage volume: 42.26171875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.424 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5d84b262-0a66-4b71-a583-3ba147089cf4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.26171875, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'timestamp': '2025-10-02T12:27:16.397751', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'instance-00000077', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '21c3c4e8-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6028.101220753, 'message_signature': '17d47c2ec1056015e6183fad7221afea3562f2d365646edfc7b338010afc9c80'}]}, 'timestamp': '2025-10-02 12:27:16.423626', '_unique_id': '335d088a39a54ad28e76a318705de42c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.424 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.424 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.424 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.424 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.424 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.424 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.424 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.424 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.424 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.424 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.424 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.424 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.424 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.424 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.424 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.424 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.424 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.424 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.424 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.424 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.424 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.424 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.424 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.424 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.424 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.424 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.424 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.424 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.424 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.424 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.424 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.426 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.426 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.427 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '16b7cd1b-3f97-4ec5-8968-e9812560b88e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'instance-00000077-f92877b9-dd8b-4444-a42b-987004802928-tap60b9a0ec-2a', 'timestamp': '2025-10-02T12:27:16.426292', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'tap60b9a0ec-2a', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f0:49:10', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60b9a0ec-2a'}, 'message_id': '21c4409e-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6028.033625136, 'message_signature': '68e15c560ff8bf5fd61cb6fe900db9161cd154bc3baa1263f3f7117166192e03'}]}, 'timestamp': '2025-10-02 12:27:16.426803', '_unique_id': '630c87d93ef34e89b80c7dbb4a6ed789'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.427 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.427 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.427 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.427 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.427 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.427 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.427 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.427 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.427 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.427 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.427 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.427 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.427 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.427 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.427 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.427 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.427 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.427 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.427 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.427 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.427 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.427 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.427 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.427 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.427 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.427 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.427 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.427 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.427 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.427 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.427 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.429 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.429 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.429 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.429 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.431 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '83838409-726e-4cd7-87df-dd004d2558ed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'instance-00000077-f92877b9-dd8b-4444-a42b-987004802928-tap60b9a0ec-2a', 'timestamp': '2025-10-02T12:27:16.429680', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'tap60b9a0ec-2a', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f0:49:10', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60b9a0ec-2a'}, 'message_id': '21c4c618-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6028.033625136, 'message_signature': '6190f97f702d6f21c1993b70e7f13bc3c72b6a20ec94b9b84bb094909f595128'}]}, 'timestamp': '2025-10-02 12:27:16.430220', '_unique_id': '7138fe4fd4da4d5f9ebd019822cfdb1f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.431 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.431 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.431 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.431 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.431 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.431 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.431 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.431 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.431 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.431 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.431 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.431 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.431 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.431 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.431 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.431 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.431 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.431 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.431 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.431 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.431 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.431 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.431 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.431 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.431 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.431 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.431 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.431 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.431 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.431 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.431 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.432 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.432 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/disk.device.write.requests volume: 49 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.433 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.434 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ef49c029-59db-4dad-817b-58a2c93b2b8f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 49, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'f92877b9-dd8b-4444-a42b-987004802928-vda', 'timestamp': '2025-10-02T12:27:16.432636', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'instance-00000077', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '21c5395e-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6028.041834253, 'message_signature': '6be511ac7ead9aa803bb0e836aa2b9546a0469636da6a8277a436d17375fd99e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'f92877b9-dd8b-4444-a42b-987004802928-sda', 'timestamp': '2025-10-02T12:27:16.432636', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'instance-00000077', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '21c54c00-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6028.041834253, 'message_signature': '74188e82f064fee925ac58842f1fff164dbaaeba42113ed5e630ace18a43c94d'}]}, 'timestamp': '2025-10-02 12:27:16.433622', '_unique_id': '6be81d6b804d44a88f15a8c2ee69f0f4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.434 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.434 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.434 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.434 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.434 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.434 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.434 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.434 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.434 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.434 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.434 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.434 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.434 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.434 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.434 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.434 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.434 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.434 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.434 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.434 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.434 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.434 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.434 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.434 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.434 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.434 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.434 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.434 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.434 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.434 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.434 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.436 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.436 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/network.outgoing.bytes volume: 1278 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.437 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5da89691-1308-4f2a-ac67-db866d463971', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1278, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'instance-00000077-f92877b9-dd8b-4444-a42b-987004802928-tap60b9a0ec-2a', 'timestamp': '2025-10-02T12:27:16.436212', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'tap60b9a0ec-2a', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f0:49:10', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60b9a0ec-2a'}, 'message_id': '21c5c39c-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6028.033625136, 'message_signature': '7a563bc051b6be8f4853a74102b05e04c1eef10509360f5a3f56e5b4db8e23f8'}]}, 'timestamp': '2025-10-02 12:27:16.436707', '_unique_id': '84399ed97be245b184945d6faa9a7c01'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.437 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.437 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.437 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.437 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.437 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.437 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.437 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.437 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.437 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.437 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.437 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.437 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.437 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.437 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.437 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.437 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.437 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.437 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.437 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.437 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.437 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.437 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.437 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.437 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.437 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.437 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.437 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.437 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.437 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.437 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.437 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.438 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.453 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/disk.device.usage volume: 30081024 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.454 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.455 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '069a4aa3-8701-4726-9d89-dd1ce0f1fe7f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30081024, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'f92877b9-dd8b-4444-a42b-987004802928-vda', 'timestamp': '2025-10-02T12:27:16.439140', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'instance-00000077', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '21c86d18-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6028.117387659, 'message_signature': '128dc69c9641ef44cf0c1a65feae6ef2ba201a9d9bd2f5861b2d294652f55751'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'f92877b9-dd8b-4444-a42b-987004802928-sda', 'timestamp': '2025-10-02T12:27:16.439140', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'instance-00000077', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '21c880be-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6028.117387659, 'message_signature': '416548fc98920149469b8465bc33bbe2ca556505b5200091def0f323f4df6620'}]}, 'timestamp': '2025-10-02 12:27:16.454630', '_unique_id': '1260fd1c07854dea8c01118cb7c87d61'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.455 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.455 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.455 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.455 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.455 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.455 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.455 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.455 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.455 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.455 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.455 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.455 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.455 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.455 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.455 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.455 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.455 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.455 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.455 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.455 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.455 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.455 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.455 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.455 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.455 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.455 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.455 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.455 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.455 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.455 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.455 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.457 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.457 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/disk.device.read.latency volume: 2889073819 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.457 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/disk.device.read.latency volume: 144553647 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.459 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2473873b-3f02-4039-9a8f-75ae3afbdf6a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2889073819, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'f92877b9-dd8b-4444-a42b-987004802928-vda', 'timestamp': '2025-10-02T12:27:16.457292', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'instance-00000077', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '21c8fbc0-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6028.041834253, 'message_signature': 'd88c9bdb8d8f71f88a6e0b96243f1f35e390c0c63b21308c48345b8c2d786647'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 144553647, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'f92877b9-dd8b-4444-a42b-987004802928-sda', 'timestamp': '2025-10-02T12:27:16.457292', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'instance-00000077', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '21c90eb2-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6028.041834253, 'message_signature': 'fc30f3e6969e1dd25bd58101551614c136d9f062988c00abac0a1a48a0f65a68'}]}, 'timestamp': '2025-10-02 12:27:16.458261', '_unique_id': '13b30b91d3fc45e98e7b3f398b4db181'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.459 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.459 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.459 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.459 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.459 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.459 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.459 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.459 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.459 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.459 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.459 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.459 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.459 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.459 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.459 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.459 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.459 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.459 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.459 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.459 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.459 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.459 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.459 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.459 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.459 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.459 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.459 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.459 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.459 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.459 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.459 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.461 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.461 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.462 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9ae2fdd1-0fa5-4bc9-bab8-8a4bcb78eb9a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'f92877b9-dd8b-4444-a42b-987004802928-vda', 'timestamp': '2025-10-02T12:27:16.461774', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'instance-00000077', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '21c9abd8-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6028.041834253, 'message_signature': 'e615ef63e81375591f21728af4df754177fe6cb6b6a597b1e9c2c16a91cbad19'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'f92877b9-dd8b-4444-a42b-987004802928-sda', 'timestamp': '2025-10-02T12:27:16.461774', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'instance-00000077', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '21c9b81c-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6028.041834253, 'message_signature': '849cfe5fd5333e48579ffb6a8faec3e751b4445a6555c8f1f33c9b33079aedf2'}]}, 'timestamp': '2025-10-02 12:27:16.462504', '_unique_id': 'c4b8e35a7aba47f5a962c358cf6d27ba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.464 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/cpu volume: 13550000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.465 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3a70da36-3ead-466b-a9ef-20fb2aec03aa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 13550000000, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'timestamp': '2025-10-02T12:27:16.464786', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'instance-00000077', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '21ca1bcc-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6028.101220753, 'message_signature': '13a08590865d5117ab5164a7402c447130d9e0466083d9b7081628c460972ec4'}]}, 'timestamp': '2025-10-02 12:27:16.465049', '_unique_id': 'd4a235fb04944495bf09fa5ee4b6fa87'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.465 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.465 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.465 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.465 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.465 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.465 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.465 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.465 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.465 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.465 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.465 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.465 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.465 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.465 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.465 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.465 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.465 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.465 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.465 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.465 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.465 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.465 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.465 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.465 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.465 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.465 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.465 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.465 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.465 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.465 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.465 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.466 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.466 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.466 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/disk.device.allocation volume: 30547968 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.466 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f96f8aa4-76ac-4395-a080-b4b74bdb51cd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30547968, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'f92877b9-dd8b-4444-a42b-987004802928-vda', 'timestamp': '2025-10-02T12:27:16.466307', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'instance-00000077', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '21ca55c4-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6028.117387659, 'message_signature': 'a161bd79405ffe328e34acbd5b6d6ff88d222996382c9d303abe3f772e4fba03'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'f92877b9-dd8b-4444-a42b-987004802928-sda', 'timestamp': '2025-10-02T12:27:16.466307', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'instance-00000077', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '21ca5e84-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6028.117387659, 'message_signature': '9b5361f16c66a86fa63ce4e69bafa520f1c9df25eab80b38b13b890308871f92'}]}, 'timestamp': '2025-10-02 12:27:16.466743', '_unique_id': '7393fc16a0ea46eea4bc9eaf0720487f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.467 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/network.outgoing.bytes.delta volume: 1278 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.468 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ba83f0d1-ccb9-45cb-a38f-a2f36d9351a2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 1278, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'instance-00000077-f92877b9-dd8b-4444-a42b-987004802928-tap60b9a0ec-2a', 'timestamp': '2025-10-02T12:27:16.467921', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'tap60b9a0ec-2a', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f0:49:10', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60b9a0ec-2a'}, 'message_id': '21ca9598-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6028.033625136, 'message_signature': '48b8954f979d24033ff41acfa3602d35f006a97d59f099f7bf2d63a58d9ab10b'}]}, 'timestamp': '2025-10-02 12:27:16.468179', '_unique_id': 'bdb5e5d189644755b082ec98a6ce2ee0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.468 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.468 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.468 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.468 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.468 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.468 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.468 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.468 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.468 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.468 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.468 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.468 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.468 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.468 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.468 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.468 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.468 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.468 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.468 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.468 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.468 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.468 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.468 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.468 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.468 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.468 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.468 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.468 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.468 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.468 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.468 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.469 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.469 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.469 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9c2b3652-5998-4bdd-84a0-5c4868a55a4a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'instance-00000077-f92877b9-dd8b-4444-a42b-987004802928-tap60b9a0ec-2a', 'timestamp': '2025-10-02T12:27:16.469489', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'tap60b9a0ec-2a', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f0:49:10', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60b9a0ec-2a'}, 'message_id': '21cad350-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6028.033625136, 'message_signature': '87e65e3f08d4beac2135cec06ec2bf39d7eee67e4aa6fbc49b3c34ff91c51fc7'}]}, 'timestamp': '2025-10-02 12:27:16.469780', '_unique_id': '58ba11ab3c1e4ea9a11558231dff32d4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.470 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/network.incoming.packets volume: 12 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.471 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5c046e06-b901-4f82-ae24-2426bb7d01b5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 12, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'instance-00000077-f92877b9-dd8b-4444-a42b-987004802928-tap60b9a0ec-2a', 'timestamp': '2025-10-02T12:27:16.470918', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'tap60b9a0ec-2a', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f0:49:10', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60b9a0ec-2a'}, 'message_id': '21cb0afa-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6028.033625136, 'message_signature': '589a36884bdfea984214fe910ee7638c94abcf359b2cc31dacf84d7ba03e6570'}]}, 'timestamp': '2025-10-02 12:27:16.471172', '_unique_id': '0af97cbcfcfc40e891a097d5ae360902'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.471 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.471 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.471 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.471 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.471 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.471 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.471 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.471 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.471 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.471 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.471 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.471 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.471 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.471 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.471 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.471 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.471 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.471 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.471 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.471 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.471 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.471 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.471 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.471 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.471 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.471 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.471 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.471 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.471 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.471 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.471 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.472 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.472 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/network.outgoing.packets volume: 15 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '81402891-3335-4468-9d12-4908450cde58', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 15, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'instance-00000077-f92877b9-dd8b-4444-a42b-987004802928-tap60b9a0ec-2a', 'timestamp': '2025-10-02T12:27:16.472303', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'tap60b9a0ec-2a', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f0:49:10', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60b9a0ec-2a'}, 'message_id': '21cb4010-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6028.033625136, 'message_signature': 'fac8e6a92239f6726a2814bcfe49685d15ce39f785027323f485978f38b23ffe'}]}, 'timestamp': '2025-10-02 12:27:16.472546', '_unique_id': '16b2748b70464cd4a92216e68ca02f2d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/disk.device.read.requests volume: 1210 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.473 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.474 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fc05e7d5-30a0-4ba3-b557-0f178b0630c8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1210, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'f92877b9-dd8b-4444-a42b-987004802928-vda', 'timestamp': '2025-10-02T12:27:16.473660', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'instance-00000077', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '21cb75b2-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6028.041834253, 'message_signature': '16e9acc990ab775d227c02260371649119c27cae342779eacc73a65aa927677c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'f92877b9-dd8b-4444-a42b-987004802928-sda', 'timestamp': '2025-10-02T12:27:16.473660', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'instance-00000077', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '21cb7f26-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6028.041834253, 'message_signature': 'b6754f9e3cc5614399d93e4bfd26d851d9c92072212e58e6d7746facf7a43fb1'}]}, 'timestamp': '2025-10-02 12:27:16.474133', '_unique_id': '153c05cda2ee4505ab927866e156783e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.474 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.474 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.474 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.474 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.474 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.474 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.474 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.474 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.474 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.474 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.474 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.474 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.474 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.474 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.474 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.474 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.474 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.474 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.474 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.474 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.474 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.474 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.474 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.474 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.474 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.474 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.474 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.474 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.474 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.474 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.474 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.475 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.475 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.475 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '461985cd-6284-4ee9-8d9f-1dac7280b1a7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'f92877b9-dd8b-4444-a42b-987004802928-vda', 'timestamp': '2025-10-02T12:27:16.475257', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'instance-00000077', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '21cbb432-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6028.117387659, 'message_signature': 'f8094ed17129b39e79657a7d5cac79d949a457dd15a077d3120d3095ee0b876b'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'f92877b9-dd8b-4444-a42b-987004802928-sda', 'timestamp': '2025-10-02T12:27:16.475257', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'instance-00000077', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '21cbbc34-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6028.117387659, 'message_signature': '84ad5fa2f045c8cbe65793c1ebc92d88082637eab3fd8738a22b2fd887b358e1'}]}, 'timestamp': '2025-10-02 12:27:16.475695', '_unique_id': 'f842476f1af34590a79012d5fc90484d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.476 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/disk.device.read.bytes volume: 32040960 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.477 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.477 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fcb79e43-6b13-4e3e-8d16-bbb6aaf420fb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 32040960, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'f92877b9-dd8b-4444-a42b-987004802928-vda', 'timestamp': '2025-10-02T12:27:16.476804', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'instance-00000077', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '21cbf0be-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6028.041834253, 'message_signature': '4ebd8cd672da2c2e5461d01c1dbc54fe81d36f0eb66d5178d78e8b5ee8afe614'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'f92877b9-dd8b-4444-a42b-987004802928-sda', 'timestamp': '2025-10-02T12:27:16.476804', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'instance-00000077', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '21cbf8b6-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6028.041834253, 'message_signature': '5e43ab4a88ca5ad19808be957a9479ea8c2396ce73b3e8323d95bd59b7fd288f'}]}, 'timestamp': '2025-10-02 12:27:16.477274', '_unique_id': '34721cb5566c43d9bf78aaeb99f0015c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.477 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.477 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.477 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.477 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.477 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.477 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.477 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.477 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.477 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.477 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.477 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.477 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.477 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.477 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.477 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.477 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.477 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.477 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.477 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.477 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.477 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.477 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.477 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.477 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.477 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.477 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.477 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.477 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.477 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.477 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.477 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.478 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.478 12 DEBUG ceilometer.compute.pollsters [-] f92877b9-dd8b-4444-a42b-987004802928/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.479 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b4a3fdf1-7bcf-433b-8adc-39f06b96153a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'instance-00000077-f92877b9-dd8b-4444-a42b-987004802928-tap60b9a0ec-2a', 'timestamp': '2025-10-02T12:27:16.478638', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-915765106', 'name': 'tap60b9a0ec-2a', 'instance_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'instance_type': 'm1.nano', 'host': '8a20db74cc27c6e912972daee3706ee93d4b911dab15aa15e57bb34f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f0:49:10', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60b9a0ec-2a'}, 'message_id': '21cc3948-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6028.033625136, 'message_signature': 'e7af5a4921a0158c1a8b84c2db835dbb16548468ccdfe2c6b97427ae0b0b172b'}]}, 'timestamp': '2025-10-02 12:27:16.479006', '_unique_id': '076ca0dcc64f4f0589b691cbad36a35c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.479 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.479 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.479 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.479 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.479 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.479 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.479 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.479 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.479 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.479 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.479 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.479 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.479 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.479 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.479 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.479 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.479 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.479 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.479 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.479 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.479 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.479 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.479 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.479 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.479 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.479 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.479 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.479 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.479 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.479 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:27:16.479 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:17 np0005466013 nova_compute[192144]: 2025-10-02 12:27:17.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:17 np0005466013 podman[240374]: 2025-10-02 12:27:17.677395914 +0000 UTC m=+0.050115741 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 08:27:17 np0005466013 podman[240375]: 2025-10-02 12:27:17.706263958 +0000 UTC m=+0.064883274 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, config_id=iscsid, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:27:19 np0005466013 nova_compute[192144]: 2025-10-02 12:27:19.418 2 DEBUG nova.network.neutron [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Updating instance_info_cache with network_info: [{"id": "57aa5324-a5d7-424f-b3ea-20887a672a1c", "address": "fa:16:3e:b3:ab:89", "network": {"id": "aa4ebb90-ef5e-4974-a53d-2aabd696731a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88e90c16adec46069b539d4f1431ab4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57aa5324-a5", "ovs_interfaceid": "57aa5324-a5d7-424f-b3ea-20887a672a1c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:27:19 np0005466013 nova_compute[192144]: 2025-10-02 12:27:19.557 2 DEBUG oslo_concurrency.lockutils [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Releasing lock "refresh_cache-351ed18e-9759-4759-9a20-24f8b3d59908" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:27:19 np0005466013 nova_compute[192144]: 2025-10-02 12:27:19.557 2 DEBUG nova.compute.manager [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Instance network_info: |[{"id": "57aa5324-a5d7-424f-b3ea-20887a672a1c", "address": "fa:16:3e:b3:ab:89", "network": {"id": "aa4ebb90-ef5e-4974-a53d-2aabd696731a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88e90c16adec46069b539d4f1431ab4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57aa5324-a5", "ovs_interfaceid": "57aa5324-a5d7-424f-b3ea-20887a672a1c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:27:19 np0005466013 nova_compute[192144]: 2025-10-02 12:27:19.558 2 DEBUG oslo_concurrency.lockutils [req-0c4507df-8ced-4f94-b74f-08b3c378e7fd req-bf238479-bd44-46b7-aa9d-d465d3b9e45a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-351ed18e-9759-4759-9a20-24f8b3d59908" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:27:19 np0005466013 nova_compute[192144]: 2025-10-02 12:27:19.558 2 DEBUG nova.network.neutron [req-0c4507df-8ced-4f94-b74f-08b3c378e7fd req-bf238479-bd44-46b7-aa9d-d465d3b9e45a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Refreshing network info cache for port 57aa5324-a5d7-424f-b3ea-20887a672a1c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:27:19 np0005466013 nova_compute[192144]: 2025-10-02 12:27:19.561 2 DEBUG nova.virt.libvirt.driver [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Start _get_guest_xml network_info=[{"id": "57aa5324-a5d7-424f-b3ea-20887a672a1c", "address": "fa:16:3e:b3:ab:89", "network": {"id": "aa4ebb90-ef5e-4974-a53d-2aabd696731a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88e90c16adec46069b539d4f1431ab4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57aa5324-a5", "ovs_interfaceid": "57aa5324-a5d7-424f-b3ea-20887a672a1c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:27:19 np0005466013 nova_compute[192144]: 2025-10-02 12:27:19.565 2 WARNING nova.virt.libvirt.driver [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:27:19 np0005466013 nova_compute[192144]: 2025-10-02 12:27:19.576 2 DEBUG nova.virt.libvirt.host [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:27:19 np0005466013 nova_compute[192144]: 2025-10-02 12:27:19.576 2 DEBUG nova.virt.libvirt.host [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:27:19 np0005466013 nova_compute[192144]: 2025-10-02 12:27:19.584 2 DEBUG nova.virt.libvirt.host [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:27:19 np0005466013 nova_compute[192144]: 2025-10-02 12:27:19.584 2 DEBUG nova.virt.libvirt.host [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:27:19 np0005466013 nova_compute[192144]: 2025-10-02 12:27:19.586 2 DEBUG nova.virt.libvirt.driver [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:27:19 np0005466013 nova_compute[192144]: 2025-10-02 12:27:19.586 2 DEBUG nova.virt.hardware [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:27:19 np0005466013 nova_compute[192144]: 2025-10-02 12:27:19.586 2 DEBUG nova.virt.hardware [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:27:19 np0005466013 nova_compute[192144]: 2025-10-02 12:27:19.587 2 DEBUG nova.virt.hardware [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:27:19 np0005466013 nova_compute[192144]: 2025-10-02 12:27:19.587 2 DEBUG nova.virt.hardware [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:27:19 np0005466013 nova_compute[192144]: 2025-10-02 12:27:19.587 2 DEBUG nova.virt.hardware [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:27:19 np0005466013 nova_compute[192144]: 2025-10-02 12:27:19.587 2 DEBUG nova.virt.hardware [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:27:19 np0005466013 nova_compute[192144]: 2025-10-02 12:27:19.587 2 DEBUG nova.virt.hardware [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:27:19 np0005466013 nova_compute[192144]: 2025-10-02 12:27:19.588 2 DEBUG nova.virt.hardware [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:27:19 np0005466013 nova_compute[192144]: 2025-10-02 12:27:19.588 2 DEBUG nova.virt.hardware [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:27:19 np0005466013 nova_compute[192144]: 2025-10-02 12:27:19.588 2 DEBUG nova.virt.hardware [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:27:19 np0005466013 nova_compute[192144]: 2025-10-02 12:27:19.588 2 DEBUG nova.virt.hardware [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:27:19 np0005466013 nova_compute[192144]: 2025-10-02 12:27:19.592 2 DEBUG nova.virt.libvirt.vif [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:27:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-784230985',display_name='tempest-ServerStableDeviceRescueTest-server-784230985',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-784230985',id=126,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='88e90c16adec46069b539d4f1431ab4d',ramdisk_id='',reservation_id='r-9mkkq6lq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-232864240',owner_user_name='tempest-ServerStableDeviceRescueTest-232864240-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:27:12Z,user_data=None,user_id='abb9f220716e48d79dbe2f97622937c4',uuid=351ed18e-9759-4759-9a20-24f8b3d59908,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "57aa5324-a5d7-424f-b3ea-20887a672a1c", "address": "fa:16:3e:b3:ab:89", "network": {"id": "aa4ebb90-ef5e-4974-a53d-2aabd696731a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88e90c16adec46069b539d4f1431ab4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57aa5324-a5", "ovs_interfaceid": "57aa5324-a5d7-424f-b3ea-20887a672a1c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:27:19 np0005466013 nova_compute[192144]: 2025-10-02 12:27:19.592 2 DEBUG nova.network.os_vif_util [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Converting VIF {"id": "57aa5324-a5d7-424f-b3ea-20887a672a1c", "address": "fa:16:3e:b3:ab:89", "network": {"id": "aa4ebb90-ef5e-4974-a53d-2aabd696731a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88e90c16adec46069b539d4f1431ab4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57aa5324-a5", "ovs_interfaceid": "57aa5324-a5d7-424f-b3ea-20887a672a1c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:27:19 np0005466013 nova_compute[192144]: 2025-10-02 12:27:19.593 2 DEBUG nova.network.os_vif_util [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b3:ab:89,bridge_name='br-int',has_traffic_filtering=True,id=57aa5324-a5d7-424f-b3ea-20887a672a1c,network=Network(aa4ebb90-ef5e-4974-a53d-2aabd696731a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57aa5324-a5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:27:19 np0005466013 nova_compute[192144]: 2025-10-02 12:27:19.594 2 DEBUG nova.objects.instance [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lazy-loading 'pci_devices' on Instance uuid 351ed18e-9759-4759-9a20-24f8b3d59908 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:27:19 np0005466013 nova_compute[192144]: 2025-10-02 12:27:19.595 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:19 np0005466013 nova_compute[192144]: 2025-10-02 12:27:19.632 2 DEBUG nova.virt.libvirt.driver [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:27:19 np0005466013 nova_compute[192144]:  <uuid>351ed18e-9759-4759-9a20-24f8b3d59908</uuid>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:  <name>instance-0000007e</name>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:27:19 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:      <nova:name>tempest-ServerStableDeviceRescueTest-server-784230985</nova:name>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:27:19</nova:creationTime>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:27:19 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:        <nova:user uuid="abb9f220716e48d79dbe2f97622937c4">tempest-ServerStableDeviceRescueTest-232864240-project-member</nova:user>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:        <nova:project uuid="88e90c16adec46069b539d4f1431ab4d">tempest-ServerStableDeviceRescueTest-232864240</nova:project>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:        <nova:port uuid="57aa5324-a5d7-424f-b3ea-20887a672a1c">
Oct  2 08:27:19 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:      <entry name="serial">351ed18e-9759-4759-9a20-24f8b3d59908</entry>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:      <entry name="uuid">351ed18e-9759-4759-9a20-24f8b3d59908</entry>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:27:19 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/351ed18e-9759-4759-9a20-24f8b3d59908/disk"/>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:27:19 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/351ed18e-9759-4759-9a20-24f8b3d59908/disk.config"/>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:27:19 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:b3:ab:89"/>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:      <target dev="tap57aa5324-a5"/>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:27:19 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/351ed18e-9759-4759-9a20-24f8b3d59908/console.log" append="off"/>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:27:19 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:27:19 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:27:19 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:27:19 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:27:19 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:27:19 np0005466013 nova_compute[192144]: 2025-10-02 12:27:19.633 2 DEBUG nova.compute.manager [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Preparing to wait for external event network-vif-plugged-57aa5324-a5d7-424f-b3ea-20887a672a1c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:27:19 np0005466013 nova_compute[192144]: 2025-10-02 12:27:19.634 2 DEBUG oslo_concurrency.lockutils [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquiring lock "351ed18e-9759-4759-9a20-24f8b3d59908-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:27:19 np0005466013 nova_compute[192144]: 2025-10-02 12:27:19.635 2 DEBUG oslo_concurrency.lockutils [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "351ed18e-9759-4759-9a20-24f8b3d59908-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:27:19 np0005466013 nova_compute[192144]: 2025-10-02 12:27:19.635 2 DEBUG oslo_concurrency.lockutils [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "351ed18e-9759-4759-9a20-24f8b3d59908-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:27:19 np0005466013 nova_compute[192144]: 2025-10-02 12:27:19.636 2 DEBUG nova.virt.libvirt.vif [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:27:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-784230985',display_name='tempest-ServerStableDeviceRescueTest-server-784230985',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-784230985',id=126,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='88e90c16adec46069b539d4f1431ab4d',ramdisk_id='',reservation_id='r-9mkkq6lq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-232864240',owner_user_name='tempest-ServerStableDeviceRescueTest-232864240-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:27:12Z,user_data=None,user_id='abb9f220716e48d79dbe2f97622937c4',uuid=351ed18e-9759-4759-9a20-24f8b3d59908,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "57aa5324-a5d7-424f-b3ea-20887a672a1c", "address": "fa:16:3e:b3:ab:89", "network": {"id": "aa4ebb90-ef5e-4974-a53d-2aabd696731a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88e90c16adec46069b539d4f1431ab4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57aa5324-a5", "ovs_interfaceid": "57aa5324-a5d7-424f-b3ea-20887a672a1c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:27:19 np0005466013 nova_compute[192144]: 2025-10-02 12:27:19.636 2 DEBUG nova.network.os_vif_util [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Converting VIF {"id": "57aa5324-a5d7-424f-b3ea-20887a672a1c", "address": "fa:16:3e:b3:ab:89", "network": {"id": "aa4ebb90-ef5e-4974-a53d-2aabd696731a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88e90c16adec46069b539d4f1431ab4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57aa5324-a5", "ovs_interfaceid": "57aa5324-a5d7-424f-b3ea-20887a672a1c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:27:19 np0005466013 nova_compute[192144]: 2025-10-02 12:27:19.637 2 DEBUG nova.network.os_vif_util [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b3:ab:89,bridge_name='br-int',has_traffic_filtering=True,id=57aa5324-a5d7-424f-b3ea-20887a672a1c,network=Network(aa4ebb90-ef5e-4974-a53d-2aabd696731a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57aa5324-a5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:27:19 np0005466013 nova_compute[192144]: 2025-10-02 12:27:19.638 2 DEBUG os_vif [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b3:ab:89,bridge_name='br-int',has_traffic_filtering=True,id=57aa5324-a5d7-424f-b3ea-20887a672a1c,network=Network(aa4ebb90-ef5e-4974-a53d-2aabd696731a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57aa5324-a5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:27:19 np0005466013 nova_compute[192144]: 2025-10-02 12:27:19.639 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:19 np0005466013 nova_compute[192144]: 2025-10-02 12:27:19.639 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:27:19 np0005466013 nova_compute[192144]: 2025-10-02 12:27:19.640 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:27:19 np0005466013 nova_compute[192144]: 2025-10-02 12:27:19.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:19 np0005466013 nova_compute[192144]: 2025-10-02 12:27:19.646 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap57aa5324-a5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:27:19 np0005466013 nova_compute[192144]: 2025-10-02 12:27:19.647 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap57aa5324-a5, col_values=(('external_ids', {'iface-id': '57aa5324-a5d7-424f-b3ea-20887a672a1c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b3:ab:89', 'vm-uuid': '351ed18e-9759-4759-9a20-24f8b3d59908'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:27:19 np0005466013 NetworkManager[51205]: <info>  [1759408039.6502] manager: (tap57aa5324-a5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/235)
Oct  2 08:27:19 np0005466013 nova_compute[192144]: 2025-10-02 12:27:19.653 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:19 np0005466013 nova_compute[192144]: 2025-10-02 12:27:19.656 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:27:19 np0005466013 nova_compute[192144]: 2025-10-02 12:27:19.659 2 INFO os_vif [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b3:ab:89,bridge_name='br-int',has_traffic_filtering=True,id=57aa5324-a5d7-424f-b3ea-20887a672a1c,network=Network(aa4ebb90-ef5e-4974-a53d-2aabd696731a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57aa5324-a5')#033[00m
Oct  2 08:27:19 np0005466013 nova_compute[192144]: 2025-10-02 12:27:19.831 2 DEBUG nova.virt.libvirt.driver [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:27:19 np0005466013 nova_compute[192144]: 2025-10-02 12:27:19.832 2 DEBUG nova.virt.libvirt.driver [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:27:19 np0005466013 nova_compute[192144]: 2025-10-02 12:27:19.832 2 DEBUG nova.virt.libvirt.driver [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] No VIF found with MAC fa:16:3e:b3:ab:89, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:27:19 np0005466013 nova_compute[192144]: 2025-10-02 12:27:19.833 2 INFO nova.virt.libvirt.driver [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Using config drive#033[00m
Oct  2 08:27:20 np0005466013 nova_compute[192144]: 2025-10-02 12:27:20.773 2 INFO nova.virt.libvirt.driver [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Creating config drive at /var/lib/nova/instances/351ed18e-9759-4759-9a20-24f8b3d59908/disk.config#033[00m
Oct  2 08:27:20 np0005466013 nova_compute[192144]: 2025-10-02 12:27:20.783 2 DEBUG oslo_concurrency.processutils [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/351ed18e-9759-4759-9a20-24f8b3d59908/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpa0a2yg2c execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:27:20 np0005466013 nova_compute[192144]: 2025-10-02 12:27:20.925 2 DEBUG oslo_concurrency.processutils [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/351ed18e-9759-4759-9a20-24f8b3d59908/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpa0a2yg2c" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:27:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:27:20.991 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=31, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=30) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:27:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:27:20.993 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:27:20 np0005466013 nova_compute[192144]: 2025-10-02 12:27:20.992 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:21 np0005466013 kernel: tap57aa5324-a5: entered promiscuous mode
Oct  2 08:27:21 np0005466013 NetworkManager[51205]: <info>  [1759408041.0204] manager: (tap57aa5324-a5): new Tun device (/org/freedesktop/NetworkManager/Devices/236)
Oct  2 08:27:21 np0005466013 ovn_controller[94366]: 2025-10-02T12:27:21Z|00518|binding|INFO|Claiming lport 57aa5324-a5d7-424f-b3ea-20887a672a1c for this chassis.
Oct  2 08:27:21 np0005466013 ovn_controller[94366]: 2025-10-02T12:27:21Z|00519|binding|INFO|57aa5324-a5d7-424f-b3ea-20887a672a1c: Claiming fa:16:3e:b3:ab:89 10.100.0.8
Oct  2 08:27:21 np0005466013 nova_compute[192144]: 2025-10-02 12:27:21.019 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:21 np0005466013 ovn_controller[94366]: 2025-10-02T12:27:21Z|00520|binding|INFO|Setting lport 57aa5324-a5d7-424f-b3ea-20887a672a1c ovn-installed in OVS
Oct  2 08:27:21 np0005466013 nova_compute[192144]: 2025-10-02 12:27:21.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:21 np0005466013 nova_compute[192144]: 2025-10-02 12:27:21.050 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:27:21.064 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b3:ab:89 10.100.0.8'], port_security=['fa:16:3e:b3:ab:89 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '351ed18e-9759-4759-9a20-24f8b3d59908', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '88e90c16adec46069b539d4f1431ab4d', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ed50fd5d-92ed-497e-8f4f-4653533c5a19', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=53026845-594b-430c-a1e8-d879cf008d46, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=57aa5324-a5d7-424f-b3ea-20887a672a1c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:27:21 np0005466013 ovn_controller[94366]: 2025-10-02T12:27:21Z|00521|binding|INFO|Setting lport 57aa5324-a5d7-424f-b3ea-20887a672a1c up in Southbound
Oct  2 08:27:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:27:21.065 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 57aa5324-a5d7-424f-b3ea-20887a672a1c in datapath aa4ebb90-ef5e-4974-a53d-2aabd696731a bound to our chassis#033[00m
Oct  2 08:27:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:27:21.067 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network aa4ebb90-ef5e-4974-a53d-2aabd696731a#033[00m
Oct  2 08:27:21 np0005466013 systemd-udevd[240439]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:27:21 np0005466013 systemd-machined[152202]: New machine qemu-60-instance-0000007e.
Oct  2 08:27:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:27:21.087 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[28a27ee8-20ec-465c-8da3-0a8cca015d1f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:27:21 np0005466013 NetworkManager[51205]: <info>  [1759408041.0967] device (tap57aa5324-a5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:27:21 np0005466013 systemd[1]: Started Virtual Machine qemu-60-instance-0000007e.
Oct  2 08:27:21 np0005466013 NetworkManager[51205]: <info>  [1759408041.0980] device (tap57aa5324-a5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:27:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:27:21.135 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[fee0ce2f-f4e8-48ef-86ae-a31f047d38a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:27:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:27:21.138 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[fb7e4c66-b34d-4eaf-b5f8-f0ced508753d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:27:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:27:21.165 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[0a105761-f1da-4777-ae3b-13a6cb69dea7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:27:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:27:21.180 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[5a640f3c-e96a-4ade-9a02-c39bc2285904]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapaa4ebb90-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bc:89:8e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 6, 'rx_bytes': 916, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 6, 'rx_bytes': 916, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 155], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 593994, 'reachable_time': 23403, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240452, 'error': None, 'target': 'ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:27:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:27:21.200 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[de5f731e-bc3a-44d8-a891-df5c79b76575]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapaa4ebb90-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 594010, 'tstamp': 594010}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240454, 'error': None, 'target': 'ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapaa4ebb90-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 594013, 'tstamp': 594013}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240454, 'error': None, 'target': 'ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:27:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:27:21.202 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaa4ebb90-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:27:21 np0005466013 nova_compute[192144]: 2025-10-02 12:27:21.204 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:27:21.205 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaa4ebb90-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:27:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:27:21.205 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:27:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:27:21.205 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapaa4ebb90-e0, col_values=(('external_ids', {'iface-id': 'c9a9afa9-78de-46a4-a649-8b8779924189'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:27:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:27:21.206 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:27:22 np0005466013 nova_compute[192144]: 2025-10-02 12:27:22.072 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759408042.071605, 351ed18e-9759-4759-9a20-24f8b3d59908 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:27:22 np0005466013 nova_compute[192144]: 2025-10-02 12:27:22.073 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] VM Started (Lifecycle Event)#033[00m
Oct  2 08:27:22 np0005466013 nova_compute[192144]: 2025-10-02 12:27:22.101 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:27:22 np0005466013 nova_compute[192144]: 2025-10-02 12:27:22.107 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759408042.0718114, 351ed18e-9759-4759-9a20-24f8b3d59908 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:27:22 np0005466013 nova_compute[192144]: 2025-10-02 12:27:22.108 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:27:22 np0005466013 nova_compute[192144]: 2025-10-02 12:27:22.226 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:27:22 np0005466013 nova_compute[192144]: 2025-10-02 12:27:22.229 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:27:22 np0005466013 nova_compute[192144]: 2025-10-02 12:27:22.296 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:27:22 np0005466013 nova_compute[192144]: 2025-10-02 12:27:22.460 2 DEBUG nova.compute.manager [req-c2508902-f24a-40b2-8f8a-b1753e59f689 req-da827fd5-542d-44c8-b445-4f7d533c4ad3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Received event network-vif-plugged-57aa5324-a5d7-424f-b3ea-20887a672a1c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:27:22 np0005466013 nova_compute[192144]: 2025-10-02 12:27:22.460 2 DEBUG oslo_concurrency.lockutils [req-c2508902-f24a-40b2-8f8a-b1753e59f689 req-da827fd5-542d-44c8-b445-4f7d533c4ad3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "351ed18e-9759-4759-9a20-24f8b3d59908-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:27:22 np0005466013 nova_compute[192144]: 2025-10-02 12:27:22.460 2 DEBUG oslo_concurrency.lockutils [req-c2508902-f24a-40b2-8f8a-b1753e59f689 req-da827fd5-542d-44c8-b445-4f7d533c4ad3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "351ed18e-9759-4759-9a20-24f8b3d59908-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:27:22 np0005466013 nova_compute[192144]: 2025-10-02 12:27:22.461 2 DEBUG oslo_concurrency.lockutils [req-c2508902-f24a-40b2-8f8a-b1753e59f689 req-da827fd5-542d-44c8-b445-4f7d533c4ad3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "351ed18e-9759-4759-9a20-24f8b3d59908-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:27:22 np0005466013 nova_compute[192144]: 2025-10-02 12:27:22.461 2 DEBUG nova.compute.manager [req-c2508902-f24a-40b2-8f8a-b1753e59f689 req-da827fd5-542d-44c8-b445-4f7d533c4ad3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Processing event network-vif-plugged-57aa5324-a5d7-424f-b3ea-20887a672a1c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:27:22 np0005466013 nova_compute[192144]: 2025-10-02 12:27:22.462 2 DEBUG nova.compute.manager [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:27:22 np0005466013 nova_compute[192144]: 2025-10-02 12:27:22.465 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759408042.4656725, 351ed18e-9759-4759-9a20-24f8b3d59908 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:27:22 np0005466013 nova_compute[192144]: 2025-10-02 12:27:22.466 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:27:22 np0005466013 nova_compute[192144]: 2025-10-02 12:27:22.467 2 DEBUG nova.virt.libvirt.driver [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:27:22 np0005466013 nova_compute[192144]: 2025-10-02 12:27:22.471 2 INFO nova.virt.libvirt.driver [-] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Instance spawned successfully.#033[00m
Oct  2 08:27:22 np0005466013 nova_compute[192144]: 2025-10-02 12:27:22.472 2 DEBUG nova.virt.libvirt.driver [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:27:22 np0005466013 nova_compute[192144]: 2025-10-02 12:27:22.555 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:27:22 np0005466013 nova_compute[192144]: 2025-10-02 12:27:22.560 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:27:22 np0005466013 nova_compute[192144]: 2025-10-02 12:27:22.581 2 DEBUG nova.virt.libvirt.driver [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:27:22 np0005466013 nova_compute[192144]: 2025-10-02 12:27:22.582 2 DEBUG nova.virt.libvirt.driver [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:27:22 np0005466013 nova_compute[192144]: 2025-10-02 12:27:22.583 2 DEBUG nova.virt.libvirt.driver [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:27:22 np0005466013 nova_compute[192144]: 2025-10-02 12:27:22.583 2 DEBUG nova.virt.libvirt.driver [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:27:22 np0005466013 nova_compute[192144]: 2025-10-02 12:27:22.584 2 DEBUG nova.virt.libvirt.driver [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:27:22 np0005466013 nova_compute[192144]: 2025-10-02 12:27:22.585 2 DEBUG nova.virt.libvirt.driver [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:27:22 np0005466013 nova_compute[192144]: 2025-10-02 12:27:22.688 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:27:22 np0005466013 nova_compute[192144]: 2025-10-02 12:27:22.794 2 INFO nova.compute.manager [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Took 9.75 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:27:22 np0005466013 nova_compute[192144]: 2025-10-02 12:27:22.795 2 DEBUG nova.compute.manager [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:27:22 np0005466013 nova_compute[192144]: 2025-10-02 12:27:22.956 2 INFO nova.compute.manager [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Took 10.63 seconds to build instance.#033[00m
Oct  2 08:27:23 np0005466013 nova_compute[192144]: 2025-10-02 12:27:23.031 2 DEBUG oslo_concurrency.lockutils [None req-935e83d6-7673-4c16-ba28-361f0aa5a43e abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "351ed18e-9759-4759-9a20-24f8b3d59908" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.951s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:27:23 np0005466013 nova_compute[192144]: 2025-10-02 12:27:23.995 2 DEBUG nova.network.neutron [req-0c4507df-8ced-4f94-b74f-08b3c378e7fd req-bf238479-bd44-46b7-aa9d-d465d3b9e45a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Updated VIF entry in instance network info cache for port 57aa5324-a5d7-424f-b3ea-20887a672a1c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:27:23 np0005466013 nova_compute[192144]: 2025-10-02 12:27:23.996 2 DEBUG nova.network.neutron [req-0c4507df-8ced-4f94-b74f-08b3c378e7fd req-bf238479-bd44-46b7-aa9d-d465d3b9e45a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Updating instance_info_cache with network_info: [{"id": "57aa5324-a5d7-424f-b3ea-20887a672a1c", "address": "fa:16:3e:b3:ab:89", "network": {"id": "aa4ebb90-ef5e-4974-a53d-2aabd696731a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88e90c16adec46069b539d4f1431ab4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57aa5324-a5", "ovs_interfaceid": "57aa5324-a5d7-424f-b3ea-20887a672a1c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:27:24 np0005466013 nova_compute[192144]: 2025-10-02 12:27:24.026 2 DEBUG oslo_concurrency.lockutils [req-0c4507df-8ced-4f94-b74f-08b3c378e7fd req-bf238479-bd44-46b7-aa9d-d465d3b9e45a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-351ed18e-9759-4759-9a20-24f8b3d59908" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:27:24 np0005466013 nova_compute[192144]: 2025-10-02 12:27:24.587 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:24 np0005466013 nova_compute[192144]: 2025-10-02 12:27:24.648 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:24 np0005466013 nova_compute[192144]: 2025-10-02 12:27:24.662 2 DEBUG nova.compute.manager [req-593e0178-ef10-44ac-a35d-bf52c2c708a1 req-1cec7d8f-a9a8-4225-9865-fc4db55669de 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Received event network-vif-plugged-57aa5324-a5d7-424f-b3ea-20887a672a1c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:27:24 np0005466013 nova_compute[192144]: 2025-10-02 12:27:24.662 2 DEBUG oslo_concurrency.lockutils [req-593e0178-ef10-44ac-a35d-bf52c2c708a1 req-1cec7d8f-a9a8-4225-9865-fc4db55669de 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "351ed18e-9759-4759-9a20-24f8b3d59908-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:27:24 np0005466013 nova_compute[192144]: 2025-10-02 12:27:24.663 2 DEBUG oslo_concurrency.lockutils [req-593e0178-ef10-44ac-a35d-bf52c2c708a1 req-1cec7d8f-a9a8-4225-9865-fc4db55669de 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "351ed18e-9759-4759-9a20-24f8b3d59908-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:27:24 np0005466013 nova_compute[192144]: 2025-10-02 12:27:24.663 2 DEBUG oslo_concurrency.lockutils [req-593e0178-ef10-44ac-a35d-bf52c2c708a1 req-1cec7d8f-a9a8-4225-9865-fc4db55669de 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "351ed18e-9759-4759-9a20-24f8b3d59908-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:27:24 np0005466013 nova_compute[192144]: 2025-10-02 12:27:24.663 2 DEBUG nova.compute.manager [req-593e0178-ef10-44ac-a35d-bf52c2c708a1 req-1cec7d8f-a9a8-4225-9865-fc4db55669de 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] No waiting events found dispatching network-vif-plugged-57aa5324-a5d7-424f-b3ea-20887a672a1c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:27:24 np0005466013 nova_compute[192144]: 2025-10-02 12:27:24.664 2 WARNING nova.compute.manager [req-593e0178-ef10-44ac-a35d-bf52c2c708a1 req-1cec7d8f-a9a8-4225-9865-fc4db55669de 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Received unexpected event network-vif-plugged-57aa5324-a5d7-424f-b3ea-20887a672a1c for instance with vm_state active and task_state None.#033[00m
Oct  2 08:27:26 np0005466013 nova_compute[192144]: 2025-10-02 12:27:26.237 2 DEBUG nova.compute.manager [None req-e05a1e4c-0ede-44b3-848e-1abcc791f36c abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:27:26 np0005466013 nova_compute[192144]: 2025-10-02 12:27:26.375 2 INFO nova.compute.manager [None req-e05a1e4c-0ede-44b3-848e-1abcc791f36c abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] instance snapshotting#033[00m
Oct  2 08:27:26 np0005466013 nova_compute[192144]: 2025-10-02 12:27:26.879 2 INFO nova.virt.libvirt.driver [None req-e05a1e4c-0ede-44b3-848e-1abcc791f36c abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Beginning live snapshot process#033[00m
Oct  2 08:27:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:27:26.996 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '31'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:27:27 np0005466013 virtqemud[191867]: invalid argument: disk vda does not have an active block job
Oct  2 08:27:27 np0005466013 nova_compute[192144]: 2025-10-02 12:27:27.177 2 DEBUG oslo_concurrency.processutils [None req-e05a1e4c-0ede-44b3-848e-1abcc791f36c abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/351ed18e-9759-4759-9a20-24f8b3d59908/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:27:27 np0005466013 nova_compute[192144]: 2025-10-02 12:27:27.236 2 DEBUG oslo_concurrency.processutils [None req-e05a1e4c-0ede-44b3-848e-1abcc791f36c abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/351ed18e-9759-4759-9a20-24f8b3d59908/disk --force-share --output=json -f qcow2" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:27:27 np0005466013 nova_compute[192144]: 2025-10-02 12:27:27.239 2 DEBUG oslo_concurrency.processutils [None req-e05a1e4c-0ede-44b3-848e-1abcc791f36c abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/351ed18e-9759-4759-9a20-24f8b3d59908/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:27:27 np0005466013 nova_compute[192144]: 2025-10-02 12:27:27.327 2 DEBUG oslo_concurrency.processutils [None req-e05a1e4c-0ede-44b3-848e-1abcc791f36c abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/351ed18e-9759-4759-9a20-24f8b3d59908/disk --force-share --output=json -f qcow2" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:27:27 np0005466013 nova_compute[192144]: 2025-10-02 12:27:27.340 2 DEBUG oslo_concurrency.processutils [None req-e05a1e4c-0ede-44b3-848e-1abcc791f36c abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:27:27 np0005466013 nova_compute[192144]: 2025-10-02 12:27:27.395 2 DEBUG oslo_concurrency.processutils [None req-e05a1e4c-0ede-44b3-848e-1abcc791f36c abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:27:27 np0005466013 nova_compute[192144]: 2025-10-02 12:27:27.397 2 DEBUG oslo_concurrency.processutils [None req-e05a1e4c-0ede-44b3-848e-1abcc791f36c abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpkgd06ei4/81da63b41efa4c98aea43e619b1b125c.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:27:27 np0005466013 nova_compute[192144]: 2025-10-02 12:27:27.443 2 DEBUG oslo_concurrency.processutils [None req-e05a1e4c-0ede-44b3-848e-1abcc791f36c abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpkgd06ei4/81da63b41efa4c98aea43e619b1b125c.delta 1073741824" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:27:27 np0005466013 nova_compute[192144]: 2025-10-02 12:27:27.444 2 INFO nova.virt.libvirt.driver [None req-e05a1e4c-0ede-44b3-848e-1abcc791f36c abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Oct  2 08:27:27 np0005466013 nova_compute[192144]: 2025-10-02 12:27:27.509 2 DEBUG nova.virt.libvirt.guest [None req-e05a1e4c-0ede-44b3-848e-1abcc791f36c abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Oct  2 08:27:27 np0005466013 nova_compute[192144]: 2025-10-02 12:27:27.513 2 INFO nova.virt.libvirt.driver [None req-e05a1e4c-0ede-44b3-848e-1abcc791f36c abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Oct  2 08:27:27 np0005466013 nova_compute[192144]: 2025-10-02 12:27:27.570 2 DEBUG nova.privsep.utils [None req-e05a1e4c-0ede-44b3-848e-1abcc791f36c abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Oct  2 08:27:27 np0005466013 nova_compute[192144]: 2025-10-02 12:27:27.571 2 DEBUG oslo_concurrency.processutils [None req-e05a1e4c-0ede-44b3-848e-1abcc791f36c abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpkgd06ei4/81da63b41efa4c98aea43e619b1b125c.delta /var/lib/nova/instances/snapshots/tmpkgd06ei4/81da63b41efa4c98aea43e619b1b125c execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:27:27 np0005466013 nova_compute[192144]: 2025-10-02 12:27:27.845 2 DEBUG oslo_concurrency.processutils [None req-e05a1e4c-0ede-44b3-848e-1abcc791f36c abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpkgd06ei4/81da63b41efa4c98aea43e619b1b125c.delta /var/lib/nova/instances/snapshots/tmpkgd06ei4/81da63b41efa4c98aea43e619b1b125c" returned: 0 in 0.273s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:27:27 np0005466013 nova_compute[192144]: 2025-10-02 12:27:27.848 2 INFO nova.virt.libvirt.driver [None req-e05a1e4c-0ede-44b3-848e-1abcc791f36c abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Snapshot extracted, beginning image upload#033[00m
Oct  2 08:27:27 np0005466013 nova_compute[192144]: 2025-10-02 12:27:27.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:27:27 np0005466013 nova_compute[192144]: 2025-10-02 12:27:27.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:27:29 np0005466013 nova_compute[192144]: 2025-10-02 12:27:29.624 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:29 np0005466013 nova_compute[192144]: 2025-10-02 12:27:29.650 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:29 np0005466013 nova_compute[192144]: 2025-10-02 12:27:29.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:27:30 np0005466013 nova_compute[192144]: 2025-10-02 12:27:30.042 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:27:30 np0005466013 nova_compute[192144]: 2025-10-02 12:27:30.043 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:27:30 np0005466013 nova_compute[192144]: 2025-10-02 12:27:30.043 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:27:30 np0005466013 nova_compute[192144]: 2025-10-02 12:27:30.044 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:27:30 np0005466013 nova_compute[192144]: 2025-10-02 12:27:30.141 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:27:30 np0005466013 nova_compute[192144]: 2025-10-02 12:27:30.225 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/disk --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:27:30 np0005466013 nova_compute[192144]: 2025-10-02 12:27:30.227 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:27:30 np0005466013 nova_compute[192144]: 2025-10-02 12:27:30.289 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:27:30 np0005466013 nova_compute[192144]: 2025-10-02 12:27:30.297 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/351ed18e-9759-4759-9a20-24f8b3d59908/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:27:30 np0005466013 nova_compute[192144]: 2025-10-02 12:27:30.378 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/351ed18e-9759-4759-9a20-24f8b3d59908/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:27:30 np0005466013 nova_compute[192144]: 2025-10-02 12:27:30.380 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/351ed18e-9759-4759-9a20-24f8b3d59908/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:27:30 np0005466013 nova_compute[192144]: 2025-10-02 12:27:30.449 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/351ed18e-9759-4759-9a20-24f8b3d59908/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:27:30 np0005466013 nova_compute[192144]: 2025-10-02 12:27:30.614 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:27:30 np0005466013 nova_compute[192144]: 2025-10-02 12:27:30.615 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5396MB free_disk=73.19841384887695GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:27:30 np0005466013 nova_compute[192144]: 2025-10-02 12:27:30.616 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:27:30 np0005466013 nova_compute[192144]: 2025-10-02 12:27:30.616 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:27:30 np0005466013 nova_compute[192144]: 2025-10-02 12:27:30.732 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance f92877b9-dd8b-4444-a42b-987004802928 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:27:30 np0005466013 nova_compute[192144]: 2025-10-02 12:27:30.733 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance 351ed18e-9759-4759-9a20-24f8b3d59908 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:27:30 np0005466013 nova_compute[192144]: 2025-10-02 12:27:30.734 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:27:30 np0005466013 nova_compute[192144]: 2025-10-02 12:27:30.734 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:27:30 np0005466013 nova_compute[192144]: 2025-10-02 12:27:30.843 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:27:30 np0005466013 nova_compute[192144]: 2025-10-02 12:27:30.871 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:27:30 np0005466013 nova_compute[192144]: 2025-10-02 12:27:30.907 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:27:30 np0005466013 nova_compute[192144]: 2025-10-02 12:27:30.908 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.292s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:27:31 np0005466013 nova_compute[192144]: 2025-10-02 12:27:31.120 2 INFO nova.virt.libvirt.driver [None req-e05a1e4c-0ede-44b3-848e-1abcc791f36c abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Snapshot image upload complete#033[00m
Oct  2 08:27:31 np0005466013 nova_compute[192144]: 2025-10-02 12:27:31.121 2 INFO nova.compute.manager [None req-e05a1e4c-0ede-44b3-848e-1abcc791f36c abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Took 4.73 seconds to snapshot the instance on the hypervisor.#033[00m
Oct  2 08:27:32 np0005466013 podman[240502]: 2025-10-02 12:27:32.696069752 +0000 UTC m=+0.067019150 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:27:32 np0005466013 podman[240501]: 2025-10-02 12:27:32.697112894 +0000 UTC m=+0.069171987 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:27:32 np0005466013 podman[240503]: 2025-10-02 12:27:32.730087216 +0000 UTC m=+0.101064325 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true)
Oct  2 08:27:34 np0005466013 nova_compute[192144]: 2025-10-02 12:27:34.627 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:34 np0005466013 nova_compute[192144]: 2025-10-02 12:27:34.655 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:34 np0005466013 nova_compute[192144]: 2025-10-02 12:27:34.873 2 INFO nova.compute.manager [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Rescuing#033[00m
Oct  2 08:27:34 np0005466013 nova_compute[192144]: 2025-10-02 12:27:34.874 2 DEBUG oslo_concurrency.lockutils [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquiring lock "refresh_cache-351ed18e-9759-4759-9a20-24f8b3d59908" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:27:34 np0005466013 nova_compute[192144]: 2025-10-02 12:27:34.874 2 DEBUG oslo_concurrency.lockutils [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquired lock "refresh_cache-351ed18e-9759-4759-9a20-24f8b3d59908" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:27:34 np0005466013 nova_compute[192144]: 2025-10-02 12:27:34.874 2 DEBUG nova.network.neutron [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:27:35 np0005466013 nova_compute[192144]: 2025-10-02 12:27:35.909 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:27:35 np0005466013 ovn_controller[94366]: 2025-10-02T12:27:35Z|00049|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b3:ab:89 10.100.0.8
Oct  2 08:27:35 np0005466013 ovn_controller[94366]: 2025-10-02T12:27:35Z|00050|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b3:ab:89 10.100.0.8
Oct  2 08:27:35 np0005466013 nova_compute[192144]: 2025-10-02 12:27:35.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:27:36 np0005466013 nova_compute[192144]: 2025-10-02 12:27:36.988 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:27:36 np0005466013 nova_compute[192144]: 2025-10-02 12:27:36.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:27:37 np0005466013 nova_compute[192144]: 2025-10-02 12:27:37.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:27:37 np0005466013 nova_compute[192144]: 2025-10-02 12:27:37.993 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:27:37 np0005466013 nova_compute[192144]: 2025-10-02 12:27:37.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:27:38 np0005466013 nova_compute[192144]: 2025-10-02 12:27:38.456 2 DEBUG nova.network.neutron [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Updating instance_info_cache with network_info: [{"id": "57aa5324-a5d7-424f-b3ea-20887a672a1c", "address": "fa:16:3e:b3:ab:89", "network": {"id": "aa4ebb90-ef5e-4974-a53d-2aabd696731a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88e90c16adec46069b539d4f1431ab4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57aa5324-a5", "ovs_interfaceid": "57aa5324-a5d7-424f-b3ea-20887a672a1c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:27:39 np0005466013 nova_compute[192144]: 2025-10-02 12:27:39.140 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "refresh_cache-f92877b9-dd8b-4444-a42b-987004802928" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:27:39 np0005466013 nova_compute[192144]: 2025-10-02 12:27:39.140 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquired lock "refresh_cache-f92877b9-dd8b-4444-a42b-987004802928" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:27:39 np0005466013 nova_compute[192144]: 2025-10-02 12:27:39.141 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: f92877b9-dd8b-4444-a42b-987004802928] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:27:39 np0005466013 nova_compute[192144]: 2025-10-02 12:27:39.141 2 DEBUG nova.objects.instance [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lazy-loading 'info_cache' on Instance uuid f92877b9-dd8b-4444-a42b-987004802928 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:27:39 np0005466013 nova_compute[192144]: 2025-10-02 12:27:39.435 2 DEBUG oslo_concurrency.lockutils [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Releasing lock "refresh_cache-351ed18e-9759-4759-9a20-24f8b3d59908" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:27:39 np0005466013 nova_compute[192144]: 2025-10-02 12:27:39.656 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:27:40 np0005466013 nova_compute[192144]: 2025-10-02 12:27:40.650 2 DEBUG nova.virt.libvirt.driver [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Oct  2 08:27:40 np0005466013 nova_compute[192144]: 2025-10-02 12:27:40.756 2 DEBUG oslo_concurrency.lockutils [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "ca651811-3d96-4b41-a50d-bbaeaf3da808" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:27:40 np0005466013 nova_compute[192144]: 2025-10-02 12:27:40.756 2 DEBUG oslo_concurrency.lockutils [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "ca651811-3d96-4b41-a50d-bbaeaf3da808" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:27:40 np0005466013 nova_compute[192144]: 2025-10-02 12:27:40.955 2 DEBUG nova.compute.manager [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:27:42 np0005466013 nova_compute[192144]: 2025-10-02 12:27:42.628 2 DEBUG oslo_concurrency.lockutils [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:27:42 np0005466013 nova_compute[192144]: 2025-10-02 12:27:42.629 2 DEBUG oslo_concurrency.lockutils [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:27:42 np0005466013 nova_compute[192144]: 2025-10-02 12:27:42.641 2 DEBUG nova.virt.hardware [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:27:42 np0005466013 nova_compute[192144]: 2025-10-02 12:27:42.642 2 INFO nova.compute.claims [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:27:42 np0005466013 podman[240586]: 2025-10-02 12:27:42.719500883 +0000 UTC m=+0.099160886 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:27:42 np0005466013 podman[240592]: 2025-10-02 12:27:42.720311599 +0000 UTC m=+0.079614849 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  2 08:27:42 np0005466013 podman[240587]: 2025-10-02 12:27:42.72386998 +0000 UTC m=+0.080123076 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9-minimal, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.openshift.expose-services=, architecture=x86_64, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers)
Oct  2 08:27:43 np0005466013 nova_compute[192144]: 2025-10-02 12:27:43.027 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: f92877b9-dd8b-4444-a42b-987004802928] Updating instance_info_cache with network_info: [{"id": "60b9a0ec-2ade-4f90-a7b0-443ac527ec3e", "address": "fa:16:3e:f0:49:10", "network": {"id": "aa4ebb90-ef5e-4974-a53d-2aabd696731a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88e90c16adec46069b539d4f1431ab4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60b9a0ec-2a", "ovs_interfaceid": "60b9a0ec-2ade-4f90-a7b0-443ac527ec3e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:27:43 np0005466013 kernel: tap57aa5324-a5 (unregistering): left promiscuous mode
Oct  2 08:27:43 np0005466013 NetworkManager[51205]: <info>  [1759408063.0480] device (tap57aa5324-a5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:27:43 np0005466013 ovn_controller[94366]: 2025-10-02T12:27:43Z|00522|binding|INFO|Releasing lport 57aa5324-a5d7-424f-b3ea-20887a672a1c from this chassis (sb_readonly=0)
Oct  2 08:27:43 np0005466013 ovn_controller[94366]: 2025-10-02T12:27:43Z|00523|binding|INFO|Setting lport 57aa5324-a5d7-424f-b3ea-20887a672a1c down in Southbound
Oct  2 08:27:43 np0005466013 ovn_controller[94366]: 2025-10-02T12:27:43Z|00524|binding|INFO|Removing iface tap57aa5324-a5 ovn-installed in OVS
Oct  2 08:27:43 np0005466013 nova_compute[192144]: 2025-10-02 12:27:43.058 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:43 np0005466013 nova_compute[192144]: 2025-10-02 12:27:43.071 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:43 np0005466013 systemd[1]: machine-qemu\x2d60\x2dinstance\x2d0000007e.scope: Deactivated successfully.
Oct  2 08:27:43 np0005466013 systemd[1]: machine-qemu\x2d60\x2dinstance\x2d0000007e.scope: Consumed 13.346s CPU time.
Oct  2 08:27:43 np0005466013 systemd-machined[152202]: Machine qemu-60-instance-0000007e terminated.
Oct  2 08:27:43 np0005466013 nova_compute[192144]: 2025-10-02 12:27:43.298 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:43 np0005466013 nova_compute[192144]: 2025-10-02 12:27:43.304 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:43 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:27:43.379 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b3:ab:89 10.100.0.8'], port_security=['fa:16:3e:b3:ab:89 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '351ed18e-9759-4759-9a20-24f8b3d59908', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '88e90c16adec46069b539d4f1431ab4d', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ed50fd5d-92ed-497e-8f4f-4653533c5a19', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=53026845-594b-430c-a1e8-d879cf008d46, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=57aa5324-a5d7-424f-b3ea-20887a672a1c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:27:43 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:27:43.381 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 57aa5324-a5d7-424f-b3ea-20887a672a1c in datapath aa4ebb90-ef5e-4974-a53d-2aabd696731a unbound from our chassis#033[00m
Oct  2 08:27:43 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:27:43.382 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network aa4ebb90-ef5e-4974-a53d-2aabd696731a#033[00m
Oct  2 08:27:43 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:27:43.404 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[f0be6a07-0189-457a-b1d5-7da162b82962]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:27:43 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:27:43.442 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[d33ba63e-8ada-4912-9fd0-336b6d188562]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:27:43 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:27:43.447 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[14169701-40c9-422d-b4e6-11cb1f7b921b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:27:43 np0005466013 nova_compute[192144]: 2025-10-02 12:27:43.460 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Releasing lock "refresh_cache-f92877b9-dd8b-4444-a42b-987004802928" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:27:43 np0005466013 nova_compute[192144]: 2025-10-02 12:27:43.460 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: f92877b9-dd8b-4444-a42b-987004802928] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:27:43 np0005466013 nova_compute[192144]: 2025-10-02 12:27:43.461 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:27:43 np0005466013 nova_compute[192144]: 2025-10-02 12:27:43.461 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:27:43 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:27:43.473 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[c9782328-75ee-4b6a-aad0-12820c753411]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:27:43 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:27:43.488 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[1c7e903b-ecc3-494f-8abb-5a1fd5a291ea]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapaa4ebb90-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bc:89:8e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 8, 'rx_bytes': 916, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 8, 'rx_bytes': 916, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 155], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 593994, 'reachable_time': 23403, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240669, 'error': None, 'target': 'ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:27:43 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:27:43.508 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[a665220b-6fac-4d33-9301-2c52ae13264b]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapaa4ebb90-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 594010, 'tstamp': 594010}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240670, 'error': None, 'target': 'ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapaa4ebb90-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 594013, 'tstamp': 594013}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240670, 'error': None, 'target': 'ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:27:43 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:27:43.510 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaa4ebb90-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:27:43 np0005466013 nova_compute[192144]: 2025-10-02 12:27:43.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:43 np0005466013 nova_compute[192144]: 2025-10-02 12:27:43.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:43 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:27:43.516 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaa4ebb90-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:27:43 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:27:43.516 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:27:43 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:27:43.516 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapaa4ebb90-e0, col_values=(('external_ids', {'iface-id': 'c9a9afa9-78de-46a4-a649-8b8779924189'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:27:43 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:27:43.516 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:27:43 np0005466013 nova_compute[192144]: 2025-10-02 12:27:43.671 2 INFO nova.virt.libvirt.driver [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Instance shutdown successfully after 3 seconds.#033[00m
Oct  2 08:27:43 np0005466013 nova_compute[192144]: 2025-10-02 12:27:43.678 2 INFO nova.virt.libvirt.driver [-] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Instance destroyed successfully.#033[00m
Oct  2 08:27:43 np0005466013 nova_compute[192144]: 2025-10-02 12:27:43.679 2 DEBUG nova.objects.instance [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lazy-loading 'numa_topology' on Instance uuid 351ed18e-9759-4759-9a20-24f8b3d59908 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:27:44 np0005466013 nova_compute[192144]: 2025-10-02 12:27:44.420 2 INFO nova.virt.libvirt.driver [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Attempting a stable device rescue#033[00m
Oct  2 08:27:44 np0005466013 nova_compute[192144]: 2025-10-02 12:27:44.659 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:44 np0005466013 nova_compute[192144]: 2025-10-02 12:27:44.663 2 DEBUG nova.compute.manager [req-88b04402-a9b7-4d0e-b238-dea9b3970b84 req-f281ba3d-9298-4646-9064-1f16b58ba34b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Received event network-vif-unplugged-57aa5324-a5d7-424f-b3ea-20887a672a1c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:27:44 np0005466013 nova_compute[192144]: 2025-10-02 12:27:44.663 2 DEBUG oslo_concurrency.lockutils [req-88b04402-a9b7-4d0e-b238-dea9b3970b84 req-f281ba3d-9298-4646-9064-1f16b58ba34b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "351ed18e-9759-4759-9a20-24f8b3d59908-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:27:44 np0005466013 nova_compute[192144]: 2025-10-02 12:27:44.664 2 DEBUG oslo_concurrency.lockutils [req-88b04402-a9b7-4d0e-b238-dea9b3970b84 req-f281ba3d-9298-4646-9064-1f16b58ba34b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "351ed18e-9759-4759-9a20-24f8b3d59908-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:27:44 np0005466013 nova_compute[192144]: 2025-10-02 12:27:44.664 2 DEBUG oslo_concurrency.lockutils [req-88b04402-a9b7-4d0e-b238-dea9b3970b84 req-f281ba3d-9298-4646-9064-1f16b58ba34b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "351ed18e-9759-4759-9a20-24f8b3d59908-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:27:44 np0005466013 nova_compute[192144]: 2025-10-02 12:27:44.665 2 DEBUG nova.compute.manager [req-88b04402-a9b7-4d0e-b238-dea9b3970b84 req-f281ba3d-9298-4646-9064-1f16b58ba34b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] No waiting events found dispatching network-vif-unplugged-57aa5324-a5d7-424f-b3ea-20887a672a1c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:27:44 np0005466013 nova_compute[192144]: 2025-10-02 12:27:44.666 2 WARNING nova.compute.manager [req-88b04402-a9b7-4d0e-b238-dea9b3970b84 req-f281ba3d-9298-4646-9064-1f16b58ba34b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Received unexpected event network-vif-unplugged-57aa5324-a5d7-424f-b3ea-20887a672a1c for instance with vm_state active and task_state rescuing.#033[00m
Oct  2 08:27:45 np0005466013 nova_compute[192144]: 2025-10-02 12:27:45.256 2 DEBUG nova.compute.provider_tree [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:27:45 np0005466013 nova_compute[192144]: 2025-10-02 12:27:45.344 2 DEBUG nova.virt.libvirt.driver [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314#033[00m
Oct  2 08:27:45 np0005466013 nova_compute[192144]: 2025-10-02 12:27:45.351 2 DEBUG nova.virt.libvirt.driver [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Oct  2 08:27:45 np0005466013 nova_compute[192144]: 2025-10-02 12:27:45.352 2 INFO nova.virt.libvirt.driver [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Creating image(s)#033[00m
Oct  2 08:27:45 np0005466013 nova_compute[192144]: 2025-10-02 12:27:45.353 2 DEBUG oslo_concurrency.lockutils [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquiring lock "/var/lib/nova/instances/351ed18e-9759-4759-9a20-24f8b3d59908/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:27:45 np0005466013 nova_compute[192144]: 2025-10-02 12:27:45.353 2 DEBUG oslo_concurrency.lockutils [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "/var/lib/nova/instances/351ed18e-9759-4759-9a20-24f8b3d59908/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:27:45 np0005466013 nova_compute[192144]: 2025-10-02 12:27:45.354 2 DEBUG oslo_concurrency.lockutils [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "/var/lib/nova/instances/351ed18e-9759-4759-9a20-24f8b3d59908/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:27:45 np0005466013 nova_compute[192144]: 2025-10-02 12:27:45.354 2 DEBUG nova.objects.instance [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lazy-loading 'trusted_certs' on Instance uuid 351ed18e-9759-4759-9a20-24f8b3d59908 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:27:45 np0005466013 nova_compute[192144]: 2025-10-02 12:27:45.781 2 DEBUG oslo_concurrency.lockutils [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquiring lock "2563015c4ccc448cfc2f148d9d2544bae12af308" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:27:45 np0005466013 nova_compute[192144]: 2025-10-02 12:27:45.782 2 DEBUG oslo_concurrency.lockutils [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "2563015c4ccc448cfc2f148d9d2544bae12af308" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:27:45 np0005466013 nova_compute[192144]: 2025-10-02 12:27:45.786 2 DEBUG nova.scheduler.client.report [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:27:46 np0005466013 nova_compute[192144]: 2025-10-02 12:27:46.206 2 DEBUG oslo_concurrency.lockutils [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 3.577s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:27:46 np0005466013 nova_compute[192144]: 2025-10-02 12:27:46.206 2 DEBUG nova.compute.manager [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:27:46 np0005466013 nova_compute[192144]: 2025-10-02 12:27:46.858 2 DEBUG nova.compute.manager [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:27:46 np0005466013 nova_compute[192144]: 2025-10-02 12:27:46.859 2 DEBUG nova.network.neutron [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:27:46 np0005466013 nova_compute[192144]: 2025-10-02 12:27:46.926 2 DEBUG nova.compute.manager [req-f5620b5a-c541-4deb-8047-1c837a400f26 req-7229537c-357f-4fd2-8fac-612b1e2459ea 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Received event network-vif-plugged-57aa5324-a5d7-424f-b3ea-20887a672a1c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:27:46 np0005466013 nova_compute[192144]: 2025-10-02 12:27:46.927 2 DEBUG oslo_concurrency.lockutils [req-f5620b5a-c541-4deb-8047-1c837a400f26 req-7229537c-357f-4fd2-8fac-612b1e2459ea 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "351ed18e-9759-4759-9a20-24f8b3d59908-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:27:46 np0005466013 nova_compute[192144]: 2025-10-02 12:27:46.927 2 DEBUG oslo_concurrency.lockutils [req-f5620b5a-c541-4deb-8047-1c837a400f26 req-7229537c-357f-4fd2-8fac-612b1e2459ea 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "351ed18e-9759-4759-9a20-24f8b3d59908-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:27:46 np0005466013 nova_compute[192144]: 2025-10-02 12:27:46.928 2 DEBUG oslo_concurrency.lockutils [req-f5620b5a-c541-4deb-8047-1c837a400f26 req-7229537c-357f-4fd2-8fac-612b1e2459ea 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "351ed18e-9759-4759-9a20-24f8b3d59908-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:27:46 np0005466013 nova_compute[192144]: 2025-10-02 12:27:46.928 2 DEBUG nova.compute.manager [req-f5620b5a-c541-4deb-8047-1c837a400f26 req-7229537c-357f-4fd2-8fac-612b1e2459ea 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] No waiting events found dispatching network-vif-plugged-57aa5324-a5d7-424f-b3ea-20887a672a1c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:27:46 np0005466013 nova_compute[192144]: 2025-10-02 12:27:46.929 2 WARNING nova.compute.manager [req-f5620b5a-c541-4deb-8047-1c837a400f26 req-7229537c-357f-4fd2-8fac-612b1e2459ea 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Received unexpected event network-vif-plugged-57aa5324-a5d7-424f-b3ea-20887a672a1c for instance with vm_state active and task_state rescuing.#033[00m
Oct  2 08:27:47 np0005466013 nova_compute[192144]: 2025-10-02 12:27:47.064 2 INFO nova.virt.libvirt.driver [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:27:47 np0005466013 nova_compute[192144]: 2025-10-02 12:27:47.216 2 DEBUG nova.policy [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:27:47 np0005466013 nova_compute[192144]: 2025-10-02 12:27:47.685 2 DEBUG nova.compute.manager [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:27:48 np0005466013 podman[240672]: 2025-10-02 12:27:48.719060083 +0000 UTC m=+0.089984832 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:27:48 np0005466013 podman[240673]: 2025-10-02 12:27:48.719263809 +0000 UTC m=+0.091554691 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=iscsid)
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.042 2 DEBUG oslo_concurrency.processutils [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2563015c4ccc448cfc2f148d9d2544bae12af308.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.106 2 DEBUG nova.compute.manager [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.108 2 DEBUG nova.virt.libvirt.driver [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.109 2 INFO nova.virt.libvirt.driver [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Creating image(s)#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.109 2 DEBUG oslo_concurrency.lockutils [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "/var/lib/nova/instances/ca651811-3d96-4b41-a50d-bbaeaf3da808/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.110 2 DEBUG oslo_concurrency.lockutils [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "/var/lib/nova/instances/ca651811-3d96-4b41-a50d-bbaeaf3da808/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.110 2 DEBUG oslo_concurrency.lockutils [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "/var/lib/nova/instances/ca651811-3d96-4b41-a50d-bbaeaf3da808/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.124 2 DEBUG oslo_concurrency.processutils [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2563015c4ccc448cfc2f148d9d2544bae12af308.part --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.124 2 DEBUG nova.virt.images [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] 029c31da-a19a-49fa-b3ae-74dda129673a was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.130 2 DEBUG nova.privsep.utils [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.131 2 DEBUG oslo_concurrency.processutils [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/2563015c4ccc448cfc2f148d9d2544bae12af308.part /var/lib/nova/instances/_base/2563015c4ccc448cfc2f148d9d2544bae12af308.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.162 2 DEBUG oslo_concurrency.processutils [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.243 2 DEBUG oslo_concurrency.processutils [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.244 2 DEBUG oslo_concurrency.lockutils [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.245 2 DEBUG oslo_concurrency.lockutils [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.255 2 DEBUG oslo_concurrency.processutils [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.302 2 DEBUG oslo_concurrency.processutils [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/2563015c4ccc448cfc2f148d9d2544bae12af308.part /var/lib/nova/instances/_base/2563015c4ccc448cfc2f148d9d2544bae12af308.converted" returned: 0 in 0.171s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.310 2 DEBUG oslo_concurrency.processutils [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2563015c4ccc448cfc2f148d9d2544bae12af308.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.337 2 DEBUG oslo_concurrency.processutils [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.338 2 DEBUG oslo_concurrency.processutils [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/ca651811-3d96-4b41-a50d-bbaeaf3da808/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.379 2 DEBUG oslo_concurrency.processutils [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/ca651811-3d96-4b41-a50d-bbaeaf3da808/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.380 2 DEBUG oslo_concurrency.lockutils [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.136s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.381 2 DEBUG oslo_concurrency.processutils [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.406 2 DEBUG oslo_concurrency.processutils [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2563015c4ccc448cfc2f148d9d2544bae12af308.converted --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.408 2 DEBUG oslo_concurrency.lockutils [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "2563015c4ccc448cfc2f148d9d2544bae12af308" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 3.626s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.433 2 DEBUG oslo_concurrency.lockutils [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquiring lock "2563015c4ccc448cfc2f148d9d2544bae12af308" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.434 2 DEBUG oslo_concurrency.lockutils [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "2563015c4ccc448cfc2f148d9d2544bae12af308" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.448 2 DEBUG oslo_concurrency.processutils [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2563015c4ccc448cfc2f148d9d2544bae12af308 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.477 2 DEBUG oslo_concurrency.processutils [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.479 2 DEBUG nova.virt.disk.api [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Checking if we can resize image /var/lib/nova/instances/ca651811-3d96-4b41-a50d-bbaeaf3da808/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.479 2 DEBUG oslo_concurrency.processutils [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ca651811-3d96-4b41-a50d-bbaeaf3da808/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.531 2 DEBUG oslo_concurrency.processutils [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2563015c4ccc448cfc2f148d9d2544bae12af308 --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.533 2 DEBUG oslo_concurrency.processutils [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2563015c4ccc448cfc2f148d9d2544bae12af308,backing_fmt=raw /var/lib/nova/instances/351ed18e-9759-4759-9a20-24f8b3d59908/disk.rescue execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.561 2 DEBUG oslo_concurrency.processutils [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ca651811-3d96-4b41-a50d-bbaeaf3da808/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.564 2 DEBUG nova.virt.disk.api [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Cannot resize image /var/lib/nova/instances/ca651811-3d96-4b41-a50d-bbaeaf3da808/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.566 2 DEBUG nova.objects.instance [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lazy-loading 'migration_context' on Instance uuid ca651811-3d96-4b41-a50d-bbaeaf3da808 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.583 2 DEBUG oslo_concurrency.processutils [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2563015c4ccc448cfc2f148d9d2544bae12af308,backing_fmt=raw /var/lib/nova/instances/351ed18e-9759-4759-9a20-24f8b3d59908/disk.rescue" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.583 2 DEBUG oslo_concurrency.lockutils [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "2563015c4ccc448cfc2f148d9d2544bae12af308" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.150s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.583 2 DEBUG nova.objects.instance [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lazy-loading 'migration_context' on Instance uuid 351ed18e-9759-4759-9a20-24f8b3d59908 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.661 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.711 2 DEBUG nova.virt.libvirt.driver [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.712 2 DEBUG nova.virt.libvirt.driver [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Ensure instance console log exists: /var/lib/nova/instances/ca651811-3d96-4b41-a50d-bbaeaf3da808/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.713 2 DEBUG oslo_concurrency.lockutils [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.713 2 DEBUG oslo_concurrency.lockutils [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.714 2 DEBUG oslo_concurrency.lockutils [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.750 2 DEBUG nova.virt.libvirt.driver [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.755 2 DEBUG nova.virt.libvirt.driver [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Start _get_guest_xml network_info=[{"id": "57aa5324-a5d7-424f-b3ea-20887a672a1c", "address": "fa:16:3e:b3:ab:89", "network": {"id": "aa4ebb90-ef5e-4974-a53d-2aabd696731a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "vif_mac": "fa:16:3e:b3:ab:89"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88e90c16adec46069b539d4f1431ab4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57aa5324-a5", "ovs_interfaceid": "57aa5324-a5d7-424f-b3ea-20887a672a1c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': '029c31da-a19a-49fa-b3ae-74dda129673a', 'kernel_id': '', 'ramdisk_id': ''} block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.756 2 DEBUG nova.objects.instance [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lazy-loading 'resources' on Instance uuid 351ed18e-9759-4759-9a20-24f8b3d59908 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.841 2 WARNING nova.virt.libvirt.driver [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.852 2 DEBUG nova.virt.libvirt.host [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.853 2 DEBUG nova.virt.libvirt.host [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.856 2 DEBUG nova.virt.libvirt.host [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.857 2 DEBUG nova.virt.libvirt.host [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.859 2 DEBUG nova.virt.libvirt.driver [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.860 2 DEBUG nova.virt.hardware [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.860 2 DEBUG nova.virt.hardware [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.861 2 DEBUG nova.virt.hardware [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.861 2 DEBUG nova.virt.hardware [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.862 2 DEBUG nova.virt.hardware [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.862 2 DEBUG nova.virt.hardware [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.863 2 DEBUG nova.virt.hardware [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.863 2 DEBUG nova.virt.hardware [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.864 2 DEBUG nova.virt.hardware [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.864 2 DEBUG nova.virt.hardware [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.864 2 DEBUG nova.virt.hardware [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:27:49 np0005466013 nova_compute[192144]: 2025-10-02 12:27:49.865 2 DEBUG nova.objects.instance [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lazy-loading 'vcpu_model' on Instance uuid 351ed18e-9759-4759-9a20-24f8b3d59908 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:27:50 np0005466013 nova_compute[192144]: 2025-10-02 12:27:50.159 2 DEBUG oslo_concurrency.processutils [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/351ed18e-9759-4759-9a20-24f8b3d59908/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:27:50 np0005466013 nova_compute[192144]: 2025-10-02 12:27:50.254 2 DEBUG oslo_concurrency.processutils [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/351ed18e-9759-4759-9a20-24f8b3d59908/disk.config --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:27:50 np0005466013 nova_compute[192144]: 2025-10-02 12:27:50.255 2 DEBUG oslo_concurrency.lockutils [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquiring lock "/var/lib/nova/instances/351ed18e-9759-4759-9a20-24f8b3d59908/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:27:50 np0005466013 nova_compute[192144]: 2025-10-02 12:27:50.256 2 DEBUG oslo_concurrency.lockutils [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "/var/lib/nova/instances/351ed18e-9759-4759-9a20-24f8b3d59908/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:27:50 np0005466013 nova_compute[192144]: 2025-10-02 12:27:50.257 2 DEBUG oslo_concurrency.lockutils [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "/var/lib/nova/instances/351ed18e-9759-4759-9a20-24f8b3d59908/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:27:50 np0005466013 nova_compute[192144]: 2025-10-02 12:27:50.258 2 DEBUG nova.virt.libvirt.vif [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:27:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-784230985',display_name='tempest-ServerStableDeviceRescueTest-server-784230985',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-784230985',id=126,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:27:22Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='88e90c16adec46069b539d4f1431ab4d',ramdisk_id='',reservation_id='r-9mkkq6lq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerStableDeviceRescueTest-232864240',owner_user_name='tempest-ServerStableDeviceRescueTest-232864240-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:27:31Z,user_data=None,user_id='abb9f220716e48d79dbe2f97622937c4',uuid=351ed18e-9759-4759-9a20-24f8b3d59908,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "57aa5324-a5d7-424f-b3ea-20887a672a1c", "address": "fa:16:3e:b3:ab:89", "network": {"id": "aa4ebb90-ef5e-4974-a53d-2aabd696731a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "vif_mac": "fa:16:3e:b3:ab:89"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88e90c16adec46069b539d4f1431ab4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57aa5324-a5", "ovs_interfaceid": "57aa5324-a5d7-424f-b3ea-20887a672a1c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:27:50 np0005466013 nova_compute[192144]: 2025-10-02 12:27:50.259 2 DEBUG nova.network.os_vif_util [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Converting VIF {"id": "57aa5324-a5d7-424f-b3ea-20887a672a1c", "address": "fa:16:3e:b3:ab:89", "network": {"id": "aa4ebb90-ef5e-4974-a53d-2aabd696731a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "vif_mac": "fa:16:3e:b3:ab:89"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88e90c16adec46069b539d4f1431ab4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57aa5324-a5", "ovs_interfaceid": "57aa5324-a5d7-424f-b3ea-20887a672a1c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:27:50 np0005466013 nova_compute[192144]: 2025-10-02 12:27:50.260 2 DEBUG nova.network.os_vif_util [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b3:ab:89,bridge_name='br-int',has_traffic_filtering=True,id=57aa5324-a5d7-424f-b3ea-20887a672a1c,network=Network(aa4ebb90-ef5e-4974-a53d-2aabd696731a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57aa5324-a5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:27:50 np0005466013 nova_compute[192144]: 2025-10-02 12:27:50.261 2 DEBUG nova.objects.instance [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lazy-loading 'pci_devices' on Instance uuid 351ed18e-9759-4759-9a20-24f8b3d59908 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:27:50 np0005466013 nova_compute[192144]: 2025-10-02 12:27:50.403 2 DEBUG nova.virt.libvirt.driver [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:27:50 np0005466013 nova_compute[192144]:  <uuid>351ed18e-9759-4759-9a20-24f8b3d59908</uuid>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:  <name>instance-0000007e</name>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:27:50 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:      <nova:name>tempest-ServerStableDeviceRescueTest-server-784230985</nova:name>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:27:49</nova:creationTime>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:27:50 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:        <nova:user uuid="abb9f220716e48d79dbe2f97622937c4">tempest-ServerStableDeviceRescueTest-232864240-project-member</nova:user>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:        <nova:project uuid="88e90c16adec46069b539d4f1431ab4d">tempest-ServerStableDeviceRescueTest-232864240</nova:project>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:        <nova:port uuid="57aa5324-a5d7-424f-b3ea-20887a672a1c">
Oct  2 08:27:50 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:      <entry name="serial">351ed18e-9759-4759-9a20-24f8b3d59908</entry>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:      <entry name="uuid">351ed18e-9759-4759-9a20-24f8b3d59908</entry>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:27:50 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/351ed18e-9759-4759-9a20-24f8b3d59908/disk"/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:27:50 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/351ed18e-9759-4759-9a20-24f8b3d59908/disk.config"/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:27:50 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/351ed18e-9759-4759-9a20-24f8b3d59908/disk.rescue"/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:      <target dev="vdb" bus="virtio"/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:      <boot order="1"/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:27:50 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:b3:ab:89"/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:      <target dev="tap57aa5324-a5"/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:27:50 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/351ed18e-9759-4759-9a20-24f8b3d59908/console.log" append="off"/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:27:50 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:27:50 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:27:50 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:27:50 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:27:50 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:27:50 np0005466013 nova_compute[192144]: 2025-10-02 12:27:50.415 2 INFO nova.virt.libvirt.driver [-] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Instance destroyed successfully.#033[00m
Oct  2 08:27:50 np0005466013 nova_compute[192144]: 2025-10-02 12:27:50.810 2 DEBUG nova.virt.libvirt.driver [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:27:50 np0005466013 nova_compute[192144]: 2025-10-02 12:27:50.811 2 DEBUG nova.virt.libvirt.driver [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:27:50 np0005466013 nova_compute[192144]: 2025-10-02 12:27:50.811 2 DEBUG nova.virt.libvirt.driver [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:27:50 np0005466013 nova_compute[192144]: 2025-10-02 12:27:50.811 2 DEBUG nova.virt.libvirt.driver [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] No VIF found with MAC fa:16:3e:b3:ab:89, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:27:50 np0005466013 nova_compute[192144]: 2025-10-02 12:27:50.812 2 INFO nova.virt.libvirt.driver [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Using config drive#033[00m
Oct  2 08:27:50 np0005466013 nova_compute[192144]: 2025-10-02 12:27:50.942 2 DEBUG nova.objects.instance [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lazy-loading 'ec2_ids' on Instance uuid 351ed18e-9759-4759-9a20-24f8b3d59908 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:27:51 np0005466013 nova_compute[192144]: 2025-10-02 12:27:51.177 2 DEBUG nova.objects.instance [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lazy-loading 'keypairs' on Instance uuid 351ed18e-9759-4759-9a20-24f8b3d59908 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:27:52 np0005466013 nova_compute[192144]: 2025-10-02 12:27:52.023 2 INFO nova.virt.libvirt.driver [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Creating config drive at /var/lib/nova/instances/351ed18e-9759-4759-9a20-24f8b3d59908/disk.config.rescue#033[00m
Oct  2 08:27:52 np0005466013 nova_compute[192144]: 2025-10-02 12:27:52.029 2 DEBUG oslo_concurrency.processutils [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/351ed18e-9759-4759-9a20-24f8b3d59908/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp26bwfx87 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:27:52 np0005466013 nova_compute[192144]: 2025-10-02 12:27:52.177 2 DEBUG oslo_concurrency.processutils [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/351ed18e-9759-4759-9a20-24f8b3d59908/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp26bwfx87" returned: 0 in 0.148s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:27:52 np0005466013 kernel: tap57aa5324-a5: entered promiscuous mode
Oct  2 08:27:52 np0005466013 ovn_controller[94366]: 2025-10-02T12:27:52Z|00525|binding|INFO|Claiming lport 57aa5324-a5d7-424f-b3ea-20887a672a1c for this chassis.
Oct  2 08:27:52 np0005466013 ovn_controller[94366]: 2025-10-02T12:27:52Z|00526|binding|INFO|57aa5324-a5d7-424f-b3ea-20887a672a1c: Claiming fa:16:3e:b3:ab:89 10.100.0.8
Oct  2 08:27:52 np0005466013 nova_compute[192144]: 2025-10-02 12:27:52.283 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:52 np0005466013 NetworkManager[51205]: <info>  [1759408072.2869] manager: (tap57aa5324-a5): new Tun device (/org/freedesktop/NetworkManager/Devices/237)
Oct  2 08:27:52 np0005466013 ovn_controller[94366]: 2025-10-02T12:27:52Z|00527|binding|INFO|Setting lport 57aa5324-a5d7-424f-b3ea-20887a672a1c ovn-installed in OVS
Oct  2 08:27:52 np0005466013 nova_compute[192144]: 2025-10-02 12:27:52.302 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:52 np0005466013 nova_compute[192144]: 2025-10-02 12:27:52.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:52 np0005466013 systemd-udevd[240766]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:27:52 np0005466013 NetworkManager[51205]: <info>  [1759408072.3279] device (tap57aa5324-a5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:27:52 np0005466013 NetworkManager[51205]: <info>  [1759408072.3288] device (tap57aa5324-a5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:27:52 np0005466013 systemd-machined[152202]: New machine qemu-61-instance-0000007e.
Oct  2 08:27:52 np0005466013 systemd[1]: Started Virtual Machine qemu-61-instance-0000007e.
Oct  2 08:27:52 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:27:52.642 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b3:ab:89 10.100.0.8'], port_security=['fa:16:3e:b3:ab:89 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '351ed18e-9759-4759-9a20-24f8b3d59908', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '88e90c16adec46069b539d4f1431ab4d', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'ed50fd5d-92ed-497e-8f4f-4653533c5a19', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=53026845-594b-430c-a1e8-d879cf008d46, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=57aa5324-a5d7-424f-b3ea-20887a672a1c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:27:52 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:27:52.644 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 57aa5324-a5d7-424f-b3ea-20887a672a1c in datapath aa4ebb90-ef5e-4974-a53d-2aabd696731a bound to our chassis#033[00m
Oct  2 08:27:52 np0005466013 ovn_controller[94366]: 2025-10-02T12:27:52Z|00528|binding|INFO|Setting lport 57aa5324-a5d7-424f-b3ea-20887a672a1c up in Southbound
Oct  2 08:27:52 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:27:52.646 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network aa4ebb90-ef5e-4974-a53d-2aabd696731a#033[00m
Oct  2 08:27:52 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:27:52.661 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[079ec782-bc45-4d43-8565-23ced11e3bb9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:27:52 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:27:52.708 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[f0172b23-a9d0-4cef-b5a0-ec092f38ac5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:27:52 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:27:52.712 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[8ecc4d43-8063-46ab-b124-8ca786d2036e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:27:52 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:27:52.752 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[b5c676c5-f429-4edb-a93d-bd9e4e57ae08]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:27:52 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:27:52.777 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[9a3b36db-2cfb-4d80-aee7-77aefa2b6610]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapaa4ebb90-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bc:89:8e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 10, 'rx_bytes': 916, 'tx_bytes': 612, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 10, 'rx_bytes': 916, 'tx_bytes': 612, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 155], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 593994, 'reachable_time': 23403, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240790, 'error': None, 'target': 'ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:27:52 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:27:52.802 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[4f548640-2f1c-4f90-9e20-ee4bff7db1f7]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapaa4ebb90-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 594010, 'tstamp': 594010}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240791, 'error': None, 'target': 'ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapaa4ebb90-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 594013, 'tstamp': 594013}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240791, 'error': None, 'target': 'ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:27:52 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:27:52.805 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaa4ebb90-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:27:52 np0005466013 nova_compute[192144]: 2025-10-02 12:27:52.807 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:52 np0005466013 nova_compute[192144]: 2025-10-02 12:27:52.808 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:52 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:27:52.809 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaa4ebb90-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:27:52 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:27:52.809 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:27:52 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:27:52.810 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapaa4ebb90-e0, col_values=(('external_ids', {'iface-id': 'c9a9afa9-78de-46a4-a649-8b8779924189'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:27:52 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:27:52.810 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:27:53 np0005466013 nova_compute[192144]: 2025-10-02 12:27:53.241 2 DEBUG nova.virt.libvirt.host [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Removed pending event for 351ed18e-9759-4759-9a20-24f8b3d59908 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Oct  2 08:27:53 np0005466013 nova_compute[192144]: 2025-10-02 12:27:53.242 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759408073.2411501, 351ed18e-9759-4759-9a20-24f8b3d59908 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:27:53 np0005466013 nova_compute[192144]: 2025-10-02 12:27:53.242 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:27:53 np0005466013 nova_compute[192144]: 2025-10-02 12:27:53.405 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:27:53 np0005466013 nova_compute[192144]: 2025-10-02 12:27:53.410 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:27:53 np0005466013 nova_compute[192144]: 2025-10-02 12:27:53.469 2 DEBUG nova.network.neutron [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Successfully created port: 087d1308-7c0a-45ab-b876-1dfdceb622d7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:27:53 np0005466013 nova_compute[192144]: 2025-10-02 12:27:53.631 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] During sync_power_state the instance has a pending task (rescuing). Skip.#033[00m
Oct  2 08:27:53 np0005466013 nova_compute[192144]: 2025-10-02 12:27:53.632 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759408073.244729, 351ed18e-9759-4759-9a20-24f8b3d59908 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:27:53 np0005466013 nova_compute[192144]: 2025-10-02 12:27:53.632 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] VM Started (Lifecycle Event)#033[00m
Oct  2 08:27:53 np0005466013 nova_compute[192144]: 2025-10-02 12:27:53.688 2 DEBUG nova.compute.manager [None req-59c7270e-3b8c-4fc0-91a6-4585f963368b abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:27:53 np0005466013 nova_compute[192144]: 2025-10-02 12:27:53.913 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:27:53 np0005466013 nova_compute[192144]: 2025-10-02 12:27:53.918 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:27:54 np0005466013 nova_compute[192144]: 2025-10-02 12:27:54.664 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:56 np0005466013 nova_compute[192144]: 2025-10-02 12:27:56.116 2 DEBUG nova.network.neutron [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Successfully created port: 6949320c-d0cb-4a9d-a882-f6d1aac564c3 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:27:56 np0005466013 nova_compute[192144]: 2025-10-02 12:27:56.863 2 DEBUG nova.compute.manager [req-28efc949-30ce-4b41-a2e3-87e3c16f1a50 req-c48292f7-a79d-4fae-a35f-64f6e14defb1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Received event network-vif-plugged-57aa5324-a5d7-424f-b3ea-20887a672a1c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:27:56 np0005466013 nova_compute[192144]: 2025-10-02 12:27:56.864 2 DEBUG oslo_concurrency.lockutils [req-28efc949-30ce-4b41-a2e3-87e3c16f1a50 req-c48292f7-a79d-4fae-a35f-64f6e14defb1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "351ed18e-9759-4759-9a20-24f8b3d59908-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:27:56 np0005466013 nova_compute[192144]: 2025-10-02 12:27:56.864 2 DEBUG oslo_concurrency.lockutils [req-28efc949-30ce-4b41-a2e3-87e3c16f1a50 req-c48292f7-a79d-4fae-a35f-64f6e14defb1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "351ed18e-9759-4759-9a20-24f8b3d59908-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:27:56 np0005466013 nova_compute[192144]: 2025-10-02 12:27:56.865 2 DEBUG oslo_concurrency.lockutils [req-28efc949-30ce-4b41-a2e3-87e3c16f1a50 req-c48292f7-a79d-4fae-a35f-64f6e14defb1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "351ed18e-9759-4759-9a20-24f8b3d59908-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:27:56 np0005466013 nova_compute[192144]: 2025-10-02 12:27:56.865 2 DEBUG nova.compute.manager [req-28efc949-30ce-4b41-a2e3-87e3c16f1a50 req-c48292f7-a79d-4fae-a35f-64f6e14defb1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] No waiting events found dispatching network-vif-plugged-57aa5324-a5d7-424f-b3ea-20887a672a1c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:27:56 np0005466013 nova_compute[192144]: 2025-10-02 12:27:56.866 2 WARNING nova.compute.manager [req-28efc949-30ce-4b41-a2e3-87e3c16f1a50 req-c48292f7-a79d-4fae-a35f-64f6e14defb1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Received unexpected event network-vif-plugged-57aa5324-a5d7-424f-b3ea-20887a672a1c for instance with vm_state rescued and task_state None.#033[00m
Oct  2 08:27:56 np0005466013 nova_compute[192144]: 2025-10-02 12:27:56.866 2 DEBUG nova.compute.manager [req-28efc949-30ce-4b41-a2e3-87e3c16f1a50 req-c48292f7-a79d-4fae-a35f-64f6e14defb1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Received event network-vif-plugged-57aa5324-a5d7-424f-b3ea-20887a672a1c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:27:56 np0005466013 nova_compute[192144]: 2025-10-02 12:27:56.866 2 DEBUG oslo_concurrency.lockutils [req-28efc949-30ce-4b41-a2e3-87e3c16f1a50 req-c48292f7-a79d-4fae-a35f-64f6e14defb1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "351ed18e-9759-4759-9a20-24f8b3d59908-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:27:56 np0005466013 nova_compute[192144]: 2025-10-02 12:27:56.867 2 DEBUG oslo_concurrency.lockutils [req-28efc949-30ce-4b41-a2e3-87e3c16f1a50 req-c48292f7-a79d-4fae-a35f-64f6e14defb1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "351ed18e-9759-4759-9a20-24f8b3d59908-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:27:56 np0005466013 nova_compute[192144]: 2025-10-02 12:27:56.867 2 DEBUG oslo_concurrency.lockutils [req-28efc949-30ce-4b41-a2e3-87e3c16f1a50 req-c48292f7-a79d-4fae-a35f-64f6e14defb1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "351ed18e-9759-4759-9a20-24f8b3d59908-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:27:56 np0005466013 nova_compute[192144]: 2025-10-02 12:27:56.867 2 DEBUG nova.compute.manager [req-28efc949-30ce-4b41-a2e3-87e3c16f1a50 req-c48292f7-a79d-4fae-a35f-64f6e14defb1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] No waiting events found dispatching network-vif-plugged-57aa5324-a5d7-424f-b3ea-20887a672a1c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:27:56 np0005466013 nova_compute[192144]: 2025-10-02 12:27:56.868 2 WARNING nova.compute.manager [req-28efc949-30ce-4b41-a2e3-87e3c16f1a50 req-c48292f7-a79d-4fae-a35f-64f6e14defb1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Received unexpected event network-vif-plugged-57aa5324-a5d7-424f-b3ea-20887a672a1c for instance with vm_state rescued and task_state None.#033[00m
Oct  2 08:27:59 np0005466013 nova_compute[192144]: 2025-10-02 12:27:59.665 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:59 np0005466013 nova_compute[192144]: 2025-10-02 12:27:59.742 2 DEBUG nova.network.neutron [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Successfully updated port: 087d1308-7c0a-45ab-b876-1dfdceb622d7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:28:00 np0005466013 nova_compute[192144]: 2025-10-02 12:28:00.423 2 DEBUG nova.compute.manager [req-8c90da96-9cfd-4da6-b0b6-7a8907e2db35 req-4de35242-5cfa-45dd-853d-d261f0983d7e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Received event network-changed-087d1308-7c0a-45ab-b876-1dfdceb622d7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:28:00 np0005466013 nova_compute[192144]: 2025-10-02 12:28:00.424 2 DEBUG nova.compute.manager [req-8c90da96-9cfd-4da6-b0b6-7a8907e2db35 req-4de35242-5cfa-45dd-853d-d261f0983d7e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Refreshing instance network info cache due to event network-changed-087d1308-7c0a-45ab-b876-1dfdceb622d7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:28:00 np0005466013 nova_compute[192144]: 2025-10-02 12:28:00.424 2 DEBUG oslo_concurrency.lockutils [req-8c90da96-9cfd-4da6-b0b6-7a8907e2db35 req-4de35242-5cfa-45dd-853d-d261f0983d7e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-ca651811-3d96-4b41-a50d-bbaeaf3da808" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:28:00 np0005466013 nova_compute[192144]: 2025-10-02 12:28:00.424 2 DEBUG oslo_concurrency.lockutils [req-8c90da96-9cfd-4da6-b0b6-7a8907e2db35 req-4de35242-5cfa-45dd-853d-d261f0983d7e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-ca651811-3d96-4b41-a50d-bbaeaf3da808" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:28:00 np0005466013 nova_compute[192144]: 2025-10-02 12:28:00.425 2 DEBUG nova.network.neutron [req-8c90da96-9cfd-4da6-b0b6-7a8907e2db35 req-4de35242-5cfa-45dd-853d-d261f0983d7e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Refreshing network info cache for port 087d1308-7c0a-45ab-b876-1dfdceb622d7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:28:01 np0005466013 nova_compute[192144]: 2025-10-02 12:28:01.359 2 INFO nova.compute.manager [None req-6d24ac46-d7e3-4b70-bb9b-da2ac76f7911 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Unrescuing#033[00m
Oct  2 08:28:01 np0005466013 nova_compute[192144]: 2025-10-02 12:28:01.360 2 DEBUG oslo_concurrency.lockutils [None req-6d24ac46-d7e3-4b70-bb9b-da2ac76f7911 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquiring lock "refresh_cache-351ed18e-9759-4759-9a20-24f8b3d59908" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:28:01 np0005466013 nova_compute[192144]: 2025-10-02 12:28:01.360 2 DEBUG oslo_concurrency.lockutils [None req-6d24ac46-d7e3-4b70-bb9b-da2ac76f7911 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquired lock "refresh_cache-351ed18e-9759-4759-9a20-24f8b3d59908" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:28:01 np0005466013 nova_compute[192144]: 2025-10-02 12:28:01.360 2 DEBUG nova.network.neutron [None req-6d24ac46-d7e3-4b70-bb9b-da2ac76f7911 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:28:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:02.312 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:02.313 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:02.314 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:03 np0005466013 nova_compute[192144]: 2025-10-02 12:28:03.631 2 DEBUG nova.network.neutron [req-8c90da96-9cfd-4da6-b0b6-7a8907e2db35 req-4de35242-5cfa-45dd-853d-d261f0983d7e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:28:03 np0005466013 podman[240793]: 2025-10-02 12:28:03.694813256 +0000 UTC m=+0.063527661 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true)
Oct  2 08:28:03 np0005466013 podman[240792]: 2025-10-02 12:28:03.701320441 +0000 UTC m=+0.070883023 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  2 08:28:03 np0005466013 podman[240794]: 2025-10-02 12:28:03.7274771 +0000 UTC m=+0.089592729 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 08:28:04 np0005466013 nova_compute[192144]: 2025-10-02 12:28:04.671 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:28:04 np0005466013 nova_compute[192144]: 2025-10-02 12:28:04.673 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:28:04 np0005466013 nova_compute[192144]: 2025-10-02 12:28:04.673 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5006 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Oct  2 08:28:04 np0005466013 nova_compute[192144]: 2025-10-02 12:28:04.674 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  2 08:28:04 np0005466013 nova_compute[192144]: 2025-10-02 12:28:04.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:04 np0005466013 nova_compute[192144]: 2025-10-02 12:28:04.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  2 08:28:04 np0005466013 nova_compute[192144]: 2025-10-02 12:28:04.747 2 DEBUG nova.network.neutron [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Successfully updated port: 6949320c-d0cb-4a9d-a882-f6d1aac564c3 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:28:04 np0005466013 nova_compute[192144]: 2025-10-02 12:28:04.842 2 DEBUG oslo_concurrency.lockutils [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "refresh_cache-ca651811-3d96-4b41-a50d-bbaeaf3da808" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:28:04 np0005466013 nova_compute[192144]: 2025-10-02 12:28:04.940 2 DEBUG nova.compute.manager [req-af3eb691-3a9b-4406-a31d-64c025b3c4f9 req-da9f7104-0b10-43d9-9350-5cd347aeeeff 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Received event network-changed-6949320c-d0cb-4a9d-a882-f6d1aac564c3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:28:04 np0005466013 nova_compute[192144]: 2025-10-02 12:28:04.941 2 DEBUG nova.compute.manager [req-af3eb691-3a9b-4406-a31d-64c025b3c4f9 req-da9f7104-0b10-43d9-9350-5cd347aeeeff 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Refreshing instance network info cache due to event network-changed-6949320c-d0cb-4a9d-a882-f6d1aac564c3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:28:04 np0005466013 nova_compute[192144]: 2025-10-02 12:28:04.941 2 DEBUG oslo_concurrency.lockutils [req-af3eb691-3a9b-4406-a31d-64c025b3c4f9 req-da9f7104-0b10-43d9-9350-5cd347aeeeff 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-ca651811-3d96-4b41-a50d-bbaeaf3da808" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:28:05 np0005466013 nova_compute[192144]: 2025-10-02 12:28:05.600 2 DEBUG nova.network.neutron [req-8c90da96-9cfd-4da6-b0b6-7a8907e2db35 req-4de35242-5cfa-45dd-853d-d261f0983d7e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:28:05 np0005466013 nova_compute[192144]: 2025-10-02 12:28:05.686 2 DEBUG oslo_concurrency.lockutils [req-8c90da96-9cfd-4da6-b0b6-7a8907e2db35 req-4de35242-5cfa-45dd-853d-d261f0983d7e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-ca651811-3d96-4b41-a50d-bbaeaf3da808" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:28:05 np0005466013 nova_compute[192144]: 2025-10-02 12:28:05.688 2 DEBUG oslo_concurrency.lockutils [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquired lock "refresh_cache-ca651811-3d96-4b41-a50d-bbaeaf3da808" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:28:05 np0005466013 nova_compute[192144]: 2025-10-02 12:28:05.689 2 DEBUG nova.network.neutron [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:28:06 np0005466013 ovn_controller[94366]: 2025-10-02T12:28:06Z|00051|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b3:ab:89 10.100.0.8
Oct  2 08:28:06 np0005466013 ovn_controller[94366]: 2025-10-02T12:28:06Z|00052|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b3:ab:89 10.100.0.8
Oct  2 08:28:06 np0005466013 nova_compute[192144]: 2025-10-02 12:28:06.654 2 DEBUG nova.network.neutron [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:28:09 np0005466013 nova_compute[192144]: 2025-10-02 12:28:09.095 2 DEBUG nova.network.neutron [None req-6d24ac46-d7e3-4b70-bb9b-da2ac76f7911 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Updating instance_info_cache with network_info: [{"id": "57aa5324-a5d7-424f-b3ea-20887a672a1c", "address": "fa:16:3e:b3:ab:89", "network": {"id": "aa4ebb90-ef5e-4974-a53d-2aabd696731a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88e90c16adec46069b539d4f1431ab4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57aa5324-a5", "ovs_interfaceid": "57aa5324-a5d7-424f-b3ea-20887a672a1c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:28:09 np0005466013 nova_compute[192144]: 2025-10-02 12:28:09.123 2 DEBUG oslo_concurrency.lockutils [None req-6d24ac46-d7e3-4b70-bb9b-da2ac76f7911 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Releasing lock "refresh_cache-351ed18e-9759-4759-9a20-24f8b3d59908" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:28:09 np0005466013 nova_compute[192144]: 2025-10-02 12:28:09.124 2 DEBUG nova.objects.instance [None req-6d24ac46-d7e3-4b70-bb9b-da2ac76f7911 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lazy-loading 'flavor' on Instance uuid 351ed18e-9759-4759-9a20-24f8b3d59908 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:28:09 np0005466013 kernel: tap57aa5324-a5 (unregistering): left promiscuous mode
Oct  2 08:28:09 np0005466013 NetworkManager[51205]: <info>  [1759408089.2234] device (tap57aa5324-a5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:28:09 np0005466013 nova_compute[192144]: 2025-10-02 12:28:09.234 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:09 np0005466013 ovn_controller[94366]: 2025-10-02T12:28:09Z|00529|binding|INFO|Releasing lport 57aa5324-a5d7-424f-b3ea-20887a672a1c from this chassis (sb_readonly=0)
Oct  2 08:28:09 np0005466013 ovn_controller[94366]: 2025-10-02T12:28:09Z|00530|binding|INFO|Setting lport 57aa5324-a5d7-424f-b3ea-20887a672a1c down in Southbound
Oct  2 08:28:09 np0005466013 ovn_controller[94366]: 2025-10-02T12:28:09Z|00531|binding|INFO|Removing iface tap57aa5324-a5 ovn-installed in OVS
Oct  2 08:28:09 np0005466013 nova_compute[192144]: 2025-10-02 12:28:09.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:09 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:09.245 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b3:ab:89 10.100.0.8'], port_security=['fa:16:3e:b3:ab:89 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '351ed18e-9759-4759-9a20-24f8b3d59908', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '88e90c16adec46069b539d4f1431ab4d', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'ed50fd5d-92ed-497e-8f4f-4653533c5a19', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=53026845-594b-430c-a1e8-d879cf008d46, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=57aa5324-a5d7-424f-b3ea-20887a672a1c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:28:09 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:09.248 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 57aa5324-a5d7-424f-b3ea-20887a672a1c in datapath aa4ebb90-ef5e-4974-a53d-2aabd696731a unbound from our chassis#033[00m
Oct  2 08:28:09 np0005466013 nova_compute[192144]: 2025-10-02 12:28:09.250 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:09 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:09.252 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network aa4ebb90-ef5e-4974-a53d-2aabd696731a#033[00m
Oct  2 08:28:09 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:09.280 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[5e7863aa-e1a2-4da0-b5e1-a8366e2ace07]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:09 np0005466013 systemd[1]: machine-qemu\x2d61\x2dinstance\x2d0000007e.scope: Deactivated successfully.
Oct  2 08:28:09 np0005466013 systemd[1]: machine-qemu\x2d61\x2dinstance\x2d0000007e.scope: Consumed 13.226s CPU time.
Oct  2 08:28:09 np0005466013 systemd-machined[152202]: Machine qemu-61-instance-0000007e terminated.
Oct  2 08:28:09 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:09.329 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[a5f01835-1419-47e0-b755-9d69fd2ad0c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:09 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:09.335 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[fa58e665-3fa8-4694-9e84-eeb1174c3d43]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:09 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:09.371 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[b54437c5-737d-4346-a7a1-9715bbe11bab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:09 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:09.397 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[0bad3333-20de-4bb1-8f6e-a46783fe9b02]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapaa4ebb90-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bc:89:8e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 12, 'rx_bytes': 916, 'tx_bytes': 696, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 12, 'rx_bytes': 916, 'tx_bytes': 696, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 155], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 593994, 'reachable_time': 23403, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240885, 'error': None, 'target': 'ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:09 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:09.423 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[28169d22-8a90-463f-b26e-f309a778235b]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapaa4ebb90-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 594010, 'tstamp': 594010}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240886, 'error': None, 'target': 'ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapaa4ebb90-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 594013, 'tstamp': 594013}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240886, 'error': None, 'target': 'ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:09 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:09.425 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaa4ebb90-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:28:09 np0005466013 nova_compute[192144]: 2025-10-02 12:28:09.428 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:09 np0005466013 nova_compute[192144]: 2025-10-02 12:28:09.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:09 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:09.448 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaa4ebb90-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:28:09 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:09.448 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:28:09 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:09.449 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapaa4ebb90-e0, col_values=(('external_ids', {'iface-id': 'c9a9afa9-78de-46a4-a649-8b8779924189'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:28:09 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:09.449 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:28:09 np0005466013 nova_compute[192144]: 2025-10-02 12:28:09.515 2 INFO nova.virt.libvirt.driver [-] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Instance destroyed successfully.#033[00m
Oct  2 08:28:09 np0005466013 nova_compute[192144]: 2025-10-02 12:28:09.516 2 DEBUG nova.objects.instance [None req-6d24ac46-d7e3-4b70-bb9b-da2ac76f7911 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lazy-loading 'numa_topology' on Instance uuid 351ed18e-9759-4759-9a20-24f8b3d59908 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:28:09 np0005466013 nova_compute[192144]: 2025-10-02 12:28:09.695 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:09 np0005466013 nova_compute[192144]: 2025-10-02 12:28:09.697 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:09 np0005466013 kernel: tap57aa5324-a5: entered promiscuous mode
Oct  2 08:28:09 np0005466013 systemd-udevd[240877]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:28:09 np0005466013 NetworkManager[51205]: <info>  [1759408089.7127] manager: (tap57aa5324-a5): new Tun device (/org/freedesktop/NetworkManager/Devices/238)
Oct  2 08:28:09 np0005466013 ovn_controller[94366]: 2025-10-02T12:28:09Z|00532|binding|INFO|Claiming lport 57aa5324-a5d7-424f-b3ea-20887a672a1c for this chassis.
Oct  2 08:28:09 np0005466013 ovn_controller[94366]: 2025-10-02T12:28:09Z|00533|binding|INFO|57aa5324-a5d7-424f-b3ea-20887a672a1c: Claiming fa:16:3e:b3:ab:89 10.100.0.8
Oct  2 08:28:09 np0005466013 nova_compute[192144]: 2025-10-02 12:28:09.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:09 np0005466013 NetworkManager[51205]: <info>  [1759408089.7269] device (tap57aa5324-a5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:28:09 np0005466013 NetworkManager[51205]: <info>  [1759408089.7277] device (tap57aa5324-a5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:28:09 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:09.729 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b3:ab:89 10.100.0.8'], port_security=['fa:16:3e:b3:ab:89 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '351ed18e-9759-4759-9a20-24f8b3d59908', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '88e90c16adec46069b539d4f1431ab4d', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'ed50fd5d-92ed-497e-8f4f-4653533c5a19', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=53026845-594b-430c-a1e8-d879cf008d46, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=57aa5324-a5d7-424f-b3ea-20887a672a1c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:28:09 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:09.730 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 57aa5324-a5d7-424f-b3ea-20887a672a1c in datapath aa4ebb90-ef5e-4974-a53d-2aabd696731a bound to our chassis#033[00m
Oct  2 08:28:09 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:09.732 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network aa4ebb90-ef5e-4974-a53d-2aabd696731a#033[00m
Oct  2 08:28:09 np0005466013 ovn_controller[94366]: 2025-10-02T12:28:09Z|00534|binding|INFO|Setting lport 57aa5324-a5d7-424f-b3ea-20887a672a1c ovn-installed in OVS
Oct  2 08:28:09 np0005466013 ovn_controller[94366]: 2025-10-02T12:28:09Z|00535|binding|INFO|Setting lport 57aa5324-a5d7-424f-b3ea-20887a672a1c up in Southbound
Oct  2 08:28:09 np0005466013 nova_compute[192144]: 2025-10-02 12:28:09.732 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:09 np0005466013 nova_compute[192144]: 2025-10-02 12:28:09.734 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:09 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:09.752 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[8db561c9-5035-4e57-9432-cc5509e6f54a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:09 np0005466013 systemd-machined[152202]: New machine qemu-62-instance-0000007e.
Oct  2 08:28:09 np0005466013 systemd[1]: Started Virtual Machine qemu-62-instance-0000007e.
Oct  2 08:28:09 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:09.785 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[f340f668-cd49-424e-b08f-bb32c6f8cb1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:09 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:09.789 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[528e622b-5789-4a2e-90f7-d226a382b27c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:09 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:09.822 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[b3971309-ff27-4a86-8bfe-67a3f3a9c137]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:09 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:09.840 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[a629d30e-1f81-40a5-bf8e-8c058cb7bc1a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapaa4ebb90-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bc:89:8e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 14, 'rx_bytes': 916, 'tx_bytes': 780, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 14, 'rx_bytes': 916, 'tx_bytes': 780, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 155], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 593994, 'reachable_time': 23403, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240935, 'error': None, 'target': 'ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:09 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:09.858 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[14eea60c-9553-4f68-b79b-a9e51e04509b]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapaa4ebb90-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 594010, 'tstamp': 594010}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240937, 'error': None, 'target': 'ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapaa4ebb90-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 594013, 'tstamp': 594013}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240937, 'error': None, 'target': 'ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:09 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:09.860 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaa4ebb90-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:28:09 np0005466013 nova_compute[192144]: 2025-10-02 12:28:09.861 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:09 np0005466013 nova_compute[192144]: 2025-10-02 12:28:09.862 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:09 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:09.863 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaa4ebb90-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:28:09 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:09.863 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:28:09 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:09.863 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapaa4ebb90-e0, col_values=(('external_ids', {'iface-id': 'c9a9afa9-78de-46a4-a649-8b8779924189'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:28:09 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:09.864 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:28:10 np0005466013 nova_compute[192144]: 2025-10-02 12:28:10.906 2 DEBUG nova.compute.manager [req-73c75e9c-9380-4b76-8008-47d4157c930c req-d7457604-b89b-470c-8c89-419aed2ea0b8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Received event network-vif-unplugged-57aa5324-a5d7-424f-b3ea-20887a672a1c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:28:10 np0005466013 nova_compute[192144]: 2025-10-02 12:28:10.907 2 DEBUG oslo_concurrency.lockutils [req-73c75e9c-9380-4b76-8008-47d4157c930c req-d7457604-b89b-470c-8c89-419aed2ea0b8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "351ed18e-9759-4759-9a20-24f8b3d59908-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:10 np0005466013 nova_compute[192144]: 2025-10-02 12:28:10.907 2 DEBUG oslo_concurrency.lockutils [req-73c75e9c-9380-4b76-8008-47d4157c930c req-d7457604-b89b-470c-8c89-419aed2ea0b8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "351ed18e-9759-4759-9a20-24f8b3d59908-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:10 np0005466013 nova_compute[192144]: 2025-10-02 12:28:10.908 2 DEBUG oslo_concurrency.lockutils [req-73c75e9c-9380-4b76-8008-47d4157c930c req-d7457604-b89b-470c-8c89-419aed2ea0b8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "351ed18e-9759-4759-9a20-24f8b3d59908-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:10 np0005466013 nova_compute[192144]: 2025-10-02 12:28:10.908 2 DEBUG nova.compute.manager [req-73c75e9c-9380-4b76-8008-47d4157c930c req-d7457604-b89b-470c-8c89-419aed2ea0b8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] No waiting events found dispatching network-vif-unplugged-57aa5324-a5d7-424f-b3ea-20887a672a1c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:28:10 np0005466013 nova_compute[192144]: 2025-10-02 12:28:10.909 2 WARNING nova.compute.manager [req-73c75e9c-9380-4b76-8008-47d4157c930c req-d7457604-b89b-470c-8c89-419aed2ea0b8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Received unexpected event network-vif-unplugged-57aa5324-a5d7-424f-b3ea-20887a672a1c for instance with vm_state rescued and task_state unrescuing.#033[00m
Oct  2 08:28:11 np0005466013 nova_compute[192144]: 2025-10-02 12:28:11.074 2 DEBUG nova.virt.libvirt.host [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Removed pending event for 351ed18e-9759-4759-9a20-24f8b3d59908 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Oct  2 08:28:11 np0005466013 nova_compute[192144]: 2025-10-02 12:28:11.075 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759408091.0742023, 351ed18e-9759-4759-9a20-24f8b3d59908 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:28:11 np0005466013 nova_compute[192144]: 2025-10-02 12:28:11.075 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:28:11 np0005466013 nova_compute[192144]: 2025-10-02 12:28:11.080 2 DEBUG nova.compute.manager [None req-6d24ac46-d7e3-4b70-bb9b-da2ac76f7911 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.068 2 DEBUG nova.network.neutron [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Updating instance_info_cache with network_info: [{"id": "087d1308-7c0a-45ab-b876-1dfdceb622d7", "address": "fa:16:3e:ac:b7:65", "network": {"id": "d68eafa7-b35f-4bd9-ba11-e28a73bc7849", "bridge": "br-int", "label": "tempest-network-smoke--995539246", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap087d1308-7c", "ovs_interfaceid": "087d1308-7c0a-45ab-b876-1dfdceb622d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6949320c-d0cb-4a9d-a882-f6d1aac564c3", "address": "fa:16:3e:49:69:cf", "network": {"id": "85092873-751b-414a-a9a1-112c2e61cb13", "bridge": "br-int", "label": "tempest-network-smoke--1405117352", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe49:69cf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6949320c-d0", "ovs_interfaceid": "6949320c-d0cb-4a9d-a882-f6d1aac564c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.516 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.521 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.549 2 DEBUG oslo_concurrency.lockutils [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Releasing lock "refresh_cache-ca651811-3d96-4b41-a50d-bbaeaf3da808" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.550 2 DEBUG nova.compute.manager [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Instance network_info: |[{"id": "087d1308-7c0a-45ab-b876-1dfdceb622d7", "address": "fa:16:3e:ac:b7:65", "network": {"id": "d68eafa7-b35f-4bd9-ba11-e28a73bc7849", "bridge": "br-int", "label": "tempest-network-smoke--995539246", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap087d1308-7c", "ovs_interfaceid": "087d1308-7c0a-45ab-b876-1dfdceb622d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6949320c-d0cb-4a9d-a882-f6d1aac564c3", "address": "fa:16:3e:49:69:cf", "network": {"id": "85092873-751b-414a-a9a1-112c2e61cb13", "bridge": "br-int", "label": "tempest-network-smoke--1405117352", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe49:69cf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6949320c-d0", "ovs_interfaceid": "6949320c-d0cb-4a9d-a882-f6d1aac564c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.551 2 DEBUG oslo_concurrency.lockutils [req-af3eb691-3a9b-4406-a31d-64c025b3c4f9 req-da9f7104-0b10-43d9-9350-5cd347aeeeff 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-ca651811-3d96-4b41-a50d-bbaeaf3da808" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.551 2 DEBUG nova.network.neutron [req-af3eb691-3a9b-4406-a31d-64c025b3c4f9 req-da9f7104-0b10-43d9-9350-5cd347aeeeff 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Refreshing network info cache for port 6949320c-d0cb-4a9d-a882-f6d1aac564c3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.555 2 DEBUG nova.virt.libvirt.driver [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Start _get_guest_xml network_info=[{"id": "087d1308-7c0a-45ab-b876-1dfdceb622d7", "address": "fa:16:3e:ac:b7:65", "network": {"id": "d68eafa7-b35f-4bd9-ba11-e28a73bc7849", "bridge": "br-int", "label": "tempest-network-smoke--995539246", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap087d1308-7c", "ovs_interfaceid": "087d1308-7c0a-45ab-b876-1dfdceb622d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6949320c-d0cb-4a9d-a882-f6d1aac564c3", "address": "fa:16:3e:49:69:cf", "network": {"id": "85092873-751b-414a-a9a1-112c2e61cb13", "bridge": "br-int", "label": "tempest-network-smoke--1405117352", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe49:69cf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6949320c-d0", "ovs_interfaceid": "6949320c-d0cb-4a9d-a882-f6d1aac564c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.557 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] During sync_power_state the instance has a pending task (unrescuing). Skip.#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.558 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759408091.0783608, 351ed18e-9759-4759-9a20-24f8b3d59908 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.558 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] VM Started (Lifecycle Event)#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.564 2 WARNING nova.virt.libvirt.driver [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.570 2 DEBUG nova.virt.libvirt.host [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.571 2 DEBUG nova.virt.libvirt.host [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.575 2 DEBUG nova.virt.libvirt.host [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.576 2 DEBUG nova.virt.libvirt.host [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.579 2 DEBUG nova.virt.libvirt.driver [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.579 2 DEBUG nova.virt.hardware [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.579 2 DEBUG nova.virt.hardware [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.580 2 DEBUG nova.virt.hardware [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.580 2 DEBUG nova.virt.hardware [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.580 2 DEBUG nova.virt.hardware [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.581 2 DEBUG nova.virt.hardware [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.581 2 DEBUG nova.virt.hardware [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.581 2 DEBUG nova.virt.hardware [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.581 2 DEBUG nova.virt.hardware [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.582 2 DEBUG nova.virt.hardware [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.582 2 DEBUG nova.virt.hardware [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.585 2 DEBUG nova.virt.libvirt.vif [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:27:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-986630317',display_name='tempest-TestGettingAddress-server-986630317',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-986630317',id=128,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNuPkj9Dxk75qecKTcAv5uyglWO7VJAWxaaepcSp1a1k0dudaz78GBONfraj5VI3+kTW1O5IZNXJG+3u7RZIiYOa8MpOz6jMbMxt8apN5oZQHFIm8Zmccy1CmZp9PKlQBg==',key_name='tempest-TestGettingAddress-1196931169',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-ku485ehg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:27:48Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=ca651811-3d96-4b41-a50d-bbaeaf3da808,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "087d1308-7c0a-45ab-b876-1dfdceb622d7", "address": "fa:16:3e:ac:b7:65", "network": {"id": "d68eafa7-b35f-4bd9-ba11-e28a73bc7849", "bridge": "br-int", "label": "tempest-network-smoke--995539246", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap087d1308-7c", "ovs_interfaceid": "087d1308-7c0a-45ab-b876-1dfdceb622d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.586 2 DEBUG nova.network.os_vif_util [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "087d1308-7c0a-45ab-b876-1dfdceb622d7", "address": "fa:16:3e:ac:b7:65", "network": {"id": "d68eafa7-b35f-4bd9-ba11-e28a73bc7849", "bridge": "br-int", "label": "tempest-network-smoke--995539246", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap087d1308-7c", "ovs_interfaceid": "087d1308-7c0a-45ab-b876-1dfdceb622d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.587 2 DEBUG nova.network.os_vif_util [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ac:b7:65,bridge_name='br-int',has_traffic_filtering=True,id=087d1308-7c0a-45ab-b876-1dfdceb622d7,network=Network(d68eafa7-b35f-4bd9-ba11-e28a73bc7849),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap087d1308-7c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.587 2 DEBUG nova.virt.libvirt.vif [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:27:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-986630317',display_name='tempest-TestGettingAddress-server-986630317',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-986630317',id=128,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNuPkj9Dxk75qecKTcAv5uyglWO7VJAWxaaepcSp1a1k0dudaz78GBONfraj5VI3+kTW1O5IZNXJG+3u7RZIiYOa8MpOz6jMbMxt8apN5oZQHFIm8Zmccy1CmZp9PKlQBg==',key_name='tempest-TestGettingAddress-1196931169',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-ku485ehg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:27:48Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=ca651811-3d96-4b41-a50d-bbaeaf3da808,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6949320c-d0cb-4a9d-a882-f6d1aac564c3", "address": "fa:16:3e:49:69:cf", "network": {"id": "85092873-751b-414a-a9a1-112c2e61cb13", "bridge": "br-int", "label": "tempest-network-smoke--1405117352", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe49:69cf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6949320c-d0", "ovs_interfaceid": "6949320c-d0cb-4a9d-a882-f6d1aac564c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.588 2 DEBUG nova.network.os_vif_util [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "6949320c-d0cb-4a9d-a882-f6d1aac564c3", "address": "fa:16:3e:49:69:cf", "network": {"id": "85092873-751b-414a-a9a1-112c2e61cb13", "bridge": "br-int", "label": "tempest-network-smoke--1405117352", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe49:69cf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6949320c-d0", "ovs_interfaceid": "6949320c-d0cb-4a9d-a882-f6d1aac564c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.588 2 DEBUG nova.network.os_vif_util [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:49:69:cf,bridge_name='br-int',has_traffic_filtering=True,id=6949320c-d0cb-4a9d-a882-f6d1aac564c3,network=Network(85092873-751b-414a-a9a1-112c2e61cb13),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6949320c-d0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.589 2 DEBUG nova.objects.instance [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lazy-loading 'pci_devices' on Instance uuid ca651811-3d96-4b41-a50d-bbaeaf3da808 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.613 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.623 2 DEBUG nova.virt.libvirt.driver [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:28:13 np0005466013 nova_compute[192144]:  <uuid>ca651811-3d96-4b41-a50d-bbaeaf3da808</uuid>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:  <name>instance-00000080</name>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:28:13 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:      <nova:name>tempest-TestGettingAddress-server-986630317</nova:name>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:28:13</nova:creationTime>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:28:13 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:        <nova:user uuid="97ce9f1898484e0e9a1f7c84a9f0dfe3">tempest-TestGettingAddress-1355720650-project-member</nova:user>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:        <nova:project uuid="fd801958556f4c8aab047ecdef6b5ee8">tempest-TestGettingAddress-1355720650</nova:project>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:        <nova:port uuid="087d1308-7c0a-45ab-b876-1dfdceb622d7">
Oct  2 08:28:13 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:        <nova:port uuid="6949320c-d0cb-4a9d-a882-f6d1aac564c3">
Oct  2 08:28:13 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe49:69cf" ipVersion="6"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:      <entry name="serial">ca651811-3d96-4b41-a50d-bbaeaf3da808</entry>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:      <entry name="uuid">ca651811-3d96-4b41-a50d-bbaeaf3da808</entry>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:28:13 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/ca651811-3d96-4b41-a50d-bbaeaf3da808/disk"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:28:13 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/ca651811-3d96-4b41-a50d-bbaeaf3da808/disk.config"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:28:13 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:ac:b7:65"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:      <target dev="tap087d1308-7c"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:28:13 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:49:69:cf"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:      <target dev="tap6949320c-d0"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:28:13 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/ca651811-3d96-4b41-a50d-bbaeaf3da808/console.log" append="off"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:28:13 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:28:13 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:28:13 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:28:13 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:28:13 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.624 2 DEBUG nova.compute.manager [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Preparing to wait for external event network-vif-plugged-087d1308-7c0a-45ab-b876-1dfdceb622d7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.624 2 DEBUG oslo_concurrency.lockutils [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "ca651811-3d96-4b41-a50d-bbaeaf3da808-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.624 2 DEBUG oslo_concurrency.lockutils [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "ca651811-3d96-4b41-a50d-bbaeaf3da808-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.624 2 DEBUG oslo_concurrency.lockutils [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "ca651811-3d96-4b41-a50d-bbaeaf3da808-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.624 2 DEBUG nova.compute.manager [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Preparing to wait for external event network-vif-plugged-6949320c-d0cb-4a9d-a882-f6d1aac564c3 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.625 2 DEBUG oslo_concurrency.lockutils [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "ca651811-3d96-4b41-a50d-bbaeaf3da808-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.625 2 DEBUG oslo_concurrency.lockutils [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "ca651811-3d96-4b41-a50d-bbaeaf3da808-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.625 2 DEBUG oslo_concurrency.lockutils [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "ca651811-3d96-4b41-a50d-bbaeaf3da808-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.626 2 DEBUG nova.virt.libvirt.vif [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:27:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-986630317',display_name='tempest-TestGettingAddress-server-986630317',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-986630317',id=128,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNuPkj9Dxk75qecKTcAv5uyglWO7VJAWxaaepcSp1a1k0dudaz78GBONfraj5VI3+kTW1O5IZNXJG+3u7RZIiYOa8MpOz6jMbMxt8apN5oZQHFIm8Zmccy1CmZp9PKlQBg==',key_name='tempest-TestGettingAddress-1196931169',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-ku485ehg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:27:48Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=ca651811-3d96-4b41-a50d-bbaeaf3da808,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "087d1308-7c0a-45ab-b876-1dfdceb622d7", "address": "fa:16:3e:ac:b7:65", "network": {"id": "d68eafa7-b35f-4bd9-ba11-e28a73bc7849", "bridge": "br-int", "label": "tempest-network-smoke--995539246", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap087d1308-7c", "ovs_interfaceid": "087d1308-7c0a-45ab-b876-1dfdceb622d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.626 2 DEBUG nova.network.os_vif_util [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "087d1308-7c0a-45ab-b876-1dfdceb622d7", "address": "fa:16:3e:ac:b7:65", "network": {"id": "d68eafa7-b35f-4bd9-ba11-e28a73bc7849", "bridge": "br-int", "label": "tempest-network-smoke--995539246", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap087d1308-7c", "ovs_interfaceid": "087d1308-7c0a-45ab-b876-1dfdceb622d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.634 2 DEBUG nova.network.os_vif_util [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ac:b7:65,bridge_name='br-int',has_traffic_filtering=True,id=087d1308-7c0a-45ab-b876-1dfdceb622d7,network=Network(d68eafa7-b35f-4bd9-ba11-e28a73bc7849),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap087d1308-7c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.634 2 DEBUG os_vif [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:b7:65,bridge_name='br-int',has_traffic_filtering=True,id=087d1308-7c0a-45ab-b876-1dfdceb622d7,network=Network(d68eafa7-b35f-4bd9-ba11-e28a73bc7849),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap087d1308-7c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.638 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.638 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.639 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.642 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.645 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap087d1308-7c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.646 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap087d1308-7c, col_values=(('external_ids', {'iface-id': '087d1308-7c0a-45ab-b876-1dfdceb622d7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ac:b7:65', 'vm-uuid': 'ca651811-3d96-4b41-a50d-bbaeaf3da808'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.689 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.691 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:28:13 np0005466013 NetworkManager[51205]: <info>  [1759408093.6920] manager: (tap087d1308-7c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/239)
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.698 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.700 2 INFO os_vif [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:b7:65,bridge_name='br-int',has_traffic_filtering=True,id=087d1308-7c0a-45ab-b876-1dfdceb622d7,network=Network(d68eafa7-b35f-4bd9-ba11-e28a73bc7849),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap087d1308-7c')#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.700 2 DEBUG nova.virt.libvirt.vif [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:27:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-986630317',display_name='tempest-TestGettingAddress-server-986630317',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-986630317',id=128,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNuPkj9Dxk75qecKTcAv5uyglWO7VJAWxaaepcSp1a1k0dudaz78GBONfraj5VI3+kTW1O5IZNXJG+3u7RZIiYOa8MpOz6jMbMxt8apN5oZQHFIm8Zmccy1CmZp9PKlQBg==',key_name='tempest-TestGettingAddress-1196931169',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-ku485ehg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:27:48Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=ca651811-3d96-4b41-a50d-bbaeaf3da808,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6949320c-d0cb-4a9d-a882-f6d1aac564c3", "address": "fa:16:3e:49:69:cf", "network": {"id": "85092873-751b-414a-a9a1-112c2e61cb13", "bridge": "br-int", "label": "tempest-network-smoke--1405117352", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe49:69cf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6949320c-d0", "ovs_interfaceid": "6949320c-d0cb-4a9d-a882-f6d1aac564c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.701 2 DEBUG nova.network.os_vif_util [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "6949320c-d0cb-4a9d-a882-f6d1aac564c3", "address": "fa:16:3e:49:69:cf", "network": {"id": "85092873-751b-414a-a9a1-112c2e61cb13", "bridge": "br-int", "label": "tempest-network-smoke--1405117352", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe49:69cf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6949320c-d0", "ovs_interfaceid": "6949320c-d0cb-4a9d-a882-f6d1aac564c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.701 2 DEBUG nova.network.os_vif_util [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:49:69:cf,bridge_name='br-int',has_traffic_filtering=True,id=6949320c-d0cb-4a9d-a882-f6d1aac564c3,network=Network(85092873-751b-414a-a9a1-112c2e61cb13),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6949320c-d0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.702 2 DEBUG os_vif [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:49:69:cf,bridge_name='br-int',has_traffic_filtering=True,id=6949320c-d0cb-4a9d-a882-f6d1aac564c3,network=Network(85092873-751b-414a-a9a1-112c2e61cb13),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6949320c-d0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.702 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.702 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.702 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.705 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.705 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6949320c-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.705 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6949320c-d0, col_values=(('external_ids', {'iface-id': '6949320c-d0cb-4a9d-a882-f6d1aac564c3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:49:69:cf', 'vm-uuid': 'ca651811-3d96-4b41-a50d-bbaeaf3da808'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.707 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:13 np0005466013 NetworkManager[51205]: <info>  [1759408093.7083] manager: (tap6949320c-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/240)
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.708 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.713 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.714 2 INFO os_vif [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:49:69:cf,bridge_name='br-int',has_traffic_filtering=True,id=6949320c-d0cb-4a9d-a882-f6d1aac564c3,network=Network(85092873-751b-414a-a9a1-112c2e61cb13),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6949320c-d0')#033[00m
Oct  2 08:28:13 np0005466013 podman[240947]: 2025-10-02 12:28:13.730681357 +0000 UTC m=+0.092560492 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., release=1755695350, build-date=2025-08-20T13:12:41, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, distribution-scope=public, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  2 08:28:13 np0005466013 podman[240948]: 2025-10-02 12:28:13.758933192 +0000 UTC m=+0.120284199 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=edpm, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct  2 08:28:13 np0005466013 podman[240946]: 2025-10-02 12:28:13.766644783 +0000 UTC m=+0.128945991 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.867 2 DEBUG nova.virt.libvirt.driver [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.867 2 DEBUG nova.virt.libvirt.driver [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.867 2 DEBUG nova.virt.libvirt.driver [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] No VIF found with MAC fa:16:3e:ac:b7:65, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.867 2 DEBUG nova.virt.libvirt.driver [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] No VIF found with MAC fa:16:3e:49:69:cf, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.868 2 INFO nova.virt.libvirt.driver [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Using config drive#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.996 2 DEBUG nova.compute.manager [req-7c377d27-4294-47d9-aa48-5de30620b8ba req-587ef9af-fc65-46b1-a50c-dbcd469bd695 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Received event network-vif-plugged-57aa5324-a5d7-424f-b3ea-20887a672a1c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.997 2 DEBUG oslo_concurrency.lockutils [req-7c377d27-4294-47d9-aa48-5de30620b8ba req-587ef9af-fc65-46b1-a50c-dbcd469bd695 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "351ed18e-9759-4759-9a20-24f8b3d59908-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.997 2 DEBUG oslo_concurrency.lockutils [req-7c377d27-4294-47d9-aa48-5de30620b8ba req-587ef9af-fc65-46b1-a50c-dbcd469bd695 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "351ed18e-9759-4759-9a20-24f8b3d59908-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.997 2 DEBUG oslo_concurrency.lockutils [req-7c377d27-4294-47d9-aa48-5de30620b8ba req-587ef9af-fc65-46b1-a50c-dbcd469bd695 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "351ed18e-9759-4759-9a20-24f8b3d59908-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.997 2 DEBUG nova.compute.manager [req-7c377d27-4294-47d9-aa48-5de30620b8ba req-587ef9af-fc65-46b1-a50c-dbcd469bd695 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] No waiting events found dispatching network-vif-plugged-57aa5324-a5d7-424f-b3ea-20887a672a1c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.998 2 WARNING nova.compute.manager [req-7c377d27-4294-47d9-aa48-5de30620b8ba req-587ef9af-fc65-46b1-a50c-dbcd469bd695 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Received unexpected event network-vif-plugged-57aa5324-a5d7-424f-b3ea-20887a672a1c for instance with vm_state active and task_state None.#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.998 2 DEBUG nova.compute.manager [req-7c377d27-4294-47d9-aa48-5de30620b8ba req-587ef9af-fc65-46b1-a50c-dbcd469bd695 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Received event network-vif-plugged-57aa5324-a5d7-424f-b3ea-20887a672a1c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.998 2 DEBUG oslo_concurrency.lockutils [req-7c377d27-4294-47d9-aa48-5de30620b8ba req-587ef9af-fc65-46b1-a50c-dbcd469bd695 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "351ed18e-9759-4759-9a20-24f8b3d59908-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.998 2 DEBUG oslo_concurrency.lockutils [req-7c377d27-4294-47d9-aa48-5de30620b8ba req-587ef9af-fc65-46b1-a50c-dbcd469bd695 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "351ed18e-9759-4759-9a20-24f8b3d59908-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.999 2 DEBUG oslo_concurrency.lockutils [req-7c377d27-4294-47d9-aa48-5de30620b8ba req-587ef9af-fc65-46b1-a50c-dbcd469bd695 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "351ed18e-9759-4759-9a20-24f8b3d59908-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.999 2 DEBUG nova.compute.manager [req-7c377d27-4294-47d9-aa48-5de30620b8ba req-587ef9af-fc65-46b1-a50c-dbcd469bd695 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] No waiting events found dispatching network-vif-plugged-57aa5324-a5d7-424f-b3ea-20887a672a1c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:28:13 np0005466013 nova_compute[192144]: 2025-10-02 12:28:13.999 2 WARNING nova.compute.manager [req-7c377d27-4294-47d9-aa48-5de30620b8ba req-587ef9af-fc65-46b1-a50c-dbcd469bd695 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Received unexpected event network-vif-plugged-57aa5324-a5d7-424f-b3ea-20887a672a1c for instance with vm_state active and task_state None.#033[00m
Oct  2 08:28:14 np0005466013 nova_compute[192144]: 2025-10-02 12:28:14.700 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:14 np0005466013 nova_compute[192144]: 2025-10-02 12:28:14.910 2 INFO nova.virt.libvirt.driver [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Creating config drive at /var/lib/nova/instances/ca651811-3d96-4b41-a50d-bbaeaf3da808/disk.config#033[00m
Oct  2 08:28:14 np0005466013 nova_compute[192144]: 2025-10-02 12:28:14.915 2 DEBUG oslo_concurrency.processutils [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ca651811-3d96-4b41-a50d-bbaeaf3da808/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5osroc57 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:28:15 np0005466013 nova_compute[192144]: 2025-10-02 12:28:15.046 2 DEBUG oslo_concurrency.processutils [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ca651811-3d96-4b41-a50d-bbaeaf3da808/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5osroc57" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:28:15 np0005466013 kernel: tap087d1308-7c: entered promiscuous mode
Oct  2 08:28:15 np0005466013 NetworkManager[51205]: <info>  [1759408095.1232] manager: (tap087d1308-7c): new Tun device (/org/freedesktop/NetworkManager/Devices/241)
Oct  2 08:28:15 np0005466013 ovn_controller[94366]: 2025-10-02T12:28:15Z|00536|binding|INFO|Claiming lport 087d1308-7c0a-45ab-b876-1dfdceb622d7 for this chassis.
Oct  2 08:28:15 np0005466013 ovn_controller[94366]: 2025-10-02T12:28:15Z|00537|binding|INFO|087d1308-7c0a-45ab-b876-1dfdceb622d7: Claiming fa:16:3e:ac:b7:65 10.100.0.13
Oct  2 08:28:15 np0005466013 nova_compute[192144]: 2025-10-02 12:28:15.131 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:15 np0005466013 NetworkManager[51205]: <info>  [1759408095.1498] manager: (tap6949320c-d0): new Tun device (/org/freedesktop/NetworkManager/Devices/242)
Oct  2 08:28:15 np0005466013 systemd-udevd[241029]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:28:15 np0005466013 systemd-udevd[241028]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:28:15 np0005466013 NetworkManager[51205]: <info>  [1759408095.1847] device (tap087d1308-7c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:28:15 np0005466013 NetworkManager[51205]: <info>  [1759408095.1860] device (tap087d1308-7c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:28:15 np0005466013 systemd-machined[152202]: New machine qemu-63-instance-00000080.
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:15.215 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:b7:65 10.100.0.13'], port_security=['fa:16:3e:ac:b7:65 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d68eafa7-b35f-4bd9-ba11-e28a73bc7849', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1049901a-232c-40d0-9fe6-646c9d087089', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a731a9f5-9e55-440a-a95e-a9a819598de7, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=087d1308-7c0a-45ab-b876-1dfdceb622d7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:15.216 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 087d1308-7c0a-45ab-b876-1dfdceb622d7 in datapath d68eafa7-b35f-4bd9-ba11-e28a73bc7849 bound to our chassis#033[00m
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:15.218 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d68eafa7-b35f-4bd9-ba11-e28a73bc7849#033[00m
Oct  2 08:28:15 np0005466013 nova_compute[192144]: 2025-10-02 12:28:15.225 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:15 np0005466013 systemd[1]: Started Virtual Machine qemu-63-instance-00000080.
Oct  2 08:28:15 np0005466013 kernel: tap6949320c-d0: entered promiscuous mode
Oct  2 08:28:15 np0005466013 NetworkManager[51205]: <info>  [1759408095.2292] device (tap6949320c-d0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:28:15 np0005466013 NetworkManager[51205]: <info>  [1759408095.2310] device (tap6949320c-d0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:28:15 np0005466013 ovn_controller[94366]: 2025-10-02T12:28:15Z|00538|binding|INFO|Claiming lport 6949320c-d0cb-4a9d-a882-f6d1aac564c3 for this chassis.
Oct  2 08:28:15 np0005466013 ovn_controller[94366]: 2025-10-02T12:28:15Z|00539|binding|INFO|6949320c-d0cb-4a9d-a882-f6d1aac564c3: Claiming fa:16:3e:49:69:cf 2001:db8::f816:3eff:fe49:69cf
Oct  2 08:28:15 np0005466013 nova_compute[192144]: 2025-10-02 12:28:15.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:15.235 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[82be78dc-1079-4b54-b070-13aec4974e5c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:15.236 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd68eafa7-b1 in ovnmeta-d68eafa7-b35f-4bd9-ba11-e28a73bc7849 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:15.246 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd68eafa7-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:15.246 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[0b0599e1-7f55-4975-872b-ff07bdd3c212]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:15 np0005466013 ovn_controller[94366]: 2025-10-02T12:28:15Z|00540|binding|INFO|Setting lport 087d1308-7c0a-45ab-b876-1dfdceb622d7 ovn-installed in OVS
Oct  2 08:28:15 np0005466013 nova_compute[192144]: 2025-10-02 12:28:15.249 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:15.248 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[69e87626-3ae4-4f11-ab39-aec755739212]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:15 np0005466013 ovn_controller[94366]: 2025-10-02T12:28:15Z|00541|binding|INFO|Setting lport 087d1308-7c0a-45ab-b876-1dfdceb622d7 up in Southbound
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:15.252 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:49:69:cf 2001:db8::f816:3eff:fe49:69cf'], port_security=['fa:16:3e:49:69:cf 2001:db8::f816:3eff:fe49:69cf'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe49:69cf/64', 'neutron:device_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-85092873-751b-414a-a9a1-112c2e61cb13', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1049901a-232c-40d0-9fe6-646c9d087089', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b1cb8f94-a0b5-458e-a15a-45916ae4369f, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=6949320c-d0cb-4a9d-a882-f6d1aac564c3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:28:15 np0005466013 ovn_controller[94366]: 2025-10-02T12:28:15Z|00542|binding|INFO|Setting lport 6949320c-d0cb-4a9d-a882-f6d1aac564c3 ovn-installed in OVS
Oct  2 08:28:15 np0005466013 nova_compute[192144]: 2025-10-02 12:28:15.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:15.266 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[2d0ef0f2-e584-4461-845f-d713b2d69d9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:15 np0005466013 ovn_controller[94366]: 2025-10-02T12:28:15Z|00543|binding|INFO|Setting lport 6949320c-d0cb-4a9d-a882-f6d1aac564c3 up in Southbound
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:15.285 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[06523885-0e9b-4393-9616-35cf2f2d51a7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:15.320 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[98a49d42-a06c-480c-b7fd-c773a603ab67]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:15 np0005466013 NetworkManager[51205]: <info>  [1759408095.3289] manager: (tapd68eafa7-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/243)
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:15.328 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[f4bc4e77-2256-465e-bd60-c5f04b5844d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:15.367 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[17f09b72-a3a5-4bc4-82b0-9b5372bb072c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:15.370 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[0f3d7b7b-092c-44f5-b005-873ca11b0866]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:15 np0005466013 NetworkManager[51205]: <info>  [1759408095.4017] device (tapd68eafa7-b0): carrier: link connected
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:15.407 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[59f53c25-2530-4e0d-8c74-33a684f67989]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:15.425 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[19ff16d4-294a-491d-a402-3783ec56f3ca]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd68eafa7-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:10:32:d2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 164], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 608701, 'reachable_time': 40715, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241065, 'error': None, 'target': 'ovnmeta-d68eafa7-b35f-4bd9-ba11-e28a73bc7849', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:15 np0005466013 nova_compute[192144]: 2025-10-02 12:28:15.444 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:15.446 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=32, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=31) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:15.454 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[8129225e-98f0-4d4b-ba68-46578f2af233]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe10:32d2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 608701, 'tstamp': 608701}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241066, 'error': None, 'target': 'ovnmeta-d68eafa7-b35f-4bd9-ba11-e28a73bc7849', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:15.474 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[72b20bb1-b92a-4798-b68c-ddf75a305179]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd68eafa7-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:10:32:d2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 164], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 608701, 'reachable_time': 40715, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 168, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 168, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 241067, 'error': None, 'target': 'ovnmeta-d68eafa7-b35f-4bd9-ba11-e28a73bc7849', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:15.510 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[ae48c472-c3f4-40ff-935a-78ceca7f0e67]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:15.602 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[c7a08ed5-054a-4fd2-b7cb-1f42e4922b57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:15.603 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd68eafa7-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:15.604 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:15.604 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd68eafa7-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:28:15 np0005466013 nova_compute[192144]: 2025-10-02 12:28:15.606 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:15 np0005466013 NetworkManager[51205]: <info>  [1759408095.6074] manager: (tapd68eafa7-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/244)
Oct  2 08:28:15 np0005466013 kernel: tapd68eafa7-b0: entered promiscuous mode
Oct  2 08:28:15 np0005466013 nova_compute[192144]: 2025-10-02 12:28:15.609 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:15.613 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd68eafa7-b0, col_values=(('external_ids', {'iface-id': '41abd6d0-501e-436d-9123-d8936335a0b5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:28:15 np0005466013 nova_compute[192144]: 2025-10-02 12:28:15.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:15 np0005466013 ovn_controller[94366]: 2025-10-02T12:28:15Z|00544|binding|INFO|Releasing lport 41abd6d0-501e-436d-9123-d8936335a0b5 from this chassis (sb_readonly=0)
Oct  2 08:28:15 np0005466013 nova_compute[192144]: 2025-10-02 12:28:15.643 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:15.647 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d68eafa7-b35f-4bd9-ba11-e28a73bc7849.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d68eafa7-b35f-4bd9-ba11-e28a73bc7849.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:15.648 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[a7fbe316-987b-45bf-be06-084adc229b96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:15.649 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-d68eafa7-b35f-4bd9-ba11-e28a73bc7849
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/d68eafa7-b35f-4bd9-ba11-e28a73bc7849.pid.haproxy
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID d68eafa7-b35f-4bd9-ba11-e28a73bc7849
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:28:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:15.649 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d68eafa7-b35f-4bd9-ba11-e28a73bc7849', 'env', 'PROCESS_TAG=haproxy-d68eafa7-b35f-4bd9-ba11-e28a73bc7849', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d68eafa7-b35f-4bd9-ba11-e28a73bc7849.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:28:16 np0005466013 nova_compute[192144]: 2025-10-02 12:28:16.027 2 DEBUG nova.compute.manager [req-41ee3772-3842-4ec2-a875-4a1924d51ccd req-f99a5615-ee98-47b9-b2a2-312fa78f5126 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Received event network-vif-plugged-087d1308-7c0a-45ab-b876-1dfdceb622d7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:28:16 np0005466013 nova_compute[192144]: 2025-10-02 12:28:16.028 2 DEBUG oslo_concurrency.lockutils [req-41ee3772-3842-4ec2-a875-4a1924d51ccd req-f99a5615-ee98-47b9-b2a2-312fa78f5126 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "ca651811-3d96-4b41-a50d-bbaeaf3da808-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:16 np0005466013 nova_compute[192144]: 2025-10-02 12:28:16.029 2 DEBUG oslo_concurrency.lockutils [req-41ee3772-3842-4ec2-a875-4a1924d51ccd req-f99a5615-ee98-47b9-b2a2-312fa78f5126 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ca651811-3d96-4b41-a50d-bbaeaf3da808-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:16 np0005466013 nova_compute[192144]: 2025-10-02 12:28:16.029 2 DEBUG oslo_concurrency.lockutils [req-41ee3772-3842-4ec2-a875-4a1924d51ccd req-f99a5615-ee98-47b9-b2a2-312fa78f5126 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ca651811-3d96-4b41-a50d-bbaeaf3da808-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:16 np0005466013 nova_compute[192144]: 2025-10-02 12:28:16.029 2 DEBUG nova.compute.manager [req-41ee3772-3842-4ec2-a875-4a1924d51ccd req-f99a5615-ee98-47b9-b2a2-312fa78f5126 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Processing event network-vif-plugged-087d1308-7c0a-45ab-b876-1dfdceb622d7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:28:16 np0005466013 nova_compute[192144]: 2025-10-02 12:28:16.114 2 DEBUG nova.compute.manager [req-90ee0a87-5732-4ffe-8208-cd082da8a876 req-540ce64c-7904-423b-bbf8-13b64b4d8b52 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Received event network-vif-plugged-6949320c-d0cb-4a9d-a882-f6d1aac564c3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:28:16 np0005466013 nova_compute[192144]: 2025-10-02 12:28:16.115 2 DEBUG oslo_concurrency.lockutils [req-90ee0a87-5732-4ffe-8208-cd082da8a876 req-540ce64c-7904-423b-bbf8-13b64b4d8b52 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "ca651811-3d96-4b41-a50d-bbaeaf3da808-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:16 np0005466013 nova_compute[192144]: 2025-10-02 12:28:16.115 2 DEBUG oslo_concurrency.lockutils [req-90ee0a87-5732-4ffe-8208-cd082da8a876 req-540ce64c-7904-423b-bbf8-13b64b4d8b52 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ca651811-3d96-4b41-a50d-bbaeaf3da808-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:16 np0005466013 nova_compute[192144]: 2025-10-02 12:28:16.115 2 DEBUG oslo_concurrency.lockutils [req-90ee0a87-5732-4ffe-8208-cd082da8a876 req-540ce64c-7904-423b-bbf8-13b64b4d8b52 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ca651811-3d96-4b41-a50d-bbaeaf3da808-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:16 np0005466013 nova_compute[192144]: 2025-10-02 12:28:16.116 2 DEBUG nova.compute.manager [req-90ee0a87-5732-4ffe-8208-cd082da8a876 req-540ce64c-7904-423b-bbf8-13b64b4d8b52 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Processing event network-vif-plugged-6949320c-d0cb-4a9d-a882-f6d1aac564c3 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:28:16 np0005466013 podman[241107]: 2025-10-02 12:28:16.062587881 +0000 UTC m=+0.029788304 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:28:16 np0005466013 podman[241107]: 2025-10-02 12:28:16.168427627 +0000 UTC m=+0.135628020 container create fe508494363446b0ce8416e788d77942ecadbb6f73117d0d95eb34c5ccf4f2bd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d68eafa7-b35f-4bd9-ba11-e28a73bc7849, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:28:16 np0005466013 systemd[1]: Started libpod-conmon-fe508494363446b0ce8416e788d77942ecadbb6f73117d0d95eb34c5ccf4f2bd.scope.
Oct  2 08:28:16 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:28:16 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f99c4c5eaf4e543d946adc95f4a497ce712e8a181ed8712901a3b0bdc688f2ea/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:28:16 np0005466013 podman[241107]: 2025-10-02 12:28:16.336230105 +0000 UTC m=+0.303430508 container init fe508494363446b0ce8416e788d77942ecadbb6f73117d0d95eb34c5ccf4f2bd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d68eafa7-b35f-4bd9-ba11-e28a73bc7849, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 08:28:16 np0005466013 podman[241107]: 2025-10-02 12:28:16.345754934 +0000 UTC m=+0.312955307 container start fe508494363446b0ce8416e788d77942ecadbb6f73117d0d95eb34c5ccf4f2bd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d68eafa7-b35f-4bd9-ba11-e28a73bc7849, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:28:16 np0005466013 neutron-haproxy-ovnmeta-d68eafa7-b35f-4bd9-ba11-e28a73bc7849[241122]: [NOTICE]   (241126) : New worker (241128) forked
Oct  2 08:28:16 np0005466013 neutron-haproxy-ovnmeta-d68eafa7-b35f-4bd9-ba11-e28a73bc7849[241122]: [NOTICE]   (241126) : Loading success.
Oct  2 08:28:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:16.422 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 6949320c-d0cb-4a9d-a882-f6d1aac564c3 in datapath 85092873-751b-414a-a9a1-112c2e61cb13 unbound from our chassis#033[00m
Oct  2 08:28:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:16.424 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 85092873-751b-414a-a9a1-112c2e61cb13#033[00m
Oct  2 08:28:16 np0005466013 nova_compute[192144]: 2025-10-02 12:28:16.425 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759408096.4218938, ca651811-3d96-4b41-a50d-bbaeaf3da808 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:28:16 np0005466013 nova_compute[192144]: 2025-10-02 12:28:16.425 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] VM Started (Lifecycle Event)#033[00m
Oct  2 08:28:16 np0005466013 nova_compute[192144]: 2025-10-02 12:28:16.427 2 DEBUG nova.compute.manager [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Instance event wait completed in 0 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:28:16 np0005466013 nova_compute[192144]: 2025-10-02 12:28:16.431 2 DEBUG nova.virt.libvirt.driver [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:28:16 np0005466013 nova_compute[192144]: 2025-10-02 12:28:16.437 2 INFO nova.virt.libvirt.driver [-] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Instance spawned successfully.#033[00m
Oct  2 08:28:16 np0005466013 nova_compute[192144]: 2025-10-02 12:28:16.438 2 DEBUG nova.virt.libvirt.driver [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:28:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:16.455 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[6b0089fb-c463-400d-8204-5a3c49e8455c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:16.456 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap85092873-71 in ovnmeta-85092873-751b-414a-a9a1-112c2e61cb13 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:28:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:16.459 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap85092873-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:28:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:16.459 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[8827f932-409b-402d-85ca-cbcd25857b46]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:16.460 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[b0a016c4-091c-47de-91a5-4ef55f2a33fa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:16.476 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[4a8b6663-1cdc-47df-be10-1d2fcc6c7d8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:16.506 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[fcc3b935-a067-4fc0-b426-27b0d2b0b08e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:16 np0005466013 nova_compute[192144]: 2025-10-02 12:28:16.517 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:28:16 np0005466013 nova_compute[192144]: 2025-10-02 12:28:16.524 2 DEBUG nova.virt.libvirt.driver [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:28:16 np0005466013 nova_compute[192144]: 2025-10-02 12:28:16.525 2 DEBUG nova.virt.libvirt.driver [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:28:16 np0005466013 nova_compute[192144]: 2025-10-02 12:28:16.525 2 DEBUG nova.virt.libvirt.driver [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:28:16 np0005466013 nova_compute[192144]: 2025-10-02 12:28:16.526 2 DEBUG nova.virt.libvirt.driver [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:28:16 np0005466013 nova_compute[192144]: 2025-10-02 12:28:16.526 2 DEBUG nova.virt.libvirt.driver [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:28:16 np0005466013 nova_compute[192144]: 2025-10-02 12:28:16.527 2 DEBUG nova.virt.libvirt.driver [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:28:16 np0005466013 nova_compute[192144]: 2025-10-02 12:28:16.533 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:28:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:16.552 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[f68737a2-e66a-4886-a34a-706d8c05d4f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:16 np0005466013 NetworkManager[51205]: <info>  [1759408096.5650] manager: (tap85092873-70): new Veth device (/org/freedesktop/NetworkManager/Devices/245)
Oct  2 08:28:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:16.566 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[360564cd-33a8-44d7-99f3-56bd993ef5b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:16.603 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[45d11d47-ba25-470b-afb2-8cdf513b9858]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:16.607 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[f5bb22e6-f228-4e3c-938b-b5fc3bd8ecab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:16 np0005466013 NetworkManager[51205]: <info>  [1759408096.6331] device (tap85092873-70): carrier: link connected
Oct  2 08:28:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:16.641 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[ad2dc84c-65be-4611-aae5-60d6e24684e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:16 np0005466013 nova_compute[192144]: 2025-10-02 12:28:16.667 2 DEBUG nova.network.neutron [req-af3eb691-3a9b-4406-a31d-64c025b3c4f9 req-da9f7104-0b10-43d9-9350-5cd347aeeeff 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Updated VIF entry in instance network info cache for port 6949320c-d0cb-4a9d-a882-f6d1aac564c3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:28:16 np0005466013 nova_compute[192144]: 2025-10-02 12:28:16.667 2 DEBUG nova.network.neutron [req-af3eb691-3a9b-4406-a31d-64c025b3c4f9 req-da9f7104-0b10-43d9-9350-5cd347aeeeff 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Updating instance_info_cache with network_info: [{"id": "087d1308-7c0a-45ab-b876-1dfdceb622d7", "address": "fa:16:3e:ac:b7:65", "network": {"id": "d68eafa7-b35f-4bd9-ba11-e28a73bc7849", "bridge": "br-int", "label": "tempest-network-smoke--995539246", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap087d1308-7c", "ovs_interfaceid": "087d1308-7c0a-45ab-b876-1dfdceb622d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6949320c-d0cb-4a9d-a882-f6d1aac564c3", "address": "fa:16:3e:49:69:cf", "network": {"id": "85092873-751b-414a-a9a1-112c2e61cb13", "bridge": "br-int", "label": "tempest-network-smoke--1405117352", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe49:69cf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6949320c-d0", "ovs_interfaceid": "6949320c-d0cb-4a9d-a882-f6d1aac564c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:28:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:16.671 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[045d60ae-dcca-46de-bb55-dfae67fb1de5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap85092873-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:77:05:f4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 165], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 608825, 'reachable_time': 28537, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241147, 'error': None, 'target': 'ovnmeta-85092873-751b-414a-a9a1-112c2e61cb13', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:16.693 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[32df3dc7-43ae-4500-9dae-b6597b498370]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe77:5f4'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 608825, 'tstamp': 608825}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241148, 'error': None, 'target': 'ovnmeta-85092873-751b-414a-a9a1-112c2e61cb13', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:16.714 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[4f920f08-dfaf-489a-ae18-13a0bf5640f9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap85092873-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:77:05:f4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 165], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 608825, 'reachable_time': 28537, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 241149, 'error': None, 'target': 'ovnmeta-85092873-751b-414a-a9a1-112c2e61cb13', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:16.754 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[5dc1500c-03e3-4459-aece-6e971387e968]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:16.795 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[4f4785f6-e964-4b87-82ab-430c9f45b009]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:16.797 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap85092873-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:28:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:16.797 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:28:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:16.798 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap85092873-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:28:16 np0005466013 NetworkManager[51205]: <info>  [1759408096.8008] manager: (tap85092873-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/246)
Oct  2 08:28:16 np0005466013 nova_compute[192144]: 2025-10-02 12:28:16.800 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:16 np0005466013 kernel: tap85092873-70: entered promiscuous mode
Oct  2 08:28:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:16.807 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap85092873-70, col_values=(('external_ids', {'iface-id': '8baac968-8bfc-4d6e-93cc-be5861eaa459'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:28:16 np0005466013 nova_compute[192144]: 2025-10-02 12:28:16.808 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:16 np0005466013 ovn_controller[94366]: 2025-10-02T12:28:16Z|00545|binding|INFO|Releasing lport 8baac968-8bfc-4d6e-93cc-be5861eaa459 from this chassis (sb_readonly=0)
Oct  2 08:28:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:16.812 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/85092873-751b-414a-a9a1-112c2e61cb13.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/85092873-751b-414a-a9a1-112c2e61cb13.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:28:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:16.813 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[2f3d3197-10a4-4ac2-a603-624cdd78c806]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:16.814 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:28:16 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:28:16 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:28:16 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-85092873-751b-414a-a9a1-112c2e61cb13
Oct  2 08:28:16 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:28:16 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:28:16 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:28:16 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/85092873-751b-414a-a9a1-112c2e61cb13.pid.haproxy
Oct  2 08:28:16 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:28:16 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:28:16 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:28:16 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:28:16 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:28:16 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:28:16 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:28:16 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:28:16 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:28:16 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:28:16 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:28:16 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:28:16 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:28:16 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:28:16 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:28:16 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:28:16 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:28:16 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:28:16 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:28:16 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:28:16 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID 85092873-751b-414a-a9a1-112c2e61cb13
Oct  2 08:28:16 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:28:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:16.815 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-85092873-751b-414a-a9a1-112c2e61cb13', 'env', 'PROCESS_TAG=haproxy-85092873-751b-414a-a9a1-112c2e61cb13', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/85092873-751b-414a-a9a1-112c2e61cb13.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:28:16 np0005466013 nova_compute[192144]: 2025-10-02 12:28:16.822 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:17 np0005466013 podman[241180]: 2025-10-02 12:28:17.176600366 +0000 UTC m=+0.034104540 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:28:17 np0005466013 podman[241180]: 2025-10-02 12:28:17.317117599 +0000 UTC m=+0.174621753 container create b6132df3ebb11e5e9c2532671ce762b859bfe9e1abacc58fd47c9b8f54e98687 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-85092873-751b-414a-a9a1-112c2e61cb13, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct  2 08:28:17 np0005466013 systemd[1]: Started libpod-conmon-b6132df3ebb11e5e9c2532671ce762b859bfe9e1abacc58fd47c9b8f54e98687.scope.
Oct  2 08:28:17 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:28:17 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7bd4dd73780ac142a6cea288365b78ad003e2dc4b791dd69b90825c1bd164fef/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:28:17 np0005466013 podman[241180]: 2025-10-02 12:28:17.530189305 +0000 UTC m=+0.387693469 container init b6132df3ebb11e5e9c2532671ce762b859bfe9e1abacc58fd47c9b8f54e98687 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-85092873-751b-414a-a9a1-112c2e61cb13, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001)
Oct  2 08:28:17 np0005466013 podman[241180]: 2025-10-02 12:28:17.537655039 +0000 UTC m=+0.395159193 container start b6132df3ebb11e5e9c2532671ce762b859bfe9e1abacc58fd47c9b8f54e98687 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-85092873-751b-414a-a9a1-112c2e61cb13, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 08:28:17 np0005466013 neutron-haproxy-ovnmeta-85092873-751b-414a-a9a1-112c2e61cb13[241195]: [NOTICE]   (241199) : New worker (241201) forked
Oct  2 08:28:17 np0005466013 neutron-haproxy-ovnmeta-85092873-751b-414a-a9a1-112c2e61cb13[241195]: [NOTICE]   (241199) : Loading success.
Oct  2 08:28:17 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:17.611 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:28:17 np0005466013 nova_compute[192144]: 2025-10-02 12:28:17.909 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:28:17 np0005466013 nova_compute[192144]: 2025-10-02 12:28:17.910 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759408096.4220238, ca651811-3d96-4b41-a50d-bbaeaf3da808 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:28:17 np0005466013 nova_compute[192144]: 2025-10-02 12:28:17.910 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:28:17 np0005466013 nova_compute[192144]: 2025-10-02 12:28:17.920 2 DEBUG oslo_concurrency.lockutils [req-af3eb691-3a9b-4406-a31d-64c025b3c4f9 req-da9f7104-0b10-43d9-9350-5cd347aeeeff 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-ca651811-3d96-4b41-a50d-bbaeaf3da808" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:28:17 np0005466013 nova_compute[192144]: 2025-10-02 12:28:17.946 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:28:17 np0005466013 nova_compute[192144]: 2025-10-02 12:28:17.950 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759408096.4310925, ca651811-3d96-4b41-a50d-bbaeaf3da808 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:28:17 np0005466013 nova_compute[192144]: 2025-10-02 12:28:17.950 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:28:18 np0005466013 nova_compute[192144]: 2025-10-02 12:28:18.002 2 INFO nova.compute.manager [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Took 28.90 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:28:18 np0005466013 nova_compute[192144]: 2025-10-02 12:28:18.003 2 DEBUG nova.compute.manager [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:28:18 np0005466013 nova_compute[192144]: 2025-10-02 12:28:18.005 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:28:18 np0005466013 nova_compute[192144]: 2025-10-02 12:28:18.014 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:28:18 np0005466013 nova_compute[192144]: 2025-10-02 12:28:18.132 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:28:18 np0005466013 nova_compute[192144]: 2025-10-02 12:28:18.312 2 DEBUG nova.compute.manager [req-01b457ec-e08e-4020-98f4-3ca318827794 req-dc6ebbc9-a34b-41f1-805c-9a5be2563c63 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Received event network-vif-plugged-087d1308-7c0a-45ab-b876-1dfdceb622d7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:28:18 np0005466013 nova_compute[192144]: 2025-10-02 12:28:18.316 2 DEBUG oslo_concurrency.lockutils [req-01b457ec-e08e-4020-98f4-3ca318827794 req-dc6ebbc9-a34b-41f1-805c-9a5be2563c63 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "ca651811-3d96-4b41-a50d-bbaeaf3da808-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:18 np0005466013 nova_compute[192144]: 2025-10-02 12:28:18.317 2 DEBUG oslo_concurrency.lockutils [req-01b457ec-e08e-4020-98f4-3ca318827794 req-dc6ebbc9-a34b-41f1-805c-9a5be2563c63 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ca651811-3d96-4b41-a50d-bbaeaf3da808-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:18 np0005466013 nova_compute[192144]: 2025-10-02 12:28:18.318 2 DEBUG oslo_concurrency.lockutils [req-01b457ec-e08e-4020-98f4-3ca318827794 req-dc6ebbc9-a34b-41f1-805c-9a5be2563c63 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ca651811-3d96-4b41-a50d-bbaeaf3da808-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:18 np0005466013 nova_compute[192144]: 2025-10-02 12:28:18.318 2 DEBUG nova.compute.manager [req-01b457ec-e08e-4020-98f4-3ca318827794 req-dc6ebbc9-a34b-41f1-805c-9a5be2563c63 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] No waiting events found dispatching network-vif-plugged-087d1308-7c0a-45ab-b876-1dfdceb622d7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:28:18 np0005466013 nova_compute[192144]: 2025-10-02 12:28:18.318 2 WARNING nova.compute.manager [req-01b457ec-e08e-4020-98f4-3ca318827794 req-dc6ebbc9-a34b-41f1-805c-9a5be2563c63 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Received unexpected event network-vif-plugged-087d1308-7c0a-45ab-b876-1dfdceb622d7 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:28:18 np0005466013 nova_compute[192144]: 2025-10-02 12:28:18.369 2 INFO nova.compute.manager [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Took 36.47 seconds to build instance.#033[00m
Oct  2 08:28:18 np0005466013 nova_compute[192144]: 2025-10-02 12:28:18.419 2 DEBUG nova.compute.manager [req-6fa6bcb0-8a90-4e83-99c9-008f058eb648 req-436a8992-d1a8-4d12-b4b4-be130d5d8d34 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Received event network-vif-plugged-57aa5324-a5d7-424f-b3ea-20887a672a1c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:28:18 np0005466013 nova_compute[192144]: 2025-10-02 12:28:18.420 2 DEBUG oslo_concurrency.lockutils [req-6fa6bcb0-8a90-4e83-99c9-008f058eb648 req-436a8992-d1a8-4d12-b4b4-be130d5d8d34 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "351ed18e-9759-4759-9a20-24f8b3d59908-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:18 np0005466013 nova_compute[192144]: 2025-10-02 12:28:18.421 2 DEBUG oslo_concurrency.lockutils [req-6fa6bcb0-8a90-4e83-99c9-008f058eb648 req-436a8992-d1a8-4d12-b4b4-be130d5d8d34 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "351ed18e-9759-4759-9a20-24f8b3d59908-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:18 np0005466013 nova_compute[192144]: 2025-10-02 12:28:18.421 2 DEBUG oslo_concurrency.lockutils [req-6fa6bcb0-8a90-4e83-99c9-008f058eb648 req-436a8992-d1a8-4d12-b4b4-be130d5d8d34 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "351ed18e-9759-4759-9a20-24f8b3d59908-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:18 np0005466013 nova_compute[192144]: 2025-10-02 12:28:18.422 2 DEBUG nova.compute.manager [req-6fa6bcb0-8a90-4e83-99c9-008f058eb648 req-436a8992-d1a8-4d12-b4b4-be130d5d8d34 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] No waiting events found dispatching network-vif-plugged-57aa5324-a5d7-424f-b3ea-20887a672a1c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:28:18 np0005466013 nova_compute[192144]: 2025-10-02 12:28:18.423 2 WARNING nova.compute.manager [req-6fa6bcb0-8a90-4e83-99c9-008f058eb648 req-436a8992-d1a8-4d12-b4b4-be130d5d8d34 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Received unexpected event network-vif-plugged-57aa5324-a5d7-424f-b3ea-20887a672a1c for instance with vm_state active and task_state None.#033[00m
Oct  2 08:28:18 np0005466013 nova_compute[192144]: 2025-10-02 12:28:18.428 2 DEBUG oslo_concurrency.lockutils [None req-bf1aac43-c255-4d98-940c-2a43f20e16e9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "ca651811-3d96-4b41-a50d-bbaeaf3da808" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 37.671s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:18 np0005466013 nova_compute[192144]: 2025-10-02 12:28:18.551 2 DEBUG nova.compute.manager [req-1fef4533-bfbb-4929-ae53-59ba97b8dc24 req-718d44d2-5b65-4b73-ae11-2266f595eef8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Received event network-vif-plugged-6949320c-d0cb-4a9d-a882-f6d1aac564c3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:28:18 np0005466013 nova_compute[192144]: 2025-10-02 12:28:18.551 2 DEBUG oslo_concurrency.lockutils [req-1fef4533-bfbb-4929-ae53-59ba97b8dc24 req-718d44d2-5b65-4b73-ae11-2266f595eef8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "ca651811-3d96-4b41-a50d-bbaeaf3da808-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:18 np0005466013 nova_compute[192144]: 2025-10-02 12:28:18.552 2 DEBUG oslo_concurrency.lockutils [req-1fef4533-bfbb-4929-ae53-59ba97b8dc24 req-718d44d2-5b65-4b73-ae11-2266f595eef8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ca651811-3d96-4b41-a50d-bbaeaf3da808-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:18 np0005466013 nova_compute[192144]: 2025-10-02 12:28:18.552 2 DEBUG oslo_concurrency.lockutils [req-1fef4533-bfbb-4929-ae53-59ba97b8dc24 req-718d44d2-5b65-4b73-ae11-2266f595eef8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ca651811-3d96-4b41-a50d-bbaeaf3da808-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:18 np0005466013 nova_compute[192144]: 2025-10-02 12:28:18.553 2 DEBUG nova.compute.manager [req-1fef4533-bfbb-4929-ae53-59ba97b8dc24 req-718d44d2-5b65-4b73-ae11-2266f595eef8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] No waiting events found dispatching network-vif-plugged-6949320c-d0cb-4a9d-a882-f6d1aac564c3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:28:18 np0005466013 nova_compute[192144]: 2025-10-02 12:28:18.553 2 WARNING nova.compute.manager [req-1fef4533-bfbb-4929-ae53-59ba97b8dc24 req-718d44d2-5b65-4b73-ae11-2266f595eef8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Received unexpected event network-vif-plugged-6949320c-d0cb-4a9d-a882-f6d1aac564c3 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:28:18 np0005466013 nova_compute[192144]: 2025-10-02 12:28:18.707 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:18 np0005466013 nova_compute[192144]: 2025-10-02 12:28:18.894 2 DEBUG oslo_concurrency.lockutils [None req-f3085708-7b52-4311-929a-d31dd75a6de9 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquiring lock "351ed18e-9759-4759-9a20-24f8b3d59908" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:18 np0005466013 nova_compute[192144]: 2025-10-02 12:28:18.895 2 DEBUG oslo_concurrency.lockutils [None req-f3085708-7b52-4311-929a-d31dd75a6de9 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "351ed18e-9759-4759-9a20-24f8b3d59908" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:18 np0005466013 nova_compute[192144]: 2025-10-02 12:28:18.895 2 DEBUG oslo_concurrency.lockutils [None req-f3085708-7b52-4311-929a-d31dd75a6de9 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquiring lock "351ed18e-9759-4759-9a20-24f8b3d59908-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:18 np0005466013 nova_compute[192144]: 2025-10-02 12:28:18.895 2 DEBUG oslo_concurrency.lockutils [None req-f3085708-7b52-4311-929a-d31dd75a6de9 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "351ed18e-9759-4759-9a20-24f8b3d59908-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:18 np0005466013 nova_compute[192144]: 2025-10-02 12:28:18.895 2 DEBUG oslo_concurrency.lockutils [None req-f3085708-7b52-4311-929a-d31dd75a6de9 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "351ed18e-9759-4759-9a20-24f8b3d59908-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:18 np0005466013 nova_compute[192144]: 2025-10-02 12:28:18.931 2 INFO nova.compute.manager [None req-f3085708-7b52-4311-929a-d31dd75a6de9 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Terminating instance#033[00m
Oct  2 08:28:18 np0005466013 nova_compute[192144]: 2025-10-02 12:28:18.949 2 DEBUG nova.compute.manager [None req-f3085708-7b52-4311-929a-d31dd75a6de9 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:28:18 np0005466013 kernel: tap57aa5324-a5 (unregistering): left promiscuous mode
Oct  2 08:28:18 np0005466013 NetworkManager[51205]: <info>  [1759408098.9826] device (tap57aa5324-a5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:28:19 np0005466013 ovn_controller[94366]: 2025-10-02T12:28:19Z|00546|binding|INFO|Releasing lport 57aa5324-a5d7-424f-b3ea-20887a672a1c from this chassis (sb_readonly=0)
Oct  2 08:28:19 np0005466013 ovn_controller[94366]: 2025-10-02T12:28:19Z|00547|binding|INFO|Setting lport 57aa5324-a5d7-424f-b3ea-20887a672a1c down in Southbound
Oct  2 08:28:19 np0005466013 ovn_controller[94366]: 2025-10-02T12:28:19Z|00548|binding|INFO|Removing iface tap57aa5324-a5 ovn-installed in OVS
Oct  2 08:28:19 np0005466013 nova_compute[192144]: 2025-10-02 12:28:19.014 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:19 np0005466013 nova_compute[192144]: 2025-10-02 12:28:19.039 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:19 np0005466013 systemd[1]: machine-qemu\x2d62\x2dinstance\x2d0000007e.scope: Deactivated successfully.
Oct  2 08:28:19 np0005466013 systemd[1]: machine-qemu\x2d62\x2dinstance\x2d0000007e.scope: Consumed 9.204s CPU time.
Oct  2 08:28:19 np0005466013 systemd-machined[152202]: Machine qemu-62-instance-0000007e terminated.
Oct  2 08:28:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:19.064 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b3:ab:89 10.100.0.8'], port_security=['fa:16:3e:b3:ab:89 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '351ed18e-9759-4759-9a20-24f8b3d59908', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '88e90c16adec46069b539d4f1431ab4d', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'ed50fd5d-92ed-497e-8f4f-4653533c5a19', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=53026845-594b-430c-a1e8-d879cf008d46, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=57aa5324-a5d7-424f-b3ea-20887a672a1c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:28:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:19.066 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 57aa5324-a5d7-424f-b3ea-20887a672a1c in datapath aa4ebb90-ef5e-4974-a53d-2aabd696731a unbound from our chassis#033[00m
Oct  2 08:28:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:19.068 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network aa4ebb90-ef5e-4974-a53d-2aabd696731a#033[00m
Oct  2 08:28:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:19.086 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[e418f374-51f9-4304-ba6a-16c25f53f461]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:19 np0005466013 podman[241213]: 2025-10-02 12:28:19.10603696 +0000 UTC m=+0.088509524 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 08:28:19 np0005466013 podman[241210]: 2025-10-02 12:28:19.12166087 +0000 UTC m=+0.113379124 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  2 08:28:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:19.135 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[fe4455d5-cf05-4eae-82b6-70703ea8e807]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:19.140 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[af8060f3-c52a-45ac-a712-dee16f2e98ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:19 np0005466013 nova_compute[192144]: 2025-10-02 12:28:19.182 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:19.181 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[36b8bd4a-91f1-40dc-8b01-e30f0770d221]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:19 np0005466013 nova_compute[192144]: 2025-10-02 12:28:19.193 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:19.210 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[e6256163-208d-4531-a4eb-62d911a3f9ec]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapaa4ebb90-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bc:89:8e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 16, 'rx_bytes': 916, 'tx_bytes': 864, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 16, 'rx_bytes': 916, 'tx_bytes': 864, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 155], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 593994, 'reachable_time': 23403, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241270, 'error': None, 'target': 'ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:19.236 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[c8296550-0170-4541-aced-c5703930819e]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapaa4ebb90-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 594010, 'tstamp': 594010}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241277, 'error': None, 'target': 'ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapaa4ebb90-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 594013, 'tstamp': 594013}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241277, 'error': None, 'target': 'ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:19.239 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaa4ebb90-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:28:19 np0005466013 nova_compute[192144]: 2025-10-02 12:28:19.241 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:19 np0005466013 nova_compute[192144]: 2025-10-02 12:28:19.249 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:19 np0005466013 nova_compute[192144]: 2025-10-02 12:28:19.250 2 INFO nova.virt.libvirt.driver [-] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Instance destroyed successfully.#033[00m
Oct  2 08:28:19 np0005466013 nova_compute[192144]: 2025-10-02 12:28:19.250 2 DEBUG nova.objects.instance [None req-f3085708-7b52-4311-929a-d31dd75a6de9 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lazy-loading 'resources' on Instance uuid 351ed18e-9759-4759-9a20-24f8b3d59908 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:28:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:19.254 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaa4ebb90-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:28:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:19.255 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:28:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:19.255 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapaa4ebb90-e0, col_values=(('external_ids', {'iface-id': 'c9a9afa9-78de-46a4-a649-8b8779924189'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:28:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:19.256 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:28:19 np0005466013 nova_compute[192144]: 2025-10-02 12:28:19.703 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:19 np0005466013 nova_compute[192144]: 2025-10-02 12:28:19.810 2 DEBUG nova.virt.libvirt.vif [None req-f3085708-7b52-4311-929a-d31dd75a6de9 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:27:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-784230985',display_name='tempest-ServerStableDeviceRescueTest-server-784230985',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-784230985',id=126,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:27:53Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='88e90c16adec46069b539d4f1431ab4d',ramdisk_id='',reservation_id='r-9mkkq6lq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerStableDeviceRescueTest-232864240',owner_user_name='tempest-ServerStableDeviceRescueTest-232864240-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:28:13Z,user_data=None,user_id='abb9f220716e48d79dbe2f97622937c4',uuid=351ed18e-9759-4759-9a20-24f8b3d59908,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "57aa5324-a5d7-424f-b3ea-20887a672a1c", "address": "fa:16:3e:b3:ab:89", "network": {"id": "aa4ebb90-ef5e-4974-a53d-2aabd696731a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88e90c16adec46069b539d4f1431ab4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57aa5324-a5", "ovs_interfaceid": "57aa5324-a5d7-424f-b3ea-20887a672a1c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:28:19 np0005466013 nova_compute[192144]: 2025-10-02 12:28:19.811 2 DEBUG nova.network.os_vif_util [None req-f3085708-7b52-4311-929a-d31dd75a6de9 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Converting VIF {"id": "57aa5324-a5d7-424f-b3ea-20887a672a1c", "address": "fa:16:3e:b3:ab:89", "network": {"id": "aa4ebb90-ef5e-4974-a53d-2aabd696731a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88e90c16adec46069b539d4f1431ab4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57aa5324-a5", "ovs_interfaceid": "57aa5324-a5d7-424f-b3ea-20887a672a1c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:28:19 np0005466013 nova_compute[192144]: 2025-10-02 12:28:19.812 2 DEBUG nova.network.os_vif_util [None req-f3085708-7b52-4311-929a-d31dd75a6de9 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b3:ab:89,bridge_name='br-int',has_traffic_filtering=True,id=57aa5324-a5d7-424f-b3ea-20887a672a1c,network=Network(aa4ebb90-ef5e-4974-a53d-2aabd696731a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57aa5324-a5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:28:19 np0005466013 nova_compute[192144]: 2025-10-02 12:28:19.813 2 DEBUG os_vif [None req-f3085708-7b52-4311-929a-d31dd75a6de9 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b3:ab:89,bridge_name='br-int',has_traffic_filtering=True,id=57aa5324-a5d7-424f-b3ea-20887a672a1c,network=Network(aa4ebb90-ef5e-4974-a53d-2aabd696731a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57aa5324-a5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:28:19 np0005466013 nova_compute[192144]: 2025-10-02 12:28:19.816 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:19 np0005466013 nova_compute[192144]: 2025-10-02 12:28:19.816 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap57aa5324-a5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:28:19 np0005466013 nova_compute[192144]: 2025-10-02 12:28:19.818 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:19 np0005466013 nova_compute[192144]: 2025-10-02 12:28:19.823 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:28:19 np0005466013 nova_compute[192144]: 2025-10-02 12:28:19.826 2 INFO os_vif [None req-f3085708-7b52-4311-929a-d31dd75a6de9 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b3:ab:89,bridge_name='br-int',has_traffic_filtering=True,id=57aa5324-a5d7-424f-b3ea-20887a672a1c,network=Network(aa4ebb90-ef5e-4974-a53d-2aabd696731a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57aa5324-a5')#033[00m
Oct  2 08:28:19 np0005466013 nova_compute[192144]: 2025-10-02 12:28:19.827 2 INFO nova.virt.libvirt.driver [None req-f3085708-7b52-4311-929a-d31dd75a6de9 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Deleting instance files /var/lib/nova/instances/351ed18e-9759-4759-9a20-24f8b3d59908_del#033[00m
Oct  2 08:28:19 np0005466013 nova_compute[192144]: 2025-10-02 12:28:19.828 2 INFO nova.virt.libvirt.driver [None req-f3085708-7b52-4311-929a-d31dd75a6de9 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Deletion of /var/lib/nova/instances/351ed18e-9759-4759-9a20-24f8b3d59908_del complete#033[00m
Oct  2 08:28:20 np0005466013 nova_compute[192144]: 2025-10-02 12:28:20.056 2 INFO nova.compute.manager [None req-f3085708-7b52-4311-929a-d31dd75a6de9 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Took 1.11 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:28:20 np0005466013 nova_compute[192144]: 2025-10-02 12:28:20.058 2 DEBUG oslo.service.loopingcall [None req-f3085708-7b52-4311-929a-d31dd75a6de9 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:28:20 np0005466013 nova_compute[192144]: 2025-10-02 12:28:20.058 2 DEBUG nova.compute.manager [-] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:28:20 np0005466013 nova_compute[192144]: 2025-10-02 12:28:20.058 2 DEBUG nova.network.neutron [-] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:28:21 np0005466013 nova_compute[192144]: 2025-10-02 12:28:21.793 2 DEBUG nova.compute.manager [req-aed116f8-20b9-49b7-825d-0c475d0cb80f req-8c1a9f12-1cc1-47e0-9080-27ffc9836a66 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Received event network-vif-unplugged-57aa5324-a5d7-424f-b3ea-20887a672a1c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:28:21 np0005466013 nova_compute[192144]: 2025-10-02 12:28:21.793 2 DEBUG oslo_concurrency.lockutils [req-aed116f8-20b9-49b7-825d-0c475d0cb80f req-8c1a9f12-1cc1-47e0-9080-27ffc9836a66 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "351ed18e-9759-4759-9a20-24f8b3d59908-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:21 np0005466013 nova_compute[192144]: 2025-10-02 12:28:21.794 2 DEBUG oslo_concurrency.lockutils [req-aed116f8-20b9-49b7-825d-0c475d0cb80f req-8c1a9f12-1cc1-47e0-9080-27ffc9836a66 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "351ed18e-9759-4759-9a20-24f8b3d59908-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:21 np0005466013 nova_compute[192144]: 2025-10-02 12:28:21.795 2 DEBUG oslo_concurrency.lockutils [req-aed116f8-20b9-49b7-825d-0c475d0cb80f req-8c1a9f12-1cc1-47e0-9080-27ffc9836a66 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "351ed18e-9759-4759-9a20-24f8b3d59908-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:21 np0005466013 nova_compute[192144]: 2025-10-02 12:28:21.795 2 DEBUG nova.compute.manager [req-aed116f8-20b9-49b7-825d-0c475d0cb80f req-8c1a9f12-1cc1-47e0-9080-27ffc9836a66 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] No waiting events found dispatching network-vif-unplugged-57aa5324-a5d7-424f-b3ea-20887a672a1c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:28:21 np0005466013 nova_compute[192144]: 2025-10-02 12:28:21.796 2 DEBUG nova.compute.manager [req-aed116f8-20b9-49b7-825d-0c475d0cb80f req-8c1a9f12-1cc1-47e0-9080-27ffc9836a66 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Received event network-vif-unplugged-57aa5324-a5d7-424f-b3ea-20887a672a1c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:28:23 np0005466013 nova_compute[192144]: 2025-10-02 12:28:23.078 2 DEBUG nova.network.neutron [-] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:28:23 np0005466013 nova_compute[192144]: 2025-10-02 12:28:23.118 2 INFO nova.compute.manager [-] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Took 3.06 seconds to deallocate network for instance.#033[00m
Oct  2 08:28:23 np0005466013 nova_compute[192144]: 2025-10-02 12:28:23.238 2 DEBUG oslo_concurrency.lockutils [None req-f3085708-7b52-4311-929a-d31dd75a6de9 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:23 np0005466013 nova_compute[192144]: 2025-10-02 12:28:23.239 2 DEBUG oslo_concurrency.lockutils [None req-f3085708-7b52-4311-929a-d31dd75a6de9 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:23 np0005466013 nova_compute[192144]: 2025-10-02 12:28:23.405 2 DEBUG nova.compute.provider_tree [None req-f3085708-7b52-4311-929a-d31dd75a6de9 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:28:23 np0005466013 nova_compute[192144]: 2025-10-02 12:28:23.438 2 DEBUG nova.scheduler.client.report [None req-f3085708-7b52-4311-929a-d31dd75a6de9 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:28:23 np0005466013 nova_compute[192144]: 2025-10-02 12:28:23.491 2 DEBUG oslo_concurrency.lockutils [None req-f3085708-7b52-4311-929a-d31dd75a6de9 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.252s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:23 np0005466013 nova_compute[192144]: 2025-10-02 12:28:23.564 2 INFO nova.scheduler.client.report [None req-f3085708-7b52-4311-929a-d31dd75a6de9 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Deleted allocations for instance 351ed18e-9759-4759-9a20-24f8b3d59908#033[00m
Oct  2 08:28:23 np0005466013 nova_compute[192144]: 2025-10-02 12:28:23.816 2 DEBUG oslo_concurrency.lockutils [None req-f3085708-7b52-4311-929a-d31dd75a6de9 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "351ed18e-9759-4759-9a20-24f8b3d59908" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.921s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:24 np0005466013 nova_compute[192144]: 2025-10-02 12:28:24.171 2 DEBUG nova.compute.manager [req-fc49b62d-d948-4f11-b14d-bec929ef396b req-6c0e77c2-ff81-4b4a-baf4-5a6911554625 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Received event network-vif-plugged-57aa5324-a5d7-424f-b3ea-20887a672a1c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:28:24 np0005466013 nova_compute[192144]: 2025-10-02 12:28:24.172 2 DEBUG oslo_concurrency.lockutils [req-fc49b62d-d948-4f11-b14d-bec929ef396b req-6c0e77c2-ff81-4b4a-baf4-5a6911554625 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "351ed18e-9759-4759-9a20-24f8b3d59908-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:24 np0005466013 nova_compute[192144]: 2025-10-02 12:28:24.172 2 DEBUG oslo_concurrency.lockutils [req-fc49b62d-d948-4f11-b14d-bec929ef396b req-6c0e77c2-ff81-4b4a-baf4-5a6911554625 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "351ed18e-9759-4759-9a20-24f8b3d59908-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:24 np0005466013 nova_compute[192144]: 2025-10-02 12:28:24.173 2 DEBUG oslo_concurrency.lockutils [req-fc49b62d-d948-4f11-b14d-bec929ef396b req-6c0e77c2-ff81-4b4a-baf4-5a6911554625 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "351ed18e-9759-4759-9a20-24f8b3d59908-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:24 np0005466013 nova_compute[192144]: 2025-10-02 12:28:24.173 2 DEBUG nova.compute.manager [req-fc49b62d-d948-4f11-b14d-bec929ef396b req-6c0e77c2-ff81-4b4a-baf4-5a6911554625 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] No waiting events found dispatching network-vif-plugged-57aa5324-a5d7-424f-b3ea-20887a672a1c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:28:24 np0005466013 nova_compute[192144]: 2025-10-02 12:28:24.173 2 WARNING nova.compute.manager [req-fc49b62d-d948-4f11-b14d-bec929ef396b req-6c0e77c2-ff81-4b4a-baf4-5a6911554625 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Received unexpected event network-vif-plugged-57aa5324-a5d7-424f-b3ea-20887a672a1c for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:28:24 np0005466013 nova_compute[192144]: 2025-10-02 12:28:24.173 2 DEBUG nova.compute.manager [req-fc49b62d-d948-4f11-b14d-bec929ef396b req-6c0e77c2-ff81-4b4a-baf4-5a6911554625 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Received event network-vif-deleted-57aa5324-a5d7-424f-b3ea-20887a672a1c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:28:24 np0005466013 nova_compute[192144]: 2025-10-02 12:28:24.707 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:24 np0005466013 nova_compute[192144]: 2025-10-02 12:28:24.819 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:26.614 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '32'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:28:28 np0005466013 nova_compute[192144]: 2025-10-02 12:28:28.995 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:28:28 np0005466013 nova_compute[192144]: 2025-10-02 12:28:28.995 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:28:29 np0005466013 nova_compute[192144]: 2025-10-02 12:28:29.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:29 np0005466013 nova_compute[192144]: 2025-10-02 12:28:29.821 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:29 np0005466013 nova_compute[192144]: 2025-10-02 12:28:29.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:28:30 np0005466013 nova_compute[192144]: 2025-10-02 12:28:30.043 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:30 np0005466013 nova_compute[192144]: 2025-10-02 12:28:30.044 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:30 np0005466013 nova_compute[192144]: 2025-10-02 12:28:30.044 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:30 np0005466013 nova_compute[192144]: 2025-10-02 12:28:30.044 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:28:30 np0005466013 nova_compute[192144]: 2025-10-02 12:28:30.187 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:28:30 np0005466013 NetworkManager[51205]: <info>  [1759408110.1972] manager: (patch-br-int-to-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/247)
Oct  2 08:28:30 np0005466013 NetworkManager[51205]: <info>  [1759408110.1988] manager: (patch-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/248)
Oct  2 08:28:30 np0005466013 nova_compute[192144]: 2025-10-02 12:28:30.218 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:30 np0005466013 nova_compute[192144]: 2025-10-02 12:28:30.259 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:28:30 np0005466013 nova_compute[192144]: 2025-10-02 12:28:30.260 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:28:30 np0005466013 nova_compute[192144]: 2025-10-02 12:28:30.300 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:30 np0005466013 ovn_controller[94366]: 2025-10-02T12:28:30Z|00549|binding|INFO|Releasing lport 41abd6d0-501e-436d-9123-d8936335a0b5 from this chassis (sb_readonly=0)
Oct  2 08:28:30 np0005466013 ovn_controller[94366]: 2025-10-02T12:28:30Z|00550|binding|INFO|Releasing lport c9a9afa9-78de-46a4-a649-8b8779924189 from this chassis (sb_readonly=0)
Oct  2 08:28:30 np0005466013 ovn_controller[94366]: 2025-10-02T12:28:30Z|00551|binding|INFO|Releasing lport 8baac968-8bfc-4d6e-93cc-be5861eaa459 from this chassis (sb_readonly=0)
Oct  2 08:28:30 np0005466013 nova_compute[192144]: 2025-10-02 12:28:30.321 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:30 np0005466013 nova_compute[192144]: 2025-10-02 12:28:30.348 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928/disk --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:28:30 np0005466013 nova_compute[192144]: 2025-10-02 12:28:30.357 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ca651811-3d96-4b41-a50d-bbaeaf3da808/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:28:30 np0005466013 nova_compute[192144]: 2025-10-02 12:28:30.828 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ca651811-3d96-4b41-a50d-bbaeaf3da808/disk --force-share --output=json" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:28:30 np0005466013 nova_compute[192144]: 2025-10-02 12:28:30.829 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ca651811-3d96-4b41-a50d-bbaeaf3da808/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:28:31 np0005466013 nova_compute[192144]: 2025-10-02 12:28:31.063 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ca651811-3d96-4b41-a50d-bbaeaf3da808/disk --force-share --output=json" returned: 0 in 0.233s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:28:31 np0005466013 nova_compute[192144]: 2025-10-02 12:28:31.284 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:28:31 np0005466013 nova_compute[192144]: 2025-10-02 12:28:31.285 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5298MB free_disk=73.17499542236328GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:28:31 np0005466013 nova_compute[192144]: 2025-10-02 12:28:31.286 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:31 np0005466013 nova_compute[192144]: 2025-10-02 12:28:31.286 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:31 np0005466013 nova_compute[192144]: 2025-10-02 12:28:31.890 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance f92877b9-dd8b-4444-a42b-987004802928 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:28:31 np0005466013 nova_compute[192144]: 2025-10-02 12:28:31.890 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance ca651811-3d96-4b41-a50d-bbaeaf3da808 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:28:31 np0005466013 nova_compute[192144]: 2025-10-02 12:28:31.891 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:28:31 np0005466013 nova_compute[192144]: 2025-10-02 12:28:31.891 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:28:31 np0005466013 nova_compute[192144]: 2025-10-02 12:28:31.986 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:28:32 np0005466013 nova_compute[192144]: 2025-10-02 12:28:32.018 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:28:32 np0005466013 nova_compute[192144]: 2025-10-02 12:28:32.072 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:28:32 np0005466013 nova_compute[192144]: 2025-10-02 12:28:32.072 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.786s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:32 np0005466013 ovn_controller[94366]: 2025-10-02T12:28:32Z|00053|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ac:b7:65 10.100.0.13
Oct  2 08:28:32 np0005466013 ovn_controller[94366]: 2025-10-02T12:28:32Z|00054|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ac:b7:65 10.100.0.13
Oct  2 08:28:33 np0005466013 ovn_controller[94366]: 2025-10-02T12:28:33Z|00552|binding|INFO|Releasing lport 41abd6d0-501e-436d-9123-d8936335a0b5 from this chassis (sb_readonly=0)
Oct  2 08:28:33 np0005466013 ovn_controller[94366]: 2025-10-02T12:28:33Z|00553|binding|INFO|Releasing lport c9a9afa9-78de-46a4-a649-8b8779924189 from this chassis (sb_readonly=0)
Oct  2 08:28:33 np0005466013 ovn_controller[94366]: 2025-10-02T12:28:33Z|00554|binding|INFO|Releasing lport 8baac968-8bfc-4d6e-93cc-be5861eaa459 from this chassis (sb_readonly=0)
Oct  2 08:28:33 np0005466013 nova_compute[192144]: 2025-10-02 12:28:33.280 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:34 np0005466013 nova_compute[192144]: 2025-10-02 12:28:34.245 2 DEBUG nova.compute.manager [req-1ee4364c-4950-48b5-a3ac-cb633c8bf18f req-debb3958-0119-49b3-b1b0-3f1a9efa9655 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Received event network-changed-087d1308-7c0a-45ab-b876-1dfdceb622d7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:28:34 np0005466013 nova_compute[192144]: 2025-10-02 12:28:34.246 2 DEBUG nova.compute.manager [req-1ee4364c-4950-48b5-a3ac-cb633c8bf18f req-debb3958-0119-49b3-b1b0-3f1a9efa9655 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Refreshing instance network info cache due to event network-changed-087d1308-7c0a-45ab-b876-1dfdceb622d7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:28:34 np0005466013 nova_compute[192144]: 2025-10-02 12:28:34.246 2 DEBUG oslo_concurrency.lockutils [req-1ee4364c-4950-48b5-a3ac-cb633c8bf18f req-debb3958-0119-49b3-b1b0-3f1a9efa9655 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-ca651811-3d96-4b41-a50d-bbaeaf3da808" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:28:34 np0005466013 nova_compute[192144]: 2025-10-02 12:28:34.246 2 DEBUG oslo_concurrency.lockutils [req-1ee4364c-4950-48b5-a3ac-cb633c8bf18f req-debb3958-0119-49b3-b1b0-3f1a9efa9655 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-ca651811-3d96-4b41-a50d-bbaeaf3da808" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:28:34 np0005466013 nova_compute[192144]: 2025-10-02 12:28:34.246 2 DEBUG nova.network.neutron [req-1ee4364c-4950-48b5-a3ac-cb633c8bf18f req-debb3958-0119-49b3-b1b0-3f1a9efa9655 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Refreshing network info cache for port 087d1308-7c0a-45ab-b876-1dfdceb622d7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:28:34 np0005466013 nova_compute[192144]: 2025-10-02 12:28:34.248 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408099.2439344, 351ed18e-9759-4759-9a20-24f8b3d59908 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:28:34 np0005466013 nova_compute[192144]: 2025-10-02 12:28:34.248 2 INFO nova.compute.manager [-] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:28:34 np0005466013 nova_compute[192144]: 2025-10-02 12:28:34.278 2 DEBUG nova.compute.manager [None req-f7077f59-8f87-4b29-9d7b-3254581cb07e - - - - - -] [instance: 351ed18e-9759-4759-9a20-24f8b3d59908] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:28:34 np0005466013 podman[241314]: 2025-10-02 12:28:34.708025913 +0000 UTC m=+0.073214795 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:28:34 np0005466013 podman[241315]: 2025-10-02 12:28:34.726450001 +0000 UTC m=+0.090907200 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  2 08:28:34 np0005466013 podman[241316]: 2025-10-02 12:28:34.744173426 +0000 UTC m=+0.100138579 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:28:34 np0005466013 nova_compute[192144]: 2025-10-02 12:28:34.753 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:34 np0005466013 nova_compute[192144]: 2025-10-02 12:28:34.822 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:37 np0005466013 nova_compute[192144]: 2025-10-02 12:28:37.930 2 DEBUG oslo_concurrency.lockutils [None req-7172c53c-b4af-415e-afa7-21d86fdbb7ec abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquiring lock "f92877b9-dd8b-4444-a42b-987004802928" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:37 np0005466013 nova_compute[192144]: 2025-10-02 12:28:37.930 2 DEBUG oslo_concurrency.lockutils [None req-7172c53c-b4af-415e-afa7-21d86fdbb7ec abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "f92877b9-dd8b-4444-a42b-987004802928" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:37 np0005466013 nova_compute[192144]: 2025-10-02 12:28:37.931 2 DEBUG oslo_concurrency.lockutils [None req-7172c53c-b4af-415e-afa7-21d86fdbb7ec abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquiring lock "f92877b9-dd8b-4444-a42b-987004802928-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:37 np0005466013 nova_compute[192144]: 2025-10-02 12:28:37.931 2 DEBUG oslo_concurrency.lockutils [None req-7172c53c-b4af-415e-afa7-21d86fdbb7ec abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "f92877b9-dd8b-4444-a42b-987004802928-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:37 np0005466013 nova_compute[192144]: 2025-10-02 12:28:37.931 2 DEBUG oslo_concurrency.lockutils [None req-7172c53c-b4af-415e-afa7-21d86fdbb7ec abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "f92877b9-dd8b-4444-a42b-987004802928-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:37 np0005466013 nova_compute[192144]: 2025-10-02 12:28:37.950 2 INFO nova.compute.manager [None req-7172c53c-b4af-415e-afa7-21d86fdbb7ec abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Terminating instance#033[00m
Oct  2 08:28:37 np0005466013 nova_compute[192144]: 2025-10-02 12:28:37.974 2 DEBUG nova.compute.manager [None req-7172c53c-b4af-415e-afa7-21d86fdbb7ec abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:28:38 np0005466013 kernel: tap60b9a0ec-2a (unregistering): left promiscuous mode
Oct  2 08:28:38 np0005466013 NetworkManager[51205]: <info>  [1759408118.0462] device (tap60b9a0ec-2a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:28:38 np0005466013 ovn_controller[94366]: 2025-10-02T12:28:38Z|00555|binding|INFO|Releasing lport 60b9a0ec-2ade-4f90-a7b0-443ac527ec3e from this chassis (sb_readonly=0)
Oct  2 08:28:38 np0005466013 nova_compute[192144]: 2025-10-02 12:28:38.056 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:38 np0005466013 ovn_controller[94366]: 2025-10-02T12:28:38Z|00556|binding|INFO|Setting lport 60b9a0ec-2ade-4f90-a7b0-443ac527ec3e down in Southbound
Oct  2 08:28:38 np0005466013 ovn_controller[94366]: 2025-10-02T12:28:38Z|00557|binding|INFO|Removing iface tap60b9a0ec-2a ovn-installed in OVS
Oct  2 08:28:38 np0005466013 nova_compute[192144]: 2025-10-02 12:28:38.061 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:38 np0005466013 nova_compute[192144]: 2025-10-02 12:28:38.072 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:38 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:38.074 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f0:49:10 10.100.0.4'], port_security=['fa:16:3e:f0:49:10 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'f92877b9-dd8b-4444-a42b-987004802928', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '88e90c16adec46069b539d4f1431ab4d', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'ed50fd5d-92ed-497e-8f4f-4653533c5a19', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=53026845-594b-430c-a1e8-d879cf008d46, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=60b9a0ec-2ade-4f90-a7b0-443ac527ec3e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:28:38 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:38.078 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 60b9a0ec-2ade-4f90-a7b0-443ac527ec3e in datapath aa4ebb90-ef5e-4974-a53d-2aabd696731a unbound from our chassis#033[00m
Oct  2 08:28:38 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:38.081 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network aa4ebb90-ef5e-4974-a53d-2aabd696731a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:28:38 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:38.083 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[f2bedba6-2181-4f12-a1c3-b04cecab1265]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:38 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:38.084 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a namespace which is not needed anymore#033[00m
Oct  2 08:28:38 np0005466013 systemd[1]: machine-qemu\x2d59\x2dinstance\x2d00000077.scope: Deactivated successfully.
Oct  2 08:28:38 np0005466013 systemd[1]: machine-qemu\x2d59\x2dinstance\x2d00000077.scope: Consumed 21.864s CPU time.
Oct  2 08:28:38 np0005466013 systemd-machined[152202]: Machine qemu-59-instance-00000077 terminated.
Oct  2 08:28:38 np0005466013 nova_compute[192144]: 2025-10-02 12:28:38.261 2 INFO nova.virt.libvirt.driver [-] [instance: f92877b9-dd8b-4444-a42b-987004802928] Instance destroyed successfully.#033[00m
Oct  2 08:28:38 np0005466013 nova_compute[192144]: 2025-10-02 12:28:38.262 2 DEBUG nova.objects.instance [None req-7172c53c-b4af-415e-afa7-21d86fdbb7ec abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lazy-loading 'resources' on Instance uuid f92877b9-dd8b-4444-a42b-987004802928 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:28:38 np0005466013 nova_compute[192144]: 2025-10-02 12:28:38.282 2 DEBUG nova.virt.libvirt.vif [None req-7172c53c-b4af-415e-afa7-21d86fdbb7ec abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:24:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-915765106',display_name='tempest-ServerStableDeviceRescueTest-server-915765106',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-915765106',id=119,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:25:41Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='88e90c16adec46069b539d4f1431ab4d',ramdisk_id='',reservation_id='r-6ztfdnas',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerStableDeviceRescueTest-232864240',owner_user_name='tempest-ServerStableDeviceRescueTest-232864240-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:25:48Z,user_data=None,user_id='abb9f220716e48d79dbe2f97622937c4',uuid=f92877b9-dd8b-4444-a42b-987004802928,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "60b9a0ec-2ade-4f90-a7b0-443ac527ec3e", "address": "fa:16:3e:f0:49:10", "network": {"id": "aa4ebb90-ef5e-4974-a53d-2aabd696731a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88e90c16adec46069b539d4f1431ab4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60b9a0ec-2a", "ovs_interfaceid": "60b9a0ec-2ade-4f90-a7b0-443ac527ec3e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:28:38 np0005466013 nova_compute[192144]: 2025-10-02 12:28:38.284 2 DEBUG nova.network.os_vif_util [None req-7172c53c-b4af-415e-afa7-21d86fdbb7ec abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Converting VIF {"id": "60b9a0ec-2ade-4f90-a7b0-443ac527ec3e", "address": "fa:16:3e:f0:49:10", "network": {"id": "aa4ebb90-ef5e-4974-a53d-2aabd696731a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88e90c16adec46069b539d4f1431ab4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60b9a0ec-2a", "ovs_interfaceid": "60b9a0ec-2ade-4f90-a7b0-443ac527ec3e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:28:38 np0005466013 nova_compute[192144]: 2025-10-02 12:28:38.285 2 DEBUG nova.network.os_vif_util [None req-7172c53c-b4af-415e-afa7-21d86fdbb7ec abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f0:49:10,bridge_name='br-int',has_traffic_filtering=True,id=60b9a0ec-2ade-4f90-a7b0-443ac527ec3e,network=Network(aa4ebb90-ef5e-4974-a53d-2aabd696731a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60b9a0ec-2a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:28:38 np0005466013 nova_compute[192144]: 2025-10-02 12:28:38.286 2 DEBUG os_vif [None req-7172c53c-b4af-415e-afa7-21d86fdbb7ec abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f0:49:10,bridge_name='br-int',has_traffic_filtering=True,id=60b9a0ec-2ade-4f90-a7b0-443ac527ec3e,network=Network(aa4ebb90-ef5e-4974-a53d-2aabd696731a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60b9a0ec-2a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:28:38 np0005466013 nova_compute[192144]: 2025-10-02 12:28:38.287 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:38 np0005466013 nova_compute[192144]: 2025-10-02 12:28:38.288 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap60b9a0ec-2a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:28:38 np0005466013 nova_compute[192144]: 2025-10-02 12:28:38.290 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:38 np0005466013 nova_compute[192144]: 2025-10-02 12:28:38.292 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:28:38 np0005466013 nova_compute[192144]: 2025-10-02 12:28:38.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:38 np0005466013 nova_compute[192144]: 2025-10-02 12:28:38.296 2 INFO os_vif [None req-7172c53c-b4af-415e-afa7-21d86fdbb7ec abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f0:49:10,bridge_name='br-int',has_traffic_filtering=True,id=60b9a0ec-2ade-4f90-a7b0-443ac527ec3e,network=Network(aa4ebb90-ef5e-4974-a53d-2aabd696731a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60b9a0ec-2a')#033[00m
Oct  2 08:28:38 np0005466013 nova_compute[192144]: 2025-10-02 12:28:38.297 2 INFO nova.virt.libvirt.driver [None req-7172c53c-b4af-415e-afa7-21d86fdbb7ec abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Deleting instance files /var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928_del#033[00m
Oct  2 08:28:38 np0005466013 nova_compute[192144]: 2025-10-02 12:28:38.298 2 INFO nova.virt.libvirt.driver [None req-7172c53c-b4af-415e-afa7-21d86fdbb7ec abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Deletion of /var/lib/nova/instances/f92877b9-dd8b-4444-a42b-987004802928_del complete#033[00m
Oct  2 08:28:38 np0005466013 nova_compute[192144]: 2025-10-02 12:28:38.399 2 INFO nova.compute.manager [None req-7172c53c-b4af-415e-afa7-21d86fdbb7ec abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Took 0.42 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:28:38 np0005466013 nova_compute[192144]: 2025-10-02 12:28:38.400 2 DEBUG oslo.service.loopingcall [None req-7172c53c-b4af-415e-afa7-21d86fdbb7ec abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:28:38 np0005466013 nova_compute[192144]: 2025-10-02 12:28:38.401 2 DEBUG nova.compute.manager [-] [instance: f92877b9-dd8b-4444-a42b-987004802928] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:28:38 np0005466013 nova_compute[192144]: 2025-10-02 12:28:38.401 2 DEBUG nova.network.neutron [-] [instance: f92877b9-dd8b-4444-a42b-987004802928] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:28:38 np0005466013 neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a[239714]: [NOTICE]   (239718) : haproxy version is 2.8.14-c23fe91
Oct  2 08:28:38 np0005466013 neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a[239714]: [NOTICE]   (239718) : path to executable is /usr/sbin/haproxy
Oct  2 08:28:38 np0005466013 neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a[239714]: [WARNING]  (239718) : Exiting Master process...
Oct  2 08:28:38 np0005466013 neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a[239714]: [WARNING]  (239718) : Exiting Master process...
Oct  2 08:28:38 np0005466013 neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a[239714]: [ALERT]    (239718) : Current worker (239720) exited with code 143 (Terminated)
Oct  2 08:28:38 np0005466013 neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a[239714]: [WARNING]  (239718) : All workers exited. Exiting... (0)
Oct  2 08:28:38 np0005466013 systemd[1]: libpod-6edf76ae29fb1c28b03dd8ad198b8e751e47f2fbb6cf975e9f01dc0be0851199.scope: Deactivated successfully.
Oct  2 08:28:38 np0005466013 podman[241410]: 2025-10-02 12:28:38.44053538 +0000 UTC m=+0.211893510 container died 6edf76ae29fb1c28b03dd8ad198b8e751e47f2fbb6cf975e9f01dc0be0851199 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Oct  2 08:28:38 np0005466013 nova_compute[192144]: 2025-10-02 12:28:38.756 2 DEBUG nova.network.neutron [req-1ee4364c-4950-48b5-a3ac-cb633c8bf18f req-debb3958-0119-49b3-b1b0-3f1a9efa9655 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Updated VIF entry in instance network info cache for port 087d1308-7c0a-45ab-b876-1dfdceb622d7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:28:38 np0005466013 nova_compute[192144]: 2025-10-02 12:28:38.757 2 DEBUG nova.network.neutron [req-1ee4364c-4950-48b5-a3ac-cb633c8bf18f req-debb3958-0119-49b3-b1b0-3f1a9efa9655 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Updating instance_info_cache with network_info: [{"id": "087d1308-7c0a-45ab-b876-1dfdceb622d7", "address": "fa:16:3e:ac:b7:65", "network": {"id": "d68eafa7-b35f-4bd9-ba11-e28a73bc7849", "bridge": "br-int", "label": "tempest-network-smoke--995539246", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap087d1308-7c", "ovs_interfaceid": "087d1308-7c0a-45ab-b876-1dfdceb622d7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6949320c-d0cb-4a9d-a882-f6d1aac564c3", "address": "fa:16:3e:49:69:cf", "network": {"id": "85092873-751b-414a-a9a1-112c2e61cb13", "bridge": "br-int", "label": "tempest-network-smoke--1405117352", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe49:69cf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6949320c-d0", "ovs_interfaceid": "6949320c-d0cb-4a9d-a882-f6d1aac564c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:28:38 np0005466013 nova_compute[192144]: 2025-10-02 12:28:38.808 2 DEBUG oslo_concurrency.lockutils [req-1ee4364c-4950-48b5-a3ac-cb633c8bf18f req-debb3958-0119-49b3-b1b0-3f1a9efa9655 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-ca651811-3d96-4b41-a50d-bbaeaf3da808" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:28:38 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6edf76ae29fb1c28b03dd8ad198b8e751e47f2fbb6cf975e9f01dc0be0851199-userdata-shm.mount: Deactivated successfully.
Oct  2 08:28:38 np0005466013 systemd[1]: var-lib-containers-storage-overlay-ab613bcf399e87a977e1f0f2e769eda7bfe48b5e70667bf48a4383daa37fe142-merged.mount: Deactivated successfully.
Oct  2 08:28:39 np0005466013 nova_compute[192144]: 2025-10-02 12:28:39.073 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:28:39 np0005466013 nova_compute[192144]: 2025-10-02 12:28:39.075 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:28:39 np0005466013 nova_compute[192144]: 2025-10-02 12:28:39.075 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:28:39 np0005466013 nova_compute[192144]: 2025-10-02 12:28:39.097 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:28:39 np0005466013 nova_compute[192144]: 2025-10-02 12:28:39.097 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:28:39 np0005466013 nova_compute[192144]: 2025-10-02 12:28:39.098 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:28:39 np0005466013 nova_compute[192144]: 2025-10-02 12:28:39.099 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:28:39 np0005466013 nova_compute[192144]: 2025-10-02 12:28:39.099 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:28:39 np0005466013 nova_compute[192144]: 2025-10-02 12:28:39.100 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:28:39 np0005466013 podman[241410]: 2025-10-02 12:28:39.13341962 +0000 UTC m=+0.904777760 container cleanup 6edf76ae29fb1c28b03dd8ad198b8e751e47f2fbb6cf975e9f01dc0be0851199 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  2 08:28:39 np0005466013 systemd[1]: libpod-conmon-6edf76ae29fb1c28b03dd8ad198b8e751e47f2fbb6cf975e9f01dc0be0851199.scope: Deactivated successfully.
Oct  2 08:28:39 np0005466013 nova_compute[192144]: 2025-10-02 12:28:39.462 2 DEBUG nova.network.neutron [-] [instance: f92877b9-dd8b-4444-a42b-987004802928] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:28:39 np0005466013 nova_compute[192144]: 2025-10-02 12:28:39.616 2 INFO nova.compute.manager [-] [instance: f92877b9-dd8b-4444-a42b-987004802928] Took 1.21 seconds to deallocate network for instance.#033[00m
Oct  2 08:28:39 np0005466013 nova_compute[192144]: 2025-10-02 12:28:39.685 2 DEBUG nova.compute.manager [req-63042195-3d14-4288-8671-86d09b324008 req-9a98bc8c-3aaa-4380-b71d-21e9dd31a98f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Received event network-vif-deleted-60b9a0ec-2ade-4f90-a7b0-443ac527ec3e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:28:39 np0005466013 nova_compute[192144]: 2025-10-02 12:28:39.732 2 DEBUG oslo_concurrency.lockutils [None req-7172c53c-b4af-415e-afa7-21d86fdbb7ec abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:39 np0005466013 nova_compute[192144]: 2025-10-02 12:28:39.733 2 DEBUG oslo_concurrency.lockutils [None req-7172c53c-b4af-415e-afa7-21d86fdbb7ec abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:39 np0005466013 nova_compute[192144]: 2025-10-02 12:28:39.757 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:39 np0005466013 nova_compute[192144]: 2025-10-02 12:28:39.829 2 DEBUG nova.compute.provider_tree [None req-7172c53c-b4af-415e-afa7-21d86fdbb7ec abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:28:39 np0005466013 nova_compute[192144]: 2025-10-02 12:28:39.846 2 DEBUG nova.scheduler.client.report [None req-7172c53c-b4af-415e-afa7-21d86fdbb7ec abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:28:39 np0005466013 nova_compute[192144]: 2025-10-02 12:28:39.886 2 DEBUG oslo_concurrency.lockutils [None req-7172c53c-b4af-415e-afa7-21d86fdbb7ec abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:39 np0005466013 nova_compute[192144]: 2025-10-02 12:28:39.921 2 DEBUG nova.compute.manager [req-8df1e34b-de42-4bd1-9add-5377cd6c9b19 req-08cdd1b5-2c79-4b7c-a268-1c745b0ad3b2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Received event network-vif-unplugged-60b9a0ec-2ade-4f90-a7b0-443ac527ec3e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:28:39 np0005466013 nova_compute[192144]: 2025-10-02 12:28:39.922 2 DEBUG oslo_concurrency.lockutils [req-8df1e34b-de42-4bd1-9add-5377cd6c9b19 req-08cdd1b5-2c79-4b7c-a268-1c745b0ad3b2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "f92877b9-dd8b-4444-a42b-987004802928-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:39 np0005466013 nova_compute[192144]: 2025-10-02 12:28:39.922 2 DEBUG oslo_concurrency.lockutils [req-8df1e34b-de42-4bd1-9add-5377cd6c9b19 req-08cdd1b5-2c79-4b7c-a268-1c745b0ad3b2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f92877b9-dd8b-4444-a42b-987004802928-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:39 np0005466013 nova_compute[192144]: 2025-10-02 12:28:39.922 2 DEBUG oslo_concurrency.lockutils [req-8df1e34b-de42-4bd1-9add-5377cd6c9b19 req-08cdd1b5-2c79-4b7c-a268-1c745b0ad3b2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f92877b9-dd8b-4444-a42b-987004802928-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:39 np0005466013 nova_compute[192144]: 2025-10-02 12:28:39.922 2 DEBUG nova.compute.manager [req-8df1e34b-de42-4bd1-9add-5377cd6c9b19 req-08cdd1b5-2c79-4b7c-a268-1c745b0ad3b2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] No waiting events found dispatching network-vif-unplugged-60b9a0ec-2ade-4f90-a7b0-443ac527ec3e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:28:39 np0005466013 nova_compute[192144]: 2025-10-02 12:28:39.923 2 WARNING nova.compute.manager [req-8df1e34b-de42-4bd1-9add-5377cd6c9b19 req-08cdd1b5-2c79-4b7c-a268-1c745b0ad3b2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Received unexpected event network-vif-unplugged-60b9a0ec-2ade-4f90-a7b0-443ac527ec3e for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:28:39 np0005466013 nova_compute[192144]: 2025-10-02 12:28:39.923 2 DEBUG nova.compute.manager [req-8df1e34b-de42-4bd1-9add-5377cd6c9b19 req-08cdd1b5-2c79-4b7c-a268-1c745b0ad3b2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Received event network-vif-plugged-60b9a0ec-2ade-4f90-a7b0-443ac527ec3e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:28:39 np0005466013 nova_compute[192144]: 2025-10-02 12:28:39.923 2 DEBUG oslo_concurrency.lockutils [req-8df1e34b-de42-4bd1-9add-5377cd6c9b19 req-08cdd1b5-2c79-4b7c-a268-1c745b0ad3b2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "f92877b9-dd8b-4444-a42b-987004802928-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:39 np0005466013 nova_compute[192144]: 2025-10-02 12:28:39.923 2 DEBUG oslo_concurrency.lockutils [req-8df1e34b-de42-4bd1-9add-5377cd6c9b19 req-08cdd1b5-2c79-4b7c-a268-1c745b0ad3b2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f92877b9-dd8b-4444-a42b-987004802928-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:39 np0005466013 nova_compute[192144]: 2025-10-02 12:28:39.923 2 DEBUG oslo_concurrency.lockutils [req-8df1e34b-de42-4bd1-9add-5377cd6c9b19 req-08cdd1b5-2c79-4b7c-a268-1c745b0ad3b2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f92877b9-dd8b-4444-a42b-987004802928-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:39 np0005466013 nova_compute[192144]: 2025-10-02 12:28:39.924 2 DEBUG nova.compute.manager [req-8df1e34b-de42-4bd1-9add-5377cd6c9b19 req-08cdd1b5-2c79-4b7c-a268-1c745b0ad3b2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] No waiting events found dispatching network-vif-plugged-60b9a0ec-2ade-4f90-a7b0-443ac527ec3e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:28:39 np0005466013 nova_compute[192144]: 2025-10-02 12:28:39.924 2 WARNING nova.compute.manager [req-8df1e34b-de42-4bd1-9add-5377cd6c9b19 req-08cdd1b5-2c79-4b7c-a268-1c745b0ad3b2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f92877b9-dd8b-4444-a42b-987004802928] Received unexpected event network-vif-plugged-60b9a0ec-2ade-4f90-a7b0-443ac527ec3e for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:28:39 np0005466013 nova_compute[192144]: 2025-10-02 12:28:39.926 2 INFO nova.scheduler.client.report [None req-7172c53c-b4af-415e-afa7-21d86fdbb7ec abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Deleted allocations for instance f92877b9-dd8b-4444-a42b-987004802928#033[00m
Oct  2 08:28:39 np0005466013 podman[241453]: 2025-10-02 12:28:39.975627548 +0000 UTC m=+0.807546183 container remove 6edf76ae29fb1c28b03dd8ad198b8e751e47f2fbb6cf975e9f01dc0be0851199 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  2 08:28:39 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:39.986 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[dc2d0c6c-ef7d-4992-bb0d-da33fb8a938b]: (4, ('Thu Oct  2 12:28:38 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a (6edf76ae29fb1c28b03dd8ad198b8e751e47f2fbb6cf975e9f01dc0be0851199)\n6edf76ae29fb1c28b03dd8ad198b8e751e47f2fbb6cf975e9f01dc0be0851199\nThu Oct  2 12:28:39 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a (6edf76ae29fb1c28b03dd8ad198b8e751e47f2fbb6cf975e9f01dc0be0851199)\n6edf76ae29fb1c28b03dd8ad198b8e751e47f2fbb6cf975e9f01dc0be0851199\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:39 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:39.989 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[675620a5-5d22-4299-ad05-2f7d5c871a98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:39 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:39.992 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaa4ebb90-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:28:39 np0005466013 kernel: tapaa4ebb90-e0: left promiscuous mode
Oct  2 08:28:39 np0005466013 nova_compute[192144]: 2025-10-02 12:28:39.996 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:40 np0005466013 nova_compute[192144]: 2025-10-02 12:28:40.000 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:40.004 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[4b74b97b-03e1-4804-8776-bceb6fbc7d71]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:40 np0005466013 nova_compute[192144]: 2025-10-02 12:28:40.026 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:40.045 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[0960564a-9e22-45a4-b74c-e7ded90f61ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:40.047 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[54aa117a-0039-446a-9181-c5bb34afec3b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:40 np0005466013 nova_compute[192144]: 2025-10-02 12:28:40.052 2 DEBUG oslo_concurrency.lockutils [None req-7172c53c-b4af-415e-afa7-21d86fdbb7ec abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "f92877b9-dd8b-4444-a42b-987004802928" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:40.071 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[ca9fe532-6c29-4236-adc8-421d30c832bd]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 593985, 'reachable_time': 41671, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241467, 'error': None, 'target': 'ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:40 np0005466013 systemd[1]: run-netns-ovnmeta\x2daa4ebb90\x2def5e\x2d4974\x2da53d\x2d2aabd696731a.mount: Deactivated successfully.
Oct  2 08:28:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:40.076 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:28:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:28:40.077 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[218be1c2-b6b4-4feb-aa89-3f6c9115ddb6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:42 np0005466013 nova_compute[192144]: 2025-10-02 12:28:42.013 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:28:43 np0005466013 nova_compute[192144]: 2025-10-02 12:28:43.290 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:44 np0005466013 podman[241470]: 2025-10-02 12:28:44.728822035 +0000 UTC m=+0.089485425 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=multipathd, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:28:44 np0005466013 podman[241471]: 2025-10-02 12:28:44.727936227 +0000 UTC m=+0.094346807 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, com.redhat.component=ubi9-minimal-container, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, build-date=2025-08-20T13:12:41, vcs-type=git, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, architecture=x86_64, maintainer=Red Hat, Inc., io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  2 08:28:44 np0005466013 podman[241472]: 2025-10-02 12:28:44.739096867 +0000 UTC m=+0.099694085 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:28:44 np0005466013 nova_compute[192144]: 2025-10-02 12:28:44.760 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:46 np0005466013 ovn_controller[94366]: 2025-10-02T12:28:46Z|00558|binding|INFO|Releasing lport 41abd6d0-501e-436d-9123-d8936335a0b5 from this chassis (sb_readonly=0)
Oct  2 08:28:46 np0005466013 ovn_controller[94366]: 2025-10-02T12:28:46Z|00559|binding|INFO|Releasing lport 8baac968-8bfc-4d6e-93cc-be5861eaa459 from this chassis (sb_readonly=0)
Oct  2 08:28:46 np0005466013 nova_compute[192144]: 2025-10-02 12:28:46.772 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:47 np0005466013 nova_compute[192144]: 2025-10-02 12:28:47.574 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:48 np0005466013 nova_compute[192144]: 2025-10-02 12:28:48.341 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:49 np0005466013 podman[241534]: 2025-10-02 12:28:49.723597593 +0000 UTC m=+0.083681283 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 08:28:49 np0005466013 podman[241535]: 2025-10-02 12:28:49.739051398 +0000 UTC m=+0.092496250 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Oct  2 08:28:49 np0005466013 nova_compute[192144]: 2025-10-02 12:28:49.763 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:53 np0005466013 nova_compute[192144]: 2025-10-02 12:28:53.260 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408118.258567, f92877b9-dd8b-4444-a42b-987004802928 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:28:53 np0005466013 nova_compute[192144]: 2025-10-02 12:28:53.261 2 INFO nova.compute.manager [-] [instance: f92877b9-dd8b-4444-a42b-987004802928] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:28:53 np0005466013 nova_compute[192144]: 2025-10-02 12:28:53.346 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:53 np0005466013 ovn_controller[94366]: 2025-10-02T12:28:53Z|00560|binding|INFO|Releasing lport 41abd6d0-501e-436d-9123-d8936335a0b5 from this chassis (sb_readonly=0)
Oct  2 08:28:53 np0005466013 ovn_controller[94366]: 2025-10-02T12:28:53Z|00561|binding|INFO|Releasing lport 8baac968-8bfc-4d6e-93cc-be5861eaa459 from this chassis (sb_readonly=0)
Oct  2 08:28:53 np0005466013 nova_compute[192144]: 2025-10-02 12:28:53.491 2 DEBUG nova.compute.manager [None req-fd4641d5-48f4-4da8-8b13-aadb22fab4cb - - - - - -] [instance: f92877b9-dd8b-4444-a42b-987004802928] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:28:53 np0005466013 nova_compute[192144]: 2025-10-02 12:28:53.532 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:54 np0005466013 nova_compute[192144]: 2025-10-02 12:28:54.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:58 np0005466013 nova_compute[192144]: 2025-10-02 12:28:58.350 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:58 np0005466013 nova_compute[192144]: 2025-10-02 12:28:58.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:59 np0005466013 nova_compute[192144]: 2025-10-02 12:28:59.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:02.312 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:02.313 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:02.314 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:03 np0005466013 nova_compute[192144]: 2025-10-02 12:29:03.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:04 np0005466013 nova_compute[192144]: 2025-10-02 12:29:04.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:05 np0005466013 podman[241586]: 2025-10-02 12:29:05.68236067 +0000 UTC m=+0.062147008 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  2 08:29:05 np0005466013 podman[241587]: 2025-10-02 12:29:05.699104245 +0000 UTC m=+0.072145461 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Oct  2 08:29:05 np0005466013 podman[241588]: 2025-10-02 12:29:05.743139534 +0000 UTC m=+0.114360144 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:29:06 np0005466013 nova_compute[192144]: 2025-10-02 12:29:06.712 2 DEBUG oslo_concurrency.lockutils [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "158b6775-d035-40a6-9699-a7bab42a3cbc" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:06 np0005466013 nova_compute[192144]: 2025-10-02 12:29:06.713 2 DEBUG oslo_concurrency.lockutils [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "158b6775-d035-40a6-9699-a7bab42a3cbc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:06 np0005466013 nova_compute[192144]: 2025-10-02 12:29:06.771 2 DEBUG nova.compute.manager [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:29:07 np0005466013 nova_compute[192144]: 2025-10-02 12:29:07.001 2 DEBUG oslo_concurrency.lockutils [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:07 np0005466013 nova_compute[192144]: 2025-10-02 12:29:07.002 2 DEBUG oslo_concurrency.lockutils [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:07 np0005466013 nova_compute[192144]: 2025-10-02 12:29:07.010 2 DEBUG nova.virt.hardware [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:29:07 np0005466013 nova_compute[192144]: 2025-10-02 12:29:07.010 2 INFO nova.compute.claims [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:29:07 np0005466013 nova_compute[192144]: 2025-10-02 12:29:07.620 2 DEBUG nova.compute.provider_tree [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:29:07 np0005466013 nova_compute[192144]: 2025-10-02 12:29:07.707 2 DEBUG nova.scheduler.client.report [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:29:08 np0005466013 nova_compute[192144]: 2025-10-02 12:29:08.107 2 DEBUG oslo_concurrency.lockutils [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:08 np0005466013 nova_compute[192144]: 2025-10-02 12:29:08.109 2 DEBUG nova.compute.manager [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:29:08 np0005466013 nova_compute[192144]: 2025-10-02 12:29:08.380 2 DEBUG nova.compute.manager [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:29:08 np0005466013 nova_compute[192144]: 2025-10-02 12:29:08.381 2 DEBUG nova.network.neutron [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:29:08 np0005466013 nova_compute[192144]: 2025-10-02 12:29:08.407 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:08 np0005466013 nova_compute[192144]: 2025-10-02 12:29:08.425 2 INFO nova.virt.libvirt.driver [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:29:08 np0005466013 nova_compute[192144]: 2025-10-02 12:29:08.454 2 DEBUG nova.compute.manager [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:29:08 np0005466013 nova_compute[192144]: 2025-10-02 12:29:08.661 2 DEBUG nova.compute.manager [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:29:08 np0005466013 nova_compute[192144]: 2025-10-02 12:29:08.664 2 DEBUG nova.virt.libvirt.driver [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:29:08 np0005466013 nova_compute[192144]: 2025-10-02 12:29:08.665 2 INFO nova.virt.libvirt.driver [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Creating image(s)#033[00m
Oct  2 08:29:08 np0005466013 nova_compute[192144]: 2025-10-02 12:29:08.667 2 DEBUG oslo_concurrency.lockutils [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "/var/lib/nova/instances/158b6775-d035-40a6-9699-a7bab42a3cbc/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:08 np0005466013 nova_compute[192144]: 2025-10-02 12:29:08.668 2 DEBUG oslo_concurrency.lockutils [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "/var/lib/nova/instances/158b6775-d035-40a6-9699-a7bab42a3cbc/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:08 np0005466013 nova_compute[192144]: 2025-10-02 12:29:08.669 2 DEBUG oslo_concurrency.lockutils [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "/var/lib/nova/instances/158b6775-d035-40a6-9699-a7bab42a3cbc/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:08 np0005466013 nova_compute[192144]: 2025-10-02 12:29:08.696 2 DEBUG oslo_concurrency.processutils [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:29:08 np0005466013 nova_compute[192144]: 2025-10-02 12:29:08.773 2 DEBUG oslo_concurrency.processutils [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:29:08 np0005466013 nova_compute[192144]: 2025-10-02 12:29:08.774 2 DEBUG oslo_concurrency.lockutils [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:08 np0005466013 nova_compute[192144]: 2025-10-02 12:29:08.775 2 DEBUG oslo_concurrency.lockutils [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:08 np0005466013 nova_compute[192144]: 2025-10-02 12:29:08.785 2 DEBUG oslo_concurrency.processutils [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:29:08 np0005466013 nova_compute[192144]: 2025-10-02 12:29:08.867 2 DEBUG nova.policy [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:29:08 np0005466013 nova_compute[192144]: 2025-10-02 12:29:08.870 2 DEBUG oslo_concurrency.processutils [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:29:08 np0005466013 nova_compute[192144]: 2025-10-02 12:29:08.871 2 DEBUG oslo_concurrency.processutils [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/158b6775-d035-40a6-9699-a7bab42a3cbc/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:29:08 np0005466013 nova_compute[192144]: 2025-10-02 12:29:08.905 2 DEBUG oslo_concurrency.processutils [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/158b6775-d035-40a6-9699-a7bab42a3cbc/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:29:08 np0005466013 nova_compute[192144]: 2025-10-02 12:29:08.906 2 DEBUG oslo_concurrency.lockutils [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:08 np0005466013 nova_compute[192144]: 2025-10-02 12:29:08.907 2 DEBUG oslo_concurrency.processutils [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:29:08 np0005466013 nova_compute[192144]: 2025-10-02 12:29:08.984 2 DEBUG oslo_concurrency.processutils [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:29:08 np0005466013 nova_compute[192144]: 2025-10-02 12:29:08.985 2 DEBUG nova.virt.disk.api [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Checking if we can resize image /var/lib/nova/instances/158b6775-d035-40a6-9699-a7bab42a3cbc/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:29:08 np0005466013 nova_compute[192144]: 2025-10-02 12:29:08.985 2 DEBUG oslo_concurrency.processutils [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/158b6775-d035-40a6-9699-a7bab42a3cbc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:29:09 np0005466013 nova_compute[192144]: 2025-10-02 12:29:09.078 2 DEBUG oslo_concurrency.processutils [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/158b6775-d035-40a6-9699-a7bab42a3cbc/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:29:09 np0005466013 nova_compute[192144]: 2025-10-02 12:29:09.079 2 DEBUG nova.virt.disk.api [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Cannot resize image /var/lib/nova/instances/158b6775-d035-40a6-9699-a7bab42a3cbc/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:29:09 np0005466013 nova_compute[192144]: 2025-10-02 12:29:09.080 2 DEBUG nova.objects.instance [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lazy-loading 'migration_context' on Instance uuid 158b6775-d035-40a6-9699-a7bab42a3cbc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:29:09 np0005466013 nova_compute[192144]: 2025-10-02 12:29:09.101 2 DEBUG nova.virt.libvirt.driver [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:29:09 np0005466013 nova_compute[192144]: 2025-10-02 12:29:09.102 2 DEBUG nova.virt.libvirt.driver [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Ensure instance console log exists: /var/lib/nova/instances/158b6775-d035-40a6-9699-a7bab42a3cbc/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:29:09 np0005466013 nova_compute[192144]: 2025-10-02 12:29:09.103 2 DEBUG oslo_concurrency.lockutils [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:09 np0005466013 nova_compute[192144]: 2025-10-02 12:29:09.103 2 DEBUG oslo_concurrency.lockutils [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:09 np0005466013 nova_compute[192144]: 2025-10-02 12:29:09.104 2 DEBUG oslo_concurrency.lockutils [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:09 np0005466013 nova_compute[192144]: 2025-10-02 12:29:09.808 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:11 np0005466013 nova_compute[192144]: 2025-10-02 12:29:11.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:11 np0005466013 nova_compute[192144]: 2025-10-02 12:29:11.490 2 DEBUG nova.network.neutron [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Successfully created port: d66594bd-b226-44dd-a0fd-5ff49b65d032 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:29:12 np0005466013 nova_compute[192144]: 2025-10-02 12:29:12.914 2 DEBUG nova.network.neutron [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Successfully updated port: d66594bd-b226-44dd-a0fd-5ff49b65d032 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:29:12 np0005466013 nova_compute[192144]: 2025-10-02 12:29:12.940 2 DEBUG oslo_concurrency.lockutils [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "refresh_cache-158b6775-d035-40a6-9699-a7bab42a3cbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:29:12 np0005466013 nova_compute[192144]: 2025-10-02 12:29:12.940 2 DEBUG oslo_concurrency.lockutils [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquired lock "refresh_cache-158b6775-d035-40a6-9699-a7bab42a3cbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:29:12 np0005466013 nova_compute[192144]: 2025-10-02 12:29:12.940 2 DEBUG nova.network.neutron [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:29:13 np0005466013 nova_compute[192144]: 2025-10-02 12:29:13.124 2 DEBUG nova.compute.manager [req-a8481c6b-3d8f-41f8-8254-e4cade6b0811 req-0114e915-8e46-4a8e-a2f3-1fbadaa68db5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Received event network-changed-d66594bd-b226-44dd-a0fd-5ff49b65d032 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:29:13 np0005466013 nova_compute[192144]: 2025-10-02 12:29:13.125 2 DEBUG nova.compute.manager [req-a8481c6b-3d8f-41f8-8254-e4cade6b0811 req-0114e915-8e46-4a8e-a2f3-1fbadaa68db5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Refreshing instance network info cache due to event network-changed-d66594bd-b226-44dd-a0fd-5ff49b65d032. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:29:13 np0005466013 nova_compute[192144]: 2025-10-02 12:29:13.125 2 DEBUG oslo_concurrency.lockutils [req-a8481c6b-3d8f-41f8-8254-e4cade6b0811 req-0114e915-8e46-4a8e-a2f3-1fbadaa68db5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-158b6775-d035-40a6-9699-a7bab42a3cbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:29:13 np0005466013 nova_compute[192144]: 2025-10-02 12:29:13.226 2 DEBUG nova.network.neutron [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:29:13 np0005466013 nova_compute[192144]: 2025-10-02 12:29:13.412 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:14 np0005466013 nova_compute[192144]: 2025-10-02 12:29:14.329 2 DEBUG nova.network.neutron [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Updating instance_info_cache with network_info: [{"id": "d66594bd-b226-44dd-a0fd-5ff49b65d032", "address": "fa:16:3e:7f:f9:aa", "network": {"id": "9f2a05d8-1f43-4329-b388-8811ae8293ca", "bridge": "br-int", "label": "tempest-network-smoke--1590153811", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd66594bd-b2", "ovs_interfaceid": "d66594bd-b226-44dd-a0fd-5ff49b65d032", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:29:14 np0005466013 nova_compute[192144]: 2025-10-02 12:29:14.428 2 DEBUG oslo_concurrency.lockutils [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Releasing lock "refresh_cache-158b6775-d035-40a6-9699-a7bab42a3cbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:29:14 np0005466013 nova_compute[192144]: 2025-10-02 12:29:14.429 2 DEBUG nova.compute.manager [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Instance network_info: |[{"id": "d66594bd-b226-44dd-a0fd-5ff49b65d032", "address": "fa:16:3e:7f:f9:aa", "network": {"id": "9f2a05d8-1f43-4329-b388-8811ae8293ca", "bridge": "br-int", "label": "tempest-network-smoke--1590153811", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd66594bd-b2", "ovs_interfaceid": "d66594bd-b226-44dd-a0fd-5ff49b65d032", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:29:14 np0005466013 nova_compute[192144]: 2025-10-02 12:29:14.430 2 DEBUG oslo_concurrency.lockutils [req-a8481c6b-3d8f-41f8-8254-e4cade6b0811 req-0114e915-8e46-4a8e-a2f3-1fbadaa68db5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-158b6775-d035-40a6-9699-a7bab42a3cbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:29:14 np0005466013 nova_compute[192144]: 2025-10-02 12:29:14.430 2 DEBUG nova.network.neutron [req-a8481c6b-3d8f-41f8-8254-e4cade6b0811 req-0114e915-8e46-4a8e-a2f3-1fbadaa68db5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Refreshing network info cache for port d66594bd-b226-44dd-a0fd-5ff49b65d032 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:29:14 np0005466013 nova_compute[192144]: 2025-10-02 12:29:14.434 2 DEBUG nova.virt.libvirt.driver [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Start _get_guest_xml network_info=[{"id": "d66594bd-b226-44dd-a0fd-5ff49b65d032", "address": "fa:16:3e:7f:f9:aa", "network": {"id": "9f2a05d8-1f43-4329-b388-8811ae8293ca", "bridge": "br-int", "label": "tempest-network-smoke--1590153811", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd66594bd-b2", "ovs_interfaceid": "d66594bd-b226-44dd-a0fd-5ff49b65d032", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:29:14 np0005466013 nova_compute[192144]: 2025-10-02 12:29:14.439 2 WARNING nova.virt.libvirt.driver [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:29:14 np0005466013 nova_compute[192144]: 2025-10-02 12:29:14.447 2 DEBUG nova.virt.libvirt.host [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:29:14 np0005466013 nova_compute[192144]: 2025-10-02 12:29:14.448 2 DEBUG nova.virt.libvirt.host [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:29:14 np0005466013 nova_compute[192144]: 2025-10-02 12:29:14.454 2 DEBUG nova.virt.libvirt.host [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:29:14 np0005466013 nova_compute[192144]: 2025-10-02 12:29:14.454 2 DEBUG nova.virt.libvirt.host [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:29:14 np0005466013 nova_compute[192144]: 2025-10-02 12:29:14.456 2 DEBUG nova.virt.libvirt.driver [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:29:14 np0005466013 nova_compute[192144]: 2025-10-02 12:29:14.456 2 DEBUG nova.virt.hardware [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:29:14 np0005466013 nova_compute[192144]: 2025-10-02 12:29:14.457 2 DEBUG nova.virt.hardware [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:29:14 np0005466013 nova_compute[192144]: 2025-10-02 12:29:14.457 2 DEBUG nova.virt.hardware [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:29:14 np0005466013 nova_compute[192144]: 2025-10-02 12:29:14.457 2 DEBUG nova.virt.hardware [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:29:14 np0005466013 nova_compute[192144]: 2025-10-02 12:29:14.458 2 DEBUG nova.virt.hardware [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:29:14 np0005466013 nova_compute[192144]: 2025-10-02 12:29:14.458 2 DEBUG nova.virt.hardware [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:29:14 np0005466013 nova_compute[192144]: 2025-10-02 12:29:14.458 2 DEBUG nova.virt.hardware [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:29:14 np0005466013 nova_compute[192144]: 2025-10-02 12:29:14.459 2 DEBUG nova.virt.hardware [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:29:14 np0005466013 nova_compute[192144]: 2025-10-02 12:29:14.459 2 DEBUG nova.virt.hardware [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:29:14 np0005466013 nova_compute[192144]: 2025-10-02 12:29:14.459 2 DEBUG nova.virt.hardware [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:29:14 np0005466013 nova_compute[192144]: 2025-10-02 12:29:14.460 2 DEBUG nova.virt.hardware [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:29:14 np0005466013 nova_compute[192144]: 2025-10-02 12:29:14.465 2 DEBUG nova.virt.libvirt.vif [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:29:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-2040067918',display_name='tempest-TestNetworkAdvancedServerOps-server-2040067918',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-2040067918',id=131,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJH+nB7ynWmlE6CYxakDXji/dFc1+oe8aupy/8lQbmwUK6XXJX68remR0div3FEoW99WG9y1B7WUExwGPOYQ/687fHl0sNVIGCh6BhE9C68EXmJ+PMvz0f/nt1NeHV55pA==',key_name='tempest-TestNetworkAdvancedServerOps-1864208372',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76c7dd40d83e4e3ca71abbebf57921b6',ramdisk_id='',reservation_id='r-b06f730g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-597114071',owner_user_name='tempest-TestNetworkAdvancedServerOps-597114071-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:29:08Z,user_data=None,user_id='1faa7e121a0e43ad8cb4ae5b2cfcc6a2',uuid=158b6775-d035-40a6-9699-a7bab42a3cbc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d66594bd-b226-44dd-a0fd-5ff49b65d032", "address": "fa:16:3e:7f:f9:aa", "network": {"id": "9f2a05d8-1f43-4329-b388-8811ae8293ca", "bridge": "br-int", "label": "tempest-network-smoke--1590153811", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd66594bd-b2", "ovs_interfaceid": "d66594bd-b226-44dd-a0fd-5ff49b65d032", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:29:14 np0005466013 nova_compute[192144]: 2025-10-02 12:29:14.465 2 DEBUG nova.network.os_vif_util [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Converting VIF {"id": "d66594bd-b226-44dd-a0fd-5ff49b65d032", "address": "fa:16:3e:7f:f9:aa", "network": {"id": "9f2a05d8-1f43-4329-b388-8811ae8293ca", "bridge": "br-int", "label": "tempest-network-smoke--1590153811", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd66594bd-b2", "ovs_interfaceid": "d66594bd-b226-44dd-a0fd-5ff49b65d032", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:29:14 np0005466013 nova_compute[192144]: 2025-10-02 12:29:14.466 2 DEBUG nova.network.os_vif_util [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7f:f9:aa,bridge_name='br-int',has_traffic_filtering=True,id=d66594bd-b226-44dd-a0fd-5ff49b65d032,network=Network(9f2a05d8-1f43-4329-b388-8811ae8293ca),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd66594bd-b2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:29:14 np0005466013 nova_compute[192144]: 2025-10-02 12:29:14.467 2 DEBUG nova.objects.instance [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lazy-loading 'pci_devices' on Instance uuid 158b6775-d035-40a6-9699-a7bab42a3cbc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:29:14 np0005466013 nova_compute[192144]: 2025-10-02 12:29:14.511 2 DEBUG nova.virt.libvirt.driver [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:29:14 np0005466013 nova_compute[192144]:  <uuid>158b6775-d035-40a6-9699-a7bab42a3cbc</uuid>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:  <name>instance-00000083</name>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:29:14 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-2040067918</nova:name>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:29:14</nova:creationTime>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:29:14 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:        <nova:user uuid="1faa7e121a0e43ad8cb4ae5b2cfcc6a2">tempest-TestNetworkAdvancedServerOps-597114071-project-member</nova:user>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:        <nova:project uuid="76c7dd40d83e4e3ca71abbebf57921b6">tempest-TestNetworkAdvancedServerOps-597114071</nova:project>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:        <nova:port uuid="d66594bd-b226-44dd-a0fd-5ff49b65d032">
Oct  2 08:29:14 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:      <entry name="serial">158b6775-d035-40a6-9699-a7bab42a3cbc</entry>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:      <entry name="uuid">158b6775-d035-40a6-9699-a7bab42a3cbc</entry>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:29:14 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/158b6775-d035-40a6-9699-a7bab42a3cbc/disk"/>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:29:14 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/158b6775-d035-40a6-9699-a7bab42a3cbc/disk.config"/>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:29:14 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:7f:f9:aa"/>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:      <target dev="tapd66594bd-b2"/>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:29:14 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/158b6775-d035-40a6-9699-a7bab42a3cbc/console.log" append="off"/>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:29:14 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:29:14 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:29:14 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:29:14 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:29:14 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:29:14 np0005466013 nova_compute[192144]: 2025-10-02 12:29:14.513 2 DEBUG nova.compute.manager [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Preparing to wait for external event network-vif-plugged-d66594bd-b226-44dd-a0fd-5ff49b65d032 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:29:14 np0005466013 nova_compute[192144]: 2025-10-02 12:29:14.514 2 DEBUG oslo_concurrency.lockutils [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "158b6775-d035-40a6-9699-a7bab42a3cbc-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:14 np0005466013 nova_compute[192144]: 2025-10-02 12:29:14.514 2 DEBUG oslo_concurrency.lockutils [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "158b6775-d035-40a6-9699-a7bab42a3cbc-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:14 np0005466013 nova_compute[192144]: 2025-10-02 12:29:14.515 2 DEBUG oslo_concurrency.lockutils [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "158b6775-d035-40a6-9699-a7bab42a3cbc-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:14 np0005466013 nova_compute[192144]: 2025-10-02 12:29:14.517 2 DEBUG nova.virt.libvirt.vif [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:29:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-2040067918',display_name='tempest-TestNetworkAdvancedServerOps-server-2040067918',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-2040067918',id=131,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJH+nB7ynWmlE6CYxakDXji/dFc1+oe8aupy/8lQbmwUK6XXJX68remR0div3FEoW99WG9y1B7WUExwGPOYQ/687fHl0sNVIGCh6BhE9C68EXmJ+PMvz0f/nt1NeHV55pA==',key_name='tempest-TestNetworkAdvancedServerOps-1864208372',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76c7dd40d83e4e3ca71abbebf57921b6',ramdisk_id='',reservation_id='r-b06f730g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-597114071',owner_user_name='tempest-TestNetworkAdvancedServerOps-597114071-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:29:08Z,user_data=None,user_id='1faa7e121a0e43ad8cb4ae5b2cfcc6a2',uuid=158b6775-d035-40a6-9699-a7bab42a3cbc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d66594bd-b226-44dd-a0fd-5ff49b65d032", "address": "fa:16:3e:7f:f9:aa", "network": {"id": "9f2a05d8-1f43-4329-b388-8811ae8293ca", "bridge": "br-int", "label": "tempest-network-smoke--1590153811", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd66594bd-b2", "ovs_interfaceid": "d66594bd-b226-44dd-a0fd-5ff49b65d032", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:29:14 np0005466013 nova_compute[192144]: 2025-10-02 12:29:14.517 2 DEBUG nova.network.os_vif_util [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Converting VIF {"id": "d66594bd-b226-44dd-a0fd-5ff49b65d032", "address": "fa:16:3e:7f:f9:aa", "network": {"id": "9f2a05d8-1f43-4329-b388-8811ae8293ca", "bridge": "br-int", "label": "tempest-network-smoke--1590153811", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd66594bd-b2", "ovs_interfaceid": "d66594bd-b226-44dd-a0fd-5ff49b65d032", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:29:14 np0005466013 nova_compute[192144]: 2025-10-02 12:29:14.519 2 DEBUG nova.network.os_vif_util [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7f:f9:aa,bridge_name='br-int',has_traffic_filtering=True,id=d66594bd-b226-44dd-a0fd-5ff49b65d032,network=Network(9f2a05d8-1f43-4329-b388-8811ae8293ca),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd66594bd-b2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:29:14 np0005466013 nova_compute[192144]: 2025-10-02 12:29:14.520 2 DEBUG os_vif [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7f:f9:aa,bridge_name='br-int',has_traffic_filtering=True,id=d66594bd-b226-44dd-a0fd-5ff49b65d032,network=Network(9f2a05d8-1f43-4329-b388-8811ae8293ca),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd66594bd-b2') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:29:14 np0005466013 nova_compute[192144]: 2025-10-02 12:29:14.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:14 np0005466013 nova_compute[192144]: 2025-10-02 12:29:14.524 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:29:14 np0005466013 nova_compute[192144]: 2025-10-02 12:29:14.525 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:29:14 np0005466013 nova_compute[192144]: 2025-10-02 12:29:14.533 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:14 np0005466013 nova_compute[192144]: 2025-10-02 12:29:14.533 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd66594bd-b2, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:29:14 np0005466013 nova_compute[192144]: 2025-10-02 12:29:14.534 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd66594bd-b2, col_values=(('external_ids', {'iface-id': 'd66594bd-b226-44dd-a0fd-5ff49b65d032', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7f:f9:aa', 'vm-uuid': '158b6775-d035-40a6-9699-a7bab42a3cbc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:29:14 np0005466013 nova_compute[192144]: 2025-10-02 12:29:14.536 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:14 np0005466013 NetworkManager[51205]: <info>  [1759408154.5373] manager: (tapd66594bd-b2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/249)
Oct  2 08:29:14 np0005466013 nova_compute[192144]: 2025-10-02 12:29:14.538 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:29:14 np0005466013 nova_compute[192144]: 2025-10-02 12:29:14.551 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:14 np0005466013 nova_compute[192144]: 2025-10-02 12:29:14.552 2 INFO os_vif [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7f:f9:aa,bridge_name='br-int',has_traffic_filtering=True,id=d66594bd-b226-44dd-a0fd-5ff49b65d032,network=Network(9f2a05d8-1f43-4329-b388-8811ae8293ca),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd66594bd-b2')#033[00m
Oct  2 08:29:14 np0005466013 nova_compute[192144]: 2025-10-02 12:29:14.664 2 DEBUG nova.virt.libvirt.driver [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:29:14 np0005466013 nova_compute[192144]: 2025-10-02 12:29:14.665 2 DEBUG nova.virt.libvirt.driver [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:29:14 np0005466013 nova_compute[192144]: 2025-10-02 12:29:14.665 2 DEBUG nova.virt.libvirt.driver [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] No VIF found with MAC fa:16:3e:7f:f9:aa, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:29:14 np0005466013 nova_compute[192144]: 2025-10-02 12:29:14.666 2 INFO nova.virt.libvirt.driver [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Using config drive#033[00m
Oct  2 08:29:14 np0005466013 nova_compute[192144]: 2025-10-02 12:29:14.810 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:15 np0005466013 nova_compute[192144]: 2025-10-02 12:29:15.285 2 INFO nova.virt.libvirt.driver [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Creating config drive at /var/lib/nova/instances/158b6775-d035-40a6-9699-a7bab42a3cbc/disk.config#033[00m
Oct  2 08:29:15 np0005466013 nova_compute[192144]: 2025-10-02 12:29:15.291 2 DEBUG oslo_concurrency.processutils [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/158b6775-d035-40a6-9699-a7bab42a3cbc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpm6bj14zv execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:29:15 np0005466013 nova_compute[192144]: 2025-10-02 12:29:15.431 2 DEBUG oslo_concurrency.processutils [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/158b6775-d035-40a6-9699-a7bab42a3cbc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpm6bj14zv" returned: 0 in 0.140s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:29:15 np0005466013 kernel: tapd66594bd-b2: entered promiscuous mode
Oct  2 08:29:15 np0005466013 NetworkManager[51205]: <info>  [1759408155.5498] manager: (tapd66594bd-b2): new Tun device (/org/freedesktop/NetworkManager/Devices/250)
Oct  2 08:29:15 np0005466013 systemd-udevd[241712]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:29:15 np0005466013 nova_compute[192144]: 2025-10-02 12:29:15.622 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:15 np0005466013 ovn_controller[94366]: 2025-10-02T12:29:15Z|00562|binding|INFO|Claiming lport d66594bd-b226-44dd-a0fd-5ff49b65d032 for this chassis.
Oct  2 08:29:15 np0005466013 ovn_controller[94366]: 2025-10-02T12:29:15Z|00563|binding|INFO|d66594bd-b226-44dd-a0fd-5ff49b65d032: Claiming fa:16:3e:7f:f9:aa 10.100.0.13
Oct  2 08:29:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:15.636 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7f:f9:aa 10.100.0.13'], port_security=['fa:16:3e:7f:f9:aa 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '158b6775-d035-40a6-9699-a7bab42a3cbc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9f2a05d8-1f43-4329-b388-8811ae8293ca', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1694e9dd-274b-47a7-823d-2b0bb81c13da', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9b545250-58ed-42b4-932e-b2ddd5229036, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=d66594bd-b226-44dd-a0fd-5ff49b65d032) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:29:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:15.638 103323 INFO neutron.agent.ovn.metadata.agent [-] Port d66594bd-b226-44dd-a0fd-5ff49b65d032 in datapath 9f2a05d8-1f43-4329-b388-8811ae8293ca bound to our chassis#033[00m
Oct  2 08:29:15 np0005466013 ovn_controller[94366]: 2025-10-02T12:29:15Z|00564|binding|INFO|Setting lport d66594bd-b226-44dd-a0fd-5ff49b65d032 ovn-installed in OVS
Oct  2 08:29:15 np0005466013 ovn_controller[94366]: 2025-10-02T12:29:15Z|00565|binding|INFO|Setting lport d66594bd-b226-44dd-a0fd-5ff49b65d032 up in Southbound
Oct  2 08:29:15 np0005466013 nova_compute[192144]: 2025-10-02 12:29:15.641 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:15.641 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9f2a05d8-1f43-4329-b388-8811ae8293ca#033[00m
Oct  2 08:29:15 np0005466013 NetworkManager[51205]: <info>  [1759408155.6503] device (tapd66594bd-b2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:29:15 np0005466013 NetworkManager[51205]: <info>  [1759408155.6512] device (tapd66594bd-b2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:29:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:15.657 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[0ae270e2-c50d-49e5-9cb7-a4a7236872bc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:15.659 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9f2a05d8-11 in ovnmeta-9f2a05d8-1f43-4329-b388-8811ae8293ca namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:29:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:15.660 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9f2a05d8-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:29:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:15.660 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[09dc0f95-650a-4956-9f97-cf748e8f133f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:15.662 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[32573c0c-147c-4fd5-b292-336e73328a5a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:15 np0005466013 systemd-machined[152202]: New machine qemu-64-instance-00000083.
Oct  2 08:29:15 np0005466013 podman[241682]: 2025-10-02 12:29:15.676495121 +0000 UTC m=+0.147559945 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  2 08:29:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:15.676 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[6b78d4c1-e9e2-437b-bd17-ee370ec6ac9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:15 np0005466013 systemd[1]: Started Virtual Machine qemu-64-instance-00000083.
Oct  2 08:29:15 np0005466013 podman[241683]: 2025-10-02 12:29:15.687113114 +0000 UTC m=+0.157185557 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, container_name=openstack_network_exporter, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., distribution-scope=public, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct  2 08:29:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:15.696 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[6919bc17-0e0c-431d-89f1-b4fbc87c47c1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:15 np0005466013 podman[241684]: 2025-10-02 12:29:15.706272964 +0000 UTC m=+0.157001100 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:29:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:15.733 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[e8855b78-ff8d-4c7c-acf1-273078fe927f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:15 np0005466013 NetworkManager[51205]: <info>  [1759408155.7431] manager: (tap9f2a05d8-10): new Veth device (/org/freedesktop/NetworkManager/Devices/251)
Oct  2 08:29:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:15.741 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[1f909381-3924-4311-8204-5216f7e7f476]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:15 np0005466013 systemd-udevd[241727]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:29:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:15.776 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[677242e1-4413-4c56-bdf2-45661d6730d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:15.780 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[43eb3ed9-457c-4966-85b0-3993b0ca6c38]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:15 np0005466013 NetworkManager[51205]: <info>  [1759408155.8057] device (tap9f2a05d8-10): carrier: link connected
Oct  2 08:29:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:15.813 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[80a80fe8-9ea2-4d65-b9d3-ed95949743e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:15.837 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[429685d5-ee50-48e5-99a7-bf32f8aaf1b1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9f2a05d8-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:61:03:0e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 169], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 614742, 'reachable_time': 43503, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241781, 'error': None, 'target': 'ovnmeta-9f2a05d8-1f43-4329-b388-8811ae8293ca', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:15.856 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[7215c576-cdbc-4d37-939f-8b65ac07af26]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe61:30e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 614742, 'tstamp': 614742}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241782, 'error': None, 'target': 'ovnmeta-9f2a05d8-1f43-4329-b388-8811ae8293ca', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:15.879 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[ab3f8c69-4776-404c-8bdc-9105e86de4e1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9f2a05d8-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:61:03:0e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 169], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 614742, 'reachable_time': 43503, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 241783, 'error': None, 'target': 'ovnmeta-9f2a05d8-1f43-4329-b388-8811ae8293ca', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:15 np0005466013 nova_compute[192144]: 2025-10-02 12:29:15.914 2 DEBUG nova.compute.manager [req-3d52bd58-6dc6-4dd3-96d6-f2d50094300a req-6a030962-67b1-4ab3-924f-e5d15070e474 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Received event network-vif-plugged-d66594bd-b226-44dd-a0fd-5ff49b65d032 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:29:15 np0005466013 nova_compute[192144]: 2025-10-02 12:29:15.915 2 DEBUG oslo_concurrency.lockutils [req-3d52bd58-6dc6-4dd3-96d6-f2d50094300a req-6a030962-67b1-4ab3-924f-e5d15070e474 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "158b6775-d035-40a6-9699-a7bab42a3cbc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:15 np0005466013 nova_compute[192144]: 2025-10-02 12:29:15.915 2 DEBUG oslo_concurrency.lockutils [req-3d52bd58-6dc6-4dd3-96d6-f2d50094300a req-6a030962-67b1-4ab3-924f-e5d15070e474 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "158b6775-d035-40a6-9699-a7bab42a3cbc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:15 np0005466013 nova_compute[192144]: 2025-10-02 12:29:15.915 2 DEBUG oslo_concurrency.lockutils [req-3d52bd58-6dc6-4dd3-96d6-f2d50094300a req-6a030962-67b1-4ab3-924f-e5d15070e474 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "158b6775-d035-40a6-9699-a7bab42a3cbc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:15 np0005466013 nova_compute[192144]: 2025-10-02 12:29:15.915 2 DEBUG nova.compute.manager [req-3d52bd58-6dc6-4dd3-96d6-f2d50094300a req-6a030962-67b1-4ab3-924f-e5d15070e474 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Processing event network-vif-plugged-d66594bd-b226-44dd-a0fd-5ff49b65d032 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:29:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:15.917 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[087b4327-93d6-4e86-bc0a-20889464394c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:15.997 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[d2ee8c86-f9cf-443a-b26f-c9561d1c0641]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:16.001 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9f2a05d8-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:29:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:16.002 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:29:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:16.003 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9f2a05d8-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:29:16 np0005466013 nova_compute[192144]: 2025-10-02 12:29:16.005 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:16 np0005466013 kernel: tap9f2a05d8-10: entered promiscuous mode
Oct  2 08:29:16 np0005466013 NetworkManager[51205]: <info>  [1759408156.0096] manager: (tap9f2a05d8-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/252)
Oct  2 08:29:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:16.012 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9f2a05d8-10, col_values=(('external_ids', {'iface-id': '325e84b6-74e4-4e02-a145-8f2619c4e99c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:29:16 np0005466013 nova_compute[192144]: 2025-10-02 12:29:16.015 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:16 np0005466013 ovn_controller[94366]: 2025-10-02T12:29:16Z|00566|binding|INFO|Releasing lport 325e84b6-74e4-4e02-a145-8f2619c4e99c from this chassis (sb_readonly=0)
Oct  2 08:29:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:16.019 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9f2a05d8-1f43-4329-b388-8811ae8293ca.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9f2a05d8-1f43-4329-b388-8811ae8293ca.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:29:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:16.021 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[a04eec10-3a93-4b89-a68f-f252992602bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:16.023 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:29:16 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:29:16 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:29:16 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-9f2a05d8-1f43-4329-b388-8811ae8293ca
Oct  2 08:29:16 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:29:16 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:29:16 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:29:16 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/9f2a05d8-1f43-4329-b388-8811ae8293ca.pid.haproxy
Oct  2 08:29:16 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:29:16 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:29:16 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:29:16 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:29:16 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:29:16 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:29:16 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:29:16 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:29:16 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:29:16 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:29:16 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:29:16 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:29:16 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:29:16 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:29:16 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:29:16 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:29:16 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:29:16 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:29:16 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:29:16 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:29:16 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID 9f2a05d8-1f43-4329-b388-8811ae8293ca
Oct  2 08:29:16 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:29:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:16.026 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9f2a05d8-1f43-4329-b388-8811ae8293ca', 'env', 'PROCESS_TAG=haproxy-9f2a05d8-1f43-4329-b388-8811ae8293ca', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9f2a05d8-1f43-4329-b388-8811ae8293ca.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:29:16 np0005466013 nova_compute[192144]: 2025-10-02 12:29:16.031 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:16 np0005466013 nova_compute[192144]: 2025-10-02 12:29:16.164 2 DEBUG nova.network.neutron [req-a8481c6b-3d8f-41f8-8254-e4cade6b0811 req-0114e915-8e46-4a8e-a2f3-1fbadaa68db5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Updated VIF entry in instance network info cache for port d66594bd-b226-44dd-a0fd-5ff49b65d032. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:29:16 np0005466013 nova_compute[192144]: 2025-10-02 12:29:16.165 2 DEBUG nova.network.neutron [req-a8481c6b-3d8f-41f8-8254-e4cade6b0811 req-0114e915-8e46-4a8e-a2f3-1fbadaa68db5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Updating instance_info_cache with network_info: [{"id": "d66594bd-b226-44dd-a0fd-5ff49b65d032", "address": "fa:16:3e:7f:f9:aa", "network": {"id": "9f2a05d8-1f43-4329-b388-8811ae8293ca", "bridge": "br-int", "label": "tempest-network-smoke--1590153811", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd66594bd-b2", "ovs_interfaceid": "d66594bd-b226-44dd-a0fd-5ff49b65d032", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:29:16 np0005466013 nova_compute[192144]: 2025-10-02 12:29:16.186 2 DEBUG oslo_concurrency.lockutils [req-a8481c6b-3d8f-41f8-8254-e4cade6b0811 req-0114e915-8e46-4a8e-a2f3-1fbadaa68db5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-158b6775-d035-40a6-9699-a7bab42a3cbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.366 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808', 'name': 'tempest-TestGettingAddress-server-986630317', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000080', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'hostId': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  2 08:29:16 np0005466013 nova_compute[192144]: 2025-10-02 12:29:16.367 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.370 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '158b6775-d035-40a6-9699-a7bab42a3cbc', 'name': 'tempest-TestNetworkAdvancedServerOps-server-2040067918', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000083', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'paused', 'tenant_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'hostId': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'status': 'paused', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.371 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.387 12 DEBUG ceilometer.compute.pollsters [-] ca651811-3d96-4b41-a50d-bbaeaf3da808/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.388 12 DEBUG ceilometer.compute.pollsters [-] ca651811-3d96-4b41-a50d-bbaeaf3da808/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 nova_compute[192144]: 2025-10-02 12:29:16.427 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759408156.426598, 158b6775-d035-40a6-9699-a7bab42a3cbc => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:29:16 np0005466013 nova_compute[192144]: 2025-10-02 12:29:16.428 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] VM Started (Lifecycle Event)#033[00m
Oct  2 08:29:16 np0005466013 nova_compute[192144]: 2025-10-02 12:29:16.430 2 DEBUG nova.compute.manager [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.432 12 DEBUG ceilometer.compute.pollsters [-] 158b6775-d035-40a6-9699-a7bab42a3cbc/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.433 12 DEBUG ceilometer.compute.pollsters [-] 158b6775-d035-40a6-9699-a7bab42a3cbc/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 nova_compute[192144]: 2025-10-02 12:29:16.435 2 DEBUG nova.virt.libvirt.driver [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.436 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ab19c933-78cd-4696-be61-8d7424ac87d2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808-vda', 'timestamp': '2025-10-02T12:29:16.371973', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-986630317', 'name': 'instance-00000080', 'instance_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6944e2c0-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.050221253, 'message_signature': 'e252f8864a41ce4c0b264fc24fbf349ccc527eb4beae0d4039f2995241b6c90b'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808-sda', 'timestamp': '2025-10-02T12:29:16.371973', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-986630317', 'name': 'instance-00000080', 'instance_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6944ffb2-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.050221253, 'message_signature': '90347490a82ea3bbff269ba6835690d2dbe2b1f4e8cfe8e45537282aca2b0426'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '158b6775-d035-40a6-9699-a7bab42a3cbc-vda', 'timestamp': '2025-10-02T12:29:16.371973', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-2040067918', 'name': 'instance-00000083', 'instance_id': '158b6775-d035-40a6-9699-a7bab42a3cbc', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '694bcc0c-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.066959857, 'message_signature': '31a57fd0ac71b4c8fc44677980b2d043f5c8a68878b9e5bcb8dd225ef88dcd1d'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '158b6775-d035-40a6-9699-a7bab42a3cbc-sda', 'timestamp': '2025-10-02T12:29:16.371973', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-2040067918', 'name': 'instance-00000083', 'instance_id': '158b6775-d035-40a6-9699-a7bab42a3cbc', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '694be5fc-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.066959857, 'message_signature': '3de792bf6badbd4a4e93ba016db751d6ead96df4a9f509e2111a0464d24318e5'}]}, 'timestamp': '2025-10-02 12:29:16.434103', '_unique_id': '4664d58d320f469caf4a8f9d209bb7a2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.436 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.436 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.436 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.436 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.436 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.436 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.436 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.436 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.436 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.436 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.436 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.436 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.436 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.436 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.436 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.436 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.436 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.436 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.436 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.436 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.436 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.436 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.436 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.436 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.436 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.436 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.436 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.436 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.436 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.436 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.436 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.437 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  2 08:29:16 np0005466013 nova_compute[192144]: 2025-10-02 12:29:16.439 2 INFO nova.virt.libvirt.driver [-] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Instance spawned successfully.#033[00m
Oct  2 08:29:16 np0005466013 nova_compute[192144]: 2025-10-02 12:29:16.440 2 DEBUG nova.virt.libvirt.driver [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.463 12 DEBUG ceilometer.compute.pollsters [-] ca651811-3d96-4b41-a50d-bbaeaf3da808/disk.device.write.latency volume: 125898183914 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.463 12 DEBUG ceilometer.compute.pollsters [-] ca651811-3d96-4b41-a50d-bbaeaf3da808/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 podman[241822]: 2025-10-02 12:29:16.479037566 +0000 UTC m=+0.081522155 container create a2c6dd3057c7f8a14651e0182310b85f7545a2745ab80da505c3d36bdade107d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f2a05d8-1f43-4329-b388-8811ae8293ca, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  2 08:29:16 np0005466013 nova_compute[192144]: 2025-10-02 12:29:16.480 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:29:16 np0005466013 nova_compute[192144]: 2025-10-02 12:29:16.491 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.495 12 DEBUG ceilometer.compute.pollsters [-] 158b6775-d035-40a6-9699-a7bab42a3cbc/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.496 12 DEBUG ceilometer.compute.pollsters [-] 158b6775-d035-40a6-9699-a7bab42a3cbc/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 nova_compute[192144]: 2025-10-02 12:29:16.499 2 DEBUG nova.virt.libvirt.driver [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:29:16 np0005466013 nova_compute[192144]: 2025-10-02 12:29:16.499 2 DEBUG nova.virt.libvirt.driver [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.498 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f46e054a-ccbb-44a6-bd28-c7f00d846085', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 125898183914, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808-vda', 'timestamp': '2025-10-02T12:29:16.438094', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-986630317', 'name': 'instance-00000080', 'instance_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69507414-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.116376396, 'message_signature': 'a9cc28f070c39e5c5a3410d04846b00b5ac1f08c3f7d41b049e0f2932478f741'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808-sda', 'timestamp': '2025-10-02T12:29:16.438094', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-986630317', 'name': 'instance-00000080', 'instance_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6950858a-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.116376396, 'message_signature': '71cb05c0d264348efd103102249891e60f862f911e714ad5961e2ce82e33ac91'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '158b6775-d035-40a6-9699-a7bab42a3cbc-vda', 'timestamp': '2025-10-02T12:29:16.438094', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-2040067918', 'name': 'instance-00000083', 'instance_id': '158b6775-d035-40a6-9699-a7bab42a3cbc', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '695567d0-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.142452243, 'message_signature': 'a2a3c2225f4e2982fa112ed3026c553885a3ac6d623acc91c30414c1fe8620cf'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '158b6775-d035-40a6-9699-a7bab42a3cbc-sda', 'timestamp': '2025-10-02T12:29:16.438094', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-2040067918', 'name': 'instance-00000083', 'instance_id': '158b6775-d035-40a6-9699-a7bab42a3cbc', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69557cc0-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.142452243, 'message_signature': '0835441aada83f806d71b13fd5179dbc9f882ad424899b8ba630055142f96096'}]}, 'timestamp': '2025-10-02 12:29:16.496898', '_unique_id': 'a81d5d04ac7c46ffbc1470cc11eab67d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.498 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.498 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.498 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.498 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.498 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.498 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.498 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.498 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.498 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.498 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.498 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.498 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.498 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.498 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.498 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.498 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.498 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.498 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.498 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.498 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.498 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.498 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.498 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.498 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.498 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.498 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.498 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.498 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.498 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.498 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.498 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.500 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.500 12 DEBUG ceilometer.compute.pollsters [-] ca651811-3d96-4b41-a50d-bbaeaf3da808/disk.device.write.requests volume: 279 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.500 12 DEBUG ceilometer.compute.pollsters [-] ca651811-3d96-4b41-a50d-bbaeaf3da808/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 nova_compute[192144]: 2025-10-02 12:29:16.500 2 DEBUG nova.virt.libvirt.driver [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.501 12 DEBUG ceilometer.compute.pollsters [-] 158b6775-d035-40a6-9699-a7bab42a3cbc/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.501 12 DEBUG ceilometer.compute.pollsters [-] 158b6775-d035-40a6-9699-a7bab42a3cbc/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 nova_compute[192144]: 2025-10-02 12:29:16.501 2 DEBUG nova.virt.libvirt.driver [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:29:16 np0005466013 nova_compute[192144]: 2025-10-02 12:29:16.502 2 DEBUG nova.virt.libvirt.driver [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.502 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '58f6d4e4-d11e-4afa-8e90-250a14a5cba6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 279, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808-vda', 'timestamp': '2025-10-02T12:29:16.500600', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-986630317', 'name': 'instance-00000080', 'instance_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '695620ee-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.116376396, 'message_signature': '352d65f468a5d1713a49668e3e5e4fca6a681af83cb0f66374db9146c00052ef'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808-sda', 'timestamp': '2025-10-02T12:29:16.500600', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-986630317', 'name': 'instance-00000080', 'instance_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69562e18-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.116376396, 'message_signature': 'c8bb56e8d514293c873285ca8a55686a4bfef739e5b5fb84bbd3b2fe816ba530'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '158b6775-d035-40a6-9699-a7bab42a3cbc-vda', 'timestamp': '2025-10-02T12:29:16.500600', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-2040067918', 'name': 'instance-00000083', 'instance_id': '158b6775-d035-40a6-9699-a7bab42a3cbc', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6956391c-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.142452243, 'message_signature': '61dad4519e08ae5193a9cc77519e77d365faf08bf5c33acf74eac25d2dc3bf5f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '158b6775-d035-40a6-9699-a7bab42a3cbc-sda', 'timestamp': '2025-10-02T12:29:16.500600', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-2040067918', 'name': 'instance-00000083', 'instance_id': '158b6775-d035-40a6-9699-a7bab42a3cbc', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '695643bc-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.142452243, 'message_signature': '50097801774313e0365fe6d443cb140b12d630e43dc17743ea4e27a0fb99761f'}]}, 'timestamp': '2025-10-02 12:29:16.501858', '_unique_id': 'd04420d950a1494d92e89bbef9c6c7a5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.502 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.502 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.502 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.502 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.502 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.502 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.502 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.502 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.502 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.502 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.502 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.502 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.502 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.502 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.502 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.502 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.502 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.502 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.502 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.502 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.502 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.502 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.502 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.502 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.502 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.502 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.502 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:29:16 np0005466013 nova_compute[192144]: 2025-10-02 12:29:16.503 2 DEBUG nova.virt.libvirt.driver [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.502 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.502 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.502 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.502 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.503 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.508 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for ca651811-3d96-4b41-a50d-bbaeaf3da808 / tap087d1308-7c inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.509 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for ca651811-3d96-4b41-a50d-bbaeaf3da808 / tap6949320c-d0 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.509 12 DEBUG ceilometer.compute.pollsters [-] ca651811-3d96-4b41-a50d-bbaeaf3da808/network.outgoing.packets volume: 28 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.509 12 DEBUG ceilometer.compute.pollsters [-] ca651811-3d96-4b41-a50d-bbaeaf3da808/network.outgoing.packets volume: 21 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.512 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 158b6775-d035-40a6-9699-a7bab42a3cbc / tapd66594bd-b2 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.512 12 DEBUG ceilometer.compute.pollsters [-] 158b6775-d035-40a6-9699-a7bab42a3cbc/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.513 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b9a5b7af-2f5c-4340-a543-e053a6c8aefe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 28, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000080-ca651811-3d96-4b41-a50d-bbaeaf3da808-tap087d1308-7c', 'timestamp': '2025-10-02T12:29:16.503992', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-986630317', 'name': 'tap087d1308-7c', 'instance_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ac:b7:65', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap087d1308-7c'}, 'message_id': '69577304-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.18224813, 'message_signature': 'df8f6df0f01f658d45b1e27cd1953bcbdad0a65ad05ebdcf56cdc3f84bf64a2c'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 21, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000080-ca651811-3d96-4b41-a50d-bbaeaf3da808-tap6949320c-d0', 'timestamp': '2025-10-02T12:29:16.503992', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-986630317', 'name': 'tap6949320c-d0', 'instance_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:49:69:cf', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6949320c-d0'}, 'message_id': '69577ffc-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.18224813, 'message_signature': '9a932f03cf5c34a971f17cf723ff3369d9a839deec037b47fd9232baa2adb403'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': 'instance-00000083-158b6775-d035-40a6-9699-a7bab42a3cbc-tapd66594bd-b2', 'timestamp': '2025-10-02T12:29:16.503992', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-2040067918', 'name': 'tapd66594bd-b2', 'instance_id': '158b6775-d035-40a6-9699-a7bab42a3cbc', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7f:f9:aa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd66594bd-b2'}, 'message_id': '6957efc8-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.188195626, 'message_signature': '542dd046146b5d1318ea7e27d467bacaf09d27d3953a89f51a62a67abccc8a3f'}]}, 'timestamp': '2025-10-02 12:29:16.512878', '_unique_id': '058276d29392431eb2ea0cf36a54af16'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.513 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.513 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.513 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.513 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.513 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.513 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.513 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.513 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.513 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.513 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.513 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.513 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.513 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.513 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.513 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.513 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.513 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.513 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.513 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.513 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.513 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.513 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.513 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.513 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.513 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.513 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.513 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.513 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.513 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.513 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.513 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.514 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  2 08:29:16 np0005466013 podman[241822]: 2025-10-02 12:29:16.427244013 +0000 UTC m=+0.029728632 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.531 12 DEBUG ceilometer.compute.pollsters [-] ca651811-3d96-4b41-a50d-bbaeaf3da808/memory.usage volume: 43.63671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 nova_compute[192144]: 2025-10-02 12:29:16.551 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:29:16 np0005466013 nova_compute[192144]: 2025-10-02 12:29:16.552 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759408156.426832, 158b6775-d035-40a6-9699-a7bab42a3cbc => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:29:16 np0005466013 nova_compute[192144]: 2025-10-02 12:29:16.552 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.559 12 DEBUG ceilometer.compute.pollsters [-] 158b6775-d035-40a6-9699-a7bab42a3cbc/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.560 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 158b6775-d035-40a6-9699-a7bab42a3cbc: ceilometer.compute.pollsters.NoVolumeException
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.561 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '938350da-f249-4fac-a8a4-3894acf0feab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 43.63671875, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808', 'timestamp': '2025-10-02T12:29:16.514757', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-986630317', 'name': 'instance-00000080', 'instance_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '695ad292-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.209473553, 'message_signature': '331c7261cf19581c63deeb82b193ea9bbc74c2d6cfee1cf952bd1b04005b3c0c'}]}, 'timestamp': '2025-10-02 12:29:16.560270', '_unique_id': '6443bd7a6f86460ca1f02cca3f6ae829'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.561 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.561 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.561 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.561 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.561 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.561 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.561 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.561 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.561 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.561 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.561 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.561 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.561 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.561 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.561 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.561 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.561 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.561 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.561 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.561 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.561 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.561 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.561 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.561 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.561 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.561 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.561 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.561 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.561 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.561 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.561 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.562 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.562 12 DEBUG ceilometer.compute.pollsters [-] ca651811-3d96-4b41-a50d-bbaeaf3da808/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.562 12 DEBUG ceilometer.compute.pollsters [-] ca651811-3d96-4b41-a50d-bbaeaf3da808/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.562 12 DEBUG ceilometer.compute.pollsters [-] 158b6775-d035-40a6-9699-a7bab42a3cbc/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.563 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1853211a-c927-45a3-8784-bad1a16bb5fe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000080-ca651811-3d96-4b41-a50d-bbaeaf3da808-tap087d1308-7c', 'timestamp': '2025-10-02T12:29:16.562161', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-986630317', 'name': 'tap087d1308-7c', 'instance_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ac:b7:65', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap087d1308-7c'}, 'message_id': '695f83b4-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.18224813, 'message_signature': 'd3ff9c1d52c1919bf98e5458aea7dca09de56c55c69d00ceaaec7675c4d22d47'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000080-ca651811-3d96-4b41-a50d-bbaeaf3da808-tap6949320c-d0', 'timestamp': '2025-10-02T12:29:16.562161', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-986630317', 'name': 'tap6949320c-d0', 'instance_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:49:69:cf', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6949320c-d0'}, 'message_id': '695f8d64-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.18224813, 'message_signature': '6ef4408c2b7a2fd493c68875b0e8095df9ab059cc3996350e50b9a73ee299aad'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': 'instance-00000083-158b6775-d035-40a6-9699-a7bab42a3cbc-tapd66594bd-b2', 'timestamp': '2025-10-02T12:29:16.562161', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-2040067918', 'name': 'tapd66594bd-b2', 'instance_id': '158b6775-d035-40a6-9699-a7bab42a3cbc', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7f:f9:aa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd66594bd-b2'}, 'message_id': '695f9548-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.188195626, 'message_signature': 'cc1553e5b038f3695596fa17bba428cd9680f445f8c0c83f38776206b8e26439'}]}, 'timestamp': '2025-10-02 12:29:16.562975', '_unique_id': '58fdabd2a7444d5fb7f25e223b42057f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.563 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.563 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.563 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.563 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.563 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.563 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.563 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.563 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.563 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.563 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.563 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.563 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.563 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.563 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.563 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.563 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.563 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.563 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.563 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.563 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.563 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.563 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.563 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.563 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.563 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.563 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.563 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.563 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.563 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.563 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.563 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.564 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.564 12 DEBUG ceilometer.compute.pollsters [-] ca651811-3d96-4b41-a50d-bbaeaf3da808/disk.device.write.bytes volume: 72970240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.564 12 DEBUG ceilometer.compute.pollsters [-] ca651811-3d96-4b41-a50d-bbaeaf3da808/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.564 12 DEBUG ceilometer.compute.pollsters [-] 158b6775-d035-40a6-9699-a7bab42a3cbc/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.564 12 DEBUG ceilometer.compute.pollsters [-] 158b6775-d035-40a6-9699-a7bab42a3cbc/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.565 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1970dc6f-dc03-4b0b-bae9-5349efe37864', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72970240, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808-vda', 'timestamp': '2025-10-02T12:29:16.564174', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-986630317', 'name': 'instance-00000080', 'instance_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '695fd0bc-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.116376396, 'message_signature': 'b41466f775d6ad245091f883ba3870e020cb980e5abb721816fc7201cc11c20e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808-sda', 'timestamp': '2025-10-02T12:29:16.564174', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-986630317', 'name': 'instance-00000080', 'instance_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '695fd882-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.116376396, 'message_signature': '1cf4b8cc4077a1c4aa6a3c1b4f72d11f6c9c7dead820cdb0942c71dcf89cf180'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '158b6775-d035-40a6-9699-a7bab42a3cbc-vda', 'timestamp': '2025-10-02T12:29:16.564174', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-2040067918', 'name': 'instance-00000083', 'instance_id': '158b6775-d035-40a6-9699-a7bab42a3cbc', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '695fdfda-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.142452243, 'message_signature': '5282d8a4da1afb3e8735b31ca8acbf9ad29d817e15b8032e8635f7927add5ba3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '158b6775-d035-40a6-9699-a7bab42a3cbc-sda', 'timestamp': '2025-10-02T12:29:16.564174', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-2040067918', 'name': 'instance-00000083', 'instance_id': '158b6775-d035-40a6-9699-a7bab42a3cbc', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '695fe854-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.142452243, 'message_signature': 'cafe21bb807e69f2a0bccffdb151b0e191e1fe5f80d26e073f3f30637a66dbc2'}]}, 'timestamp': '2025-10-02 12:29:16.565020', '_unique_id': 'af794bc6243241ff8aa2fef9e3959904'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.565 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.565 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.565 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.565 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.565 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.565 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.565 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.565 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.565 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.565 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.565 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.565 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.565 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.565 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.565 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.565 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.565 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.565 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.565 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.565 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.565 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.565 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.565 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.565 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.565 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.565 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.565 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.565 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.565 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.565 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.565 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.566 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.566 12 DEBUG ceilometer.compute.pollsters [-] ca651811-3d96-4b41-a50d-bbaeaf3da808/disk.device.read.latency volume: 2201981340 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.566 12 DEBUG ceilometer.compute.pollsters [-] ca651811-3d96-4b41-a50d-bbaeaf3da808/disk.device.read.latency volume: 52157922 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.566 12 DEBUG ceilometer.compute.pollsters [-] 158b6775-d035-40a6-9699-a7bab42a3cbc/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.566 12 DEBUG ceilometer.compute.pollsters [-] 158b6775-d035-40a6-9699-a7bab42a3cbc/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.567 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '35cc4e9c-b3bc-41c0-8bf5-1a40f71e4fc5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2201981340, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808-vda', 'timestamp': '2025-10-02T12:29:16.566331', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-986630317', 'name': 'instance-00000080', 'instance_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69602508-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.116376396, 'message_signature': '8e9ff0add2bf415d36c5241bf0e4e9419af2813cac7081292429dc596cba4d88'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 52157922, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808-sda', 'timestamp': '2025-10-02T12:29:16.566331', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-986630317', 'name': 'instance-00000080', 'instance_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69602cec-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.116376396, 'message_signature': 'ee7fcc15ceb778dd67b8ff1a58b323538667d363e784df7f5f5278d9217754e2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '158b6775-d035-40a6-9699-a7bab42a3cbc-vda', 'timestamp': '2025-10-02T12:29:16.566331', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-2040067918', 'name': 'instance-00000083', 'instance_id': '158b6775-d035-40a6-9699-a7bab42a3cbc', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6960352a-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.142452243, 'message_signature': 'af1cd45d401c9cbbaaf6218e1d5c0f5b41ce211fb10c6f0b580ffb426c77a85e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '158b6775-d035-40a6-9699-a7bab42a3cbc-sda', 'timestamp': '2025-10-02T12:29:16.566331', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-2040067918', 'name': 'instance-00000083', 'instance_id': '158b6775-d035-40a6-9699-a7bab42a3cbc', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69603c82-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.142452243, 'message_signature': 'd51f5631ec9902c8c4ef5fee0bfe2a112a6cd956d742f7a1ace22eff82f186df'}]}, 'timestamp': '2025-10-02 12:29:16.567142', '_unique_id': '4bf7a78c14ea412780f30c0f146e9d5b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.567 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.567 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.567 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.567 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.567 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.567 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.567 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.567 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.567 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.567 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.567 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.567 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.567 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.567 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.567 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.567 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.567 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.567 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.567 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.567 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.567 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.567 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.567 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.567 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.567 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.567 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.567 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.567 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.567 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.567 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.567 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.568 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.568 12 DEBUG ceilometer.compute.pollsters [-] ca651811-3d96-4b41-a50d-bbaeaf3da808/network.incoming.packets volume: 27 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.568 12 DEBUG ceilometer.compute.pollsters [-] ca651811-3d96-4b41-a50d-bbaeaf3da808/network.incoming.packets volume: 10 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.568 12 DEBUG ceilometer.compute.pollsters [-] 158b6775-d035-40a6-9699-a7bab42a3cbc/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.569 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '835142c5-56ca-4e0a-8529-0df140bd902f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 27, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000080-ca651811-3d96-4b41-a50d-bbaeaf3da808-tap087d1308-7c', 'timestamp': '2025-10-02T12:29:16.568477', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-986630317', 'name': 'tap087d1308-7c', 'instance_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ac:b7:65', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap087d1308-7c'}, 'message_id': '696078e6-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.18224813, 'message_signature': '4dcf95c52ab5d8c3a888ab5f51606322ac7cb8c5fcf1a61fd9663eafa4a56175'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 10, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000080-ca651811-3d96-4b41-a50d-bbaeaf3da808-tap6949320c-d0', 'timestamp': '2025-10-02T12:29:16.568477', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-986630317', 'name': 'tap6949320c-d0', 'instance_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:49:69:cf', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6949320c-d0'}, 'message_id': '696080ca-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.18224813, 'message_signature': '81e6a4a29d899e4ff7d9a3a582c9b9b98ebe6ed6fcc4004f94fb850e8e5a5d13'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': 'instance-00000083-158b6775-d035-40a6-9699-a7bab42a3cbc-tapd66594bd-b2', 'timestamp': '2025-10-02T12:29:16.568477', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-2040067918', 'name': 'tapd66594bd-b2', 'instance_id': '158b6775-d035-40a6-9699-a7bab42a3cbc', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7f:f9:aa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd66594bd-b2'}, 'message_id': '696089a8-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.188195626, 'message_signature': '60c570ce65d1bdc3010a4308666295ddc3bd43c252393dbaff48aa5c19a44ea6'}]}, 'timestamp': '2025-10-02 12:29:16.569151', '_unique_id': 'a3d7462150f24b7184ccb1c9b91ee823'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.569 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.569 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.569 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.569 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.569 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.569 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.569 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.569 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.569 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.569 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.569 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.569 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.569 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.569 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.569 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.569 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.569 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.569 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.569 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.569 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.569 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.569 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.569 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.569 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.569 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.569 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.569 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.569 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.569 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.569 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.569 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.570 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.570 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.570 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-986630317>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-2040067918>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-986630317>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-2040067918>]
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.570 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.570 12 DEBUG ceilometer.compute.pollsters [-] ca651811-3d96-4b41-a50d-bbaeaf3da808/disk.device.allocation volume: 30351360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.570 12 DEBUG ceilometer.compute.pollsters [-] ca651811-3d96-4b41-a50d-bbaeaf3da808/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.571 12 DEBUG ceilometer.compute.pollsters [-] 158b6775-d035-40a6-9699-a7bab42a3cbc/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.571 12 DEBUG ceilometer.compute.pollsters [-] 158b6775-d035-40a6-9699-a7bab42a3cbc/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '78284a41-ea6d-4bee-90aa-9eea91f8c416', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30351360, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808-vda', 'timestamp': '2025-10-02T12:29:16.570727', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-986630317', 'name': 'instance-00000080', 'instance_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6960d16a-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.050221253, 'message_signature': 'ebac1efd0bd82526a52d9421dbabae651ec52f8ac011d558354988555aeb54ec'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808-sda', 'timestamp': '2025-10-02T12:29:16.570727', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-986630317', 'name': 'instance-00000080', 'instance_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6960d94e-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.050221253, 'message_signature': '134f61bdf3e9ac1096a73fc2f03b280be2c93dc8563fd67aca4c29a6ef42346b'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '158b6775-d035-40a6-9699-a7bab42a3cbc-vda', 'timestamp': '2025-10-02T12:29:16.570727', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-2040067918', 'name': 'instance-00000083', 'instance_id': '158b6775-d035-40a6-9699-a7bab42a3cbc', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6960e0a6-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.066959857, 'message_signature': '8f4d750f7feb9d9fe731cfd1c862757957fead53bf666df6372fb0fe609137f5'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '158b6775-d035-40a6-9699-a7bab42a3cbc-sda', 'timestamp': '2025-10-02T12:29:16.570727', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-2040067918', 'name': 'instance-00000083', 'instance_id': '158b6775-d035-40a6-9699-a7bab42a3cbc', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6960e7c2-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.066959857, 'message_signature': '36ba90b274fc14e2ecee80106c028af7dcf8b9e565782fba4e9ce7a8ef36b39e'}]}, 'timestamp': '2025-10-02 12:29:16.571546', '_unique_id': 'f0cfdb50a85b4ac49976838c07b211f0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.572 12 DEBUG ceilometer.compute.pollsters [-] ca651811-3d96-4b41-a50d-bbaeaf3da808/network.incoming.bytes volume: 4421 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.573 12 DEBUG ceilometer.compute.pollsters [-] ca651811-3d96-4b41-a50d-bbaeaf3da808/network.incoming.bytes volume: 1040 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.573 12 DEBUG ceilometer.compute.pollsters [-] 158b6775-d035-40a6-9699-a7bab42a3cbc/network.incoming.bytes volume: 110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.574 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1d531134-7bb5-4f56-b154-1453e59237d9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4421, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000080-ca651811-3d96-4b41-a50d-bbaeaf3da808-tap087d1308-7c', 'timestamp': '2025-10-02T12:29:16.572883', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-986630317', 'name': 'tap087d1308-7c', 'instance_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ac:b7:65', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap087d1308-7c'}, 'message_id': '69612714-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.18224813, 'message_signature': '324573fbbeba8127d858647a7d17210a87523f63c01c9ed843fb0df3da4a5247'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1040, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000080-ca651811-3d96-4b41-a50d-bbaeaf3da808-tap6949320c-d0', 'timestamp': '2025-10-02T12:29:16.572883', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-986630317', 'name': 'tap6949320c-d0', 'instance_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:49:69:cf', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6949320c-d0'}, 'message_id': '69613330-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.18224813, 'message_signature': '139d43fca99f698e1bdf2dc50e321b92bb32d890e9cdcdbdc1ebc36dd1dee9fa'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 110, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': 'instance-00000083-158b6775-d035-40a6-9699-a7bab42a3cbc-tapd66594bd-b2', 'timestamp': '2025-10-02T12:29:16.572883', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-2040067918', 'name': 'tapd66594bd-b2', 'instance_id': '158b6775-d035-40a6-9699-a7bab42a3cbc', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7f:f9:aa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd66594bd-b2'}, 'message_id': '69613e5c-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.188195626, 'message_signature': 'fdcabab6267df1abb1d7660fb7b15bc57f87931d6b43fc0739867dd06c7e2e1a'}]}, 'timestamp': '2025-10-02 12:29:16.573803', '_unique_id': '0b02f78e44444974a289e37699b0a567'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.574 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.574 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.574 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.574 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.574 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.574 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.574 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.574 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.574 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.574 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.574 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.574 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.574 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.574 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.574 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.574 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.574 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.574 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.574 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.574 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.574 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.574 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.574 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.574 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.574 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.574 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.574 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.574 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.574 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.574 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.574 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.575 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.575 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.575 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-986630317>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-2040067918>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-986630317>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-2040067918>]
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.575 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.575 12 DEBUG ceilometer.compute.pollsters [-] ca651811-3d96-4b41-a50d-bbaeaf3da808/network.outgoing.bytes volume: 3390 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.576 12 DEBUG ceilometer.compute.pollsters [-] ca651811-3d96-4b41-a50d-bbaeaf3da808/network.outgoing.bytes volume: 2618 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.576 12 DEBUG ceilometer.compute.pollsters [-] 158b6775-d035-40a6-9699-a7bab42a3cbc/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.577 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e270d641-8028-49c6-952a-417c45e31996', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3390, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000080-ca651811-3d96-4b41-a50d-bbaeaf3da808-tap087d1308-7c', 'timestamp': '2025-10-02T12:29:16.575939', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-986630317', 'name': 'tap087d1308-7c', 'instance_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ac:b7:65', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap087d1308-7c'}, 'message_id': '69619e4c-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.18224813, 'message_signature': 'ced5eb64440a128a4fc3c083d04cc1b44cae784c1edb6d34167ba6805277ecf8'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2618, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000080-ca651811-3d96-4b41-a50d-bbaeaf3da808-tap6949320c-d0', 'timestamp': '2025-10-02T12:29:16.575939', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-986630317', 'name': 'tap6949320c-d0', 'instance_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:49:69:cf', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6949320c-d0'}, 'message_id': '6961a9e6-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.18224813, 'message_signature': 'fd76389d2fe3ec0313aac95927ebd153954132b23e409d883adecbf80c7ea36d'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': 'instance-00000083-158b6775-d035-40a6-9699-a7bab42a3cbc-tapd66594bd-b2', 'timestamp': '2025-10-02T12:29:16.575939', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-2040067918', 'name': 'tapd66594bd-b2', 'instance_id': '158b6775-d035-40a6-9699-a7bab42a3cbc', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7f:f9:aa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd66594bd-b2'}, 'message_id': '6961b60c-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.188195626, 'message_signature': 'dba807de8c6fa9f8064352aed2883ff1e9284d375358e3da6d6ad69b552edfcd'}]}, 'timestamp': '2025-10-02 12:29:16.576914', '_unique_id': 'a4b090660c5a42afb01a4b1c6018edb1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.577 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.577 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.577 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.577 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.577 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.577 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.577 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.577 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.577 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.577 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.577 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.577 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.577 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.577 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.577 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.577 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.577 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.577 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.577 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.577 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.577 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.577 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.577 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.577 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.577 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.577 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.577 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.577 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.577 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.577 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.577 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.578 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.578 12 DEBUG ceilometer.compute.pollsters [-] ca651811-3d96-4b41-a50d-bbaeaf3da808/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.578 12 DEBUG ceilometer.compute.pollsters [-] ca651811-3d96-4b41-a50d-bbaeaf3da808/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.579 12 DEBUG ceilometer.compute.pollsters [-] 158b6775-d035-40a6-9699-a7bab42a3cbc/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.579 12 DEBUG ceilometer.compute.pollsters [-] 158b6775-d035-40a6-9699-a7bab42a3cbc/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.580 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4a8a5c44-32b3-4b26-bd50-d334835a1181', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808-vda', 'timestamp': '2025-10-02T12:29:16.578618', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-986630317', 'name': 'instance-00000080', 'instance_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69620710-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.050221253, 'message_signature': '687682ac424aa17626b1b9a39c0aaddf15fd1945e16efcd05e33836c00a59684'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808-sda', 'timestamp': '2025-10-02T12:29:16.578618', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-986630317', 'name': 'instance-00000080', 'instance_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69621462-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.050221253, 'message_signature': '2bc7aa6617c9f994a5f1077099a732406c406ebd729bfe1d07e0fbe6cfaecc3c'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '158b6775-d035-40a6-9699-a7bab42a3cbc-vda', 'timestamp': '2025-10-02T12:29:16.578618', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-2040067918', 'name': 'instance-00000083', 'instance_id': '158b6775-d035-40a6-9699-a7bab42a3cbc', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69621f20-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.066959857, 'message_signature': '01059fc17cbd7aa03deed1bba6e65c002316a20fc2066d279d7c9fad49a4e4fa'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '158b6775-d035-40a6-9699-a7bab42a3cbc-sda', 'timestamp': '2025-10-02T12:29:16.578618', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-2040067918', 'name': 'instance-00000083', 'instance_id': '158b6775-d035-40a6-9699-a7bab42a3cbc', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69622a6a-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.066959857, 'message_signature': 'f3814aba266e5cc3ea6761d901254bc47d011c8455feccf8dcacd598edd19a53'}]}, 'timestamp': '2025-10-02 12:29:16.579871', '_unique_id': '2cd8902666f6404a9c7559d21c931b9d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.580 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.580 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.580 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.580 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.580 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.580 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.580 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.580 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.580 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.580 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.580 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.580 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.580 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.580 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.580 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.580 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.580 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.580 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.580 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.580 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.580 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.580 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.580 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.580 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.580 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.580 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.580 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.580 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.580 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.580 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.580 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.581 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.582 12 DEBUG ceilometer.compute.pollsters [-] ca651811-3d96-4b41-a50d-bbaeaf3da808/disk.device.read.bytes volume: 31328768 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.582 12 DEBUG ceilometer.compute.pollsters [-] ca651811-3d96-4b41-a50d-bbaeaf3da808/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.582 12 DEBUG ceilometer.compute.pollsters [-] 158b6775-d035-40a6-9699-a7bab42a3cbc/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.583 12 DEBUG ceilometer.compute.pollsters [-] 158b6775-d035-40a6-9699-a7bab42a3cbc/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 nova_compute[192144]: 2025-10-02 12:29:16.583 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.585 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c4e77785-6993-4044-a4d7-a5c8180a44a6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31328768, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808-vda', 'timestamp': '2025-10-02T12:29:16.582097', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-986630317', 'name': 'instance-00000080', 'instance_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69629176-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.116376396, 'message_signature': '364151521c507d081326740c8c0bb501fc4d7ed66037bdead37e61d97a7a2d31'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808-sda', 'timestamp': '2025-10-02T12:29:16.582097', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-986630317', 'name': 'instance-00000080', 'instance_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69629f36-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.116376396, 'message_signature': '324947ecd25eb2b3ccf6295ce13c77bcc488ea7faab769d3e6c2759765fa18fd'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '158b6775-d035-40a6-9699-a7bab42a3cbc-vda', 'timestamp': '2025-10-02T12:29:16.582097', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-2040067918', 'name': 'instance-00000083', 'instance_id': '158b6775-d035-40a6-9699-a7bab42a3cbc', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6962ab3e-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.142452243, 'message_signature': 'd8cba0ffd576b9b53a52dbf83b55cdc3332d8b62aa57eec2aceab024eb02b75c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '158b6775-d035-40a6-9699-a7bab42a3cbc-sda', 'timestamp': '2025-10-02T12:29:16.582097', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-2040067918', 'name': 'instance-00000083', 'instance_id': '158b6775-d035-40a6-9699-a7bab42a3cbc', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6962b692-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.142452243, 'message_signature': 'ca33d0f3fbc722caeadb0c05df26d6f273b33a83f22080e9d8e64bdb9a7e5e1d'}]}, 'timestamp': '2025-10-02 12:29:16.583441', '_unique_id': 'b7e4d1dc017c41f1b2e1f8bdfb48028b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.585 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.585 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.585 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.585 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.585 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.585 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.585 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.585 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.585 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.585 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.585 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.585 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.585 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.585 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.585 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.585 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.585 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.585 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.585 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:29:16 np0005466013 nova_compute[192144]: 2025-10-02 12:29:16.587 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759408156.4339814, 158b6775-d035-40a6-9699-a7bab42a3cbc => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.585 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.585 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.585 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.585 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.585 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.585 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.585 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.585 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.585 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.585 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.585 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.585 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.587 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.587 12 DEBUG ceilometer.compute.pollsters [-] ca651811-3d96-4b41-a50d-bbaeaf3da808/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 nova_compute[192144]: 2025-10-02 12:29:16.587 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.587 12 DEBUG ceilometer.compute.pollsters [-] ca651811-3d96-4b41-a50d-bbaeaf3da808/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.588 12 DEBUG ceilometer.compute.pollsters [-] 158b6775-d035-40a6-9699-a7bab42a3cbc/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.589 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0c64ba46-2355-4737-ad6b-7255ea652f1b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000080-ca651811-3d96-4b41-a50d-bbaeaf3da808-tap087d1308-7c', 'timestamp': '2025-10-02T12:29:16.587452', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-986630317', 'name': 'tap087d1308-7c', 'instance_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ac:b7:65', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap087d1308-7c'}, 'message_id': '69636150-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.18224813, 'message_signature': '8dd21148edfa08f285a1d3cdbda3f479291c0ff114c15cc872b4044196ff2c19'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000080-ca651811-3d96-4b41-a50d-bbaeaf3da808-tap6949320c-d0', 'timestamp': '2025-10-02T12:29:16.587452', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-986630317', 'name': 'tap6949320c-d0', 'instance_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:49:69:cf', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6949320c-d0'}, 'message_id': '69636fc4-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.18224813, 'message_signature': '8893ece04ee726420cd726ecfc3af47535f082ba88f7ce1006b95476b9c7837e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': 'instance-00000083-158b6775-d035-40a6-9699-a7bab42a3cbc-tapd66594bd-b2', 'timestamp': '2025-10-02T12:29:16.587452', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-2040067918', 'name': 'tapd66594bd-b2', 'instance_id': '158b6775-d035-40a6-9699-a7bab42a3cbc', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7f:f9:aa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd66594bd-b2'}, 'message_id': '69637c6c-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.188195626, 'message_signature': 'f2bed9340631a4393f8f142fcb5f0596beaecc629afbb34d8a5a0f6574c75fb0'}]}, 'timestamp': '2025-10-02 12:29:16.588516', '_unique_id': '08cf1c77f60b4d65b251f4a5c5b1f98b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.589 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.589 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.589 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.589 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.589 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.589 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.589 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.589 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.589 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.589 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.589 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.589 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.589 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.589 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.589 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.589 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.589 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.589 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.589 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.589 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.589 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.589 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.589 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.589 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.589 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.589 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.589 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.589 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.589 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.589 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.589 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.590 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.590 12 DEBUG ceilometer.compute.pollsters [-] ca651811-3d96-4b41-a50d-bbaeaf3da808/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.590 12 DEBUG ceilometer.compute.pollsters [-] ca651811-3d96-4b41-a50d-bbaeaf3da808/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.591 12 DEBUG ceilometer.compute.pollsters [-] 158b6775-d035-40a6-9699-a7bab42a3cbc/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.592 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd0298c69-c149-4dfd-bec4-b30e1b71c26d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000080-ca651811-3d96-4b41-a50d-bbaeaf3da808-tap087d1308-7c', 'timestamp': '2025-10-02T12:29:16.590388', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-986630317', 'name': 'tap087d1308-7c', 'instance_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ac:b7:65', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap087d1308-7c'}, 'message_id': '6963d306-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.18224813, 'message_signature': '30b367dbdf0d0349d608041bf34736fa913e39ed10401ddd9719a1f2292fe9b3'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000080-ca651811-3d96-4b41-a50d-bbaeaf3da808-tap6949320c-d0', 'timestamp': '2025-10-02T12:29:16.590388', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-986630317', 'name': 'tap6949320c-d0', 'instance_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:49:69:cf', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6949320c-d0'}, 'message_id': '6963e008-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.18224813, 'message_signature': 'f7b4c83828b06eac89f73034813c8fe36c83d164743389b9a9be70cc88dd4363'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': 'instance-00000083-158b6775-d035-40a6-9699-a7bab42a3cbc-tapd66594bd-b2', 'timestamp': '2025-10-02T12:29:16.590388', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-2040067918', 'name': 'tapd66594bd-b2', 'instance_id': '158b6775-d035-40a6-9699-a7bab42a3cbc', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7f:f9:aa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd66594bd-b2'}, 'message_id': '6963ec9c-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.188195626, 'message_signature': '5a4d23c3a480c8850af9181e2b3868848895c8b3c67f1f9d460db51f5909d0d9'}]}, 'timestamp': '2025-10-02 12:29:16.591386', '_unique_id': '6147a86c52f54579873bcfc979d79ec9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.592 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.592 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.592 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.592 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.592 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.592 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.592 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.592 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.592 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.592 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.592 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.592 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.592 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.592 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.592 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.592 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.592 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.592 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.592 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.592 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.592 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.592 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.592 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.592 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.592 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.592 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.592 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.592 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.592 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.592 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.592 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.592 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.593 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.593 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-986630317>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-2040067918>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-986630317>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-2040067918>]
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.593 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.593 12 DEBUG ceilometer.compute.pollsters [-] ca651811-3d96-4b41-a50d-bbaeaf3da808/disk.device.read.requests volume: 1150 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.593 12 DEBUG ceilometer.compute.pollsters [-] ca651811-3d96-4b41-a50d-bbaeaf3da808/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.594 12 DEBUG ceilometer.compute.pollsters [-] 158b6775-d035-40a6-9699-a7bab42a3cbc/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.594 12 DEBUG ceilometer.compute.pollsters [-] 158b6775-d035-40a6-9699-a7bab42a3cbc/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.595 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5e2b91ce-496d-4fca-b45a-9e49e36ea081', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1150, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808-vda', 'timestamp': '2025-10-02T12:29:16.593628', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-986630317', 'name': 'instance-00000080', 'instance_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69645150-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.116376396, 'message_signature': 'e6fc08565742d5119089400136d0787851627593d27f76aa391804f4349e202c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808-sda', 'timestamp': '2025-10-02T12:29:16.593628', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-986630317', 'name': 'instance-00000080', 'instance_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69645e34-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.116376396, 'message_signature': 'b3c0145c947f149fcec9a31f2b7e4c7d21331a18450c674130afc7ed70656fc1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '158b6775-d035-40a6-9699-a7bab42a3cbc-vda', 'timestamp': '2025-10-02T12:29:16.593628', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-2040067918', 'name': 'instance-00000083', 'instance_id': '158b6775-d035-40a6-9699-a7bab42a3cbc', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '696468c0-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.142452243, 'message_signature': '8e133d64d21cea68a874002d0d56fcedee365269f34d64de647c480be4cec834'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '158b6775-d035-40a6-9699-a7bab42a3cbc-sda', 'timestamp': '2025-10-02T12:29:16.593628', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-2040067918', 'name': 'instance-00000083', 'instance_id': '158b6775-d035-40a6-9699-a7bab42a3cbc', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69647266-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.142452243, 'message_signature': 'a83fa3343c9edcd007c59c89ae5da2b90717be586661f522515bb78462419ce0'}]}, 'timestamp': '2025-10-02 12:29:16.594775', '_unique_id': '7d4c32e6fed94c3bb511b0789b451b02'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.595 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.595 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.595 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.595 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.595 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.595 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.595 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.595 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.595 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.595 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.595 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.595 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.595 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.595 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.595 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.595 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.595 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.595 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.595 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.595 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.595 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.595 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.595 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.595 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.595 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.595 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.595 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.595 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.595 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.595 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.595 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.596 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.596 12 DEBUG ceilometer.compute.pollsters [-] ca651811-3d96-4b41-a50d-bbaeaf3da808/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.597 12 DEBUG ceilometer.compute.pollsters [-] ca651811-3d96-4b41-a50d-bbaeaf3da808/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.597 12 DEBUG ceilometer.compute.pollsters [-] 158b6775-d035-40a6-9699-a7bab42a3cbc/network.incoming.packets.drop volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.598 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1ee8392a-4226-4185-9ff3-cdf58e86eb48', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000080-ca651811-3d96-4b41-a50d-bbaeaf3da808-tap087d1308-7c', 'timestamp': '2025-10-02T12:29:16.596610', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-986630317', 'name': 'tap087d1308-7c', 'instance_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ac:b7:65', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap087d1308-7c'}, 'message_id': '6964c658-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.18224813, 'message_signature': 'ad6e8d2b792aa40c3b500136b59c49ef5e5c07ffdb363cdab0d41615000ebd95'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000080-ca651811-3d96-4b41-a50d-bbaeaf3da808-tap6949320c-d0', 'timestamp': '2025-10-02T12:29:16.596610', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-986630317', 'name': 'tap6949320c-d0', 'instance_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:49:69:cf', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6949320c-d0'}, 'message_id': '6964d4d6-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.18224813, 'message_signature': 'bb57de8467e2b66a5069e5827b94b9d31e4744e26b005fe5928a0a8111611ee9'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': 'instance-00000083-158b6775-d035-40a6-9699-a7bab42a3cbc-tapd66594bd-b2', 'timestamp': '2025-10-02T12:29:16.596610', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-2040067918', 'name': 'tapd66594bd-b2', 'instance_id': '158b6775-d035-40a6-9699-a7bab42a3cbc', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7f:f9:aa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd66594bd-b2'}, 'message_id': '6964e1ec-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.188195626, 'message_signature': '7ccee92ddee6a31089dc57dfae10a48af0d644fb3717a4008de4f65dee2e4c22'}]}, 'timestamp': '2025-10-02 12:29:16.597661', '_unique_id': '6a3f346391c8491b928c5fa1ba945eb7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.598 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.598 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.598 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.598 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.598 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.598 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.598 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.598 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.598 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.598 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.598 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.598 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.598 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.598 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.598 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.598 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.598 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.598 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.598 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.598 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.598 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.598 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.598 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.598 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.598 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.598 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.598 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.598 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.598 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.598 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.598 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.599 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.599 12 DEBUG ceilometer.compute.pollsters [-] ca651811-3d96-4b41-a50d-bbaeaf3da808/cpu volume: 12770000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.599 12 DEBUG ceilometer.compute.pollsters [-] 158b6775-d035-40a6-9699-a7bab42a3cbc/cpu volume: 80000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.600 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '492cfdd5-f1e6-4f22-bc5e-a3fc7a1dbee6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12770000000, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808', 'timestamp': '2025-10-02T12:29:16.599374', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-986630317', 'name': 'instance-00000080', 'instance_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '696531d8-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.209473553, 'message_signature': '91e32cbdf1abf5c6afe9e45e0fee3d48cc1ce8861df6a606e8b462f1508cd1a3'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 80000000, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '158b6775-d035-40a6-9699-a7bab42a3cbc', 'timestamp': '2025-10-02T12:29:16.599374', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-2040067918', 'name': 'instance-00000083', 'instance_id': '158b6775-d035-40a6-9699-a7bab42a3cbc', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '69653f0c-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.237881813, 'message_signature': 'b8312128bbbfee16ebb50afd6ca74c267f82984d8fc84b3c0bba7454eeb11a17'}]}, 'timestamp': '2025-10-02 12:29:16.600033', '_unique_id': '75337d9195714e01b0a18a143a63c337'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.600 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.600 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.600 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.600 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.600 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.600 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.600 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.600 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.600 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.600 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.600 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.600 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.600 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.600 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.600 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.600 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.600 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.600 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.600 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.600 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.600 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.600 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.600 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.600 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.600 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.600 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.600 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.600 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.600 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.600 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.600 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.601 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.601 12 DEBUG ceilometer.compute.pollsters [-] ca651811-3d96-4b41-a50d-bbaeaf3da808/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.602 12 DEBUG ceilometer.compute.pollsters [-] ca651811-3d96-4b41-a50d-bbaeaf3da808/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.602 12 DEBUG ceilometer.compute.pollsters [-] 158b6775-d035-40a6-9699-a7bab42a3cbc/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.603 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2b16152a-acc7-4d20-9a2c-1fc6d5d742d3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000080-ca651811-3d96-4b41-a50d-bbaeaf3da808-tap087d1308-7c', 'timestamp': '2025-10-02T12:29:16.601680', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-986630317', 'name': 'tap087d1308-7c', 'instance_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ac:b7:65', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap087d1308-7c'}, 'message_id': '69658da4-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.18224813, 'message_signature': '4400f0be0a25a6414f6a568fd12149472ce220d3d06364e564fa46be2bf2cf9b'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000080-ca651811-3d96-4b41-a50d-bbaeaf3da808-tap6949320c-d0', 'timestamp': '2025-10-02T12:29:16.601680', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-986630317', 'name': 'tap6949320c-d0', 'instance_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:49:69:cf', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6949320c-d0'}, 'message_id': '69659a38-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.18224813, 'message_signature': 'd60561b2e444befc82b95f705ecb5807c379303f874d338cc68dbc897cf71aa1'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': 'instance-00000083-158b6775-d035-40a6-9699-a7bab42a3cbc-tapd66594bd-b2', 'timestamp': '2025-10-02T12:29:16.601680', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-2040067918', 'name': 'tapd66594bd-b2', 'instance_id': '158b6775-d035-40a6-9699-a7bab42a3cbc', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7f:f9:aa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd66594bd-b2'}, 'message_id': '6965a6c2-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.188195626, 'message_signature': '73d4981afb0e125d76c05f711eacae4e565f8de30c61bef3e5068d321098c987'}]}, 'timestamp': '2025-10-02 12:29:16.602696', '_unique_id': 'a2892324d2fb46c8b8642bad1d8e094d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.603 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.603 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.603 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.603 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.603 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.603 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.603 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.603 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.603 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.603 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.603 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.603 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.603 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.603 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.603 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.603 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.603 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.603 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.603 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.603 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.603 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.603 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.603 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.603 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.603 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.603 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.603 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.603 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.603 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.603 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.603 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.604 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.604 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.604 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-986630317>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-2040067918>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-986630317>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-2040067918>]
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.604 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.604 12 DEBUG ceilometer.compute.pollsters [-] ca651811-3d96-4b41-a50d-bbaeaf3da808/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.605 12 DEBUG ceilometer.compute.pollsters [-] ca651811-3d96-4b41-a50d-bbaeaf3da808/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.605 12 DEBUG ceilometer.compute.pollsters [-] 158b6775-d035-40a6-9699-a7bab42a3cbc/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.606 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2500b05d-d9a6-485b-89e9-2a8c195a840a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000080-ca651811-3d96-4b41-a50d-bbaeaf3da808-tap087d1308-7c', 'timestamp': '2025-10-02T12:29:16.604820', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-986630317', 'name': 'tap087d1308-7c', 'instance_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ac:b7:65', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap087d1308-7c'}, 'message_id': '696607fc-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.18224813, 'message_signature': 'fd661cec0da9981b279d73d78a5d60284b09ad51c4c89b7058323a42b4f44da0'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000080-ca651811-3d96-4b41-a50d-bbaeaf3da808-tap6949320c-d0', 'timestamp': '2025-10-02T12:29:16.604820', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-986630317', 'name': 'tap6949320c-d0', 'instance_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:49:69:cf', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6949320c-d0'}, 'message_id': '6966160c-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.18224813, 'message_signature': '77292852b1414772c14fd0d7c37a2135570cbb6bc9e8e8de38a7026ee5c25698'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': 'instance-00000083-158b6775-d035-40a6-9699-a7bab42a3cbc-tapd66594bd-b2', 'timestamp': '2025-10-02T12:29:16.604820', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-2040067918', 'name': 'tapd66594bd-b2', 'instance_id': '158b6775-d035-40a6-9699-a7bab42a3cbc', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7f:f9:aa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd66594bd-b2'}, 'message_id': '69662228-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6148.188195626, 'message_signature': 'ea14b20cdf190042ab59f84ba314bbf4b793461edaf2ac63565091cfc7c8591f'}]}, 'timestamp': '2025-10-02 12:29:16.605908', '_unique_id': 'd4d3529031d74eec929541ebca7ffcc9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.606 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.606 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.606 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.606 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.606 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.606 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.606 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.606 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.606 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.606 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.606 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.606 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.606 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.606 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.606 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.606 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.606 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.606 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.606 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.606 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.606 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.606 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.606 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.606 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.606 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.606 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.606 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.606 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.606 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.606 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:29:16.606 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466013 nova_compute[192144]: 2025-10-02 12:29:16.638 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:29:16 np0005466013 nova_compute[192144]: 2025-10-02 12:29:16.643 2 INFO nova.compute.manager [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Took 7.98 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:29:16 np0005466013 nova_compute[192144]: 2025-10-02 12:29:16.643 2 DEBUG nova.compute.manager [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:29:16 np0005466013 nova_compute[192144]: 2025-10-02 12:29:16.646 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:29:16 np0005466013 systemd[1]: Started libpod-conmon-a2c6dd3057c7f8a14651e0182310b85f7545a2745ab80da505c3d36bdade107d.scope.
Oct  2 08:29:16 np0005466013 nova_compute[192144]: 2025-10-02 12:29:16.696 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:29:16 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:29:16 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/00ae2d3cc3208429f0bb7abc04824f2bf6a1fbde4a1b99c9a261d5e0f8ced497/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:29:16 np0005466013 nova_compute[192144]: 2025-10-02 12:29:16.763 2 INFO nova.compute.manager [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Took 9.84 seconds to build instance.#033[00m
Oct  2 08:29:16 np0005466013 nova_compute[192144]: 2025-10-02 12:29:16.788 2 DEBUG oslo_concurrency.lockutils [None req-d7cf8452-693b-4153-8710-78a2d60c876e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "158b6775-d035-40a6-9699-a7bab42a3cbc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.076s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:17 np0005466013 podman[241822]: 2025-10-02 12:29:17.066269025 +0000 UTC m=+0.668753634 container init a2c6dd3057c7f8a14651e0182310b85f7545a2745ab80da505c3d36bdade107d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f2a05d8-1f43-4329-b388-8811ae8293ca, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  2 08:29:17 np0005466013 podman[241822]: 2025-10-02 12:29:17.07313273 +0000 UTC m=+0.675617309 container start a2c6dd3057c7f8a14651e0182310b85f7545a2745ab80da505c3d36bdade107d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f2a05d8-1f43-4329-b388-8811ae8293ca, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:29:17 np0005466013 neutron-haproxy-ovnmeta-9f2a05d8-1f43-4329-b388-8811ae8293ca[241837]: [NOTICE]   (241841) : New worker (241843) forked
Oct  2 08:29:17 np0005466013 neutron-haproxy-ovnmeta-9f2a05d8-1f43-4329-b388-8811ae8293ca[241837]: [NOTICE]   (241841) : Loading success.
Oct  2 08:29:18 np0005466013 nova_compute[192144]: 2025-10-02 12:29:18.069 2 DEBUG nova.compute.manager [req-75b44beb-e271-4ee3-b108-7915fa1e554f req-a922bb56-6001-4868-9094-f87946f03781 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Received event network-vif-plugged-d66594bd-b226-44dd-a0fd-5ff49b65d032 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:29:18 np0005466013 nova_compute[192144]: 2025-10-02 12:29:18.070 2 DEBUG oslo_concurrency.lockutils [req-75b44beb-e271-4ee3-b108-7915fa1e554f req-a922bb56-6001-4868-9094-f87946f03781 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "158b6775-d035-40a6-9699-a7bab42a3cbc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:18 np0005466013 nova_compute[192144]: 2025-10-02 12:29:18.071 2 DEBUG oslo_concurrency.lockutils [req-75b44beb-e271-4ee3-b108-7915fa1e554f req-a922bb56-6001-4868-9094-f87946f03781 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "158b6775-d035-40a6-9699-a7bab42a3cbc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:18 np0005466013 nova_compute[192144]: 2025-10-02 12:29:18.071 2 DEBUG oslo_concurrency.lockutils [req-75b44beb-e271-4ee3-b108-7915fa1e554f req-a922bb56-6001-4868-9094-f87946f03781 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "158b6775-d035-40a6-9699-a7bab42a3cbc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:18 np0005466013 nova_compute[192144]: 2025-10-02 12:29:18.072 2 DEBUG nova.compute.manager [req-75b44beb-e271-4ee3-b108-7915fa1e554f req-a922bb56-6001-4868-9094-f87946f03781 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] No waiting events found dispatching network-vif-plugged-d66594bd-b226-44dd-a0fd-5ff49b65d032 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:29:18 np0005466013 nova_compute[192144]: 2025-10-02 12:29:18.072 2 WARNING nova.compute.manager [req-75b44beb-e271-4ee3-b108-7915fa1e554f req-a922bb56-6001-4868-9094-f87946f03781 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Received unexpected event network-vif-plugged-d66594bd-b226-44dd-a0fd-5ff49b65d032 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:29:19 np0005466013 nova_compute[192144]: 2025-10-02 12:29:19.538 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:19 np0005466013 nova_compute[192144]: 2025-10-02 12:29:19.812 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:20.255 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=33, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=32) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:29:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:20.256 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:29:20 np0005466013 nova_compute[192144]: 2025-10-02 12:29:20.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:20 np0005466013 podman[241852]: 2025-10-02 12:29:20.716691312 +0000 UTC m=+0.083441155 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:29:20 np0005466013 podman[241853]: 2025-10-02 12:29:20.770661303 +0000 UTC m=+0.124175191 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=iscsid, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  2 08:29:24 np0005466013 nova_compute[192144]: 2025-10-02 12:29:24.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:24 np0005466013 nova_compute[192144]: 2025-10-02 12:29:24.656 2 DEBUG nova.compute.manager [req-5fdd727d-bc71-4562-8926-8d53ed318f99 req-aefd148f-3bc3-444c-abfc-c37ec2e09a51 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Received event network-changed-d66594bd-b226-44dd-a0fd-5ff49b65d032 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:29:24 np0005466013 nova_compute[192144]: 2025-10-02 12:29:24.657 2 DEBUG nova.compute.manager [req-5fdd727d-bc71-4562-8926-8d53ed318f99 req-aefd148f-3bc3-444c-abfc-c37ec2e09a51 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Refreshing instance network info cache due to event network-changed-d66594bd-b226-44dd-a0fd-5ff49b65d032. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:29:24 np0005466013 nova_compute[192144]: 2025-10-02 12:29:24.657 2 DEBUG oslo_concurrency.lockutils [req-5fdd727d-bc71-4562-8926-8d53ed318f99 req-aefd148f-3bc3-444c-abfc-c37ec2e09a51 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-158b6775-d035-40a6-9699-a7bab42a3cbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:29:24 np0005466013 nova_compute[192144]: 2025-10-02 12:29:24.657 2 DEBUG oslo_concurrency.lockutils [req-5fdd727d-bc71-4562-8926-8d53ed318f99 req-aefd148f-3bc3-444c-abfc-c37ec2e09a51 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-158b6775-d035-40a6-9699-a7bab42a3cbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:29:24 np0005466013 nova_compute[192144]: 2025-10-02 12:29:24.658 2 DEBUG nova.network.neutron [req-5fdd727d-bc71-4562-8926-8d53ed318f99 req-aefd148f-3bc3-444c-abfc-c37ec2e09a51 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Refreshing network info cache for port d66594bd-b226-44dd-a0fd-5ff49b65d032 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:29:24 np0005466013 nova_compute[192144]: 2025-10-02 12:29:24.815 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:25.259 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '33'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:29:27 np0005466013 nova_compute[192144]: 2025-10-02 12:29:27.788 2 DEBUG nova.network.neutron [req-5fdd727d-bc71-4562-8926-8d53ed318f99 req-aefd148f-3bc3-444c-abfc-c37ec2e09a51 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Updated VIF entry in instance network info cache for port d66594bd-b226-44dd-a0fd-5ff49b65d032. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:29:27 np0005466013 nova_compute[192144]: 2025-10-02 12:29:27.788 2 DEBUG nova.network.neutron [req-5fdd727d-bc71-4562-8926-8d53ed318f99 req-aefd148f-3bc3-444c-abfc-c37ec2e09a51 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Updating instance_info_cache with network_info: [{"id": "d66594bd-b226-44dd-a0fd-5ff49b65d032", "address": "fa:16:3e:7f:f9:aa", "network": {"id": "9f2a05d8-1f43-4329-b388-8811ae8293ca", "bridge": "br-int", "label": "tempest-network-smoke--1590153811", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd66594bd-b2", "ovs_interfaceid": "d66594bd-b226-44dd-a0fd-5ff49b65d032", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:29:27 np0005466013 nova_compute[192144]: 2025-10-02 12:29:27.928 2 DEBUG oslo_concurrency.lockutils [req-5fdd727d-bc71-4562-8926-8d53ed318f99 req-aefd148f-3bc3-444c-abfc-c37ec2e09a51 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-158b6775-d035-40a6-9699-a7bab42a3cbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:29:29 np0005466013 nova_compute[192144]: 2025-10-02 12:29:29.544 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:29 np0005466013 ovn_controller[94366]: 2025-10-02T12:29:29Z|00055|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7f:f9:aa 10.100.0.13
Oct  2 08:29:29 np0005466013 ovn_controller[94366]: 2025-10-02T12:29:29Z|00056|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7f:f9:aa 10.100.0.13
Oct  2 08:29:29 np0005466013 nova_compute[192144]: 2025-10-02 12:29:29.821 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:30 np0005466013 nova_compute[192144]: 2025-10-02 12:29:30.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:29:30 np0005466013 nova_compute[192144]: 2025-10-02 12:29:30.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:29:30 np0005466013 nova_compute[192144]: 2025-10-02 12:29:30.995 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:29:31 np0005466013 nova_compute[192144]: 2025-10-02 12:29:31.174 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:31 np0005466013 nova_compute[192144]: 2025-10-02 12:29:31.174 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:31 np0005466013 nova_compute[192144]: 2025-10-02 12:29:31.175 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:31 np0005466013 nova_compute[192144]: 2025-10-02 12:29:31.175 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:29:31 np0005466013 nova_compute[192144]: 2025-10-02 12:29:31.818 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ca651811-3d96-4b41-a50d-bbaeaf3da808/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:29:31 np0005466013 nova_compute[192144]: 2025-10-02 12:29:31.922 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ca651811-3d96-4b41-a50d-bbaeaf3da808/disk --force-share --output=json" returned: 0 in 0.104s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:29:31 np0005466013 nova_compute[192144]: 2025-10-02 12:29:31.924 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ca651811-3d96-4b41-a50d-bbaeaf3da808/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:29:32 np0005466013 nova_compute[192144]: 2025-10-02 12:29:32.029 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ca651811-3d96-4b41-a50d-bbaeaf3da808/disk --force-share --output=json" returned: 0 in 0.105s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:29:32 np0005466013 nova_compute[192144]: 2025-10-02 12:29:32.040 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/158b6775-d035-40a6-9699-a7bab42a3cbc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:29:32 np0005466013 nova_compute[192144]: 2025-10-02 12:29:32.135 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/158b6775-d035-40a6-9699-a7bab42a3cbc/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:29:32 np0005466013 nova_compute[192144]: 2025-10-02 12:29:32.137 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/158b6775-d035-40a6-9699-a7bab42a3cbc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:29:32 np0005466013 nova_compute[192144]: 2025-10-02 12:29:32.214 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/158b6775-d035-40a6-9699-a7bab42a3cbc/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:29:32 np0005466013 nova_compute[192144]: 2025-10-02 12:29:32.475 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:29:32 np0005466013 nova_compute[192144]: 2025-10-02 12:29:32.476 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5329MB free_disk=73.14346313476562GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:29:32 np0005466013 nova_compute[192144]: 2025-10-02 12:29:32.477 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:32 np0005466013 nova_compute[192144]: 2025-10-02 12:29:32.477 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:33 np0005466013 nova_compute[192144]: 2025-10-02 12:29:33.571 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance ca651811-3d96-4b41-a50d-bbaeaf3da808 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:29:33 np0005466013 nova_compute[192144]: 2025-10-02 12:29:33.571 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance 158b6775-d035-40a6-9699-a7bab42a3cbc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:29:33 np0005466013 nova_compute[192144]: 2025-10-02 12:29:33.571 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:29:33 np0005466013 nova_compute[192144]: 2025-10-02 12:29:33.572 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:29:33 np0005466013 nova_compute[192144]: 2025-10-02 12:29:33.681 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:29:34 np0005466013 nova_compute[192144]: 2025-10-02 12:29:34.089 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:29:34 np0005466013 nova_compute[192144]: 2025-10-02 12:29:34.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:34 np0005466013 nova_compute[192144]: 2025-10-02 12:29:34.569 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:29:34 np0005466013 nova_compute[192144]: 2025-10-02 12:29:34.569 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.092s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:34 np0005466013 nova_compute[192144]: 2025-10-02 12:29:34.859 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:34 np0005466013 nova_compute[192144]: 2025-10-02 12:29:34.938 2 INFO nova.compute.manager [None req-cc367401-6826-459b-a2c3-bb9392ce2765 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Get console output#033[00m
Oct  2 08:29:34 np0005466013 nova_compute[192144]: 2025-10-02 12:29:34.946 56 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  2 08:29:35 np0005466013 nova_compute[192144]: 2025-10-02 12:29:35.410 2 DEBUG oslo_concurrency.lockutils [None req-8d732e41-fae3-4687-85b7-028c35b1f050 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "158b6775-d035-40a6-9699-a7bab42a3cbc" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:35 np0005466013 nova_compute[192144]: 2025-10-02 12:29:35.410 2 DEBUG oslo_concurrency.lockutils [None req-8d732e41-fae3-4687-85b7-028c35b1f050 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "158b6775-d035-40a6-9699-a7bab42a3cbc" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:35 np0005466013 nova_compute[192144]: 2025-10-02 12:29:35.410 2 INFO nova.compute.manager [None req-8d732e41-fae3-4687-85b7-028c35b1f050 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Rebooting instance#033[00m
Oct  2 08:29:35 np0005466013 nova_compute[192144]: 2025-10-02 12:29:35.437 2 DEBUG oslo_concurrency.lockutils [None req-8d732e41-fae3-4687-85b7-028c35b1f050 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "refresh_cache-158b6775-d035-40a6-9699-a7bab42a3cbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:29:35 np0005466013 nova_compute[192144]: 2025-10-02 12:29:35.438 2 DEBUG oslo_concurrency.lockutils [None req-8d732e41-fae3-4687-85b7-028c35b1f050 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquired lock "refresh_cache-158b6775-d035-40a6-9699-a7bab42a3cbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:29:35 np0005466013 nova_compute[192144]: 2025-10-02 12:29:35.438 2 DEBUG nova.network.neutron [None req-8d732e41-fae3-4687-85b7-028c35b1f050 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:29:35 np0005466013 nova_compute[192144]: 2025-10-02 12:29:35.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:29:35 np0005466013 nova_compute[192144]: 2025-10-02 12:29:35.995 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  2 08:29:36 np0005466013 podman[241935]: 2025-10-02 12:29:36.73265952 +0000 UTC m=+0.090947090 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:29:36 np0005466013 podman[241936]: 2025-10-02 12:29:36.752209493 +0000 UTC m=+0.107267082 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  2 08:29:36 np0005466013 podman[241937]: 2025-10-02 12:29:36.767175362 +0000 UTC m=+0.124253894 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Oct  2 08:29:37 np0005466013 nova_compute[192144]: 2025-10-02 12:29:37.024 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:29:37 np0005466013 nova_compute[192144]: 2025-10-02 12:29:37.826 2 DEBUG nova.network.neutron [None req-8d732e41-fae3-4687-85b7-028c35b1f050 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Updating instance_info_cache with network_info: [{"id": "d66594bd-b226-44dd-a0fd-5ff49b65d032", "address": "fa:16:3e:7f:f9:aa", "network": {"id": "9f2a05d8-1f43-4329-b388-8811ae8293ca", "bridge": "br-int", "label": "tempest-network-smoke--1590153811", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd66594bd-b2", "ovs_interfaceid": "d66594bd-b226-44dd-a0fd-5ff49b65d032", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:29:37 np0005466013 nova_compute[192144]: 2025-10-02 12:29:37.851 2 DEBUG oslo_concurrency.lockutils [None req-8d732e41-fae3-4687-85b7-028c35b1f050 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Releasing lock "refresh_cache-158b6775-d035-40a6-9699-a7bab42a3cbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:29:37 np0005466013 nova_compute[192144]: 2025-10-02 12:29:37.866 2 DEBUG nova.compute.manager [None req-8d732e41-fae3-4687-85b7-028c35b1f050 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:29:37 np0005466013 nova_compute[192144]: 2025-10-02 12:29:37.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:29:38 np0005466013 nova_compute[192144]: 2025-10-02 12:29:38.988 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:29:38 np0005466013 nova_compute[192144]: 2025-10-02 12:29:38.992 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:29:38 np0005466013 nova_compute[192144]: 2025-10-02 12:29:38.993 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:29:38 np0005466013 nova_compute[192144]: 2025-10-02 12:29:38.993 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:29:39 np0005466013 nova_compute[192144]: 2025-10-02 12:29:39.260 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "refresh_cache-ca651811-3d96-4b41-a50d-bbaeaf3da808" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:29:39 np0005466013 nova_compute[192144]: 2025-10-02 12:29:39.261 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquired lock "refresh_cache-ca651811-3d96-4b41-a50d-bbaeaf3da808" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:29:39 np0005466013 nova_compute[192144]: 2025-10-02 12:29:39.262 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:29:39 np0005466013 nova_compute[192144]: 2025-10-02 12:29:39.262 2 DEBUG nova.objects.instance [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lazy-loading 'info_cache' on Instance uuid ca651811-3d96-4b41-a50d-bbaeaf3da808 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:29:39 np0005466013 nova_compute[192144]: 2025-10-02 12:29:39.584 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:39 np0005466013 nova_compute[192144]: 2025-10-02 12:29:39.862 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:40 np0005466013 kernel: tapd66594bd-b2 (unregistering): left promiscuous mode
Oct  2 08:29:40 np0005466013 NetworkManager[51205]: <info>  [1759408180.3096] device (tapd66594bd-b2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:29:40 np0005466013 ovn_controller[94366]: 2025-10-02T12:29:40Z|00567|binding|INFO|Releasing lport d66594bd-b226-44dd-a0fd-5ff49b65d032 from this chassis (sb_readonly=0)
Oct  2 08:29:40 np0005466013 ovn_controller[94366]: 2025-10-02T12:29:40Z|00568|binding|INFO|Setting lport d66594bd-b226-44dd-a0fd-5ff49b65d032 down in Southbound
Oct  2 08:29:40 np0005466013 nova_compute[192144]: 2025-10-02 12:29:40.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:40 np0005466013 ovn_controller[94366]: 2025-10-02T12:29:40Z|00569|binding|INFO|Removing iface tapd66594bd-b2 ovn-installed in OVS
Oct  2 08:29:40 np0005466013 nova_compute[192144]: 2025-10-02 12:29:40.338 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:40 np0005466013 nova_compute[192144]: 2025-10-02 12:29:40.354 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:40.357 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7f:f9:aa 10.100.0.13'], port_security=['fa:16:3e:7f:f9:aa 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '158b6775-d035-40a6-9699-a7bab42a3cbc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9f2a05d8-1f43-4329-b388-8811ae8293ca', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1694e9dd-274b-47a7-823d-2b0bb81c13da', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.218'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9b545250-58ed-42b4-932e-b2ddd5229036, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=d66594bd-b226-44dd-a0fd-5ff49b65d032) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:29:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:40.358 103323 INFO neutron.agent.ovn.metadata.agent [-] Port d66594bd-b226-44dd-a0fd-5ff49b65d032 in datapath 9f2a05d8-1f43-4329-b388-8811ae8293ca unbound from our chassis#033[00m
Oct  2 08:29:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:40.359 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9f2a05d8-1f43-4329-b388-8811ae8293ca, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:29:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:40.361 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[ab12c056-f02e-4067-a961-26c335ba600f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:40.362 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9f2a05d8-1f43-4329-b388-8811ae8293ca namespace which is not needed anymore#033[00m
Oct  2 08:29:40 np0005466013 systemd[1]: machine-qemu\x2d64\x2dinstance\x2d00000083.scope: Deactivated successfully.
Oct  2 08:29:40 np0005466013 systemd[1]: machine-qemu\x2d64\x2dinstance\x2d00000083.scope: Consumed 13.986s CPU time.
Oct  2 08:29:40 np0005466013 systemd-machined[152202]: Machine qemu-64-instance-00000083 terminated.
Oct  2 08:29:40 np0005466013 neutron-haproxy-ovnmeta-9f2a05d8-1f43-4329-b388-8811ae8293ca[241837]: [NOTICE]   (241841) : haproxy version is 2.8.14-c23fe91
Oct  2 08:29:40 np0005466013 neutron-haproxy-ovnmeta-9f2a05d8-1f43-4329-b388-8811ae8293ca[241837]: [NOTICE]   (241841) : path to executable is /usr/sbin/haproxy
Oct  2 08:29:40 np0005466013 neutron-haproxy-ovnmeta-9f2a05d8-1f43-4329-b388-8811ae8293ca[241837]: [WARNING]  (241841) : Exiting Master process...
Oct  2 08:29:40 np0005466013 neutron-haproxy-ovnmeta-9f2a05d8-1f43-4329-b388-8811ae8293ca[241837]: [ALERT]    (241841) : Current worker (241843) exited with code 143 (Terminated)
Oct  2 08:29:40 np0005466013 neutron-haproxy-ovnmeta-9f2a05d8-1f43-4329-b388-8811ae8293ca[241837]: [WARNING]  (241841) : All workers exited. Exiting... (0)
Oct  2 08:29:40 np0005466013 systemd[1]: libpod-a2c6dd3057c7f8a14651e0182310b85f7545a2745ab80da505c3d36bdade107d.scope: Deactivated successfully.
Oct  2 08:29:40 np0005466013 podman[242024]: 2025-10-02 12:29:40.525344755 +0000 UTC m=+0.046895961 container died a2c6dd3057c7f8a14651e0182310b85f7545a2745ab80da505c3d36bdade107d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f2a05d8-1f43-4329-b388-8811ae8293ca, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Oct  2 08:29:40 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a2c6dd3057c7f8a14651e0182310b85f7545a2745ab80da505c3d36bdade107d-userdata-shm.mount: Deactivated successfully.
Oct  2 08:29:40 np0005466013 systemd[1]: var-lib-containers-storage-overlay-00ae2d3cc3208429f0bb7abc04824f2bf6a1fbde4a1b99c9a261d5e0f8ced497-merged.mount: Deactivated successfully.
Oct  2 08:29:40 np0005466013 podman[242024]: 2025-10-02 12:29:40.593060686 +0000 UTC m=+0.114611882 container cleanup a2c6dd3057c7f8a14651e0182310b85f7545a2745ab80da505c3d36bdade107d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f2a05d8-1f43-4329-b388-8811ae8293ca, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:29:40 np0005466013 systemd[1]: libpod-conmon-a2c6dd3057c7f8a14651e0182310b85f7545a2745ab80da505c3d36bdade107d.scope: Deactivated successfully.
Oct  2 08:29:40 np0005466013 podman[242068]: 2025-10-02 12:29:40.725654601 +0000 UTC m=+0.109484401 container remove a2c6dd3057c7f8a14651e0182310b85f7545a2745ab80da505c3d36bdade107d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f2a05d8-1f43-4329-b388-8811ae8293ca, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2)
Oct  2 08:29:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:40.734 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[be1dac60-cd1e-478b-a1a3-fcf6459f54dd]: (4, ('Thu Oct  2 12:29:40 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9f2a05d8-1f43-4329-b388-8811ae8293ca (a2c6dd3057c7f8a14651e0182310b85f7545a2745ab80da505c3d36bdade107d)\na2c6dd3057c7f8a14651e0182310b85f7545a2745ab80da505c3d36bdade107d\nThu Oct  2 12:29:40 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9f2a05d8-1f43-4329-b388-8811ae8293ca (a2c6dd3057c7f8a14651e0182310b85f7545a2745ab80da505c3d36bdade107d)\na2c6dd3057c7f8a14651e0182310b85f7545a2745ab80da505c3d36bdade107d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:40.737 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[6a8a88a8-7d36-449a-873d-1e4ac8f9d9a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:40.738 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9f2a05d8-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:29:40 np0005466013 nova_compute[192144]: 2025-10-02 12:29:40.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:40 np0005466013 kernel: tap9f2a05d8-10: left promiscuous mode
Oct  2 08:29:40 np0005466013 nova_compute[192144]: 2025-10-02 12:29:40.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:40.770 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[7bd4fa5d-d533-4d47-9040-ef74e38e7a77]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:40.813 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[a73bca6d-6d9d-4980-8fb6-8d5c99f53d1e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:40.814 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[1e020bb2-9c32-4ddd-a248-c74e7d2ebd08]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:40.847 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[889ae9b5-7219-47e0-8c05-40733ae4e3d7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 614734, 'reachable_time': 34771, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242088, 'error': None, 'target': 'ovnmeta-9f2a05d8-1f43-4329-b388-8811ae8293ca', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:40.852 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9f2a05d8-1f43-4329-b388-8811ae8293ca deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:29:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:40.853 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[f0a4e90a-d8ed-4224-a707-7c0a4bb855d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:40 np0005466013 systemd[1]: run-netns-ovnmeta\x2d9f2a05d8\x2d1f43\x2d4329\x2db388\x2d8811ae8293ca.mount: Deactivated successfully.
Oct  2 08:29:41 np0005466013 nova_compute[192144]: 2025-10-02 12:29:41.033 2 DEBUG nova.compute.manager [req-a104cfd4-4d98-442e-a50c-61c87b795964 req-b4ba9e78-7007-402a-997c-01cec9511fae 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Received event network-vif-unplugged-d66594bd-b226-44dd-a0fd-5ff49b65d032 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:29:41 np0005466013 nova_compute[192144]: 2025-10-02 12:29:41.034 2 DEBUG oslo_concurrency.lockutils [req-a104cfd4-4d98-442e-a50c-61c87b795964 req-b4ba9e78-7007-402a-997c-01cec9511fae 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "158b6775-d035-40a6-9699-a7bab42a3cbc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:41 np0005466013 nova_compute[192144]: 2025-10-02 12:29:41.034 2 DEBUG oslo_concurrency.lockutils [req-a104cfd4-4d98-442e-a50c-61c87b795964 req-b4ba9e78-7007-402a-997c-01cec9511fae 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "158b6775-d035-40a6-9699-a7bab42a3cbc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:41 np0005466013 nova_compute[192144]: 2025-10-02 12:29:41.034 2 DEBUG oslo_concurrency.lockutils [req-a104cfd4-4d98-442e-a50c-61c87b795964 req-b4ba9e78-7007-402a-997c-01cec9511fae 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "158b6775-d035-40a6-9699-a7bab42a3cbc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:41 np0005466013 nova_compute[192144]: 2025-10-02 12:29:41.034 2 DEBUG nova.compute.manager [req-a104cfd4-4d98-442e-a50c-61c87b795964 req-b4ba9e78-7007-402a-997c-01cec9511fae 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] No waiting events found dispatching network-vif-unplugged-d66594bd-b226-44dd-a0fd-5ff49b65d032 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:29:41 np0005466013 nova_compute[192144]: 2025-10-02 12:29:41.035 2 WARNING nova.compute.manager [req-a104cfd4-4d98-442e-a50c-61c87b795964 req-b4ba9e78-7007-402a-997c-01cec9511fae 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Received unexpected event network-vif-unplugged-d66594bd-b226-44dd-a0fd-5ff49b65d032 for instance with vm_state active and task_state reboot_started.#033[00m
Oct  2 08:29:41 np0005466013 nova_compute[192144]: 2025-10-02 12:29:41.135 2 INFO nova.virt.libvirt.driver [None req-8d732e41-fae3-4687-85b7-028c35b1f050 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Instance shutdown successfully.#033[00m
Oct  2 08:29:41 np0005466013 kernel: tapd66594bd-b2: entered promiscuous mode
Oct  2 08:29:41 np0005466013 systemd-udevd[242001]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:29:41 np0005466013 NetworkManager[51205]: <info>  [1759408181.2030] manager: (tapd66594bd-b2): new Tun device (/org/freedesktop/NetworkManager/Devices/253)
Oct  2 08:29:41 np0005466013 ovn_controller[94366]: 2025-10-02T12:29:41Z|00570|binding|INFO|Claiming lport d66594bd-b226-44dd-a0fd-5ff49b65d032 for this chassis.
Oct  2 08:29:41 np0005466013 ovn_controller[94366]: 2025-10-02T12:29:41Z|00571|binding|INFO|d66594bd-b226-44dd-a0fd-5ff49b65d032: Claiming fa:16:3e:7f:f9:aa 10.100.0.13
Oct  2 08:29:41 np0005466013 nova_compute[192144]: 2025-10-02 12:29:41.204 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:41 np0005466013 NetworkManager[51205]: <info>  [1759408181.2147] device (tapd66594bd-b2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:29:41 np0005466013 NetworkManager[51205]: <info>  [1759408181.2160] device (tapd66594bd-b2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:41.216 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7f:f9:aa 10.100.0.13'], port_security=['fa:16:3e:7f:f9:aa 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '158b6775-d035-40a6-9699-a7bab42a3cbc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9f2a05d8-1f43-4329-b388-8811ae8293ca', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'neutron:revision_number': '5', 'neutron:security_group_ids': '1694e9dd-274b-47a7-823d-2b0bb81c13da', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.218'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9b545250-58ed-42b4-932e-b2ddd5229036, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=d66594bd-b226-44dd-a0fd-5ff49b65d032) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:41.217 103323 INFO neutron.agent.ovn.metadata.agent [-] Port d66594bd-b226-44dd-a0fd-5ff49b65d032 in datapath 9f2a05d8-1f43-4329-b388-8811ae8293ca bound to our chassis#033[00m
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:41.218 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9f2a05d8-1f43-4329-b388-8811ae8293ca#033[00m
Oct  2 08:29:41 np0005466013 ovn_controller[94366]: 2025-10-02T12:29:41Z|00572|binding|INFO|Setting lport d66594bd-b226-44dd-a0fd-5ff49b65d032 ovn-installed in OVS
Oct  2 08:29:41 np0005466013 ovn_controller[94366]: 2025-10-02T12:29:41Z|00573|binding|INFO|Setting lport d66594bd-b226-44dd-a0fd-5ff49b65d032 up in Southbound
Oct  2 08:29:41 np0005466013 nova_compute[192144]: 2025-10-02 12:29:41.222 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:41.229 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[7ee9f36f-7177-4235-b09c-67cf28a12809]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:41.230 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9f2a05d8-11 in ovnmeta-9f2a05d8-1f43-4329-b388-8811ae8293ca namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:41.232 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9f2a05d8-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:41.232 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[19ecd695-129a-40d6-934f-f25f54ebe0c1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:41.233 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[f52b4f35-2820-4625-879c-fc68aa942a8e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:41 np0005466013 systemd-machined[152202]: New machine qemu-65-instance-00000083.
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:41.243 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[762783a8-d8fb-4243-8e25-90395cfe5fe5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:41 np0005466013 systemd[1]: Started Virtual Machine qemu-65-instance-00000083.
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:41.267 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[c60576c0-15bc-4e3d-bef4-33f6caf33900]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:41.298 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[a8ac3e70-2b38-47d9-93f8-4d39afebd7d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:41.305 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[8eb9ab79-3901-47a2-bcd9-4e1bfef5bfc7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:41 np0005466013 NetworkManager[51205]: <info>  [1759408181.3072] manager: (tap9f2a05d8-10): new Veth device (/org/freedesktop/NetworkManager/Devices/254)
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:41.336 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[d603a9a3-9066-429c-95a5-6d352db9d8f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:41.339 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[c9368fae-664d-491e-b9dd-9266adab1a62]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:41 np0005466013 NetworkManager[51205]: <info>  [1759408181.3617] device (tap9f2a05d8-10): carrier: link connected
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:41.367 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[445961eb-1c6b-4cb0-a3f1-46587def10a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:41.383 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[fe5957f6-d95f-4910-9409-cfa893bdfb15]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9f2a05d8-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:61:03:0e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 172], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 617297, 'reachable_time': 23163, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242135, 'error': None, 'target': 'ovnmeta-9f2a05d8-1f43-4329-b388-8811ae8293ca', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:41.401 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[9dcd25ee-a646-44ed-b59a-869e2ec54bb1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe61:30e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 617297, 'tstamp': 617297}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 242136, 'error': None, 'target': 'ovnmeta-9f2a05d8-1f43-4329-b388-8811ae8293ca', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:41.422 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[d40f27da-cf9f-407b-917d-6ce64cb469c9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9f2a05d8-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:61:03:0e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 172], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 617297, 'reachable_time': 23163, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 242137, 'error': None, 'target': 'ovnmeta-9f2a05d8-1f43-4329-b388-8811ae8293ca', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:41.465 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[3ea7b838-25c4-4afb-a4b0-3eb2d2280cde]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:41.563 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[8076cb53-ccbd-48fc-914e-c7fad03e4334]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:41.565 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9f2a05d8-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:41.566 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:41.567 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9f2a05d8-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:29:41 np0005466013 nova_compute[192144]: 2025-10-02 12:29:41.569 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:41 np0005466013 NetworkManager[51205]: <info>  [1759408181.5708] manager: (tap9f2a05d8-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/255)
Oct  2 08:29:41 np0005466013 kernel: tap9f2a05d8-10: entered promiscuous mode
Oct  2 08:29:41 np0005466013 nova_compute[192144]: 2025-10-02 12:29:41.573 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:41.575 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9f2a05d8-10, col_values=(('external_ids', {'iface-id': '325e84b6-74e4-4e02-a145-8f2619c4e99c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:29:41 np0005466013 nova_compute[192144]: 2025-10-02 12:29:41.577 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:41 np0005466013 ovn_controller[94366]: 2025-10-02T12:29:41Z|00574|binding|INFO|Releasing lport 325e84b6-74e4-4e02-a145-8f2619c4e99c from this chassis (sb_readonly=0)
Oct  2 08:29:41 np0005466013 nova_compute[192144]: 2025-10-02 12:29:41.603 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:41.604 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9f2a05d8-1f43-4329-b388-8811ae8293ca.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9f2a05d8-1f43-4329-b388-8811ae8293ca.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:41.605 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[271ee4f6-22de-4b25-85a7-0be603d6869a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:41.606 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-9f2a05d8-1f43-4329-b388-8811ae8293ca
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/9f2a05d8-1f43-4329-b388-8811ae8293ca.pid.haproxy
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID 9f2a05d8-1f43-4329-b388-8811ae8293ca
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:29:41 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:41.607 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9f2a05d8-1f43-4329-b388-8811ae8293ca', 'env', 'PROCESS_TAG=haproxy-9f2a05d8-1f43-4329-b388-8811ae8293ca', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9f2a05d8-1f43-4329-b388-8811ae8293ca.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:29:42 np0005466013 podman[242176]: 2025-10-02 12:29:41.987298932 +0000 UTC m=+0.034780791 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:29:42 np0005466013 nova_compute[192144]: 2025-10-02 12:29:42.322 2 DEBUG nova.virt.libvirt.host [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Removed pending event for 158b6775-d035-40a6-9699-a7bab42a3cbc due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Oct  2 08:29:42 np0005466013 nova_compute[192144]: 2025-10-02 12:29:42.322 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759408182.3220072, 158b6775-d035-40a6-9699-a7bab42a3cbc => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:29:42 np0005466013 nova_compute[192144]: 2025-10-02 12:29:42.322 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:29:42 np0005466013 nova_compute[192144]: 2025-10-02 12:29:42.327 2 INFO nova.virt.libvirt.driver [-] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Instance running successfully.#033[00m
Oct  2 08:29:42 np0005466013 nova_compute[192144]: 2025-10-02 12:29:42.327 2 INFO nova.virt.libvirt.driver [None req-8d732e41-fae3-4687-85b7-028c35b1f050 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Instance soft rebooted successfully.#033[00m
Oct  2 08:29:42 np0005466013 nova_compute[192144]: 2025-10-02 12:29:42.328 2 DEBUG nova.compute.manager [None req-8d732e41-fae3-4687-85b7-028c35b1f050 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:29:42 np0005466013 nova_compute[192144]: 2025-10-02 12:29:42.366 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:29:42 np0005466013 nova_compute[192144]: 2025-10-02 12:29:42.369 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:29:42 np0005466013 nova_compute[192144]: 2025-10-02 12:29:42.403 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] During sync_power_state the instance has a pending task (reboot_started). Skip.#033[00m
Oct  2 08:29:42 np0005466013 nova_compute[192144]: 2025-10-02 12:29:42.405 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759408182.3245134, 158b6775-d035-40a6-9699-a7bab42a3cbc => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:29:42 np0005466013 nova_compute[192144]: 2025-10-02 12:29:42.405 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] VM Started (Lifecycle Event)#033[00m
Oct  2 08:29:42 np0005466013 nova_compute[192144]: 2025-10-02 12:29:42.431 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:29:42 np0005466013 nova_compute[192144]: 2025-10-02 12:29:42.435 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:29:42 np0005466013 nova_compute[192144]: 2025-10-02 12:29:42.473 2 DEBUG oslo_concurrency.lockutils [None req-8d732e41-fae3-4687-85b7-028c35b1f050 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "158b6775-d035-40a6-9699-a7bab42a3cbc" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 7.062s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:42 np0005466013 podman[242176]: 2025-10-02 12:29:42.511102383 +0000 UTC m=+0.558584182 container create ee348fd0cc8e79145c296ba8ebb74844f84044a7f1b405354a2c403e48288c92 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f2a05d8-1f43-4329-b388-8811ae8293ca, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  2 08:29:42 np0005466013 systemd[1]: Started libpod-conmon-ee348fd0cc8e79145c296ba8ebb74844f84044a7f1b405354a2c403e48288c92.scope.
Oct  2 08:29:42 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:29:42 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/284e90eeca0bf2cc3f57010eed51bb22f7e2ae9584546adf349fe0f841a6f37c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:29:42 np0005466013 podman[242176]: 2025-10-02 12:29:42.656124128 +0000 UTC m=+0.703605987 container init ee348fd0cc8e79145c296ba8ebb74844f84044a7f1b405354a2c403e48288c92 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f2a05d8-1f43-4329-b388-8811ae8293ca, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  2 08:29:42 np0005466013 podman[242176]: 2025-10-02 12:29:42.666451821 +0000 UTC m=+0.713933630 container start ee348fd0cc8e79145c296ba8ebb74844f84044a7f1b405354a2c403e48288c92 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f2a05d8-1f43-4329-b388-8811ae8293ca, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:29:42 np0005466013 neutron-haproxy-ovnmeta-9f2a05d8-1f43-4329-b388-8811ae8293ca[242191]: [NOTICE]   (242195) : New worker (242197) forked
Oct  2 08:29:42 np0005466013 neutron-haproxy-ovnmeta-9f2a05d8-1f43-4329-b388-8811ae8293ca[242191]: [NOTICE]   (242195) : Loading success.
Oct  2 08:29:43 np0005466013 nova_compute[192144]: 2025-10-02 12:29:43.176 2 DEBUG nova.compute.manager [req-c5afb61e-4ae1-4524-a685-ea77c4a90b35 req-b65d7fde-ab23-4a92-8ab4-40be2babf1ab 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Received event network-vif-plugged-d66594bd-b226-44dd-a0fd-5ff49b65d032 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:29:43 np0005466013 nova_compute[192144]: 2025-10-02 12:29:43.177 2 DEBUG oslo_concurrency.lockutils [req-c5afb61e-4ae1-4524-a685-ea77c4a90b35 req-b65d7fde-ab23-4a92-8ab4-40be2babf1ab 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "158b6775-d035-40a6-9699-a7bab42a3cbc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:43 np0005466013 nova_compute[192144]: 2025-10-02 12:29:43.177 2 DEBUG oslo_concurrency.lockutils [req-c5afb61e-4ae1-4524-a685-ea77c4a90b35 req-b65d7fde-ab23-4a92-8ab4-40be2babf1ab 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "158b6775-d035-40a6-9699-a7bab42a3cbc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:43 np0005466013 nova_compute[192144]: 2025-10-02 12:29:43.178 2 DEBUG oslo_concurrency.lockutils [req-c5afb61e-4ae1-4524-a685-ea77c4a90b35 req-b65d7fde-ab23-4a92-8ab4-40be2babf1ab 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "158b6775-d035-40a6-9699-a7bab42a3cbc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:43 np0005466013 nova_compute[192144]: 2025-10-02 12:29:43.178 2 DEBUG nova.compute.manager [req-c5afb61e-4ae1-4524-a685-ea77c4a90b35 req-b65d7fde-ab23-4a92-8ab4-40be2babf1ab 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] No waiting events found dispatching network-vif-plugged-d66594bd-b226-44dd-a0fd-5ff49b65d032 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:29:43 np0005466013 nova_compute[192144]: 2025-10-02 12:29:43.179 2 WARNING nova.compute.manager [req-c5afb61e-4ae1-4524-a685-ea77c4a90b35 req-b65d7fde-ab23-4a92-8ab4-40be2babf1ab 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Received unexpected event network-vif-plugged-d66594bd-b226-44dd-a0fd-5ff49b65d032 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:29:43 np0005466013 nova_compute[192144]: 2025-10-02 12:29:43.179 2 DEBUG nova.compute.manager [req-c5afb61e-4ae1-4524-a685-ea77c4a90b35 req-b65d7fde-ab23-4a92-8ab4-40be2babf1ab 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Received event network-vif-plugged-d66594bd-b226-44dd-a0fd-5ff49b65d032 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:29:43 np0005466013 nova_compute[192144]: 2025-10-02 12:29:43.180 2 DEBUG oslo_concurrency.lockutils [req-c5afb61e-4ae1-4524-a685-ea77c4a90b35 req-b65d7fde-ab23-4a92-8ab4-40be2babf1ab 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "158b6775-d035-40a6-9699-a7bab42a3cbc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:43 np0005466013 nova_compute[192144]: 2025-10-02 12:29:43.180 2 DEBUG oslo_concurrency.lockutils [req-c5afb61e-4ae1-4524-a685-ea77c4a90b35 req-b65d7fde-ab23-4a92-8ab4-40be2babf1ab 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "158b6775-d035-40a6-9699-a7bab42a3cbc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:43 np0005466013 nova_compute[192144]: 2025-10-02 12:29:43.181 2 DEBUG oslo_concurrency.lockutils [req-c5afb61e-4ae1-4524-a685-ea77c4a90b35 req-b65d7fde-ab23-4a92-8ab4-40be2babf1ab 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "158b6775-d035-40a6-9699-a7bab42a3cbc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:43 np0005466013 nova_compute[192144]: 2025-10-02 12:29:43.181 2 DEBUG nova.compute.manager [req-c5afb61e-4ae1-4524-a685-ea77c4a90b35 req-b65d7fde-ab23-4a92-8ab4-40be2babf1ab 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] No waiting events found dispatching network-vif-plugged-d66594bd-b226-44dd-a0fd-5ff49b65d032 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:29:43 np0005466013 nova_compute[192144]: 2025-10-02 12:29:43.182 2 WARNING nova.compute.manager [req-c5afb61e-4ae1-4524-a685-ea77c4a90b35 req-b65d7fde-ab23-4a92-8ab4-40be2babf1ab 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Received unexpected event network-vif-plugged-d66594bd-b226-44dd-a0fd-5ff49b65d032 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:29:43 np0005466013 nova_compute[192144]: 2025-10-02 12:29:43.182 2 DEBUG nova.compute.manager [req-c5afb61e-4ae1-4524-a685-ea77c4a90b35 req-b65d7fde-ab23-4a92-8ab4-40be2babf1ab 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Received event network-vif-plugged-d66594bd-b226-44dd-a0fd-5ff49b65d032 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:29:43 np0005466013 nova_compute[192144]: 2025-10-02 12:29:43.182 2 DEBUG oslo_concurrency.lockutils [req-c5afb61e-4ae1-4524-a685-ea77c4a90b35 req-b65d7fde-ab23-4a92-8ab4-40be2babf1ab 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "158b6775-d035-40a6-9699-a7bab42a3cbc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:43 np0005466013 nova_compute[192144]: 2025-10-02 12:29:43.183 2 DEBUG oslo_concurrency.lockutils [req-c5afb61e-4ae1-4524-a685-ea77c4a90b35 req-b65d7fde-ab23-4a92-8ab4-40be2babf1ab 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "158b6775-d035-40a6-9699-a7bab42a3cbc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:43 np0005466013 nova_compute[192144]: 2025-10-02 12:29:43.183 2 DEBUG oslo_concurrency.lockutils [req-c5afb61e-4ae1-4524-a685-ea77c4a90b35 req-b65d7fde-ab23-4a92-8ab4-40be2babf1ab 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "158b6775-d035-40a6-9699-a7bab42a3cbc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:43 np0005466013 nova_compute[192144]: 2025-10-02 12:29:43.184 2 DEBUG nova.compute.manager [req-c5afb61e-4ae1-4524-a685-ea77c4a90b35 req-b65d7fde-ab23-4a92-8ab4-40be2babf1ab 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] No waiting events found dispatching network-vif-plugged-d66594bd-b226-44dd-a0fd-5ff49b65d032 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:29:43 np0005466013 nova_compute[192144]: 2025-10-02 12:29:43.184 2 WARNING nova.compute.manager [req-c5afb61e-4ae1-4524-a685-ea77c4a90b35 req-b65d7fde-ab23-4a92-8ab4-40be2babf1ab 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Received unexpected event network-vif-plugged-d66594bd-b226-44dd-a0fd-5ff49b65d032 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:29:43 np0005466013 nova_compute[192144]: 2025-10-02 12:29:43.654 2 DEBUG nova.compute.manager [req-035ca9ea-3492-4015-be89-c0e1eb1d3244 req-f1c55d2f-22da-40b9-bdc9-47b6ddd70885 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Received event network-changed-087d1308-7c0a-45ab-b876-1dfdceb622d7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:29:43 np0005466013 nova_compute[192144]: 2025-10-02 12:29:43.656 2 DEBUG nova.compute.manager [req-035ca9ea-3492-4015-be89-c0e1eb1d3244 req-f1c55d2f-22da-40b9-bdc9-47b6ddd70885 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Refreshing instance network info cache due to event network-changed-087d1308-7c0a-45ab-b876-1dfdceb622d7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:29:43 np0005466013 nova_compute[192144]: 2025-10-02 12:29:43.657 2 DEBUG oslo_concurrency.lockutils [req-035ca9ea-3492-4015-be89-c0e1eb1d3244 req-f1c55d2f-22da-40b9-bdc9-47b6ddd70885 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-ca651811-3d96-4b41-a50d-bbaeaf3da808" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:29:43 np0005466013 nova_compute[192144]: 2025-10-02 12:29:43.878 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Updating instance_info_cache with network_info: [{"id": "087d1308-7c0a-45ab-b876-1dfdceb622d7", "address": "fa:16:3e:ac:b7:65", "network": {"id": "d68eafa7-b35f-4bd9-ba11-e28a73bc7849", "bridge": "br-int", "label": "tempest-network-smoke--995539246", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap087d1308-7c", "ovs_interfaceid": "087d1308-7c0a-45ab-b876-1dfdceb622d7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6949320c-d0cb-4a9d-a882-f6d1aac564c3", "address": "fa:16:3e:49:69:cf", "network": {"id": "85092873-751b-414a-a9a1-112c2e61cb13", "bridge": "br-int", "label": "tempest-network-smoke--1405117352", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe49:69cf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6949320c-d0", "ovs_interfaceid": "6949320c-d0cb-4a9d-a882-f6d1aac564c3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:29:43 np0005466013 nova_compute[192144]: 2025-10-02 12:29:43.881 2 DEBUG oslo_concurrency.lockutils [None req-e702d4ee-b3a3-41e7-9cef-56e11ae80e39 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "ca651811-3d96-4b41-a50d-bbaeaf3da808" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:43 np0005466013 nova_compute[192144]: 2025-10-02 12:29:43.882 2 DEBUG oslo_concurrency.lockutils [None req-e702d4ee-b3a3-41e7-9cef-56e11ae80e39 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "ca651811-3d96-4b41-a50d-bbaeaf3da808" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:43 np0005466013 nova_compute[192144]: 2025-10-02 12:29:43.882 2 DEBUG oslo_concurrency.lockutils [None req-e702d4ee-b3a3-41e7-9cef-56e11ae80e39 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "ca651811-3d96-4b41-a50d-bbaeaf3da808-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:43 np0005466013 nova_compute[192144]: 2025-10-02 12:29:43.882 2 DEBUG oslo_concurrency.lockutils [None req-e702d4ee-b3a3-41e7-9cef-56e11ae80e39 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "ca651811-3d96-4b41-a50d-bbaeaf3da808-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:43 np0005466013 nova_compute[192144]: 2025-10-02 12:29:43.882 2 DEBUG oslo_concurrency.lockutils [None req-e702d4ee-b3a3-41e7-9cef-56e11ae80e39 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "ca651811-3d96-4b41-a50d-bbaeaf3da808-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:43 np0005466013 nova_compute[192144]: 2025-10-02 12:29:43.906 2 INFO nova.compute.manager [None req-e702d4ee-b3a3-41e7-9cef-56e11ae80e39 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Terminating instance#033[00m
Oct  2 08:29:43 np0005466013 nova_compute[192144]: 2025-10-02 12:29:43.931 2 DEBUG nova.compute.manager [None req-e702d4ee-b3a3-41e7-9cef-56e11ae80e39 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:29:43 np0005466013 nova_compute[192144]: 2025-10-02 12:29:43.948 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Releasing lock "refresh_cache-ca651811-3d96-4b41-a50d-bbaeaf3da808" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:29:43 np0005466013 nova_compute[192144]: 2025-10-02 12:29:43.948 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:29:43 np0005466013 nova_compute[192144]: 2025-10-02 12:29:43.948 2 DEBUG oslo_concurrency.lockutils [req-035ca9ea-3492-4015-be89-c0e1eb1d3244 req-f1c55d2f-22da-40b9-bdc9-47b6ddd70885 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-ca651811-3d96-4b41-a50d-bbaeaf3da808" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:29:43 np0005466013 nova_compute[192144]: 2025-10-02 12:29:43.949 2 DEBUG nova.network.neutron [req-035ca9ea-3492-4015-be89-c0e1eb1d3244 req-f1c55d2f-22da-40b9-bdc9-47b6ddd70885 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Refreshing network info cache for port 087d1308-7c0a-45ab-b876-1dfdceb622d7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:29:43 np0005466013 nova_compute[192144]: 2025-10-02 12:29:43.950 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:29:43 np0005466013 nova_compute[192144]: 2025-10-02 12:29:43.950 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:29:43 np0005466013 nova_compute[192144]: 2025-10-02 12:29:43.950 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:29:43 np0005466013 nova_compute[192144]: 2025-10-02 12:29:43.950 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:29:43 np0005466013 nova_compute[192144]: 2025-10-02 12:29:43.951 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  2 08:29:43 np0005466013 kernel: tap087d1308-7c (unregistering): left promiscuous mode
Oct  2 08:29:43 np0005466013 NetworkManager[51205]: <info>  [1759408183.9587] device (tap087d1308-7c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:29:43 np0005466013 nova_compute[192144]: 2025-10-02 12:29:43.974 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  2 08:29:43 np0005466013 ovn_controller[94366]: 2025-10-02T12:29:43Z|00575|binding|INFO|Releasing lport 087d1308-7c0a-45ab-b876-1dfdceb622d7 from this chassis (sb_readonly=0)
Oct  2 08:29:43 np0005466013 ovn_controller[94366]: 2025-10-02T12:29:43Z|00576|binding|INFO|Setting lport 087d1308-7c0a-45ab-b876-1dfdceb622d7 down in Southbound
Oct  2 08:29:43 np0005466013 ovn_controller[94366]: 2025-10-02T12:29:43Z|00577|binding|INFO|Removing iface tap087d1308-7c ovn-installed in OVS
Oct  2 08:29:44 np0005466013 nova_compute[192144]: 2025-10-02 12:29:43.993 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:44 np0005466013 nova_compute[192144]: 2025-10-02 12:29:43.997 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:44 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:44.005 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:b7:65 10.100.0.13'], port_security=['fa:16:3e:ac:b7:65 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d68eafa7-b35f-4bd9-ba11-e28a73bc7849', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1049901a-232c-40d0-9fe6-646c9d087089', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a731a9f5-9e55-440a-a95e-a9a819598de7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=087d1308-7c0a-45ab-b876-1dfdceb622d7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:29:44 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:44.009 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 087d1308-7c0a-45ab-b876-1dfdceb622d7 in datapath d68eafa7-b35f-4bd9-ba11-e28a73bc7849 unbound from our chassis#033[00m
Oct  2 08:29:44 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:44.012 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d68eafa7-b35f-4bd9-ba11-e28a73bc7849, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:29:44 np0005466013 kernel: tap6949320c-d0 (unregistering): left promiscuous mode
Oct  2 08:29:44 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:44.014 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[83eb8c75-1fd0-48e4-b9d9-be56c4df2936]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:44 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:44.015 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d68eafa7-b35f-4bd9-ba11-e28a73bc7849 namespace which is not needed anymore#033[00m
Oct  2 08:29:44 np0005466013 NetworkManager[51205]: <info>  [1759408184.0183] device (tap6949320c-d0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:29:44 np0005466013 nova_compute[192144]: 2025-10-02 12:29:44.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:44 np0005466013 ovn_controller[94366]: 2025-10-02T12:29:44Z|00578|binding|INFO|Releasing lport 6949320c-d0cb-4a9d-a882-f6d1aac564c3 from this chassis (sb_readonly=0)
Oct  2 08:29:44 np0005466013 ovn_controller[94366]: 2025-10-02T12:29:44Z|00579|binding|INFO|Setting lport 6949320c-d0cb-4a9d-a882-f6d1aac564c3 down in Southbound
Oct  2 08:29:44 np0005466013 ovn_controller[94366]: 2025-10-02T12:29:44Z|00580|binding|INFO|Removing iface tap6949320c-d0 ovn-installed in OVS
Oct  2 08:29:44 np0005466013 nova_compute[192144]: 2025-10-02 12:29:44.035 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:44 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:44.049 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:49:69:cf 2001:db8::f816:3eff:fe49:69cf'], port_security=['fa:16:3e:49:69:cf 2001:db8::f816:3eff:fe49:69cf'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe49:69cf/64', 'neutron:device_id': 'ca651811-3d96-4b41-a50d-bbaeaf3da808', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-85092873-751b-414a-a9a1-112c2e61cb13', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1049901a-232c-40d0-9fe6-646c9d087089', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b1cb8f94-a0b5-458e-a15a-45916ae4369f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=6949320c-d0cb-4a9d-a882-f6d1aac564c3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:29:44 np0005466013 nova_compute[192144]: 2025-10-02 12:29:44.049 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:44 np0005466013 systemd[1]: machine-qemu\x2d63\x2dinstance\x2d00000080.scope: Deactivated successfully.
Oct  2 08:29:44 np0005466013 systemd[1]: machine-qemu\x2d63\x2dinstance\x2d00000080.scope: Consumed 18.256s CPU time.
Oct  2 08:29:44 np0005466013 systemd-machined[152202]: Machine qemu-63-instance-00000080 terminated.
Oct  2 08:29:44 np0005466013 nova_compute[192144]: 2025-10-02 12:29:44.155 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:44 np0005466013 NetworkManager[51205]: <info>  [1759408184.1640] manager: (tap6949320c-d0): new Tun device (/org/freedesktop/NetworkManager/Devices/256)
Oct  2 08:29:44 np0005466013 nova_compute[192144]: 2025-10-02 12:29:44.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:44 np0005466013 neutron-haproxy-ovnmeta-d68eafa7-b35f-4bd9-ba11-e28a73bc7849[241122]: [NOTICE]   (241126) : haproxy version is 2.8.14-c23fe91
Oct  2 08:29:44 np0005466013 neutron-haproxy-ovnmeta-d68eafa7-b35f-4bd9-ba11-e28a73bc7849[241122]: [NOTICE]   (241126) : path to executable is /usr/sbin/haproxy
Oct  2 08:29:44 np0005466013 neutron-haproxy-ovnmeta-d68eafa7-b35f-4bd9-ba11-e28a73bc7849[241122]: [WARNING]  (241126) : Exiting Master process...
Oct  2 08:29:44 np0005466013 neutron-haproxy-ovnmeta-d68eafa7-b35f-4bd9-ba11-e28a73bc7849[241122]: [ALERT]    (241126) : Current worker (241128) exited with code 143 (Terminated)
Oct  2 08:29:44 np0005466013 neutron-haproxy-ovnmeta-d68eafa7-b35f-4bd9-ba11-e28a73bc7849[241122]: [WARNING]  (241126) : All workers exited. Exiting... (0)
Oct  2 08:29:44 np0005466013 systemd[1]: libpod-fe508494363446b0ce8416e788d77942ecadbb6f73117d0d95eb34c5ccf4f2bd.scope: Deactivated successfully.
Oct  2 08:29:44 np0005466013 podman[242232]: 2025-10-02 12:29:44.179508259 +0000 UTC m=+0.060519657 container died fe508494363446b0ce8416e788d77942ecadbb6f73117d0d95eb34c5ccf4f2bd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d68eafa7-b35f-4bd9-ba11-e28a73bc7849, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:29:44 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fe508494363446b0ce8416e788d77942ecadbb6f73117d0d95eb34c5ccf4f2bd-userdata-shm.mount: Deactivated successfully.
Oct  2 08:29:44 np0005466013 systemd[1]: var-lib-containers-storage-overlay-f99c4c5eaf4e543d946adc95f4a497ce712e8a181ed8712901a3b0bdc688f2ea-merged.mount: Deactivated successfully.
Oct  2 08:29:44 np0005466013 nova_compute[192144]: 2025-10-02 12:29:44.228 2 INFO nova.virt.libvirt.driver [-] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Instance destroyed successfully.#033[00m
Oct  2 08:29:44 np0005466013 nova_compute[192144]: 2025-10-02 12:29:44.229 2 DEBUG nova.objects.instance [None req-e702d4ee-b3a3-41e7-9cef-56e11ae80e39 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lazy-loading 'resources' on Instance uuid ca651811-3d96-4b41-a50d-bbaeaf3da808 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:29:44 np0005466013 podman[242232]: 2025-10-02 12:29:44.232932463 +0000 UTC m=+0.113943861 container cleanup fe508494363446b0ce8416e788d77942ecadbb6f73117d0d95eb34c5ccf4f2bd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d68eafa7-b35f-4bd9-ba11-e28a73bc7849, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:29:44 np0005466013 systemd[1]: libpod-conmon-fe508494363446b0ce8416e788d77942ecadbb6f73117d0d95eb34c5ccf4f2bd.scope: Deactivated successfully.
Oct  2 08:29:44 np0005466013 nova_compute[192144]: 2025-10-02 12:29:44.253 2 DEBUG nova.virt.libvirt.vif [None req-e702d4ee-b3a3-41e7-9cef-56e11ae80e39 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:27:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-986630317',display_name='tempest-TestGettingAddress-server-986630317',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-986630317',id=128,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNuPkj9Dxk75qecKTcAv5uyglWO7VJAWxaaepcSp1a1k0dudaz78GBONfraj5VI3+kTW1O5IZNXJG+3u7RZIiYOa8MpOz6jMbMxt8apN5oZQHFIm8Zmccy1CmZp9PKlQBg==',key_name='tempest-TestGettingAddress-1196931169',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:28:18Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-ku485ehg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:28:18Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=ca651811-3d96-4b41-a50d-bbaeaf3da808,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "087d1308-7c0a-45ab-b876-1dfdceb622d7", "address": "fa:16:3e:ac:b7:65", "network": {"id": "d68eafa7-b35f-4bd9-ba11-e28a73bc7849", "bridge": "br-int", "label": "tempest-network-smoke--995539246", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap087d1308-7c", "ovs_interfaceid": "087d1308-7c0a-45ab-b876-1dfdceb622d7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:29:44 np0005466013 nova_compute[192144]: 2025-10-02 12:29:44.254 2 DEBUG nova.network.os_vif_util [None req-e702d4ee-b3a3-41e7-9cef-56e11ae80e39 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "087d1308-7c0a-45ab-b876-1dfdceb622d7", "address": "fa:16:3e:ac:b7:65", "network": {"id": "d68eafa7-b35f-4bd9-ba11-e28a73bc7849", "bridge": "br-int", "label": "tempest-network-smoke--995539246", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap087d1308-7c", "ovs_interfaceid": "087d1308-7c0a-45ab-b876-1dfdceb622d7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:29:44 np0005466013 nova_compute[192144]: 2025-10-02 12:29:44.255 2 DEBUG nova.network.os_vif_util [None req-e702d4ee-b3a3-41e7-9cef-56e11ae80e39 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ac:b7:65,bridge_name='br-int',has_traffic_filtering=True,id=087d1308-7c0a-45ab-b876-1dfdceb622d7,network=Network(d68eafa7-b35f-4bd9-ba11-e28a73bc7849),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap087d1308-7c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:29:44 np0005466013 nova_compute[192144]: 2025-10-02 12:29:44.255 2 DEBUG os_vif [None req-e702d4ee-b3a3-41e7-9cef-56e11ae80e39 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ac:b7:65,bridge_name='br-int',has_traffic_filtering=True,id=087d1308-7c0a-45ab-b876-1dfdceb622d7,network=Network(d68eafa7-b35f-4bd9-ba11-e28a73bc7849),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap087d1308-7c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:29:44 np0005466013 nova_compute[192144]: 2025-10-02 12:29:44.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:44 np0005466013 nova_compute[192144]: 2025-10-02 12:29:44.257 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap087d1308-7c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:29:44 np0005466013 nova_compute[192144]: 2025-10-02 12:29:44.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:29:44 np0005466013 nova_compute[192144]: 2025-10-02 12:29:44.263 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:44 np0005466013 nova_compute[192144]: 2025-10-02 12:29:44.265 2 INFO os_vif [None req-e702d4ee-b3a3-41e7-9cef-56e11ae80e39 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ac:b7:65,bridge_name='br-int',has_traffic_filtering=True,id=087d1308-7c0a-45ab-b876-1dfdceb622d7,network=Network(d68eafa7-b35f-4bd9-ba11-e28a73bc7849),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap087d1308-7c')#033[00m
Oct  2 08:29:44 np0005466013 nova_compute[192144]: 2025-10-02 12:29:44.266 2 DEBUG nova.virt.libvirt.vif [None req-e702d4ee-b3a3-41e7-9cef-56e11ae80e39 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:27:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-986630317',display_name='tempest-TestGettingAddress-server-986630317',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-986630317',id=128,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNuPkj9Dxk75qecKTcAv5uyglWO7VJAWxaaepcSp1a1k0dudaz78GBONfraj5VI3+kTW1O5IZNXJG+3u7RZIiYOa8MpOz6jMbMxt8apN5oZQHFIm8Zmccy1CmZp9PKlQBg==',key_name='tempest-TestGettingAddress-1196931169',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:28:18Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-ku485ehg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:28:18Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=ca651811-3d96-4b41-a50d-bbaeaf3da808,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6949320c-d0cb-4a9d-a882-f6d1aac564c3", "address": "fa:16:3e:49:69:cf", "network": {"id": "85092873-751b-414a-a9a1-112c2e61cb13", "bridge": "br-int", "label": "tempest-network-smoke--1405117352", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe49:69cf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6949320c-d0", "ovs_interfaceid": "6949320c-d0cb-4a9d-a882-f6d1aac564c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:29:44 np0005466013 nova_compute[192144]: 2025-10-02 12:29:44.266 2 DEBUG nova.network.os_vif_util [None req-e702d4ee-b3a3-41e7-9cef-56e11ae80e39 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "6949320c-d0cb-4a9d-a882-f6d1aac564c3", "address": "fa:16:3e:49:69:cf", "network": {"id": "85092873-751b-414a-a9a1-112c2e61cb13", "bridge": "br-int", "label": "tempest-network-smoke--1405117352", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe49:69cf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6949320c-d0", "ovs_interfaceid": "6949320c-d0cb-4a9d-a882-f6d1aac564c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:29:44 np0005466013 nova_compute[192144]: 2025-10-02 12:29:44.267 2 DEBUG nova.network.os_vif_util [None req-e702d4ee-b3a3-41e7-9cef-56e11ae80e39 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:49:69:cf,bridge_name='br-int',has_traffic_filtering=True,id=6949320c-d0cb-4a9d-a882-f6d1aac564c3,network=Network(85092873-751b-414a-a9a1-112c2e61cb13),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6949320c-d0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:29:44 np0005466013 nova_compute[192144]: 2025-10-02 12:29:44.267 2 DEBUG os_vif [None req-e702d4ee-b3a3-41e7-9cef-56e11ae80e39 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:49:69:cf,bridge_name='br-int',has_traffic_filtering=True,id=6949320c-d0cb-4a9d-a882-f6d1aac564c3,network=Network(85092873-751b-414a-a9a1-112c2e61cb13),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6949320c-d0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:29:44 np0005466013 nova_compute[192144]: 2025-10-02 12:29:44.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:44 np0005466013 nova_compute[192144]: 2025-10-02 12:29:44.269 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6949320c-d0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:29:44 np0005466013 nova_compute[192144]: 2025-10-02 12:29:44.271 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:29:44 np0005466013 nova_compute[192144]: 2025-10-02 12:29:44.273 2 INFO os_vif [None req-e702d4ee-b3a3-41e7-9cef-56e11ae80e39 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:49:69:cf,bridge_name='br-int',has_traffic_filtering=True,id=6949320c-d0cb-4a9d-a882-f6d1aac564c3,network=Network(85092873-751b-414a-a9a1-112c2e61cb13),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6949320c-d0')#033[00m
Oct  2 08:29:44 np0005466013 nova_compute[192144]: 2025-10-02 12:29:44.273 2 INFO nova.virt.libvirt.driver [None req-e702d4ee-b3a3-41e7-9cef-56e11ae80e39 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Deleting instance files /var/lib/nova/instances/ca651811-3d96-4b41-a50d-bbaeaf3da808_del#033[00m
Oct  2 08:29:44 np0005466013 nova_compute[192144]: 2025-10-02 12:29:44.274 2 INFO nova.virt.libvirt.driver [None req-e702d4ee-b3a3-41e7-9cef-56e11ae80e39 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Deletion of /var/lib/nova/instances/ca651811-3d96-4b41-a50d-bbaeaf3da808_del complete#033[00m
Oct  2 08:29:44 np0005466013 podman[242287]: 2025-10-02 12:29:44.311094282 +0000 UTC m=+0.054807738 container remove fe508494363446b0ce8416e788d77942ecadbb6f73117d0d95eb34c5ccf4f2bd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d68eafa7-b35f-4bd9-ba11-e28a73bc7849, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:29:44 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:44.316 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[32f17242-6f63-4f9c-a3db-ebab9e799e8f]: (4, ('Thu Oct  2 12:29:44 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d68eafa7-b35f-4bd9-ba11-e28a73bc7849 (fe508494363446b0ce8416e788d77942ecadbb6f73117d0d95eb34c5ccf4f2bd)\nfe508494363446b0ce8416e788d77942ecadbb6f73117d0d95eb34c5ccf4f2bd\nThu Oct  2 12:29:44 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d68eafa7-b35f-4bd9-ba11-e28a73bc7849 (fe508494363446b0ce8416e788d77942ecadbb6f73117d0d95eb34c5ccf4f2bd)\nfe508494363446b0ce8416e788d77942ecadbb6f73117d0d95eb34c5ccf4f2bd\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:44 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:44.318 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[ae367cda-fc1a-403a-ac78-ce0b6271adfc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:44 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:44.319 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd68eafa7-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:29:44 np0005466013 kernel: tapd68eafa7-b0: left promiscuous mode
Oct  2 08:29:44 np0005466013 nova_compute[192144]: 2025-10-02 12:29:44.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:44 np0005466013 nova_compute[192144]: 2025-10-02 12:29:44.335 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:44 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:44.338 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[9f0a4592-9771-47f4-9bbb-6f1735c849bd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:44 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:44.361 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[c64ab383-195f-49cb-9fe1-7ed3e7e3dd09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:44 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:44.362 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[a5c7574a-ae69-4acc-880c-954f2a593972]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:44 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:44.380 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[bda20add-14ea-436c-a026-6d22b867184e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 608693, 'reachable_time': 23762, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242305, 'error': None, 'target': 'ovnmeta-d68eafa7-b35f-4bd9-ba11-e28a73bc7849', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:44 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:44.383 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d68eafa7-b35f-4bd9-ba11-e28a73bc7849 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:29:44 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:44.383 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[9212e7df-78c8-40e8-935b-fb1866f5754a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:44 np0005466013 systemd[1]: run-netns-ovnmeta\x2dd68eafa7\x2db35f\x2d4bd9\x2dba11\x2de28a73bc7849.mount: Deactivated successfully.
Oct  2 08:29:44 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:44.386 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 6949320c-d0cb-4a9d-a882-f6d1aac564c3 in datapath 85092873-751b-414a-a9a1-112c2e61cb13 unbound from our chassis#033[00m
Oct  2 08:29:44 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:44.388 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 85092873-751b-414a-a9a1-112c2e61cb13, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:29:44 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:44.389 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[0bbd76e7-7a2c-4327-9424-3385d4cbf98b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:44 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:44.389 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-85092873-751b-414a-a9a1-112c2e61cb13 namespace which is not needed anymore#033[00m
Oct  2 08:29:44 np0005466013 nova_compute[192144]: 2025-10-02 12:29:44.423 2 DEBUG nova.compute.manager [req-ca07842b-6d8d-48a4-9c28-1d5d28fd7681 req-1efebef1-ab06-4f86-9514-debc1c206b7e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Received event network-vif-unplugged-6949320c-d0cb-4a9d-a882-f6d1aac564c3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:29:44 np0005466013 nova_compute[192144]: 2025-10-02 12:29:44.424 2 DEBUG oslo_concurrency.lockutils [req-ca07842b-6d8d-48a4-9c28-1d5d28fd7681 req-1efebef1-ab06-4f86-9514-debc1c206b7e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "ca651811-3d96-4b41-a50d-bbaeaf3da808-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:44 np0005466013 nova_compute[192144]: 2025-10-02 12:29:44.424 2 DEBUG oslo_concurrency.lockutils [req-ca07842b-6d8d-48a4-9c28-1d5d28fd7681 req-1efebef1-ab06-4f86-9514-debc1c206b7e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ca651811-3d96-4b41-a50d-bbaeaf3da808-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:44 np0005466013 nova_compute[192144]: 2025-10-02 12:29:44.424 2 DEBUG oslo_concurrency.lockutils [req-ca07842b-6d8d-48a4-9c28-1d5d28fd7681 req-1efebef1-ab06-4f86-9514-debc1c206b7e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ca651811-3d96-4b41-a50d-bbaeaf3da808-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:44 np0005466013 nova_compute[192144]: 2025-10-02 12:29:44.424 2 DEBUG nova.compute.manager [req-ca07842b-6d8d-48a4-9c28-1d5d28fd7681 req-1efebef1-ab06-4f86-9514-debc1c206b7e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] No waiting events found dispatching network-vif-unplugged-6949320c-d0cb-4a9d-a882-f6d1aac564c3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:29:44 np0005466013 nova_compute[192144]: 2025-10-02 12:29:44.424 2 DEBUG nova.compute.manager [req-ca07842b-6d8d-48a4-9c28-1d5d28fd7681 req-1efebef1-ab06-4f86-9514-debc1c206b7e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Received event network-vif-unplugged-6949320c-d0cb-4a9d-a882-f6d1aac564c3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:29:44 np0005466013 nova_compute[192144]: 2025-10-02 12:29:44.462 2 INFO nova.compute.manager [None req-e702d4ee-b3a3-41e7-9cef-56e11ae80e39 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Took 0.53 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:29:44 np0005466013 nova_compute[192144]: 2025-10-02 12:29:44.462 2 DEBUG oslo.service.loopingcall [None req-e702d4ee-b3a3-41e7-9cef-56e11ae80e39 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:29:44 np0005466013 nova_compute[192144]: 2025-10-02 12:29:44.462 2 DEBUG nova.compute.manager [-] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:29:44 np0005466013 nova_compute[192144]: 2025-10-02 12:29:44.462 2 DEBUG nova.network.neutron [-] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:29:44 np0005466013 neutron-haproxy-ovnmeta-85092873-751b-414a-a9a1-112c2e61cb13[241195]: [NOTICE]   (241199) : haproxy version is 2.8.14-c23fe91
Oct  2 08:29:44 np0005466013 neutron-haproxy-ovnmeta-85092873-751b-414a-a9a1-112c2e61cb13[241195]: [NOTICE]   (241199) : path to executable is /usr/sbin/haproxy
Oct  2 08:29:44 np0005466013 neutron-haproxy-ovnmeta-85092873-751b-414a-a9a1-112c2e61cb13[241195]: [ALERT]    (241199) : Current worker (241201) exited with code 143 (Terminated)
Oct  2 08:29:44 np0005466013 neutron-haproxy-ovnmeta-85092873-751b-414a-a9a1-112c2e61cb13[241195]: [WARNING]  (241199) : All workers exited. Exiting... (0)
Oct  2 08:29:44 np0005466013 systemd[1]: libpod-b6132df3ebb11e5e9c2532671ce762b859bfe9e1abacc58fd47c9b8f54e98687.scope: Deactivated successfully.
Oct  2 08:29:44 np0005466013 podman[242323]: 2025-10-02 12:29:44.546676523 +0000 UTC m=+0.061232420 container died b6132df3ebb11e5e9c2532671ce762b859bfe9e1abacc58fd47c9b8f54e98687 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-85092873-751b-414a-a9a1-112c2e61cb13, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001)
Oct  2 08:29:44 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b6132df3ebb11e5e9c2532671ce762b859bfe9e1abacc58fd47c9b8f54e98687-userdata-shm.mount: Deactivated successfully.
Oct  2 08:29:44 np0005466013 systemd[1]: var-lib-containers-storage-overlay-7bd4dd73780ac142a6cea288365b78ad003e2dc4b791dd69b90825c1bd164fef-merged.mount: Deactivated successfully.
Oct  2 08:29:44 np0005466013 podman[242323]: 2025-10-02 12:29:44.587970087 +0000 UTC m=+0.102525944 container cleanup b6132df3ebb11e5e9c2532671ce762b859bfe9e1abacc58fd47c9b8f54e98687 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-85092873-751b-414a-a9a1-112c2e61cb13, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:29:44 np0005466013 systemd[1]: libpod-conmon-b6132df3ebb11e5e9c2532671ce762b859bfe9e1abacc58fd47c9b8f54e98687.scope: Deactivated successfully.
Oct  2 08:29:44 np0005466013 podman[242355]: 2025-10-02 12:29:44.649916658 +0000 UTC m=+0.040814630 container remove b6132df3ebb11e5e9c2532671ce762b859bfe9e1abacc58fd47c9b8f54e98687 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-85092873-751b-414a-a9a1-112c2e61cb13, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  2 08:29:44 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:44.655 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[00812b50-d4ed-47c6-a1d5-8c1fac45b116]: (4, ('Thu Oct  2 12:29:44 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-85092873-751b-414a-a9a1-112c2e61cb13 (b6132df3ebb11e5e9c2532671ce762b859bfe9e1abacc58fd47c9b8f54e98687)\nb6132df3ebb11e5e9c2532671ce762b859bfe9e1abacc58fd47c9b8f54e98687\nThu Oct  2 12:29:44 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-85092873-751b-414a-a9a1-112c2e61cb13 (b6132df3ebb11e5e9c2532671ce762b859bfe9e1abacc58fd47c9b8f54e98687)\nb6132df3ebb11e5e9c2532671ce762b859bfe9e1abacc58fd47c9b8f54e98687\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:44 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:44.656 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[e60230b8-5e36-4960-aaa4-00c10c15d4e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:44 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:44.657 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap85092873-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:29:44 np0005466013 nova_compute[192144]: 2025-10-02 12:29:44.659 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:44 np0005466013 kernel: tap85092873-70: left promiscuous mode
Oct  2 08:29:44 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:44.663 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[4065f991-804f-4be7-9f36-1fa5f47cf5ae]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:44 np0005466013 nova_compute[192144]: 2025-10-02 12:29:44.679 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:44 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:44.699 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[4de30c47-13bf-4fd7-ae79-d2285ca2b3b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:44 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:44.700 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[9b5c160d-a3ba-4e22-a100-895ed8014961]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:44 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:44.722 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[308a082b-5050-47ed-a54f-4dfc860ee9af]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 608816, 'reachable_time': 16395, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242370, 'error': None, 'target': 'ovnmeta-85092873-751b-414a-a9a1-112c2e61cb13', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:44 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:44.724 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-85092873-751b-414a-a9a1-112c2e61cb13 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:29:44 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:29:44.724 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[73b42b36-d5a2-4012-9cbc-42d668a08196]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:44 np0005466013 nova_compute[192144]: 2025-10-02 12:29:44.804 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:44 np0005466013 nova_compute[192144]: 2025-10-02 12:29:44.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:45 np0005466013 ovn_controller[94366]: 2025-10-02T12:29:45Z|00581|binding|INFO|Releasing lport 325e84b6-74e4-4e02-a145-8f2619c4e99c from this chassis (sb_readonly=0)
Oct  2 08:29:45 np0005466013 nova_compute[192144]: 2025-10-02 12:29:45.190 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:45 np0005466013 systemd[1]: run-netns-ovnmeta\x2d85092873\x2d751b\x2d414a\x2da9a1\x2d112c2e61cb13.mount: Deactivated successfully.
Oct  2 08:29:45 np0005466013 nova_compute[192144]: 2025-10-02 12:29:45.349 2 DEBUG nova.compute.manager [req-55e24dec-0ce6-41c6-b2b6-f6f532a66bbc req-20d54636-a24c-44b1-89ce-311bcc62c52a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Received event network-vif-unplugged-087d1308-7c0a-45ab-b876-1dfdceb622d7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:29:45 np0005466013 nova_compute[192144]: 2025-10-02 12:29:45.349 2 DEBUG oslo_concurrency.lockutils [req-55e24dec-0ce6-41c6-b2b6-f6f532a66bbc req-20d54636-a24c-44b1-89ce-311bcc62c52a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "ca651811-3d96-4b41-a50d-bbaeaf3da808-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:45 np0005466013 nova_compute[192144]: 2025-10-02 12:29:45.350 2 DEBUG oslo_concurrency.lockutils [req-55e24dec-0ce6-41c6-b2b6-f6f532a66bbc req-20d54636-a24c-44b1-89ce-311bcc62c52a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ca651811-3d96-4b41-a50d-bbaeaf3da808-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:45 np0005466013 nova_compute[192144]: 2025-10-02 12:29:45.350 2 DEBUG oslo_concurrency.lockutils [req-55e24dec-0ce6-41c6-b2b6-f6f532a66bbc req-20d54636-a24c-44b1-89ce-311bcc62c52a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ca651811-3d96-4b41-a50d-bbaeaf3da808-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:45 np0005466013 nova_compute[192144]: 2025-10-02 12:29:45.350 2 DEBUG nova.compute.manager [req-55e24dec-0ce6-41c6-b2b6-f6f532a66bbc req-20d54636-a24c-44b1-89ce-311bcc62c52a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] No waiting events found dispatching network-vif-unplugged-087d1308-7c0a-45ab-b876-1dfdceb622d7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:29:45 np0005466013 nova_compute[192144]: 2025-10-02 12:29:45.351 2 DEBUG nova.compute.manager [req-55e24dec-0ce6-41c6-b2b6-f6f532a66bbc req-20d54636-a24c-44b1-89ce-311bcc62c52a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Received event network-vif-unplugged-087d1308-7c0a-45ab-b876-1dfdceb622d7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:29:45 np0005466013 nova_compute[192144]: 2025-10-02 12:29:45.351 2 DEBUG nova.compute.manager [req-55e24dec-0ce6-41c6-b2b6-f6f532a66bbc req-20d54636-a24c-44b1-89ce-311bcc62c52a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Received event network-vif-plugged-087d1308-7c0a-45ab-b876-1dfdceb622d7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:29:45 np0005466013 nova_compute[192144]: 2025-10-02 12:29:45.351 2 DEBUG oslo_concurrency.lockutils [req-55e24dec-0ce6-41c6-b2b6-f6f532a66bbc req-20d54636-a24c-44b1-89ce-311bcc62c52a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "ca651811-3d96-4b41-a50d-bbaeaf3da808-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:45 np0005466013 nova_compute[192144]: 2025-10-02 12:29:45.352 2 DEBUG oslo_concurrency.lockutils [req-55e24dec-0ce6-41c6-b2b6-f6f532a66bbc req-20d54636-a24c-44b1-89ce-311bcc62c52a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ca651811-3d96-4b41-a50d-bbaeaf3da808-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:45 np0005466013 nova_compute[192144]: 2025-10-02 12:29:45.352 2 DEBUG oslo_concurrency.lockutils [req-55e24dec-0ce6-41c6-b2b6-f6f532a66bbc req-20d54636-a24c-44b1-89ce-311bcc62c52a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ca651811-3d96-4b41-a50d-bbaeaf3da808-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:45 np0005466013 nova_compute[192144]: 2025-10-02 12:29:45.352 2 DEBUG nova.compute.manager [req-55e24dec-0ce6-41c6-b2b6-f6f532a66bbc req-20d54636-a24c-44b1-89ce-311bcc62c52a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] No waiting events found dispatching network-vif-plugged-087d1308-7c0a-45ab-b876-1dfdceb622d7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:29:45 np0005466013 nova_compute[192144]: 2025-10-02 12:29:45.353 2 WARNING nova.compute.manager [req-55e24dec-0ce6-41c6-b2b6-f6f532a66bbc req-20d54636-a24c-44b1-89ce-311bcc62c52a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Received unexpected event network-vif-plugged-087d1308-7c0a-45ab-b876-1dfdceb622d7 for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:29:45 np0005466013 nova_compute[192144]: 2025-10-02 12:29:45.668 2 DEBUG nova.network.neutron [req-035ca9ea-3492-4015-be89-c0e1eb1d3244 req-f1c55d2f-22da-40b9-bdc9-47b6ddd70885 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Updated VIF entry in instance network info cache for port 087d1308-7c0a-45ab-b876-1dfdceb622d7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:29:45 np0005466013 nova_compute[192144]: 2025-10-02 12:29:45.669 2 DEBUG nova.network.neutron [req-035ca9ea-3492-4015-be89-c0e1eb1d3244 req-f1c55d2f-22da-40b9-bdc9-47b6ddd70885 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Updating instance_info_cache with network_info: [{"id": "087d1308-7c0a-45ab-b876-1dfdceb622d7", "address": "fa:16:3e:ac:b7:65", "network": {"id": "d68eafa7-b35f-4bd9-ba11-e28a73bc7849", "bridge": "br-int", "label": "tempest-network-smoke--995539246", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap087d1308-7c", "ovs_interfaceid": "087d1308-7c0a-45ab-b876-1dfdceb622d7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6949320c-d0cb-4a9d-a882-f6d1aac564c3", "address": "fa:16:3e:49:69:cf", "network": {"id": "85092873-751b-414a-a9a1-112c2e61cb13", "bridge": "br-int", "label": "tempest-network-smoke--1405117352", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe49:69cf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6949320c-d0", "ovs_interfaceid": "6949320c-d0cb-4a9d-a882-f6d1aac564c3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:29:45 np0005466013 nova_compute[192144]: 2025-10-02 12:29:45.694 2 DEBUG oslo_concurrency.lockutils [req-035ca9ea-3492-4015-be89-c0e1eb1d3244 req-f1c55d2f-22da-40b9-bdc9-47b6ddd70885 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-ca651811-3d96-4b41-a50d-bbaeaf3da808" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:29:46 np0005466013 nova_compute[192144]: 2025-10-02 12:29:46.641 2 DEBUG nova.compute.manager [req-d17ae0c2-ede1-4c1d-9901-d90d58731185 req-1e5542a0-33b7-4ce7-95d5-4451c5ae283e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Received event network-vif-plugged-6949320c-d0cb-4a9d-a882-f6d1aac564c3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:29:46 np0005466013 nova_compute[192144]: 2025-10-02 12:29:46.642 2 DEBUG oslo_concurrency.lockutils [req-d17ae0c2-ede1-4c1d-9901-d90d58731185 req-1e5542a0-33b7-4ce7-95d5-4451c5ae283e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "ca651811-3d96-4b41-a50d-bbaeaf3da808-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:46 np0005466013 nova_compute[192144]: 2025-10-02 12:29:46.642 2 DEBUG oslo_concurrency.lockutils [req-d17ae0c2-ede1-4c1d-9901-d90d58731185 req-1e5542a0-33b7-4ce7-95d5-4451c5ae283e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ca651811-3d96-4b41-a50d-bbaeaf3da808-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:46 np0005466013 nova_compute[192144]: 2025-10-02 12:29:46.643 2 DEBUG oslo_concurrency.lockutils [req-d17ae0c2-ede1-4c1d-9901-d90d58731185 req-1e5542a0-33b7-4ce7-95d5-4451c5ae283e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ca651811-3d96-4b41-a50d-bbaeaf3da808-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:46 np0005466013 nova_compute[192144]: 2025-10-02 12:29:46.643 2 DEBUG nova.compute.manager [req-d17ae0c2-ede1-4c1d-9901-d90d58731185 req-1e5542a0-33b7-4ce7-95d5-4451c5ae283e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] No waiting events found dispatching network-vif-plugged-6949320c-d0cb-4a9d-a882-f6d1aac564c3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:29:46 np0005466013 nova_compute[192144]: 2025-10-02 12:29:46.644 2 WARNING nova.compute.manager [req-d17ae0c2-ede1-4c1d-9901-d90d58731185 req-1e5542a0-33b7-4ce7-95d5-4451c5ae283e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Received unexpected event network-vif-plugged-6949320c-d0cb-4a9d-a882-f6d1aac564c3 for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:29:46 np0005466013 podman[242372]: 2025-10-02 12:29:46.746859291 +0000 UTC m=+0.107819020 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_id=edpm)
Oct  2 08:29:46 np0005466013 podman[242371]: 2025-10-02 12:29:46.752520587 +0000 UTC m=+0.105105334 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  2 08:29:46 np0005466013 podman[242373]: 2025-10-02 12:29:46.752615821 +0000 UTC m=+0.084590852 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct  2 08:29:47 np0005466013 nova_compute[192144]: 2025-10-02 12:29:47.153 2 DEBUG nova.network.neutron [-] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:29:47 np0005466013 nova_compute[192144]: 2025-10-02 12:29:47.182 2 INFO nova.compute.manager [-] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Took 2.72 seconds to deallocate network for instance.#033[00m
Oct  2 08:29:47 np0005466013 nova_compute[192144]: 2025-10-02 12:29:47.301 2 DEBUG oslo_concurrency.lockutils [None req-e702d4ee-b3a3-41e7-9cef-56e11ae80e39 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:47 np0005466013 nova_compute[192144]: 2025-10-02 12:29:47.302 2 DEBUG oslo_concurrency.lockutils [None req-e702d4ee-b3a3-41e7-9cef-56e11ae80e39 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:47 np0005466013 nova_compute[192144]: 2025-10-02 12:29:47.706 2 DEBUG nova.compute.provider_tree [None req-e702d4ee-b3a3-41e7-9cef-56e11ae80e39 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:29:47 np0005466013 nova_compute[192144]: 2025-10-02 12:29:47.743 2 DEBUG nova.scheduler.client.report [None req-e702d4ee-b3a3-41e7-9cef-56e11ae80e39 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:29:47 np0005466013 nova_compute[192144]: 2025-10-02 12:29:47.771 2 DEBUG oslo_concurrency.lockutils [None req-e702d4ee-b3a3-41e7-9cef-56e11ae80e39 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.468s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:47 np0005466013 nova_compute[192144]: 2025-10-02 12:29:47.828 2 INFO nova.scheduler.client.report [None req-e702d4ee-b3a3-41e7-9cef-56e11ae80e39 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Deleted allocations for instance ca651811-3d96-4b41-a50d-bbaeaf3da808#033[00m
Oct  2 08:29:47 np0005466013 nova_compute[192144]: 2025-10-02 12:29:47.866 2 DEBUG nova.compute.manager [req-7c34e72a-296a-4053-a40d-c30dfd9363db req-1e9cae60-4e25-4b93-b0d9-611b4750265b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Received event network-vif-deleted-087d1308-7c0a-45ab-b876-1dfdceb622d7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:29:47 np0005466013 nova_compute[192144]: 2025-10-02 12:29:47.866 2 DEBUG nova.compute.manager [req-7c34e72a-296a-4053-a40d-c30dfd9363db req-1e9cae60-4e25-4b93-b0d9-611b4750265b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Received event network-vif-deleted-6949320c-d0cb-4a9d-a882-f6d1aac564c3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:29:47 np0005466013 nova_compute[192144]: 2025-10-02 12:29:47.948 2 DEBUG oslo_concurrency.lockutils [None req-e702d4ee-b3a3-41e7-9cef-56e11ae80e39 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "ca651811-3d96-4b41-a50d-bbaeaf3da808" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.066s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:48 np0005466013 nova_compute[192144]: 2025-10-02 12:29:48.906 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:49 np0005466013 nova_compute[192144]: 2025-10-02 12:29:49.271 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:49 np0005466013 nova_compute[192144]: 2025-10-02 12:29:49.868 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:51 np0005466013 podman[242431]: 2025-10-02 12:29:51.704938019 +0000 UTC m=+0.072228884 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=iscsid, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct  2 08:29:51 np0005466013 podman[242430]: 2025-10-02 12:29:51.726267978 +0000 UTC m=+0.097500887 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:29:52 np0005466013 ovn_controller[94366]: 2025-10-02T12:29:52Z|00582|binding|INFO|Releasing lport 325e84b6-74e4-4e02-a145-8f2619c4e99c from this chassis (sb_readonly=0)
Oct  2 08:29:52 np0005466013 nova_compute[192144]: 2025-10-02 12:29:52.684 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:53 np0005466013 nova_compute[192144]: 2025-10-02 12:29:53.129 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:54 np0005466013 nova_compute[192144]: 2025-10-02 12:29:54.273 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:54 np0005466013 nova_compute[192144]: 2025-10-02 12:29:54.739 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:54 np0005466013 nova_compute[192144]: 2025-10-02 12:29:54.871 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:55 np0005466013 ovn_controller[94366]: 2025-10-02T12:29:55Z|00057|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7f:f9:aa 10.100.0.13
Oct  2 08:29:55 np0005466013 nova_compute[192144]: 2025-10-02 12:29:55.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:29:59 np0005466013 nova_compute[192144]: 2025-10-02 12:29:59.228 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408184.2270105, ca651811-3d96-4b41-a50d-bbaeaf3da808 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:29:59 np0005466013 nova_compute[192144]: 2025-10-02 12:29:59.229 2 INFO nova.compute.manager [-] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:29:59 np0005466013 nova_compute[192144]: 2025-10-02 12:29:59.250 2 DEBUG nova.compute.manager [None req-fa0a003c-c57a-42a1-b49a-ed759c9810fa - - - - - -] [instance: ca651811-3d96-4b41-a50d-bbaeaf3da808] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:29:59 np0005466013 nova_compute[192144]: 2025-10-02 12:29:59.275 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:59 np0005466013 nova_compute[192144]: 2025-10-02 12:29:59.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:59 np0005466013 nova_compute[192144]: 2025-10-02 12:29:59.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:00 np0005466013 nova_compute[192144]: 2025-10-02 12:30:00.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:01 np0005466013 nova_compute[192144]: 2025-10-02 12:30:01.612 2 INFO nova.compute.manager [None req-f3cec9e5-8332-4300-939a-51df54dbf5a0 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Get console output#033[00m
Oct  2 08:30:01 np0005466013 nova_compute[192144]: 2025-10-02 12:30:01.620 56 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  2 08:30:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:30:02.313 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:30:02.315 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:30:02.316 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:02 np0005466013 nova_compute[192144]: 2025-10-02 12:30:02.768 2 DEBUG nova.compute.manager [req-7a45fead-f690-4446-8aa6-f29e07060737 req-bfffa797-3c99-41af-b046-34e3542c12bb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Received event network-changed-d66594bd-b226-44dd-a0fd-5ff49b65d032 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:30:02 np0005466013 nova_compute[192144]: 2025-10-02 12:30:02.769 2 DEBUG nova.compute.manager [req-7a45fead-f690-4446-8aa6-f29e07060737 req-bfffa797-3c99-41af-b046-34e3542c12bb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Refreshing instance network info cache due to event network-changed-d66594bd-b226-44dd-a0fd-5ff49b65d032. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:30:02 np0005466013 nova_compute[192144]: 2025-10-02 12:30:02.770 2 DEBUG oslo_concurrency.lockutils [req-7a45fead-f690-4446-8aa6-f29e07060737 req-bfffa797-3c99-41af-b046-34e3542c12bb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-158b6775-d035-40a6-9699-a7bab42a3cbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:30:02 np0005466013 nova_compute[192144]: 2025-10-02 12:30:02.770 2 DEBUG oslo_concurrency.lockutils [req-7a45fead-f690-4446-8aa6-f29e07060737 req-bfffa797-3c99-41af-b046-34e3542c12bb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-158b6775-d035-40a6-9699-a7bab42a3cbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:30:02 np0005466013 nova_compute[192144]: 2025-10-02 12:30:02.770 2 DEBUG nova.network.neutron [req-7a45fead-f690-4446-8aa6-f29e07060737 req-bfffa797-3c99-41af-b046-34e3542c12bb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Refreshing network info cache for port d66594bd-b226-44dd-a0fd-5ff49b65d032 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:30:02 np0005466013 nova_compute[192144]: 2025-10-02 12:30:02.853 2 DEBUG oslo_concurrency.lockutils [None req-a0fcd2af-2799-4273-8550-ee080f7c81cb 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "158b6775-d035-40a6-9699-a7bab42a3cbc" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:02 np0005466013 nova_compute[192144]: 2025-10-02 12:30:02.854 2 DEBUG oslo_concurrency.lockutils [None req-a0fcd2af-2799-4273-8550-ee080f7c81cb 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "158b6775-d035-40a6-9699-a7bab42a3cbc" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:03 np0005466013 nova_compute[192144]: 2025-10-02 12:30:03.032 2 DEBUG oslo_concurrency.lockutils [None req-a0fcd2af-2799-4273-8550-ee080f7c81cb 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "158b6775-d035-40a6-9699-a7bab42a3cbc-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:03 np0005466013 nova_compute[192144]: 2025-10-02 12:30:03.033 2 DEBUG oslo_concurrency.lockutils [None req-a0fcd2af-2799-4273-8550-ee080f7c81cb 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "158b6775-d035-40a6-9699-a7bab42a3cbc-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:03 np0005466013 nova_compute[192144]: 2025-10-02 12:30:03.033 2 DEBUG oslo_concurrency.lockutils [None req-a0fcd2af-2799-4273-8550-ee080f7c81cb 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "158b6775-d035-40a6-9699-a7bab42a3cbc-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:03 np0005466013 nova_compute[192144]: 2025-10-02 12:30:03.048 2 INFO nova.compute.manager [None req-a0fcd2af-2799-4273-8550-ee080f7c81cb 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Terminating instance#033[00m
Oct  2 08:30:03 np0005466013 nova_compute[192144]: 2025-10-02 12:30:03.063 2 DEBUG nova.compute.manager [None req-a0fcd2af-2799-4273-8550-ee080f7c81cb 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:30:03 np0005466013 kernel: tapd66594bd-b2 (unregistering): left promiscuous mode
Oct  2 08:30:03 np0005466013 NetworkManager[51205]: <info>  [1759408203.0943] device (tapd66594bd-b2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:30:03 np0005466013 ovn_controller[94366]: 2025-10-02T12:30:03Z|00583|binding|INFO|Releasing lport d66594bd-b226-44dd-a0fd-5ff49b65d032 from this chassis (sb_readonly=0)
Oct  2 08:30:03 np0005466013 nova_compute[192144]: 2025-10-02 12:30:03.109 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:03 np0005466013 ovn_controller[94366]: 2025-10-02T12:30:03Z|00584|binding|INFO|Setting lport d66594bd-b226-44dd-a0fd-5ff49b65d032 down in Southbound
Oct  2 08:30:03 np0005466013 ovn_controller[94366]: 2025-10-02T12:30:03Z|00585|binding|INFO|Removing iface tapd66594bd-b2 ovn-installed in OVS
Oct  2 08:30:03 np0005466013 nova_compute[192144]: 2025-10-02 12:30:03.113 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:03 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:30:03.120 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7f:f9:aa 10.100.0.13'], port_security=['fa:16:3e:7f:f9:aa 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '158b6775-d035-40a6-9699-a7bab42a3cbc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9f2a05d8-1f43-4329-b388-8811ae8293ca', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'neutron:revision_number': '6', 'neutron:security_group_ids': '1694e9dd-274b-47a7-823d-2b0bb81c13da', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9b545250-58ed-42b4-932e-b2ddd5229036, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=d66594bd-b226-44dd-a0fd-5ff49b65d032) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:30:03 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:30:03.122 103323 INFO neutron.agent.ovn.metadata.agent [-] Port d66594bd-b226-44dd-a0fd-5ff49b65d032 in datapath 9f2a05d8-1f43-4329-b388-8811ae8293ca unbound from our chassis#033[00m
Oct  2 08:30:03 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:30:03.123 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9f2a05d8-1f43-4329-b388-8811ae8293ca, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:30:03 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:30:03.127 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[e43babe4-746f-46e8-a004-e964b342648d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:03 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:30:03.127 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9f2a05d8-1f43-4329-b388-8811ae8293ca namespace which is not needed anymore#033[00m
Oct  2 08:30:03 np0005466013 nova_compute[192144]: 2025-10-02 12:30:03.147 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:03 np0005466013 systemd[1]: machine-qemu\x2d65\x2dinstance\x2d00000083.scope: Deactivated successfully.
Oct  2 08:30:03 np0005466013 systemd[1]: machine-qemu\x2d65\x2dinstance\x2d00000083.scope: Consumed 13.132s CPU time.
Oct  2 08:30:03 np0005466013 systemd-machined[152202]: Machine qemu-65-instance-00000083 terminated.
Oct  2 08:30:03 np0005466013 neutron-haproxy-ovnmeta-9f2a05d8-1f43-4329-b388-8811ae8293ca[242191]: [NOTICE]   (242195) : haproxy version is 2.8.14-c23fe91
Oct  2 08:30:03 np0005466013 neutron-haproxy-ovnmeta-9f2a05d8-1f43-4329-b388-8811ae8293ca[242191]: [NOTICE]   (242195) : path to executable is /usr/sbin/haproxy
Oct  2 08:30:03 np0005466013 neutron-haproxy-ovnmeta-9f2a05d8-1f43-4329-b388-8811ae8293ca[242191]: [WARNING]  (242195) : Exiting Master process...
Oct  2 08:30:03 np0005466013 neutron-haproxy-ovnmeta-9f2a05d8-1f43-4329-b388-8811ae8293ca[242191]: [WARNING]  (242195) : Exiting Master process...
Oct  2 08:30:03 np0005466013 neutron-haproxy-ovnmeta-9f2a05d8-1f43-4329-b388-8811ae8293ca[242191]: [ALERT]    (242195) : Current worker (242197) exited with code 143 (Terminated)
Oct  2 08:30:03 np0005466013 neutron-haproxy-ovnmeta-9f2a05d8-1f43-4329-b388-8811ae8293ca[242191]: [WARNING]  (242195) : All workers exited. Exiting... (0)
Oct  2 08:30:03 np0005466013 systemd[1]: libpod-ee348fd0cc8e79145c296ba8ebb74844f84044a7f1b405354a2c403e48288c92.scope: Deactivated successfully.
Oct  2 08:30:03 np0005466013 nova_compute[192144]: 2025-10-02 12:30:03.355 2 INFO nova.virt.libvirt.driver [-] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Instance destroyed successfully.#033[00m
Oct  2 08:30:03 np0005466013 nova_compute[192144]: 2025-10-02 12:30:03.356 2 DEBUG nova.objects.instance [None req-a0fcd2af-2799-4273-8550-ee080f7c81cb 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lazy-loading 'resources' on Instance uuid 158b6775-d035-40a6-9699-a7bab42a3cbc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:30:03 np0005466013 podman[242505]: 2025-10-02 12:30:03.360972809 +0000 UTC m=+0.078752766 container died ee348fd0cc8e79145c296ba8ebb74844f84044a7f1b405354a2c403e48288c92 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f2a05d8-1f43-4329-b388-8811ae8293ca, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:30:03 np0005466013 nova_compute[192144]: 2025-10-02 12:30:03.369 2 DEBUG nova.virt.libvirt.vif [None req-a0fcd2af-2799-4273-8550-ee080f7c81cb 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:29:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-2040067918',display_name='tempest-TestNetworkAdvancedServerOps-server-2040067918',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-2040067918',id=131,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJH+nB7ynWmlE6CYxakDXji/dFc1+oe8aupy/8lQbmwUK6XXJX68remR0div3FEoW99WG9y1B7WUExwGPOYQ/687fHl0sNVIGCh6BhE9C68EXmJ+PMvz0f/nt1NeHV55pA==',key_name='tempest-TestNetworkAdvancedServerOps-1864208372',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:29:16Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='76c7dd40d83e4e3ca71abbebf57921b6',ramdisk_id='',reservation_id='r-b06f730g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-597114071',owner_user_name='tempest-TestNetworkAdvancedServerOps-597114071-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:29:42Z,user_data=None,user_id='1faa7e121a0e43ad8cb4ae5b2cfcc6a2',uuid=158b6775-d035-40a6-9699-a7bab42a3cbc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d66594bd-b226-44dd-a0fd-5ff49b65d032", "address": "fa:16:3e:7f:f9:aa", "network": {"id": "9f2a05d8-1f43-4329-b388-8811ae8293ca", "bridge": "br-int", "label": "tempest-network-smoke--1590153811", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd66594bd-b2", "ovs_interfaceid": "d66594bd-b226-44dd-a0fd-5ff49b65d032", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:30:03 np0005466013 nova_compute[192144]: 2025-10-02 12:30:03.369 2 DEBUG nova.network.os_vif_util [None req-a0fcd2af-2799-4273-8550-ee080f7c81cb 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Converting VIF {"id": "d66594bd-b226-44dd-a0fd-5ff49b65d032", "address": "fa:16:3e:7f:f9:aa", "network": {"id": "9f2a05d8-1f43-4329-b388-8811ae8293ca", "bridge": "br-int", "label": "tempest-network-smoke--1590153811", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd66594bd-b2", "ovs_interfaceid": "d66594bd-b226-44dd-a0fd-5ff49b65d032", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:30:03 np0005466013 nova_compute[192144]: 2025-10-02 12:30:03.369 2 DEBUG nova.network.os_vif_util [None req-a0fcd2af-2799-4273-8550-ee080f7c81cb 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7f:f9:aa,bridge_name='br-int',has_traffic_filtering=True,id=d66594bd-b226-44dd-a0fd-5ff49b65d032,network=Network(9f2a05d8-1f43-4329-b388-8811ae8293ca),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd66594bd-b2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:30:03 np0005466013 nova_compute[192144]: 2025-10-02 12:30:03.370 2 DEBUG os_vif [None req-a0fcd2af-2799-4273-8550-ee080f7c81cb 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7f:f9:aa,bridge_name='br-int',has_traffic_filtering=True,id=d66594bd-b226-44dd-a0fd-5ff49b65d032,network=Network(9f2a05d8-1f43-4329-b388-8811ae8293ca),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd66594bd-b2') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:30:03 np0005466013 nova_compute[192144]: 2025-10-02 12:30:03.371 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:03 np0005466013 nova_compute[192144]: 2025-10-02 12:30:03.371 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd66594bd-b2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:30:03 np0005466013 nova_compute[192144]: 2025-10-02 12:30:03.412 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:03 np0005466013 nova_compute[192144]: 2025-10-02 12:30:03.415 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:30:03 np0005466013 nova_compute[192144]: 2025-10-02 12:30:03.418 2 INFO os_vif [None req-a0fcd2af-2799-4273-8550-ee080f7c81cb 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7f:f9:aa,bridge_name='br-int',has_traffic_filtering=True,id=d66594bd-b226-44dd-a0fd-5ff49b65d032,network=Network(9f2a05d8-1f43-4329-b388-8811ae8293ca),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd66594bd-b2')#033[00m
Oct  2 08:30:03 np0005466013 nova_compute[192144]: 2025-10-02 12:30:03.419 2 INFO nova.virt.libvirt.driver [None req-a0fcd2af-2799-4273-8550-ee080f7c81cb 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Deleting instance files /var/lib/nova/instances/158b6775-d035-40a6-9699-a7bab42a3cbc_del#033[00m
Oct  2 08:30:03 np0005466013 nova_compute[192144]: 2025-10-02 12:30:03.420 2 INFO nova.virt.libvirt.driver [None req-a0fcd2af-2799-4273-8550-ee080f7c81cb 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Deletion of /var/lib/nova/instances/158b6775-d035-40a6-9699-a7bab42a3cbc_del complete#033[00m
Oct  2 08:30:03 np0005466013 nova_compute[192144]: 2025-10-02 12:30:03.499 2 INFO nova.compute.manager [None req-a0fcd2af-2799-4273-8550-ee080f7c81cb 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Took 0.44 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:30:03 np0005466013 nova_compute[192144]: 2025-10-02 12:30:03.500 2 DEBUG oslo.service.loopingcall [None req-a0fcd2af-2799-4273-8550-ee080f7c81cb 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:30:03 np0005466013 nova_compute[192144]: 2025-10-02 12:30:03.500 2 DEBUG nova.compute.manager [-] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:30:03 np0005466013 nova_compute[192144]: 2025-10-02 12:30:03.500 2 DEBUG nova.network.neutron [-] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:30:03 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ee348fd0cc8e79145c296ba8ebb74844f84044a7f1b405354a2c403e48288c92-userdata-shm.mount: Deactivated successfully.
Oct  2 08:30:03 np0005466013 systemd[1]: var-lib-containers-storage-overlay-284e90eeca0bf2cc3f57010eed51bb22f7e2ae9584546adf349fe0f841a6f37c-merged.mount: Deactivated successfully.
Oct  2 08:30:03 np0005466013 podman[242505]: 2025-10-02 12:30:03.598478124 +0000 UTC m=+0.316258071 container cleanup ee348fd0cc8e79145c296ba8ebb74844f84044a7f1b405354a2c403e48288c92 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f2a05d8-1f43-4329-b388-8811ae8293ca, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 08:30:03 np0005466013 systemd[1]: libpod-conmon-ee348fd0cc8e79145c296ba8ebb74844f84044a7f1b405354a2c403e48288c92.scope: Deactivated successfully.
Oct  2 08:30:03 np0005466013 podman[242547]: 2025-10-02 12:30:03.919287834 +0000 UTC m=+0.277538334 container remove ee348fd0cc8e79145c296ba8ebb74844f84044a7f1b405354a2c403e48288c92 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f2a05d8-1f43-4329-b388-8811ae8293ca, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:30:03 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:30:03.930 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[042ebe5c-b140-4845-95bd-f26c9b77dad4]: (4, ('Thu Oct  2 12:30:03 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9f2a05d8-1f43-4329-b388-8811ae8293ca (ee348fd0cc8e79145c296ba8ebb74844f84044a7f1b405354a2c403e48288c92)\nee348fd0cc8e79145c296ba8ebb74844f84044a7f1b405354a2c403e48288c92\nThu Oct  2 12:30:03 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9f2a05d8-1f43-4329-b388-8811ae8293ca (ee348fd0cc8e79145c296ba8ebb74844f84044a7f1b405354a2c403e48288c92)\nee348fd0cc8e79145c296ba8ebb74844f84044a7f1b405354a2c403e48288c92\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:03 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:30:03.932 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[3d6f6053-cf1e-4630-bd77-f850d6b890ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:03 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:30:03.934 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9f2a05d8-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:30:03 np0005466013 nova_compute[192144]: 2025-10-02 12:30:03.937 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:03 np0005466013 kernel: tap9f2a05d8-10: left promiscuous mode
Oct  2 08:30:03 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:30:03.945 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[4cddef75-b7a9-4b47-9897-a9e2c0cf6b91]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:03 np0005466013 nova_compute[192144]: 2025-10-02 12:30:03.966 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:03 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:30:03.991 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[01971c44-950d-4593-9606-d6f5e6368e28]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:03 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:30:03.993 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[8e71084d-2785-485a-9871-02083882c241]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:30:04.021 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[048ce97b-a1b8-4625-a161-53c782074579]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 617291, 'reachable_time': 15499, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242562, 'error': None, 'target': 'ovnmeta-9f2a05d8-1f43-4329-b388-8811ae8293ca', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:30:04.026 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9f2a05d8-1f43-4329-b388-8811ae8293ca deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:30:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:30:04.027 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[e777560e-a5c4-48fa-b43c-f3953f27f350]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:04 np0005466013 systemd[1]: run-netns-ovnmeta\x2d9f2a05d8\x2d1f43\x2d4329\x2db388\x2d8811ae8293ca.mount: Deactivated successfully.
Oct  2 08:30:04 np0005466013 nova_compute[192144]: 2025-10-02 12:30:04.875 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:05 np0005466013 nova_compute[192144]: 2025-10-02 12:30:05.147 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:05 np0005466013 nova_compute[192144]: 2025-10-02 12:30:05.368 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:05 np0005466013 nova_compute[192144]: 2025-10-02 12:30:05.904 2 DEBUG nova.network.neutron [req-7a45fead-f690-4446-8aa6-f29e07060737 req-bfffa797-3c99-41af-b046-34e3542c12bb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Updated VIF entry in instance network info cache for port d66594bd-b226-44dd-a0fd-5ff49b65d032. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:30:05 np0005466013 nova_compute[192144]: 2025-10-02 12:30:05.905 2 DEBUG nova.network.neutron [req-7a45fead-f690-4446-8aa6-f29e07060737 req-bfffa797-3c99-41af-b046-34e3542c12bb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Updating instance_info_cache with network_info: [{"id": "d66594bd-b226-44dd-a0fd-5ff49b65d032", "address": "fa:16:3e:7f:f9:aa", "network": {"id": "9f2a05d8-1f43-4329-b388-8811ae8293ca", "bridge": "br-int", "label": "tempest-network-smoke--1590153811", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd66594bd-b2", "ovs_interfaceid": "d66594bd-b226-44dd-a0fd-5ff49b65d032", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:30:05 np0005466013 nova_compute[192144]: 2025-10-02 12:30:05.934 2 DEBUG nova.network.neutron [-] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:30:05 np0005466013 nova_compute[192144]: 2025-10-02 12:30:05.936 2 DEBUG oslo_concurrency.lockutils [req-7a45fead-f690-4446-8aa6-f29e07060737 req-bfffa797-3c99-41af-b046-34e3542c12bb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-158b6775-d035-40a6-9699-a7bab42a3cbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:30:05 np0005466013 nova_compute[192144]: 2025-10-02 12:30:05.964 2 INFO nova.compute.manager [-] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Took 2.46 seconds to deallocate network for instance.#033[00m
Oct  2 08:30:06 np0005466013 nova_compute[192144]: 2025-10-02 12:30:06.064 2 DEBUG oslo_concurrency.lockutils [None req-a0fcd2af-2799-4273-8550-ee080f7c81cb 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:06 np0005466013 nova_compute[192144]: 2025-10-02 12:30:06.065 2 DEBUG oslo_concurrency.lockutils [None req-a0fcd2af-2799-4273-8550-ee080f7c81cb 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:06 np0005466013 nova_compute[192144]: 2025-10-02 12:30:06.074 2 DEBUG nova.compute.manager [req-0783d721-1fef-4c68-8cb2-e590507bad5b req-abe3c254-c9ba-4b19-8869-245509047173 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Received event network-vif-deleted-d66594bd-b226-44dd-a0fd-5ff49b65d032 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:30:06 np0005466013 nova_compute[192144]: 2025-10-02 12:30:06.134 2 DEBUG nova.compute.provider_tree [None req-a0fcd2af-2799-4273-8550-ee080f7c81cb 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:30:06 np0005466013 nova_compute[192144]: 2025-10-02 12:30:06.151 2 DEBUG nova.scheduler.client.report [None req-a0fcd2af-2799-4273-8550-ee080f7c81cb 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:30:06 np0005466013 nova_compute[192144]: 2025-10-02 12:30:06.173 2 DEBUG oslo_concurrency.lockutils [None req-a0fcd2af-2799-4273-8550-ee080f7c81cb 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:06 np0005466013 nova_compute[192144]: 2025-10-02 12:30:06.205 2 INFO nova.scheduler.client.report [None req-a0fcd2af-2799-4273-8550-ee080f7c81cb 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Deleted allocations for instance 158b6775-d035-40a6-9699-a7bab42a3cbc#033[00m
Oct  2 08:30:06 np0005466013 nova_compute[192144]: 2025-10-02 12:30:06.307 2 DEBUG oslo_concurrency.lockutils [None req-a0fcd2af-2799-4273-8550-ee080f7c81cb 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "158b6775-d035-40a6-9699-a7bab42a3cbc" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.453s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:07 np0005466013 podman[242564]: 2025-10-02 12:30:07.708794856 +0000 UTC m=+0.065512066 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 08:30:07 np0005466013 podman[242566]: 2025-10-02 12:30:07.729675469 +0000 UTC m=+0.090485805 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:30:07 np0005466013 podman[242565]: 2025-10-02 12:30:07.741679188 +0000 UTC m=+0.099370080 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:30:08 np0005466013 nova_compute[192144]: 2025-10-02 12:30:08.413 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:09 np0005466013 nova_compute[192144]: 2025-10-02 12:30:09.876 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:13 np0005466013 nova_compute[192144]: 2025-10-02 12:30:13.417 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:14 np0005466013 nova_compute[192144]: 2025-10-02 12:30:14.883 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:17 np0005466013 podman[242630]: 2025-10-02 12:30:17.68894117 +0000 UTC m=+0.065216197 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Oct  2 08:30:17 np0005466013 podman[242632]: 2025-10-02 12:30:17.702108307 +0000 UTC m=+0.067282125 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct  2 08:30:17 np0005466013 podman[242631]: 2025-10-02 12:30:17.708988996 +0000 UTC m=+0.070957928 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, version=9.6, maintainer=Red Hat, Inc., name=ubi9-minimal, distribution-scope=public, vcs-type=git, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct  2 08:30:18 np0005466013 nova_compute[192144]: 2025-10-02 12:30:18.356 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408203.3539512, 158b6775-d035-40a6-9699-a7bab42a3cbc => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:30:18 np0005466013 nova_compute[192144]: 2025-10-02 12:30:18.356 2 INFO nova.compute.manager [-] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:30:18 np0005466013 nova_compute[192144]: 2025-10-02 12:30:18.382 2 DEBUG nova.compute.manager [None req-8fcc6ba0-b73a-4372-ade7-b8551f0e3edc - - - - - -] [instance: 158b6775-d035-40a6-9699-a7bab42a3cbc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:30:18 np0005466013 nova_compute[192144]: 2025-10-02 12:30:18.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:19 np0005466013 nova_compute[192144]: 2025-10-02 12:30:19.885 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:30:20.395 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=34, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=33) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:30:20 np0005466013 nova_compute[192144]: 2025-10-02 12:30:20.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:30:20.396 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:30:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:30:20.978 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2e:31:73 2001:db8:0:1:f816:3eff:fe2e:3173 2001:db8::f816:3eff:fe2e:3173'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe2e:3173/64 2001:db8::f816:3eff:fe2e:3173/64', 'neutron:device_id': 'ovnmeta-e2520108-9d67-4d82-a7a0-ba429a88c3c9', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e2520108-9d67-4d82-a7a0-ba429a88c3c9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=876a7f58-2645-4e1a-8a60-dbbe16fdfb2e, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=3eb0ed9e-d99b-4ee6-af64-ada9c8369b17) old=Port_Binding(mac=['fa:16:3e:2e:31:73 2001:db8::f816:3eff:fe2e:3173'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe2e:3173/64', 'neutron:device_id': 'ovnmeta-e2520108-9d67-4d82-a7a0-ba429a88c3c9', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e2520108-9d67-4d82-a7a0-ba429a88c3c9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:30:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:30:20.980 103323 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 3eb0ed9e-d99b-4ee6-af64-ada9c8369b17 in datapath e2520108-9d67-4d82-a7a0-ba429a88c3c9 updated#033[00m
Oct  2 08:30:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:30:20.983 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e2520108-9d67-4d82-a7a0-ba429a88c3c9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:30:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:30:20.984 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[e63b739a-7c29-4b9b-b915-601ad04694da]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:22 np0005466013 podman[242691]: 2025-10-02 12:30:22.717898325 +0000 UTC m=+0.077737082 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:30:22 np0005466013 podman[242690]: 2025-10-02 12:30:22.725529169 +0000 UTC m=+0.084510177 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:30:23 np0005466013 nova_compute[192144]: 2025-10-02 12:30:23.425 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:24 np0005466013 nova_compute[192144]: 2025-10-02 12:30:24.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:30:25.398 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '34'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:30:28 np0005466013 nova_compute[192144]: 2025-10-02 12:30:28.430 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:29 np0005466013 nova_compute[192144]: 2025-10-02 12:30:29.890 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:31 np0005466013 nova_compute[192144]: 2025-10-02 12:30:31.014 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:30:31 np0005466013 nova_compute[192144]: 2025-10-02 12:30:31.072 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:31 np0005466013 nova_compute[192144]: 2025-10-02 12:30:31.072 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:31 np0005466013 nova_compute[192144]: 2025-10-02 12:30:31.072 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:31 np0005466013 nova_compute[192144]: 2025-10-02 12:30:31.072 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:30:31 np0005466013 nova_compute[192144]: 2025-10-02 12:30:31.246 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:30:31 np0005466013 nova_compute[192144]: 2025-10-02 12:30:31.247 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5720MB free_disk=73.20048904418945GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:30:31 np0005466013 nova_compute[192144]: 2025-10-02 12:30:31.248 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:31 np0005466013 nova_compute[192144]: 2025-10-02 12:30:31.248 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:32 np0005466013 nova_compute[192144]: 2025-10-02 12:30:32.565 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance 205fe71f-8f8c-4ae1-ac04-9344041cfd6c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1692#033[00m
Oct  2 08:30:32 np0005466013 nova_compute[192144]: 2025-10-02 12:30:32.567 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:30:32 np0005466013 nova_compute[192144]: 2025-10-02 12:30:32.567 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:30:32 np0005466013 nova_compute[192144]: 2025-10-02 12:30:32.625 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Refreshing inventories for resource provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  2 08:30:32 np0005466013 nova_compute[192144]: 2025-10-02 12:30:32.697 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Updating ProviderTree inventory for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  2 08:30:32 np0005466013 nova_compute[192144]: 2025-10-02 12:30:32.698 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Updating inventory in ProviderTree for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 08:30:32 np0005466013 nova_compute[192144]: 2025-10-02 12:30:32.711 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Refreshing aggregate associations for resource provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  2 08:30:32 np0005466013 nova_compute[192144]: 2025-10-02 12:30:32.739 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Refreshing trait associations for resource provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80, traits: COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_TRUSTED_CERTS,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NODE,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE,HW_CPU_X86_MMX,COMPUTE_RESCUE_BFV _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  2 08:30:32 np0005466013 nova_compute[192144]: 2025-10-02 12:30:32.780 2 DEBUG oslo_concurrency.lockutils [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "205fe71f-8f8c-4ae1-ac04-9344041cfd6c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:32 np0005466013 nova_compute[192144]: 2025-10-02 12:30:32.780 2 DEBUG oslo_concurrency.lockutils [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "205fe71f-8f8c-4ae1-ac04-9344041cfd6c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:32 np0005466013 nova_compute[192144]: 2025-10-02 12:30:32.790 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:30:32 np0005466013 nova_compute[192144]: 2025-10-02 12:30:32.875 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:30:32 np0005466013 nova_compute[192144]: 2025-10-02 12:30:32.910 2 DEBUG nova.compute.manager [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:30:33 np0005466013 nova_compute[192144]: 2025-10-02 12:30:33.423 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:30:33 np0005466013 nova_compute[192144]: 2025-10-02 12:30:33.424 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.176s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:33 np0005466013 nova_compute[192144]: 2025-10-02 12:30:33.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:34 np0005466013 nova_compute[192144]: 2025-10-02 12:30:34.080 2 DEBUG oslo_concurrency.lockutils [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:34 np0005466013 nova_compute[192144]: 2025-10-02 12:30:34.081 2 DEBUG oslo_concurrency.lockutils [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:34 np0005466013 nova_compute[192144]: 2025-10-02 12:30:34.088 2 DEBUG nova.virt.hardware [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:30:34 np0005466013 nova_compute[192144]: 2025-10-02 12:30:34.089 2 INFO nova.compute.claims [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:30:34 np0005466013 nova_compute[192144]: 2025-10-02 12:30:34.404 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:30:34 np0005466013 nova_compute[192144]: 2025-10-02 12:30:34.405 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:30:34 np0005466013 nova_compute[192144]: 2025-10-02 12:30:34.517 2 DEBUG nova.compute.provider_tree [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:30:34 np0005466013 nova_compute[192144]: 2025-10-02 12:30:34.672 2 DEBUG nova.scheduler.client.report [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:30:34 np0005466013 nova_compute[192144]: 2025-10-02 12:30:34.745 2 DEBUG oslo_concurrency.lockutils [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.664s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:34 np0005466013 nova_compute[192144]: 2025-10-02 12:30:34.747 2 DEBUG nova.compute.manager [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:30:34 np0005466013 nova_compute[192144]: 2025-10-02 12:30:34.891 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:35 np0005466013 nova_compute[192144]: 2025-10-02 12:30:35.057 2 DEBUG nova.compute.manager [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:30:35 np0005466013 nova_compute[192144]: 2025-10-02 12:30:35.057 2 DEBUG nova.network.neutron [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:30:35 np0005466013 nova_compute[192144]: 2025-10-02 12:30:35.141 2 INFO nova.virt.libvirt.driver [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:30:35 np0005466013 nova_compute[192144]: 2025-10-02 12:30:35.170 2 DEBUG nova.compute.manager [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:30:35 np0005466013 nova_compute[192144]: 2025-10-02 12:30:35.410 2 DEBUG nova.policy [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:30:35 np0005466013 nova_compute[192144]: 2025-10-02 12:30:35.773 2 DEBUG nova.compute.manager [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:30:35 np0005466013 nova_compute[192144]: 2025-10-02 12:30:35.775 2 DEBUG nova.virt.libvirt.driver [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:30:35 np0005466013 nova_compute[192144]: 2025-10-02 12:30:35.775 2 INFO nova.virt.libvirt.driver [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Creating image(s)#033[00m
Oct  2 08:30:35 np0005466013 nova_compute[192144]: 2025-10-02 12:30:35.776 2 DEBUG oslo_concurrency.lockutils [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "/var/lib/nova/instances/205fe71f-8f8c-4ae1-ac04-9344041cfd6c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:35 np0005466013 nova_compute[192144]: 2025-10-02 12:30:35.776 2 DEBUG oslo_concurrency.lockutils [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "/var/lib/nova/instances/205fe71f-8f8c-4ae1-ac04-9344041cfd6c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:35 np0005466013 nova_compute[192144]: 2025-10-02 12:30:35.777 2 DEBUG oslo_concurrency.lockutils [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "/var/lib/nova/instances/205fe71f-8f8c-4ae1-ac04-9344041cfd6c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:35 np0005466013 nova_compute[192144]: 2025-10-02 12:30:35.789 2 DEBUG oslo_concurrency.processutils [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:30:35 np0005466013 nova_compute[192144]: 2025-10-02 12:30:35.854 2 DEBUG oslo_concurrency.processutils [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:30:35 np0005466013 nova_compute[192144]: 2025-10-02 12:30:35.855 2 DEBUG oslo_concurrency.lockutils [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:35 np0005466013 nova_compute[192144]: 2025-10-02 12:30:35.856 2 DEBUG oslo_concurrency.lockutils [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:35 np0005466013 nova_compute[192144]: 2025-10-02 12:30:35.867 2 DEBUG oslo_concurrency.processutils [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:30:35 np0005466013 nova_compute[192144]: 2025-10-02 12:30:35.926 2 DEBUG oslo_concurrency.processutils [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:30:35 np0005466013 nova_compute[192144]: 2025-10-02 12:30:35.928 2 DEBUG oslo_concurrency.processutils [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/205fe71f-8f8c-4ae1-ac04-9344041cfd6c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:30:36 np0005466013 nova_compute[192144]: 2025-10-02 12:30:36.079 2 DEBUG oslo_concurrency.processutils [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/205fe71f-8f8c-4ae1-ac04-9344041cfd6c/disk 1073741824" returned: 0 in 0.151s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:30:36 np0005466013 nova_compute[192144]: 2025-10-02 12:30:36.082 2 DEBUG oslo_concurrency.lockutils [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.226s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:36 np0005466013 nova_compute[192144]: 2025-10-02 12:30:36.083 2 DEBUG oslo_concurrency.processutils [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:30:36 np0005466013 nova_compute[192144]: 2025-10-02 12:30:36.177 2 DEBUG oslo_concurrency.processutils [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:30:36 np0005466013 nova_compute[192144]: 2025-10-02 12:30:36.179 2 DEBUG nova.virt.disk.api [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Checking if we can resize image /var/lib/nova/instances/205fe71f-8f8c-4ae1-ac04-9344041cfd6c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:30:36 np0005466013 nova_compute[192144]: 2025-10-02 12:30:36.180 2 DEBUG oslo_concurrency.processutils [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/205fe71f-8f8c-4ae1-ac04-9344041cfd6c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:30:36 np0005466013 nova_compute[192144]: 2025-10-02 12:30:36.276 2 DEBUG oslo_concurrency.processutils [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/205fe71f-8f8c-4ae1-ac04-9344041cfd6c/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:30:36 np0005466013 nova_compute[192144]: 2025-10-02 12:30:36.277 2 DEBUG nova.virt.disk.api [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Cannot resize image /var/lib/nova/instances/205fe71f-8f8c-4ae1-ac04-9344041cfd6c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:30:36 np0005466013 nova_compute[192144]: 2025-10-02 12:30:36.278 2 DEBUG nova.objects.instance [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lazy-loading 'migration_context' on Instance uuid 205fe71f-8f8c-4ae1-ac04-9344041cfd6c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:30:36 np0005466013 nova_compute[192144]: 2025-10-02 12:30:36.340 2 DEBUG nova.virt.libvirt.driver [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:30:36 np0005466013 nova_compute[192144]: 2025-10-02 12:30:36.340 2 DEBUG nova.virt.libvirt.driver [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Ensure instance console log exists: /var/lib/nova/instances/205fe71f-8f8c-4ae1-ac04-9344041cfd6c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:30:36 np0005466013 nova_compute[192144]: 2025-10-02 12:30:36.341 2 DEBUG oslo_concurrency.lockutils [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:36 np0005466013 nova_compute[192144]: 2025-10-02 12:30:36.342 2 DEBUG oslo_concurrency.lockutils [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:36 np0005466013 nova_compute[192144]: 2025-10-02 12:30:36.342 2 DEBUG oslo_concurrency.lockutils [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:37 np0005466013 nova_compute[192144]: 2025-10-02 12:30:37.990 2 DEBUG nova.network.neutron [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Successfully created port: a9e0d1d4-1101-49b7-adda-6a2a6db11fe1 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:30:38 np0005466013 nova_compute[192144]: 2025-10-02 12:30:38.458 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:38 np0005466013 podman[242749]: 2025-10-02 12:30:38.721963309 +0000 UTC m=+0.083991210 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 08:30:38 np0005466013 podman[242751]: 2025-10-02 12:30:38.748977026 +0000 UTC m=+0.113933774 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:30:38 np0005466013 podman[242750]: 2025-10-02 12:30:38.755904056 +0000 UTC m=+0.115745565 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Oct  2 08:30:38 np0005466013 nova_compute[192144]: 2025-10-02 12:30:38.996 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:30:38 np0005466013 nova_compute[192144]: 2025-10-02 12:30:38.998 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:30:38 np0005466013 nova_compute[192144]: 2025-10-02 12:30:38.998 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:30:39 np0005466013 nova_compute[192144]: 2025-10-02 12:30:39.892 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:39 np0005466013 nova_compute[192144]: 2025-10-02 12:30:39.990 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:30:39 np0005466013 nova_compute[192144]: 2025-10-02 12:30:39.992 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:30:39 np0005466013 nova_compute[192144]: 2025-10-02 12:30:39.993 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:30:40 np0005466013 nova_compute[192144]: 2025-10-02 12:30:40.039 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:30:40 np0005466013 nova_compute[192144]: 2025-10-02 12:30:40.240 2 DEBUG nova.network.neutron [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Successfully updated port: a9e0d1d4-1101-49b7-adda-6a2a6db11fe1 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:30:40 np0005466013 nova_compute[192144]: 2025-10-02 12:30:40.397 2 DEBUG oslo_concurrency.lockutils [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "refresh_cache-205fe71f-8f8c-4ae1-ac04-9344041cfd6c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:30:40 np0005466013 nova_compute[192144]: 2025-10-02 12:30:40.398 2 DEBUG oslo_concurrency.lockutils [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquired lock "refresh_cache-205fe71f-8f8c-4ae1-ac04-9344041cfd6c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:30:40 np0005466013 nova_compute[192144]: 2025-10-02 12:30:40.398 2 DEBUG nova.network.neutron [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:30:40 np0005466013 nova_compute[192144]: 2025-10-02 12:30:40.463 2 DEBUG nova.compute.manager [req-ee710bf7-e637-40ee-9429-47089a452be7 req-7910e077-18d5-4841-ae1e-30ba70101a92 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Received event network-changed-a9e0d1d4-1101-49b7-adda-6a2a6db11fe1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:30:40 np0005466013 nova_compute[192144]: 2025-10-02 12:30:40.463 2 DEBUG nova.compute.manager [req-ee710bf7-e637-40ee-9429-47089a452be7 req-7910e077-18d5-4841-ae1e-30ba70101a92 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Refreshing instance network info cache due to event network-changed-a9e0d1d4-1101-49b7-adda-6a2a6db11fe1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:30:40 np0005466013 nova_compute[192144]: 2025-10-02 12:30:40.464 2 DEBUG oslo_concurrency.lockutils [req-ee710bf7-e637-40ee-9429-47089a452be7 req-7910e077-18d5-4841-ae1e-30ba70101a92 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-205fe71f-8f8c-4ae1-ac04-9344041cfd6c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:30:40 np0005466013 nova_compute[192144]: 2025-10-02 12:30:40.680 2 DEBUG nova.network.neutron [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:30:40 np0005466013 nova_compute[192144]: 2025-10-02 12:30:40.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:30:41 np0005466013 nova_compute[192144]: 2025-10-02 12:30:41.593 2 DEBUG nova.network.neutron [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Updating instance_info_cache with network_info: [{"id": "a9e0d1d4-1101-49b7-adda-6a2a6db11fe1", "address": "fa:16:3e:b4:da:2c", "network": {"id": "efed3bdf-e287-4892-a4a2-6d198fc94413", "bridge": "br-int", "label": "tempest-network-smoke--1318553587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9e0d1d4-11", "ovs_interfaceid": "a9e0d1d4-1101-49b7-adda-6a2a6db11fe1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:30:41 np0005466013 nova_compute[192144]: 2025-10-02 12:30:41.624 2 DEBUG oslo_concurrency.lockutils [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Releasing lock "refresh_cache-205fe71f-8f8c-4ae1-ac04-9344041cfd6c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:30:41 np0005466013 nova_compute[192144]: 2025-10-02 12:30:41.625 2 DEBUG nova.compute.manager [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Instance network_info: |[{"id": "a9e0d1d4-1101-49b7-adda-6a2a6db11fe1", "address": "fa:16:3e:b4:da:2c", "network": {"id": "efed3bdf-e287-4892-a4a2-6d198fc94413", "bridge": "br-int", "label": "tempest-network-smoke--1318553587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9e0d1d4-11", "ovs_interfaceid": "a9e0d1d4-1101-49b7-adda-6a2a6db11fe1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:30:41 np0005466013 nova_compute[192144]: 2025-10-02 12:30:41.625 2 DEBUG oslo_concurrency.lockutils [req-ee710bf7-e637-40ee-9429-47089a452be7 req-7910e077-18d5-4841-ae1e-30ba70101a92 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-205fe71f-8f8c-4ae1-ac04-9344041cfd6c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:30:41 np0005466013 nova_compute[192144]: 2025-10-02 12:30:41.626 2 DEBUG nova.network.neutron [req-ee710bf7-e637-40ee-9429-47089a452be7 req-7910e077-18d5-4841-ae1e-30ba70101a92 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Refreshing network info cache for port a9e0d1d4-1101-49b7-adda-6a2a6db11fe1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:30:41 np0005466013 nova_compute[192144]: 2025-10-02 12:30:41.631 2 DEBUG nova.virt.libvirt.driver [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Start _get_guest_xml network_info=[{"id": "a9e0d1d4-1101-49b7-adda-6a2a6db11fe1", "address": "fa:16:3e:b4:da:2c", "network": {"id": "efed3bdf-e287-4892-a4a2-6d198fc94413", "bridge": "br-int", "label": "tempest-network-smoke--1318553587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9e0d1d4-11", "ovs_interfaceid": "a9e0d1d4-1101-49b7-adda-6a2a6db11fe1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:30:41 np0005466013 nova_compute[192144]: 2025-10-02 12:30:41.639 2 WARNING nova.virt.libvirt.driver [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:30:41 np0005466013 nova_compute[192144]: 2025-10-02 12:30:41.644 2 DEBUG nova.virt.libvirt.host [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:30:41 np0005466013 nova_compute[192144]: 2025-10-02 12:30:41.645 2 DEBUG nova.virt.libvirt.host [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:30:41 np0005466013 nova_compute[192144]: 2025-10-02 12:30:41.648 2 DEBUG nova.virt.libvirt.host [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:30:41 np0005466013 nova_compute[192144]: 2025-10-02 12:30:41.649 2 DEBUG nova.virt.libvirt.host [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:30:41 np0005466013 nova_compute[192144]: 2025-10-02 12:30:41.651 2 DEBUG nova.virt.libvirt.driver [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:30:41 np0005466013 nova_compute[192144]: 2025-10-02 12:30:41.651 2 DEBUG nova.virt.hardware [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:30:41 np0005466013 nova_compute[192144]: 2025-10-02 12:30:41.652 2 DEBUG nova.virt.hardware [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:30:41 np0005466013 nova_compute[192144]: 2025-10-02 12:30:41.652 2 DEBUG nova.virt.hardware [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:30:41 np0005466013 nova_compute[192144]: 2025-10-02 12:30:41.653 2 DEBUG nova.virt.hardware [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:30:41 np0005466013 nova_compute[192144]: 2025-10-02 12:30:41.653 2 DEBUG nova.virt.hardware [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:30:41 np0005466013 nova_compute[192144]: 2025-10-02 12:30:41.654 2 DEBUG nova.virt.hardware [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:30:41 np0005466013 nova_compute[192144]: 2025-10-02 12:30:41.654 2 DEBUG nova.virt.hardware [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:30:41 np0005466013 nova_compute[192144]: 2025-10-02 12:30:41.655 2 DEBUG nova.virt.hardware [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:30:41 np0005466013 nova_compute[192144]: 2025-10-02 12:30:41.655 2 DEBUG nova.virt.hardware [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:30:41 np0005466013 nova_compute[192144]: 2025-10-02 12:30:41.656 2 DEBUG nova.virt.hardware [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:30:41 np0005466013 nova_compute[192144]: 2025-10-02 12:30:41.656 2 DEBUG nova.virt.hardware [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:30:41 np0005466013 nova_compute[192144]: 2025-10-02 12:30:41.662 2 DEBUG nova.virt.libvirt.vif [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:30:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-853898395',display_name='tempest-TestNetworkAdvancedServerOps-server-853898395',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-853898395',id=135,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJcWJR9H07bCSYUqeF8jPAGHvorXmh9wYiFyjeR3r1c8wskLP0QtO4hgxQKDidv5VyuYzyK3XTaMOh059pbUMGGg0yimxkoI04eFolQEnRD7tnn/yCbWfabjbEavYmykBA==',key_name='tempest-TestNetworkAdvancedServerOps-1317244517',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76c7dd40d83e4e3ca71abbebf57921b6',ramdisk_id='',reservation_id='r-3e4dgpei',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-597114071',owner_user_name='tempest-TestNetworkAdvancedServerOps-597114071-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:30:35Z,user_data=None,user_id='1faa7e121a0e43ad8cb4ae5b2cfcc6a2',uuid=205fe71f-8f8c-4ae1-ac04-9344041cfd6c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a9e0d1d4-1101-49b7-adda-6a2a6db11fe1", "address": "fa:16:3e:b4:da:2c", "network": {"id": "efed3bdf-e287-4892-a4a2-6d198fc94413", "bridge": "br-int", "label": "tempest-network-smoke--1318553587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9e0d1d4-11", "ovs_interfaceid": "a9e0d1d4-1101-49b7-adda-6a2a6db11fe1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:30:41 np0005466013 nova_compute[192144]: 2025-10-02 12:30:41.663 2 DEBUG nova.network.os_vif_util [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Converting VIF {"id": "a9e0d1d4-1101-49b7-adda-6a2a6db11fe1", "address": "fa:16:3e:b4:da:2c", "network": {"id": "efed3bdf-e287-4892-a4a2-6d198fc94413", "bridge": "br-int", "label": "tempest-network-smoke--1318553587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9e0d1d4-11", "ovs_interfaceid": "a9e0d1d4-1101-49b7-adda-6a2a6db11fe1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:30:41 np0005466013 nova_compute[192144]: 2025-10-02 12:30:41.664 2 DEBUG nova.network.os_vif_util [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b4:da:2c,bridge_name='br-int',has_traffic_filtering=True,id=a9e0d1d4-1101-49b7-adda-6a2a6db11fe1,network=Network(efed3bdf-e287-4892-a4a2-6d198fc94413),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa9e0d1d4-11') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:30:41 np0005466013 nova_compute[192144]: 2025-10-02 12:30:41.666 2 DEBUG nova.objects.instance [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lazy-loading 'pci_devices' on Instance uuid 205fe71f-8f8c-4ae1-ac04-9344041cfd6c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:30:41 np0005466013 nova_compute[192144]: 2025-10-02 12:30:41.688 2 DEBUG nova.virt.libvirt.driver [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:30:41 np0005466013 nova_compute[192144]:  <uuid>205fe71f-8f8c-4ae1-ac04-9344041cfd6c</uuid>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:  <name>instance-00000087</name>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:30:41 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-853898395</nova:name>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:30:41</nova:creationTime>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:30:41 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:        <nova:user uuid="1faa7e121a0e43ad8cb4ae5b2cfcc6a2">tempest-TestNetworkAdvancedServerOps-597114071-project-member</nova:user>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:        <nova:project uuid="76c7dd40d83e4e3ca71abbebf57921b6">tempest-TestNetworkAdvancedServerOps-597114071</nova:project>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:        <nova:port uuid="a9e0d1d4-1101-49b7-adda-6a2a6db11fe1">
Oct  2 08:30:41 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:      <entry name="serial">205fe71f-8f8c-4ae1-ac04-9344041cfd6c</entry>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:      <entry name="uuid">205fe71f-8f8c-4ae1-ac04-9344041cfd6c</entry>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:30:41 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/205fe71f-8f8c-4ae1-ac04-9344041cfd6c/disk"/>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:30:41 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/205fe71f-8f8c-4ae1-ac04-9344041cfd6c/disk.config"/>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:30:41 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:b4:da:2c"/>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:      <target dev="tapa9e0d1d4-11"/>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:30:41 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/205fe71f-8f8c-4ae1-ac04-9344041cfd6c/console.log" append="off"/>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:30:41 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:30:41 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:30:41 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:30:41 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:30:41 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:30:41 np0005466013 nova_compute[192144]: 2025-10-02 12:30:41.690 2 DEBUG nova.compute.manager [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Preparing to wait for external event network-vif-plugged-a9e0d1d4-1101-49b7-adda-6a2a6db11fe1 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:30:41 np0005466013 nova_compute[192144]: 2025-10-02 12:30:41.690 2 DEBUG oslo_concurrency.lockutils [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "205fe71f-8f8c-4ae1-ac04-9344041cfd6c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:41 np0005466013 nova_compute[192144]: 2025-10-02 12:30:41.691 2 DEBUG oslo_concurrency.lockutils [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "205fe71f-8f8c-4ae1-ac04-9344041cfd6c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:41 np0005466013 nova_compute[192144]: 2025-10-02 12:30:41.691 2 DEBUG oslo_concurrency.lockutils [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "205fe71f-8f8c-4ae1-ac04-9344041cfd6c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:41 np0005466013 nova_compute[192144]: 2025-10-02 12:30:41.692 2 DEBUG nova.virt.libvirt.vif [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:30:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-853898395',display_name='tempest-TestNetworkAdvancedServerOps-server-853898395',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-853898395',id=135,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJcWJR9H07bCSYUqeF8jPAGHvorXmh9wYiFyjeR3r1c8wskLP0QtO4hgxQKDidv5VyuYzyK3XTaMOh059pbUMGGg0yimxkoI04eFolQEnRD7tnn/yCbWfabjbEavYmykBA==',key_name='tempest-TestNetworkAdvancedServerOps-1317244517',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76c7dd40d83e4e3ca71abbebf57921b6',ramdisk_id='',reservation_id='r-3e4dgpei',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-597114071',owner_user_name='tempest-TestNetworkAdvancedServerOps-597114071-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:30:35Z,user_data=None,user_id='1faa7e121a0e43ad8cb4ae5b2cfcc6a2',uuid=205fe71f-8f8c-4ae1-ac04-9344041cfd6c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a9e0d1d4-1101-49b7-adda-6a2a6db11fe1", "address": "fa:16:3e:b4:da:2c", "network": {"id": "efed3bdf-e287-4892-a4a2-6d198fc94413", "bridge": "br-int", "label": "tempest-network-smoke--1318553587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9e0d1d4-11", "ovs_interfaceid": "a9e0d1d4-1101-49b7-adda-6a2a6db11fe1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:30:41 np0005466013 nova_compute[192144]: 2025-10-02 12:30:41.692 2 DEBUG nova.network.os_vif_util [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Converting VIF {"id": "a9e0d1d4-1101-49b7-adda-6a2a6db11fe1", "address": "fa:16:3e:b4:da:2c", "network": {"id": "efed3bdf-e287-4892-a4a2-6d198fc94413", "bridge": "br-int", "label": "tempest-network-smoke--1318553587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9e0d1d4-11", "ovs_interfaceid": "a9e0d1d4-1101-49b7-adda-6a2a6db11fe1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:30:41 np0005466013 nova_compute[192144]: 2025-10-02 12:30:41.693 2 DEBUG nova.network.os_vif_util [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b4:da:2c,bridge_name='br-int',has_traffic_filtering=True,id=a9e0d1d4-1101-49b7-adda-6a2a6db11fe1,network=Network(efed3bdf-e287-4892-a4a2-6d198fc94413),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa9e0d1d4-11') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:30:41 np0005466013 nova_compute[192144]: 2025-10-02 12:30:41.693 2 DEBUG os_vif [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b4:da:2c,bridge_name='br-int',has_traffic_filtering=True,id=a9e0d1d4-1101-49b7-adda-6a2a6db11fe1,network=Network(efed3bdf-e287-4892-a4a2-6d198fc94413),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa9e0d1d4-11') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:30:41 np0005466013 nova_compute[192144]: 2025-10-02 12:30:41.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:41 np0005466013 nova_compute[192144]: 2025-10-02 12:30:41.694 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:30:41 np0005466013 nova_compute[192144]: 2025-10-02 12:30:41.694 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:30:41 np0005466013 nova_compute[192144]: 2025-10-02 12:30:41.698 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:41 np0005466013 nova_compute[192144]: 2025-10-02 12:30:41.698 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa9e0d1d4-11, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:30:41 np0005466013 nova_compute[192144]: 2025-10-02 12:30:41.699 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa9e0d1d4-11, col_values=(('external_ids', {'iface-id': 'a9e0d1d4-1101-49b7-adda-6a2a6db11fe1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b4:da:2c', 'vm-uuid': '205fe71f-8f8c-4ae1-ac04-9344041cfd6c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:30:41 np0005466013 nova_compute[192144]: 2025-10-02 12:30:41.701 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:41 np0005466013 NetworkManager[51205]: <info>  [1759408241.7022] manager: (tapa9e0d1d4-11): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/257)
Oct  2 08:30:41 np0005466013 nova_compute[192144]: 2025-10-02 12:30:41.703 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:30:41 np0005466013 nova_compute[192144]: 2025-10-02 12:30:41.710 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:41 np0005466013 nova_compute[192144]: 2025-10-02 12:30:41.711 2 INFO os_vif [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b4:da:2c,bridge_name='br-int',has_traffic_filtering=True,id=a9e0d1d4-1101-49b7-adda-6a2a6db11fe1,network=Network(efed3bdf-e287-4892-a4a2-6d198fc94413),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa9e0d1d4-11')#033[00m
Oct  2 08:30:41 np0005466013 ovn_controller[94366]: 2025-10-02T12:30:41Z|00586|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Oct  2 08:30:41 np0005466013 nova_compute[192144]: 2025-10-02 12:30:41.777 2 DEBUG nova.virt.libvirt.driver [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:30:41 np0005466013 nova_compute[192144]: 2025-10-02 12:30:41.777 2 DEBUG nova.virt.libvirt.driver [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:30:41 np0005466013 nova_compute[192144]: 2025-10-02 12:30:41.777 2 DEBUG nova.virt.libvirt.driver [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] No VIF found with MAC fa:16:3e:b4:da:2c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:30:41 np0005466013 nova_compute[192144]: 2025-10-02 12:30:41.778 2 INFO nova.virt.libvirt.driver [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Using config drive#033[00m
Oct  2 08:30:42 np0005466013 nova_compute[192144]: 2025-10-02 12:30:42.299 2 INFO nova.virt.libvirt.driver [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Creating config drive at /var/lib/nova/instances/205fe71f-8f8c-4ae1-ac04-9344041cfd6c/disk.config#033[00m
Oct  2 08:30:42 np0005466013 nova_compute[192144]: 2025-10-02 12:30:42.309 2 DEBUG oslo_concurrency.processutils [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/205fe71f-8f8c-4ae1-ac04-9344041cfd6c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6kv3xstw execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:30:42 np0005466013 nova_compute[192144]: 2025-10-02 12:30:42.453 2 DEBUG oslo_concurrency.processutils [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/205fe71f-8f8c-4ae1-ac04-9344041cfd6c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6kv3xstw" returned: 0 in 0.144s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:30:42 np0005466013 kernel: tapa9e0d1d4-11: entered promiscuous mode
Oct  2 08:30:42 np0005466013 NetworkManager[51205]: <info>  [1759408242.5389] manager: (tapa9e0d1d4-11): new Tun device (/org/freedesktop/NetworkManager/Devices/258)
Oct  2 08:30:42 np0005466013 ovn_controller[94366]: 2025-10-02T12:30:42Z|00587|binding|INFO|Claiming lport a9e0d1d4-1101-49b7-adda-6a2a6db11fe1 for this chassis.
Oct  2 08:30:42 np0005466013 nova_compute[192144]: 2025-10-02 12:30:42.540 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:42 np0005466013 ovn_controller[94366]: 2025-10-02T12:30:42Z|00588|binding|INFO|a9e0d1d4-1101-49b7-adda-6a2a6db11fe1: Claiming fa:16:3e:b4:da:2c 10.100.0.14
Oct  2 08:30:42 np0005466013 nova_compute[192144]: 2025-10-02 12:30:42.545 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:42 np0005466013 nova_compute[192144]: 2025-10-02 12:30:42.551 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:42 np0005466013 systemd-machined[152202]: New machine qemu-66-instance-00000087.
Oct  2 08:30:42 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:30:42.601 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b4:da:2c 10.100.0.14'], port_security=['fa:16:3e:b4:da:2c 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '205fe71f-8f8c-4ae1-ac04-9344041cfd6c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-efed3bdf-e287-4892-a4a2-6d198fc94413', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7f00fc5d-beb8-472a-8d55-c082ab0c14cf', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ec777f9d-784c-4505-9548-eae114383c79, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=a9e0d1d4-1101-49b7-adda-6a2a6db11fe1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:30:42 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:30:42.602 103323 INFO neutron.agent.ovn.metadata.agent [-] Port a9e0d1d4-1101-49b7-adda-6a2a6db11fe1 in datapath efed3bdf-e287-4892-a4a2-6d198fc94413 bound to our chassis#033[00m
Oct  2 08:30:42 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:30:42.604 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network efed3bdf-e287-4892-a4a2-6d198fc94413#033[00m
Oct  2 08:30:42 np0005466013 systemd[1]: Started Virtual Machine qemu-66-instance-00000087.
Oct  2 08:30:42 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:30:42.624 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[59abef97-774f-4fd0-9d94-b84fd92351d1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:42 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:30:42.625 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapefed3bdf-e1 in ovnmeta-efed3bdf-e287-4892-a4a2-6d198fc94413 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:30:42 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:30:42.627 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapefed3bdf-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:30:42 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:30:42.627 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[91461ea1-8662-4b41-8718-b09c6c9b4aee]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:42 np0005466013 nova_compute[192144]: 2025-10-02 12:30:42.629 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:42 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:30:42.629 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[1d209d87-21e9-409d-b0d1-45c53a436208]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:42 np0005466013 ovn_controller[94366]: 2025-10-02T12:30:42Z|00589|binding|INFO|Setting lport a9e0d1d4-1101-49b7-adda-6a2a6db11fe1 ovn-installed in OVS
Oct  2 08:30:42 np0005466013 ovn_controller[94366]: 2025-10-02T12:30:42Z|00590|binding|INFO|Setting lport a9e0d1d4-1101-49b7-adda-6a2a6db11fe1 up in Southbound
Oct  2 08:30:42 np0005466013 nova_compute[192144]: 2025-10-02 12:30:42.635 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:42 np0005466013 systemd-udevd[242837]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:30:42 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:30:42.647 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[bd1d71e5-8bcc-45d1-a054-5d29a6377a38]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:42 np0005466013 NetworkManager[51205]: <info>  [1759408242.6622] device (tapa9e0d1d4-11): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:30:42 np0005466013 NetworkManager[51205]: <info>  [1759408242.6636] device (tapa9e0d1d4-11): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:30:42 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:30:42.682 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[31b29e46-f4d5-4a5e-8124-b69503c65cf5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:42 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:30:42.729 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[03625c71-c2a3-4ee7-b4e3-961a74208502]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:42 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:30:42.734 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[49337b33-c1c0-45f9-a30d-c28b283bb0ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:42 np0005466013 NetworkManager[51205]: <info>  [1759408242.7358] manager: (tapefed3bdf-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/259)
Oct  2 08:30:42 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:30:42.788 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[36a85a16-ac24-450f-8de8-ac8574a72b39]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:42 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:30:42.792 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[d59e0408-2073-4bc8-9ec0-af37e1b9401b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:42 np0005466013 NetworkManager[51205]: <info>  [1759408242.8239] device (tapefed3bdf-e0): carrier: link connected
Oct  2 08:30:42 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:30:42.831 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[f21a9d15-35a4-4c15-a38a-6e0536314f66]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:42 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:30:42.861 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[1b760408-423d-45cc-b8ef-83ce34d53265]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapefed3bdf-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a5:32:cb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 177], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 623444, 'reachable_time': 43956, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242868, 'error': None, 'target': 'ovnmeta-efed3bdf-e287-4892-a4a2-6d198fc94413', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:42 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:30:42.887 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[1fcdf78f-7e93-4c9d-8f12-02a630c973ab]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea5:32cb'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 623444, 'tstamp': 623444}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 242869, 'error': None, 'target': 'ovnmeta-efed3bdf-e287-4892-a4a2-6d198fc94413', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:42 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:30:42.912 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[505127d2-e91a-4bab-8e31-dcb1518bcb7c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapefed3bdf-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a5:32:cb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 177], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 623444, 'reachable_time': 43956, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 242870, 'error': None, 'target': 'ovnmeta-efed3bdf-e287-4892-a4a2-6d198fc94413', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:42 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:30:42.955 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[ecc23fc9-1480-453c-b97a-965547a9f8ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:42 np0005466013 nova_compute[192144]: 2025-10-02 12:30:42.974 2 DEBUG nova.compute.manager [req-965916e1-d048-4da3-986c-5f7612fd703f req-387783bb-7df9-4bb0-a116-0cc77565daa9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Received event network-vif-plugged-a9e0d1d4-1101-49b7-adda-6a2a6db11fe1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:30:42 np0005466013 nova_compute[192144]: 2025-10-02 12:30:42.974 2 DEBUG oslo_concurrency.lockutils [req-965916e1-d048-4da3-986c-5f7612fd703f req-387783bb-7df9-4bb0-a116-0cc77565daa9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "205fe71f-8f8c-4ae1-ac04-9344041cfd6c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:42 np0005466013 nova_compute[192144]: 2025-10-02 12:30:42.975 2 DEBUG oslo_concurrency.lockutils [req-965916e1-d048-4da3-986c-5f7612fd703f req-387783bb-7df9-4bb0-a116-0cc77565daa9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "205fe71f-8f8c-4ae1-ac04-9344041cfd6c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:42 np0005466013 nova_compute[192144]: 2025-10-02 12:30:42.975 2 DEBUG oslo_concurrency.lockutils [req-965916e1-d048-4da3-986c-5f7612fd703f req-387783bb-7df9-4bb0-a116-0cc77565daa9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "205fe71f-8f8c-4ae1-ac04-9344041cfd6c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:42 np0005466013 nova_compute[192144]: 2025-10-02 12:30:42.976 2 DEBUG nova.compute.manager [req-965916e1-d048-4da3-986c-5f7612fd703f req-387783bb-7df9-4bb0-a116-0cc77565daa9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Processing event network-vif-plugged-a9e0d1d4-1101-49b7-adda-6a2a6db11fe1 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:30:42 np0005466013 nova_compute[192144]: 2025-10-02 12:30:42.989 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:30:43 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:30:43.029 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[86a8e01d-cde7-4e53-a627-52577f1a769f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:43 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:30:43.030 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapefed3bdf-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:30:43 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:30:43.031 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:30:43 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:30:43.031 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapefed3bdf-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:30:43 np0005466013 nova_compute[192144]: 2025-10-02 12:30:43.033 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:43 np0005466013 kernel: tapefed3bdf-e0: entered promiscuous mode
Oct  2 08:30:43 np0005466013 NetworkManager[51205]: <info>  [1759408243.0373] manager: (tapefed3bdf-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/260)
Oct  2 08:30:43 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:30:43.038 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapefed3bdf-e0, col_values=(('external_ids', {'iface-id': '96781755-97bc-4b0c-9d56-1511bbfc3ac7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:30:43 np0005466013 nova_compute[192144]: 2025-10-02 12:30:43.039 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:43 np0005466013 ovn_controller[94366]: 2025-10-02T12:30:43Z|00591|binding|INFO|Releasing lport 96781755-97bc-4b0c-9d56-1511bbfc3ac7 from this chassis (sb_readonly=0)
Oct  2 08:30:43 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:30:43.043 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/efed3bdf-e287-4892-a4a2-6d198fc94413.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/efed3bdf-e287-4892-a4a2-6d198fc94413.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:30:43 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:30:43.044 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[aa03f5a4-bb07-4c7b-a54e-696db395ffd8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:43 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:30:43.045 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:30:43 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:30:43 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:30:43 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-efed3bdf-e287-4892-a4a2-6d198fc94413
Oct  2 08:30:43 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:30:43 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:30:43 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:30:43 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/efed3bdf-e287-4892-a4a2-6d198fc94413.pid.haproxy
Oct  2 08:30:43 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:30:43 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:30:43 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:30:43 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:30:43 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:30:43 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:30:43 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:30:43 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:30:43 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:30:43 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:30:43 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:30:43 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:30:43 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:30:43 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:30:43 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:30:43 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:30:43 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:30:43 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:30:43 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:30:43 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:30:43 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID efed3bdf-e287-4892-a4a2-6d198fc94413
Oct  2 08:30:43 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:30:43 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:30:43.046 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-efed3bdf-e287-4892-a4a2-6d198fc94413', 'env', 'PROCESS_TAG=haproxy-efed3bdf-e287-4892-a4a2-6d198fc94413', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/efed3bdf-e287-4892-a4a2-6d198fc94413.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:30:43 np0005466013 nova_compute[192144]: 2025-10-02 12:30:43.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:43 np0005466013 nova_compute[192144]: 2025-10-02 12:30:43.094 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:30:43 np0005466013 podman[242909]: 2025-10-02 12:30:43.46594207 +0000 UTC m=+0.062206015 container create f817f4ab7ac1c1c849d73be959e820cacc729f223e73f5a81e6f0d17728d714b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-efed3bdf-e287-4892-a4a2-6d198fc94413, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:30:43 np0005466013 systemd[1]: Started libpod-conmon-f817f4ab7ac1c1c849d73be959e820cacc729f223e73f5a81e6f0d17728d714b.scope.
Oct  2 08:30:43 np0005466013 podman[242909]: 2025-10-02 12:30:43.428110725 +0000 UTC m=+0.024374750 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:30:43 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:30:43 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/549d8efa2e23f5bac53fff61feb2d6583d763512c14200e32646536e2ac5f0be/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:30:43 np0005466013 podman[242909]: 2025-10-02 12:30:43.576254683 +0000 UTC m=+0.172518718 container init f817f4ab7ac1c1c849d73be959e820cacc729f223e73f5a81e6f0d17728d714b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-efed3bdf-e287-4892-a4a2-6d198fc94413, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001)
Oct  2 08:30:43 np0005466013 podman[242909]: 2025-10-02 12:30:43.583929148 +0000 UTC m=+0.180193103 container start f817f4ab7ac1c1c849d73be959e820cacc729f223e73f5a81e6f0d17728d714b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-efed3bdf-e287-4892-a4a2-6d198fc94413, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Oct  2 08:30:43 np0005466013 nova_compute[192144]: 2025-10-02 12:30:43.612 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759408243.6120765, 205fe71f-8f8c-4ae1-ac04-9344041cfd6c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:30:43 np0005466013 nova_compute[192144]: 2025-10-02 12:30:43.613 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] VM Started (Lifecycle Event)#033[00m
Oct  2 08:30:43 np0005466013 nova_compute[192144]: 2025-10-02 12:30:43.615 2 DEBUG nova.compute.manager [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:30:43 np0005466013 nova_compute[192144]: 2025-10-02 12:30:43.620 2 DEBUG nova.virt.libvirt.driver [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:30:43 np0005466013 neutron-haproxy-ovnmeta-efed3bdf-e287-4892-a4a2-6d198fc94413[242924]: [NOTICE]   (242928) : New worker (242930) forked
Oct  2 08:30:43 np0005466013 neutron-haproxy-ovnmeta-efed3bdf-e287-4892-a4a2-6d198fc94413[242924]: [NOTICE]   (242928) : Loading success.
Oct  2 08:30:43 np0005466013 nova_compute[192144]: 2025-10-02 12:30:43.625 2 INFO nova.virt.libvirt.driver [-] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Instance spawned successfully.#033[00m
Oct  2 08:30:43 np0005466013 nova_compute[192144]: 2025-10-02 12:30:43.625 2 DEBUG nova.virt.libvirt.driver [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:30:43 np0005466013 nova_compute[192144]: 2025-10-02 12:30:43.657 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:30:43 np0005466013 nova_compute[192144]: 2025-10-02 12:30:43.665 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:30:43 np0005466013 nova_compute[192144]: 2025-10-02 12:30:43.669 2 DEBUG nova.virt.libvirt.driver [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:30:43 np0005466013 nova_compute[192144]: 2025-10-02 12:30:43.669 2 DEBUG nova.virt.libvirt.driver [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:30:43 np0005466013 nova_compute[192144]: 2025-10-02 12:30:43.670 2 DEBUG nova.virt.libvirt.driver [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:30:43 np0005466013 nova_compute[192144]: 2025-10-02 12:30:43.670 2 DEBUG nova.virt.libvirt.driver [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:30:43 np0005466013 nova_compute[192144]: 2025-10-02 12:30:43.671 2 DEBUG nova.virt.libvirt.driver [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:30:43 np0005466013 nova_compute[192144]: 2025-10-02 12:30:43.671 2 DEBUG nova.virt.libvirt.driver [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:30:43 np0005466013 nova_compute[192144]: 2025-10-02 12:30:43.784 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:30:43 np0005466013 nova_compute[192144]: 2025-10-02 12:30:43.784 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759408243.61296, 205fe71f-8f8c-4ae1-ac04-9344041cfd6c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:30:43 np0005466013 nova_compute[192144]: 2025-10-02 12:30:43.785 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:30:43 np0005466013 nova_compute[192144]: 2025-10-02 12:30:43.958 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:30:43 np0005466013 nova_compute[192144]: 2025-10-02 12:30:43.962 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759408243.6183112, 205fe71f-8f8c-4ae1-ac04-9344041cfd6c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:30:43 np0005466013 nova_compute[192144]: 2025-10-02 12:30:43.963 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:30:43 np0005466013 nova_compute[192144]: 2025-10-02 12:30:43.994 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:30:43 np0005466013 nova_compute[192144]: 2025-10-02 12:30:43.999 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:30:44 np0005466013 nova_compute[192144]: 2025-10-02 12:30:44.010 2 INFO nova.compute.manager [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Took 8.24 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:30:44 np0005466013 nova_compute[192144]: 2025-10-02 12:30:44.010 2 DEBUG nova.compute.manager [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:30:44 np0005466013 nova_compute[192144]: 2025-10-02 12:30:44.023 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:30:44 np0005466013 nova_compute[192144]: 2025-10-02 12:30:44.118 2 INFO nova.compute.manager [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Took 10.53 seconds to build instance.#033[00m
Oct  2 08:30:44 np0005466013 nova_compute[192144]: 2025-10-02 12:30:44.139 2 DEBUG oslo_concurrency.lockutils [None req-6a04895e-7f8c-432b-b1fd-7e790b7da98a 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "205fe71f-8f8c-4ae1-ac04-9344041cfd6c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.358s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:44 np0005466013 nova_compute[192144]: 2025-10-02 12:30:44.614 2 DEBUG nova.network.neutron [req-ee710bf7-e637-40ee-9429-47089a452be7 req-7910e077-18d5-4841-ae1e-30ba70101a92 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Updated VIF entry in instance network info cache for port a9e0d1d4-1101-49b7-adda-6a2a6db11fe1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:30:44 np0005466013 nova_compute[192144]: 2025-10-02 12:30:44.615 2 DEBUG nova.network.neutron [req-ee710bf7-e637-40ee-9429-47089a452be7 req-7910e077-18d5-4841-ae1e-30ba70101a92 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Updating instance_info_cache with network_info: [{"id": "a9e0d1d4-1101-49b7-adda-6a2a6db11fe1", "address": "fa:16:3e:b4:da:2c", "network": {"id": "efed3bdf-e287-4892-a4a2-6d198fc94413", "bridge": "br-int", "label": "tempest-network-smoke--1318553587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9e0d1d4-11", "ovs_interfaceid": "a9e0d1d4-1101-49b7-adda-6a2a6db11fe1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:30:44 np0005466013 nova_compute[192144]: 2025-10-02 12:30:44.692 2 DEBUG oslo_concurrency.lockutils [req-ee710bf7-e637-40ee-9429-47089a452be7 req-7910e077-18d5-4841-ae1e-30ba70101a92 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-205fe71f-8f8c-4ae1-ac04-9344041cfd6c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:30:44 np0005466013 nova_compute[192144]: 2025-10-02 12:30:44.894 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:45 np0005466013 nova_compute[192144]: 2025-10-02 12:30:45.088 2 DEBUG nova.compute.manager [req-3380559d-130a-4bbc-b8ae-24e138e23f44 req-f3e30c5a-a7da-4681-bf00-c7685737eca3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Received event network-vif-plugged-a9e0d1d4-1101-49b7-adda-6a2a6db11fe1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:30:45 np0005466013 nova_compute[192144]: 2025-10-02 12:30:45.089 2 DEBUG oslo_concurrency.lockutils [req-3380559d-130a-4bbc-b8ae-24e138e23f44 req-f3e30c5a-a7da-4681-bf00-c7685737eca3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "205fe71f-8f8c-4ae1-ac04-9344041cfd6c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:45 np0005466013 nova_compute[192144]: 2025-10-02 12:30:45.090 2 DEBUG oslo_concurrency.lockutils [req-3380559d-130a-4bbc-b8ae-24e138e23f44 req-f3e30c5a-a7da-4681-bf00-c7685737eca3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "205fe71f-8f8c-4ae1-ac04-9344041cfd6c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:45 np0005466013 nova_compute[192144]: 2025-10-02 12:30:45.090 2 DEBUG oslo_concurrency.lockutils [req-3380559d-130a-4bbc-b8ae-24e138e23f44 req-f3e30c5a-a7da-4681-bf00-c7685737eca3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "205fe71f-8f8c-4ae1-ac04-9344041cfd6c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:45 np0005466013 nova_compute[192144]: 2025-10-02 12:30:45.090 2 DEBUG nova.compute.manager [req-3380559d-130a-4bbc-b8ae-24e138e23f44 req-f3e30c5a-a7da-4681-bf00-c7685737eca3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] No waiting events found dispatching network-vif-plugged-a9e0d1d4-1101-49b7-adda-6a2a6db11fe1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:30:45 np0005466013 nova_compute[192144]: 2025-10-02 12:30:45.091 2 WARNING nova.compute.manager [req-3380559d-130a-4bbc-b8ae-24e138e23f44 req-f3e30c5a-a7da-4681-bf00-c7685737eca3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Received unexpected event network-vif-plugged-a9e0d1d4-1101-49b7-adda-6a2a6db11fe1 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:30:46 np0005466013 nova_compute[192144]: 2025-10-02 12:30:46.700 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:48 np0005466013 podman[242939]: 2025-10-02 12:30:48.708341362 +0000 UTC m=+0.076718159 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, container_name=multipathd)
Oct  2 08:30:48 np0005466013 podman[242941]: 2025-10-02 12:30:48.714474155 +0000 UTC m=+0.077669260 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=edpm, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:30:48 np0005466013 podman[242940]: 2025-10-02 12:30:48.732156212 +0000 UTC m=+0.097755446 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-type=git, build-date=2025-08-20T13:12:41, distribution-scope=public, vendor=Red Hat, Inc., config_id=edpm, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, container_name=openstack_network_exporter, version=9.6, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal)
Oct  2 08:30:48 np0005466013 NetworkManager[51205]: <info>  [1759408248.9324] manager: (patch-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/261)
Oct  2 08:30:48 np0005466013 NetworkManager[51205]: <info>  [1759408248.9335] manager: (patch-br-int-to-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/262)
Oct  2 08:30:48 np0005466013 nova_compute[192144]: 2025-10-02 12:30:48.939 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:49 np0005466013 nova_compute[192144]: 2025-10-02 12:30:49.094 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:49 np0005466013 ovn_controller[94366]: 2025-10-02T12:30:49Z|00592|binding|INFO|Releasing lport 96781755-97bc-4b0c-9d56-1511bbfc3ac7 from this chassis (sb_readonly=0)
Oct  2 08:30:49 np0005466013 nova_compute[192144]: 2025-10-02 12:30:49.120 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:49 np0005466013 nova_compute[192144]: 2025-10-02 12:30:49.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:50 np0005466013 nova_compute[192144]: 2025-10-02 12:30:50.973 2 DEBUG nova.compute.manager [req-aaaf8013-888e-4d01-ac43-5fbcf0c3ad35 req-24d7b84a-c012-4a20-9d7d-b93fdd5cb689 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Received event network-changed-a9e0d1d4-1101-49b7-adda-6a2a6db11fe1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:30:50 np0005466013 nova_compute[192144]: 2025-10-02 12:30:50.975 2 DEBUG nova.compute.manager [req-aaaf8013-888e-4d01-ac43-5fbcf0c3ad35 req-24d7b84a-c012-4a20-9d7d-b93fdd5cb689 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Refreshing instance network info cache due to event network-changed-a9e0d1d4-1101-49b7-adda-6a2a6db11fe1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:30:50 np0005466013 nova_compute[192144]: 2025-10-02 12:30:50.975 2 DEBUG oslo_concurrency.lockutils [req-aaaf8013-888e-4d01-ac43-5fbcf0c3ad35 req-24d7b84a-c012-4a20-9d7d-b93fdd5cb689 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-205fe71f-8f8c-4ae1-ac04-9344041cfd6c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:30:50 np0005466013 nova_compute[192144]: 2025-10-02 12:30:50.976 2 DEBUG oslo_concurrency.lockutils [req-aaaf8013-888e-4d01-ac43-5fbcf0c3ad35 req-24d7b84a-c012-4a20-9d7d-b93fdd5cb689 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-205fe71f-8f8c-4ae1-ac04-9344041cfd6c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:30:50 np0005466013 nova_compute[192144]: 2025-10-02 12:30:50.976 2 DEBUG nova.network.neutron [req-aaaf8013-888e-4d01-ac43-5fbcf0c3ad35 req-24d7b84a-c012-4a20-9d7d-b93fdd5cb689 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Refreshing network info cache for port a9e0d1d4-1101-49b7-adda-6a2a6db11fe1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:30:51 np0005466013 nova_compute[192144]: 2025-10-02 12:30:51.704 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:53 np0005466013 podman[243003]: 2025-10-02 12:30:53.702274912 +0000 UTC m=+0.070959307 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  2 08:30:53 np0005466013 podman[243002]: 2025-10-02 12:30:53.729116673 +0000 UTC m=+0.099189684 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:30:53 np0005466013 nova_compute[192144]: 2025-10-02 12:30:53.872 2 DEBUG nova.network.neutron [req-aaaf8013-888e-4d01-ac43-5fbcf0c3ad35 req-24d7b84a-c012-4a20-9d7d-b93fdd5cb689 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Updated VIF entry in instance network info cache for port a9e0d1d4-1101-49b7-adda-6a2a6db11fe1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:30:53 np0005466013 nova_compute[192144]: 2025-10-02 12:30:53.872 2 DEBUG nova.network.neutron [req-aaaf8013-888e-4d01-ac43-5fbcf0c3ad35 req-24d7b84a-c012-4a20-9d7d-b93fdd5cb689 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Updating instance_info_cache with network_info: [{"id": "a9e0d1d4-1101-49b7-adda-6a2a6db11fe1", "address": "fa:16:3e:b4:da:2c", "network": {"id": "efed3bdf-e287-4892-a4a2-6d198fc94413", "bridge": "br-int", "label": "tempest-network-smoke--1318553587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9e0d1d4-11", "ovs_interfaceid": "a9e0d1d4-1101-49b7-adda-6a2a6db11fe1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:30:53 np0005466013 nova_compute[192144]: 2025-10-02 12:30:53.894 2 DEBUG oslo_concurrency.lockutils [req-aaaf8013-888e-4d01-ac43-5fbcf0c3ad35 req-24d7b84a-c012-4a20-9d7d-b93fdd5cb689 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-205fe71f-8f8c-4ae1-ac04-9344041cfd6c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:30:54 np0005466013 nova_compute[192144]: 2025-10-02 12:30:54.899 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:55 np0005466013 ovn_controller[94366]: 2025-10-02T12:30:55Z|00593|binding|INFO|Releasing lport 96781755-97bc-4b0c-9d56-1511bbfc3ac7 from this chassis (sb_readonly=0)
Oct  2 08:30:55 np0005466013 nova_compute[192144]: 2025-10-02 12:30:55.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:55 np0005466013 ovn_controller[94366]: 2025-10-02T12:30:55Z|00058|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b4:da:2c 10.100.0.14
Oct  2 08:30:55 np0005466013 ovn_controller[94366]: 2025-10-02T12:30:55Z|00059|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b4:da:2c 10.100.0.14
Oct  2 08:30:56 np0005466013 nova_compute[192144]: 2025-10-02 12:30:56.708 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:59 np0005466013 nova_compute[192144]: 2025-10-02 12:30:59.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:01 np0005466013 nova_compute[192144]: 2025-10-02 12:31:01.710 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:02.314 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:02.315 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:02.316 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:03 np0005466013 nova_compute[192144]: 2025-10-02 12:31:03.661 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:03 np0005466013 nova_compute[192144]: 2025-10-02 12:31:03.751 2 INFO nova.compute.manager [None req-6cd223fd-e8d8-47a6-b006-8e1c72ad8204 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Get console output#033[00m
Oct  2 08:31:03 np0005466013 nova_compute[192144]: 2025-10-02 12:31:03.761 56 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  2 08:31:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:04.077 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=35, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=34) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:31:04 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:04.079 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:31:04 np0005466013 nova_compute[192144]: 2025-10-02 12:31:04.079 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:04 np0005466013 nova_compute[192144]: 2025-10-02 12:31:04.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:05 np0005466013 nova_compute[192144]: 2025-10-02 12:31:05.294 2 INFO nova.compute.manager [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Rebuilding instance#033[00m
Oct  2 08:31:05 np0005466013 nova_compute[192144]: 2025-10-02 12:31:05.724 2 DEBUG nova.compute.manager [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:31:05 np0005466013 nova_compute[192144]: 2025-10-02 12:31:05.788 2 DEBUG nova.objects.instance [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lazy-loading 'pci_requests' on Instance uuid 205fe71f-8f8c-4ae1-ac04-9344041cfd6c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:31:05 np0005466013 nova_compute[192144]: 2025-10-02 12:31:05.803 2 DEBUG nova.objects.instance [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lazy-loading 'pci_devices' on Instance uuid 205fe71f-8f8c-4ae1-ac04-9344041cfd6c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:31:05 np0005466013 nova_compute[192144]: 2025-10-02 12:31:05.814 2 DEBUG nova.objects.instance [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lazy-loading 'resources' on Instance uuid 205fe71f-8f8c-4ae1-ac04-9344041cfd6c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:31:05 np0005466013 nova_compute[192144]: 2025-10-02 12:31:05.825 2 DEBUG nova.objects.instance [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lazy-loading 'migration_context' on Instance uuid 205fe71f-8f8c-4ae1-ac04-9344041cfd6c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:31:05 np0005466013 nova_compute[192144]: 2025-10-02 12:31:05.836 2 DEBUG nova.objects.instance [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Oct  2 08:31:05 np0005466013 nova_compute[192144]: 2025-10-02 12:31:05.839 2 DEBUG nova.virt.libvirt.driver [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Oct  2 08:31:06 np0005466013 nova_compute[192144]: 2025-10-02 12:31:06.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:07 np0005466013 nova_compute[192144]: 2025-10-02 12:31:07.015 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:07 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:07.082 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '35'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:31:08 np0005466013 kernel: tapa9e0d1d4-11 (unregistering): left promiscuous mode
Oct  2 08:31:08 np0005466013 NetworkManager[51205]: <info>  [1759408268.7835] device (tapa9e0d1d4-11): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:31:08 np0005466013 nova_compute[192144]: 2025-10-02 12:31:08.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:08 np0005466013 ovn_controller[94366]: 2025-10-02T12:31:08Z|00594|binding|INFO|Releasing lport a9e0d1d4-1101-49b7-adda-6a2a6db11fe1 from this chassis (sb_readonly=0)
Oct  2 08:31:08 np0005466013 ovn_controller[94366]: 2025-10-02T12:31:08Z|00595|binding|INFO|Setting lport a9e0d1d4-1101-49b7-adda-6a2a6db11fe1 down in Southbound
Oct  2 08:31:08 np0005466013 ovn_controller[94366]: 2025-10-02T12:31:08Z|00596|binding|INFO|Removing iface tapa9e0d1d4-11 ovn-installed in OVS
Oct  2 08:31:08 np0005466013 nova_compute[192144]: 2025-10-02 12:31:08.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:08 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:08.806 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b4:da:2c 10.100.0.14'], port_security=['fa:16:3e:b4:da:2c 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '205fe71f-8f8c-4ae1-ac04-9344041cfd6c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-efed3bdf-e287-4892-a4a2-6d198fc94413', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7f00fc5d-beb8-472a-8d55-c082ab0c14cf', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.242'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ec777f9d-784c-4505-9548-eae114383c79, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=a9e0d1d4-1101-49b7-adda-6a2a6db11fe1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:31:08 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:08.809 103323 INFO neutron.agent.ovn.metadata.agent [-] Port a9e0d1d4-1101-49b7-adda-6a2a6db11fe1 in datapath efed3bdf-e287-4892-a4a2-6d198fc94413 unbound from our chassis#033[00m
Oct  2 08:31:08 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:08.813 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network efed3bdf-e287-4892-a4a2-6d198fc94413, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:31:08 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:08.817 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[66b7a537-66b4-4dd0-a811-a4c43a3cff2a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:08 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:08.818 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-efed3bdf-e287-4892-a4a2-6d198fc94413 namespace which is not needed anymore#033[00m
Oct  2 08:31:08 np0005466013 nova_compute[192144]: 2025-10-02 12:31:08.832 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:08 np0005466013 systemd[1]: machine-qemu\x2d66\x2dinstance\x2d00000087.scope: Deactivated successfully.
Oct  2 08:31:08 np0005466013 systemd[1]: machine-qemu\x2d66\x2dinstance\x2d00000087.scope: Consumed 14.159s CPU time.
Oct  2 08:31:08 np0005466013 systemd-machined[152202]: Machine qemu-66-instance-00000087 terminated.
Oct  2 08:31:08 np0005466013 podman[243061]: 2025-10-02 12:31:08.879607927 +0000 UTC m=+0.066241491 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  2 08:31:08 np0005466013 podman[243065]: 2025-10-02 12:31:08.900378926 +0000 UTC m=+0.073841642 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  2 08:31:08 np0005466013 podman[243066]: 2025-10-02 12:31:08.916072017 +0000 UTC m=+0.087859258 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:31:09 np0005466013 nova_compute[192144]: 2025-10-02 12:31:09.027 2 DEBUG nova.compute.manager [req-b7dc1781-ec4f-44ff-8e93-6c406c384ab6 req-7190171b-cc6e-4be0-bb34-82a6611a142e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Received event network-vif-unplugged-a9e0d1d4-1101-49b7-adda-6a2a6db11fe1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:31:09 np0005466013 nova_compute[192144]: 2025-10-02 12:31:09.027 2 DEBUG oslo_concurrency.lockutils [req-b7dc1781-ec4f-44ff-8e93-6c406c384ab6 req-7190171b-cc6e-4be0-bb34-82a6611a142e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "205fe71f-8f8c-4ae1-ac04-9344041cfd6c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:09 np0005466013 nova_compute[192144]: 2025-10-02 12:31:09.028 2 DEBUG oslo_concurrency.lockutils [req-b7dc1781-ec4f-44ff-8e93-6c406c384ab6 req-7190171b-cc6e-4be0-bb34-82a6611a142e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "205fe71f-8f8c-4ae1-ac04-9344041cfd6c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:09 np0005466013 nova_compute[192144]: 2025-10-02 12:31:09.028 2 DEBUG oslo_concurrency.lockutils [req-b7dc1781-ec4f-44ff-8e93-6c406c384ab6 req-7190171b-cc6e-4be0-bb34-82a6611a142e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "205fe71f-8f8c-4ae1-ac04-9344041cfd6c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:09 np0005466013 nova_compute[192144]: 2025-10-02 12:31:09.028 2 DEBUG nova.compute.manager [req-b7dc1781-ec4f-44ff-8e93-6c406c384ab6 req-7190171b-cc6e-4be0-bb34-82a6611a142e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] No waiting events found dispatching network-vif-unplugged-a9e0d1d4-1101-49b7-adda-6a2a6db11fe1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:31:09 np0005466013 nova_compute[192144]: 2025-10-02 12:31:09.029 2 WARNING nova.compute.manager [req-b7dc1781-ec4f-44ff-8e93-6c406c384ab6 req-7190171b-cc6e-4be0-bb34-82a6611a142e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Received unexpected event network-vif-unplugged-a9e0d1d4-1101-49b7-adda-6a2a6db11fe1 for instance with vm_state active and task_state rebuilding.#033[00m
Oct  2 08:31:09 np0005466013 nova_compute[192144]: 2025-10-02 12:31:09.057 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:09 np0005466013 nova_compute[192144]: 2025-10-02 12:31:09.060 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:09 np0005466013 nova_compute[192144]: 2025-10-02 12:31:09.101 2 INFO nova.virt.libvirt.driver [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Instance shutdown successfully after 3 seconds.#033[00m
Oct  2 08:31:09 np0005466013 nova_compute[192144]: 2025-10-02 12:31:09.107 2 INFO nova.virt.libvirt.driver [-] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Instance destroyed successfully.#033[00m
Oct  2 08:31:09 np0005466013 nova_compute[192144]: 2025-10-02 12:31:09.112 2 INFO nova.virt.libvirt.driver [-] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Instance destroyed successfully.#033[00m
Oct  2 08:31:09 np0005466013 nova_compute[192144]: 2025-10-02 12:31:09.112 2 DEBUG nova.virt.libvirt.vif [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:30:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-853898395',display_name='tempest-TestNetworkAdvancedServerOps-server-853898395',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-853898395',id=135,image_ref='062d9f80-76b6-42ce-bee7-0fb82a008353',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJcWJR9H07bCSYUqeF8jPAGHvorXmh9wYiFyjeR3r1c8wskLP0QtO4hgxQKDidv5VyuYzyK3XTaMOh059pbUMGGg0yimxkoI04eFolQEnRD7tnn/yCbWfabjbEavYmykBA==',key_name='tempest-TestNetworkAdvancedServerOps-1317244517',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:30:44Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='76c7dd40d83e4e3ca71abbebf57921b6',ramdisk_id='',reservation_id='r-3e4dgpei',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='062d9f80-76b6-42ce-bee7-0fb82a008353',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-597114071',owner_user_name='tempest-TestNetworkAdvancedServerOps-597114071-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:31:04Z,user_data=None,user_id='1faa7e121a0e43ad8cb4ae5b2cfcc6a2',uuid=205fe71f-8f8c-4ae1-ac04-9344041cfd6c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a9e0d1d4-1101-49b7-adda-6a2a6db11fe1", "address": "fa:16:3e:b4:da:2c", "network": {"id": "efed3bdf-e287-4892-a4a2-6d198fc94413", "bridge": "br-int", "label": "tempest-network-smoke--1318553587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9e0d1d4-11", "ovs_interfaceid": "a9e0d1d4-1101-49b7-adda-6a2a6db11fe1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:31:09 np0005466013 nova_compute[192144]: 2025-10-02 12:31:09.113 2 DEBUG nova.network.os_vif_util [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Converting VIF {"id": "a9e0d1d4-1101-49b7-adda-6a2a6db11fe1", "address": "fa:16:3e:b4:da:2c", "network": {"id": "efed3bdf-e287-4892-a4a2-6d198fc94413", "bridge": "br-int", "label": "tempest-network-smoke--1318553587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9e0d1d4-11", "ovs_interfaceid": "a9e0d1d4-1101-49b7-adda-6a2a6db11fe1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:31:09 np0005466013 nova_compute[192144]: 2025-10-02 12:31:09.114 2 DEBUG nova.network.os_vif_util [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b4:da:2c,bridge_name='br-int',has_traffic_filtering=True,id=a9e0d1d4-1101-49b7-adda-6a2a6db11fe1,network=Network(efed3bdf-e287-4892-a4a2-6d198fc94413),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa9e0d1d4-11') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:31:09 np0005466013 nova_compute[192144]: 2025-10-02 12:31:09.114 2 DEBUG os_vif [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b4:da:2c,bridge_name='br-int',has_traffic_filtering=True,id=a9e0d1d4-1101-49b7-adda-6a2a6db11fe1,network=Network(efed3bdf-e287-4892-a4a2-6d198fc94413),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa9e0d1d4-11') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:31:09 np0005466013 nova_compute[192144]: 2025-10-02 12:31:09.118 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:09 np0005466013 nova_compute[192144]: 2025-10-02 12:31:09.119 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa9e0d1d4-11, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:31:09 np0005466013 nova_compute[192144]: 2025-10-02 12:31:09.120 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:09 np0005466013 nova_compute[192144]: 2025-10-02 12:31:09.122 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:09 np0005466013 nova_compute[192144]: 2025-10-02 12:31:09.125 2 INFO os_vif [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b4:da:2c,bridge_name='br-int',has_traffic_filtering=True,id=a9e0d1d4-1101-49b7-adda-6a2a6db11fe1,network=Network(efed3bdf-e287-4892-a4a2-6d198fc94413),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa9e0d1d4-11')#033[00m
Oct  2 08:31:09 np0005466013 nova_compute[192144]: 2025-10-02 12:31:09.125 2 INFO nova.virt.libvirt.driver [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Deleting instance files /var/lib/nova/instances/205fe71f-8f8c-4ae1-ac04-9344041cfd6c_del#033[00m
Oct  2 08:31:09 np0005466013 nova_compute[192144]: 2025-10-02 12:31:09.126 2 INFO nova.virt.libvirt.driver [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Deletion of /var/lib/nova/instances/205fe71f-8f8c-4ae1-ac04-9344041cfd6c_del complete#033[00m
Oct  2 08:31:09 np0005466013 neutron-haproxy-ovnmeta-efed3bdf-e287-4892-a4a2-6d198fc94413[242924]: [NOTICE]   (242928) : haproxy version is 2.8.14-c23fe91
Oct  2 08:31:09 np0005466013 neutron-haproxy-ovnmeta-efed3bdf-e287-4892-a4a2-6d198fc94413[242924]: [NOTICE]   (242928) : path to executable is /usr/sbin/haproxy
Oct  2 08:31:09 np0005466013 neutron-haproxy-ovnmeta-efed3bdf-e287-4892-a4a2-6d198fc94413[242924]: [WARNING]  (242928) : Exiting Master process...
Oct  2 08:31:09 np0005466013 neutron-haproxy-ovnmeta-efed3bdf-e287-4892-a4a2-6d198fc94413[242924]: [ALERT]    (242928) : Current worker (242930) exited with code 143 (Terminated)
Oct  2 08:31:09 np0005466013 neutron-haproxy-ovnmeta-efed3bdf-e287-4892-a4a2-6d198fc94413[242924]: [WARNING]  (242928) : All workers exited. Exiting... (0)
Oct  2 08:31:09 np0005466013 systemd[1]: libpod-f817f4ab7ac1c1c849d73be959e820cacc729f223e73f5a81e6f0d17728d714b.scope: Deactivated successfully.
Oct  2 08:31:09 np0005466013 podman[243148]: 2025-10-02 12:31:09.152729275 +0000 UTC m=+0.220341857 container died f817f4ab7ac1c1c849d73be959e820cacc729f223e73f5a81e6f0d17728d714b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-efed3bdf-e287-4892-a4a2-6d198fc94413, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001)
Oct  2 08:31:09 np0005466013 nova_compute[192144]: 2025-10-02 12:31:09.564 2 DEBUG nova.virt.libvirt.driver [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:31:09 np0005466013 nova_compute[192144]: 2025-10-02 12:31:09.565 2 INFO nova.virt.libvirt.driver [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Creating image(s)#033[00m
Oct  2 08:31:09 np0005466013 nova_compute[192144]: 2025-10-02 12:31:09.566 2 DEBUG oslo_concurrency.lockutils [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "/var/lib/nova/instances/205fe71f-8f8c-4ae1-ac04-9344041cfd6c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:09 np0005466013 nova_compute[192144]: 2025-10-02 12:31:09.566 2 DEBUG oslo_concurrency.lockutils [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "/var/lib/nova/instances/205fe71f-8f8c-4ae1-ac04-9344041cfd6c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:09 np0005466013 nova_compute[192144]: 2025-10-02 12:31:09.567 2 DEBUG oslo_concurrency.lockutils [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "/var/lib/nova/instances/205fe71f-8f8c-4ae1-ac04-9344041cfd6c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:09 np0005466013 nova_compute[192144]: 2025-10-02 12:31:09.584 2 DEBUG oslo_concurrency.processutils [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:31:09 np0005466013 nova_compute[192144]: 2025-10-02 12:31:09.683 2 DEBUG oslo_concurrency.processutils [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2 --force-share --output=json" returned: 0 in 0.099s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:31:09 np0005466013 nova_compute[192144]: 2025-10-02 12:31:09.686 2 DEBUG oslo_concurrency.lockutils [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "d7f074efa852dc950deac120296f6eecf48a40d2" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:09 np0005466013 nova_compute[192144]: 2025-10-02 12:31:09.687 2 DEBUG oslo_concurrency.lockutils [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "d7f074efa852dc950deac120296f6eecf48a40d2" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:09 np0005466013 nova_compute[192144]: 2025-10-02 12:31:09.698 2 DEBUG oslo_concurrency.processutils [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:31:09 np0005466013 nova_compute[192144]: 2025-10-02 12:31:09.765 2 DEBUG oslo_concurrency.processutils [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:31:09 np0005466013 nova_compute[192144]: 2025-10-02 12:31:09.766 2 DEBUG oslo_concurrency.processutils [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2,backing_fmt=raw /var/lib/nova/instances/205fe71f-8f8c-4ae1-ac04-9344041cfd6c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:31:09 np0005466013 nova_compute[192144]: 2025-10-02 12:31:09.939 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:09 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f817f4ab7ac1c1c849d73be959e820cacc729f223e73f5a81e6f0d17728d714b-userdata-shm.mount: Deactivated successfully.
Oct  2 08:31:10 np0005466013 systemd[1]: var-lib-containers-storage-overlay-549d8efa2e23f5bac53fff61feb2d6583d763512c14200e32646536e2ac5f0be-merged.mount: Deactivated successfully.
Oct  2 08:31:10 np0005466013 nova_compute[192144]: 2025-10-02 12:31:10.618 2 DEBUG oslo_concurrency.processutils [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2,backing_fmt=raw /var/lib/nova/instances/205fe71f-8f8c-4ae1-ac04-9344041cfd6c/disk 1073741824" returned: 0 in 0.852s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:31:10 np0005466013 nova_compute[192144]: 2025-10-02 12:31:10.619 2 DEBUG oslo_concurrency.lockutils [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "d7f074efa852dc950deac120296f6eecf48a40d2" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.932s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:10 np0005466013 nova_compute[192144]: 2025-10-02 12:31:10.620 2 DEBUG oslo_concurrency.processutils [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:31:10 np0005466013 nova_compute[192144]: 2025-10-02 12:31:10.711 2 DEBUG oslo_concurrency.processutils [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2 --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:31:10 np0005466013 nova_compute[192144]: 2025-10-02 12:31:10.713 2 DEBUG nova.virt.disk.api [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Checking if we can resize image /var/lib/nova/instances/205fe71f-8f8c-4ae1-ac04-9344041cfd6c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:31:10 np0005466013 nova_compute[192144]: 2025-10-02 12:31:10.714 2 DEBUG oslo_concurrency.processutils [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/205fe71f-8f8c-4ae1-ac04-9344041cfd6c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:31:10 np0005466013 nova_compute[192144]: 2025-10-02 12:31:10.809 2 DEBUG oslo_concurrency.processutils [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/205fe71f-8f8c-4ae1-ac04-9344041cfd6c/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:31:10 np0005466013 nova_compute[192144]: 2025-10-02 12:31:10.811 2 DEBUG nova.virt.disk.api [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Cannot resize image /var/lib/nova/instances/205fe71f-8f8c-4ae1-ac04-9344041cfd6c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:31:10 np0005466013 nova_compute[192144]: 2025-10-02 12:31:10.813 2 DEBUG nova.virt.libvirt.driver [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:31:10 np0005466013 nova_compute[192144]: 2025-10-02 12:31:10.814 2 DEBUG nova.virt.libvirt.driver [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Ensure instance console log exists: /var/lib/nova/instances/205fe71f-8f8c-4ae1-ac04-9344041cfd6c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:31:10 np0005466013 nova_compute[192144]: 2025-10-02 12:31:10.815 2 DEBUG oslo_concurrency.lockutils [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:10 np0005466013 nova_compute[192144]: 2025-10-02 12:31:10.816 2 DEBUG oslo_concurrency.lockutils [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:10 np0005466013 nova_compute[192144]: 2025-10-02 12:31:10.816 2 DEBUG oslo_concurrency.lockutils [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:10 np0005466013 nova_compute[192144]: 2025-10-02 12:31:10.819 2 DEBUG nova.virt.libvirt.driver [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Start _get_guest_xml network_info=[{"id": "a9e0d1d4-1101-49b7-adda-6a2a6db11fe1", "address": "fa:16:3e:b4:da:2c", "network": {"id": "efed3bdf-e287-4892-a4a2-6d198fc94413", "bridge": "br-int", "label": "tempest-network-smoke--1318553587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9e0d1d4-11", "ovs_interfaceid": "a9e0d1d4-1101-49b7-adda-6a2a6db11fe1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:28Z,direct_url=<?>,disk_format='qcow2',id=062d9f80-76b6-42ce-bee7-0fb82a008353,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:31:10 np0005466013 nova_compute[192144]: 2025-10-02 12:31:10.826 2 WARNING nova.virt.libvirt.driver [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Oct  2 08:31:10 np0005466013 nova_compute[192144]: 2025-10-02 12:31:10.835 2 DEBUG nova.virt.libvirt.host [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:31:10 np0005466013 nova_compute[192144]: 2025-10-02 12:31:10.836 2 DEBUG nova.virt.libvirt.host [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:31:10 np0005466013 nova_compute[192144]: 2025-10-02 12:31:10.841 2 DEBUG nova.virt.libvirt.host [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:31:10 np0005466013 nova_compute[192144]: 2025-10-02 12:31:10.841 2 DEBUG nova.virt.libvirt.host [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:31:10 np0005466013 nova_compute[192144]: 2025-10-02 12:31:10.843 2 DEBUG nova.virt.libvirt.driver [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:31:10 np0005466013 nova_compute[192144]: 2025-10-02 12:31:10.843 2 DEBUG nova.virt.hardware [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:28Z,direct_url=<?>,disk_format='qcow2',id=062d9f80-76b6-42ce-bee7-0fb82a008353,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:31:10 np0005466013 nova_compute[192144]: 2025-10-02 12:31:10.844 2 DEBUG nova.virt.hardware [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:31:10 np0005466013 nova_compute[192144]: 2025-10-02 12:31:10.844 2 DEBUG nova.virt.hardware [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:31:10 np0005466013 nova_compute[192144]: 2025-10-02 12:31:10.844 2 DEBUG nova.virt.hardware [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:31:10 np0005466013 nova_compute[192144]: 2025-10-02 12:31:10.844 2 DEBUG nova.virt.hardware [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:31:10 np0005466013 nova_compute[192144]: 2025-10-02 12:31:10.845 2 DEBUG nova.virt.hardware [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:31:10 np0005466013 nova_compute[192144]: 2025-10-02 12:31:10.845 2 DEBUG nova.virt.hardware [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:31:10 np0005466013 nova_compute[192144]: 2025-10-02 12:31:10.845 2 DEBUG nova.virt.hardware [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:31:10 np0005466013 nova_compute[192144]: 2025-10-02 12:31:10.846 2 DEBUG nova.virt.hardware [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:31:10 np0005466013 nova_compute[192144]: 2025-10-02 12:31:10.846 2 DEBUG nova.virt.hardware [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:31:10 np0005466013 nova_compute[192144]: 2025-10-02 12:31:10.846 2 DEBUG nova.virt.hardware [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:31:10 np0005466013 nova_compute[192144]: 2025-10-02 12:31:10.847 2 DEBUG nova.objects.instance [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 205fe71f-8f8c-4ae1-ac04-9344041cfd6c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:31:10 np0005466013 nova_compute[192144]: 2025-10-02 12:31:10.886 2 DEBUG nova.virt.libvirt.vif [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T12:30:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-853898395',display_name='tempest-TestNetworkAdvancedServerOps-server-853898395',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-853898395',id=135,image_ref='062d9f80-76b6-42ce-bee7-0fb82a008353',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJcWJR9H07bCSYUqeF8jPAGHvorXmh9wYiFyjeR3r1c8wskLP0QtO4hgxQKDidv5VyuYzyK3XTaMOh059pbUMGGg0yimxkoI04eFolQEnRD7tnn/yCbWfabjbEavYmykBA==',key_name='tempest-TestNetworkAdvancedServerOps-1317244517',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:30:44Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='76c7dd40d83e4e3ca71abbebf57921b6',ramdisk_id='',reservation_id='r-3e4dgpei',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='062d9f80-76b6-42ce-bee7-0fb82a008353',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-597114071',owner_user_name='tempest-TestNetworkAdvancedServerOps-597114071-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:31:09Z,user_data=None,user_id='1faa7e121a0e43ad8cb4ae5b2cfcc6a2',uuid=205fe71f-8f8c-4ae1-ac04-9344041cfd6c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a9e0d1d4-1101-49b7-adda-6a2a6db11fe1", "address": "fa:16:3e:b4:da:2c", "network": {"id": "efed3bdf-e287-4892-a4a2-6d198fc94413", "bridge": "br-int", "label": "tempest-network-smoke--1318553587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9e0d1d4-11", "ovs_interfaceid": "a9e0d1d4-1101-49b7-adda-6a2a6db11fe1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:31:10 np0005466013 nova_compute[192144]: 2025-10-02 12:31:10.887 2 DEBUG nova.network.os_vif_util [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Converting VIF {"id": "a9e0d1d4-1101-49b7-adda-6a2a6db11fe1", "address": "fa:16:3e:b4:da:2c", "network": {"id": "efed3bdf-e287-4892-a4a2-6d198fc94413", "bridge": "br-int", "label": "tempest-network-smoke--1318553587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9e0d1d4-11", "ovs_interfaceid": "a9e0d1d4-1101-49b7-adda-6a2a6db11fe1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:31:10 np0005466013 nova_compute[192144]: 2025-10-02 12:31:10.888 2 DEBUG nova.network.os_vif_util [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b4:da:2c,bridge_name='br-int',has_traffic_filtering=True,id=a9e0d1d4-1101-49b7-adda-6a2a6db11fe1,network=Network(efed3bdf-e287-4892-a4a2-6d198fc94413),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa9e0d1d4-11') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:31:10 np0005466013 nova_compute[192144]: 2025-10-02 12:31:10.890 2 DEBUG nova.virt.libvirt.driver [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:31:10 np0005466013 nova_compute[192144]:  <uuid>205fe71f-8f8c-4ae1-ac04-9344041cfd6c</uuid>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:  <name>instance-00000087</name>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:31:10 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-853898395</nova:name>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:31:10</nova:creationTime>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:31:10 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:        <nova:user uuid="1faa7e121a0e43ad8cb4ae5b2cfcc6a2">tempest-TestNetworkAdvancedServerOps-597114071-project-member</nova:user>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:        <nova:project uuid="76c7dd40d83e4e3ca71abbebf57921b6">tempest-TestNetworkAdvancedServerOps-597114071</nova:project>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="062d9f80-76b6-42ce-bee7-0fb82a008353"/>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:        <nova:port uuid="a9e0d1d4-1101-49b7-adda-6a2a6db11fe1">
Oct  2 08:31:10 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:      <entry name="serial">205fe71f-8f8c-4ae1-ac04-9344041cfd6c</entry>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:      <entry name="uuid">205fe71f-8f8c-4ae1-ac04-9344041cfd6c</entry>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:31:10 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/205fe71f-8f8c-4ae1-ac04-9344041cfd6c/disk"/>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:31:10 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/205fe71f-8f8c-4ae1-ac04-9344041cfd6c/disk.config"/>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:31:10 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:b4:da:2c"/>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:      <target dev="tapa9e0d1d4-11"/>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:31:10 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/205fe71f-8f8c-4ae1-ac04-9344041cfd6c/console.log" append="off"/>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:31:10 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:31:10 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:31:10 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:31:10 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:31:10 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:31:10 np0005466013 nova_compute[192144]: 2025-10-02 12:31:10.892 2 DEBUG nova.virt.libvirt.vif [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T12:30:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-853898395',display_name='tempest-TestNetworkAdvancedServerOps-server-853898395',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-853898395',id=135,image_ref='062d9f80-76b6-42ce-bee7-0fb82a008353',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJcWJR9H07bCSYUqeF8jPAGHvorXmh9wYiFyjeR3r1c8wskLP0QtO4hgxQKDidv5VyuYzyK3XTaMOh059pbUMGGg0yimxkoI04eFolQEnRD7tnn/yCbWfabjbEavYmykBA==',key_name='tempest-TestNetworkAdvancedServerOps-1317244517',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:30:44Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='76c7dd40d83e4e3ca71abbebf57921b6',ramdisk_id='',reservation_id='r-3e4dgpei',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='062d9f80-76b6-42ce-bee7-0fb82a008353',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-597114071',owner_user_name='tempest-TestNetworkAdvancedServerOps-597114071-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:31:09Z,user_data=None,user_id='1faa7e121a0e43ad8cb4ae5b2cfcc6a2',uuid=205fe71f-8f8c-4ae1-ac04-9344041cfd6c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a9e0d1d4-1101-49b7-adda-6a2a6db11fe1", "address": "fa:16:3e:b4:da:2c", "network": {"id": "efed3bdf-e287-4892-a4a2-6d198fc94413", "bridge": "br-int", "label": "tempest-network-smoke--1318553587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9e0d1d4-11", "ovs_interfaceid": "a9e0d1d4-1101-49b7-adda-6a2a6db11fe1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:31:10 np0005466013 nova_compute[192144]: 2025-10-02 12:31:10.893 2 DEBUG nova.network.os_vif_util [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Converting VIF {"id": "a9e0d1d4-1101-49b7-adda-6a2a6db11fe1", "address": "fa:16:3e:b4:da:2c", "network": {"id": "efed3bdf-e287-4892-a4a2-6d198fc94413", "bridge": "br-int", "label": "tempest-network-smoke--1318553587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9e0d1d4-11", "ovs_interfaceid": "a9e0d1d4-1101-49b7-adda-6a2a6db11fe1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:31:10 np0005466013 nova_compute[192144]: 2025-10-02 12:31:10.894 2 DEBUG nova.network.os_vif_util [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b4:da:2c,bridge_name='br-int',has_traffic_filtering=True,id=a9e0d1d4-1101-49b7-adda-6a2a6db11fe1,network=Network(efed3bdf-e287-4892-a4a2-6d198fc94413),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa9e0d1d4-11') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:31:10 np0005466013 nova_compute[192144]: 2025-10-02 12:31:10.895 2 DEBUG os_vif [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b4:da:2c,bridge_name='br-int',has_traffic_filtering=True,id=a9e0d1d4-1101-49b7-adda-6a2a6db11fe1,network=Network(efed3bdf-e287-4892-a4a2-6d198fc94413),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa9e0d1d4-11') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:31:10 np0005466013 nova_compute[192144]: 2025-10-02 12:31:10.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:10 np0005466013 nova_compute[192144]: 2025-10-02 12:31:10.896 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:31:10 np0005466013 nova_compute[192144]: 2025-10-02 12:31:10.897 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:31:10 np0005466013 nova_compute[192144]: 2025-10-02 12:31:10.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:10 np0005466013 nova_compute[192144]: 2025-10-02 12:31:10.903 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa9e0d1d4-11, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:31:10 np0005466013 nova_compute[192144]: 2025-10-02 12:31:10.905 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa9e0d1d4-11, col_values=(('external_ids', {'iface-id': 'a9e0d1d4-1101-49b7-adda-6a2a6db11fe1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b4:da:2c', 'vm-uuid': '205fe71f-8f8c-4ae1-ac04-9344041cfd6c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:31:10 np0005466013 nova_compute[192144]: 2025-10-02 12:31:10.908 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:10 np0005466013 NetworkManager[51205]: <info>  [1759408270.9097] manager: (tapa9e0d1d4-11): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/263)
Oct  2 08:31:10 np0005466013 nova_compute[192144]: 2025-10-02 12:31:10.914 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:31:10 np0005466013 nova_compute[192144]: 2025-10-02 12:31:10.917 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:10 np0005466013 nova_compute[192144]: 2025-10-02 12:31:10.918 2 INFO os_vif [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b4:da:2c,bridge_name='br-int',has_traffic_filtering=True,id=a9e0d1d4-1101-49b7-adda-6a2a6db11fe1,network=Network(efed3bdf-e287-4892-a4a2-6d198fc94413),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa9e0d1d4-11')#033[00m
Oct  2 08:31:11 np0005466013 podman[243148]: 2025-10-02 12:31:11.031676151 +0000 UTC m=+2.099288773 container cleanup f817f4ab7ac1c1c849d73be959e820cacc729f223e73f5a81e6f0d17728d714b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-efed3bdf-e287-4892-a4a2-6d198fc94413, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:31:11 np0005466013 systemd[1]: libpod-conmon-f817f4ab7ac1c1c849d73be959e820cacc729f223e73f5a81e6f0d17728d714b.scope: Deactivated successfully.
Oct  2 08:31:11 np0005466013 nova_compute[192144]: 2025-10-02 12:31:11.110 2 DEBUG nova.compute.manager [req-57137942-2392-4195-9ade-426357d2b4c6 req-1ae7be6e-b5ab-49f7-b78a-1f52af6f1ca2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Received event network-vif-plugged-a9e0d1d4-1101-49b7-adda-6a2a6db11fe1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:31:11 np0005466013 nova_compute[192144]: 2025-10-02 12:31:11.111 2 DEBUG oslo_concurrency.lockutils [req-57137942-2392-4195-9ade-426357d2b4c6 req-1ae7be6e-b5ab-49f7-b78a-1f52af6f1ca2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "205fe71f-8f8c-4ae1-ac04-9344041cfd6c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:11 np0005466013 nova_compute[192144]: 2025-10-02 12:31:11.111 2 DEBUG oslo_concurrency.lockutils [req-57137942-2392-4195-9ade-426357d2b4c6 req-1ae7be6e-b5ab-49f7-b78a-1f52af6f1ca2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "205fe71f-8f8c-4ae1-ac04-9344041cfd6c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:11 np0005466013 nova_compute[192144]: 2025-10-02 12:31:11.112 2 DEBUG oslo_concurrency.lockutils [req-57137942-2392-4195-9ade-426357d2b4c6 req-1ae7be6e-b5ab-49f7-b78a-1f52af6f1ca2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "205fe71f-8f8c-4ae1-ac04-9344041cfd6c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:11 np0005466013 nova_compute[192144]: 2025-10-02 12:31:11.112 2 DEBUG nova.compute.manager [req-57137942-2392-4195-9ade-426357d2b4c6 req-1ae7be6e-b5ab-49f7-b78a-1f52af6f1ca2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] No waiting events found dispatching network-vif-plugged-a9e0d1d4-1101-49b7-adda-6a2a6db11fe1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:31:11 np0005466013 nova_compute[192144]: 2025-10-02 12:31:11.112 2 WARNING nova.compute.manager [req-57137942-2392-4195-9ade-426357d2b4c6 req-1ae7be6e-b5ab-49f7-b78a-1f52af6f1ca2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Received unexpected event network-vif-plugged-a9e0d1d4-1101-49b7-adda-6a2a6db11fe1 for instance with vm_state active and task_state rebuild_spawning.#033[00m
Oct  2 08:31:11 np0005466013 nova_compute[192144]: 2025-10-02 12:31:11.118 2 DEBUG nova.virt.libvirt.driver [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:31:11 np0005466013 nova_compute[192144]: 2025-10-02 12:31:11.118 2 DEBUG nova.virt.libvirt.driver [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:31:11 np0005466013 nova_compute[192144]: 2025-10-02 12:31:11.119 2 DEBUG nova.virt.libvirt.driver [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] No VIF found with MAC fa:16:3e:b4:da:2c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:31:11 np0005466013 nova_compute[192144]: 2025-10-02 12:31:11.119 2 INFO nova.virt.libvirt.driver [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Using config drive#033[00m
Oct  2 08:31:11 np0005466013 nova_compute[192144]: 2025-10-02 12:31:11.141 2 DEBUG nova.objects.instance [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 205fe71f-8f8c-4ae1-ac04-9344041cfd6c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:31:11 np0005466013 nova_compute[192144]: 2025-10-02 12:31:11.174 2 DEBUG nova.objects.instance [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lazy-loading 'keypairs' on Instance uuid 205fe71f-8f8c-4ae1-ac04-9344041cfd6c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:31:11 np0005466013 nova_compute[192144]: 2025-10-02 12:31:11.319 2 DEBUG oslo_concurrency.lockutils [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Acquiring lock "f9bd9b91-34ee-4ea0-9ff5-38db09b7f541" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:11 np0005466013 nova_compute[192144]: 2025-10-02 12:31:11.320 2 DEBUG oslo_concurrency.lockutils [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Lock "f9bd9b91-34ee-4ea0-9ff5-38db09b7f541" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:11 np0005466013 nova_compute[192144]: 2025-10-02 12:31:11.362 2 DEBUG nova.compute.manager [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:31:11 np0005466013 nova_compute[192144]: 2025-10-02 12:31:11.441 2 INFO nova.virt.libvirt.driver [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Creating config drive at /var/lib/nova/instances/205fe71f-8f8c-4ae1-ac04-9344041cfd6c/disk.config#033[00m
Oct  2 08:31:11 np0005466013 nova_compute[192144]: 2025-10-02 12:31:11.451 2 DEBUG oslo_concurrency.processutils [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/205fe71f-8f8c-4ae1-ac04-9344041cfd6c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbf3fvl4v execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:31:11 np0005466013 nova_compute[192144]: 2025-10-02 12:31:11.597 2 DEBUG oslo_concurrency.processutils [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/205fe71f-8f8c-4ae1-ac04-9344041cfd6c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbf3fvl4v" returned: 0 in 0.146s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:31:11 np0005466013 kernel: tapa9e0d1d4-11: entered promiscuous mode
Oct  2 08:31:11 np0005466013 systemd-udevd[243102]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:31:11 np0005466013 ovn_controller[94366]: 2025-10-02T12:31:11Z|00597|binding|INFO|Claiming lport a9e0d1d4-1101-49b7-adda-6a2a6db11fe1 for this chassis.
Oct  2 08:31:11 np0005466013 ovn_controller[94366]: 2025-10-02T12:31:11Z|00598|binding|INFO|a9e0d1d4-1101-49b7-adda-6a2a6db11fe1: Claiming fa:16:3e:b4:da:2c 10.100.0.14
Oct  2 08:31:11 np0005466013 nova_compute[192144]: 2025-10-02 12:31:11.693 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:11 np0005466013 NetworkManager[51205]: <info>  [1759408271.6981] manager: (tapa9e0d1d4-11): new Tun device (/org/freedesktop/NetworkManager/Devices/264)
Oct  2 08:31:11 np0005466013 nova_compute[192144]: 2025-10-02 12:31:11.700 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:11 np0005466013 NetworkManager[51205]: <info>  [1759408271.7126] device (tapa9e0d1d4-11): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:31:11 np0005466013 NetworkManager[51205]: <info>  [1759408271.7172] device (tapa9e0d1d4-11): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:31:11 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:11.718 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b4:da:2c 10.100.0.14'], port_security=['fa:16:3e:b4:da:2c 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '205fe71f-8f8c-4ae1-ac04-9344041cfd6c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-efed3bdf-e287-4892-a4a2-6d198fc94413', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'neutron:revision_number': '5', 'neutron:security_group_ids': '7f00fc5d-beb8-472a-8d55-c082ab0c14cf', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.242'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ec777f9d-784c-4505-9548-eae114383c79, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=a9e0d1d4-1101-49b7-adda-6a2a6db11fe1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:31:11 np0005466013 ovn_controller[94366]: 2025-10-02T12:31:11Z|00599|binding|INFO|Setting lport a9e0d1d4-1101-49b7-adda-6a2a6db11fe1 ovn-installed in OVS
Oct  2 08:31:11 np0005466013 ovn_controller[94366]: 2025-10-02T12:31:11Z|00600|binding|INFO|Setting lport a9e0d1d4-1101-49b7-adda-6a2a6db11fe1 up in Southbound
Oct  2 08:31:11 np0005466013 nova_compute[192144]: 2025-10-02 12:31:11.724 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:11 np0005466013 nova_compute[192144]: 2025-10-02 12:31:11.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:11 np0005466013 nova_compute[192144]: 2025-10-02 12:31:11.744 2 DEBUG oslo_concurrency.lockutils [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:11 np0005466013 nova_compute[192144]: 2025-10-02 12:31:11.745 2 DEBUG oslo_concurrency.lockutils [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:11 np0005466013 systemd-machined[152202]: New machine qemu-67-instance-00000087.
Oct  2 08:31:11 np0005466013 nova_compute[192144]: 2025-10-02 12:31:11.754 2 DEBUG nova.virt.hardware [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:31:11 np0005466013 nova_compute[192144]: 2025-10-02 12:31:11.755 2 INFO nova.compute.claims [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:31:11 np0005466013 systemd[1]: Started Virtual Machine qemu-67-instance-00000087.
Oct  2 08:31:11 np0005466013 podman[243213]: 2025-10-02 12:31:11.787585958 +0000 UTC m=+0.716347766 container remove f817f4ab7ac1c1c849d73be959e820cacc729f223e73f5a81e6f0d17728d714b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-efed3bdf-e287-4892-a4a2-6d198fc94413, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:31:11 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:11.798 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[f2f3e00a-eec2-4309-b169-f181cd6c751f]: (4, ('Thu Oct  2 12:31:08 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-efed3bdf-e287-4892-a4a2-6d198fc94413 (f817f4ab7ac1c1c849d73be959e820cacc729f223e73f5a81e6f0d17728d714b)\nf817f4ab7ac1c1c849d73be959e820cacc729f223e73f5a81e6f0d17728d714b\nThu Oct  2 12:31:11 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-efed3bdf-e287-4892-a4a2-6d198fc94413 (f817f4ab7ac1c1c849d73be959e820cacc729f223e73f5a81e6f0d17728d714b)\nf817f4ab7ac1c1c849d73be959e820cacc729f223e73f5a81e6f0d17728d714b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:11 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:11.801 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[e75b3218-fb03-43ec-bb63-6b2eeceea25d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:11 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:11.802 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapefed3bdf-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:31:11 np0005466013 nova_compute[192144]: 2025-10-02 12:31:11.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:11 np0005466013 kernel: tapefed3bdf-e0: left promiscuous mode
Oct  2 08:31:11 np0005466013 nova_compute[192144]: 2025-10-02 12:31:11.829 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:11 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:11.832 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[23b22261-b250-47e4-ae80-36051e55d488]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:11 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:11.859 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[4564c096-e82f-45b9-8747-d5692e5af15b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:11 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:11.861 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[140a8a0d-254e-4185-ab7f-435175e4d0aa]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:11 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:11.888 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[69cf2bfc-8602-43bf-b7b2-7bb57376bbed]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 623433, 'reachable_time': 44900, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243250, 'error': None, 'target': 'ovnmeta-efed3bdf-e287-4892-a4a2-6d198fc94413', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:11 np0005466013 systemd[1]: run-netns-ovnmeta\x2defed3bdf\x2de287\x2d4892\x2da4a2\x2d6d198fc94413.mount: Deactivated successfully.
Oct  2 08:31:11 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:11.894 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-efed3bdf-e287-4892-a4a2-6d198fc94413 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:31:11 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:11.894 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[f8d1cbd9-0302-4dee-9bb1-e8b021cf81d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:11 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:11.898 103323 INFO neutron.agent.ovn.metadata.agent [-] Port a9e0d1d4-1101-49b7-adda-6a2a6db11fe1 in datapath efed3bdf-e287-4892-a4a2-6d198fc94413 unbound from our chassis#033[00m
Oct  2 08:31:11 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:11.901 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network efed3bdf-e287-4892-a4a2-6d198fc94413#033[00m
Oct  2 08:31:11 np0005466013 nova_compute[192144]: 2025-10-02 12:31:11.910 2 DEBUG nova.compute.provider_tree [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:31:11 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:11.921 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[defd1f2b-0cdd-40db-b135-af94aaa07a60]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:11 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:11.922 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapefed3bdf-e1 in ovnmeta-efed3bdf-e287-4892-a4a2-6d198fc94413 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:31:11 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:11.928 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapefed3bdf-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:31:11 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:11.928 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[46abf4b7-1444-4fc1-ae25-944b83551ef5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:11 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:11.929 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[748f4c99-da10-4d6e-a7b8-58e1945c818f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:11 np0005466013 nova_compute[192144]: 2025-10-02 12:31:11.929 2 DEBUG nova.scheduler.client.report [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:31:11 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:11.949 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[6e4334c5-beb6-4989-9e28-3fa2c37e44f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:11 np0005466013 nova_compute[192144]: 2025-10-02 12:31:11.963 2 DEBUG oslo_concurrency.lockutils [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.218s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:11 np0005466013 nova_compute[192144]: 2025-10-02 12:31:11.965 2 DEBUG nova.compute.manager [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:31:11 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:11.976 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[bfe1a87b-6b89-4555-8190-928fb6016006]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:12.023 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[7e4d6c2d-0e0d-4532-a429-b0dd96c5dd57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:12 np0005466013 systemd-udevd[243252]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:31:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:12.032 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[06b00c96-bf74-43f4-9f54-6b6f846241e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:12 np0005466013 NetworkManager[51205]: <info>  [1759408272.0341] manager: (tapefed3bdf-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/265)
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.038 2 DEBUG nova.compute.manager [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.039 2 DEBUG nova.network.neutron [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.067 2 INFO nova.virt.libvirt.driver [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:31:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:12.085 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[89b98317-26e6-49c4-baae-b19a9cd420bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:12.090 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[cc10dd07-fe4d-4b75-9d19-199e08d00680]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.095 2 DEBUG nova.compute.manager [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:31:12 np0005466013 NetworkManager[51205]: <info>  [1759408272.1253] device (tapefed3bdf-e0): carrier: link connected
Oct  2 08:31:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:12.136 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[cfc2375e-f488-48aa-aad8-a3edebe26ebd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:12.160 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[04f2c47c-7cec-4ada-b8db-d28d59e1dbb0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapefed3bdf-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a5:32:cb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 180], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 626374, 'reachable_time': 31742, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243286, 'error': None, 'target': 'ovnmeta-efed3bdf-e287-4892-a4a2-6d198fc94413', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:12.182 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[71dbc2f6-1323-42dc-9786-766fe4caacbd]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea5:32cb'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 626374, 'tstamp': 626374}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 243287, 'error': None, 'target': 'ovnmeta-efed3bdf-e287-4892-a4a2-6d198fc94413', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:12.212 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[0821228c-2527-4985-b4aa-095471911570]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapefed3bdf-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a5:32:cb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 180], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 626374, 'reachable_time': 31742, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 243288, 'error': None, 'target': 'ovnmeta-efed3bdf-e287-4892-a4a2-6d198fc94413', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:12.257 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[ca775ade-4ac4-4ec7-a4ee-226c2d804cf5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.283 2 DEBUG nova.compute.manager [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.286 2 DEBUG nova.virt.libvirt.driver [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.287 2 INFO nova.virt.libvirt.driver [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Creating image(s)#033[00m
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.288 2 DEBUG oslo_concurrency.lockutils [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Acquiring lock "/var/lib/nova/instances/f9bd9b91-34ee-4ea0-9ff5-38db09b7f541/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.289 2 DEBUG oslo_concurrency.lockutils [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Lock "/var/lib/nova/instances/f9bd9b91-34ee-4ea0-9ff5-38db09b7f541/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.290 2 DEBUG oslo_concurrency.lockutils [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Lock "/var/lib/nova/instances/f9bd9b91-34ee-4ea0-9ff5-38db09b7f541/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.307 2 DEBUG oslo_concurrency.processutils [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:31:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:12.342 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[63c118a0-dce2-49e0-950c-776aa219896e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:12.344 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapefed3bdf-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:31:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:12.344 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:31:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:12.345 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapefed3bdf-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:31:12 np0005466013 NetworkManager[51205]: <info>  [1759408272.3479] manager: (tapefed3bdf-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/266)
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:12 np0005466013 kernel: tapefed3bdf-e0: entered promiscuous mode
Oct  2 08:31:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:12.354 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapefed3bdf-e0, col_values=(('external_ids', {'iface-id': '96781755-97bc-4b0c-9d56-1511bbfc3ac7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:31:12 np0005466013 ovn_controller[94366]: 2025-10-02T12:31:12Z|00601|binding|INFO|Releasing lport 96781755-97bc-4b0c-9d56-1511bbfc3ac7 from this chassis (sb_readonly=0)
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:12.358 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/efed3bdf-e287-4892-a4a2-6d198fc94413.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/efed3bdf-e287-4892-a4a2-6d198fc94413.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:31:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:12.359 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[160537d3-c9cf-495d-bb3e-23a131ba0768]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:12.360 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:31:12 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:31:12 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:31:12 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-efed3bdf-e287-4892-a4a2-6d198fc94413
Oct  2 08:31:12 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:31:12 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:31:12 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:31:12 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/efed3bdf-e287-4892-a4a2-6d198fc94413.pid.haproxy
Oct  2 08:31:12 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:31:12 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:31:12 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:31:12 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:31:12 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:31:12 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:31:12 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:31:12 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:31:12 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:31:12 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:31:12 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:31:12 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:31:12 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:31:12 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:31:12 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:31:12 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:31:12 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:31:12 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:31:12 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:31:12 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:31:12 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID efed3bdf-e287-4892-a4a2-6d198fc94413
Oct  2 08:31:12 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:31:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:12.361 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-efed3bdf-e287-4892-a4a2-6d198fc94413', 'env', 'PROCESS_TAG=haproxy-efed3bdf-e287-4892-a4a2-6d198fc94413', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/efed3bdf-e287-4892-a4a2-6d198fc94413.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.380 2 DEBUG oslo_concurrency.processutils [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.381 2 DEBUG oslo_concurrency.lockutils [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.382 2 DEBUG oslo_concurrency.lockutils [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.399 2 DEBUG oslo_concurrency.processutils [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.449 2 DEBUG nova.policy [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ee501201a91d4d1facccde8769261729', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b6f9110c2f2b4d5ba26abe5fc13b1395', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.460 2 DEBUG oslo_concurrency.processutils [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.462 2 DEBUG oslo_concurrency.processutils [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/f9bd9b91-34ee-4ea0-9ff5-38db09b7f541/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.554 2 DEBUG nova.virt.libvirt.host [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Removed pending event for 205fe71f-8f8c-4ae1-ac04-9344041cfd6c due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.555 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759408272.5536764, 205fe71f-8f8c-4ae1-ac04-9344041cfd6c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.556 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.559 2 DEBUG nova.compute.manager [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.559 2 DEBUG nova.virt.libvirt.driver [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.563 2 INFO nova.virt.libvirt.driver [-] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Instance spawned successfully.#033[00m
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.564 2 DEBUG nova.virt.libvirt.driver [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.591 2 DEBUG nova.virt.libvirt.driver [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.591 2 DEBUG nova.virt.libvirt.driver [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.592 2 DEBUG nova.virt.libvirt.driver [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.592 2 DEBUG nova.virt.libvirt.driver [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.593 2 DEBUG nova.virt.libvirt.driver [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.593 2 DEBUG nova.virt.libvirt.driver [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.596 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.600 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.626 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.626 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759408272.5546892, 205fe71f-8f8c-4ae1-ac04-9344041cfd6c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.626 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] VM Started (Lifecycle Event)#033[00m
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.659 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.662 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.705 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.718 2 DEBUG nova.compute.manager [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.793 2 DEBUG oslo_concurrency.processutils [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/f9bd9b91-34ee-4ea0-9ff5-38db09b7f541/disk 1073741824" returned: 0 in 0.331s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.794 2 DEBUG oslo_concurrency.lockutils [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.411s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.794 2 DEBUG oslo_concurrency.processutils [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.853 2 DEBUG oslo_concurrency.processutils [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.855 2 DEBUG nova.virt.disk.api [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Checking if we can resize image /var/lib/nova/instances/f9bd9b91-34ee-4ea0-9ff5-38db09b7f541/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.856 2 DEBUG oslo_concurrency.processutils [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f9bd9b91-34ee-4ea0-9ff5-38db09b7f541/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:31:12 np0005466013 podman[243329]: 2025-10-02 12:31:12.778944144 +0000 UTC m=+0.024158253 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.883 2 DEBUG oslo_concurrency.lockutils [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.885 2 DEBUG oslo_concurrency.lockutils [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.886 2 DEBUG nova.objects.instance [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.918 2 DEBUG oslo_concurrency.processutils [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f9bd9b91-34ee-4ea0-9ff5-38db09b7f541/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.920 2 DEBUG nova.virt.disk.api [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Cannot resize image /var/lib/nova/instances/f9bd9b91-34ee-4ea0-9ff5-38db09b7f541/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.920 2 DEBUG nova.objects.instance [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Lazy-loading 'migration_context' on Instance uuid f9bd9b91-34ee-4ea0-9ff5-38db09b7f541 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.945 2 DEBUG nova.virt.libvirt.driver [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.946 2 DEBUG nova.virt.libvirt.driver [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Ensure instance console log exists: /var/lib/nova/instances/f9bd9b91-34ee-4ea0-9ff5-38db09b7f541/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.947 2 DEBUG oslo_concurrency.lockutils [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.947 2 DEBUG oslo_concurrency.lockutils [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:12 np0005466013 nova_compute[192144]: 2025-10-02 12:31:12.948 2 DEBUG oslo_concurrency.lockutils [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:13 np0005466013 nova_compute[192144]: 2025-10-02 12:31:13.002 2 DEBUG oslo_concurrency.lockutils [None req-40f2395a-8d62-41dc-bd96-d4560c0ab479 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:13 np0005466013 nova_compute[192144]: 2025-10-02 12:31:13.053 2 DEBUG nova.network.neutron [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Successfully created port: 15e9913c-6f57-4c6b-9187-b313d2cc7b28 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:31:13 np0005466013 nova_compute[192144]: 2025-10-02 12:31:13.202 2 DEBUG nova.compute.manager [req-0e48cb46-c53e-42dd-b02e-7190e21e306d req-7fac8dfa-45eb-4d59-980e-a79b974b161f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Received event network-vif-plugged-a9e0d1d4-1101-49b7-adda-6a2a6db11fe1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:31:13 np0005466013 nova_compute[192144]: 2025-10-02 12:31:13.203 2 DEBUG oslo_concurrency.lockutils [req-0e48cb46-c53e-42dd-b02e-7190e21e306d req-7fac8dfa-45eb-4d59-980e-a79b974b161f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "205fe71f-8f8c-4ae1-ac04-9344041cfd6c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:13 np0005466013 nova_compute[192144]: 2025-10-02 12:31:13.209 2 DEBUG oslo_concurrency.lockutils [req-0e48cb46-c53e-42dd-b02e-7190e21e306d req-7fac8dfa-45eb-4d59-980e-a79b974b161f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "205fe71f-8f8c-4ae1-ac04-9344041cfd6c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:13 np0005466013 nova_compute[192144]: 2025-10-02 12:31:13.209 2 DEBUG oslo_concurrency.lockutils [req-0e48cb46-c53e-42dd-b02e-7190e21e306d req-7fac8dfa-45eb-4d59-980e-a79b974b161f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "205fe71f-8f8c-4ae1-ac04-9344041cfd6c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:13 np0005466013 nova_compute[192144]: 2025-10-02 12:31:13.210 2 DEBUG nova.compute.manager [req-0e48cb46-c53e-42dd-b02e-7190e21e306d req-7fac8dfa-45eb-4d59-980e-a79b974b161f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] No waiting events found dispatching network-vif-plugged-a9e0d1d4-1101-49b7-adda-6a2a6db11fe1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:31:13 np0005466013 nova_compute[192144]: 2025-10-02 12:31:13.210 2 WARNING nova.compute.manager [req-0e48cb46-c53e-42dd-b02e-7190e21e306d req-7fac8dfa-45eb-4d59-980e-a79b974b161f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Received unexpected event network-vif-plugged-a9e0d1d4-1101-49b7-adda-6a2a6db11fe1 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:31:13 np0005466013 nova_compute[192144]: 2025-10-02 12:31:13.211 2 DEBUG nova.compute.manager [req-0e48cb46-c53e-42dd-b02e-7190e21e306d req-7fac8dfa-45eb-4d59-980e-a79b974b161f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Received event network-vif-plugged-a9e0d1d4-1101-49b7-adda-6a2a6db11fe1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:31:13 np0005466013 nova_compute[192144]: 2025-10-02 12:31:13.211 2 DEBUG oslo_concurrency.lockutils [req-0e48cb46-c53e-42dd-b02e-7190e21e306d req-7fac8dfa-45eb-4d59-980e-a79b974b161f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "205fe71f-8f8c-4ae1-ac04-9344041cfd6c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:13 np0005466013 nova_compute[192144]: 2025-10-02 12:31:13.212 2 DEBUG oslo_concurrency.lockutils [req-0e48cb46-c53e-42dd-b02e-7190e21e306d req-7fac8dfa-45eb-4d59-980e-a79b974b161f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "205fe71f-8f8c-4ae1-ac04-9344041cfd6c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:13 np0005466013 nova_compute[192144]: 2025-10-02 12:31:13.212 2 DEBUG oslo_concurrency.lockutils [req-0e48cb46-c53e-42dd-b02e-7190e21e306d req-7fac8dfa-45eb-4d59-980e-a79b974b161f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "205fe71f-8f8c-4ae1-ac04-9344041cfd6c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:13 np0005466013 nova_compute[192144]: 2025-10-02 12:31:13.213 2 DEBUG nova.compute.manager [req-0e48cb46-c53e-42dd-b02e-7190e21e306d req-7fac8dfa-45eb-4d59-980e-a79b974b161f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] No waiting events found dispatching network-vif-plugged-a9e0d1d4-1101-49b7-adda-6a2a6db11fe1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:31:13 np0005466013 nova_compute[192144]: 2025-10-02 12:31:13.213 2 WARNING nova.compute.manager [req-0e48cb46-c53e-42dd-b02e-7190e21e306d req-7fac8dfa-45eb-4d59-980e-a79b974b161f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Received unexpected event network-vif-plugged-a9e0d1d4-1101-49b7-adda-6a2a6db11fe1 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:31:13 np0005466013 podman[243329]: 2025-10-02 12:31:13.346332083 +0000 UTC m=+0.591546182 container create f51e4838099107d623428aaecc97c0e6b47a7b76891f57157d1f1921f4d753ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-efed3bdf-e287-4892-a4a2-6d198fc94413, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct  2 08:31:13 np0005466013 systemd[1]: Started libpod-conmon-f51e4838099107d623428aaecc97c0e6b47a7b76891f57157d1f1921f4d753ac.scope.
Oct  2 08:31:13 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:31:13 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d65ad3ffbf5b32d7653c93e7970ef3c7d38112a08be9831552f0a650f512e76e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:31:13 np0005466013 podman[243329]: 2025-10-02 12:31:13.771251941 +0000 UTC m=+1.016466120 container init f51e4838099107d623428aaecc97c0e6b47a7b76891f57157d1f1921f4d753ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-efed3bdf-e287-4892-a4a2-6d198fc94413, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:31:13 np0005466013 podman[243329]: 2025-10-02 12:31:13.781616615 +0000 UTC m=+1.026830734 container start f51e4838099107d623428aaecc97c0e6b47a7b76891f57157d1f1921f4d753ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-efed3bdf-e287-4892-a4a2-6d198fc94413, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct  2 08:31:13 np0005466013 neutron-haproxy-ovnmeta-efed3bdf-e287-4892-a4a2-6d198fc94413[243350]: [NOTICE]   (243354) : New worker (243356) forked
Oct  2 08:31:13 np0005466013 neutron-haproxy-ovnmeta-efed3bdf-e287-4892-a4a2-6d198fc94413[243350]: [NOTICE]   (243354) : Loading success.
Oct  2 08:31:14 np0005466013 nova_compute[192144]: 2025-10-02 12:31:14.941 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:15 np0005466013 nova_compute[192144]: 2025-10-02 12:31:15.017 2 DEBUG nova.network.neutron [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Successfully updated port: 15e9913c-6f57-4c6b-9187-b313d2cc7b28 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:31:15 np0005466013 nova_compute[192144]: 2025-10-02 12:31:15.033 2 DEBUG oslo_concurrency.lockutils [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Acquiring lock "refresh_cache-f9bd9b91-34ee-4ea0-9ff5-38db09b7f541" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:31:15 np0005466013 nova_compute[192144]: 2025-10-02 12:31:15.034 2 DEBUG oslo_concurrency.lockutils [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Acquired lock "refresh_cache-f9bd9b91-34ee-4ea0-9ff5-38db09b7f541" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:31:15 np0005466013 nova_compute[192144]: 2025-10-02 12:31:15.034 2 DEBUG nova.network.neutron [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:31:15 np0005466013 nova_compute[192144]: 2025-10-02 12:31:15.145 2 DEBUG nova.compute.manager [req-1f7e88bc-98b4-4c32-a451-c7db316c9ab3 req-b1dd2a6f-c2f7-41a0-950a-c8ed42481394 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Received event network-changed-15e9913c-6f57-4c6b-9187-b313d2cc7b28 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:31:15 np0005466013 nova_compute[192144]: 2025-10-02 12:31:15.146 2 DEBUG nova.compute.manager [req-1f7e88bc-98b4-4c32-a451-c7db316c9ab3 req-b1dd2a6f-c2f7-41a0-950a-c8ed42481394 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Refreshing instance network info cache due to event network-changed-15e9913c-6f57-4c6b-9187-b313d2cc7b28. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:31:15 np0005466013 nova_compute[192144]: 2025-10-02 12:31:15.147 2 DEBUG oslo_concurrency.lockutils [req-1f7e88bc-98b4-4c32-a451-c7db316c9ab3 req-b1dd2a6f-c2f7-41a0-950a-c8ed42481394 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-f9bd9b91-34ee-4ea0-9ff5-38db09b7f541" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:31:15 np0005466013 nova_compute[192144]: 2025-10-02 12:31:15.811 2 DEBUG nova.network.neutron [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:31:15 np0005466013 nova_compute[192144]: 2025-10-02 12:31:15.909 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.360 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '205fe71f-8f8c-4ae1-ac04-9344041cfd6c', 'name': 'tempest-TestNetworkAdvancedServerOps-server-853898395', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000087', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'hostId': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.362 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.362 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.363 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-853898395>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-853898395>]
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.363 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.384 12 DEBUG ceilometer.compute.pollsters [-] 205fe71f-8f8c-4ae1-ac04-9344041cfd6c/cpu volume: 3700000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.388 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '93a072cf-67f8-41c8-905a-81c22351265c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3700000000, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '205fe71f-8f8c-4ae1-ac04-9344041cfd6c', 'timestamp': '2025-10-02T12:31:16.363779', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-853898395', 'name': 'instance-00000087', 'instance_id': '205fe71f-8f8c-4ae1-ac04-9344041cfd6c', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'b0cb16e6-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6268.062501411, 'message_signature': '8e9e187b40a4ee47fd9ffdf26dee2dd52f8043ec9bf7ea7bb60610bc5cfd51f4'}]}, 'timestamp': '2025-10-02 12:31:16.385961', '_unique_id': '5a5085abb84941128b67bf0e8443ead5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.388 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.388 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.388 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.388 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.388 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.388 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.388 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.388 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.388 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.388 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.388 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.388 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.388 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.388 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.388 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.388 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.388 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.388 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.388 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.388 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.388 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.388 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.388 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.388 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.388 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.388 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.388 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.388 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.388 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.388 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.388 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.389 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.421 12 DEBUG ceilometer.compute.pollsters [-] 205fe71f-8f8c-4ae1-ac04-9344041cfd6c/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.422 12 DEBUG ceilometer.compute.pollsters [-] 205fe71f-8f8c-4ae1-ac04-9344041cfd6c/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.424 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fd5e74e0-aeb1-4334-b498-32eea5717140', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '205fe71f-8f8c-4ae1-ac04-9344041cfd6c-vda', 'timestamp': '2025-10-02T12:31:16.390021', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-853898395', 'name': 'instance-00000087', 'instance_id': '205fe71f-8f8c-4ae1-ac04-9344041cfd6c', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b0d09d78-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6268.068291463, 'message_signature': '57a70c3648a6209deb5a8b12e164c621db8c937b0ac2acd565d1e4fb03dca834'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '205fe71f-8f8c-4ae1-ac04-9344041cfd6c-sda', 'timestamp': '2025-10-02T12:31:16.390021', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-853898395', 'name': 'instance-00000087', 'instance_id': '205fe71f-8f8c-4ae1-ac04-9344041cfd6c', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b0d0b68c-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6268.068291463, 'message_signature': 'cba0f1d7b46939998e41c904eedfb73d3c3b7984f296242546a3c8a55a14c133'}]}, 'timestamp': '2025-10-02 12:31:16.422653', '_unique_id': '1f9d1845e05c4e3aae2fab6bcccf9f63'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.424 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.424 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.424 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.424 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.424 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.424 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.424 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.424 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.424 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.424 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.424 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.424 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.424 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.424 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.424 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.424 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.424 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.424 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.424 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.424 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.424 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.424 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.424 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.424 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.424 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.424 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.424 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.424 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.424 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.424 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.424 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.425 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.432 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 205fe71f-8f8c-4ae1-ac04-9344041cfd6c / tapa9e0d1d4-11 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.432 12 DEBUG ceilometer.compute.pollsters [-] 205fe71f-8f8c-4ae1-ac04-9344041cfd6c/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.434 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '69121cbc-a915-4b08-a16b-5d3095c3f9a2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': 'instance-00000087-205fe71f-8f8c-4ae1-ac04-9344041cfd6c-tapa9e0d1d4-11', 'timestamp': '2025-10-02T12:31:16.425736', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-853898395', 'name': 'tapa9e0d1d4-11', 'instance_id': '205fe71f-8f8c-4ae1-ac04-9344041cfd6c', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b4:da:2c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa9e0d1d4-11'}, 'message_id': 'b0d25136-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6268.104076481, 'message_signature': 'ac271c0a6ec66eb0be2d101b36bbcd194da70ef98f280dd743dd35477fa5f8ff'}]}, 'timestamp': '2025-10-02 12:31:16.433228', '_unique_id': '313e35f0d10f4b1c85004c5e4ebb28bc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.434 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.434 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.434 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.434 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.434 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.434 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.434 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.434 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.434 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.434 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.434 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.434 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.434 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.434 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.434 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.434 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.434 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.434 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.434 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.434 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.434 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.434 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.434 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.434 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.434 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.434 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.434 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.434 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.434 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.434 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.434 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.436 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.436 12 DEBUG ceilometer.compute.pollsters [-] 205fe71f-8f8c-4ae1-ac04-9344041cfd6c/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.437 12 DEBUG ceilometer.compute.pollsters [-] 205fe71f-8f8c-4ae1-ac04-9344041cfd6c/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.438 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c29629a9-5315-4d0e-835e-1e6162adcef5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '205fe71f-8f8c-4ae1-ac04-9344041cfd6c-vda', 'timestamp': '2025-10-02T12:31:16.436391', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-853898395', 'name': 'instance-00000087', 'instance_id': '205fe71f-8f8c-4ae1-ac04-9344041cfd6c', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b0d2e5ec-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6268.068291463, 'message_signature': '7c609c9a0cff0802c08d7a4229a58682cc12b65cabf950c0156ca1383d5a96b0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '205fe71f-8f8c-4ae1-ac04-9344041cfd6c-sda', 'timestamp': '2025-10-02T12:31:16.436391', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-853898395', 'name': 'instance-00000087', 'instance_id': '205fe71f-8f8c-4ae1-ac04-9344041cfd6c', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b0d2ff6e-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6268.068291463, 'message_signature': '45d1d5e6055a295a6970ee2e419b673bbfc3ff75f3ad80c733a6f4e6ad5cdb8a'}]}, 'timestamp': '2025-10-02 12:31:16.437615', '_unique_id': 'b7ff9810f13f46f091fee0ecf6ff7e5a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.438 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.438 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.438 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.438 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.438 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.438 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.438 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.438 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.438 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.438 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.438 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.438 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.438 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.438 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.438 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.438 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.438 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.438 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.438 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.438 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.438 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.438 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.438 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.438 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.438 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.438 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.438 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.438 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.438 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.438 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.438 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.440 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.440 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.440 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-853898395>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-853898395>]
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.441 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.457 12 DEBUG ceilometer.compute.pollsters [-] 205fe71f-8f8c-4ae1-ac04-9344041cfd6c/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.457 12 DEBUG ceilometer.compute.pollsters [-] 205fe71f-8f8c-4ae1-ac04-9344041cfd6c/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.459 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ca9c2985-0c77-46d3-a1ae-53a7ece8beb1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '205fe71f-8f8c-4ae1-ac04-9344041cfd6c-vda', 'timestamp': '2025-10-02T12:31:16.441477', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-853898395', 'name': 'instance-00000087', 'instance_id': '205fe71f-8f8c-4ae1-ac04-9344041cfd6c', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b0d61384-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6268.119757512, 'message_signature': '4849fcd47cf83f722644ba64d0b500876a231c95e2e19ca1135019a96f4b3615'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '205fe71f-8f8c-4ae1-ac04-9344041cfd6c-sda', 'timestamp': '2025-10-02T12:31:16.441477', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-853898395', 'name': 'instance-00000087', 'instance_id': '205fe71f-8f8c-4ae1-ac04-9344041cfd6c', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b0d624f0-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6268.119757512, 'message_signature': '924708fe5501a32df9e3c813281834aa08d087dc38586bb21a3f3422a166c1c2'}]}, 'timestamp': '2025-10-02 12:31:16.458175', '_unique_id': 'e12380e2dc934f879595745ec7632b23'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.459 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.459 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.459 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.459 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.459 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.459 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.459 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.459 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.459 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.459 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.459 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.459 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.459 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.459 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.459 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.459 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.459 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.459 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.459 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.459 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.459 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.459 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.459 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.459 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.459 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.459 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.459 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.459 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.459 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.459 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.459 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.460 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.460 12 DEBUG ceilometer.compute.pollsters [-] 205fe71f-8f8c-4ae1-ac04-9344041cfd6c/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.461 12 DEBUG ceilometer.compute.pollsters [-] 205fe71f-8f8c-4ae1-ac04-9344041cfd6c/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.462 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '73b2d406-8a58-4cc1-a6af-152062f6602b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '205fe71f-8f8c-4ae1-ac04-9344041cfd6c-vda', 'timestamp': '2025-10-02T12:31:16.460612', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-853898395', 'name': 'instance-00000087', 'instance_id': '205fe71f-8f8c-4ae1-ac04-9344041cfd6c', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b0d6953e-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6268.068291463, 'message_signature': '117d3e9ad701cacde765010a0d405454f0109b674ef40a6b2d0473fcd03d0991'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '205fe71f-8f8c-4ae1-ac04-9344041cfd6c-sda', 'timestamp': '2025-10-02T12:31:16.460612', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-853898395', 'name': 'instance-00000087', 'instance_id': '205fe71f-8f8c-4ae1-ac04-9344041cfd6c', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b0d6a5c4-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6268.068291463, 'message_signature': '194a80331d1540c0b5feb7bc2dc46efd1b430ede091e7243ac2998c0834f34e2'}]}, 'timestamp': '2025-10-02 12:31:16.461459', '_unique_id': '7cc51f2be60045e3916bbd131805d492'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.462 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.462 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.462 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.462 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.462 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.462 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.462 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.462 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.462 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.462 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.462 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.462 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.462 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.462 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.462 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.462 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.462 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.462 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.462 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.462 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.462 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.462 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.462 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.462 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.462 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.462 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.462 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.462 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.462 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.462 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.462 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.463 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.463 12 DEBUG ceilometer.compute.pollsters [-] 205fe71f-8f8c-4ae1-ac04-9344041cfd6c/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.463 12 DEBUG ceilometer.compute.pollsters [-] 205fe71f-8f8c-4ae1-ac04-9344041cfd6c/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.464 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e830bee3-14e4-449a-9555-43504d273865', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '205fe71f-8f8c-4ae1-ac04-9344041cfd6c-vda', 'timestamp': '2025-10-02T12:31:16.463442', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-853898395', 'name': 'instance-00000087', 'instance_id': '205fe71f-8f8c-4ae1-ac04-9344041cfd6c', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b0d6fff6-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6268.068291463, 'message_signature': '74998d55079f483f96d88b93ff10a5b3880afccaae318dc3a031368a1708ff32'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '205fe71f-8f8c-4ae1-ac04-9344041cfd6c-sda', 'timestamp': '2025-10-02T12:31:16.463442', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-853898395', 'name': 'instance-00000087', 'instance_id': '205fe71f-8f8c-4ae1-ac04-9344041cfd6c', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b0d70c76-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6268.068291463, 'message_signature': '2243172282833864e31b25cb4fd3fe75f33c94ed4be682e1c342ea52d5601e3e'}]}, 'timestamp': '2025-10-02 12:31:16.464072', '_unique_id': 'd95c307d9cdc42978c4867be4b92153f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.464 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.464 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.464 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.464 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.464 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.464 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.464 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.464 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.464 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.464 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.464 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.464 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.464 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.464 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.464 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.464 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.464 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.464 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.464 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.464 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.464 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.464 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.464 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.464 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.464 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.464 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.464 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.464 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.464 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.464 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.464 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.465 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.465 12 DEBUG ceilometer.compute.pollsters [-] 205fe71f-8f8c-4ae1-ac04-9344041cfd6c/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.466 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ac2be308-19d8-4895-8a29-b7b8da523db3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': 'instance-00000087-205fe71f-8f8c-4ae1-ac04-9344041cfd6c-tapa9e0d1d4-11', 'timestamp': '2025-10-02T12:31:16.465779', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-853898395', 'name': 'tapa9e0d1d4-11', 'instance_id': '205fe71f-8f8c-4ae1-ac04-9344041cfd6c', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b4:da:2c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa9e0d1d4-11'}, 'message_id': 'b0d75c76-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6268.104076481, 'message_signature': 'f81b9b889eceb30581f481d9198c8f2bfd407caa69f1197db9206d530a7324dc'}]}, 'timestamp': '2025-10-02 12:31:16.466142', '_unique_id': 'acd8f167bceb4d8f88dee0483babbb0e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.466 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.466 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.466 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.466 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.466 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.466 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.466 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.466 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.466 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.466 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.466 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.466 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.466 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.466 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.466 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.466 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.466 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.466 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.466 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.466 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.466 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.466 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.466 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.466 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.466 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.466 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.466 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.466 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.466 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.466 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.466 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.467 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.467 12 DEBUG ceilometer.compute.pollsters [-] 205fe71f-8f8c-4ae1-ac04-9344041cfd6c/network.incoming.bytes volume: 110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.468 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ce0a4cd2-a0be-409d-89b8-556e6841679e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 110, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': 'instance-00000087-205fe71f-8f8c-4ae1-ac04-9344041cfd6c-tapa9e0d1d4-11', 'timestamp': '2025-10-02T12:31:16.467748', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-853898395', 'name': 'tapa9e0d1d4-11', 'instance_id': '205fe71f-8f8c-4ae1-ac04-9344041cfd6c', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b4:da:2c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa9e0d1d4-11'}, 'message_id': 'b0d7a8ca-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6268.104076481, 'message_signature': '9cbefb3bf3b02331acd42a0c7fd98d30d67b1a96872db6ae93b612f01c2cde5f'}]}, 'timestamp': '2025-10-02 12:31:16.468094', '_unique_id': '4be17f9791104658a5c1aabb5407fcca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.468 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.468 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.468 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.468 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.468 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.468 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.468 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.468 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.468 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.468 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.468 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.468 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.468 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.468 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.468 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.468 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.468 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.468 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.468 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.468 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.468 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.468 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.468 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.468 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.468 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.468 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.468 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.468 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.468 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.468 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.468 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.469 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.469 12 DEBUG ceilometer.compute.pollsters [-] 205fe71f-8f8c-4ae1-ac04-9344041cfd6c/disk.device.read.latency volume: 2399500640 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.470 12 DEBUG ceilometer.compute.pollsters [-] 205fe71f-8f8c-4ae1-ac04-9344041cfd6c/disk.device.read.latency volume: 7841310 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.471 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f9574920-9c37-4e39-8757-a477dd23fa56', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2399500640, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '205fe71f-8f8c-4ae1-ac04-9344041cfd6c-vda', 'timestamp': '2025-10-02T12:31:16.469628', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-853898395', 'name': 'instance-00000087', 'instance_id': '205fe71f-8f8c-4ae1-ac04-9344041cfd6c', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b0d7f2d0-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6268.068291463, 'message_signature': 'e74ff261df0ad0e7d8037b70b2501bbf6f040c5d63948fcb85d64e20a46d9a47'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 7841310, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '205fe71f-8f8c-4ae1-ac04-9344041cfd6c-sda', 'timestamp': '2025-10-02T12:31:16.469628', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-853898395', 'name': 'instance-00000087', 'instance_id': '205fe71f-8f8c-4ae1-ac04-9344041cfd6c', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b0d801c6-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6268.068291463, 'message_signature': '4313331f831ffcd8a5c38a79acf08b1c3358d7b489b21b64df23880bff7e9327'}]}, 'timestamp': '2025-10-02 12:31:16.470354', '_unique_id': 'fa3d28b3a443442c925e411ca30074a1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.471 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.471 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.471 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.471 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.471 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.471 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.471 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.471 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.471 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.471 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.471 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.471 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.471 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.471 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.471 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.471 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.471 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.471 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.471 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.471 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.471 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.471 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.471 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.471 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.471 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.471 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.471 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.471 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.471 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.471 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.471 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.472 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.472 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.472 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-853898395>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-853898395>]
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.472 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.472 12 DEBUG ceilometer.compute.pollsters [-] 205fe71f-8f8c-4ae1-ac04-9344041cfd6c/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.472 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 205fe71f-8f8c-4ae1-ac04-9344041cfd6c: ceilometer.compute.pollsters.NoVolumeException
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.472 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.473 12 DEBUG ceilometer.compute.pollsters [-] 205fe71f-8f8c-4ae1-ac04-9344041cfd6c/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.473 12 DEBUG ceilometer.compute.pollsters [-] 205fe71f-8f8c-4ae1-ac04-9344041cfd6c/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.474 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c2fcc93b-5e2c-4e13-9f26-4fe2403c25ea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '205fe71f-8f8c-4ae1-ac04-9344041cfd6c-vda', 'timestamp': '2025-10-02T12:31:16.473033', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-853898395', 'name': 'instance-00000087', 'instance_id': '205fe71f-8f8c-4ae1-ac04-9344041cfd6c', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b0d8764c-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6268.119757512, 'message_signature': '594b40f9980a79a8741fb42ea0104cd799b482c51f54e207cc70434725f152e6'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '205fe71f-8f8c-4ae1-ac04-9344041cfd6c-sda', 'timestamp': '2025-10-02T12:31:16.473033', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-853898395', 'name': 'instance-00000087', 'instance_id': '205fe71f-8f8c-4ae1-ac04-9344041cfd6c', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b0d8818c-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6268.119757512, 'message_signature': 'c23bf2c7b64955fa310fde11cc5f4057ee64a3511013a3bc05449948ea631ab4'}]}, 'timestamp': '2025-10-02 12:31:16.473622', '_unique_id': 'c25baa0824f44e6a93acab2e477a22fe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.474 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.474 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.474 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.474 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.474 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.474 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.474 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.474 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.474 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.474 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.474 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.474 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.474 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.474 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.474 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.474 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.474 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.474 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.474 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.474 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.474 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.474 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.474 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.474 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.474 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.474 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.474 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.474 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.474 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.474 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.474 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.475 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.475 12 DEBUG ceilometer.compute.pollsters [-] 205fe71f-8f8c-4ae1-ac04-9344041cfd6c/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.476 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fed717cd-8c80-4c32-b2a5-19e82404dca6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': 'instance-00000087-205fe71f-8f8c-4ae1-ac04-9344041cfd6c-tapa9e0d1d4-11', 'timestamp': '2025-10-02T12:31:16.475326', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-853898395', 'name': 'tapa9e0d1d4-11', 'instance_id': '205fe71f-8f8c-4ae1-ac04-9344041cfd6c', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b4:da:2c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa9e0d1d4-11'}, 'message_id': 'b0d8cffc-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6268.104076481, 'message_signature': '4e31030c9d7fb32fd63c94185fa0ee357445c97b1b8ecd5429d0bf7a85264364'}]}, 'timestamp': '2025-10-02 12:31:16.475649', '_unique_id': '9c1caa19e11549f08b9fd99de77d52cb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.476 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.476 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.476 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.476 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.476 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.476 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.476 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.476 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.476 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.476 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.476 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.476 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.476 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.476 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.476 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.476 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.476 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.476 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.476 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.476 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.476 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.476 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.476 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.476 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.476 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.476 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.476 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.476 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.476 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.476 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.476 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.477 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.477 12 DEBUG ceilometer.compute.pollsters [-] 205fe71f-8f8c-4ae1-ac04-9344041cfd6c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.477 12 DEBUG ceilometer.compute.pollsters [-] 205fe71f-8f8c-4ae1-ac04-9344041cfd6c/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.478 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5be42f9e-b9ff-44ba-9616-7aefbec9a4a7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '205fe71f-8f8c-4ae1-ac04-9344041cfd6c-vda', 'timestamp': '2025-10-02T12:31:16.477171', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-853898395', 'name': 'instance-00000087', 'instance_id': '205fe71f-8f8c-4ae1-ac04-9344041cfd6c', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b0d917a0-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6268.119757512, 'message_signature': 'b4c7bc98baab46b446d1234c45b63f736d28ffc827284c2d8b976facb4995d62'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '205fe71f-8f8c-4ae1-ac04-9344041cfd6c-sda', 'timestamp': '2025-10-02T12:31:16.477171', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-853898395', 'name': 'instance-00000087', 'instance_id': '205fe71f-8f8c-4ae1-ac04-9344041cfd6c', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b0d922a4-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6268.119757512, 'message_signature': '52df0580da9c42df5cee9a959f86a0ce32434da3c303769c3830bf147c330f29'}]}, 'timestamp': '2025-10-02 12:31:16.477744', '_unique_id': 'b106a5bead0a4700becd8363d268c21c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.478 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.478 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.478 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.478 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.478 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.478 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.478 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.478 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.478 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.478 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.478 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.478 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.478 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.478 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.478 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.478 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.478 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.478 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.478 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.478 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.478 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.478 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.478 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.478 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.478 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.478 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.478 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.478 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.478 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.478 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.478 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.479 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.479 12 DEBUG ceilometer.compute.pollsters [-] 205fe71f-8f8c-4ae1-ac04-9344041cfd6c/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.480 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '67d229a2-8739-491a-8202-9999e2944103', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': 'instance-00000087-205fe71f-8f8c-4ae1-ac04-9344041cfd6c-tapa9e0d1d4-11', 'timestamp': '2025-10-02T12:31:16.479309', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-853898395', 'name': 'tapa9e0d1d4-11', 'instance_id': '205fe71f-8f8c-4ae1-ac04-9344041cfd6c', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b4:da:2c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa9e0d1d4-11'}, 'message_id': 'b0d96b6a-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6268.104076481, 'message_signature': '871fa048f081398368e23d213c315407c767afe5def8c74650c41b42b8279c15'}]}, 'timestamp': '2025-10-02 12:31:16.479628', '_unique_id': '3d84faf5f19240f7980364585e095ce9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.480 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.480 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.480 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.480 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.480 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.480 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.480 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.480 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.480 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.480 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.480 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.480 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.480 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.480 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.480 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.480 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.480 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.480 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.480 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.480 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.480 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.480 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.480 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.480 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.480 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.480 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.480 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.480 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.480 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.480 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.480 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.481 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.481 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.481 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-853898395>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-853898395>]
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.481 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.481 12 DEBUG ceilometer.compute.pollsters [-] 205fe71f-8f8c-4ae1-ac04-9344041cfd6c/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.482 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '543a5ef8-e4f9-4cbc-bac0-8cc74e192ebc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': 'instance-00000087-205fe71f-8f8c-4ae1-ac04-9344041cfd6c-tapa9e0d1d4-11', 'timestamp': '2025-10-02T12:31:16.481541', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-853898395', 'name': 'tapa9e0d1d4-11', 'instance_id': '205fe71f-8f8c-4ae1-ac04-9344041cfd6c', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b4:da:2c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa9e0d1d4-11'}, 'message_id': 'b0d9c290-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6268.104076481, 'message_signature': 'ff022885ee3c50b29967b4dd1929dfbafbfdcb14dab45f5be686729ecf2d0cc5'}]}, 'timestamp': '2025-10-02 12:31:16.481882', '_unique_id': '6cf55e7ae744439eaac3c81732b2ad60'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.482 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.482 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.482 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.482 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.482 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.482 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.482 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.482 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.482 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.482 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.482 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.482 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.482 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.482 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.482 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.482 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.482 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.482 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.482 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.482 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.482 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.482 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.482 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.482 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.482 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.482 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.482 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.482 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.482 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.482 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.482 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.483 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.483 12 DEBUG ceilometer.compute.pollsters [-] 205fe71f-8f8c-4ae1-ac04-9344041cfd6c/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.483 12 DEBUG ceilometer.compute.pollsters [-] 205fe71f-8f8c-4ae1-ac04-9344041cfd6c/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.484 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '59ee8e70-d44b-41a7-8f12-60a0257cc9dd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '205fe71f-8f8c-4ae1-ac04-9344041cfd6c-vda', 'timestamp': '2025-10-02T12:31:16.483389', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-853898395', 'name': 'instance-00000087', 'instance_id': '205fe71f-8f8c-4ae1-ac04-9344041cfd6c', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b0da0a7a-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6268.068291463, 'message_signature': 'd2ad1e412a29c05b7799ff8ebad3c3b14bb894081b86764261c51d2d5f3dafa3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '205fe71f-8f8c-4ae1-ac04-9344041cfd6c-sda', 'timestamp': '2025-10-02T12:31:16.483389', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-853898395', 'name': 'instance-00000087', 'instance_id': '205fe71f-8f8c-4ae1-ac04-9344041cfd6c', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b0da1560-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6268.068291463, 'message_signature': '3c3666944506c752fa722d861ac842313e659a0b4c748c093b56c69aa54ba9cd'}]}, 'timestamp': '2025-10-02 12:31:16.483982', '_unique_id': '33d9245af9cf4660a1a248ca191b565f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.484 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.484 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.484 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.484 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.484 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.484 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.484 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.484 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.484 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.484 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.484 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.484 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.484 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.484 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.484 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.484 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.484 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.484 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.484 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.484 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.484 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.484 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.484 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.484 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.484 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.484 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.484 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.484 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.484 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.484 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.484 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.485 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.485 12 DEBUG ceilometer.compute.pollsters [-] 205fe71f-8f8c-4ae1-ac04-9344041cfd6c/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.486 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f5844bc5-d17d-4ca6-9d6a-bb308b311065', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': 'instance-00000087-205fe71f-8f8c-4ae1-ac04-9344041cfd6c-tapa9e0d1d4-11', 'timestamp': '2025-10-02T12:31:16.485516', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-853898395', 'name': 'tapa9e0d1d4-11', 'instance_id': '205fe71f-8f8c-4ae1-ac04-9344041cfd6c', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b4:da:2c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa9e0d1d4-11'}, 'message_id': 'b0da5dd6-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6268.104076481, 'message_signature': '6f81c8afe8c6ef8d3078928a01b9ad488c5ea6fe17e29327ac47a03b8b2c2a8e'}]}, 'timestamp': '2025-10-02 12:31:16.485833', '_unique_id': '4f4f99f2b0d3419e9616011a5142f68a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.486 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.486 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.486 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.486 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.486 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.486 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.486 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.486 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.486 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.486 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.486 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.486 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.486 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.486 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.486 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.486 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.486 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.486 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.486 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.486 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.486 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.486 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.486 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.486 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.486 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.486 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.486 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.486 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.486 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.486 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.486 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.487 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.487 12 DEBUG ceilometer.compute.pollsters [-] 205fe71f-8f8c-4ae1-ac04-9344041cfd6c/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.488 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '40b1488f-74a0-49aa-8c65-b96a66c833e4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': 'instance-00000087-205fe71f-8f8c-4ae1-ac04-9344041cfd6c-tapa9e0d1d4-11', 'timestamp': '2025-10-02T12:31:16.487373', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-853898395', 'name': 'tapa9e0d1d4-11', 'instance_id': '205fe71f-8f8c-4ae1-ac04-9344041cfd6c', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b4:da:2c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa9e0d1d4-11'}, 'message_id': 'b0daa62e-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6268.104076481, 'message_signature': '9e74721e299c7630bc4d064360fc10e416212f20c5a9d27370711e33bee3e2ad'}]}, 'timestamp': '2025-10-02 12:31:16.487683', '_unique_id': '387d958df9db4cc7805664929b0a0a0f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.488 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.488 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.488 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.488 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.488 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.488 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.488 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.488 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.488 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.488 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.488 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.488 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.488 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.488 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.488 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.488 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.488 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.488 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.488 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.488 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.488 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.488 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.488 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.488 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.488 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.488 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.488 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.488 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.488 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.488 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.488 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.489 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.489 12 DEBUG ceilometer.compute.pollsters [-] 205fe71f-8f8c-4ae1-ac04-9344041cfd6c/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.490 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2fe2d549-731e-4890-aaeb-e73ca31a26d6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': 'instance-00000087-205fe71f-8f8c-4ae1-ac04-9344041cfd6c-tapa9e0d1d4-11', 'timestamp': '2025-10-02T12:31:16.489418', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-853898395', 'name': 'tapa9e0d1d4-11', 'instance_id': '205fe71f-8f8c-4ae1-ac04-9344041cfd6c', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b4:da:2c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa9e0d1d4-11'}, 'message_id': 'b0daf836-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6268.104076481, 'message_signature': '62cf3fb010e8deac05563e5d16742a5e5da37150366e1523bb95aa9c8738fbe3'}]}, 'timestamp': '2025-10-02 12:31:16.489882', '_unique_id': 'bd34501b402a499982964119e125986c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.490 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.490 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.490 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.490 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.490 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.490 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.490 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.490 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.490 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.490 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.490 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.490 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.490 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.490 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.490 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.490 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.490 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.490 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.490 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.490 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.490 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.490 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.490 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.490 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.490 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.490 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.490 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.490 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.490 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.490 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.490 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.491 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.491 12 DEBUG ceilometer.compute.pollsters [-] 205fe71f-8f8c-4ae1-ac04-9344041cfd6c/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.492 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '241e5039-38c9-4fb9-94a7-12ec77f8177c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': 'instance-00000087-205fe71f-8f8c-4ae1-ac04-9344041cfd6c-tapa9e0d1d4-11', 'timestamp': '2025-10-02T12:31:16.491482', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-853898395', 'name': 'tapa9e0d1d4-11', 'instance_id': '205fe71f-8f8c-4ae1-ac04-9344041cfd6c', 'instance_type': 'm1.nano', 'host': '817df11b476f78ba96b6b8af0daf5637c885984a4827fad483ae76c3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b4:da:2c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa9e0d1d4-11'}, 'message_id': 'b0db48cc-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6268.104076481, 'message_signature': '4c58620c56825ca24b9663feb3d80dc9ff2e4e04b7b8dc6f9b979d668b0e57c6'}]}, 'timestamp': '2025-10-02 12:31:16.491945', '_unique_id': 'e8a0fe32ce93458682dce3e9516c8ff3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.492 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.492 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.492 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.492 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.492 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.492 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.492 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.492 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.492 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.492 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.492 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.492 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.492 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.492 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.492 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.492 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.492 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.492 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.492 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.492 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.492 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.492 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.492 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.492 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.492 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.492 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.492 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.492 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.492 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.492 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:31:16.492 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:17 np0005466013 nova_compute[192144]: 2025-10-02 12:31:17.884 2 DEBUG nova.network.neutron [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Updating instance_info_cache with network_info: [{"id": "15e9913c-6f57-4c6b-9187-b313d2cc7b28", "address": "fa:16:3e:87:ce:78", "network": {"id": "04081b94-8d27-4ffe-952c-7ea90fd87701", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-971132582-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b6f9110c2f2b4d5ba26abe5fc13b1395", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15e9913c-6f", "ovs_interfaceid": "15e9913c-6f57-4c6b-9187-b313d2cc7b28", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:31:17 np0005466013 nova_compute[192144]: 2025-10-02 12:31:17.908 2 DEBUG oslo_concurrency.lockutils [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Releasing lock "refresh_cache-f9bd9b91-34ee-4ea0-9ff5-38db09b7f541" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:31:17 np0005466013 nova_compute[192144]: 2025-10-02 12:31:17.909 2 DEBUG nova.compute.manager [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Instance network_info: |[{"id": "15e9913c-6f57-4c6b-9187-b313d2cc7b28", "address": "fa:16:3e:87:ce:78", "network": {"id": "04081b94-8d27-4ffe-952c-7ea90fd87701", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-971132582-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b6f9110c2f2b4d5ba26abe5fc13b1395", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15e9913c-6f", "ovs_interfaceid": "15e9913c-6f57-4c6b-9187-b313d2cc7b28", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:31:17 np0005466013 nova_compute[192144]: 2025-10-02 12:31:17.910 2 DEBUG oslo_concurrency.lockutils [req-1f7e88bc-98b4-4c32-a451-c7db316c9ab3 req-b1dd2a6f-c2f7-41a0-950a-c8ed42481394 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-f9bd9b91-34ee-4ea0-9ff5-38db09b7f541" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:31:17 np0005466013 nova_compute[192144]: 2025-10-02 12:31:17.911 2 DEBUG nova.network.neutron [req-1f7e88bc-98b4-4c32-a451-c7db316c9ab3 req-b1dd2a6f-c2f7-41a0-950a-c8ed42481394 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Refreshing network info cache for port 15e9913c-6f57-4c6b-9187-b313d2cc7b28 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:31:17 np0005466013 nova_compute[192144]: 2025-10-02 12:31:17.918 2 DEBUG nova.virt.libvirt.driver [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Start _get_guest_xml network_info=[{"id": "15e9913c-6f57-4c6b-9187-b313d2cc7b28", "address": "fa:16:3e:87:ce:78", "network": {"id": "04081b94-8d27-4ffe-952c-7ea90fd87701", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-971132582-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b6f9110c2f2b4d5ba26abe5fc13b1395", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15e9913c-6f", "ovs_interfaceid": "15e9913c-6f57-4c6b-9187-b313d2cc7b28", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:31:17 np0005466013 nova_compute[192144]: 2025-10-02 12:31:17.926 2 WARNING nova.virt.libvirt.driver [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:31:17 np0005466013 nova_compute[192144]: 2025-10-02 12:31:17.934 2 DEBUG nova.virt.libvirt.host [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:31:17 np0005466013 nova_compute[192144]: 2025-10-02 12:31:17.936 2 DEBUG nova.virt.libvirt.host [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:31:17 np0005466013 nova_compute[192144]: 2025-10-02 12:31:17.944 2 DEBUG nova.virt.libvirt.host [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:31:17 np0005466013 nova_compute[192144]: 2025-10-02 12:31:17.944 2 DEBUG nova.virt.libvirt.host [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:31:17 np0005466013 nova_compute[192144]: 2025-10-02 12:31:17.945 2 DEBUG nova.virt.libvirt.driver [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:31:17 np0005466013 nova_compute[192144]: 2025-10-02 12:31:17.945 2 DEBUG nova.virt.hardware [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:31:17 np0005466013 nova_compute[192144]: 2025-10-02 12:31:17.946 2 DEBUG nova.virt.hardware [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:31:17 np0005466013 nova_compute[192144]: 2025-10-02 12:31:17.946 2 DEBUG nova.virt.hardware [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:31:17 np0005466013 nova_compute[192144]: 2025-10-02 12:31:17.946 2 DEBUG nova.virt.hardware [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:31:17 np0005466013 nova_compute[192144]: 2025-10-02 12:31:17.946 2 DEBUG nova.virt.hardware [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:31:17 np0005466013 nova_compute[192144]: 2025-10-02 12:31:17.946 2 DEBUG nova.virt.hardware [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:31:17 np0005466013 nova_compute[192144]: 2025-10-02 12:31:17.946 2 DEBUG nova.virt.hardware [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:31:17 np0005466013 nova_compute[192144]: 2025-10-02 12:31:17.946 2 DEBUG nova.virt.hardware [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:31:17 np0005466013 nova_compute[192144]: 2025-10-02 12:31:17.947 2 DEBUG nova.virt.hardware [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:31:17 np0005466013 nova_compute[192144]: 2025-10-02 12:31:17.947 2 DEBUG nova.virt.hardware [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:31:17 np0005466013 nova_compute[192144]: 2025-10-02 12:31:17.947 2 DEBUG nova.virt.hardware [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:31:17 np0005466013 nova_compute[192144]: 2025-10-02 12:31:17.950 2 DEBUG nova.virt.libvirt.vif [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:31:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerMetadataNegativeTestJSON-server-393561322',display_name='tempest-ServerMetadataNegativeTestJSON-server-393561322',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-servermetadatanegativetestjson-server-393561322',id=138,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b6f9110c2f2b4d5ba26abe5fc13b1395',ramdisk_id='',reservation_id='r-yfwdf0l3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerMetadataNegativeTestJSON-2025431146',owner_user_name='tempest-ServerMetadataNegativeTestJSON-2025431146-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:31:12Z,user_data=None,user_id='ee501201a91d4d1facccde8769261729',uuid=f9bd9b91-34ee-4ea0-9ff5-38db09b7f541,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "15e9913c-6f57-4c6b-9187-b313d2cc7b28", "address": "fa:16:3e:87:ce:78", "network": {"id": "04081b94-8d27-4ffe-952c-7ea90fd87701", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-971132582-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b6f9110c2f2b4d5ba26abe5fc13b1395", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15e9913c-6f", "ovs_interfaceid": "15e9913c-6f57-4c6b-9187-b313d2cc7b28", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:31:17 np0005466013 nova_compute[192144]: 2025-10-02 12:31:17.950 2 DEBUG nova.network.os_vif_util [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Converting VIF {"id": "15e9913c-6f57-4c6b-9187-b313d2cc7b28", "address": "fa:16:3e:87:ce:78", "network": {"id": "04081b94-8d27-4ffe-952c-7ea90fd87701", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-971132582-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b6f9110c2f2b4d5ba26abe5fc13b1395", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15e9913c-6f", "ovs_interfaceid": "15e9913c-6f57-4c6b-9187-b313d2cc7b28", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:31:17 np0005466013 nova_compute[192144]: 2025-10-02 12:31:17.951 2 DEBUG nova.network.os_vif_util [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:87:ce:78,bridge_name='br-int',has_traffic_filtering=True,id=15e9913c-6f57-4c6b-9187-b313d2cc7b28,network=Network(04081b94-8d27-4ffe-952c-7ea90fd87701),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15e9913c-6f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:31:17 np0005466013 nova_compute[192144]: 2025-10-02 12:31:17.951 2 DEBUG nova.objects.instance [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Lazy-loading 'pci_devices' on Instance uuid f9bd9b91-34ee-4ea0-9ff5-38db09b7f541 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:31:17 np0005466013 nova_compute[192144]: 2025-10-02 12:31:17.965 2 DEBUG nova.virt.libvirt.driver [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:31:17 np0005466013 nova_compute[192144]:  <uuid>f9bd9b91-34ee-4ea0-9ff5-38db09b7f541</uuid>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:  <name>instance-0000008a</name>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:31:17 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:      <nova:name>tempest-ServerMetadataNegativeTestJSON-server-393561322</nova:name>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:31:17</nova:creationTime>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:31:17 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:        <nova:user uuid="ee501201a91d4d1facccde8769261729">tempest-ServerMetadataNegativeTestJSON-2025431146-project-member</nova:user>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:        <nova:project uuid="b6f9110c2f2b4d5ba26abe5fc13b1395">tempest-ServerMetadataNegativeTestJSON-2025431146</nova:project>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:        <nova:port uuid="15e9913c-6f57-4c6b-9187-b313d2cc7b28">
Oct  2 08:31:17 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:      <entry name="serial">f9bd9b91-34ee-4ea0-9ff5-38db09b7f541</entry>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:      <entry name="uuid">f9bd9b91-34ee-4ea0-9ff5-38db09b7f541</entry>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:31:17 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/f9bd9b91-34ee-4ea0-9ff5-38db09b7f541/disk"/>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:31:17 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/f9bd9b91-34ee-4ea0-9ff5-38db09b7f541/disk.config"/>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:31:17 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:87:ce:78"/>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:      <target dev="tap15e9913c-6f"/>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:31:17 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/f9bd9b91-34ee-4ea0-9ff5-38db09b7f541/console.log" append="off"/>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:31:17 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:31:17 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:31:17 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:31:17 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:31:17 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:31:17 np0005466013 nova_compute[192144]: 2025-10-02 12:31:17.965 2 DEBUG nova.compute.manager [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Preparing to wait for external event network-vif-plugged-15e9913c-6f57-4c6b-9187-b313d2cc7b28 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:31:17 np0005466013 nova_compute[192144]: 2025-10-02 12:31:17.965 2 DEBUG oslo_concurrency.lockutils [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Acquiring lock "f9bd9b91-34ee-4ea0-9ff5-38db09b7f541-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:17 np0005466013 nova_compute[192144]: 2025-10-02 12:31:17.965 2 DEBUG oslo_concurrency.lockutils [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Lock "f9bd9b91-34ee-4ea0-9ff5-38db09b7f541-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:17 np0005466013 nova_compute[192144]: 2025-10-02 12:31:17.965 2 DEBUG oslo_concurrency.lockutils [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Lock "f9bd9b91-34ee-4ea0-9ff5-38db09b7f541-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:17 np0005466013 nova_compute[192144]: 2025-10-02 12:31:17.966 2 DEBUG nova.virt.libvirt.vif [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:31:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerMetadataNegativeTestJSON-server-393561322',display_name='tempest-ServerMetadataNegativeTestJSON-server-393561322',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-servermetadatanegativetestjson-server-393561322',id=138,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b6f9110c2f2b4d5ba26abe5fc13b1395',ramdisk_id='',reservation_id='r-yfwdf0l3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerMetadataNegativeTestJSON-2025431146',owner_user_name='tempest-ServerMetadataNegativeTestJSON-2025431146-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:31:12Z,user_data=None,user_id='ee501201a91d4d1facccde8769261729',uuid=f9bd9b91-34ee-4ea0-9ff5-38db09b7f541,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "15e9913c-6f57-4c6b-9187-b313d2cc7b28", "address": "fa:16:3e:87:ce:78", "network": {"id": "04081b94-8d27-4ffe-952c-7ea90fd87701", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-971132582-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b6f9110c2f2b4d5ba26abe5fc13b1395", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15e9913c-6f", "ovs_interfaceid": "15e9913c-6f57-4c6b-9187-b313d2cc7b28", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:31:17 np0005466013 nova_compute[192144]: 2025-10-02 12:31:17.966 2 DEBUG nova.network.os_vif_util [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Converting VIF {"id": "15e9913c-6f57-4c6b-9187-b313d2cc7b28", "address": "fa:16:3e:87:ce:78", "network": {"id": "04081b94-8d27-4ffe-952c-7ea90fd87701", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-971132582-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b6f9110c2f2b4d5ba26abe5fc13b1395", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15e9913c-6f", "ovs_interfaceid": "15e9913c-6f57-4c6b-9187-b313d2cc7b28", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:31:17 np0005466013 nova_compute[192144]: 2025-10-02 12:31:17.967 2 DEBUG nova.network.os_vif_util [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:87:ce:78,bridge_name='br-int',has_traffic_filtering=True,id=15e9913c-6f57-4c6b-9187-b313d2cc7b28,network=Network(04081b94-8d27-4ffe-952c-7ea90fd87701),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15e9913c-6f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:31:17 np0005466013 nova_compute[192144]: 2025-10-02 12:31:17.967 2 DEBUG os_vif [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:87:ce:78,bridge_name='br-int',has_traffic_filtering=True,id=15e9913c-6f57-4c6b-9187-b313d2cc7b28,network=Network(04081b94-8d27-4ffe-952c-7ea90fd87701),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15e9913c-6f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:31:17 np0005466013 nova_compute[192144]: 2025-10-02 12:31:17.968 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:17 np0005466013 nova_compute[192144]: 2025-10-02 12:31:17.968 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:31:17 np0005466013 nova_compute[192144]: 2025-10-02 12:31:17.968 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:31:17 np0005466013 nova_compute[192144]: 2025-10-02 12:31:17.971 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:17 np0005466013 nova_compute[192144]: 2025-10-02 12:31:17.971 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap15e9913c-6f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:31:17 np0005466013 nova_compute[192144]: 2025-10-02 12:31:17.972 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap15e9913c-6f, col_values=(('external_ids', {'iface-id': '15e9913c-6f57-4c6b-9187-b313d2cc7b28', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:87:ce:78', 'vm-uuid': 'f9bd9b91-34ee-4ea0-9ff5-38db09b7f541'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:31:17 np0005466013 nova_compute[192144]: 2025-10-02 12:31:17.973 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:17 np0005466013 NetworkManager[51205]: <info>  [1759408277.9741] manager: (tap15e9913c-6f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/267)
Oct  2 08:31:17 np0005466013 nova_compute[192144]: 2025-10-02 12:31:17.976 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:31:17 np0005466013 nova_compute[192144]: 2025-10-02 12:31:17.984 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:17 np0005466013 nova_compute[192144]: 2025-10-02 12:31:17.985 2 INFO os_vif [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:87:ce:78,bridge_name='br-int',has_traffic_filtering=True,id=15e9913c-6f57-4c6b-9187-b313d2cc7b28,network=Network(04081b94-8d27-4ffe-952c-7ea90fd87701),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15e9913c-6f')#033[00m
Oct  2 08:31:18 np0005466013 nova_compute[192144]: 2025-10-02 12:31:18.196 2 DEBUG nova.virt.libvirt.driver [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:31:18 np0005466013 nova_compute[192144]: 2025-10-02 12:31:18.197 2 DEBUG nova.virt.libvirt.driver [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:31:18 np0005466013 nova_compute[192144]: 2025-10-02 12:31:18.197 2 DEBUG nova.virt.libvirt.driver [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] No VIF found with MAC fa:16:3e:87:ce:78, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:31:18 np0005466013 nova_compute[192144]: 2025-10-02 12:31:18.198 2 INFO nova.virt.libvirt.driver [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Using config drive#033[00m
Oct  2 08:31:18 np0005466013 nova_compute[192144]: 2025-10-02 12:31:18.949 2 INFO nova.virt.libvirt.driver [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Creating config drive at /var/lib/nova/instances/f9bd9b91-34ee-4ea0-9ff5-38db09b7f541/disk.config#033[00m
Oct  2 08:31:18 np0005466013 nova_compute[192144]: 2025-10-02 12:31:18.959 2 DEBUG oslo_concurrency.processutils [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f9bd9b91-34ee-4ea0-9ff5-38db09b7f541/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpevb_84li execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:31:19 np0005466013 nova_compute[192144]: 2025-10-02 12:31:19.112 2 DEBUG oslo_concurrency.processutils [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f9bd9b91-34ee-4ea0-9ff5-38db09b7f541/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpevb_84li" returned: 0 in 0.153s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:31:19 np0005466013 kernel: tap15e9913c-6f: entered promiscuous mode
Oct  2 08:31:19 np0005466013 NetworkManager[51205]: <info>  [1759408279.2464] manager: (tap15e9913c-6f): new Tun device (/org/freedesktop/NetworkManager/Devices/268)
Oct  2 08:31:19 np0005466013 ovn_controller[94366]: 2025-10-02T12:31:19Z|00602|binding|INFO|Claiming lport 15e9913c-6f57-4c6b-9187-b313d2cc7b28 for this chassis.
Oct  2 08:31:19 np0005466013 ovn_controller[94366]: 2025-10-02T12:31:19Z|00603|binding|INFO|15e9913c-6f57-4c6b-9187-b313d2cc7b28: Claiming fa:16:3e:87:ce:78 10.100.0.11
Oct  2 08:31:19 np0005466013 nova_compute[192144]: 2025-10-02 12:31:19.253 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:19.278 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:87:ce:78 10.100.0.11'], port_security=['fa:16:3e:87:ce:78 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'f9bd9b91-34ee-4ea0-9ff5-38db09b7f541', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-04081b94-8d27-4ffe-952c-7ea90fd87701', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b6f9110c2f2b4d5ba26abe5fc13b1395', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ac70135a-c24c-4a8b-98cb-6f151b5957d6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3ca14fbf-fca9-4419-b124-4a664fb3b231, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=15e9913c-6f57-4c6b-9187-b313d2cc7b28) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:19.282 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 15e9913c-6f57-4c6b-9187-b313d2cc7b28 in datapath 04081b94-8d27-4ffe-952c-7ea90fd87701 bound to our chassis#033[00m
Oct  2 08:31:19 np0005466013 nova_compute[192144]: 2025-10-02 12:31:19.281 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:19 np0005466013 ovn_controller[94366]: 2025-10-02T12:31:19Z|00604|binding|INFO|Setting lport 15e9913c-6f57-4c6b-9187-b313d2cc7b28 ovn-installed in OVS
Oct  2 08:31:19 np0005466013 ovn_controller[94366]: 2025-10-02T12:31:19Z|00605|binding|INFO|Setting lport 15e9913c-6f57-4c6b-9187-b313d2cc7b28 up in Southbound
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:19.288 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 04081b94-8d27-4ffe-952c-7ea90fd87701#033[00m
Oct  2 08:31:19 np0005466013 nova_compute[192144]: 2025-10-02 12:31:19.299 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:19 np0005466013 systemd-udevd[243412]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:19.313 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[b5232220-2ca5-4d5a-be0e-043b75e62c4c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:19.314 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap04081b94-81 in ovnmeta-04081b94-8d27-4ffe-952c-7ea90fd87701 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:19.319 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap04081b94-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:19.320 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[93100274-3336-41b1-86c6-06b5ca5cae95]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:19.322 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[47fa6ce8-86be-4e12-b4de-9ed2e5a55b74]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:19 np0005466013 NetworkManager[51205]: <info>  [1759408279.3249] device (tap15e9913c-6f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:31:19 np0005466013 NetworkManager[51205]: <info>  [1759408279.3255] device (tap15e9913c-6f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:31:19 np0005466013 systemd-machined[152202]: New machine qemu-68-instance-0000008a.
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:19.341 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[87bc34cc-08d1-44f7-a3de-9ae2074e4d36]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:19 np0005466013 systemd[1]: Started Virtual Machine qemu-68-instance-0000008a.
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:19.367 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[3adb2388-67e0-4717-8615-b57f533d93f7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:19 np0005466013 podman[243378]: 2025-10-02 12:31:19.379208959 +0000 UTC m=+0.152889798 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:31:19 np0005466013 podman[243379]: 2025-10-02 12:31:19.380989008 +0000 UTC m=+0.154457549 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., config_id=edpm, distribution-scope=public, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, managed_by=edpm_ansible)
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:19.406 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[59c9d2d8-c6f1-4ddb-9239-dd7e44b98426]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:19 np0005466013 systemd-udevd[243423]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:31:19 np0005466013 podman[243380]: 2025-10-02 12:31:19.415280886 +0000 UTC m=+0.183740361 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:31:19 np0005466013 NetworkManager[51205]: <info>  [1759408279.4157] manager: (tap04081b94-80): new Veth device (/org/freedesktop/NetworkManager/Devices/269)
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:19.413 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[a6be04f6-4d83-48ee-b2ab-20c2ef07d19d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:19.469 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[183aa542-d170-4e56-bd8e-65ef0fb429ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:19.475 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[2e1fda04-ae2e-47a8-af76-9a926690fa24]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:19 np0005466013 NetworkManager[51205]: <info>  [1759408279.5056] device (tap04081b94-80): carrier: link connected
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:19.516 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[9019ffdd-db2b-45ad-a572-a700afc7007b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:19.536 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[0a0cd828-84b4-4773-8453-2073446acc7b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap04081b94-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:15:1e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 182], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 627112, 'reachable_time': 27304, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243476, 'error': None, 'target': 'ovnmeta-04081b94-8d27-4ffe-952c-7ea90fd87701', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:19.560 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[2ac49f45-3bf7-4ee3-8ba0-f4c818e73adc]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe36:151e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 627112, 'tstamp': 627112}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 243477, 'error': None, 'target': 'ovnmeta-04081b94-8d27-4ffe-952c-7ea90fd87701', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:19.588 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[ca1f2763-76d7-42d1-95ca-0ab648c7a16c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap04081b94-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:15:1e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 182], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 627112, 'reachable_time': 27304, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 243478, 'error': None, 'target': 'ovnmeta-04081b94-8d27-4ffe-952c-7ea90fd87701', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:19.640 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[fe31dd03-c63d-4315-8964-75b82f5ba001]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:19.755 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[b550a151-974f-4013-b036-6644b2904706]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:19.759 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap04081b94-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:19.759 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:19.761 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap04081b94-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:31:19 np0005466013 kernel: tap04081b94-80: entered promiscuous mode
Oct  2 08:31:19 np0005466013 NetworkManager[51205]: <info>  [1759408279.7976] manager: (tap04081b94-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/270)
Oct  2 08:31:19 np0005466013 nova_compute[192144]: 2025-10-02 12:31:19.799 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:19.804 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap04081b94-80, col_values=(('external_ids', {'iface-id': '8ac47916-ce5c-4f94-b114-9661e46148f1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:31:19 np0005466013 ovn_controller[94366]: 2025-10-02T12:31:19Z|00606|binding|INFO|Releasing lport 8ac47916-ce5c-4f94-b114-9661e46148f1 from this chassis (sb_readonly=0)
Oct  2 08:31:19 np0005466013 nova_compute[192144]: 2025-10-02 12:31:19.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:19 np0005466013 nova_compute[192144]: 2025-10-02 12:31:19.807 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:19.808 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/04081b94-8d27-4ffe-952c-7ea90fd87701.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/04081b94-8d27-4ffe-952c-7ea90fd87701.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:19.809 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[02c047b2-d94a-4239-9cab-4d0081fb881f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:19.811 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-04081b94-8d27-4ffe-952c-7ea90fd87701
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/04081b94-8d27-4ffe-952c-7ea90fd87701.pid.haproxy
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID 04081b94-8d27-4ffe-952c-7ea90fd87701
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:31:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:19.812 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-04081b94-8d27-4ffe-952c-7ea90fd87701', 'env', 'PROCESS_TAG=haproxy-04081b94-8d27-4ffe-952c-7ea90fd87701', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/04081b94-8d27-4ffe-952c-7ea90fd87701.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:31:19 np0005466013 nova_compute[192144]: 2025-10-02 12:31:19.822 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:19 np0005466013 nova_compute[192144]: 2025-10-02 12:31:19.944 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:19 np0005466013 nova_compute[192144]: 2025-10-02 12:31:19.971 2 DEBUG nova.compute.manager [req-14a707bf-dc38-4a95-9047-0b99aa53c164 req-7a1f90ad-8bfa-453b-b075-7f38e44271da 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Received event network-vif-plugged-15e9913c-6f57-4c6b-9187-b313d2cc7b28 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:31:19 np0005466013 nova_compute[192144]: 2025-10-02 12:31:19.973 2 DEBUG oslo_concurrency.lockutils [req-14a707bf-dc38-4a95-9047-0b99aa53c164 req-7a1f90ad-8bfa-453b-b075-7f38e44271da 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "f9bd9b91-34ee-4ea0-9ff5-38db09b7f541-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:19 np0005466013 nova_compute[192144]: 2025-10-02 12:31:19.973 2 DEBUG oslo_concurrency.lockutils [req-14a707bf-dc38-4a95-9047-0b99aa53c164 req-7a1f90ad-8bfa-453b-b075-7f38e44271da 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f9bd9b91-34ee-4ea0-9ff5-38db09b7f541-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:19 np0005466013 nova_compute[192144]: 2025-10-02 12:31:19.974 2 DEBUG oslo_concurrency.lockutils [req-14a707bf-dc38-4a95-9047-0b99aa53c164 req-7a1f90ad-8bfa-453b-b075-7f38e44271da 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f9bd9b91-34ee-4ea0-9ff5-38db09b7f541-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:19 np0005466013 nova_compute[192144]: 2025-10-02 12:31:19.974 2 DEBUG nova.compute.manager [req-14a707bf-dc38-4a95-9047-0b99aa53c164 req-7a1f90ad-8bfa-453b-b075-7f38e44271da 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Processing event network-vif-plugged-15e9913c-6f57-4c6b-9187-b313d2cc7b28 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:31:20 np0005466013 nova_compute[192144]: 2025-10-02 12:31:20.288 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759408280.2876227, f9bd9b91-34ee-4ea0-9ff5-38db09b7f541 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:31:20 np0005466013 nova_compute[192144]: 2025-10-02 12:31:20.290 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] VM Started (Lifecycle Event)#033[00m
Oct  2 08:31:20 np0005466013 nova_compute[192144]: 2025-10-02 12:31:20.292 2 DEBUG nova.compute.manager [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:31:20 np0005466013 nova_compute[192144]: 2025-10-02 12:31:20.296 2 DEBUG nova.virt.libvirt.driver [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:31:20 np0005466013 nova_compute[192144]: 2025-10-02 12:31:20.300 2 INFO nova.virt.libvirt.driver [-] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Instance spawned successfully.#033[00m
Oct  2 08:31:20 np0005466013 nova_compute[192144]: 2025-10-02 12:31:20.300 2 DEBUG nova.virt.libvirt.driver [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:31:20 np0005466013 nova_compute[192144]: 2025-10-02 12:31:20.317 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:31:20 np0005466013 nova_compute[192144]: 2025-10-02 12:31:20.325 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:31:20 np0005466013 nova_compute[192144]: 2025-10-02 12:31:20.330 2 DEBUG nova.virt.libvirt.driver [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:31:20 np0005466013 nova_compute[192144]: 2025-10-02 12:31:20.330 2 DEBUG nova.virt.libvirt.driver [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:31:20 np0005466013 nova_compute[192144]: 2025-10-02 12:31:20.331 2 DEBUG nova.virt.libvirt.driver [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:31:20 np0005466013 nova_compute[192144]: 2025-10-02 12:31:20.331 2 DEBUG nova.virt.libvirt.driver [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:31:20 np0005466013 nova_compute[192144]: 2025-10-02 12:31:20.332 2 DEBUG nova.virt.libvirt.driver [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:31:20 np0005466013 nova_compute[192144]: 2025-10-02 12:31:20.332 2 DEBUG nova.virt.libvirt.driver [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:31:20 np0005466013 nova_compute[192144]: 2025-10-02 12:31:20.355 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:31:20 np0005466013 nova_compute[192144]: 2025-10-02 12:31:20.356 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759408280.2886536, f9bd9b91-34ee-4ea0-9ff5-38db09b7f541 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:31:20 np0005466013 nova_compute[192144]: 2025-10-02 12:31:20.356 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:31:20 np0005466013 podman[243517]: 2025-10-02 12:31:20.287493956 +0000 UTC m=+0.040465305 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:31:20 np0005466013 nova_compute[192144]: 2025-10-02 12:31:20.383 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:31:20 np0005466013 nova_compute[192144]: 2025-10-02 12:31:20.386 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759408280.2971067, f9bd9b91-34ee-4ea0-9ff5-38db09b7f541 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:31:20 np0005466013 nova_compute[192144]: 2025-10-02 12:31:20.387 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:31:20 np0005466013 nova_compute[192144]: 2025-10-02 12:31:20.402 2 INFO nova.compute.manager [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Took 8.12 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:31:20 np0005466013 nova_compute[192144]: 2025-10-02 12:31:20.403 2 DEBUG nova.compute.manager [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:31:20 np0005466013 nova_compute[192144]: 2025-10-02 12:31:20.406 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:31:20 np0005466013 nova_compute[192144]: 2025-10-02 12:31:20.413 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:31:20 np0005466013 nova_compute[192144]: 2025-10-02 12:31:20.449 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:31:20 np0005466013 nova_compute[192144]: 2025-10-02 12:31:20.508 2 INFO nova.compute.manager [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Took 8.86 seconds to build instance.#033[00m
Oct  2 08:31:20 np0005466013 nova_compute[192144]: 2025-10-02 12:31:20.534 2 DEBUG oslo_concurrency.lockutils [None req-af477cdf-64d5-4a02-9803-7bc0b1a56d0e ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Lock "f9bd9b91-34ee-4ea0-9ff5-38db09b7f541" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.214s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:20 np0005466013 nova_compute[192144]: 2025-10-02 12:31:20.813 2 DEBUG nova.network.neutron [req-1f7e88bc-98b4-4c32-a451-c7db316c9ab3 req-b1dd2a6f-c2f7-41a0-950a-c8ed42481394 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Updated VIF entry in instance network info cache for port 15e9913c-6f57-4c6b-9187-b313d2cc7b28. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:31:20 np0005466013 nova_compute[192144]: 2025-10-02 12:31:20.814 2 DEBUG nova.network.neutron [req-1f7e88bc-98b4-4c32-a451-c7db316c9ab3 req-b1dd2a6f-c2f7-41a0-950a-c8ed42481394 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Updating instance_info_cache with network_info: [{"id": "15e9913c-6f57-4c6b-9187-b313d2cc7b28", "address": "fa:16:3e:87:ce:78", "network": {"id": "04081b94-8d27-4ffe-952c-7ea90fd87701", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-971132582-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b6f9110c2f2b4d5ba26abe5fc13b1395", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15e9913c-6f", "ovs_interfaceid": "15e9913c-6f57-4c6b-9187-b313d2cc7b28", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:31:20 np0005466013 nova_compute[192144]: 2025-10-02 12:31:20.833 2 DEBUG oslo_concurrency.lockutils [req-1f7e88bc-98b4-4c32-a451-c7db316c9ab3 req-b1dd2a6f-c2f7-41a0-950a-c8ed42481394 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-f9bd9b91-34ee-4ea0-9ff5-38db09b7f541" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:31:20 np0005466013 podman[243517]: 2025-10-02 12:31:20.919928575 +0000 UTC m=+0.672899944 container create ed72aa3a1c293054cc06bee89284abca7030b75d1f5a45eb412928d52b6e0440 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-04081b94-8d27-4ffe-952c-7ea90fd87701, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:31:21 np0005466013 systemd[1]: Started libpod-conmon-ed72aa3a1c293054cc06bee89284abca7030b75d1f5a45eb412928d52b6e0440.scope.
Oct  2 08:31:21 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:31:21 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2ab18cadf458428fff59c3e818657bb4cf544075f7a7ebdf146a6f5fd98958a3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:31:21 np0005466013 podman[243517]: 2025-10-02 12:31:21.339379821 +0000 UTC m=+1.092351180 container init ed72aa3a1c293054cc06bee89284abca7030b75d1f5a45eb412928d52b6e0440 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-04081b94-8d27-4ffe-952c-7ea90fd87701, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:31:21 np0005466013 podman[243517]: 2025-10-02 12:31:21.351668669 +0000 UTC m=+1.104640008 container start ed72aa3a1c293054cc06bee89284abca7030b75d1f5a45eb412928d52b6e0440 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-04081b94-8d27-4ffe-952c-7ea90fd87701, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:31:21 np0005466013 neutron-haproxy-ovnmeta-04081b94-8d27-4ffe-952c-7ea90fd87701[243532]: [NOTICE]   (243536) : New worker (243538) forked
Oct  2 08:31:21 np0005466013 neutron-haproxy-ovnmeta-04081b94-8d27-4ffe-952c-7ea90fd87701[243532]: [NOTICE]   (243536) : Loading success.
Oct  2 08:31:22 np0005466013 nova_compute[192144]: 2025-10-02 12:31:22.099 2 DEBUG nova.compute.manager [req-7efc88bf-bfbc-49f3-9637-766350e07d0b req-4341db8c-8b33-4b6e-aaff-bcef991d65a2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Received event network-vif-plugged-15e9913c-6f57-4c6b-9187-b313d2cc7b28 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:31:22 np0005466013 nova_compute[192144]: 2025-10-02 12:31:22.101 2 DEBUG oslo_concurrency.lockutils [req-7efc88bf-bfbc-49f3-9637-766350e07d0b req-4341db8c-8b33-4b6e-aaff-bcef991d65a2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "f9bd9b91-34ee-4ea0-9ff5-38db09b7f541-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:22 np0005466013 nova_compute[192144]: 2025-10-02 12:31:22.102 2 DEBUG oslo_concurrency.lockutils [req-7efc88bf-bfbc-49f3-9637-766350e07d0b req-4341db8c-8b33-4b6e-aaff-bcef991d65a2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f9bd9b91-34ee-4ea0-9ff5-38db09b7f541-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:22 np0005466013 nova_compute[192144]: 2025-10-02 12:31:22.102 2 DEBUG oslo_concurrency.lockutils [req-7efc88bf-bfbc-49f3-9637-766350e07d0b req-4341db8c-8b33-4b6e-aaff-bcef991d65a2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f9bd9b91-34ee-4ea0-9ff5-38db09b7f541-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:22 np0005466013 nova_compute[192144]: 2025-10-02 12:31:22.103 2 DEBUG nova.compute.manager [req-7efc88bf-bfbc-49f3-9637-766350e07d0b req-4341db8c-8b33-4b6e-aaff-bcef991d65a2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] No waiting events found dispatching network-vif-plugged-15e9913c-6f57-4c6b-9187-b313d2cc7b28 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:31:22 np0005466013 nova_compute[192144]: 2025-10-02 12:31:22.104 2 WARNING nova.compute.manager [req-7efc88bf-bfbc-49f3-9637-766350e07d0b req-4341db8c-8b33-4b6e-aaff-bcef991d65a2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Received unexpected event network-vif-plugged-15e9913c-6f57-4c6b-9187-b313d2cc7b28 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:31:22 np0005466013 nova_compute[192144]: 2025-10-02 12:31:22.976 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:24 np0005466013 podman[243547]: 2025-10-02 12:31:24.727505145 +0000 UTC m=+0.087045841 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:31:24 np0005466013 podman[243548]: 2025-10-02 12:31:24.73790795 +0000 UTC m=+0.094090534 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:31:24 np0005466013 nova_compute[192144]: 2025-10-02 12:31:24.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:25 np0005466013 nova_compute[192144]: 2025-10-02 12:31:25.366 2 DEBUG oslo_concurrency.lockutils [None req-66f3673c-9001-4b19-beab-47590dbb2ddd ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Acquiring lock "f9bd9b91-34ee-4ea0-9ff5-38db09b7f541" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:25 np0005466013 nova_compute[192144]: 2025-10-02 12:31:25.366 2 DEBUG oslo_concurrency.lockutils [None req-66f3673c-9001-4b19-beab-47590dbb2ddd ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Lock "f9bd9b91-34ee-4ea0-9ff5-38db09b7f541" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:25 np0005466013 nova_compute[192144]: 2025-10-02 12:31:25.367 2 DEBUG oslo_concurrency.lockutils [None req-66f3673c-9001-4b19-beab-47590dbb2ddd ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Acquiring lock "f9bd9b91-34ee-4ea0-9ff5-38db09b7f541-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:25 np0005466013 nova_compute[192144]: 2025-10-02 12:31:25.367 2 DEBUG oslo_concurrency.lockutils [None req-66f3673c-9001-4b19-beab-47590dbb2ddd ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Lock "f9bd9b91-34ee-4ea0-9ff5-38db09b7f541-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:25 np0005466013 nova_compute[192144]: 2025-10-02 12:31:25.367 2 DEBUG oslo_concurrency.lockutils [None req-66f3673c-9001-4b19-beab-47590dbb2ddd ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Lock "f9bd9b91-34ee-4ea0-9ff5-38db09b7f541-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:25 np0005466013 nova_compute[192144]: 2025-10-02 12:31:25.377 2 INFO nova.compute.manager [None req-66f3673c-9001-4b19-beab-47590dbb2ddd ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Terminating instance#033[00m
Oct  2 08:31:25 np0005466013 nova_compute[192144]: 2025-10-02 12:31:25.386 2 DEBUG nova.compute.manager [None req-66f3673c-9001-4b19-beab-47590dbb2ddd ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:31:25 np0005466013 kernel: tap15e9913c-6f (unregistering): left promiscuous mode
Oct  2 08:31:25 np0005466013 NetworkManager[51205]: <info>  [1759408285.4106] device (tap15e9913c-6f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:31:25 np0005466013 ovn_controller[94366]: 2025-10-02T12:31:25Z|00607|binding|INFO|Releasing lport 15e9913c-6f57-4c6b-9187-b313d2cc7b28 from this chassis (sb_readonly=0)
Oct  2 08:31:25 np0005466013 ovn_controller[94366]: 2025-10-02T12:31:25Z|00608|binding|INFO|Setting lport 15e9913c-6f57-4c6b-9187-b313d2cc7b28 down in Southbound
Oct  2 08:31:25 np0005466013 nova_compute[192144]: 2025-10-02 12:31:25.428 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:25 np0005466013 ovn_controller[94366]: 2025-10-02T12:31:25Z|00609|binding|INFO|Removing iface tap15e9913c-6f ovn-installed in OVS
Oct  2 08:31:25 np0005466013 nova_compute[192144]: 2025-10-02 12:31:25.431 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:25.441 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:87:ce:78 10.100.0.11'], port_security=['fa:16:3e:87:ce:78 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'f9bd9b91-34ee-4ea0-9ff5-38db09b7f541', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-04081b94-8d27-4ffe-952c-7ea90fd87701', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b6f9110c2f2b4d5ba26abe5fc13b1395', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ac70135a-c24c-4a8b-98cb-6f151b5957d6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3ca14fbf-fca9-4419-b124-4a664fb3b231, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=15e9913c-6f57-4c6b-9187-b313d2cc7b28) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:31:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:25.443 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 15e9913c-6f57-4c6b-9187-b313d2cc7b28 in datapath 04081b94-8d27-4ffe-952c-7ea90fd87701 unbound from our chassis#033[00m
Oct  2 08:31:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:25.447 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 04081b94-8d27-4ffe-952c-7ea90fd87701, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:31:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:25.448 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[74e7116d-3b5a-450b-8e2e-9a0c7af64636]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:25.449 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-04081b94-8d27-4ffe-952c-7ea90fd87701 namespace which is not needed anymore#033[00m
Oct  2 08:31:25 np0005466013 nova_compute[192144]: 2025-10-02 12:31:25.454 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:25 np0005466013 systemd[1]: machine-qemu\x2d68\x2dinstance\x2d0000008a.scope: Deactivated successfully.
Oct  2 08:31:25 np0005466013 systemd[1]: machine-qemu\x2d68\x2dinstance\x2d0000008a.scope: Consumed 5.916s CPU time.
Oct  2 08:31:25 np0005466013 systemd-machined[152202]: Machine qemu-68-instance-0000008a terminated.
Oct  2 08:31:25 np0005466013 nova_compute[192144]: 2025-10-02 12:31:25.684 2 INFO nova.virt.libvirt.driver [-] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Instance destroyed successfully.#033[00m
Oct  2 08:31:25 np0005466013 nova_compute[192144]: 2025-10-02 12:31:25.684 2 DEBUG nova.objects.instance [None req-66f3673c-9001-4b19-beab-47590dbb2ddd ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Lazy-loading 'resources' on Instance uuid f9bd9b91-34ee-4ea0-9ff5-38db09b7f541 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:31:25 np0005466013 nova_compute[192144]: 2025-10-02 12:31:25.701 2 DEBUG nova.virt.libvirt.vif [None req-66f3673c-9001-4b19-beab-47590dbb2ddd ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:31:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerMetadataNegativeTestJSON-server-393561322',display_name='tempest-ServerMetadataNegativeTestJSON-server-393561322',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-servermetadatanegativetestjson-server-393561322',id=138,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:31:20Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b6f9110c2f2b4d5ba26abe5fc13b1395',ramdisk_id='',reservation_id='r-yfwdf0l3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerMetadataNegativeTestJSON-2025431146',owner_user_name='tempest-ServerMetadataNegativeTestJSON-2025431146-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:31:20Z,user_data=None,user_id='ee501201a91d4d1facccde8769261729',uuid=f9bd9b91-34ee-4ea0-9ff5-38db09b7f541,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "15e9913c-6f57-4c6b-9187-b313d2cc7b28", "address": "fa:16:3e:87:ce:78", "network": {"id": "04081b94-8d27-4ffe-952c-7ea90fd87701", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-971132582-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b6f9110c2f2b4d5ba26abe5fc13b1395", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15e9913c-6f", "ovs_interfaceid": "15e9913c-6f57-4c6b-9187-b313d2cc7b28", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:31:25 np0005466013 nova_compute[192144]: 2025-10-02 12:31:25.702 2 DEBUG nova.network.os_vif_util [None req-66f3673c-9001-4b19-beab-47590dbb2ddd ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Converting VIF {"id": "15e9913c-6f57-4c6b-9187-b313d2cc7b28", "address": "fa:16:3e:87:ce:78", "network": {"id": "04081b94-8d27-4ffe-952c-7ea90fd87701", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-971132582-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b6f9110c2f2b4d5ba26abe5fc13b1395", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15e9913c-6f", "ovs_interfaceid": "15e9913c-6f57-4c6b-9187-b313d2cc7b28", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:31:25 np0005466013 nova_compute[192144]: 2025-10-02 12:31:25.703 2 DEBUG nova.network.os_vif_util [None req-66f3673c-9001-4b19-beab-47590dbb2ddd ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:87:ce:78,bridge_name='br-int',has_traffic_filtering=True,id=15e9913c-6f57-4c6b-9187-b313d2cc7b28,network=Network(04081b94-8d27-4ffe-952c-7ea90fd87701),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15e9913c-6f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:31:25 np0005466013 nova_compute[192144]: 2025-10-02 12:31:25.703 2 DEBUG os_vif [None req-66f3673c-9001-4b19-beab-47590dbb2ddd ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:87:ce:78,bridge_name='br-int',has_traffic_filtering=True,id=15e9913c-6f57-4c6b-9187-b313d2cc7b28,network=Network(04081b94-8d27-4ffe-952c-7ea90fd87701),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15e9913c-6f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:31:25 np0005466013 nova_compute[192144]: 2025-10-02 12:31:25.707 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:25 np0005466013 nova_compute[192144]: 2025-10-02 12:31:25.707 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap15e9913c-6f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:31:25 np0005466013 nova_compute[192144]: 2025-10-02 12:31:25.710 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:25 np0005466013 nova_compute[192144]: 2025-10-02 12:31:25.712 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:31:25 np0005466013 nova_compute[192144]: 2025-10-02 12:31:25.713 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:25 np0005466013 nova_compute[192144]: 2025-10-02 12:31:25.716 2 INFO os_vif [None req-66f3673c-9001-4b19-beab-47590dbb2ddd ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:87:ce:78,bridge_name='br-int',has_traffic_filtering=True,id=15e9913c-6f57-4c6b-9187-b313d2cc7b28,network=Network(04081b94-8d27-4ffe-952c-7ea90fd87701),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15e9913c-6f')#033[00m
Oct  2 08:31:25 np0005466013 nova_compute[192144]: 2025-10-02 12:31:25.717 2 INFO nova.virt.libvirt.driver [None req-66f3673c-9001-4b19-beab-47590dbb2ddd ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Deleting instance files /var/lib/nova/instances/f9bd9b91-34ee-4ea0-9ff5-38db09b7f541_del#033[00m
Oct  2 08:31:25 np0005466013 nova_compute[192144]: 2025-10-02 12:31:25.718 2 INFO nova.virt.libvirt.driver [None req-66f3673c-9001-4b19-beab-47590dbb2ddd ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Deletion of /var/lib/nova/instances/f9bd9b91-34ee-4ea0-9ff5-38db09b7f541_del complete#033[00m
Oct  2 08:31:25 np0005466013 neutron-haproxy-ovnmeta-04081b94-8d27-4ffe-952c-7ea90fd87701[243532]: [NOTICE]   (243536) : haproxy version is 2.8.14-c23fe91
Oct  2 08:31:25 np0005466013 neutron-haproxy-ovnmeta-04081b94-8d27-4ffe-952c-7ea90fd87701[243532]: [NOTICE]   (243536) : path to executable is /usr/sbin/haproxy
Oct  2 08:31:25 np0005466013 neutron-haproxy-ovnmeta-04081b94-8d27-4ffe-952c-7ea90fd87701[243532]: [WARNING]  (243536) : Exiting Master process...
Oct  2 08:31:25 np0005466013 neutron-haproxy-ovnmeta-04081b94-8d27-4ffe-952c-7ea90fd87701[243532]: [ALERT]    (243536) : Current worker (243538) exited with code 143 (Terminated)
Oct  2 08:31:25 np0005466013 neutron-haproxy-ovnmeta-04081b94-8d27-4ffe-952c-7ea90fd87701[243532]: [WARNING]  (243536) : All workers exited. Exiting... (0)
Oct  2 08:31:25 np0005466013 nova_compute[192144]: 2025-10-02 12:31:25.808 2 INFO nova.compute.manager [None req-66f3673c-9001-4b19-beab-47590dbb2ddd ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Took 0.42 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:31:25 np0005466013 nova_compute[192144]: 2025-10-02 12:31:25.809 2 DEBUG oslo.service.loopingcall [None req-66f3673c-9001-4b19-beab-47590dbb2ddd ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:31:25 np0005466013 nova_compute[192144]: 2025-10-02 12:31:25.810 2 DEBUG nova.compute.manager [-] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:31:25 np0005466013 nova_compute[192144]: 2025-10-02 12:31:25.810 2 DEBUG nova.network.neutron [-] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:31:25 np0005466013 systemd[1]: libpod-ed72aa3a1c293054cc06bee89284abca7030b75d1f5a45eb412928d52b6e0440.scope: Deactivated successfully.
Oct  2 08:31:25 np0005466013 podman[243616]: 2025-10-02 12:31:25.817251288 +0000 UTC m=+0.257485411 container died ed72aa3a1c293054cc06bee89284abca7030b75d1f5a45eb412928d52b6e0440 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-04081b94-8d27-4ffe-952c-7ea90fd87701, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:31:25 np0005466013 nova_compute[192144]: 2025-10-02 12:31:25.958 2 DEBUG nova.compute.manager [req-ad701963-f274-41c5-9a5b-0bf7d0459852 req-baeafd5f-cc3a-4820-9234-0c6ddd5acec2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Received event network-vif-unplugged-15e9913c-6f57-4c6b-9187-b313d2cc7b28 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:31:25 np0005466013 nova_compute[192144]: 2025-10-02 12:31:25.959 2 DEBUG oslo_concurrency.lockutils [req-ad701963-f274-41c5-9a5b-0bf7d0459852 req-baeafd5f-cc3a-4820-9234-0c6ddd5acec2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "f9bd9b91-34ee-4ea0-9ff5-38db09b7f541-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:25 np0005466013 nova_compute[192144]: 2025-10-02 12:31:25.960 2 DEBUG oslo_concurrency.lockutils [req-ad701963-f274-41c5-9a5b-0bf7d0459852 req-baeafd5f-cc3a-4820-9234-0c6ddd5acec2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f9bd9b91-34ee-4ea0-9ff5-38db09b7f541-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:25 np0005466013 nova_compute[192144]: 2025-10-02 12:31:25.960 2 DEBUG oslo_concurrency.lockutils [req-ad701963-f274-41c5-9a5b-0bf7d0459852 req-baeafd5f-cc3a-4820-9234-0c6ddd5acec2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f9bd9b91-34ee-4ea0-9ff5-38db09b7f541-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:25 np0005466013 nova_compute[192144]: 2025-10-02 12:31:25.961 2 DEBUG nova.compute.manager [req-ad701963-f274-41c5-9a5b-0bf7d0459852 req-baeafd5f-cc3a-4820-9234-0c6ddd5acec2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] No waiting events found dispatching network-vif-unplugged-15e9913c-6f57-4c6b-9187-b313d2cc7b28 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:31:25 np0005466013 nova_compute[192144]: 2025-10-02 12:31:25.961 2 DEBUG nova.compute.manager [req-ad701963-f274-41c5-9a5b-0bf7d0459852 req-baeafd5f-cc3a-4820-9234-0c6ddd5acec2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Received event network-vif-unplugged-15e9913c-6f57-4c6b-9187-b313d2cc7b28 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:31:25 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ed72aa3a1c293054cc06bee89284abca7030b75d1f5a45eb412928d52b6e0440-userdata-shm.mount: Deactivated successfully.
Oct  2 08:31:25 np0005466013 systemd[1]: var-lib-containers-storage-overlay-2ab18cadf458428fff59c3e818657bb4cf544075f7a7ebdf146a6f5fd98958a3-merged.mount: Deactivated successfully.
Oct  2 08:31:26 np0005466013 podman[243616]: 2025-10-02 12:31:26.066861665 +0000 UTC m=+0.507095788 container cleanup ed72aa3a1c293054cc06bee89284abca7030b75d1f5a45eb412928d52b6e0440 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-04081b94-8d27-4ffe-952c-7ea90fd87701, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 08:31:26 np0005466013 systemd[1]: libpod-conmon-ed72aa3a1c293054cc06bee89284abca7030b75d1f5a45eb412928d52b6e0440.scope: Deactivated successfully.
Oct  2 08:31:26 np0005466013 podman[243669]: 2025-10-02 12:31:26.354498025 +0000 UTC m=+0.253056672 container remove ed72aa3a1c293054cc06bee89284abca7030b75d1f5a45eb412928d52b6e0440 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-04081b94-8d27-4ffe-952c-7ea90fd87701, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:31:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:26.363 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[e799b652-672e-4025-818e-3b8b5f10d73b]: (4, ('Thu Oct  2 12:31:25 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-04081b94-8d27-4ffe-952c-7ea90fd87701 (ed72aa3a1c293054cc06bee89284abca7030b75d1f5a45eb412928d52b6e0440)\ned72aa3a1c293054cc06bee89284abca7030b75d1f5a45eb412928d52b6e0440\nThu Oct  2 12:31:26 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-04081b94-8d27-4ffe-952c-7ea90fd87701 (ed72aa3a1c293054cc06bee89284abca7030b75d1f5a45eb412928d52b6e0440)\ned72aa3a1c293054cc06bee89284abca7030b75d1f5a45eb412928d52b6e0440\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:26.365 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[c35f3410-96e1-47de-a744-a1949170371b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:26.367 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap04081b94-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:31:26 np0005466013 kernel: tap04081b94-80: left promiscuous mode
Oct  2 08:31:26 np0005466013 nova_compute[192144]: 2025-10-02 12:31:26.371 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:26 np0005466013 nova_compute[192144]: 2025-10-02 12:31:26.394 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:26 np0005466013 nova_compute[192144]: 2025-10-02 12:31:26.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:26.398 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[948506fb-f44f-45c9-a1ef-c298e99b7005]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:26.432 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[224b2ca6-b6d1-40c4-adfa-9085ef731045]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:26.433 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[7a1bf899-f06f-45c3-9cdb-7a8e04cb9681]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:26.450 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[e0dc3175-b455-4ae2-95cf-2c6c18b111cd]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 627101, 'reachable_time': 19784, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243687, 'error': None, 'target': 'ovnmeta-04081b94-8d27-4ffe-952c-7ea90fd87701', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:26 np0005466013 systemd[1]: run-netns-ovnmeta\x2d04081b94\x2d8d27\x2d4ffe\x2d952c\x2d7ea90fd87701.mount: Deactivated successfully.
Oct  2 08:31:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:26.458 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-04081b94-8d27-4ffe-952c-7ea90fd87701 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:31:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:26.458 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[135f18b5-7290-4515-bb56-247fb18702c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:26 np0005466013 nova_compute[192144]: 2025-10-02 12:31:26.817 2 DEBUG nova.network.neutron [-] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:31:26 np0005466013 nova_compute[192144]: 2025-10-02 12:31:26.844 2 INFO nova.compute.manager [-] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Took 1.03 seconds to deallocate network for instance.#033[00m
Oct  2 08:31:26 np0005466013 nova_compute[192144]: 2025-10-02 12:31:26.915 2 DEBUG nova.compute.manager [req-ad744b65-a615-4484-9176-c030d4cae11c req-794e227b-682b-441c-b549-faded4631e32 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Received event network-vif-deleted-15e9913c-6f57-4c6b-9187-b313d2cc7b28 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:31:26 np0005466013 nova_compute[192144]: 2025-10-02 12:31:26.970 2 DEBUG oslo_concurrency.lockutils [None req-66f3673c-9001-4b19-beab-47590dbb2ddd ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:26 np0005466013 nova_compute[192144]: 2025-10-02 12:31:26.971 2 DEBUG oslo_concurrency.lockutils [None req-66f3673c-9001-4b19-beab-47590dbb2ddd ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:27 np0005466013 nova_compute[192144]: 2025-10-02 12:31:27.058 2 DEBUG nova.compute.provider_tree [None req-66f3673c-9001-4b19-beab-47590dbb2ddd ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:31:27 np0005466013 nova_compute[192144]: 2025-10-02 12:31:27.077 2 DEBUG nova.scheduler.client.report [None req-66f3673c-9001-4b19-beab-47590dbb2ddd ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:31:27 np0005466013 nova_compute[192144]: 2025-10-02 12:31:27.114 2 DEBUG oslo_concurrency.lockutils [None req-66f3673c-9001-4b19-beab-47590dbb2ddd ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.143s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:27 np0005466013 nova_compute[192144]: 2025-10-02 12:31:27.166 2 INFO nova.scheduler.client.report [None req-66f3673c-9001-4b19-beab-47590dbb2ddd ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Deleted allocations for instance f9bd9b91-34ee-4ea0-9ff5-38db09b7f541#033[00m
Oct  2 08:31:27 np0005466013 nova_compute[192144]: 2025-10-02 12:31:27.296 2 DEBUG oslo_concurrency.lockutils [None req-66f3673c-9001-4b19-beab-47590dbb2ddd ee501201a91d4d1facccde8769261729 b6f9110c2f2b4d5ba26abe5fc13b1395 - - default default] Lock "f9bd9b91-34ee-4ea0-9ff5-38db09b7f541" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.930s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:28 np0005466013 nova_compute[192144]: 2025-10-02 12:31:28.050 2 DEBUG nova.compute.manager [req-49631be1-28b6-4626-b5d7-5a64b1ec70ab req-a0dc4ec1-75e0-4ec3-9855-c0959336d5e4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Received event network-vif-plugged-15e9913c-6f57-4c6b-9187-b313d2cc7b28 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:31:28 np0005466013 nova_compute[192144]: 2025-10-02 12:31:28.051 2 DEBUG oslo_concurrency.lockutils [req-49631be1-28b6-4626-b5d7-5a64b1ec70ab req-a0dc4ec1-75e0-4ec3-9855-c0959336d5e4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "f9bd9b91-34ee-4ea0-9ff5-38db09b7f541-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:28 np0005466013 nova_compute[192144]: 2025-10-02 12:31:28.051 2 DEBUG oslo_concurrency.lockutils [req-49631be1-28b6-4626-b5d7-5a64b1ec70ab req-a0dc4ec1-75e0-4ec3-9855-c0959336d5e4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f9bd9b91-34ee-4ea0-9ff5-38db09b7f541-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:28 np0005466013 nova_compute[192144]: 2025-10-02 12:31:28.052 2 DEBUG oslo_concurrency.lockutils [req-49631be1-28b6-4626-b5d7-5a64b1ec70ab req-a0dc4ec1-75e0-4ec3-9855-c0959336d5e4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "f9bd9b91-34ee-4ea0-9ff5-38db09b7f541-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:28 np0005466013 nova_compute[192144]: 2025-10-02 12:31:28.052 2 DEBUG nova.compute.manager [req-49631be1-28b6-4626-b5d7-5a64b1ec70ab req-a0dc4ec1-75e0-4ec3-9855-c0959336d5e4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] No waiting events found dispatching network-vif-plugged-15e9913c-6f57-4c6b-9187-b313d2cc7b28 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:31:28 np0005466013 nova_compute[192144]: 2025-10-02 12:31:28.053 2 WARNING nova.compute.manager [req-49631be1-28b6-4626-b5d7-5a64b1ec70ab req-a0dc4ec1-75e0-4ec3-9855-c0959336d5e4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Received unexpected event network-vif-plugged-15e9913c-6f57-4c6b-9187-b313d2cc7b28 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:31:29 np0005466013 ovn_controller[94366]: 2025-10-02T12:31:29Z|00060|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b4:da:2c 10.100.0.14
Oct  2 08:31:29 np0005466013 ovn_controller[94366]: 2025-10-02T12:31:29Z|00061|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b4:da:2c 10.100.0.14
Oct  2 08:31:29 np0005466013 nova_compute[192144]: 2025-10-02 12:31:29.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:30 np0005466013 nova_compute[192144]: 2025-10-02 12:31:30.710 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:30 np0005466013 nova_compute[192144]: 2025-10-02 12:31:30.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:31:31 np0005466013 nova_compute[192144]: 2025-10-02 12:31:31.014 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:31 np0005466013 nova_compute[192144]: 2025-10-02 12:31:31.015 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:31 np0005466013 nova_compute[192144]: 2025-10-02 12:31:31.015 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:31 np0005466013 nova_compute[192144]: 2025-10-02 12:31:31.015 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:31:31 np0005466013 nova_compute[192144]: 2025-10-02 12:31:31.078 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/205fe71f-8f8c-4ae1-ac04-9344041cfd6c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:31:31 np0005466013 nova_compute[192144]: 2025-10-02 12:31:31.132 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/205fe71f-8f8c-4ae1-ac04-9344041cfd6c/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:31:31 np0005466013 nova_compute[192144]: 2025-10-02 12:31:31.133 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/205fe71f-8f8c-4ae1-ac04-9344041cfd6c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:31:31 np0005466013 nova_compute[192144]: 2025-10-02 12:31:31.204 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/205fe71f-8f8c-4ae1-ac04-9344041cfd6c/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:31:31 np0005466013 ovn_controller[94366]: 2025-10-02T12:31:31Z|00610|binding|INFO|Releasing lport 96781755-97bc-4b0c-9d56-1511bbfc3ac7 from this chassis (sb_readonly=0)
Oct  2 08:31:31 np0005466013 nova_compute[192144]: 2025-10-02 12:31:31.340 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:31:31 np0005466013 nova_compute[192144]: 2025-10-02 12:31:31.341 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5501MB free_disk=73.1711196899414GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:31:31 np0005466013 nova_compute[192144]: 2025-10-02 12:31:31.342 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:31 np0005466013 nova_compute[192144]: 2025-10-02 12:31:31.342 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:31 np0005466013 nova_compute[192144]: 2025-10-02 12:31:31.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:31 np0005466013 nova_compute[192144]: 2025-10-02 12:31:31.447 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance 205fe71f-8f8c-4ae1-ac04-9344041cfd6c actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:31:31 np0005466013 nova_compute[192144]: 2025-10-02 12:31:31.448 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:31:31 np0005466013 nova_compute[192144]: 2025-10-02 12:31:31.448 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:31:31 np0005466013 nova_compute[192144]: 2025-10-02 12:31:31.491 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:31:31 np0005466013 nova_compute[192144]: 2025-10-02 12:31:31.524 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:31:31 np0005466013 nova_compute[192144]: 2025-10-02 12:31:31.556 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:31:31 np0005466013 nova_compute[192144]: 2025-10-02 12:31:31.556 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.214s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:33 np0005466013 nova_compute[192144]: 2025-10-02 12:31:33.558 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:31:33 np0005466013 nova_compute[192144]: 2025-10-02 12:31:33.558 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:31:35 np0005466013 nova_compute[192144]: 2025-10-02 12:31:34.999 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:35 np0005466013 nova_compute[192144]: 2025-10-02 12:31:35.130 2 INFO nova.compute.manager [None req-1c9af7c2-5adb-4ba4-8856-6077cc3344ed 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Get console output#033[00m
Oct  2 08:31:35 np0005466013 nova_compute[192144]: 2025-10-02 12:31:35.135 56 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  2 08:31:35 np0005466013 nova_compute[192144]: 2025-10-02 12:31:35.713 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:36 np0005466013 nova_compute[192144]: 2025-10-02 12:31:36.305 2 DEBUG nova.compute.manager [req-efefa553-0efe-4abb-9490-e38a4237d047 req-87120583-35c8-4beb-86c7-c52dfc6325f6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Received event network-changed-a9e0d1d4-1101-49b7-adda-6a2a6db11fe1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:31:36 np0005466013 nova_compute[192144]: 2025-10-02 12:31:36.307 2 DEBUG nova.compute.manager [req-efefa553-0efe-4abb-9490-e38a4237d047 req-87120583-35c8-4beb-86c7-c52dfc6325f6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Refreshing instance network info cache due to event network-changed-a9e0d1d4-1101-49b7-adda-6a2a6db11fe1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:31:36 np0005466013 nova_compute[192144]: 2025-10-02 12:31:36.307 2 DEBUG oslo_concurrency.lockutils [req-efefa553-0efe-4abb-9490-e38a4237d047 req-87120583-35c8-4beb-86c7-c52dfc6325f6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-205fe71f-8f8c-4ae1-ac04-9344041cfd6c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:31:36 np0005466013 nova_compute[192144]: 2025-10-02 12:31:36.307 2 DEBUG oslo_concurrency.lockutils [req-efefa553-0efe-4abb-9490-e38a4237d047 req-87120583-35c8-4beb-86c7-c52dfc6325f6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-205fe71f-8f8c-4ae1-ac04-9344041cfd6c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:31:36 np0005466013 nova_compute[192144]: 2025-10-02 12:31:36.307 2 DEBUG nova.network.neutron [req-efefa553-0efe-4abb-9490-e38a4237d047 req-87120583-35c8-4beb-86c7-c52dfc6325f6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Refreshing network info cache for port a9e0d1d4-1101-49b7-adda-6a2a6db11fe1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:31:36 np0005466013 nova_compute[192144]: 2025-10-02 12:31:36.416 2 DEBUG oslo_concurrency.lockutils [None req-fb1f5d53-5de9-4a69-8347-b78f2a65ebbc 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "205fe71f-8f8c-4ae1-ac04-9344041cfd6c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:36 np0005466013 nova_compute[192144]: 2025-10-02 12:31:36.416 2 DEBUG oslo_concurrency.lockutils [None req-fb1f5d53-5de9-4a69-8347-b78f2a65ebbc 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "205fe71f-8f8c-4ae1-ac04-9344041cfd6c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:36 np0005466013 nova_compute[192144]: 2025-10-02 12:31:36.417 2 DEBUG oslo_concurrency.lockutils [None req-fb1f5d53-5de9-4a69-8347-b78f2a65ebbc 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "205fe71f-8f8c-4ae1-ac04-9344041cfd6c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:36 np0005466013 nova_compute[192144]: 2025-10-02 12:31:36.417 2 DEBUG oslo_concurrency.lockutils [None req-fb1f5d53-5de9-4a69-8347-b78f2a65ebbc 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "205fe71f-8f8c-4ae1-ac04-9344041cfd6c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:36 np0005466013 nova_compute[192144]: 2025-10-02 12:31:36.417 2 DEBUG oslo_concurrency.lockutils [None req-fb1f5d53-5de9-4a69-8347-b78f2a65ebbc 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "205fe71f-8f8c-4ae1-ac04-9344041cfd6c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:36 np0005466013 nova_compute[192144]: 2025-10-02 12:31:36.431 2 INFO nova.compute.manager [None req-fb1f5d53-5de9-4a69-8347-b78f2a65ebbc 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Terminating instance#033[00m
Oct  2 08:31:36 np0005466013 nova_compute[192144]: 2025-10-02 12:31:36.443 2 DEBUG nova.compute.manager [None req-fb1f5d53-5de9-4a69-8347-b78f2a65ebbc 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:31:36 np0005466013 kernel: tapa9e0d1d4-11 (unregistering): left promiscuous mode
Oct  2 08:31:36 np0005466013 NetworkManager[51205]: <info>  [1759408296.4713] device (tapa9e0d1d4-11): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:31:36 np0005466013 ovn_controller[94366]: 2025-10-02T12:31:36Z|00611|binding|INFO|Releasing lport a9e0d1d4-1101-49b7-adda-6a2a6db11fe1 from this chassis (sb_readonly=0)
Oct  2 08:31:36 np0005466013 nova_compute[192144]: 2025-10-02 12:31:36.479 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:36 np0005466013 ovn_controller[94366]: 2025-10-02T12:31:36Z|00612|binding|INFO|Setting lport a9e0d1d4-1101-49b7-adda-6a2a6db11fe1 down in Southbound
Oct  2 08:31:36 np0005466013 ovn_controller[94366]: 2025-10-02T12:31:36Z|00613|binding|INFO|Removing iface tapa9e0d1d4-11 ovn-installed in OVS
Oct  2 08:31:36 np0005466013 nova_compute[192144]: 2025-10-02 12:31:36.481 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:36 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:36.495 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b4:da:2c 10.100.0.14'], port_security=['fa:16:3e:b4:da:2c 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '205fe71f-8f8c-4ae1-ac04-9344041cfd6c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-efed3bdf-e287-4892-a4a2-6d198fc94413', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'neutron:revision_number': '6', 'neutron:security_group_ids': '7f00fc5d-beb8-472a-8d55-c082ab0c14cf', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ec777f9d-784c-4505-9548-eae114383c79, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=a9e0d1d4-1101-49b7-adda-6a2a6db11fe1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:31:36 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:36.496 103323 INFO neutron.agent.ovn.metadata.agent [-] Port a9e0d1d4-1101-49b7-adda-6a2a6db11fe1 in datapath efed3bdf-e287-4892-a4a2-6d198fc94413 unbound from our chassis#033[00m
Oct  2 08:31:36 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:36.498 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network efed3bdf-e287-4892-a4a2-6d198fc94413, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:31:36 np0005466013 nova_compute[192144]: 2025-10-02 12:31:36.498 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:36 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:36.499 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[433da2fd-652a-409c-a079-751e8adbf665]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:36 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:36.499 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-efed3bdf-e287-4892-a4a2-6d198fc94413 namespace which is not needed anymore#033[00m
Oct  2 08:31:36 np0005466013 systemd[1]: machine-qemu\x2d67\x2dinstance\x2d00000087.scope: Deactivated successfully.
Oct  2 08:31:36 np0005466013 systemd[1]: machine-qemu\x2d67\x2dinstance\x2d00000087.scope: Consumed 15.296s CPU time.
Oct  2 08:31:36 np0005466013 systemd-machined[152202]: Machine qemu-67-instance-00000087 terminated.
Oct  2 08:31:36 np0005466013 neutron-haproxy-ovnmeta-efed3bdf-e287-4892-a4a2-6d198fc94413[243350]: [NOTICE]   (243354) : haproxy version is 2.8.14-c23fe91
Oct  2 08:31:36 np0005466013 neutron-haproxy-ovnmeta-efed3bdf-e287-4892-a4a2-6d198fc94413[243350]: [NOTICE]   (243354) : path to executable is /usr/sbin/haproxy
Oct  2 08:31:36 np0005466013 neutron-haproxy-ovnmeta-efed3bdf-e287-4892-a4a2-6d198fc94413[243350]: [WARNING]  (243354) : Exiting Master process...
Oct  2 08:31:36 np0005466013 neutron-haproxy-ovnmeta-efed3bdf-e287-4892-a4a2-6d198fc94413[243350]: [ALERT]    (243354) : Current worker (243356) exited with code 143 (Terminated)
Oct  2 08:31:36 np0005466013 neutron-haproxy-ovnmeta-efed3bdf-e287-4892-a4a2-6d198fc94413[243350]: [WARNING]  (243354) : All workers exited. Exiting... (0)
Oct  2 08:31:36 np0005466013 systemd[1]: libpod-f51e4838099107d623428aaecc97c0e6b47a7b76891f57157d1f1921f4d753ac.scope: Deactivated successfully.
Oct  2 08:31:36 np0005466013 podman[243724]: 2025-10-02 12:31:36.619409655 +0000 UTC m=+0.046755643 container died f51e4838099107d623428aaecc97c0e6b47a7b76891f57157d1f1921f4d753ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-efed3bdf-e287-4892-a4a2-6d198fc94413, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:31:36 np0005466013 nova_compute[192144]: 2025-10-02 12:31:36.660 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:36 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f51e4838099107d623428aaecc97c0e6b47a7b76891f57157d1f1921f4d753ac-userdata-shm.mount: Deactivated successfully.
Oct  2 08:31:36 np0005466013 systemd[1]: var-lib-containers-storage-overlay-d65ad3ffbf5b32d7653c93e7970ef3c7d38112a08be9831552f0a650f512e76e-merged.mount: Deactivated successfully.
Oct  2 08:31:36 np0005466013 nova_compute[192144]: 2025-10-02 12:31:36.666 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:36 np0005466013 podman[243724]: 2025-10-02 12:31:36.684491166 +0000 UTC m=+0.111837154 container cleanup f51e4838099107d623428aaecc97c0e6b47a7b76891f57157d1f1921f4d753ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-efed3bdf-e287-4892-a4a2-6d198fc94413, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:31:36 np0005466013 systemd[1]: libpod-conmon-f51e4838099107d623428aaecc97c0e6b47a7b76891f57157d1f1921f4d753ac.scope: Deactivated successfully.
Oct  2 08:31:36 np0005466013 nova_compute[192144]: 2025-10-02 12:31:36.702 2 INFO nova.virt.libvirt.driver [-] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Instance destroyed successfully.#033[00m
Oct  2 08:31:36 np0005466013 nova_compute[192144]: 2025-10-02 12:31:36.702 2 DEBUG nova.objects.instance [None req-fb1f5d53-5de9-4a69-8347-b78f2a65ebbc 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lazy-loading 'resources' on Instance uuid 205fe71f-8f8c-4ae1-ac04-9344041cfd6c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:31:36 np0005466013 nova_compute[192144]: 2025-10-02 12:31:36.715 2 DEBUG nova.virt.libvirt.vif [None req-fb1f5d53-5de9-4a69-8347-b78f2a65ebbc 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T12:30:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-853898395',display_name='tempest-TestNetworkAdvancedServerOps-server-853898395',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-853898395',id=135,image_ref='062d9f80-76b6-42ce-bee7-0fb82a008353',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJcWJR9H07bCSYUqeF8jPAGHvorXmh9wYiFyjeR3r1c8wskLP0QtO4hgxQKDidv5VyuYzyK3XTaMOh059pbUMGGg0yimxkoI04eFolQEnRD7tnn/yCbWfabjbEavYmykBA==',key_name='tempest-TestNetworkAdvancedServerOps-1317244517',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:31:12Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='76c7dd40d83e4e3ca71abbebf57921b6',ramdisk_id='',reservation_id='r-3e4dgpei',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='062d9f80-76b6-42ce-bee7-0fb82a008353',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-597114071',owner_user_name='tempest-TestNetworkAdvancedServerOps-597114071-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:31:12Z,user_data=None,user_id='1faa7e121a0e43ad8cb4ae5b2cfcc6a2',uuid=205fe71f-8f8c-4ae1-ac04-9344041cfd6c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a9e0d1d4-1101-49b7-adda-6a2a6db11fe1", "address": "fa:16:3e:b4:da:2c", "network": {"id": "efed3bdf-e287-4892-a4a2-6d198fc94413", "bridge": "br-int", "label": "tempest-network-smoke--1318553587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9e0d1d4-11", "ovs_interfaceid": "a9e0d1d4-1101-49b7-adda-6a2a6db11fe1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:31:36 np0005466013 nova_compute[192144]: 2025-10-02 12:31:36.715 2 DEBUG nova.network.os_vif_util [None req-fb1f5d53-5de9-4a69-8347-b78f2a65ebbc 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Converting VIF {"id": "a9e0d1d4-1101-49b7-adda-6a2a6db11fe1", "address": "fa:16:3e:b4:da:2c", "network": {"id": "efed3bdf-e287-4892-a4a2-6d198fc94413", "bridge": "br-int", "label": "tempest-network-smoke--1318553587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9e0d1d4-11", "ovs_interfaceid": "a9e0d1d4-1101-49b7-adda-6a2a6db11fe1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:31:36 np0005466013 nova_compute[192144]: 2025-10-02 12:31:36.716 2 DEBUG nova.network.os_vif_util [None req-fb1f5d53-5de9-4a69-8347-b78f2a65ebbc 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b4:da:2c,bridge_name='br-int',has_traffic_filtering=True,id=a9e0d1d4-1101-49b7-adda-6a2a6db11fe1,network=Network(efed3bdf-e287-4892-a4a2-6d198fc94413),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa9e0d1d4-11') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:31:36 np0005466013 nova_compute[192144]: 2025-10-02 12:31:36.716 2 DEBUG os_vif [None req-fb1f5d53-5de9-4a69-8347-b78f2a65ebbc 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b4:da:2c,bridge_name='br-int',has_traffic_filtering=True,id=a9e0d1d4-1101-49b7-adda-6a2a6db11fe1,network=Network(efed3bdf-e287-4892-a4a2-6d198fc94413),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa9e0d1d4-11') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:31:36 np0005466013 nova_compute[192144]: 2025-10-02 12:31:36.718 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:36 np0005466013 nova_compute[192144]: 2025-10-02 12:31:36.719 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa9e0d1d4-11, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:31:36 np0005466013 nova_compute[192144]: 2025-10-02 12:31:36.720 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:36 np0005466013 nova_compute[192144]: 2025-10-02 12:31:36.722 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:36 np0005466013 nova_compute[192144]: 2025-10-02 12:31:36.724 2 INFO os_vif [None req-fb1f5d53-5de9-4a69-8347-b78f2a65ebbc 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b4:da:2c,bridge_name='br-int',has_traffic_filtering=True,id=a9e0d1d4-1101-49b7-adda-6a2a6db11fe1,network=Network(efed3bdf-e287-4892-a4a2-6d198fc94413),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa9e0d1d4-11')#033[00m
Oct  2 08:31:36 np0005466013 nova_compute[192144]: 2025-10-02 12:31:36.725 2 INFO nova.virt.libvirt.driver [None req-fb1f5d53-5de9-4a69-8347-b78f2a65ebbc 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Deleting instance files /var/lib/nova/instances/205fe71f-8f8c-4ae1-ac04-9344041cfd6c_del#033[00m
Oct  2 08:31:36 np0005466013 nova_compute[192144]: 2025-10-02 12:31:36.727 2 INFO nova.virt.libvirt.driver [None req-fb1f5d53-5de9-4a69-8347-b78f2a65ebbc 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Deletion of /var/lib/nova/instances/205fe71f-8f8c-4ae1-ac04-9344041cfd6c_del complete#033[00m
Oct  2 08:31:36 np0005466013 podman[243770]: 2025-10-02 12:31:36.759084263 +0000 UTC m=+0.051947047 container remove f51e4838099107d623428aaecc97c0e6b47a7b76891f57157d1f1921f4d753ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-efed3bdf-e287-4892-a4a2-6d198fc94413, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:31:36 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:36.763 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[6eca58d9-a88a-44de-84a2-761ca5ca339c]: (4, ('Thu Oct  2 12:31:36 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-efed3bdf-e287-4892-a4a2-6d198fc94413 (f51e4838099107d623428aaecc97c0e6b47a7b76891f57157d1f1921f4d753ac)\nf51e4838099107d623428aaecc97c0e6b47a7b76891f57157d1f1921f4d753ac\nThu Oct  2 12:31:36 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-efed3bdf-e287-4892-a4a2-6d198fc94413 (f51e4838099107d623428aaecc97c0e6b47a7b76891f57157d1f1921f4d753ac)\nf51e4838099107d623428aaecc97c0e6b47a7b76891f57157d1f1921f4d753ac\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:36 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:36.765 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[5e17e022-f8e0-4e7b-a23e-f34c34d05744]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:36 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:36.765 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapefed3bdf-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:31:36 np0005466013 nova_compute[192144]: 2025-10-02 12:31:36.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:36 np0005466013 kernel: tapefed3bdf-e0: left promiscuous mode
Oct  2 08:31:36 np0005466013 nova_compute[192144]: 2025-10-02 12:31:36.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:36 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:36.795 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[05ed4686-7830-47e8-b783-a0cbd15374ac]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:36 np0005466013 nova_compute[192144]: 2025-10-02 12:31:36.799 2 INFO nova.compute.manager [None req-fb1f5d53-5de9-4a69-8347-b78f2a65ebbc 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:31:36 np0005466013 nova_compute[192144]: 2025-10-02 12:31:36.800 2 DEBUG oslo.service.loopingcall [None req-fb1f5d53-5de9-4a69-8347-b78f2a65ebbc 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:31:36 np0005466013 nova_compute[192144]: 2025-10-02 12:31:36.800 2 DEBUG nova.compute.manager [-] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:31:36 np0005466013 nova_compute[192144]: 2025-10-02 12:31:36.800 2 DEBUG nova.network.neutron [-] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:31:36 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:36.833 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[e9f9e0f7-b828-49d0-af6a-8bb2a32bc231]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:36 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:36.834 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[5d65e3c0-a9e2-4959-895c-12f9c82adfba]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:36 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:36.851 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[5516dc65-d2e4-49f1-8e6c-75bc8a245da1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 626363, 'reachable_time': 38596, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243786, 'error': None, 'target': 'ovnmeta-efed3bdf-e287-4892-a4a2-6d198fc94413', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:36 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:36.854 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-efed3bdf-e287-4892-a4a2-6d198fc94413 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:31:36 np0005466013 systemd[1]: run-netns-ovnmeta\x2defed3bdf\x2de287\x2d4892\x2da4a2\x2d6d198fc94413.mount: Deactivated successfully.
Oct  2 08:31:36 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:36.854 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[5f85d046-6321-44ac-96d0-f2225ded7cec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:36 np0005466013 nova_compute[192144]: 2025-10-02 12:31:36.925 2 DEBUG nova.compute.manager [req-4c6a1379-a895-4230-b9bc-b13d3c528845 req-9f405eef-bf6c-45c7-86ff-99733002ed47 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Received event network-vif-unplugged-a9e0d1d4-1101-49b7-adda-6a2a6db11fe1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:31:36 np0005466013 nova_compute[192144]: 2025-10-02 12:31:36.926 2 DEBUG oslo_concurrency.lockutils [req-4c6a1379-a895-4230-b9bc-b13d3c528845 req-9f405eef-bf6c-45c7-86ff-99733002ed47 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "205fe71f-8f8c-4ae1-ac04-9344041cfd6c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:36 np0005466013 nova_compute[192144]: 2025-10-02 12:31:36.926 2 DEBUG oslo_concurrency.lockutils [req-4c6a1379-a895-4230-b9bc-b13d3c528845 req-9f405eef-bf6c-45c7-86ff-99733002ed47 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "205fe71f-8f8c-4ae1-ac04-9344041cfd6c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:36 np0005466013 nova_compute[192144]: 2025-10-02 12:31:36.926 2 DEBUG oslo_concurrency.lockutils [req-4c6a1379-a895-4230-b9bc-b13d3c528845 req-9f405eef-bf6c-45c7-86ff-99733002ed47 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "205fe71f-8f8c-4ae1-ac04-9344041cfd6c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:36 np0005466013 nova_compute[192144]: 2025-10-02 12:31:36.927 2 DEBUG nova.compute.manager [req-4c6a1379-a895-4230-b9bc-b13d3c528845 req-9f405eef-bf6c-45c7-86ff-99733002ed47 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] No waiting events found dispatching network-vif-unplugged-a9e0d1d4-1101-49b7-adda-6a2a6db11fe1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:31:36 np0005466013 nova_compute[192144]: 2025-10-02 12:31:36.927 2 DEBUG nova.compute.manager [req-4c6a1379-a895-4230-b9bc-b13d3c528845 req-9f405eef-bf6c-45c7-86ff-99733002ed47 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Received event network-vif-unplugged-a9e0d1d4-1101-49b7-adda-6a2a6db11fe1 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:31:38 np0005466013 nova_compute[192144]: 2025-10-02 12:31:38.191 2 DEBUG nova.network.neutron [-] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:31:38 np0005466013 nova_compute[192144]: 2025-10-02 12:31:38.212 2 INFO nova.compute.manager [-] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Took 1.41 seconds to deallocate network for instance.#033[00m
Oct  2 08:31:38 np0005466013 nova_compute[192144]: 2025-10-02 12:31:38.324 2 DEBUG nova.compute.manager [req-ad4d7d9c-eea0-4223-91ba-aa00e285662c req-5a5f7a3e-a102-4247-b1ba-3ef56bfcf047 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Received event network-vif-deleted-a9e0d1d4-1101-49b7-adda-6a2a6db11fe1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:31:38 np0005466013 nova_compute[192144]: 2025-10-02 12:31:38.362 2 DEBUG oslo_concurrency.lockutils [None req-fb1f5d53-5de9-4a69-8347-b78f2a65ebbc 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:38 np0005466013 nova_compute[192144]: 2025-10-02 12:31:38.362 2 DEBUG oslo_concurrency.lockutils [None req-fb1f5d53-5de9-4a69-8347-b78f2a65ebbc 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:38 np0005466013 nova_compute[192144]: 2025-10-02 12:31:38.456 2 DEBUG nova.compute.provider_tree [None req-fb1f5d53-5de9-4a69-8347-b78f2a65ebbc 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:31:38 np0005466013 nova_compute[192144]: 2025-10-02 12:31:38.478 2 DEBUG nova.scheduler.client.report [None req-fb1f5d53-5de9-4a69-8347-b78f2a65ebbc 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:31:38 np0005466013 nova_compute[192144]: 2025-10-02 12:31:38.481 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:38 np0005466013 nova_compute[192144]: 2025-10-02 12:31:38.507 2 DEBUG oslo_concurrency.lockutils [None req-fb1f5d53-5de9-4a69-8347-b78f2a65ebbc 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.145s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:38 np0005466013 nova_compute[192144]: 2025-10-02 12:31:38.539 2 INFO nova.scheduler.client.report [None req-fb1f5d53-5de9-4a69-8347-b78f2a65ebbc 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Deleted allocations for instance 205fe71f-8f8c-4ae1-ac04-9344041cfd6c#033[00m
Oct  2 08:31:38 np0005466013 nova_compute[192144]: 2025-10-02 12:31:38.624 2 DEBUG oslo_concurrency.lockutils [None req-fb1f5d53-5de9-4a69-8347-b78f2a65ebbc 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "205fe71f-8f8c-4ae1-ac04-9344041cfd6c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.207s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:38 np0005466013 nova_compute[192144]: 2025-10-02 12:31:38.730 2 DEBUG nova.network.neutron [req-efefa553-0efe-4abb-9490-e38a4237d047 req-87120583-35c8-4beb-86c7-c52dfc6325f6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Updated VIF entry in instance network info cache for port a9e0d1d4-1101-49b7-adda-6a2a6db11fe1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:31:38 np0005466013 nova_compute[192144]: 2025-10-02 12:31:38.731 2 DEBUG nova.network.neutron [req-efefa553-0efe-4abb-9490-e38a4237d047 req-87120583-35c8-4beb-86c7-c52dfc6325f6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Updating instance_info_cache with network_info: [{"id": "a9e0d1d4-1101-49b7-adda-6a2a6db11fe1", "address": "fa:16:3e:b4:da:2c", "network": {"id": "efed3bdf-e287-4892-a4a2-6d198fc94413", "bridge": "br-int", "label": "tempest-network-smoke--1318553587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9e0d1d4-11", "ovs_interfaceid": "a9e0d1d4-1101-49b7-adda-6a2a6db11fe1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:31:38 np0005466013 nova_compute[192144]: 2025-10-02 12:31:38.772 2 DEBUG oslo_concurrency.lockutils [req-efefa553-0efe-4abb-9490-e38a4237d047 req-87120583-35c8-4beb-86c7-c52dfc6325f6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-205fe71f-8f8c-4ae1-ac04-9344041cfd6c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:31:38 np0005466013 nova_compute[192144]: 2025-10-02 12:31:38.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:31:39 np0005466013 nova_compute[192144]: 2025-10-02 12:31:39.007 2 DEBUG nova.compute.manager [req-55605fdd-c0cd-4be7-811b-c21e935b50c6 req-a9d0ea11-9a83-4641-a443-fcf4d0ce8f51 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Received event network-vif-plugged-a9e0d1d4-1101-49b7-adda-6a2a6db11fe1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:31:39 np0005466013 nova_compute[192144]: 2025-10-02 12:31:39.007 2 DEBUG oslo_concurrency.lockutils [req-55605fdd-c0cd-4be7-811b-c21e935b50c6 req-a9d0ea11-9a83-4641-a443-fcf4d0ce8f51 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "205fe71f-8f8c-4ae1-ac04-9344041cfd6c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:39 np0005466013 nova_compute[192144]: 2025-10-02 12:31:39.007 2 DEBUG oslo_concurrency.lockutils [req-55605fdd-c0cd-4be7-811b-c21e935b50c6 req-a9d0ea11-9a83-4641-a443-fcf4d0ce8f51 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "205fe71f-8f8c-4ae1-ac04-9344041cfd6c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:39 np0005466013 nova_compute[192144]: 2025-10-02 12:31:39.007 2 DEBUG oslo_concurrency.lockutils [req-55605fdd-c0cd-4be7-811b-c21e935b50c6 req-a9d0ea11-9a83-4641-a443-fcf4d0ce8f51 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "205fe71f-8f8c-4ae1-ac04-9344041cfd6c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:39 np0005466013 nova_compute[192144]: 2025-10-02 12:31:39.008 2 DEBUG nova.compute.manager [req-55605fdd-c0cd-4be7-811b-c21e935b50c6 req-a9d0ea11-9a83-4641-a443-fcf4d0ce8f51 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] No waiting events found dispatching network-vif-plugged-a9e0d1d4-1101-49b7-adda-6a2a6db11fe1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:31:39 np0005466013 nova_compute[192144]: 2025-10-02 12:31:39.008 2 WARNING nova.compute.manager [req-55605fdd-c0cd-4be7-811b-c21e935b50c6 req-a9d0ea11-9a83-4641-a443-fcf4d0ce8f51 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Received unexpected event network-vif-plugged-a9e0d1d4-1101-49b7-adda-6a2a6db11fe1 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:31:39 np0005466013 podman[243787]: 2025-10-02 12:31:39.685760265 +0000 UTC m=+0.057458409 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:31:39 np0005466013 podman[243788]: 2025-10-02 12:31:39.717126767 +0000 UTC m=+0.077067100 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, config_id=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Oct  2 08:31:39 np0005466013 podman[243789]: 2025-10-02 12:31:39.727818781 +0000 UTC m=+0.091541620 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  2 08:31:39 np0005466013 nova_compute[192144]: 2025-10-02 12:31:39.965 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:39 np0005466013 nova_compute[192144]: 2025-10-02 12:31:39.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:31:39 np0005466013 nova_compute[192144]: 2025-10-02 12:31:39.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:31:40 np0005466013 nova_compute[192144]: 2025-10-02 12:31:40.000 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:40 np0005466013 nova_compute[192144]: 2025-10-02 12:31:40.681 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408285.6790922, f9bd9b91-34ee-4ea0-9ff5-38db09b7f541 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:31:40 np0005466013 nova_compute[192144]: 2025-10-02 12:31:40.681 2 INFO nova.compute.manager [-] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:31:40 np0005466013 nova_compute[192144]: 2025-10-02 12:31:40.709 2 DEBUG nova.compute.manager [None req-c2f5dbbb-b797-4933-b260-1480e880eb6e - - - - - -] [instance: f9bd9b91-34ee-4ea0-9ff5-38db09b7f541] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:31:41 np0005466013 nova_compute[192144]: 2025-10-02 12:31:41.721 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:41 np0005466013 nova_compute[192144]: 2025-10-02 12:31:41.988 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:31:41 np0005466013 nova_compute[192144]: 2025-10-02 12:31:41.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:31:41 np0005466013 nova_compute[192144]: 2025-10-02 12:31:41.993 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:31:41 np0005466013 nova_compute[192144]: 2025-10-02 12:31:41.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:31:42 np0005466013 nova_compute[192144]: 2025-10-02 12:31:42.011 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:31:42 np0005466013 nova_compute[192144]: 2025-10-02 12:31:42.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:42 np0005466013 nova_compute[192144]: 2025-10-02 12:31:42.936 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:42 np0005466013 nova_compute[192144]: 2025-10-02 12:31:42.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:31:44 np0005466013 nova_compute[192144]: 2025-10-02 12:31:44.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:31:45 np0005466013 nova_compute[192144]: 2025-10-02 12:31:45.002 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:46 np0005466013 nova_compute[192144]: 2025-10-02 12:31:46.723 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:49 np0005466013 podman[243856]: 2025-10-02 12:31:49.715112314 +0000 UTC m=+0.082712657 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  2 08:31:49 np0005466013 podman[243857]: 2025-10-02 12:31:49.741653795 +0000 UTC m=+0.097325672 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.buildah.version=1.33.7, managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, release=1755695350, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, version=9.6, distribution-scope=public, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers)
Oct  2 08:31:49 np0005466013 podman[243858]: 2025-10-02 12:31:49.78214204 +0000 UTC m=+0.134950022 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, config_id=edpm)
Oct  2 08:31:50 np0005466013 nova_compute[192144]: 2025-10-02 12:31:50.005 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:50 np0005466013 nova_compute[192144]: 2025-10-02 12:31:50.413 2 DEBUG oslo_concurrency.lockutils [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "4958df02-e9fa-4cb2-9175-4313cd3fd658" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:50 np0005466013 nova_compute[192144]: 2025-10-02 12:31:50.413 2 DEBUG oslo_concurrency.lockutils [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "4958df02-e9fa-4cb2-9175-4313cd3fd658" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:50 np0005466013 nova_compute[192144]: 2025-10-02 12:31:50.494 2 DEBUG nova.compute.manager [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:31:50 np0005466013 nova_compute[192144]: 2025-10-02 12:31:50.724 2 DEBUG oslo_concurrency.lockutils [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:50 np0005466013 nova_compute[192144]: 2025-10-02 12:31:50.724 2 DEBUG oslo_concurrency.lockutils [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:50 np0005466013 nova_compute[192144]: 2025-10-02 12:31:50.733 2 DEBUG nova.virt.hardware [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:31:50 np0005466013 nova_compute[192144]: 2025-10-02 12:31:50.733 2 INFO nova.compute.claims [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:31:50 np0005466013 nova_compute[192144]: 2025-10-02 12:31:50.882 2 DEBUG nova.compute.provider_tree [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:31:50 np0005466013 nova_compute[192144]: 2025-10-02 12:31:50.901 2 DEBUG nova.scheduler.client.report [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:31:50 np0005466013 nova_compute[192144]: 2025-10-02 12:31:50.935 2 DEBUG oslo_concurrency.lockutils [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.211s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:50 np0005466013 nova_compute[192144]: 2025-10-02 12:31:50.936 2 DEBUG nova.compute.manager [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:31:51 np0005466013 nova_compute[192144]: 2025-10-02 12:31:51.002 2 DEBUG nova.compute.manager [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:31:51 np0005466013 nova_compute[192144]: 2025-10-02 12:31:51.002 2 DEBUG nova.network.neutron [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:31:51 np0005466013 nova_compute[192144]: 2025-10-02 12:31:51.026 2 INFO nova.virt.libvirt.driver [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:31:51 np0005466013 nova_compute[192144]: 2025-10-02 12:31:51.060 2 DEBUG nova.compute.manager [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:31:51 np0005466013 nova_compute[192144]: 2025-10-02 12:31:51.193 2 DEBUG nova.compute.manager [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:31:51 np0005466013 nova_compute[192144]: 2025-10-02 12:31:51.195 2 DEBUG nova.virt.libvirt.driver [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:31:51 np0005466013 nova_compute[192144]: 2025-10-02 12:31:51.196 2 INFO nova.virt.libvirt.driver [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Creating image(s)#033[00m
Oct  2 08:31:51 np0005466013 nova_compute[192144]: 2025-10-02 12:31:51.197 2 DEBUG oslo_concurrency.lockutils [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "/var/lib/nova/instances/4958df02-e9fa-4cb2-9175-4313cd3fd658/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:51 np0005466013 nova_compute[192144]: 2025-10-02 12:31:51.197 2 DEBUG oslo_concurrency.lockutils [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "/var/lib/nova/instances/4958df02-e9fa-4cb2-9175-4313cd3fd658/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:51 np0005466013 nova_compute[192144]: 2025-10-02 12:31:51.198 2 DEBUG oslo_concurrency.lockutils [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "/var/lib/nova/instances/4958df02-e9fa-4cb2-9175-4313cd3fd658/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:51 np0005466013 nova_compute[192144]: 2025-10-02 12:31:51.221 2 DEBUG oslo_concurrency.processutils [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:31:51 np0005466013 nova_compute[192144]: 2025-10-02 12:31:51.275 2 DEBUG nova.policy [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '27daa263abb54d4d8e3ae34cd1c5ccf5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a4a7099974504a798e1607c8e6a1f570', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:31:51 np0005466013 nova_compute[192144]: 2025-10-02 12:31:51.314 2 DEBUG oslo_concurrency.processutils [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:31:51 np0005466013 nova_compute[192144]: 2025-10-02 12:31:51.315 2 DEBUG oslo_concurrency.lockutils [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:51 np0005466013 nova_compute[192144]: 2025-10-02 12:31:51.315 2 DEBUG oslo_concurrency.lockutils [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:51 np0005466013 nova_compute[192144]: 2025-10-02 12:31:51.325 2 DEBUG oslo_concurrency.processutils [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:31:51 np0005466013 nova_compute[192144]: 2025-10-02 12:31:51.375 2 DEBUG oslo_concurrency.processutils [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:31:51 np0005466013 nova_compute[192144]: 2025-10-02 12:31:51.377 2 DEBUG oslo_concurrency.processutils [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/4958df02-e9fa-4cb2-9175-4313cd3fd658/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:31:51 np0005466013 nova_compute[192144]: 2025-10-02 12:31:51.409 2 DEBUG oslo_concurrency.processutils [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/4958df02-e9fa-4cb2-9175-4313cd3fd658/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:31:51 np0005466013 nova_compute[192144]: 2025-10-02 12:31:51.410 2 DEBUG oslo_concurrency.lockutils [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.095s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:51 np0005466013 nova_compute[192144]: 2025-10-02 12:31:51.410 2 DEBUG oslo_concurrency.processutils [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:31:51 np0005466013 nova_compute[192144]: 2025-10-02 12:31:51.460 2 DEBUG oslo_concurrency.processutils [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:31:51 np0005466013 nova_compute[192144]: 2025-10-02 12:31:51.461 2 DEBUG nova.virt.disk.api [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Checking if we can resize image /var/lib/nova/instances/4958df02-e9fa-4cb2-9175-4313cd3fd658/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:31:51 np0005466013 nova_compute[192144]: 2025-10-02 12:31:51.461 2 DEBUG oslo_concurrency.processutils [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4958df02-e9fa-4cb2-9175-4313cd3fd658/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:31:51 np0005466013 nova_compute[192144]: 2025-10-02 12:31:51.544 2 DEBUG oslo_concurrency.processutils [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4958df02-e9fa-4cb2-9175-4313cd3fd658/disk --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:31:51 np0005466013 nova_compute[192144]: 2025-10-02 12:31:51.545 2 DEBUG nova.virt.disk.api [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Cannot resize image /var/lib/nova/instances/4958df02-e9fa-4cb2-9175-4313cd3fd658/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:31:51 np0005466013 nova_compute[192144]: 2025-10-02 12:31:51.546 2 DEBUG nova.objects.instance [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lazy-loading 'migration_context' on Instance uuid 4958df02-e9fa-4cb2-9175-4313cd3fd658 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:31:51 np0005466013 nova_compute[192144]: 2025-10-02 12:31:51.572 2 DEBUG nova.virt.libvirt.driver [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:31:51 np0005466013 nova_compute[192144]: 2025-10-02 12:31:51.572 2 DEBUG nova.virt.libvirt.driver [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Ensure instance console log exists: /var/lib/nova/instances/4958df02-e9fa-4cb2-9175-4313cd3fd658/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:31:51 np0005466013 nova_compute[192144]: 2025-10-02 12:31:51.573 2 DEBUG oslo_concurrency.lockutils [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:51 np0005466013 nova_compute[192144]: 2025-10-02 12:31:51.574 2 DEBUG oslo_concurrency.lockutils [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:51 np0005466013 nova_compute[192144]: 2025-10-02 12:31:51.574 2 DEBUG oslo_concurrency.lockutils [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:51 np0005466013 nova_compute[192144]: 2025-10-02 12:31:51.701 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408296.7003133, 205fe71f-8f8c-4ae1-ac04-9344041cfd6c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:31:51 np0005466013 nova_compute[192144]: 2025-10-02 12:31:51.702 2 INFO nova.compute.manager [-] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:31:51 np0005466013 nova_compute[192144]: 2025-10-02 12:31:51.725 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:51 np0005466013 nova_compute[192144]: 2025-10-02 12:31:51.757 2 DEBUG nova.compute.manager [None req-fbeee2da-58c5-4fc8-b44f-3f10f089e915 - - - - - -] [instance: 205fe71f-8f8c-4ae1-ac04-9344041cfd6c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:31:52 np0005466013 nova_compute[192144]: 2025-10-02 12:31:52.460 2 DEBUG nova.network.neutron [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Successfully created port: bf1d62fc-3a8d-4493-ae99-723fac577d26 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:31:54 np0005466013 nova_compute[192144]: 2025-10-02 12:31:54.343 2 DEBUG nova.network.neutron [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Successfully updated port: bf1d62fc-3a8d-4493-ae99-723fac577d26 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:31:54 np0005466013 nova_compute[192144]: 2025-10-02 12:31:54.434 2 DEBUG oslo_concurrency.lockutils [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "refresh_cache-4958df02-e9fa-4cb2-9175-4313cd3fd658" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:31:54 np0005466013 nova_compute[192144]: 2025-10-02 12:31:54.434 2 DEBUG oslo_concurrency.lockutils [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquired lock "refresh_cache-4958df02-e9fa-4cb2-9175-4313cd3fd658" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:31:54 np0005466013 nova_compute[192144]: 2025-10-02 12:31:54.435 2 DEBUG nova.network.neutron [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:31:54 np0005466013 nova_compute[192144]: 2025-10-02 12:31:54.722 2 DEBUG nova.compute.manager [req-9d21ab72-6f0e-4186-9b95-aaee05915e7b req-a0565ec2-897d-45f3-bcaf-9e9f7719e329 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Received event network-changed-bf1d62fc-3a8d-4493-ae99-723fac577d26 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:31:54 np0005466013 nova_compute[192144]: 2025-10-02 12:31:54.722 2 DEBUG nova.compute.manager [req-9d21ab72-6f0e-4186-9b95-aaee05915e7b req-a0565ec2-897d-45f3-bcaf-9e9f7719e329 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Refreshing instance network info cache due to event network-changed-bf1d62fc-3a8d-4493-ae99-723fac577d26. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:31:54 np0005466013 nova_compute[192144]: 2025-10-02 12:31:54.723 2 DEBUG oslo_concurrency.lockutils [req-9d21ab72-6f0e-4186-9b95-aaee05915e7b req-a0565ec2-897d-45f3-bcaf-9e9f7719e329 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-4958df02-e9fa-4cb2-9175-4313cd3fd658" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:31:54 np0005466013 nova_compute[192144]: 2025-10-02 12:31:54.742 2 DEBUG nova.network.neutron [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:31:55 np0005466013 nova_compute[192144]: 2025-10-02 12:31:55.006 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:55 np0005466013 podman[243929]: 2025-10-02 12:31:55.736914001 +0000 UTC m=+0.095330471 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  2 08:31:55 np0005466013 podman[243930]: 2025-10-02 12:31:55.744936403 +0000 UTC m=+0.098954225 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid)
Oct  2 08:31:55 np0005466013 nova_compute[192144]: 2025-10-02 12:31:55.968 2 DEBUG nova.network.neutron [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Updating instance_info_cache with network_info: [{"id": "bf1d62fc-3a8d-4493-ae99-723fac577d26", "address": "fa:16:3e:4a:4b:3d", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf1d62fc-3a", "ovs_interfaceid": "bf1d62fc-3a8d-4493-ae99-723fac577d26", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:31:56 np0005466013 nova_compute[192144]: 2025-10-02 12:31:56.016 2 DEBUG oslo_concurrency.lockutils [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Releasing lock "refresh_cache-4958df02-e9fa-4cb2-9175-4313cd3fd658" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:31:56 np0005466013 nova_compute[192144]: 2025-10-02 12:31:56.016 2 DEBUG nova.compute.manager [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Instance network_info: |[{"id": "bf1d62fc-3a8d-4493-ae99-723fac577d26", "address": "fa:16:3e:4a:4b:3d", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf1d62fc-3a", "ovs_interfaceid": "bf1d62fc-3a8d-4493-ae99-723fac577d26", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:31:56 np0005466013 nova_compute[192144]: 2025-10-02 12:31:56.017 2 DEBUG oslo_concurrency.lockutils [req-9d21ab72-6f0e-4186-9b95-aaee05915e7b req-a0565ec2-897d-45f3-bcaf-9e9f7719e329 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-4958df02-e9fa-4cb2-9175-4313cd3fd658" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:31:56 np0005466013 nova_compute[192144]: 2025-10-02 12:31:56.017 2 DEBUG nova.network.neutron [req-9d21ab72-6f0e-4186-9b95-aaee05915e7b req-a0565ec2-897d-45f3-bcaf-9e9f7719e329 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Refreshing network info cache for port bf1d62fc-3a8d-4493-ae99-723fac577d26 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:31:56 np0005466013 nova_compute[192144]: 2025-10-02 12:31:56.020 2 DEBUG nova.virt.libvirt.driver [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Start _get_guest_xml network_info=[{"id": "bf1d62fc-3a8d-4493-ae99-723fac577d26", "address": "fa:16:3e:4a:4b:3d", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf1d62fc-3a", "ovs_interfaceid": "bf1d62fc-3a8d-4493-ae99-723fac577d26", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:31:56 np0005466013 nova_compute[192144]: 2025-10-02 12:31:56.025 2 WARNING nova.virt.libvirt.driver [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:31:56 np0005466013 nova_compute[192144]: 2025-10-02 12:31:56.030 2 DEBUG nova.virt.libvirt.host [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:31:56 np0005466013 nova_compute[192144]: 2025-10-02 12:31:56.030 2 DEBUG nova.virt.libvirt.host [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:31:56 np0005466013 nova_compute[192144]: 2025-10-02 12:31:56.033 2 DEBUG nova.virt.libvirt.host [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:31:56 np0005466013 nova_compute[192144]: 2025-10-02 12:31:56.034 2 DEBUG nova.virt.libvirt.host [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:31:56 np0005466013 nova_compute[192144]: 2025-10-02 12:31:56.035 2 DEBUG nova.virt.libvirt.driver [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:31:56 np0005466013 nova_compute[192144]: 2025-10-02 12:31:56.035 2 DEBUG nova.virt.hardware [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:31:56 np0005466013 nova_compute[192144]: 2025-10-02 12:31:56.036 2 DEBUG nova.virt.hardware [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:31:56 np0005466013 nova_compute[192144]: 2025-10-02 12:31:56.036 2 DEBUG nova.virt.hardware [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:31:56 np0005466013 nova_compute[192144]: 2025-10-02 12:31:56.036 2 DEBUG nova.virt.hardware [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:31:56 np0005466013 nova_compute[192144]: 2025-10-02 12:31:56.036 2 DEBUG nova.virt.hardware [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:31:56 np0005466013 nova_compute[192144]: 2025-10-02 12:31:56.036 2 DEBUG nova.virt.hardware [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:31:56 np0005466013 nova_compute[192144]: 2025-10-02 12:31:56.036 2 DEBUG nova.virt.hardware [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:31:56 np0005466013 nova_compute[192144]: 2025-10-02 12:31:56.037 2 DEBUG nova.virt.hardware [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:31:56 np0005466013 nova_compute[192144]: 2025-10-02 12:31:56.037 2 DEBUG nova.virt.hardware [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:31:56 np0005466013 nova_compute[192144]: 2025-10-02 12:31:56.037 2 DEBUG nova.virt.hardware [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:31:56 np0005466013 nova_compute[192144]: 2025-10-02 12:31:56.037 2 DEBUG nova.virt.hardware [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:31:56 np0005466013 nova_compute[192144]: 2025-10-02 12:31:56.041 2 DEBUG nova.virt.libvirt.vif [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:31:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-₡-1854753736',display_name='tempest-₡-1854753736',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest--1854753736',id=140,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a4a7099974504a798e1607c8e6a1f570',ramdisk_id='',reservation_id='r-s2du0lnf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1163535506',owner_user_name='tempest-ServersTestJSON-1163535506-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:31:51Z,user_data=None,user_id='27daa263abb54d4d8e3ae34cd1c5ccf5',uuid=4958df02-e9fa-4cb2-9175-4313cd3fd658,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bf1d62fc-3a8d-4493-ae99-723fac577d26", "address": "fa:16:3e:4a:4b:3d", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf1d62fc-3a", "ovs_interfaceid": "bf1d62fc-3a8d-4493-ae99-723fac577d26", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:31:56 np0005466013 nova_compute[192144]: 2025-10-02 12:31:56.042 2 DEBUG nova.network.os_vif_util [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Converting VIF {"id": "bf1d62fc-3a8d-4493-ae99-723fac577d26", "address": "fa:16:3e:4a:4b:3d", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf1d62fc-3a", "ovs_interfaceid": "bf1d62fc-3a8d-4493-ae99-723fac577d26", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:31:56 np0005466013 nova_compute[192144]: 2025-10-02 12:31:56.042 2 DEBUG nova.network.os_vif_util [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4a:4b:3d,bridge_name='br-int',has_traffic_filtering=True,id=bf1d62fc-3a8d-4493-ae99-723fac577d26,network=Network(1acf42c5-084c-4cc4-bdc5-910eec0249e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbf1d62fc-3a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:31:56 np0005466013 nova_compute[192144]: 2025-10-02 12:31:56.043 2 DEBUG nova.objects.instance [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4958df02-e9fa-4cb2-9175-4313cd3fd658 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:31:56 np0005466013 nova_compute[192144]: 2025-10-02 12:31:56.116 2 DEBUG nova.virt.libvirt.driver [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:31:56 np0005466013 nova_compute[192144]:  <uuid>4958df02-e9fa-4cb2-9175-4313cd3fd658</uuid>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:  <name>instance-0000008c</name>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:31:56 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:      <nova:name>tempest-₡-1854753736</nova:name>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:31:56</nova:creationTime>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:31:56 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:        <nova:user uuid="27daa263abb54d4d8e3ae34cd1c5ccf5">tempest-ServersTestJSON-1163535506-project-member</nova:user>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:        <nova:project uuid="a4a7099974504a798e1607c8e6a1f570">tempest-ServersTestJSON-1163535506</nova:project>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:        <nova:port uuid="bf1d62fc-3a8d-4493-ae99-723fac577d26">
Oct  2 08:31:56 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:      <entry name="serial">4958df02-e9fa-4cb2-9175-4313cd3fd658</entry>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:      <entry name="uuid">4958df02-e9fa-4cb2-9175-4313cd3fd658</entry>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:31:56 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/4958df02-e9fa-4cb2-9175-4313cd3fd658/disk"/>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:31:56 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/4958df02-e9fa-4cb2-9175-4313cd3fd658/disk.config"/>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:31:56 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:4a:4b:3d"/>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:      <target dev="tapbf1d62fc-3a"/>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:31:56 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/4958df02-e9fa-4cb2-9175-4313cd3fd658/console.log" append="off"/>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:31:56 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:31:56 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:31:56 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:31:56 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:31:56 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:31:56 np0005466013 nova_compute[192144]: 2025-10-02 12:31:56.118 2 DEBUG nova.compute.manager [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Preparing to wait for external event network-vif-plugged-bf1d62fc-3a8d-4493-ae99-723fac577d26 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:31:56 np0005466013 nova_compute[192144]: 2025-10-02 12:31:56.118 2 DEBUG oslo_concurrency.lockutils [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "4958df02-e9fa-4cb2-9175-4313cd3fd658-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:56 np0005466013 nova_compute[192144]: 2025-10-02 12:31:56.119 2 DEBUG oslo_concurrency.lockutils [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "4958df02-e9fa-4cb2-9175-4313cd3fd658-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:56 np0005466013 nova_compute[192144]: 2025-10-02 12:31:56.119 2 DEBUG oslo_concurrency.lockutils [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "4958df02-e9fa-4cb2-9175-4313cd3fd658-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:56 np0005466013 nova_compute[192144]: 2025-10-02 12:31:56.120 2 DEBUG nova.virt.libvirt.vif [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:31:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-₡-1854753736',display_name='tempest-₡-1854753736',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest--1854753736',id=140,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a4a7099974504a798e1607c8e6a1f570',ramdisk_id='',reservation_id='r-s2du0lnf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1163535506',owner_user_name='tempest-ServersTestJSON-1163535506-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:31:51Z,user_data=None,user_id='27daa263abb54d4d8e3ae34cd1c5ccf5',uuid=4958df02-e9fa-4cb2-9175-4313cd3fd658,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bf1d62fc-3a8d-4493-ae99-723fac577d26", "address": "fa:16:3e:4a:4b:3d", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf1d62fc-3a", "ovs_interfaceid": "bf1d62fc-3a8d-4493-ae99-723fac577d26", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:31:56 np0005466013 nova_compute[192144]: 2025-10-02 12:31:56.121 2 DEBUG nova.network.os_vif_util [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Converting VIF {"id": "bf1d62fc-3a8d-4493-ae99-723fac577d26", "address": "fa:16:3e:4a:4b:3d", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf1d62fc-3a", "ovs_interfaceid": "bf1d62fc-3a8d-4493-ae99-723fac577d26", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:31:56 np0005466013 nova_compute[192144]: 2025-10-02 12:31:56.122 2 DEBUG nova.network.os_vif_util [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4a:4b:3d,bridge_name='br-int',has_traffic_filtering=True,id=bf1d62fc-3a8d-4493-ae99-723fac577d26,network=Network(1acf42c5-084c-4cc4-bdc5-910eec0249e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbf1d62fc-3a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:31:56 np0005466013 nova_compute[192144]: 2025-10-02 12:31:56.123 2 DEBUG os_vif [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4a:4b:3d,bridge_name='br-int',has_traffic_filtering=True,id=bf1d62fc-3a8d-4493-ae99-723fac577d26,network=Network(1acf42c5-084c-4cc4-bdc5-910eec0249e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbf1d62fc-3a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:31:56 np0005466013 nova_compute[192144]: 2025-10-02 12:31:56.124 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:56 np0005466013 nova_compute[192144]: 2025-10-02 12:31:56.124 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:31:56 np0005466013 nova_compute[192144]: 2025-10-02 12:31:56.125 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:31:56 np0005466013 nova_compute[192144]: 2025-10-02 12:31:56.127 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:56 np0005466013 nova_compute[192144]: 2025-10-02 12:31:56.127 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbf1d62fc-3a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:31:56 np0005466013 nova_compute[192144]: 2025-10-02 12:31:56.128 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbf1d62fc-3a, col_values=(('external_ids', {'iface-id': 'bf1d62fc-3a8d-4493-ae99-723fac577d26', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4a:4b:3d', 'vm-uuid': '4958df02-e9fa-4cb2-9175-4313cd3fd658'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:31:56 np0005466013 nova_compute[192144]: 2025-10-02 12:31:56.180 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:56 np0005466013 NetworkManager[51205]: <info>  [1759408316.1824] manager: (tapbf1d62fc-3a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/271)
Oct  2 08:31:56 np0005466013 nova_compute[192144]: 2025-10-02 12:31:56.183 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:31:56 np0005466013 nova_compute[192144]: 2025-10-02 12:31:56.186 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:56 np0005466013 nova_compute[192144]: 2025-10-02 12:31:56.187 2 INFO os_vif [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4a:4b:3d,bridge_name='br-int',has_traffic_filtering=True,id=bf1d62fc-3a8d-4493-ae99-723fac577d26,network=Network(1acf42c5-084c-4cc4-bdc5-910eec0249e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbf1d62fc-3a')#033[00m
Oct  2 08:31:56 np0005466013 nova_compute[192144]: 2025-10-02 12:31:56.264 2 DEBUG nova.virt.libvirt.driver [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:31:56 np0005466013 nova_compute[192144]: 2025-10-02 12:31:56.265 2 DEBUG nova.virt.libvirt.driver [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:31:56 np0005466013 nova_compute[192144]: 2025-10-02 12:31:56.265 2 DEBUG nova.virt.libvirt.driver [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] No VIF found with MAC fa:16:3e:4a:4b:3d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:31:56 np0005466013 nova_compute[192144]: 2025-10-02 12:31:56.266 2 INFO nova.virt.libvirt.driver [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Using config drive#033[00m
Oct  2 08:31:56 np0005466013 nova_compute[192144]: 2025-10-02 12:31:56.723 2 INFO nova.virt.libvirt.driver [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Creating config drive at /var/lib/nova/instances/4958df02-e9fa-4cb2-9175-4313cd3fd658/disk.config#033[00m
Oct  2 08:31:56 np0005466013 nova_compute[192144]: 2025-10-02 12:31:56.729 2 DEBUG oslo_concurrency.processutils [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4958df02-e9fa-4cb2-9175-4313cd3fd658/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_vn2gzuf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:31:56 np0005466013 nova_compute[192144]: 2025-10-02 12:31:56.859 2 DEBUG oslo_concurrency.processutils [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4958df02-e9fa-4cb2-9175-4313cd3fd658/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_vn2gzuf" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:31:56 np0005466013 kernel: tapbf1d62fc-3a: entered promiscuous mode
Oct  2 08:31:56 np0005466013 NetworkManager[51205]: <info>  [1759408316.9415] manager: (tapbf1d62fc-3a): new Tun device (/org/freedesktop/NetworkManager/Devices/272)
Oct  2 08:31:56 np0005466013 nova_compute[192144]: 2025-10-02 12:31:56.942 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:56 np0005466013 ovn_controller[94366]: 2025-10-02T12:31:56Z|00614|binding|INFO|Claiming lport bf1d62fc-3a8d-4493-ae99-723fac577d26 for this chassis.
Oct  2 08:31:56 np0005466013 ovn_controller[94366]: 2025-10-02T12:31:56Z|00615|binding|INFO|bf1d62fc-3a8d-4493-ae99-723fac577d26: Claiming fa:16:3e:4a:4b:3d 10.100.0.9
Oct  2 08:31:56 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:56.955 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4a:4b:3d 10.100.0.9'], port_security=['fa:16:3e:4a:4b:3d 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '4958df02-e9fa-4cb2-9175-4313cd3fd658', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1acf42c5-084c-4cc4-bdc5-910eec0249e3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a4a7099974504a798e1607c8e6a1f570', 'neutron:revision_number': '2', 'neutron:security_group_ids': '99e51855-93ef-45a8-a4a3-2b0a8aec1882', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=498d5b4e-c711-4633-9705-7db30a0fb056, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=bf1d62fc-3a8d-4493-ae99-723fac577d26) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:31:56 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:56.957 103323 INFO neutron.agent.ovn.metadata.agent [-] Port bf1d62fc-3a8d-4493-ae99-723fac577d26 in datapath 1acf42c5-084c-4cc4-bdc5-910eec0249e3 bound to our chassis#033[00m
Oct  2 08:31:56 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:56.960 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1acf42c5-084c-4cc4-bdc5-910eec0249e3#033[00m
Oct  2 08:31:56 np0005466013 systemd-udevd[243993]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:31:56 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:56.976 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[14baea16-b1e1-47df-a254-bcf5e9dc918f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:56 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:56.978 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1acf42c5-01 in ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:31:56 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:56.979 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1acf42c5-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:31:56 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:56.979 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[433d8dc5-eb96-4a25-a35d-c4b15202f137]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:56 np0005466013 NetworkManager[51205]: <info>  [1759408316.9827] device (tapbf1d62fc-3a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:31:56 np0005466013 NetworkManager[51205]: <info>  [1759408316.9836] device (tapbf1d62fc-3a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:31:56 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:56.982 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[d52a4706-da8f-48fc-9d09-f2bea500c8fc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:56 np0005466013 systemd-machined[152202]: New machine qemu-69-instance-0000008c.
Oct  2 08:31:57 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:57.000 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[e25f4532-fdc4-4104-8c5a-f811003bf6d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:57 np0005466013 nova_compute[192144]: 2025-10-02 12:31:57.031 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:57 np0005466013 nova_compute[192144]: 2025-10-02 12:31:57.035 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:57 np0005466013 ovn_controller[94366]: 2025-10-02T12:31:57Z|00616|binding|INFO|Setting lport bf1d62fc-3a8d-4493-ae99-723fac577d26 ovn-installed in OVS
Oct  2 08:31:57 np0005466013 ovn_controller[94366]: 2025-10-02T12:31:57Z|00617|binding|INFO|Setting lport bf1d62fc-3a8d-4493-ae99-723fac577d26 up in Southbound
Oct  2 08:31:57 np0005466013 nova_compute[192144]: 2025-10-02 12:31:57.039 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:57 np0005466013 systemd[1]: Started Virtual Machine qemu-69-instance-0000008c.
Oct  2 08:31:57 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:57.043 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[58113480-3df6-4f48-85a4-cba6c4b0961d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:57 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:57.083 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[55fb6252-8eaf-4e34-8007-ba7dae19cecf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:57 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:57.089 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[4368289e-b832-4c5a-9f6b-4e979d3f9ed0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:57 np0005466013 NetworkManager[51205]: <info>  [1759408317.0899] manager: (tap1acf42c5-00): new Veth device (/org/freedesktop/NetworkManager/Devices/273)
Oct  2 08:31:57 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:57.128 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[09cd19ad-500a-4955-a4f4-9a70e3454306]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:57 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:57.132 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[0e08b83c-4b78-402e-9b59-18aa956ffaef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:57 np0005466013 NetworkManager[51205]: <info>  [1759408317.1572] device (tap1acf42c5-00): carrier: link connected
Oct  2 08:31:57 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:57.162 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[642511e9-1f08-4cb1-96ee-1e351c1cba18]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:57 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:57.177 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[852724cb-94e2-41c3-b105-3e771a2bf529]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1acf42c5-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:5b:cd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 186], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 630877, 'reachable_time': 30317, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244026, 'error': None, 'target': 'ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:57 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:57.200 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[15f26d21-618e-4b73-973c-791b77f16c2b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0a:5bcd'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 630877, 'tstamp': 630877}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 244027, 'error': None, 'target': 'ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:57 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:57.223 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[20b6104d-447d-41dd-be2a-3cce21c06386]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1acf42c5-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:5b:cd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 186], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 630877, 'reachable_time': 30317, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 244028, 'error': None, 'target': 'ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:57 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:57.277 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[cf871ae2-ad4c-4a9a-8ec1-d756833bbb37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:57 np0005466013 nova_compute[192144]: 2025-10-02 12:31:57.339 2 DEBUG nova.compute.manager [req-72c5e8cb-a5af-4c0f-9b78-f368f179bc8b req-85451514-97cb-4d19-b281-968cd40794d9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Received event network-vif-plugged-bf1d62fc-3a8d-4493-ae99-723fac577d26 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:31:57 np0005466013 nova_compute[192144]: 2025-10-02 12:31:57.339 2 DEBUG oslo_concurrency.lockutils [req-72c5e8cb-a5af-4c0f-9b78-f368f179bc8b req-85451514-97cb-4d19-b281-968cd40794d9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "4958df02-e9fa-4cb2-9175-4313cd3fd658-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:57 np0005466013 nova_compute[192144]: 2025-10-02 12:31:57.339 2 DEBUG oslo_concurrency.lockutils [req-72c5e8cb-a5af-4c0f-9b78-f368f179bc8b req-85451514-97cb-4d19-b281-968cd40794d9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "4958df02-e9fa-4cb2-9175-4313cd3fd658-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:57 np0005466013 nova_compute[192144]: 2025-10-02 12:31:57.339 2 DEBUG oslo_concurrency.lockutils [req-72c5e8cb-a5af-4c0f-9b78-f368f179bc8b req-85451514-97cb-4d19-b281-968cd40794d9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "4958df02-e9fa-4cb2-9175-4313cd3fd658-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:57 np0005466013 nova_compute[192144]: 2025-10-02 12:31:57.339 2 DEBUG nova.compute.manager [req-72c5e8cb-a5af-4c0f-9b78-f368f179bc8b req-85451514-97cb-4d19-b281-968cd40794d9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Processing event network-vif-plugged-bf1d62fc-3a8d-4493-ae99-723fac577d26 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:31:57 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:57.350 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[5b683398-0cb8-4196-ac50-2b27787c3554]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:57 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:57.352 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1acf42c5-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:31:57 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:57.352 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:31:57 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:57.353 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1acf42c5-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:31:57 np0005466013 NetworkManager[51205]: <info>  [1759408317.3552] manager: (tap1acf42c5-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/274)
Oct  2 08:31:57 np0005466013 kernel: tap1acf42c5-00: entered promiscuous mode
Oct  2 08:31:57 np0005466013 nova_compute[192144]: 2025-10-02 12:31:57.354 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:57 np0005466013 nova_compute[192144]: 2025-10-02 12:31:57.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:57 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:57.357 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1acf42c5-00, col_values=(('external_ids', {'iface-id': 'c198cb2e-a850-46e4-8295-a2f9c280ee53'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:31:57 np0005466013 nova_compute[192144]: 2025-10-02 12:31:57.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:57 np0005466013 ovn_controller[94366]: 2025-10-02T12:31:57Z|00618|binding|INFO|Releasing lport c198cb2e-a850-46e4-8295-a2f9c280ee53 from this chassis (sb_readonly=0)
Oct  2 08:31:57 np0005466013 nova_compute[192144]: 2025-10-02 12:31:57.360 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:57 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:57.361 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1acf42c5-084c-4cc4-bdc5-910eec0249e3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1acf42c5-084c-4cc4-bdc5-910eec0249e3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:31:57 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:57.361 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[903f67cd-47a7-45b8-af7a-d46d1350174b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:57 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:57.362 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:31:57 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:31:57 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:31:57 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-1acf42c5-084c-4cc4-bdc5-910eec0249e3
Oct  2 08:31:57 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:31:57 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:31:57 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:31:57 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/1acf42c5-084c-4cc4-bdc5-910eec0249e3.pid.haproxy
Oct  2 08:31:57 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:31:57 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:31:57 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:31:57 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:31:57 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:31:57 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:31:57 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:31:57 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:31:57 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:31:57 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:31:57 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:31:57 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:31:57 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:31:57 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:31:57 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:31:57 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:31:57 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:31:57 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:31:57 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:31:57 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:31:57 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID 1acf42c5-084c-4cc4-bdc5-910eec0249e3
Oct  2 08:31:57 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:31:57 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:31:57.363 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3', 'env', 'PROCESS_TAG=haproxy-1acf42c5-084c-4cc4-bdc5-910eec0249e3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1acf42c5-084c-4cc4-bdc5-910eec0249e3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:31:57 np0005466013 nova_compute[192144]: 2025-10-02 12:31:57.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:57 np0005466013 podman[244065]: 2025-10-02 12:31:57.769954729 +0000 UTC m=+0.072031480 container create 076b49b97b4d7e3fed7e3ab105a5a39e7ba7e445d514dd08b80745d61eae254e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct  2 08:31:57 np0005466013 nova_compute[192144]: 2025-10-02 12:31:57.805 2 DEBUG nova.network.neutron [req-9d21ab72-6f0e-4186-9b95-aaee05915e7b req-a0565ec2-897d-45f3-bcaf-9e9f7719e329 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Updated VIF entry in instance network info cache for port bf1d62fc-3a8d-4493-ae99-723fac577d26. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:31:57 np0005466013 nova_compute[192144]: 2025-10-02 12:31:57.806 2 DEBUG nova.network.neutron [req-9d21ab72-6f0e-4186-9b95-aaee05915e7b req-a0565ec2-897d-45f3-bcaf-9e9f7719e329 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Updating instance_info_cache with network_info: [{"id": "bf1d62fc-3a8d-4493-ae99-723fac577d26", "address": "fa:16:3e:4a:4b:3d", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf1d62fc-3a", "ovs_interfaceid": "bf1d62fc-3a8d-4493-ae99-723fac577d26", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:31:57 np0005466013 systemd[1]: Started libpod-conmon-076b49b97b4d7e3fed7e3ab105a5a39e7ba7e445d514dd08b80745d61eae254e.scope.
Oct  2 08:31:57 np0005466013 nova_compute[192144]: 2025-10-02 12:31:57.822 2 DEBUG oslo_concurrency.lockutils [req-9d21ab72-6f0e-4186-9b95-aaee05915e7b req-a0565ec2-897d-45f3-bcaf-9e9f7719e329 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-4958df02-e9fa-4cb2-9175-4313cd3fd658" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:31:57 np0005466013 nova_compute[192144]: 2025-10-02 12:31:57.824 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759408317.8243008, 4958df02-e9fa-4cb2-9175-4313cd3fd658 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:31:57 np0005466013 nova_compute[192144]: 2025-10-02 12:31:57.824 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] VM Started (Lifecycle Event)#033[00m
Oct  2 08:31:57 np0005466013 podman[244065]: 2025-10-02 12:31:57.731253735 +0000 UTC m=+0.033330466 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:31:57 np0005466013 nova_compute[192144]: 2025-10-02 12:31:57.826 2 DEBUG nova.compute.manager [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:31:57 np0005466013 nova_compute[192144]: 2025-10-02 12:31:57.830 2 DEBUG nova.virt.libvirt.driver [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:31:57 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:31:57 np0005466013 nova_compute[192144]: 2025-10-02 12:31:57.834 2 INFO nova.virt.libvirt.driver [-] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Instance spawned successfully.#033[00m
Oct  2 08:31:57 np0005466013 nova_compute[192144]: 2025-10-02 12:31:57.834 2 DEBUG nova.virt.libvirt.driver [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:31:57 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bb3b9d86227baccfefdee04b6bd98d54b156dcf3151053924b0d7cc6d915815a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:31:57 np0005466013 nova_compute[192144]: 2025-10-02 12:31:57.849 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:31:57 np0005466013 nova_compute[192144]: 2025-10-02 12:31:57.854 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:31:57 np0005466013 podman[244065]: 2025-10-02 12:31:57.862301065 +0000 UTC m=+0.164377766 container init 076b49b97b4d7e3fed7e3ab105a5a39e7ba7e445d514dd08b80745d61eae254e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  2 08:31:57 np0005466013 nova_compute[192144]: 2025-10-02 12:31:57.866 2 DEBUG nova.virt.libvirt.driver [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:31:57 np0005466013 nova_compute[192144]: 2025-10-02 12:31:57.867 2 DEBUG nova.virt.libvirt.driver [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:31:57 np0005466013 nova_compute[192144]: 2025-10-02 12:31:57.867 2 DEBUG nova.virt.libvirt.driver [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:31:57 np0005466013 nova_compute[192144]: 2025-10-02 12:31:57.868 2 DEBUG nova.virt.libvirt.driver [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:31:57 np0005466013 nova_compute[192144]: 2025-10-02 12:31:57.868 2 DEBUG nova.virt.libvirt.driver [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:31:57 np0005466013 nova_compute[192144]: 2025-10-02 12:31:57.869 2 DEBUG nova.virt.libvirt.driver [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:31:57 np0005466013 podman[244065]: 2025-10-02 12:31:57.870245614 +0000 UTC m=+0.172322315 container start 076b49b97b4d7e3fed7e3ab105a5a39e7ba7e445d514dd08b80745d61eae254e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001)
Oct  2 08:31:57 np0005466013 nova_compute[192144]: 2025-10-02 12:31:57.896 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:31:57 np0005466013 nova_compute[192144]: 2025-10-02 12:31:57.897 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759408317.8263822, 4958df02-e9fa-4cb2-9175-4313cd3fd658 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:31:57 np0005466013 nova_compute[192144]: 2025-10-02 12:31:57.897 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:31:57 np0005466013 neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3[244079]: [NOTICE]   (244084) : New worker (244086) forked
Oct  2 08:31:57 np0005466013 neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3[244079]: [NOTICE]   (244084) : Loading success.
Oct  2 08:31:57 np0005466013 nova_compute[192144]: 2025-10-02 12:31:57.931 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:31:57 np0005466013 nova_compute[192144]: 2025-10-02 12:31:57.935 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759408317.829127, 4958df02-e9fa-4cb2-9175-4313cd3fd658 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:31:57 np0005466013 nova_compute[192144]: 2025-10-02 12:31:57.935 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:31:57 np0005466013 nova_compute[192144]: 2025-10-02 12:31:57.955 2 INFO nova.compute.manager [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Took 6.76 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:31:57 np0005466013 nova_compute[192144]: 2025-10-02 12:31:57.956 2 DEBUG nova.compute.manager [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:31:57 np0005466013 nova_compute[192144]: 2025-10-02 12:31:57.959 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:31:57 np0005466013 nova_compute[192144]: 2025-10-02 12:31:57.968 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:31:58 np0005466013 nova_compute[192144]: 2025-10-02 12:31:58.002 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:31:58 np0005466013 nova_compute[192144]: 2025-10-02 12:31:58.057 2 INFO nova.compute.manager [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Took 7.42 seconds to build instance.#033[00m
Oct  2 08:31:58 np0005466013 nova_compute[192144]: 2025-10-02 12:31:58.083 2 DEBUG oslo_concurrency.lockutils [None req-e0957d34-1654-49ea-bd66-8413aefe15d2 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "4958df02-e9fa-4cb2-9175-4313cd3fd658" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.670s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:59 np0005466013 nova_compute[192144]: 2025-10-02 12:31:59.454 2 DEBUG nova.compute.manager [req-4dfba269-d384-4d1e-b069-1f3371891a95 req-20f86c8a-4890-41be-b7a4-01372718a151 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Received event network-vif-plugged-bf1d62fc-3a8d-4493-ae99-723fac577d26 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:31:59 np0005466013 nova_compute[192144]: 2025-10-02 12:31:59.455 2 DEBUG oslo_concurrency.lockutils [req-4dfba269-d384-4d1e-b069-1f3371891a95 req-20f86c8a-4890-41be-b7a4-01372718a151 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "4958df02-e9fa-4cb2-9175-4313cd3fd658-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:59 np0005466013 nova_compute[192144]: 2025-10-02 12:31:59.456 2 DEBUG oslo_concurrency.lockutils [req-4dfba269-d384-4d1e-b069-1f3371891a95 req-20f86c8a-4890-41be-b7a4-01372718a151 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "4958df02-e9fa-4cb2-9175-4313cd3fd658-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:59 np0005466013 nova_compute[192144]: 2025-10-02 12:31:59.456 2 DEBUG oslo_concurrency.lockutils [req-4dfba269-d384-4d1e-b069-1f3371891a95 req-20f86c8a-4890-41be-b7a4-01372718a151 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "4958df02-e9fa-4cb2-9175-4313cd3fd658-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:59 np0005466013 nova_compute[192144]: 2025-10-02 12:31:59.457 2 DEBUG nova.compute.manager [req-4dfba269-d384-4d1e-b069-1f3371891a95 req-20f86c8a-4890-41be-b7a4-01372718a151 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] No waiting events found dispatching network-vif-plugged-bf1d62fc-3a8d-4493-ae99-723fac577d26 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:31:59 np0005466013 nova_compute[192144]: 2025-10-02 12:31:59.457 2 WARNING nova.compute.manager [req-4dfba269-d384-4d1e-b069-1f3371891a95 req-20f86c8a-4890-41be-b7a4-01372718a151 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Received unexpected event network-vif-plugged-bf1d62fc-3a8d-4493-ae99-723fac577d26 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:32:00 np0005466013 nova_compute[192144]: 2025-10-02 12:32:00.009 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:01 np0005466013 nova_compute[192144]: 2025-10-02 12:32:01.182 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:02.315 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:02.316 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:02.317 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:05 np0005466013 nova_compute[192144]: 2025-10-02 12:32:05.012 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:06 np0005466013 nova_compute[192144]: 2025-10-02 12:32:06.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:09 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:09.413 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=36, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=35) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:32:09 np0005466013 nova_compute[192144]: 2025-10-02 12:32:09.414 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:09 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:09.416 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:32:10 np0005466013 nova_compute[192144]: 2025-10-02 12:32:10.014 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:10 np0005466013 podman[244109]: 2025-10-02 12:32:10.719943367 +0000 UTC m=+0.084770850 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:32:10 np0005466013 podman[244110]: 2025-10-02 12:32:10.746887154 +0000 UTC m=+0.104282133 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Oct  2 08:32:10 np0005466013 podman[244111]: 2025-10-02 12:32:10.76868827 +0000 UTC m=+0.127168333 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=ovn_controller)
Oct  2 08:32:11 np0005466013 nova_compute[192144]: 2025-10-02 12:32:11.234 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:11 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:11.420 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '36'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:11 np0005466013 ovn_controller[94366]: 2025-10-02T12:32:11Z|00062|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:4a:4b:3d 10.100.0.9
Oct  2 08:32:11 np0005466013 ovn_controller[94366]: 2025-10-02T12:32:11Z|00063|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:4a:4b:3d 10.100.0.9
Oct  2 08:32:15 np0005466013 nova_compute[192144]: 2025-10-02 12:32:15.016 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:16 np0005466013 nova_compute[192144]: 2025-10-02 12:32:16.237 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:20 np0005466013 nova_compute[192144]: 2025-10-02 12:32:20.018 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:20 np0005466013 podman[244171]: 2025-10-02 12:32:20.702298205 +0000 UTC m=+0.078814082 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:32:20 np0005466013 podman[244172]: 2025-10-02 12:32:20.723412839 +0000 UTC m=+0.092230863 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., version=9.6, io.buildah.version=1.33.7, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_id=edpm)
Oct  2 08:32:20 np0005466013 podman[244173]: 2025-10-02 12:32:20.7580901 +0000 UTC m=+0.117993105 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Oct  2 08:32:21 np0005466013 nova_compute[192144]: 2025-10-02 12:32:21.241 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:25 np0005466013 nova_compute[192144]: 2025-10-02 12:32:25.020 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:26.250 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d0:76:2a 2001:db8:0:1:f816:3eff:fed0:762a 2001:db8::f816:3eff:fed0:762a'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fed0:762a/64 2001:db8::f816:3eff:fed0:762a/64', 'neutron:device_id': 'ovnmeta-f55e0845-fc62-481d-a70d-8546faf2b8fb', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f55e0845-fc62-481d-a70d-8546faf2b8fb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=512667a6-6958-4dd6-8891-fcda7d607ab5, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=763e1f51-8560-461a-a2f3-3c284c8e5a17) old=Port_Binding(mac=['fa:16:3e:d0:76:2a 2001:db8::f816:3eff:fed0:762a'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fed0:762a/64', 'neutron:device_id': 'ovnmeta-f55e0845-fc62-481d-a70d-8546faf2b8fb', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f55e0845-fc62-481d-a70d-8546faf2b8fb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:32:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:26.251 103323 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 763e1f51-8560-461a-a2f3-3c284c8e5a17 in datapath f55e0845-fc62-481d-a70d-8546faf2b8fb updated#033[00m
Oct  2 08:32:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:26.252 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f55e0845-fc62-481d-a70d-8546faf2b8fb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:32:26 np0005466013 nova_compute[192144]: 2025-10-02 12:32:26.277 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:26.278 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[6e6d97c0-e885-41d5-81cc-b57300ae8516]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:26 np0005466013 podman[244230]: 2025-10-02 12:32:26.732592319 +0000 UTC m=+0.095238848 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:32:26 np0005466013 podman[244231]: 2025-10-02 12:32:26.74850194 +0000 UTC m=+0.107272068 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid)
Oct  2 08:32:30 np0005466013 nova_compute[192144]: 2025-10-02 12:32:30.022 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:31 np0005466013 nova_compute[192144]: 2025-10-02 12:32:31.280 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:31 np0005466013 nova_compute[192144]: 2025-10-02 12:32:31.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:32:32 np0005466013 nova_compute[192144]: 2025-10-02 12:32:32.021 2 DEBUG oslo_concurrency.lockutils [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "1eda1f2a-e061-4d62-b09d-49ac1dc55ace" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:32 np0005466013 nova_compute[192144]: 2025-10-02 12:32:32.021 2 DEBUG oslo_concurrency.lockutils [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "1eda1f2a-e061-4d62-b09d-49ac1dc55ace" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:32 np0005466013 nova_compute[192144]: 2025-10-02 12:32:32.026 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:32 np0005466013 nova_compute[192144]: 2025-10-02 12:32:32.027 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:32 np0005466013 nova_compute[192144]: 2025-10-02 12:32:32.027 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:32 np0005466013 nova_compute[192144]: 2025-10-02 12:32:32.028 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:32:32 np0005466013 nova_compute[192144]: 2025-10-02 12:32:32.042 2 DEBUG nova.compute.manager [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:32:32 np0005466013 nova_compute[192144]: 2025-10-02 12:32:32.164 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4958df02-e9fa-4cb2-9175-4313cd3fd658/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:32:32 np0005466013 nova_compute[192144]: 2025-10-02 12:32:32.249 2 DEBUG oslo_concurrency.lockutils [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:32 np0005466013 nova_compute[192144]: 2025-10-02 12:32:32.249 2 DEBUG oslo_concurrency.lockutils [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:32 np0005466013 nova_compute[192144]: 2025-10-02 12:32:32.257 2 DEBUG nova.virt.hardware [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:32:32 np0005466013 nova_compute[192144]: 2025-10-02 12:32:32.257 2 INFO nova.compute.claims [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:32:32 np0005466013 nova_compute[192144]: 2025-10-02 12:32:32.259 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4958df02-e9fa-4cb2-9175-4313cd3fd658/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:32:32 np0005466013 nova_compute[192144]: 2025-10-02 12:32:32.260 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4958df02-e9fa-4cb2-9175-4313cd3fd658/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:32:32 np0005466013 nova_compute[192144]: 2025-10-02 12:32:32.348 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4958df02-e9fa-4cb2-9175-4313cd3fd658/disk --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:32:32 np0005466013 nova_compute[192144]: 2025-10-02 12:32:32.461 2 DEBUG nova.compute.provider_tree [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:32:32 np0005466013 nova_compute[192144]: 2025-10-02 12:32:32.477 2 DEBUG nova.scheduler.client.report [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:32:32 np0005466013 nova_compute[192144]: 2025-10-02 12:32:32.508 2 DEBUG oslo_concurrency.lockutils [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.258s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:32 np0005466013 nova_compute[192144]: 2025-10-02 12:32:32.509 2 DEBUG nova.compute.manager [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:32:32 np0005466013 nova_compute[192144]: 2025-10-02 12:32:32.553 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:32:32 np0005466013 nova_compute[192144]: 2025-10-02 12:32:32.555 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5530MB free_disk=73.17171096801758GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:32:32 np0005466013 nova_compute[192144]: 2025-10-02 12:32:32.555 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:32 np0005466013 nova_compute[192144]: 2025-10-02 12:32:32.556 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:32 np0005466013 nova_compute[192144]: 2025-10-02 12:32:32.653 2 DEBUG nova.compute.manager [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:32:32 np0005466013 nova_compute[192144]: 2025-10-02 12:32:32.653 2 DEBUG nova.network.neutron [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:32:32 np0005466013 nova_compute[192144]: 2025-10-02 12:32:32.689 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance 4958df02-e9fa-4cb2-9175-4313cd3fd658 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:32:32 np0005466013 nova_compute[192144]: 2025-10-02 12:32:32.690 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance 1eda1f2a-e061-4d62-b09d-49ac1dc55ace actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:32:32 np0005466013 nova_compute[192144]: 2025-10-02 12:32:32.690 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:32:32 np0005466013 nova_compute[192144]: 2025-10-02 12:32:32.691 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:32:32 np0005466013 nova_compute[192144]: 2025-10-02 12:32:32.741 2 INFO nova.virt.libvirt.driver [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:32:32 np0005466013 nova_compute[192144]: 2025-10-02 12:32:32.756 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:32:32 np0005466013 nova_compute[192144]: 2025-10-02 12:32:32.792 2 DEBUG nova.compute.manager [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:32:32 np0005466013 nova_compute[192144]: 2025-10-02 12:32:32.886 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:32:32 np0005466013 nova_compute[192144]: 2025-10-02 12:32:32.894 2 DEBUG nova.policy [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:32:32 np0005466013 nova_compute[192144]: 2025-10-02 12:32:32.977 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:32:32 np0005466013 nova_compute[192144]: 2025-10-02 12:32:32.978 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.422s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:33 np0005466013 nova_compute[192144]: 2025-10-02 12:32:33.372 2 DEBUG nova.compute.manager [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:32:33 np0005466013 nova_compute[192144]: 2025-10-02 12:32:33.374 2 DEBUG nova.virt.libvirt.driver [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:32:33 np0005466013 nova_compute[192144]: 2025-10-02 12:32:33.374 2 INFO nova.virt.libvirt.driver [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Creating image(s)#033[00m
Oct  2 08:32:33 np0005466013 nova_compute[192144]: 2025-10-02 12:32:33.374 2 DEBUG oslo_concurrency.lockutils [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "/var/lib/nova/instances/1eda1f2a-e061-4d62-b09d-49ac1dc55ace/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:33 np0005466013 nova_compute[192144]: 2025-10-02 12:32:33.375 2 DEBUG oslo_concurrency.lockutils [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "/var/lib/nova/instances/1eda1f2a-e061-4d62-b09d-49ac1dc55ace/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:33 np0005466013 nova_compute[192144]: 2025-10-02 12:32:33.375 2 DEBUG oslo_concurrency.lockutils [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "/var/lib/nova/instances/1eda1f2a-e061-4d62-b09d-49ac1dc55ace/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:33 np0005466013 nova_compute[192144]: 2025-10-02 12:32:33.388 2 DEBUG oslo_concurrency.processutils [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:32:33 np0005466013 nova_compute[192144]: 2025-10-02 12:32:33.469 2 DEBUG oslo_concurrency.processutils [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:32:33 np0005466013 nova_compute[192144]: 2025-10-02 12:32:33.470 2 DEBUG oslo_concurrency.lockutils [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:33 np0005466013 nova_compute[192144]: 2025-10-02 12:32:33.471 2 DEBUG oslo_concurrency.lockutils [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:33 np0005466013 nova_compute[192144]: 2025-10-02 12:32:33.483 2 DEBUG oslo_concurrency.processutils [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:32:33 np0005466013 nova_compute[192144]: 2025-10-02 12:32:33.571 2 DEBUG oslo_concurrency.processutils [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:32:33 np0005466013 nova_compute[192144]: 2025-10-02 12:32:33.572 2 DEBUG oslo_concurrency.processutils [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/1eda1f2a-e061-4d62-b09d-49ac1dc55ace/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:32:33 np0005466013 nova_compute[192144]: 2025-10-02 12:32:33.611 2 DEBUG oslo_concurrency.processutils [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/1eda1f2a-e061-4d62-b09d-49ac1dc55ace/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:32:33 np0005466013 nova_compute[192144]: 2025-10-02 12:32:33.612 2 DEBUG oslo_concurrency.lockutils [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:33 np0005466013 nova_compute[192144]: 2025-10-02 12:32:33.613 2 DEBUG oslo_concurrency.processutils [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:32:33 np0005466013 nova_compute[192144]: 2025-10-02 12:32:33.677 2 DEBUG oslo_concurrency.processutils [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:32:33 np0005466013 nova_compute[192144]: 2025-10-02 12:32:33.678 2 DEBUG nova.virt.disk.api [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Checking if we can resize image /var/lib/nova/instances/1eda1f2a-e061-4d62-b09d-49ac1dc55ace/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:32:33 np0005466013 nova_compute[192144]: 2025-10-02 12:32:33.679 2 DEBUG oslo_concurrency.processutils [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1eda1f2a-e061-4d62-b09d-49ac1dc55ace/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:32:33 np0005466013 nova_compute[192144]: 2025-10-02 12:32:33.735 2 DEBUG oslo_concurrency.processutils [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1eda1f2a-e061-4d62-b09d-49ac1dc55ace/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:32:33 np0005466013 nova_compute[192144]: 2025-10-02 12:32:33.736 2 DEBUG nova.virt.disk.api [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Cannot resize image /var/lib/nova/instances/1eda1f2a-e061-4d62-b09d-49ac1dc55ace/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:32:33 np0005466013 nova_compute[192144]: 2025-10-02 12:32:33.736 2 DEBUG nova.objects.instance [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lazy-loading 'migration_context' on Instance uuid 1eda1f2a-e061-4d62-b09d-49ac1dc55ace obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:32:33 np0005466013 nova_compute[192144]: 2025-10-02 12:32:33.923 2 DEBUG nova.virt.libvirt.driver [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:32:33 np0005466013 nova_compute[192144]: 2025-10-02 12:32:33.924 2 DEBUG nova.virt.libvirt.driver [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Ensure instance console log exists: /var/lib/nova/instances/1eda1f2a-e061-4d62-b09d-49ac1dc55ace/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:32:33 np0005466013 nova_compute[192144]: 2025-10-02 12:32:33.924 2 DEBUG oslo_concurrency.lockutils [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:33 np0005466013 nova_compute[192144]: 2025-10-02 12:32:33.925 2 DEBUG oslo_concurrency.lockutils [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:33 np0005466013 nova_compute[192144]: 2025-10-02 12:32:33.925 2 DEBUG oslo_concurrency.lockutils [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:35 np0005466013 nova_compute[192144]: 2025-10-02 12:32:35.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:35 np0005466013 nova_compute[192144]: 2025-10-02 12:32:35.524 2 DEBUG nova.network.neutron [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Successfully created port: 087a3a60-6c14-460a-99cf-049201b3c5b7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:32:35 np0005466013 nova_compute[192144]: 2025-10-02 12:32:35.978 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:32:35 np0005466013 nova_compute[192144]: 2025-10-02 12:32:35.979 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:32:36 np0005466013 nova_compute[192144]: 2025-10-02 12:32:36.336 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:37 np0005466013 nova_compute[192144]: 2025-10-02 12:32:37.750 2 DEBUG nova.network.neutron [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Successfully created port: 4d4f5d39-ff10-4ea6-8b7a-df918302bf68 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:32:38 np0005466013 nova_compute[192144]: 2025-10-02 12:32:38.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:32:39 np0005466013 nova_compute[192144]: 2025-10-02 12:32:39.089 2 DEBUG nova.network.neutron [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Successfully updated port: 087a3a60-6c14-460a-99cf-049201b3c5b7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:32:39 np0005466013 nova_compute[192144]: 2025-10-02 12:32:39.483 2 DEBUG nova.compute.manager [req-344e523a-2e9b-4f18-a153-baf3be140759 req-3b57f82e-7772-4db6-94e5-722d451c12de 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Received event network-changed-087a3a60-6c14-460a-99cf-049201b3c5b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:32:39 np0005466013 nova_compute[192144]: 2025-10-02 12:32:39.483 2 DEBUG nova.compute.manager [req-344e523a-2e9b-4f18-a153-baf3be140759 req-3b57f82e-7772-4db6-94e5-722d451c12de 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Refreshing instance network info cache due to event network-changed-087a3a60-6c14-460a-99cf-049201b3c5b7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:32:39 np0005466013 nova_compute[192144]: 2025-10-02 12:32:39.483 2 DEBUG oslo_concurrency.lockutils [req-344e523a-2e9b-4f18-a153-baf3be140759 req-3b57f82e-7772-4db6-94e5-722d451c12de 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-1eda1f2a-e061-4d62-b09d-49ac1dc55ace" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:32:39 np0005466013 nova_compute[192144]: 2025-10-02 12:32:39.483 2 DEBUG oslo_concurrency.lockutils [req-344e523a-2e9b-4f18-a153-baf3be140759 req-3b57f82e-7772-4db6-94e5-722d451c12de 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-1eda1f2a-e061-4d62-b09d-49ac1dc55ace" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:32:39 np0005466013 nova_compute[192144]: 2025-10-02 12:32:39.484 2 DEBUG nova.network.neutron [req-344e523a-2e9b-4f18-a153-baf3be140759 req-3b57f82e-7772-4db6-94e5-722d451c12de 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Refreshing network info cache for port 087a3a60-6c14-460a-99cf-049201b3c5b7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:32:39 np0005466013 nova_compute[192144]: 2025-10-02 12:32:39.888 2 DEBUG nova.network.neutron [req-344e523a-2e9b-4f18-a153-baf3be140759 req-3b57f82e-7772-4db6-94e5-722d451c12de 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:32:39 np0005466013 nova_compute[192144]: 2025-10-02 12:32:39.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:32:40 np0005466013 nova_compute[192144]: 2025-10-02 12:32:40.026 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:40 np0005466013 nova_compute[192144]: 2025-10-02 12:32:40.173 2 DEBUG nova.network.neutron [req-344e523a-2e9b-4f18-a153-baf3be140759 req-3b57f82e-7772-4db6-94e5-722d451c12de 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:32:40 np0005466013 nova_compute[192144]: 2025-10-02 12:32:40.339 2 DEBUG oslo_concurrency.lockutils [req-344e523a-2e9b-4f18-a153-baf3be140759 req-3b57f82e-7772-4db6-94e5-722d451c12de 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-1eda1f2a-e061-4d62-b09d-49ac1dc55ace" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:32:40 np0005466013 nova_compute[192144]: 2025-10-02 12:32:40.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:32:41 np0005466013 nova_compute[192144]: 2025-10-02 12:32:41.340 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:41 np0005466013 nova_compute[192144]: 2025-10-02 12:32:41.394 2 DEBUG nova.network.neutron [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Successfully updated port: 4d4f5d39-ff10-4ea6-8b7a-df918302bf68 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:32:41 np0005466013 nova_compute[192144]: 2025-10-02 12:32:41.518 2 DEBUG oslo_concurrency.lockutils [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "refresh_cache-1eda1f2a-e061-4d62-b09d-49ac1dc55ace" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:32:41 np0005466013 nova_compute[192144]: 2025-10-02 12:32:41.518 2 DEBUG oslo_concurrency.lockutils [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquired lock "refresh_cache-1eda1f2a-e061-4d62-b09d-49ac1dc55ace" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:32:41 np0005466013 nova_compute[192144]: 2025-10-02 12:32:41.518 2 DEBUG nova.network.neutron [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:32:41 np0005466013 nova_compute[192144]: 2025-10-02 12:32:41.634 2 DEBUG nova.compute.manager [req-43aee28a-5159-4893-84aa-d34708284002 req-1b35a552-1f56-48c9-99d9-fce9fff7daf4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Received event network-changed-4d4f5d39-ff10-4ea6-8b7a-df918302bf68 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:32:41 np0005466013 nova_compute[192144]: 2025-10-02 12:32:41.635 2 DEBUG nova.compute.manager [req-43aee28a-5159-4893-84aa-d34708284002 req-1b35a552-1f56-48c9-99d9-fce9fff7daf4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Refreshing instance network info cache due to event network-changed-4d4f5d39-ff10-4ea6-8b7a-df918302bf68. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:32:41 np0005466013 nova_compute[192144]: 2025-10-02 12:32:41.635 2 DEBUG oslo_concurrency.lockutils [req-43aee28a-5159-4893-84aa-d34708284002 req-1b35a552-1f56-48c9-99d9-fce9fff7daf4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-1eda1f2a-e061-4d62-b09d-49ac1dc55ace" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:32:41 np0005466013 podman[244300]: 2025-10-02 12:32:41.692754479 +0000 UTC m=+0.059009818 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Oct  2 08:32:41 np0005466013 podman[244299]: 2025-10-02 12:32:41.717039573 +0000 UTC m=+0.078909974 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  2 08:32:41 np0005466013 podman[244301]: 2025-10-02 12:32:41.767476231 +0000 UTC m=+0.121300989 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  2 08:32:41 np0005466013 nova_compute[192144]: 2025-10-02 12:32:41.910 2 DEBUG nova.network.neutron [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:32:41 np0005466013 nova_compute[192144]: 2025-10-02 12:32:41.989 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:32:43 np0005466013 nova_compute[192144]: 2025-10-02 12:32:43.988 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:32:44 np0005466013 nova_compute[192144]: 2025-10-02 12:32:44.022 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:32:44 np0005466013 nova_compute[192144]: 2025-10-02 12:32:44.023 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:32:44 np0005466013 nova_compute[192144]: 2025-10-02 12:32:44.024 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:32:44 np0005466013 nova_compute[192144]: 2025-10-02 12:32:44.047 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Oct  2 08:32:44 np0005466013 nova_compute[192144]: 2025-10-02 12:32:44.234 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "refresh_cache-4958df02-e9fa-4cb2-9175-4313cd3fd658" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:32:44 np0005466013 nova_compute[192144]: 2025-10-02 12:32:44.235 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquired lock "refresh_cache-4958df02-e9fa-4cb2-9175-4313cd3fd658" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:32:44 np0005466013 nova_compute[192144]: 2025-10-02 12:32:44.235 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:32:44 np0005466013 nova_compute[192144]: 2025-10-02 12:32:44.236 2 DEBUG nova.objects.instance [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4958df02-e9fa-4cb2-9175-4313cd3fd658 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:32:44 np0005466013 nova_compute[192144]: 2025-10-02 12:32:44.957 2 DEBUG nova.network.neutron [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Updating instance_info_cache with network_info: [{"id": "087a3a60-6c14-460a-99cf-049201b3c5b7", "address": "fa:16:3e:bf:a3:73", "network": {"id": "48ae5e44-4c0f-44dd-b2b0-7bd3123da141", "bridge": "br-int", "label": "tempest-network-smoke--581239262", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap087a3a60-6c", "ovs_interfaceid": "087a3a60-6c14-460a-99cf-049201b3c5b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "4d4f5d39-ff10-4ea6-8b7a-df918302bf68", "address": "fa:16:3e:5b:3f:d2", "network": {"id": "f55e0845-fc62-481d-a70d-8546faf2b8fb", "bridge": "br-int", "label": "tempest-network-smoke--2003085585", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe5b:3fd2", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5b:3fd2", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d4f5d39-ff", "ovs_interfaceid": "4d4f5d39-ff10-4ea6-8b7a-df918302bf68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:32:44 np0005466013 nova_compute[192144]: 2025-10-02 12:32:44.981 2 DEBUG oslo_concurrency.lockutils [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Releasing lock "refresh_cache-1eda1f2a-e061-4d62-b09d-49ac1dc55ace" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:32:44 np0005466013 nova_compute[192144]: 2025-10-02 12:32:44.982 2 DEBUG nova.compute.manager [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Instance network_info: |[{"id": "087a3a60-6c14-460a-99cf-049201b3c5b7", "address": "fa:16:3e:bf:a3:73", "network": {"id": "48ae5e44-4c0f-44dd-b2b0-7bd3123da141", "bridge": "br-int", "label": "tempest-network-smoke--581239262", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap087a3a60-6c", "ovs_interfaceid": "087a3a60-6c14-460a-99cf-049201b3c5b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "4d4f5d39-ff10-4ea6-8b7a-df918302bf68", "address": "fa:16:3e:5b:3f:d2", "network": {"id": "f55e0845-fc62-481d-a70d-8546faf2b8fb", "bridge": "br-int", "label": "tempest-network-smoke--2003085585", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe5b:3fd2", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5b:3fd2", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d4f5d39-ff", "ovs_interfaceid": "4d4f5d39-ff10-4ea6-8b7a-df918302bf68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:32:44 np0005466013 nova_compute[192144]: 2025-10-02 12:32:44.983 2 DEBUG oslo_concurrency.lockutils [req-43aee28a-5159-4893-84aa-d34708284002 req-1b35a552-1f56-48c9-99d9-fce9fff7daf4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-1eda1f2a-e061-4d62-b09d-49ac1dc55ace" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:32:44 np0005466013 nova_compute[192144]: 2025-10-02 12:32:44.983 2 DEBUG nova.network.neutron [req-43aee28a-5159-4893-84aa-d34708284002 req-1b35a552-1f56-48c9-99d9-fce9fff7daf4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Refreshing network info cache for port 4d4f5d39-ff10-4ea6-8b7a-df918302bf68 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:32:44 np0005466013 nova_compute[192144]: 2025-10-02 12:32:44.987 2 DEBUG nova.virt.libvirt.driver [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Start _get_guest_xml network_info=[{"id": "087a3a60-6c14-460a-99cf-049201b3c5b7", "address": "fa:16:3e:bf:a3:73", "network": {"id": "48ae5e44-4c0f-44dd-b2b0-7bd3123da141", "bridge": "br-int", "label": "tempest-network-smoke--581239262", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap087a3a60-6c", "ovs_interfaceid": "087a3a60-6c14-460a-99cf-049201b3c5b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "4d4f5d39-ff10-4ea6-8b7a-df918302bf68", "address": "fa:16:3e:5b:3f:d2", "network": {"id": "f55e0845-fc62-481d-a70d-8546faf2b8fb", "bridge": "br-int", "label": "tempest-network-smoke--2003085585", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe5b:3fd2", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5b:3fd2", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d4f5d39-ff", "ovs_interfaceid": "4d4f5d39-ff10-4ea6-8b7a-df918302bf68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:32:44 np0005466013 nova_compute[192144]: 2025-10-02 12:32:44.992 2 WARNING nova.virt.libvirt.driver [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.003 2 DEBUG nova.virt.libvirt.host [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.004 2 DEBUG nova.virt.libvirt.host [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.008 2 DEBUG nova.virt.libvirt.host [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.009 2 DEBUG nova.virt.libvirt.host [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.010 2 DEBUG nova.virt.libvirt.driver [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.010 2 DEBUG nova.virt.hardware [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.011 2 DEBUG nova.virt.hardware [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.011 2 DEBUG nova.virt.hardware [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.011 2 DEBUG nova.virt.hardware [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.012 2 DEBUG nova.virt.hardware [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.012 2 DEBUG nova.virt.hardware [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.012 2 DEBUG nova.virt.hardware [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.013 2 DEBUG nova.virt.hardware [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.013 2 DEBUG nova.virt.hardware [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.013 2 DEBUG nova.virt.hardware [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.013 2 DEBUG nova.virt.hardware [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.017 2 DEBUG nova.virt.libvirt.vif [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:32:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-618937776',display_name='tempest-TestGettingAddress-server-618937776',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-618937776',id=145,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAOjMDXDlGh7XWskIogxPxg3/debwat727C6FiGGYuKMh4jN83iNAp0gEQFUOyWskdK5DuOPQyXrWxnq0VTv+25W2TLuxAMtNSrcXSqgdODflHSjkV04SZMqyvlJudVrow==',key_name='tempest-TestGettingAddress-1577770950',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-1zbfq32k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:32:32Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=1eda1f2a-e061-4d62-b09d-49ac1dc55ace,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "087a3a60-6c14-460a-99cf-049201b3c5b7", "address": "fa:16:3e:bf:a3:73", "network": {"id": "48ae5e44-4c0f-44dd-b2b0-7bd3123da141", "bridge": "br-int", "label": "tempest-network-smoke--581239262", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap087a3a60-6c", "ovs_interfaceid": "087a3a60-6c14-460a-99cf-049201b3c5b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.018 2 DEBUG nova.network.os_vif_util [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "087a3a60-6c14-460a-99cf-049201b3c5b7", "address": "fa:16:3e:bf:a3:73", "network": {"id": "48ae5e44-4c0f-44dd-b2b0-7bd3123da141", "bridge": "br-int", "label": "tempest-network-smoke--581239262", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap087a3a60-6c", "ovs_interfaceid": "087a3a60-6c14-460a-99cf-049201b3c5b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.019 2 DEBUG nova.network.os_vif_util [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bf:a3:73,bridge_name='br-int',has_traffic_filtering=True,id=087a3a60-6c14-460a-99cf-049201b3c5b7,network=Network(48ae5e44-4c0f-44dd-b2b0-7bd3123da141),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap087a3a60-6c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.020 2 DEBUG nova.virt.libvirt.vif [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:32:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-618937776',display_name='tempest-TestGettingAddress-server-618937776',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-618937776',id=145,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAOjMDXDlGh7XWskIogxPxg3/debwat727C6FiGGYuKMh4jN83iNAp0gEQFUOyWskdK5DuOPQyXrWxnq0VTv+25W2TLuxAMtNSrcXSqgdODflHSjkV04SZMqyvlJudVrow==',key_name='tempest-TestGettingAddress-1577770950',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-1zbfq32k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:32:32Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=1eda1f2a-e061-4d62-b09d-49ac1dc55ace,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4d4f5d39-ff10-4ea6-8b7a-df918302bf68", "address": "fa:16:3e:5b:3f:d2", "network": {"id": "f55e0845-fc62-481d-a70d-8546faf2b8fb", "bridge": "br-int", "label": "tempest-network-smoke--2003085585", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe5b:3fd2", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5b:3fd2", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d4f5d39-ff", "ovs_interfaceid": "4d4f5d39-ff10-4ea6-8b7a-df918302bf68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.020 2 DEBUG nova.network.os_vif_util [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "4d4f5d39-ff10-4ea6-8b7a-df918302bf68", "address": "fa:16:3e:5b:3f:d2", "network": {"id": "f55e0845-fc62-481d-a70d-8546faf2b8fb", "bridge": "br-int", "label": "tempest-network-smoke--2003085585", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe5b:3fd2", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5b:3fd2", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d4f5d39-ff", "ovs_interfaceid": "4d4f5d39-ff10-4ea6-8b7a-df918302bf68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.021 2 DEBUG nova.network.os_vif_util [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5b:3f:d2,bridge_name='br-int',has_traffic_filtering=True,id=4d4f5d39-ff10-4ea6-8b7a-df918302bf68,network=Network(f55e0845-fc62-481d-a70d-8546faf2b8fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d4f5d39-ff') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.022 2 DEBUG nova.objects.instance [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1eda1f2a-e061-4d62-b09d-49ac1dc55ace obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.028 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.044 2 DEBUG nova.virt.libvirt.driver [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:32:45 np0005466013 nova_compute[192144]:  <uuid>1eda1f2a-e061-4d62-b09d-49ac1dc55ace</uuid>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:  <name>instance-00000091</name>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:32:45 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:      <nova:name>tempest-TestGettingAddress-server-618937776</nova:name>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:32:44</nova:creationTime>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:32:45 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:        <nova:user uuid="97ce9f1898484e0e9a1f7c84a9f0dfe3">tempest-TestGettingAddress-1355720650-project-member</nova:user>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:        <nova:project uuid="fd801958556f4c8aab047ecdef6b5ee8">tempest-TestGettingAddress-1355720650</nova:project>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:        <nova:port uuid="087a3a60-6c14-460a-99cf-049201b3c5b7">
Oct  2 08:32:45 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:        <nova:port uuid="4d4f5d39-ff10-4ea6-8b7a-df918302bf68">
Oct  2 08:32:45 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fe5b:3fd2" ipVersion="6"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe5b:3fd2" ipVersion="6"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:      <entry name="serial">1eda1f2a-e061-4d62-b09d-49ac1dc55ace</entry>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:      <entry name="uuid">1eda1f2a-e061-4d62-b09d-49ac1dc55ace</entry>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:32:45 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/1eda1f2a-e061-4d62-b09d-49ac1dc55ace/disk"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:32:45 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/1eda1f2a-e061-4d62-b09d-49ac1dc55ace/disk.config"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:32:45 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:bf:a3:73"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:      <target dev="tap087a3a60-6c"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:32:45 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:5b:3f:d2"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:      <target dev="tap4d4f5d39-ff"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:32:45 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/1eda1f2a-e061-4d62-b09d-49ac1dc55ace/console.log" append="off"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:32:45 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:32:45 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:32:45 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:32:45 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:32:45 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.045 2 DEBUG nova.compute.manager [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Preparing to wait for external event network-vif-plugged-087a3a60-6c14-460a-99cf-049201b3c5b7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.046 2 DEBUG oslo_concurrency.lockutils [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "1eda1f2a-e061-4d62-b09d-49ac1dc55ace-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.046 2 DEBUG oslo_concurrency.lockutils [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "1eda1f2a-e061-4d62-b09d-49ac1dc55ace-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.046 2 DEBUG oslo_concurrency.lockutils [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "1eda1f2a-e061-4d62-b09d-49ac1dc55ace-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.047 2 DEBUG nova.compute.manager [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Preparing to wait for external event network-vif-plugged-4d4f5d39-ff10-4ea6-8b7a-df918302bf68 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.047 2 DEBUG oslo_concurrency.lockutils [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "1eda1f2a-e061-4d62-b09d-49ac1dc55ace-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.048 2 DEBUG oslo_concurrency.lockutils [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "1eda1f2a-e061-4d62-b09d-49ac1dc55ace-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.048 2 DEBUG oslo_concurrency.lockutils [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "1eda1f2a-e061-4d62-b09d-49ac1dc55ace-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.049 2 DEBUG nova.virt.libvirt.vif [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:32:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-618937776',display_name='tempest-TestGettingAddress-server-618937776',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-618937776',id=145,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAOjMDXDlGh7XWskIogxPxg3/debwat727C6FiGGYuKMh4jN83iNAp0gEQFUOyWskdK5DuOPQyXrWxnq0VTv+25W2TLuxAMtNSrcXSqgdODflHSjkV04SZMqyvlJudVrow==',key_name='tempest-TestGettingAddress-1577770950',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-1zbfq32k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:32:32Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=1eda1f2a-e061-4d62-b09d-49ac1dc55ace,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "087a3a60-6c14-460a-99cf-049201b3c5b7", "address": "fa:16:3e:bf:a3:73", "network": {"id": "48ae5e44-4c0f-44dd-b2b0-7bd3123da141", "bridge": "br-int", "label": "tempest-network-smoke--581239262", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap087a3a60-6c", "ovs_interfaceid": "087a3a60-6c14-460a-99cf-049201b3c5b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.050 2 DEBUG nova.network.os_vif_util [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "087a3a60-6c14-460a-99cf-049201b3c5b7", "address": "fa:16:3e:bf:a3:73", "network": {"id": "48ae5e44-4c0f-44dd-b2b0-7bd3123da141", "bridge": "br-int", "label": "tempest-network-smoke--581239262", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap087a3a60-6c", "ovs_interfaceid": "087a3a60-6c14-460a-99cf-049201b3c5b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.051 2 DEBUG nova.network.os_vif_util [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bf:a3:73,bridge_name='br-int',has_traffic_filtering=True,id=087a3a60-6c14-460a-99cf-049201b3c5b7,network=Network(48ae5e44-4c0f-44dd-b2b0-7bd3123da141),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap087a3a60-6c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.051 2 DEBUG os_vif [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bf:a3:73,bridge_name='br-int',has_traffic_filtering=True,id=087a3a60-6c14-460a-99cf-049201b3c5b7,network=Network(48ae5e44-4c0f-44dd-b2b0-7bd3123da141),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap087a3a60-6c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.052 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.053 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.054 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.058 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.058 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap087a3a60-6c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.059 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap087a3a60-6c, col_values=(('external_ids', {'iface-id': '087a3a60-6c14-460a-99cf-049201b3c5b7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:bf:a3:73', 'vm-uuid': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.061 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:45 np0005466013 NetworkManager[51205]: <info>  [1759408365.0624] manager: (tap087a3a60-6c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/275)
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.073 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.077 2 INFO os_vif [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bf:a3:73,bridge_name='br-int',has_traffic_filtering=True,id=087a3a60-6c14-460a-99cf-049201b3c5b7,network=Network(48ae5e44-4c0f-44dd-b2b0-7bd3123da141),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap087a3a60-6c')#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.079 2 DEBUG nova.virt.libvirt.vif [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:32:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-618937776',display_name='tempest-TestGettingAddress-server-618937776',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-618937776',id=145,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAOjMDXDlGh7XWskIogxPxg3/debwat727C6FiGGYuKMh4jN83iNAp0gEQFUOyWskdK5DuOPQyXrWxnq0VTv+25W2TLuxAMtNSrcXSqgdODflHSjkV04SZMqyvlJudVrow==',key_name='tempest-TestGettingAddress-1577770950',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-1zbfq32k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:32:32Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=1eda1f2a-e061-4d62-b09d-49ac1dc55ace,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4d4f5d39-ff10-4ea6-8b7a-df918302bf68", "address": "fa:16:3e:5b:3f:d2", "network": {"id": "f55e0845-fc62-481d-a70d-8546faf2b8fb", "bridge": "br-int", "label": "tempest-network-smoke--2003085585", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe5b:3fd2", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5b:3fd2", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d4f5d39-ff", "ovs_interfaceid": "4d4f5d39-ff10-4ea6-8b7a-df918302bf68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.079 2 DEBUG nova.network.os_vif_util [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "4d4f5d39-ff10-4ea6-8b7a-df918302bf68", "address": "fa:16:3e:5b:3f:d2", "network": {"id": "f55e0845-fc62-481d-a70d-8546faf2b8fb", "bridge": "br-int", "label": "tempest-network-smoke--2003085585", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe5b:3fd2", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5b:3fd2", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d4f5d39-ff", "ovs_interfaceid": "4d4f5d39-ff10-4ea6-8b7a-df918302bf68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.080 2 DEBUG nova.network.os_vif_util [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5b:3f:d2,bridge_name='br-int',has_traffic_filtering=True,id=4d4f5d39-ff10-4ea6-8b7a-df918302bf68,network=Network(f55e0845-fc62-481d-a70d-8546faf2b8fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d4f5d39-ff') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.080 2 DEBUG os_vif [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:3f:d2,bridge_name='br-int',has_traffic_filtering=True,id=4d4f5d39-ff10-4ea6-8b7a-df918302bf68,network=Network(f55e0845-fc62-481d-a70d-8546faf2b8fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d4f5d39-ff') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.081 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.081 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.082 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.085 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.085 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4d4f5d39-ff, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.085 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4d4f5d39-ff, col_values=(('external_ids', {'iface-id': '4d4f5d39-ff10-4ea6-8b7a-df918302bf68', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5b:3f:d2', 'vm-uuid': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.087 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:45 np0005466013 NetworkManager[51205]: <info>  [1759408365.0884] manager: (tap4d4f5d39-ff): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/276)
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.090 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.097 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.099 2 INFO os_vif [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:3f:d2,bridge_name='br-int',has_traffic_filtering=True,id=4d4f5d39-ff10-4ea6-8b7a-df918302bf68,network=Network(f55e0845-fc62-481d-a70d-8546faf2b8fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d4f5d39-ff')#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.248 2 DEBUG nova.virt.libvirt.driver [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.249 2 DEBUG nova.virt.libvirt.driver [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.249 2 DEBUG nova.virt.libvirt.driver [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] No VIF found with MAC fa:16:3e:bf:a3:73, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.250 2 DEBUG nova.virt.libvirt.driver [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] No VIF found with MAC fa:16:3e:5b:3f:d2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.251 2 INFO nova.virt.libvirt.driver [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Using config drive#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.645 2 INFO nova.virt.libvirt.driver [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Creating config drive at /var/lib/nova/instances/1eda1f2a-e061-4d62-b09d-49ac1dc55ace/disk.config#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.655 2 DEBUG oslo_concurrency.processutils [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1eda1f2a-e061-4d62-b09d-49ac1dc55ace/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpslx7x3js execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.691 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Updating instance_info_cache with network_info: [{"id": "bf1d62fc-3a8d-4493-ae99-723fac577d26", "address": "fa:16:3e:4a:4b:3d", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf1d62fc-3a", "ovs_interfaceid": "bf1d62fc-3a8d-4493-ae99-723fac577d26", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.733 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Releasing lock "refresh_cache-4958df02-e9fa-4cb2-9175-4313cd3fd658" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.734 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.735 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.804 2 DEBUG oslo_concurrency.processutils [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1eda1f2a-e061-4d62-b09d-49ac1dc55ace/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpslx7x3js" returned: 0 in 0.149s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:32:45 np0005466013 kernel: tap087a3a60-6c: entered promiscuous mode
Oct  2 08:32:45 np0005466013 NetworkManager[51205]: <info>  [1759408365.9011] manager: (tap087a3a60-6c): new Tun device (/org/freedesktop/NetworkManager/Devices/277)
Oct  2 08:32:45 np0005466013 ovn_controller[94366]: 2025-10-02T12:32:45Z|00619|binding|INFO|Claiming lport 087a3a60-6c14-460a-99cf-049201b3c5b7 for this chassis.
Oct  2 08:32:45 np0005466013 nova_compute[192144]: 2025-10-02 12:32:45.903 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:45 np0005466013 ovn_controller[94366]: 2025-10-02T12:32:45Z|00620|binding|INFO|087a3a60-6c14-460a-99cf-049201b3c5b7: Claiming fa:16:3e:bf:a3:73 10.100.0.12
Oct  2 08:32:45 np0005466013 NetworkManager[51205]: <info>  [1759408365.9276] manager: (tap4d4f5d39-ff): new Tun device (/org/freedesktop/NetworkManager/Devices/278)
Oct  2 08:32:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:45.929 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bf:a3:73 10.100.0.12'], port_security=['fa:16:3e:bf:a3:73 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-48ae5e44-4c0f-44dd-b2b0-7bd3123da141', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c3e897d9-b083-4f5e-aef4-0a4551c54806', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dbb75d33-0be1-4472-abdd-63f2f4f59602, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=087a3a60-6c14-460a-99cf-049201b3c5b7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:32:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:45.931 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 087a3a60-6c14-460a-99cf-049201b3c5b7 in datapath 48ae5e44-4c0f-44dd-b2b0-7bd3123da141 bound to our chassis#033[00m
Oct  2 08:32:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:45.936 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 48ae5e44-4c0f-44dd-b2b0-7bd3123da141#033[00m
Oct  2 08:32:45 np0005466013 systemd-udevd[244393]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:32:45 np0005466013 systemd-udevd[244394]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:32:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:45.956 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[d8e55aa4-2825-45b8-b801-4f67262a4e53]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:45.958 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap48ae5e44-41 in ovnmeta-48ae5e44-4c0f-44dd-b2b0-7bd3123da141 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:32:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:45.961 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap48ae5e44-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:32:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:45.962 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[9b8d09e1-da59-40f2-a902-5554c616cb8f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:45.964 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[d5f04217-e45c-496c-84cc-29fb8e063c91]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:45 np0005466013 NetworkManager[51205]: <info>  [1759408365.9657] device (tap087a3a60-6c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:32:45 np0005466013 NetworkManager[51205]: <info>  [1759408365.9685] device (tap087a3a60-6c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:32:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:45.993 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[d58715de-4041-4e19-b561-1c82cf611b8f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:46 np0005466013 systemd-machined[152202]: New machine qemu-70-instance-00000091.
Oct  2 08:32:46 np0005466013 nova_compute[192144]: 2025-10-02 12:32:46.021 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:46 np0005466013 kernel: tap4d4f5d39-ff: entered promiscuous mode
Oct  2 08:32:46 np0005466013 NetworkManager[51205]: <info>  [1759408366.0227] device (tap4d4f5d39-ff): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:32:46 np0005466013 NetworkManager[51205]: <info>  [1759408366.0241] device (tap4d4f5d39-ff): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:32:46 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:46.023 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[5e17ba42-3c2f-464e-be3b-266bc51c415e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:46 np0005466013 systemd[1]: Started Virtual Machine qemu-70-instance-00000091.
Oct  2 08:32:46 np0005466013 ovn_controller[94366]: 2025-10-02T12:32:46Z|00621|binding|INFO|Claiming lport 4d4f5d39-ff10-4ea6-8b7a-df918302bf68 for this chassis.
Oct  2 08:32:46 np0005466013 ovn_controller[94366]: 2025-10-02T12:32:46Z|00622|binding|INFO|4d4f5d39-ff10-4ea6-8b7a-df918302bf68: Claiming fa:16:3e:5b:3f:d2 2001:db8:0:1:f816:3eff:fe5b:3fd2 2001:db8::f816:3eff:fe5b:3fd2
Oct  2 08:32:46 np0005466013 nova_compute[192144]: 2025-10-02 12:32:46.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:46 np0005466013 ovn_controller[94366]: 2025-10-02T12:32:46Z|00623|binding|INFO|Setting lport 087a3a60-6c14-460a-99cf-049201b3c5b7 ovn-installed in OVS
Oct  2 08:32:46 np0005466013 nova_compute[192144]: 2025-10-02 12:32:46.033 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:46 np0005466013 ovn_controller[94366]: 2025-10-02T12:32:46Z|00624|binding|INFO|Setting lport 087a3a60-6c14-460a-99cf-049201b3c5b7 up in Southbound
Oct  2 08:32:46 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:46.035 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5b:3f:d2 2001:db8:0:1:f816:3eff:fe5b:3fd2 2001:db8::f816:3eff:fe5b:3fd2'], port_security=['fa:16:3e:5b:3f:d2 2001:db8:0:1:f816:3eff:fe5b:3fd2 2001:db8::f816:3eff:fe5b:3fd2'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe5b:3fd2/64 2001:db8::f816:3eff:fe5b:3fd2/64', 'neutron:device_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f55e0845-fc62-481d-a70d-8546faf2b8fb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c3e897d9-b083-4f5e-aef4-0a4551c54806', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=512667a6-6958-4dd6-8891-fcda7d607ab5, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=4d4f5d39-ff10-4ea6-8b7a-df918302bf68) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:32:46 np0005466013 nova_compute[192144]: 2025-10-02 12:32:46.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:46 np0005466013 ovn_controller[94366]: 2025-10-02T12:32:46Z|00625|binding|INFO|Setting lport 4d4f5d39-ff10-4ea6-8b7a-df918302bf68 ovn-installed in OVS
Oct  2 08:32:46 np0005466013 ovn_controller[94366]: 2025-10-02T12:32:46Z|00626|binding|INFO|Setting lport 4d4f5d39-ff10-4ea6-8b7a-df918302bf68 up in Southbound
Oct  2 08:32:46 np0005466013 nova_compute[192144]: 2025-10-02 12:32:46.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:46 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:46.072 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[3f2f8c4a-de31-4cb3-a92e-13eb2c904fa7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:46 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:46.080 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[5b6ad63f-f912-48ed-9098-b221c5e1328e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:46 np0005466013 NetworkManager[51205]: <info>  [1759408366.0812] manager: (tap48ae5e44-40): new Veth device (/org/freedesktop/NetworkManager/Devices/279)
Oct  2 08:32:46 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:46.128 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[824ac59e-b808-4028-aa0f-9b4117139ec1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:46 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:46.140 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[7aaa94b1-9cc0-4160-8338-0ff09e9fdac1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:46 np0005466013 NetworkManager[51205]: <info>  [1759408366.1633] device (tap48ae5e44-40): carrier: link connected
Oct  2 08:32:46 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:46.169 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[770980da-011e-4c8a-a514-0522d0285953]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:46 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:46.192 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[17daa35c-7b8d-425c-b2fa-e74c5cfec57a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap48ae5e44-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d6:62:33'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 189], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 635778, 'reachable_time': 21144, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244430, 'error': None, 'target': 'ovnmeta-48ae5e44-4c0f-44dd-b2b0-7bd3123da141', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:46 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:46.213 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[7b8ea76c-8e96-468e-93a6-550c9e057bc6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed6:6233'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 635778, 'tstamp': 635778}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 244431, 'error': None, 'target': 'ovnmeta-48ae5e44-4c0f-44dd-b2b0-7bd3123da141', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:46 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:46.233 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[1eb7f3fa-f703-4a7c-9efe-c25433820c71]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap48ae5e44-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d6:62:33'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 189], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 635778, 'reachable_time': 21144, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 244432, 'error': None, 'target': 'ovnmeta-48ae5e44-4c0f-44dd-b2b0-7bd3123da141', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:46 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:46.278 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[04803fde-2b0f-47d3-b1ce-8ebb1376676b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:46 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:46.359 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[dfa4095f-1fe7-429e-a77d-0fb4469d2bd0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:46 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:46.360 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap48ae5e44-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:46 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:46.360 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:32:46 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:46.360 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap48ae5e44-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:46 np0005466013 NetworkManager[51205]: <info>  [1759408366.3629] manager: (tap48ae5e44-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/280)
Oct  2 08:32:46 np0005466013 kernel: tap48ae5e44-40: entered promiscuous mode
Oct  2 08:32:46 np0005466013 nova_compute[192144]: 2025-10-02 12:32:46.362 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:46 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:46.367 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap48ae5e44-40, col_values=(('external_ids', {'iface-id': 'f8346990-e84e-49ae-958d-dc83725093d9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:46 np0005466013 ovn_controller[94366]: 2025-10-02T12:32:46Z|00627|binding|INFO|Releasing lport f8346990-e84e-49ae-958d-dc83725093d9 from this chassis (sb_readonly=0)
Oct  2 08:32:46 np0005466013 nova_compute[192144]: 2025-10-02 12:32:46.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:46 np0005466013 nova_compute[192144]: 2025-10-02 12:32:46.395 2 DEBUG nova.compute.manager [req-344085c3-7a17-4b9a-b8f7-f39c73e687f6 req-34429cd7-040a-45f4-96ef-1a9ab7639a37 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Received event network-vif-plugged-087a3a60-6c14-460a-99cf-049201b3c5b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:32:46 np0005466013 nova_compute[192144]: 2025-10-02 12:32:46.396 2 DEBUG oslo_concurrency.lockutils [req-344085c3-7a17-4b9a-b8f7-f39c73e687f6 req-34429cd7-040a-45f4-96ef-1a9ab7639a37 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "1eda1f2a-e061-4d62-b09d-49ac1dc55ace-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:46 np0005466013 nova_compute[192144]: 2025-10-02 12:32:46.396 2 DEBUG oslo_concurrency.lockutils [req-344085c3-7a17-4b9a-b8f7-f39c73e687f6 req-34429cd7-040a-45f4-96ef-1a9ab7639a37 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "1eda1f2a-e061-4d62-b09d-49ac1dc55ace-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:46 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:46.396 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/48ae5e44-4c0f-44dd-b2b0-7bd3123da141.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/48ae5e44-4c0f-44dd-b2b0-7bd3123da141.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:32:46 np0005466013 nova_compute[192144]: 2025-10-02 12:32:46.396 2 DEBUG oslo_concurrency.lockutils [req-344085c3-7a17-4b9a-b8f7-f39c73e687f6 req-34429cd7-040a-45f4-96ef-1a9ab7639a37 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "1eda1f2a-e061-4d62-b09d-49ac1dc55ace-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:46 np0005466013 nova_compute[192144]: 2025-10-02 12:32:46.397 2 DEBUG nova.compute.manager [req-344085c3-7a17-4b9a-b8f7-f39c73e687f6 req-34429cd7-040a-45f4-96ef-1a9ab7639a37 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Processing event network-vif-plugged-087a3a60-6c14-460a-99cf-049201b3c5b7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:32:46 np0005466013 nova_compute[192144]: 2025-10-02 12:32:46.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:46 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:46.397 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[1fdbfd3b-0161-482d-a38c-c9b39811f0a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:46 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:46.399 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:32:46 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:32:46 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:32:46 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-48ae5e44-4c0f-44dd-b2b0-7bd3123da141
Oct  2 08:32:46 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:32:46 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:32:46 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:32:46 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/48ae5e44-4c0f-44dd-b2b0-7bd3123da141.pid.haproxy
Oct  2 08:32:46 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:32:46 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:32:46 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:32:46 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:32:46 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:32:46 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:32:46 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:32:46 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:32:46 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:32:46 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:32:46 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:32:46 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:32:46 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:32:46 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:32:46 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:32:46 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:32:46 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:32:46 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:32:46 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:32:46 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:32:46 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID 48ae5e44-4c0f-44dd-b2b0-7bd3123da141
Oct  2 08:32:46 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:32:46 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:46.401 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-48ae5e44-4c0f-44dd-b2b0-7bd3123da141', 'env', 'PROCESS_TAG=haproxy-48ae5e44-4c0f-44dd-b2b0-7bd3123da141', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/48ae5e44-4c0f-44dd-b2b0-7bd3123da141.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:32:46 np0005466013 nova_compute[192144]: 2025-10-02 12:32:46.449 2 DEBUG nova.compute.manager [req-d8f1eed3-df18-4231-8639-5cdcb3281799 req-6b96259b-b6ed-4571-a292-5884db5c1714 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Received event network-vif-plugged-4d4f5d39-ff10-4ea6-8b7a-df918302bf68 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:32:46 np0005466013 nova_compute[192144]: 2025-10-02 12:32:46.451 2 DEBUG oslo_concurrency.lockutils [req-d8f1eed3-df18-4231-8639-5cdcb3281799 req-6b96259b-b6ed-4571-a292-5884db5c1714 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "1eda1f2a-e061-4d62-b09d-49ac1dc55ace-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:46 np0005466013 nova_compute[192144]: 2025-10-02 12:32:46.451 2 DEBUG oslo_concurrency.lockutils [req-d8f1eed3-df18-4231-8639-5cdcb3281799 req-6b96259b-b6ed-4571-a292-5884db5c1714 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "1eda1f2a-e061-4d62-b09d-49ac1dc55ace-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:46 np0005466013 nova_compute[192144]: 2025-10-02 12:32:46.451 2 DEBUG oslo_concurrency.lockutils [req-d8f1eed3-df18-4231-8639-5cdcb3281799 req-6b96259b-b6ed-4571-a292-5884db5c1714 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "1eda1f2a-e061-4d62-b09d-49ac1dc55ace-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:46 np0005466013 nova_compute[192144]: 2025-10-02 12:32:46.452 2 DEBUG nova.compute.manager [req-d8f1eed3-df18-4231-8639-5cdcb3281799 req-6b96259b-b6ed-4571-a292-5884db5c1714 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Processing event network-vif-plugged-4d4f5d39-ff10-4ea6-8b7a-df918302bf68 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:32:46 np0005466013 podman[244472]: 2025-10-02 12:32:46.778297104 +0000 UTC m=+0.028285260 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:32:46 np0005466013 nova_compute[192144]: 2025-10-02 12:32:46.913 2 DEBUG nova.compute.manager [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Instance event wait completed in 0 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:32:46 np0005466013 nova_compute[192144]: 2025-10-02 12:32:46.914 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759408366.9123392, 1eda1f2a-e061-4d62-b09d-49ac1dc55ace => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:32:46 np0005466013 nova_compute[192144]: 2025-10-02 12:32:46.914 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] VM Started (Lifecycle Event)#033[00m
Oct  2 08:32:46 np0005466013 nova_compute[192144]: 2025-10-02 12:32:46.916 2 DEBUG nova.network.neutron [req-43aee28a-5159-4893-84aa-d34708284002 req-1b35a552-1f56-48c9-99d9-fce9fff7daf4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Updated VIF entry in instance network info cache for port 4d4f5d39-ff10-4ea6-8b7a-df918302bf68. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:32:46 np0005466013 nova_compute[192144]: 2025-10-02 12:32:46.917 2 DEBUG nova.network.neutron [req-43aee28a-5159-4893-84aa-d34708284002 req-1b35a552-1f56-48c9-99d9-fce9fff7daf4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Updating instance_info_cache with network_info: [{"id": "087a3a60-6c14-460a-99cf-049201b3c5b7", "address": "fa:16:3e:bf:a3:73", "network": {"id": "48ae5e44-4c0f-44dd-b2b0-7bd3123da141", "bridge": "br-int", "label": "tempest-network-smoke--581239262", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap087a3a60-6c", "ovs_interfaceid": "087a3a60-6c14-460a-99cf-049201b3c5b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "4d4f5d39-ff10-4ea6-8b7a-df918302bf68", "address": "fa:16:3e:5b:3f:d2", "network": {"id": "f55e0845-fc62-481d-a70d-8546faf2b8fb", "bridge": "br-int", "label": "tempest-network-smoke--2003085585", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe5b:3fd2", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5b:3fd2", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d4f5d39-ff", "ovs_interfaceid": "4d4f5d39-ff10-4ea6-8b7a-df918302bf68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:32:46 np0005466013 nova_compute[192144]: 2025-10-02 12:32:46.919 2 DEBUG nova.virt.libvirt.driver [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:32:46 np0005466013 nova_compute[192144]: 2025-10-02 12:32:46.925 2 INFO nova.virt.libvirt.driver [-] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Instance spawned successfully.#033[00m
Oct  2 08:32:46 np0005466013 nova_compute[192144]: 2025-10-02 12:32:46.926 2 DEBUG nova.virt.libvirt.driver [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:32:46 np0005466013 nova_compute[192144]: 2025-10-02 12:32:46.944 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:32:46 np0005466013 nova_compute[192144]: 2025-10-02 12:32:46.951 2 DEBUG oslo_concurrency.lockutils [req-43aee28a-5159-4893-84aa-d34708284002 req-1b35a552-1f56-48c9-99d9-fce9fff7daf4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-1eda1f2a-e061-4d62-b09d-49ac1dc55ace" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:32:46 np0005466013 nova_compute[192144]: 2025-10-02 12:32:46.956 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:32:46 np0005466013 nova_compute[192144]: 2025-10-02 12:32:46.961 2 DEBUG nova.virt.libvirt.driver [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:32:46 np0005466013 nova_compute[192144]: 2025-10-02 12:32:46.962 2 DEBUG nova.virt.libvirt.driver [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:32:46 np0005466013 nova_compute[192144]: 2025-10-02 12:32:46.963 2 DEBUG nova.virt.libvirt.driver [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:32:46 np0005466013 nova_compute[192144]: 2025-10-02 12:32:46.964 2 DEBUG nova.virt.libvirt.driver [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:32:46 np0005466013 nova_compute[192144]: 2025-10-02 12:32:46.965 2 DEBUG nova.virt.libvirt.driver [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:32:46 np0005466013 nova_compute[192144]: 2025-10-02 12:32:46.966 2 DEBUG nova.virt.libvirt.driver [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:32:46 np0005466013 nova_compute[192144]: 2025-10-02 12:32:46.980 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:32:46 np0005466013 nova_compute[192144]: 2025-10-02 12:32:46.980 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759408366.9126816, 1eda1f2a-e061-4d62-b09d-49ac1dc55ace => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:32:46 np0005466013 nova_compute[192144]: 2025-10-02 12:32:46.981 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:32:46 np0005466013 nova_compute[192144]: 2025-10-02 12:32:46.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:32:47 np0005466013 podman[244472]: 2025-10-02 12:32:47.003191943 +0000 UTC m=+0.253180099 container create d3b5e53a4ddd39a78eb89c6ae7760fe41b0e28b3434b9a96611a1bb584df512b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48ae5e44-4c0f-44dd-b2b0-7bd3123da141, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  2 08:32:47 np0005466013 nova_compute[192144]: 2025-10-02 12:32:47.009 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:32:47 np0005466013 nova_compute[192144]: 2025-10-02 12:32:47.020 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759408366.9169004, 1eda1f2a-e061-4d62-b09d-49ac1dc55ace => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:32:47 np0005466013 nova_compute[192144]: 2025-10-02 12:32:47.020 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:32:47 np0005466013 nova_compute[192144]: 2025-10-02 12:32:47.052 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:32:47 np0005466013 nova_compute[192144]: 2025-10-02 12:32:47.056 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:32:47 np0005466013 nova_compute[192144]: 2025-10-02 12:32:47.076 2 INFO nova.compute.manager [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Took 13.70 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:32:47 np0005466013 nova_compute[192144]: 2025-10-02 12:32:47.077 2 DEBUG nova.compute.manager [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:32:47 np0005466013 nova_compute[192144]: 2025-10-02 12:32:47.080 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:32:47 np0005466013 systemd[1]: Started libpod-conmon-d3b5e53a4ddd39a78eb89c6ae7760fe41b0e28b3434b9a96611a1bb584df512b.scope.
Oct  2 08:32:47 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:32:47 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d624a9ebb372b44d75938e945b4d5f4fac3be329befebbae69cf96de91e7dca5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:32:47 np0005466013 nova_compute[192144]: 2025-10-02 12:32:47.189 2 INFO nova.compute.manager [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Took 15.04 seconds to build instance.#033[00m
Oct  2 08:32:47 np0005466013 nova_compute[192144]: 2025-10-02 12:32:47.217 2 DEBUG oslo_concurrency.lockutils [None req-22149438-4f9d-4434-8d6c-b83f5d2b15f2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "1eda1f2a-e061-4d62-b09d-49ac1dc55ace" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.196s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:47 np0005466013 podman[244472]: 2025-10-02 12:32:47.298591848 +0000 UTC m=+0.548580084 container init d3b5e53a4ddd39a78eb89c6ae7760fe41b0e28b3434b9a96611a1bb584df512b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48ae5e44-4c0f-44dd-b2b0-7bd3123da141, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 08:32:47 np0005466013 podman[244472]: 2025-10-02 12:32:47.306152256 +0000 UTC m=+0.556140442 container start d3b5e53a4ddd39a78eb89c6ae7760fe41b0e28b3434b9a96611a1bb584df512b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48ae5e44-4c0f-44dd-b2b0-7bd3123da141, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:32:47 np0005466013 neutron-haproxy-ovnmeta-48ae5e44-4c0f-44dd-b2b0-7bd3123da141[244487]: [NOTICE]   (244491) : New worker (244493) forked
Oct  2 08:32:47 np0005466013 neutron-haproxy-ovnmeta-48ae5e44-4c0f-44dd-b2b0-7bd3123da141[244487]: [NOTICE]   (244491) : Loading success.
Oct  2 08:32:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:47.443 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 4d4f5d39-ff10-4ea6-8b7a-df918302bf68 in datapath f55e0845-fc62-481d-a70d-8546faf2b8fb unbound from our chassis#033[00m
Oct  2 08:32:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:47.447 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f55e0845-fc62-481d-a70d-8546faf2b8fb#033[00m
Oct  2 08:32:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:47.460 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[3cd600d5-3543-4874-a37f-cd914af25ae1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:47.461 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf55e0845-f1 in ovnmeta-f55e0845-fc62-481d-a70d-8546faf2b8fb namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:32:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:47.465 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf55e0845-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:32:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:47.465 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[01575ee3-f63c-4f5a-b0a0-746b31d17297]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:47.466 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[8c408753-dded-4c8e-ace7-d2cf0a2e4d9f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:47.479 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[375a2ca3-fe24-46c5-a1c6-02d94bb5b673]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:47.503 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[f976d360-e0e8-434b-ae4a-87aeb7804fb7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:47.530 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[fd9aaaee-e36d-4ccc-9492-87f641d15b30]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:47 np0005466013 NetworkManager[51205]: <info>  [1759408367.5383] manager: (tapf55e0845-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/281)
Oct  2 08:32:47 np0005466013 systemd-udevd[244411]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:32:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:47.536 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[430a7cf1-3971-449e-92af-6ee7d5fe46be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:47.583 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[270deeac-4301-42ca-beb7-08b346fc7fd4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:47.592 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[18fb1313-8dd9-4cb6-a79b-7a1a24ae1a0d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:47 np0005466013 NetworkManager[51205]: <info>  [1759408367.6231] device (tapf55e0845-f0): carrier: link connected
Oct  2 08:32:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:47.628 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[e206172a-81e9-4dd9-8400-efe33cf6e498]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:47.653 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[38dc5c3a-43df-4c0b-b6de-adabc81f81b4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf55e0845-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d0:76:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 190], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 635924, 'reachable_time': 19503, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 192, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 192, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244512, 'error': None, 'target': 'ovnmeta-f55e0845-fc62-481d-a70d-8546faf2b8fb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:47.673 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[4e0d040f-e6a1-4bf8-8932-03b74e2bde84]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed0:762a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 635924, 'tstamp': 635924}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 244513, 'error': None, 'target': 'ovnmeta-f55e0845-fc62-481d-a70d-8546faf2b8fb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:47.691 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[57294dd7-fb53-499d-8ef7-7814ca4d0d8f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf55e0845-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d0:76:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 190], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 635924, 'reachable_time': 19503, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 192, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 192, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 244514, 'error': None, 'target': 'ovnmeta-f55e0845-fc62-481d-a70d-8546faf2b8fb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:47.733 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[8279cc25-f0cf-4207-9c9e-711734d3d9d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:47.771 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[d6617f27-1b36-4e61-a07c-dffd11e72a85]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:47.773 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf55e0845-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:47.773 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:32:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:47.774 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf55e0845-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:47 np0005466013 nova_compute[192144]: 2025-10-02 12:32:47.775 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:47 np0005466013 NetworkManager[51205]: <info>  [1759408367.7765] manager: (tapf55e0845-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/282)
Oct  2 08:32:47 np0005466013 kernel: tapf55e0845-f0: entered promiscuous mode
Oct  2 08:32:47 np0005466013 nova_compute[192144]: 2025-10-02 12:32:47.781 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:47.788 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf55e0845-f0, col_values=(('external_ids', {'iface-id': '763e1f51-8560-461a-a2f3-3c284c8e5a17'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:47 np0005466013 nova_compute[192144]: 2025-10-02 12:32:47.789 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:47 np0005466013 ovn_controller[94366]: 2025-10-02T12:32:47Z|00628|binding|INFO|Releasing lport 763e1f51-8560-461a-a2f3-3c284c8e5a17 from this chassis (sb_readonly=0)
Oct  2 08:32:47 np0005466013 nova_compute[192144]: 2025-10-02 12:32:47.790 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:47.802 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f55e0845-fc62-481d-a70d-8546faf2b8fb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f55e0845-fc62-481d-a70d-8546faf2b8fb.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:32:47 np0005466013 nova_compute[192144]: 2025-10-02 12:32:47.804 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:47.803 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[83b905ff-8d0e-4ab7-af78-3cfb98a161e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:47.804 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:32:47 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:32:47 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:32:47 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-f55e0845-fc62-481d-a70d-8546faf2b8fb
Oct  2 08:32:47 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:32:47 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:32:47 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:32:47 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/f55e0845-fc62-481d-a70d-8546faf2b8fb.pid.haproxy
Oct  2 08:32:47 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:32:47 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:32:47 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:32:47 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:32:47 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:32:47 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:32:47 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:32:47 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:32:47 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:32:47 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:32:47 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:32:47 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:32:47 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:32:47 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:32:47 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:32:47 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:32:47 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:32:47 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:32:47 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:32:47 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:32:47 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID f55e0845-fc62-481d-a70d-8546faf2b8fb
Oct  2 08:32:47 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:32:47 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:32:47.805 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f55e0845-fc62-481d-a70d-8546faf2b8fb', 'env', 'PROCESS_TAG=haproxy-f55e0845-fc62-481d-a70d-8546faf2b8fb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f55e0845-fc62-481d-a70d-8546faf2b8fb.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:32:48 np0005466013 podman[244546]: 2025-10-02 12:32:48.319355792 +0000 UTC m=+0.103480887 container create 41225329b868b978a8fa3ab90f74dafc613e0019951b5ef17a29a181a58685c2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f55e0845-fc62-481d-a70d-8546faf2b8fb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:32:48 np0005466013 podman[244546]: 2025-10-02 12:32:48.246229061 +0000 UTC m=+0.030354176 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:32:48 np0005466013 systemd[1]: Started libpod-conmon-41225329b868b978a8fa3ab90f74dafc613e0019951b5ef17a29a181a58685c2.scope.
Oct  2 08:32:48 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:32:48 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3225e0d4ac0687970187dfe0aaaf0ae262783ef6b6e2b42017da435eb93ec836/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:32:48 np0005466013 podman[244546]: 2025-10-02 12:32:48.428364433 +0000 UTC m=+0.212489578 container init 41225329b868b978a8fa3ab90f74dafc613e0019951b5ef17a29a181a58685c2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f55e0845-fc62-481d-a70d-8546faf2b8fb, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Oct  2 08:32:48 np0005466013 podman[244546]: 2025-10-02 12:32:48.433969819 +0000 UTC m=+0.218094924 container start 41225329b868b978a8fa3ab90f74dafc613e0019951b5ef17a29a181a58685c2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f55e0845-fc62-481d-a70d-8546faf2b8fb, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS)
Oct  2 08:32:48 np0005466013 neutron-haproxy-ovnmeta-f55e0845-fc62-481d-a70d-8546faf2b8fb[244561]: [NOTICE]   (244565) : New worker (244567) forked
Oct  2 08:32:48 np0005466013 neutron-haproxy-ovnmeta-f55e0845-fc62-481d-a70d-8546faf2b8fb[244561]: [NOTICE]   (244565) : Loading success.
Oct  2 08:32:48 np0005466013 nova_compute[192144]: 2025-10-02 12:32:48.535 2 DEBUG nova.compute.manager [req-4f8b5628-ed7c-4670-921b-0307ae2c0302 req-dd58d7c4-411c-4ab4-90bb-b3766ce81b4b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Received event network-vif-plugged-087a3a60-6c14-460a-99cf-049201b3c5b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:32:48 np0005466013 nova_compute[192144]: 2025-10-02 12:32:48.535 2 DEBUG oslo_concurrency.lockutils [req-4f8b5628-ed7c-4670-921b-0307ae2c0302 req-dd58d7c4-411c-4ab4-90bb-b3766ce81b4b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "1eda1f2a-e061-4d62-b09d-49ac1dc55ace-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:48 np0005466013 nova_compute[192144]: 2025-10-02 12:32:48.535 2 DEBUG oslo_concurrency.lockutils [req-4f8b5628-ed7c-4670-921b-0307ae2c0302 req-dd58d7c4-411c-4ab4-90bb-b3766ce81b4b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "1eda1f2a-e061-4d62-b09d-49ac1dc55ace-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:48 np0005466013 nova_compute[192144]: 2025-10-02 12:32:48.536 2 DEBUG oslo_concurrency.lockutils [req-4f8b5628-ed7c-4670-921b-0307ae2c0302 req-dd58d7c4-411c-4ab4-90bb-b3766ce81b4b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "1eda1f2a-e061-4d62-b09d-49ac1dc55ace-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:48 np0005466013 nova_compute[192144]: 2025-10-02 12:32:48.536 2 DEBUG nova.compute.manager [req-4f8b5628-ed7c-4670-921b-0307ae2c0302 req-dd58d7c4-411c-4ab4-90bb-b3766ce81b4b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] No waiting events found dispatching network-vif-plugged-087a3a60-6c14-460a-99cf-049201b3c5b7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:32:48 np0005466013 nova_compute[192144]: 2025-10-02 12:32:48.536 2 WARNING nova.compute.manager [req-4f8b5628-ed7c-4670-921b-0307ae2c0302 req-dd58d7c4-411c-4ab4-90bb-b3766ce81b4b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Received unexpected event network-vif-plugged-087a3a60-6c14-460a-99cf-049201b3c5b7 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:32:48 np0005466013 nova_compute[192144]: 2025-10-02 12:32:48.814 2 DEBUG nova.compute.manager [req-76434c2b-71ad-4eff-b3cb-0cfbca5d19e0 req-ea45e442-59a3-4f6c-8797-9cee00143b99 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Received event network-vif-plugged-4d4f5d39-ff10-4ea6-8b7a-df918302bf68 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:32:48 np0005466013 nova_compute[192144]: 2025-10-02 12:32:48.814 2 DEBUG oslo_concurrency.lockutils [req-76434c2b-71ad-4eff-b3cb-0cfbca5d19e0 req-ea45e442-59a3-4f6c-8797-9cee00143b99 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "1eda1f2a-e061-4d62-b09d-49ac1dc55ace-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:48 np0005466013 nova_compute[192144]: 2025-10-02 12:32:48.814 2 DEBUG oslo_concurrency.lockutils [req-76434c2b-71ad-4eff-b3cb-0cfbca5d19e0 req-ea45e442-59a3-4f6c-8797-9cee00143b99 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "1eda1f2a-e061-4d62-b09d-49ac1dc55ace-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:48 np0005466013 nova_compute[192144]: 2025-10-02 12:32:48.815 2 DEBUG oslo_concurrency.lockutils [req-76434c2b-71ad-4eff-b3cb-0cfbca5d19e0 req-ea45e442-59a3-4f6c-8797-9cee00143b99 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "1eda1f2a-e061-4d62-b09d-49ac1dc55ace-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:48 np0005466013 nova_compute[192144]: 2025-10-02 12:32:48.815 2 DEBUG nova.compute.manager [req-76434c2b-71ad-4eff-b3cb-0cfbca5d19e0 req-ea45e442-59a3-4f6c-8797-9cee00143b99 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] No waiting events found dispatching network-vif-plugged-4d4f5d39-ff10-4ea6-8b7a-df918302bf68 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:32:48 np0005466013 nova_compute[192144]: 2025-10-02 12:32:48.815 2 WARNING nova.compute.manager [req-76434c2b-71ad-4eff-b3cb-0cfbca5d19e0 req-ea45e442-59a3-4f6c-8797-9cee00143b99 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Received unexpected event network-vif-plugged-4d4f5d39-ff10-4ea6-8b7a-df918302bf68 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:32:50 np0005466013 nova_compute[192144]: 2025-10-02 12:32:50.033 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:50 np0005466013 nova_compute[192144]: 2025-10-02 12:32:50.088 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:50 np0005466013 nova_compute[192144]: 2025-10-02 12:32:50.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:50 np0005466013 NetworkManager[51205]: <info>  [1759408370.2535] manager: (patch-br-int-to-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/283)
Oct  2 08:32:50 np0005466013 NetworkManager[51205]: <info>  [1759408370.2543] manager: (patch-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/284)
Oct  2 08:32:50 np0005466013 ovn_controller[94366]: 2025-10-02T12:32:50Z|00629|binding|INFO|Releasing lport 763e1f51-8560-461a-a2f3-3c284c8e5a17 from this chassis (sb_readonly=0)
Oct  2 08:32:50 np0005466013 ovn_controller[94366]: 2025-10-02T12:32:50Z|00630|binding|INFO|Releasing lport f8346990-e84e-49ae-958d-dc83725093d9 from this chassis (sb_readonly=0)
Oct  2 08:32:50 np0005466013 ovn_controller[94366]: 2025-10-02T12:32:50Z|00631|binding|INFO|Releasing lport c198cb2e-a850-46e4-8295-a2f9c280ee53 from this chassis (sb_readonly=0)
Oct  2 08:32:50 np0005466013 ovn_controller[94366]: 2025-10-02T12:32:50Z|00632|binding|INFO|Releasing lport 763e1f51-8560-461a-a2f3-3c284c8e5a17 from this chassis (sb_readonly=0)
Oct  2 08:32:50 np0005466013 ovn_controller[94366]: 2025-10-02T12:32:50Z|00633|binding|INFO|Releasing lport f8346990-e84e-49ae-958d-dc83725093d9 from this chassis (sb_readonly=0)
Oct  2 08:32:50 np0005466013 ovn_controller[94366]: 2025-10-02T12:32:50Z|00634|binding|INFO|Releasing lport c198cb2e-a850-46e4-8295-a2f9c280ee53 from this chassis (sb_readonly=0)
Oct  2 08:32:50 np0005466013 nova_compute[192144]: 2025-10-02 12:32:50.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:51 np0005466013 podman[244578]: 2025-10-02 12:32:51.727676624 +0000 UTC m=+0.097135218 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, release=1755695350, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, name=ubi9-minimal, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., version=9.6, vcs-type=git, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct  2 08:32:51 np0005466013 podman[244579]: 2025-10-02 12:32:51.734523839 +0000 UTC m=+0.098181291 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct  2 08:32:51 np0005466013 podman[244577]: 2025-10-02 12:32:51.761671954 +0000 UTC m=+0.126966157 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Oct  2 08:32:53 np0005466013 nova_compute[192144]: 2025-10-02 12:32:53.035 2 DEBUG nova.compute.manager [req-012f029d-b9ab-4842-9ffa-20c4739fc428 req-dfd0093d-745c-4ae2-b5ae-a4febb8b6c0a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Received event network-changed-087a3a60-6c14-460a-99cf-049201b3c5b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:32:53 np0005466013 nova_compute[192144]: 2025-10-02 12:32:53.037 2 DEBUG nova.compute.manager [req-012f029d-b9ab-4842-9ffa-20c4739fc428 req-dfd0093d-745c-4ae2-b5ae-a4febb8b6c0a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Refreshing instance network info cache due to event network-changed-087a3a60-6c14-460a-99cf-049201b3c5b7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:32:53 np0005466013 nova_compute[192144]: 2025-10-02 12:32:53.037 2 DEBUG oslo_concurrency.lockutils [req-012f029d-b9ab-4842-9ffa-20c4739fc428 req-dfd0093d-745c-4ae2-b5ae-a4febb8b6c0a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-1eda1f2a-e061-4d62-b09d-49ac1dc55ace" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:32:53 np0005466013 nova_compute[192144]: 2025-10-02 12:32:53.038 2 DEBUG oslo_concurrency.lockutils [req-012f029d-b9ab-4842-9ffa-20c4739fc428 req-dfd0093d-745c-4ae2-b5ae-a4febb8b6c0a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-1eda1f2a-e061-4d62-b09d-49ac1dc55ace" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:32:53 np0005466013 nova_compute[192144]: 2025-10-02 12:32:53.038 2 DEBUG nova.network.neutron [req-012f029d-b9ab-4842-9ffa-20c4739fc428 req-dfd0093d-745c-4ae2-b5ae-a4febb8b6c0a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Refreshing network info cache for port 087a3a60-6c14-460a-99cf-049201b3c5b7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:32:55 np0005466013 nova_compute[192144]: 2025-10-02 12:32:55.035 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:55 np0005466013 nova_compute[192144]: 2025-10-02 12:32:55.089 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:56 np0005466013 nova_compute[192144]: 2025-10-02 12:32:56.397 2 DEBUG nova.network.neutron [req-012f029d-b9ab-4842-9ffa-20c4739fc428 req-dfd0093d-745c-4ae2-b5ae-a4febb8b6c0a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Updated VIF entry in instance network info cache for port 087a3a60-6c14-460a-99cf-049201b3c5b7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:32:56 np0005466013 nova_compute[192144]: 2025-10-02 12:32:56.398 2 DEBUG nova.network.neutron [req-012f029d-b9ab-4842-9ffa-20c4739fc428 req-dfd0093d-745c-4ae2-b5ae-a4febb8b6c0a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Updating instance_info_cache with network_info: [{"id": "087a3a60-6c14-460a-99cf-049201b3c5b7", "address": "fa:16:3e:bf:a3:73", "network": {"id": "48ae5e44-4c0f-44dd-b2b0-7bd3123da141", "bridge": "br-int", "label": "tempest-network-smoke--581239262", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap087a3a60-6c", "ovs_interfaceid": "087a3a60-6c14-460a-99cf-049201b3c5b7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "4d4f5d39-ff10-4ea6-8b7a-df918302bf68", "address": "fa:16:3e:5b:3f:d2", "network": {"id": "f55e0845-fc62-481d-a70d-8546faf2b8fb", "bridge": "br-int", "label": "tempest-network-smoke--2003085585", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe5b:3fd2", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5b:3fd2", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d4f5d39-ff", "ovs_interfaceid": "4d4f5d39-ff10-4ea6-8b7a-df918302bf68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:32:56 np0005466013 nova_compute[192144]: 2025-10-02 12:32:56.438 2 DEBUG oslo_concurrency.lockutils [req-012f029d-b9ab-4842-9ffa-20c4739fc428 req-dfd0093d-745c-4ae2-b5ae-a4febb8b6c0a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-1eda1f2a-e061-4d62-b09d-49ac1dc55ace" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:32:57 np0005466013 podman[244639]: 2025-10-02 12:32:57.671779477 +0000 UTC m=+0.052223884 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:32:57 np0005466013 podman[244640]: 2025-10-02 12:32:57.679004894 +0000 UTC m=+0.056310613 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, org.label-schema.build-date=20251001)
Oct  2 08:33:00 np0005466013 nova_compute[192144]: 2025-10-02 12:33:00.038 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:00 np0005466013 nova_compute[192144]: 2025-10-02 12:33:00.090 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:33:02.316 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:33:02.317 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:33:02.319 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:03 np0005466013 ovn_controller[94366]: 2025-10-02T12:33:03Z|00064|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:bf:a3:73 10.100.0.12
Oct  2 08:33:03 np0005466013 ovn_controller[94366]: 2025-10-02T12:33:03Z|00065|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:bf:a3:73 10.100.0.12
Oct  2 08:33:05 np0005466013 nova_compute[192144]: 2025-10-02 12:33:05.041 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:05 np0005466013 nova_compute[192144]: 2025-10-02 12:33:05.092 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:10 np0005466013 nova_compute[192144]: 2025-10-02 12:33:10.043 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:10 np0005466013 nova_compute[192144]: 2025-10-02 12:33:10.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:11 np0005466013 nova_compute[192144]: 2025-10-02 12:33:11.091 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:11 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:33:11.092 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=37, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=36) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:33:11 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:33:11.096 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:33:11 np0005466013 ovn_controller[94366]: 2025-10-02T12:33:11Z|00635|binding|INFO|Releasing lport 763e1f51-8560-461a-a2f3-3c284c8e5a17 from this chassis (sb_readonly=0)
Oct  2 08:33:11 np0005466013 ovn_controller[94366]: 2025-10-02T12:33:11Z|00636|binding|INFO|Releasing lport f8346990-e84e-49ae-958d-dc83725093d9 from this chassis (sb_readonly=0)
Oct  2 08:33:11 np0005466013 ovn_controller[94366]: 2025-10-02T12:33:11Z|00637|binding|INFO|Releasing lport c198cb2e-a850-46e4-8295-a2f9c280ee53 from this chassis (sb_readonly=0)
Oct  2 08:33:11 np0005466013 nova_compute[192144]: 2025-10-02 12:33:11.608 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:33:12.099 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '37'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:12 np0005466013 podman[244704]: 2025-10-02 12:33:12.696889953 +0000 UTC m=+0.062907391 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 08:33:12 np0005466013 podman[244705]: 2025-10-02 12:33:12.71811468 +0000 UTC m=+0.084229381 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:33:12 np0005466013 podman[244706]: 2025-10-02 12:33:12.726126033 +0000 UTC m=+0.092482872 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20251001, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  2 08:33:15 np0005466013 nova_compute[192144]: 2025-10-02 12:33:15.122 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.358 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace', 'name': 'tempest-TestGettingAddress-server-618937776', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000091', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'hostId': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.361 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '4958df02-e9fa-4cb2-9175-4313cd3fd658', 'name': 'tempest-₡-1854753736', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000008c', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'a4a7099974504a798e1607c8e6a1f570', 'user_id': '27daa263abb54d4d8e3ae34cd1c5ccf5', 'hostId': 'e691ce900214cff61c226df9d77e545b4ce49570c884449f9fb6ad18', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.361 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.367 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 1eda1f2a-e061-4d62-b09d-49ac1dc55ace / tap087a3a60-6c inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.367 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 1eda1f2a-e061-4d62-b09d-49ac1dc55ace / tap4d4f5d39-ff inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.368 12 DEBUG ceilometer.compute.pollsters [-] 1eda1f2a-e061-4d62-b09d-49ac1dc55ace/network.incoming.bytes volume: 1948 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.368 12 DEBUG ceilometer.compute.pollsters [-] 1eda1f2a-e061-4d62-b09d-49ac1dc55ace/network.incoming.bytes volume: 1072 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.372 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 4958df02-e9fa-4cb2-9175-4313cd3fd658 / tapbf1d62fc-3a inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.373 12 DEBUG ceilometer.compute.pollsters [-] 4958df02-e9fa-4cb2-9175-4313cd3fd658/network.incoming.bytes volume: 1652 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.375 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aa272cc7-669d-4ffa-b098-951188fa186d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1948, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000091-1eda1f2a-e061-4d62-b09d-49ac1dc55ace-tap087a3a60-6c', 'timestamp': '2025-10-02T12:33:16.362024', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-618937776', 'name': 'tap087a3a60-6c', 'instance_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bf:a3:73', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap087a3a60-6c'}, 'message_id': 'f84f0a22-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.040235297, 'message_signature': 'f456741cab642a414373300f7dee0d61357ceb598fae6694e6cbc60f7f0d3ffc'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1072, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000091-1eda1f2a-e061-4d62-b09d-49ac1dc55ace-tap4d4f5d39-ff', 'timestamp': '2025-10-02T12:33:16.362024', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-618937776', 'name': 'tap4d4f5d39-ff', 'instance_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5b:3f:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4d4f5d39-ff'}, 'message_id': 'f84f17d8-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.040235297, 'message_signature': '455e972dfbec806cd536c20dce33488626dbe78ffe25de7bf549abf4696bedaf'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1652, 'user_id': '27daa263abb54d4d8e3ae34cd1c5ccf5', 'user_name': None, 'project_id': 'a4a7099974504a798e1607c8e6a1f570', 'project_name': None, 'resource_id': 'instance-0000008c-4958df02-e9fa-4cb2-9175-4313cd3fd658-tapbf1d62fc-3a', 'timestamp': '2025-10-02T12:33:16.362024', 'resource_metadata': {'display_name': 'tempest-₡-1854753736', 'name': 'tapbf1d62fc-3a', 'instance_id': '4958df02-e9fa-4cb2-9175-4313cd3fd658', 'instance_type': 'm1.nano', 'host': 'e691ce900214cff61c226df9d77e545b4ce49570c884449f9fb6ad18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4a:4b:3d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbf1d62fc-3a'}, 'message_id': 'f84fccf0-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.047195096, 'message_signature': '3b11b30e650cc2ad1a7208d05c3e1e3c8525b6df6774615baefb94a9da65e990'}]}, 'timestamp': '2025-10-02 12:33:16.373830', '_unique_id': '29f6d7db5bda42ac8c69c85bd4cf58fd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.375 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.375 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.375 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.375 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.375 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.375 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.375 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.375 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.375 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.375 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.375 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.375 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.375 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.375 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.375 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.375 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.375 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.375 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.375 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.375 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.375 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.375 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.375 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.375 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.375 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.375 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.375 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.375 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.375 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.375 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.375 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.378 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.406 12 DEBUG ceilometer.compute.pollsters [-] 1eda1f2a-e061-4d62-b09d-49ac1dc55ace/disk.device.read.latency volume: 1917650459 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.407 12 DEBUG ceilometer.compute.pollsters [-] 1eda1f2a-e061-4d62-b09d-49ac1dc55ace/disk.device.read.latency volume: 126201006 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.434 12 DEBUG ceilometer.compute.pollsters [-] 4958df02-e9fa-4cb2-9175-4313cd3fd658/disk.device.read.latency volume: 1189585779 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.435 12 DEBUG ceilometer.compute.pollsters [-] 4958df02-e9fa-4cb2-9175-4313cd3fd658/disk.device.read.latency volume: 60265733 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.437 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bf9167be-fd0d-4586-b3a0-86490aeea28c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1917650459, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace-vda', 'timestamp': '2025-10-02T12:33:16.378382', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-618937776', 'name': 'instance-00000091', 'instance_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f854e1d6-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.056653764, 'message_signature': '5739c220f43932a472261f49fc3ba656317205343600350550a73c94bca3b554'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 126201006, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace-sda', 'timestamp': '2025-10-02T12:33:16.378382', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-618937776', 'name': 'instance-00000091', 'instance_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f854fc2a-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.056653764, 'message_signature': '7b89cb300370b31acffbd9739986af39626ae7ffd3b59ec8548fb313eb793935'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1189585779, 'user_id': '27daa263abb54d4d8e3ae34cd1c5ccf5', 'user_name': None, 'project_id': 'a4a7099974504a798e1607c8e6a1f570', 'project_name': None, 'resource_id': '4958df02-e9fa-4cb2-9175-4313cd3fd658-vda', 'timestamp': '2025-10-02T12:33:16.378382', 'resource_metadata': {'display_name': 'tempest-₡-1854753736', 'name': 'instance-0000008c', 'instance_id': '4958df02-e9fa-4cb2-9175-4313cd3fd658', 'instance_type': 'm1.nano', 'host': 'e691ce900214cff61c226df9d77e545b4ce49570c884449f9fb6ad18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f8592c0a-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.085866753, 'message_signature': 'bee9295111fa07ae19dc2c5e27e81df8e7954178eafefe15ff7d3fd2d339b8c9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 60265733, 'user_id': '27daa263abb54d4d8e3ae34cd1c5ccf5', 'user_name': None, 'project_id': 'a4a7099974504a798e1607c8e6a1f570', 'project_name': None, 'resource_id': '4958df02-e9fa-4cb2-9175-4313cd3fd658-sda', 'timestamp': '2025-10-02T12:33:16.378382', 'resource_metadata': {'display_name': 'tempest-₡-1854753736', 'name': 'instance-0000008c', 'instance_id': '4958df02-e9fa-4cb2-9175-4313cd3fd658', 'instance_type': 'm1.nano', 'host': 'e691ce900214cff61c226df9d77e545b4ce49570c884449f9fb6ad18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f85940aa-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.085866753, 'message_signature': '85334aca1b11e0932d5deb2da74550d6afa9d42a0004dc4515d9589685298048'}]}, 'timestamp': '2025-10-02 12:33:16.435699', '_unique_id': '806e3efebe4e4eb5b30b39f42a1031f3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.437 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.437 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.437 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.437 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.437 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.437 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.437 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.437 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.437 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.437 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.437 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.437 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.437 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.437 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.437 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.437 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.437 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.437 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.437 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.437 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.437 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.437 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.437 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.437 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.437 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.437 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.437 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.437 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.437 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.437 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.437 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.438 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.439 12 DEBUG ceilometer.compute.pollsters [-] 1eda1f2a-e061-4d62-b09d-49ac1dc55ace/network.incoming.packets volume: 15 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.439 12 DEBUG ceilometer.compute.pollsters [-] 1eda1f2a-e061-4d62-b09d-49ac1dc55ace/network.incoming.packets volume: 10 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.440 12 DEBUG ceilometer.compute.pollsters [-] 4958df02-e9fa-4cb2-9175-4313cd3fd658/network.incoming.packets volume: 11 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.441 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1963db77-785f-4893-a085-a7c2774273d6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 15, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000091-1eda1f2a-e061-4d62-b09d-49ac1dc55ace-tap087a3a60-6c', 'timestamp': '2025-10-02T12:33:16.439010', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-618937776', 'name': 'tap087a3a60-6c', 'instance_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bf:a3:73', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap087a3a60-6c'}, 'message_id': 'f859d54c-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.040235297, 'message_signature': 'be38959af3ec3daa63042c74fc79df8a86aefd643d302b24ed9cdd27eba35bf4'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 10, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000091-1eda1f2a-e061-4d62-b09d-49ac1dc55ace-tap4d4f5d39-ff', 'timestamp': '2025-10-02T12:33:16.439010', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-618937776', 'name': 'tap4d4f5d39-ff', 'instance_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5b:3f:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4d4f5d39-ff'}, 'message_id': 'f859e744-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.040235297, 'message_signature': 'ed0dc099c420672d90fa0d298f8143faef30f0ac2bf3513ceb9ca70ed8f0e936'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 11, 'user_id': '27daa263abb54d4d8e3ae34cd1c5ccf5', 'user_name': None, 'project_id': 'a4a7099974504a798e1607c8e6a1f570', 'project_name': None, 'resource_id': 'instance-0000008c-4958df02-e9fa-4cb2-9175-4313cd3fd658-tapbf1d62fc-3a', 'timestamp': '2025-10-02T12:33:16.439010', 'resource_metadata': {'display_name': 'tempest-₡-1854753736', 'name': 'tapbf1d62fc-3a', 'instance_id': '4958df02-e9fa-4cb2-9175-4313cd3fd658', 'instance_type': 'm1.nano', 'host': 'e691ce900214cff61c226df9d77e545b4ce49570c884449f9fb6ad18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4a:4b:3d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbf1d62fc-3a'}, 'message_id': 'f859fc52-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.047195096, 'message_signature': '3221fb5f1e346078c140970413ba89fa0b4293ce43902686b5027c44fd723e91'}]}, 'timestamp': '2025-10-02 12:33:16.440498', '_unique_id': 'fd6514d6bdd14910927ef90327ecb21e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.441 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.441 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.441 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.441 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.441 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.441 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.441 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.441 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.441 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.441 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.441 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.441 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.441 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.441 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.441 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.441 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.441 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.441 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.441 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.441 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.441 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.441 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.441 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.441 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.441 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.441 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.441 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.441 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.441 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.441 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.441 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.442 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.462 12 DEBUG ceilometer.compute.pollsters [-] 1eda1f2a-e061-4d62-b09d-49ac1dc55ace/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.463 12 DEBUG ceilometer.compute.pollsters [-] 1eda1f2a-e061-4d62-b09d-49ac1dc55ace/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.479 12 DEBUG ceilometer.compute.pollsters [-] 4958df02-e9fa-4cb2-9175-4313cd3fd658/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.480 12 DEBUG ceilometer.compute.pollsters [-] 4958df02-e9fa-4cb2-9175-4313cd3fd658/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.481 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1b751ffa-82f6-467b-b432-b8077c3b82b0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace-vda', 'timestamp': '2025-10-02T12:33:16.443102', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-618937776', 'name': 'instance-00000091', 'instance_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f85d7dbe-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.121374241, 'message_signature': '81e8c7d07a7e9109a041532158f9fa1743b250ebbaa3b6fa7921d02dd86095da'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace-sda', 'timestamp': '2025-10-02T12:33:16.443102', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-618937776', 'name': 'instance-00000091', 'instance_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f85d9222-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.121374241, 'message_signature': '1139e44cc8c96444b5e8e277435594b01e5d54c4f8a0e45e1a33b098bf7e477f'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '27daa263abb54d4d8e3ae34cd1c5ccf5', 'user_name': None, 'project_id': 'a4a7099974504a798e1607c8e6a1f570', 'project_name': None, 'resource_id': '4958df02-e9fa-4cb2-9175-4313cd3fd658-vda', 'timestamp': '2025-10-02T12:33:16.443102', 'resource_metadata': {'display_name': 'tempest-₡-1854753736', 'name': 'instance-0000008c', 'instance_id': '4958df02-e9fa-4cb2-9175-4313cd3fd658', 'instance_type': 'm1.nano', 'host': 'e691ce900214cff61c226df9d77e545b4ce49570c884449f9fb6ad18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f8600a34-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.142282939, 'message_signature': '6d46da8c9f907570f329e5f5aa4e933218c1db05a3720fa5e8b159e202441a34'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '27daa263abb54d4d8e3ae34cd1c5ccf5', 'user_name': None, 'project_id': 'a4a7099974504a798e1607c8e6a1f570', 'project_name': None, 'resource_id': '4958df02-e9fa-4cb2-9175-4313cd3fd658-sda', 'timestamp': '2025-10-02T12:33:16.443102', 'resource_metadata': {'display_name': 'tempest-₡-1854753736', 'name': 'instance-0000008c', 'instance_id': '4958df02-e9fa-4cb2-9175-4313cd3fd658', 'instance_type': 'm1.nano', 'host': 'e691ce900214cff61c226df9d77e545b4ce49570c884449f9fb6ad18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f8601b96-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.142282939, 'message_signature': '188fb82b066fb526e3b21d5dc5fa72141165fb5cb30802efcf71664b414c2d1a'}]}, 'timestamp': '2025-10-02 12:33:16.480637', '_unique_id': 'd552e715c005498d8ec597d0e67335d3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.481 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.481 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.481 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.481 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.481 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.481 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.481 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.481 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.481 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.481 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.481 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.481 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.481 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.481 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.481 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.481 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.481 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.481 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.481 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.481 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.481 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.481 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.481 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.481 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.481 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.481 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.481 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.481 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.481 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.481 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.481 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.482 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.482 12 DEBUG ceilometer.compute.pollsters [-] 1eda1f2a-e061-4d62-b09d-49ac1dc55ace/disk.device.read.bytes volume: 30792192 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.483 12 DEBUG ceilometer.compute.pollsters [-] 1eda1f2a-e061-4d62-b09d-49ac1dc55ace/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.483 12 DEBUG ceilometer.compute.pollsters [-] 4958df02-e9fa-4cb2-9175-4313cd3fd658/disk.device.read.bytes volume: 31005184 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.483 12 DEBUG ceilometer.compute.pollsters [-] 4958df02-e9fa-4cb2-9175-4313cd3fd658/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.485 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7a5d3454-9979-4ffd-a4d2-2da4dcaefde0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30792192, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace-vda', 'timestamp': '2025-10-02T12:33:16.482948', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-618937776', 'name': 'instance-00000091', 'instance_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f8608496-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.056653764, 'message_signature': '39f2c10d13aafccd53889104614fc2114469a17d6b15fb36919aac96ec7f4d36'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace-sda', 'timestamp': '2025-10-02T12:33:16.482948', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-618937776', 'name': 'instance-00000091', 'instance_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f8608cd4-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.056653764, 'message_signature': '87b52cd344dc8d5a4c712078c34b413a5f2140b89b453d0be8cabab74914ba25'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31005184, 'user_id': '27daa263abb54d4d8e3ae34cd1c5ccf5', 'user_name': None, 'project_id': 'a4a7099974504a798e1607c8e6a1f570', 'project_name': None, 'resource_id': '4958df02-e9fa-4cb2-9175-4313cd3fd658-vda', 'timestamp': '2025-10-02T12:33:16.482948', 'resource_metadata': {'display_name': 'tempest-₡-1854753736', 'name': 'instance-0000008c', 'instance_id': '4958df02-e9fa-4cb2-9175-4313cd3fd658', 'instance_type': 'm1.nano', 'host': 'e691ce900214cff61c226df9d77e545b4ce49570c884449f9fb6ad18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f8609fa8-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.085866753, 'message_signature': 'e98daf4cb31688756e1ef52b7f4e15dfc443b77c296d168158361708ac023f8a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '27daa263abb54d4d8e3ae34cd1c5ccf5', 'user_name': None, 'project_id': 'a4a7099974504a798e1607c8e6a1f570', 'project_name': None, 'resource_id': '4958df02-e9fa-4cb2-9175-4313cd3fd658-sda', 'timestamp': '2025-10-02T12:33:16.482948', 'resource_metadata': {'display_name': 'tempest-₡-1854753736', 'name': 'instance-0000008c', 'instance_id': '4958df02-e9fa-4cb2-9175-4313cd3fd658', 'instance_type': 'm1.nano', 'host': 'e691ce900214cff61c226df9d77e545b4ce49570c884449f9fb6ad18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f860ab7e-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.085866753, 'message_signature': '2ca749a3b4884727d7901a3d69eefdec94681a836df377ddf6dc29c0a3a31fb0'}]}, 'timestamp': '2025-10-02 12:33:16.484273', '_unique_id': 'f5da0e6175e44bffbf4acff8f2a44695'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.485 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.485 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.485 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.485 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.485 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.485 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.485 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.485 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.485 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.485 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.485 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.485 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.485 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.485 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.485 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.485 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.485 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.485 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.485 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.485 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.485 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.485 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.485 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.485 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.485 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.485 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.485 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.485 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.485 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.485 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.485 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.486 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.508 12 DEBUG ceilometer.compute.pollsters [-] 1eda1f2a-e061-4d62-b09d-49ac1dc55ace/memory.usage volume: 43.95703125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.530 12 DEBUG ceilometer.compute.pollsters [-] 4958df02-e9fa-4cb2-9175-4313cd3fd658/memory.usage volume: 42.61328125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.533 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '128b1116-4de9-4e81-b9c4-870c5f6051e4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 43.95703125, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace', 'timestamp': '2025-10-02T12:33:16.486285', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-618937776', 'name': 'instance-00000091', 'instance_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'f8648618-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.187043107, 'message_signature': '2f030b95c50df55d576d27a7823c83dce4049c4bb436493f92681b71f88db2e4'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.61328125, 'user_id': '27daa263abb54d4d8e3ae34cd1c5ccf5', 'user_name': None, 'project_id': 'a4a7099974504a798e1607c8e6a1f570', 'project_name': None, 'resource_id': '4958df02-e9fa-4cb2-9175-4313cd3fd658', 'timestamp': '2025-10-02T12:33:16.486285', 'resource_metadata': {'display_name': 'tempest-₡-1854753736', 'name': 'instance-0000008c', 'instance_id': '4958df02-e9fa-4cb2-9175-4313cd3fd658', 'instance_type': 'm1.nano', 'host': 'e691ce900214cff61c226df9d77e545b4ce49570c884449f9fb6ad18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'f867c814-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.208592146, 'message_signature': 'd4667392054647cacd4377aba763ceb0a2074af5facc56eed16a56b95948a6d8'}]}, 'timestamp': '2025-10-02 12:33:16.530921', '_unique_id': '332f8d86977a4775b4b28bb68b013634'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.533 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.533 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.533 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.533 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.533 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.533 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.533 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.533 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.533 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.533 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.533 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.533 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.533 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.533 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.533 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.533 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.533 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.533 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.533 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.533 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.533 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.533 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.533 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.533 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.533 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.533 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.533 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.533 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.533 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.533 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.533 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.534 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.534 12 DEBUG ceilometer.compute.pollsters [-] 1eda1f2a-e061-4d62-b09d-49ac1dc55ace/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.535 12 DEBUG ceilometer.compute.pollsters [-] 1eda1f2a-e061-4d62-b09d-49ac1dc55ace/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.535 12 DEBUG ceilometer.compute.pollsters [-] 4958df02-e9fa-4cb2-9175-4313cd3fd658/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.537 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '65512a68-aecd-4bac-859e-12aa8c17d32d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000091-1eda1f2a-e061-4d62-b09d-49ac1dc55ace-tap087a3a60-6c', 'timestamp': '2025-10-02T12:33:16.534508', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-618937776', 'name': 'tap087a3a60-6c', 'instance_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bf:a3:73', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap087a3a60-6c'}, 'message_id': 'f86867ce-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.040235297, 'message_signature': '2742ac9026bfc5e87738934f4cd3bc64ad27b1f9a96e0d58a7ba34c8704b39d4'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000091-1eda1f2a-e061-4d62-b09d-49ac1dc55ace-tap4d4f5d39-ff', 'timestamp': '2025-10-02T12:33:16.534508', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-618937776', 'name': 'tap4d4f5d39-ff', 'instance_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5b:3f:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4d4f5d39-ff'}, 'message_id': 'f8687746-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.040235297, 'message_signature': '615cfad9ed31a18dc5d3c9ce83df22efdc471cb49a06e2c6ae08257d5256b4ce'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '27daa263abb54d4d8e3ae34cd1c5ccf5', 'user_name': None, 'project_id': 'a4a7099974504a798e1607c8e6a1f570', 'project_name': None, 'resource_id': 'instance-0000008c-4958df02-e9fa-4cb2-9175-4313cd3fd658-tapbf1d62fc-3a', 'timestamp': '2025-10-02T12:33:16.534508', 'resource_metadata': {'display_name': 'tempest-₡-1854753736', 'name': 'tapbf1d62fc-3a', 'instance_id': '4958df02-e9fa-4cb2-9175-4313cd3fd658', 'instance_type': 'm1.nano', 'host': 'e691ce900214cff61c226df9d77e545b4ce49570c884449f9fb6ad18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4a:4b:3d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbf1d62fc-3a'}, 'message_id': 'f86886aa-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.047195096, 'message_signature': '4986609c7928f367878704e36a1d773ed008fe022e45a917b9bb55282d23af6c'}]}, 'timestamp': '2025-10-02 12:33:16.535783', '_unique_id': '42568435b80c48ab9bb4b780b9ce6f99'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.537 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.537 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.537 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.537 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.537 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.537 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.537 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.537 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.537 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.537 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.537 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.537 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.537 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.537 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.537 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.537 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.537 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.537 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.537 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.537 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.537 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.537 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.537 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.537 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.537 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.537 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.537 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.537 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.537 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.537 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.537 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.538 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.538 12 DEBUG ceilometer.compute.pollsters [-] 1eda1f2a-e061-4d62-b09d-49ac1dc55ace/disk.device.write.requests volume: 334 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.538 12 DEBUG ceilometer.compute.pollsters [-] 1eda1f2a-e061-4d62-b09d-49ac1dc55ace/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.538 12 DEBUG ceilometer.compute.pollsters [-] 4958df02-e9fa-4cb2-9175-4313cd3fd658/disk.device.write.requests volume: 303 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.539 12 DEBUG ceilometer.compute.pollsters [-] 4958df02-e9fa-4cb2-9175-4313cd3fd658/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.540 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '164c07bd-60e0-4fcf-88f0-cad663864956', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 334, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace-vda', 'timestamp': '2025-10-02T12:33:16.538242', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-618937776', 'name': 'instance-00000091', 'instance_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f868f6e4-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.056653764, 'message_signature': '803bf7c5db5fd919d01a442b6f4501759577f4b1894482f36c21630387eb293f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace-sda', 'timestamp': '2025-10-02T12:33:16.538242', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-618937776', 'name': 'instance-00000091', 'instance_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f86907ba-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.056653764, 'message_signature': '11d64338b858096d20d2e0f9606c8d36b170bfd7c191a00d7f56d39b293f7c28'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 303, 'user_id': '27daa263abb54d4d8e3ae34cd1c5ccf5', 'user_name': None, 'project_id': 'a4a7099974504a798e1607c8e6a1f570', 'project_name': None, 'resource_id': '4958df02-e9fa-4cb2-9175-4313cd3fd658-vda', 'timestamp': '2025-10-02T12:33:16.538242', 'resource_metadata': {'display_name': 'tempest-₡-1854753736', 'name': 'instance-0000008c', 'instance_id': '4958df02-e9fa-4cb2-9175-4313cd3fd658', 'instance_type': 'm1.nano', 'host': 'e691ce900214cff61c226df9d77e545b4ce49570c884449f9fb6ad18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f86911c4-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.085866753, 'message_signature': '38754c4cf4e3a0c0aaebfde0631cdf5ecac782e8c9107a75052c382995bc2b15'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '27daa263abb54d4d8e3ae34cd1c5ccf5', 'user_name': None, 'project_id': 'a4a7099974504a798e1607c8e6a1f570', 'project_name': None, 'resource_id': '4958df02-e9fa-4cb2-9175-4313cd3fd658-sda', 'timestamp': '2025-10-02T12:33:16.538242', 'resource_metadata': {'display_name': 'tempest-₡-1854753736', 'name': 'instance-0000008c', 'instance_id': '4958df02-e9fa-4cb2-9175-4313cd3fd658', 'instance_type': 'm1.nano', 'host': 'e691ce900214cff61c226df9d77e545b4ce49570c884449f9fb6ad18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f869257e-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.085866753, 'message_signature': 'f93e287ce644094fd767fc33497de7cc24a6c85c5d6c94eff9c99f75575535dc'}]}, 'timestamp': '2025-10-02 12:33:16.539774', '_unique_id': '03d21eec24294f2a810be5c7b6d43215'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.540 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.540 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.540 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.540 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.540 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.540 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.540 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.540 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.540 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.540 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.540 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.540 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.540 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.540 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.540 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.540 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.540 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.540 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.540 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.540 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.540 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.540 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.540 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.540 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.540 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.540 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.540 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.540 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.540 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.540 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.540 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.541 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.541 12 DEBUG ceilometer.compute.pollsters [-] 1eda1f2a-e061-4d62-b09d-49ac1dc55ace/cpu volume: 12160000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.542 12 DEBUG ceilometer.compute.pollsters [-] 4958df02-e9fa-4cb2-9175-4313cd3fd658/cpu volume: 11860000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.543 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0d69ee34-1fb8-4130-887b-0a2e7d9b1eb9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12160000000, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace', 'timestamp': '2025-10-02T12:33:16.541894', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-618937776', 'name': 'instance-00000091', 'instance_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'f86983ac-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.187043107, 'message_signature': '2f59fbf38ba9051c04a73af70507a0542a1fbde527601d248ea00ae91f2a51f0'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11860000000, 'user_id': '27daa263abb54d4d8e3ae34cd1c5ccf5', 'user_name': None, 'project_id': 'a4a7099974504a798e1607c8e6a1f570', 'project_name': None, 'resource_id': '4958df02-e9fa-4cb2-9175-4313cd3fd658', 'timestamp': '2025-10-02T12:33:16.541894', 'resource_metadata': {'display_name': 'tempest-₡-1854753736', 'name': 'instance-0000008c', 'instance_id': '4958df02-e9fa-4cb2-9175-4313cd3fd658', 'instance_type': 'm1.nano', 'host': 'e691ce900214cff61c226df9d77e545b4ce49570c884449f9fb6ad18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'f8698f1e-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.208592146, 'message_signature': 'a42a8e9054aa6d1f81756c1784f3345a9aefc755abb1f863b4c488097b51643c'}]}, 'timestamp': '2025-10-02 12:33:16.542752', '_unique_id': '538e30ec7d5540919efcf4b3b42c9c57'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.543 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.543 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.543 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.543 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.543 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.543 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.543 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.543 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.543 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.543 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.543 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.543 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.543 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.543 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.543 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.543 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.543 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.543 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.543 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.543 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.543 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.543 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.543 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.543 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.543 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.543 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.543 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.543 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.543 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.543 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.543 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.544 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.544 12 DEBUG ceilometer.compute.pollsters [-] 1eda1f2a-e061-4d62-b09d-49ac1dc55ace/disk.device.write.latency volume: 120140349739 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.544 12 DEBUG ceilometer.compute.pollsters [-] 1eda1f2a-e061-4d62-b09d-49ac1dc55ace/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.545 12 DEBUG ceilometer.compute.pollsters [-] 4958df02-e9fa-4cb2-9175-4313cd3fd658/disk.device.write.latency volume: 2242399187 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.545 12 DEBUG ceilometer.compute.pollsters [-] 4958df02-e9fa-4cb2-9175-4313cd3fd658/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.546 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd51ae7f9-d29c-4a37-b6bb-48e548db96c6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 120140349739, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace-vda', 'timestamp': '2025-10-02T12:33:16.544313', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-618937776', 'name': 'instance-00000091', 'instance_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f869e900-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.056653764, 'message_signature': 'b381aeaca996f450d5a18b10b097e62fdacaa779ddca3044b7593860953f45f6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace-sda', 'timestamp': '2025-10-02T12:33:16.544313', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-618937776', 'name': 'instance-00000091', 'instance_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f869f468-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.056653764, 'message_signature': 'db4b569c06b35acba5d8935d0ce4419465f01c4f63e2f80ff294ddcde3df7a95'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2242399187, 'user_id': '27daa263abb54d4d8e3ae34cd1c5ccf5', 'user_name': None, 'project_id': 'a4a7099974504a798e1607c8e6a1f570', 'project_name': None, 'resource_id': '4958df02-e9fa-4cb2-9175-4313cd3fd658-vda', 'timestamp': '2025-10-02T12:33:16.544313', 'resource_metadata': {'display_name': 'tempest-₡-1854753736', 'name': 'instance-0000008c', 'instance_id': '4958df02-e9fa-4cb2-9175-4313cd3fd658', 'instance_type': 'm1.nano', 'host': 'e691ce900214cff61c226df9d77e545b4ce49570c884449f9fb6ad18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f86a0412-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.085866753, 'message_signature': 'a915ce2234e3079e30b4ba5cdeaf896285a854e29f52047cc0a6acd8dc5ff1b0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '27daa263abb54d4d8e3ae34cd1c5ccf5', 'user_name': None, 'project_id': 'a4a7099974504a798e1607c8e6a1f570', 'project_name': None, 'resource_id': '4958df02-e9fa-4cb2-9175-4313cd3fd658-sda', 'timestamp': '2025-10-02T12:33:16.544313', 'resource_metadata': {'display_name': 'tempest-₡-1854753736', 'name': 'instance-0000008c', 'instance_id': '4958df02-e9fa-4cb2-9175-4313cd3fd658', 'instance_type': 'm1.nano', 'host': 'e691ce900214cff61c226df9d77e545b4ce49570c884449f9fb6ad18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f86a109c-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.085866753, 'message_signature': '938276f80d2352c12b96514478ad9c3712a2811222a41f464091b576992f0a24'}]}, 'timestamp': '2025-10-02 12:33:16.545786', '_unique_id': '0d58d98ec8d04e7ca731657cd1466554'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.546 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.546 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.546 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.546 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.546 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.546 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.546 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.546 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.546 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.546 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.546 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.546 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.546 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.546 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.546 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.546 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.546 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.546 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.546 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.546 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.546 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.546 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.546 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.546 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.546 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.546 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.546 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.546 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.546 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.546 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.546 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.547 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.548 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.548 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-618937776>, <NovaLikeServer: tempest-₡-1854753736>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-618937776>, <NovaLikeServer: tempest-₡-1854753736>]
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.548 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.548 12 DEBUG ceilometer.compute.pollsters [-] 1eda1f2a-e061-4d62-b09d-49ac1dc55ace/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.548 12 DEBUG ceilometer.compute.pollsters [-] 1eda1f2a-e061-4d62-b09d-49ac1dc55ace/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.549 12 DEBUG ceilometer.compute.pollsters [-] 4958df02-e9fa-4cb2-9175-4313cd3fd658/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.549 12 DEBUG ceilometer.compute.pollsters [-] 4958df02-e9fa-4cb2-9175-4313cd3fd658/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.551 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c5237cc6-ad03-4bb0-be16-f46450b924ec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30154752, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace-vda', 'timestamp': '2025-10-02T12:33:16.548590', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-618937776', 'name': 'instance-00000091', 'instance_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f86a8900-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.121374241, 'message_signature': '3de04800d566781c93efe1cc7ccfd937813de93e5cfecb4292e298ff977c2014'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace-sda', 'timestamp': '2025-10-02T12:33:16.548590', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-618937776', 'name': 'instance-00000091', 'instance_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f86a93fa-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.121374241, 'message_signature': '84b4e72eab87e120a828e2956aac7e67f222e6115c02981e6eecc15117a120b9'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30154752, 'user_id': '27daa263abb54d4d8e3ae34cd1c5ccf5', 'user_name': None, 'project_id': 'a4a7099974504a798e1607c8e6a1f570', 'project_name': None, 'resource_id': '4958df02-e9fa-4cb2-9175-4313cd3fd658-vda', 'timestamp': '2025-10-02T12:33:16.548590', 'resource_metadata': {'display_name': 'tempest-₡-1854753736', 'name': 'instance-0000008c', 'instance_id': '4958df02-e9fa-4cb2-9175-4313cd3fd658', 'instance_type': 'm1.nano', 'host': 'e691ce900214cff61c226df9d77e545b4ce49570c884449f9fb6ad18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f86a9ddc-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.142282939, 'message_signature': 'c59079fbf4b0ff52c9e7864fdbefbe6eec4e418d6e6eb62fcafc21b684438809'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '27daa263abb54d4d8e3ae34cd1c5ccf5', 'user_name': None, 'project_id': 'a4a7099974504a798e1607c8e6a1f570', 'project_name': None, 'resource_id': '4958df02-e9fa-4cb2-9175-4313cd3fd658-sda', 'timestamp': '2025-10-02T12:33:16.548590', 'resource_metadata': {'display_name': 'tempest-₡-1854753736', 'name': 'instance-0000008c', 'instance_id': '4958df02-e9fa-4cb2-9175-4313cd3fd658', 'instance_type': 'm1.nano', 'host': 'e691ce900214cff61c226df9d77e545b4ce49570c884449f9fb6ad18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f86ab556-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.142282939, 'message_signature': '8145c9098a41487369524ed71c82e37b1b6fe07e638ef795c6fdf1e936faf8bc'}]}, 'timestamp': '2025-10-02 12:33:16.550000', '_unique_id': 'f0045b0ef3a74062a24c52fbd1cfced2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.551 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.551 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.551 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.551 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.551 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.551 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.551 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.551 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.551 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.551 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.551 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.551 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.551 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.551 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.551 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.551 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.551 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.551 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.551 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.551 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.551 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.551 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.551 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.551 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.551 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.551 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.551 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.551 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.551 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.551 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.551 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.552 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.552 12 DEBUG ceilometer.compute.pollsters [-] 1eda1f2a-e061-4d62-b09d-49ac1dc55ace/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.552 12 DEBUG ceilometer.compute.pollsters [-] 1eda1f2a-e061-4d62-b09d-49ac1dc55ace/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.552 12 DEBUG ceilometer.compute.pollsters [-] 4958df02-e9fa-4cb2-9175-4313cd3fd658/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.553 12 DEBUG ceilometer.compute.pollsters [-] 4958df02-e9fa-4cb2-9175-4313cd3fd658/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.554 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '10ee747c-61c2-4592-9a3d-4ec6b05981e6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace-vda', 'timestamp': '2025-10-02T12:33:16.552222', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-618937776', 'name': 'instance-00000091', 'instance_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f86b16ae-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.121374241, 'message_signature': '1e2dd59e71c089143f7f3477507fb0f6120cfefbc20a2cf7487d28e2ea5c5085'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace-sda', 'timestamp': '2025-10-02T12:33:16.552222', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-618937776', 'name': 'instance-00000091', 'instance_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f86b21da-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.121374241, 'message_signature': '3309378d70813d6d566b9715a270d5bdedcf00a53d010b4a94a6680afa5ef4d7'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '27daa263abb54d4d8e3ae34cd1c5ccf5', 'user_name': None, 'project_id': 'a4a7099974504a798e1607c8e6a1f570', 'project_name': None, 'resource_id': '4958df02-e9fa-4cb2-9175-4313cd3fd658-vda', 'timestamp': '2025-10-02T12:33:16.552222', 'resource_metadata': {'display_name': 'tempest-₡-1854753736', 'name': 'instance-0000008c', 'instance_id': '4958df02-e9fa-4cb2-9175-4313cd3fd658', 'instance_type': 'm1.nano', 'host': 'e691ce900214cff61c226df9d77e545b4ce49570c884449f9fb6ad18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f86b3148-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.142282939, 'message_signature': 'f7db6dd34822d5aa080ad5c90564b0a906d711527e3bd3d7ef3628c671eb7d91'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '27daa263abb54d4d8e3ae34cd1c5ccf5', 'user_name': None, 'project_id': 'a4a7099974504a798e1607c8e6a1f570', 'project_name': None, 'resource_id': '4958df02-e9fa-4cb2-9175-4313cd3fd658-sda', 'timestamp': '2025-10-02T12:33:16.552222', 'resource_metadata': {'display_name': 'tempest-₡-1854753736', 'name': 'instance-0000008c', 'instance_id': '4958df02-e9fa-4cb2-9175-4313cd3fd658', 'instance_type': 'm1.nano', 'host': 'e691ce900214cff61c226df9d77e545b4ce49570c884449f9fb6ad18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f86b3f1c-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.142282939, 'message_signature': '89cf1ebb1983cb3d0775db3d27f9e5ec9b4d392083def95dc049d42d93248212'}]}, 'timestamp': '2025-10-02 12:33:16.553641', '_unique_id': 'f931b9ec6c1f4213964d256e80600d50'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.554 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.554 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.554 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.554 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.554 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.554 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.554 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.554 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.554 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.554 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.554 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.554 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.554 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.554 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.554 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.554 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.554 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.554 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.554 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.554 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.554 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.554 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.554 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.554 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.554 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.554 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.554 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.554 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.554 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.554 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.554 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.555 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.555 12 DEBUG ceilometer.compute.pollsters [-] 1eda1f2a-e061-4d62-b09d-49ac1dc55ace/disk.device.read.requests volume: 1094 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.555 12 DEBUG ceilometer.compute.pollsters [-] 1eda1f2a-e061-4d62-b09d-49ac1dc55ace/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.556 12 DEBUG ceilometer.compute.pollsters [-] 4958df02-e9fa-4cb2-9175-4313cd3fd658/disk.device.read.requests volume: 1132 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.556 12 DEBUG ceilometer.compute.pollsters [-] 4958df02-e9fa-4cb2-9175-4313cd3fd658/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.557 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bc875317-cc4d-41ad-8e31-a9ea714bf433', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1094, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace-vda', 'timestamp': '2025-10-02T12:33:16.555503', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-618937776', 'name': 'instance-00000091', 'instance_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f86b9714-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.056653764, 'message_signature': '127c7e4e1bd962f9c8ce5553637203aecb27891556bc29715475a7f7132ea117'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace-sda', 'timestamp': '2025-10-02T12:33:16.555503', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-618937776', 'name': 'instance-00000091', 'instance_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f86ba682-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.056653764, 'message_signature': '6918565925c6fbd10b74057eb50c45526b6a65029e0a5a7d282a5c42a0b34b18'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1132, 'user_id': '27daa263abb54d4d8e3ae34cd1c5ccf5', 'user_name': None, 'project_id': 'a4a7099974504a798e1607c8e6a1f570', 'project_name': None, 'resource_id': '4958df02-e9fa-4cb2-9175-4313cd3fd658-vda', 'timestamp': '2025-10-02T12:33:16.555503', 'resource_metadata': {'display_name': 'tempest-₡-1854753736', 'name': 'instance-0000008c', 'instance_id': '4958df02-e9fa-4cb2-9175-4313cd3fd658', 'instance_type': 'm1.nano', 'host': 'e691ce900214cff61c226df9d77e545b4ce49570c884449f9fb6ad18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f86bb546-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.085866753, 'message_signature': '6d9916c740fb3a886d81b8e104e770ec25b6e4e41097b4383223d633a15ef32b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '27daa263abb54d4d8e3ae34cd1c5ccf5', 'user_name': None, 'project_id': 'a4a7099974504a798e1607c8e6a1f570', 'project_name': None, 'resource_id': '4958df02-e9fa-4cb2-9175-4313cd3fd658-sda', 'timestamp': '2025-10-02T12:33:16.555503', 'resource_metadata': {'display_name': 'tempest-₡-1854753736', 'name': 'instance-0000008c', 'instance_id': '4958df02-e9fa-4cb2-9175-4313cd3fd658', 'instance_type': 'm1.nano', 'host': 'e691ce900214cff61c226df9d77e545b4ce49570c884449f9fb6ad18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f86bc284-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.085866753, 'message_signature': 'e0cbf32dcb58dfad95ec5a39722c722a34dabd33246dda6d2a5cd8622b041c31'}]}, 'timestamp': '2025-10-02 12:33:16.556963', '_unique_id': '10f2b8cecba7481cb4e5bd5850b8d259'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.557 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.557 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.557 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.557 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.557 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.557 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.557 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.557 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.557 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.557 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.557 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.557 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.557 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.557 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.557 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.557 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.557 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.557 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.557 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.557 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.557 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.557 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.557 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.557 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.557 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.557 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.557 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.557 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.557 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.557 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.557 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.558 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.558 12 DEBUG ceilometer.compute.pollsters [-] 1eda1f2a-e061-4d62-b09d-49ac1dc55ace/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.559 12 DEBUG ceilometer.compute.pollsters [-] 1eda1f2a-e061-4d62-b09d-49ac1dc55ace/network.outgoing.packets volume: 23 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.559 12 DEBUG ceilometer.compute.pollsters [-] 4958df02-e9fa-4cb2-9175-4313cd3fd658/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.560 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '74eeaa0a-6111-448f-8522-becdf00945e1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000091-1eda1f2a-e061-4d62-b09d-49ac1dc55ace-tap087a3a60-6c', 'timestamp': '2025-10-02T12:33:16.558937', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-618937776', 'name': 'tap087a3a60-6c', 'instance_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bf:a3:73', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap087a3a60-6c'}, 'message_id': 'f86c1ebe-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.040235297, 'message_signature': '3271d9553893d3950f1cbc32336ff68be38764b2d5eb768dca7410719c5ce174'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 23, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000091-1eda1f2a-e061-4d62-b09d-49ac1dc55ace-tap4d4f5d39-ff', 'timestamp': '2025-10-02T12:33:16.558937', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-618937776', 'name': 'tap4d4f5d39-ff', 'instance_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5b:3f:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4d4f5d39-ff'}, 'message_id': 'f86c2d82-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.040235297, 'message_signature': 'd59b126adf2e65d8ed5def5e612503dc05962f3a90b6588175a8a0978a1989b9'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '27daa263abb54d4d8e3ae34cd1c5ccf5', 'user_name': None, 'project_id': 'a4a7099974504a798e1607c8e6a1f570', 'project_name': None, 'resource_id': 'instance-0000008c-4958df02-e9fa-4cb2-9175-4313cd3fd658-tapbf1d62fc-3a', 'timestamp': '2025-10-02T12:33:16.558937', 'resource_metadata': {'display_name': 'tempest-₡-1854753736', 'name': 'tapbf1d62fc-3a', 'instance_id': '4958df02-e9fa-4cb2-9175-4313cd3fd658', 'instance_type': 'm1.nano', 'host': 'e691ce900214cff61c226df9d77e545b4ce49570c884449f9fb6ad18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4a:4b:3d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbf1d62fc-3a'}, 'message_id': 'f86c3b7e-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.047195096, 'message_signature': 'd249f7f12ac86b1421bf24980b12b2e417a74a590d7bad90370b21855520e76f'}]}, 'timestamp': '2025-10-02 12:33:16.560040', '_unique_id': 'ca6fe3739cf04304aeed11815799c955'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.560 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.560 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.560 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.560 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.560 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.560 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.560 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.560 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.560 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.560 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.560 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.560 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.560 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.560 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.560 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.560 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.560 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.560 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.560 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.560 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.560 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.560 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.560 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.560 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.560 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.560 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.560 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.560 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.560 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.560 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.560 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.562 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.562 12 DEBUG ceilometer.compute.pollsters [-] 1eda1f2a-e061-4d62-b09d-49ac1dc55ace/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.562 12 DEBUG ceilometer.compute.pollsters [-] 1eda1f2a-e061-4d62-b09d-49ac1dc55ace/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.562 12 DEBUG ceilometer.compute.pollsters [-] 4958df02-e9fa-4cb2-9175-4313cd3fd658/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.564 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '42c80ec6-43fb-4cbd-8307-6c5e6e48bed6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000091-1eda1f2a-e061-4d62-b09d-49ac1dc55ace-tap087a3a60-6c', 'timestamp': '2025-10-02T12:33:16.562091', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-618937776', 'name': 'tap087a3a60-6c', 'instance_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bf:a3:73', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap087a3a60-6c'}, 'message_id': 'f86c9e02-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.040235297, 'message_signature': 'cf7123471e9e39162f0a424fb7d7f7b2f27f0288a725df16c40232ae9468c808'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000091-1eda1f2a-e061-4d62-b09d-49ac1dc55ace-tap4d4f5d39-ff', 'timestamp': '2025-10-02T12:33:16.562091', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-618937776', 'name': 'tap4d4f5d39-ff', 'instance_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5b:3f:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4d4f5d39-ff'}, 'message_id': 'f86ca9f6-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.040235297, 'message_signature': '9bd997c497fa3c62e0e21639e26733b747a964cf31041737a31f1e873bd0dfc0'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '27daa263abb54d4d8e3ae34cd1c5ccf5', 'user_name': None, 'project_id': 'a4a7099974504a798e1607c8e6a1f570', 'project_name': None, 'resource_id': 'instance-0000008c-4958df02-e9fa-4cb2-9175-4313cd3fd658-tapbf1d62fc-3a', 'timestamp': '2025-10-02T12:33:16.562091', 'resource_metadata': {'display_name': 'tempest-₡-1854753736', 'name': 'tapbf1d62fc-3a', 'instance_id': '4958df02-e9fa-4cb2-9175-4313cd3fd658', 'instance_type': 'm1.nano', 'host': 'e691ce900214cff61c226df9d77e545b4ce49570c884449f9fb6ad18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4a:4b:3d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbf1d62fc-3a'}, 'message_id': 'f86cba22-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.047195096, 'message_signature': '5c83ce8b02a5dea4fb65164a268409ad6de70ef927da8b372aac28f97dcdd89e'}]}, 'timestamp': '2025-10-02 12:33:16.563311', '_unique_id': '2b08f6da9c614dd693cca2aa5ef1a6e4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.564 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.564 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.564 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.564 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.564 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.564 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.564 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.564 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.564 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.564 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.564 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.564 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.564 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.564 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.564 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.564 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.564 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.564 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.564 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.564 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.564 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.564 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.564 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.564 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.564 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.564 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.564 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.564 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.564 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.564 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.564 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.565 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.565 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.565 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-618937776>, <NovaLikeServer: tempest-₡-1854753736>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-618937776>, <NovaLikeServer: tempest-₡-1854753736>]
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.565 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.566 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.566 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-618937776>, <NovaLikeServer: tempest-₡-1854753736>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-618937776>, <NovaLikeServer: tempest-₡-1854753736>]
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.566 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.566 12 DEBUG ceilometer.compute.pollsters [-] 1eda1f2a-e061-4d62-b09d-49ac1dc55ace/disk.device.write.bytes volume: 72916992 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.567 12 DEBUG ceilometer.compute.pollsters [-] 1eda1f2a-e061-4d62-b09d-49ac1dc55ace/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.567 12 DEBUG ceilometer.compute.pollsters [-] 4958df02-e9fa-4cb2-9175-4313cd3fd658/disk.device.write.bytes volume: 73035776 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.567 12 DEBUG ceilometer.compute.pollsters [-] 4958df02-e9fa-4cb2-9175-4313cd3fd658/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.569 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '632701ca-21c2-4ea3-91cc-9dded7692bea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72916992, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace-vda', 'timestamp': '2025-10-02T12:33:16.566640', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-618937776', 'name': 'instance-00000091', 'instance_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f86d4faa-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.056653764, 'message_signature': '07d61b4877237c885753c2488253487ec28ad5851e516120f227ca84da5fdd78'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace-sda', 'timestamp': '2025-10-02T12:33:16.566640', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-618937776', 'name': 'instance-00000091', 'instance_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f86d5bc6-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.056653764, 'message_signature': '5d9556813de1e0b7e3c181c81580d6ae7a79f81074f08c17990bdf2bcd975e6c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73035776, 'user_id': '27daa263abb54d4d8e3ae34cd1c5ccf5', 'user_name': None, 'project_id': 'a4a7099974504a798e1607c8e6a1f570', 'project_name': None, 'resource_id': '4958df02-e9fa-4cb2-9175-4313cd3fd658-vda', 'timestamp': '2025-10-02T12:33:16.566640', 'resource_metadata': {'display_name': 'tempest-₡-1854753736', 'name': 'instance-0000008c', 'instance_id': '4958df02-e9fa-4cb2-9175-4313cd3fd658', 'instance_type': 'm1.nano', 'host': 'e691ce900214cff61c226df9d77e545b4ce49570c884449f9fb6ad18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f86d6972-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.085866753, 'message_signature': '7a6fc19602606edd5ce25ecc4869df97e053308a753e4d84be2bfb6c8e015e15'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '27daa263abb54d4d8e3ae34cd1c5ccf5', 'user_name': None, 'project_id': 'a4a7099974504a798e1607c8e6a1f570', 'project_name': None, 'resource_id': '4958df02-e9fa-4cb2-9175-4313cd3fd658-sda', 'timestamp': '2025-10-02T12:33:16.566640', 'resource_metadata': {'display_name': 'tempest-₡-1854753736', 'name': 'instance-0000008c', 'instance_id': '4958df02-e9fa-4cb2-9175-4313cd3fd658', 'instance_type': 'm1.nano', 'host': 'e691ce900214cff61c226df9d77e545b4ce49570c884449f9fb6ad18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f86d7534-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.085866753, 'message_signature': '78a86ba56ac2b536366ee56319c603a40dd1ed7c034b7ae37b1c515a745494c6'}]}, 'timestamp': '2025-10-02 12:33:16.568037', '_unique_id': 'b4a75dba5f804ac9987e81c65e5c4cfe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.569 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.569 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.569 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.569 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.569 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.569 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.569 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.569 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.569 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.569 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.569 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.569 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.569 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.569 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.569 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.569 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.569 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.569 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.569 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.569 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.569 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.569 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.569 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.569 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.569 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.569 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.569 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.569 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.569 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.569 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.569 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.570 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.570 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.570 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-618937776>, <NovaLikeServer: tempest-₡-1854753736>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-618937776>, <NovaLikeServer: tempest-₡-1854753736>]
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.570 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.570 12 DEBUG ceilometer.compute.pollsters [-] 1eda1f2a-e061-4d62-b09d-49ac1dc55ace/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.571 12 DEBUG ceilometer.compute.pollsters [-] 1eda1f2a-e061-4d62-b09d-49ac1dc55ace/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.571 12 DEBUG ceilometer.compute.pollsters [-] 4958df02-e9fa-4cb2-9175-4313cd3fd658/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.572 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9eba88cb-d634-482f-8099-5141848c4e43', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000091-1eda1f2a-e061-4d62-b09d-49ac1dc55ace-tap087a3a60-6c', 'timestamp': '2025-10-02T12:33:16.570710', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-618937776', 'name': 'tap087a3a60-6c', 'instance_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bf:a3:73', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap087a3a60-6c'}, 'message_id': 'f86de92e-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.040235297, 'message_signature': '202f87804eeb4f31a172f0a7e33c02b8eb58d07aeef1a84c63852cd50eeb94c7'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000091-1eda1f2a-e061-4d62-b09d-49ac1dc55ace-tap4d4f5d39-ff', 'timestamp': '2025-10-02T12:33:16.570710', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-618937776', 'name': 'tap4d4f5d39-ff', 'instance_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5b:3f:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4d4f5d39-ff'}, 'message_id': 'f86dfac2-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.040235297, 'message_signature': '8ab7f745061d6360a7d2c838067247fcbb15d9a00419b8916eb8e82507570bc7'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '27daa263abb54d4d8e3ae34cd1c5ccf5', 'user_name': None, 'project_id': 'a4a7099974504a798e1607c8e6a1f570', 'project_name': None, 'resource_id': 'instance-0000008c-4958df02-e9fa-4cb2-9175-4313cd3fd658-tapbf1d62fc-3a', 'timestamp': '2025-10-02T12:33:16.570710', 'resource_metadata': {'display_name': 'tempest-₡-1854753736', 'name': 'tapbf1d62fc-3a', 'instance_id': '4958df02-e9fa-4cb2-9175-4313cd3fd658', 'instance_type': 'm1.nano', 'host': 'e691ce900214cff61c226df9d77e545b4ce49570c884449f9fb6ad18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4a:4b:3d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbf1d62fc-3a'}, 'message_id': 'f86e0634-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.047195096, 'message_signature': 'c69f22d7f02bc11297d0c59a27f470df5c4d6c52d0733e179c6b9b1cfa81c642'}]}, 'timestamp': '2025-10-02 12:33:16.571726', '_unique_id': 'e187ecec69c849a2b31ca101261e9750'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.572 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.572 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.572 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.572 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.572 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.572 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.572 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.572 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.572 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.572 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.572 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.572 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.572 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.572 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.572 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.572 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.572 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.572 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.572 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.572 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.572 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.572 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.572 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.572 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.572 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.572 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.572 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.572 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.572 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.572 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.572 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.573 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.573 12 DEBUG ceilometer.compute.pollsters [-] 1eda1f2a-e061-4d62-b09d-49ac1dc55ace/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.574 12 DEBUG ceilometer.compute.pollsters [-] 1eda1f2a-e061-4d62-b09d-49ac1dc55ace/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.574 12 DEBUG ceilometer.compute.pollsters [-] 4958df02-e9fa-4cb2-9175-4313cd3fd658/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.576 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0ff9fdcb-af5f-40ba-9055-2cd58c7daa13', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000091-1eda1f2a-e061-4d62-b09d-49ac1dc55ace-tap087a3a60-6c', 'timestamp': '2025-10-02T12:33:16.573969', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-618937776', 'name': 'tap087a3a60-6c', 'instance_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bf:a3:73', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap087a3a60-6c'}, 'message_id': 'f86e6854-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.040235297, 'message_signature': '0051e35857232d0c2d224a2297ca71ac627433daf635a35f7eed592002782595'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000091-1eda1f2a-e061-4d62-b09d-49ac1dc55ace-tap4d4f5d39-ff', 'timestamp': '2025-10-02T12:33:16.573969', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-618937776', 'name': 'tap4d4f5d39-ff', 'instance_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5b:3f:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4d4f5d39-ff'}, 'message_id': 'f86e742a-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.040235297, 'message_signature': '833628d26c38fb4fb17be15615d974ffd9ad878c1cbbe32d0b47086efe623b98'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '27daa263abb54d4d8e3ae34cd1c5ccf5', 'user_name': None, 'project_id': 'a4a7099974504a798e1607c8e6a1f570', 'project_name': None, 'resource_id': 'instance-0000008c-4958df02-e9fa-4cb2-9175-4313cd3fd658-tapbf1d62fc-3a', 'timestamp': '2025-10-02T12:33:16.573969', 'resource_metadata': {'display_name': 'tempest-₡-1854753736', 'name': 'tapbf1d62fc-3a', 'instance_id': '4958df02-e9fa-4cb2-9175-4313cd3fd658', 'instance_type': 'm1.nano', 'host': 'e691ce900214cff61c226df9d77e545b4ce49570c884449f9fb6ad18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4a:4b:3d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbf1d62fc-3a'}, 'message_id': 'f86e8b18-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.047195096, 'message_signature': '0831f7cbfa548fa8433cecc44ebb88dc8149d36b339001a8e554fed5be25b749'}]}, 'timestamp': '2025-10-02 12:33:16.575149', '_unique_id': '043f08f392d1417c8cdfde178aebfb26'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.576 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.576 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.576 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.576 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.576 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.576 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.576 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.576 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.576 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.576 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.576 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.576 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.576 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.576 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.576 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.576 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.576 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.576 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.576 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.576 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.576 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.576 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.576 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.576 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.576 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.576 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.576 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.576 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.576 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.576 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.576 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.577 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.577 12 DEBUG ceilometer.compute.pollsters [-] 1eda1f2a-e061-4d62-b09d-49ac1dc55ace/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.577 12 DEBUG ceilometer.compute.pollsters [-] 1eda1f2a-e061-4d62-b09d-49ac1dc55ace/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.578 12 DEBUG ceilometer.compute.pollsters [-] 4958df02-e9fa-4cb2-9175-4313cd3fd658/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.579 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5c8193a7-a92c-4a0f-8296-e972511047cb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000091-1eda1f2a-e061-4d62-b09d-49ac1dc55ace-tap087a3a60-6c', 'timestamp': '2025-10-02T12:33:16.577399', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-618937776', 'name': 'tap087a3a60-6c', 'instance_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bf:a3:73', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap087a3a60-6c'}, 'message_id': 'f86ef396-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.040235297, 'message_signature': 'eed404b9963d578634faebc2716a81efd25f9ea6bf4a55a3a2f8192ba2881da8'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000091-1eda1f2a-e061-4d62-b09d-49ac1dc55ace-tap4d4f5d39-ff', 'timestamp': '2025-10-02T12:33:16.577399', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-618937776', 'name': 'tap4d4f5d39-ff', 'instance_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5b:3f:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4d4f5d39-ff'}, 'message_id': 'f86efec2-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.040235297, 'message_signature': 'c841a7e74e9d3765d386843249f8d5a4588d4c11a255ca271dae1939a596466c'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '27daa263abb54d4d8e3ae34cd1c5ccf5', 'user_name': None, 'project_id': 'a4a7099974504a798e1607c8e6a1f570', 'project_name': None, 'resource_id': 'instance-0000008c-4958df02-e9fa-4cb2-9175-4313cd3fd658-tapbf1d62fc-3a', 'timestamp': '2025-10-02T12:33:16.577399', 'resource_metadata': {'display_name': 'tempest-₡-1854753736', 'name': 'tapbf1d62fc-3a', 'instance_id': '4958df02-e9fa-4cb2-9175-4313cd3fd658', 'instance_type': 'm1.nano', 'host': 'e691ce900214cff61c226df9d77e545b4ce49570c884449f9fb6ad18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4a:4b:3d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbf1d62fc-3a'}, 'message_id': 'f86f0c6e-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.047195096, 'message_signature': '36f4aa6328dcbd48a96b7079b8ffcd5246e6ba4bc9072bcb0d5b516a0a8a3ee6'}]}, 'timestamp': '2025-10-02 12:33:16.578557', '_unique_id': 'dc015d52bf9942fba4a99df586a8224a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.579 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.579 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.579 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.579 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.579 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.579 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.579 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.579 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.579 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.579 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.579 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.579 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.579 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.579 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.579 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.579 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.579 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.579 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.579 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.579 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.579 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.579 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.579 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.579 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.579 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.579 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.579 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.579 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.579 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.579 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.579 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.580 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.580 12 DEBUG ceilometer.compute.pollsters [-] 1eda1f2a-e061-4d62-b09d-49ac1dc55ace/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.581 12 DEBUG ceilometer.compute.pollsters [-] 1eda1f2a-e061-4d62-b09d-49ac1dc55ace/network.outgoing.bytes volume: 2278 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.581 12 DEBUG ceilometer.compute.pollsters [-] 4958df02-e9fa-4cb2-9175-4313cd3fd658/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.582 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b7808faf-a137-4c74-8ea1-2cd7e4cb74c3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000091-1eda1f2a-e061-4d62-b09d-49ac1dc55ace-tap087a3a60-6c', 'timestamp': '2025-10-02T12:33:16.580711', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-618937776', 'name': 'tap087a3a60-6c', 'instance_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bf:a3:73', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap087a3a60-6c'}, 'message_id': 'f86f762c-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.040235297, 'message_signature': 'ae648c11663727632edfb8cce43b60dd00081fa05aa6a26206d73af1b2044a2e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2278, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000091-1eda1f2a-e061-4d62-b09d-49ac1dc55ace-tap4d4f5d39-ff', 'timestamp': '2025-10-02T12:33:16.580711', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-618937776', 'name': 'tap4d4f5d39-ff', 'instance_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5b:3f:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4d4f5d39-ff'}, 'message_id': 'f86f8234-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.040235297, 'message_signature': '10e0b50b2cae2d1b26e0de130f9b8ebb204d11de084b942a29a993f1fb4738a0'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': '27daa263abb54d4d8e3ae34cd1c5ccf5', 'user_name': None, 'project_id': 'a4a7099974504a798e1607c8e6a1f570', 'project_name': None, 'resource_id': 'instance-0000008c-4958df02-e9fa-4cb2-9175-4313cd3fd658-tapbf1d62fc-3a', 'timestamp': '2025-10-02T12:33:16.580711', 'resource_metadata': {'display_name': 'tempest-₡-1854753736', 'name': 'tapbf1d62fc-3a', 'instance_id': '4958df02-e9fa-4cb2-9175-4313cd3fd658', 'instance_type': 'm1.nano', 'host': 'e691ce900214cff61c226df9d77e545b4ce49570c884449f9fb6ad18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4a:4b:3d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbf1d62fc-3a'}, 'message_id': 'f86f91e8-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.047195096, 'message_signature': 'd4028820184912af8f4af4544e16b708645abcf7cb1a633a6c271c6bc9e313e1'}]}, 'timestamp': '2025-10-02 12:33:16.581993', '_unique_id': '6002a8e9038b4abcb17dff64c686536d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.582 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.582 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.582 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.582 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.582 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.582 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.582 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.582 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.582 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.582 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.582 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.582 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.582 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.582 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.582 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.582 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.582 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.582 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.582 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.582 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.582 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.582 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.582 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.582 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.582 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.582 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.582 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.582 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.582 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.582 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.582 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.583 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.583 12 DEBUG ceilometer.compute.pollsters [-] 1eda1f2a-e061-4d62-b09d-49ac1dc55ace/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.584 12 DEBUG ceilometer.compute.pollsters [-] 1eda1f2a-e061-4d62-b09d-49ac1dc55ace/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.584 12 DEBUG ceilometer.compute.pollsters [-] 4958df02-e9fa-4cb2-9175-4313cd3fd658/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.585 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5bfc673a-99e8-4bee-b3c5-cda5991c4e47', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000091-1eda1f2a-e061-4d62-b09d-49ac1dc55ace-tap087a3a60-6c', 'timestamp': '2025-10-02T12:33:16.583919', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-618937776', 'name': 'tap087a3a60-6c', 'instance_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bf:a3:73', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap087a3a60-6c'}, 'message_id': 'f86fede6-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.040235297, 'message_signature': '3ebdece3334b39fdfce6ee6f76524e655cc8737b70064f027a825c93abe03ced'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000091-1eda1f2a-e061-4d62-b09d-49ac1dc55ace-tap4d4f5d39-ff', 'timestamp': '2025-10-02T12:33:16.583919', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-618937776', 'name': 'tap4d4f5d39-ff', 'instance_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5b:3f:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4d4f5d39-ff'}, 'message_id': 'f86ffcdc-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.040235297, 'message_signature': '2d709cc6fb2cd580fae87a41f64cd43b5f2a0786c53654c863728ad2658e457c'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '27daa263abb54d4d8e3ae34cd1c5ccf5', 'user_name': None, 'project_id': 'a4a7099974504a798e1607c8e6a1f570', 'project_name': None, 'resource_id': 'instance-0000008c-4958df02-e9fa-4cb2-9175-4313cd3fd658-tapbf1d62fc-3a', 'timestamp': '2025-10-02T12:33:16.583919', 'resource_metadata': {'display_name': 'tempest-₡-1854753736', 'name': 'tapbf1d62fc-3a', 'instance_id': '4958df02-e9fa-4cb2-9175-4313cd3fd658', 'instance_type': 'm1.nano', 'host': 'e691ce900214cff61c226df9d77e545b4ce49570c884449f9fb6ad18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4a:4b:3d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbf1d62fc-3a'}, 'message_id': 'f8700d94-9f8b-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6388.047195096, 'message_signature': '39dcd2bce32f7bdaf595c5cba1f8a944a2c8610d998f51bb60283031249017e3'}]}, 'timestamp': '2025-10-02 12:33:16.585037', '_unique_id': '2bd7e2390a4348968bc0018b14ea428d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.585 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.585 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.585 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.585 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.585 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.585 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.585 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.585 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.585 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.585 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.585 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.585 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.585 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.585 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.585 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.585 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.585 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.585 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.585 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.585 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.585 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.585 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.585 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.585 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.585 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.585 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.585 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.585 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.585 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.585 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:33:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:33:16.585 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:33:20 np0005466013 nova_compute[192144]: 2025-10-02 12:33:20.124 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:33:20 np0005466013 nova_compute[192144]: 2025-10-02 12:33:20.125 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:20 np0005466013 nova_compute[192144]: 2025-10-02 12:33:20.125 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Oct  2 08:33:20 np0005466013 nova_compute[192144]: 2025-10-02 12:33:20.125 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  2 08:33:20 np0005466013 nova_compute[192144]: 2025-10-02 12:33:20.126 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  2 08:33:20 np0005466013 nova_compute[192144]: 2025-10-02 12:33:20.128 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:22 np0005466013 nova_compute[192144]: 2025-10-02 12:33:22.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:22 np0005466013 podman[244769]: 2025-10-02 12:33:22.682570796 +0000 UTC m=+0.057124158 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, build-date=2025-08-20T13:12:41, version=9.6, container_name=openstack_network_exporter, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7)
Oct  2 08:33:22 np0005466013 podman[244768]: 2025-10-02 12:33:22.695657458 +0000 UTC m=+0.070532281 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:33:22 np0005466013 podman[244770]: 2025-10-02 12:33:22.704542318 +0000 UTC m=+0.070315004 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:33:22 np0005466013 ovn_controller[94366]: 2025-10-02T12:33:22Z|00638|binding|INFO|Releasing lport 763e1f51-8560-461a-a2f3-3c284c8e5a17 from this chassis (sb_readonly=0)
Oct  2 08:33:22 np0005466013 ovn_controller[94366]: 2025-10-02T12:33:22Z|00639|binding|INFO|Releasing lport f8346990-e84e-49ae-958d-dc83725093d9 from this chassis (sb_readonly=0)
Oct  2 08:33:22 np0005466013 ovn_controller[94366]: 2025-10-02T12:33:22Z|00640|binding|INFO|Releasing lport c198cb2e-a850-46e4-8295-a2f9c280ee53 from this chassis (sb_readonly=0)
Oct  2 08:33:22 np0005466013 nova_compute[192144]: 2025-10-02 12:33:22.908 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:25 np0005466013 nova_compute[192144]: 2025-10-02 12:33:25.153 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:27 np0005466013 nova_compute[192144]: 2025-10-02 12:33:27.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:28 np0005466013 nova_compute[192144]: 2025-10-02 12:33:28.475 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:28 np0005466013 podman[244831]: 2025-10-02 12:33:28.693488402 +0000 UTC m=+0.063110738 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:33:28 np0005466013 podman[244832]: 2025-10-02 12:33:28.754013937 +0000 UTC m=+0.111376837 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, container_name=iscsid)
Oct  2 08:33:30 np0005466013 nova_compute[192144]: 2025-10-02 12:33:30.156 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:33 np0005466013 nova_compute[192144]: 2025-10-02 12:33:33.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:33:34 np0005466013 nova_compute[192144]: 2025-10-02 12:33:34.027 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:34 np0005466013 nova_compute[192144]: 2025-10-02 12:33:34.028 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:34 np0005466013 nova_compute[192144]: 2025-10-02 12:33:34.028 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:34 np0005466013 nova_compute[192144]: 2025-10-02 12:33:34.028 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:33:34 np0005466013 nova_compute[192144]: 2025-10-02 12:33:34.123 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1eda1f2a-e061-4d62-b09d-49ac1dc55ace/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:33:34 np0005466013 nova_compute[192144]: 2025-10-02 12:33:34.216 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1eda1f2a-e061-4d62-b09d-49ac1dc55ace/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:33:34 np0005466013 nova_compute[192144]: 2025-10-02 12:33:34.218 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1eda1f2a-e061-4d62-b09d-49ac1dc55ace/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:33:34 np0005466013 nova_compute[192144]: 2025-10-02 12:33:34.311 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1eda1f2a-e061-4d62-b09d-49ac1dc55ace/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:33:34 np0005466013 nova_compute[192144]: 2025-10-02 12:33:34.319 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4958df02-e9fa-4cb2-9175-4313cd3fd658/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:33:34 np0005466013 nova_compute[192144]: 2025-10-02 12:33:34.379 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4958df02-e9fa-4cb2-9175-4313cd3fd658/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:33:34 np0005466013 nova_compute[192144]: 2025-10-02 12:33:34.380 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4958df02-e9fa-4cb2-9175-4313cd3fd658/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:33:34 np0005466013 nova_compute[192144]: 2025-10-02 12:33:34.474 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4958df02-e9fa-4cb2-9175-4313cd3fd658/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:33:34 np0005466013 nova_compute[192144]: 2025-10-02 12:33:34.670 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:33:34 np0005466013 nova_compute[192144]: 2025-10-02 12:33:34.672 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5321MB free_disk=73.14299011230469GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:33:34 np0005466013 nova_compute[192144]: 2025-10-02 12:33:34.672 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:34 np0005466013 nova_compute[192144]: 2025-10-02 12:33:34.672 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:34 np0005466013 nova_compute[192144]: 2025-10-02 12:33:34.756 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance 4958df02-e9fa-4cb2-9175-4313cd3fd658 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:33:34 np0005466013 nova_compute[192144]: 2025-10-02 12:33:34.757 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance 1eda1f2a-e061-4d62-b09d-49ac1dc55ace actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:33:34 np0005466013 nova_compute[192144]: 2025-10-02 12:33:34.757 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:33:34 np0005466013 nova_compute[192144]: 2025-10-02 12:33:34.757 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:33:34 np0005466013 nova_compute[192144]: 2025-10-02 12:33:34.811 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:33:34 np0005466013 nova_compute[192144]: 2025-10-02 12:33:34.827 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:33:34 np0005466013 nova_compute[192144]: 2025-10-02 12:33:34.848 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:33:34 np0005466013 nova_compute[192144]: 2025-10-02 12:33:34.848 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.176s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:35 np0005466013 nova_compute[192144]: 2025-10-02 12:33:35.157 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:35 np0005466013 irqbalance[782]: Cannot change IRQ 26 affinity: Operation not permitted
Oct  2 08:33:35 np0005466013 irqbalance[782]: IRQ 26 affinity is now unmanaged
Oct  2 08:33:35 np0005466013 nova_compute[192144]: 2025-10-02 12:33:35.682 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:35 np0005466013 nova_compute[192144]: 2025-10-02 12:33:35.849 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:33:35 np0005466013 nova_compute[192144]: 2025-10-02 12:33:35.849 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:33:38 np0005466013 nova_compute[192144]: 2025-10-02 12:33:38.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:33:39 np0005466013 nova_compute[192144]: 2025-10-02 12:33:39.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:33:40 np0005466013 nova_compute[192144]: 2025-10-02 12:33:40.159 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:41 np0005466013 nova_compute[192144]: 2025-10-02 12:33:41.594 2 DEBUG oslo_concurrency.lockutils [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "d69f983e-e1c9-488c-a48e-2684e425362a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:41 np0005466013 nova_compute[192144]: 2025-10-02 12:33:41.594 2 DEBUG oslo_concurrency.lockutils [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "d69f983e-e1c9-488c-a48e-2684e425362a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:41 np0005466013 nova_compute[192144]: 2025-10-02 12:33:41.612 2 DEBUG nova.compute.manager [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:33:41 np0005466013 nova_compute[192144]: 2025-10-02 12:33:41.772 2 DEBUG oslo_concurrency.lockutils [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:41 np0005466013 nova_compute[192144]: 2025-10-02 12:33:41.773 2 DEBUG oslo_concurrency.lockutils [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:41 np0005466013 nova_compute[192144]: 2025-10-02 12:33:41.780 2 DEBUG nova.virt.hardware [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:33:41 np0005466013 nova_compute[192144]: 2025-10-02 12:33:41.781 2 INFO nova.compute.claims [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:33:41 np0005466013 nova_compute[192144]: 2025-10-02 12:33:41.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:33:42 np0005466013 nova_compute[192144]: 2025-10-02 12:33:42.039 2 DEBUG nova.compute.provider_tree [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:33:42 np0005466013 nova_compute[192144]: 2025-10-02 12:33:42.059 2 DEBUG nova.scheduler.client.report [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:33:42 np0005466013 nova_compute[192144]: 2025-10-02 12:33:42.083 2 DEBUG oslo_concurrency.lockutils [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.310s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:42 np0005466013 nova_compute[192144]: 2025-10-02 12:33:42.084 2 DEBUG nova.compute.manager [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:33:42 np0005466013 nova_compute[192144]: 2025-10-02 12:33:42.149 2 DEBUG nova.compute.manager [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:33:42 np0005466013 nova_compute[192144]: 2025-10-02 12:33:42.150 2 DEBUG nova.network.neutron [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:33:42 np0005466013 nova_compute[192144]: 2025-10-02 12:33:42.188 2 INFO nova.virt.libvirt.driver [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:33:42 np0005466013 nova_compute[192144]: 2025-10-02 12:33:42.217 2 DEBUG nova.compute.manager [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:33:42 np0005466013 nova_compute[192144]: 2025-10-02 12:33:42.389 2 DEBUG nova.compute.manager [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:33:42 np0005466013 nova_compute[192144]: 2025-10-02 12:33:42.390 2 DEBUG nova.virt.libvirt.driver [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:33:42 np0005466013 nova_compute[192144]: 2025-10-02 12:33:42.391 2 INFO nova.virt.libvirt.driver [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Creating image(s)#033[00m
Oct  2 08:33:42 np0005466013 nova_compute[192144]: 2025-10-02 12:33:42.391 2 DEBUG oslo_concurrency.lockutils [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "/var/lib/nova/instances/d69f983e-e1c9-488c-a48e-2684e425362a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:42 np0005466013 nova_compute[192144]: 2025-10-02 12:33:42.392 2 DEBUG oslo_concurrency.lockutils [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "/var/lib/nova/instances/d69f983e-e1c9-488c-a48e-2684e425362a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:42 np0005466013 nova_compute[192144]: 2025-10-02 12:33:42.392 2 DEBUG oslo_concurrency.lockutils [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "/var/lib/nova/instances/d69f983e-e1c9-488c-a48e-2684e425362a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:42 np0005466013 nova_compute[192144]: 2025-10-02 12:33:42.410 2 DEBUG oslo_concurrency.processutils [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:33:42 np0005466013 nova_compute[192144]: 2025-10-02 12:33:42.499 2 DEBUG oslo_concurrency.processutils [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:33:42 np0005466013 nova_compute[192144]: 2025-10-02 12:33:42.500 2 DEBUG oslo_concurrency.lockutils [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:42 np0005466013 nova_compute[192144]: 2025-10-02 12:33:42.501 2 DEBUG oslo_concurrency.lockutils [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:42 np0005466013 nova_compute[192144]: 2025-10-02 12:33:42.513 2 DEBUG oslo_concurrency.processutils [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:33:42 np0005466013 nova_compute[192144]: 2025-10-02 12:33:42.595 2 DEBUG oslo_concurrency.processutils [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:33:42 np0005466013 nova_compute[192144]: 2025-10-02 12:33:42.597 2 DEBUG oslo_concurrency.processutils [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/d69f983e-e1c9-488c-a48e-2684e425362a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:33:42 np0005466013 nova_compute[192144]: 2025-10-02 12:33:42.659 2 DEBUG oslo_concurrency.processutils [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/d69f983e-e1c9-488c-a48e-2684e425362a/disk 1073741824" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:33:42 np0005466013 nova_compute[192144]: 2025-10-02 12:33:42.661 2 DEBUG oslo_concurrency.lockutils [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.160s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:42 np0005466013 nova_compute[192144]: 2025-10-02 12:33:42.662 2 DEBUG oslo_concurrency.processutils [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:33:42 np0005466013 nova_compute[192144]: 2025-10-02 12:33:42.752 2 DEBUG oslo_concurrency.processutils [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:33:42 np0005466013 nova_compute[192144]: 2025-10-02 12:33:42.753 2 DEBUG nova.virt.disk.api [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Checking if we can resize image /var/lib/nova/instances/d69f983e-e1c9-488c-a48e-2684e425362a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:33:42 np0005466013 nova_compute[192144]: 2025-10-02 12:33:42.754 2 DEBUG oslo_concurrency.processutils [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d69f983e-e1c9-488c-a48e-2684e425362a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:33:42 np0005466013 nova_compute[192144]: 2025-10-02 12:33:42.847 2 DEBUG oslo_concurrency.processutils [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d69f983e-e1c9-488c-a48e-2684e425362a/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:33:42 np0005466013 nova_compute[192144]: 2025-10-02 12:33:42.849 2 DEBUG nova.virt.disk.api [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Cannot resize image /var/lib/nova/instances/d69f983e-e1c9-488c-a48e-2684e425362a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:33:42 np0005466013 nova_compute[192144]: 2025-10-02 12:33:42.850 2 DEBUG nova.objects.instance [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lazy-loading 'migration_context' on Instance uuid d69f983e-e1c9-488c-a48e-2684e425362a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:33:42 np0005466013 nova_compute[192144]: 2025-10-02 12:33:42.871 2 DEBUG nova.virt.libvirt.driver [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:33:42 np0005466013 nova_compute[192144]: 2025-10-02 12:33:42.872 2 DEBUG nova.virt.libvirt.driver [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Ensure instance console log exists: /var/lib/nova/instances/d69f983e-e1c9-488c-a48e-2684e425362a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:33:42 np0005466013 nova_compute[192144]: 2025-10-02 12:33:42.873 2 DEBUG oslo_concurrency.lockutils [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:42 np0005466013 nova_compute[192144]: 2025-10-02 12:33:42.873 2 DEBUG oslo_concurrency.lockutils [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:42 np0005466013 nova_compute[192144]: 2025-10-02 12:33:42.874 2 DEBUG oslo_concurrency.lockutils [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:42 np0005466013 nova_compute[192144]: 2025-10-02 12:33:42.989 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:33:43 np0005466013 nova_compute[192144]: 2025-10-02 12:33:43.137 2 DEBUG nova.policy [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '27daa263abb54d4d8e3ae34cd1c5ccf5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a4a7099974504a798e1607c8e6a1f570', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:33:43 np0005466013 podman[244903]: 2025-10-02 12:33:43.694806389 +0000 UTC m=+0.068630841 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  2 08:33:43 np0005466013 podman[244904]: 2025-10-02 12:33:43.74060493 +0000 UTC m=+0.094645440 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  2 08:33:43 np0005466013 podman[244910]: 2025-10-02 12:33:43.793928848 +0000 UTC m=+0.139342527 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  2 08:33:44 np0005466013 nova_compute[192144]: 2025-10-02 12:33:44.574 2 DEBUG nova.network.neutron [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Successfully created port: c52d24c3-c76d-49e4-9b0f-2640127e1fa7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:33:44 np0005466013 nova_compute[192144]: 2025-10-02 12:33:44.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:33:44 np0005466013 nova_compute[192144]: 2025-10-02 12:33:44.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:33:44 np0005466013 nova_compute[192144]: 2025-10-02 12:33:44.995 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:33:45 np0005466013 nova_compute[192144]: 2025-10-02 12:33:45.029 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Oct  2 08:33:45 np0005466013 nova_compute[192144]: 2025-10-02 12:33:45.161 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:45 np0005466013 nova_compute[192144]: 2025-10-02 12:33:45.253 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "refresh_cache-4958df02-e9fa-4cb2-9175-4313cd3fd658" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:33:45 np0005466013 nova_compute[192144]: 2025-10-02 12:33:45.254 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquired lock "refresh_cache-4958df02-e9fa-4cb2-9175-4313cd3fd658" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:33:45 np0005466013 nova_compute[192144]: 2025-10-02 12:33:45.254 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:33:45 np0005466013 nova_compute[192144]: 2025-10-02 12:33:45.255 2 DEBUG nova.objects.instance [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4958df02-e9fa-4cb2-9175-4313cd3fd658 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:33:46 np0005466013 nova_compute[192144]: 2025-10-02 12:33:46.317 2 DEBUG nova.network.neutron [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Successfully updated port: c52d24c3-c76d-49e4-9b0f-2640127e1fa7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:33:46 np0005466013 nova_compute[192144]: 2025-10-02 12:33:46.338 2 DEBUG oslo_concurrency.lockutils [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "refresh_cache-d69f983e-e1c9-488c-a48e-2684e425362a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:33:46 np0005466013 nova_compute[192144]: 2025-10-02 12:33:46.339 2 DEBUG oslo_concurrency.lockutils [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquired lock "refresh_cache-d69f983e-e1c9-488c-a48e-2684e425362a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:33:46 np0005466013 nova_compute[192144]: 2025-10-02 12:33:46.339 2 DEBUG nova.network.neutron [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:33:46 np0005466013 nova_compute[192144]: 2025-10-02 12:33:46.979 2 DEBUG nova.network.neutron [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:33:47 np0005466013 nova_compute[192144]: 2025-10-02 12:33:47.094 2 DEBUG nova.compute.manager [req-c185cc82-18fb-4a91-bafd-067a52623422 req-b1af5c32-2135-4076-af3a-a07d0a4c0f1b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Received event network-changed-c52d24c3-c76d-49e4-9b0f-2640127e1fa7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:33:47 np0005466013 nova_compute[192144]: 2025-10-02 12:33:47.095 2 DEBUG nova.compute.manager [req-c185cc82-18fb-4a91-bafd-067a52623422 req-b1af5c32-2135-4076-af3a-a07d0a4c0f1b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Refreshing instance network info cache due to event network-changed-c52d24c3-c76d-49e4-9b0f-2640127e1fa7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:33:47 np0005466013 nova_compute[192144]: 2025-10-02 12:33:47.095 2 DEBUG oslo_concurrency.lockutils [req-c185cc82-18fb-4a91-bafd-067a52623422 req-b1af5c32-2135-4076-af3a-a07d0a4c0f1b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-d69f983e-e1c9-488c-a48e-2684e425362a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:33:47 np0005466013 nova_compute[192144]: 2025-10-02 12:33:47.973 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Updating instance_info_cache with network_info: [{"id": "bf1d62fc-3a8d-4493-ae99-723fac577d26", "address": "fa:16:3e:4a:4b:3d", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf1d62fc-3a", "ovs_interfaceid": "bf1d62fc-3a8d-4493-ae99-723fac577d26", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:33:48 np0005466013 nova_compute[192144]: 2025-10-02 12:33:48.002 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Releasing lock "refresh_cache-4958df02-e9fa-4cb2-9175-4313cd3fd658" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:33:48 np0005466013 nova_compute[192144]: 2025-10-02 12:33:48.002 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:33:48 np0005466013 nova_compute[192144]: 2025-10-02 12:33:48.003 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:33:48 np0005466013 nova_compute[192144]: 2025-10-02 12:33:48.432 2 DEBUG nova.network.neutron [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Updating instance_info_cache with network_info: [{"id": "c52d24c3-c76d-49e4-9b0f-2640127e1fa7", "address": "fa:16:3e:3a:22:37", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc52d24c3-c7", "ovs_interfaceid": "c52d24c3-c76d-49e4-9b0f-2640127e1fa7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:33:48 np0005466013 nova_compute[192144]: 2025-10-02 12:33:48.487 2 DEBUG oslo_concurrency.lockutils [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Releasing lock "refresh_cache-d69f983e-e1c9-488c-a48e-2684e425362a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:33:48 np0005466013 nova_compute[192144]: 2025-10-02 12:33:48.488 2 DEBUG nova.compute.manager [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Instance network_info: |[{"id": "c52d24c3-c76d-49e4-9b0f-2640127e1fa7", "address": "fa:16:3e:3a:22:37", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc52d24c3-c7", "ovs_interfaceid": "c52d24c3-c76d-49e4-9b0f-2640127e1fa7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:33:48 np0005466013 nova_compute[192144]: 2025-10-02 12:33:48.489 2 DEBUG oslo_concurrency.lockutils [req-c185cc82-18fb-4a91-bafd-067a52623422 req-b1af5c32-2135-4076-af3a-a07d0a4c0f1b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-d69f983e-e1c9-488c-a48e-2684e425362a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:33:48 np0005466013 nova_compute[192144]: 2025-10-02 12:33:48.490 2 DEBUG nova.network.neutron [req-c185cc82-18fb-4a91-bafd-067a52623422 req-b1af5c32-2135-4076-af3a-a07d0a4c0f1b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Refreshing network info cache for port c52d24c3-c76d-49e4-9b0f-2640127e1fa7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:33:48 np0005466013 nova_compute[192144]: 2025-10-02 12:33:48.493 2 DEBUG nova.virt.libvirt.driver [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Start _get_guest_xml network_info=[{"id": "c52d24c3-c76d-49e4-9b0f-2640127e1fa7", "address": "fa:16:3e:3a:22:37", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc52d24c3-c7", "ovs_interfaceid": "c52d24c3-c76d-49e4-9b0f-2640127e1fa7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:33:48 np0005466013 nova_compute[192144]: 2025-10-02 12:33:48.503 2 WARNING nova.virt.libvirt.driver [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:33:48 np0005466013 nova_compute[192144]: 2025-10-02 12:33:48.509 2 DEBUG nova.virt.libvirt.host [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:33:48 np0005466013 nova_compute[192144]: 2025-10-02 12:33:48.509 2 DEBUG nova.virt.libvirt.host [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:33:48 np0005466013 nova_compute[192144]: 2025-10-02 12:33:48.516 2 DEBUG nova.virt.libvirt.host [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:33:48 np0005466013 nova_compute[192144]: 2025-10-02 12:33:48.517 2 DEBUG nova.virt.libvirt.host [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:33:48 np0005466013 nova_compute[192144]: 2025-10-02 12:33:48.519 2 DEBUG nova.virt.libvirt.driver [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:33:48 np0005466013 nova_compute[192144]: 2025-10-02 12:33:48.519 2 DEBUG nova.virt.hardware [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:33:48 np0005466013 nova_compute[192144]: 2025-10-02 12:33:48.520 2 DEBUG nova.virt.hardware [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:33:48 np0005466013 nova_compute[192144]: 2025-10-02 12:33:48.520 2 DEBUG nova.virt.hardware [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:33:48 np0005466013 nova_compute[192144]: 2025-10-02 12:33:48.520 2 DEBUG nova.virt.hardware [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:33:48 np0005466013 nova_compute[192144]: 2025-10-02 12:33:48.521 2 DEBUG nova.virt.hardware [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:33:48 np0005466013 nova_compute[192144]: 2025-10-02 12:33:48.521 2 DEBUG nova.virt.hardware [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:33:48 np0005466013 nova_compute[192144]: 2025-10-02 12:33:48.521 2 DEBUG nova.virt.hardware [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:33:48 np0005466013 nova_compute[192144]: 2025-10-02 12:33:48.522 2 DEBUG nova.virt.hardware [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:33:48 np0005466013 nova_compute[192144]: 2025-10-02 12:33:48.522 2 DEBUG nova.virt.hardware [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:33:48 np0005466013 nova_compute[192144]: 2025-10-02 12:33:48.522 2 DEBUG nova.virt.hardware [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:33:48 np0005466013 nova_compute[192144]: 2025-10-02 12:33:48.523 2 DEBUG nova.virt.hardware [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:33:48 np0005466013 nova_compute[192144]: 2025-10-02 12:33:48.527 2 DEBUG nova.virt.libvirt.vif [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:33:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1701542532',display_name='tempest-ServersTestJSON-server-1701542532',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-1701542532',id=151,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a4a7099974504a798e1607c8e6a1f570',ramdisk_id='',reservation_id='r-198inr7y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1163535506',owner_user_name='tempest-ServersTestJSON-1163535506-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:33:42Z,user_data=None,user_id='27daa263abb54d4d8e3ae34cd1c5ccf5',uuid=d69f983e-e1c9-488c-a48e-2684e425362a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c52d24c3-c76d-49e4-9b0f-2640127e1fa7", "address": "fa:16:3e:3a:22:37", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc52d24c3-c7", "ovs_interfaceid": "c52d24c3-c76d-49e4-9b0f-2640127e1fa7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:33:48 np0005466013 nova_compute[192144]: 2025-10-02 12:33:48.528 2 DEBUG nova.network.os_vif_util [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Converting VIF {"id": "c52d24c3-c76d-49e4-9b0f-2640127e1fa7", "address": "fa:16:3e:3a:22:37", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc52d24c3-c7", "ovs_interfaceid": "c52d24c3-c76d-49e4-9b0f-2640127e1fa7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:33:48 np0005466013 nova_compute[192144]: 2025-10-02 12:33:48.529 2 DEBUG nova.network.os_vif_util [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3a:22:37,bridge_name='br-int',has_traffic_filtering=True,id=c52d24c3-c76d-49e4-9b0f-2640127e1fa7,network=Network(1acf42c5-084c-4cc4-bdc5-910eec0249e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc52d24c3-c7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:33:48 np0005466013 nova_compute[192144]: 2025-10-02 12:33:48.529 2 DEBUG nova.objects.instance [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lazy-loading 'pci_devices' on Instance uuid d69f983e-e1c9-488c-a48e-2684e425362a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:33:48 np0005466013 nova_compute[192144]: 2025-10-02 12:33:48.563 2 DEBUG nova.virt.libvirt.driver [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:33:48 np0005466013 nova_compute[192144]:  <uuid>d69f983e-e1c9-488c-a48e-2684e425362a</uuid>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:  <name>instance-00000097</name>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:33:48 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:      <nova:name>tempest-ServersTestJSON-server-1701542532</nova:name>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:33:48</nova:creationTime>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:33:48 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:        <nova:user uuid="27daa263abb54d4d8e3ae34cd1c5ccf5">tempest-ServersTestJSON-1163535506-project-member</nova:user>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:        <nova:project uuid="a4a7099974504a798e1607c8e6a1f570">tempest-ServersTestJSON-1163535506</nova:project>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:        <nova:port uuid="c52d24c3-c76d-49e4-9b0f-2640127e1fa7">
Oct  2 08:33:48 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:      <entry name="serial">d69f983e-e1c9-488c-a48e-2684e425362a</entry>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:      <entry name="uuid">d69f983e-e1c9-488c-a48e-2684e425362a</entry>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:33:48 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/d69f983e-e1c9-488c-a48e-2684e425362a/disk"/>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:33:48 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/d69f983e-e1c9-488c-a48e-2684e425362a/disk.config"/>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:33:48 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:3a:22:37"/>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:      <target dev="tapc52d24c3-c7"/>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:33:48 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/d69f983e-e1c9-488c-a48e-2684e425362a/console.log" append="off"/>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:33:48 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:33:48 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:33:48 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:33:48 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:33:48 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:33:48 np0005466013 nova_compute[192144]: 2025-10-02 12:33:48.565 2 DEBUG nova.compute.manager [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Preparing to wait for external event network-vif-plugged-c52d24c3-c76d-49e4-9b0f-2640127e1fa7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:33:48 np0005466013 nova_compute[192144]: 2025-10-02 12:33:48.566 2 DEBUG oslo_concurrency.lockutils [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "d69f983e-e1c9-488c-a48e-2684e425362a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:48 np0005466013 nova_compute[192144]: 2025-10-02 12:33:48.567 2 DEBUG oslo_concurrency.lockutils [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "d69f983e-e1c9-488c-a48e-2684e425362a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:48 np0005466013 nova_compute[192144]: 2025-10-02 12:33:48.568 2 DEBUG oslo_concurrency.lockutils [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "d69f983e-e1c9-488c-a48e-2684e425362a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:48 np0005466013 nova_compute[192144]: 2025-10-02 12:33:48.569 2 DEBUG nova.virt.libvirt.vif [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:33:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1701542532',display_name='tempest-ServersTestJSON-server-1701542532',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-1701542532',id=151,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a4a7099974504a798e1607c8e6a1f570',ramdisk_id='',reservation_id='r-198inr7y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1163535506',owner_user_name='tempest-ServersTestJSON-1163535506-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:33:42Z,user_data=None,user_id='27daa263abb54d4d8e3ae34cd1c5ccf5',uuid=d69f983e-e1c9-488c-a48e-2684e425362a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c52d24c3-c76d-49e4-9b0f-2640127e1fa7", "address": "fa:16:3e:3a:22:37", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc52d24c3-c7", "ovs_interfaceid": "c52d24c3-c76d-49e4-9b0f-2640127e1fa7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:33:48 np0005466013 nova_compute[192144]: 2025-10-02 12:33:48.570 2 DEBUG nova.network.os_vif_util [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Converting VIF {"id": "c52d24c3-c76d-49e4-9b0f-2640127e1fa7", "address": "fa:16:3e:3a:22:37", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc52d24c3-c7", "ovs_interfaceid": "c52d24c3-c76d-49e4-9b0f-2640127e1fa7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:33:48 np0005466013 nova_compute[192144]: 2025-10-02 12:33:48.571 2 DEBUG nova.network.os_vif_util [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3a:22:37,bridge_name='br-int',has_traffic_filtering=True,id=c52d24c3-c76d-49e4-9b0f-2640127e1fa7,network=Network(1acf42c5-084c-4cc4-bdc5-910eec0249e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc52d24c3-c7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:33:48 np0005466013 nova_compute[192144]: 2025-10-02 12:33:48.572 2 DEBUG os_vif [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3a:22:37,bridge_name='br-int',has_traffic_filtering=True,id=c52d24c3-c76d-49e4-9b0f-2640127e1fa7,network=Network(1acf42c5-084c-4cc4-bdc5-910eec0249e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc52d24c3-c7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:33:48 np0005466013 nova_compute[192144]: 2025-10-02 12:33:48.573 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:48 np0005466013 nova_compute[192144]: 2025-10-02 12:33:48.573 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:48 np0005466013 nova_compute[192144]: 2025-10-02 12:33:48.574 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:33:48 np0005466013 nova_compute[192144]: 2025-10-02 12:33:48.579 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:48 np0005466013 nova_compute[192144]: 2025-10-02 12:33:48.580 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc52d24c3-c7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:48 np0005466013 nova_compute[192144]: 2025-10-02 12:33:48.581 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc52d24c3-c7, col_values=(('external_ids', {'iface-id': 'c52d24c3-c76d-49e4-9b0f-2640127e1fa7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3a:22:37', 'vm-uuid': 'd69f983e-e1c9-488c-a48e-2684e425362a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:48 np0005466013 NetworkManager[51205]: <info>  [1759408428.5852] manager: (tapc52d24c3-c7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/285)
Oct  2 08:33:48 np0005466013 nova_compute[192144]: 2025-10-02 12:33:48.584 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:48 np0005466013 nova_compute[192144]: 2025-10-02 12:33:48.587 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:33:48 np0005466013 nova_compute[192144]: 2025-10-02 12:33:48.595 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:48 np0005466013 nova_compute[192144]: 2025-10-02 12:33:48.597 2 INFO os_vif [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3a:22:37,bridge_name='br-int',has_traffic_filtering=True,id=c52d24c3-c76d-49e4-9b0f-2640127e1fa7,network=Network(1acf42c5-084c-4cc4-bdc5-910eec0249e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc52d24c3-c7')#033[00m
Oct  2 08:33:48 np0005466013 nova_compute[192144]: 2025-10-02 12:33:48.823 2 DEBUG nova.virt.libvirt.driver [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:33:48 np0005466013 nova_compute[192144]: 2025-10-02 12:33:48.824 2 DEBUG nova.virt.libvirt.driver [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:33:48 np0005466013 nova_compute[192144]: 2025-10-02 12:33:48.825 2 DEBUG nova.virt.libvirt.driver [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] No VIF found with MAC fa:16:3e:3a:22:37, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:33:48 np0005466013 nova_compute[192144]: 2025-10-02 12:33:48.826 2 INFO nova.virt.libvirt.driver [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Using config drive#033[00m
Oct  2 08:33:48 np0005466013 nova_compute[192144]: 2025-10-02 12:33:48.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:33:49 np0005466013 nova_compute[192144]: 2025-10-02 12:33:49.337 2 INFO nova.virt.libvirt.driver [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Creating config drive at /var/lib/nova/instances/d69f983e-e1c9-488c-a48e-2684e425362a/disk.config#033[00m
Oct  2 08:33:49 np0005466013 nova_compute[192144]: 2025-10-02 12:33:49.345 2 DEBUG oslo_concurrency.processutils [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d69f983e-e1c9-488c-a48e-2684e425362a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp811698w4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:33:49 np0005466013 nova_compute[192144]: 2025-10-02 12:33:49.490 2 DEBUG oslo_concurrency.processutils [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d69f983e-e1c9-488c-a48e-2684e425362a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp811698w4" returned: 0 in 0.145s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:33:49 np0005466013 kernel: tapc52d24c3-c7: entered promiscuous mode
Oct  2 08:33:49 np0005466013 NetworkManager[51205]: <info>  [1759408429.5806] manager: (tapc52d24c3-c7): new Tun device (/org/freedesktop/NetworkManager/Devices/286)
Oct  2 08:33:49 np0005466013 ovn_controller[94366]: 2025-10-02T12:33:49Z|00641|binding|INFO|Claiming lport c52d24c3-c76d-49e4-9b0f-2640127e1fa7 for this chassis.
Oct  2 08:33:49 np0005466013 ovn_controller[94366]: 2025-10-02T12:33:49Z|00642|binding|INFO|c52d24c3-c76d-49e4-9b0f-2640127e1fa7: Claiming fa:16:3e:3a:22:37 10.100.0.3
Oct  2 08:33:49 np0005466013 nova_compute[192144]: 2025-10-02 12:33:49.585 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:49 np0005466013 ovn_controller[94366]: 2025-10-02T12:33:49Z|00643|binding|INFO|Setting lport c52d24c3-c76d-49e4-9b0f-2640127e1fa7 ovn-installed in OVS
Oct  2 08:33:49 np0005466013 nova_compute[192144]: 2025-10-02 12:33:49.617 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:49 np0005466013 nova_compute[192144]: 2025-10-02 12:33:49.622 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:49 np0005466013 systemd-udevd[244985]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:33:49 np0005466013 ovn_controller[94366]: 2025-10-02T12:33:49Z|00644|binding|INFO|Setting lport c52d24c3-c76d-49e4-9b0f-2640127e1fa7 up in Southbound
Oct  2 08:33:49 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:33:49.653 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3a:22:37 10.100.0.3'], port_security=['fa:16:3e:3a:22:37 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'd69f983e-e1c9-488c-a48e-2684e425362a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1acf42c5-084c-4cc4-bdc5-910eec0249e3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a4a7099974504a798e1607c8e6a1f570', 'neutron:revision_number': '2', 'neutron:security_group_ids': '99e51855-93ef-45a8-a4a3-2b0a8aec1882', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=498d5b4e-c711-4633-9705-7db30a0fb056, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=c52d24c3-c76d-49e4-9b0f-2640127e1fa7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:33:49 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:33:49.655 103323 INFO neutron.agent.ovn.metadata.agent [-] Port c52d24c3-c76d-49e4-9b0f-2640127e1fa7 in datapath 1acf42c5-084c-4cc4-bdc5-910eec0249e3 bound to our chassis#033[00m
Oct  2 08:33:49 np0005466013 NetworkManager[51205]: <info>  [1759408429.6593] device (tapc52d24c3-c7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:33:49 np0005466013 NetworkManager[51205]: <info>  [1759408429.6605] device (tapc52d24c3-c7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:33:49 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:33:49.661 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1acf42c5-084c-4cc4-bdc5-910eec0249e3#033[00m
Oct  2 08:33:49 np0005466013 systemd-machined[152202]: New machine qemu-71-instance-00000097.
Oct  2 08:33:49 np0005466013 systemd[1]: Started Virtual Machine qemu-71-instance-00000097.
Oct  2 08:33:49 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:33:49.689 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[d3ac03f1-2b70-4643-aa7d-e15bcb82e603]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:49 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:33:49.734 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[ae240aa3-f0bc-41d5-8790-1321a189fcc8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:49 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:33:49.740 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[a7949e23-e0d3-4f44-88e9-4b98735cad8b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:49 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:33:49.783 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[30865cc9-a031-467f-9b7c-72d9a5f4d109]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:49 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:33:49.806 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[bfeb3d52-d207-41ad-9cc5-6a959aa8c34d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1acf42c5-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:5b:cd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 6, 'rx_bytes': 916, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 6, 'rx_bytes': 916, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 186], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 630877, 'reachable_time': 30317, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245000, 'error': None, 'target': 'ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:49 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:33:49.829 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[64837c56-2523-4c7a-a12f-73fb5cdb4c19]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap1acf42c5-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 630892, 'tstamp': 630892}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 245001, 'error': None, 'target': 'ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap1acf42c5-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 630895, 'tstamp': 630895}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 245001, 'error': None, 'target': 'ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:49 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:33:49.831 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1acf42c5-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:49 np0005466013 nova_compute[192144]: 2025-10-02 12:33:49.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:49 np0005466013 nova_compute[192144]: 2025-10-02 12:33:49.835 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:49 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:33:49.836 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1acf42c5-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:49 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:33:49.836 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:33:49 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:33:49.836 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1acf42c5-00, col_values=(('external_ids', {'iface-id': 'c198cb2e-a850-46e4-8295-a2f9c280ee53'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:49 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:33:49.837 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:33:50 np0005466013 nova_compute[192144]: 2025-10-02 12:33:50.101 2 DEBUG nova.compute.manager [req-5aab7246-5eaa-486f-8bf7-502227af183f req-0819d36f-aa78-42bf-94a5-e4eb7796e78e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Received event network-vif-plugged-c52d24c3-c76d-49e4-9b0f-2640127e1fa7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:33:50 np0005466013 nova_compute[192144]: 2025-10-02 12:33:50.102 2 DEBUG oslo_concurrency.lockutils [req-5aab7246-5eaa-486f-8bf7-502227af183f req-0819d36f-aa78-42bf-94a5-e4eb7796e78e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "d69f983e-e1c9-488c-a48e-2684e425362a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:50 np0005466013 nova_compute[192144]: 2025-10-02 12:33:50.103 2 DEBUG oslo_concurrency.lockutils [req-5aab7246-5eaa-486f-8bf7-502227af183f req-0819d36f-aa78-42bf-94a5-e4eb7796e78e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "d69f983e-e1c9-488c-a48e-2684e425362a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:50 np0005466013 nova_compute[192144]: 2025-10-02 12:33:50.103 2 DEBUG oslo_concurrency.lockutils [req-5aab7246-5eaa-486f-8bf7-502227af183f req-0819d36f-aa78-42bf-94a5-e4eb7796e78e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "d69f983e-e1c9-488c-a48e-2684e425362a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:50 np0005466013 nova_compute[192144]: 2025-10-02 12:33:50.103 2 DEBUG nova.compute.manager [req-5aab7246-5eaa-486f-8bf7-502227af183f req-0819d36f-aa78-42bf-94a5-e4eb7796e78e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Processing event network-vif-plugged-c52d24c3-c76d-49e4-9b0f-2640127e1fa7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:33:50 np0005466013 nova_compute[192144]: 2025-10-02 12:33:50.164 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:50 np0005466013 nova_compute[192144]: 2025-10-02 12:33:50.633 2 DEBUG nova.network.neutron [req-c185cc82-18fb-4a91-bafd-067a52623422 req-b1af5c32-2135-4076-af3a-a07d0a4c0f1b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Updated VIF entry in instance network info cache for port c52d24c3-c76d-49e4-9b0f-2640127e1fa7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:33:50 np0005466013 nova_compute[192144]: 2025-10-02 12:33:50.634 2 DEBUG nova.network.neutron [req-c185cc82-18fb-4a91-bafd-067a52623422 req-b1af5c32-2135-4076-af3a-a07d0a4c0f1b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Updating instance_info_cache with network_info: [{"id": "c52d24c3-c76d-49e4-9b0f-2640127e1fa7", "address": "fa:16:3e:3a:22:37", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc52d24c3-c7", "ovs_interfaceid": "c52d24c3-c76d-49e4-9b0f-2640127e1fa7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:33:50 np0005466013 nova_compute[192144]: 2025-10-02 12:33:50.658 2 DEBUG oslo_concurrency.lockutils [req-c185cc82-18fb-4a91-bafd-067a52623422 req-b1af5c32-2135-4076-af3a-a07d0a4c0f1b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-d69f983e-e1c9-488c-a48e-2684e425362a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:33:50 np0005466013 nova_compute[192144]: 2025-10-02 12:33:50.918 2 DEBUG nova.compute.manager [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:33:50 np0005466013 nova_compute[192144]: 2025-10-02 12:33:50.919 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759408430.9170504, d69f983e-e1c9-488c-a48e-2684e425362a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:33:50 np0005466013 nova_compute[192144]: 2025-10-02 12:33:50.920 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] VM Started (Lifecycle Event)#033[00m
Oct  2 08:33:50 np0005466013 nova_compute[192144]: 2025-10-02 12:33:50.924 2 DEBUG nova.virt.libvirt.driver [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:33:50 np0005466013 nova_compute[192144]: 2025-10-02 12:33:50.930 2 INFO nova.virt.libvirt.driver [-] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Instance spawned successfully.#033[00m
Oct  2 08:33:50 np0005466013 nova_compute[192144]: 2025-10-02 12:33:50.931 2 DEBUG nova.virt.libvirt.driver [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:33:50 np0005466013 nova_compute[192144]: 2025-10-02 12:33:50.945 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:33:50 np0005466013 nova_compute[192144]: 2025-10-02 12:33:50.952 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:33:50 np0005466013 nova_compute[192144]: 2025-10-02 12:33:50.956 2 DEBUG nova.virt.libvirt.driver [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:33:50 np0005466013 nova_compute[192144]: 2025-10-02 12:33:50.956 2 DEBUG nova.virt.libvirt.driver [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:33:50 np0005466013 nova_compute[192144]: 2025-10-02 12:33:50.957 2 DEBUG nova.virt.libvirt.driver [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:33:50 np0005466013 nova_compute[192144]: 2025-10-02 12:33:50.958 2 DEBUG nova.virt.libvirt.driver [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:33:50 np0005466013 nova_compute[192144]: 2025-10-02 12:33:50.958 2 DEBUG nova.virt.libvirt.driver [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:33:50 np0005466013 nova_compute[192144]: 2025-10-02 12:33:50.959 2 DEBUG nova.virt.libvirt.driver [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:33:50 np0005466013 nova_compute[192144]: 2025-10-02 12:33:50.975 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:33:50 np0005466013 nova_compute[192144]: 2025-10-02 12:33:50.976 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759408430.9174497, d69f983e-e1c9-488c-a48e-2684e425362a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:33:50 np0005466013 nova_compute[192144]: 2025-10-02 12:33:50.976 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:33:51 np0005466013 nova_compute[192144]: 2025-10-02 12:33:51.004 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:33:51 np0005466013 nova_compute[192144]: 2025-10-02 12:33:51.009 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759408430.9239025, d69f983e-e1c9-488c-a48e-2684e425362a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:33:51 np0005466013 nova_compute[192144]: 2025-10-02 12:33:51.009 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:33:51 np0005466013 nova_compute[192144]: 2025-10-02 12:33:51.038 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:33:51 np0005466013 nova_compute[192144]: 2025-10-02 12:33:51.042 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:33:51 np0005466013 nova_compute[192144]: 2025-10-02 12:33:51.048 2 INFO nova.compute.manager [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Took 8.66 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:33:51 np0005466013 nova_compute[192144]: 2025-10-02 12:33:51.048 2 DEBUG nova.compute.manager [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:33:51 np0005466013 nova_compute[192144]: 2025-10-02 12:33:51.078 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:33:51 np0005466013 nova_compute[192144]: 2025-10-02 12:33:51.180 2 INFO nova.compute.manager [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Took 9.45 seconds to build instance.#033[00m
Oct  2 08:33:51 np0005466013 nova_compute[192144]: 2025-10-02 12:33:51.199 2 DEBUG oslo_concurrency.lockutils [None req-bbab651b-ee8c-4f8f-8a58-992f50d670b6 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "d69f983e-e1c9-488c-a48e-2684e425362a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.605s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:52 np0005466013 nova_compute[192144]: 2025-10-02 12:33:52.276 2 DEBUG nova.compute.manager [req-8597e88d-7d4d-4ef9-9b53-77ff2d96aace req-cd71d061-dafd-467b-bb5d-dd2ef754b544 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Received event network-vif-plugged-c52d24c3-c76d-49e4-9b0f-2640127e1fa7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:33:52 np0005466013 nova_compute[192144]: 2025-10-02 12:33:52.278 2 DEBUG oslo_concurrency.lockutils [req-8597e88d-7d4d-4ef9-9b53-77ff2d96aace req-cd71d061-dafd-467b-bb5d-dd2ef754b544 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "d69f983e-e1c9-488c-a48e-2684e425362a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:52 np0005466013 nova_compute[192144]: 2025-10-02 12:33:52.279 2 DEBUG oslo_concurrency.lockutils [req-8597e88d-7d4d-4ef9-9b53-77ff2d96aace req-cd71d061-dafd-467b-bb5d-dd2ef754b544 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "d69f983e-e1c9-488c-a48e-2684e425362a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:52 np0005466013 nova_compute[192144]: 2025-10-02 12:33:52.280 2 DEBUG oslo_concurrency.lockutils [req-8597e88d-7d4d-4ef9-9b53-77ff2d96aace req-cd71d061-dafd-467b-bb5d-dd2ef754b544 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "d69f983e-e1c9-488c-a48e-2684e425362a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:52 np0005466013 nova_compute[192144]: 2025-10-02 12:33:52.280 2 DEBUG nova.compute.manager [req-8597e88d-7d4d-4ef9-9b53-77ff2d96aace req-cd71d061-dafd-467b-bb5d-dd2ef754b544 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] No waiting events found dispatching network-vif-plugged-c52d24c3-c76d-49e4-9b0f-2640127e1fa7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:33:52 np0005466013 nova_compute[192144]: 2025-10-02 12:33:52.281 2 WARNING nova.compute.manager [req-8597e88d-7d4d-4ef9-9b53-77ff2d96aace req-cd71d061-dafd-467b-bb5d-dd2ef754b544 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Received unexpected event network-vif-plugged-c52d24c3-c76d-49e4-9b0f-2640127e1fa7 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:33:53 np0005466013 nova_compute[192144]: 2025-10-02 12:33:53.473 2 DEBUG oslo_concurrency.lockutils [None req-6e2be73b-a37c-428a-9959-97ed1eb2e587 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "d69f983e-e1c9-488c-a48e-2684e425362a" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:53 np0005466013 nova_compute[192144]: 2025-10-02 12:33:53.475 2 DEBUG oslo_concurrency.lockutils [None req-6e2be73b-a37c-428a-9959-97ed1eb2e587 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "d69f983e-e1c9-488c-a48e-2684e425362a" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:53 np0005466013 nova_compute[192144]: 2025-10-02 12:33:53.475 2 DEBUG nova.compute.manager [None req-6e2be73b-a37c-428a-9959-97ed1eb2e587 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:33:53 np0005466013 nova_compute[192144]: 2025-10-02 12:33:53.479 2 DEBUG nova.compute.manager [None req-6e2be73b-a37c-428a-9959-97ed1eb2e587 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m
Oct  2 08:33:53 np0005466013 nova_compute[192144]: 2025-10-02 12:33:53.480 2 DEBUG nova.objects.instance [None req-6e2be73b-a37c-428a-9959-97ed1eb2e587 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lazy-loading 'flavor' on Instance uuid d69f983e-e1c9-488c-a48e-2684e425362a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:33:53 np0005466013 nova_compute[192144]: 2025-10-02 12:33:53.502 2 DEBUG nova.objects.instance [None req-6e2be73b-a37c-428a-9959-97ed1eb2e587 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lazy-loading 'info_cache' on Instance uuid d69f983e-e1c9-488c-a48e-2684e425362a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:33:53 np0005466013 nova_compute[192144]: 2025-10-02 12:33:53.539 2 DEBUG nova.virt.libvirt.driver [None req-6e2be73b-a37c-428a-9959-97ed1eb2e587 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Oct  2 08:33:53 np0005466013 nova_compute[192144]: 2025-10-02 12:33:53.584 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:53 np0005466013 podman[245010]: 2025-10-02 12:33:53.683953451 +0000 UTC m=+0.054593359 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, architecture=x86_64, distribution-scope=public, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.buildah.version=1.33.7, release=1755695350, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, name=ubi9-minimal)
Oct  2 08:33:53 np0005466013 podman[245009]: 2025-10-02 12:33:53.693738669 +0000 UTC m=+0.065869654 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 08:33:53 np0005466013 podman[245011]: 2025-10-02 12:33:53.718581111 +0000 UTC m=+0.087356310 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, config_id=edpm, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  2 08:33:55 np0005466013 nova_compute[192144]: 2025-10-02 12:33:55.168 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:58 np0005466013 nova_compute[192144]: 2025-10-02 12:33:58.588 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:59 np0005466013 podman[245068]: 2025-10-02 12:33:59.670944254 +0000 UTC m=+0.050837070 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  2 08:33:59 np0005466013 podman[245069]: 2025-10-02 12:33:59.707861206 +0000 UTC m=+0.085983127 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 08:34:00 np0005466013 nova_compute[192144]: 2025-10-02 12:34:00.169 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:02.316 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:02.317 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:02.318 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:02 np0005466013 nova_compute[192144]: 2025-10-02 12:34:02.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:34:02 np0005466013 nova_compute[192144]: 2025-10-02 12:34:02.995 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:02 np0005466013 nova_compute[192144]: 2025-10-02 12:34:02.995 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:02 np0005466013 nova_compute[192144]: 2025-10-02 12:34:02.996 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:02 np0005466013 nova_compute[192144]: 2025-10-02 12:34:02.996 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:02 np0005466013 nova_compute[192144]: 2025-10-02 12:34:02.997 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:02 np0005466013 nova_compute[192144]: 2025-10-02 12:34:02.997 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:03 np0005466013 nova_compute[192144]: 2025-10-02 12:34:03.039 2 DEBUG nova.virt.libvirt.imagecache [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314#033[00m
Oct  2 08:34:03 np0005466013 nova_compute[192144]: 2025-10-02 12:34:03.039 2 DEBUG nova.virt.libvirt.imagecache [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Image id cf60d86d-f1d5-4be4-976e-7488dbdcf0b2 yields fingerprint 068b233e8d7f49e215e2900dde7d25b776cad955 _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:319#033[00m
Oct  2 08:34:03 np0005466013 nova_compute[192144]: 2025-10-02 12:34:03.039 2 INFO nova.virt.libvirt.imagecache [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] image cf60d86d-f1d5-4be4-976e-7488dbdcf0b2 at (/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955): checking#033[00m
Oct  2 08:34:03 np0005466013 nova_compute[192144]: 2025-10-02 12:34:03.039 2 DEBUG nova.virt.libvirt.imagecache [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] image cf60d86d-f1d5-4be4-976e-7488dbdcf0b2 at (/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955): image is in use _mark_in_use /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:279#033[00m
Oct  2 08:34:03 np0005466013 nova_compute[192144]: 2025-10-02 12:34:03.174 2 DEBUG nova.virt.libvirt.imagecache [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Image id  yields fingerprint da39a3ee5e6b4b0d3255bfef95601890afd80709 _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:319#033[00m
Oct  2 08:34:03 np0005466013 nova_compute[192144]: 2025-10-02 12:34:03.175 2 DEBUG nova.virt.libvirt.imagecache [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] 4958df02-e9fa-4cb2-9175-4313cd3fd658 is a valid instance name _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:126#033[00m
Oct  2 08:34:03 np0005466013 nova_compute[192144]: 2025-10-02 12:34:03.175 2 DEBUG nova.virt.libvirt.imagecache [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] 4958df02-e9fa-4cb2-9175-4313cd3fd658 has a disk file _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:129#033[00m
Oct  2 08:34:03 np0005466013 nova_compute[192144]: 2025-10-02 12:34:03.175 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4958df02-e9fa-4cb2-9175-4313cd3fd658/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:34:03 np0005466013 nova_compute[192144]: 2025-10-02 12:34:03.234 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4958df02-e9fa-4cb2-9175-4313cd3fd658/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:34:03 np0005466013 nova_compute[192144]: 2025-10-02 12:34:03.235 2 DEBUG nova.virt.libvirt.imagecache [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance 4958df02-e9fa-4cb2-9175-4313cd3fd658 is backed by 068b233e8d7f49e215e2900dde7d25b776cad955 _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:141#033[00m
Oct  2 08:34:03 np0005466013 nova_compute[192144]: 2025-10-02 12:34:03.235 2 DEBUG nova.virt.libvirt.imagecache [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] 1eda1f2a-e061-4d62-b09d-49ac1dc55ace is a valid instance name _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:126#033[00m
Oct  2 08:34:03 np0005466013 nova_compute[192144]: 2025-10-02 12:34:03.235 2 DEBUG nova.virt.libvirt.imagecache [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] 1eda1f2a-e061-4d62-b09d-49ac1dc55ace has a disk file _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:129#033[00m
Oct  2 08:34:03 np0005466013 nova_compute[192144]: 2025-10-02 12:34:03.236 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1eda1f2a-e061-4d62-b09d-49ac1dc55ace/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:34:03 np0005466013 nova_compute[192144]: 2025-10-02 12:34:03.291 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1eda1f2a-e061-4d62-b09d-49ac1dc55ace/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:34:03 np0005466013 nova_compute[192144]: 2025-10-02 12:34:03.292 2 DEBUG nova.virt.libvirt.imagecache [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance 1eda1f2a-e061-4d62-b09d-49ac1dc55ace is backed by 068b233e8d7f49e215e2900dde7d25b776cad955 _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:141#033[00m
Oct  2 08:34:03 np0005466013 nova_compute[192144]: 2025-10-02 12:34:03.293 2 DEBUG nova.virt.libvirt.imagecache [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] d69f983e-e1c9-488c-a48e-2684e425362a is a valid instance name _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:126#033[00m
Oct  2 08:34:03 np0005466013 nova_compute[192144]: 2025-10-02 12:34:03.293 2 DEBUG nova.virt.libvirt.imagecache [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] d69f983e-e1c9-488c-a48e-2684e425362a has a disk file _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:129#033[00m
Oct  2 08:34:03 np0005466013 nova_compute[192144]: 2025-10-02 12:34:03.293 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d69f983e-e1c9-488c-a48e-2684e425362a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:34:03 np0005466013 nova_compute[192144]: 2025-10-02 12:34:03.353 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d69f983e-e1c9-488c-a48e-2684e425362a/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:34:03 np0005466013 nova_compute[192144]: 2025-10-02 12:34:03.354 2 DEBUG nova.virt.libvirt.imagecache [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance d69f983e-e1c9-488c-a48e-2684e425362a is backed by 068b233e8d7f49e215e2900dde7d25b776cad955 _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:141#033[00m
Oct  2 08:34:03 np0005466013 nova_compute[192144]: 2025-10-02 12:34:03.354 2 WARNING nova.virt.libvirt.imagecache [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Unknown base file: /var/lib/nova/instances/_base/c05f5917d86516c44566bebc89543d31048e148b#033[00m
Oct  2 08:34:03 np0005466013 nova_compute[192144]: 2025-10-02 12:34:03.354 2 WARNING nova.virt.libvirt.imagecache [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Unknown base file: /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2#033[00m
Oct  2 08:34:03 np0005466013 nova_compute[192144]: 2025-10-02 12:34:03.354 2 WARNING nova.virt.libvirt.imagecache [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Unknown base file: /var/lib/nova/instances/_base/c5d7c7df9c32610775fd016cc9585c255e2a2d15#033[00m
Oct  2 08:34:03 np0005466013 nova_compute[192144]: 2025-10-02 12:34:03.355 2 WARNING nova.virt.libvirt.imagecache [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Unknown base file: /var/lib/nova/instances/_base/bf7d91c80d713bd4a5b1fc203f1a5c35101d6204#033[00m
Oct  2 08:34:03 np0005466013 nova_compute[192144]: 2025-10-02 12:34:03.355 2 WARNING nova.virt.libvirt.imagecache [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Unknown base file: /var/lib/nova/instances/_base/2563015c4ccc448cfc2f148d9d2544bae12af308#033[00m
Oct  2 08:34:03 np0005466013 nova_compute[192144]: 2025-10-02 12:34:03.355 2 INFO nova.virt.libvirt.imagecache [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Active base files: /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955#033[00m
Oct  2 08:34:03 np0005466013 nova_compute[192144]: 2025-10-02 12:34:03.355 2 INFO nova.virt.libvirt.imagecache [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Removable base files: /var/lib/nova/instances/_base/c05f5917d86516c44566bebc89543d31048e148b /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2 /var/lib/nova/instances/_base/c5d7c7df9c32610775fd016cc9585c255e2a2d15 /var/lib/nova/instances/_base/bf7d91c80d713bd4a5b1fc203f1a5c35101d6204 /var/lib/nova/instances/_base/2563015c4ccc448cfc2f148d9d2544bae12af308#033[00m
Oct  2 08:34:03 np0005466013 nova_compute[192144]: 2025-10-02 12:34:03.356 2 INFO nova.virt.libvirt.imagecache [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/c05f5917d86516c44566bebc89543d31048e148b#033[00m
Oct  2 08:34:03 np0005466013 nova_compute[192144]: 2025-10-02 12:34:03.356 2 INFO nova.virt.libvirt.imagecache [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2#033[00m
Oct  2 08:34:03 np0005466013 nova_compute[192144]: 2025-10-02 12:34:03.356 2 INFO nova.virt.libvirt.imagecache [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/c5d7c7df9c32610775fd016cc9585c255e2a2d15#033[00m
Oct  2 08:34:03 np0005466013 nova_compute[192144]: 2025-10-02 12:34:03.356 2 INFO nova.virt.libvirt.imagecache [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/bf7d91c80d713bd4a5b1fc203f1a5c35101d6204#033[00m
Oct  2 08:34:03 np0005466013 nova_compute[192144]: 2025-10-02 12:34:03.356 2 INFO nova.virt.libvirt.imagecache [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/2563015c4ccc448cfc2f148d9d2544bae12af308#033[00m
Oct  2 08:34:03 np0005466013 nova_compute[192144]: 2025-10-02 12:34:03.357 2 DEBUG nova.virt.libvirt.imagecache [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350#033[00m
Oct  2 08:34:03 np0005466013 nova_compute[192144]: 2025-10-02 12:34:03.357 2 DEBUG nova.virt.libvirt.imagecache [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299#033[00m
Oct  2 08:34:03 np0005466013 nova_compute[192144]: 2025-10-02 12:34:03.357 2 DEBUG nova.virt.libvirt.imagecache [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284#033[00m
Oct  2 08:34:03 np0005466013 nova_compute[192144]: 2025-10-02 12:34:03.584 2 DEBUG nova.virt.libvirt.driver [None req-6e2be73b-a37c-428a-9959-97ed1eb2e587 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Oct  2 08:34:03 np0005466013 nova_compute[192144]: 2025-10-02 12:34:03.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:05 np0005466013 nova_compute[192144]: 2025-10-02 12:34:05.171 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:05 np0005466013 ovn_controller[94366]: 2025-10-02T12:34:05Z|00066|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:3a:22:37 10.100.0.3
Oct  2 08:34:05 np0005466013 ovn_controller[94366]: 2025-10-02T12:34:05Z|00067|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:3a:22:37 10.100.0.3
Oct  2 08:34:08 np0005466013 nova_compute[192144]: 2025-10-02 12:34:08.597 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:08 np0005466013 kernel: tapc52d24c3-c7 (unregistering): left promiscuous mode
Oct  2 08:34:08 np0005466013 NetworkManager[51205]: <info>  [1759408448.7979] device (tapc52d24c3-c7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:34:08 np0005466013 ovn_controller[94366]: 2025-10-02T12:34:08Z|00645|binding|INFO|Releasing lport c52d24c3-c76d-49e4-9b0f-2640127e1fa7 from this chassis (sb_readonly=0)
Oct  2 08:34:08 np0005466013 ovn_controller[94366]: 2025-10-02T12:34:08Z|00646|binding|INFO|Setting lport c52d24c3-c76d-49e4-9b0f-2640127e1fa7 down in Southbound
Oct  2 08:34:08 np0005466013 ovn_controller[94366]: 2025-10-02T12:34:08Z|00647|binding|INFO|Removing iface tapc52d24c3-c7 ovn-installed in OVS
Oct  2 08:34:08 np0005466013 nova_compute[192144]: 2025-10-02 12:34:08.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:08 np0005466013 nova_compute[192144]: 2025-10-02 12:34:08.863 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:08 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:08.863 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3a:22:37 10.100.0.3'], port_security=['fa:16:3e:3a:22:37 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'd69f983e-e1c9-488c-a48e-2684e425362a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1acf42c5-084c-4cc4-bdc5-910eec0249e3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a4a7099974504a798e1607c8e6a1f570', 'neutron:revision_number': '4', 'neutron:security_group_ids': '99e51855-93ef-45a8-a4a3-2b0a8aec1882', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=498d5b4e-c711-4633-9705-7db30a0fb056, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=c52d24c3-c76d-49e4-9b0f-2640127e1fa7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:34:08 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:08.865 103323 INFO neutron.agent.ovn.metadata.agent [-] Port c52d24c3-c76d-49e4-9b0f-2640127e1fa7 in datapath 1acf42c5-084c-4cc4-bdc5-910eec0249e3 unbound from our chassis#033[00m
Oct  2 08:34:08 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:08.867 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1acf42c5-084c-4cc4-bdc5-910eec0249e3#033[00m
Oct  2 08:34:08 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:08.888 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[536604a5-1ce0-481a-929b-3ad2cb69d539]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:08 np0005466013 systemd[1]: machine-qemu\x2d71\x2dinstance\x2d00000097.scope: Deactivated successfully.
Oct  2 08:34:08 np0005466013 systemd[1]: machine-qemu\x2d71\x2dinstance\x2d00000097.scope: Consumed 14.408s CPU time.
Oct  2 08:34:08 np0005466013 systemd-machined[152202]: Machine qemu-71-instance-00000097 terminated.
Oct  2 08:34:08 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:08.919 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[db732432-927c-4bbf-80ea-958309e4018e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:08 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:08.922 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[9a17df29-eacd-4ad1-b291-40026caa0d41]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:08 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:08.950 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[315ad34e-f914-4b38-90f5-2bf21c192fb8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:08 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:08.969 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[3a8cf881-b80b-426c-ac8d-357236cc9846]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1acf42c5-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:5b:cd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 8, 'rx_bytes': 916, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 8, 'rx_bytes': 916, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 186], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 630877, 'reachable_time': 30317, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245163, 'error': None, 'target': 'ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:08 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:08.985 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[f01267fc-0af0-4668-a159-81be75fb9b65]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap1acf42c5-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 630892, 'tstamp': 630892}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 245164, 'error': None, 'target': 'ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap1acf42c5-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 630895, 'tstamp': 630895}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 245164, 'error': None, 'target': 'ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:08 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:08.987 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1acf42c5-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:34:08 np0005466013 nova_compute[192144]: 2025-10-02 12:34:08.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:08 np0005466013 nova_compute[192144]: 2025-10-02 12:34:08.992 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:08 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:08.992 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1acf42c5-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:34:08 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:08.993 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:34:08 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:08.993 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1acf42c5-00, col_values=(('external_ids', {'iface-id': 'c198cb2e-a850-46e4-8295-a2f9c280ee53'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:34:08 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:08.993 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:34:09 np0005466013 nova_compute[192144]: 2025-10-02 12:34:09.070 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:09 np0005466013 nova_compute[192144]: 2025-10-02 12:34:09.074 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:09 np0005466013 nova_compute[192144]: 2025-10-02 12:34:09.088 2 DEBUG nova.compute.manager [req-4896badc-378c-4e71-beee-d8dabef846f0 req-43b9038f-5472-481e-92d3-a05cde81c7d5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Received event network-vif-unplugged-c52d24c3-c76d-49e4-9b0f-2640127e1fa7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:34:09 np0005466013 nova_compute[192144]: 2025-10-02 12:34:09.088 2 DEBUG oslo_concurrency.lockutils [req-4896badc-378c-4e71-beee-d8dabef846f0 req-43b9038f-5472-481e-92d3-a05cde81c7d5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "d69f983e-e1c9-488c-a48e-2684e425362a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:09 np0005466013 nova_compute[192144]: 2025-10-02 12:34:09.088 2 DEBUG oslo_concurrency.lockutils [req-4896badc-378c-4e71-beee-d8dabef846f0 req-43b9038f-5472-481e-92d3-a05cde81c7d5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "d69f983e-e1c9-488c-a48e-2684e425362a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:09 np0005466013 nova_compute[192144]: 2025-10-02 12:34:09.088 2 DEBUG oslo_concurrency.lockutils [req-4896badc-378c-4e71-beee-d8dabef846f0 req-43b9038f-5472-481e-92d3-a05cde81c7d5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "d69f983e-e1c9-488c-a48e-2684e425362a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:09 np0005466013 nova_compute[192144]: 2025-10-02 12:34:09.089 2 DEBUG nova.compute.manager [req-4896badc-378c-4e71-beee-d8dabef846f0 req-43b9038f-5472-481e-92d3-a05cde81c7d5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] No waiting events found dispatching network-vif-unplugged-c52d24c3-c76d-49e4-9b0f-2640127e1fa7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:34:09 np0005466013 nova_compute[192144]: 2025-10-02 12:34:09.089 2 WARNING nova.compute.manager [req-4896badc-378c-4e71-beee-d8dabef846f0 req-43b9038f-5472-481e-92d3-a05cde81c7d5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Received unexpected event network-vif-unplugged-c52d24c3-c76d-49e4-9b0f-2640127e1fa7 for instance with vm_state active and task_state powering-off.#033[00m
Oct  2 08:34:09 np0005466013 nova_compute[192144]: 2025-10-02 12:34:09.614 2 INFO nova.virt.libvirt.driver [None req-6e2be73b-a37c-428a-9959-97ed1eb2e587 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Instance shutdown successfully after 16 seconds.#033[00m
Oct  2 08:34:09 np0005466013 nova_compute[192144]: 2025-10-02 12:34:09.618 2 INFO nova.virt.libvirt.driver [-] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Instance destroyed successfully.#033[00m
Oct  2 08:34:09 np0005466013 nova_compute[192144]: 2025-10-02 12:34:09.619 2 DEBUG nova.objects.instance [None req-6e2be73b-a37c-428a-9959-97ed1eb2e587 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lazy-loading 'numa_topology' on Instance uuid d69f983e-e1c9-488c-a48e-2684e425362a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:34:09 np0005466013 nova_compute[192144]: 2025-10-02 12:34:09.640 2 DEBUG nova.compute.manager [None req-6e2be73b-a37c-428a-9959-97ed1eb2e587 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:34:09 np0005466013 nova_compute[192144]: 2025-10-02 12:34:09.745 2 DEBUG oslo_concurrency.lockutils [None req-6e2be73b-a37c-428a-9959-97ed1eb2e587 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "d69f983e-e1c9-488c-a48e-2684e425362a" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 16.270s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:10 np0005466013 nova_compute[192144]: 2025-10-02 12:34:10.177 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:11 np0005466013 nova_compute[192144]: 2025-10-02 12:34:11.359 2 DEBUG nova.compute.manager [req-e496957b-379f-4a95-91d7-c391dd819c62 req-c4f546f6-6fef-4e9c-ae60-97c3ab90f173 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Received event network-vif-plugged-c52d24c3-c76d-49e4-9b0f-2640127e1fa7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:34:11 np0005466013 nova_compute[192144]: 2025-10-02 12:34:11.359 2 DEBUG oslo_concurrency.lockutils [req-e496957b-379f-4a95-91d7-c391dd819c62 req-c4f546f6-6fef-4e9c-ae60-97c3ab90f173 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "d69f983e-e1c9-488c-a48e-2684e425362a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:11 np0005466013 nova_compute[192144]: 2025-10-02 12:34:11.359 2 DEBUG oslo_concurrency.lockutils [req-e496957b-379f-4a95-91d7-c391dd819c62 req-c4f546f6-6fef-4e9c-ae60-97c3ab90f173 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "d69f983e-e1c9-488c-a48e-2684e425362a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:11 np0005466013 nova_compute[192144]: 2025-10-02 12:34:11.359 2 DEBUG oslo_concurrency.lockutils [req-e496957b-379f-4a95-91d7-c391dd819c62 req-c4f546f6-6fef-4e9c-ae60-97c3ab90f173 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "d69f983e-e1c9-488c-a48e-2684e425362a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:11 np0005466013 nova_compute[192144]: 2025-10-02 12:34:11.359 2 DEBUG nova.compute.manager [req-e496957b-379f-4a95-91d7-c391dd819c62 req-c4f546f6-6fef-4e9c-ae60-97c3ab90f173 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] No waiting events found dispatching network-vif-plugged-c52d24c3-c76d-49e4-9b0f-2640127e1fa7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:34:11 np0005466013 nova_compute[192144]: 2025-10-02 12:34:11.360 2 WARNING nova.compute.manager [req-e496957b-379f-4a95-91d7-c391dd819c62 req-c4f546f6-6fef-4e9c-ae60-97c3ab90f173 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Received unexpected event network-vif-plugged-c52d24c3-c76d-49e4-9b0f-2640127e1fa7 for instance with vm_state stopped and task_state None.#033[00m
Oct  2 08:34:11 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:11.877 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=38, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=37) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:34:11 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:11.878 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:34:11 np0005466013 nova_compute[192144]: 2025-10-02 12:34:11.882 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:12 np0005466013 nova_compute[192144]: 2025-10-02 12:34:12.697 2 DEBUG oslo_concurrency.lockutils [None req-e6d952c4-9c3e-43e1-adce-1aaf2dc7cc12 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "d69f983e-e1c9-488c-a48e-2684e425362a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:12 np0005466013 nova_compute[192144]: 2025-10-02 12:34:12.697 2 DEBUG oslo_concurrency.lockutils [None req-e6d952c4-9c3e-43e1-adce-1aaf2dc7cc12 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "d69f983e-e1c9-488c-a48e-2684e425362a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:12 np0005466013 nova_compute[192144]: 2025-10-02 12:34:12.697 2 DEBUG oslo_concurrency.lockutils [None req-e6d952c4-9c3e-43e1-adce-1aaf2dc7cc12 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "d69f983e-e1c9-488c-a48e-2684e425362a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:12 np0005466013 nova_compute[192144]: 2025-10-02 12:34:12.698 2 DEBUG oslo_concurrency.lockutils [None req-e6d952c4-9c3e-43e1-adce-1aaf2dc7cc12 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "d69f983e-e1c9-488c-a48e-2684e425362a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:12 np0005466013 nova_compute[192144]: 2025-10-02 12:34:12.698 2 DEBUG oslo_concurrency.lockutils [None req-e6d952c4-9c3e-43e1-adce-1aaf2dc7cc12 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "d69f983e-e1c9-488c-a48e-2684e425362a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:12 np0005466013 nova_compute[192144]: 2025-10-02 12:34:12.709 2 INFO nova.compute.manager [None req-e6d952c4-9c3e-43e1-adce-1aaf2dc7cc12 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Terminating instance#033[00m
Oct  2 08:34:12 np0005466013 nova_compute[192144]: 2025-10-02 12:34:12.722 2 DEBUG nova.compute.manager [None req-e6d952c4-9c3e-43e1-adce-1aaf2dc7cc12 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:34:12 np0005466013 nova_compute[192144]: 2025-10-02 12:34:12.729 2 INFO nova.virt.libvirt.driver [-] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Instance destroyed successfully.#033[00m
Oct  2 08:34:12 np0005466013 nova_compute[192144]: 2025-10-02 12:34:12.729 2 DEBUG nova.objects.instance [None req-e6d952c4-9c3e-43e1-adce-1aaf2dc7cc12 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lazy-loading 'resources' on Instance uuid d69f983e-e1c9-488c-a48e-2684e425362a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:34:12 np0005466013 nova_compute[192144]: 2025-10-02 12:34:12.740 2 DEBUG nova.virt.libvirt.vif [None req-e6d952c4-9c3e-43e1-adce-1aaf2dc7cc12 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:33:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1701542532',display_name='tempest-Íñstáñcé-1658110676',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-1701542532',id=151,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:33:51Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='a4a7099974504a798e1607c8e6a1f570',ramdisk_id='',reservation_id='r-198inr7y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-1163535506',owner_user_name='tempest-ServersTestJSON-1163535506-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:34:11Z,user_data=None,user_id='27daa263abb54d4d8e3ae34cd1c5ccf5',uuid=d69f983e-e1c9-488c-a48e-2684e425362a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "c52d24c3-c76d-49e4-9b0f-2640127e1fa7", "address": "fa:16:3e:3a:22:37", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc52d24c3-c7", "ovs_interfaceid": "c52d24c3-c76d-49e4-9b0f-2640127e1fa7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:34:12 np0005466013 nova_compute[192144]: 2025-10-02 12:34:12.741 2 DEBUG nova.network.os_vif_util [None req-e6d952c4-9c3e-43e1-adce-1aaf2dc7cc12 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Converting VIF {"id": "c52d24c3-c76d-49e4-9b0f-2640127e1fa7", "address": "fa:16:3e:3a:22:37", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc52d24c3-c7", "ovs_interfaceid": "c52d24c3-c76d-49e4-9b0f-2640127e1fa7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:34:12 np0005466013 nova_compute[192144]: 2025-10-02 12:34:12.742 2 DEBUG nova.network.os_vif_util [None req-e6d952c4-9c3e-43e1-adce-1aaf2dc7cc12 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3a:22:37,bridge_name='br-int',has_traffic_filtering=True,id=c52d24c3-c76d-49e4-9b0f-2640127e1fa7,network=Network(1acf42c5-084c-4cc4-bdc5-910eec0249e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc52d24c3-c7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:34:12 np0005466013 nova_compute[192144]: 2025-10-02 12:34:12.742 2 DEBUG os_vif [None req-e6d952c4-9c3e-43e1-adce-1aaf2dc7cc12 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3a:22:37,bridge_name='br-int',has_traffic_filtering=True,id=c52d24c3-c76d-49e4-9b0f-2640127e1fa7,network=Network(1acf42c5-084c-4cc4-bdc5-910eec0249e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc52d24c3-c7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:34:12 np0005466013 nova_compute[192144]: 2025-10-02 12:34:12.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:12 np0005466013 nova_compute[192144]: 2025-10-02 12:34:12.745 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc52d24c3-c7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:34:12 np0005466013 nova_compute[192144]: 2025-10-02 12:34:12.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:34:12 np0005466013 nova_compute[192144]: 2025-10-02 12:34:12.753 2 INFO os_vif [None req-e6d952c4-9c3e-43e1-adce-1aaf2dc7cc12 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3a:22:37,bridge_name='br-int',has_traffic_filtering=True,id=c52d24c3-c76d-49e4-9b0f-2640127e1fa7,network=Network(1acf42c5-084c-4cc4-bdc5-910eec0249e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc52d24c3-c7')#033[00m
Oct  2 08:34:12 np0005466013 nova_compute[192144]: 2025-10-02 12:34:12.753 2 INFO nova.virt.libvirt.driver [None req-e6d952c4-9c3e-43e1-adce-1aaf2dc7cc12 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Deleting instance files /var/lib/nova/instances/d69f983e-e1c9-488c-a48e-2684e425362a_del#033[00m
Oct  2 08:34:12 np0005466013 nova_compute[192144]: 2025-10-02 12:34:12.754 2 INFO nova.virt.libvirt.driver [None req-e6d952c4-9c3e-43e1-adce-1aaf2dc7cc12 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Deletion of /var/lib/nova/instances/d69f983e-e1c9-488c-a48e-2684e425362a_del complete#033[00m
Oct  2 08:34:12 np0005466013 nova_compute[192144]: 2025-10-02 12:34:12.832 2 INFO nova.compute.manager [None req-e6d952c4-9c3e-43e1-adce-1aaf2dc7cc12 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Took 0.11 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:34:12 np0005466013 nova_compute[192144]: 2025-10-02 12:34:12.833 2 DEBUG oslo.service.loopingcall [None req-e6d952c4-9c3e-43e1-adce-1aaf2dc7cc12 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:34:12 np0005466013 nova_compute[192144]: 2025-10-02 12:34:12.834 2 DEBUG nova.compute.manager [-] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:34:12 np0005466013 nova_compute[192144]: 2025-10-02 12:34:12.834 2 DEBUG nova.network.neutron [-] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:34:13 np0005466013 nova_compute[192144]: 2025-10-02 12:34:13.455 2 DEBUG nova.network.neutron [-] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:34:13 np0005466013 nova_compute[192144]: 2025-10-02 12:34:13.471 2 INFO nova.compute.manager [-] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Took 0.64 seconds to deallocate network for instance.#033[00m
Oct  2 08:34:13 np0005466013 nova_compute[192144]: 2025-10-02 12:34:13.513 2 DEBUG nova.compute.manager [req-b8ab1fc8-97ff-4873-bfec-8a0f9c7e845d req-3eee2927-11cb-48de-9673-7934876aed7b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Received event network-vif-deleted-c52d24c3-c76d-49e4-9b0f-2640127e1fa7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:34:13 np0005466013 nova_compute[192144]: 2025-10-02 12:34:13.548 2 DEBUG oslo_concurrency.lockutils [None req-e6d952c4-9c3e-43e1-adce-1aaf2dc7cc12 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:13 np0005466013 nova_compute[192144]: 2025-10-02 12:34:13.549 2 DEBUG oslo_concurrency.lockutils [None req-e6d952c4-9c3e-43e1-adce-1aaf2dc7cc12 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:13 np0005466013 nova_compute[192144]: 2025-10-02 12:34:13.650 2 DEBUG nova.compute.provider_tree [None req-e6d952c4-9c3e-43e1-adce-1aaf2dc7cc12 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:34:13 np0005466013 nova_compute[192144]: 2025-10-02 12:34:13.676 2 DEBUG nova.scheduler.client.report [None req-e6d952c4-9c3e-43e1-adce-1aaf2dc7cc12 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:34:13 np0005466013 nova_compute[192144]: 2025-10-02 12:34:13.698 2 DEBUG oslo_concurrency.lockutils [None req-e6d952c4-9c3e-43e1-adce-1aaf2dc7cc12 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.149s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:13 np0005466013 nova_compute[192144]: 2025-10-02 12:34:13.724 2 INFO nova.scheduler.client.report [None req-e6d952c4-9c3e-43e1-adce-1aaf2dc7cc12 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Deleted allocations for instance d69f983e-e1c9-488c-a48e-2684e425362a#033[00m
Oct  2 08:34:13 np0005466013 nova_compute[192144]: 2025-10-02 12:34:13.803 2 DEBUG oslo_concurrency.lockutils [None req-e6d952c4-9c3e-43e1-adce-1aaf2dc7cc12 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "d69f983e-e1c9-488c-a48e-2684e425362a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:14 np0005466013 podman[245182]: 2025-10-02 12:34:14.693897185 +0000 UTC m=+0.053814904 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:34:14 np0005466013 podman[245183]: 2025-10-02 12:34:14.693907165 +0000 UTC m=+0.054905209 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:34:14 np0005466013 nova_compute[192144]: 2025-10-02 12:34:14.706 2 DEBUG nova.compute.manager [req-8874e8fc-01b1-4d5c-9ee8-878da553e76b req-5bce72eb-b097-4d8a-bb84-b5d7ef71906f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Received event network-changed-087a3a60-6c14-460a-99cf-049201b3c5b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:34:14 np0005466013 nova_compute[192144]: 2025-10-02 12:34:14.706 2 DEBUG nova.compute.manager [req-8874e8fc-01b1-4d5c-9ee8-878da553e76b req-5bce72eb-b097-4d8a-bb84-b5d7ef71906f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Refreshing instance network info cache due to event network-changed-087a3a60-6c14-460a-99cf-049201b3c5b7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:34:14 np0005466013 nova_compute[192144]: 2025-10-02 12:34:14.706 2 DEBUG oslo_concurrency.lockutils [req-8874e8fc-01b1-4d5c-9ee8-878da553e76b req-5bce72eb-b097-4d8a-bb84-b5d7ef71906f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-1eda1f2a-e061-4d62-b09d-49ac1dc55ace" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:34:14 np0005466013 nova_compute[192144]: 2025-10-02 12:34:14.707 2 DEBUG oslo_concurrency.lockutils [req-8874e8fc-01b1-4d5c-9ee8-878da553e76b req-5bce72eb-b097-4d8a-bb84-b5d7ef71906f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-1eda1f2a-e061-4d62-b09d-49ac1dc55ace" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:34:14 np0005466013 nova_compute[192144]: 2025-10-02 12:34:14.707 2 DEBUG nova.network.neutron [req-8874e8fc-01b1-4d5c-9ee8-878da553e76b req-5bce72eb-b097-4d8a-bb84-b5d7ef71906f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Refreshing network info cache for port 087a3a60-6c14-460a-99cf-049201b3c5b7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:34:14 np0005466013 podman[245184]: 2025-10-02 12:34:14.727828183 +0000 UTC m=+0.085051357 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  2 08:34:14 np0005466013 nova_compute[192144]: 2025-10-02 12:34:14.806 2 DEBUG oslo_concurrency.lockutils [None req-5b9713ac-c054-4c9b-a5df-aa7632bbd188 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "1eda1f2a-e061-4d62-b09d-49ac1dc55ace" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:14 np0005466013 nova_compute[192144]: 2025-10-02 12:34:14.806 2 DEBUG oslo_concurrency.lockutils [None req-5b9713ac-c054-4c9b-a5df-aa7632bbd188 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "1eda1f2a-e061-4d62-b09d-49ac1dc55ace" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:14 np0005466013 nova_compute[192144]: 2025-10-02 12:34:14.807 2 DEBUG oslo_concurrency.lockutils [None req-5b9713ac-c054-4c9b-a5df-aa7632bbd188 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "1eda1f2a-e061-4d62-b09d-49ac1dc55ace-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:14 np0005466013 nova_compute[192144]: 2025-10-02 12:34:14.807 2 DEBUG oslo_concurrency.lockutils [None req-5b9713ac-c054-4c9b-a5df-aa7632bbd188 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "1eda1f2a-e061-4d62-b09d-49ac1dc55ace-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:14 np0005466013 nova_compute[192144]: 2025-10-02 12:34:14.807 2 DEBUG oslo_concurrency.lockutils [None req-5b9713ac-c054-4c9b-a5df-aa7632bbd188 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "1eda1f2a-e061-4d62-b09d-49ac1dc55ace-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:14 np0005466013 nova_compute[192144]: 2025-10-02 12:34:14.820 2 INFO nova.compute.manager [None req-5b9713ac-c054-4c9b-a5df-aa7632bbd188 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Terminating instance#033[00m
Oct  2 08:34:14 np0005466013 nova_compute[192144]: 2025-10-02 12:34:14.830 2 DEBUG nova.compute.manager [None req-5b9713ac-c054-4c9b-a5df-aa7632bbd188 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:34:14 np0005466013 kernel: tap087a3a60-6c (unregistering): left promiscuous mode
Oct  2 08:34:14 np0005466013 NetworkManager[51205]: <info>  [1759408454.8518] device (tap087a3a60-6c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:34:14 np0005466013 ovn_controller[94366]: 2025-10-02T12:34:14Z|00648|binding|INFO|Releasing lport 087a3a60-6c14-460a-99cf-049201b3c5b7 from this chassis (sb_readonly=0)
Oct  2 08:34:14 np0005466013 nova_compute[192144]: 2025-10-02 12:34:14.859 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:14 np0005466013 ovn_controller[94366]: 2025-10-02T12:34:14Z|00649|binding|INFO|Setting lport 087a3a60-6c14-460a-99cf-049201b3c5b7 down in Southbound
Oct  2 08:34:14 np0005466013 ovn_controller[94366]: 2025-10-02T12:34:14Z|00650|binding|INFO|Removing iface tap087a3a60-6c ovn-installed in OVS
Oct  2 08:34:14 np0005466013 nova_compute[192144]: 2025-10-02 12:34:14.862 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:14 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:14.870 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bf:a3:73 10.100.0.12'], port_security=['fa:16:3e:bf:a3:73 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-48ae5e44-4c0f-44dd-b2b0-7bd3123da141', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c3e897d9-b083-4f5e-aef4-0a4551c54806', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dbb75d33-0be1-4472-abdd-63f2f4f59602, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=087a3a60-6c14-460a-99cf-049201b3c5b7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:34:14 np0005466013 nova_compute[192144]: 2025-10-02 12:34:14.873 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:14 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:14.872 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 087a3a60-6c14-460a-99cf-049201b3c5b7 in datapath 48ae5e44-4c0f-44dd-b2b0-7bd3123da141 unbound from our chassis#033[00m
Oct  2 08:34:14 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:14.876 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 48ae5e44-4c0f-44dd-b2b0-7bd3123da141, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:34:14 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:14.877 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[3c88ef41-3e59-41c8-92a8-6b42dca81c72]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:14 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:14.878 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-48ae5e44-4c0f-44dd-b2b0-7bd3123da141 namespace which is not needed anymore#033[00m
Oct  2 08:34:14 np0005466013 kernel: tap4d4f5d39-ff (unregistering): left promiscuous mode
Oct  2 08:34:14 np0005466013 NetworkManager[51205]: <info>  [1759408454.8873] device (tap4d4f5d39-ff): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:34:14 np0005466013 nova_compute[192144]: 2025-10-02 12:34:14.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:14 np0005466013 ovn_controller[94366]: 2025-10-02T12:34:14Z|00651|binding|INFO|Releasing lport 4d4f5d39-ff10-4ea6-8b7a-df918302bf68 from this chassis (sb_readonly=0)
Oct  2 08:34:14 np0005466013 ovn_controller[94366]: 2025-10-02T12:34:14Z|00652|binding|INFO|Setting lport 4d4f5d39-ff10-4ea6-8b7a-df918302bf68 down in Southbound
Oct  2 08:34:14 np0005466013 ovn_controller[94366]: 2025-10-02T12:34:14Z|00653|binding|INFO|Removing iface tap4d4f5d39-ff ovn-installed in OVS
Oct  2 08:34:14 np0005466013 nova_compute[192144]: 2025-10-02 12:34:14.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:14 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:14.909 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5b:3f:d2 2001:db8:0:1:f816:3eff:fe5b:3fd2 2001:db8::f816:3eff:fe5b:3fd2'], port_security=['fa:16:3e:5b:3f:d2 2001:db8:0:1:f816:3eff:fe5b:3fd2 2001:db8::f816:3eff:fe5b:3fd2'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe5b:3fd2/64 2001:db8::f816:3eff:fe5b:3fd2/64', 'neutron:device_id': '1eda1f2a-e061-4d62-b09d-49ac1dc55ace', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f55e0845-fc62-481d-a70d-8546faf2b8fb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c3e897d9-b083-4f5e-aef4-0a4551c54806', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=512667a6-6958-4dd6-8891-fcda7d607ab5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=4d4f5d39-ff10-4ea6-8b7a-df918302bf68) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:34:14 np0005466013 nova_compute[192144]: 2025-10-02 12:34:14.920 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:14 np0005466013 systemd[1]: machine-qemu\x2d70\x2dinstance\x2d00000091.scope: Deactivated successfully.
Oct  2 08:34:14 np0005466013 systemd[1]: machine-qemu\x2d70\x2dinstance\x2d00000091.scope: Consumed 17.494s CPU time.
Oct  2 08:34:14 np0005466013 systemd-machined[152202]: Machine qemu-70-instance-00000091 terminated.
Oct  2 08:34:15 np0005466013 neutron-haproxy-ovnmeta-48ae5e44-4c0f-44dd-b2b0-7bd3123da141[244487]: [NOTICE]   (244491) : haproxy version is 2.8.14-c23fe91
Oct  2 08:34:15 np0005466013 neutron-haproxy-ovnmeta-48ae5e44-4c0f-44dd-b2b0-7bd3123da141[244487]: [NOTICE]   (244491) : path to executable is /usr/sbin/haproxy
Oct  2 08:34:15 np0005466013 neutron-haproxy-ovnmeta-48ae5e44-4c0f-44dd-b2b0-7bd3123da141[244487]: [WARNING]  (244491) : Exiting Master process...
Oct  2 08:34:15 np0005466013 neutron-haproxy-ovnmeta-48ae5e44-4c0f-44dd-b2b0-7bd3123da141[244487]: [WARNING]  (244491) : Exiting Master process...
Oct  2 08:34:15 np0005466013 systemd-udevd[245250]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:34:15 np0005466013 NetworkManager[51205]: <info>  [1759408455.0471] manager: (tap087a3a60-6c): new Tun device (/org/freedesktop/NetworkManager/Devices/287)
Oct  2 08:34:15 np0005466013 neutron-haproxy-ovnmeta-48ae5e44-4c0f-44dd-b2b0-7bd3123da141[244487]: [ALERT]    (244491) : Current worker (244493) exited with code 143 (Terminated)
Oct  2 08:34:15 np0005466013 neutron-haproxy-ovnmeta-48ae5e44-4c0f-44dd-b2b0-7bd3123da141[244487]: [WARNING]  (244491) : All workers exited. Exiting... (0)
Oct  2 08:34:15 np0005466013 systemd[1]: libpod-d3b5e53a4ddd39a78eb89c6ae7760fe41b0e28b3434b9a96611a1bb584df512b.scope: Deactivated successfully.
Oct  2 08:34:15 np0005466013 podman[245275]: 2025-10-02 12:34:15.055273828 +0000 UTC m=+0.054979302 container died d3b5e53a4ddd39a78eb89c6ae7760fe41b0e28b3434b9a96611a1bb584df512b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48ae5e44-4c0f-44dd-b2b0-7bd3123da141, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:34:15 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d3b5e53a4ddd39a78eb89c6ae7760fe41b0e28b3434b9a96611a1bb584df512b-userdata-shm.mount: Deactivated successfully.
Oct  2 08:34:15 np0005466013 systemd[1]: var-lib-containers-storage-overlay-d624a9ebb372b44d75938e945b4d5f4fac3be329befebbae69cf96de91e7dca5-merged.mount: Deactivated successfully.
Oct  2 08:34:15 np0005466013 podman[245275]: 2025-10-02 12:34:15.108185212 +0000 UTC m=+0.107890716 container cleanup d3b5e53a4ddd39a78eb89c6ae7760fe41b0e28b3434b9a96611a1bb584df512b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48ae5e44-4c0f-44dd-b2b0-7bd3123da141, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.107 2 INFO nova.virt.libvirt.driver [-] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Instance destroyed successfully.#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.108 2 DEBUG nova.objects.instance [None req-5b9713ac-c054-4c9b-a5df-aa7632bbd188 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lazy-loading 'resources' on Instance uuid 1eda1f2a-e061-4d62-b09d-49ac1dc55ace obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:34:15 np0005466013 systemd[1]: libpod-conmon-d3b5e53a4ddd39a78eb89c6ae7760fe41b0e28b3434b9a96611a1bb584df512b.scope: Deactivated successfully.
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.124 2 DEBUG nova.virt.libvirt.vif [None req-5b9713ac-c054-4c9b-a5df-aa7632bbd188 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:32:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-618937776',display_name='tempest-TestGettingAddress-server-618937776',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-618937776',id=145,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAOjMDXDlGh7XWskIogxPxg3/debwat727C6FiGGYuKMh4jN83iNAp0gEQFUOyWskdK5DuOPQyXrWxnq0VTv+25W2TLuxAMtNSrcXSqgdODflHSjkV04SZMqyvlJudVrow==',key_name='tempest-TestGettingAddress-1577770950',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:32:47Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-1zbfq32k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:32:47Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=1eda1f2a-e061-4d62-b09d-49ac1dc55ace,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "087a3a60-6c14-460a-99cf-049201b3c5b7", "address": "fa:16:3e:bf:a3:73", "network": {"id": "48ae5e44-4c0f-44dd-b2b0-7bd3123da141", "bridge": "br-int", "label": "tempest-network-smoke--581239262", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap087a3a60-6c", "ovs_interfaceid": "087a3a60-6c14-460a-99cf-049201b3c5b7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.125 2 DEBUG nova.network.os_vif_util [None req-5b9713ac-c054-4c9b-a5df-aa7632bbd188 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "087a3a60-6c14-460a-99cf-049201b3c5b7", "address": "fa:16:3e:bf:a3:73", "network": {"id": "48ae5e44-4c0f-44dd-b2b0-7bd3123da141", "bridge": "br-int", "label": "tempest-network-smoke--581239262", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap087a3a60-6c", "ovs_interfaceid": "087a3a60-6c14-460a-99cf-049201b3c5b7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.125 2 DEBUG nova.network.os_vif_util [None req-5b9713ac-c054-4c9b-a5df-aa7632bbd188 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:bf:a3:73,bridge_name='br-int',has_traffic_filtering=True,id=087a3a60-6c14-460a-99cf-049201b3c5b7,network=Network(48ae5e44-4c0f-44dd-b2b0-7bd3123da141),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap087a3a60-6c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.126 2 DEBUG os_vif [None req-5b9713ac-c054-4c9b-a5df-aa7632bbd188 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:bf:a3:73,bridge_name='br-int',has_traffic_filtering=True,id=087a3a60-6c14-460a-99cf-049201b3c5b7,network=Network(48ae5e44-4c0f-44dd-b2b0-7bd3123da141),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap087a3a60-6c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.127 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.127 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap087a3a60-6c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.129 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.131 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.134 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.136 2 INFO os_vif [None req-5b9713ac-c054-4c9b-a5df-aa7632bbd188 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:bf:a3:73,bridge_name='br-int',has_traffic_filtering=True,id=087a3a60-6c14-460a-99cf-049201b3c5b7,network=Network(48ae5e44-4c0f-44dd-b2b0-7bd3123da141),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap087a3a60-6c')#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.137 2 DEBUG nova.virt.libvirt.vif [None req-5b9713ac-c054-4c9b-a5df-aa7632bbd188 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:32:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-618937776',display_name='tempest-TestGettingAddress-server-618937776',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-618937776',id=145,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAOjMDXDlGh7XWskIogxPxg3/debwat727C6FiGGYuKMh4jN83iNAp0gEQFUOyWskdK5DuOPQyXrWxnq0VTv+25W2TLuxAMtNSrcXSqgdODflHSjkV04SZMqyvlJudVrow==',key_name='tempest-TestGettingAddress-1577770950',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:32:47Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-1zbfq32k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:32:47Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=1eda1f2a-e061-4d62-b09d-49ac1dc55ace,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4d4f5d39-ff10-4ea6-8b7a-df918302bf68", "address": "fa:16:3e:5b:3f:d2", "network": {"id": "f55e0845-fc62-481d-a70d-8546faf2b8fb", "bridge": "br-int", "label": "tempest-network-smoke--2003085585", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe5b:3fd2", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5b:3fd2", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d4f5d39-ff", "ovs_interfaceid": "4d4f5d39-ff10-4ea6-8b7a-df918302bf68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.137 2 DEBUG nova.network.os_vif_util [None req-5b9713ac-c054-4c9b-a5df-aa7632bbd188 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "4d4f5d39-ff10-4ea6-8b7a-df918302bf68", "address": "fa:16:3e:5b:3f:d2", "network": {"id": "f55e0845-fc62-481d-a70d-8546faf2b8fb", "bridge": "br-int", "label": "tempest-network-smoke--2003085585", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe5b:3fd2", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5b:3fd2", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d4f5d39-ff", "ovs_interfaceid": "4d4f5d39-ff10-4ea6-8b7a-df918302bf68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.138 2 DEBUG nova.network.os_vif_util [None req-5b9713ac-c054-4c9b-a5df-aa7632bbd188 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5b:3f:d2,bridge_name='br-int',has_traffic_filtering=True,id=4d4f5d39-ff10-4ea6-8b7a-df918302bf68,network=Network(f55e0845-fc62-481d-a70d-8546faf2b8fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d4f5d39-ff') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.138 2 DEBUG os_vif [None req-5b9713ac-c054-4c9b-a5df-aa7632bbd188 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:3f:d2,bridge_name='br-int',has_traffic_filtering=True,id=4d4f5d39-ff10-4ea6-8b7a-df918302bf68,network=Network(f55e0845-fc62-481d-a70d-8546faf2b8fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d4f5d39-ff') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.139 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.140 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4d4f5d39-ff, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.141 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.144 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.146 2 INFO os_vif [None req-5b9713ac-c054-4c9b-a5df-aa7632bbd188 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:3f:d2,bridge_name='br-int',has_traffic_filtering=True,id=4d4f5d39-ff10-4ea6-8b7a-df918302bf68,network=Network(f55e0845-fc62-481d-a70d-8546faf2b8fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d4f5d39-ff')#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.147 2 INFO nova.virt.libvirt.driver [None req-5b9713ac-c054-4c9b-a5df-aa7632bbd188 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Deleting instance files /var/lib/nova/instances/1eda1f2a-e061-4d62-b09d-49ac1dc55ace_del#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.147 2 INFO nova.virt.libvirt.driver [None req-5b9713ac-c054-4c9b-a5df-aa7632bbd188 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Deletion of /var/lib/nova/instances/1eda1f2a-e061-4d62-b09d-49ac1dc55ace_del complete#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.175 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:15 np0005466013 podman[245331]: 2025-10-02 12:34:15.177508024 +0000 UTC m=+0.042168258 container remove d3b5e53a4ddd39a78eb89c6ae7760fe41b0e28b3434b9a96611a1bb584df512b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48ae5e44-4c0f-44dd-b2b0-7bd3123da141, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:34:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:15.183 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[d998eb19-2471-4742-bb6c-1fde41a94a89]: (4, ('Thu Oct  2 12:34:14 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-48ae5e44-4c0f-44dd-b2b0-7bd3123da141 (d3b5e53a4ddd39a78eb89c6ae7760fe41b0e28b3434b9a96611a1bb584df512b)\nd3b5e53a4ddd39a78eb89c6ae7760fe41b0e28b3434b9a96611a1bb584df512b\nThu Oct  2 12:34:15 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-48ae5e44-4c0f-44dd-b2b0-7bd3123da141 (d3b5e53a4ddd39a78eb89c6ae7760fe41b0e28b3434b9a96611a1bb584df512b)\nd3b5e53a4ddd39a78eb89c6ae7760fe41b0e28b3434b9a96611a1bb584df512b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:15.184 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[6a91de02-2476-4965-9aa3-37bce89c1b60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:15.185 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap48ae5e44-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.187 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:15 np0005466013 kernel: tap48ae5e44-40: left promiscuous mode
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.200 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:15.203 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[fcc9989e-11ef-4d36-932f-d09a65b2c5c7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.230 2 INFO nova.compute.manager [None req-5b9713ac-c054-4c9b-a5df-aa7632bbd188 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.231 2 DEBUG oslo.service.loopingcall [None req-5b9713ac-c054-4c9b-a5df-aa7632bbd188 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.232 2 DEBUG nova.compute.manager [-] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.236 2 DEBUG nova.network.neutron [-] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:34:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:15.238 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[fa7720bd-0918-452e-a8f9-c679dbdc367b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:15.239 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[57f13dd5-d220-49c7-a35c-c6cde61b3177]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:15.256 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[43b84a0f-4938-415f-9409-1dbb481a6af9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 635768, 'reachable_time': 29876, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245347, 'error': None, 'target': 'ovnmeta-48ae5e44-4c0f-44dd-b2b0-7bd3123da141', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:15.258 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-48ae5e44-4c0f-44dd-b2b0-7bd3123da141 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:34:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:15.259 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[7acbd2be-7bff-4991-b18d-c19fdd93c66b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:15.260 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 4d4f5d39-ff10-4ea6-8b7a-df918302bf68 in datapath f55e0845-fc62-481d-a70d-8546faf2b8fb unbound from our chassis#033[00m
Oct  2 08:34:15 np0005466013 systemd[1]: run-netns-ovnmeta\x2d48ae5e44\x2d4c0f\x2d44dd\x2db2b0\x2d7bd3123da141.mount: Deactivated successfully.
Oct  2 08:34:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:15.262 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f55e0845-fc62-481d-a70d-8546faf2b8fb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:34:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:15.263 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[87d8e85f-e681-4d8d-847f-bd482dc88101]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:15.263 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f55e0845-fc62-481d-a70d-8546faf2b8fb namespace which is not needed anymore#033[00m
Oct  2 08:34:15 np0005466013 neutron-haproxy-ovnmeta-f55e0845-fc62-481d-a70d-8546faf2b8fb[244561]: [NOTICE]   (244565) : haproxy version is 2.8.14-c23fe91
Oct  2 08:34:15 np0005466013 neutron-haproxy-ovnmeta-f55e0845-fc62-481d-a70d-8546faf2b8fb[244561]: [NOTICE]   (244565) : path to executable is /usr/sbin/haproxy
Oct  2 08:34:15 np0005466013 neutron-haproxy-ovnmeta-f55e0845-fc62-481d-a70d-8546faf2b8fb[244561]: [WARNING]  (244565) : Exiting Master process...
Oct  2 08:34:15 np0005466013 neutron-haproxy-ovnmeta-f55e0845-fc62-481d-a70d-8546faf2b8fb[244561]: [WARNING]  (244565) : Exiting Master process...
Oct  2 08:34:15 np0005466013 neutron-haproxy-ovnmeta-f55e0845-fc62-481d-a70d-8546faf2b8fb[244561]: [ALERT]    (244565) : Current worker (244567) exited with code 143 (Terminated)
Oct  2 08:34:15 np0005466013 neutron-haproxy-ovnmeta-f55e0845-fc62-481d-a70d-8546faf2b8fb[244561]: [WARNING]  (244565) : All workers exited. Exiting... (0)
Oct  2 08:34:15 np0005466013 systemd[1]: libpod-41225329b868b978a8fa3ab90f74dafc613e0019951b5ef17a29a181a58685c2.scope: Deactivated successfully.
Oct  2 08:34:15 np0005466013 podman[245366]: 2025-10-02 12:34:15.446362325 +0000 UTC m=+0.088484316 container died 41225329b868b978a8fa3ab90f74dafc613e0019951b5ef17a29a181a58685c2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f55e0845-fc62-481d-a70d-8546faf2b8fb, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 08:34:15 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-41225329b868b978a8fa3ab90f74dafc613e0019951b5ef17a29a181a58685c2-userdata-shm.mount: Deactivated successfully.
Oct  2 08:34:15 np0005466013 systemd[1]: var-lib-containers-storage-overlay-3225e0d4ac0687970187dfe0aaaf0ae262783ef6b6e2b42017da435eb93ec836-merged.mount: Deactivated successfully.
Oct  2 08:34:15 np0005466013 podman[245366]: 2025-10-02 12:34:15.47991553 +0000 UTC m=+0.122037471 container cleanup 41225329b868b978a8fa3ab90f74dafc613e0019951b5ef17a29a181a58685c2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f55e0845-fc62-481d-a70d-8546faf2b8fb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 08:34:15 np0005466013 systemd[1]: libpod-conmon-41225329b868b978a8fa3ab90f74dafc613e0019951b5ef17a29a181a58685c2.scope: Deactivated successfully.
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.493 2 DEBUG oslo_concurrency.lockutils [None req-fa3c8d82-0e58-44ce-8bc3-ab1e6a937dd1 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "4958df02-e9fa-4cb2-9175-4313cd3fd658" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.494 2 DEBUG oslo_concurrency.lockutils [None req-fa3c8d82-0e58-44ce-8bc3-ab1e6a937dd1 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "4958df02-e9fa-4cb2-9175-4313cd3fd658" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.495 2 DEBUG oslo_concurrency.lockutils [None req-fa3c8d82-0e58-44ce-8bc3-ab1e6a937dd1 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "4958df02-e9fa-4cb2-9175-4313cd3fd658-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.496 2 DEBUG oslo_concurrency.lockutils [None req-fa3c8d82-0e58-44ce-8bc3-ab1e6a937dd1 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "4958df02-e9fa-4cb2-9175-4313cd3fd658-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.496 2 DEBUG oslo_concurrency.lockutils [None req-fa3c8d82-0e58-44ce-8bc3-ab1e6a937dd1 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "4958df02-e9fa-4cb2-9175-4313cd3fd658-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.516 2 INFO nova.compute.manager [None req-fa3c8d82-0e58-44ce-8bc3-ab1e6a937dd1 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Terminating instance#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.529 2 DEBUG nova.compute.manager [None req-fa3c8d82-0e58-44ce-8bc3-ab1e6a937dd1 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:34:15 np0005466013 podman[245397]: 2025-10-02 12:34:15.552154404 +0000 UTC m=+0.047339831 container remove 41225329b868b978a8fa3ab90f74dafc613e0019951b5ef17a29a181a58685c2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f55e0845-fc62-481d-a70d-8546faf2b8fb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2)
Oct  2 08:34:15 np0005466013 kernel: tapbf1d62fc-3a (unregistering): left promiscuous mode
Oct  2 08:34:15 np0005466013 NetworkManager[51205]: <info>  [1759408455.5594] device (tapbf1d62fc-3a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:34:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:15.564 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[becec438-6826-4106-889f-d492370ddffe]: (4, ('Thu Oct  2 12:34:15 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f55e0845-fc62-481d-a70d-8546faf2b8fb (41225329b868b978a8fa3ab90f74dafc613e0019951b5ef17a29a181a58685c2)\n41225329b868b978a8fa3ab90f74dafc613e0019951b5ef17a29a181a58685c2\nThu Oct  2 12:34:15 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f55e0845-fc62-481d-a70d-8546faf2b8fb (41225329b868b978a8fa3ab90f74dafc613e0019951b5ef17a29a181a58685c2)\n41225329b868b978a8fa3ab90f74dafc613e0019951b5ef17a29a181a58685c2\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:15 np0005466013 ovn_controller[94366]: 2025-10-02T12:34:15Z|00654|binding|INFO|Releasing lport bf1d62fc-3a8d-4493-ae99-723fac577d26 from this chassis (sb_readonly=0)
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.566 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:15 np0005466013 ovn_controller[94366]: 2025-10-02T12:34:15Z|00655|binding|INFO|Setting lport bf1d62fc-3a8d-4493-ae99-723fac577d26 down in Southbound
Oct  2 08:34:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:15.566 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[7f55cb3d-859e-4194-9fbf-0b68b58578cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:15.568 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf55e0845-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:34:15 np0005466013 ovn_controller[94366]: 2025-10-02T12:34:15Z|00656|binding|INFO|Removing iface tapbf1d62fc-3a ovn-installed in OVS
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.569 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:15.574 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4a:4b:3d 10.100.0.9'], port_security=['fa:16:3e:4a:4b:3d 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '4958df02-e9fa-4cb2-9175-4313cd3fd658', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1acf42c5-084c-4cc4-bdc5-910eec0249e3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a4a7099974504a798e1607c8e6a1f570', 'neutron:revision_number': '4', 'neutron:security_group_ids': '99e51855-93ef-45a8-a4a3-2b0a8aec1882', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=498d5b4e-c711-4633-9705-7db30a0fb056, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=bf1d62fc-3a8d-4493-ae99-723fac577d26) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:34:15 np0005466013 kernel: tapf55e0845-f0: left promiscuous mode
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.585 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.594 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:15.599 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[de652c46-7502-4eae-8cab-aa7163707798]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:15 np0005466013 systemd[1]: machine-qemu\x2d69\x2dinstance\x2d0000008c.scope: Deactivated successfully.
Oct  2 08:34:15 np0005466013 systemd[1]: machine-qemu\x2d69\x2dinstance\x2d0000008c.scope: Consumed 18.776s CPU time.
Oct  2 08:34:15 np0005466013 systemd-machined[152202]: Machine qemu-69-instance-0000008c terminated.
Oct  2 08:34:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:15.643 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[b78e5288-6895-41c1-b0fc-c7285cca2495]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:15.645 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[f41d3124-bcd2-496e-af57-fc95bf756e20]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:15.661 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[1f08604a-57b9-4b61-aae2-29f9c4ee8396]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 635914, 'reachable_time': 32975, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245420, 'error': None, 'target': 'ovnmeta-f55e0845-fc62-481d-a70d-8546faf2b8fb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:15.663 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f55e0845-fc62-481d-a70d-8546faf2b8fb deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:34:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:15.664 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[830ef8a5-b91e-44ba-a6ba-7795030bdfde]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:15.664 103323 INFO neutron.agent.ovn.metadata.agent [-] Port bf1d62fc-3a8d-4493-ae99-723fac577d26 in datapath 1acf42c5-084c-4cc4-bdc5-910eec0249e3 unbound from our chassis#033[00m
Oct  2 08:34:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:15.666 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1acf42c5-084c-4cc4-bdc5-910eec0249e3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:34:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:15.667 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[e62865bc-52b2-4a91-b66c-e43948eeab10]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:15.667 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3 namespace which is not needed anymore#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.708 2 DEBUG nova.compute.manager [req-d3d39e1b-901a-4b76-9628-e09b70090bcb req-253bb700-9286-42db-82f1-e1a0ba628ff1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Received event network-vif-unplugged-087a3a60-6c14-460a-99cf-049201b3c5b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.709 2 DEBUG oslo_concurrency.lockutils [req-d3d39e1b-901a-4b76-9628-e09b70090bcb req-253bb700-9286-42db-82f1-e1a0ba628ff1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "1eda1f2a-e061-4d62-b09d-49ac1dc55ace-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.710 2 DEBUG oslo_concurrency.lockutils [req-d3d39e1b-901a-4b76-9628-e09b70090bcb req-253bb700-9286-42db-82f1-e1a0ba628ff1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "1eda1f2a-e061-4d62-b09d-49ac1dc55ace-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.710 2 DEBUG oslo_concurrency.lockutils [req-d3d39e1b-901a-4b76-9628-e09b70090bcb req-253bb700-9286-42db-82f1-e1a0ba628ff1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "1eda1f2a-e061-4d62-b09d-49ac1dc55ace-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.710 2 DEBUG nova.compute.manager [req-d3d39e1b-901a-4b76-9628-e09b70090bcb req-253bb700-9286-42db-82f1-e1a0ba628ff1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] No waiting events found dispatching network-vif-unplugged-087a3a60-6c14-460a-99cf-049201b3c5b7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.711 2 DEBUG nova.compute.manager [req-d3d39e1b-901a-4b76-9628-e09b70090bcb req-253bb700-9286-42db-82f1-e1a0ba628ff1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Received event network-vif-unplugged-087a3a60-6c14-460a-99cf-049201b3c5b7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.799 2 INFO nova.virt.libvirt.driver [-] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Instance destroyed successfully.#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.800 2 DEBUG nova.objects.instance [None req-fa3c8d82-0e58-44ce-8bc3-ab1e6a937dd1 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lazy-loading 'resources' on Instance uuid 4958df02-e9fa-4cb2-9175-4313cd3fd658 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:34:15 np0005466013 neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3[244079]: [NOTICE]   (244084) : haproxy version is 2.8.14-c23fe91
Oct  2 08:34:15 np0005466013 neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3[244079]: [NOTICE]   (244084) : path to executable is /usr/sbin/haproxy
Oct  2 08:34:15 np0005466013 neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3[244079]: [WARNING]  (244084) : Exiting Master process...
Oct  2 08:34:15 np0005466013 neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3[244079]: [ALERT]    (244084) : Current worker (244086) exited with code 143 (Terminated)
Oct  2 08:34:15 np0005466013 neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3[244079]: [WARNING]  (244084) : All workers exited. Exiting... (0)
Oct  2 08:34:15 np0005466013 systemd[1]: libpod-076b49b97b4d7e3fed7e3ab105a5a39e7ba7e445d514dd08b80745d61eae254e.scope: Deactivated successfully.
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.819 2 DEBUG nova.virt.libvirt.vif [None req-fa3c8d82-0e58-44ce-8bc3-ab1e6a937dd1 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:31:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-₡-1854753736',display_name='tempest-₡-1854753736',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest--1854753736',id=140,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:31:57Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a4a7099974504a798e1607c8e6a1f570',ramdisk_id='',reservation_id='r-s2du0lnf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-1163535506',owner_user_name='tempest-ServersTestJSON-1163535506-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:31:58Z,user_data=None,user_id='27daa263abb54d4d8e3ae34cd1c5ccf5',uuid=4958df02-e9fa-4cb2-9175-4313cd3fd658,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bf1d62fc-3a8d-4493-ae99-723fac577d26", "address": "fa:16:3e:4a:4b:3d", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf1d62fc-3a", "ovs_interfaceid": "bf1d62fc-3a8d-4493-ae99-723fac577d26", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.820 2 DEBUG nova.network.os_vif_util [None req-fa3c8d82-0e58-44ce-8bc3-ab1e6a937dd1 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Converting VIF {"id": "bf1d62fc-3a8d-4493-ae99-723fac577d26", "address": "fa:16:3e:4a:4b:3d", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf1d62fc-3a", "ovs_interfaceid": "bf1d62fc-3a8d-4493-ae99-723fac577d26", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.821 2 DEBUG nova.network.os_vif_util [None req-fa3c8d82-0e58-44ce-8bc3-ab1e6a937dd1 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:4a:4b:3d,bridge_name='br-int',has_traffic_filtering=True,id=bf1d62fc-3a8d-4493-ae99-723fac577d26,network=Network(1acf42c5-084c-4cc4-bdc5-910eec0249e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbf1d62fc-3a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.821 2 DEBUG os_vif [None req-fa3c8d82-0e58-44ce-8bc3-ab1e6a937dd1 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:4a:4b:3d,bridge_name='br-int',has_traffic_filtering=True,id=bf1d62fc-3a8d-4493-ae99-723fac577d26,network=Network(1acf42c5-084c-4cc4-bdc5-910eec0249e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbf1d62fc-3a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.823 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.824 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbf1d62fc-3a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:34:15 np0005466013 podman[245443]: 2025-10-02 12:34:15.825235777 +0000 UTC m=+0.053327759 container died 076b49b97b4d7e3fed7e3ab105a5a39e7ba7e445d514dd08b80745d61eae254e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.826 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.829 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.831 2 INFO os_vif [None req-fa3c8d82-0e58-44ce-8bc3-ab1e6a937dd1 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:4a:4b:3d,bridge_name='br-int',has_traffic_filtering=True,id=bf1d62fc-3a8d-4493-ae99-723fac577d26,network=Network(1acf42c5-084c-4cc4-bdc5-910eec0249e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbf1d62fc-3a')#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.832 2 INFO nova.virt.libvirt.driver [None req-fa3c8d82-0e58-44ce-8bc3-ab1e6a937dd1 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Deleting instance files /var/lib/nova/instances/4958df02-e9fa-4cb2-9175-4313cd3fd658_del#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.833 2 INFO nova.virt.libvirt.driver [None req-fa3c8d82-0e58-44ce-8bc3-ab1e6a937dd1 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Deletion of /var/lib/nova/instances/4958df02-e9fa-4cb2-9175-4313cd3fd658_del complete#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.873 2 DEBUG nova.compute.manager [req-d60d3f2a-552f-4abb-8428-d883660976fc req-6c91d4e0-3d40-451b-882a-2b0c3a4b1b02 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Received event network-vif-unplugged-4d4f5d39-ff10-4ea6-8b7a-df918302bf68 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.874 2 DEBUG oslo_concurrency.lockutils [req-d60d3f2a-552f-4abb-8428-d883660976fc req-6c91d4e0-3d40-451b-882a-2b0c3a4b1b02 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "1eda1f2a-e061-4d62-b09d-49ac1dc55ace-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.874 2 DEBUG oslo_concurrency.lockutils [req-d60d3f2a-552f-4abb-8428-d883660976fc req-6c91d4e0-3d40-451b-882a-2b0c3a4b1b02 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "1eda1f2a-e061-4d62-b09d-49ac1dc55ace-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.874 2 DEBUG oslo_concurrency.lockutils [req-d60d3f2a-552f-4abb-8428-d883660976fc req-6c91d4e0-3d40-451b-882a-2b0c3a4b1b02 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "1eda1f2a-e061-4d62-b09d-49ac1dc55ace-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.875 2 DEBUG nova.compute.manager [req-d60d3f2a-552f-4abb-8428-d883660976fc req-6c91d4e0-3d40-451b-882a-2b0c3a4b1b02 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] No waiting events found dispatching network-vif-unplugged-4d4f5d39-ff10-4ea6-8b7a-df918302bf68 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.875 2 DEBUG nova.compute.manager [req-d60d3f2a-552f-4abb-8428-d883660976fc req-6c91d4e0-3d40-451b-882a-2b0c3a4b1b02 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Received event network-vif-unplugged-4d4f5d39-ff10-4ea6-8b7a-df918302bf68 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:34:15 np0005466013 podman[245443]: 2025-10-02 12:34:15.88758692 +0000 UTC m=+0.115678902 container cleanup 076b49b97b4d7e3fed7e3ab105a5a39e7ba7e445d514dd08b80745d61eae254e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 08:34:15 np0005466013 systemd[1]: libpod-conmon-076b49b97b4d7e3fed7e3ab105a5a39e7ba7e445d514dd08b80745d61eae254e.scope: Deactivated successfully.
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.904 2 INFO nova.compute.manager [None req-fa3c8d82-0e58-44ce-8bc3-ab1e6a937dd1 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Took 0.37 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.905 2 DEBUG oslo.service.loopingcall [None req-fa3c8d82-0e58-44ce-8bc3-ab1e6a937dd1 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.906 2 DEBUG nova.compute.manager [-] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.906 2 DEBUG nova.network.neutron [-] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.944 2 DEBUG nova.compute.manager [req-cc9e2534-d235-4934-bf63-4fcdc207ec7e req-9e33f853-39ca-41a0-8397-052bc4bf5052 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Received event network-vif-unplugged-bf1d62fc-3a8d-4493-ae99-723fac577d26 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.944 2 DEBUG oslo_concurrency.lockutils [req-cc9e2534-d235-4934-bf63-4fcdc207ec7e req-9e33f853-39ca-41a0-8397-052bc4bf5052 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "4958df02-e9fa-4cb2-9175-4313cd3fd658-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.945 2 DEBUG oslo_concurrency.lockutils [req-cc9e2534-d235-4934-bf63-4fcdc207ec7e req-9e33f853-39ca-41a0-8397-052bc4bf5052 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "4958df02-e9fa-4cb2-9175-4313cd3fd658-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.945 2 DEBUG oslo_concurrency.lockutils [req-cc9e2534-d235-4934-bf63-4fcdc207ec7e req-9e33f853-39ca-41a0-8397-052bc4bf5052 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "4958df02-e9fa-4cb2-9175-4313cd3fd658-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.945 2 DEBUG nova.compute.manager [req-cc9e2534-d235-4934-bf63-4fcdc207ec7e req-9e33f853-39ca-41a0-8397-052bc4bf5052 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] No waiting events found dispatching network-vif-unplugged-bf1d62fc-3a8d-4493-ae99-723fac577d26 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.946 2 DEBUG nova.compute.manager [req-cc9e2534-d235-4934-bf63-4fcdc207ec7e req-9e33f853-39ca-41a0-8397-052bc4bf5052 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Received event network-vif-unplugged-bf1d62fc-3a8d-4493-ae99-723fac577d26 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:34:15 np0005466013 podman[245485]: 2025-10-02 12:34:15.97687595 +0000 UTC m=+0.062948872 container remove 076b49b97b4d7e3fed7e3ab105a5a39e7ba7e445d514dd08b80745d61eae254e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:34:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:15.983 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[df65d5ec-19eb-4e55-8fd9-2c9323d1a370]: (4, ('Thu Oct  2 12:34:15 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3 (076b49b97b4d7e3fed7e3ab105a5a39e7ba7e445d514dd08b80745d61eae254e)\n076b49b97b4d7e3fed7e3ab105a5a39e7ba7e445d514dd08b80745d61eae254e\nThu Oct  2 12:34:15 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3 (076b49b97b4d7e3fed7e3ab105a5a39e7ba7e445d514dd08b80745d61eae254e)\n076b49b97b4d7e3fed7e3ab105a5a39e7ba7e445d514dd08b80745d61eae254e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:15.985 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[d020e738-2c64-4ae0-8da2-c137c0d4d5cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:15.986 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1acf42c5-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:34:15 np0005466013 kernel: tap1acf42c5-00: left promiscuous mode
Oct  2 08:34:15 np0005466013 nova_compute[192144]: 2025-10-02 12:34:15.987 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:15.993 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[47a75e54-1476-482a-b926-2f279dbee55d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:16 np0005466013 nova_compute[192144]: 2025-10-02 12:34:16.002 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:16.029 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[47fcb0a5-a2f9-461a-b7e9-e9b4636a5347]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:16.031 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[688971c0-926a-4a32-9284-9d4c9b445665]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:16.050 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[5157b278-612f-4aed-a895-cf742adc181a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 630869, 'reachable_time': 21661, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245500, 'error': None, 'target': 'ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:16.053 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:34:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:16.053 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[68534f57-e398-4898-a08e-1f17dc104876]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:16 np0005466013 systemd[1]: run-netns-ovnmeta\x2df55e0845\x2dfc62\x2d481d\x2da70d\x2d8546faf2b8fb.mount: Deactivated successfully.
Oct  2 08:34:16 np0005466013 systemd[1]: var-lib-containers-storage-overlay-bb3b9d86227baccfefdee04b6bd98d54b156dcf3151053924b0d7cc6d915815a-merged.mount: Deactivated successfully.
Oct  2 08:34:16 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-076b49b97b4d7e3fed7e3ab105a5a39e7ba7e445d514dd08b80745d61eae254e-userdata-shm.mount: Deactivated successfully.
Oct  2 08:34:16 np0005466013 systemd[1]: run-netns-ovnmeta\x2d1acf42c5\x2d084c\x2d4cc4\x2dbdc5\x2d910eec0249e3.mount: Deactivated successfully.
Oct  2 08:34:16 np0005466013 nova_compute[192144]: 2025-10-02 12:34:16.582 2 DEBUG nova.network.neutron [-] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:34:16 np0005466013 nova_compute[192144]: 2025-10-02 12:34:16.600 2 INFO nova.compute.manager [-] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Took 1.36 seconds to deallocate network for instance.#033[00m
Oct  2 08:34:16 np0005466013 nova_compute[192144]: 2025-10-02 12:34:16.673 2 DEBUG oslo_concurrency.lockutils [None req-5b9713ac-c054-4c9b-a5df-aa7632bbd188 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:16 np0005466013 nova_compute[192144]: 2025-10-02 12:34:16.674 2 DEBUG oslo_concurrency.lockutils [None req-5b9713ac-c054-4c9b-a5df-aa7632bbd188 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:16 np0005466013 nova_compute[192144]: 2025-10-02 12:34:16.744 2 DEBUG nova.compute.provider_tree [None req-5b9713ac-c054-4c9b-a5df-aa7632bbd188 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:34:16 np0005466013 nova_compute[192144]: 2025-10-02 12:34:16.757 2 DEBUG nova.scheduler.client.report [None req-5b9713ac-c054-4c9b-a5df-aa7632bbd188 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:34:16 np0005466013 nova_compute[192144]: 2025-10-02 12:34:16.785 2 DEBUG oslo_concurrency.lockutils [None req-5b9713ac-c054-4c9b-a5df-aa7632bbd188 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:16 np0005466013 nova_compute[192144]: 2025-10-02 12:34:16.826 2 INFO nova.scheduler.client.report [None req-5b9713ac-c054-4c9b-a5df-aa7632bbd188 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Deleted allocations for instance 1eda1f2a-e061-4d62-b09d-49ac1dc55ace#033[00m
Oct  2 08:34:16 np0005466013 nova_compute[192144]: 2025-10-02 12:34:16.881 2 DEBUG nova.network.neutron [req-8874e8fc-01b1-4d5c-9ee8-878da553e76b req-5bce72eb-b097-4d8a-bb84-b5d7ef71906f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Updated VIF entry in instance network info cache for port 087a3a60-6c14-460a-99cf-049201b3c5b7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:34:16 np0005466013 nova_compute[192144]: 2025-10-02 12:34:16.882 2 DEBUG nova.network.neutron [req-8874e8fc-01b1-4d5c-9ee8-878da553e76b req-5bce72eb-b097-4d8a-bb84-b5d7ef71906f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Updating instance_info_cache with network_info: [{"id": "087a3a60-6c14-460a-99cf-049201b3c5b7", "address": "fa:16:3e:bf:a3:73", "network": {"id": "48ae5e44-4c0f-44dd-b2b0-7bd3123da141", "bridge": "br-int", "label": "tempest-network-smoke--581239262", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap087a3a60-6c", "ovs_interfaceid": "087a3a60-6c14-460a-99cf-049201b3c5b7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "4d4f5d39-ff10-4ea6-8b7a-df918302bf68", "address": "fa:16:3e:5b:3f:d2", "network": {"id": "f55e0845-fc62-481d-a70d-8546faf2b8fb", "bridge": "br-int", "label": "tempest-network-smoke--2003085585", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe5b:3fd2", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5b:3fd2", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d4f5d39-ff", "ovs_interfaceid": "4d4f5d39-ff10-4ea6-8b7a-df918302bf68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:34:16 np0005466013 nova_compute[192144]: 2025-10-02 12:34:16.898 2 DEBUG nova.network.neutron [-] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:34:16 np0005466013 nova_compute[192144]: 2025-10-02 12:34:16.915 2 INFO nova.compute.manager [-] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Took 1.01 seconds to deallocate network for instance.#033[00m
Oct  2 08:34:16 np0005466013 nova_compute[192144]: 2025-10-02 12:34:16.915 2 DEBUG oslo_concurrency.lockutils [req-8874e8fc-01b1-4d5c-9ee8-878da553e76b req-5bce72eb-b097-4d8a-bb84-b5d7ef71906f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-1eda1f2a-e061-4d62-b09d-49ac1dc55ace" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:34:16 np0005466013 nova_compute[192144]: 2025-10-02 12:34:16.923 2 DEBUG oslo_concurrency.lockutils [None req-5b9713ac-c054-4c9b-a5df-aa7632bbd188 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "1eda1f2a-e061-4d62-b09d-49ac1dc55ace" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:16 np0005466013 nova_compute[192144]: 2025-10-02 12:34:16.975 2 DEBUG oslo_concurrency.lockutils [None req-fa3c8d82-0e58-44ce-8bc3-ab1e6a937dd1 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:16 np0005466013 nova_compute[192144]: 2025-10-02 12:34:16.976 2 DEBUG oslo_concurrency.lockutils [None req-fa3c8d82-0e58-44ce-8bc3-ab1e6a937dd1 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:17 np0005466013 nova_compute[192144]: 2025-10-02 12:34:17.028 2 DEBUG nova.compute.provider_tree [None req-fa3c8d82-0e58-44ce-8bc3-ab1e6a937dd1 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:34:17 np0005466013 nova_compute[192144]: 2025-10-02 12:34:17.046 2 DEBUG nova.scheduler.client.report [None req-fa3c8d82-0e58-44ce-8bc3-ab1e6a937dd1 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:34:17 np0005466013 nova_compute[192144]: 2025-10-02 12:34:17.067 2 DEBUG oslo_concurrency.lockutils [None req-fa3c8d82-0e58-44ce-8bc3-ab1e6a937dd1 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.091s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:17 np0005466013 nova_compute[192144]: 2025-10-02 12:34:17.098 2 INFO nova.scheduler.client.report [None req-fa3c8d82-0e58-44ce-8bc3-ab1e6a937dd1 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Deleted allocations for instance 4958df02-e9fa-4cb2-9175-4313cd3fd658#033[00m
Oct  2 08:34:17 np0005466013 nova_compute[192144]: 2025-10-02 12:34:17.220 2 DEBUG oslo_concurrency.lockutils [None req-fa3c8d82-0e58-44ce-8bc3-ab1e6a937dd1 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "4958df02-e9fa-4cb2-9175-4313cd3fd658" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.726s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:17 np0005466013 nova_compute[192144]: 2025-10-02 12:34:17.834 2 DEBUG nova.compute.manager [req-998e5b8d-86ce-4a01-85b6-a7cb0bb1a196 req-f5c0af6c-8411-4854-99f8-583c74ca185f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Received event network-vif-plugged-087a3a60-6c14-460a-99cf-049201b3c5b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:34:17 np0005466013 nova_compute[192144]: 2025-10-02 12:34:17.835 2 DEBUG oslo_concurrency.lockutils [req-998e5b8d-86ce-4a01-85b6-a7cb0bb1a196 req-f5c0af6c-8411-4854-99f8-583c74ca185f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "1eda1f2a-e061-4d62-b09d-49ac1dc55ace-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:17 np0005466013 nova_compute[192144]: 2025-10-02 12:34:17.835 2 DEBUG oslo_concurrency.lockutils [req-998e5b8d-86ce-4a01-85b6-a7cb0bb1a196 req-f5c0af6c-8411-4854-99f8-583c74ca185f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "1eda1f2a-e061-4d62-b09d-49ac1dc55ace-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:17 np0005466013 nova_compute[192144]: 2025-10-02 12:34:17.835 2 DEBUG oslo_concurrency.lockutils [req-998e5b8d-86ce-4a01-85b6-a7cb0bb1a196 req-f5c0af6c-8411-4854-99f8-583c74ca185f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "1eda1f2a-e061-4d62-b09d-49ac1dc55ace-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:17 np0005466013 nova_compute[192144]: 2025-10-02 12:34:17.835 2 DEBUG nova.compute.manager [req-998e5b8d-86ce-4a01-85b6-a7cb0bb1a196 req-f5c0af6c-8411-4854-99f8-583c74ca185f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] No waiting events found dispatching network-vif-plugged-087a3a60-6c14-460a-99cf-049201b3c5b7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:34:17 np0005466013 nova_compute[192144]: 2025-10-02 12:34:17.836 2 WARNING nova.compute.manager [req-998e5b8d-86ce-4a01-85b6-a7cb0bb1a196 req-f5c0af6c-8411-4854-99f8-583c74ca185f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Received unexpected event network-vif-plugged-087a3a60-6c14-460a-99cf-049201b3c5b7 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:34:17 np0005466013 nova_compute[192144]: 2025-10-02 12:34:17.993 2 DEBUG nova.compute.manager [req-b5a41aa0-8b08-49ee-80f0-cef3a622ecd1 req-3519fea2-7832-4ae6-9cd9-a9caf54f1941 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Received event network-vif-plugged-4d4f5d39-ff10-4ea6-8b7a-df918302bf68 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:34:17 np0005466013 nova_compute[192144]: 2025-10-02 12:34:17.994 2 DEBUG oslo_concurrency.lockutils [req-b5a41aa0-8b08-49ee-80f0-cef3a622ecd1 req-3519fea2-7832-4ae6-9cd9-a9caf54f1941 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "1eda1f2a-e061-4d62-b09d-49ac1dc55ace-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:17 np0005466013 nova_compute[192144]: 2025-10-02 12:34:17.994 2 DEBUG oslo_concurrency.lockutils [req-b5a41aa0-8b08-49ee-80f0-cef3a622ecd1 req-3519fea2-7832-4ae6-9cd9-a9caf54f1941 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "1eda1f2a-e061-4d62-b09d-49ac1dc55ace-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:17 np0005466013 nova_compute[192144]: 2025-10-02 12:34:17.994 2 DEBUG oslo_concurrency.lockutils [req-b5a41aa0-8b08-49ee-80f0-cef3a622ecd1 req-3519fea2-7832-4ae6-9cd9-a9caf54f1941 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "1eda1f2a-e061-4d62-b09d-49ac1dc55ace-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:17 np0005466013 nova_compute[192144]: 2025-10-02 12:34:17.995 2 DEBUG nova.compute.manager [req-b5a41aa0-8b08-49ee-80f0-cef3a622ecd1 req-3519fea2-7832-4ae6-9cd9-a9caf54f1941 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] No waiting events found dispatching network-vif-plugged-4d4f5d39-ff10-4ea6-8b7a-df918302bf68 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:34:17 np0005466013 nova_compute[192144]: 2025-10-02 12:34:17.995 2 WARNING nova.compute.manager [req-b5a41aa0-8b08-49ee-80f0-cef3a622ecd1 req-3519fea2-7832-4ae6-9cd9-a9caf54f1941 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Received unexpected event network-vif-plugged-4d4f5d39-ff10-4ea6-8b7a-df918302bf68 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:34:17 np0005466013 nova_compute[192144]: 2025-10-02 12:34:17.995 2 DEBUG nova.compute.manager [req-b5a41aa0-8b08-49ee-80f0-cef3a622ecd1 req-3519fea2-7832-4ae6-9cd9-a9caf54f1941 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Received event network-vif-deleted-bf1d62fc-3a8d-4493-ae99-723fac577d26 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:34:18 np0005466013 nova_compute[192144]: 2025-10-02 12:34:18.051 2 DEBUG nova.compute.manager [req-2c2a8e14-6b4d-49c2-92b5-12de764bd7c7 req-4dc8ef5f-ca0b-4250-9d2d-fd3a8f5924cf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Received event network-vif-deleted-087a3a60-6c14-460a-99cf-049201b3c5b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:34:18 np0005466013 nova_compute[192144]: 2025-10-02 12:34:18.052 2 INFO nova.compute.manager [req-2c2a8e14-6b4d-49c2-92b5-12de764bd7c7 req-4dc8ef5f-ca0b-4250-9d2d-fd3a8f5924cf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Neutron deleted interface 087a3a60-6c14-460a-99cf-049201b3c5b7; detaching it from the instance and deleting it from the info cache#033[00m
Oct  2 08:34:18 np0005466013 nova_compute[192144]: 2025-10-02 12:34:18.052 2 DEBUG nova.network.neutron [req-2c2a8e14-6b4d-49c2-92b5-12de764bd7c7 req-4dc8ef5f-ca0b-4250-9d2d-fd3a8f5924cf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106#033[00m
Oct  2 08:34:18 np0005466013 nova_compute[192144]: 2025-10-02 12:34:18.056 2 DEBUG nova.compute.manager [req-2c2a8e14-6b4d-49c2-92b5-12de764bd7c7 req-4dc8ef5f-ca0b-4250-9d2d-fd3a8f5924cf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Detach interface failed, port_id=087a3a60-6c14-460a-99cf-049201b3c5b7, reason: Instance 1eda1f2a-e061-4d62-b09d-49ac1dc55ace could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Oct  2 08:34:18 np0005466013 nova_compute[192144]: 2025-10-02 12:34:18.057 2 DEBUG nova.compute.manager [req-2c2a8e14-6b4d-49c2-92b5-12de764bd7c7 req-4dc8ef5f-ca0b-4250-9d2d-fd3a8f5924cf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Received event network-vif-plugged-bf1d62fc-3a8d-4493-ae99-723fac577d26 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:34:18 np0005466013 nova_compute[192144]: 2025-10-02 12:34:18.057 2 DEBUG oslo_concurrency.lockutils [req-2c2a8e14-6b4d-49c2-92b5-12de764bd7c7 req-4dc8ef5f-ca0b-4250-9d2d-fd3a8f5924cf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "4958df02-e9fa-4cb2-9175-4313cd3fd658-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:18 np0005466013 nova_compute[192144]: 2025-10-02 12:34:18.058 2 DEBUG oslo_concurrency.lockutils [req-2c2a8e14-6b4d-49c2-92b5-12de764bd7c7 req-4dc8ef5f-ca0b-4250-9d2d-fd3a8f5924cf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "4958df02-e9fa-4cb2-9175-4313cd3fd658-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:18 np0005466013 nova_compute[192144]: 2025-10-02 12:34:18.058 2 DEBUG oslo_concurrency.lockutils [req-2c2a8e14-6b4d-49c2-92b5-12de764bd7c7 req-4dc8ef5f-ca0b-4250-9d2d-fd3a8f5924cf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "4958df02-e9fa-4cb2-9175-4313cd3fd658-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:18 np0005466013 nova_compute[192144]: 2025-10-02 12:34:18.058 2 DEBUG nova.compute.manager [req-2c2a8e14-6b4d-49c2-92b5-12de764bd7c7 req-4dc8ef5f-ca0b-4250-9d2d-fd3a8f5924cf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] No waiting events found dispatching network-vif-plugged-bf1d62fc-3a8d-4493-ae99-723fac577d26 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:34:18 np0005466013 nova_compute[192144]: 2025-10-02 12:34:18.059 2 WARNING nova.compute.manager [req-2c2a8e14-6b4d-49c2-92b5-12de764bd7c7 req-4dc8ef5f-ca0b-4250-9d2d-fd3a8f5924cf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Received unexpected event network-vif-plugged-bf1d62fc-3a8d-4493-ae99-723fac577d26 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:34:18 np0005466013 nova_compute[192144]: 2025-10-02 12:34:18.059 2 DEBUG nova.compute.manager [req-2c2a8e14-6b4d-49c2-92b5-12de764bd7c7 req-4dc8ef5f-ca0b-4250-9d2d-fd3a8f5924cf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Received event network-vif-deleted-4d4f5d39-ff10-4ea6-8b7a-df918302bf68 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:34:18 np0005466013 nova_compute[192144]: 2025-10-02 12:34:18.060 2 INFO nova.compute.manager [req-2c2a8e14-6b4d-49c2-92b5-12de764bd7c7 req-4dc8ef5f-ca0b-4250-9d2d-fd3a8f5924cf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Neutron deleted interface 4d4f5d39-ff10-4ea6-8b7a-df918302bf68; detaching it from the instance and deleting it from the info cache#033[00m
Oct  2 08:34:18 np0005466013 nova_compute[192144]: 2025-10-02 12:34:18.060 2 DEBUG nova.network.neutron [req-2c2a8e14-6b4d-49c2-92b5-12de764bd7c7 req-4dc8ef5f-ca0b-4250-9d2d-fd3a8f5924cf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106#033[00m
Oct  2 08:34:18 np0005466013 nova_compute[192144]: 2025-10-02 12:34:18.064 2 DEBUG nova.compute.manager [req-2c2a8e14-6b4d-49c2-92b5-12de764bd7c7 req-4dc8ef5f-ca0b-4250-9d2d-fd3a8f5924cf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Detach interface failed, port_id=4d4f5d39-ff10-4ea6-8b7a-df918302bf68, reason: Instance 1eda1f2a-e061-4d62-b09d-49ac1dc55ace could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Oct  2 08:34:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:18.881 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '38'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:34:20 np0005466013 nova_compute[192144]: 2025-10-02 12:34:20.178 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:20 np0005466013 nova_compute[192144]: 2025-10-02 12:34:20.826 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:22 np0005466013 nova_compute[192144]: 2025-10-02 12:34:22.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:23 np0005466013 nova_compute[192144]: 2025-10-02 12:34:23.146 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:24 np0005466013 nova_compute[192144]: 2025-10-02 12:34:24.121 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408449.1197755, d69f983e-e1c9-488c-a48e-2684e425362a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:34:24 np0005466013 nova_compute[192144]: 2025-10-02 12:34:24.122 2 INFO nova.compute.manager [-] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:34:24 np0005466013 nova_compute[192144]: 2025-10-02 12:34:24.140 2 DEBUG nova.compute.manager [None req-eef378c9-5347-4f3e-84e3-1482041fcca2 - - - - - -] [instance: d69f983e-e1c9-488c-a48e-2684e425362a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:34:24 np0005466013 podman[245502]: 2025-10-02 12:34:24.70360914 +0000 UTC m=+0.073333892 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3)
Oct  2 08:34:24 np0005466013 podman[245503]: 2025-10-02 12:34:24.71411046 +0000 UTC m=+0.078470664 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, version=9.6, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.expose-services=, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public)
Oct  2 08:34:24 np0005466013 podman[245504]: 2025-10-02 12:34:24.746675201 +0000 UTC m=+0.096494968 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:34:25 np0005466013 nova_compute[192144]: 2025-10-02 12:34:25.181 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:25 np0005466013 nova_compute[192144]: 2025-10-02 12:34:25.829 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:30 np0005466013 nova_compute[192144]: 2025-10-02 12:34:30.108 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408455.105865, 1eda1f2a-e061-4d62-b09d-49ac1dc55ace => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:34:30 np0005466013 nova_compute[192144]: 2025-10-02 12:34:30.108 2 INFO nova.compute.manager [-] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:34:30 np0005466013 nova_compute[192144]: 2025-10-02 12:34:30.131 2 DEBUG nova.compute.manager [None req-75c9110f-e5b6-4316-9c90-32dcad6136df - - - - - -] [instance: 1eda1f2a-e061-4d62-b09d-49ac1dc55ace] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:34:30 np0005466013 nova_compute[192144]: 2025-10-02 12:34:30.184 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:30 np0005466013 podman[245565]: 2025-10-02 12:34:30.709267279 +0000 UTC m=+0.076430930 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:34:30 np0005466013 podman[245564]: 2025-10-02 12:34:30.728464221 +0000 UTC m=+0.092910756 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  2 08:34:30 np0005466013 nova_compute[192144]: 2025-10-02 12:34:30.798 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408455.796977, 4958df02-e9fa-4cb2-9175-4313cd3fd658 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:34:30 np0005466013 nova_compute[192144]: 2025-10-02 12:34:30.799 2 INFO nova.compute.manager [-] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:34:30 np0005466013 nova_compute[192144]: 2025-10-02 12:34:30.823 2 DEBUG nova.compute.manager [None req-eac360e2-538c-428f-bf58-031f171064a2 - - - - - -] [instance: 4958df02-e9fa-4cb2-9175-4313cd3fd658] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:34:30 np0005466013 nova_compute[192144]: 2025-10-02 12:34:30.831 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:35 np0005466013 nova_compute[192144]: 2025-10-02 12:34:35.189 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:35 np0005466013 nova_compute[192144]: 2025-10-02 12:34:35.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:36 np0005466013 nova_compute[192144]: 2025-10-02 12:34:36.357 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:34:36 np0005466013 nova_compute[192144]: 2025-10-02 12:34:36.358 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:34:36 np0005466013 nova_compute[192144]: 2025-10-02 12:34:36.358 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:34:36 np0005466013 nova_compute[192144]: 2025-10-02 12:34:36.384 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:36 np0005466013 nova_compute[192144]: 2025-10-02 12:34:36.385 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:36 np0005466013 nova_compute[192144]: 2025-10-02 12:34:36.385 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:36 np0005466013 nova_compute[192144]: 2025-10-02 12:34:36.385 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:34:36 np0005466013 nova_compute[192144]: 2025-10-02 12:34:36.609 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:34:36 np0005466013 nova_compute[192144]: 2025-10-02 12:34:36.611 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5711MB free_disk=73.2005729675293GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:34:36 np0005466013 nova_compute[192144]: 2025-10-02 12:34:36.612 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:36 np0005466013 nova_compute[192144]: 2025-10-02 12:34:36.612 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:36 np0005466013 nova_compute[192144]: 2025-10-02 12:34:36.699 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:34:36 np0005466013 nova_compute[192144]: 2025-10-02 12:34:36.699 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:34:36 np0005466013 nova_compute[192144]: 2025-10-02 12:34:36.722 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:34:36 np0005466013 nova_compute[192144]: 2025-10-02 12:34:36.737 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:34:36 np0005466013 nova_compute[192144]: 2025-10-02 12:34:36.778 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:34:36 np0005466013 nova_compute[192144]: 2025-10-02 12:34:36.779 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.167s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:39 np0005466013 nova_compute[192144]: 2025-10-02 12:34:39.416 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:34:40 np0005466013 nova_compute[192144]: 2025-10-02 12:34:40.193 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:40 np0005466013 nova_compute[192144]: 2025-10-02 12:34:40.836 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:40 np0005466013 nova_compute[192144]: 2025-10-02 12:34:40.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:34:43 np0005466013 nova_compute[192144]: 2025-10-02 12:34:43.989 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:34:43 np0005466013 nova_compute[192144]: 2025-10-02 12:34:43.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:34:45 np0005466013 nova_compute[192144]: 2025-10-02 12:34:45.196 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:45 np0005466013 podman[245609]: 2025-10-02 12:34:45.691704298 +0000 UTC m=+0.064137923 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  2 08:34:45 np0005466013 podman[245608]: 2025-10-02 12:34:45.711904412 +0000 UTC m=+0.086868346 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:34:45 np0005466013 podman[245610]: 2025-10-02 12:34:45.739631652 +0000 UTC m=+0.109950531 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:34:45 np0005466013 nova_compute[192144]: 2025-10-02 12:34:45.838 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:45 np0005466013 nova_compute[192144]: 2025-10-02 12:34:45.989 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:34:46 np0005466013 nova_compute[192144]: 2025-10-02 12:34:46.015 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:34:46 np0005466013 nova_compute[192144]: 2025-10-02 12:34:46.016 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:34:46 np0005466013 nova_compute[192144]: 2025-10-02 12:34:46.046 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:34:46 np0005466013 nova_compute[192144]: 2025-10-02 12:34:46.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:34:49 np0005466013 nova_compute[192144]: 2025-10-02 12:34:49.542 2 DEBUG oslo_concurrency.lockutils [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "cd5cbe34-283a-4177-9d0b-bc35fadcde72" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:49 np0005466013 nova_compute[192144]: 2025-10-02 12:34:49.543 2 DEBUG oslo_concurrency.lockutils [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "cd5cbe34-283a-4177-9d0b-bc35fadcde72" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:49 np0005466013 nova_compute[192144]: 2025-10-02 12:34:49.575 2 DEBUG nova.compute.manager [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:34:49 np0005466013 nova_compute[192144]: 2025-10-02 12:34:49.689 2 DEBUG oslo_concurrency.lockutils [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:49 np0005466013 nova_compute[192144]: 2025-10-02 12:34:49.690 2 DEBUG oslo_concurrency.lockutils [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:49 np0005466013 nova_compute[192144]: 2025-10-02 12:34:49.696 2 DEBUG nova.virt.hardware [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:34:49 np0005466013 nova_compute[192144]: 2025-10-02 12:34:49.697 2 INFO nova.compute.claims [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:34:49 np0005466013 nova_compute[192144]: 2025-10-02 12:34:49.796 2 DEBUG nova.compute.provider_tree [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:34:49 np0005466013 nova_compute[192144]: 2025-10-02 12:34:49.811 2 DEBUG nova.scheduler.client.report [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:34:49 np0005466013 nova_compute[192144]: 2025-10-02 12:34:49.835 2 DEBUG oslo_concurrency.lockutils [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.145s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:49 np0005466013 nova_compute[192144]: 2025-10-02 12:34:49.835 2 DEBUG nova.compute.manager [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:34:49 np0005466013 nova_compute[192144]: 2025-10-02 12:34:49.895 2 DEBUG nova.compute.manager [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:34:49 np0005466013 nova_compute[192144]: 2025-10-02 12:34:49.895 2 DEBUG nova.network.neutron [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:34:49 np0005466013 nova_compute[192144]: 2025-10-02 12:34:49.912 2 INFO nova.virt.libvirt.driver [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:34:49 np0005466013 nova_compute[192144]: 2025-10-02 12:34:49.927 2 DEBUG nova.compute.manager [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:34:49 np0005466013 nova_compute[192144]: 2025-10-02 12:34:49.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:34:49 np0005466013 nova_compute[192144]: 2025-10-02 12:34:49.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:34:49 np0005466013 nova_compute[192144]: 2025-10-02 12:34:49.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  2 08:34:50 np0005466013 nova_compute[192144]: 2025-10-02 12:34:50.065 2 DEBUG nova.compute.manager [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:34:50 np0005466013 nova_compute[192144]: 2025-10-02 12:34:50.066 2 DEBUG nova.virt.libvirt.driver [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:34:50 np0005466013 nova_compute[192144]: 2025-10-02 12:34:50.067 2 INFO nova.virt.libvirt.driver [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Creating image(s)#033[00m
Oct  2 08:34:50 np0005466013 nova_compute[192144]: 2025-10-02 12:34:50.067 2 DEBUG oslo_concurrency.lockutils [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "/var/lib/nova/instances/cd5cbe34-283a-4177-9d0b-bc35fadcde72/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:50 np0005466013 nova_compute[192144]: 2025-10-02 12:34:50.068 2 DEBUG oslo_concurrency.lockutils [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "/var/lib/nova/instances/cd5cbe34-283a-4177-9d0b-bc35fadcde72/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:50 np0005466013 nova_compute[192144]: 2025-10-02 12:34:50.068 2 DEBUG oslo_concurrency.lockutils [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "/var/lib/nova/instances/cd5cbe34-283a-4177-9d0b-bc35fadcde72/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:50 np0005466013 nova_compute[192144]: 2025-10-02 12:34:50.080 2 DEBUG oslo_concurrency.processutils [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:34:50 np0005466013 nova_compute[192144]: 2025-10-02 12:34:50.173 2 DEBUG oslo_concurrency.processutils [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:34:50 np0005466013 nova_compute[192144]: 2025-10-02 12:34:50.174 2 DEBUG oslo_concurrency.lockutils [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:50 np0005466013 nova_compute[192144]: 2025-10-02 12:34:50.175 2 DEBUG oslo_concurrency.lockutils [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:50 np0005466013 nova_compute[192144]: 2025-10-02 12:34:50.185 2 DEBUG oslo_concurrency.processutils [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:34:50 np0005466013 nova_compute[192144]: 2025-10-02 12:34:50.205 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:50 np0005466013 nova_compute[192144]: 2025-10-02 12:34:50.264 2 DEBUG oslo_concurrency.processutils [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:34:50 np0005466013 nova_compute[192144]: 2025-10-02 12:34:50.265 2 DEBUG oslo_concurrency.processutils [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/cd5cbe34-283a-4177-9d0b-bc35fadcde72/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:34:50 np0005466013 nova_compute[192144]: 2025-10-02 12:34:50.298 2 DEBUG nova.policy [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:34:50 np0005466013 nova_compute[192144]: 2025-10-02 12:34:50.397 2 DEBUG oslo_concurrency.processutils [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/cd5cbe34-283a-4177-9d0b-bc35fadcde72/disk 1073741824" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:34:50 np0005466013 nova_compute[192144]: 2025-10-02 12:34:50.398 2 DEBUG oslo_concurrency.lockutils [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.223s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:50 np0005466013 nova_compute[192144]: 2025-10-02 12:34:50.398 2 DEBUG oslo_concurrency.processutils [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:34:50 np0005466013 nova_compute[192144]: 2025-10-02 12:34:50.487 2 DEBUG oslo_concurrency.processutils [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:34:50 np0005466013 nova_compute[192144]: 2025-10-02 12:34:50.489 2 DEBUG nova.virt.disk.api [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Checking if we can resize image /var/lib/nova/instances/cd5cbe34-283a-4177-9d0b-bc35fadcde72/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:34:50 np0005466013 nova_compute[192144]: 2025-10-02 12:34:50.490 2 DEBUG oslo_concurrency.processutils [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cd5cbe34-283a-4177-9d0b-bc35fadcde72/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:34:50 np0005466013 nova_compute[192144]: 2025-10-02 12:34:50.588 2 DEBUG oslo_concurrency.processutils [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cd5cbe34-283a-4177-9d0b-bc35fadcde72/disk --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:34:50 np0005466013 nova_compute[192144]: 2025-10-02 12:34:50.589 2 DEBUG nova.virt.disk.api [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Cannot resize image /var/lib/nova/instances/cd5cbe34-283a-4177-9d0b-bc35fadcde72/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:34:50 np0005466013 nova_compute[192144]: 2025-10-02 12:34:50.589 2 DEBUG nova.objects.instance [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lazy-loading 'migration_context' on Instance uuid cd5cbe34-283a-4177-9d0b-bc35fadcde72 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:34:50 np0005466013 nova_compute[192144]: 2025-10-02 12:34:50.618 2 DEBUG nova.virt.libvirt.driver [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:34:50 np0005466013 nova_compute[192144]: 2025-10-02 12:34:50.618 2 DEBUG nova.virt.libvirt.driver [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Ensure instance console log exists: /var/lib/nova/instances/cd5cbe34-283a-4177-9d0b-bc35fadcde72/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:34:50 np0005466013 nova_compute[192144]: 2025-10-02 12:34:50.619 2 DEBUG oslo_concurrency.lockutils [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:50 np0005466013 nova_compute[192144]: 2025-10-02 12:34:50.619 2 DEBUG oslo_concurrency.lockutils [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:50 np0005466013 nova_compute[192144]: 2025-10-02 12:34:50.619 2 DEBUG oslo_concurrency.lockutils [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:50 np0005466013 nova_compute[192144]: 2025-10-02 12:34:50.841 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:51 np0005466013 nova_compute[192144]: 2025-10-02 12:34:51.593 2 DEBUG nova.network.neutron [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Successfully created port: 5be4618b-6dbd-4495-af12-ea729df149d7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:34:52 np0005466013 nova_compute[192144]: 2025-10-02 12:34:52.121 2 DEBUG nova.network.neutron [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Successfully created port: a9826a7d-3298-45d6-b564-4b199244dec1 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:34:53 np0005466013 nova_compute[192144]: 2025-10-02 12:34:53.468 2 DEBUG nova.network.neutron [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Successfully updated port: 5be4618b-6dbd-4495-af12-ea729df149d7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:34:53 np0005466013 nova_compute[192144]: 2025-10-02 12:34:53.629 2 DEBUG nova.compute.manager [req-185a3928-bca1-445b-9ae8-2ca6cef918f3 req-7ddb42ed-9978-407d-8bf1-30af261ddb8a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Received event network-changed-5be4618b-6dbd-4495-af12-ea729df149d7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:34:53 np0005466013 nova_compute[192144]: 2025-10-02 12:34:53.630 2 DEBUG nova.compute.manager [req-185a3928-bca1-445b-9ae8-2ca6cef918f3 req-7ddb42ed-9978-407d-8bf1-30af261ddb8a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Refreshing instance network info cache due to event network-changed-5be4618b-6dbd-4495-af12-ea729df149d7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:34:53 np0005466013 nova_compute[192144]: 2025-10-02 12:34:53.630 2 DEBUG oslo_concurrency.lockutils [req-185a3928-bca1-445b-9ae8-2ca6cef918f3 req-7ddb42ed-9978-407d-8bf1-30af261ddb8a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-cd5cbe34-283a-4177-9d0b-bc35fadcde72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:34:53 np0005466013 nova_compute[192144]: 2025-10-02 12:34:53.630 2 DEBUG oslo_concurrency.lockutils [req-185a3928-bca1-445b-9ae8-2ca6cef918f3 req-7ddb42ed-9978-407d-8bf1-30af261ddb8a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-cd5cbe34-283a-4177-9d0b-bc35fadcde72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:34:53 np0005466013 nova_compute[192144]: 2025-10-02 12:34:53.630 2 DEBUG nova.network.neutron [req-185a3928-bca1-445b-9ae8-2ca6cef918f3 req-7ddb42ed-9978-407d-8bf1-30af261ddb8a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Refreshing network info cache for port 5be4618b-6dbd-4495-af12-ea729df149d7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:34:53 np0005466013 nova_compute[192144]: 2025-10-02 12:34:53.937 2 DEBUG nova.network.neutron [req-185a3928-bca1-445b-9ae8-2ca6cef918f3 req-7ddb42ed-9978-407d-8bf1-30af261ddb8a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:34:54 np0005466013 nova_compute[192144]: 2025-10-02 12:34:54.223 2 DEBUG nova.network.neutron [req-185a3928-bca1-445b-9ae8-2ca6cef918f3 req-7ddb42ed-9978-407d-8bf1-30af261ddb8a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:34:54 np0005466013 nova_compute[192144]: 2025-10-02 12:34:54.240 2 DEBUG oslo_concurrency.lockutils [req-185a3928-bca1-445b-9ae8-2ca6cef918f3 req-7ddb42ed-9978-407d-8bf1-30af261ddb8a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-cd5cbe34-283a-4177-9d0b-bc35fadcde72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:34:54 np0005466013 nova_compute[192144]: 2025-10-02 12:34:54.959 2 DEBUG nova.network.neutron [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Successfully updated port: a9826a7d-3298-45d6-b564-4b199244dec1 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:34:54 np0005466013 nova_compute[192144]: 2025-10-02 12:34:54.982 2 DEBUG oslo_concurrency.lockutils [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "refresh_cache-cd5cbe34-283a-4177-9d0b-bc35fadcde72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:34:54 np0005466013 nova_compute[192144]: 2025-10-02 12:34:54.983 2 DEBUG oslo_concurrency.lockutils [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquired lock "refresh_cache-cd5cbe34-283a-4177-9d0b-bc35fadcde72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:34:54 np0005466013 nova_compute[192144]: 2025-10-02 12:34:54.983 2 DEBUG nova.network.neutron [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:34:55 np0005466013 nova_compute[192144]: 2025-10-02 12:34:55.165 2 DEBUG nova.network.neutron [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:34:55 np0005466013 nova_compute[192144]: 2025-10-02 12:34:55.200 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:55 np0005466013 podman[245689]: 2025-10-02 12:34:55.704718083 +0000 UTC m=+0.065123304 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute)
Oct  2 08:34:55 np0005466013 podman[245687]: 2025-10-02 12:34:55.72281735 +0000 UTC m=+0.086135923 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:34:55 np0005466013 podman[245688]: 2025-10-02 12:34:55.723499502 +0000 UTC m=+0.088021383 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, release=1755695350, vcs-type=git)
Oct  2 08:34:55 np0005466013 nova_compute[192144]: 2025-10-02 12:34:55.747 2 DEBUG nova.compute.manager [req-eadcfb0a-42f2-48d6-b54a-a337ebae2f80 req-43e8ccda-ae1f-4579-9897-20eecb4007a3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Received event network-changed-a9826a7d-3298-45d6-b564-4b199244dec1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:34:55 np0005466013 nova_compute[192144]: 2025-10-02 12:34:55.747 2 DEBUG nova.compute.manager [req-eadcfb0a-42f2-48d6-b54a-a337ebae2f80 req-43e8ccda-ae1f-4579-9897-20eecb4007a3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Refreshing instance network info cache due to event network-changed-a9826a7d-3298-45d6-b564-4b199244dec1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:34:55 np0005466013 nova_compute[192144]: 2025-10-02 12:34:55.748 2 DEBUG oslo_concurrency.lockutils [req-eadcfb0a-42f2-48d6-b54a-a337ebae2f80 req-43e8ccda-ae1f-4579-9897-20eecb4007a3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-cd5cbe34-283a-4177-9d0b-bc35fadcde72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:34:55 np0005466013 nova_compute[192144]: 2025-10-02 12:34:55.843 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:56 np0005466013 nova_compute[192144]: 2025-10-02 12:34:56.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:34:56 np0005466013 nova_compute[192144]: 2025-10-02 12:34:56.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.011 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.011 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.031 2 DEBUG nova.network.neutron [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Updating instance_info_cache with network_info: [{"id": "5be4618b-6dbd-4495-af12-ea729df149d7", "address": "fa:16:3e:6a:d5:ad", "network": {"id": "385e0a9e-c250-418d-8cab-e7e3ae4506c1", "bridge": "br-int", "label": "tempest-network-smoke--772702849", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5be4618b-6d", "ovs_interfaceid": "5be4618b-6dbd-4495-af12-ea729df149d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "a9826a7d-3298-45d6-b564-4b199244dec1", "address": "fa:16:3e:5d:00:e6", "network": {"id": "4522b631-3a21-451f-8605-7c2b34273ecd", "bridge": "br-int", "label": "tempest-network-smoke--123996893", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5d:e6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9826a7d-32", "ovs_interfaceid": "a9826a7d-3298-45d6-b564-4b199244dec1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.052 2 DEBUG oslo_concurrency.lockutils [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Releasing lock "refresh_cache-cd5cbe34-283a-4177-9d0b-bc35fadcde72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.053 2 DEBUG nova.compute.manager [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Instance network_info: |[{"id": "5be4618b-6dbd-4495-af12-ea729df149d7", "address": "fa:16:3e:6a:d5:ad", "network": {"id": "385e0a9e-c250-418d-8cab-e7e3ae4506c1", "bridge": "br-int", "label": "tempest-network-smoke--772702849", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5be4618b-6d", "ovs_interfaceid": "5be4618b-6dbd-4495-af12-ea729df149d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "a9826a7d-3298-45d6-b564-4b199244dec1", "address": "fa:16:3e:5d:00:e6", "network": {"id": "4522b631-3a21-451f-8605-7c2b34273ecd", "bridge": "br-int", "label": "tempest-network-smoke--123996893", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5d:e6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9826a7d-32", "ovs_interfaceid": "a9826a7d-3298-45d6-b564-4b199244dec1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.054 2 DEBUG oslo_concurrency.lockutils [req-eadcfb0a-42f2-48d6-b54a-a337ebae2f80 req-43e8ccda-ae1f-4579-9897-20eecb4007a3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-cd5cbe34-283a-4177-9d0b-bc35fadcde72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.054 2 DEBUG nova.network.neutron [req-eadcfb0a-42f2-48d6-b54a-a337ebae2f80 req-43e8ccda-ae1f-4579-9897-20eecb4007a3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Refreshing network info cache for port a9826a7d-3298-45d6-b564-4b199244dec1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.058 2 DEBUG nova.virt.libvirt.driver [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Start _get_guest_xml network_info=[{"id": "5be4618b-6dbd-4495-af12-ea729df149d7", "address": "fa:16:3e:6a:d5:ad", "network": {"id": "385e0a9e-c250-418d-8cab-e7e3ae4506c1", "bridge": "br-int", "label": "tempest-network-smoke--772702849", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5be4618b-6d", "ovs_interfaceid": "5be4618b-6dbd-4495-af12-ea729df149d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "a9826a7d-3298-45d6-b564-4b199244dec1", "address": "fa:16:3e:5d:00:e6", "network": {"id": "4522b631-3a21-451f-8605-7c2b34273ecd", "bridge": "br-int", "label": "tempest-network-smoke--123996893", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5d:e6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9826a7d-32", "ovs_interfaceid": "a9826a7d-3298-45d6-b564-4b199244dec1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.062 2 WARNING nova.virt.libvirt.driver [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.067 2 DEBUG nova.virt.libvirt.host [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.067 2 DEBUG nova.virt.libvirt.host [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.072 2 DEBUG nova.virt.libvirt.host [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.073 2 DEBUG nova.virt.libvirt.host [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.074 2 DEBUG nova.virt.libvirt.driver [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.074 2 DEBUG nova.virt.hardware [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.075 2 DEBUG nova.virt.hardware [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.075 2 DEBUG nova.virt.hardware [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.075 2 DEBUG nova.virt.hardware [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.075 2 DEBUG nova.virt.hardware [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.075 2 DEBUG nova.virt.hardware [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.076 2 DEBUG nova.virt.hardware [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.076 2 DEBUG nova.virt.hardware [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.076 2 DEBUG nova.virt.hardware [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.076 2 DEBUG nova.virt.hardware [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.076 2 DEBUG nova.virt.hardware [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.080 2 DEBUG nova.virt.libvirt.vif [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:34:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1033926726',display_name='tempest-TestGettingAddress-server-1033926726',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1033926726',id=154,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDH5dyE2Fb62QTiQ2NOxenp6JYghL1oq9DicssFh4gYhJeK4nW86De0EFSTiwMTso1Ry0yTYcOXXGoL5CP8YNZFtCow/YELoJPmiA0NwtFRUtcQ5PfX4mPVbmaNTOUOY/A==',key_name='tempest-TestGettingAddress-170538368',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-v2dngbvu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:34:49Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=cd5cbe34-283a-4177-9d0b-bc35fadcde72,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5be4618b-6dbd-4495-af12-ea729df149d7", "address": "fa:16:3e:6a:d5:ad", "network": {"id": "385e0a9e-c250-418d-8cab-e7e3ae4506c1", "bridge": "br-int", "label": "tempest-network-smoke--772702849", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5be4618b-6d", "ovs_interfaceid": "5be4618b-6dbd-4495-af12-ea729df149d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.080 2 DEBUG nova.network.os_vif_util [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "5be4618b-6dbd-4495-af12-ea729df149d7", "address": "fa:16:3e:6a:d5:ad", "network": {"id": "385e0a9e-c250-418d-8cab-e7e3ae4506c1", "bridge": "br-int", "label": "tempest-network-smoke--772702849", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5be4618b-6d", "ovs_interfaceid": "5be4618b-6dbd-4495-af12-ea729df149d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.081 2 DEBUG nova.network.os_vif_util [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6a:d5:ad,bridge_name='br-int',has_traffic_filtering=True,id=5be4618b-6dbd-4495-af12-ea729df149d7,network=Network(385e0a9e-c250-418d-8cab-e7e3ae4506c1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5be4618b-6d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.082 2 DEBUG nova.virt.libvirt.vif [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:34:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1033926726',display_name='tempest-TestGettingAddress-server-1033926726',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1033926726',id=154,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDH5dyE2Fb62QTiQ2NOxenp6JYghL1oq9DicssFh4gYhJeK4nW86De0EFSTiwMTso1Ry0yTYcOXXGoL5CP8YNZFtCow/YELoJPmiA0NwtFRUtcQ5PfX4mPVbmaNTOUOY/A==',key_name='tempest-TestGettingAddress-170538368',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-v2dngbvu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:34:49Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=cd5cbe34-283a-4177-9d0b-bc35fadcde72,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a9826a7d-3298-45d6-b564-4b199244dec1", "address": "fa:16:3e:5d:00:e6", "network": {"id": "4522b631-3a21-451f-8605-7c2b34273ecd", "bridge": "br-int", "label": "tempest-network-smoke--123996893", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5d:e6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9826a7d-32", "ovs_interfaceid": "a9826a7d-3298-45d6-b564-4b199244dec1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.082 2 DEBUG nova.network.os_vif_util [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "a9826a7d-3298-45d6-b564-4b199244dec1", "address": "fa:16:3e:5d:00:e6", "network": {"id": "4522b631-3a21-451f-8605-7c2b34273ecd", "bridge": "br-int", "label": "tempest-network-smoke--123996893", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5d:e6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9826a7d-32", "ovs_interfaceid": "a9826a7d-3298-45d6-b564-4b199244dec1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.083 2 DEBUG nova.network.os_vif_util [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5d:00:e6,bridge_name='br-int',has_traffic_filtering=True,id=a9826a7d-3298-45d6-b564-4b199244dec1,network=Network(4522b631-3a21-451f-8605-7c2b34273ecd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa9826a7d-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.083 2 DEBUG nova.objects.instance [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lazy-loading 'pci_devices' on Instance uuid cd5cbe34-283a-4177-9d0b-bc35fadcde72 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.103 2 DEBUG nova.virt.libvirt.driver [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:34:57 np0005466013 nova_compute[192144]:  <uuid>cd5cbe34-283a-4177-9d0b-bc35fadcde72</uuid>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:  <name>instance-0000009a</name>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:34:57 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:      <nova:name>tempest-TestGettingAddress-server-1033926726</nova:name>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:34:57</nova:creationTime>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:34:57 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:        <nova:user uuid="97ce9f1898484e0e9a1f7c84a9f0dfe3">tempest-TestGettingAddress-1355720650-project-member</nova:user>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:        <nova:project uuid="fd801958556f4c8aab047ecdef6b5ee8">tempest-TestGettingAddress-1355720650</nova:project>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:        <nova:port uuid="5be4618b-6dbd-4495-af12-ea729df149d7">
Oct  2 08:34:57 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:        <nova:port uuid="a9826a7d-3298-45d6-b564-4b199244dec1">
Oct  2 08:34:57 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe5d:e6" ipVersion="6"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:      <entry name="serial">cd5cbe34-283a-4177-9d0b-bc35fadcde72</entry>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:      <entry name="uuid">cd5cbe34-283a-4177-9d0b-bc35fadcde72</entry>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:34:57 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/cd5cbe34-283a-4177-9d0b-bc35fadcde72/disk"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:34:57 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/cd5cbe34-283a-4177-9d0b-bc35fadcde72/disk.config"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:34:57 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:6a:d5:ad"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:      <target dev="tap5be4618b-6d"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:34:57 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:5d:00:e6"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:      <target dev="tapa9826a7d-32"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:34:57 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/cd5cbe34-283a-4177-9d0b-bc35fadcde72/console.log" append="off"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:34:57 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:34:57 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:34:57 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:34:57 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:34:57 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.104 2 DEBUG nova.compute.manager [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Preparing to wait for external event network-vif-plugged-5be4618b-6dbd-4495-af12-ea729df149d7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.104 2 DEBUG oslo_concurrency.lockutils [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "cd5cbe34-283a-4177-9d0b-bc35fadcde72-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.104 2 DEBUG oslo_concurrency.lockutils [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "cd5cbe34-283a-4177-9d0b-bc35fadcde72-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.105 2 DEBUG oslo_concurrency.lockutils [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "cd5cbe34-283a-4177-9d0b-bc35fadcde72-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.105 2 DEBUG nova.compute.manager [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Preparing to wait for external event network-vif-plugged-a9826a7d-3298-45d6-b564-4b199244dec1 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.105 2 DEBUG oslo_concurrency.lockutils [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "cd5cbe34-283a-4177-9d0b-bc35fadcde72-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.105 2 DEBUG oslo_concurrency.lockutils [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "cd5cbe34-283a-4177-9d0b-bc35fadcde72-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.105 2 DEBUG oslo_concurrency.lockutils [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "cd5cbe34-283a-4177-9d0b-bc35fadcde72-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.106 2 DEBUG nova.virt.libvirt.vif [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:34:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1033926726',display_name='tempest-TestGettingAddress-server-1033926726',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1033926726',id=154,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDH5dyE2Fb62QTiQ2NOxenp6JYghL1oq9DicssFh4gYhJeK4nW86De0EFSTiwMTso1Ry0yTYcOXXGoL5CP8YNZFtCow/YELoJPmiA0NwtFRUtcQ5PfX4mPVbmaNTOUOY/A==',key_name='tempest-TestGettingAddress-170538368',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-v2dngbvu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:34:49Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=cd5cbe34-283a-4177-9d0b-bc35fadcde72,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5be4618b-6dbd-4495-af12-ea729df149d7", "address": "fa:16:3e:6a:d5:ad", "network": {"id": "385e0a9e-c250-418d-8cab-e7e3ae4506c1", "bridge": "br-int", "label": "tempest-network-smoke--772702849", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5be4618b-6d", "ovs_interfaceid": "5be4618b-6dbd-4495-af12-ea729df149d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.106 2 DEBUG nova.network.os_vif_util [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "5be4618b-6dbd-4495-af12-ea729df149d7", "address": "fa:16:3e:6a:d5:ad", "network": {"id": "385e0a9e-c250-418d-8cab-e7e3ae4506c1", "bridge": "br-int", "label": "tempest-network-smoke--772702849", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5be4618b-6d", "ovs_interfaceid": "5be4618b-6dbd-4495-af12-ea729df149d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.107 2 DEBUG nova.network.os_vif_util [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6a:d5:ad,bridge_name='br-int',has_traffic_filtering=True,id=5be4618b-6dbd-4495-af12-ea729df149d7,network=Network(385e0a9e-c250-418d-8cab-e7e3ae4506c1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5be4618b-6d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.107 2 DEBUG os_vif [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6a:d5:ad,bridge_name='br-int',has_traffic_filtering=True,id=5be4618b-6dbd-4495-af12-ea729df149d7,network=Network(385e0a9e-c250-418d-8cab-e7e3ae4506c1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5be4618b-6d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.107 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.108 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.108 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.110 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.110 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5be4618b-6d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.110 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5be4618b-6d, col_values=(('external_ids', {'iface-id': '5be4618b-6dbd-4495-af12-ea729df149d7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6a:d5:ad', 'vm-uuid': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.111 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:57 np0005466013 NetworkManager[51205]: <info>  [1759408497.1127] manager: (tap5be4618b-6d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/288)
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.114 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.117 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.118 2 INFO os_vif [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6a:d5:ad,bridge_name='br-int',has_traffic_filtering=True,id=5be4618b-6dbd-4495-af12-ea729df149d7,network=Network(385e0a9e-c250-418d-8cab-e7e3ae4506c1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5be4618b-6d')#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.119 2 DEBUG nova.virt.libvirt.vif [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:34:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1033926726',display_name='tempest-TestGettingAddress-server-1033926726',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1033926726',id=154,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDH5dyE2Fb62QTiQ2NOxenp6JYghL1oq9DicssFh4gYhJeK4nW86De0EFSTiwMTso1Ry0yTYcOXXGoL5CP8YNZFtCow/YELoJPmiA0NwtFRUtcQ5PfX4mPVbmaNTOUOY/A==',key_name='tempest-TestGettingAddress-170538368',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-v2dngbvu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:34:49Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=cd5cbe34-283a-4177-9d0b-bc35fadcde72,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a9826a7d-3298-45d6-b564-4b199244dec1", "address": "fa:16:3e:5d:00:e6", "network": {"id": "4522b631-3a21-451f-8605-7c2b34273ecd", "bridge": "br-int", "label": "tempest-network-smoke--123996893", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5d:e6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9826a7d-32", "ovs_interfaceid": "a9826a7d-3298-45d6-b564-4b199244dec1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.119 2 DEBUG nova.network.os_vif_util [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "a9826a7d-3298-45d6-b564-4b199244dec1", "address": "fa:16:3e:5d:00:e6", "network": {"id": "4522b631-3a21-451f-8605-7c2b34273ecd", "bridge": "br-int", "label": "tempest-network-smoke--123996893", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5d:e6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9826a7d-32", "ovs_interfaceid": "a9826a7d-3298-45d6-b564-4b199244dec1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.119 2 DEBUG nova.network.os_vif_util [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5d:00:e6,bridge_name='br-int',has_traffic_filtering=True,id=a9826a7d-3298-45d6-b564-4b199244dec1,network=Network(4522b631-3a21-451f-8605-7c2b34273ecd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa9826a7d-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.120 2 DEBUG os_vif [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5d:00:e6,bridge_name='br-int',has_traffic_filtering=True,id=a9826a7d-3298-45d6-b564-4b199244dec1,network=Network(4522b631-3a21-451f-8605-7c2b34273ecd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa9826a7d-32') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.120 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.120 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.121 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.122 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.123 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa9826a7d-32, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.123 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa9826a7d-32, col_values=(('external_ids', {'iface-id': 'a9826a7d-3298-45d6-b564-4b199244dec1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5d:00:e6', 'vm-uuid': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:34:57 np0005466013 NetworkManager[51205]: <info>  [1759408497.1259] manager: (tapa9826a7d-32): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/289)
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.125 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.127 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.132 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.133 2 INFO os_vif [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5d:00:e6,bridge_name='br-int',has_traffic_filtering=True,id=a9826a7d-3298-45d6-b564-4b199244dec1,network=Network(4522b631-3a21-451f-8605-7c2b34273ecd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa9826a7d-32')#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.199 2 DEBUG nova.virt.libvirt.driver [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.199 2 DEBUG nova.virt.libvirt.driver [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.200 2 DEBUG nova.virt.libvirt.driver [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] No VIF found with MAC fa:16:3e:6a:d5:ad, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.200 2 DEBUG nova.virt.libvirt.driver [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] No VIF found with MAC fa:16:3e:5d:00:e6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.200 2 INFO nova.virt.libvirt.driver [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Using config drive#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.711 2 INFO nova.virt.libvirt.driver [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Creating config drive at /var/lib/nova/instances/cd5cbe34-283a-4177-9d0b-bc35fadcde72/disk.config#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.720 2 DEBUG oslo_concurrency.processutils [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/cd5cbe34-283a-4177-9d0b-bc35fadcde72/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcqwzj302 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.863 2 DEBUG oslo_concurrency.processutils [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/cd5cbe34-283a-4177-9d0b-bc35fadcde72/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcqwzj302" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:34:57 np0005466013 kernel: tap5be4618b-6d: entered promiscuous mode
Oct  2 08:34:57 np0005466013 NetworkManager[51205]: <info>  [1759408497.9445] manager: (tap5be4618b-6d): new Tun device (/org/freedesktop/NetworkManager/Devices/290)
Oct  2 08:34:57 np0005466013 ovn_controller[94366]: 2025-10-02T12:34:57Z|00657|binding|INFO|Claiming lport 5be4618b-6dbd-4495-af12-ea729df149d7 for this chassis.
Oct  2 08:34:57 np0005466013 ovn_controller[94366]: 2025-10-02T12:34:57Z|00658|binding|INFO|5be4618b-6dbd-4495-af12-ea729df149d7: Claiming fa:16:3e:6a:d5:ad 10.100.0.5
Oct  2 08:34:57 np0005466013 nova_compute[192144]: 2025-10-02 12:34:57.949 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:57 np0005466013 NetworkManager[51205]: <info>  [1759408497.9673] manager: (tapa9826a7d-32): new Tun device (/org/freedesktop/NetworkManager/Devices/291)
Oct  2 08:34:57 np0005466013 kernel: tapa9826a7d-32: entered promiscuous mode
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:57.993 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6a:d5:ad 10.100.0.5'], port_security=['fa:16:3e:6a:d5:ad 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-385e0a9e-c250-418d-8cab-e7e3ae4506c1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '2', 'neutron:security_group_ids': '31537a48-e4ed-4e85-9383-0c91e41b0f96', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a4660930-bac7-4d92-b95e-2296da9c1763, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=5be4618b-6dbd-4495-af12-ea729df149d7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:57.995 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 5be4618b-6dbd-4495-af12-ea729df149d7 in datapath 385e0a9e-c250-418d-8cab-e7e3ae4506c1 bound to our chassis#033[00m
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:57.999 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 385e0a9e-c250-418d-8cab-e7e3ae4506c1#033[00m
Oct  2 08:34:58 np0005466013 systemd-udevd[245770]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:34:58 np0005466013 systemd-udevd[245772]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:58.020 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[27b045a9-258c-4437-b226-4ff253a59161]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:58.021 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap385e0a9e-c1 in ovnmeta-385e0a9e-c250-418d-8cab-e7e3ae4506c1 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:58.025 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap385e0a9e-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:58.025 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[531aa12c-86ee-46a8-bb6c-a60a783e774d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:58.026 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[f34ba1cb-9e30-42e6-944d-7ae65e030179]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:58 np0005466013 NetworkManager[51205]: <info>  [1759408498.0359] device (tap5be4618b-6d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:34:58 np0005466013 NetworkManager[51205]: <info>  [1759408498.0372] device (tap5be4618b-6d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:34:58 np0005466013 NetworkManager[51205]: <info>  [1759408498.0442] device (tapa9826a7d-32): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:58.044 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[23f72721-f466-427e-9722-fae3974b6b14]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:58 np0005466013 NetworkManager[51205]: <info>  [1759408498.0456] device (tapa9826a7d-32): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:34:58 np0005466013 systemd-machined[152202]: New machine qemu-72-instance-0000009a.
Oct  2 08:34:58 np0005466013 ovn_controller[94366]: 2025-10-02T12:34:58Z|00659|binding|INFO|Claiming lport a9826a7d-3298-45d6-b564-4b199244dec1 for this chassis.
Oct  2 08:34:58 np0005466013 ovn_controller[94366]: 2025-10-02T12:34:58Z|00660|binding|INFO|a9826a7d-3298-45d6-b564-4b199244dec1: Claiming fa:16:3e:5d:00:e6 2001:db8::f816:3eff:fe5d:e6
Oct  2 08:34:58 np0005466013 systemd[1]: Started Virtual Machine qemu-72-instance-0000009a.
Oct  2 08:34:58 np0005466013 nova_compute[192144]: 2025-10-02 12:34:58.074 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:58 np0005466013 nova_compute[192144]: 2025-10-02 12:34:58.079 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:58.082 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5d:00:e6 2001:db8::f816:3eff:fe5d:e6'], port_security=['fa:16:3e:5d:00:e6 2001:db8::f816:3eff:fe5d:e6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe5d:e6/64', 'neutron:device_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4522b631-3a21-451f-8605-7c2b34273ecd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '2', 'neutron:security_group_ids': '31537a48-e4ed-4e85-9383-0c91e41b0f96', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=09de0cb2-1ed0-42b1-8efb-533f84345cc8, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=a9826a7d-3298-45d6-b564-4b199244dec1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:34:58 np0005466013 ovn_controller[94366]: 2025-10-02T12:34:58Z|00661|binding|INFO|Setting lport 5be4618b-6dbd-4495-af12-ea729df149d7 ovn-installed in OVS
Oct  2 08:34:58 np0005466013 ovn_controller[94366]: 2025-10-02T12:34:58Z|00662|binding|INFO|Setting lport 5be4618b-6dbd-4495-af12-ea729df149d7 up in Southbound
Oct  2 08:34:58 np0005466013 nova_compute[192144]: 2025-10-02 12:34:58.087 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:58.087 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[37056af9-dd75-411f-9459-f43a5750e8ee]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:58 np0005466013 ovn_controller[94366]: 2025-10-02T12:34:58Z|00663|binding|INFO|Setting lport a9826a7d-3298-45d6-b564-4b199244dec1 ovn-installed in OVS
Oct  2 08:34:58 np0005466013 ovn_controller[94366]: 2025-10-02T12:34:58Z|00664|binding|INFO|Setting lport a9826a7d-3298-45d6-b564-4b199244dec1 up in Southbound
Oct  2 08:34:58 np0005466013 nova_compute[192144]: 2025-10-02 12:34:58.097 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:58.121 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[47b3d417-c19e-43f2-88c3-420358af7cc4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:58.126 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[b6a63607-d4c1-4c0f-940c-cdb9c33663fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:58 np0005466013 NetworkManager[51205]: <info>  [1759408498.1266] manager: (tap385e0a9e-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/292)
Oct  2 08:34:58 np0005466013 systemd-udevd[245777]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:58.166 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[4d582d92-b18e-4e61-8e42-45f63a4a66ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:58.169 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[eaa3f51e-3e81-4dc4-8bcb-b564065262bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:58 np0005466013 NetworkManager[51205]: <info>  [1759408498.1899] device (tap385e0a9e-c0): carrier: link connected
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:58.195 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[095e7f26-6d55-4e7d-994e-f4f65c1d6a40]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:58.220 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[e1824553-6f84-4a13-8157-22d49b7415b7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap385e0a9e-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b8:f8:af'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 198], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 648980, 'reachable_time': 36856, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245805, 'error': None, 'target': 'ovnmeta-385e0a9e-c250-418d-8cab-e7e3ae4506c1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:58.244 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[6adf66c5-b14a-4cb7-a54d-9c2bd0e23b00]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb8:f8af'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 648980, 'tstamp': 648980}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 245806, 'error': None, 'target': 'ovnmeta-385e0a9e-c250-418d-8cab-e7e3ae4506c1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:58.266 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[39d85dd6-a87f-45e0-87b2-9ffeefe58ba2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap385e0a9e-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b8:f8:af'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 198], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 648980, 'reachable_time': 36856, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 245807, 'error': None, 'target': 'ovnmeta-385e0a9e-c250-418d-8cab-e7e3ae4506c1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:58.302 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[5d20cfa9-cd02-4443-8b69-4977d1868baf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:58.377 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[e3ac564c-601b-4c73-9ffc-701a4d40a276]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:58.379 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap385e0a9e-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:58.379 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:58.380 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap385e0a9e-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:34:58 np0005466013 nova_compute[192144]: 2025-10-02 12:34:58.419 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:58 np0005466013 NetworkManager[51205]: <info>  [1759408498.4200] manager: (tap385e0a9e-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/293)
Oct  2 08:34:58 np0005466013 kernel: tap385e0a9e-c0: entered promiscuous mode
Oct  2 08:34:58 np0005466013 nova_compute[192144]: 2025-10-02 12:34:58.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:58.424 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap385e0a9e-c0, col_values=(('external_ids', {'iface-id': '9e6245a0-6013-48e5-9e96-a79fafe59b6a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:34:58 np0005466013 nova_compute[192144]: 2025-10-02 12:34:58.425 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:58 np0005466013 ovn_controller[94366]: 2025-10-02T12:34:58Z|00665|binding|INFO|Releasing lport 9e6245a0-6013-48e5-9e96-a79fafe59b6a from this chassis (sb_readonly=0)
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:58.427 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/385e0a9e-c250-418d-8cab-e7e3ae4506c1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/385e0a9e-c250-418d-8cab-e7e3ae4506c1.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:58.427 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[4778a60e-3797-4687-968f-e5d7688c5314]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:58.428 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-385e0a9e-c250-418d-8cab-e7e3ae4506c1
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/385e0a9e-c250-418d-8cab-e7e3ae4506c1.pid.haproxy
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID 385e0a9e-c250-418d-8cab-e7e3ae4506c1
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:34:58 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:34:58.429 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-385e0a9e-c250-418d-8cab-e7e3ae4506c1', 'env', 'PROCESS_TAG=haproxy-385e0a9e-c250-418d-8cab-e7e3ae4506c1', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/385e0a9e-c250-418d-8cab-e7e3ae4506c1.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:34:58 np0005466013 nova_compute[192144]: 2025-10-02 12:34:58.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:58 np0005466013 nova_compute[192144]: 2025-10-02 12:34:58.538 2 DEBUG nova.network.neutron [req-eadcfb0a-42f2-48d6-b54a-a337ebae2f80 req-43e8ccda-ae1f-4579-9897-20eecb4007a3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Updated VIF entry in instance network info cache for port a9826a7d-3298-45d6-b564-4b199244dec1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:34:58 np0005466013 nova_compute[192144]: 2025-10-02 12:34:58.540 2 DEBUG nova.network.neutron [req-eadcfb0a-42f2-48d6-b54a-a337ebae2f80 req-43e8ccda-ae1f-4579-9897-20eecb4007a3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Updating instance_info_cache with network_info: [{"id": "5be4618b-6dbd-4495-af12-ea729df149d7", "address": "fa:16:3e:6a:d5:ad", "network": {"id": "385e0a9e-c250-418d-8cab-e7e3ae4506c1", "bridge": "br-int", "label": "tempest-network-smoke--772702849", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5be4618b-6d", "ovs_interfaceid": "5be4618b-6dbd-4495-af12-ea729df149d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "a9826a7d-3298-45d6-b564-4b199244dec1", "address": "fa:16:3e:5d:00:e6", "network": {"id": "4522b631-3a21-451f-8605-7c2b34273ecd", "bridge": "br-int", "label": "tempest-network-smoke--123996893", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5d:e6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9826a7d-32", "ovs_interfaceid": "a9826a7d-3298-45d6-b564-4b199244dec1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:34:58 np0005466013 nova_compute[192144]: 2025-10-02 12:34:58.569 2 DEBUG oslo_concurrency.lockutils [req-eadcfb0a-42f2-48d6-b54a-a337ebae2f80 req-43e8ccda-ae1f-4579-9897-20eecb4007a3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-cd5cbe34-283a-4177-9d0b-bc35fadcde72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:34:58 np0005466013 podman[245840]: 2025-10-02 12:34:58.835046231 +0000 UTC m=+0.024585692 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:34:59 np0005466013 nova_compute[192144]: 2025-10-02 12:34:59.362 2 DEBUG nova.compute.manager [req-5ab82f80-bdea-4280-995c-97ce0589f4cf req-1a3b4302-5a26-4386-8aa8-83daad75736e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Received event network-vif-plugged-5be4618b-6dbd-4495-af12-ea729df149d7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:34:59 np0005466013 nova_compute[192144]: 2025-10-02 12:34:59.365 2 DEBUG oslo_concurrency.lockutils [req-5ab82f80-bdea-4280-995c-97ce0589f4cf req-1a3b4302-5a26-4386-8aa8-83daad75736e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "cd5cbe34-283a-4177-9d0b-bc35fadcde72-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:59 np0005466013 nova_compute[192144]: 2025-10-02 12:34:59.365 2 DEBUG oslo_concurrency.lockutils [req-5ab82f80-bdea-4280-995c-97ce0589f4cf req-1a3b4302-5a26-4386-8aa8-83daad75736e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "cd5cbe34-283a-4177-9d0b-bc35fadcde72-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:59 np0005466013 nova_compute[192144]: 2025-10-02 12:34:59.366 2 DEBUG oslo_concurrency.lockutils [req-5ab82f80-bdea-4280-995c-97ce0589f4cf req-1a3b4302-5a26-4386-8aa8-83daad75736e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "cd5cbe34-283a-4177-9d0b-bc35fadcde72-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:59 np0005466013 nova_compute[192144]: 2025-10-02 12:34:59.366 2 DEBUG nova.compute.manager [req-5ab82f80-bdea-4280-995c-97ce0589f4cf req-1a3b4302-5a26-4386-8aa8-83daad75736e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Processing event network-vif-plugged-5be4618b-6dbd-4495-af12-ea729df149d7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:34:59 np0005466013 nova_compute[192144]: 2025-10-02 12:34:59.366 2 DEBUG nova.compute.manager [req-5ab82f80-bdea-4280-995c-97ce0589f4cf req-1a3b4302-5a26-4386-8aa8-83daad75736e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Received event network-vif-plugged-5be4618b-6dbd-4495-af12-ea729df149d7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:34:59 np0005466013 nova_compute[192144]: 2025-10-02 12:34:59.366 2 DEBUG oslo_concurrency.lockutils [req-5ab82f80-bdea-4280-995c-97ce0589f4cf req-1a3b4302-5a26-4386-8aa8-83daad75736e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "cd5cbe34-283a-4177-9d0b-bc35fadcde72-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:59 np0005466013 nova_compute[192144]: 2025-10-02 12:34:59.367 2 DEBUG oslo_concurrency.lockutils [req-5ab82f80-bdea-4280-995c-97ce0589f4cf req-1a3b4302-5a26-4386-8aa8-83daad75736e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "cd5cbe34-283a-4177-9d0b-bc35fadcde72-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:59 np0005466013 nova_compute[192144]: 2025-10-02 12:34:59.367 2 DEBUG oslo_concurrency.lockutils [req-5ab82f80-bdea-4280-995c-97ce0589f4cf req-1a3b4302-5a26-4386-8aa8-83daad75736e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "cd5cbe34-283a-4177-9d0b-bc35fadcde72-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:59 np0005466013 nova_compute[192144]: 2025-10-02 12:34:59.367 2 DEBUG nova.compute.manager [req-5ab82f80-bdea-4280-995c-97ce0589f4cf req-1a3b4302-5a26-4386-8aa8-83daad75736e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] No event matching network-vif-plugged-5be4618b-6dbd-4495-af12-ea729df149d7 in dict_keys([('network-vif-plugged', 'a9826a7d-3298-45d6-b564-4b199244dec1')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Oct  2 08:34:59 np0005466013 nova_compute[192144]: 2025-10-02 12:34:59.367 2 WARNING nova.compute.manager [req-5ab82f80-bdea-4280-995c-97ce0589f4cf req-1a3b4302-5a26-4386-8aa8-83daad75736e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Received unexpected event network-vif-plugged-5be4618b-6dbd-4495-af12-ea729df149d7 for instance with vm_state building and task_state spawning.#033[00m
Oct  2 08:34:59 np0005466013 nova_compute[192144]: 2025-10-02 12:34:59.584 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759408499.58401, cd5cbe34-283a-4177-9d0b-bc35fadcde72 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:34:59 np0005466013 nova_compute[192144]: 2025-10-02 12:34:59.586 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] VM Started (Lifecycle Event)#033[00m
Oct  2 08:34:59 np0005466013 nova_compute[192144]: 2025-10-02 12:34:59.611 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:34:59 np0005466013 nova_compute[192144]: 2025-10-02 12:34:59.618 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759408499.584209, cd5cbe34-283a-4177-9d0b-bc35fadcde72 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:34:59 np0005466013 nova_compute[192144]: 2025-10-02 12:34:59.618 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:34:59 np0005466013 nova_compute[192144]: 2025-10-02 12:34:59.646 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:34:59 np0005466013 nova_compute[192144]: 2025-10-02 12:34:59.650 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:34:59 np0005466013 nova_compute[192144]: 2025-10-02 12:34:59.676 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:34:59 np0005466013 podman[245840]: 2025-10-02 12:34:59.717461929 +0000 UTC m=+0.907001370 container create 7baa2d64357ae1bda25a05f8a9aa9c16e7e8d5c3f8f0c24c903f5f451211c3a1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-385e0a9e-c250-418d-8cab-e7e3ae4506c1, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:34:59 np0005466013 systemd[1]: Started libpod-conmon-7baa2d64357ae1bda25a05f8a9aa9c16e7e8d5c3f8f0c24c903f5f451211c3a1.scope.
Oct  2 08:34:59 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:34:59 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/44d07cb0920fd101e16bb8cc5218a44289b5c60f4fe6103599a50c87912cc307/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:34:59 np0005466013 podman[245840]: 2025-10-02 12:34:59.991774685 +0000 UTC m=+1.181314106 container init 7baa2d64357ae1bda25a05f8a9aa9c16e7e8d5c3f8f0c24c903f5f451211c3a1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-385e0a9e-c250-418d-8cab-e7e3ae4506c1, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:35:00 np0005466013 podman[245840]: 2025-10-02 12:35:00.000097697 +0000 UTC m=+1.189637108 container start 7baa2d64357ae1bda25a05f8a9aa9c16e7e8d5c3f8f0c24c903f5f451211c3a1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-385e0a9e-c250-418d-8cab-e7e3ae4506c1, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Oct  2 08:35:00 np0005466013 neutron-haproxy-ovnmeta-385e0a9e-c250-418d-8cab-e7e3ae4506c1[245863]: [NOTICE]   (245867) : New worker (245869) forked
Oct  2 08:35:00 np0005466013 neutron-haproxy-ovnmeta-385e0a9e-c250-418d-8cab-e7e3ae4506c1[245863]: [NOTICE]   (245867) : Loading success.
Oct  2 08:35:00 np0005466013 nova_compute[192144]: 2025-10-02 12:35:00.034 2 DEBUG nova.compute.manager [req-1b8f3de8-2e69-4e38-b3ac-5b5c28424a94 req-bd33c2ba-e793-4bd3-b55d-51dd5953f257 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Received event network-vif-plugged-a9826a7d-3298-45d6-b564-4b199244dec1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:35:00 np0005466013 nova_compute[192144]: 2025-10-02 12:35:00.034 2 DEBUG oslo_concurrency.lockutils [req-1b8f3de8-2e69-4e38-b3ac-5b5c28424a94 req-bd33c2ba-e793-4bd3-b55d-51dd5953f257 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "cd5cbe34-283a-4177-9d0b-bc35fadcde72-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:00 np0005466013 nova_compute[192144]: 2025-10-02 12:35:00.035 2 DEBUG oslo_concurrency.lockutils [req-1b8f3de8-2e69-4e38-b3ac-5b5c28424a94 req-bd33c2ba-e793-4bd3-b55d-51dd5953f257 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "cd5cbe34-283a-4177-9d0b-bc35fadcde72-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:00 np0005466013 nova_compute[192144]: 2025-10-02 12:35:00.035 2 DEBUG oslo_concurrency.lockutils [req-1b8f3de8-2e69-4e38-b3ac-5b5c28424a94 req-bd33c2ba-e793-4bd3-b55d-51dd5953f257 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "cd5cbe34-283a-4177-9d0b-bc35fadcde72-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:00 np0005466013 nova_compute[192144]: 2025-10-02 12:35:00.035 2 DEBUG nova.compute.manager [req-1b8f3de8-2e69-4e38-b3ac-5b5c28424a94 req-bd33c2ba-e793-4bd3-b55d-51dd5953f257 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Processing event network-vif-plugged-a9826a7d-3298-45d6-b564-4b199244dec1 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:35:00 np0005466013 nova_compute[192144]: 2025-10-02 12:35:00.035 2 DEBUG nova.compute.manager [req-1b8f3de8-2e69-4e38-b3ac-5b5c28424a94 req-bd33c2ba-e793-4bd3-b55d-51dd5953f257 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Received event network-vif-plugged-a9826a7d-3298-45d6-b564-4b199244dec1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:35:00 np0005466013 nova_compute[192144]: 2025-10-02 12:35:00.035 2 DEBUG oslo_concurrency.lockutils [req-1b8f3de8-2e69-4e38-b3ac-5b5c28424a94 req-bd33c2ba-e793-4bd3-b55d-51dd5953f257 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "cd5cbe34-283a-4177-9d0b-bc35fadcde72-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:00 np0005466013 nova_compute[192144]: 2025-10-02 12:35:00.036 2 DEBUG oslo_concurrency.lockutils [req-1b8f3de8-2e69-4e38-b3ac-5b5c28424a94 req-bd33c2ba-e793-4bd3-b55d-51dd5953f257 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "cd5cbe34-283a-4177-9d0b-bc35fadcde72-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:00 np0005466013 nova_compute[192144]: 2025-10-02 12:35:00.036 2 DEBUG oslo_concurrency.lockutils [req-1b8f3de8-2e69-4e38-b3ac-5b5c28424a94 req-bd33c2ba-e793-4bd3-b55d-51dd5953f257 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "cd5cbe34-283a-4177-9d0b-bc35fadcde72-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:00 np0005466013 nova_compute[192144]: 2025-10-02 12:35:00.036 2 DEBUG nova.compute.manager [req-1b8f3de8-2e69-4e38-b3ac-5b5c28424a94 req-bd33c2ba-e793-4bd3-b55d-51dd5953f257 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] No waiting events found dispatching network-vif-plugged-a9826a7d-3298-45d6-b564-4b199244dec1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:35:00 np0005466013 nova_compute[192144]: 2025-10-02 12:35:00.036 2 WARNING nova.compute.manager [req-1b8f3de8-2e69-4e38-b3ac-5b5c28424a94 req-bd33c2ba-e793-4bd3-b55d-51dd5953f257 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Received unexpected event network-vif-plugged-a9826a7d-3298-45d6-b564-4b199244dec1 for instance with vm_state building and task_state spawning.#033[00m
Oct  2 08:35:00 np0005466013 nova_compute[192144]: 2025-10-02 12:35:00.037 2 DEBUG nova.compute.manager [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Instance event wait completed in 0 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:35:00 np0005466013 nova_compute[192144]: 2025-10-02 12:35:00.042 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759408500.0417337, cd5cbe34-283a-4177-9d0b-bc35fadcde72 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:35:00 np0005466013 nova_compute[192144]: 2025-10-02 12:35:00.043 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:35:00 np0005466013 nova_compute[192144]: 2025-10-02 12:35:00.045 2 DEBUG nova.virt.libvirt.driver [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:35:00 np0005466013 nova_compute[192144]: 2025-10-02 12:35:00.051 2 INFO nova.virt.libvirt.driver [-] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Instance spawned successfully.#033[00m
Oct  2 08:35:00 np0005466013 nova_compute[192144]: 2025-10-02 12:35:00.053 2 DEBUG nova.virt.libvirt.driver [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:35:00 np0005466013 nova_compute[192144]: 2025-10-02 12:35:00.071 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:35:00 np0005466013 nova_compute[192144]: 2025-10-02 12:35:00.076 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:35:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:35:00.081 103323 INFO neutron.agent.ovn.metadata.agent [-] Port a9826a7d-3298-45d6-b564-4b199244dec1 in datapath 4522b631-3a21-451f-8605-7c2b34273ecd unbound from our chassis#033[00m
Oct  2 08:35:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:35:00.084 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4522b631-3a21-451f-8605-7c2b34273ecd#033[00m
Oct  2 08:35:00 np0005466013 nova_compute[192144]: 2025-10-02 12:35:00.088 2 DEBUG nova.virt.libvirt.driver [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:35:00 np0005466013 nova_compute[192144]: 2025-10-02 12:35:00.089 2 DEBUG nova.virt.libvirt.driver [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:35:00 np0005466013 nova_compute[192144]: 2025-10-02 12:35:00.090 2 DEBUG nova.virt.libvirt.driver [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:35:00 np0005466013 nova_compute[192144]: 2025-10-02 12:35:00.090 2 DEBUG nova.virt.libvirt.driver [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:35:00 np0005466013 nova_compute[192144]: 2025-10-02 12:35:00.091 2 DEBUG nova.virt.libvirt.driver [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:35:00 np0005466013 nova_compute[192144]: 2025-10-02 12:35:00.092 2 DEBUG nova.virt.libvirt.driver [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:35:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:35:00.102 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[c51f4007-4843-4c47-81a4-b2778c699292]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:35:00.103 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4522b631-31 in ovnmeta-4522b631-3a21-451f-8605-7c2b34273ecd namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:35:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:35:00.106 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4522b631-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:35:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:35:00.106 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[ad2f297d-7fdb-4696-89ae-9b48c6220903]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:35:00.107 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[54b403b3-3fec-4be5-9189-76878b914aa2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:35:00.120 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[975ab198-2510-4ccd-8cc8-f75166861587]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:00 np0005466013 nova_compute[192144]: 2025-10-02 12:35:00.127 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:35:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:35:00.144 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[85cb571e-8ea4-42a4-93d1-b011a0aa9443]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:35:00.168 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[53006ed2-7aee-46e1-a80c-a81baf89e579]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:00 np0005466013 NetworkManager[51205]: <info>  [1759408500.1778] manager: (tap4522b631-30): new Veth device (/org/freedesktop/NetworkManager/Devices/294)
Oct  2 08:35:00 np0005466013 systemd-udevd[245798]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:35:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:35:00.176 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[4f0edb16-c394-43a2-bad2-ff8708f1ddb5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:00 np0005466013 nova_compute[192144]: 2025-10-02 12:35:00.195 2 INFO nova.compute.manager [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Took 10.13 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:35:00 np0005466013 nova_compute[192144]: 2025-10-02 12:35:00.196 2 DEBUG nova.compute.manager [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:35:00 np0005466013 nova_compute[192144]: 2025-10-02 12:35:00.205 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:35:00.223 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[532f3c3a-a71c-411f-b834-45a00d536fe3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:35:00.226 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[6dd93745-2a7a-44bd-ad31-9d29b99f04a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:00 np0005466013 NetworkManager[51205]: <info>  [1759408500.2482] device (tap4522b631-30): carrier: link connected
Oct  2 08:35:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:35:00.252 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[9c1c6780-d57c-41bb-a15f-e92c6c5ce1b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:35:00.274 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[081f7005-7a9f-4dbc-926d-4bf45936f8d6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4522b631-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a2:e8:b5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 199], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 649186, 'reachable_time': 22561, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245888, 'error': None, 'target': 'ovnmeta-4522b631-3a21-451f-8605-7c2b34273ecd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:35:00.295 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[f16c8294-6550-44a7-9e2c-6c64de77ce19]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea2:e8b5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 649186, 'tstamp': 649186}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 245889, 'error': None, 'target': 'ovnmeta-4522b631-3a21-451f-8605-7c2b34273ecd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:35:00.312 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[356f1139-d44b-4899-b51d-c0b679020e23]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4522b631-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a2:e8:b5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 199], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 649186, 'reachable_time': 22561, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 245890, 'error': None, 'target': 'ovnmeta-4522b631-3a21-451f-8605-7c2b34273ecd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:00 np0005466013 nova_compute[192144]: 2025-10-02 12:35:00.320 2 INFO nova.compute.manager [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Took 10.67 seconds to build instance.#033[00m
Oct  2 08:35:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:35:00.345 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[91901415-5c08-4097-9963-6f08ff3d7740]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:00 np0005466013 nova_compute[192144]: 2025-10-02 12:35:00.347 2 DEBUG oslo_concurrency.lockutils [None req-d10eea99-a3e8-41e5-8180-3cc21a2f8f4f 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "cd5cbe34-283a-4177-9d0b-bc35fadcde72" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.805s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:35:00.385 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[27cbbf1c-49a7-4a35-9bba-3e152d594626]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:35:00.387 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4522b631-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:35:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:35:00.387 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:35:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:35:00.388 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4522b631-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:35:00 np0005466013 nova_compute[192144]: 2025-10-02 12:35:00.391 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:00 np0005466013 NetworkManager[51205]: <info>  [1759408500.3919] manager: (tap4522b631-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/295)
Oct  2 08:35:00 np0005466013 kernel: tap4522b631-30: entered promiscuous mode
Oct  2 08:35:00 np0005466013 nova_compute[192144]: 2025-10-02 12:35:00.393 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:35:00.397 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4522b631-30, col_values=(('external_ids', {'iface-id': '0965d19d-88e4-4971-9eb7-5bedfed08cdc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:35:00 np0005466013 nova_compute[192144]: 2025-10-02 12:35:00.399 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:00 np0005466013 ovn_controller[94366]: 2025-10-02T12:35:00Z|00666|binding|INFO|Releasing lport 0965d19d-88e4-4971-9eb7-5bedfed08cdc from this chassis (sb_readonly=0)
Oct  2 08:35:00 np0005466013 nova_compute[192144]: 2025-10-02 12:35:00.399 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:35:00.400 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4522b631-3a21-451f-8605-7c2b34273ecd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4522b631-3a21-451f-8605-7c2b34273ecd.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:35:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:35:00.401 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[d2fb8bd2-6a38-451c-b5c9-b0269eabb5e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:35:00.401 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:35:00 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:35:00 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:35:00 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-4522b631-3a21-451f-8605-7c2b34273ecd
Oct  2 08:35:00 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:35:00 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:35:00 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:35:00 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/4522b631-3a21-451f-8605-7c2b34273ecd.pid.haproxy
Oct  2 08:35:00 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:35:00 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:35:00 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:35:00 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:35:00 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:35:00 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:35:00 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:35:00 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:35:00 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:35:00 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:35:00 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:35:00 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:35:00 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:35:00 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:35:00 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:35:00 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:35:00 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:35:00 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:35:00 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:35:00 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:35:00 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID 4522b631-3a21-451f-8605-7c2b34273ecd
Oct  2 08:35:00 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:35:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:35:00.402 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4522b631-3a21-451f-8605-7c2b34273ecd', 'env', 'PROCESS_TAG=haproxy-4522b631-3a21-451f-8605-7c2b34273ecd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4522b631-3a21-451f-8605-7c2b34273ecd.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:35:00 np0005466013 nova_compute[192144]: 2025-10-02 12:35:00.412 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:00 np0005466013 podman[245920]: 2025-10-02 12:35:00.779823862 +0000 UTC m=+0.056177633 container create 370d19c4416df4ec77b34531905fd720ba9a0735772425dfdaba95f47ae2efc9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4522b631-3a21-451f-8605-7c2b34273ecd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.vendor=CentOS)
Oct  2 08:35:00 np0005466013 systemd[1]: Started libpod-conmon-370d19c4416df4ec77b34531905fd720ba9a0735772425dfdaba95f47ae2efc9.scope.
Oct  2 08:35:00 np0005466013 podman[245920]: 2025-10-02 12:35:00.751740891 +0000 UTC m=+0.028094712 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:35:00 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:35:00 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e5e9c2b14c51779ee1ae18effd476d9d4403db9fb19b87b899e448ec4eb6d222/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:35:00 np0005466013 podman[245920]: 2025-10-02 12:35:00.878802917 +0000 UTC m=+0.155156688 container init 370d19c4416df4ec77b34531905fd720ba9a0735772425dfdaba95f47ae2efc9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4522b631-3a21-451f-8605-7c2b34273ecd, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 08:35:00 np0005466013 podman[245933]: 2025-10-02 12:35:00.881081479 +0000 UTC m=+0.066413045 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 08:35:00 np0005466013 podman[245920]: 2025-10-02 12:35:00.888927585 +0000 UTC m=+0.165281356 container start 370d19c4416df4ec77b34531905fd720ba9a0735772425dfdaba95f47ae2efc9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4522b631-3a21-451f-8605-7c2b34273ecd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:35:00 np0005466013 podman[245934]: 2025-10-02 12:35:00.894865172 +0000 UTC m=+0.076956746 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  2 08:35:00 np0005466013 neutron-haproxy-ovnmeta-4522b631-3a21-451f-8605-7c2b34273ecd[245955]: [NOTICE]   (245981) : New worker (245983) forked
Oct  2 08:35:00 np0005466013 neutron-haproxy-ovnmeta-4522b631-3a21-451f-8605-7c2b34273ecd[245955]: [NOTICE]   (245981) : Loading success.
Oct  2 08:35:02 np0005466013 nova_compute[192144]: 2025-10-02 12:35:02.126 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:35:02.319 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:35:02.320 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:35:02.321 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:03 np0005466013 nova_compute[192144]: 2025-10-02 12:35:03.613 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:03 np0005466013 NetworkManager[51205]: <info>  [1759408503.6175] manager: (patch-br-int-to-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/296)
Oct  2 08:35:03 np0005466013 NetworkManager[51205]: <info>  [1759408503.6188] manager: (patch-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/297)
Oct  2 08:35:03 np0005466013 ovn_controller[94366]: 2025-10-02T12:35:03Z|00667|binding|INFO|Releasing lport 9e6245a0-6013-48e5-9e96-a79fafe59b6a from this chassis (sb_readonly=0)
Oct  2 08:35:03 np0005466013 ovn_controller[94366]: 2025-10-02T12:35:03Z|00668|binding|INFO|Releasing lport 0965d19d-88e4-4971-9eb7-5bedfed08cdc from this chassis (sb_readonly=0)
Oct  2 08:35:03 np0005466013 ovn_controller[94366]: 2025-10-02T12:35:03Z|00669|binding|INFO|Releasing lport 9e6245a0-6013-48e5-9e96-a79fafe59b6a from this chassis (sb_readonly=0)
Oct  2 08:35:03 np0005466013 ovn_controller[94366]: 2025-10-02T12:35:03Z|00670|binding|INFO|Releasing lport 0965d19d-88e4-4971-9eb7-5bedfed08cdc from this chassis (sb_readonly=0)
Oct  2 08:35:03 np0005466013 nova_compute[192144]: 2025-10-02 12:35:03.830 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:05 np0005466013 nova_compute[192144]: 2025-10-02 12:35:05.124 2 DEBUG nova.compute.manager [req-62615ab8-ee77-434d-a41c-c2f004d13017 req-dd1d8839-f9e4-4087-b3e3-5faa7369bc1c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Received event network-changed-5be4618b-6dbd-4495-af12-ea729df149d7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:35:05 np0005466013 nova_compute[192144]: 2025-10-02 12:35:05.124 2 DEBUG nova.compute.manager [req-62615ab8-ee77-434d-a41c-c2f004d13017 req-dd1d8839-f9e4-4087-b3e3-5faa7369bc1c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Refreshing instance network info cache due to event network-changed-5be4618b-6dbd-4495-af12-ea729df149d7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:35:05 np0005466013 nova_compute[192144]: 2025-10-02 12:35:05.124 2 DEBUG oslo_concurrency.lockutils [req-62615ab8-ee77-434d-a41c-c2f004d13017 req-dd1d8839-f9e4-4087-b3e3-5faa7369bc1c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-cd5cbe34-283a-4177-9d0b-bc35fadcde72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:35:05 np0005466013 nova_compute[192144]: 2025-10-02 12:35:05.124 2 DEBUG oslo_concurrency.lockutils [req-62615ab8-ee77-434d-a41c-c2f004d13017 req-dd1d8839-f9e4-4087-b3e3-5faa7369bc1c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-cd5cbe34-283a-4177-9d0b-bc35fadcde72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:35:05 np0005466013 nova_compute[192144]: 2025-10-02 12:35:05.125 2 DEBUG nova.network.neutron [req-62615ab8-ee77-434d-a41c-c2f004d13017 req-dd1d8839-f9e4-4087-b3e3-5faa7369bc1c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Refreshing network info cache for port 5be4618b-6dbd-4495-af12-ea729df149d7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:35:05 np0005466013 nova_compute[192144]: 2025-10-02 12:35:05.205 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:07 np0005466013 nova_compute[192144]: 2025-10-02 12:35:07.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:10 np0005466013 nova_compute[192144]: 2025-10-02 12:35:10.131 2 DEBUG nova.network.neutron [req-62615ab8-ee77-434d-a41c-c2f004d13017 req-dd1d8839-f9e4-4087-b3e3-5faa7369bc1c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Updated VIF entry in instance network info cache for port 5be4618b-6dbd-4495-af12-ea729df149d7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:35:10 np0005466013 nova_compute[192144]: 2025-10-02 12:35:10.132 2 DEBUG nova.network.neutron [req-62615ab8-ee77-434d-a41c-c2f004d13017 req-dd1d8839-f9e4-4087-b3e3-5faa7369bc1c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Updating instance_info_cache with network_info: [{"id": "5be4618b-6dbd-4495-af12-ea729df149d7", "address": "fa:16:3e:6a:d5:ad", "network": {"id": "385e0a9e-c250-418d-8cab-e7e3ae4506c1", "bridge": "br-int", "label": "tempest-network-smoke--772702849", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.217", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5be4618b-6d", "ovs_interfaceid": "5be4618b-6dbd-4495-af12-ea729df149d7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "a9826a7d-3298-45d6-b564-4b199244dec1", "address": "fa:16:3e:5d:00:e6", "network": {"id": "4522b631-3a21-451f-8605-7c2b34273ecd", "bridge": "br-int", "label": "tempest-network-smoke--123996893", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5d:e6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9826a7d-32", "ovs_interfaceid": "a9826a7d-3298-45d6-b564-4b199244dec1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:35:10 np0005466013 nova_compute[192144]: 2025-10-02 12:35:10.154 2 DEBUG oslo_concurrency.lockutils [req-62615ab8-ee77-434d-a41c-c2f004d13017 req-dd1d8839-f9e4-4087-b3e3-5faa7369bc1c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-cd5cbe34-283a-4177-9d0b-bc35fadcde72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:35:10 np0005466013 nova_compute[192144]: 2025-10-02 12:35:10.208 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:12 np0005466013 nova_compute[192144]: 2025-10-02 12:35:12.156 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:12 np0005466013 nova_compute[192144]: 2025-10-02 12:35:12.394 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:35:12 np0005466013 nova_compute[192144]: 2025-10-02 12:35:12.413 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Triggering sync for uuid cd5cbe34-283a-4177-9d0b-bc35fadcde72 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Oct  2 08:35:12 np0005466013 nova_compute[192144]: 2025-10-02 12:35:12.414 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "cd5cbe34-283a-4177-9d0b-bc35fadcde72" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:12 np0005466013 nova_compute[192144]: 2025-10-02 12:35:12.414 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "cd5cbe34-283a-4177-9d0b-bc35fadcde72" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:12 np0005466013 nova_compute[192144]: 2025-10-02 12:35:12.437 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "cd5cbe34-283a-4177-9d0b-bc35fadcde72" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.023s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:15 np0005466013 nova_compute[192144]: 2025-10-02 12:35:15.210 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:35:15.286 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=39, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=38) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:35:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:35:15.287 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:35:15 np0005466013 nova_compute[192144]: 2025-10-02 12:35:15.288 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.360 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72', 'name': 'tempest-TestGettingAddress-server-1033926726', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000009a', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'hostId': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.361 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.386 12 DEBUG ceilometer.compute.pollsters [-] cd5cbe34-283a-4177-9d0b-bc35fadcde72/disk.device.read.latency volume: 1005279908 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.387 12 DEBUG ceilometer.compute.pollsters [-] cd5cbe34-283a-4177-9d0b-bc35fadcde72/disk.device.read.latency volume: 1201945187 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.390 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '82ce5704-6ce0-40ce-83fd-8541a68e381e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1005279908, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72-vda', 'timestamp': '2025-10-02T12:35:16.361526', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1033926726', 'name': 'instance-0000009a', 'instance_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3fd86c6c-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6508.039767692, 'message_signature': '0918ba5bf8b1d895e1e2fd089673a5c7834f266eaa2022181f1613dcde046e30'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1201945187, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72-sda', 'timestamp': '2025-10-02T12:35:16.361526', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1033926726', 'name': 'instance-0000009a', 'instance_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3fd8844a-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6508.039767692, 'message_signature': '83afd2f72d800b3af2188235116f91ff80efbee354ade8a7d986b4877f1a0032'}]}, 'timestamp': '2025-10-02 12:35:16.388015', '_unique_id': '2c285b4795cb4b429ed27f9d0e3e450a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.390 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.390 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.390 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.390 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.390 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.390 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.390 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.390 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.390 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.390 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.390 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.390 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.390 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.390 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.390 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.390 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.390 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.390 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.390 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.390 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.390 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.390 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.390 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.390 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.390 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.390 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.390 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.390 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.390 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.390 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.390 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.391 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.392 12 DEBUG ceilometer.compute.pollsters [-] cd5cbe34-283a-4177-9d0b-bc35fadcde72/disk.device.write.latency volume: 78897323216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.392 12 DEBUG ceilometer.compute.pollsters [-] cd5cbe34-283a-4177-9d0b-bc35fadcde72/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.394 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c2161233-2710-44f7-88e6-6914d10b9a9c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 78897323216, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72-vda', 'timestamp': '2025-10-02T12:35:16.392117', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1033926726', 'name': 'instance-0000009a', 'instance_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3fd939a8-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6508.039767692, 'message_signature': '964f0ddcefa0cf22d167a8c397c9fa543f5fdde7671790c51e61a1217d72ee54'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72-sda', 'timestamp': '2025-10-02T12:35:16.392117', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1033926726', 'name': 'instance-0000009a', 'instance_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3fd94cae-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6508.039767692, 'message_signature': '31d79ffedde7755d645ee81d5c8732c41a6f8fd19512a4ae28b3e3672387cf50'}]}, 'timestamp': '2025-10-02 12:35:16.393302', '_unique_id': 'e26a90c0f7564e33b4ee5f69ee7e4289'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.394 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.394 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.394 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.394 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.394 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.394 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.394 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.394 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.394 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.394 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.394 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.394 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.394 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.394 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.394 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.394 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.394 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.394 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.394 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.394 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.394 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.394 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.394 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.394 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.394 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.394 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.394 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.394 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.394 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.394 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.394 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.395 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.395 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.396 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1033926726>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1033926726>]
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.396 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.400 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for cd5cbe34-283a-4177-9d0b-bc35fadcde72 / tap5be4618b-6d inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.401 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for cd5cbe34-283a-4177-9d0b-bc35fadcde72 / tapa9826a7d-32 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.402 12 DEBUG ceilometer.compute.pollsters [-] cd5cbe34-283a-4177-9d0b-bc35fadcde72/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.402 12 DEBUG ceilometer.compute.pollsters [-] cd5cbe34-283a-4177-9d0b-bc35fadcde72/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.404 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0586e36b-5fc5-4324-acb0-7740bbae2707', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-0000009a-cd5cbe34-283a-4177-9d0b-bc35fadcde72-tap5be4618b-6d', 'timestamp': '2025-10-02T12:35:16.396602', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1033926726', 'name': 'tap5be4618b-6d', 'instance_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6a:d5:ad', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5be4618b-6d'}, 'message_id': '3fdabea4-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6508.074857153, 'message_signature': '9134cda9eb24c46ede561194118084f6f16a9414b2108ac14ff07d70be73caca'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-0000009a-cd5cbe34-283a-4177-9d0b-bc35fadcde72-tapa9826a7d-32', 'timestamp': '2025-10-02T12:35:16.396602', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1033926726', 'name': 'tapa9826a7d-32', 'instance_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5d:00:e6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa9826a7d-32'}, 'message_id': '3fdad402-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6508.074857153, 'message_signature': '50fc253bdc03031636f250d078a852c81f79be9e8355bc8b72098ef5b3d76bc2'}]}, 'timestamp': '2025-10-02 12:35:16.403122', '_unique_id': 'd9065558986745c3a4edcd856297737f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.404 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.404 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.404 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.404 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.404 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.404 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.404 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.404 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.404 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.404 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.404 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.404 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.404 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.404 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.404 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.404 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.404 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.404 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.404 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.404 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.404 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.404 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.404 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.404 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.404 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.404 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.404 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.404 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.404 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.404 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.404 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.405 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.405 12 DEBUG ceilometer.compute.pollsters [-] cd5cbe34-283a-4177-9d0b-bc35fadcde72/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.406 12 DEBUG ceilometer.compute.pollsters [-] cd5cbe34-283a-4177-9d0b-bc35fadcde72/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.407 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3342a3db-2b37-483e-8933-e213ca82b592', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-0000009a-cd5cbe34-283a-4177-9d0b-bc35fadcde72-tap5be4618b-6d', 'timestamp': '2025-10-02T12:35:16.405894', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1033926726', 'name': 'tap5be4618b-6d', 'instance_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6a:d5:ad', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5be4618b-6d'}, 'message_id': '3fdb533c-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6508.074857153, 'message_signature': '08892a760a4ea060dc9538201cde46d90a2d55f6859e176d50c8c61e03aad5e7'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-0000009a-cd5cbe34-283a-4177-9d0b-bc35fadcde72-tapa9826a7d-32', 'timestamp': '2025-10-02T12:35:16.405894', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1033926726', 'name': 'tapa9826a7d-32', 'instance_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5d:00:e6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa9826a7d-32'}, 'message_id': '3fdb64b2-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6508.074857153, 'message_signature': 'dde8234717df0132e7433d26e62da6757287736cd0392d7e07b97e7a1a5f0635'}]}, 'timestamp': '2025-10-02 12:35:16.406865', '_unique_id': 'af9ce6755c5c4afab9e2c819f0c2fadf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.407 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.407 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.407 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.407 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.407 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.407 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.407 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.407 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.407 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.407 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.407 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.407 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.407 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.407 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.407 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.407 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.407 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.407 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.407 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.407 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.407 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.407 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.407 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.407 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.407 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.407 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.407 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.407 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.407 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.407 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.407 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.409 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.409 12 DEBUG ceilometer.compute.pollsters [-] cd5cbe34-283a-4177-9d0b-bc35fadcde72/network.outgoing.bytes volume: 266 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.409 12 DEBUG ceilometer.compute.pollsters [-] cd5cbe34-283a-4177-9d0b-bc35fadcde72/network.outgoing.bytes volume: 266 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.411 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2e7e4b88-3f30-4eaa-923d-2cd5320f26d1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 266, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-0000009a-cd5cbe34-283a-4177-9d0b-bc35fadcde72-tap5be4618b-6d', 'timestamp': '2025-10-02T12:35:16.409430', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1033926726', 'name': 'tap5be4618b-6d', 'instance_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6a:d5:ad', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5be4618b-6d'}, 'message_id': '3fdbdce4-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6508.074857153, 'message_signature': '220d23d70733a0741591c38db25027436f5a5d88dc365b872237a453c2d8f557'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 266, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-0000009a-cd5cbe34-283a-4177-9d0b-bc35fadcde72-tapa9826a7d-32', 'timestamp': '2025-10-02T12:35:16.409430', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1033926726', 'name': 'tapa9826a7d-32', 'instance_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5d:00:e6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa9826a7d-32'}, 'message_id': '3fdbef54-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6508.074857153, 'message_signature': 'd7d8fb2580ed8609d4cf27ff45e93cca7f2a5196cb607b6028df33e4c8dd67e4'}]}, 'timestamp': '2025-10-02 12:35:16.410358', '_unique_id': 'c789fc0a2b4c4ddcb108053da5aeb669'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.411 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.411 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.411 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.411 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.411 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.411 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.411 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.411 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.411 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.411 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.411 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.411 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.411 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.411 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.411 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.411 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.411 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.411 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.411 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.411 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.411 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.411 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.411 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.411 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.411 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.411 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.411 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.411 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.411 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.411 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.411 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.412 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.412 12 DEBUG ceilometer.compute.pollsters [-] cd5cbe34-283a-4177-9d0b-bc35fadcde72/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.413 12 DEBUG ceilometer.compute.pollsters [-] cd5cbe34-283a-4177-9d0b-bc35fadcde72/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.414 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8aeb489f-4245-41be-8555-47482a7de5d5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-0000009a-cd5cbe34-283a-4177-9d0b-bc35fadcde72-tap5be4618b-6d', 'timestamp': '2025-10-02T12:35:16.412726', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1033926726', 'name': 'tap5be4618b-6d', 'instance_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6a:d5:ad', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5be4618b-6d'}, 'message_id': '3fdc5f2a-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6508.074857153, 'message_signature': '3df2b5fb0fe96c14fc728df90e7850404e8e6dc51444494da91f97d1115a98ec'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-0000009a-cd5cbe34-283a-4177-9d0b-bc35fadcde72-tapa9826a7d-32', 'timestamp': '2025-10-02T12:35:16.412726', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1033926726', 'name': 'tapa9826a7d-32', 'instance_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5d:00:e6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa9826a7d-32'}, 'message_id': '3fdc7028-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6508.074857153, 'message_signature': 'f45fa6519a4a8376cc66861e3e6f935bddb9980554eefa7e346ae31d17543cb3'}]}, 'timestamp': '2025-10-02 12:35:16.413656', '_unique_id': 'f85e0410242e415f981ce0408b48d1c0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.414 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.414 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.414 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.414 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.414 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.414 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.414 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.414 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.414 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.414 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.414 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.414 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.414 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.414 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.414 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.414 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.414 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.414 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.414 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.414 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.414 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.414 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.414 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.414 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.414 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.414 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.414 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.414 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.414 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.414 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.414 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.415 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.416 12 DEBUG ceilometer.compute.pollsters [-] cd5cbe34-283a-4177-9d0b-bc35fadcde72/disk.device.read.bytes volume: 29867008 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.416 12 DEBUG ceilometer.compute.pollsters [-] cd5cbe34-283a-4177-9d0b-bc35fadcde72/disk.device.read.bytes volume: 221502 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.417 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1e691fd4-2bae-436c-b950-5af4af6198ca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29867008, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72-vda', 'timestamp': '2025-10-02T12:35:16.416020', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1033926726', 'name': 'instance-0000009a', 'instance_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3fdcde5a-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6508.039767692, 'message_signature': 'f9c26187354f5cd45fb4a1d2eb68f4326be7d46d6309e61bdc135990df8e6ff7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 221502, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72-sda', 'timestamp': '2025-10-02T12:35:16.416020', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1033926726', 'name': 'instance-0000009a', 'instance_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3fdceecc-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6508.039767692, 'message_signature': 'dd71872d6f1ff553c803ad6a9f3dbbab18ba44a7b0f11e6a2429435c0b3b9e09'}]}, 'timestamp': '2025-10-02 12:35:16.416920', '_unique_id': 'f2006ec8f5ee403ca39e673af3d800a8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.417 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.417 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.417 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.417 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.417 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.417 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.417 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.417 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.417 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.417 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.417 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.417 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.417 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.417 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.417 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.417 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.417 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.417 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.417 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.417 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.417 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.417 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.417 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.417 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.417 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.417 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.417 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.417 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.417 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.417 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.417 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.419 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.432 12 DEBUG ceilometer.compute.pollsters [-] cd5cbe34-283a-4177-9d0b-bc35fadcde72/disk.device.usage volume: 28442624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.433 12 DEBUG ceilometer.compute.pollsters [-] cd5cbe34-283a-4177-9d0b-bc35fadcde72/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.435 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fccdb500-0dc2-4db6-bc3a-f552bc915aae', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 28442624, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72-vda', 'timestamp': '2025-10-02T12:35:16.419335', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1033926726', 'name': 'instance-0000009a', 'instance_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3fdf7020-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6508.097602357, 'message_signature': '4a367b06ac22ee70ba5f9196f660ec3c93a5cf33efd0b660578fc0d1612d0e7a'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72-sda', 'timestamp': '2025-10-02T12:35:16.419335', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1033926726', 'name': 'instance-0000009a', 'instance_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3fdf85ce-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6508.097602357, 'message_signature': 'f7a0ddbab7013985a4e97ca790824f2d59e9c5682892fed334798ea31969849c'}]}, 'timestamp': '2025-10-02 12:35:16.433917', '_unique_id': 'c22b1cd7d5d047d3a7886fabe49a03ec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.435 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.435 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.435 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.435 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.435 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.435 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.435 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.435 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.435 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.435 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.435 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.435 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.435 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.435 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.435 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.435 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.435 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.435 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.435 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.435 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.435 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.435 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.435 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.435 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.435 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.435 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.435 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.435 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.435 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.435 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.435 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.436 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.437 12 DEBUG ceilometer.compute.pollsters [-] cd5cbe34-283a-4177-9d0b-bc35fadcde72/network.outgoing.packets volume: 3 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.437 12 DEBUG ceilometer.compute.pollsters [-] cd5cbe34-283a-4177-9d0b-bc35fadcde72/network.outgoing.packets volume: 3 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.439 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'efb8ccf1-8cdf-45dc-ad1a-fcdcc9cd4c51', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 3, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-0000009a-cd5cbe34-283a-4177-9d0b-bc35fadcde72-tap5be4618b-6d', 'timestamp': '2025-10-02T12:35:16.437132', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1033926726', 'name': 'tap5be4618b-6d', 'instance_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6a:d5:ad', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5be4618b-6d'}, 'message_id': '3fe01b06-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6508.074857153, 'message_signature': '74d4d3cbb8a52346b7a0a3019fc463b66f3a1ab5033066ed73641f9a4524961c'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 3, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-0000009a-cd5cbe34-283a-4177-9d0b-bc35fadcde72-tapa9826a7d-32', 'timestamp': '2025-10-02T12:35:16.437132', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1033926726', 'name': 'tapa9826a7d-32', 'instance_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5d:00:e6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa9826a7d-32'}, 'message_id': '3fe03618-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6508.074857153, 'message_signature': 'b3ac2a3084c23fa5e534e52a9d1f8c1dbad91f5f1613025525af627accea5cf2'}]}, 'timestamp': '2025-10-02 12:35:16.438501', '_unique_id': '32c40f354d6e4d4baed252e0a086a481'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.439 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.439 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.439 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.439 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.439 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.439 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.439 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.439 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.439 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.439 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.439 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.439 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.439 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.439 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.439 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.439 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.439 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.439 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.439 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.439 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.439 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.439 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.439 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.439 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.439 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.439 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.439 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.439 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.439 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.439 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.439 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.441 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.441 12 DEBUG ceilometer.compute.pollsters [-] cd5cbe34-283a-4177-9d0b-bc35fadcde72/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.441 12 DEBUG ceilometer.compute.pollsters [-] cd5cbe34-283a-4177-9d0b-bc35fadcde72/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.443 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '24ec9eb5-88e9-4dd0-9410-10ab8b5008cd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-0000009a-cd5cbe34-283a-4177-9d0b-bc35fadcde72-tap5be4618b-6d', 'timestamp': '2025-10-02T12:35:16.441209', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1033926726', 'name': 'tap5be4618b-6d', 'instance_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6a:d5:ad', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5be4618b-6d'}, 'message_id': '3fe0b6d8-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6508.074857153, 'message_signature': 'e4d81318215511e8100c93ccc114caa667a1b3bf87a056d529b170af9c07b60f'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-0000009a-cd5cbe34-283a-4177-9d0b-bc35fadcde72-tapa9826a7d-32', 'timestamp': '2025-10-02T12:35:16.441209', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1033926726', 'name': 'tapa9826a7d-32', 'instance_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5d:00:e6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa9826a7d-32'}, 'message_id': '3fe0cdc6-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6508.074857153, 'message_signature': '39f552edee9d54f3e73fae0f0e8bb525e9a1c7b2f4f2dbba24f17199cd239e40'}]}, 'timestamp': '2025-10-02 12:35:16.442280', '_unique_id': '8b4dc4a0bd314be0b86e863c9afa73a8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.443 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.443 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.443 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.443 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.443 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.443 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.443 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.443 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.443 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.443 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.443 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.443 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.443 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.443 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.443 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.443 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.443 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.443 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.443 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.443 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.443 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.443 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.443 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.443 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.443 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.443 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.443 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.443 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.443 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.443 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.443 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.444 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.445 12 DEBUG ceilometer.compute.pollsters [-] cd5cbe34-283a-4177-9d0b-bc35fadcde72/disk.device.write.requests volume: 237 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.445 12 DEBUG ceilometer.compute.pollsters [-] cd5cbe34-283a-4177-9d0b-bc35fadcde72/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.447 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c24807be-bc21-4378-a319-a1e6d187381e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 237, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72-vda', 'timestamp': '2025-10-02T12:35:16.445152', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1033926726', 'name': 'instance-0000009a', 'instance_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3fe1523c-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6508.039767692, 'message_signature': '89bb4c429de2b6d68028cc9e649484be1834fab1bcb98a9220a5bf3d2e65fb50'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72-sda', 'timestamp': '2025-10-02T12:35:16.445152', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1033926726', 'name': 'instance-0000009a', 'instance_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3fe16a2e-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6508.039767692, 'message_signature': 'f522949ceb7543f9d8cc80d5a694524116bf0f596bb5d089cea9bf4e6cc8dda4'}]}, 'timestamp': '2025-10-02 12:35:16.446345', '_unique_id': '3e385fb1670c4acbad6ddb66015dd2f1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.447 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.447 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.447 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.447 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.447 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.447 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.447 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.447 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.447 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.447 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.447 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.447 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.447 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.447 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.447 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.447 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.447 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.447 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.447 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.447 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.447 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.447 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.447 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.447 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.447 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.447 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.447 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.447 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.447 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.447 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.447 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.449 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.449 12 DEBUG ceilometer.compute.pollsters [-] cd5cbe34-283a-4177-9d0b-bc35fadcde72/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.450 12 DEBUG ceilometer.compute.pollsters [-] cd5cbe34-283a-4177-9d0b-bc35fadcde72/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.451 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bd21a2f5-b61d-453d-a837-fc883d2bacfd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72-vda', 'timestamp': '2025-10-02T12:35:16.449547', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1033926726', 'name': 'instance-0000009a', 'instance_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3fe1fc6e-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6508.097602357, 'message_signature': '7da23611bf0ba0a351445411cd749f90a856822c5d42198b1b09aa1b200cc49a'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72-sda', 'timestamp': '2025-10-02T12:35:16.449547', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1033926726', 'name': 'instance-0000009a', 'instance_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3fe210dc-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6508.097602357, 'message_signature': 'd0fec94e6709537cb233f0494da44d26bef44802eb70e820e2ed2aafe440a16f'}]}, 'timestamp': '2025-10-02 12:35:16.450556', '_unique_id': 'f857ed271cc44d7ba84393128852698e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.451 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.451 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.451 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.451 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.451 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.451 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.451 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.451 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.451 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.451 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.451 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.451 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.451 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.451 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.451 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.451 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.451 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.451 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.451 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.451 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.451 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.451 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.451 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.451 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.451 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.451 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.451 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.451 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.451 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.451 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.451 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.453 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.453 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.453 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1033926726>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1033926726>]
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.453 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.453 12 DEBUG ceilometer.compute.pollsters [-] cd5cbe34-283a-4177-9d0b-bc35fadcde72/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.454 12 DEBUG ceilometer.compute.pollsters [-] cd5cbe34-283a-4177-9d0b-bc35fadcde72/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.455 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e2f88001-165a-4eea-b7c4-e1c1fd2c9019', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-0000009a-cd5cbe34-283a-4177-9d0b-bc35fadcde72-tap5be4618b-6d', 'timestamp': '2025-10-02T12:35:16.453867', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1033926726', 'name': 'tap5be4618b-6d', 'instance_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6a:d5:ad', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5be4618b-6d'}, 'message_id': '3fe2a6a0-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6508.074857153, 'message_signature': '0ecc5a6855bb440c150de419f5198533553db1ce2a3fe2358f876045efa02c1e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-0000009a-cd5cbe34-283a-4177-9d0b-bc35fadcde72-tapa9826a7d-32', 'timestamp': '2025-10-02T12:35:16.453867', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1033926726', 'name': 'tapa9826a7d-32', 'instance_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5d:00:e6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa9826a7d-32'}, 'message_id': '3fe2b85c-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6508.074857153, 'message_signature': 'dcbecc691a81ed2d8169197ef2e19449d53ace4bc1a7d3715891394c9259f2bd'}]}, 'timestamp': '2025-10-02 12:35:16.454895', '_unique_id': '7e8df2be3c084872ac93232a138257c6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.455 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.455 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.455 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.455 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.455 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.455 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.455 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.455 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.455 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.455 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.455 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.455 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.455 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.455 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.455 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.455 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.455 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.455 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.455 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.455 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.455 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.455 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.455 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.455 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.455 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.455 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.455 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.455 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.455 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.455 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.455 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.457 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.478 12 DEBUG ceilometer.compute.pollsters [-] cd5cbe34-283a-4177-9d0b-bc35fadcde72/memory.usage volume: 40.41015625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.479 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4f420a12-5d03-4019-8a6f-b4f1dde654f6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.41015625, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72', 'timestamp': '2025-10-02T12:35:16.457347', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1033926726', 'name': 'instance-0000009a', 'instance_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '3fe65746-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6508.156151014, 'message_signature': '405d5ce499b66b0239566a2528590e3a3e86c9cbf5f78bcd6df2b4402e9906bf'}]}, 'timestamp': '2025-10-02 12:35:16.478575', '_unique_id': 'ba95a6eb986147db80ff31f1166925a8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.479 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.479 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.479 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.479 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.479 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.479 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.479 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.479 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.479 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.479 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.479 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.479 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.479 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.479 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.479 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.479 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.479 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.479 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.479 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.479 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.479 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.479 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.479 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.479 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.479 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.479 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.479 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.479 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.479 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.479 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.479 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.480 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.481 12 DEBUG ceilometer.compute.pollsters [-] cd5cbe34-283a-4177-9d0b-bc35fadcde72/network.incoming.packets volume: 13 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:16 np0005466013 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.481 12 DEBUG ceilometer.compute.pollsters [-] cd5cbe34-283a-4177-9d0b-bc35fadcde72/network.incoming.packets volume: 9 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.484 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2aacca99-7cd9-4e8a-b067-475a3657e306', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 13, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-0000009a-cd5cbe34-283a-4177-9d0b-bc35fadcde72-tap5be4618b-6d', 'timestamp': '2025-10-02T12:35:16.481041', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1033926726', 'name': 'tap5be4618b-6d', 'instance_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6a:d5:ad', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5be4618b-6d'}, 'message_id': '3fe6ca8c-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6508.074857153, 'message_signature': '92f4762ea2c627410290bf7bc266ebaee419f9b5c5ee8ac592f51c4831572ce7'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 9, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-0000009a-cd5cbe34-283a-4177-9d0b-bc35fadcde72-tapa9826a7d-32', 'timestamp': '2025-10-02T12:35:16.481041', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1033926726', 'name': 'tapa9826a7d-32', 'instance_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5d:00:e6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa9826a7d-32'}, 'message_id': '3fe6dd06-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6508.074857153, 'message_signature': 'c2080cb72f9c01b313fd4e97ca71fafea2451f52ad57f9077d9c9a1687168d57'}]}, 'timestamp': '2025-10-02 12:35:16.482018', '_unique_id': '2c618002b4e44169b3b0d85a224dfb2d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.484 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.484 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.484 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.484 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.484 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.484 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.484 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.484 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.484 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.484 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.484 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.484 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.484 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.484 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.484 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.484 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.484 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.484 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.484 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.484 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.484 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.484 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.484 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.484 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.484 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.484 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.484 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.484 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.484 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.484 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.484 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.484 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.485 12 DEBUG ceilometer.compute.pollsters [-] cd5cbe34-283a-4177-9d0b-bc35fadcde72/disk.device.write.bytes volume: 25677824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.485 12 DEBUG ceilometer.compute.pollsters [-] cd5cbe34-283a-4177-9d0b-bc35fadcde72/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.486 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ce8fd5b8-8670-46ca-b0a1-cc386a795146', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 25677824, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72-vda', 'timestamp': '2025-10-02T12:35:16.485022', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1033926726', 'name': 'instance-0000009a', 'instance_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3fe76352-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6508.039767692, 'message_signature': '3a74959766d25a3254f6037c646bf49667b2acea9424d3c5f84dbf6c4ef9f278'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72-sda', 'timestamp': '2025-10-02T12:35:16.485022', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1033926726', 'name': 'instance-0000009a', 'instance_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3fe76ece-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6508.039767692, 'message_signature': '67608122e3c718a56fc1f045d14890d5194f81e2b4cd67372dba0b951a7995be'}]}, 'timestamp': '2025-10-02 12:35:16.485638', '_unique_id': 'cc6791adb9fa4134b055da22b57de15f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.486 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.486 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.486 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.486 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.486 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.486 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.486 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.486 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.486 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.486 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.486 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.486 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.486 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.486 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.486 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.486 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.486 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.486 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.486 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.486 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.486 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.486 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.486 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.486 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.486 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.486 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.486 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.486 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.486 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.486 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.486 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.487 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.487 12 DEBUG ceilometer.compute.pollsters [-] cd5cbe34-283a-4177-9d0b-bc35fadcde72/disk.device.allocation volume: 29237248 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.487 12 DEBUG ceilometer.compute.pollsters [-] cd5cbe34-283a-4177-9d0b-bc35fadcde72/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.488 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c4d1ce70-3ac8-47d4-9849-3d3651f076be', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29237248, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72-vda', 'timestamp': '2025-10-02T12:35:16.487218', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1033926726', 'name': 'instance-0000009a', 'instance_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3fe7b834-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6508.097602357, 'message_signature': 'fece55bb45d62f0569898b3b999efe269f19294b4d837767562d7c8a2dbce16d'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72-sda', 'timestamp': '2025-10-02T12:35:16.487218', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1033926726', 'name': 'instance-0000009a', 'instance_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3fe7c306-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6508.097602357, 'message_signature': 'e4b282d687fa1b9afa3dd84fdc01ef36c49c244e84994eaea8c881cc682e61c0'}]}, 'timestamp': '2025-10-02 12:35:16.487790', '_unique_id': '1c0b4d38164149dea5b6989e56308ee0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.488 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.488 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.488 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.488 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.488 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.488 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.488 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.488 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.488 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.488 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.488 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.488 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.488 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.488 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.488 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.488 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.488 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.488 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.488 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.488 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.488 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.488 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.488 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.488 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.488 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.488 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.488 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.488 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.488 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.488 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.488 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.489 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.489 12 DEBUG ceilometer.compute.pollsters [-] cd5cbe34-283a-4177-9d0b-bc35fadcde72/network.incoming.bytes volume: 1218 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.489 12 DEBUG ceilometer.compute.pollsters [-] cd5cbe34-283a-4177-9d0b-bc35fadcde72/network.incoming.bytes volume: 922 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.490 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2eb524dc-f65f-4cd5-874a-d2e375039223', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1218, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-0000009a-cd5cbe34-283a-4177-9d0b-bc35fadcde72-tap5be4618b-6d', 'timestamp': '2025-10-02T12:35:16.489505', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1033926726', 'name': 'tap5be4618b-6d', 'instance_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6a:d5:ad', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5be4618b-6d'}, 'message_id': '3fe811a8-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6508.074857153, 'message_signature': 'f21e92d68eb6b86c8d109a3e78b6b47d2f9f3f825cf919ec9b32a7e98657277f'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 922, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-0000009a-cd5cbe34-283a-4177-9d0b-bc35fadcde72-tapa9826a7d-32', 'timestamp': '2025-10-02T12:35:16.489505', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1033926726', 'name': 'tapa9826a7d-32', 'instance_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5d:00:e6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa9826a7d-32'}, 'message_id': '3fe81e1e-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6508.074857153, 'message_signature': '057fdffd883652f5e77cdfacf1c6bcc34faba5adbdd8f259391fef8797e6f2ec'}]}, 'timestamp': '2025-10-02 12:35:16.490137', '_unique_id': '5d0dfb8af2c94334a0452160224830a1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.490 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.490 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.490 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.490 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.490 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.490 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.490 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.490 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.490 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.490 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.490 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.490 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.490 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.490 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.490 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.490 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.490 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.490 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.490 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.490 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.490 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.490 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.490 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.490 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.490 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.490 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.490 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.490 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.490 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.490 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.490 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.491 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.491 12 DEBUG ceilometer.compute.pollsters [-] cd5cbe34-283a-4177-9d0b-bc35fadcde72/cpu volume: 11210000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.492 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1753bec3-a0e2-47f5-b674-ebd48aabc9aa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11210000000, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72', 'timestamp': '2025-10-02T12:35:16.491760', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1033926726', 'name': 'instance-0000009a', 'instance_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '3fe86a9a-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6508.156151014, 'message_signature': '778231bc8e5e2d1a355f995a70400470056211d3f85e651293cc4aec3a10e654'}]}, 'timestamp': '2025-10-02 12:35:16.492089', '_unique_id': 'f850e0d258de4181800e6485c28eec99'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.492 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.492 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.492 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.492 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.492 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.492 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.492 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.492 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.492 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.492 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.492 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.492 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.492 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.492 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.492 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.492 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.492 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.492 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.492 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.492 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.492 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.492 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.492 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.492 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.492 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.492 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.492 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.492 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.492 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.492 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.492 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.493 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.493 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.493 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1033926726>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1033926726>]
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.493 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.494 12 DEBUG ceilometer.compute.pollsters [-] cd5cbe34-283a-4177-9d0b-bc35fadcde72/disk.device.read.requests volume: 1085 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.494 12 DEBUG ceilometer.compute.pollsters [-] cd5cbe34-283a-4177-9d0b-bc35fadcde72/disk.device.read.requests volume: 95 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.495 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4a73f15e-60f3-4762-9f0d-c8831b5e4444', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1085, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72-vda', 'timestamp': '2025-10-02T12:35:16.493994', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1033926726', 'name': 'instance-0000009a', 'instance_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3fe8c09e-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6508.039767692, 'message_signature': 'f8378200ff364797c44ffb90ac68d21e864388580edae16e271c5b2cdb37496a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 95, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72-sda', 'timestamp': '2025-10-02T12:35:16.493994', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1033926726', 'name': 'instance-0000009a', 'instance_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3fe8cba2-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6508.039767692, 'message_signature': '9e1928a81617ec3542062b3257d265a4f0e771914003344e9e8dfecfac7f19f1'}]}, 'timestamp': '2025-10-02 12:35:16.494558', '_unique_id': 'a6d6d13425564d2f89f5c44f77b3a097'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.495 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.495 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.495 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.495 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.495 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.495 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.495 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.495 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.495 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.495 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.495 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.495 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.495 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.495 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.495 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.495 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.495 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.495 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.495 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.495 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.495 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.495 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.495 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.495 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.495 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.495 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.495 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.495 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.495 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.495 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.495 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.495 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.496 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.496 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1033926726>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1033926726>]
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.496 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.496 12 DEBUG ceilometer.compute.pollsters [-] cd5cbe34-283a-4177-9d0b-bc35fadcde72/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.496 12 DEBUG ceilometer.compute.pollsters [-] cd5cbe34-283a-4177-9d0b-bc35fadcde72/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.497 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '151b481f-4b71-4880-992b-5e5f3c76f09c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-0000009a-cd5cbe34-283a-4177-9d0b-bc35fadcde72-tap5be4618b-6d', 'timestamp': '2025-10-02T12:35:16.496440', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1033926726', 'name': 'tap5be4618b-6d', 'instance_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6a:d5:ad', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5be4618b-6d'}, 'message_id': '3fe92034-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6508.074857153, 'message_signature': 'e46ea128a57e249941a1189b5942532034f9cebacd54768d6fc7c34f936bd7fa'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-0000009a-cd5cbe34-283a-4177-9d0b-bc35fadcde72-tapa9826a7d-32', 'timestamp': '2025-10-02T12:35:16.496440', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1033926726', 'name': 'tapa9826a7d-32', 'instance_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72', 'instance_type': 'm1.nano', 'host': 'a3a7bbdcd062f4fb5f1fd41f2d588456d2645d6da252186f28342851', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5d:00:e6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa9826a7d-32'}, 'message_id': '3fe92c64-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6508.074857153, 'message_signature': '0db973eb6047d87d66e86309b3a4a45fed442d81a7d0ce07c750f86259205b3b'}]}, 'timestamp': '2025-10-02 12:35:16.497092', '_unique_id': '1880b7f3245b4b3ba75174d8123cacb3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.497 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.497 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.497 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.497 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.497 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.497 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.497 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.497 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.497 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.497 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.497 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.497 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.497 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.497 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.497 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.497 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.497 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.497 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.497 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.497 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.497 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.497 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.497 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.497 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.497 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.497 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.497 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.497 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.497 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.497 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:35:16.497 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466013 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 08:35:16 np0005466013 podman[246015]: 2025-10-02 12:35:16.693422945 +0000 UTC m=+0.065376823 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  2 08:35:16 np0005466013 podman[246016]: 2025-10-02 12:35:16.704162662 +0000 UTC m=+0.076543513 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  2 08:35:16 np0005466013 podman[246017]: 2025-10-02 12:35:16.737812287 +0000 UTC m=+0.105423309 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:35:16 np0005466013 ovn_controller[94366]: 2025-10-02T12:35:16Z|00068|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:6a:d5:ad 10.100.0.5
Oct  2 08:35:16 np0005466013 ovn_controller[94366]: 2025-10-02T12:35:16Z|00069|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6a:d5:ad 10.100.0.5
Oct  2 08:35:17 np0005466013 nova_compute[192144]: 2025-10-02 12:35:17.158 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:20 np0005466013 nova_compute[192144]: 2025-10-02 12:35:20.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:22 np0005466013 nova_compute[192144]: 2025-10-02 12:35:22.160 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:25 np0005466013 nova_compute[192144]: 2025-10-02 12:35:25.216 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:35:25.288 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '39'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:35:26 np0005466013 podman[246085]: 2025-10-02 12:35:26.726529646 +0000 UTC m=+0.091026727 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_id=edpm, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.openshift.expose-services=, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, vcs-type=git, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers)
Oct  2 08:35:26 np0005466013 podman[246084]: 2025-10-02 12:35:26.729021585 +0000 UTC m=+0.097561283 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:35:26 np0005466013 podman[246086]: 2025-10-02 12:35:26.736149268 +0000 UTC m=+0.091906385 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:35:27 np0005466013 nova_compute[192144]: 2025-10-02 12:35:27.163 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:29 np0005466013 ovn_controller[94366]: 2025-10-02T12:35:29Z|00671|binding|INFO|Releasing lport 9e6245a0-6013-48e5-9e96-a79fafe59b6a from this chassis (sb_readonly=0)
Oct  2 08:35:29 np0005466013 ovn_controller[94366]: 2025-10-02T12:35:29Z|00672|binding|INFO|Releasing lport 0965d19d-88e4-4971-9eb7-5bedfed08cdc from this chassis (sb_readonly=0)
Oct  2 08:35:29 np0005466013 nova_compute[192144]: 2025-10-02 12:35:29.074 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:30 np0005466013 nova_compute[192144]: 2025-10-02 12:35:30.218 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:31 np0005466013 podman[246143]: 2025-10-02 12:35:31.691692337 +0000 UTC m=+0.059781878 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:35:31 np0005466013 podman[246144]: 2025-10-02 12:35:31.693245846 +0000 UTC m=+0.066170358 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:35:32 np0005466013 nova_compute[192144]: 2025-10-02 12:35:32.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:35 np0005466013 nova_compute[192144]: 2025-10-02 12:35:35.220 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:36 np0005466013 nova_compute[192144]: 2025-10-02 12:35:36.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:35:37 np0005466013 nova_compute[192144]: 2025-10-02 12:35:37.026 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:37 np0005466013 nova_compute[192144]: 2025-10-02 12:35:37.026 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:37 np0005466013 nova_compute[192144]: 2025-10-02 12:35:37.026 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:37 np0005466013 nova_compute[192144]: 2025-10-02 12:35:37.027 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:35:37 np0005466013 nova_compute[192144]: 2025-10-02 12:35:37.107 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cd5cbe34-283a-4177-9d0b-bc35fadcde72/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:35:37 np0005466013 nova_compute[192144]: 2025-10-02 12:35:37.162 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cd5cbe34-283a-4177-9d0b-bc35fadcde72/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:35:37 np0005466013 nova_compute[192144]: 2025-10-02 12:35:37.163 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cd5cbe34-283a-4177-9d0b-bc35fadcde72/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:35:37 np0005466013 nova_compute[192144]: 2025-10-02 12:35:37.178 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:37 np0005466013 nova_compute[192144]: 2025-10-02 12:35:37.217 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cd5cbe34-283a-4177-9d0b-bc35fadcde72/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:35:37 np0005466013 nova_compute[192144]: 2025-10-02 12:35:37.362 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:35:37 np0005466013 nova_compute[192144]: 2025-10-02 12:35:37.363 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5522MB free_disk=73.17173767089844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:35:37 np0005466013 nova_compute[192144]: 2025-10-02 12:35:37.364 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:37 np0005466013 nova_compute[192144]: 2025-10-02 12:35:37.364 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:37 np0005466013 nova_compute[192144]: 2025-10-02 12:35:37.732 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance cd5cbe34-283a-4177-9d0b-bc35fadcde72 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:35:37 np0005466013 nova_compute[192144]: 2025-10-02 12:35:37.732 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:35:37 np0005466013 nova_compute[192144]: 2025-10-02 12:35:37.733 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:35:37 np0005466013 nova_compute[192144]: 2025-10-02 12:35:37.816 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Refreshing inventories for resource provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  2 08:35:37 np0005466013 nova_compute[192144]: 2025-10-02 12:35:37.953 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Updating ProviderTree inventory for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  2 08:35:37 np0005466013 nova_compute[192144]: 2025-10-02 12:35:37.954 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Updating inventory in ProviderTree for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 08:35:37 np0005466013 nova_compute[192144]: 2025-10-02 12:35:37.986 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Refreshing aggregate associations for resource provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  2 08:35:38 np0005466013 nova_compute[192144]: 2025-10-02 12:35:38.012 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Refreshing trait associations for resource provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80, traits: COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_TRUSTED_CERTS,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NODE,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE,HW_CPU_X86_MMX,COMPUTE_RESCUE_BFV _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  2 08:35:38 np0005466013 nova_compute[192144]: 2025-10-02 12:35:38.087 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:35:38 np0005466013 nova_compute[192144]: 2025-10-02 12:35:38.104 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:35:38 np0005466013 nova_compute[192144]: 2025-10-02 12:35:38.151 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:35:38 np0005466013 nova_compute[192144]: 2025-10-02 12:35:38.151 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.787s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:39 np0005466013 nova_compute[192144]: 2025-10-02 12:35:39.152 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:35:39 np0005466013 nova_compute[192144]: 2025-10-02 12:35:39.152 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:35:39 np0005466013 nova_compute[192144]: 2025-10-02 12:35:39.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:35:40 np0005466013 nova_compute[192144]: 2025-10-02 12:35:40.222 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:40 np0005466013 nova_compute[192144]: 2025-10-02 12:35:40.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:35:42 np0005466013 nova_compute[192144]: 2025-10-02 12:35:42.180 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:43 np0005466013 nova_compute[192144]: 2025-10-02 12:35:43.988 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:35:44 np0005466013 nova_compute[192144]: 2025-10-02 12:35:44.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:35:45 np0005466013 nova_compute[192144]: 2025-10-02 12:35:45.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:47 np0005466013 nova_compute[192144]: 2025-10-02 12:35:47.182 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:47 np0005466013 podman[246201]: 2025-10-02 12:35:47.686699315 +0000 UTC m=+0.053238270 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent)
Oct  2 08:35:47 np0005466013 podman[246200]: 2025-10-02 12:35:47.697113723 +0000 UTC m=+0.074809499 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:35:47 np0005466013 podman[246202]: 2025-10-02 12:35:47.712461224 +0000 UTC m=+0.084662858 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Oct  2 08:35:47 np0005466013 nova_compute[192144]: 2025-10-02 12:35:47.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:35:47 np0005466013 nova_compute[192144]: 2025-10-02 12:35:47.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:35:47 np0005466013 nova_compute[192144]: 2025-10-02 12:35:47.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:35:48 np0005466013 nova_compute[192144]: 2025-10-02 12:35:48.289 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "refresh_cache-cd5cbe34-283a-4177-9d0b-bc35fadcde72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:35:48 np0005466013 nova_compute[192144]: 2025-10-02 12:35:48.289 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquired lock "refresh_cache-cd5cbe34-283a-4177-9d0b-bc35fadcde72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:35:48 np0005466013 nova_compute[192144]: 2025-10-02 12:35:48.289 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:35:48 np0005466013 nova_compute[192144]: 2025-10-02 12:35:48.289 2 DEBUG nova.objects.instance [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lazy-loading 'info_cache' on Instance uuid cd5cbe34-283a-4177-9d0b-bc35fadcde72 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:35:48 np0005466013 nova_compute[192144]: 2025-10-02 12:35:48.365 2 DEBUG oslo_concurrency.lockutils [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Acquiring lock "1d931a6f-0703-4e1f-acfc-b8402834c14d" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:48 np0005466013 nova_compute[192144]: 2025-10-02 12:35:48.366 2 DEBUG oslo_concurrency.lockutils [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Lock "1d931a6f-0703-4e1f-acfc-b8402834c14d" acquired by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:48 np0005466013 nova_compute[192144]: 2025-10-02 12:35:48.366 2 INFO nova.compute.manager [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Unshelving#033[00m
Oct  2 08:35:48 np0005466013 nova_compute[192144]: 2025-10-02 12:35:48.542 2 DEBUG oslo_concurrency.lockutils [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:48 np0005466013 nova_compute[192144]: 2025-10-02 12:35:48.542 2 DEBUG oslo_concurrency.lockutils [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:48 np0005466013 nova_compute[192144]: 2025-10-02 12:35:48.547 2 DEBUG nova.objects.instance [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Lazy-loading 'pci_requests' on Instance uuid 1d931a6f-0703-4e1f-acfc-b8402834c14d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:35:48 np0005466013 nova_compute[192144]: 2025-10-02 12:35:48.568 2 DEBUG nova.objects.instance [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Lazy-loading 'numa_topology' on Instance uuid 1d931a6f-0703-4e1f-acfc-b8402834c14d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:35:48 np0005466013 nova_compute[192144]: 2025-10-02 12:35:48.594 2 DEBUG nova.virt.hardware [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:35:48 np0005466013 nova_compute[192144]: 2025-10-02 12:35:48.595 2 INFO nova.compute.claims [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:35:48 np0005466013 nova_compute[192144]: 2025-10-02 12:35:48.775 2 DEBUG nova.compute.provider_tree [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:35:48 np0005466013 nova_compute[192144]: 2025-10-02 12:35:48.809 2 DEBUG nova.scheduler.client.report [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:35:48 np0005466013 nova_compute[192144]: 2025-10-02 12:35:48.855 2 DEBUG oslo_concurrency.lockutils [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.313s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:49 np0005466013 nova_compute[192144]: 2025-10-02 12:35:49.301 2 INFO nova.network.neutron [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Updating port 37bcb93a-8639-42b7-aafd-21f019307d66 with attributes {'binding:host_id': 'compute-2.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Oct  2 08:35:50 np0005466013 nova_compute[192144]: 2025-10-02 12:35:50.226 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:51 np0005466013 nova_compute[192144]: 2025-10-02 12:35:51.414 2 DEBUG oslo_concurrency.lockutils [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Acquiring lock "refresh_cache-1d931a6f-0703-4e1f-acfc-b8402834c14d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:35:51 np0005466013 nova_compute[192144]: 2025-10-02 12:35:51.414 2 DEBUG oslo_concurrency.lockutils [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Acquired lock "refresh_cache-1d931a6f-0703-4e1f-acfc-b8402834c14d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:35:51 np0005466013 nova_compute[192144]: 2025-10-02 12:35:51.415 2 DEBUG nova.network.neutron [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:35:51 np0005466013 nova_compute[192144]: 2025-10-02 12:35:51.554 2 DEBUG nova.compute.manager [req-a40f394c-f6cf-4ef1-a204-61f3ad606c03 req-a2bd4ce8-4b03-4916-98cd-01d9620fe1ff 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Received event network-changed-37bcb93a-8639-42b7-aafd-21f019307d66 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:35:51 np0005466013 nova_compute[192144]: 2025-10-02 12:35:51.554 2 DEBUG nova.compute.manager [req-a40f394c-f6cf-4ef1-a204-61f3ad606c03 req-a2bd4ce8-4b03-4916-98cd-01d9620fe1ff 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Refreshing instance network info cache due to event network-changed-37bcb93a-8639-42b7-aafd-21f019307d66. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:35:51 np0005466013 nova_compute[192144]: 2025-10-02 12:35:51.555 2 DEBUG oslo_concurrency.lockutils [req-a40f394c-f6cf-4ef1-a204-61f3ad606c03 req-a2bd4ce8-4b03-4916-98cd-01d9620fe1ff 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-1d931a6f-0703-4e1f-acfc-b8402834c14d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:35:52 np0005466013 nova_compute[192144]: 2025-10-02 12:35:52.084 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Updating instance_info_cache with network_info: [{"id": "5be4618b-6dbd-4495-af12-ea729df149d7", "address": "fa:16:3e:6a:d5:ad", "network": {"id": "385e0a9e-c250-418d-8cab-e7e3ae4506c1", "bridge": "br-int", "label": "tempest-network-smoke--772702849", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.217", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5be4618b-6d", "ovs_interfaceid": "5be4618b-6dbd-4495-af12-ea729df149d7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "a9826a7d-3298-45d6-b564-4b199244dec1", "address": "fa:16:3e:5d:00:e6", "network": {"id": "4522b631-3a21-451f-8605-7c2b34273ecd", "bridge": "br-int", "label": "tempest-network-smoke--123996893", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5d:e6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9826a7d-32", "ovs_interfaceid": "a9826a7d-3298-45d6-b564-4b199244dec1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:35:52 np0005466013 nova_compute[192144]: 2025-10-02 12:35:52.112 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Releasing lock "refresh_cache-cd5cbe34-283a-4177-9d0b-bc35fadcde72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:35:52 np0005466013 nova_compute[192144]: 2025-10-02 12:35:52.112 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:35:52 np0005466013 nova_compute[192144]: 2025-10-02 12:35:52.114 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:35:52 np0005466013 nova_compute[192144]: 2025-10-02 12:35:52.114 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:35:52 np0005466013 nova_compute[192144]: 2025-10-02 12:35:52.184 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:54 np0005466013 nova_compute[192144]: 2025-10-02 12:35:54.902 2 DEBUG nova.network.neutron [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Updating instance_info_cache with network_info: [{"id": "37bcb93a-8639-42b7-aafd-21f019307d66", "address": "fa:16:3e:3b:d9:00", "network": {"id": "a1f1562c-389b-4488-b13e-0f3594ca916b", "bridge": "br-int", "label": "tempest-TestShelveInstance-85908978-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "086ee425cb0949ab836e1b3ae489ced0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37bcb93a-86", "ovs_interfaceid": "37bcb93a-8639-42b7-aafd-21f019307d66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:35:54 np0005466013 nova_compute[192144]: 2025-10-02 12:35:54.929 2 DEBUG oslo_concurrency.lockutils [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Releasing lock "refresh_cache-1d931a6f-0703-4e1f-acfc-b8402834c14d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:35:54 np0005466013 nova_compute[192144]: 2025-10-02 12:35:54.930 2 DEBUG nova.virt.libvirt.driver [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:35:54 np0005466013 nova_compute[192144]: 2025-10-02 12:35:54.931 2 INFO nova.virt.libvirt.driver [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Creating image(s)#033[00m
Oct  2 08:35:54 np0005466013 nova_compute[192144]: 2025-10-02 12:35:54.932 2 DEBUG oslo_concurrency.lockutils [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Acquiring lock "/var/lib/nova/instances/1d931a6f-0703-4e1f-acfc-b8402834c14d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:54 np0005466013 nova_compute[192144]: 2025-10-02 12:35:54.932 2 DEBUG oslo_concurrency.lockutils [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Lock "/var/lib/nova/instances/1d931a6f-0703-4e1f-acfc-b8402834c14d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:54 np0005466013 nova_compute[192144]: 2025-10-02 12:35:54.933 2 DEBUG oslo_concurrency.lockutils [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Lock "/var/lib/nova/instances/1d931a6f-0703-4e1f-acfc-b8402834c14d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:54 np0005466013 nova_compute[192144]: 2025-10-02 12:35:54.933 2 DEBUG nova.objects.instance [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 1d931a6f-0703-4e1f-acfc-b8402834c14d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:35:54 np0005466013 nova_compute[192144]: 2025-10-02 12:35:54.934 2 DEBUG oslo_concurrency.lockutils [req-a40f394c-f6cf-4ef1-a204-61f3ad606c03 req-a2bd4ce8-4b03-4916-98cd-01d9620fe1ff 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-1d931a6f-0703-4e1f-acfc-b8402834c14d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:35:54 np0005466013 nova_compute[192144]: 2025-10-02 12:35:54.934 2 DEBUG nova.network.neutron [req-a40f394c-f6cf-4ef1-a204-61f3ad606c03 req-a2bd4ce8-4b03-4916-98cd-01d9620fe1ff 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Refreshing network info cache for port 37bcb93a-8639-42b7-aafd-21f019307d66 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:35:54 np0005466013 nova_compute[192144]: 2025-10-02 12:35:54.952 2 DEBUG oslo_concurrency.lockutils [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Acquiring lock "0008bb8726682bc026fb573eb2e22198988c3351" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:54 np0005466013 nova_compute[192144]: 2025-10-02 12:35:54.952 2 DEBUG oslo_concurrency.lockutils [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Lock "0008bb8726682bc026fb573eb2e22198988c3351" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:55 np0005466013 nova_compute[192144]: 2025-10-02 12:35:55.227 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:57 np0005466013 nova_compute[192144]: 2025-10-02 12:35:57.186 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:57 np0005466013 nova_compute[192144]: 2025-10-02 12:35:57.367 2 DEBUG oslo_concurrency.processutils [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/0008bb8726682bc026fb573eb2e22198988c3351.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:35:57 np0005466013 nova_compute[192144]: 2025-10-02 12:35:57.425 2 DEBUG oslo_concurrency.processutils [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/0008bb8726682bc026fb573eb2e22198988c3351.part --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:35:57 np0005466013 nova_compute[192144]: 2025-10-02 12:35:57.426 2 DEBUG nova.virt.images [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] 5afe2b05-02a1-45ef-8376-c5d63a6eac1b was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Oct  2 08:35:57 np0005466013 nova_compute[192144]: 2025-10-02 12:35:57.430 2 DEBUG nova.privsep.utils [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Oct  2 08:35:57 np0005466013 nova_compute[192144]: 2025-10-02 12:35:57.430 2 DEBUG oslo_concurrency.processutils [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/0008bb8726682bc026fb573eb2e22198988c3351.part /var/lib/nova/instances/_base/0008bb8726682bc026fb573eb2e22198988c3351.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:35:57 np0005466013 podman[246281]: 2025-10-02 12:35:57.698745281 +0000 UTC m=+0.058694343 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 08:35:57 np0005466013 podman[246279]: 2025-10-02 12:35:57.698932776 +0000 UTC m=+0.063632648 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3)
Oct  2 08:35:57 np0005466013 podman[246280]: 2025-10-02 12:35:57.711221292 +0000 UTC m=+0.080281860 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.openshift.expose-services=, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, distribution-scope=public, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, version=9.6, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  2 08:35:58 np0005466013 nova_compute[192144]: 2025-10-02 12:35:58.037 2 DEBUG oslo_concurrency.processutils [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/0008bb8726682bc026fb573eb2e22198988c3351.part /var/lib/nova/instances/_base/0008bb8726682bc026fb573eb2e22198988c3351.converted" returned: 0 in 0.607s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:35:58 np0005466013 nova_compute[192144]: 2025-10-02 12:35:58.047 2 DEBUG oslo_concurrency.processutils [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/0008bb8726682bc026fb573eb2e22198988c3351.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:35:58 np0005466013 nova_compute[192144]: 2025-10-02 12:35:58.145 2 DEBUG oslo_concurrency.processutils [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/0008bb8726682bc026fb573eb2e22198988c3351.converted --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:35:58 np0005466013 nova_compute[192144]: 2025-10-02 12:35:58.147 2 DEBUG oslo_concurrency.lockutils [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Lock "0008bb8726682bc026fb573eb2e22198988c3351" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 3.194s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:58 np0005466013 nova_compute[192144]: 2025-10-02 12:35:58.165 2 DEBUG oslo_concurrency.processutils [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/0008bb8726682bc026fb573eb2e22198988c3351 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:35:58 np0005466013 nova_compute[192144]: 2025-10-02 12:35:58.237 2 DEBUG oslo_concurrency.processutils [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/0008bb8726682bc026fb573eb2e22198988c3351 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:35:58 np0005466013 nova_compute[192144]: 2025-10-02 12:35:58.238 2 DEBUG oslo_concurrency.lockutils [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Acquiring lock "0008bb8726682bc026fb573eb2e22198988c3351" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:58 np0005466013 nova_compute[192144]: 2025-10-02 12:35:58.239 2 DEBUG oslo_concurrency.lockutils [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Lock "0008bb8726682bc026fb573eb2e22198988c3351" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:58 np0005466013 nova_compute[192144]: 2025-10-02 12:35:58.264 2 DEBUG oslo_concurrency.processutils [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/0008bb8726682bc026fb573eb2e22198988c3351 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:35:58 np0005466013 nova_compute[192144]: 2025-10-02 12:35:58.321 2 DEBUG oslo_concurrency.processutils [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/0008bb8726682bc026fb573eb2e22198988c3351 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:35:58 np0005466013 nova_compute[192144]: 2025-10-02 12:35:58.322 2 DEBUG oslo_concurrency.processutils [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/0008bb8726682bc026fb573eb2e22198988c3351,backing_fmt=raw /var/lib/nova/instances/1d931a6f-0703-4e1f-acfc-b8402834c14d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:35:58 np0005466013 nova_compute[192144]: 2025-10-02 12:35:58.353 2 DEBUG oslo_concurrency.processutils [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/0008bb8726682bc026fb573eb2e22198988c3351,backing_fmt=raw /var/lib/nova/instances/1d931a6f-0703-4e1f-acfc-b8402834c14d/disk 1073741824" returned: 0 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:35:58 np0005466013 nova_compute[192144]: 2025-10-02 12:35:58.354 2 DEBUG oslo_concurrency.lockutils [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Lock "0008bb8726682bc026fb573eb2e22198988c3351" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:58 np0005466013 nova_compute[192144]: 2025-10-02 12:35:58.355 2 DEBUG oslo_concurrency.processutils [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/0008bb8726682bc026fb573eb2e22198988c3351 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:35:58 np0005466013 nova_compute[192144]: 2025-10-02 12:35:58.420 2 DEBUG oslo_concurrency.processutils [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/0008bb8726682bc026fb573eb2e22198988c3351 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:35:58 np0005466013 nova_compute[192144]: 2025-10-02 12:35:58.421 2 DEBUG nova.objects.instance [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Lazy-loading 'migration_context' on Instance uuid 1d931a6f-0703-4e1f-acfc-b8402834c14d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:35:58 np0005466013 nova_compute[192144]: 2025-10-02 12:35:58.491 2 INFO nova.virt.libvirt.driver [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Rebasing disk image.#033[00m
Oct  2 08:35:58 np0005466013 nova_compute[192144]: 2025-10-02 12:35:58.491 2 DEBUG oslo_concurrency.processutils [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:35:58 np0005466013 nova_compute[192144]: 2025-10-02 12:35:58.553 2 DEBUG oslo_concurrency.processutils [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:35:58 np0005466013 nova_compute[192144]: 2025-10-02 12:35:58.554 2 DEBUG oslo_concurrency.processutils [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Running cmd (subprocess): qemu-img rebase -b /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 -F raw /var/lib/nova/instances/1d931a6f-0703-4e1f-acfc-b8402834c14d/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:35:59 np0005466013 nova_compute[192144]: 2025-10-02 12:35:59.713 2 DEBUG oslo_concurrency.processutils [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] CMD "qemu-img rebase -b /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 -F raw /var/lib/nova/instances/1d931a6f-0703-4e1f-acfc-b8402834c14d/disk" returned: 0 in 1.159s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:35:59 np0005466013 nova_compute[192144]: 2025-10-02 12:35:59.714 2 DEBUG nova.virt.libvirt.driver [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:35:59 np0005466013 nova_compute[192144]: 2025-10-02 12:35:59.714 2 DEBUG nova.virt.libvirt.driver [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Ensure instance console log exists: /var/lib/nova/instances/1d931a6f-0703-4e1f-acfc-b8402834c14d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:35:59 np0005466013 nova_compute[192144]: 2025-10-02 12:35:59.714 2 DEBUG oslo_concurrency.lockutils [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:59 np0005466013 nova_compute[192144]: 2025-10-02 12:35:59.714 2 DEBUG oslo_concurrency.lockutils [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:59 np0005466013 nova_compute[192144]: 2025-10-02 12:35:59.715 2 DEBUG oslo_concurrency.lockutils [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:59 np0005466013 nova_compute[192144]: 2025-10-02 12:35:59.716 2 DEBUG nova.virt.libvirt.driver [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Start _get_guest_xml network_info=[{"id": "37bcb93a-8639-42b7-aafd-21f019307d66", "address": "fa:16:3e:3b:d9:00", "network": {"id": "a1f1562c-389b-4488-b13e-0f3594ca916b", "bridge": "br-int", "label": "tempest-TestShelveInstance-85908978-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "086ee425cb0949ab836e1b3ae489ced0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37bcb93a-86", "ovs_interfaceid": "37bcb93a-8639-42b7-aafd-21f019307d66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='2bd841f60727f1821f66807d43f327a4',container_format='bare',created_at=2025-10-02T12:35:28Z,direct_url=<?>,disk_format='qcow2',id=5afe2b05-02a1-45ef-8376-c5d63a6eac1b,min_disk=1,min_ram=0,name='tempest-TestShelveInstance-server-1978368192-shelved',owner='086ee425cb0949ab836e1b3ae489ced0',properties=ImageMetaProps,protected=<?>,size=52363264,status='active',tags=<?>,updated_at=2025-10-02T12:35:34Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:35:59 np0005466013 nova_compute[192144]: 2025-10-02 12:35:59.720 2 WARNING nova.virt.libvirt.driver [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:35:59 np0005466013 nova_compute[192144]: 2025-10-02 12:35:59.724 2 DEBUG nova.virt.libvirt.host [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:35:59 np0005466013 nova_compute[192144]: 2025-10-02 12:35:59.724 2 DEBUG nova.virt.libvirt.host [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:35:59 np0005466013 nova_compute[192144]: 2025-10-02 12:35:59.728 2 DEBUG nova.virt.libvirt.host [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:35:59 np0005466013 nova_compute[192144]: 2025-10-02 12:35:59.728 2 DEBUG nova.virt.libvirt.host [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:35:59 np0005466013 nova_compute[192144]: 2025-10-02 12:35:59.729 2 DEBUG nova.virt.libvirt.driver [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:35:59 np0005466013 nova_compute[192144]: 2025-10-02 12:35:59.729 2 DEBUG nova.virt.hardware [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='2bd841f60727f1821f66807d43f327a4',container_format='bare',created_at=2025-10-02T12:35:28Z,direct_url=<?>,disk_format='qcow2',id=5afe2b05-02a1-45ef-8376-c5d63a6eac1b,min_disk=1,min_ram=0,name='tempest-TestShelveInstance-server-1978368192-shelved',owner='086ee425cb0949ab836e1b3ae489ced0',properties=ImageMetaProps,protected=<?>,size=52363264,status='active',tags=<?>,updated_at=2025-10-02T12:35:34Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:35:59 np0005466013 nova_compute[192144]: 2025-10-02 12:35:59.730 2 DEBUG nova.virt.hardware [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:35:59 np0005466013 nova_compute[192144]: 2025-10-02 12:35:59.730 2 DEBUG nova.virt.hardware [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:35:59 np0005466013 nova_compute[192144]: 2025-10-02 12:35:59.730 2 DEBUG nova.virt.hardware [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:35:59 np0005466013 nova_compute[192144]: 2025-10-02 12:35:59.730 2 DEBUG nova.virt.hardware [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:35:59 np0005466013 nova_compute[192144]: 2025-10-02 12:35:59.730 2 DEBUG nova.virt.hardware [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:35:59 np0005466013 nova_compute[192144]: 2025-10-02 12:35:59.730 2 DEBUG nova.virt.hardware [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:35:59 np0005466013 nova_compute[192144]: 2025-10-02 12:35:59.731 2 DEBUG nova.virt.hardware [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:35:59 np0005466013 nova_compute[192144]: 2025-10-02 12:35:59.731 2 DEBUG nova.virt.hardware [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:35:59 np0005466013 nova_compute[192144]: 2025-10-02 12:35:59.731 2 DEBUG nova.virt.hardware [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:35:59 np0005466013 nova_compute[192144]: 2025-10-02 12:35:59.731 2 DEBUG nova.virt.hardware [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:35:59 np0005466013 nova_compute[192144]: 2025-10-02 12:35:59.731 2 DEBUG nova.objects.instance [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 1d931a6f-0703-4e1f-acfc-b8402834c14d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:35:59 np0005466013 nova_compute[192144]: 2025-10-02 12:35:59.752 2 DEBUG nova.virt.libvirt.vif [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T12:34:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestShelveInstance-server-1978368192',display_name='tempest-TestShelveInstance-server-1978368192',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testshelveinstance-server-1978368192',id=156,image_ref='5afe2b05-02a1-45ef-8376-c5d63a6eac1b',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name='tempest-TestShelveInstance-537086882',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:35:03Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='086ee425cb0949ab836e1b3ae489ced0',ramdisk_id='',reservation_id='r-a0s1hayn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestShelveInstance-1329865483',owner_user_name='tempest-TestShelveInstance-1329865483-project-member',shelved_at='2025-10-02T12:35:35.031233',shelved_host='compute-1.ctlplane.example.com',shelved_image_id='5afe2b05-02a1-45ef-8376-c5d63a6eac1b'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:35:48Z,user_data=None,user_id='81e456ca7bee486181b9c11ddb1f3ffd',uuid=1d931a6f-0703-4e1f-acfc-b8402834c14d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "37bcb93a-8639-42b7-aafd-21f019307d66", "address": "fa:16:3e:3b:d9:00", "network": {"id": "a1f1562c-389b-4488-b13e-0f3594ca916b", "bridge": "br-int", "label": "tempest-TestShelveInstance-85908978-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "086ee425cb0949ab836e1b3ae489ced0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37bcb93a-86", "ovs_interfaceid": "37bcb93a-8639-42b7-aafd-21f019307d66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:35:59 np0005466013 nova_compute[192144]: 2025-10-02 12:35:59.752 2 DEBUG nova.network.os_vif_util [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Converting VIF {"id": "37bcb93a-8639-42b7-aafd-21f019307d66", "address": "fa:16:3e:3b:d9:00", "network": {"id": "a1f1562c-389b-4488-b13e-0f3594ca916b", "bridge": "br-int", "label": "tempest-TestShelveInstance-85908978-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "086ee425cb0949ab836e1b3ae489ced0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37bcb93a-86", "ovs_interfaceid": "37bcb93a-8639-42b7-aafd-21f019307d66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:35:59 np0005466013 nova_compute[192144]: 2025-10-02 12:35:59.753 2 DEBUG nova.network.os_vif_util [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3b:d9:00,bridge_name='br-int',has_traffic_filtering=True,id=37bcb93a-8639-42b7-aafd-21f019307d66,network=Network(a1f1562c-389b-4488-b13e-0f3594ca916b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap37bcb93a-86') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:35:59 np0005466013 nova_compute[192144]: 2025-10-02 12:35:59.754 2 DEBUG nova.objects.instance [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1d931a6f-0703-4e1f-acfc-b8402834c14d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:35:59 np0005466013 nova_compute[192144]: 2025-10-02 12:35:59.776 2 DEBUG nova.virt.libvirt.driver [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:35:59 np0005466013 nova_compute[192144]:  <uuid>1d931a6f-0703-4e1f-acfc-b8402834c14d</uuid>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:  <name>instance-0000009c</name>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:35:59 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:      <nova:name>tempest-TestShelveInstance-server-1978368192</nova:name>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:35:59</nova:creationTime>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:35:59 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:        <nova:user uuid="81e456ca7bee486181b9c11ddb1f3ffd">tempest-TestShelveInstance-1329865483-project-member</nova:user>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:        <nova:project uuid="086ee425cb0949ab836e1b3ae489ced0">tempest-TestShelveInstance-1329865483</nova:project>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="5afe2b05-02a1-45ef-8376-c5d63a6eac1b"/>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:        <nova:port uuid="37bcb93a-8639-42b7-aafd-21f019307d66">
Oct  2 08:35:59 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:      <entry name="serial">1d931a6f-0703-4e1f-acfc-b8402834c14d</entry>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:      <entry name="uuid">1d931a6f-0703-4e1f-acfc-b8402834c14d</entry>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:35:59 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/1d931a6f-0703-4e1f-acfc-b8402834c14d/disk"/>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:35:59 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/1d931a6f-0703-4e1f-acfc-b8402834c14d/disk.config"/>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:35:59 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:3b:d9:00"/>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:      <target dev="tap37bcb93a-86"/>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:35:59 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/1d931a6f-0703-4e1f-acfc-b8402834c14d/console.log" append="off"/>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    <input type="keyboard" bus="usb"/>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:35:59 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:35:59 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:35:59 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:35:59 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:35:59 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:35:59 np0005466013 nova_compute[192144]: 2025-10-02 12:35:59.777 2 DEBUG nova.compute.manager [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Preparing to wait for external event network-vif-plugged-37bcb93a-8639-42b7-aafd-21f019307d66 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:35:59 np0005466013 nova_compute[192144]: 2025-10-02 12:35:59.777 2 DEBUG oslo_concurrency.lockutils [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Acquiring lock "1d931a6f-0703-4e1f-acfc-b8402834c14d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:59 np0005466013 nova_compute[192144]: 2025-10-02 12:35:59.777 2 DEBUG oslo_concurrency.lockutils [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Lock "1d931a6f-0703-4e1f-acfc-b8402834c14d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:59 np0005466013 nova_compute[192144]: 2025-10-02 12:35:59.777 2 DEBUG oslo_concurrency.lockutils [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Lock "1d931a6f-0703-4e1f-acfc-b8402834c14d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:59 np0005466013 nova_compute[192144]: 2025-10-02 12:35:59.778 2 DEBUG nova.virt.libvirt.vif [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T12:34:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestShelveInstance-server-1978368192',display_name='tempest-TestShelveInstance-server-1978368192',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testshelveinstance-server-1978368192',id=156,image_ref='5afe2b05-02a1-45ef-8376-c5d63a6eac1b',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name='tempest-TestShelveInstance-537086882',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:35:03Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='086ee425cb0949ab836e1b3ae489ced0',ramdisk_id='',reservation_id='r-a0s1hayn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestShelveInstance-1329865483',owner_user_name='tempest-TestShelveInstance-1329865483-project-member',shelved_at='2025-10-02T12:35:35.031233',shelved_host='compute-1.ctlplane.example.com',shelved_image_id='5afe2b05-02a1-45ef-8376-c5d63a6eac1b'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:35:48Z,user_data=None,user_id='81e456ca7bee486181b9c11ddb1f3ffd',uuid=1d931a6f-0703-4e1f-acfc-b8402834c14d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "37bcb93a-8639-42b7-aafd-21f019307d66", "address": "fa:16:3e:3b:d9:00", "network": {"id": "a1f1562c-389b-4488-b13e-0f3594ca916b", "bridge": "br-int", "label": "tempest-TestShelveInstance-85908978-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "086ee425cb0949ab836e1b3ae489ced0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37bcb93a-86", "ovs_interfaceid": "37bcb93a-8639-42b7-aafd-21f019307d66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:35:59 np0005466013 nova_compute[192144]: 2025-10-02 12:35:59.778 2 DEBUG nova.network.os_vif_util [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Converting VIF {"id": "37bcb93a-8639-42b7-aafd-21f019307d66", "address": "fa:16:3e:3b:d9:00", "network": {"id": "a1f1562c-389b-4488-b13e-0f3594ca916b", "bridge": "br-int", "label": "tempest-TestShelveInstance-85908978-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "086ee425cb0949ab836e1b3ae489ced0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37bcb93a-86", "ovs_interfaceid": "37bcb93a-8639-42b7-aafd-21f019307d66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:35:59 np0005466013 nova_compute[192144]: 2025-10-02 12:35:59.778 2 DEBUG nova.network.os_vif_util [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3b:d9:00,bridge_name='br-int',has_traffic_filtering=True,id=37bcb93a-8639-42b7-aafd-21f019307d66,network=Network(a1f1562c-389b-4488-b13e-0f3594ca916b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap37bcb93a-86') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:35:59 np0005466013 nova_compute[192144]: 2025-10-02 12:35:59.778 2 DEBUG os_vif [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3b:d9:00,bridge_name='br-int',has_traffic_filtering=True,id=37bcb93a-8639-42b7-aafd-21f019307d66,network=Network(a1f1562c-389b-4488-b13e-0f3594ca916b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap37bcb93a-86') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:35:59 np0005466013 nova_compute[192144]: 2025-10-02 12:35:59.779 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:59 np0005466013 nova_compute[192144]: 2025-10-02 12:35:59.779 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:35:59 np0005466013 nova_compute[192144]: 2025-10-02 12:35:59.780 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:35:59 np0005466013 nova_compute[192144]: 2025-10-02 12:35:59.782 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:59 np0005466013 nova_compute[192144]: 2025-10-02 12:35:59.782 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap37bcb93a-86, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:35:59 np0005466013 nova_compute[192144]: 2025-10-02 12:35:59.782 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap37bcb93a-86, col_values=(('external_ids', {'iface-id': '37bcb93a-8639-42b7-aafd-21f019307d66', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3b:d9:00', 'vm-uuid': '1d931a6f-0703-4e1f-acfc-b8402834c14d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:35:59 np0005466013 NetworkManager[51205]: <info>  [1759408559.7846] manager: (tap37bcb93a-86): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/298)
Oct  2 08:35:59 np0005466013 nova_compute[192144]: 2025-10-02 12:35:59.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:59 np0005466013 nova_compute[192144]: 2025-10-02 12:35:59.786 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:35:59 np0005466013 nova_compute[192144]: 2025-10-02 12:35:59.791 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:59 np0005466013 nova_compute[192144]: 2025-10-02 12:35:59.792 2 INFO os_vif [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3b:d9:00,bridge_name='br-int',has_traffic_filtering=True,id=37bcb93a-8639-42b7-aafd-21f019307d66,network=Network(a1f1562c-389b-4488-b13e-0f3594ca916b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap37bcb93a-86')#033[00m
Oct  2 08:35:59 np0005466013 nova_compute[192144]: 2025-10-02 12:35:59.855 2 DEBUG nova.virt.libvirt.driver [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:35:59 np0005466013 nova_compute[192144]: 2025-10-02 12:35:59.855 2 DEBUG nova.virt.libvirt.driver [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:35:59 np0005466013 nova_compute[192144]: 2025-10-02 12:35:59.855 2 DEBUG nova.virt.libvirt.driver [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] No VIF found with MAC fa:16:3e:3b:d9:00, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:35:59 np0005466013 nova_compute[192144]: 2025-10-02 12:35:59.856 2 INFO nova.virt.libvirt.driver [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Using config drive#033[00m
Oct  2 08:35:59 np0005466013 nova_compute[192144]: 2025-10-02 12:35:59.883 2 DEBUG nova.objects.instance [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 1d931a6f-0703-4e1f-acfc-b8402834c14d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:36:00 np0005466013 nova_compute[192144]: 2025-10-02 12:36:00.078 2 DEBUG nova.objects.instance [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Lazy-loading 'keypairs' on Instance uuid 1d931a6f-0703-4e1f-acfc-b8402834c14d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:36:00 np0005466013 nova_compute[192144]: 2025-10-02 12:36:00.229 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:00 np0005466013 nova_compute[192144]: 2025-10-02 12:36:00.424 2 DEBUG nova.network.neutron [req-a40f394c-f6cf-4ef1-a204-61f3ad606c03 req-a2bd4ce8-4b03-4916-98cd-01d9620fe1ff 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Updated VIF entry in instance network info cache for port 37bcb93a-8639-42b7-aafd-21f019307d66. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:36:00 np0005466013 nova_compute[192144]: 2025-10-02 12:36:00.424 2 DEBUG nova.network.neutron [req-a40f394c-f6cf-4ef1-a204-61f3ad606c03 req-a2bd4ce8-4b03-4916-98cd-01d9620fe1ff 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Updating instance_info_cache with network_info: [{"id": "37bcb93a-8639-42b7-aafd-21f019307d66", "address": "fa:16:3e:3b:d9:00", "network": {"id": "a1f1562c-389b-4488-b13e-0f3594ca916b", "bridge": "br-int", "label": "tempest-TestShelveInstance-85908978-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "086ee425cb0949ab836e1b3ae489ced0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37bcb93a-86", "ovs_interfaceid": "37bcb93a-8639-42b7-aafd-21f019307d66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:36:00 np0005466013 nova_compute[192144]: 2025-10-02 12:36:00.442 2 DEBUG oslo_concurrency.lockutils [req-a40f394c-f6cf-4ef1-a204-61f3ad606c03 req-a2bd4ce8-4b03-4916-98cd-01d9620fe1ff 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-1d931a6f-0703-4e1f-acfc-b8402834c14d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:36:00 np0005466013 nova_compute[192144]: 2025-10-02 12:36:00.651 2 INFO nova.virt.libvirt.driver [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Creating config drive at /var/lib/nova/instances/1d931a6f-0703-4e1f-acfc-b8402834c14d/disk.config#033[00m
Oct  2 08:36:00 np0005466013 nova_compute[192144]: 2025-10-02 12:36:00.656 2 DEBUG oslo_concurrency.processutils [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1d931a6f-0703-4e1f-acfc-b8402834c14d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjvs2vz0t execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:36:00 np0005466013 nova_compute[192144]: 2025-10-02 12:36:00.791 2 DEBUG oslo_concurrency.processutils [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1d931a6f-0703-4e1f-acfc-b8402834c14d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjvs2vz0t" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:36:00 np0005466013 kernel: tap37bcb93a-86: entered promiscuous mode
Oct  2 08:36:00 np0005466013 NetworkManager[51205]: <info>  [1759408560.8439] manager: (tap37bcb93a-86): new Tun device (/org/freedesktop/NetworkManager/Devices/299)
Oct  2 08:36:00 np0005466013 ovn_controller[94366]: 2025-10-02T12:36:00Z|00673|binding|INFO|Claiming lport 37bcb93a-8639-42b7-aafd-21f019307d66 for this chassis.
Oct  2 08:36:00 np0005466013 ovn_controller[94366]: 2025-10-02T12:36:00Z|00674|binding|INFO|37bcb93a-8639-42b7-aafd-21f019307d66: Claiming fa:16:3e:3b:d9:00 10.100.0.9
Oct  2 08:36:00 np0005466013 nova_compute[192144]: 2025-10-02 12:36:00.845 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:00 np0005466013 ovn_controller[94366]: 2025-10-02T12:36:00Z|00675|binding|INFO|Setting lport 37bcb93a-8639-42b7-aafd-21f019307d66 ovn-installed in OVS
Oct  2 08:36:00 np0005466013 ovn_controller[94366]: 2025-10-02T12:36:00Z|00676|binding|INFO|Setting lport 37bcb93a-8639-42b7-aafd-21f019307d66 up in Southbound
Oct  2 08:36:00 np0005466013 nova_compute[192144]: 2025-10-02 12:36:00.864 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:00.860 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3b:d9:00 10.100.0.9'], port_security=['fa:16:3e:3b:d9:00 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '1d931a6f-0703-4e1f-acfc-b8402834c14d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a1f1562c-389b-4488-b13e-0f3594ca916b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '086ee425cb0949ab836e1b3ae489ced0', 'neutron:revision_number': '7', 'neutron:security_group_ids': 'e9c6c044-9bae-451d-9ac4-f29a1af96360', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.195'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=23273790-9180-40d0-a3ca-fdfdfd7f3c59, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=37bcb93a-8639-42b7-aafd-21f019307d66) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:36:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:00.861 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 37bcb93a-8639-42b7-aafd-21f019307d66 in datapath a1f1562c-389b-4488-b13e-0f3594ca916b bound to our chassis#033[00m
Oct  2 08:36:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:00.863 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a1f1562c-389b-4488-b13e-0f3594ca916b#033[00m
Oct  2 08:36:00 np0005466013 systemd-udevd[246379]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:36:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:00.875 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[b21804a6-e10c-4059-8d7a-2165648c8a67]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:00.876 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa1f1562c-31 in ovnmeta-a1f1562c-389b-4488-b13e-0f3594ca916b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:36:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:00.878 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa1f1562c-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:36:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:00.878 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[7d59bfe8-d1c6-41a9-b450-39597740e05a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:00.879 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[23c4b63f-92ba-48b8-826a-636f8a473c60]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:00 np0005466013 NetworkManager[51205]: <info>  [1759408560.8876] device (tap37bcb93a-86): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:36:00 np0005466013 NetworkManager[51205]: <info>  [1759408560.8887] device (tap37bcb93a-86): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:36:00 np0005466013 systemd-machined[152202]: New machine qemu-73-instance-0000009c.
Oct  2 08:36:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:00.897 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[d59a9252-2cba-4a67-a9d6-7b701015fed2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:00 np0005466013 systemd[1]: Started Virtual Machine qemu-73-instance-0000009c.
Oct  2 08:36:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:00.913 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[3f8b42da-0096-4df2-8fc0-c9d3378d1df8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:00.942 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[d55b7f5f-855d-4907-8fde-32b64555306d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:00 np0005466013 NetworkManager[51205]: <info>  [1759408560.9475] manager: (tapa1f1562c-30): new Veth device (/org/freedesktop/NetworkManager/Devices/300)
Oct  2 08:36:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:00.946 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[8a527421-ab58-430f-9bce-4e4cbdc60640]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:00 np0005466013 systemd-udevd[246383]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:36:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:00.974 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[b415ee0a-9aa6-4dc4-ad41-0586ec26e54d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:00.977 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[977ee46e-2487-47d3-bbbe-189a3f2dcc1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:00 np0005466013 NetworkManager[51205]: <info>  [1759408560.9989] device (tapa1f1562c-30): carrier: link connected
Oct  2 08:36:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:01.004 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[7b697950-e72f-444b-9e98-05e51727121d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:01.018 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[59c88fea-69c9-41f3-a1b1-391e4c1c5be9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa1f1562c-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:27:e4:7a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 201], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 655261, 'reachable_time': 23840, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 246412, 'error': None, 'target': 'ovnmeta-a1f1562c-389b-4488-b13e-0f3594ca916b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:01.037 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[fbc1b842-30bb-42ea-8509-45edcd4bbd25]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe27:e47a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 655261, 'tstamp': 655261}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 246413, 'error': None, 'target': 'ovnmeta-a1f1562c-389b-4488-b13e-0f3594ca916b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:01.053 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[4da33c6c-89fd-4805-9f00-01ab3da39e0d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa1f1562c-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:27:e4:7a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 201], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 655261, 'reachable_time': 23840, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 246414, 'error': None, 'target': 'ovnmeta-a1f1562c-389b-4488-b13e-0f3594ca916b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:01.082 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[8dd04967-9b6a-46d1-b131-e434c45e4656]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:01.138 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[17e20f12-0f80-42b6-b4a9-d64483f1ffde]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:01.139 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa1f1562c-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:36:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:01.140 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:36:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:01.140 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa1f1562c-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:36:01 np0005466013 NetworkManager[51205]: <info>  [1759408561.1971] manager: (tapa1f1562c-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/301)
Oct  2 08:36:01 np0005466013 nova_compute[192144]: 2025-10-02 12:36:01.196 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:01 np0005466013 kernel: tapa1f1562c-30: entered promiscuous mode
Oct  2 08:36:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:01.201 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa1f1562c-30, col_values=(('external_ids', {'iface-id': '765813dd-4eb1-46b7-adc3-4b198fc4dbfb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:36:01 np0005466013 ovn_controller[94366]: 2025-10-02T12:36:01Z|00677|binding|INFO|Releasing lport 765813dd-4eb1-46b7-adc3-4b198fc4dbfb from this chassis (sb_readonly=0)
Oct  2 08:36:01 np0005466013 nova_compute[192144]: 2025-10-02 12:36:01.217 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:01.218 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a1f1562c-389b-4488-b13e-0f3594ca916b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a1f1562c-389b-4488-b13e-0f3594ca916b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:36:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:01.219 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[aec25fee-9500-499a-b9e8-43fb11948a32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:01.220 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:36:01 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:36:01 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:36:01 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-a1f1562c-389b-4488-b13e-0f3594ca916b
Oct  2 08:36:01 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:36:01 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:36:01 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:36:01 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/a1f1562c-389b-4488-b13e-0f3594ca916b.pid.haproxy
Oct  2 08:36:01 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:36:01 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:36:01 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:36:01 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:36:01 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:36:01 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:36:01 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:36:01 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:36:01 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:36:01 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:36:01 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:36:01 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:36:01 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:36:01 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:36:01 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:36:01 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:36:01 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:36:01 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:36:01 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:36:01 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:36:01 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID a1f1562c-389b-4488-b13e-0f3594ca916b
Oct  2 08:36:01 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:36:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:01.220 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a1f1562c-389b-4488-b13e-0f3594ca916b', 'env', 'PROCESS_TAG=haproxy-a1f1562c-389b-4488-b13e-0f3594ca916b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a1f1562c-389b-4488-b13e-0f3594ca916b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:36:01 np0005466013 nova_compute[192144]: 2025-10-02 12:36:01.319 2 DEBUG nova.compute.manager [req-6d0519b6-1efc-4930-8eb6-dcae342b2894 req-005442e2-7277-49e8-b2ba-0c562acf8e69 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Received event network-vif-plugged-37bcb93a-8639-42b7-aafd-21f019307d66 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:36:01 np0005466013 nova_compute[192144]: 2025-10-02 12:36:01.320 2 DEBUG oslo_concurrency.lockutils [req-6d0519b6-1efc-4930-8eb6-dcae342b2894 req-005442e2-7277-49e8-b2ba-0c562acf8e69 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "1d931a6f-0703-4e1f-acfc-b8402834c14d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:01 np0005466013 nova_compute[192144]: 2025-10-02 12:36:01.320 2 DEBUG oslo_concurrency.lockutils [req-6d0519b6-1efc-4930-8eb6-dcae342b2894 req-005442e2-7277-49e8-b2ba-0c562acf8e69 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "1d931a6f-0703-4e1f-acfc-b8402834c14d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:01 np0005466013 nova_compute[192144]: 2025-10-02 12:36:01.320 2 DEBUG oslo_concurrency.lockutils [req-6d0519b6-1efc-4930-8eb6-dcae342b2894 req-005442e2-7277-49e8-b2ba-0c562acf8e69 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "1d931a6f-0703-4e1f-acfc-b8402834c14d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:01 np0005466013 nova_compute[192144]: 2025-10-02 12:36:01.320 2 DEBUG nova.compute.manager [req-6d0519b6-1efc-4930-8eb6-dcae342b2894 req-005442e2-7277-49e8-b2ba-0c562acf8e69 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Processing event network-vif-plugged-37bcb93a-8639-42b7-aafd-21f019307d66 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:36:01 np0005466013 podman[246453]: 2025-10-02 12:36:01.652969979 +0000 UTC m=+0.063817313 container create 5dd30e9692e3c43c7611249b4e3aad8b45a515857a3022255702140e66f58fd4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1f1562c-389b-4488-b13e-0f3594ca916b, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:36:01 np0005466013 systemd[1]: Started libpod-conmon-5dd30e9692e3c43c7611249b4e3aad8b45a515857a3022255702140e66f58fd4.scope.
Oct  2 08:36:01 np0005466013 podman[246453]: 2025-10-02 12:36:01.616428712 +0000 UTC m=+0.027276066 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:36:01 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:36:01 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/343ed3a098ae6fd455a048098c66f6f8b70046fabf299b5a68427f2dfc11cf0d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:36:01 np0005466013 podman[246453]: 2025-10-02 12:36:01.762684961 +0000 UTC m=+0.173532325 container init 5dd30e9692e3c43c7611249b4e3aad8b45a515857a3022255702140e66f58fd4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1f1562c-389b-4488-b13e-0f3594ca916b, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001)
Oct  2 08:36:01 np0005466013 podman[246453]: 2025-10-02 12:36:01.774706818 +0000 UTC m=+0.185554152 container start 5dd30e9692e3c43c7611249b4e3aad8b45a515857a3022255702140e66f58fd4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1f1562c-389b-4488-b13e-0f3594ca916b, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:36:01 np0005466013 neutron-haproxy-ovnmeta-a1f1562c-389b-4488-b13e-0f3594ca916b[246468]: [NOTICE]   (246497) : New worker (246505) forked
Oct  2 08:36:01 np0005466013 neutron-haproxy-ovnmeta-a1f1562c-389b-4488-b13e-0f3594ca916b[246468]: [NOTICE]   (246497) : Loading success.
Oct  2 08:36:01 np0005466013 podman[246471]: 2025-10-02 12:36:01.859764868 +0000 UTC m=+0.130485155 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:36:01 np0005466013 podman[246470]: 2025-10-02 12:36:01.872009582 +0000 UTC m=+0.142752930 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  2 08:36:01 np0005466013 nova_compute[192144]: 2025-10-02 12:36:01.933 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759408561.932649, 1d931a6f-0703-4e1f-acfc-b8402834c14d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:36:01 np0005466013 nova_compute[192144]: 2025-10-02 12:36:01.933 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] VM Started (Lifecycle Event)#033[00m
Oct  2 08:36:01 np0005466013 nova_compute[192144]: 2025-10-02 12:36:01.936 2 DEBUG nova.compute.manager [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:36:01 np0005466013 nova_compute[192144]: 2025-10-02 12:36:01.941 2 DEBUG nova.virt.libvirt.driver [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:36:01 np0005466013 nova_compute[192144]: 2025-10-02 12:36:01.945 2 INFO nova.virt.libvirt.driver [-] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Instance spawned successfully.#033[00m
Oct  2 08:36:01 np0005466013 nova_compute[192144]: 2025-10-02 12:36:01.960 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:36:01 np0005466013 nova_compute[192144]: 2025-10-02 12:36:01.965 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:36:01 np0005466013 nova_compute[192144]: 2025-10-02 12:36:01.989 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:36:01 np0005466013 nova_compute[192144]: 2025-10-02 12:36:01.990 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759408561.9328046, 1d931a6f-0703-4e1f-acfc-b8402834c14d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:36:01 np0005466013 nova_compute[192144]: 2025-10-02 12:36:01.990 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:36:02 np0005466013 nova_compute[192144]: 2025-10-02 12:36:02.037 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:36:02 np0005466013 nova_compute[192144]: 2025-10-02 12:36:02.042 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759408561.9407423, 1d931a6f-0703-4e1f-acfc-b8402834c14d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:36:02 np0005466013 nova_compute[192144]: 2025-10-02 12:36:02.043 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:36:02 np0005466013 nova_compute[192144]: 2025-10-02 12:36:02.074 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:36:02 np0005466013 nova_compute[192144]: 2025-10-02 12:36:02.079 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:36:02 np0005466013 nova_compute[192144]: 2025-10-02 12:36:02.108 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:36:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:02.320 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:02.321 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:02.322 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:03 np0005466013 nova_compute[192144]: 2025-10-02 12:36:03.013 2 DEBUG nova.compute.manager [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:36:03 np0005466013 nova_compute[192144]: 2025-10-02 12:36:03.127 2 DEBUG oslo_concurrency.lockutils [None req-b0fa0417-884b-4bc6-9dc9-368f236d1b6c 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Lock "1d931a6f-0703-4e1f-acfc-b8402834c14d" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 14.761s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:03 np0005466013 nova_compute[192144]: 2025-10-02 12:36:03.419 2 DEBUG nova.compute.manager [req-ce9d5ee5-cc22-41b1-a8b1-3ee3795c6ea5 req-bfb41763-2c5b-441e-8689-053e5984775a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Received event network-vif-plugged-37bcb93a-8639-42b7-aafd-21f019307d66 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:36:03 np0005466013 nova_compute[192144]: 2025-10-02 12:36:03.420 2 DEBUG oslo_concurrency.lockutils [req-ce9d5ee5-cc22-41b1-a8b1-3ee3795c6ea5 req-bfb41763-2c5b-441e-8689-053e5984775a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "1d931a6f-0703-4e1f-acfc-b8402834c14d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:03 np0005466013 nova_compute[192144]: 2025-10-02 12:36:03.421 2 DEBUG oslo_concurrency.lockutils [req-ce9d5ee5-cc22-41b1-a8b1-3ee3795c6ea5 req-bfb41763-2c5b-441e-8689-053e5984775a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "1d931a6f-0703-4e1f-acfc-b8402834c14d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:03 np0005466013 nova_compute[192144]: 2025-10-02 12:36:03.421 2 DEBUG oslo_concurrency.lockutils [req-ce9d5ee5-cc22-41b1-a8b1-3ee3795c6ea5 req-bfb41763-2c5b-441e-8689-053e5984775a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "1d931a6f-0703-4e1f-acfc-b8402834c14d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:03 np0005466013 nova_compute[192144]: 2025-10-02 12:36:03.421 2 DEBUG nova.compute.manager [req-ce9d5ee5-cc22-41b1-a8b1-3ee3795c6ea5 req-bfb41763-2c5b-441e-8689-053e5984775a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] No waiting events found dispatching network-vif-plugged-37bcb93a-8639-42b7-aafd-21f019307d66 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:36:03 np0005466013 nova_compute[192144]: 2025-10-02 12:36:03.422 2 WARNING nova.compute.manager [req-ce9d5ee5-cc22-41b1-a8b1-3ee3795c6ea5 req-bfb41763-2c5b-441e-8689-053e5984775a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Received unexpected event network-vif-plugged-37bcb93a-8639-42b7-aafd-21f019307d66 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:36:04 np0005466013 ovn_controller[94366]: 2025-10-02T12:36:04Z|00678|binding|INFO|Releasing lport 9e6245a0-6013-48e5-9e96-a79fafe59b6a from this chassis (sb_readonly=0)
Oct  2 08:36:04 np0005466013 ovn_controller[94366]: 2025-10-02T12:36:04Z|00679|binding|INFO|Releasing lport 0965d19d-88e4-4971-9eb7-5bedfed08cdc from this chassis (sb_readonly=0)
Oct  2 08:36:04 np0005466013 ovn_controller[94366]: 2025-10-02T12:36:04Z|00680|binding|INFO|Releasing lport 765813dd-4eb1-46b7-adc3-4b198fc4dbfb from this chassis (sb_readonly=0)
Oct  2 08:36:04 np0005466013 nova_compute[192144]: 2025-10-02 12:36:04.228 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:04 np0005466013 nova_compute[192144]: 2025-10-02 12:36:04.784 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:05 np0005466013 nova_compute[192144]: 2025-10-02 12:36:05.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:08 np0005466013 nova_compute[192144]: 2025-10-02 12:36:08.500 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:09 np0005466013 nova_compute[192144]: 2025-10-02 12:36:09.786 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:10 np0005466013 nova_compute[192144]: 2025-10-02 12:36:10.237 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:14 np0005466013 ovn_controller[94366]: 2025-10-02T12:36:14Z|00070|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:3b:d9:00 10.100.0.9
Oct  2 08:36:14 np0005466013 nova_compute[192144]: 2025-10-02 12:36:14.666 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:14 np0005466013 nova_compute[192144]: 2025-10-02 12:36:14.787 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:15 np0005466013 nova_compute[192144]: 2025-10-02 12:36:15.238 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:16.332 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=40, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=39) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:36:16 np0005466013 nova_compute[192144]: 2025-10-02 12:36:16.333 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:16.334 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:36:18 np0005466013 podman[246551]: 2025-10-02 12:36:18.682564604 +0000 UTC m=+0.052630503 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent)
Oct  2 08:36:18 np0005466013 podman[246550]: 2025-10-02 12:36:18.704134201 +0000 UTC m=+0.071711541 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  2 08:36:18 np0005466013 podman[246552]: 2025-10-02 12:36:18.735749433 +0000 UTC m=+0.096605202 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, config_id=ovn_controller, org.label-schema.license=GPLv2)
Oct  2 08:36:19 np0005466013 nova_compute[192144]: 2025-10-02 12:36:19.791 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:20 np0005466013 nova_compute[192144]: 2025-10-02 12:36:20.241 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:23 np0005466013 nova_compute[192144]: 2025-10-02 12:36:23.650 2 DEBUG oslo_concurrency.lockutils [None req-706dc104-69a2-471c-b33c-c3269a624ec5 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Acquiring lock "1d931a6f-0703-4e1f-acfc-b8402834c14d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:23 np0005466013 nova_compute[192144]: 2025-10-02 12:36:23.651 2 DEBUG oslo_concurrency.lockutils [None req-706dc104-69a2-471c-b33c-c3269a624ec5 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Lock "1d931a6f-0703-4e1f-acfc-b8402834c14d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:23 np0005466013 nova_compute[192144]: 2025-10-02 12:36:23.651 2 DEBUG oslo_concurrency.lockutils [None req-706dc104-69a2-471c-b33c-c3269a624ec5 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Acquiring lock "1d931a6f-0703-4e1f-acfc-b8402834c14d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:23 np0005466013 nova_compute[192144]: 2025-10-02 12:36:23.651 2 DEBUG oslo_concurrency.lockutils [None req-706dc104-69a2-471c-b33c-c3269a624ec5 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Lock "1d931a6f-0703-4e1f-acfc-b8402834c14d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:23 np0005466013 nova_compute[192144]: 2025-10-02 12:36:23.651 2 DEBUG oslo_concurrency.lockutils [None req-706dc104-69a2-471c-b33c-c3269a624ec5 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Lock "1d931a6f-0703-4e1f-acfc-b8402834c14d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:23 np0005466013 nova_compute[192144]: 2025-10-02 12:36:23.666 2 INFO nova.compute.manager [None req-706dc104-69a2-471c-b33c-c3269a624ec5 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Terminating instance#033[00m
Oct  2 08:36:23 np0005466013 nova_compute[192144]: 2025-10-02 12:36:23.683 2 DEBUG nova.compute.manager [None req-706dc104-69a2-471c-b33c-c3269a624ec5 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:36:23 np0005466013 kernel: tap37bcb93a-86 (unregistering): left promiscuous mode
Oct  2 08:36:23 np0005466013 NetworkManager[51205]: <info>  [1759408583.7215] device (tap37bcb93a-86): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:36:23 np0005466013 ovn_controller[94366]: 2025-10-02T12:36:23Z|00681|binding|INFO|Releasing lport 37bcb93a-8639-42b7-aafd-21f019307d66 from this chassis (sb_readonly=0)
Oct  2 08:36:23 np0005466013 ovn_controller[94366]: 2025-10-02T12:36:23Z|00682|binding|INFO|Setting lport 37bcb93a-8639-42b7-aafd-21f019307d66 down in Southbound
Oct  2 08:36:23 np0005466013 ovn_controller[94366]: 2025-10-02T12:36:23Z|00683|binding|INFO|Removing iface tap37bcb93a-86 ovn-installed in OVS
Oct  2 08:36:23 np0005466013 nova_compute[192144]: 2025-10-02 12:36:23.731 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:23 np0005466013 nova_compute[192144]: 2025-10-02 12:36:23.734 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:23 np0005466013 nova_compute[192144]: 2025-10-02 12:36:23.746 2 DEBUG nova.compute.manager [req-025eed00-24d3-49aa-bb83-ec32aa41a795 req-81967ab5-6c57-4972-b151-177c349fe989 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Received event network-changed-37bcb93a-8639-42b7-aafd-21f019307d66 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:36:23 np0005466013 nova_compute[192144]: 2025-10-02 12:36:23.747 2 DEBUG nova.compute.manager [req-025eed00-24d3-49aa-bb83-ec32aa41a795 req-81967ab5-6c57-4972-b151-177c349fe989 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Refreshing instance network info cache due to event network-changed-37bcb93a-8639-42b7-aafd-21f019307d66. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:36:23 np0005466013 nova_compute[192144]: 2025-10-02 12:36:23.747 2 DEBUG oslo_concurrency.lockutils [req-025eed00-24d3-49aa-bb83-ec32aa41a795 req-81967ab5-6c57-4972-b151-177c349fe989 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-1d931a6f-0703-4e1f-acfc-b8402834c14d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:36:23 np0005466013 nova_compute[192144]: 2025-10-02 12:36:23.747 2 DEBUG oslo_concurrency.lockutils [req-025eed00-24d3-49aa-bb83-ec32aa41a795 req-81967ab5-6c57-4972-b151-177c349fe989 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-1d931a6f-0703-4e1f-acfc-b8402834c14d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:36:23 np0005466013 nova_compute[192144]: 2025-10-02 12:36:23.747 2 DEBUG nova.network.neutron [req-025eed00-24d3-49aa-bb83-ec32aa41a795 req-81967ab5-6c57-4972-b151-177c349fe989 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Refreshing network info cache for port 37bcb93a-8639-42b7-aafd-21f019307d66 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:36:23 np0005466013 nova_compute[192144]: 2025-10-02 12:36:23.748 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:23.748 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3b:d9:00 10.100.0.9'], port_security=['fa:16:3e:3b:d9:00 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '1d931a6f-0703-4e1f-acfc-b8402834c14d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a1f1562c-389b-4488-b13e-0f3594ca916b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '086ee425cb0949ab836e1b3ae489ced0', 'neutron:revision_number': '9', 'neutron:security_group_ids': 'e9c6c044-9bae-451d-9ac4-f29a1af96360', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=23273790-9180-40d0-a3ca-fdfdfd7f3c59, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=37bcb93a-8639-42b7-aafd-21f019307d66) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:36:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:23.751 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 37bcb93a-8639-42b7-aafd-21f019307d66 in datapath a1f1562c-389b-4488-b13e-0f3594ca916b unbound from our chassis#033[00m
Oct  2 08:36:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:23.753 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a1f1562c-389b-4488-b13e-0f3594ca916b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:36:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:23.756 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[dffd256a-5e6c-446f-97e5-44a487725dd0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:23.757 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a1f1562c-389b-4488-b13e-0f3594ca916b namespace which is not needed anymore#033[00m
Oct  2 08:36:23 np0005466013 systemd[1]: machine-qemu\x2d73\x2dinstance\x2d0000009c.scope: Deactivated successfully.
Oct  2 08:36:23 np0005466013 systemd[1]: machine-qemu\x2d73\x2dinstance\x2d0000009c.scope: Consumed 14.059s CPU time.
Oct  2 08:36:23 np0005466013 systemd-machined[152202]: Machine qemu-73-instance-0000009c terminated.
Oct  2 08:36:23 np0005466013 kernel: tap37bcb93a-86: entered promiscuous mode
Oct  2 08:36:23 np0005466013 NetworkManager[51205]: <info>  [1759408583.9124] manager: (tap37bcb93a-86): new Tun device (/org/freedesktop/NetworkManager/Devices/302)
Oct  2 08:36:23 np0005466013 ovn_controller[94366]: 2025-10-02T12:36:23Z|00684|binding|INFO|Claiming lport 37bcb93a-8639-42b7-aafd-21f019307d66 for this chassis.
Oct  2 08:36:23 np0005466013 nova_compute[192144]: 2025-10-02 12:36:23.913 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:23 np0005466013 ovn_controller[94366]: 2025-10-02T12:36:23Z|00685|binding|INFO|37bcb93a-8639-42b7-aafd-21f019307d66: Claiming fa:16:3e:3b:d9:00 10.100.0.9
Oct  2 08:36:23 np0005466013 neutron-haproxy-ovnmeta-a1f1562c-389b-4488-b13e-0f3594ca916b[246468]: [NOTICE]   (246497) : haproxy version is 2.8.14-c23fe91
Oct  2 08:36:23 np0005466013 neutron-haproxy-ovnmeta-a1f1562c-389b-4488-b13e-0f3594ca916b[246468]: [NOTICE]   (246497) : path to executable is /usr/sbin/haproxy
Oct  2 08:36:23 np0005466013 neutron-haproxy-ovnmeta-a1f1562c-389b-4488-b13e-0f3594ca916b[246468]: [WARNING]  (246497) : Exiting Master process...
Oct  2 08:36:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:23.923 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3b:d9:00 10.100.0.9'], port_security=['fa:16:3e:3b:d9:00 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '1d931a6f-0703-4e1f-acfc-b8402834c14d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a1f1562c-389b-4488-b13e-0f3594ca916b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '086ee425cb0949ab836e1b3ae489ced0', 'neutron:revision_number': '9', 'neutron:security_group_ids': 'e9c6c044-9bae-451d-9ac4-f29a1af96360', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=23273790-9180-40d0-a3ca-fdfdfd7f3c59, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=37bcb93a-8639-42b7-aafd-21f019307d66) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:36:23 np0005466013 neutron-haproxy-ovnmeta-a1f1562c-389b-4488-b13e-0f3594ca916b[246468]: [ALERT]    (246497) : Current worker (246505) exited with code 143 (Terminated)
Oct  2 08:36:23 np0005466013 neutron-haproxy-ovnmeta-a1f1562c-389b-4488-b13e-0f3594ca916b[246468]: [WARNING]  (246497) : All workers exited. Exiting... (0)
Oct  2 08:36:23 np0005466013 kernel: tap37bcb93a-86 (unregistering): left promiscuous mode
Oct  2 08:36:23 np0005466013 systemd[1]: libpod-5dd30e9692e3c43c7611249b4e3aad8b45a515857a3022255702140e66f58fd4.scope: Deactivated successfully.
Oct  2 08:36:23 np0005466013 podman[246641]: 2025-10-02 12:36:23.939330683 +0000 UTC m=+0.068065267 container died 5dd30e9692e3c43c7611249b4e3aad8b45a515857a3022255702140e66f58fd4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1f1562c-389b-4488-b13e-0f3594ca916b, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  2 08:36:23 np0005466013 ovn_controller[94366]: 2025-10-02T12:36:23Z|00686|binding|INFO|Releasing lport 37bcb93a-8639-42b7-aafd-21f019307d66 from this chassis (sb_readonly=0)
Oct  2 08:36:23 np0005466013 nova_compute[192144]: 2025-10-02 12:36:23.943 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:23.953 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3b:d9:00 10.100.0.9'], port_security=['fa:16:3e:3b:d9:00 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '1d931a6f-0703-4e1f-acfc-b8402834c14d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a1f1562c-389b-4488-b13e-0f3594ca916b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '086ee425cb0949ab836e1b3ae489ced0', 'neutron:revision_number': '9', 'neutron:security_group_ids': 'e9c6c044-9bae-451d-9ac4-f29a1af96360', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=23273790-9180-40d0-a3ca-fdfdfd7f3c59, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=37bcb93a-8639-42b7-aafd-21f019307d66) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:36:23 np0005466013 systemd[1]: var-lib-containers-storage-overlay-343ed3a098ae6fd455a048098c66f6f8b70046fabf299b5a68427f2dfc11cf0d-merged.mount: Deactivated successfully.
Oct  2 08:36:23 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5dd30e9692e3c43c7611249b4e3aad8b45a515857a3022255702140e66f58fd4-userdata-shm.mount: Deactivated successfully.
Oct  2 08:36:23 np0005466013 nova_compute[192144]: 2025-10-02 12:36:23.998 2 INFO nova.virt.libvirt.driver [-] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Instance destroyed successfully.#033[00m
Oct  2 08:36:23 np0005466013 nova_compute[192144]: 2025-10-02 12:36:23.998 2 DEBUG nova.objects.instance [None req-706dc104-69a2-471c-b33c-c3269a624ec5 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Lazy-loading 'resources' on Instance uuid 1d931a6f-0703-4e1f-acfc-b8402834c14d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:36:24 np0005466013 podman[246641]: 2025-10-02 12:36:24.000219323 +0000 UTC m=+0.128953947 container cleanup 5dd30e9692e3c43c7611249b4e3aad8b45a515857a3022255702140e66f58fd4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1f1562c-389b-4488-b13e-0f3594ca916b, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:36:24 np0005466013 nova_compute[192144]: 2025-10-02 12:36:24.010 2 DEBUG nova.compute.manager [req-cc708dc8-d77a-474a-86ea-217c2284a04a req-610515d1-ee33-4c1e-81aa-22e0d2be0e89 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Received event network-vif-unplugged-37bcb93a-8639-42b7-aafd-21f019307d66 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:36:24 np0005466013 nova_compute[192144]: 2025-10-02 12:36:24.010 2 DEBUG oslo_concurrency.lockutils [req-cc708dc8-d77a-474a-86ea-217c2284a04a req-610515d1-ee33-4c1e-81aa-22e0d2be0e89 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "1d931a6f-0703-4e1f-acfc-b8402834c14d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:24 np0005466013 nova_compute[192144]: 2025-10-02 12:36:24.011 2 DEBUG oslo_concurrency.lockutils [req-cc708dc8-d77a-474a-86ea-217c2284a04a req-610515d1-ee33-4c1e-81aa-22e0d2be0e89 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "1d931a6f-0703-4e1f-acfc-b8402834c14d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:24 np0005466013 nova_compute[192144]: 2025-10-02 12:36:24.011 2 DEBUG oslo_concurrency.lockutils [req-cc708dc8-d77a-474a-86ea-217c2284a04a req-610515d1-ee33-4c1e-81aa-22e0d2be0e89 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "1d931a6f-0703-4e1f-acfc-b8402834c14d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:24 np0005466013 nova_compute[192144]: 2025-10-02 12:36:24.011 2 DEBUG nova.compute.manager [req-cc708dc8-d77a-474a-86ea-217c2284a04a req-610515d1-ee33-4c1e-81aa-22e0d2be0e89 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] No waiting events found dispatching network-vif-unplugged-37bcb93a-8639-42b7-aafd-21f019307d66 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:36:24 np0005466013 nova_compute[192144]: 2025-10-02 12:36:24.012 2 DEBUG nova.compute.manager [req-cc708dc8-d77a-474a-86ea-217c2284a04a req-610515d1-ee33-4c1e-81aa-22e0d2be0e89 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Received event network-vif-unplugged-37bcb93a-8639-42b7-aafd-21f019307d66 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:36:24 np0005466013 nova_compute[192144]: 2025-10-02 12:36:24.014 2 DEBUG nova.virt.libvirt.vif [None req-706dc104-69a2-471c-b33c-c3269a624ec5 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T12:34:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestShelveInstance-server-1978368192',display_name='tempest-TestShelveInstance-server-1978368192',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testshelveinstance-server-1978368192',id=156,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE/XDGeJT9WLDU0HFLzjGkDPYXYzNnVTkBgasq3A2D12H9O5maW7G09qXMMNOwpxQcY9ezmdK5YuMVeh5Lmhul8cAhXsU4OmdH86TOpc/q67Xul+dL/ucyqS3TKQHf5rEA==',key_name='tempest-TestShelveInstance-537086882',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:36:03Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='086ee425cb0949ab836e1b3ae489ced0',ramdisk_id='',reservation_id='r-a0s1hayn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestShelveInstance-1329865483',owner_user_name='tempest-TestShelveInstance-1329865483-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:36:03Z,user_data=None,user_id='81e456ca7bee486181b9c11ddb1f3ffd',uuid=1d931a6f-0703-4e1f-acfc-b8402834c14d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "37bcb93a-8639-42b7-aafd-21f019307d66", "address": "fa:16:3e:3b:d9:00", "network": {"id": "a1f1562c-389b-4488-b13e-0f3594ca916b", "bridge": "br-int", "label": "tempest-TestShelveInstance-85908978-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "086ee425cb0949ab836e1b3ae489ced0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37bcb93a-86", "ovs_interfaceid": "37bcb93a-8639-42b7-aafd-21f019307d66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:36:24 np0005466013 nova_compute[192144]: 2025-10-02 12:36:24.015 2 DEBUG nova.network.os_vif_util [None req-706dc104-69a2-471c-b33c-c3269a624ec5 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Converting VIF {"id": "37bcb93a-8639-42b7-aafd-21f019307d66", "address": "fa:16:3e:3b:d9:00", "network": {"id": "a1f1562c-389b-4488-b13e-0f3594ca916b", "bridge": "br-int", "label": "tempest-TestShelveInstance-85908978-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "086ee425cb0949ab836e1b3ae489ced0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37bcb93a-86", "ovs_interfaceid": "37bcb93a-8639-42b7-aafd-21f019307d66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:36:24 np0005466013 nova_compute[192144]: 2025-10-02 12:36:24.016 2 DEBUG nova.network.os_vif_util [None req-706dc104-69a2-471c-b33c-c3269a624ec5 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3b:d9:00,bridge_name='br-int',has_traffic_filtering=True,id=37bcb93a-8639-42b7-aafd-21f019307d66,network=Network(a1f1562c-389b-4488-b13e-0f3594ca916b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap37bcb93a-86') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:36:24 np0005466013 nova_compute[192144]: 2025-10-02 12:36:24.017 2 DEBUG os_vif [None req-706dc104-69a2-471c-b33c-c3269a624ec5 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3b:d9:00,bridge_name='br-int',has_traffic_filtering=True,id=37bcb93a-8639-42b7-aafd-21f019307d66,network=Network(a1f1562c-389b-4488-b13e-0f3594ca916b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap37bcb93a-86') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:36:24 np0005466013 nova_compute[192144]: 2025-10-02 12:36:24.020 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:24 np0005466013 nova_compute[192144]: 2025-10-02 12:36:24.020 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap37bcb93a-86, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:36:24 np0005466013 systemd[1]: libpod-conmon-5dd30e9692e3c43c7611249b4e3aad8b45a515857a3022255702140e66f58fd4.scope: Deactivated successfully.
Oct  2 08:36:24 np0005466013 nova_compute[192144]: 2025-10-02 12:36:24.029 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:24 np0005466013 nova_compute[192144]: 2025-10-02 12:36:24.031 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:36:24 np0005466013 nova_compute[192144]: 2025-10-02 12:36:24.035 2 INFO os_vif [None req-706dc104-69a2-471c-b33c-c3269a624ec5 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3b:d9:00,bridge_name='br-int',has_traffic_filtering=True,id=37bcb93a-8639-42b7-aafd-21f019307d66,network=Network(a1f1562c-389b-4488-b13e-0f3594ca916b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap37bcb93a-86')#033[00m
Oct  2 08:36:24 np0005466013 nova_compute[192144]: 2025-10-02 12:36:24.036 2 INFO nova.virt.libvirt.driver [None req-706dc104-69a2-471c-b33c-c3269a624ec5 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Deleting instance files /var/lib/nova/instances/1d931a6f-0703-4e1f-acfc-b8402834c14d_del#033[00m
Oct  2 08:36:24 np0005466013 nova_compute[192144]: 2025-10-02 12:36:24.042 2 INFO nova.virt.libvirt.driver [None req-706dc104-69a2-471c-b33c-c3269a624ec5 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Deletion of /var/lib/nova/instances/1d931a6f-0703-4e1f-acfc-b8402834c14d_del complete#033[00m
Oct  2 08:36:24 np0005466013 podman[246681]: 2025-10-02 12:36:24.090419143 +0000 UTC m=+0.055354408 container remove 5dd30e9692e3c43c7611249b4e3aad8b45a515857a3022255702140e66f58fd4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1f1562c-389b-4488-b13e-0f3594ca916b, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:36:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:24.097 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[4327185a-84ca-41a6-a982-d9a5e8313320]: (4, ('Thu Oct  2 12:36:23 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-a1f1562c-389b-4488-b13e-0f3594ca916b (5dd30e9692e3c43c7611249b4e3aad8b45a515857a3022255702140e66f58fd4)\n5dd30e9692e3c43c7611249b4e3aad8b45a515857a3022255702140e66f58fd4\nThu Oct  2 12:36:24 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-a1f1562c-389b-4488-b13e-0f3594ca916b (5dd30e9692e3c43c7611249b4e3aad8b45a515857a3022255702140e66f58fd4)\n5dd30e9692e3c43c7611249b4e3aad8b45a515857a3022255702140e66f58fd4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:24.100 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[21312c5d-759a-4bfb-9190-7f0100166b5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:24.101 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa1f1562c-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:36:24 np0005466013 nova_compute[192144]: 2025-10-02 12:36:24.112 2 INFO nova.compute.manager [None req-706dc104-69a2-471c-b33c-c3269a624ec5 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Took 0.43 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:36:24 np0005466013 nova_compute[192144]: 2025-10-02 12:36:24.113 2 DEBUG oslo.service.loopingcall [None req-706dc104-69a2-471c-b33c-c3269a624ec5 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:36:24 np0005466013 nova_compute[192144]: 2025-10-02 12:36:24.113 2 DEBUG nova.compute.manager [-] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:36:24 np0005466013 nova_compute[192144]: 2025-10-02 12:36:24.113 2 DEBUG nova.network.neutron [-] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:36:24 np0005466013 nova_compute[192144]: 2025-10-02 12:36:24.142 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:24 np0005466013 kernel: tapa1f1562c-30: left promiscuous mode
Oct  2 08:36:24 np0005466013 nova_compute[192144]: 2025-10-02 12:36:24.157 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:24.160 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[759a0581-ff72-4da4-b5d7-3333dcd7ca13]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:24.188 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[1ed57ae2-7924-45c2-acc1-f03617389b08]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:24.190 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[1089fa8c-f874-4983-bae2-df3fa78ae48a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:24.212 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[f334ba28-3311-4889-afb4-f9c61f9300c2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 655255, 'reachable_time': 31519, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 246696, 'error': None, 'target': 'ovnmeta-a1f1562c-389b-4488-b13e-0f3594ca916b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:24 np0005466013 systemd[1]: run-netns-ovnmeta\x2da1f1562c\x2d389b\x2d4488\x2db13e\x2d0f3594ca916b.mount: Deactivated successfully.
Oct  2 08:36:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:24.219 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a1f1562c-389b-4488-b13e-0f3594ca916b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:36:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:24.219 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[c107ab83-9862-4b3d-8b5c-6a7e4cfe0077]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:24.220 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 37bcb93a-8639-42b7-aafd-21f019307d66 in datapath a1f1562c-389b-4488-b13e-0f3594ca916b unbound from our chassis#033[00m
Oct  2 08:36:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:24.222 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a1f1562c-389b-4488-b13e-0f3594ca916b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:36:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:24.223 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[053b49e3-185b-40f9-8f4d-cf5924c6b16b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:24.224 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 37bcb93a-8639-42b7-aafd-21f019307d66 in datapath a1f1562c-389b-4488-b13e-0f3594ca916b unbound from our chassis#033[00m
Oct  2 08:36:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:24.225 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a1f1562c-389b-4488-b13e-0f3594ca916b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:36:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:24.226 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[10d8f89a-023b-49a5-aabd-84b66bd7f636]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:24 np0005466013 nova_compute[192144]: 2025-10-02 12:36:24.424 2 DEBUG oslo_concurrency.lockutils [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:24 np0005466013 nova_compute[192144]: 2025-10-02 12:36:24.425 2 DEBUG oslo_concurrency.lockutils [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:24 np0005466013 nova_compute[192144]: 2025-10-02 12:36:24.440 2 DEBUG nova.compute.manager [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:36:24 np0005466013 nova_compute[192144]: 2025-10-02 12:36:24.551 2 DEBUG oslo_concurrency.lockutils [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:24 np0005466013 nova_compute[192144]: 2025-10-02 12:36:24.551 2 DEBUG oslo_concurrency.lockutils [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:24 np0005466013 nova_compute[192144]: 2025-10-02 12:36:24.564 2 DEBUG nova.virt.hardware [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:36:24 np0005466013 nova_compute[192144]: 2025-10-02 12:36:24.565 2 INFO nova.compute.claims [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:36:24 np0005466013 nova_compute[192144]: 2025-10-02 12:36:24.751 2 DEBUG nova.compute.provider_tree [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:36:24 np0005466013 nova_compute[192144]: 2025-10-02 12:36:24.771 2 DEBUG nova.scheduler.client.report [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:36:24 np0005466013 nova_compute[192144]: 2025-10-02 12:36:24.794 2 DEBUG oslo_concurrency.lockutils [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.243s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:24 np0005466013 nova_compute[192144]: 2025-10-02 12:36:24.795 2 DEBUG nova.compute.manager [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:36:24 np0005466013 nova_compute[192144]: 2025-10-02 12:36:24.828 2 DEBUG nova.network.neutron [-] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:36:24 np0005466013 nova_compute[192144]: 2025-10-02 12:36:24.853 2 INFO nova.compute.manager [-] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Took 0.74 seconds to deallocate network for instance.#033[00m
Oct  2 08:36:24 np0005466013 nova_compute[192144]: 2025-10-02 12:36:24.867 2 DEBUG nova.compute.manager [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:36:24 np0005466013 nova_compute[192144]: 2025-10-02 12:36:24.868 2 DEBUG nova.network.neutron [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:36:24 np0005466013 nova_compute[192144]: 2025-10-02 12:36:24.911 2 INFO nova.virt.libvirt.driver [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:36:24 np0005466013 nova_compute[192144]: 2025-10-02 12:36:24.936 2 DEBUG nova.compute.manager [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:36:24 np0005466013 nova_compute[192144]: 2025-10-02 12:36:24.986 2 DEBUG oslo_concurrency.lockutils [None req-706dc104-69a2-471c-b33c-c3269a624ec5 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:24 np0005466013 nova_compute[192144]: 2025-10-02 12:36:24.986 2 DEBUG oslo_concurrency.lockutils [None req-706dc104-69a2-471c-b33c-c3269a624ec5 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:25 np0005466013 nova_compute[192144]: 2025-10-02 12:36:25.066 2 DEBUG nova.policy [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:36:25 np0005466013 nova_compute[192144]: 2025-10-02 12:36:25.080 2 DEBUG nova.compute.manager [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:36:25 np0005466013 nova_compute[192144]: 2025-10-02 12:36:25.082 2 DEBUG nova.virt.libvirt.driver [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:36:25 np0005466013 nova_compute[192144]: 2025-10-02 12:36:25.082 2 INFO nova.virt.libvirt.driver [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Creating image(s)#033[00m
Oct  2 08:36:25 np0005466013 nova_compute[192144]: 2025-10-02 12:36:25.083 2 DEBUG oslo_concurrency.lockutils [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "/var/lib/nova/instances/2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:25 np0005466013 nova_compute[192144]: 2025-10-02 12:36:25.084 2 DEBUG oslo_concurrency.lockutils [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "/var/lib/nova/instances/2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:25 np0005466013 nova_compute[192144]: 2025-10-02 12:36:25.085 2 DEBUG oslo_concurrency.lockutils [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "/var/lib/nova/instances/2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:25 np0005466013 nova_compute[192144]: 2025-10-02 12:36:25.108 2 DEBUG oslo_concurrency.processutils [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:36:25 np0005466013 nova_compute[192144]: 2025-10-02 12:36:25.135 2 DEBUG nova.network.neutron [req-025eed00-24d3-49aa-bb83-ec32aa41a795 req-81967ab5-6c57-4972-b151-177c349fe989 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Updated VIF entry in instance network info cache for port 37bcb93a-8639-42b7-aafd-21f019307d66. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:36:25 np0005466013 nova_compute[192144]: 2025-10-02 12:36:25.136 2 DEBUG nova.network.neutron [req-025eed00-24d3-49aa-bb83-ec32aa41a795 req-81967ab5-6c57-4972-b151-177c349fe989 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Updating instance_info_cache with network_info: [{"id": "37bcb93a-8639-42b7-aafd-21f019307d66", "address": "fa:16:3e:3b:d9:00", "network": {"id": "a1f1562c-389b-4488-b13e-0f3594ca916b", "bridge": "br-int", "label": "tempest-TestShelveInstance-85908978-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "086ee425cb0949ab836e1b3ae489ced0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37bcb93a-86", "ovs_interfaceid": "37bcb93a-8639-42b7-aafd-21f019307d66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:36:25 np0005466013 nova_compute[192144]: 2025-10-02 12:36:25.145 2 DEBUG nova.compute.provider_tree [None req-706dc104-69a2-471c-b33c-c3269a624ec5 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:36:25 np0005466013 nova_compute[192144]: 2025-10-02 12:36:25.172 2 DEBUG nova.scheduler.client.report [None req-706dc104-69a2-471c-b33c-c3269a624ec5 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:36:25 np0005466013 nova_compute[192144]: 2025-10-02 12:36:25.178 2 DEBUG oslo_concurrency.lockutils [req-025eed00-24d3-49aa-bb83-ec32aa41a795 req-81967ab5-6c57-4972-b151-177c349fe989 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-1d931a6f-0703-4e1f-acfc-b8402834c14d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:36:25 np0005466013 nova_compute[192144]: 2025-10-02 12:36:25.179 2 DEBUG oslo_concurrency.processutils [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:36:25 np0005466013 nova_compute[192144]: 2025-10-02 12:36:25.180 2 DEBUG oslo_concurrency.lockutils [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:25 np0005466013 nova_compute[192144]: 2025-10-02 12:36:25.181 2 DEBUG oslo_concurrency.lockutils [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:25 np0005466013 nova_compute[192144]: 2025-10-02 12:36:25.202 2 DEBUG oslo_concurrency.processutils [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:36:25 np0005466013 nova_compute[192144]: 2025-10-02 12:36:25.234 2 DEBUG oslo_concurrency.lockutils [None req-706dc104-69a2-471c-b33c-c3269a624ec5 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.247s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:25 np0005466013 nova_compute[192144]: 2025-10-02 12:36:25.244 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:25 np0005466013 nova_compute[192144]: 2025-10-02 12:36:25.268 2 INFO nova.scheduler.client.report [None req-706dc104-69a2-471c-b33c-c3269a624ec5 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Deleted allocations for instance 1d931a6f-0703-4e1f-acfc-b8402834c14d#033[00m
Oct  2 08:36:25 np0005466013 nova_compute[192144]: 2025-10-02 12:36:25.276 2 DEBUG oslo_concurrency.processutils [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:36:25 np0005466013 nova_compute[192144]: 2025-10-02 12:36:25.277 2 DEBUG oslo_concurrency.processutils [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:36:25 np0005466013 nova_compute[192144]: 2025-10-02 12:36:25.324 2 DEBUG oslo_concurrency.processutils [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/disk 1073741824" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:36:25 np0005466013 nova_compute[192144]: 2025-10-02 12:36:25.326 2 DEBUG oslo_concurrency.lockutils [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.145s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:25 np0005466013 nova_compute[192144]: 2025-10-02 12:36:25.326 2 DEBUG oslo_concurrency.processutils [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:36:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:25.336 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '40'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:36:25 np0005466013 nova_compute[192144]: 2025-10-02 12:36:25.381 2 DEBUG oslo_concurrency.lockutils [None req-706dc104-69a2-471c-b33c-c3269a624ec5 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Lock "1d931a6f-0703-4e1f-acfc-b8402834c14d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.730s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:25 np0005466013 nova_compute[192144]: 2025-10-02 12:36:25.400 2 DEBUG oslo_concurrency.processutils [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:36:25 np0005466013 nova_compute[192144]: 2025-10-02 12:36:25.402 2 DEBUG nova.virt.disk.api [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Checking if we can resize image /var/lib/nova/instances/2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:36:25 np0005466013 nova_compute[192144]: 2025-10-02 12:36:25.402 2 DEBUG oslo_concurrency.processutils [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:36:25 np0005466013 nova_compute[192144]: 2025-10-02 12:36:25.464 2 DEBUG oslo_concurrency.processutils [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:36:25 np0005466013 nova_compute[192144]: 2025-10-02 12:36:25.466 2 DEBUG nova.virt.disk.api [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Cannot resize image /var/lib/nova/instances/2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:36:25 np0005466013 nova_compute[192144]: 2025-10-02 12:36:25.466 2 DEBUG nova.objects.instance [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lazy-loading 'migration_context' on Instance uuid 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:36:25 np0005466013 nova_compute[192144]: 2025-10-02 12:36:25.491 2 DEBUG nova.virt.libvirt.driver [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:36:25 np0005466013 nova_compute[192144]: 2025-10-02 12:36:25.491 2 DEBUG nova.virt.libvirt.driver [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Ensure instance console log exists: /var/lib/nova/instances/2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:36:25 np0005466013 nova_compute[192144]: 2025-10-02 12:36:25.492 2 DEBUG oslo_concurrency.lockutils [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:25 np0005466013 nova_compute[192144]: 2025-10-02 12:36:25.492 2 DEBUG oslo_concurrency.lockutils [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:25 np0005466013 nova_compute[192144]: 2025-10-02 12:36:25.492 2 DEBUG oslo_concurrency.lockutils [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:26 np0005466013 nova_compute[192144]: 2025-10-02 12:36:26.144 2 DEBUG nova.compute.manager [req-0ad4e674-9b43-4a5d-b180-0e61c850c281 req-5329c3c5-e9c5-4e1d-9e18-0f87fcab0e08 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Received event network-vif-plugged-37bcb93a-8639-42b7-aafd-21f019307d66 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:36:26 np0005466013 nova_compute[192144]: 2025-10-02 12:36:26.145 2 DEBUG oslo_concurrency.lockutils [req-0ad4e674-9b43-4a5d-b180-0e61c850c281 req-5329c3c5-e9c5-4e1d-9e18-0f87fcab0e08 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "1d931a6f-0703-4e1f-acfc-b8402834c14d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:26 np0005466013 nova_compute[192144]: 2025-10-02 12:36:26.145 2 DEBUG oslo_concurrency.lockutils [req-0ad4e674-9b43-4a5d-b180-0e61c850c281 req-5329c3c5-e9c5-4e1d-9e18-0f87fcab0e08 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "1d931a6f-0703-4e1f-acfc-b8402834c14d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:26 np0005466013 nova_compute[192144]: 2025-10-02 12:36:26.145 2 DEBUG oslo_concurrency.lockutils [req-0ad4e674-9b43-4a5d-b180-0e61c850c281 req-5329c3c5-e9c5-4e1d-9e18-0f87fcab0e08 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "1d931a6f-0703-4e1f-acfc-b8402834c14d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:26 np0005466013 nova_compute[192144]: 2025-10-02 12:36:26.146 2 DEBUG nova.compute.manager [req-0ad4e674-9b43-4a5d-b180-0e61c850c281 req-5329c3c5-e9c5-4e1d-9e18-0f87fcab0e08 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] No waiting events found dispatching network-vif-plugged-37bcb93a-8639-42b7-aafd-21f019307d66 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:36:26 np0005466013 nova_compute[192144]: 2025-10-02 12:36:26.146 2 WARNING nova.compute.manager [req-0ad4e674-9b43-4a5d-b180-0e61c850c281 req-5329c3c5-e9c5-4e1d-9e18-0f87fcab0e08 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Received unexpected event network-vif-plugged-37bcb93a-8639-42b7-aafd-21f019307d66 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:36:26 np0005466013 nova_compute[192144]: 2025-10-02 12:36:26.146 2 DEBUG nova.compute.manager [req-0ad4e674-9b43-4a5d-b180-0e61c850c281 req-5329c3c5-e9c5-4e1d-9e18-0f87fcab0e08 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Received event network-vif-deleted-37bcb93a-8639-42b7-aafd-21f019307d66 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:36:26 np0005466013 nova_compute[192144]: 2025-10-02 12:36:26.147 2 INFO nova.compute.manager [req-0ad4e674-9b43-4a5d-b180-0e61c850c281 req-5329c3c5-e9c5-4e1d-9e18-0f87fcab0e08 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Neutron deleted interface 37bcb93a-8639-42b7-aafd-21f019307d66; detaching it from the instance and deleting it from the info cache#033[00m
Oct  2 08:36:26 np0005466013 nova_compute[192144]: 2025-10-02 12:36:26.147 2 DEBUG nova.network.neutron [req-0ad4e674-9b43-4a5d-b180-0e61c850c281 req-5329c3c5-e9c5-4e1d-9e18-0f87fcab0e08 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106#033[00m
Oct  2 08:36:26 np0005466013 nova_compute[192144]: 2025-10-02 12:36:26.150 2 DEBUG nova.network.neutron [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Successfully created port: 7fd89911-3957-4f68-8adb-9a1c640f6bdb _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:36:26 np0005466013 nova_compute[192144]: 2025-10-02 12:36:26.158 2 DEBUG nova.compute.manager [req-0ad4e674-9b43-4a5d-b180-0e61c850c281 req-5329c3c5-e9c5-4e1d-9e18-0f87fcab0e08 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Detach interface failed, port_id=37bcb93a-8639-42b7-aafd-21f019307d66, reason: Instance 1d931a6f-0703-4e1f-acfc-b8402834c14d could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Oct  2 08:36:27 np0005466013 nova_compute[192144]: 2025-10-02 12:36:27.467 2 DEBUG nova.network.neutron [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Successfully updated port: 7fd89911-3957-4f68-8adb-9a1c640f6bdb _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:36:27 np0005466013 nova_compute[192144]: 2025-10-02 12:36:27.485 2 DEBUG oslo_concurrency.lockutils [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "refresh_cache-2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:36:27 np0005466013 nova_compute[192144]: 2025-10-02 12:36:27.486 2 DEBUG oslo_concurrency.lockutils [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquired lock "refresh_cache-2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:36:27 np0005466013 nova_compute[192144]: 2025-10-02 12:36:27.486 2 DEBUG nova.network.neutron [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:36:27 np0005466013 nova_compute[192144]: 2025-10-02 12:36:27.791 2 DEBUG nova.network.neutron [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:36:28 np0005466013 nova_compute[192144]: 2025-10-02 12:36:28.300 2 DEBUG nova.compute.manager [req-440ef8d2-89f3-4e8b-b90e-72e59b145a88 req-b3059b40-f632-4eb9-849e-ba0456415a1f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Received event network-changed-7fd89911-3957-4f68-8adb-9a1c640f6bdb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:36:28 np0005466013 nova_compute[192144]: 2025-10-02 12:36:28.300 2 DEBUG nova.compute.manager [req-440ef8d2-89f3-4e8b-b90e-72e59b145a88 req-b3059b40-f632-4eb9-849e-ba0456415a1f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Refreshing instance network info cache due to event network-changed-7fd89911-3957-4f68-8adb-9a1c640f6bdb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:36:28 np0005466013 nova_compute[192144]: 2025-10-02 12:36:28.301 2 DEBUG oslo_concurrency.lockutils [req-440ef8d2-89f3-4e8b-b90e-72e59b145a88 req-b3059b40-f632-4eb9-849e-ba0456415a1f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:36:28 np0005466013 podman[246713]: 2025-10-02 12:36:28.713099055 +0000 UTC m=+0.086051321 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:36:28 np0005466013 podman[246714]: 2025-10-02 12:36:28.729814255 +0000 UTC m=+0.092230704 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, config_id=edpm, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Oct  2 08:36:28 np0005466013 podman[246715]: 2025-10-02 12:36:28.752713659 +0000 UTC m=+0.115938512 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:36:29 np0005466013 nova_compute[192144]: 2025-10-02 12:36:29.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:29 np0005466013 nova_compute[192144]: 2025-10-02 12:36:29.626 2 DEBUG nova.network.neutron [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Updating instance_info_cache with network_info: [{"id": "7fd89911-3957-4f68-8adb-9a1c640f6bdb", "address": "fa:16:3e:16:78:94", "network": {"id": "f6b28beb-3fac-4e00-bd1f-932a66109b1d", "bridge": "br-int", "label": "tempest-network-smoke--1067683882", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7fd89911-39", "ovs_interfaceid": "7fd89911-3957-4f68-8adb-9a1c640f6bdb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:36:29 np0005466013 nova_compute[192144]: 2025-10-02 12:36:29.656 2 DEBUG oslo_concurrency.lockutils [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Releasing lock "refresh_cache-2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:36:29 np0005466013 nova_compute[192144]: 2025-10-02 12:36:29.657 2 DEBUG nova.compute.manager [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Instance network_info: |[{"id": "7fd89911-3957-4f68-8adb-9a1c640f6bdb", "address": "fa:16:3e:16:78:94", "network": {"id": "f6b28beb-3fac-4e00-bd1f-932a66109b1d", "bridge": "br-int", "label": "tempest-network-smoke--1067683882", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7fd89911-39", "ovs_interfaceid": "7fd89911-3957-4f68-8adb-9a1c640f6bdb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:36:29 np0005466013 nova_compute[192144]: 2025-10-02 12:36:29.658 2 DEBUG oslo_concurrency.lockutils [req-440ef8d2-89f3-4e8b-b90e-72e59b145a88 req-b3059b40-f632-4eb9-849e-ba0456415a1f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:36:29 np0005466013 nova_compute[192144]: 2025-10-02 12:36:29.659 2 DEBUG nova.network.neutron [req-440ef8d2-89f3-4e8b-b90e-72e59b145a88 req-b3059b40-f632-4eb9-849e-ba0456415a1f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Refreshing network info cache for port 7fd89911-3957-4f68-8adb-9a1c640f6bdb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:36:29 np0005466013 nova_compute[192144]: 2025-10-02 12:36:29.663 2 DEBUG nova.virt.libvirt.driver [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Start _get_guest_xml network_info=[{"id": "7fd89911-3957-4f68-8adb-9a1c640f6bdb", "address": "fa:16:3e:16:78:94", "network": {"id": "f6b28beb-3fac-4e00-bd1f-932a66109b1d", "bridge": "br-int", "label": "tempest-network-smoke--1067683882", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7fd89911-39", "ovs_interfaceid": "7fd89911-3957-4f68-8adb-9a1c640f6bdb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:36:29 np0005466013 nova_compute[192144]: 2025-10-02 12:36:29.671 2 WARNING nova.virt.libvirt.driver [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:36:29 np0005466013 nova_compute[192144]: 2025-10-02 12:36:29.682 2 DEBUG nova.virt.libvirt.host [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:36:29 np0005466013 nova_compute[192144]: 2025-10-02 12:36:29.683 2 DEBUG nova.virt.libvirt.host [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:36:29 np0005466013 nova_compute[192144]: 2025-10-02 12:36:29.688 2 DEBUG nova.virt.libvirt.host [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:36:29 np0005466013 nova_compute[192144]: 2025-10-02 12:36:29.689 2 DEBUG nova.virt.libvirt.host [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:36:29 np0005466013 nova_compute[192144]: 2025-10-02 12:36:29.690 2 DEBUG nova.virt.libvirt.driver [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:36:29 np0005466013 nova_compute[192144]: 2025-10-02 12:36:29.690 2 DEBUG nova.virt.hardware [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:36:29 np0005466013 nova_compute[192144]: 2025-10-02 12:36:29.691 2 DEBUG nova.virt.hardware [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:36:29 np0005466013 nova_compute[192144]: 2025-10-02 12:36:29.691 2 DEBUG nova.virt.hardware [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:36:29 np0005466013 nova_compute[192144]: 2025-10-02 12:36:29.691 2 DEBUG nova.virt.hardware [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:36:29 np0005466013 nova_compute[192144]: 2025-10-02 12:36:29.691 2 DEBUG nova.virt.hardware [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:36:29 np0005466013 nova_compute[192144]: 2025-10-02 12:36:29.691 2 DEBUG nova.virt.hardware [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:36:29 np0005466013 nova_compute[192144]: 2025-10-02 12:36:29.691 2 DEBUG nova.virt.hardware [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:36:29 np0005466013 nova_compute[192144]: 2025-10-02 12:36:29.692 2 DEBUG nova.virt.hardware [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:36:29 np0005466013 nova_compute[192144]: 2025-10-02 12:36:29.692 2 DEBUG nova.virt.hardware [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:36:29 np0005466013 nova_compute[192144]: 2025-10-02 12:36:29.692 2 DEBUG nova.virt.hardware [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:36:29 np0005466013 nova_compute[192144]: 2025-10-02 12:36:29.692 2 DEBUG nova.virt.hardware [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:36:29 np0005466013 nova_compute[192144]: 2025-10-02 12:36:29.698 2 DEBUG nova.virt.libvirt.vif [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:36:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1457788990',display_name='tempest-TestNetworkBasicOps-server-1457788990',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1457788990',id=160,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLAX+2XQrOd7KzesCSBvPQcBu33n2cNneMCsROY2zt8eQS2rwe7lFkCII3Yacz+uHDwkN7yLjJdqPxnhD7koMwAMvGoeCFoDh9M4G2I81ZizY75euwWGW9AYqhxHgyD2+w==',key_name='tempest-TestNetworkBasicOps-1430246833',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6e2a4899168a47618e377cb3ac85ddd2',ramdisk_id='',reservation_id='r-dfdu8cck',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1323893370',owner_user_name='tempest-TestNetworkBasicOps-1323893370-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:36:24Z,user_data=None,user_id='a1898fdf056c4a249c33590f26d4d845',uuid=2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7fd89911-3957-4f68-8adb-9a1c640f6bdb", "address": "fa:16:3e:16:78:94", "network": {"id": "f6b28beb-3fac-4e00-bd1f-932a66109b1d", "bridge": "br-int", "label": "tempest-network-smoke--1067683882", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7fd89911-39", "ovs_interfaceid": "7fd89911-3957-4f68-8adb-9a1c640f6bdb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:36:29 np0005466013 nova_compute[192144]: 2025-10-02 12:36:29.699 2 DEBUG nova.network.os_vif_util [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Converting VIF {"id": "7fd89911-3957-4f68-8adb-9a1c640f6bdb", "address": "fa:16:3e:16:78:94", "network": {"id": "f6b28beb-3fac-4e00-bd1f-932a66109b1d", "bridge": "br-int", "label": "tempest-network-smoke--1067683882", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7fd89911-39", "ovs_interfaceid": "7fd89911-3957-4f68-8adb-9a1c640f6bdb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:36:29 np0005466013 nova_compute[192144]: 2025-10-02 12:36:29.699 2 DEBUG nova.network.os_vif_util [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:16:78:94,bridge_name='br-int',has_traffic_filtering=True,id=7fd89911-3957-4f68-8adb-9a1c640f6bdb,network=Network(f6b28beb-3fac-4e00-bd1f-932a66109b1d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7fd89911-39') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:36:29 np0005466013 nova_compute[192144]: 2025-10-02 12:36:29.701 2 DEBUG nova.objects.instance [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:36:29 np0005466013 nova_compute[192144]: 2025-10-02 12:36:29.720 2 DEBUG nova.virt.libvirt.driver [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:36:29 np0005466013 nova_compute[192144]:  <uuid>2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf</uuid>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:  <name>instance-000000a0</name>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:36:29 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:      <nova:name>tempest-TestNetworkBasicOps-server-1457788990</nova:name>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:36:29</nova:creationTime>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:36:29 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:        <nova:user uuid="a1898fdf056c4a249c33590f26d4d845">tempest-TestNetworkBasicOps-1323893370-project-member</nova:user>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:        <nova:project uuid="6e2a4899168a47618e377cb3ac85ddd2">tempest-TestNetworkBasicOps-1323893370</nova:project>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:        <nova:port uuid="7fd89911-3957-4f68-8adb-9a1c640f6bdb">
Oct  2 08:36:29 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:      <entry name="serial">2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf</entry>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:      <entry name="uuid">2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf</entry>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:36:29 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/disk"/>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:36:29 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/disk.config"/>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:36:29 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:16:78:94"/>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:      <target dev="tap7fd89911-39"/>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:36:29 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/console.log" append="off"/>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:36:29 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:36:29 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:36:29 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:36:29 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:36:29 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:36:29 np0005466013 nova_compute[192144]: 2025-10-02 12:36:29.721 2 DEBUG nova.compute.manager [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Preparing to wait for external event network-vif-plugged-7fd89911-3957-4f68-8adb-9a1c640f6bdb prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:36:29 np0005466013 nova_compute[192144]: 2025-10-02 12:36:29.722 2 DEBUG oslo_concurrency.lockutils [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:29 np0005466013 nova_compute[192144]: 2025-10-02 12:36:29.722 2 DEBUG oslo_concurrency.lockutils [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:29 np0005466013 ovn_controller[94366]: 2025-10-02T12:36:29Z|00687|binding|INFO|Releasing lport 9e6245a0-6013-48e5-9e96-a79fafe59b6a from this chassis (sb_readonly=0)
Oct  2 08:36:29 np0005466013 nova_compute[192144]: 2025-10-02 12:36:29.723 2 DEBUG oslo_concurrency.lockutils [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:29 np0005466013 ovn_controller[94366]: 2025-10-02T12:36:29Z|00688|binding|INFO|Releasing lport 0965d19d-88e4-4971-9eb7-5bedfed08cdc from this chassis (sb_readonly=0)
Oct  2 08:36:29 np0005466013 nova_compute[192144]: 2025-10-02 12:36:29.724 2 DEBUG nova.virt.libvirt.vif [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:36:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1457788990',display_name='tempest-TestNetworkBasicOps-server-1457788990',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1457788990',id=160,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLAX+2XQrOd7KzesCSBvPQcBu33n2cNneMCsROY2zt8eQS2rwe7lFkCII3Yacz+uHDwkN7yLjJdqPxnhD7koMwAMvGoeCFoDh9M4G2I81ZizY75euwWGW9AYqhxHgyD2+w==',key_name='tempest-TestNetworkBasicOps-1430246833',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6e2a4899168a47618e377cb3ac85ddd2',ramdisk_id='',reservation_id='r-dfdu8cck',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1323893370',owner_user_name='tempest-TestNetworkBasicOps-1323893370-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:36:24Z,user_data=None,user_id='a1898fdf056c4a249c33590f26d4d845',uuid=2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7fd89911-3957-4f68-8adb-9a1c640f6bdb", "address": "fa:16:3e:16:78:94", "network": {"id": "f6b28beb-3fac-4e00-bd1f-932a66109b1d", "bridge": "br-int", "label": "tempest-network-smoke--1067683882", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7fd89911-39", "ovs_interfaceid": "7fd89911-3957-4f68-8adb-9a1c640f6bdb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:36:29 np0005466013 nova_compute[192144]: 2025-10-02 12:36:29.724 2 DEBUG nova.network.os_vif_util [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Converting VIF {"id": "7fd89911-3957-4f68-8adb-9a1c640f6bdb", "address": "fa:16:3e:16:78:94", "network": {"id": "f6b28beb-3fac-4e00-bd1f-932a66109b1d", "bridge": "br-int", "label": "tempest-network-smoke--1067683882", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7fd89911-39", "ovs_interfaceid": "7fd89911-3957-4f68-8adb-9a1c640f6bdb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:36:29 np0005466013 nova_compute[192144]: 2025-10-02 12:36:29.725 2 DEBUG nova.network.os_vif_util [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:16:78:94,bridge_name='br-int',has_traffic_filtering=True,id=7fd89911-3957-4f68-8adb-9a1c640f6bdb,network=Network(f6b28beb-3fac-4e00-bd1f-932a66109b1d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7fd89911-39') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:36:29 np0005466013 nova_compute[192144]: 2025-10-02 12:36:29.726 2 DEBUG os_vif [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:16:78:94,bridge_name='br-int',has_traffic_filtering=True,id=7fd89911-3957-4f68-8adb-9a1c640f6bdb,network=Network(f6b28beb-3fac-4e00-bd1f-932a66109b1d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7fd89911-39') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:36:29 np0005466013 nova_compute[192144]: 2025-10-02 12:36:29.727 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:29 np0005466013 nova_compute[192144]: 2025-10-02 12:36:29.727 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:36:29 np0005466013 nova_compute[192144]: 2025-10-02 12:36:29.728 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:36:29 np0005466013 nova_compute[192144]: 2025-10-02 12:36:29.732 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:29 np0005466013 nova_compute[192144]: 2025-10-02 12:36:29.733 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7fd89911-39, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:36:29 np0005466013 nova_compute[192144]: 2025-10-02 12:36:29.734 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7fd89911-39, col_values=(('external_ids', {'iface-id': '7fd89911-3957-4f68-8adb-9a1c640f6bdb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:16:78:94', 'vm-uuid': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:36:29 np0005466013 nova_compute[192144]: 2025-10-02 12:36:29.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:29 np0005466013 NetworkManager[51205]: <info>  [1759408589.7386] manager: (tap7fd89911-39): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/303)
Oct  2 08:36:29 np0005466013 nova_compute[192144]: 2025-10-02 12:36:29.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:36:29 np0005466013 nova_compute[192144]: 2025-10-02 12:36:29.802 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:29 np0005466013 nova_compute[192144]: 2025-10-02 12:36:29.804 2 INFO os_vif [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:16:78:94,bridge_name='br-int',has_traffic_filtering=True,id=7fd89911-3957-4f68-8adb-9a1c640f6bdb,network=Network(f6b28beb-3fac-4e00-bd1f-932a66109b1d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7fd89911-39')#033[00m
Oct  2 08:36:29 np0005466013 nova_compute[192144]: 2025-10-02 12:36:29.806 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:29 np0005466013 nova_compute[192144]: 2025-10-02 12:36:29.869 2 DEBUG nova.virt.libvirt.driver [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:36:29 np0005466013 nova_compute[192144]: 2025-10-02 12:36:29.870 2 DEBUG nova.virt.libvirt.driver [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:36:29 np0005466013 nova_compute[192144]: 2025-10-02 12:36:29.870 2 DEBUG nova.virt.libvirt.driver [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] No VIF found with MAC fa:16:3e:16:78:94, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:36:29 np0005466013 nova_compute[192144]: 2025-10-02 12:36:29.871 2 INFO nova.virt.libvirt.driver [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Using config drive#033[00m
Oct  2 08:36:30 np0005466013 nova_compute[192144]: 2025-10-02 12:36:30.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:30 np0005466013 nova_compute[192144]: 2025-10-02 12:36:30.956 2 INFO nova.virt.libvirt.driver [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Creating config drive at /var/lib/nova/instances/2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/disk.config#033[00m
Oct  2 08:36:30 np0005466013 nova_compute[192144]: 2025-10-02 12:36:30.965 2 DEBUG oslo_concurrency.processutils [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpiq2_a91a execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:36:31 np0005466013 nova_compute[192144]: 2025-10-02 12:36:31.093 2 DEBUG oslo_concurrency.processutils [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpiq2_a91a" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:36:31 np0005466013 kernel: tap7fd89911-39: entered promiscuous mode
Oct  2 08:36:31 np0005466013 ovn_controller[94366]: 2025-10-02T12:36:31Z|00689|binding|INFO|Claiming lport 7fd89911-3957-4f68-8adb-9a1c640f6bdb for this chassis.
Oct  2 08:36:31 np0005466013 ovn_controller[94366]: 2025-10-02T12:36:31Z|00690|binding|INFO|7fd89911-3957-4f68-8adb-9a1c640f6bdb: Claiming fa:16:3e:16:78:94 10.100.0.4
Oct  2 08:36:31 np0005466013 nova_compute[192144]: 2025-10-02 12:36:31.190 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:31 np0005466013 NetworkManager[51205]: <info>  [1759408591.1924] manager: (tap7fd89911-39): new Tun device (/org/freedesktop/NetworkManager/Devices/304)
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:31.203 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:16:78:94 10.100.0.4'], port_security=['fa:16:3e:16:78:94 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f6b28beb-3fac-4e00-bd1f-932a66109b1d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f032af94-2449-4fc3-bf18-1eca195c2d1c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8bd58f68-4db4-498a-b482-4891f2ea7922, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=7fd89911-3957-4f68-8adb-9a1c640f6bdb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:31.205 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 7fd89911-3957-4f68-8adb-9a1c640f6bdb in datapath f6b28beb-3fac-4e00-bd1f-932a66109b1d bound to our chassis#033[00m
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:31.206 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f6b28beb-3fac-4e00-bd1f-932a66109b1d#033[00m
Oct  2 08:36:31 np0005466013 systemd-udevd[246796]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:36:31 np0005466013 ovn_controller[94366]: 2025-10-02T12:36:31Z|00691|binding|INFO|Setting lport 7fd89911-3957-4f68-8adb-9a1c640f6bdb ovn-installed in OVS
Oct  2 08:36:31 np0005466013 ovn_controller[94366]: 2025-10-02T12:36:31Z|00692|binding|INFO|Setting lport 7fd89911-3957-4f68-8adb-9a1c640f6bdb up in Southbound
Oct  2 08:36:31 np0005466013 nova_compute[192144]: 2025-10-02 12:36:31.225 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:31.229 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[396121ec-2d25-481b-a231-7e8ded01e672]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:31.230 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf6b28beb-31 in ovnmeta-f6b28beb-3fac-4e00-bd1f-932a66109b1d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:36:31 np0005466013 nova_compute[192144]: 2025-10-02 12:36:31.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:31 np0005466013 NetworkManager[51205]: <info>  [1759408591.2366] device (tap7fd89911-39): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:36:31 np0005466013 NetworkManager[51205]: <info>  [1759408591.2377] device (tap7fd89911-39): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:31.234 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf6b28beb-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:31.235 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[7f7c05d0-3dd2-420e-aa9b-e4dc257cb6d8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:31.236 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[560b0a5c-8c10-4a06-a30f-1e3931b5121b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:31 np0005466013 systemd-machined[152202]: New machine qemu-74-instance-000000a0.
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:31.254 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[f960d8b7-a6cb-47de-9fb7-b8a8e62d4e1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:31 np0005466013 systemd[1]: Started Virtual Machine qemu-74-instance-000000a0.
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:31.281 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[0859ac08-d96b-4640-ad90-abb7d5722db1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:31.317 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[cd745379-58ae-4eed-8264-245010684eab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:31 np0005466013 NetworkManager[51205]: <info>  [1759408591.3246] manager: (tapf6b28beb-30): new Veth device (/org/freedesktop/NetworkManager/Devices/305)
Oct  2 08:36:31 np0005466013 systemd-udevd[246799]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:31.323 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[4cc0cedc-1375-45ca-a472-1e146abd94f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:31.360 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[7f8b75e5-fa72-490b-8b65-e71ffc0e1aab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:31.365 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[b0c405a5-3fee-4483-8747-2594b2751c14]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:31 np0005466013 NetworkManager[51205]: <info>  [1759408591.3938] device (tapf6b28beb-30): carrier: link connected
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:31.399 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[90f175cc-0cb7-4e51-9baf-9f7474cafc7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:31.422 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[5026a104-c700-4f27-85ea-334e13b42313]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf6b28beb-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3d:0a:fd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 204], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 658301, 'reachable_time': 39236, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 246829, 'error': None, 'target': 'ovnmeta-f6b28beb-3fac-4e00-bd1f-932a66109b1d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:31.443 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[667150f0-5145-420c-90a2-97738b69e402]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe3d:afd'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 658301, 'tstamp': 658301}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 246830, 'error': None, 'target': 'ovnmeta-f6b28beb-3fac-4e00-bd1f-932a66109b1d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:31.467 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[100282db-5f46-4422-b43f-90d3e1a553f9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf6b28beb-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3d:0a:fd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 204], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 658301, 'reachable_time': 39236, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 246831, 'error': None, 'target': 'ovnmeta-f6b28beb-3fac-4e00-bd1f-932a66109b1d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:31.514 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[13546063-2425-494f-bd39-385f42c7942e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:31.586 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[269430e9-d2af-459f-adc2-b68b8e5eebf3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:31.587 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf6b28beb-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:31.588 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:31.588 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf6b28beb-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:36:31 np0005466013 nova_compute[192144]: 2025-10-02 12:36:31.590 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:31 np0005466013 kernel: tapf6b28beb-30: entered promiscuous mode
Oct  2 08:36:31 np0005466013 NetworkManager[51205]: <info>  [1759408591.5917] manager: (tapf6b28beb-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/306)
Oct  2 08:36:31 np0005466013 nova_compute[192144]: 2025-10-02 12:36:31.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:31.598 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf6b28beb-30, col_values=(('external_ids', {'iface-id': '02b99a60-d61e-4fd5-b4f5-ea414f3b2b3c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:36:31 np0005466013 nova_compute[192144]: 2025-10-02 12:36:31.599 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:31 np0005466013 nova_compute[192144]: 2025-10-02 12:36:31.600 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:31.601 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f6b28beb-3fac-4e00-bd1f-932a66109b1d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f6b28beb-3fac-4e00-bd1f-932a66109b1d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:31.602 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[0b30d662-44d6-4323-aadb-a16c95ba685a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:31.603 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-f6b28beb-3fac-4e00-bd1f-932a66109b1d
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/f6b28beb-3fac-4e00-bd1f-932a66109b1d.pid.haproxy
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID f6b28beb-3fac-4e00-bd1f-932a66109b1d
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:36:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:31.604 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f6b28beb-3fac-4e00-bd1f-932a66109b1d', 'env', 'PROCESS_TAG=haproxy-f6b28beb-3fac-4e00-bd1f-932a66109b1d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f6b28beb-3fac-4e00-bd1f-932a66109b1d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:36:31 np0005466013 ovn_controller[94366]: 2025-10-02T12:36:31Z|00693|binding|INFO|Releasing lport 02b99a60-d61e-4fd5-b4f5-ea414f3b2b3c from this chassis (sb_readonly=0)
Oct  2 08:36:31 np0005466013 nova_compute[192144]: 2025-10-02 12:36:31.618 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:31 np0005466013 nova_compute[192144]: 2025-10-02 12:36:31.656 2 DEBUG nova.compute.manager [req-e6aa12ee-54ca-42b1-9379-51a0b08d6847 req-1844fa47-704b-4c6e-aedc-c230c305040b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Received event network-vif-plugged-7fd89911-3957-4f68-8adb-9a1c640f6bdb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:36:31 np0005466013 nova_compute[192144]: 2025-10-02 12:36:31.657 2 DEBUG oslo_concurrency.lockutils [req-e6aa12ee-54ca-42b1-9379-51a0b08d6847 req-1844fa47-704b-4c6e-aedc-c230c305040b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:31 np0005466013 nova_compute[192144]: 2025-10-02 12:36:31.658 2 DEBUG oslo_concurrency.lockutils [req-e6aa12ee-54ca-42b1-9379-51a0b08d6847 req-1844fa47-704b-4c6e-aedc-c230c305040b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:31 np0005466013 nova_compute[192144]: 2025-10-02 12:36:31.658 2 DEBUG oslo_concurrency.lockutils [req-e6aa12ee-54ca-42b1-9379-51a0b08d6847 req-1844fa47-704b-4c6e-aedc-c230c305040b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:31 np0005466013 nova_compute[192144]: 2025-10-02 12:36:31.659 2 DEBUG nova.compute.manager [req-e6aa12ee-54ca-42b1-9379-51a0b08d6847 req-1844fa47-704b-4c6e-aedc-c230c305040b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Processing event network-vif-plugged-7fd89911-3957-4f68-8adb-9a1c640f6bdb _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:36:31 np0005466013 nova_compute[192144]: 2025-10-02 12:36:31.891 2 DEBUG nova.network.neutron [req-440ef8d2-89f3-4e8b-b90e-72e59b145a88 req-b3059b40-f632-4eb9-849e-ba0456415a1f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Updated VIF entry in instance network info cache for port 7fd89911-3957-4f68-8adb-9a1c640f6bdb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:36:31 np0005466013 nova_compute[192144]: 2025-10-02 12:36:31.892 2 DEBUG nova.network.neutron [req-440ef8d2-89f3-4e8b-b90e-72e59b145a88 req-b3059b40-f632-4eb9-849e-ba0456415a1f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Updating instance_info_cache with network_info: [{"id": "7fd89911-3957-4f68-8adb-9a1c640f6bdb", "address": "fa:16:3e:16:78:94", "network": {"id": "f6b28beb-3fac-4e00-bd1f-932a66109b1d", "bridge": "br-int", "label": "tempest-network-smoke--1067683882", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7fd89911-39", "ovs_interfaceid": "7fd89911-3957-4f68-8adb-9a1c640f6bdb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:36:31 np0005466013 nova_compute[192144]: 2025-10-02 12:36:31.923 2 DEBUG oslo_concurrency.lockutils [req-440ef8d2-89f3-4e8b-b90e-72e59b145a88 req-b3059b40-f632-4eb9-849e-ba0456415a1f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:36:32 np0005466013 podman[246870]: 2025-10-02 12:36:32.047359918 +0000 UTC m=+0.058146671 container create d0ec391301a98e9c709b7adb64263f7195426a9348857f5be62b4811e55fb1ad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f6b28beb-3fac-4e00-bd1f-932a66109b1d, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001)
Oct  2 08:36:32 np0005466013 systemd[1]: Started libpod-conmon-d0ec391301a98e9c709b7adb64263f7195426a9348857f5be62b4811e55fb1ad.scope.
Oct  2 08:36:32 np0005466013 podman[246870]: 2025-10-02 12:36:32.020378269 +0000 UTC m=+0.031165022 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:36:32 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:36:32 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/277afd71fb3fc70971fd153beb7e7fafeaa6e69bfa2b36ee3d3533aa1c9278e5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:36:32 np0005466013 podman[246870]: 2025-10-02 12:36:32.138437396 +0000 UTC m=+0.149224129 container init d0ec391301a98e9c709b7adb64263f7195426a9348857f5be62b4811e55fb1ad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f6b28beb-3fac-4e00-bd1f-932a66109b1d, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true)
Oct  2 08:36:32 np0005466013 podman[246883]: 2025-10-02 12:36:32.140671365 +0000 UTC m=+0.054191889 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  2 08:36:32 np0005466013 podman[246870]: 2025-10-02 12:36:32.144957509 +0000 UTC m=+0.155744242 container start d0ec391301a98e9c709b7adb64263f7195426a9348857f5be62b4811e55fb1ad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f6b28beb-3fac-4e00-bd1f-932a66109b1d, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001)
Oct  2 08:36:32 np0005466013 podman[246886]: 2025-10-02 12:36:32.149055767 +0000 UTC m=+0.060955050 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=iscsid, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 08:36:32 np0005466013 neutron-haproxy-ovnmeta-f6b28beb-3fac-4e00-bd1f-932a66109b1d[246897]: [NOTICE]   (246930) : New worker (246932) forked
Oct  2 08:36:32 np0005466013 neutron-haproxy-ovnmeta-f6b28beb-3fac-4e00-bd1f-932a66109b1d[246897]: [NOTICE]   (246930) : Loading success.
Oct  2 08:36:32 np0005466013 nova_compute[192144]: 2025-10-02 12:36:32.203 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759408592.203394, 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:36:32 np0005466013 nova_compute[192144]: 2025-10-02 12:36:32.204 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] VM Started (Lifecycle Event)#033[00m
Oct  2 08:36:32 np0005466013 nova_compute[192144]: 2025-10-02 12:36:32.207 2 DEBUG nova.compute.manager [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:36:32 np0005466013 nova_compute[192144]: 2025-10-02 12:36:32.216 2 DEBUG nova.virt.libvirt.driver [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:36:32 np0005466013 nova_compute[192144]: 2025-10-02 12:36:32.221 2 INFO nova.virt.libvirt.driver [-] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Instance spawned successfully.#033[00m
Oct  2 08:36:32 np0005466013 nova_compute[192144]: 2025-10-02 12:36:32.222 2 DEBUG nova.virt.libvirt.driver [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:36:32 np0005466013 nova_compute[192144]: 2025-10-02 12:36:32.233 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:36:32 np0005466013 nova_compute[192144]: 2025-10-02 12:36:32.237 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:36:32 np0005466013 nova_compute[192144]: 2025-10-02 12:36:32.259 2 DEBUG nova.virt.libvirt.driver [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:36:32 np0005466013 nova_compute[192144]: 2025-10-02 12:36:32.260 2 DEBUG nova.virt.libvirt.driver [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:36:32 np0005466013 nova_compute[192144]: 2025-10-02 12:36:32.260 2 DEBUG nova.virt.libvirt.driver [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:36:32 np0005466013 nova_compute[192144]: 2025-10-02 12:36:32.262 2 DEBUG nova.virt.libvirt.driver [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:36:32 np0005466013 nova_compute[192144]: 2025-10-02 12:36:32.263 2 DEBUG nova.virt.libvirt.driver [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:36:32 np0005466013 nova_compute[192144]: 2025-10-02 12:36:32.263 2 DEBUG nova.virt.libvirt.driver [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:36:32 np0005466013 nova_compute[192144]: 2025-10-02 12:36:32.302 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:36:32 np0005466013 nova_compute[192144]: 2025-10-02 12:36:32.302 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759408592.2035413, 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:36:32 np0005466013 nova_compute[192144]: 2025-10-02 12:36:32.303 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:36:32 np0005466013 nova_compute[192144]: 2025-10-02 12:36:32.360 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:36:32 np0005466013 nova_compute[192144]: 2025-10-02 12:36:32.364 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759408592.210075, 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:36:32 np0005466013 nova_compute[192144]: 2025-10-02 12:36:32.364 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:36:32 np0005466013 nova_compute[192144]: 2025-10-02 12:36:32.395 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:36:32 np0005466013 nova_compute[192144]: 2025-10-02 12:36:32.401 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:36:32 np0005466013 nova_compute[192144]: 2025-10-02 12:36:32.407 2 INFO nova.compute.manager [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Took 7.33 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:36:32 np0005466013 nova_compute[192144]: 2025-10-02 12:36:32.407 2 DEBUG nova.compute.manager [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:36:32 np0005466013 nova_compute[192144]: 2025-10-02 12:36:32.435 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:36:32 np0005466013 nova_compute[192144]: 2025-10-02 12:36:32.537 2 INFO nova.compute.manager [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Took 8.02 seconds to build instance.#033[00m
Oct  2 08:36:32 np0005466013 nova_compute[192144]: 2025-10-02 12:36:32.558 2 DEBUG oslo_concurrency.lockutils [None req-52e52c54-5b5d-47cf-b6de-1fe49939db2a a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.133s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:33 np0005466013 nova_compute[192144]: 2025-10-02 12:36:33.564 2 DEBUG oslo_concurrency.lockutils [None req-2197a112-beec-4624-8c6f-a51a1abaa501 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "cd5cbe34-283a-4177-9d0b-bc35fadcde72" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:33 np0005466013 nova_compute[192144]: 2025-10-02 12:36:33.565 2 DEBUG oslo_concurrency.lockutils [None req-2197a112-beec-4624-8c6f-a51a1abaa501 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "cd5cbe34-283a-4177-9d0b-bc35fadcde72" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:33 np0005466013 nova_compute[192144]: 2025-10-02 12:36:33.566 2 DEBUG oslo_concurrency.lockutils [None req-2197a112-beec-4624-8c6f-a51a1abaa501 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "cd5cbe34-283a-4177-9d0b-bc35fadcde72-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:33 np0005466013 nova_compute[192144]: 2025-10-02 12:36:33.566 2 DEBUG oslo_concurrency.lockutils [None req-2197a112-beec-4624-8c6f-a51a1abaa501 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "cd5cbe34-283a-4177-9d0b-bc35fadcde72-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:33 np0005466013 nova_compute[192144]: 2025-10-02 12:36:33.567 2 DEBUG oslo_concurrency.lockutils [None req-2197a112-beec-4624-8c6f-a51a1abaa501 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "cd5cbe34-283a-4177-9d0b-bc35fadcde72-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:33 np0005466013 nova_compute[192144]: 2025-10-02 12:36:33.581 2 INFO nova.compute.manager [None req-2197a112-beec-4624-8c6f-a51a1abaa501 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Terminating instance#033[00m
Oct  2 08:36:33 np0005466013 nova_compute[192144]: 2025-10-02 12:36:33.596 2 DEBUG nova.compute.manager [None req-2197a112-beec-4624-8c6f-a51a1abaa501 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:36:33 np0005466013 kernel: tap5be4618b-6d (unregistering): left promiscuous mode
Oct  2 08:36:33 np0005466013 NetworkManager[51205]: <info>  [1759408593.6243] device (tap5be4618b-6d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:36:33 np0005466013 nova_compute[192144]: 2025-10-02 12:36:33.648 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:33 np0005466013 ovn_controller[94366]: 2025-10-02T12:36:33Z|00694|binding|INFO|Releasing lport 5be4618b-6dbd-4495-af12-ea729df149d7 from this chassis (sb_readonly=0)
Oct  2 08:36:33 np0005466013 ovn_controller[94366]: 2025-10-02T12:36:33Z|00695|binding|INFO|Setting lport 5be4618b-6dbd-4495-af12-ea729df149d7 down in Southbound
Oct  2 08:36:33 np0005466013 ovn_controller[94366]: 2025-10-02T12:36:33Z|00696|binding|INFO|Removing iface tap5be4618b-6d ovn-installed in OVS
Oct  2 08:36:33 np0005466013 nova_compute[192144]: 2025-10-02 12:36:33.652 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:33 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:33.658 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6a:d5:ad 10.100.0.5'], port_security=['fa:16:3e:6a:d5:ad 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-385e0a9e-c250-418d-8cab-e7e3ae4506c1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '4', 'neutron:security_group_ids': '31537a48-e4ed-4e85-9383-0c91e41b0f96', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a4660930-bac7-4d92-b95e-2296da9c1763, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=5be4618b-6dbd-4495-af12-ea729df149d7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:36:33 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:33.660 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 5be4618b-6dbd-4495-af12-ea729df149d7 in datapath 385e0a9e-c250-418d-8cab-e7e3ae4506c1 unbound from our chassis#033[00m
Oct  2 08:36:33 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:33.662 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 385e0a9e-c250-418d-8cab-e7e3ae4506c1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:36:33 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:33.663 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[dff71d2f-032b-4117-b6bc-9e9c720abc4b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:33 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:33.664 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-385e0a9e-c250-418d-8cab-e7e3ae4506c1 namespace which is not needed anymore#033[00m
Oct  2 08:36:33 np0005466013 nova_compute[192144]: 2025-10-02 12:36:33.663 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:33 np0005466013 kernel: tapa9826a7d-32 (unregistering): left promiscuous mode
Oct  2 08:36:33 np0005466013 NetworkManager[51205]: <info>  [1759408593.6755] device (tapa9826a7d-32): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:36:33 np0005466013 nova_compute[192144]: 2025-10-02 12:36:33.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:33 np0005466013 ovn_controller[94366]: 2025-10-02T12:36:33Z|00697|binding|INFO|Releasing lport a9826a7d-3298-45d6-b564-4b199244dec1 from this chassis (sb_readonly=0)
Oct  2 08:36:33 np0005466013 ovn_controller[94366]: 2025-10-02T12:36:33Z|00698|binding|INFO|Setting lport a9826a7d-3298-45d6-b564-4b199244dec1 down in Southbound
Oct  2 08:36:33 np0005466013 ovn_controller[94366]: 2025-10-02T12:36:33Z|00699|binding|INFO|Removing iface tapa9826a7d-32 ovn-installed in OVS
Oct  2 08:36:33 np0005466013 nova_compute[192144]: 2025-10-02 12:36:33.692 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:33 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:33.697 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5d:00:e6 2001:db8::f816:3eff:fe5d:e6'], port_security=['fa:16:3e:5d:00:e6 2001:db8::f816:3eff:fe5d:e6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe5d:e6/64', 'neutron:device_id': 'cd5cbe34-283a-4177-9d0b-bc35fadcde72', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4522b631-3a21-451f-8605-7c2b34273ecd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '4', 'neutron:security_group_ids': '31537a48-e4ed-4e85-9383-0c91e41b0f96', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=09de0cb2-1ed0-42b1-8efb-533f84345cc8, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=a9826a7d-3298-45d6-b564-4b199244dec1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:36:33 np0005466013 nova_compute[192144]: 2025-10-02 12:36:33.709 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:33 np0005466013 systemd[1]: machine-qemu\x2d72\x2dinstance\x2d0000009a.scope: Deactivated successfully.
Oct  2 08:36:33 np0005466013 systemd[1]: machine-qemu\x2d72\x2dinstance\x2d0000009a.scope: Consumed 17.467s CPU time.
Oct  2 08:36:33 np0005466013 systemd-machined[152202]: Machine qemu-72-instance-0000009a terminated.
Oct  2 08:36:33 np0005466013 nova_compute[192144]: 2025-10-02 12:36:33.811 2 DEBUG nova.compute.manager [req-c98d9fd9-99d4-4a95-9482-25d0f10fc5d0 req-3d1ef264-b9aa-443b-813d-e98fed66b48b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Received event network-vif-plugged-7fd89911-3957-4f68-8adb-9a1c640f6bdb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:36:33 np0005466013 nova_compute[192144]: 2025-10-02 12:36:33.812 2 DEBUG oslo_concurrency.lockutils [req-c98d9fd9-99d4-4a95-9482-25d0f10fc5d0 req-3d1ef264-b9aa-443b-813d-e98fed66b48b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:33 np0005466013 nova_compute[192144]: 2025-10-02 12:36:33.813 2 DEBUG oslo_concurrency.lockutils [req-c98d9fd9-99d4-4a95-9482-25d0f10fc5d0 req-3d1ef264-b9aa-443b-813d-e98fed66b48b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:33 np0005466013 nova_compute[192144]: 2025-10-02 12:36:33.813 2 DEBUG oslo_concurrency.lockutils [req-c98d9fd9-99d4-4a95-9482-25d0f10fc5d0 req-3d1ef264-b9aa-443b-813d-e98fed66b48b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:33 np0005466013 nova_compute[192144]: 2025-10-02 12:36:33.813 2 DEBUG nova.compute.manager [req-c98d9fd9-99d4-4a95-9482-25d0f10fc5d0 req-3d1ef264-b9aa-443b-813d-e98fed66b48b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] No waiting events found dispatching network-vif-plugged-7fd89911-3957-4f68-8adb-9a1c640f6bdb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:36:33 np0005466013 nova_compute[192144]: 2025-10-02 12:36:33.813 2 WARNING nova.compute.manager [req-c98d9fd9-99d4-4a95-9482-25d0f10fc5d0 req-3d1ef264-b9aa-443b-813d-e98fed66b48b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Received unexpected event network-vif-plugged-7fd89911-3957-4f68-8adb-9a1c640f6bdb for instance with vm_state active and task_state None.#033[00m
Oct  2 08:36:33 np0005466013 nova_compute[192144]: 2025-10-02 12:36:33.814 2 DEBUG nova.compute.manager [req-c98d9fd9-99d4-4a95-9482-25d0f10fc5d0 req-3d1ef264-b9aa-443b-813d-e98fed66b48b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Received event network-changed-5be4618b-6dbd-4495-af12-ea729df149d7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:36:33 np0005466013 nova_compute[192144]: 2025-10-02 12:36:33.814 2 DEBUG nova.compute.manager [req-c98d9fd9-99d4-4a95-9482-25d0f10fc5d0 req-3d1ef264-b9aa-443b-813d-e98fed66b48b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Refreshing instance network info cache due to event network-changed-5be4618b-6dbd-4495-af12-ea729df149d7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:36:33 np0005466013 nova_compute[192144]: 2025-10-02 12:36:33.814 2 DEBUG oslo_concurrency.lockutils [req-c98d9fd9-99d4-4a95-9482-25d0f10fc5d0 req-3d1ef264-b9aa-443b-813d-e98fed66b48b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-cd5cbe34-283a-4177-9d0b-bc35fadcde72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:36:33 np0005466013 nova_compute[192144]: 2025-10-02 12:36:33.814 2 DEBUG oslo_concurrency.lockutils [req-c98d9fd9-99d4-4a95-9482-25d0f10fc5d0 req-3d1ef264-b9aa-443b-813d-e98fed66b48b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-cd5cbe34-283a-4177-9d0b-bc35fadcde72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:36:33 np0005466013 nova_compute[192144]: 2025-10-02 12:36:33.815 2 DEBUG nova.network.neutron [req-c98d9fd9-99d4-4a95-9482-25d0f10fc5d0 req-3d1ef264-b9aa-443b-813d-e98fed66b48b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Refreshing network info cache for port 5be4618b-6dbd-4495-af12-ea729df149d7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:36:33 np0005466013 neutron-haproxy-ovnmeta-385e0a9e-c250-418d-8cab-e7e3ae4506c1[245863]: [NOTICE]   (245867) : haproxy version is 2.8.14-c23fe91
Oct  2 08:36:33 np0005466013 neutron-haproxy-ovnmeta-385e0a9e-c250-418d-8cab-e7e3ae4506c1[245863]: [NOTICE]   (245867) : path to executable is /usr/sbin/haproxy
Oct  2 08:36:33 np0005466013 neutron-haproxy-ovnmeta-385e0a9e-c250-418d-8cab-e7e3ae4506c1[245863]: [WARNING]  (245867) : Exiting Master process...
Oct  2 08:36:33 np0005466013 neutron-haproxy-ovnmeta-385e0a9e-c250-418d-8cab-e7e3ae4506c1[245863]: [WARNING]  (245867) : Exiting Master process...
Oct  2 08:36:33 np0005466013 neutron-haproxy-ovnmeta-385e0a9e-c250-418d-8cab-e7e3ae4506c1[245863]: [ALERT]    (245867) : Current worker (245869) exited with code 143 (Terminated)
Oct  2 08:36:33 np0005466013 neutron-haproxy-ovnmeta-385e0a9e-c250-418d-8cab-e7e3ae4506c1[245863]: [WARNING]  (245867) : All workers exited. Exiting... (0)
Oct  2 08:36:33 np0005466013 systemd[1]: libpod-7baa2d64357ae1bda25a05f8a9aa9c16e7e8d5c3f8f0c24c903f5f451211c3a1.scope: Deactivated successfully.
Oct  2 08:36:33 np0005466013 podman[246966]: 2025-10-02 12:36:33.835916988 +0000 UTC m=+0.051521606 container died 7baa2d64357ae1bda25a05f8a9aa9c16e7e8d5c3f8f0c24c903f5f451211c3a1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-385e0a9e-c250-418d-8cab-e7e3ae4506c1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct  2 08:36:33 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7baa2d64357ae1bda25a05f8a9aa9c16e7e8d5c3f8f0c24c903f5f451211c3a1-userdata-shm.mount: Deactivated successfully.
Oct  2 08:36:33 np0005466013 systemd[1]: var-lib-containers-storage-overlay-44d07cb0920fd101e16bb8cc5218a44289b5c60f4fe6103599a50c87912cc307-merged.mount: Deactivated successfully.
Oct  2 08:36:33 np0005466013 nova_compute[192144]: 2025-10-02 12:36:33.885 2 INFO nova.virt.libvirt.driver [-] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Instance destroyed successfully.#033[00m
Oct  2 08:36:33 np0005466013 nova_compute[192144]: 2025-10-02 12:36:33.887 2 DEBUG nova.objects.instance [None req-2197a112-beec-4624-8c6f-a51a1abaa501 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lazy-loading 'resources' on Instance uuid cd5cbe34-283a-4177-9d0b-bc35fadcde72 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:36:33 np0005466013 podman[246966]: 2025-10-02 12:36:33.888782694 +0000 UTC m=+0.104387312 container cleanup 7baa2d64357ae1bda25a05f8a9aa9c16e7e8d5c3f8f0c24c903f5f451211c3a1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-385e0a9e-c250-418d-8cab-e7e3ae4506c1, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct  2 08:36:33 np0005466013 systemd[1]: libpod-conmon-7baa2d64357ae1bda25a05f8a9aa9c16e7e8d5c3f8f0c24c903f5f451211c3a1.scope: Deactivated successfully.
Oct  2 08:36:33 np0005466013 nova_compute[192144]: 2025-10-02 12:36:33.916 2 DEBUG nova.virt.libvirt.vif [None req-2197a112-beec-4624-8c6f-a51a1abaa501 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:34:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1033926726',display_name='tempest-TestGettingAddress-server-1033926726',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1033926726',id=154,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDH5dyE2Fb62QTiQ2NOxenp6JYghL1oq9DicssFh4gYhJeK4nW86De0EFSTiwMTso1Ry0yTYcOXXGoL5CP8YNZFtCow/YELoJPmiA0NwtFRUtcQ5PfX4mPVbmaNTOUOY/A==',key_name='tempest-TestGettingAddress-170538368',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:35:00Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-v2dngbvu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:35:00Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=cd5cbe34-283a-4177-9d0b-bc35fadcde72,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5be4618b-6dbd-4495-af12-ea729df149d7", "address": "fa:16:3e:6a:d5:ad", "network": {"id": "385e0a9e-c250-418d-8cab-e7e3ae4506c1", "bridge": "br-int", "label": "tempest-network-smoke--772702849", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.217", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5be4618b-6d", "ovs_interfaceid": "5be4618b-6dbd-4495-af12-ea729df149d7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:36:33 np0005466013 nova_compute[192144]: 2025-10-02 12:36:33.919 2 DEBUG nova.network.os_vif_util [None req-2197a112-beec-4624-8c6f-a51a1abaa501 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "5be4618b-6dbd-4495-af12-ea729df149d7", "address": "fa:16:3e:6a:d5:ad", "network": {"id": "385e0a9e-c250-418d-8cab-e7e3ae4506c1", "bridge": "br-int", "label": "tempest-network-smoke--772702849", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.217", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5be4618b-6d", "ovs_interfaceid": "5be4618b-6dbd-4495-af12-ea729df149d7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:36:33 np0005466013 nova_compute[192144]: 2025-10-02 12:36:33.922 2 DEBUG nova.network.os_vif_util [None req-2197a112-beec-4624-8c6f-a51a1abaa501 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:6a:d5:ad,bridge_name='br-int',has_traffic_filtering=True,id=5be4618b-6dbd-4495-af12-ea729df149d7,network=Network(385e0a9e-c250-418d-8cab-e7e3ae4506c1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5be4618b-6d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:36:33 np0005466013 nova_compute[192144]: 2025-10-02 12:36:33.924 2 DEBUG os_vif [None req-2197a112-beec-4624-8c6f-a51a1abaa501 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:6a:d5:ad,bridge_name='br-int',has_traffic_filtering=True,id=5be4618b-6dbd-4495-af12-ea729df149d7,network=Network(385e0a9e-c250-418d-8cab-e7e3ae4506c1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5be4618b-6d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:36:33 np0005466013 nova_compute[192144]: 2025-10-02 12:36:33.929 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:33 np0005466013 nova_compute[192144]: 2025-10-02 12:36:33.930 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5be4618b-6d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:36:33 np0005466013 nova_compute[192144]: 2025-10-02 12:36:33.934 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:33 np0005466013 nova_compute[192144]: 2025-10-02 12:36:33.937 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:36:33 np0005466013 nova_compute[192144]: 2025-10-02 12:36:33.939 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:33 np0005466013 nova_compute[192144]: 2025-10-02 12:36:33.946 2 INFO os_vif [None req-2197a112-beec-4624-8c6f-a51a1abaa501 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:6a:d5:ad,bridge_name='br-int',has_traffic_filtering=True,id=5be4618b-6dbd-4495-af12-ea729df149d7,network=Network(385e0a9e-c250-418d-8cab-e7e3ae4506c1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5be4618b-6d')#033[00m
Oct  2 08:36:33 np0005466013 nova_compute[192144]: 2025-10-02 12:36:33.948 2 DEBUG nova.virt.libvirt.vif [None req-2197a112-beec-4624-8c6f-a51a1abaa501 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:34:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1033926726',display_name='tempest-TestGettingAddress-server-1033926726',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1033926726',id=154,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDH5dyE2Fb62QTiQ2NOxenp6JYghL1oq9DicssFh4gYhJeK4nW86De0EFSTiwMTso1Ry0yTYcOXXGoL5CP8YNZFtCow/YELoJPmiA0NwtFRUtcQ5PfX4mPVbmaNTOUOY/A==',key_name='tempest-TestGettingAddress-170538368',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:35:00Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-v2dngbvu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:35:00Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=cd5cbe34-283a-4177-9d0b-bc35fadcde72,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a9826a7d-3298-45d6-b564-4b199244dec1", "address": "fa:16:3e:5d:00:e6", "network": {"id": "4522b631-3a21-451f-8605-7c2b34273ecd", "bridge": "br-int", "label": "tempest-network-smoke--123996893", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5d:e6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9826a7d-32", "ovs_interfaceid": "a9826a7d-3298-45d6-b564-4b199244dec1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:36:33 np0005466013 nova_compute[192144]: 2025-10-02 12:36:33.949 2 DEBUG nova.network.os_vif_util [None req-2197a112-beec-4624-8c6f-a51a1abaa501 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "a9826a7d-3298-45d6-b564-4b199244dec1", "address": "fa:16:3e:5d:00:e6", "network": {"id": "4522b631-3a21-451f-8605-7c2b34273ecd", "bridge": "br-int", "label": "tempest-network-smoke--123996893", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5d:e6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9826a7d-32", "ovs_interfaceid": "a9826a7d-3298-45d6-b564-4b199244dec1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:36:33 np0005466013 podman[247023]: 2025-10-02 12:36:33.949423283 +0000 UTC m=+0.035609370 container remove 7baa2d64357ae1bda25a05f8a9aa9c16e7e8d5c3f8f0c24c903f5f451211c3a1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-385e0a9e-c250-418d-8cab-e7e3ae4506c1, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 08:36:33 np0005466013 nova_compute[192144]: 2025-10-02 12:36:33.949 2 DEBUG nova.network.os_vif_util [None req-2197a112-beec-4624-8c6f-a51a1abaa501 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5d:00:e6,bridge_name='br-int',has_traffic_filtering=True,id=a9826a7d-3298-45d6-b564-4b199244dec1,network=Network(4522b631-3a21-451f-8605-7c2b34273ecd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa9826a7d-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:36:33 np0005466013 nova_compute[192144]: 2025-10-02 12:36:33.950 2 DEBUG os_vif [None req-2197a112-beec-4624-8c6f-a51a1abaa501 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5d:00:e6,bridge_name='br-int',has_traffic_filtering=True,id=a9826a7d-3298-45d6-b564-4b199244dec1,network=Network(4522b631-3a21-451f-8605-7c2b34273ecd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa9826a7d-32') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:36:33 np0005466013 nova_compute[192144]: 2025-10-02 12:36:33.954 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:33 np0005466013 nova_compute[192144]: 2025-10-02 12:36:33.954 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa9826a7d-32, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:36:33 np0005466013 nova_compute[192144]: 2025-10-02 12:36:33.957 2 DEBUG nova.compute.manager [req-dbc2e1ff-043e-4077-9fc0-7186804aa27d req-5dbf6a1a-81f5-4554-b974-8317d07df28d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Received event network-vif-unplugged-5be4618b-6dbd-4495-af12-ea729df149d7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:36:33 np0005466013 nova_compute[192144]: 2025-10-02 12:36:33.962 2 DEBUG oslo_concurrency.lockutils [req-dbc2e1ff-043e-4077-9fc0-7186804aa27d req-5dbf6a1a-81f5-4554-b974-8317d07df28d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "cd5cbe34-283a-4177-9d0b-bc35fadcde72-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:33 np0005466013 nova_compute[192144]: 2025-10-02 12:36:33.962 2 DEBUG oslo_concurrency.lockutils [req-dbc2e1ff-043e-4077-9fc0-7186804aa27d req-5dbf6a1a-81f5-4554-b974-8317d07df28d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "cd5cbe34-283a-4177-9d0b-bc35fadcde72-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:33 np0005466013 nova_compute[192144]: 2025-10-02 12:36:33.962 2 DEBUG oslo_concurrency.lockutils [req-dbc2e1ff-043e-4077-9fc0-7186804aa27d req-5dbf6a1a-81f5-4554-b974-8317d07df28d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "cd5cbe34-283a-4177-9d0b-bc35fadcde72-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:33 np0005466013 nova_compute[192144]: 2025-10-02 12:36:33.963 2 DEBUG nova.compute.manager [req-dbc2e1ff-043e-4077-9fc0-7186804aa27d req-5dbf6a1a-81f5-4554-b974-8317d07df28d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] No waiting events found dispatching network-vif-unplugged-5be4618b-6dbd-4495-af12-ea729df149d7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:36:33 np0005466013 nova_compute[192144]: 2025-10-02 12:36:33.963 2 DEBUG nova.compute.manager [req-dbc2e1ff-043e-4077-9fc0-7186804aa27d req-5dbf6a1a-81f5-4554-b974-8317d07df28d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Received event network-vif-unplugged-5be4618b-6dbd-4495-af12-ea729df149d7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:36:33 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:33.963 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[2697b16c-32d1-4487-8992-f926f9a7098a]: (4, ('Thu Oct  2 12:36:33 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-385e0a9e-c250-418d-8cab-e7e3ae4506c1 (7baa2d64357ae1bda25a05f8a9aa9c16e7e8d5c3f8f0c24c903f5f451211c3a1)\n7baa2d64357ae1bda25a05f8a9aa9c16e7e8d5c3f8f0c24c903f5f451211c3a1\nThu Oct  2 12:36:33 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-385e0a9e-c250-418d-8cab-e7e3ae4506c1 (7baa2d64357ae1bda25a05f8a9aa9c16e7e8d5c3f8f0c24c903f5f451211c3a1)\n7baa2d64357ae1bda25a05f8a9aa9c16e7e8d5c3f8f0c24c903f5f451211c3a1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:33 np0005466013 nova_compute[192144]: 2025-10-02 12:36:33.963 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:33 np0005466013 nova_compute[192144]: 2025-10-02 12:36:33.965 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:36:33 np0005466013 nova_compute[192144]: 2025-10-02 12:36:33.966 2 INFO os_vif [None req-2197a112-beec-4624-8c6f-a51a1abaa501 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5d:00:e6,bridge_name='br-int',has_traffic_filtering=True,id=a9826a7d-3298-45d6-b564-4b199244dec1,network=Network(4522b631-3a21-451f-8605-7c2b34273ecd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa9826a7d-32')#033[00m
Oct  2 08:36:33 np0005466013 nova_compute[192144]: 2025-10-02 12:36:33.968 2 INFO nova.virt.libvirt.driver [None req-2197a112-beec-4624-8c6f-a51a1abaa501 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Deleting instance files /var/lib/nova/instances/cd5cbe34-283a-4177-9d0b-bc35fadcde72_del#033[00m
Oct  2 08:36:33 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:33.968 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[370c42bc-5539-4267-8818-79f9a7731b05]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:33 np0005466013 nova_compute[192144]: 2025-10-02 12:36:33.969 2 INFO nova.virt.libvirt.driver [None req-2197a112-beec-4624-8c6f-a51a1abaa501 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Deletion of /var/lib/nova/instances/cd5cbe34-283a-4177-9d0b-bc35fadcde72_del complete#033[00m
Oct  2 08:36:33 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:33.969 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap385e0a9e-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:36:33 np0005466013 kernel: tap385e0a9e-c0: left promiscuous mode
Oct  2 08:36:33 np0005466013 nova_compute[192144]: 2025-10-02 12:36:33.974 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:33 np0005466013 nova_compute[192144]: 2025-10-02 12:36:33.982 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:33 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:33.984 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[0aac5da9-5f0f-4747-b357-1149876fcc35]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:34.018 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[7d5bba81-ac35-4d0d-b16e-1cddb4fafd7f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:34.018 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[ba29086b-9928-4bed-a3c9-f3c88e789717]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:34.037 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[fd12bcdc-884f-4d67-9eba-0b0e76ff99b1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 648973, 'reachable_time': 28668, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247037, 'error': None, 'target': 'ovnmeta-385e0a9e-c250-418d-8cab-e7e3ae4506c1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:34.039 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-385e0a9e-c250-418d-8cab-e7e3ae4506c1 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:36:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:34.039 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[aeae9f43-3922-4c20-bcdd-31742ec4e394]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:34.040 103323 INFO neutron.agent.ovn.metadata.agent [-] Port a9826a7d-3298-45d6-b564-4b199244dec1 in datapath 4522b631-3a21-451f-8605-7c2b34273ecd unbound from our chassis#033[00m
Oct  2 08:36:34 np0005466013 systemd[1]: run-netns-ovnmeta\x2d385e0a9e\x2dc250\x2d418d\x2d8cab\x2de7e3ae4506c1.mount: Deactivated successfully.
Oct  2 08:36:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:34.042 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4522b631-3a21-451f-8605-7c2b34273ecd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:36:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:34.042 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[068fd24c-1781-438f-829b-8615e6e81742]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:34.043 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4522b631-3a21-451f-8605-7c2b34273ecd namespace which is not needed anymore#033[00m
Oct  2 08:36:34 np0005466013 nova_compute[192144]: 2025-10-02 12:36:34.050 2 INFO nova.compute.manager [None req-2197a112-beec-4624-8c6f-a51a1abaa501 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Took 0.45 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:36:34 np0005466013 nova_compute[192144]: 2025-10-02 12:36:34.050 2 DEBUG oslo.service.loopingcall [None req-2197a112-beec-4624-8c6f-a51a1abaa501 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:36:34 np0005466013 nova_compute[192144]: 2025-10-02 12:36:34.051 2 DEBUG nova.compute.manager [-] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:36:34 np0005466013 nova_compute[192144]: 2025-10-02 12:36:34.051 2 DEBUG nova.network.neutron [-] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:36:34 np0005466013 neutron-haproxy-ovnmeta-4522b631-3a21-451f-8605-7c2b34273ecd[245955]: [NOTICE]   (245981) : haproxy version is 2.8.14-c23fe91
Oct  2 08:36:34 np0005466013 neutron-haproxy-ovnmeta-4522b631-3a21-451f-8605-7c2b34273ecd[245955]: [NOTICE]   (245981) : path to executable is /usr/sbin/haproxy
Oct  2 08:36:34 np0005466013 neutron-haproxy-ovnmeta-4522b631-3a21-451f-8605-7c2b34273ecd[245955]: [WARNING]  (245981) : Exiting Master process...
Oct  2 08:36:34 np0005466013 neutron-haproxy-ovnmeta-4522b631-3a21-451f-8605-7c2b34273ecd[245955]: [WARNING]  (245981) : Exiting Master process...
Oct  2 08:36:34 np0005466013 neutron-haproxy-ovnmeta-4522b631-3a21-451f-8605-7c2b34273ecd[245955]: [ALERT]    (245981) : Current worker (245983) exited with code 143 (Terminated)
Oct  2 08:36:34 np0005466013 neutron-haproxy-ovnmeta-4522b631-3a21-451f-8605-7c2b34273ecd[245955]: [WARNING]  (245981) : All workers exited. Exiting... (0)
Oct  2 08:36:34 np0005466013 systemd[1]: libpod-370d19c4416df4ec77b34531905fd720ba9a0735772425dfdaba95f47ae2efc9.scope: Deactivated successfully.
Oct  2 08:36:34 np0005466013 podman[247056]: 2025-10-02 12:36:34.182322638 +0000 UTC m=+0.045878971 container died 370d19c4416df4ec77b34531905fd720ba9a0735772425dfdaba95f47ae2efc9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4522b631-3a21-451f-8605-7c2b34273ecd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Oct  2 08:36:34 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-370d19c4416df4ec77b34531905fd720ba9a0735772425dfdaba95f47ae2efc9-userdata-shm.mount: Deactivated successfully.
Oct  2 08:36:34 np0005466013 systemd[1]: var-lib-containers-storage-overlay-e5e9c2b14c51779ee1ae18effd476d9d4403db9fb19b87b899e448ec4eb6d222-merged.mount: Deactivated successfully.
Oct  2 08:36:34 np0005466013 podman[247056]: 2025-10-02 12:36:34.219902228 +0000 UTC m=+0.083458591 container cleanup 370d19c4416df4ec77b34531905fd720ba9a0735772425dfdaba95f47ae2efc9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4522b631-3a21-451f-8605-7c2b34273ecd, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Oct  2 08:36:34 np0005466013 systemd[1]: libpod-conmon-370d19c4416df4ec77b34531905fd720ba9a0735772425dfdaba95f47ae2efc9.scope: Deactivated successfully.
Oct  2 08:36:34 np0005466013 podman[247087]: 2025-10-02 12:36:34.460557943 +0000 UTC m=+0.219515328 container remove 370d19c4416df4ec77b34531905fd720ba9a0735772425dfdaba95f47ae2efc9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4522b631-3a21-451f-8605-7c2b34273ecd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 08:36:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:34.468 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[79eddee2-e44b-4c48-85de-32404b85c012]: (4, ('Thu Oct  2 12:36:34 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-4522b631-3a21-451f-8605-7c2b34273ecd (370d19c4416df4ec77b34531905fd720ba9a0735772425dfdaba95f47ae2efc9)\n370d19c4416df4ec77b34531905fd720ba9a0735772425dfdaba95f47ae2efc9\nThu Oct  2 12:36:34 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-4522b631-3a21-451f-8605-7c2b34273ecd (370d19c4416df4ec77b34531905fd720ba9a0735772425dfdaba95f47ae2efc9)\n370d19c4416df4ec77b34531905fd720ba9a0735772425dfdaba95f47ae2efc9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:34.470 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[eff52315-8a91-47dd-b67d-518d0ebc722f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:34.470 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4522b631-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:36:34 np0005466013 kernel: tap4522b631-30: left promiscuous mode
Oct  2 08:36:34 np0005466013 nova_compute[192144]: 2025-10-02 12:36:34.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:34 np0005466013 nova_compute[192144]: 2025-10-02 12:36:34.498 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:34.502 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[a74d6e4a-1e5d-49be-8d66-003c5e0786e6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:34.530 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[f71fb245-6d7e-4492-b895-40ef2b1face1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:34.531 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[084ed387-701a-4322-8a31-97cc9d12ebd0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:34.552 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[48117f55-ac9b-4d34-afb7-7c7e8125e47d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 649178, 'reachable_time': 40352, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247101, 'error': None, 'target': 'ovnmeta-4522b631-3a21-451f-8605-7c2b34273ecd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:34.554 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4522b631-3a21-451f-8605-7c2b34273ecd deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:36:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:36:34.554 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[b504c056-4820-40ec-935d-1acf05bab427]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:34 np0005466013 nova_compute[192144]: 2025-10-02 12:36:34.611 2 DEBUG nova.compute.manager [req-af5d531d-c0a5-4df3-891d-3b8c1a2c1d9e req-2a78f735-632f-49fe-912c-90f344c63ac3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Received event network-vif-unplugged-a9826a7d-3298-45d6-b564-4b199244dec1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:36:34 np0005466013 nova_compute[192144]: 2025-10-02 12:36:34.612 2 DEBUG oslo_concurrency.lockutils [req-af5d531d-c0a5-4df3-891d-3b8c1a2c1d9e req-2a78f735-632f-49fe-912c-90f344c63ac3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "cd5cbe34-283a-4177-9d0b-bc35fadcde72-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:34 np0005466013 nova_compute[192144]: 2025-10-02 12:36:34.612 2 DEBUG oslo_concurrency.lockutils [req-af5d531d-c0a5-4df3-891d-3b8c1a2c1d9e req-2a78f735-632f-49fe-912c-90f344c63ac3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "cd5cbe34-283a-4177-9d0b-bc35fadcde72-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:34 np0005466013 nova_compute[192144]: 2025-10-02 12:36:34.613 2 DEBUG oslo_concurrency.lockutils [req-af5d531d-c0a5-4df3-891d-3b8c1a2c1d9e req-2a78f735-632f-49fe-912c-90f344c63ac3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "cd5cbe34-283a-4177-9d0b-bc35fadcde72-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:34 np0005466013 nova_compute[192144]: 2025-10-02 12:36:34.613 2 DEBUG nova.compute.manager [req-af5d531d-c0a5-4df3-891d-3b8c1a2c1d9e req-2a78f735-632f-49fe-912c-90f344c63ac3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] No waiting events found dispatching network-vif-unplugged-a9826a7d-3298-45d6-b564-4b199244dec1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:36:34 np0005466013 nova_compute[192144]: 2025-10-02 12:36:34.613 2 DEBUG nova.compute.manager [req-af5d531d-c0a5-4df3-891d-3b8c1a2c1d9e req-2a78f735-632f-49fe-912c-90f344c63ac3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Received event network-vif-unplugged-a9826a7d-3298-45d6-b564-4b199244dec1 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:36:34 np0005466013 nova_compute[192144]: 2025-10-02 12:36:34.614 2 DEBUG nova.compute.manager [req-af5d531d-c0a5-4df3-891d-3b8c1a2c1d9e req-2a78f735-632f-49fe-912c-90f344c63ac3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Received event network-vif-plugged-a9826a7d-3298-45d6-b564-4b199244dec1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:36:34 np0005466013 nova_compute[192144]: 2025-10-02 12:36:34.614 2 DEBUG oslo_concurrency.lockutils [req-af5d531d-c0a5-4df3-891d-3b8c1a2c1d9e req-2a78f735-632f-49fe-912c-90f344c63ac3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "cd5cbe34-283a-4177-9d0b-bc35fadcde72-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:34 np0005466013 nova_compute[192144]: 2025-10-02 12:36:34.615 2 DEBUG oslo_concurrency.lockutils [req-af5d531d-c0a5-4df3-891d-3b8c1a2c1d9e req-2a78f735-632f-49fe-912c-90f344c63ac3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "cd5cbe34-283a-4177-9d0b-bc35fadcde72-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:34 np0005466013 nova_compute[192144]: 2025-10-02 12:36:34.615 2 DEBUG oslo_concurrency.lockutils [req-af5d531d-c0a5-4df3-891d-3b8c1a2c1d9e req-2a78f735-632f-49fe-912c-90f344c63ac3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "cd5cbe34-283a-4177-9d0b-bc35fadcde72-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:34 np0005466013 nova_compute[192144]: 2025-10-02 12:36:34.616 2 DEBUG nova.compute.manager [req-af5d531d-c0a5-4df3-891d-3b8c1a2c1d9e req-2a78f735-632f-49fe-912c-90f344c63ac3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] No waiting events found dispatching network-vif-plugged-a9826a7d-3298-45d6-b564-4b199244dec1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:36:34 np0005466013 nova_compute[192144]: 2025-10-02 12:36:34.616 2 WARNING nova.compute.manager [req-af5d531d-c0a5-4df3-891d-3b8c1a2c1d9e req-2a78f735-632f-49fe-912c-90f344c63ac3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Received unexpected event network-vif-plugged-a9826a7d-3298-45d6-b564-4b199244dec1 for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:36:34 np0005466013 systemd[1]: run-netns-ovnmeta\x2d4522b631\x2d3a21\x2d451f\x2d8605\x2d7c2b34273ecd.mount: Deactivated successfully.
Oct  2 08:36:35 np0005466013 nova_compute[192144]: 2025-10-02 12:36:35.151 2 DEBUG nova.network.neutron [req-c98d9fd9-99d4-4a95-9482-25d0f10fc5d0 req-3d1ef264-b9aa-443b-813d-e98fed66b48b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Updated VIF entry in instance network info cache for port 5be4618b-6dbd-4495-af12-ea729df149d7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:36:35 np0005466013 nova_compute[192144]: 2025-10-02 12:36:35.152 2 DEBUG nova.network.neutron [req-c98d9fd9-99d4-4a95-9482-25d0f10fc5d0 req-3d1ef264-b9aa-443b-813d-e98fed66b48b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Updating instance_info_cache with network_info: [{"id": "5be4618b-6dbd-4495-af12-ea729df149d7", "address": "fa:16:3e:6a:d5:ad", "network": {"id": "385e0a9e-c250-418d-8cab-e7e3ae4506c1", "bridge": "br-int", "label": "tempest-network-smoke--772702849", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5be4618b-6d", "ovs_interfaceid": "5be4618b-6dbd-4495-af12-ea729df149d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "a9826a7d-3298-45d6-b564-4b199244dec1", "address": "fa:16:3e:5d:00:e6", "network": {"id": "4522b631-3a21-451f-8605-7c2b34273ecd", "bridge": "br-int", "label": "tempest-network-smoke--123996893", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5d:e6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9826a7d-32", "ovs_interfaceid": "a9826a7d-3298-45d6-b564-4b199244dec1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:36:35 np0005466013 nova_compute[192144]: 2025-10-02 12:36:35.180 2 DEBUG oslo_concurrency.lockutils [req-c98d9fd9-99d4-4a95-9482-25d0f10fc5d0 req-3d1ef264-b9aa-443b-813d-e98fed66b48b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-cd5cbe34-283a-4177-9d0b-bc35fadcde72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:36:35 np0005466013 ovn_controller[94366]: 2025-10-02T12:36:35Z|00700|binding|INFO|Releasing lport 02b99a60-d61e-4fd5-b4f5-ea414f3b2b3c from this chassis (sb_readonly=0)
Oct  2 08:36:35 np0005466013 nova_compute[192144]: 2025-10-02 12:36:35.207 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:35 np0005466013 nova_compute[192144]: 2025-10-02 12:36:35.325 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:35 np0005466013 ovn_controller[94366]: 2025-10-02T12:36:35Z|00701|binding|INFO|Releasing lport 02b99a60-d61e-4fd5-b4f5-ea414f3b2b3c from this chassis (sb_readonly=0)
Oct  2 08:36:35 np0005466013 nova_compute[192144]: 2025-10-02 12:36:35.335 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:35 np0005466013 nova_compute[192144]: 2025-10-02 12:36:35.591 2 DEBUG nova.network.neutron [-] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:36:35 np0005466013 nova_compute[192144]: 2025-10-02 12:36:35.615 2 INFO nova.compute.manager [-] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Took 1.56 seconds to deallocate network for instance.#033[00m
Oct  2 08:36:35 np0005466013 nova_compute[192144]: 2025-10-02 12:36:35.710 2 DEBUG oslo_concurrency.lockutils [None req-2197a112-beec-4624-8c6f-a51a1abaa501 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:35 np0005466013 nova_compute[192144]: 2025-10-02 12:36:35.711 2 DEBUG oslo_concurrency.lockutils [None req-2197a112-beec-4624-8c6f-a51a1abaa501 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:35 np0005466013 nova_compute[192144]: 2025-10-02 12:36:35.790 2 DEBUG nova.compute.provider_tree [None req-2197a112-beec-4624-8c6f-a51a1abaa501 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:36:35 np0005466013 nova_compute[192144]: 2025-10-02 12:36:35.794 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:35 np0005466013 NetworkManager[51205]: <info>  [1759408595.7953] manager: (patch-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/307)
Oct  2 08:36:35 np0005466013 NetworkManager[51205]: <info>  [1759408595.7959] manager: (patch-br-int-to-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/308)
Oct  2 08:36:35 np0005466013 nova_compute[192144]: 2025-10-02 12:36:35.806 2 DEBUG nova.scheduler.client.report [None req-2197a112-beec-4624-8c6f-a51a1abaa501 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:36:35 np0005466013 nova_compute[192144]: 2025-10-02 12:36:35.825 2 DEBUG oslo_concurrency.lockutils [None req-2197a112-beec-4624-8c6f-a51a1abaa501 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:35 np0005466013 nova_compute[192144]: 2025-10-02 12:36:35.887 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:35 np0005466013 ovn_controller[94366]: 2025-10-02T12:36:35Z|00702|binding|INFO|Releasing lport 02b99a60-d61e-4fd5-b4f5-ea414f3b2b3c from this chassis (sb_readonly=0)
Oct  2 08:36:35 np0005466013 nova_compute[192144]: 2025-10-02 12:36:35.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:35 np0005466013 nova_compute[192144]: 2025-10-02 12:36:35.951 2 INFO nova.scheduler.client.report [None req-2197a112-beec-4624-8c6f-a51a1abaa501 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Deleted allocations for instance cd5cbe34-283a-4177-9d0b-bc35fadcde72#033[00m
Oct  2 08:36:36 np0005466013 nova_compute[192144]: 2025-10-02 12:36:36.045 2 DEBUG nova.compute.manager [req-a95c8921-6727-4c6e-a160-dece9a520a02 req-071d1d4c-7757-40ae-a22d-1cbca53bd510 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Received event network-vif-plugged-5be4618b-6dbd-4495-af12-ea729df149d7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:36:36 np0005466013 nova_compute[192144]: 2025-10-02 12:36:36.046 2 DEBUG oslo_concurrency.lockutils [req-a95c8921-6727-4c6e-a160-dece9a520a02 req-071d1d4c-7757-40ae-a22d-1cbca53bd510 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "cd5cbe34-283a-4177-9d0b-bc35fadcde72-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:36 np0005466013 nova_compute[192144]: 2025-10-02 12:36:36.046 2 DEBUG oslo_concurrency.lockutils [req-a95c8921-6727-4c6e-a160-dece9a520a02 req-071d1d4c-7757-40ae-a22d-1cbca53bd510 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "cd5cbe34-283a-4177-9d0b-bc35fadcde72-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:36 np0005466013 nova_compute[192144]: 2025-10-02 12:36:36.047 2 DEBUG oslo_concurrency.lockutils [req-a95c8921-6727-4c6e-a160-dece9a520a02 req-071d1d4c-7757-40ae-a22d-1cbca53bd510 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "cd5cbe34-283a-4177-9d0b-bc35fadcde72-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:36 np0005466013 nova_compute[192144]: 2025-10-02 12:36:36.047 2 DEBUG nova.compute.manager [req-a95c8921-6727-4c6e-a160-dece9a520a02 req-071d1d4c-7757-40ae-a22d-1cbca53bd510 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] No waiting events found dispatching network-vif-plugged-5be4618b-6dbd-4495-af12-ea729df149d7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:36:36 np0005466013 nova_compute[192144]: 2025-10-02 12:36:36.048 2 WARNING nova.compute.manager [req-a95c8921-6727-4c6e-a160-dece9a520a02 req-071d1d4c-7757-40ae-a22d-1cbca53bd510 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Received unexpected event network-vif-plugged-5be4618b-6dbd-4495-af12-ea729df149d7 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:36:36 np0005466013 nova_compute[192144]: 2025-10-02 12:36:36.049 2 DEBUG oslo_concurrency.lockutils [None req-2197a112-beec-4624-8c6f-a51a1abaa501 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "cd5cbe34-283a-4177-9d0b-bc35fadcde72" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.484s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:36 np0005466013 nova_compute[192144]: 2025-10-02 12:36:36.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:36:37 np0005466013 nova_compute[192144]: 2025-10-02 12:36:37.035 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:37 np0005466013 nova_compute[192144]: 2025-10-02 12:36:37.036 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:37 np0005466013 nova_compute[192144]: 2025-10-02 12:36:37.036 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:37 np0005466013 nova_compute[192144]: 2025-10-02 12:36:37.036 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:36:37 np0005466013 nova_compute[192144]: 2025-10-02 12:36:37.108 2 DEBUG nova.compute.manager [req-319004bf-6d18-416c-8e43-66c874b9709f req-4506ede6-b659-4cb3-a49b-fd1cae33b293 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Received event network-vif-deleted-5be4618b-6dbd-4495-af12-ea729df149d7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:36:37 np0005466013 nova_compute[192144]: 2025-10-02 12:36:37.109 2 DEBUG nova.compute.manager [req-319004bf-6d18-416c-8e43-66c874b9709f req-4506ede6-b659-4cb3-a49b-fd1cae33b293 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Received event network-vif-deleted-a9826a7d-3298-45d6-b564-4b199244dec1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:36:37 np0005466013 nova_compute[192144]: 2025-10-02 12:36:37.109 2 DEBUG nova.compute.manager [req-319004bf-6d18-416c-8e43-66c874b9709f req-4506ede6-b659-4cb3-a49b-fd1cae33b293 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Received event network-changed-7fd89911-3957-4f68-8adb-9a1c640f6bdb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:36:37 np0005466013 nova_compute[192144]: 2025-10-02 12:36:37.110 2 DEBUG nova.compute.manager [req-319004bf-6d18-416c-8e43-66c874b9709f req-4506ede6-b659-4cb3-a49b-fd1cae33b293 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Refreshing instance network info cache due to event network-changed-7fd89911-3957-4f68-8adb-9a1c640f6bdb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:36:37 np0005466013 nova_compute[192144]: 2025-10-02 12:36:37.110 2 DEBUG oslo_concurrency.lockutils [req-319004bf-6d18-416c-8e43-66c874b9709f req-4506ede6-b659-4cb3-a49b-fd1cae33b293 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:36:37 np0005466013 nova_compute[192144]: 2025-10-02 12:36:37.110 2 DEBUG oslo_concurrency.lockutils [req-319004bf-6d18-416c-8e43-66c874b9709f req-4506ede6-b659-4cb3-a49b-fd1cae33b293 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:36:37 np0005466013 nova_compute[192144]: 2025-10-02 12:36:37.110 2 DEBUG nova.network.neutron [req-319004bf-6d18-416c-8e43-66c874b9709f req-4506ede6-b659-4cb3-a49b-fd1cae33b293 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Refreshing network info cache for port 7fd89911-3957-4f68-8adb-9a1c640f6bdb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:36:37 np0005466013 nova_compute[192144]: 2025-10-02 12:36:37.137 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:36:37 np0005466013 nova_compute[192144]: 2025-10-02 12:36:37.224 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:36:37 np0005466013 nova_compute[192144]: 2025-10-02 12:36:37.225 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:36:37 np0005466013 nova_compute[192144]: 2025-10-02 12:36:37.279 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:36:37 np0005466013 nova_compute[192144]: 2025-10-02 12:36:37.435 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:36:37 np0005466013 nova_compute[192144]: 2025-10-02 12:36:37.437 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5490MB free_disk=73.13273620605469GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:36:37 np0005466013 nova_compute[192144]: 2025-10-02 12:36:37.438 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:37 np0005466013 nova_compute[192144]: 2025-10-02 12:36:37.438 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:37 np0005466013 nova_compute[192144]: 2025-10-02 12:36:37.505 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:37 np0005466013 nova_compute[192144]: 2025-10-02 12:36:37.599 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:36:37 np0005466013 nova_compute[192144]: 2025-10-02 12:36:37.600 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:36:37 np0005466013 nova_compute[192144]: 2025-10-02 12:36:37.600 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:36:37 np0005466013 nova_compute[192144]: 2025-10-02 12:36:37.689 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:36:37 np0005466013 nova_compute[192144]: 2025-10-02 12:36:37.705 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:36:37 np0005466013 nova_compute[192144]: 2025-10-02 12:36:37.729 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:36:37 np0005466013 nova_compute[192144]: 2025-10-02 12:36:37.729 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.291s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:38 np0005466013 nova_compute[192144]: 2025-10-02 12:36:38.957 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:38 np0005466013 nova_compute[192144]: 2025-10-02 12:36:38.994 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408583.9940066, 1d931a6f-0703-4e1f-acfc-b8402834c14d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:36:38 np0005466013 nova_compute[192144]: 2025-10-02 12:36:38.995 2 INFO nova.compute.manager [-] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:36:39 np0005466013 nova_compute[192144]: 2025-10-02 12:36:39.012 2 DEBUG nova.compute.manager [None req-40575189-0cdb-4581-bae3-95df50516a3a - - - - - -] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:36:39 np0005466013 nova_compute[192144]: 2025-10-02 12:36:39.730 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:36:39 np0005466013 nova_compute[192144]: 2025-10-02 12:36:39.731 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:36:39 np0005466013 nova_compute[192144]: 2025-10-02 12:36:39.903 2 DEBUG nova.network.neutron [req-319004bf-6d18-416c-8e43-66c874b9709f req-4506ede6-b659-4cb3-a49b-fd1cae33b293 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Updated VIF entry in instance network info cache for port 7fd89911-3957-4f68-8adb-9a1c640f6bdb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:36:39 np0005466013 nova_compute[192144]: 2025-10-02 12:36:39.904 2 DEBUG nova.network.neutron [req-319004bf-6d18-416c-8e43-66c874b9709f req-4506ede6-b659-4cb3-a49b-fd1cae33b293 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Updating instance_info_cache with network_info: [{"id": "7fd89911-3957-4f68-8adb-9a1c640f6bdb", "address": "fa:16:3e:16:78:94", "network": {"id": "f6b28beb-3fac-4e00-bd1f-932a66109b1d", "bridge": "br-int", "label": "tempest-network-smoke--1067683882", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7fd89911-39", "ovs_interfaceid": "7fd89911-3957-4f68-8adb-9a1c640f6bdb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:36:39 np0005466013 nova_compute[192144]: 2025-10-02 12:36:39.956 2 DEBUG oslo_concurrency.lockutils [req-319004bf-6d18-416c-8e43-66c874b9709f req-4506ede6-b659-4cb3-a49b-fd1cae33b293 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:36:40 np0005466013 nova_compute[192144]: 2025-10-02 12:36:40.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:40 np0005466013 nova_compute[192144]: 2025-10-02 12:36:40.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:36:40 np0005466013 nova_compute[192144]: 2025-10-02 12:36:40.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:36:42 np0005466013 nova_compute[192144]: 2025-10-02 12:36:42.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:43 np0005466013 nova_compute[192144]: 2025-10-02 12:36:43.960 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:43 np0005466013 nova_compute[192144]: 2025-10-02 12:36:43.988 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:36:44 np0005466013 ovn_controller[94366]: 2025-10-02T12:36:44Z|00071|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:16:78:94 10.100.0.4
Oct  2 08:36:44 np0005466013 ovn_controller[94366]: 2025-10-02T12:36:44Z|00072|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:16:78:94 10.100.0.4
Oct  2 08:36:44 np0005466013 nova_compute[192144]: 2025-10-02 12:36:44.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:36:45 np0005466013 nova_compute[192144]: 2025-10-02 12:36:45.331 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:45 np0005466013 nova_compute[192144]: 2025-10-02 12:36:45.989 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:36:47 np0005466013 ovn_controller[94366]: 2025-10-02T12:36:47Z|00703|binding|INFO|Releasing lport 02b99a60-d61e-4fd5-b4f5-ea414f3b2b3c from this chassis (sb_readonly=0)
Oct  2 08:36:47 np0005466013 nova_compute[192144]: 2025-10-02 12:36:47.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:48 np0005466013 nova_compute[192144]: 2025-10-02 12:36:48.882 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408593.8810492, cd5cbe34-283a-4177-9d0b-bc35fadcde72 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:36:48 np0005466013 nova_compute[192144]: 2025-10-02 12:36:48.883 2 INFO nova.compute.manager [-] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:36:48 np0005466013 nova_compute[192144]: 2025-10-02 12:36:48.963 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:48 np0005466013 nova_compute[192144]: 2025-10-02 12:36:48.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:36:48 np0005466013 nova_compute[192144]: 2025-10-02 12:36:48.993 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:36:48 np0005466013 nova_compute[192144]: 2025-10-02 12:36:48.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:36:49 np0005466013 podman[247126]: 2025-10-02 12:36:49.730023987 +0000 UTC m=+0.090640994 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 08:36:49 np0005466013 podman[247127]: 2025-10-02 12:36:49.730046258 +0000 UTC m=+0.090731078 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Oct  2 08:36:49 np0005466013 podman[247128]: 2025-10-02 12:36:49.765497901 +0000 UTC m=+0.127193612 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 08:36:50 np0005466013 nova_compute[192144]: 2025-10-02 12:36:50.194 2 DEBUG nova.compute.manager [None req-af330644-cf03-4902-83a0-345df86428be - - - - - -] [instance: cd5cbe34-283a-4177-9d0b-bc35fadcde72] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:36:50 np0005466013 nova_compute[192144]: 2025-10-02 12:36:50.333 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:51 np0005466013 nova_compute[192144]: 2025-10-02 12:36:51.009 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "refresh_cache-2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:36:51 np0005466013 nova_compute[192144]: 2025-10-02 12:36:51.010 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquired lock "refresh_cache-2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:36:51 np0005466013 nova_compute[192144]: 2025-10-02 12:36:51.010 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:36:51 np0005466013 nova_compute[192144]: 2025-10-02 12:36:51.011 2 DEBUG nova.objects.instance [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lazy-loading 'info_cache' on Instance uuid 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:36:51 np0005466013 nova_compute[192144]: 2025-10-02 12:36:51.247 2 INFO nova.compute.manager [None req-170c15fa-d814-4cb0-b328-6de24d0d1f45 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Get console output#033[00m
Oct  2 08:36:51 np0005466013 nova_compute[192144]: 2025-10-02 12:36:51.252 56 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  2 08:36:53 np0005466013 nova_compute[192144]: 2025-10-02 12:36:53.275 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:53 np0005466013 nova_compute[192144]: 2025-10-02 12:36:53.966 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:54 np0005466013 nova_compute[192144]: 2025-10-02 12:36:54.433 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Updating instance_info_cache with network_info: [{"id": "7fd89911-3957-4f68-8adb-9a1c640f6bdb", "address": "fa:16:3e:16:78:94", "network": {"id": "f6b28beb-3fac-4e00-bd1f-932a66109b1d", "bridge": "br-int", "label": "tempest-network-smoke--1067683882", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7fd89911-39", "ovs_interfaceid": "7fd89911-3957-4f68-8adb-9a1c640f6bdb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:36:55 np0005466013 nova_compute[192144]: 2025-10-02 12:36:55.335 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:56 np0005466013 nova_compute[192144]: 2025-10-02 12:36:56.487 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Releasing lock "refresh_cache-2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:36:56 np0005466013 nova_compute[192144]: 2025-10-02 12:36:56.488 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:36:56 np0005466013 nova_compute[192144]: 2025-10-02 12:36:56.489 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:36:56 np0005466013 nova_compute[192144]: 2025-10-02 12:36:56.490 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:36:58 np0005466013 nova_compute[192144]: 2025-10-02 12:36:58.724 2 DEBUG oslo_concurrency.lockutils [None req-ce6bccf6-2b4f-42d0-a5cf-008d03f8aac0 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "interface-2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:58 np0005466013 nova_compute[192144]: 2025-10-02 12:36:58.725 2 DEBUG oslo_concurrency.lockutils [None req-ce6bccf6-2b4f-42d0-a5cf-008d03f8aac0 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "interface-2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:58 np0005466013 nova_compute[192144]: 2025-10-02 12:36:58.726 2 DEBUG nova.objects.instance [None req-ce6bccf6-2b4f-42d0-a5cf-008d03f8aac0 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lazy-loading 'flavor' on Instance uuid 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:36:58 np0005466013 nova_compute[192144]: 2025-10-02 12:36:58.953 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:58 np0005466013 nova_compute[192144]: 2025-10-02 12:36:58.968 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:59 np0005466013 nova_compute[192144]: 2025-10-02 12:36:59.226 2 DEBUG nova.objects.instance [None req-ce6bccf6-2b4f-42d0-a5cf-008d03f8aac0 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lazy-loading 'pci_requests' on Instance uuid 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:36:59 np0005466013 nova_compute[192144]: 2025-10-02 12:36:59.251 2 DEBUG nova.network.neutron [None req-ce6bccf6-2b4f-42d0-a5cf-008d03f8aac0 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:36:59 np0005466013 podman[247195]: 2025-10-02 12:36:59.711257556 +0000 UTC m=+0.077974769 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ceilometer_agent_compute)
Oct  2 08:36:59 np0005466013 podman[247194]: 2025-10-02 12:36:59.715191039 +0000 UTC m=+0.079771276 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, name=ubi9-minimal, build-date=2025-08-20T13:12:41, vcs-type=git, architecture=x86_64, container_name=openstack_network_exporter, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers)
Oct  2 08:36:59 np0005466013 podman[247193]: 2025-10-02 12:36:59.726089038 +0000 UTC m=+0.092284175 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:37:00 np0005466013 nova_compute[192144]: 2025-10-02 12:37:00.175 2 DEBUG nova.policy [None req-ce6bccf6-2b4f-42d0-a5cf-008d03f8aac0 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:37:00 np0005466013 nova_compute[192144]: 2025-10-02 12:37:00.337 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:00 np0005466013 nova_compute[192144]: 2025-10-02 12:37:00.772 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:01 np0005466013 nova_compute[192144]: 2025-10-02 12:37:01.976 2 DEBUG nova.network.neutron [None req-ce6bccf6-2b4f-42d0-a5cf-008d03f8aac0 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Successfully created port: 7b31c42c-5f0b-40fa-b913-4546441b9444 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:37:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:02.320 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:37:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:02.321 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:37:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:02.322 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:37:02 np0005466013 podman[247250]: 2025-10-02 12:37:02.692215845 +0000 UTC m=+0.062441915 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  2 08:37:02 np0005466013 podman[247251]: 2025-10-02 12:37:02.752632497 +0000 UTC m=+0.112360000 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid)
Oct  2 08:37:03 np0005466013 nova_compute[192144]: 2025-10-02 12:37:03.151 2 DEBUG nova.network.neutron [None req-ce6bccf6-2b4f-42d0-a5cf-008d03f8aac0 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Successfully updated port: 7b31c42c-5f0b-40fa-b913-4546441b9444 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:37:03 np0005466013 nova_compute[192144]: 2025-10-02 12:37:03.174 2 DEBUG oslo_concurrency.lockutils [None req-ce6bccf6-2b4f-42d0-a5cf-008d03f8aac0 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "refresh_cache-2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:37:03 np0005466013 nova_compute[192144]: 2025-10-02 12:37:03.174 2 DEBUG oslo_concurrency.lockutils [None req-ce6bccf6-2b4f-42d0-a5cf-008d03f8aac0 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquired lock "refresh_cache-2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:37:03 np0005466013 nova_compute[192144]: 2025-10-02 12:37:03.174 2 DEBUG nova.network.neutron [None req-ce6bccf6-2b4f-42d0-a5cf-008d03f8aac0 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:37:03 np0005466013 nova_compute[192144]: 2025-10-02 12:37:03.660 2 DEBUG nova.compute.manager [req-63e7fef9-c584-4376-8ee5-25ca5e61643f req-2f3bac89-c9b0-4f06-9831-d129fd42c96b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Received event network-changed-7b31c42c-5f0b-40fa-b913-4546441b9444 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:37:03 np0005466013 nova_compute[192144]: 2025-10-02 12:37:03.661 2 DEBUG nova.compute.manager [req-63e7fef9-c584-4376-8ee5-25ca5e61643f req-2f3bac89-c9b0-4f06-9831-d129fd42c96b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Refreshing instance network info cache due to event network-changed-7b31c42c-5f0b-40fa-b913-4546441b9444. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:37:03 np0005466013 nova_compute[192144]: 2025-10-02 12:37:03.661 2 DEBUG oslo_concurrency.lockutils [req-63e7fef9-c584-4376-8ee5-25ca5e61643f req-2f3bac89-c9b0-4f06-9831-d129fd42c96b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:37:03 np0005466013 nova_compute[192144]: 2025-10-02 12:37:03.764 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:03 np0005466013 nova_compute[192144]: 2025-10-02 12:37:03.970 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:05 np0005466013 nova_compute[192144]: 2025-10-02 12:37:05.340 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:05 np0005466013 nova_compute[192144]: 2025-10-02 12:37:05.706 2 DEBUG nova.network.neutron [None req-ce6bccf6-2b4f-42d0-a5cf-008d03f8aac0 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Updating instance_info_cache with network_info: [{"id": "7fd89911-3957-4f68-8adb-9a1c640f6bdb", "address": "fa:16:3e:16:78:94", "network": {"id": "f6b28beb-3fac-4e00-bd1f-932a66109b1d", "bridge": "br-int", "label": "tempest-network-smoke--1067683882", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7fd89911-39", "ovs_interfaceid": "7fd89911-3957-4f68-8adb-9a1c640f6bdb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7b31c42c-5f0b-40fa-b913-4546441b9444", "address": "fa:16:3e:a5:99:e1", "network": {"id": "a970b3c6-2fc3-4025-868b-2e9af396991a", "bridge": "br-int", "label": "tempest-network-smoke--441167180", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b31c42c-5f", "ovs_interfaceid": "7b31c42c-5f0b-40fa-b913-4546441b9444", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:37:05 np0005466013 nova_compute[192144]: 2025-10-02 12:37:05.742 2 DEBUG oslo_concurrency.lockutils [None req-ce6bccf6-2b4f-42d0-a5cf-008d03f8aac0 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Releasing lock "refresh_cache-2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:37:05 np0005466013 nova_compute[192144]: 2025-10-02 12:37:05.744 2 DEBUG oslo_concurrency.lockutils [req-63e7fef9-c584-4376-8ee5-25ca5e61643f req-2f3bac89-c9b0-4f06-9831-d129fd42c96b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:37:05 np0005466013 nova_compute[192144]: 2025-10-02 12:37:05.745 2 DEBUG nova.network.neutron [req-63e7fef9-c584-4376-8ee5-25ca5e61643f req-2f3bac89-c9b0-4f06-9831-d129fd42c96b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Refreshing network info cache for port 7b31c42c-5f0b-40fa-b913-4546441b9444 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:37:05 np0005466013 nova_compute[192144]: 2025-10-02 12:37:05.750 2 DEBUG nova.virt.libvirt.vif [None req-ce6bccf6-2b4f-42d0-a5cf-008d03f8aac0 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:36:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1457788990',display_name='tempest-TestNetworkBasicOps-server-1457788990',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1457788990',id=160,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLAX+2XQrOd7KzesCSBvPQcBu33n2cNneMCsROY2zt8eQS2rwe7lFkCII3Yacz+uHDwkN7yLjJdqPxnhD7koMwAMvGoeCFoDh9M4G2I81ZizY75euwWGW9AYqhxHgyD2+w==',key_name='tempest-TestNetworkBasicOps-1430246833',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:36:32Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6e2a4899168a47618e377cb3ac85ddd2',ramdisk_id='',reservation_id='r-dfdu8cck',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1323893370',owner_user_name='tempest-TestNetworkBasicOps-1323893370-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:36:32Z,user_data=None,user_id='a1898fdf056c4a249c33590f26d4d845',uuid=2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7b31c42c-5f0b-40fa-b913-4546441b9444", "address": "fa:16:3e:a5:99:e1", "network": {"id": "a970b3c6-2fc3-4025-868b-2e9af396991a", "bridge": "br-int", "label": "tempest-network-smoke--441167180", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b31c42c-5f", "ovs_interfaceid": "7b31c42c-5f0b-40fa-b913-4546441b9444", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:37:05 np0005466013 nova_compute[192144]: 2025-10-02 12:37:05.751 2 DEBUG nova.network.os_vif_util [None req-ce6bccf6-2b4f-42d0-a5cf-008d03f8aac0 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Converting VIF {"id": "7b31c42c-5f0b-40fa-b913-4546441b9444", "address": "fa:16:3e:a5:99:e1", "network": {"id": "a970b3c6-2fc3-4025-868b-2e9af396991a", "bridge": "br-int", "label": "tempest-network-smoke--441167180", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b31c42c-5f", "ovs_interfaceid": "7b31c42c-5f0b-40fa-b913-4546441b9444", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:37:05 np0005466013 nova_compute[192144]: 2025-10-02 12:37:05.752 2 DEBUG nova.network.os_vif_util [None req-ce6bccf6-2b4f-42d0-a5cf-008d03f8aac0 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a5:99:e1,bridge_name='br-int',has_traffic_filtering=True,id=7b31c42c-5f0b-40fa-b913-4546441b9444,network=Network(a970b3c6-2fc3-4025-868b-2e9af396991a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b31c42c-5f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:37:05 np0005466013 nova_compute[192144]: 2025-10-02 12:37:05.753 2 DEBUG os_vif [None req-ce6bccf6-2b4f-42d0-a5cf-008d03f8aac0 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a5:99:e1,bridge_name='br-int',has_traffic_filtering=True,id=7b31c42c-5f0b-40fa-b913-4546441b9444,network=Network(a970b3c6-2fc3-4025-868b-2e9af396991a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b31c42c-5f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:37:05 np0005466013 nova_compute[192144]: 2025-10-02 12:37:05.754 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:05 np0005466013 nova_compute[192144]: 2025-10-02 12:37:05.754 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:37:05 np0005466013 nova_compute[192144]: 2025-10-02 12:37:05.755 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:37:05 np0005466013 nova_compute[192144]: 2025-10-02 12:37:05.760 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:05 np0005466013 nova_compute[192144]: 2025-10-02 12:37:05.761 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7b31c42c-5f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:37:05 np0005466013 nova_compute[192144]: 2025-10-02 12:37:05.762 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7b31c42c-5f, col_values=(('external_ids', {'iface-id': '7b31c42c-5f0b-40fa-b913-4546441b9444', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a5:99:e1', 'vm-uuid': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:37:05 np0005466013 NetworkManager[51205]: <info>  [1759408625.7672] manager: (tap7b31c42c-5f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/309)
Oct  2 08:37:05 np0005466013 nova_compute[192144]: 2025-10-02 12:37:05.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:05 np0005466013 nova_compute[192144]: 2025-10-02 12:37:05.773 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:37:05 np0005466013 nova_compute[192144]: 2025-10-02 12:37:05.777 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:05 np0005466013 nova_compute[192144]: 2025-10-02 12:37:05.779 2 INFO os_vif [None req-ce6bccf6-2b4f-42d0-a5cf-008d03f8aac0 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a5:99:e1,bridge_name='br-int',has_traffic_filtering=True,id=7b31c42c-5f0b-40fa-b913-4546441b9444,network=Network(a970b3c6-2fc3-4025-868b-2e9af396991a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b31c42c-5f')#033[00m
Oct  2 08:37:05 np0005466013 nova_compute[192144]: 2025-10-02 12:37:05.780 2 DEBUG nova.virt.libvirt.vif [None req-ce6bccf6-2b4f-42d0-a5cf-008d03f8aac0 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:36:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1457788990',display_name='tempest-TestNetworkBasicOps-server-1457788990',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1457788990',id=160,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLAX+2XQrOd7KzesCSBvPQcBu33n2cNneMCsROY2zt8eQS2rwe7lFkCII3Yacz+uHDwkN7yLjJdqPxnhD7koMwAMvGoeCFoDh9M4G2I81ZizY75euwWGW9AYqhxHgyD2+w==',key_name='tempest-TestNetworkBasicOps-1430246833',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:36:32Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6e2a4899168a47618e377cb3ac85ddd2',ramdisk_id='',reservation_id='r-dfdu8cck',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1323893370',owner_user_name='tempest-TestNetworkBasicOps-1323893370-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:36:32Z,user_data=None,user_id='a1898fdf056c4a249c33590f26d4d845',uuid=2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7b31c42c-5f0b-40fa-b913-4546441b9444", "address": "fa:16:3e:a5:99:e1", "network": {"id": "a970b3c6-2fc3-4025-868b-2e9af396991a", "bridge": "br-int", "label": "tempest-network-smoke--441167180", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b31c42c-5f", "ovs_interfaceid": "7b31c42c-5f0b-40fa-b913-4546441b9444", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:37:05 np0005466013 nova_compute[192144]: 2025-10-02 12:37:05.780 2 DEBUG nova.network.os_vif_util [None req-ce6bccf6-2b4f-42d0-a5cf-008d03f8aac0 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Converting VIF {"id": "7b31c42c-5f0b-40fa-b913-4546441b9444", "address": "fa:16:3e:a5:99:e1", "network": {"id": "a970b3c6-2fc3-4025-868b-2e9af396991a", "bridge": "br-int", "label": "tempest-network-smoke--441167180", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b31c42c-5f", "ovs_interfaceid": "7b31c42c-5f0b-40fa-b913-4546441b9444", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:37:05 np0005466013 nova_compute[192144]: 2025-10-02 12:37:05.781 2 DEBUG nova.network.os_vif_util [None req-ce6bccf6-2b4f-42d0-a5cf-008d03f8aac0 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a5:99:e1,bridge_name='br-int',has_traffic_filtering=True,id=7b31c42c-5f0b-40fa-b913-4546441b9444,network=Network(a970b3c6-2fc3-4025-868b-2e9af396991a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b31c42c-5f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:37:05 np0005466013 nova_compute[192144]: 2025-10-02 12:37:05.783 2 DEBUG nova.virt.libvirt.guest [None req-ce6bccf6-2b4f-42d0-a5cf-008d03f8aac0 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] attach device xml: <interface type="ethernet">
Oct  2 08:37:05 np0005466013 nova_compute[192144]:  <mac address="fa:16:3e:a5:99:e1"/>
Oct  2 08:37:05 np0005466013 nova_compute[192144]:  <model type="virtio"/>
Oct  2 08:37:05 np0005466013 nova_compute[192144]:  <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:37:05 np0005466013 nova_compute[192144]:  <mtu size="1442"/>
Oct  2 08:37:05 np0005466013 nova_compute[192144]:  <target dev="tap7b31c42c-5f"/>
Oct  2 08:37:05 np0005466013 nova_compute[192144]: </interface>
Oct  2 08:37:05 np0005466013 nova_compute[192144]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Oct  2 08:37:05 np0005466013 kernel: tap7b31c42c-5f: entered promiscuous mode
Oct  2 08:37:05 np0005466013 NetworkManager[51205]: <info>  [1759408625.8032] manager: (tap7b31c42c-5f): new Tun device (/org/freedesktop/NetworkManager/Devices/310)
Oct  2 08:37:05 np0005466013 ovn_controller[94366]: 2025-10-02T12:37:05Z|00704|binding|INFO|Claiming lport 7b31c42c-5f0b-40fa-b913-4546441b9444 for this chassis.
Oct  2 08:37:05 np0005466013 ovn_controller[94366]: 2025-10-02T12:37:05Z|00705|binding|INFO|7b31c42c-5f0b-40fa-b913-4546441b9444: Claiming fa:16:3e:a5:99:e1 10.100.0.29
Oct  2 08:37:05 np0005466013 nova_compute[192144]: 2025-10-02 12:37:05.807 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:05 np0005466013 systemd-udevd[247300]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:37:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:05.839 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a5:99:e1 10.100.0.29'], port_security=['fa:16:3e:a5:99:e1 10.100.0.29'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.29/28', 'neutron:device_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a970b3c6-2fc3-4025-868b-2e9af396991a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1aab0b39-6daf-41d1-a7da-b7bb077ff5e9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cf407807-38c2-4b6a-825d-3f40edf483e2, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=7b31c42c-5f0b-40fa-b913-4546441b9444) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:37:05 np0005466013 nova_compute[192144]: 2025-10-02 12:37:05.840 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:05.841 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 7b31c42c-5f0b-40fa-b913-4546441b9444 in datapath a970b3c6-2fc3-4025-868b-2e9af396991a bound to our chassis#033[00m
Oct  2 08:37:05 np0005466013 ovn_controller[94366]: 2025-10-02T12:37:05Z|00706|binding|INFO|Setting lport 7b31c42c-5f0b-40fa-b913-4546441b9444 ovn-installed in OVS
Oct  2 08:37:05 np0005466013 ovn_controller[94366]: 2025-10-02T12:37:05Z|00707|binding|INFO|Setting lport 7b31c42c-5f0b-40fa-b913-4546441b9444 up in Southbound
Oct  2 08:37:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:05.844 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a970b3c6-2fc3-4025-868b-2e9af396991a#033[00m
Oct  2 08:37:05 np0005466013 nova_compute[192144]: 2025-10-02 12:37:05.846 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:05 np0005466013 NetworkManager[51205]: <info>  [1759408625.8567] device (tap7b31c42c-5f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:37:05 np0005466013 NetworkManager[51205]: <info>  [1759408625.8579] device (tap7b31c42c-5f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:37:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:05.864 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[48c4a306-0721-4d15-b87c-f908557e01eb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:05.866 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa970b3c6-21 in ovnmeta-a970b3c6-2fc3-4025-868b-2e9af396991a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:37:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:05.868 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa970b3c6-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:37:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:05.869 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[aad4b9e9-c361-421c-b794-bc57508dd330]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:05.870 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[016da78a-6063-4928-b7ab-9a06b540cd57]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:05.885 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[12e74a68-e61d-4904-a89d-8e5ccc1f9de8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:05.900 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[b489f36a-7b53-49db-a3a7-153739a000da]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:05 np0005466013 nova_compute[192144]: 2025-10-02 12:37:05.918 2 DEBUG nova.virt.libvirt.driver [None req-ce6bccf6-2b4f-42d0-a5cf-008d03f8aac0 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:37:05 np0005466013 nova_compute[192144]: 2025-10-02 12:37:05.918 2 DEBUG nova.virt.libvirt.driver [None req-ce6bccf6-2b4f-42d0-a5cf-008d03f8aac0 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:37:05 np0005466013 nova_compute[192144]: 2025-10-02 12:37:05.919 2 DEBUG nova.virt.libvirt.driver [None req-ce6bccf6-2b4f-42d0-a5cf-008d03f8aac0 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] No VIF found with MAC fa:16:3e:16:78:94, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:37:05 np0005466013 nova_compute[192144]: 2025-10-02 12:37:05.919 2 DEBUG nova.virt.libvirt.driver [None req-ce6bccf6-2b4f-42d0-a5cf-008d03f8aac0 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] No VIF found with MAC fa:16:3e:a5:99:e1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:37:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:05.934 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[0bfe3d92-51a5-4a45-a908-39b576e57d58]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:05 np0005466013 NetworkManager[51205]: <info>  [1759408625.9446] manager: (tapa970b3c6-20): new Veth device (/org/freedesktop/NetworkManager/Devices/311)
Oct  2 08:37:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:05.945 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[284e414c-e8d9-47a7-9675-a19aae3c6390]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:05 np0005466013 nova_compute[192144]: 2025-10-02 12:37:05.952 2 DEBUG nova.virt.libvirt.guest [None req-ce6bccf6-2b4f-42d0-a5cf-008d03f8aac0 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:37:05 np0005466013 nova_compute[192144]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:37:05 np0005466013 nova_compute[192144]:  <nova:name>tempest-TestNetworkBasicOps-server-1457788990</nova:name>
Oct  2 08:37:05 np0005466013 nova_compute[192144]:  <nova:creationTime>2025-10-02 12:37:05</nova:creationTime>
Oct  2 08:37:05 np0005466013 nova_compute[192144]:  <nova:flavor name="m1.nano">
Oct  2 08:37:05 np0005466013 nova_compute[192144]:    <nova:memory>128</nova:memory>
Oct  2 08:37:05 np0005466013 nova_compute[192144]:    <nova:disk>1</nova:disk>
Oct  2 08:37:05 np0005466013 nova_compute[192144]:    <nova:swap>0</nova:swap>
Oct  2 08:37:05 np0005466013 nova_compute[192144]:    <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:37:05 np0005466013 nova_compute[192144]:    <nova:vcpus>1</nova:vcpus>
Oct  2 08:37:05 np0005466013 nova_compute[192144]:  </nova:flavor>
Oct  2 08:37:05 np0005466013 nova_compute[192144]:  <nova:owner>
Oct  2 08:37:05 np0005466013 nova_compute[192144]:    <nova:user uuid="a1898fdf056c4a249c33590f26d4d845">tempest-TestNetworkBasicOps-1323893370-project-member</nova:user>
Oct  2 08:37:05 np0005466013 nova_compute[192144]:    <nova:project uuid="6e2a4899168a47618e377cb3ac85ddd2">tempest-TestNetworkBasicOps-1323893370</nova:project>
Oct  2 08:37:05 np0005466013 nova_compute[192144]:  </nova:owner>
Oct  2 08:37:05 np0005466013 nova_compute[192144]:  <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:37:05 np0005466013 nova_compute[192144]:  <nova:ports>
Oct  2 08:37:05 np0005466013 nova_compute[192144]:    <nova:port uuid="7fd89911-3957-4f68-8adb-9a1c640f6bdb">
Oct  2 08:37:05 np0005466013 nova_compute[192144]:      <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Oct  2 08:37:05 np0005466013 nova_compute[192144]:    </nova:port>
Oct  2 08:37:05 np0005466013 nova_compute[192144]:    <nova:port uuid="7b31c42c-5f0b-40fa-b913-4546441b9444">
Oct  2 08:37:05 np0005466013 nova_compute[192144]:      <nova:ip type="fixed" address="10.100.0.29" ipVersion="4"/>
Oct  2 08:37:05 np0005466013 nova_compute[192144]:    </nova:port>
Oct  2 08:37:05 np0005466013 nova_compute[192144]:  </nova:ports>
Oct  2 08:37:05 np0005466013 nova_compute[192144]: </nova:instance>
Oct  2 08:37:05 np0005466013 nova_compute[192144]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Oct  2 08:37:05 np0005466013 nova_compute[192144]: 2025-10-02 12:37:05.997 2 DEBUG oslo_concurrency.lockutils [None req-ce6bccf6-2b4f-42d0-a5cf-008d03f8aac0 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "interface-2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 7.272s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:37:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:05.996 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[0fb36c5e-fd16-44fc-bb92-574cb41f39dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:06.000 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[2e1d32b3-4501-4214-a6ea-d4db40cbd27e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:06 np0005466013 NetworkManager[51205]: <info>  [1759408626.0323] device (tapa970b3c6-20): carrier: link connected
Oct  2 08:37:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:06.043 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[801eb448-7e30-4ef8-bcae-08d7db899388]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:06.071 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[a3068e78-31ed-4c12-8980-fd98cc6d83e5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa970b3c6-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d4:4e:6e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 208], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 661764, 'reachable_time': 35715, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247329, 'error': None, 'target': 'ovnmeta-a970b3c6-2fc3-4025-868b-2e9af396991a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:06.094 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[a6ab298f-4c29-4f26-be9a-f65844ad9caf]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed4:4e6e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 661764, 'tstamp': 661764}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 247330, 'error': None, 'target': 'ovnmeta-a970b3c6-2fc3-4025-868b-2e9af396991a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:06.124 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[c5d4ce4e-6020-44ad-90a1-bb375654279f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa970b3c6-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d4:4e:6e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 208], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 661764, 'reachable_time': 35715, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 168, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 168, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 247331, 'error': None, 'target': 'ovnmeta-a970b3c6-2fc3-4025-868b-2e9af396991a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:06.168 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[bc48db38-7372-42c2-864d-a81487e76c29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:06 np0005466013 nova_compute[192144]: 2025-10-02 12:37:06.194 2 DEBUG nova.compute.manager [req-c83bbd57-6ed8-4910-a210-b3cddc29af63 req-0fbdda58-cf57-44ff-95d2-8894ac3752e9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Received event network-vif-plugged-7b31c42c-5f0b-40fa-b913-4546441b9444 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:37:06 np0005466013 nova_compute[192144]: 2025-10-02 12:37:06.195 2 DEBUG oslo_concurrency.lockutils [req-c83bbd57-6ed8-4910-a210-b3cddc29af63 req-0fbdda58-cf57-44ff-95d2-8894ac3752e9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:37:06 np0005466013 nova_compute[192144]: 2025-10-02 12:37:06.195 2 DEBUG oslo_concurrency.lockutils [req-c83bbd57-6ed8-4910-a210-b3cddc29af63 req-0fbdda58-cf57-44ff-95d2-8894ac3752e9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:37:06 np0005466013 nova_compute[192144]: 2025-10-02 12:37:06.196 2 DEBUG oslo_concurrency.lockutils [req-c83bbd57-6ed8-4910-a210-b3cddc29af63 req-0fbdda58-cf57-44ff-95d2-8894ac3752e9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:37:06 np0005466013 nova_compute[192144]: 2025-10-02 12:37:06.196 2 DEBUG nova.compute.manager [req-c83bbd57-6ed8-4910-a210-b3cddc29af63 req-0fbdda58-cf57-44ff-95d2-8894ac3752e9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] No waiting events found dispatching network-vif-plugged-7b31c42c-5f0b-40fa-b913-4546441b9444 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:37:06 np0005466013 nova_compute[192144]: 2025-10-02 12:37:06.196 2 WARNING nova.compute.manager [req-c83bbd57-6ed8-4910-a210-b3cddc29af63 req-0fbdda58-cf57-44ff-95d2-8894ac3752e9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Received unexpected event network-vif-plugged-7b31c42c-5f0b-40fa-b913-4546441b9444 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:37:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:06.269 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[dca71011-4f61-4f45-b0a9-29ad22101657]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:06.271 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa970b3c6-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:37:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:06.272 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:37:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:06.273 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa970b3c6-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:37:06 np0005466013 nova_compute[192144]: 2025-10-02 12:37:06.275 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:06 np0005466013 kernel: tapa970b3c6-20: entered promiscuous mode
Oct  2 08:37:06 np0005466013 NetworkManager[51205]: <info>  [1759408626.2782] manager: (tapa970b3c6-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/312)
Oct  2 08:37:06 np0005466013 nova_compute[192144]: 2025-10-02 12:37:06.280 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:06.283 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa970b3c6-20, col_values=(('external_ids', {'iface-id': '6aa346c6-3e0c-4887-be68-d585d409cf95'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:37:06 np0005466013 nova_compute[192144]: 2025-10-02 12:37:06.286 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:06 np0005466013 ovn_controller[94366]: 2025-10-02T12:37:06Z|00708|binding|INFO|Releasing lport 6aa346c6-3e0c-4887-be68-d585d409cf95 from this chassis (sb_readonly=0)
Oct  2 08:37:06 np0005466013 nova_compute[192144]: 2025-10-02 12:37:06.287 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:06.288 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a970b3c6-2fc3-4025-868b-2e9af396991a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a970b3c6-2fc3-4025-868b-2e9af396991a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:37:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:06.290 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[5e61c5da-c1e0-4ba9-b61c-c1f9f8e8edd7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:06.292 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:37:06 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:37:06 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:37:06 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-a970b3c6-2fc3-4025-868b-2e9af396991a
Oct  2 08:37:06 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:37:06 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:37:06 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:37:06 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/a970b3c6-2fc3-4025-868b-2e9af396991a.pid.haproxy
Oct  2 08:37:06 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:37:06 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:37:06 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:37:06 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:37:06 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:37:06 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:37:06 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:37:06 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:37:06 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:37:06 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:37:06 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:37:06 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:37:06 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:37:06 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:37:06 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:37:06 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:37:06 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:37:06 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:37:06 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:37:06 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:37:06 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID a970b3c6-2fc3-4025-868b-2e9af396991a
Oct  2 08:37:06 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:37:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:06.294 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a970b3c6-2fc3-4025-868b-2e9af396991a', 'env', 'PROCESS_TAG=haproxy-a970b3c6-2fc3-4025-868b-2e9af396991a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a970b3c6-2fc3-4025-868b-2e9af396991a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:37:06 np0005466013 nova_compute[192144]: 2025-10-02 12:37:06.300 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:06 np0005466013 podman[247362]: 2025-10-02 12:37:06.760103131 +0000 UTC m=+0.076395370 container create cb050a0f43e622ce0fb5177ed18b89b83603510fe1080fa4a048c00822149281 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a970b3c6-2fc3-4025-868b-2e9af396991a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:37:06 np0005466013 systemd[1]: Started libpod-conmon-cb050a0f43e622ce0fb5177ed18b89b83603510fe1080fa4a048c00822149281.scope.
Oct  2 08:37:06 np0005466013 podman[247362]: 2025-10-02 12:37:06.722371566 +0000 UTC m=+0.038663805 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:37:06 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:37:06 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f49f39c1a025e10fb86042e071ed27a9be40dc397c9777c326dd7ae577c7855a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:37:06 np0005466013 podman[247362]: 2025-10-02 12:37:06.860608822 +0000 UTC m=+0.176901061 container init cb050a0f43e622ce0fb5177ed18b89b83603510fe1080fa4a048c00822149281 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a970b3c6-2fc3-4025-868b-2e9af396991a, org.label-schema.build-date=20251001, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 08:37:06 np0005466013 podman[247362]: 2025-10-02 12:37:06.867579188 +0000 UTC m=+0.183871387 container start cb050a0f43e622ce0fb5177ed18b89b83603510fe1080fa4a048c00822149281 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a970b3c6-2fc3-4025-868b-2e9af396991a, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Oct  2 08:37:06 np0005466013 neutron-haproxy-ovnmeta-a970b3c6-2fc3-4025-868b-2e9af396991a[247377]: [NOTICE]   (247381) : New worker (247383) forked
Oct  2 08:37:06 np0005466013 neutron-haproxy-ovnmeta-a970b3c6-2fc3-4025-868b-2e9af396991a[247377]: [NOTICE]   (247381) : Loading success.
Oct  2 08:37:07 np0005466013 ovn_controller[94366]: 2025-10-02T12:37:07Z|00709|binding|INFO|Releasing lport 02b99a60-d61e-4fd5-b4f5-ea414f3b2b3c from this chassis (sb_readonly=0)
Oct  2 08:37:07 np0005466013 ovn_controller[94366]: 2025-10-02T12:37:07Z|00710|binding|INFO|Releasing lport 6aa346c6-3e0c-4887-be68-d585d409cf95 from this chassis (sb_readonly=0)
Oct  2 08:37:07 np0005466013 ovn_controller[94366]: 2025-10-02T12:37:07Z|00073|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a5:99:e1 10.100.0.29
Oct  2 08:37:07 np0005466013 ovn_controller[94366]: 2025-10-02T12:37:07Z|00074|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a5:99:e1 10.100.0.29
Oct  2 08:37:07 np0005466013 nova_compute[192144]: 2025-10-02 12:37:07.158 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:07 np0005466013 nova_compute[192144]: 2025-10-02 12:37:07.604 2 DEBUG nova.network.neutron [req-63e7fef9-c584-4376-8ee5-25ca5e61643f req-2f3bac89-c9b0-4f06-9831-d129fd42c96b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Updated VIF entry in instance network info cache for port 7b31c42c-5f0b-40fa-b913-4546441b9444. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:37:07 np0005466013 nova_compute[192144]: 2025-10-02 12:37:07.604 2 DEBUG nova.network.neutron [req-63e7fef9-c584-4376-8ee5-25ca5e61643f req-2f3bac89-c9b0-4f06-9831-d129fd42c96b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Updating instance_info_cache with network_info: [{"id": "7fd89911-3957-4f68-8adb-9a1c640f6bdb", "address": "fa:16:3e:16:78:94", "network": {"id": "f6b28beb-3fac-4e00-bd1f-932a66109b1d", "bridge": "br-int", "label": "tempest-network-smoke--1067683882", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7fd89911-39", "ovs_interfaceid": "7fd89911-3957-4f68-8adb-9a1c640f6bdb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7b31c42c-5f0b-40fa-b913-4546441b9444", "address": "fa:16:3e:a5:99:e1", "network": {"id": "a970b3c6-2fc3-4025-868b-2e9af396991a", "bridge": "br-int", "label": "tempest-network-smoke--441167180", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b31c42c-5f", "ovs_interfaceid": "7b31c42c-5f0b-40fa-b913-4546441b9444", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:37:07 np0005466013 nova_compute[192144]: 2025-10-02 12:37:07.633 2 DEBUG oslo_concurrency.lockutils [req-63e7fef9-c584-4376-8ee5-25ca5e61643f req-2f3bac89-c9b0-4f06-9831-d129fd42c96b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:37:07 np0005466013 nova_compute[192144]: 2025-10-02 12:37:07.681 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:08 np0005466013 nova_compute[192144]: 2025-10-02 12:37:08.363 2 DEBUG nova.compute.manager [req-c61c6306-980b-498b-bfd6-1b4c6cedb57c req-92c0da99-8594-462d-8c5e-d684d0561dc4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Received event network-vif-plugged-7b31c42c-5f0b-40fa-b913-4546441b9444 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:37:08 np0005466013 nova_compute[192144]: 2025-10-02 12:37:08.364 2 DEBUG oslo_concurrency.lockutils [req-c61c6306-980b-498b-bfd6-1b4c6cedb57c req-92c0da99-8594-462d-8c5e-d684d0561dc4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:37:08 np0005466013 nova_compute[192144]: 2025-10-02 12:37:08.364 2 DEBUG oslo_concurrency.lockutils [req-c61c6306-980b-498b-bfd6-1b4c6cedb57c req-92c0da99-8594-462d-8c5e-d684d0561dc4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:37:08 np0005466013 nova_compute[192144]: 2025-10-02 12:37:08.365 2 DEBUG oslo_concurrency.lockutils [req-c61c6306-980b-498b-bfd6-1b4c6cedb57c req-92c0da99-8594-462d-8c5e-d684d0561dc4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:37:08 np0005466013 nova_compute[192144]: 2025-10-02 12:37:08.365 2 DEBUG nova.compute.manager [req-c61c6306-980b-498b-bfd6-1b4c6cedb57c req-92c0da99-8594-462d-8c5e-d684d0561dc4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] No waiting events found dispatching network-vif-plugged-7b31c42c-5f0b-40fa-b913-4546441b9444 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:37:08 np0005466013 nova_compute[192144]: 2025-10-02 12:37:08.365 2 WARNING nova.compute.manager [req-c61c6306-980b-498b-bfd6-1b4c6cedb57c req-92c0da99-8594-462d-8c5e-d684d0561dc4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Received unexpected event network-vif-plugged-7b31c42c-5f0b-40fa-b913-4546441b9444 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:37:10 np0005466013 nova_compute[192144]: 2025-10-02 12:37:10.344 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:10 np0005466013 nova_compute[192144]: 2025-10-02 12:37:10.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:12 np0005466013 nova_compute[192144]: 2025-10-02 12:37:12.307 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:14 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:14.525 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fb:62:4a 10.100.0.2 2001:db8::f816:3eff:fefb:624a'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fefb:624a/64', 'neutron:device_id': 'ovnmeta-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=546080ca-391c-439c-be48-88bb942119c9, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=e78bd1c4-7546-4ebe-a71b-a49e8c78f36c) old=Port_Binding(mac=['fa:16:3e:fb:62:4a 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:37:14 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:14.527 103323 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port e78bd1c4-7546-4ebe-a71b-a49e8c78f36c in datapath 1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6 updated#033[00m
Oct  2 08:37:14 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:14.529 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:37:14 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:14.531 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[b86776ba-7cc0-416f-b7a1-e5909e48188e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:15 np0005466013 nova_compute[192144]: 2025-10-02 12:37:15.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:15 np0005466013 nova_compute[192144]: 2025-10-02 12:37:15.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.360 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf', 'name': 'tempest-TestNetworkBasicOps-server-1457788990', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000a0', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '6e2a4899168a47618e377cb3ac85ddd2', 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'hostId': '86538acccf15652cbcbd6dce337a361b90f94626f8f65bd3c5c73fc4', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.361 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.375 12 DEBUG ceilometer.compute.pollsters [-] 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.376 12 DEBUG ceilometer.compute.pollsters [-] 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.377 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd5df1fa9-a50e-4844-b5d4-4da40427ff1a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-vda', 'timestamp': '2025-10-02T12:37:16.361958', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1457788990', 'name': 'instance-000000a0', 'instance_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf', 'instance_type': 'm1.nano', 'host': '86538acccf15652cbcbd6dce337a361b90f94626f8f65bd3c5c73fc4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '875d49ea-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6628.040171585, 'message_signature': '3dbcbdd094b9a721ac1c8c3efb9a7fccd21a2d44ea51be11bda65fa201cc78a3'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-sda', 'timestamp': '2025-10-02T12:37:16.361958', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1457788990', 'name': 'instance-000000a0', 'instance_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf', 'instance_type': 'm1.nano', 'host': '86538acccf15652cbcbd6dce337a361b90f94626f8f65bd3c5c73fc4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '875d55a2-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6628.040171585, 'message_signature': 'add656ce25c9143f788dbf5036fd31adfec71143fcd77bbaed53c6526af980d7'}]}, 'timestamp': '2025-10-02 12:37:16.376551', '_unique_id': 'b1ee1077fb4c4e8c8a19afbbbbcf45ac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.377 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.377 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.377 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.377 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.377 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.377 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.377 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.377 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.377 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.377 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.377 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.377 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.377 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.377 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.377 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.377 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.377 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.377 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.377 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.377 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.377 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.377 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.377 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.377 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.377 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.377 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.377 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.377 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.377 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.377 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.377 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.378 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.382 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf / tap7fd89911-39 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.382 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf / tap7b31c42c-5f inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.383 12 DEBUG ceilometer.compute.pollsters [-] 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.383 12 DEBUG ceilometer.compute.pollsters [-] 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.384 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '131cd385-bb6f-42bb-b9c0-1eee64055b2d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': 'instance-000000a0-2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-tap7fd89911-39', 'timestamp': '2025-10-02T12:37:16.378739', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1457788990', 'name': 'tap7fd89911-39', 'instance_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf', 'instance_type': 'm1.nano', 'host': '86538acccf15652cbcbd6dce337a361b90f94626f8f65bd3c5c73fc4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:16:78:94', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7fd89911-39'}, 'message_id': '875e60be-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6628.05698662, 'message_signature': '223d9cc8450d79703bb92538be771faeb270ef4e5308ca966753e23d337913ae'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': 'instance-000000a0-2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-tap7b31c42c-5f', 'timestamp': '2025-10-02T12:37:16.378739', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1457788990', 'name': 'tap7b31c42c-5f', 'instance_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf', 'instance_type': 'm1.nano', 'host': '86538acccf15652cbcbd6dce337a361b90f94626f8f65bd3c5c73fc4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a5:99:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7b31c42c-5f'}, 'message_id': '875e6a5a-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6628.05698662, 'message_signature': 'a1cb88b7b1380511ec01842094005c0884aeff1be4d90cfddab5250850041959'}]}, 'timestamp': '2025-10-02 12:37:16.383606', '_unique_id': '47c180dfa0e44612b6703df098fcb015'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.384 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.384 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.384 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.384 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.384 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.384 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.384 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.384 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.384 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.384 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.384 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.384 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.384 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.384 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.384 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.384 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.384 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.384 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.384 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.384 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.384 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.384 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.384 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.384 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.384 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.384 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.384 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.384 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.384 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.384 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.384 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.385 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.385 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.385 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1457788990>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1457788990>]
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.385 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.412 12 DEBUG ceilometer.compute.pollsters [-] 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/disk.device.write.requests volume: 334 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.413 12 DEBUG ceilometer.compute.pollsters [-] 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.415 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6571c105-eb63-40e5-aab8-e45fafe66499', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 334, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-vda', 'timestamp': '2025-10-02T12:37:16.385453', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1457788990', 'name': 'instance-000000a0', 'instance_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf', 'instance_type': 'm1.nano', 'host': '86538acccf15652cbcbd6dce337a361b90f94626f8f65bd3c5c73fc4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8762f8a4-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6628.063672407, 'message_signature': '927bcf532ea83a4153b29c6d1b01916376b3fe32950a507c512a4fe126f1a6fb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-sda', 'timestamp': '2025-10-02T12:37:16.385453', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1457788990', 'name': 'instance-000000a0', 'instance_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf', 'instance_type': 'm1.nano', 'host': '86538acccf15652cbcbd6dce337a361b90f94626f8f65bd3c5c73fc4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '87630e8e-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6628.063672407, 'message_signature': '7aca621aed5f93c050a8efe0891cb73a467ba98b865d979935653b993bf55512'}]}, 'timestamp': '2025-10-02 12:37:16.414139', '_unique_id': '91bde9202cd741a58bd14ed6548779e9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.415 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.415 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.415 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.415 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.415 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.415 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.415 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.415 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.415 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.415 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.415 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.415 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.415 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.415 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.415 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.415 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.415 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.415 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.415 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.415 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.415 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.415 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.415 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.415 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.415 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.415 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.415 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.415 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.415 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.415 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.415 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.416 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.417 12 DEBUG ceilometer.compute.pollsters [-] 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/network.outgoing.packets volume: 147 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.417 12 DEBUG ceilometer.compute.pollsters [-] 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/network.outgoing.packets volume: 14 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.419 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '30893a14-5cc0-41f8-8204-aa40bfd23f54', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 147, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': 'instance-000000a0-2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-tap7fd89911-39', 'timestamp': '2025-10-02T12:37:16.417147', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1457788990', 'name': 'tap7fd89911-39', 'instance_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf', 'instance_type': 'm1.nano', 'host': '86538acccf15652cbcbd6dce337a361b90f94626f8f65bd3c5c73fc4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:16:78:94', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7fd89911-39'}, 'message_id': '87639764-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6628.05698662, 'message_signature': 'd8dbca7cb65b8c6e0d5700416a8dcf9980711322c5652f7e716d38504b9145cf'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 14, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': 'instance-000000a0-2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-tap7b31c42c-5f', 'timestamp': '2025-10-02T12:37:16.417147', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1457788990', 'name': 'tap7b31c42c-5f', 'instance_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf', 'instance_type': 'm1.nano', 'host': '86538acccf15652cbcbd6dce337a361b90f94626f8f65bd3c5c73fc4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a5:99:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7b31c42c-5f'}, 'message_id': '8763ab0a-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6628.05698662, 'message_signature': 'a47746c4fd475448d5bfcc38cf4a9f00aa87695abda66a471f45dce8f4db5a8c'}]}, 'timestamp': '2025-10-02 12:37:16.418143', '_unique_id': '39087004880e4f6fa41f6ad534816671'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.419 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.419 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.419 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.419 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.419 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.419 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.419 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.419 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.419 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.419 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.419 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.419 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.419 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.419 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.419 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.419 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.419 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.419 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.419 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.419 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.419 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.419 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.419 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.419 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.419 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.419 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.419 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.419 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.419 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.419 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.419 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.420 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.447 12 DEBUG ceilometer.compute.pollsters [-] 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/cpu volume: 11730000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.449 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '09cbb8c2-b1cf-47a4-a4c9-029b68adfde8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11730000000, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf', 'timestamp': '2025-10-02T12:37:16.420677', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1457788990', 'name': 'instance-000000a0', 'instance_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf', 'instance_type': 'm1.nano', 'host': '86538acccf15652cbcbd6dce337a361b90f94626f8f65bd3c5c73fc4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '87683350-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6628.125168073, 'message_signature': 'c695077f1603e672749c7118fa9660f1a97a62a96c4b26c14b6c6fe18aecdde4'}]}, 'timestamp': '2025-10-02 12:37:16.447948', '_unique_id': '75ecfe1ec1534576a627f56d18f46a15'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.449 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.449 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.449 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.449 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.449 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.449 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.449 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.449 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.449 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.449 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.449 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.449 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.449 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.449 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.449 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.449 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.449 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.449 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.449 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.449 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.449 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.449 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.449 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.449 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.449 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.449 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.449 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.449 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.449 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.449 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.449 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.450 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.451 12 DEBUG ceilometer.compute.pollsters [-] 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/network.incoming.bytes volume: 28363 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.451 12 DEBUG ceilometer.compute.pollsters [-] 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/network.incoming.bytes volume: 1630 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.453 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fb90ae84-51fa-44b5-a72f-e1dc417b5cf7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 28363, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': 'instance-000000a0-2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-tap7fd89911-39', 'timestamp': '2025-10-02T12:37:16.451077', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1457788990', 'name': 'tap7fd89911-39', 'instance_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf', 'instance_type': 'm1.nano', 'host': '86538acccf15652cbcbd6dce337a361b90f94626f8f65bd3c5c73fc4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:16:78:94', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7fd89911-39'}, 'message_id': '8768c54a-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6628.05698662, 'message_signature': '8ca83e226e2c540a444ac1d5ba54a76bdf917758ec2af276eecc9201ad6722e6'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1630, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': 'instance-000000a0-2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-tap7b31c42c-5f', 'timestamp': '2025-10-02T12:37:16.451077', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1457788990', 'name': 'tap7b31c42c-5f', 'instance_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf', 'instance_type': 'm1.nano', 'host': '86538acccf15652cbcbd6dce337a361b90f94626f8f65bd3c5c73fc4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a5:99:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7b31c42c-5f'}, 'message_id': '8768d904-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6628.05698662, 'message_signature': '36da617657325dc5c21ad0fdcfce18761e5225d66fc935f9aa0aa1b1ad8a63bd'}]}, 'timestamp': '2025-10-02 12:37:16.452093', '_unique_id': 'c57c650457484f36ae791c3cb05cdf5a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.453 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.453 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.453 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.453 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.453 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.453 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.453 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.453 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.453 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.453 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.453 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.453 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.453 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.453 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.453 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.453 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.453 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.453 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.453 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.453 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.453 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.453 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.453 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.453 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.453 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.453 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.453 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.453 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.453 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.453 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.453 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.454 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.454 12 DEBUG ceilometer.compute.pollsters [-] 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/disk.device.write.latency volume: 2409948836 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.455 12 DEBUG ceilometer.compute.pollsters [-] 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.456 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '33eddf43-6fc1-4c0f-9f99-64001f5d5779', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2409948836, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-vda', 'timestamp': '2025-10-02T12:37:16.454687', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1457788990', 'name': 'instance-000000a0', 'instance_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf', 'instance_type': 'm1.nano', 'host': '86538acccf15652cbcbd6dce337a361b90f94626f8f65bd3c5c73fc4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '876952d0-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6628.063672407, 'message_signature': 'e64809a4ae3b98402445081ca80620519a4c843490ad4ee73caeb642fbe64d47'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-sda', 'timestamp': '2025-10-02T12:37:16.454687', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1457788990', 'name': 'instance-000000a0', 'instance_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf', 'instance_type': 'm1.nano', 'host': '86538acccf15652cbcbd6dce337a361b90f94626f8f65bd3c5c73fc4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '876965c2-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6628.063672407, 'message_signature': '4b9845fc151ed0f0e8403df220749e1b7cec686b5782eaf20650333b0a6e769d'}]}, 'timestamp': '2025-10-02 12:37:16.455674', '_unique_id': '9299ea94c7324639a8e33cc4da4d63e1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.456 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.456 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.456 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.456 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.456 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.456 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.456 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.456 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.456 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.456 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.456 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.456 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.456 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.456 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.456 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.456 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.456 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.456 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.456 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.456 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.456 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.456 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.456 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.456 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.456 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.456 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.456 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.456 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.456 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.456 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.456 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.458 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.458 12 DEBUG ceilometer.compute.pollsters [-] 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.458 12 DEBUG ceilometer.compute.pollsters [-] 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.460 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bec647a5-6b58-4818-a8fe-22f141f845d7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': 'instance-000000a0-2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-tap7fd89911-39', 'timestamp': '2025-10-02T12:37:16.458252', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1457788990', 'name': 'tap7fd89911-39', 'instance_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf', 'instance_type': 'm1.nano', 'host': '86538acccf15652cbcbd6dce337a361b90f94626f8f65bd3c5c73fc4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:16:78:94', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7fd89911-39'}, 'message_id': '8769dcaa-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6628.05698662, 'message_signature': '993cb5ffa51ac89cf41307cbac182736a61bf13d20b7e3263a66bd9181e967a6'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': 'instance-000000a0-2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-tap7b31c42c-5f', 'timestamp': '2025-10-02T12:37:16.458252', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1457788990', 'name': 'tap7b31c42c-5f', 'instance_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf', 'instance_type': 'm1.nano', 'host': '86538acccf15652cbcbd6dce337a361b90f94626f8f65bd3c5c73fc4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a5:99:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7b31c42c-5f'}, 'message_id': '8769f00a-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6628.05698662, 'message_signature': 'cc87a66278c05e5f013fe13f0d14fbe0147ba260fe3ec67422fc1f483c24093b'}]}, 'timestamp': '2025-10-02 12:37:16.459229', '_unique_id': '270a458cbbdc4006bebc42c7736be0dc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.460 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.460 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.460 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.460 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.460 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.460 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.460 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.460 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.460 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.460 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.460 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.460 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.460 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.460 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.460 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.460 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.460 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.460 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.460 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.460 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.460 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.460 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.460 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.460 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.460 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.460 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.460 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.460 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.460 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.460 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.460 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.461 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.461 12 DEBUG ceilometer.compute.pollsters [-] 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/disk.device.read.requests volume: 1090 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.462 12 DEBUG ceilometer.compute.pollsters [-] 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.464 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a29ecf0f-0f71-44da-9072-234de572f657', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1090, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-vda', 'timestamp': '2025-10-02T12:37:16.461804', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1457788990', 'name': 'instance-000000a0', 'instance_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf', 'instance_type': 'm1.nano', 'host': '86538acccf15652cbcbd6dce337a361b90f94626f8f65bd3c5c73fc4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '876a6968-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6628.063672407, 'message_signature': '84189154e651bb5668fed22ed61472272163084541e15a6ffcfb4666c7a47d04'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-sda', 'timestamp': '2025-10-02T12:37:16.461804', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1457788990', 'name': 'instance-000000a0', 'instance_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf', 'instance_type': 'm1.nano', 'host': '86538acccf15652cbcbd6dce337a361b90f94626f8f65bd3c5c73fc4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '876a7aac-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6628.063672407, 'message_signature': '4fc21f97b91a592e6119b15258afc16e910e6a495d20f5c70b877ad752143f4f'}]}, 'timestamp': '2025-10-02 12:37:16.462789', '_unique_id': '7c4effb37ab642dc8973bdb077f3d861'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.464 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.464 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.464 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.464 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.464 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.464 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.464 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.464 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.464 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.464 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.464 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.464 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.464 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.464 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.464 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.464 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.464 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.464 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.464 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.464 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.464 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.464 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.464 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.464 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.464 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.464 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.464 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.464 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.464 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.464 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.464 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.465 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.465 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.465 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1457788990>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1457788990>]
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.466 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.466 12 DEBUG ceilometer.compute.pollsters [-] 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.466 12 DEBUG ceilometer.compute.pollsters [-] 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.468 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '735a463c-245e-4a78-9ee0-838ef80dd8cf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': 'instance-000000a0-2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-tap7fd89911-39', 'timestamp': '2025-10-02T12:37:16.466152', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1457788990', 'name': 'tap7fd89911-39', 'instance_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf', 'instance_type': 'm1.nano', 'host': '86538acccf15652cbcbd6dce337a361b90f94626f8f65bd3c5c73fc4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:16:78:94', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7fd89911-39'}, 'message_id': '876b119c-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6628.05698662, 'message_signature': '1d0255c30885178ea88500e7c2139654600062d5dea4a1864c3d2647f63b5c1f'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': 'instance-000000a0-2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-tap7b31c42c-5f', 'timestamp': '2025-10-02T12:37:16.466152', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1457788990', 'name': 'tap7b31c42c-5f', 'instance_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf', 'instance_type': 'm1.nano', 'host': '86538acccf15652cbcbd6dce337a361b90f94626f8f65bd3c5c73fc4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a5:99:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7b31c42c-5f'}, 'message_id': '876b2556-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6628.05698662, 'message_signature': '758105cf1ef1e576bbcca982900cd024501cb055e44ad7158fe4252d2fb073d8'}]}, 'timestamp': '2025-10-02 12:37:16.467151', '_unique_id': '204fd2827b584d32888738aaa532e1ba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.468 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.468 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.468 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.468 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.468 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.468 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.468 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.468 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.468 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.468 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.468 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.468 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.468 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.468 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.468 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.468 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.468 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.468 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.468 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.468 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.468 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.468 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.468 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.468 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.468 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.468 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.468 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.468 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.468 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.468 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.468 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.469 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.469 12 DEBUG ceilometer.compute.pollsters [-] 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.470 12 DEBUG ceilometer.compute.pollsters [-] 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.471 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '90043ef3-51f4-40e3-93cc-5292521d3a3a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': 'instance-000000a0-2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-tap7fd89911-39', 'timestamp': '2025-10-02T12:37:16.469687', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1457788990', 'name': 'tap7fd89911-39', 'instance_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf', 'instance_type': 'm1.nano', 'host': '86538acccf15652cbcbd6dce337a361b90f94626f8f65bd3c5c73fc4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:16:78:94', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7fd89911-39'}, 'message_id': '876b9cc0-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6628.05698662, 'message_signature': 'aa223d16e0faebf8a059e5443113039699728552297bfd3349ef37d211fae171'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': 'instance-000000a0-2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-tap7b31c42c-5f', 'timestamp': '2025-10-02T12:37:16.469687', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1457788990', 'name': 'tap7b31c42c-5f', 'instance_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf', 'instance_type': 'm1.nano', 'host': '86538acccf15652cbcbd6dce337a361b90f94626f8f65bd3c5c73fc4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a5:99:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7b31c42c-5f'}, 'message_id': '876bae72-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6628.05698662, 'message_signature': '955ca58809e6bd70651a024061f38b3cd7dd6dfaa62f93bfd8f95488c58653cb'}]}, 'timestamp': '2025-10-02 12:37:16.470655', '_unique_id': 'a25797494fef42089b266aa8b49db870'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.471 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.471 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.471 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.471 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.471 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.471 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.471 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.471 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.471 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.471 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.471 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.471 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.471 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.471 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.471 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.471 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.471 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.471 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.471 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.471 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.471 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.471 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.471 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.471 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.471 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.471 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.471 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.471 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.471 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.471 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.471 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.472 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.473 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.473 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1457788990>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1457788990>]
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.473 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.473 12 DEBUG ceilometer.compute.pollsters [-] 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/disk.device.write.bytes volume: 73035776 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.474 12 DEBUG ceilometer.compute.pollsters [-] 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.475 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd2fe4f94-79bd-45b1-83ae-9685e7e44746', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73035776, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-vda', 'timestamp': '2025-10-02T12:37:16.473745', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1457788990', 'name': 'instance-000000a0', 'instance_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf', 'instance_type': 'm1.nano', 'host': '86538acccf15652cbcbd6dce337a361b90f94626f8f65bd3c5c73fc4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '876c3b12-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6628.063672407, 'message_signature': 'f79f141f766addea680b358a377a5fc78cb86449a5a6dddaaa78cac8941c334a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-sda', 'timestamp': '2025-10-02T12:37:16.473745', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1457788990', 'name': 'instance-000000a0', 'instance_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf', 'instance_type': 'm1.nano', 'host': '86538acccf15652cbcbd6dce337a361b90f94626f8f65bd3c5c73fc4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '876c4c10-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6628.063672407, 'message_signature': '0990c9a33cdaf61324626ad9f6b7a6b915ab680827148f21e71633c313919fc7'}]}, 'timestamp': '2025-10-02 12:37:16.474673', '_unique_id': '2d89dbb7f30740559c8e0b2bc75f812e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.475 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.475 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.475 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.475 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.475 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.475 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.475 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.475 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.475 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.475 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.475 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.475 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.475 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.475 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.475 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.475 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.475 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.475 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.475 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.475 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.475 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.475 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.475 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.475 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.475 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.475 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.475 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.475 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.475 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.475 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.475 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.475 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.476 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.477 12 DEBUG ceilometer.compute.pollsters [-] 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/disk.device.read.bytes volume: 30153216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.478 12 DEBUG ceilometer.compute.pollsters [-] 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.480 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9062d314-1bb3-43b8-9e35-8222cc501c84', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30153216, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-vda', 'timestamp': '2025-10-02T12:37:16.477410', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1457788990', 'name': 'instance-000000a0', 'instance_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf', 'instance_type': 'm1.nano', 'host': '86538acccf15652cbcbd6dce337a361b90f94626f8f65bd3c5c73fc4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '876cdf18-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6628.063672407, 'message_signature': '242fc1538e439ded646ca2e6b5877f4972addc93f501a8608eaf67f1db70cf9b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-sda', 'timestamp': '2025-10-02T12:37:16.477410', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1457788990', 'name': 'instance-000000a0', 'instance_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf', 'instance_type': 'm1.nano', 'host': '86538acccf15652cbcbd6dce337a361b90f94626f8f65bd3c5c73fc4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '876cf19c-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6628.063672407, 'message_signature': '3581f8a681d72d1420d28135bfd4bc0ef6524904087e63c3c6c41fa139718c08'}]}, 'timestamp': '2025-10-02 12:37:16.478907', '_unique_id': '29ca17d1d54e4f068e080de3f8818c6a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.480 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.480 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.480 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.480 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.480 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.480 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.480 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.480 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.480 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.480 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.480 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.480 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.480 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.480 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.480 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.480 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.480 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.480 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.480 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.480 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.480 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.480 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.480 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.480 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.480 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.480 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.480 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.480 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.480 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.480 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.480 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.481 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.482 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.482 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1457788990>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1457788990>]
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.482 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.482 12 DEBUG ceilometer.compute.pollsters [-] 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/disk.device.read.latency volume: 554326469 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.483 12 DEBUG ceilometer.compute.pollsters [-] 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/disk.device.read.latency volume: 46193239 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.484 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ff76a54c-d134-4c5e-b213-f2c0bf0f01ca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 554326469, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-vda', 'timestamp': '2025-10-02T12:37:16.482602', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1457788990', 'name': 'instance-000000a0', 'instance_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf', 'instance_type': 'm1.nano', 'host': '86538acccf15652cbcbd6dce337a361b90f94626f8f65bd3c5c73fc4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '876d91a6-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6628.063672407, 'message_signature': '6f9f7aa8e6667cdf5931721b94f3f7bd6890ea627c6fe10921e2278129b977fc'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 46193239, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-sda', 'timestamp': '2025-10-02T12:37:16.482602', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1457788990', 'name': 'instance-000000a0', 'instance_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf', 'instance_type': 'm1.nano', 'host': '86538acccf15652cbcbd6dce337a361b90f94626f8f65bd3c5c73fc4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '876da2d6-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6628.063672407, 'message_signature': '8ff4a0d47a0c403be7d28f7b85c666b3ea0c3e1dd4c8e60250fd6180e0d9ccf4'}]}, 'timestamp': '2025-10-02 12:37:16.483392', '_unique_id': '306cfb43b80145aeba07d13032a9aad3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.484 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.484 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.484 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.484 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.484 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.484 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.484 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.484 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.484 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.484 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.484 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.484 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.484 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.484 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.484 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.484 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.484 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.484 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.484 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.484 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.484 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.484 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.484 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.484 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.484 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.484 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.484 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.484 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.484 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.484 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.484 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.485 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.485 12 DEBUG ceilometer.compute.pollsters [-] 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/memory.usage volume: 48.05859375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.486 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ce3e40f5-b7c3-4241-a347-ad445cc5f296', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 48.05859375, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf', 'timestamp': '2025-10-02T12:37:16.485532', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1457788990', 'name': 'instance-000000a0', 'instance_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf', 'instance_type': 'm1.nano', 'host': '86538acccf15652cbcbd6dce337a361b90f94626f8f65bd3c5c73fc4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '876e0348-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6628.125168073, 'message_signature': '46aa7efa589e29f0efd86a46616315671290bedbcebdc176f59509c2e2c26378'}]}, 'timestamp': '2025-10-02 12:37:16.485909', '_unique_id': '808a103b2ffb495d9ada43cdce580d7b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.486 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.486 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.486 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.486 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.486 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.486 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.486 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.486 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.486 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.486 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.486 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.486 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.486 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.486 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.486 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.486 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.486 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.486 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.486 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.486 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.486 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.486 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.486 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.486 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.486 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.486 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.486 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.486 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.486 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.486 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.486 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.487 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.487 12 DEBUG ceilometer.compute.pollsters [-] 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.487 12 DEBUG ceilometer.compute.pollsters [-] 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.488 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '54c6de58-89c0-4ab2-a3f3-4f3b77986345', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': 'instance-000000a0-2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-tap7fd89911-39', 'timestamp': '2025-10-02T12:37:16.487548', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1457788990', 'name': 'tap7fd89911-39', 'instance_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf', 'instance_type': 'm1.nano', 'host': '86538acccf15652cbcbd6dce337a361b90f94626f8f65bd3c5c73fc4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:16:78:94', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7fd89911-39'}, 'message_id': '876e5258-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6628.05698662, 'message_signature': '47a84cdfab578f4190a655537a63d5c319e0bc27a4d156c8fdb88bab34e9b130'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': 'instance-000000a0-2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-tap7b31c42c-5f', 'timestamp': '2025-10-02T12:37:16.487548', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1457788990', 'name': 'tap7b31c42c-5f', 'instance_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf', 'instance_type': 'm1.nano', 'host': '86538acccf15652cbcbd6dce337a361b90f94626f8f65bd3c5c73fc4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a5:99:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7b31c42c-5f'}, 'message_id': '876e609a-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6628.05698662, 'message_signature': '552965ae56a3204c0a619bdb773539d79c2f90467cc6bb7aa1a2b6cb5de2c6a9'}]}, 'timestamp': '2025-10-02 12:37:16.488256', '_unique_id': '3f6b599feca2447db628c7359d316c73'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.488 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.488 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.488 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.488 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.488 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.488 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.488 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.488 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.488 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.488 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.488 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.488 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.488 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.488 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.488 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.488 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.488 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.488 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.488 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.488 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.488 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.488 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.488 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.488 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.488 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.488 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.488 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.488 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.488 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.488 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.488 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.490 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.490 12 DEBUG ceilometer.compute.pollsters [-] 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.490 12 DEBUG ceilometer.compute.pollsters [-] 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.491 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '10db0f9e-5ceb-411b-8d51-ed405eb99f27', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-vda', 'timestamp': '2025-10-02T12:37:16.490291', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1457788990', 'name': 'instance-000000a0', 'instance_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf', 'instance_type': 'm1.nano', 'host': '86538acccf15652cbcbd6dce337a361b90f94626f8f65bd3c5c73fc4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '876ebcca-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6628.040171585, 'message_signature': '30e7b55b85adf723936b2ca602a165c5aa6d7e347cc1bde7d4ac5b4503ee350f'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-sda', 'timestamp': '2025-10-02T12:37:16.490291', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1457788990', 'name': 'instance-000000a0', 'instance_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf', 'instance_type': 'm1.nano', 'host': '86538acccf15652cbcbd6dce337a361b90f94626f8f65bd3c5c73fc4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '876ec846-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6628.040171585, 'message_signature': '6c3605a73fc0c94265ebd53d809d2d4a23e8497c3ab1c018283af3a6d78100c1'}]}, 'timestamp': '2025-10-02 12:37:16.490928', '_unique_id': 'a07d8739ab374d18abb02f47a37a6113'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.491 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.491 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.491 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.491 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.491 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.491 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.491 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.491 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.491 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.491 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.491 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.491 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.491 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.491 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.491 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.491 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.491 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.491 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.491 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.491 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.491 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.491 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.491 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.491 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.491 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.491 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.491 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.491 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.491 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.491 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.491 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.492 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.492 12 DEBUG ceilometer.compute.pollsters [-] 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.492 12 DEBUG ceilometer.compute.pollsters [-] 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.494 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '13f4ac96-d922-4cc1-8144-21db505a6593', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-vda', 'timestamp': '2025-10-02T12:37:16.492582', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1457788990', 'name': 'instance-000000a0', 'instance_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf', 'instance_type': 'm1.nano', 'host': '86538acccf15652cbcbd6dce337a361b90f94626f8f65bd3c5c73fc4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '876f15e4-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6628.040171585, 'message_signature': '5bb8bafa5a6d83e0e516be2c8025c90d183cfe490ef7580ea647b6f4be59995a'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-sda', 'timestamp': '2025-10-02T12:37:16.492582', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1457788990', 'name': 'instance-000000a0', 'instance_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf', 'instance_type': 'm1.nano', 'host': '86538acccf15652cbcbd6dce337a361b90f94626f8f65bd3c5c73fc4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '876f2322-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6628.040171585, 'message_signature': '308422ef412a59fcd4c28b1759ac588a2f941d1cae1ac425da36e79700e6a995'}]}, 'timestamp': '2025-10-02 12:37:16.493276', '_unique_id': 'c672e6c3cd344d229de6a0f07e466fa9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.494 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.494 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.494 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.494 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.494 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.494 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.494 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.494 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.494 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.494 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.494 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.494 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.494 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.494 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.494 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.494 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.494 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.494 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.494 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.494 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.494 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.494 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.494 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.494 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.494 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.494 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.494 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.494 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.494 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.494 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.494 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.495 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.495 12 DEBUG ceilometer.compute.pollsters [-] 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.495 12 DEBUG ceilometer.compute.pollsters [-] 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.496 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f9a80950-0ade-4db5-93f8-704db138d9d3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': 'instance-000000a0-2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-tap7fd89911-39', 'timestamp': '2025-10-02T12:37:16.495449', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1457788990', 'name': 'tap7fd89911-39', 'instance_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf', 'instance_type': 'm1.nano', 'host': '86538acccf15652cbcbd6dce337a361b90f94626f8f65bd3c5c73fc4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:16:78:94', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7fd89911-39'}, 'message_id': '876f86b4-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6628.05698662, 'message_signature': '443e9165371d3aab5f4361b298b93a3943d4faa6b6a96b125cf26120264b8e2f'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': 'instance-000000a0-2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-tap7b31c42c-5f', 'timestamp': '2025-10-02T12:37:16.495449', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1457788990', 'name': 'tap7b31c42c-5f', 'instance_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf', 'instance_type': 'm1.nano', 'host': '86538acccf15652cbcbd6dce337a361b90f94626f8f65bd3c5c73fc4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a5:99:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7b31c42c-5f'}, 'message_id': '876f9532-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6628.05698662, 'message_signature': '3643c3c65315cdabf1cdace2289b559aa4f045663511795fbf358945e7c1d0f6'}]}, 'timestamp': '2025-10-02 12:37:16.496162', '_unique_id': 'd6e0bcbd87aa484081acfa12aee28504'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.496 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.496 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.496 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.496 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.496 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.496 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.496 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.496 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.496 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.496 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.496 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.496 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.496 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.496 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.496 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.496 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.496 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.496 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.496 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.496 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.496 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.496 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.496 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.496 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.496 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.496 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.496 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.496 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.496 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.496 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.496 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.497 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.497 12 DEBUG ceilometer.compute.pollsters [-] 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/network.incoming.packets volume: 150 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.498 12 DEBUG ceilometer.compute.pollsters [-] 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/network.incoming.packets volume: 11 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.499 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '44f7bfd5-aa30-4f46-934c-9664a2f0e856', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 150, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': 'instance-000000a0-2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-tap7fd89911-39', 'timestamp': '2025-10-02T12:37:16.497882', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1457788990', 'name': 'tap7fd89911-39', 'instance_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf', 'instance_type': 'm1.nano', 'host': '86538acccf15652cbcbd6dce337a361b90f94626f8f65bd3c5c73fc4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:16:78:94', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7fd89911-39'}, 'message_id': '876fe51e-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6628.05698662, 'message_signature': 'e30ddc4f365bff164d5686ef84a59a0aad11b5365ffc6312adff54388e44e128'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 11, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': 'instance-000000a0-2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-tap7b31c42c-5f', 'timestamp': '2025-10-02T12:37:16.497882', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1457788990', 'name': 'tap7b31c42c-5f', 'instance_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf', 'instance_type': 'm1.nano', 'host': '86538acccf15652cbcbd6dce337a361b90f94626f8f65bd3c5c73fc4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a5:99:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7b31c42c-5f'}, 'message_id': '876ff0ea-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6628.05698662, 'message_signature': 'e0a763d10c07cf9b318c811ab26a01338ad47f7bca1193f8211494d12f85bbf7'}]}, 'timestamp': '2025-10-02 12:37:16.498505', '_unique_id': 'f86fb9f57a504ddd86da359c170f359a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.499 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.499 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.499 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.499 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.499 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.499 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.499 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.499 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.499 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.499 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.499 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.499 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.499 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.499 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.499 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.499 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.499 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.499 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.499 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.499 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.499 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.499 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.499 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.499 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.499 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.499 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.499 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.499 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.499 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.499 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.499 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.500 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.500 12 DEBUG ceilometer.compute.pollsters [-] 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/network.outgoing.bytes volume: 23676 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.500 12 DEBUG ceilometer.compute.pollsters [-] 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/network.outgoing.bytes volume: 1480 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.501 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f961f207-5650-4168-84a1-c82ac3fca508', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23676, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': 'instance-000000a0-2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-tap7fd89911-39', 'timestamp': '2025-10-02T12:37:16.500118', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1457788990', 'name': 'tap7fd89911-39', 'instance_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf', 'instance_type': 'm1.nano', 'host': '86538acccf15652cbcbd6dce337a361b90f94626f8f65bd3c5c73fc4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:16:78:94', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7fd89911-39'}, 'message_id': '87703c62-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6628.05698662, 'message_signature': 'b6ae2a9651708fcc02bc0796468a09d00f87f7456f7d7ed48f0b8e70a69411a2'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1480, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': 'instance-000000a0-2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-tap7b31c42c-5f', 'timestamp': '2025-10-02T12:37:16.500118', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1457788990', 'name': 'tap7b31c42c-5f', 'instance_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf', 'instance_type': 'm1.nano', 'host': '86538acccf15652cbcbd6dce337a361b90f94626f8f65bd3c5c73fc4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a5:99:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7b31c42c-5f'}, 'message_id': '87704874-9f8c-11f0-9b9a-fa163ec2af05', 'monotonic_time': 6628.05698662, 'message_signature': '8a46ef6f413111571d663de89ea3a2bc0e749e888b88b79ba91d934c9e6cb813'}]}, 'timestamp': '2025-10-02 12:37:16.500744', '_unique_id': '0403931bff2f41ec8b05e92727f2d01b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.501 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.501 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.501 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.501 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.501 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.501 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.501 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.501 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.501 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.501 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.501 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.501 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.501 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.501 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.501 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.501 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.501 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.501 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.501 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.501 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.501 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.501 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.501 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.501 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.501 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.501 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.501 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.501 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.501 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.501 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:37:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:37:16.501 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:37:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:16.564 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=41, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=40) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:37:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:16.565 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:37:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:16.566 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '41'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:37:16 np0005466013 nova_compute[192144]: 2025-10-02 12:37:16.566 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:20 np0005466013 nova_compute[192144]: 2025-10-02 12:37:20.351 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:20 np0005466013 podman[247392]: 2025-10-02 12:37:20.683295873 +0000 UTC m=+0.059069951 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  2 08:37:20 np0005466013 podman[247393]: 2025-10-02 12:37:20.690412896 +0000 UTC m=+0.063173670 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  2 08:37:20 np0005466013 podman[247394]: 2025-10-02 12:37:20.724862018 +0000 UTC m=+0.093149432 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 08:37:20 np0005466013 nova_compute[192144]: 2025-10-02 12:37:20.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:21.252 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fb:62:4a 10.100.0.2 2001:db8:0:1:f816:3eff:fefb:624a 2001:db8::f816:3eff:fefb:624a'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8:0:1:f816:3eff:fefb:624a/64 2001:db8::f816:3eff:fefb:624a/64', 'neutron:device_id': 'ovnmeta-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=546080ca-391c-439c-be48-88bb942119c9, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=e78bd1c4-7546-4ebe-a71b-a49e8c78f36c) old=Port_Binding(mac=['fa:16:3e:fb:62:4a 10.100.0.2 2001:db8::f816:3eff:fefb:624a'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fefb:624a/64', 'neutron:device_id': 'ovnmeta-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:37:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:21.255 103323 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port e78bd1c4-7546-4ebe-a71b-a49e8c78f36c in datapath 1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6 updated#033[00m
Oct  2 08:37:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:21.259 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:37:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:21.260 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[56bbb188-b1f5-4059-a4ec-e3f652934f3e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:25 np0005466013 nova_compute[192144]: 2025-10-02 12:37:25.354 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:25 np0005466013 nova_compute[192144]: 2025-10-02 12:37:25.771 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:26 np0005466013 nova_compute[192144]: 2025-10-02 12:37:26.735 2 DEBUG oslo_concurrency.lockutils [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "5fc50e91-2988-453d-87f4-afbd5214d7f6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:37:26 np0005466013 nova_compute[192144]: 2025-10-02 12:37:26.736 2 DEBUG oslo_concurrency.lockutils [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "5fc50e91-2988-453d-87f4-afbd5214d7f6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:37:26 np0005466013 nova_compute[192144]: 2025-10-02 12:37:26.757 2 DEBUG nova.compute.manager [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:37:26 np0005466013 nova_compute[192144]: 2025-10-02 12:37:26.861 2 DEBUG oslo_concurrency.lockutils [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:37:26 np0005466013 nova_compute[192144]: 2025-10-02 12:37:26.862 2 DEBUG oslo_concurrency.lockutils [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:37:26 np0005466013 nova_compute[192144]: 2025-10-02 12:37:26.872 2 DEBUG nova.virt.hardware [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:37:26 np0005466013 nova_compute[192144]: 2025-10-02 12:37:26.873 2 INFO nova.compute.claims [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:37:27 np0005466013 nova_compute[192144]: 2025-10-02 12:37:27.021 2 DEBUG nova.compute.provider_tree [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:37:27 np0005466013 nova_compute[192144]: 2025-10-02 12:37:27.036 2 DEBUG nova.scheduler.client.report [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:37:27 np0005466013 nova_compute[192144]: 2025-10-02 12:37:27.064 2 DEBUG oslo_concurrency.lockutils [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.202s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:37:27 np0005466013 nova_compute[192144]: 2025-10-02 12:37:27.065 2 DEBUG nova.compute.manager [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:37:27 np0005466013 nova_compute[192144]: 2025-10-02 12:37:27.119 2 DEBUG nova.compute.manager [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:37:27 np0005466013 nova_compute[192144]: 2025-10-02 12:37:27.120 2 DEBUG nova.network.neutron [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:37:27 np0005466013 nova_compute[192144]: 2025-10-02 12:37:27.142 2 INFO nova.virt.libvirt.driver [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:37:27 np0005466013 nova_compute[192144]: 2025-10-02 12:37:27.164 2 DEBUG nova.compute.manager [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:37:27 np0005466013 nova_compute[192144]: 2025-10-02 12:37:27.293 2 DEBUG nova.compute.manager [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:37:27 np0005466013 nova_compute[192144]: 2025-10-02 12:37:27.295 2 DEBUG nova.virt.libvirt.driver [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:37:27 np0005466013 nova_compute[192144]: 2025-10-02 12:37:27.296 2 INFO nova.virt.libvirt.driver [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Creating image(s)#033[00m
Oct  2 08:37:27 np0005466013 nova_compute[192144]: 2025-10-02 12:37:27.297 2 DEBUG oslo_concurrency.lockutils [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "/var/lib/nova/instances/5fc50e91-2988-453d-87f4-afbd5214d7f6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:37:27 np0005466013 nova_compute[192144]: 2025-10-02 12:37:27.297 2 DEBUG oslo_concurrency.lockutils [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "/var/lib/nova/instances/5fc50e91-2988-453d-87f4-afbd5214d7f6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:37:27 np0005466013 nova_compute[192144]: 2025-10-02 12:37:27.298 2 DEBUG oslo_concurrency.lockutils [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "/var/lib/nova/instances/5fc50e91-2988-453d-87f4-afbd5214d7f6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:37:27 np0005466013 nova_compute[192144]: 2025-10-02 12:37:27.315 2 DEBUG oslo_concurrency.processutils [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:37:27 np0005466013 nova_compute[192144]: 2025-10-02 12:37:27.352 2 DEBUG nova.policy [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:37:27 np0005466013 nova_compute[192144]: 2025-10-02 12:37:27.370 2 DEBUG oslo_concurrency.processutils [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:37:27 np0005466013 nova_compute[192144]: 2025-10-02 12:37:27.371 2 DEBUG oslo_concurrency.lockutils [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:37:27 np0005466013 nova_compute[192144]: 2025-10-02 12:37:27.371 2 DEBUG oslo_concurrency.lockutils [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:37:27 np0005466013 nova_compute[192144]: 2025-10-02 12:37:27.381 2 DEBUG oslo_concurrency.processutils [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:37:27 np0005466013 nova_compute[192144]: 2025-10-02 12:37:27.442 2 DEBUG oslo_concurrency.processutils [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:37:27 np0005466013 nova_compute[192144]: 2025-10-02 12:37:27.444 2 DEBUG oslo_concurrency.processutils [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/5fc50e91-2988-453d-87f4-afbd5214d7f6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:37:27 np0005466013 nova_compute[192144]: 2025-10-02 12:37:27.480 2 DEBUG oslo_concurrency.processutils [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/5fc50e91-2988-453d-87f4-afbd5214d7f6/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:37:27 np0005466013 nova_compute[192144]: 2025-10-02 12:37:27.481 2 DEBUG oslo_concurrency.lockutils [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:37:27 np0005466013 nova_compute[192144]: 2025-10-02 12:37:27.482 2 DEBUG oslo_concurrency.processutils [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:37:27 np0005466013 nova_compute[192144]: 2025-10-02 12:37:27.572 2 DEBUG oslo_concurrency.processutils [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:37:27 np0005466013 nova_compute[192144]: 2025-10-02 12:37:27.573 2 DEBUG nova.virt.disk.api [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Checking if we can resize image /var/lib/nova/instances/5fc50e91-2988-453d-87f4-afbd5214d7f6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:37:27 np0005466013 nova_compute[192144]: 2025-10-02 12:37:27.573 2 DEBUG oslo_concurrency.processutils [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5fc50e91-2988-453d-87f4-afbd5214d7f6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:37:27 np0005466013 nova_compute[192144]: 2025-10-02 12:37:27.644 2 DEBUG oslo_concurrency.processutils [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5fc50e91-2988-453d-87f4-afbd5214d7f6/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:37:27 np0005466013 nova_compute[192144]: 2025-10-02 12:37:27.645 2 DEBUG nova.virt.disk.api [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Cannot resize image /var/lib/nova/instances/5fc50e91-2988-453d-87f4-afbd5214d7f6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:37:27 np0005466013 nova_compute[192144]: 2025-10-02 12:37:27.646 2 DEBUG nova.objects.instance [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lazy-loading 'migration_context' on Instance uuid 5fc50e91-2988-453d-87f4-afbd5214d7f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:37:27 np0005466013 nova_compute[192144]: 2025-10-02 12:37:27.664 2 DEBUG nova.virt.libvirt.driver [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:37:27 np0005466013 nova_compute[192144]: 2025-10-02 12:37:27.664 2 DEBUG nova.virt.libvirt.driver [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Ensure instance console log exists: /var/lib/nova/instances/5fc50e91-2988-453d-87f4-afbd5214d7f6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:37:27 np0005466013 nova_compute[192144]: 2025-10-02 12:37:27.665 2 DEBUG oslo_concurrency.lockutils [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:37:27 np0005466013 nova_compute[192144]: 2025-10-02 12:37:27.666 2 DEBUG oslo_concurrency.lockutils [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:37:27 np0005466013 nova_compute[192144]: 2025-10-02 12:37:27.666 2 DEBUG oslo_concurrency.lockutils [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:37:28 np0005466013 nova_compute[192144]: 2025-10-02 12:37:28.179 2 DEBUG nova.network.neutron [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Successfully created port: a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:37:28 np0005466013 nova_compute[192144]: 2025-10-02 12:37:28.762 2 DEBUG nova.network.neutron [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Successfully updated port: a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:37:28 np0005466013 nova_compute[192144]: 2025-10-02 12:37:28.776 2 DEBUG oslo_concurrency.lockutils [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "refresh_cache-5fc50e91-2988-453d-87f4-afbd5214d7f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:37:28 np0005466013 nova_compute[192144]: 2025-10-02 12:37:28.777 2 DEBUG oslo_concurrency.lockutils [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquired lock "refresh_cache-5fc50e91-2988-453d-87f4-afbd5214d7f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:37:28 np0005466013 nova_compute[192144]: 2025-10-02 12:37:28.777 2 DEBUG nova.network.neutron [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:37:28 np0005466013 nova_compute[192144]: 2025-10-02 12:37:28.864 2 DEBUG nova.compute.manager [req-e9ce3a1e-62c4-4c29-b659-577f7705d6dc req-7f793172-931f-46a2-8ea0-09e8ee1296aa 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Received event network-changed-a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:37:28 np0005466013 nova_compute[192144]: 2025-10-02 12:37:28.865 2 DEBUG nova.compute.manager [req-e9ce3a1e-62c4-4c29-b659-577f7705d6dc req-7f793172-931f-46a2-8ea0-09e8ee1296aa 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Refreshing instance network info cache due to event network-changed-a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:37:28 np0005466013 nova_compute[192144]: 2025-10-02 12:37:28.865 2 DEBUG oslo_concurrency.lockutils [req-e9ce3a1e-62c4-4c29-b659-577f7705d6dc req-7f793172-931f-46a2-8ea0-09e8ee1296aa 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-5fc50e91-2988-453d-87f4-afbd5214d7f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:37:28 np0005466013 nova_compute[192144]: 2025-10-02 12:37:28.922 2 DEBUG nova.network.neutron [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.084 2 DEBUG nova.network.neutron [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Updating instance_info_cache with network_info: [{"id": "a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb", "address": "fa:16:3e:cb:c0:e4", "network": {"id": "1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6", "bridge": "br-int", "label": "tempest-network-smoke--1033345898", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fecb:c0e4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fecb:c0e4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa745c1a1-9c", "ovs_interfaceid": "a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.106 2 DEBUG oslo_concurrency.lockutils [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Releasing lock "refresh_cache-5fc50e91-2988-453d-87f4-afbd5214d7f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.106 2 DEBUG nova.compute.manager [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Instance network_info: |[{"id": "a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb", "address": "fa:16:3e:cb:c0:e4", "network": {"id": "1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6", "bridge": "br-int", "label": "tempest-network-smoke--1033345898", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fecb:c0e4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fecb:c0e4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa745c1a1-9c", "ovs_interfaceid": "a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.107 2 DEBUG oslo_concurrency.lockutils [req-e9ce3a1e-62c4-4c29-b659-577f7705d6dc req-7f793172-931f-46a2-8ea0-09e8ee1296aa 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-5fc50e91-2988-453d-87f4-afbd5214d7f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.107 2 DEBUG nova.network.neutron [req-e9ce3a1e-62c4-4c29-b659-577f7705d6dc req-7f793172-931f-46a2-8ea0-09e8ee1296aa 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Refreshing network info cache for port a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.112 2 DEBUG nova.virt.libvirt.driver [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Start _get_guest_xml network_info=[{"id": "a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb", "address": "fa:16:3e:cb:c0:e4", "network": {"id": "1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6", "bridge": "br-int", "label": "tempest-network-smoke--1033345898", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fecb:c0e4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fecb:c0e4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa745c1a1-9c", "ovs_interfaceid": "a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.116 2 WARNING nova.virt.libvirt.driver [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.121 2 DEBUG nova.virt.libvirt.host [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.122 2 DEBUG nova.virt.libvirt.host [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.125 2 DEBUG nova.virt.libvirt.host [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.125 2 DEBUG nova.virt.libvirt.host [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.127 2 DEBUG nova.virt.libvirt.driver [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.127 2 DEBUG nova.virt.hardware [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.127 2 DEBUG nova.virt.hardware [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.128 2 DEBUG nova.virt.hardware [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.128 2 DEBUG nova.virt.hardware [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.128 2 DEBUG nova.virt.hardware [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.129 2 DEBUG nova.virt.hardware [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.129 2 DEBUG nova.virt.hardware [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.129 2 DEBUG nova.virt.hardware [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.129 2 DEBUG nova.virt.hardware [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.130 2 DEBUG nova.virt.hardware [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.130 2 DEBUG nova.virt.hardware [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.135 2 DEBUG nova.virt.libvirt.vif [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:37:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1257094405',display_name='tempest-TestGettingAddress-server-1257094405',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1257094405',id=163,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGgra/wXYI+rCsG5upBBPiIDSxRkzAR1A6pFxaSU1LPFfL3D5RfEN0Sz4k+PeFJCJFhU6eEresOI7XeTo6tERj3riWEwLSsbwiPk4PW1j9Dz/nyAQSV9AMMgUGCskGAq1Q==',key_name='tempest-TestGettingAddress-228200713',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-24sur2rw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:37:27Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=5fc50e91-2988-453d-87f4-afbd5214d7f6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb", "address": "fa:16:3e:cb:c0:e4", "network": {"id": "1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6", "bridge": "br-int", "label": "tempest-network-smoke--1033345898", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fecb:c0e4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fecb:c0e4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa745c1a1-9c", "ovs_interfaceid": "a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.135 2 DEBUG nova.network.os_vif_util [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb", "address": "fa:16:3e:cb:c0:e4", "network": {"id": "1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6", "bridge": "br-int", "label": "tempest-network-smoke--1033345898", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fecb:c0e4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fecb:c0e4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa745c1a1-9c", "ovs_interfaceid": "a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.136 2 DEBUG nova.network.os_vif_util [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cb:c0:e4,bridge_name='br-int',has_traffic_filtering=True,id=a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb,network=Network(1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa745c1a1-9c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.137 2 DEBUG nova.objects.instance [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5fc50e91-2988-453d-87f4-afbd5214d7f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.150 2 DEBUG nova.virt.libvirt.driver [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:37:30 np0005466013 nova_compute[192144]:  <uuid>5fc50e91-2988-453d-87f4-afbd5214d7f6</uuid>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:  <name>instance-000000a3</name>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:37:30 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:      <nova:name>tempest-TestGettingAddress-server-1257094405</nova:name>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:37:30</nova:creationTime>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:37:30 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:        <nova:user uuid="97ce9f1898484e0e9a1f7c84a9f0dfe3">tempest-TestGettingAddress-1355720650-project-member</nova:user>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:        <nova:project uuid="fd801958556f4c8aab047ecdef6b5ee8">tempest-TestGettingAddress-1355720650</nova:project>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:        <nova:port uuid="a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb">
Oct  2 08:37:30 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fecb:c0e4" ipVersion="6"/>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fecb:c0e4" ipVersion="6"/>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:      <entry name="serial">5fc50e91-2988-453d-87f4-afbd5214d7f6</entry>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:      <entry name="uuid">5fc50e91-2988-453d-87f4-afbd5214d7f6</entry>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:37:30 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/5fc50e91-2988-453d-87f4-afbd5214d7f6/disk"/>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:37:30 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/5fc50e91-2988-453d-87f4-afbd5214d7f6/disk.config"/>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:37:30 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:cb:c0:e4"/>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:      <target dev="tapa745c1a1-9c"/>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:37:30 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/5fc50e91-2988-453d-87f4-afbd5214d7f6/console.log" append="off"/>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:37:30 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:37:30 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:37:30 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:37:30 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:37:30 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.152 2 DEBUG nova.compute.manager [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Preparing to wait for external event network-vif-plugged-a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.153 2 DEBUG oslo_concurrency.lockutils [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "5fc50e91-2988-453d-87f4-afbd5214d7f6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.153 2 DEBUG oslo_concurrency.lockutils [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "5fc50e91-2988-453d-87f4-afbd5214d7f6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.154 2 DEBUG oslo_concurrency.lockutils [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "5fc50e91-2988-453d-87f4-afbd5214d7f6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.155 2 DEBUG nova.virt.libvirt.vif [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:37:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1257094405',display_name='tempest-TestGettingAddress-server-1257094405',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1257094405',id=163,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGgra/wXYI+rCsG5upBBPiIDSxRkzAR1A6pFxaSU1LPFfL3D5RfEN0Sz4k+PeFJCJFhU6eEresOI7XeTo6tERj3riWEwLSsbwiPk4PW1j9Dz/nyAQSV9AMMgUGCskGAq1Q==',key_name='tempest-TestGettingAddress-228200713',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-24sur2rw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:37:27Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=5fc50e91-2988-453d-87f4-afbd5214d7f6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb", "address": "fa:16:3e:cb:c0:e4", "network": {"id": "1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6", "bridge": "br-int", "label": "tempest-network-smoke--1033345898", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fecb:c0e4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fecb:c0e4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa745c1a1-9c", "ovs_interfaceid": "a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.156 2 DEBUG nova.network.os_vif_util [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb", "address": "fa:16:3e:cb:c0:e4", "network": {"id": "1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6", "bridge": "br-int", "label": "tempest-network-smoke--1033345898", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fecb:c0e4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fecb:c0e4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa745c1a1-9c", "ovs_interfaceid": "a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.157 2 DEBUG nova.network.os_vif_util [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cb:c0:e4,bridge_name='br-int',has_traffic_filtering=True,id=a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb,network=Network(1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa745c1a1-9c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.158 2 DEBUG os_vif [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cb:c0:e4,bridge_name='br-int',has_traffic_filtering=True,id=a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb,network=Network(1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa745c1a1-9c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.159 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.160 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.161 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.166 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.166 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa745c1a1-9c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.167 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa745c1a1-9c, col_values=(('external_ids', {'iface-id': 'a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cb:c0:e4', 'vm-uuid': '5fc50e91-2988-453d-87f4-afbd5214d7f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.169 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:30 np0005466013 NetworkManager[51205]: <info>  [1759408650.1706] manager: (tapa745c1a1-9c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/313)
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.172 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.178 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.179 2 INFO os_vif [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cb:c0:e4,bridge_name='br-int',has_traffic_filtering=True,id=a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb,network=Network(1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa745c1a1-9c')#033[00m
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.257 2 DEBUG nova.virt.libvirt.driver [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.257 2 DEBUG nova.virt.libvirt.driver [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.258 2 DEBUG nova.virt.libvirt.driver [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] No VIF found with MAC fa:16:3e:cb:c0:e4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.258 2 INFO nova.virt.libvirt.driver [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Using config drive#033[00m
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.613 2 INFO nova.virt.libvirt.driver [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Creating config drive at /var/lib/nova/instances/5fc50e91-2988-453d-87f4-afbd5214d7f6/disk.config#033[00m
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.618 2 DEBUG oslo_concurrency.processutils [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5fc50e91-2988-453d-87f4-afbd5214d7f6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfhn3f51q execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:37:30 np0005466013 podman[247478]: 2025-10-02 12:37:30.693072285 +0000 UTC m=+0.069328611 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  2 08:37:30 np0005466013 podman[247479]: 2025-10-02 12:37:30.702067595 +0000 UTC m=+0.064642765 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, release=1755695350, version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, config_id=edpm, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vendor=Red Hat, Inc., name=ubi9-minimal)
Oct  2 08:37:30 np0005466013 podman[247480]: 2025-10-02 12:37:30.713800201 +0000 UTC m=+0.068476724 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.754 2 DEBUG oslo_concurrency.processutils [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5fc50e91-2988-453d-87f4-afbd5214d7f6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfhn3f51q" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:37:30 np0005466013 kernel: tapa745c1a1-9c: entered promiscuous mode
Oct  2 08:37:30 np0005466013 NetworkManager[51205]: <info>  [1759408650.8189] manager: (tapa745c1a1-9c): new Tun device (/org/freedesktop/NetworkManager/Devices/314)
Oct  2 08:37:30 np0005466013 ovn_controller[94366]: 2025-10-02T12:37:30Z|00711|binding|INFO|Claiming lport a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb for this chassis.
Oct  2 08:37:30 np0005466013 ovn_controller[94366]: 2025-10-02T12:37:30Z|00712|binding|INFO|a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb: Claiming fa:16:3e:cb:c0:e4 10.100.0.6 2001:db8:0:1:f816:3eff:fecb:c0e4 2001:db8::f816:3eff:fecb:c0e4
Oct  2 08:37:30 np0005466013 systemd-udevd[247551]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.856 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:30.866 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cb:c0:e4 10.100.0.6 2001:db8:0:1:f816:3eff:fecb:c0e4 2001:db8::f816:3eff:fecb:c0e4'], port_security=['fa:16:3e:cb:c0:e4 10.100.0.6 2001:db8:0:1:f816:3eff:fecb:c0e4 2001:db8::f816:3eff:fecb:c0e4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28 2001:db8:0:1:f816:3eff:fecb:c0e4/64 2001:db8::f816:3eff:fecb:c0e4/64', 'neutron:device_id': '5fc50e91-2988-453d-87f4-afbd5214d7f6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '2', 'neutron:security_group_ids': '38601fe0-d139-4a59-b46e-238283b5fcdd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=546080ca-391c-439c-be48-88bb942119c9, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:37:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:30.868 103323 INFO neutron.agent.ovn.metadata.agent [-] Port a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb in datapath 1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6 bound to our chassis#033[00m
Oct  2 08:37:30 np0005466013 NetworkManager[51205]: <info>  [1759408650.8730] device (tapa745c1a1-9c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:37:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:30.872 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6#033[00m
Oct  2 08:37:30 np0005466013 NetworkManager[51205]: <info>  [1759408650.8745] device (tapa745c1a1-9c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:37:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:30.884 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[f23fe4cf-e55a-46a6-84a7-386f215b1f1c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:30.885 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1d7388dd-d1 in ovnmeta-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:37:30 np0005466013 ovn_controller[94366]: 2025-10-02T12:37:30Z|00713|binding|INFO|Setting lport a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb ovn-installed in OVS
Oct  2 08:37:30 np0005466013 ovn_controller[94366]: 2025-10-02T12:37:30Z|00714|binding|INFO|Setting lport a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb up in Southbound
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:30 np0005466013 nova_compute[192144]: 2025-10-02 12:37:30.891 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:30.888 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1d7388dd-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:37:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:30.888 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[e413a7d7-015c-4c00-b655-a7de5b95ddf5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:30.893 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[ee544795-211e-4dcc-9d4e-239cc00636fb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:30 np0005466013 systemd-machined[152202]: New machine qemu-75-instance-000000a3.
Oct  2 08:37:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:30.906 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[b3a5ae78-3eac-40a7-8a9f-3b22b59fe011]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:30 np0005466013 systemd[1]: Started Virtual Machine qemu-75-instance-000000a3.
Oct  2 08:37:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:30.944 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[47ce1336-0963-4fbc-87b6-89154444cbd3]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:30.977 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[622ce162-d1a9-4ac8-8651-0d3f2f02f3cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:30.983 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[32f0d355-8435-47ce-b1fa-b269c1b6600a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:30 np0005466013 NetworkManager[51205]: <info>  [1759408650.9843] manager: (tap1d7388dd-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/315)
Oct  2 08:37:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:31.012 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[14ff46d4-3cef-4c4b-9fea-163c3e63405e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:31.014 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[25080957-e702-45a2-b7a9-8851e1687a65]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:31 np0005466013 NetworkManager[51205]: <info>  [1759408651.0408] device (tap1d7388dd-d0): carrier: link connected
Oct  2 08:37:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:31.049 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[46de86f2-5177-4598-937d-a70ba5433867]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:31 np0005466013 nova_compute[192144]: 2025-10-02 12:37:31.053 2 DEBUG nova.compute.manager [req-7a2eae15-47c4-4f08-aa18-445b7ebd9c05 req-c98a392d-70c3-4b82-a81b-e4d6ad791b88 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Received event network-vif-plugged-a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:37:31 np0005466013 nova_compute[192144]: 2025-10-02 12:37:31.053 2 DEBUG oslo_concurrency.lockutils [req-7a2eae15-47c4-4f08-aa18-445b7ebd9c05 req-c98a392d-70c3-4b82-a81b-e4d6ad791b88 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "5fc50e91-2988-453d-87f4-afbd5214d7f6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:37:31 np0005466013 nova_compute[192144]: 2025-10-02 12:37:31.053 2 DEBUG oslo_concurrency.lockutils [req-7a2eae15-47c4-4f08-aa18-445b7ebd9c05 req-c98a392d-70c3-4b82-a81b-e4d6ad791b88 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "5fc50e91-2988-453d-87f4-afbd5214d7f6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:37:31 np0005466013 nova_compute[192144]: 2025-10-02 12:37:31.053 2 DEBUG oslo_concurrency.lockutils [req-7a2eae15-47c4-4f08-aa18-445b7ebd9c05 req-c98a392d-70c3-4b82-a81b-e4d6ad791b88 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "5fc50e91-2988-453d-87f4-afbd5214d7f6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:37:31 np0005466013 nova_compute[192144]: 2025-10-02 12:37:31.054 2 DEBUG nova.compute.manager [req-7a2eae15-47c4-4f08-aa18-445b7ebd9c05 req-c98a392d-70c3-4b82-a81b-e4d6ad791b88 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Processing event network-vif-plugged-a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:37:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:31.077 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[32d13488-fbc0-4009-876d-d804835281a8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1d7388dd-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fb:62:4a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 210], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 664265, 'reachable_time': 33984, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247587, 'error': None, 'target': 'ovnmeta-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:31.095 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[cda38c8c-9163-4c83-a106-a032f81e4c73]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefb:624a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 664265, 'tstamp': 664265}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 247588, 'error': None, 'target': 'ovnmeta-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:31.108 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[1f82289a-c6f4-4355-8b5d-291b12c6200f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1d7388dd-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fb:62:4a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 210], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 664265, 'reachable_time': 33984, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 192, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 192, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 247590, 'error': None, 'target': 'ovnmeta-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:31.141 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[1c7432c9-7b00-4123-9495-db5eb69075cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:31.206 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[25aa2f58-5740-4cea-b79b-15d999235127]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:31.207 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1d7388dd-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:37:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:31.208 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:37:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:31.208 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1d7388dd-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:37:31 np0005466013 nova_compute[192144]: 2025-10-02 12:37:31.209 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:31 np0005466013 kernel: tap1d7388dd-d0: entered promiscuous mode
Oct  2 08:37:31 np0005466013 NetworkManager[51205]: <info>  [1759408651.2117] manager: (tap1d7388dd-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/316)
Oct  2 08:37:31 np0005466013 nova_compute[192144]: 2025-10-02 12:37:31.212 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:31.213 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1d7388dd-d0, col_values=(('external_ids', {'iface-id': 'e78bd1c4-7546-4ebe-a71b-a49e8c78f36c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:37:31 np0005466013 nova_compute[192144]: 2025-10-02 12:37:31.214 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:31 np0005466013 ovn_controller[94366]: 2025-10-02T12:37:31Z|00715|binding|INFO|Releasing lport e78bd1c4-7546-4ebe-a71b-a49e8c78f36c from this chassis (sb_readonly=0)
Oct  2 08:37:31 np0005466013 nova_compute[192144]: 2025-10-02 12:37:31.215 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:31.216 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:37:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:31.222 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[d29cf1ff-81f3-45f0-8fba-342571763894]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:31.222 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:37:31 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:37:31 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:37:31 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6
Oct  2 08:37:31 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:37:31 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:37:31 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:37:31 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6.pid.haproxy
Oct  2 08:37:31 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:37:31 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:37:31 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:37:31 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:37:31 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:37:31 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:37:31 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:37:31 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:37:31 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:37:31 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:37:31 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:37:31 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:37:31 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:37:31 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:37:31 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:37:31 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:37:31 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:37:31 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:37:31 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:37:31 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:37:31 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID 1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6
Oct  2 08:37:31 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:37:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:31.223 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6', 'env', 'PROCESS_TAG=haproxy-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:37:31 np0005466013 nova_compute[192144]: 2025-10-02 12:37:31.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:31 np0005466013 nova_compute[192144]: 2025-10-02 12:37:31.594 2 DEBUG nova.compute.manager [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:37:31 np0005466013 nova_compute[192144]: 2025-10-02 12:37:31.596 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759408651.5937078, 5fc50e91-2988-453d-87f4-afbd5214d7f6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:37:31 np0005466013 nova_compute[192144]: 2025-10-02 12:37:31.597 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] VM Started (Lifecycle Event)#033[00m
Oct  2 08:37:31 np0005466013 nova_compute[192144]: 2025-10-02 12:37:31.608 2 DEBUG nova.virt.libvirt.driver [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:37:31 np0005466013 nova_compute[192144]: 2025-10-02 12:37:31.613 2 INFO nova.virt.libvirt.driver [-] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Instance spawned successfully.#033[00m
Oct  2 08:37:31 np0005466013 nova_compute[192144]: 2025-10-02 12:37:31.614 2 DEBUG nova.virt.libvirt.driver [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:37:31 np0005466013 nova_compute[192144]: 2025-10-02 12:37:31.631 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:37:31 np0005466013 nova_compute[192144]: 2025-10-02 12:37:31.639 2 DEBUG nova.virt.libvirt.driver [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:37:31 np0005466013 nova_compute[192144]: 2025-10-02 12:37:31.640 2 DEBUG nova.virt.libvirt.driver [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:37:31 np0005466013 nova_compute[192144]: 2025-10-02 12:37:31.642 2 DEBUG nova.virt.libvirt.driver [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:37:31 np0005466013 nova_compute[192144]: 2025-10-02 12:37:31.643 2 DEBUG nova.virt.libvirt.driver [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:37:31 np0005466013 nova_compute[192144]: 2025-10-02 12:37:31.644 2 DEBUG nova.virt.libvirt.driver [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:37:31 np0005466013 nova_compute[192144]: 2025-10-02 12:37:31.645 2 DEBUG nova.virt.libvirt.driver [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:37:31 np0005466013 podman[247628]: 2025-10-02 12:37:31.653246112 +0000 UTC m=+0.076940708 container create 6f0cb77385badf45492fbb84f9da4bdc8b1d32c37eadba993d8ae16483881a70 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS)
Oct  2 08:37:31 np0005466013 nova_compute[192144]: 2025-10-02 12:37:31.652 2 DEBUG nova.network.neutron [req-e9ce3a1e-62c4-4c29-b659-577f7705d6dc req-7f793172-931f-46a2-8ea0-09e8ee1296aa 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Updated VIF entry in instance network info cache for port a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:37:31 np0005466013 nova_compute[192144]: 2025-10-02 12:37:31.655 2 DEBUG nova.network.neutron [req-e9ce3a1e-62c4-4c29-b659-577f7705d6dc req-7f793172-931f-46a2-8ea0-09e8ee1296aa 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Updating instance_info_cache with network_info: [{"id": "a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb", "address": "fa:16:3e:cb:c0:e4", "network": {"id": "1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6", "bridge": "br-int", "label": "tempest-network-smoke--1033345898", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fecb:c0e4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fecb:c0e4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa745c1a1-9c", "ovs_interfaceid": "a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:37:31 np0005466013 nova_compute[192144]: 2025-10-02 12:37:31.659 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:37:31 np0005466013 nova_compute[192144]: 2025-10-02 12:37:31.691 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:37:31 np0005466013 nova_compute[192144]: 2025-10-02 12:37:31.692 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759408651.5952046, 5fc50e91-2988-453d-87f4-afbd5214d7f6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:37:31 np0005466013 nova_compute[192144]: 2025-10-02 12:37:31.692 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:37:31 np0005466013 nova_compute[192144]: 2025-10-02 12:37:31.694 2 DEBUG oslo_concurrency.lockutils [req-e9ce3a1e-62c4-4c29-b659-577f7705d6dc req-7f793172-931f-46a2-8ea0-09e8ee1296aa 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-5fc50e91-2988-453d-87f4-afbd5214d7f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:37:31 np0005466013 systemd[1]: Started libpod-conmon-6f0cb77385badf45492fbb84f9da4bdc8b1d32c37eadba993d8ae16483881a70.scope.
Oct  2 08:37:31 np0005466013 podman[247628]: 2025-10-02 12:37:31.61724618 +0000 UTC m=+0.040940826 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:37:31 np0005466013 nova_compute[192144]: 2025-10-02 12:37:31.714 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:37:31 np0005466013 nova_compute[192144]: 2025-10-02 12:37:31.719 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759408651.6067474, 5fc50e91-2988-453d-87f4-afbd5214d7f6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:37:31 np0005466013 nova_compute[192144]: 2025-10-02 12:37:31.719 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:37:31 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:37:31 np0005466013 nova_compute[192144]: 2025-10-02 12:37:31.739 2 INFO nova.compute.manager [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Took 4.45 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:37:31 np0005466013 nova_compute[192144]: 2025-10-02 12:37:31.740 2 DEBUG nova.compute.manager [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:37:31 np0005466013 nova_compute[192144]: 2025-10-02 12:37:31.741 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:37:31 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cde8710b46bd5dd8ae65822b81fd68b96f4c9f942fc195a828fd4b387b2aca42/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:37:31 np0005466013 nova_compute[192144]: 2025-10-02 12:37:31.754 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:37:31 np0005466013 podman[247628]: 2025-10-02 12:37:31.765945152 +0000 UTC m=+0.189639828 container init 6f0cb77385badf45492fbb84f9da4bdc8b1d32c37eadba993d8ae16483881a70 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:37:31 np0005466013 podman[247628]: 2025-10-02 12:37:31.775793609 +0000 UTC m=+0.199488225 container start 6f0cb77385badf45492fbb84f9da4bdc8b1d32c37eadba993d8ae16483881a70 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:37:31 np0005466013 nova_compute[192144]: 2025-10-02 12:37:31.784 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:37:31 np0005466013 neutron-haproxy-ovnmeta-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6[247643]: [NOTICE]   (247647) : New worker (247649) forked
Oct  2 08:37:31 np0005466013 neutron-haproxy-ovnmeta-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6[247643]: [NOTICE]   (247647) : Loading success.
Oct  2 08:37:31 np0005466013 nova_compute[192144]: 2025-10-02 12:37:31.842 2 INFO nova.compute.manager [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Took 5.02 seconds to build instance.#033[00m
Oct  2 08:37:31 np0005466013 nova_compute[192144]: 2025-10-02 12:37:31.860 2 DEBUG oslo_concurrency.lockutils [None req-b9acc284-2e1f-4a1a-a35b-5d09a19f156d 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "5fc50e91-2988-453d-87f4-afbd5214d7f6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:37:33 np0005466013 nova_compute[192144]: 2025-10-02 12:37:33.132 2 DEBUG nova.compute.manager [req-430eec8f-8de1-4ebd-9940-54349bb47c68 req-f1293934-aba3-460a-8e5e-8c0aaa00f924 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Received event network-vif-plugged-a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:37:33 np0005466013 nova_compute[192144]: 2025-10-02 12:37:33.132 2 DEBUG oslo_concurrency.lockutils [req-430eec8f-8de1-4ebd-9940-54349bb47c68 req-f1293934-aba3-460a-8e5e-8c0aaa00f924 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "5fc50e91-2988-453d-87f4-afbd5214d7f6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:37:33 np0005466013 nova_compute[192144]: 2025-10-02 12:37:33.133 2 DEBUG oslo_concurrency.lockutils [req-430eec8f-8de1-4ebd-9940-54349bb47c68 req-f1293934-aba3-460a-8e5e-8c0aaa00f924 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "5fc50e91-2988-453d-87f4-afbd5214d7f6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:37:33 np0005466013 nova_compute[192144]: 2025-10-02 12:37:33.133 2 DEBUG oslo_concurrency.lockutils [req-430eec8f-8de1-4ebd-9940-54349bb47c68 req-f1293934-aba3-460a-8e5e-8c0aaa00f924 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "5fc50e91-2988-453d-87f4-afbd5214d7f6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:37:33 np0005466013 nova_compute[192144]: 2025-10-02 12:37:33.134 2 DEBUG nova.compute.manager [req-430eec8f-8de1-4ebd-9940-54349bb47c68 req-f1293934-aba3-460a-8e5e-8c0aaa00f924 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] No waiting events found dispatching network-vif-plugged-a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:37:33 np0005466013 nova_compute[192144]: 2025-10-02 12:37:33.134 2 WARNING nova.compute.manager [req-430eec8f-8de1-4ebd-9940-54349bb47c68 req-f1293934-aba3-460a-8e5e-8c0aaa00f924 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Received unexpected event network-vif-plugged-a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb for instance with vm_state active and task_state None.#033[00m
Oct  2 08:37:33 np0005466013 podman[247658]: 2025-10-02 12:37:33.6973425 +0000 UTC m=+0.060977830 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:37:33 np0005466013 podman[247659]: 2025-10-02 12:37:33.71436464 +0000 UTC m=+0.074827041 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:37:35 np0005466013 nova_compute[192144]: 2025-10-02 12:37:35.170 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:35 np0005466013 nova_compute[192144]: 2025-10-02 12:37:35.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:38 np0005466013 nova_compute[192144]: 2025-10-02 12:37:38.995 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:37:39 np0005466013 nova_compute[192144]: 2025-10-02 12:37:39.024 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:37:39 np0005466013 nova_compute[192144]: 2025-10-02 12:37:39.025 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:37:39 np0005466013 nova_compute[192144]: 2025-10-02 12:37:39.025 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:37:39 np0005466013 nova_compute[192144]: 2025-10-02 12:37:39.025 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:37:39 np0005466013 nova_compute[192144]: 2025-10-02 12:37:39.107 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:37:39 np0005466013 nova_compute[192144]: 2025-10-02 12:37:39.194 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:37:39 np0005466013 nova_compute[192144]: 2025-10-02 12:37:39.195 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:37:39 np0005466013 nova_compute[192144]: 2025-10-02 12:37:39.261 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:37:39 np0005466013 nova_compute[192144]: 2025-10-02 12:37:39.268 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5fc50e91-2988-453d-87f4-afbd5214d7f6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:37:39 np0005466013 nova_compute[192144]: 2025-10-02 12:37:39.360 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5fc50e91-2988-453d-87f4-afbd5214d7f6/disk --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:37:39 np0005466013 nova_compute[192144]: 2025-10-02 12:37:39.361 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5fc50e91-2988-453d-87f4-afbd5214d7f6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:37:39 np0005466013 nova_compute[192144]: 2025-10-02 12:37:39.425 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5fc50e91-2988-453d-87f4-afbd5214d7f6/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:37:39 np0005466013 nova_compute[192144]: 2025-10-02 12:37:39.615 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:37:39 np0005466013 nova_compute[192144]: 2025-10-02 12:37:39.616 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5365MB free_disk=73.10295104980469GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:37:39 np0005466013 nova_compute[192144]: 2025-10-02 12:37:39.616 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:37:39 np0005466013 nova_compute[192144]: 2025-10-02 12:37:39.617 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:37:39 np0005466013 nova_compute[192144]: 2025-10-02 12:37:39.705 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:37:39 np0005466013 nova_compute[192144]: 2025-10-02 12:37:39.706 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance 5fc50e91-2988-453d-87f4-afbd5214d7f6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:37:39 np0005466013 nova_compute[192144]: 2025-10-02 12:37:39.706 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:37:39 np0005466013 nova_compute[192144]: 2025-10-02 12:37:39.706 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:37:39 np0005466013 nova_compute[192144]: 2025-10-02 12:37:39.750 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:37:39 np0005466013 nova_compute[192144]: 2025-10-02 12:37:39.785 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:37:39 np0005466013 nova_compute[192144]: 2025-10-02 12:37:39.806 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:37:39 np0005466013 nova_compute[192144]: 2025-10-02 12:37:39.806 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.190s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:37:40 np0005466013 nova_compute[192144]: 2025-10-02 12:37:40.172 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:40 np0005466013 nova_compute[192144]: 2025-10-02 12:37:40.362 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:40 np0005466013 nova_compute[192144]: 2025-10-02 12:37:40.612 2 DEBUG nova.compute.manager [req-a3bfb38f-bc88-4154-bca1-1fac26d8f783 req-e53ea8ab-777a-427c-ad1a-8e3c12477648 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Received event network-changed-a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:37:40 np0005466013 nova_compute[192144]: 2025-10-02 12:37:40.612 2 DEBUG nova.compute.manager [req-a3bfb38f-bc88-4154-bca1-1fac26d8f783 req-e53ea8ab-777a-427c-ad1a-8e3c12477648 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Refreshing instance network info cache due to event network-changed-a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:37:40 np0005466013 nova_compute[192144]: 2025-10-02 12:37:40.613 2 DEBUG oslo_concurrency.lockutils [req-a3bfb38f-bc88-4154-bca1-1fac26d8f783 req-e53ea8ab-777a-427c-ad1a-8e3c12477648 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-5fc50e91-2988-453d-87f4-afbd5214d7f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:37:40 np0005466013 nova_compute[192144]: 2025-10-02 12:37:40.613 2 DEBUG oslo_concurrency.lockutils [req-a3bfb38f-bc88-4154-bca1-1fac26d8f783 req-e53ea8ab-777a-427c-ad1a-8e3c12477648 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-5fc50e91-2988-453d-87f4-afbd5214d7f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:37:40 np0005466013 nova_compute[192144]: 2025-10-02 12:37:40.614 2 DEBUG nova.network.neutron [req-a3bfb38f-bc88-4154-bca1-1fac26d8f783 req-e53ea8ab-777a-427c-ad1a-8e3c12477648 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Refreshing network info cache for port a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:37:40 np0005466013 nova_compute[192144]: 2025-10-02 12:37:40.805 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:37:40 np0005466013 nova_compute[192144]: 2025-10-02 12:37:40.806 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:37:41 np0005466013 nova_compute[192144]: 2025-10-02 12:37:41.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:37:42 np0005466013 nova_compute[192144]: 2025-10-02 12:37:42.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:37:43 np0005466013 nova_compute[192144]: 2025-10-02 12:37:43.289 2 DEBUG nova.network.neutron [req-a3bfb38f-bc88-4154-bca1-1fac26d8f783 req-e53ea8ab-777a-427c-ad1a-8e3c12477648 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Updated VIF entry in instance network info cache for port a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:37:43 np0005466013 nova_compute[192144]: 2025-10-02 12:37:43.290 2 DEBUG nova.network.neutron [req-a3bfb38f-bc88-4154-bca1-1fac26d8f783 req-e53ea8ab-777a-427c-ad1a-8e3c12477648 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Updating instance_info_cache with network_info: [{"id": "a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb", "address": "fa:16:3e:cb:c0:e4", "network": {"id": "1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6", "bridge": "br-int", "label": "tempest-network-smoke--1033345898", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fecb:c0e4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fecb:c0e4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa745c1a1-9c", "ovs_interfaceid": "a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:37:43 np0005466013 nova_compute[192144]: 2025-10-02 12:37:43.331 2 DEBUG oslo_concurrency.lockutils [req-a3bfb38f-bc88-4154-bca1-1fac26d8f783 req-e53ea8ab-777a-427c-ad1a-8e3c12477648 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-5fc50e91-2988-453d-87f4-afbd5214d7f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:37:44 np0005466013 ovn_controller[94366]: 2025-10-02T12:37:44Z|00075|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:cb:c0:e4 10.100.0.6
Oct  2 08:37:44 np0005466013 ovn_controller[94366]: 2025-10-02T12:37:44Z|00076|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:cb:c0:e4 10.100.0.6
Oct  2 08:37:44 np0005466013 nova_compute[192144]: 2025-10-02 12:37:44.519 2 DEBUG nova.compute.manager [req-35dd67f0-41f1-4610-8daf-b02dd1f66822 req-c11a4f51-7e2b-4dc0-a642-88e8927f0cb9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Received event network-changed-7b31c42c-5f0b-40fa-b913-4546441b9444 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:37:44 np0005466013 nova_compute[192144]: 2025-10-02 12:37:44.520 2 DEBUG nova.compute.manager [req-35dd67f0-41f1-4610-8daf-b02dd1f66822 req-c11a4f51-7e2b-4dc0-a642-88e8927f0cb9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Refreshing instance network info cache due to event network-changed-7b31c42c-5f0b-40fa-b913-4546441b9444. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:37:44 np0005466013 nova_compute[192144]: 2025-10-02 12:37:44.520 2 DEBUG oslo_concurrency.lockutils [req-35dd67f0-41f1-4610-8daf-b02dd1f66822 req-c11a4f51-7e2b-4dc0-a642-88e8927f0cb9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:37:44 np0005466013 nova_compute[192144]: 2025-10-02 12:37:44.520 2 DEBUG oslo_concurrency.lockutils [req-35dd67f0-41f1-4610-8daf-b02dd1f66822 req-c11a4f51-7e2b-4dc0-a642-88e8927f0cb9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:37:44 np0005466013 nova_compute[192144]: 2025-10-02 12:37:44.520 2 DEBUG nova.network.neutron [req-35dd67f0-41f1-4610-8daf-b02dd1f66822 req-c11a4f51-7e2b-4dc0-a642-88e8927f0cb9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Refreshing network info cache for port 7b31c42c-5f0b-40fa-b913-4546441b9444 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:37:44 np0005466013 nova_compute[192144]: 2025-10-02 12:37:44.989 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:37:45 np0005466013 nova_compute[192144]: 2025-10-02 12:37:45.175 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:45 np0005466013 nova_compute[192144]: 2025-10-02 12:37:45.365 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:45 np0005466013 nova_compute[192144]: 2025-10-02 12:37:45.992 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:37:46 np0005466013 nova_compute[192144]: 2025-10-02 12:37:46.358 2 DEBUG nova.network.neutron [req-35dd67f0-41f1-4610-8daf-b02dd1f66822 req-c11a4f51-7e2b-4dc0-a642-88e8927f0cb9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Updated VIF entry in instance network info cache for port 7b31c42c-5f0b-40fa-b913-4546441b9444. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:37:46 np0005466013 nova_compute[192144]: 2025-10-02 12:37:46.359 2 DEBUG nova.network.neutron [req-35dd67f0-41f1-4610-8daf-b02dd1f66822 req-c11a4f51-7e2b-4dc0-a642-88e8927f0cb9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Updating instance_info_cache with network_info: [{"id": "7fd89911-3957-4f68-8adb-9a1c640f6bdb", "address": "fa:16:3e:16:78:94", "network": {"id": "f6b28beb-3fac-4e00-bd1f-932a66109b1d", "bridge": "br-int", "label": "tempest-network-smoke--1067683882", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7fd89911-39", "ovs_interfaceid": "7fd89911-3957-4f68-8adb-9a1c640f6bdb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7b31c42c-5f0b-40fa-b913-4546441b9444", "address": "fa:16:3e:a5:99:e1", "network": {"id": "a970b3c6-2fc3-4025-868b-2e9af396991a", "bridge": "br-int", "label": "tempest-network-smoke--441167180", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b31c42c-5f", "ovs_interfaceid": "7b31c42c-5f0b-40fa-b913-4546441b9444", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:37:46 np0005466013 nova_compute[192144]: 2025-10-02 12:37:46.386 2 DEBUG oslo_concurrency.lockutils [req-35dd67f0-41f1-4610-8daf-b02dd1f66822 req-c11a4f51-7e2b-4dc0-a642-88e8927f0cb9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:37:50 np0005466013 nova_compute[192144]: 2025-10-02 12:37:50.178 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:50 np0005466013 nova_compute[192144]: 2025-10-02 12:37:50.368 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:50 np0005466013 nova_compute[192144]: 2025-10-02 12:37:50.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:37:50 np0005466013 nova_compute[192144]: 2025-10-02 12:37:50.996 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:37:50 np0005466013 nova_compute[192144]: 2025-10-02 12:37:50.996 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:37:51 np0005466013 nova_compute[192144]: 2025-10-02 12:37:51.166 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "refresh_cache-2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:37:51 np0005466013 nova_compute[192144]: 2025-10-02 12:37:51.167 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquired lock "refresh_cache-2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:37:51 np0005466013 nova_compute[192144]: 2025-10-02 12:37:51.167 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:37:51 np0005466013 nova_compute[192144]: 2025-10-02 12:37:51.168 2 DEBUG nova.objects.instance [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lazy-loading 'info_cache' on Instance uuid 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:37:51 np0005466013 podman[247731]: 2025-10-02 12:37:51.744421596 +0000 UTC m=+0.103630579 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:37:51 np0005466013 podman[247732]: 2025-10-02 12:37:51.744871 +0000 UTC m=+0.099623644 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true)
Oct  2 08:37:51 np0005466013 podman[247733]: 2025-10-02 12:37:51.798386167 +0000 UTC m=+0.148305211 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:37:53 np0005466013 nova_compute[192144]: 2025-10-02 12:37:53.931 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Updating instance_info_cache with network_info: [{"id": "7fd89911-3957-4f68-8adb-9a1c640f6bdb", "address": "fa:16:3e:16:78:94", "network": {"id": "f6b28beb-3fac-4e00-bd1f-932a66109b1d", "bridge": "br-int", "label": "tempest-network-smoke--1067683882", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7fd89911-39", "ovs_interfaceid": "7fd89911-3957-4f68-8adb-9a1c640f6bdb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7b31c42c-5f0b-40fa-b913-4546441b9444", "address": "fa:16:3e:a5:99:e1", "network": {"id": "a970b3c6-2fc3-4025-868b-2e9af396991a", "bridge": "br-int", "label": "tempest-network-smoke--441167180", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b31c42c-5f", "ovs_interfaceid": "7b31c42c-5f0b-40fa-b913-4546441b9444", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:37:53 np0005466013 nova_compute[192144]: 2025-10-02 12:37:53.952 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Releasing lock "refresh_cache-2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:37:53 np0005466013 nova_compute[192144]: 2025-10-02 12:37:53.953 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:37:53 np0005466013 nova_compute[192144]: 2025-10-02 12:37:53.954 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:37:53 np0005466013 nova_compute[192144]: 2025-10-02 12:37:53.954 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:37:55 np0005466013 nova_compute[192144]: 2025-10-02 12:37:55.183 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:55 np0005466013 nova_compute[192144]: 2025-10-02 12:37:55.371 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:59.930 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=42, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=41) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:37:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:37:59.931 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:37:59 np0005466013 nova_compute[192144]: 2025-10-02 12:37:59.948 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:00 np0005466013 nova_compute[192144]: 2025-10-02 12:38:00.185 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:00 np0005466013 nova_compute[192144]: 2025-10-02 12:38:00.374 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:01 np0005466013 podman[247801]: 2025-10-02 12:38:01.722924887 +0000 UTC m=+0.090610104 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible)
Oct  2 08:38:01 np0005466013 podman[247803]: 2025-10-02 12:38:01.726828939 +0000 UTC m=+0.077868107 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute)
Oct  2 08:38:01 np0005466013 podman[247802]: 2025-10-02 12:38:01.735238811 +0000 UTC m=+0.088622522 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., config_id=edpm, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal)
Oct  2 08:38:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:02.321 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:38:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:02.322 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:38:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:02.323 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:03 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:03.934 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '42'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:38:04 np0005466013 podman[247862]: 2025-10-02 12:38:04.699611733 +0000 UTC m=+0.072591682 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 08:38:04 np0005466013 podman[247861]: 2025-10-02 12:38:04.700348607 +0000 UTC m=+0.080609392 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:38:05 np0005466013 nova_compute[192144]: 2025-10-02 12:38:05.187 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:05 np0005466013 nova_compute[192144]: 2025-10-02 12:38:05.376 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:10 np0005466013 nova_compute[192144]: 2025-10-02 12:38:10.189 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:10 np0005466013 nova_compute[192144]: 2025-10-02 12:38:10.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:15 np0005466013 nova_compute[192144]: 2025-10-02 12:38:15.191 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:15 np0005466013 nova_compute[192144]: 2025-10-02 12:38:15.383 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:19 np0005466013 nova_compute[192144]: 2025-10-02 12:38:19.437 2 DEBUG oslo_concurrency.lockutils [None req-3a10a512-b7f4-408c-8427-03cd0d5a48c4 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "interface-2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-7b31c42c-5f0b-40fa-b913-4546441b9444" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:38:19 np0005466013 nova_compute[192144]: 2025-10-02 12:38:19.437 2 DEBUG oslo_concurrency.lockutils [None req-3a10a512-b7f4-408c-8427-03cd0d5a48c4 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "interface-2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-7b31c42c-5f0b-40fa-b913-4546441b9444" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:38:19 np0005466013 nova_compute[192144]: 2025-10-02 12:38:19.452 2 DEBUG nova.objects.instance [None req-3a10a512-b7f4-408c-8427-03cd0d5a48c4 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lazy-loading 'flavor' on Instance uuid 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:38:19 np0005466013 nova_compute[192144]: 2025-10-02 12:38:19.493 2 DEBUG nova.virt.libvirt.vif [None req-3a10a512-b7f4-408c-8427-03cd0d5a48c4 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:36:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1457788990',display_name='tempest-TestNetworkBasicOps-server-1457788990',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1457788990',id=160,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLAX+2XQrOd7KzesCSBvPQcBu33n2cNneMCsROY2zt8eQS2rwe7lFkCII3Yacz+uHDwkN7yLjJdqPxnhD7koMwAMvGoeCFoDh9M4G2I81ZizY75euwWGW9AYqhxHgyD2+w==',key_name='tempest-TestNetworkBasicOps-1430246833',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:36:32Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6e2a4899168a47618e377cb3ac85ddd2',ramdisk_id='',reservation_id='r-dfdu8cck',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1323893370',owner_user_name='tempest-TestNetworkBasicOps-1323893370-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:36:32Z,user_data=None,user_id='a1898fdf056c4a249c33590f26d4d845',uuid=2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7b31c42c-5f0b-40fa-b913-4546441b9444", "address": "fa:16:3e:a5:99:e1", "network": {"id": "a970b3c6-2fc3-4025-868b-2e9af396991a", "bridge": "br-int", "label": "tempest-network-smoke--441167180", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b31c42c-5f", "ovs_interfaceid": "7b31c42c-5f0b-40fa-b913-4546441b9444", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:38:19 np0005466013 nova_compute[192144]: 2025-10-02 12:38:19.494 2 DEBUG nova.network.os_vif_util [None req-3a10a512-b7f4-408c-8427-03cd0d5a48c4 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Converting VIF {"id": "7b31c42c-5f0b-40fa-b913-4546441b9444", "address": "fa:16:3e:a5:99:e1", "network": {"id": "a970b3c6-2fc3-4025-868b-2e9af396991a", "bridge": "br-int", "label": "tempest-network-smoke--441167180", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b31c42c-5f", "ovs_interfaceid": "7b31c42c-5f0b-40fa-b913-4546441b9444", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:38:19 np0005466013 nova_compute[192144]: 2025-10-02 12:38:19.494 2 DEBUG nova.network.os_vif_util [None req-3a10a512-b7f4-408c-8427-03cd0d5a48c4 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a5:99:e1,bridge_name='br-int',has_traffic_filtering=True,id=7b31c42c-5f0b-40fa-b913-4546441b9444,network=Network(a970b3c6-2fc3-4025-868b-2e9af396991a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b31c42c-5f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:38:19 np0005466013 nova_compute[192144]: 2025-10-02 12:38:19.496 2 DEBUG nova.virt.libvirt.guest [None req-3a10a512-b7f4-408c-8427-03cd0d5a48c4 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:a5:99:e1"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap7b31c42c-5f"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Oct  2 08:38:19 np0005466013 nova_compute[192144]: 2025-10-02 12:38:19.498 2 DEBUG nova.virt.libvirt.guest [None req-3a10a512-b7f4-408c-8427-03cd0d5a48c4 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:a5:99:e1"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap7b31c42c-5f"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Oct  2 08:38:19 np0005466013 nova_compute[192144]: 2025-10-02 12:38:19.500 2 DEBUG nova.virt.libvirt.driver [None req-3a10a512-b7f4-408c-8427-03cd0d5a48c4 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Attempting to detach device tap7b31c42c-5f from instance 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Oct  2 08:38:19 np0005466013 nova_compute[192144]: 2025-10-02 12:38:19.500 2 DEBUG nova.virt.libvirt.guest [None req-3a10a512-b7f4-408c-8427-03cd0d5a48c4 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] detach device xml: <interface type="ethernet">
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <mac address="fa:16:3e:a5:99:e1"/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <model type="virtio"/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <mtu size="1442"/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <target dev="tap7b31c42c-5f"/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]: </interface>
Oct  2 08:38:19 np0005466013 nova_compute[192144]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Oct  2 08:38:19 np0005466013 nova_compute[192144]: 2025-10-02 12:38:19.537 2 DEBUG nova.virt.libvirt.guest [None req-3a10a512-b7f4-408c-8427-03cd0d5a48c4 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:a5:99:e1"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap7b31c42c-5f"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Oct  2 08:38:19 np0005466013 nova_compute[192144]: 2025-10-02 12:38:19.541 2 DEBUG nova.virt.libvirt.guest [None req-3a10a512-b7f4-408c-8427-03cd0d5a48c4 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:a5:99:e1"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap7b31c42c-5f"/></interface>not found in domain: <domain type='kvm' id='74'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <name>instance-000000a0</name>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <uuid>2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf</uuid>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <nova:name>tempest-TestNetworkBasicOps-server-1457788990</nova:name>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <nova:creationTime>2025-10-02 12:37:05</nova:creationTime>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <nova:flavor name="m1.nano">
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <nova:memory>128</nova:memory>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <nova:disk>1</nova:disk>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <nova:swap>0</nova:swap>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <nova:vcpus>1</nova:vcpus>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  </nova:flavor>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <nova:owner>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <nova:user uuid="a1898fdf056c4a249c33590f26d4d845">tempest-TestNetworkBasicOps-1323893370-project-member</nova:user>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <nova:project uuid="6e2a4899168a47618e377cb3ac85ddd2">tempest-TestNetworkBasicOps-1323893370</nova:project>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  </nova:owner>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <nova:ports>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <nova:port uuid="7fd89911-3957-4f68-8adb-9a1c640f6bdb">
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </nova:port>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <nova:port uuid="7b31c42c-5f0b-40fa-b913-4546441b9444">
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <nova:ip type="fixed" address="10.100.0.29" ipVersion="4"/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </nova:port>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  </nova:ports>
Oct  2 08:38:19 np0005466013 nova_compute[192144]: </nova:instance>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <memory unit='KiB'>131072</memory>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <currentMemory unit='KiB'>131072</currentMemory>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <vcpu placement='static'>1</vcpu>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <resource>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <partition>/machine</partition>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  </resource>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <sysinfo type='smbios'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <entry name='manufacturer'>RDO</entry>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <entry name='product'>OpenStack Compute</entry>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <entry name='serial'>2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf</entry>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <entry name='uuid'>2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf</entry>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <entry name='family'>Virtual Machine</entry>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <type arch='x86_64' machine='pc-q35-rhel9.6.0'>hvm</type>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <boot dev='hd'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <smbios mode='sysinfo'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <vmcoreinfo state='on'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <cpu mode='custom' match='exact' check='full'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <model fallback='forbid'>Nehalem</model>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <feature policy='require' name='x2apic'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <feature policy='require' name='hypervisor'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <feature policy='require' name='vme'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <clock offset='utc'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <timer name='pit' tickpolicy='delay'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <timer name='rtc' tickpolicy='catchup'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <timer name='hpet' present='no'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <on_poweroff>destroy</on_poweroff>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <on_reboot>restart</on_reboot>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <on_crash>destroy</on_crash>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <disk type='file' device='disk'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <driver name='qemu' type='qcow2' cache='none'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <source file='/var/lib/nova/instances/2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/disk' index='2'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <backingStore type='file' index='3'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:        <format type='raw'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:        <source file='/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:        <backingStore/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      </backingStore>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target dev='vda' bus='virtio'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='virtio-disk0'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <disk type='file' device='cdrom'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <driver name='qemu' type='raw' cache='none'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <source file='/var/lib/nova/instances/2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/disk.config' index='1'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <backingStore/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target dev='sda' bus='sata'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <readonly/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='sata0-0-0'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='pci' index='0' model='pcie-root'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='pcie.0'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='pci' index='1' model='pcie-root-port'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target chassis='1' port='0x10'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='pci.1'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='pci' index='2' model='pcie-root-port'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target chassis='2' port='0x11'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='pci.2'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='pci' index='3' model='pcie-root-port'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target chassis='3' port='0x12'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='pci.3'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='pci' index='4' model='pcie-root-port'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target chassis='4' port='0x13'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='pci.4'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='pci' index='5' model='pcie-root-port'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target chassis='5' port='0x14'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='pci.5'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='pci' index='6' model='pcie-root-port'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target chassis='6' port='0x15'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='pci.6'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='pci' index='7' model='pcie-root-port'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target chassis='7' port='0x16'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='pci.7'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='pci' index='8' model='pcie-root-port'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target chassis='8' port='0x17'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='pci.8'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='pci' index='9' model='pcie-root-port'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target chassis='9' port='0x18'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='pci.9'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='pci' index='10' model='pcie-root-port'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target chassis='10' port='0x19'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='pci.10'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='pci' index='11' model='pcie-root-port'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target chassis='11' port='0x1a'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='pci.11'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='pci' index='12' model='pcie-root-port'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target chassis='12' port='0x1b'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='pci.12'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='pci' index='13' model='pcie-root-port'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target chassis='13' port='0x1c'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='pci.13'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='pci' index='14' model='pcie-root-port'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target chassis='14' port='0x1d'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='pci.14'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='pci' index='15' model='pcie-root-port'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target chassis='15' port='0x1e'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='pci.15'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='pci' index='16' model='pcie-root-port'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target chassis='16' port='0x1f'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='pci.16'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='pci' index='17' model='pcie-root-port'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target chassis='17' port='0x20'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='pci.17'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='pci' index='18' model='pcie-root-port'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target chassis='18' port='0x21'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='pci.18'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='pci' index='19' model='pcie-root-port'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target chassis='19' port='0x22'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='pci.19'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='pci' index='20' model='pcie-root-port'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target chassis='20' port='0x23'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='pci.20'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='pci' index='21' model='pcie-root-port'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target chassis='21' port='0x24'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='pci.21'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='pci' index='22' model='pcie-root-port'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target chassis='22' port='0x25'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='pci.22'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='pci' index='23' model='pcie-root-port'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target chassis='23' port='0x26'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='pci.23'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='pci' index='24' model='pcie-root-port'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target chassis='24' port='0x27'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='pci.24'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='pci' index='25' model='pcie-root-port'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target chassis='25' port='0x28'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='pci.25'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model name='pcie-pci-bridge'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='pci.26'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='usb' index='0' model='piix3-uhci'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='usb'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='sata' index='0'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='ide'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <interface type='ethernet'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <mac address='fa:16:3e:16:78:94'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target dev='tap7fd89911-39'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model type='virtio'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <driver name='vhost' rx_queue_size='512'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <mtu size='1442'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='net0'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <interface type='ethernet'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <mac address='fa:16:3e:a5:99:e1'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target dev='tap7b31c42c-5f'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model type='virtio'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <driver name='vhost' rx_queue_size='512'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <mtu size='1442'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='net1'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <serial type='pty'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <source path='/dev/pts/1'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <log file='/var/lib/nova/instances/2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/console.log' append='off'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target type='isa-serial' port='0'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:        <model name='isa-serial'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      </target>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='serial0'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <console type='pty' tty='/dev/pts/1'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <source path='/dev/pts/1'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <log file='/var/lib/nova/instances/2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/console.log' append='off'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target type='serial' port='0'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='serial0'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </console>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <input type='tablet' bus='usb'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='input0'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='usb' bus='0' port='1'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </input>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <input type='mouse' bus='ps2'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='input1'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </input>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <input type='keyboard' bus='ps2'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='input2'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </input>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <graphics type='vnc' port='5901' autoport='yes' listen='::0'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <listen type='address' address='::0'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </graphics>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <audio id='1' type='none'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model type='virtio' heads='1' primary='yes'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='video0'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <watchdog model='itco' action='reset'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='watchdog0'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </watchdog>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <memballoon model='virtio'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <stats period='10'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='balloon0'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <rng model='virtio'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <backend model='random'>/dev/urandom</backend>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='rng0'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <label>system_u:system_r:svirt_t:s0:c13,c483</label>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c13,c483</imagelabel>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  </seclabel>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <label>+107:+107</label>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <imagelabel>+107:+107</imagelabel>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  </seclabel>
Oct  2 08:38:19 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:38:19 np0005466013 nova_compute[192144]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Oct  2 08:38:19 np0005466013 nova_compute[192144]: 2025-10-02 12:38:19.543 2 INFO nova.virt.libvirt.driver [None req-3a10a512-b7f4-408c-8427-03cd0d5a48c4 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Successfully detached device tap7b31c42c-5f from instance 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf from the persistent domain config.#033[00m
Oct  2 08:38:19 np0005466013 nova_compute[192144]: 2025-10-02 12:38:19.543 2 DEBUG nova.virt.libvirt.driver [None req-3a10a512-b7f4-408c-8427-03cd0d5a48c4 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] (1/8): Attempting to detach device tap7b31c42c-5f with device alias net1 from instance 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Oct  2 08:38:19 np0005466013 nova_compute[192144]: 2025-10-02 12:38:19.543 2 DEBUG nova.virt.libvirt.guest [None req-3a10a512-b7f4-408c-8427-03cd0d5a48c4 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] detach device xml: <interface type="ethernet">
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <mac address="fa:16:3e:a5:99:e1"/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <model type="virtio"/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <mtu size="1442"/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <target dev="tap7b31c42c-5f"/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]: </interface>
Oct  2 08:38:19 np0005466013 nova_compute[192144]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Oct  2 08:38:19 np0005466013 kernel: tap7b31c42c-5f (unregistering): left promiscuous mode
Oct  2 08:38:19 np0005466013 NetworkManager[51205]: <info>  [1759408699.6516] device (tap7b31c42c-5f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:38:19 np0005466013 nova_compute[192144]: 2025-10-02 12:38:19.664 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:19 np0005466013 ovn_controller[94366]: 2025-10-02T12:38:19Z|00716|binding|INFO|Releasing lport 7b31c42c-5f0b-40fa-b913-4546441b9444 from this chassis (sb_readonly=0)
Oct  2 08:38:19 np0005466013 ovn_controller[94366]: 2025-10-02T12:38:19Z|00717|binding|INFO|Setting lport 7b31c42c-5f0b-40fa-b913-4546441b9444 down in Southbound
Oct  2 08:38:19 np0005466013 ovn_controller[94366]: 2025-10-02T12:38:19Z|00718|binding|INFO|Removing iface tap7b31c42c-5f ovn-installed in OVS
Oct  2 08:38:19 np0005466013 nova_compute[192144]: 2025-10-02 12:38:19.667 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:19.672 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a5:99:e1 10.100.0.29', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.29/28', 'neutron:device_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a970b3c6-2fc3-4025-868b-2e9af396991a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cf407807-38c2-4b6a-825d-3f40edf483e2, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=7b31c42c-5f0b-40fa-b913-4546441b9444) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:38:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:19.674 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 7b31c42c-5f0b-40fa-b913-4546441b9444 in datapath a970b3c6-2fc3-4025-868b-2e9af396991a unbound from our chassis#033[00m
Oct  2 08:38:19 np0005466013 nova_compute[192144]: 2025-10-02 12:38:19.678 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:19.678 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a970b3c6-2fc3-4025-868b-2e9af396991a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:38:19 np0005466013 nova_compute[192144]: 2025-10-02 12:38:19.681 2 DEBUG nova.virt.libvirt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Received event <DeviceRemovedEvent: 1759408699.6808293, 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Oct  2 08:38:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:19.680 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[d7b70d14-2b9b-4d01-addd-111cfd2cda17]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:19.682 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a970b3c6-2fc3-4025-868b-2e9af396991a namespace which is not needed anymore#033[00m
Oct  2 08:38:19 np0005466013 nova_compute[192144]: 2025-10-02 12:38:19.683 2 DEBUG nova.virt.libvirt.driver [None req-3a10a512-b7f4-408c-8427-03cd0d5a48c4 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Start waiting for the detach event from libvirt for device tap7b31c42c-5f with device alias net1 for instance 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Oct  2 08:38:19 np0005466013 nova_compute[192144]: 2025-10-02 12:38:19.683 2 DEBUG nova.virt.libvirt.guest [None req-3a10a512-b7f4-408c-8427-03cd0d5a48c4 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:a5:99:e1"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap7b31c42c-5f"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Oct  2 08:38:19 np0005466013 nova_compute[192144]: 2025-10-02 12:38:19.686 2 DEBUG nova.virt.libvirt.guest [None req-3a10a512-b7f4-408c-8427-03cd0d5a48c4 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:a5:99:e1"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap7b31c42c-5f"/></interface>not found in domain: <domain type='kvm' id='74'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <name>instance-000000a0</name>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <uuid>2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf</uuid>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <nova:name>tempest-TestNetworkBasicOps-server-1457788990</nova:name>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <nova:creationTime>2025-10-02 12:37:05</nova:creationTime>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <nova:flavor name="m1.nano">
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <nova:memory>128</nova:memory>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <nova:disk>1</nova:disk>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <nova:swap>0</nova:swap>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <nova:vcpus>1</nova:vcpus>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  </nova:flavor>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <nova:owner>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <nova:user uuid="a1898fdf056c4a249c33590f26d4d845">tempest-TestNetworkBasicOps-1323893370-project-member</nova:user>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <nova:project uuid="6e2a4899168a47618e377cb3ac85ddd2">tempest-TestNetworkBasicOps-1323893370</nova:project>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  </nova:owner>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <nova:ports>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <nova:port uuid="7fd89911-3957-4f68-8adb-9a1c640f6bdb">
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </nova:port>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <nova:port uuid="7b31c42c-5f0b-40fa-b913-4546441b9444">
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <nova:ip type="fixed" address="10.100.0.29" ipVersion="4"/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </nova:port>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  </nova:ports>
Oct  2 08:38:19 np0005466013 nova_compute[192144]: </nova:instance>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <memory unit='KiB'>131072</memory>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <currentMemory unit='KiB'>131072</currentMemory>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <vcpu placement='static'>1</vcpu>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <resource>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <partition>/machine</partition>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  </resource>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <sysinfo type='smbios'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <entry name='manufacturer'>RDO</entry>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <entry name='product'>OpenStack Compute</entry>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <entry name='serial'>2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf</entry>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <entry name='uuid'>2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf</entry>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <entry name='family'>Virtual Machine</entry>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <type arch='x86_64' machine='pc-q35-rhel9.6.0'>hvm</type>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <boot dev='hd'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <smbios mode='sysinfo'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <vmcoreinfo state='on'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <cpu mode='custom' match='exact' check='full'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <model fallback='forbid'>Nehalem</model>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <feature policy='require' name='x2apic'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <feature policy='require' name='hypervisor'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <feature policy='require' name='vme'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <clock offset='utc'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <timer name='pit' tickpolicy='delay'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <timer name='rtc' tickpolicy='catchup'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <timer name='hpet' present='no'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <on_poweroff>destroy</on_poweroff>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <on_reboot>restart</on_reboot>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <on_crash>destroy</on_crash>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <disk type='file' device='disk'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <driver name='qemu' type='qcow2' cache='none'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <source file='/var/lib/nova/instances/2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/disk' index='2'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <backingStore type='file' index='3'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:        <format type='raw'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:        <source file='/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:        <backingStore/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      </backingStore>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target dev='vda' bus='virtio'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='virtio-disk0'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <disk type='file' device='cdrom'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <driver name='qemu' type='raw' cache='none'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <source file='/var/lib/nova/instances/2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/disk.config' index='1'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <backingStore/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target dev='sda' bus='sata'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <readonly/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='sata0-0-0'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='pci' index='0' model='pcie-root'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='pcie.0'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='pci' index='1' model='pcie-root-port'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target chassis='1' port='0x10'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='pci.1'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='pci' index='2' model='pcie-root-port'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target chassis='2' port='0x11'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='pci.2'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='pci' index='3' model='pcie-root-port'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target chassis='3' port='0x12'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='pci.3'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='pci' index='4' model='pcie-root-port'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target chassis='4' port='0x13'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='pci.4'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='pci' index='5' model='pcie-root-port'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target chassis='5' port='0x14'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='pci.5'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='pci' index='6' model='pcie-root-port'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target chassis='6' port='0x15'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='pci.6'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='pci' index='7' model='pcie-root-port'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target chassis='7' port='0x16'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='pci.7'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='pci' index='8' model='pcie-root-port'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target chassis='8' port='0x17'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='pci.8'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='pci' index='9' model='pcie-root-port'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target chassis='9' port='0x18'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='pci.9'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='pci' index='10' model='pcie-root-port'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target chassis='10' port='0x19'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='pci.10'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='pci' index='11' model='pcie-root-port'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target chassis='11' port='0x1a'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='pci.11'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='pci' index='12' model='pcie-root-port'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target chassis='12' port='0x1b'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='pci.12'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='pci' index='13' model='pcie-root-port'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target chassis='13' port='0x1c'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='pci.13'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='pci' index='14' model='pcie-root-port'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target chassis='14' port='0x1d'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='pci.14'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='pci' index='15' model='pcie-root-port'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target chassis='15' port='0x1e'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='pci.15'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='pci' index='16' model='pcie-root-port'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target chassis='16' port='0x1f'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='pci.16'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='pci' index='17' model='pcie-root-port'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target chassis='17' port='0x20'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='pci.17'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='pci' index='18' model='pcie-root-port'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target chassis='18' port='0x21'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='pci.18'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='pci' index='19' model='pcie-root-port'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target chassis='19' port='0x22'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='pci.19'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='pci' index='20' model='pcie-root-port'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target chassis='20' port='0x23'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='pci.20'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='pci' index='21' model='pcie-root-port'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target chassis='21' port='0x24'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='pci.21'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='pci' index='22' model='pcie-root-port'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target chassis='22' port='0x25'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='pci.22'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='pci' index='23' model='pcie-root-port'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target chassis='23' port='0x26'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='pci.23'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='pci' index='24' model='pcie-root-port'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target chassis='24' port='0x27'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='pci.24'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='pci' index='25' model='pcie-root-port'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target chassis='25' port='0x28'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='pci.25'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model name='pcie-pci-bridge'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='pci.26'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='usb' index='0' model='piix3-uhci'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='usb'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <controller type='sata' index='0'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='ide'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <interface type='ethernet'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <mac address='fa:16:3e:16:78:94'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target dev='tap7fd89911-39'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model type='virtio'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <driver name='vhost' rx_queue_size='512'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <mtu size='1442'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='net0'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <serial type='pty'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <source path='/dev/pts/1'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <log file='/var/lib/nova/instances/2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/console.log' append='off'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target type='isa-serial' port='0'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:        <model name='isa-serial'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      </target>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='serial0'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <console type='pty' tty='/dev/pts/1'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <source path='/dev/pts/1'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <log file='/var/lib/nova/instances/2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/console.log' append='off'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <target type='serial' port='0'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='serial0'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </console>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <input type='tablet' bus='usb'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='input0'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='usb' bus='0' port='1'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </input>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <input type='mouse' bus='ps2'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='input1'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </input>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <input type='keyboard' bus='ps2'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='input2'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </input>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <graphics type='vnc' port='5901' autoport='yes' listen='::0'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <listen type='address' address='::0'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </graphics>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <audio id='1' type='none'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <model type='virtio' heads='1' primary='yes'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='video0'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <watchdog model='itco' action='reset'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='watchdog0'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </watchdog>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <memballoon model='virtio'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <stats period='10'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='balloon0'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <rng model='virtio'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <backend model='random'>/dev/urandom</backend>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <alias name='rng0'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <label>system_u:system_r:svirt_t:s0:c13,c483</label>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c13,c483</imagelabel>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  </seclabel>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <label>+107:+107</label>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <imagelabel>+107:+107</imagelabel>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  </seclabel>
Oct  2 08:38:19 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:38:19 np0005466013 nova_compute[192144]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Oct  2 08:38:19 np0005466013 nova_compute[192144]: 2025-10-02 12:38:19.687 2 INFO nova.virt.libvirt.driver [None req-3a10a512-b7f4-408c-8427-03cd0d5a48c4 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Successfully detached device tap7b31c42c-5f from instance 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf from the live domain config.#033[00m
Oct  2 08:38:19 np0005466013 nova_compute[192144]: 2025-10-02 12:38:19.687 2 DEBUG nova.virt.libvirt.vif [None req-3a10a512-b7f4-408c-8427-03cd0d5a48c4 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:36:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1457788990',display_name='tempest-TestNetworkBasicOps-server-1457788990',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1457788990',id=160,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLAX+2XQrOd7KzesCSBvPQcBu33n2cNneMCsROY2zt8eQS2rwe7lFkCII3Yacz+uHDwkN7yLjJdqPxnhD7koMwAMvGoeCFoDh9M4G2I81ZizY75euwWGW9AYqhxHgyD2+w==',key_name='tempest-TestNetworkBasicOps-1430246833',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:36:32Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6e2a4899168a47618e377cb3ac85ddd2',ramdisk_id='',reservation_id='r-dfdu8cck',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1323893370',owner_user_name='tempest-TestNetworkBasicOps-1323893370-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:36:32Z,user_data=None,user_id='a1898fdf056c4a249c33590f26d4d845',uuid=2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7b31c42c-5f0b-40fa-b913-4546441b9444", "address": "fa:16:3e:a5:99:e1", "network": {"id": "a970b3c6-2fc3-4025-868b-2e9af396991a", "bridge": "br-int", "label": "tempest-network-smoke--441167180", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b31c42c-5f", "ovs_interfaceid": "7b31c42c-5f0b-40fa-b913-4546441b9444", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:38:19 np0005466013 nova_compute[192144]: 2025-10-02 12:38:19.687 2 DEBUG nova.network.os_vif_util [None req-3a10a512-b7f4-408c-8427-03cd0d5a48c4 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Converting VIF {"id": "7b31c42c-5f0b-40fa-b913-4546441b9444", "address": "fa:16:3e:a5:99:e1", "network": {"id": "a970b3c6-2fc3-4025-868b-2e9af396991a", "bridge": "br-int", "label": "tempest-network-smoke--441167180", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b31c42c-5f", "ovs_interfaceid": "7b31c42c-5f0b-40fa-b913-4546441b9444", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:38:19 np0005466013 nova_compute[192144]: 2025-10-02 12:38:19.688 2 DEBUG nova.network.os_vif_util [None req-3a10a512-b7f4-408c-8427-03cd0d5a48c4 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a5:99:e1,bridge_name='br-int',has_traffic_filtering=True,id=7b31c42c-5f0b-40fa-b913-4546441b9444,network=Network(a970b3c6-2fc3-4025-868b-2e9af396991a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b31c42c-5f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:38:19 np0005466013 nova_compute[192144]: 2025-10-02 12:38:19.688 2 DEBUG os_vif [None req-3a10a512-b7f4-408c-8427-03cd0d5a48c4 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a5:99:e1,bridge_name='br-int',has_traffic_filtering=True,id=7b31c42c-5f0b-40fa-b913-4546441b9444,network=Network(a970b3c6-2fc3-4025-868b-2e9af396991a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b31c42c-5f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:38:19 np0005466013 nova_compute[192144]: 2025-10-02 12:38:19.690 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:19 np0005466013 nova_compute[192144]: 2025-10-02 12:38:19.691 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b31c42c-5f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:38:19 np0005466013 nova_compute[192144]: 2025-10-02 12:38:19.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:19 np0005466013 nova_compute[192144]: 2025-10-02 12:38:19.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:38:19 np0005466013 nova_compute[192144]: 2025-10-02 12:38:19.701 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:19 np0005466013 nova_compute[192144]: 2025-10-02 12:38:19.704 2 INFO os_vif [None req-3a10a512-b7f4-408c-8427-03cd0d5a48c4 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a5:99:e1,bridge_name='br-int',has_traffic_filtering=True,id=7b31c42c-5f0b-40fa-b913-4546441b9444,network=Network(a970b3c6-2fc3-4025-868b-2e9af396991a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b31c42c-5f')#033[00m
Oct  2 08:38:19 np0005466013 nova_compute[192144]: 2025-10-02 12:38:19.705 2 DEBUG nova.virt.libvirt.guest [None req-3a10a512-b7f4-408c-8427-03cd0d5a48c4 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <nova:name>tempest-TestNetworkBasicOps-server-1457788990</nova:name>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <nova:creationTime>2025-10-02 12:38:19</nova:creationTime>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <nova:flavor name="m1.nano">
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <nova:memory>128</nova:memory>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <nova:disk>1</nova:disk>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <nova:swap>0</nova:swap>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <nova:vcpus>1</nova:vcpus>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  </nova:flavor>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <nova:owner>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <nova:user uuid="a1898fdf056c4a249c33590f26d4d845">tempest-TestNetworkBasicOps-1323893370-project-member</nova:user>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <nova:project uuid="6e2a4899168a47618e377cb3ac85ddd2">tempest-TestNetworkBasicOps-1323893370</nova:project>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  </nova:owner>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  <nova:ports>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    <nova:port uuid="7fd89911-3957-4f68-8adb-9a1c640f6bdb">
Oct  2 08:38:19 np0005466013 nova_compute[192144]:      <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:    </nova:port>
Oct  2 08:38:19 np0005466013 nova_compute[192144]:  </nova:ports>
Oct  2 08:38:19 np0005466013 nova_compute[192144]: </nova:instance>
Oct  2 08:38:19 np0005466013 nova_compute[192144]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Oct  2 08:38:19 np0005466013 neutron-haproxy-ovnmeta-a970b3c6-2fc3-4025-868b-2e9af396991a[247377]: [NOTICE]   (247381) : haproxy version is 2.8.14-c23fe91
Oct  2 08:38:19 np0005466013 neutron-haproxy-ovnmeta-a970b3c6-2fc3-4025-868b-2e9af396991a[247377]: [NOTICE]   (247381) : path to executable is /usr/sbin/haproxy
Oct  2 08:38:19 np0005466013 neutron-haproxy-ovnmeta-a970b3c6-2fc3-4025-868b-2e9af396991a[247377]: [WARNING]  (247381) : Exiting Master process...
Oct  2 08:38:19 np0005466013 neutron-haproxy-ovnmeta-a970b3c6-2fc3-4025-868b-2e9af396991a[247377]: [WARNING]  (247381) : Exiting Master process...
Oct  2 08:38:19 np0005466013 neutron-haproxy-ovnmeta-a970b3c6-2fc3-4025-868b-2e9af396991a[247377]: [ALERT]    (247381) : Current worker (247383) exited with code 143 (Terminated)
Oct  2 08:38:19 np0005466013 neutron-haproxy-ovnmeta-a970b3c6-2fc3-4025-868b-2e9af396991a[247377]: [WARNING]  (247381) : All workers exited. Exiting... (0)
Oct  2 08:38:19 np0005466013 systemd[1]: libpod-cb050a0f43e622ce0fb5177ed18b89b83603510fe1080fa4a048c00822149281.scope: Deactivated successfully.
Oct  2 08:38:19 np0005466013 podman[247931]: 2025-10-02 12:38:19.963111975 +0000 UTC m=+0.145117781 container died cb050a0f43e622ce0fb5177ed18b89b83603510fe1080fa4a048c00822149281 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a970b3c6-2fc3-4025-868b-2e9af396991a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:38:20 np0005466013 systemd[1]: var-lib-containers-storage-overlay-f49f39c1a025e10fb86042e071ed27a9be40dc397c9777c326dd7ae577c7855a-merged.mount: Deactivated successfully.
Oct  2 08:38:20 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cb050a0f43e622ce0fb5177ed18b89b83603510fe1080fa4a048c00822149281-userdata-shm.mount: Deactivated successfully.
Oct  2 08:38:20 np0005466013 nova_compute[192144]: 2025-10-02 12:38:20.062 2 DEBUG nova.compute.manager [req-51c19785-9687-4a67-b6fa-ea8d1a85ac6b req-f16ece12-5294-45ce-863f-fbd211386dc5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Received event network-vif-unplugged-7b31c42c-5f0b-40fa-b913-4546441b9444 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:38:20 np0005466013 nova_compute[192144]: 2025-10-02 12:38:20.062 2 DEBUG oslo_concurrency.lockutils [req-51c19785-9687-4a67-b6fa-ea8d1a85ac6b req-f16ece12-5294-45ce-863f-fbd211386dc5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:38:20 np0005466013 nova_compute[192144]: 2025-10-02 12:38:20.063 2 DEBUG oslo_concurrency.lockutils [req-51c19785-9687-4a67-b6fa-ea8d1a85ac6b req-f16ece12-5294-45ce-863f-fbd211386dc5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:38:20 np0005466013 nova_compute[192144]: 2025-10-02 12:38:20.063 2 DEBUG oslo_concurrency.lockutils [req-51c19785-9687-4a67-b6fa-ea8d1a85ac6b req-f16ece12-5294-45ce-863f-fbd211386dc5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:20 np0005466013 nova_compute[192144]: 2025-10-02 12:38:20.063 2 DEBUG nova.compute.manager [req-51c19785-9687-4a67-b6fa-ea8d1a85ac6b req-f16ece12-5294-45ce-863f-fbd211386dc5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] No waiting events found dispatching network-vif-unplugged-7b31c42c-5f0b-40fa-b913-4546441b9444 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:38:20 np0005466013 nova_compute[192144]: 2025-10-02 12:38:20.063 2 WARNING nova.compute.manager [req-51c19785-9687-4a67-b6fa-ea8d1a85ac6b req-f16ece12-5294-45ce-863f-fbd211386dc5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Received unexpected event network-vif-unplugged-7b31c42c-5f0b-40fa-b913-4546441b9444 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:38:20 np0005466013 podman[247931]: 2025-10-02 12:38:20.15144338 +0000 UTC m=+0.333449186 container cleanup cb050a0f43e622ce0fb5177ed18b89b83603510fe1080fa4a048c00822149281 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a970b3c6-2fc3-4025-868b-2e9af396991a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:38:20 np0005466013 systemd[1]: libpod-conmon-cb050a0f43e622ce0fb5177ed18b89b83603510fe1080fa4a048c00822149281.scope: Deactivated successfully.
Oct  2 08:38:20 np0005466013 nova_compute[192144]: 2025-10-02 12:38:20.353 2 DEBUG oslo_concurrency.lockutils [None req-3a10a512-b7f4-408c-8427-03cd0d5a48c4 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "refresh_cache-2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:38:20 np0005466013 nova_compute[192144]: 2025-10-02 12:38:20.354 2 DEBUG oslo_concurrency.lockutils [None req-3a10a512-b7f4-408c-8427-03cd0d5a48c4 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquired lock "refresh_cache-2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:38:20 np0005466013 nova_compute[192144]: 2025-10-02 12:38:20.354 2 DEBUG nova.network.neutron [None req-3a10a512-b7f4-408c-8427-03cd0d5a48c4 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:38:20 np0005466013 nova_compute[192144]: 2025-10-02 12:38:20.413 2 DEBUG nova.compute.manager [req-716bcc52-037d-4d10-b11c-b3d93ebb4096 req-f60c473e-bab7-4dea-960b-b0d3522ef545 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Received event network-vif-deleted-7b31c42c-5f0b-40fa-b913-4546441b9444 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:38:20 np0005466013 nova_compute[192144]: 2025-10-02 12:38:20.414 2 INFO nova.compute.manager [req-716bcc52-037d-4d10-b11c-b3d93ebb4096 req-f60c473e-bab7-4dea-960b-b0d3522ef545 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Neutron deleted interface 7b31c42c-5f0b-40fa-b913-4546441b9444; detaching it from the instance and deleting it from the info cache#033[00m
Oct  2 08:38:20 np0005466013 nova_compute[192144]: 2025-10-02 12:38:20.414 2 DEBUG nova.network.neutron [req-716bcc52-037d-4d10-b11c-b3d93ebb4096 req-f60c473e-bab7-4dea-960b-b0d3522ef545 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Updating instance_info_cache with network_info: [{"id": "7fd89911-3957-4f68-8adb-9a1c640f6bdb", "address": "fa:16:3e:16:78:94", "network": {"id": "f6b28beb-3fac-4e00-bd1f-932a66109b1d", "bridge": "br-int", "label": "tempest-network-smoke--1067683882", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7fd89911-39", "ovs_interfaceid": "7fd89911-3957-4f68-8adb-9a1c640f6bdb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:38:20 np0005466013 nova_compute[192144]: 2025-10-02 12:38:20.416 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:20 np0005466013 nova_compute[192144]: 2025-10-02 12:38:20.469 2 DEBUG nova.objects.instance [req-716bcc52-037d-4d10-b11c-b3d93ebb4096 req-f60c473e-bab7-4dea-960b-b0d3522ef545 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lazy-loading 'system_metadata' on Instance uuid 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:38:20 np0005466013 podman[247964]: 2025-10-02 12:38:20.488323584 +0000 UTC m=+0.307025995 container remove cb050a0f43e622ce0fb5177ed18b89b83603510fe1080fa4a048c00822149281 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a970b3c6-2fc3-4025-868b-2e9af396991a, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001)
Oct  2 08:38:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:20.498 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[d25a566d-ad28-4666-8d33-7038dc402e16]: (4, ('Thu Oct  2 12:38:19 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-a970b3c6-2fc3-4025-868b-2e9af396991a (cb050a0f43e622ce0fb5177ed18b89b83603510fe1080fa4a048c00822149281)\ncb050a0f43e622ce0fb5177ed18b89b83603510fe1080fa4a048c00822149281\nThu Oct  2 12:38:20 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-a970b3c6-2fc3-4025-868b-2e9af396991a (cb050a0f43e622ce0fb5177ed18b89b83603510fe1080fa4a048c00822149281)\ncb050a0f43e622ce0fb5177ed18b89b83603510fe1080fa4a048c00822149281\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:20.502 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[1a8e0e92-16b1-4dfc-b3f4-380230ae9962]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:20.504 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa970b3c6-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:38:20 np0005466013 nova_compute[192144]: 2025-10-02 12:38:20.506 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:20 np0005466013 kernel: tapa970b3c6-20: left promiscuous mode
Oct  2 08:38:20 np0005466013 nova_compute[192144]: 2025-10-02 12:38:20.518 2 DEBUG nova.objects.instance [req-716bcc52-037d-4d10-b11c-b3d93ebb4096 req-f60c473e-bab7-4dea-960b-b0d3522ef545 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lazy-loading 'flavor' on Instance uuid 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:38:20 np0005466013 nova_compute[192144]: 2025-10-02 12:38:20.530 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:20.535 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[862e7bc3-8e94-4d77-858e-bc19cb05b72a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:20 np0005466013 nova_compute[192144]: 2025-10-02 12:38:20.544 2 DEBUG nova.virt.libvirt.vif [req-716bcc52-037d-4d10-b11c-b3d93ebb4096 req-f60c473e-bab7-4dea-960b-b0d3522ef545 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:36:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1457788990',display_name='tempest-TestNetworkBasicOps-server-1457788990',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1457788990',id=160,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLAX+2XQrOd7KzesCSBvPQcBu33n2cNneMCsROY2zt8eQS2rwe7lFkCII3Yacz+uHDwkN7yLjJdqPxnhD7koMwAMvGoeCFoDh9M4G2I81ZizY75euwWGW9AYqhxHgyD2+w==',key_name='tempest-TestNetworkBasicOps-1430246833',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:36:32Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6e2a4899168a47618e377cb3ac85ddd2',ramdisk_id='',reservation_id='r-dfdu8cck',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1323893370',owner_user_name='tempest-TestNetworkBasicOps-1323893370-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:36:32Z,user_data=None,user_id='a1898fdf056c4a249c33590f26d4d845',uuid=2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7b31c42c-5f0b-40fa-b913-4546441b9444", "address": "fa:16:3e:a5:99:e1", "network": {"id": "a970b3c6-2fc3-4025-868b-2e9af396991a", "bridge": "br-int", "label": "tempest-network-smoke--441167180", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b31c42c-5f", "ovs_interfaceid": "7b31c42c-5f0b-40fa-b913-4546441b9444", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:38:20 np0005466013 nova_compute[192144]: 2025-10-02 12:38:20.545 2 DEBUG nova.network.os_vif_util [req-716bcc52-037d-4d10-b11c-b3d93ebb4096 req-f60c473e-bab7-4dea-960b-b0d3522ef545 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Converting VIF {"id": "7b31c42c-5f0b-40fa-b913-4546441b9444", "address": "fa:16:3e:a5:99:e1", "network": {"id": "a970b3c6-2fc3-4025-868b-2e9af396991a", "bridge": "br-int", "label": "tempest-network-smoke--441167180", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b31c42c-5f", "ovs_interfaceid": "7b31c42c-5f0b-40fa-b913-4546441b9444", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:38:20 np0005466013 nova_compute[192144]: 2025-10-02 12:38:20.548 2 DEBUG nova.network.os_vif_util [req-716bcc52-037d-4d10-b11c-b3d93ebb4096 req-f60c473e-bab7-4dea-960b-b0d3522ef545 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a5:99:e1,bridge_name='br-int',has_traffic_filtering=True,id=7b31c42c-5f0b-40fa-b913-4546441b9444,network=Network(a970b3c6-2fc3-4025-868b-2e9af396991a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b31c42c-5f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:38:20 np0005466013 nova_compute[192144]: 2025-10-02 12:38:20.551 2 DEBUG nova.virt.libvirt.guest [req-716bcc52-037d-4d10-b11c-b3d93ebb4096 req-f60c473e-bab7-4dea-960b-b0d3522ef545 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:a5:99:e1"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap7b31c42c-5f"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Oct  2 08:38:20 np0005466013 nova_compute[192144]: 2025-10-02 12:38:20.554 2 DEBUG nova.virt.libvirt.guest [req-716bcc52-037d-4d10-b11c-b3d93ebb4096 req-f60c473e-bab7-4dea-960b-b0d3522ef545 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:a5:99:e1"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap7b31c42c-5f"/></interface>not found in domain: <domain type='kvm' id='74'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <name>instance-000000a0</name>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <uuid>2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf</uuid>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <nova:name>tempest-TestNetworkBasicOps-server-1457788990</nova:name>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <nova:creationTime>2025-10-02 12:38:19</nova:creationTime>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <nova:flavor name="m1.nano">
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <nova:memory>128</nova:memory>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <nova:disk>1</nova:disk>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <nova:swap>0</nova:swap>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <nova:vcpus>1</nova:vcpus>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  </nova:flavor>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <nova:owner>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <nova:user uuid="a1898fdf056c4a249c33590f26d4d845">tempest-TestNetworkBasicOps-1323893370-project-member</nova:user>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <nova:project uuid="6e2a4899168a47618e377cb3ac85ddd2">tempest-TestNetworkBasicOps-1323893370</nova:project>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  </nova:owner>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <nova:ports>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <nova:port uuid="7fd89911-3957-4f68-8adb-9a1c640f6bdb">
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </nova:port>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  </nova:ports>
Oct  2 08:38:20 np0005466013 nova_compute[192144]: </nova:instance>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <memory unit='KiB'>131072</memory>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <currentMemory unit='KiB'>131072</currentMemory>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <vcpu placement='static'>1</vcpu>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <resource>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <partition>/machine</partition>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  </resource>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <sysinfo type='smbios'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <entry name='manufacturer'>RDO</entry>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <entry name='product'>OpenStack Compute</entry>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <entry name='serial'>2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf</entry>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <entry name='uuid'>2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf</entry>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <entry name='family'>Virtual Machine</entry>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <type arch='x86_64' machine='pc-q35-rhel9.6.0'>hvm</type>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <boot dev='hd'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <smbios mode='sysinfo'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <vmcoreinfo state='on'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <cpu mode='custom' match='exact' check='full'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <model fallback='forbid'>Nehalem</model>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <feature policy='require' name='x2apic'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <feature policy='require' name='hypervisor'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <feature policy='require' name='vme'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <clock offset='utc'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <timer name='pit' tickpolicy='delay'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <timer name='rtc' tickpolicy='catchup'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <timer name='hpet' present='no'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <on_poweroff>destroy</on_poweroff>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <on_reboot>restart</on_reboot>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <on_crash>destroy</on_crash>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <disk type='file' device='disk'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <driver name='qemu' type='qcow2' cache='none'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <source file='/var/lib/nova/instances/2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/disk' index='2'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <backingStore type='file' index='3'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:        <format type='raw'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:        <source file='/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:        <backingStore/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      </backingStore>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target dev='vda' bus='virtio'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='virtio-disk0'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <disk type='file' device='cdrom'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <driver name='qemu' type='raw' cache='none'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <source file='/var/lib/nova/instances/2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/disk.config' index='1'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <backingStore/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target dev='sda' bus='sata'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <readonly/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='sata0-0-0'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='pci' index='0' model='pcie-root'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='pcie.0'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='pci' index='1' model='pcie-root-port'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target chassis='1' port='0x10'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='pci.1'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='pci' index='2' model='pcie-root-port'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target chassis='2' port='0x11'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='pci.2'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='pci' index='3' model='pcie-root-port'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target chassis='3' port='0x12'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='pci.3'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='pci' index='4' model='pcie-root-port'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target chassis='4' port='0x13'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='pci.4'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='pci' index='5' model='pcie-root-port'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target chassis='5' port='0x14'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='pci.5'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='pci' index='6' model='pcie-root-port'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target chassis='6' port='0x15'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='pci.6'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='pci' index='7' model='pcie-root-port'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target chassis='7' port='0x16'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='pci.7'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='pci' index='8' model='pcie-root-port'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target chassis='8' port='0x17'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='pci.8'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='pci' index='9' model='pcie-root-port'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target chassis='9' port='0x18'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='pci.9'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='pci' index='10' model='pcie-root-port'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target chassis='10' port='0x19'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='pci.10'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='pci' index='11' model='pcie-root-port'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target chassis='11' port='0x1a'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='pci.11'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='pci' index='12' model='pcie-root-port'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target chassis='12' port='0x1b'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='pci.12'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='pci' index='13' model='pcie-root-port'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target chassis='13' port='0x1c'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='pci.13'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='pci' index='14' model='pcie-root-port'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target chassis='14' port='0x1d'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='pci.14'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='pci' index='15' model='pcie-root-port'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target chassis='15' port='0x1e'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='pci.15'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='pci' index='16' model='pcie-root-port'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target chassis='16' port='0x1f'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='pci.16'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='pci' index='17' model='pcie-root-port'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target chassis='17' port='0x20'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='pci.17'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='pci' index='18' model='pcie-root-port'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target chassis='18' port='0x21'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='pci.18'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='pci' index='19' model='pcie-root-port'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target chassis='19' port='0x22'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='pci.19'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='pci' index='20' model='pcie-root-port'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target chassis='20' port='0x23'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='pci.20'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='pci' index='21' model='pcie-root-port'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target chassis='21' port='0x24'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='pci.21'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='pci' index='22' model='pcie-root-port'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target chassis='22' port='0x25'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='pci.22'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='pci' index='23' model='pcie-root-port'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target chassis='23' port='0x26'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='pci.23'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='pci' index='24' model='pcie-root-port'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target chassis='24' port='0x27'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='pci.24'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='pci' index='25' model='pcie-root-port'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target chassis='25' port='0x28'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='pci.25'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <model name='pcie-pci-bridge'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='pci.26'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='usb' index='0' model='piix3-uhci'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='usb'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='sata' index='0'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='ide'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <interface type='ethernet'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <mac address='fa:16:3e:16:78:94'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target dev='tap7fd89911-39'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <model type='virtio'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <driver name='vhost' rx_queue_size='512'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <mtu size='1442'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='net0'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <serial type='pty'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <source path='/dev/pts/1'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <log file='/var/lib/nova/instances/2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/console.log' append='off'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target type='isa-serial' port='0'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:        <model name='isa-serial'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      </target>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='serial0'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <console type='pty' tty='/dev/pts/1'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <source path='/dev/pts/1'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <log file='/var/lib/nova/instances/2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/console.log' append='off'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target type='serial' port='0'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='serial0'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </console>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <input type='tablet' bus='usb'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='input0'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='usb' bus='0' port='1'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </input>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <input type='mouse' bus='ps2'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='input1'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </input>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <input type='keyboard' bus='ps2'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='input2'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </input>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <graphics type='vnc' port='5901' autoport='yes' listen='::0'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <listen type='address' address='::0'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </graphics>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <audio id='1' type='none'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <model type='virtio' heads='1' primary='yes'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='video0'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <watchdog model='itco' action='reset'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='watchdog0'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </watchdog>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <memballoon model='virtio'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <stats period='10'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='balloon0'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <rng model='virtio'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <backend model='random'>/dev/urandom</backend>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='rng0'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <label>system_u:system_r:svirt_t:s0:c13,c483</label>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c13,c483</imagelabel>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  </seclabel>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <label>+107:+107</label>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <imagelabel>+107:+107</imagelabel>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  </seclabel>
Oct  2 08:38:20 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:38:20 np0005466013 nova_compute[192144]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Oct  2 08:38:20 np0005466013 nova_compute[192144]: 2025-10-02 12:38:20.556 2 DEBUG nova.virt.libvirt.guest [req-716bcc52-037d-4d10-b11c-b3d93ebb4096 req-f60c473e-bab7-4dea-960b-b0d3522ef545 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:a5:99:e1"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap7b31c42c-5f"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Oct  2 08:38:20 np0005466013 nova_compute[192144]: 2025-10-02 12:38:20.559 2 DEBUG nova.virt.libvirt.guest [req-716bcc52-037d-4d10-b11c-b3d93ebb4096 req-f60c473e-bab7-4dea-960b-b0d3522ef545 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:a5:99:e1"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap7b31c42c-5f"/></interface>not found in domain: <domain type='kvm' id='74'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <name>instance-000000a0</name>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <uuid>2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf</uuid>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <nova:name>tempest-TestNetworkBasicOps-server-1457788990</nova:name>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <nova:creationTime>2025-10-02 12:38:19</nova:creationTime>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <nova:flavor name="m1.nano">
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <nova:memory>128</nova:memory>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <nova:disk>1</nova:disk>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <nova:swap>0</nova:swap>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <nova:vcpus>1</nova:vcpus>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  </nova:flavor>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <nova:owner>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <nova:user uuid="a1898fdf056c4a249c33590f26d4d845">tempest-TestNetworkBasicOps-1323893370-project-member</nova:user>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <nova:project uuid="6e2a4899168a47618e377cb3ac85ddd2">tempest-TestNetworkBasicOps-1323893370</nova:project>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  </nova:owner>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <nova:ports>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <nova:port uuid="7fd89911-3957-4f68-8adb-9a1c640f6bdb">
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </nova:port>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  </nova:ports>
Oct  2 08:38:20 np0005466013 nova_compute[192144]: </nova:instance>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <memory unit='KiB'>131072</memory>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <currentMemory unit='KiB'>131072</currentMemory>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <vcpu placement='static'>1</vcpu>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <resource>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <partition>/machine</partition>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  </resource>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <sysinfo type='smbios'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <entry name='manufacturer'>RDO</entry>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <entry name='product'>OpenStack Compute</entry>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <entry name='serial'>2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf</entry>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <entry name='uuid'>2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf</entry>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <entry name='family'>Virtual Machine</entry>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <type arch='x86_64' machine='pc-q35-rhel9.6.0'>hvm</type>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <boot dev='hd'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <smbios mode='sysinfo'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <vmcoreinfo state='on'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <cpu mode='custom' match='exact' check='full'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <model fallback='forbid'>Nehalem</model>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <feature policy='require' name='x2apic'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <feature policy='require' name='hypervisor'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <feature policy='require' name='vme'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <clock offset='utc'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <timer name='pit' tickpolicy='delay'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <timer name='rtc' tickpolicy='catchup'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <timer name='hpet' present='no'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <on_poweroff>destroy</on_poweroff>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <on_reboot>restart</on_reboot>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <on_crash>destroy</on_crash>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <disk type='file' device='disk'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <driver name='qemu' type='qcow2' cache='none'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <source file='/var/lib/nova/instances/2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/disk' index='2'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <backingStore type='file' index='3'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:        <format type='raw'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:        <source file='/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:        <backingStore/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      </backingStore>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target dev='vda' bus='virtio'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='virtio-disk0'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <disk type='file' device='cdrom'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <driver name='qemu' type='raw' cache='none'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <source file='/var/lib/nova/instances/2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/disk.config' index='1'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <backingStore/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target dev='sda' bus='sata'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <readonly/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='sata0-0-0'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='pci' index='0' model='pcie-root'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='pcie.0'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='pci' index='1' model='pcie-root-port'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target chassis='1' port='0x10'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='pci.1'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='pci' index='2' model='pcie-root-port'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target chassis='2' port='0x11'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='pci.2'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='pci' index='3' model='pcie-root-port'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target chassis='3' port='0x12'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='pci.3'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='pci' index='4' model='pcie-root-port'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target chassis='4' port='0x13'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='pci.4'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='pci' index='5' model='pcie-root-port'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target chassis='5' port='0x14'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='pci.5'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='pci' index='6' model='pcie-root-port'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target chassis='6' port='0x15'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='pci.6'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='pci' index='7' model='pcie-root-port'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target chassis='7' port='0x16'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='pci.7'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='pci' index='8' model='pcie-root-port'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target chassis='8' port='0x17'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='pci.8'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='pci' index='9' model='pcie-root-port'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target chassis='9' port='0x18'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='pci.9'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='pci' index='10' model='pcie-root-port'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target chassis='10' port='0x19'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='pci.10'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='pci' index='11' model='pcie-root-port'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target chassis='11' port='0x1a'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='pci.11'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='pci' index='12' model='pcie-root-port'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target chassis='12' port='0x1b'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='pci.12'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='pci' index='13' model='pcie-root-port'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target chassis='13' port='0x1c'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='pci.13'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='pci' index='14' model='pcie-root-port'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target chassis='14' port='0x1d'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='pci.14'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='pci' index='15' model='pcie-root-port'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target chassis='15' port='0x1e'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='pci.15'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='pci' index='16' model='pcie-root-port'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target chassis='16' port='0x1f'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='pci.16'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='pci' index='17' model='pcie-root-port'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target chassis='17' port='0x20'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='pci.17'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='pci' index='18' model='pcie-root-port'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target chassis='18' port='0x21'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='pci.18'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='pci' index='19' model='pcie-root-port'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target chassis='19' port='0x22'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='pci.19'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='pci' index='20' model='pcie-root-port'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target chassis='20' port='0x23'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='pci.20'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='pci' index='21' model='pcie-root-port'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target chassis='21' port='0x24'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='pci.21'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='pci' index='22' model='pcie-root-port'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target chassis='22' port='0x25'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='pci.22'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='pci' index='23' model='pcie-root-port'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target chassis='23' port='0x26'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='pci.23'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='pci' index='24' model='pcie-root-port'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target chassis='24' port='0x27'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='pci.24'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='pci' index='25' model='pcie-root-port'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <model name='pcie-root-port'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target chassis='25' port='0x28'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='pci.25'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <model name='pcie-pci-bridge'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='pci.26'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='usb' index='0' model='piix3-uhci'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='usb'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <controller type='sata' index='0'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='ide'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </controller>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <interface type='ethernet'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <mac address='fa:16:3e:16:78:94'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target dev='tap7fd89911-39'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <model type='virtio'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <driver name='vhost' rx_queue_size='512'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <mtu size='1442'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='net0'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <serial type='pty'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <source path='/dev/pts/1'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <log file='/var/lib/nova/instances/2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/console.log' append='off'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target type='isa-serial' port='0'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:        <model name='isa-serial'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      </target>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='serial0'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <console type='pty' tty='/dev/pts/1'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <source path='/dev/pts/1'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <log file='/var/lib/nova/instances/2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf/console.log' append='off'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <target type='serial' port='0'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='serial0'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </console>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <input type='tablet' bus='usb'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='input0'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='usb' bus='0' port='1'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </input>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <input type='mouse' bus='ps2'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='input1'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </input>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <input type='keyboard' bus='ps2'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='input2'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </input>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <graphics type='vnc' port='5901' autoport='yes' listen='::0'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <listen type='address' address='::0'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </graphics>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <audio id='1' type='none'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <model type='virtio' heads='1' primary='yes'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='video0'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <watchdog model='itco' action='reset'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='watchdog0'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </watchdog>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <memballoon model='virtio'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <stats period='10'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='balloon0'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <rng model='virtio'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <backend model='random'>/dev/urandom</backend>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <alias name='rng0'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <label>system_u:system_r:svirt_t:s0:c13,c483</label>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c13,c483</imagelabel>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  </seclabel>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <label>+107:+107</label>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <imagelabel>+107:+107</imagelabel>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  </seclabel>
Oct  2 08:38:20 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:38:20 np0005466013 nova_compute[192144]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Oct  2 08:38:20 np0005466013 nova_compute[192144]: 2025-10-02 12:38:20.561 2 WARNING nova.virt.libvirt.driver [req-716bcc52-037d-4d10-b11c-b3d93ebb4096 req-f60c473e-bab7-4dea-960b-b0d3522ef545 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Detaching interface fa:16:3e:a5:99:e1 failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tap7b31c42c-5f' not found.#033[00m
Oct  2 08:38:20 np0005466013 nova_compute[192144]: 2025-10-02 12:38:20.562 2 DEBUG nova.virt.libvirt.vif [req-716bcc52-037d-4d10-b11c-b3d93ebb4096 req-f60c473e-bab7-4dea-960b-b0d3522ef545 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:36:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1457788990',display_name='tempest-TestNetworkBasicOps-server-1457788990',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1457788990',id=160,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLAX+2XQrOd7KzesCSBvPQcBu33n2cNneMCsROY2zt8eQS2rwe7lFkCII3Yacz+uHDwkN7yLjJdqPxnhD7koMwAMvGoeCFoDh9M4G2I81ZizY75euwWGW9AYqhxHgyD2+w==',key_name='tempest-TestNetworkBasicOps-1430246833',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:36:32Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6e2a4899168a47618e377cb3ac85ddd2',ramdisk_id='',reservation_id='r-dfdu8cck',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1323893370',owner_user_name='tempest-TestNetworkBasicOps-1323893370-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:36:32Z,user_data=None,user_id='a1898fdf056c4a249c33590f26d4d845',uuid=2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7b31c42c-5f0b-40fa-b913-4546441b9444", "address": "fa:16:3e:a5:99:e1", "network": {"id": "a970b3c6-2fc3-4025-868b-2e9af396991a", "bridge": "br-int", "label": "tempest-network-smoke--441167180", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b31c42c-5f", "ovs_interfaceid": "7b31c42c-5f0b-40fa-b913-4546441b9444", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:38:20 np0005466013 nova_compute[192144]: 2025-10-02 12:38:20.562 2 DEBUG nova.network.os_vif_util [req-716bcc52-037d-4d10-b11c-b3d93ebb4096 req-f60c473e-bab7-4dea-960b-b0d3522ef545 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Converting VIF {"id": "7b31c42c-5f0b-40fa-b913-4546441b9444", "address": "fa:16:3e:a5:99:e1", "network": {"id": "a970b3c6-2fc3-4025-868b-2e9af396991a", "bridge": "br-int", "label": "tempest-network-smoke--441167180", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b31c42c-5f", "ovs_interfaceid": "7b31c42c-5f0b-40fa-b913-4546441b9444", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:38:20 np0005466013 nova_compute[192144]: 2025-10-02 12:38:20.563 2 DEBUG nova.network.os_vif_util [req-716bcc52-037d-4d10-b11c-b3d93ebb4096 req-f60c473e-bab7-4dea-960b-b0d3522ef545 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a5:99:e1,bridge_name='br-int',has_traffic_filtering=True,id=7b31c42c-5f0b-40fa-b913-4546441b9444,network=Network(a970b3c6-2fc3-4025-868b-2e9af396991a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b31c42c-5f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:38:20 np0005466013 nova_compute[192144]: 2025-10-02 12:38:20.563 2 DEBUG os_vif [req-716bcc52-037d-4d10-b11c-b3d93ebb4096 req-f60c473e-bab7-4dea-960b-b0d3522ef545 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a5:99:e1,bridge_name='br-int',has_traffic_filtering=True,id=7b31c42c-5f0b-40fa-b913-4546441b9444,network=Network(a970b3c6-2fc3-4025-868b-2e9af396991a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b31c42c-5f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:38:20 np0005466013 nova_compute[192144]: 2025-10-02 12:38:20.564 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:20 np0005466013 nova_compute[192144]: 2025-10-02 12:38:20.564 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b31c42c-5f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:38:20 np0005466013 nova_compute[192144]: 2025-10-02 12:38:20.565 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:38:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:20.562 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[b9b3e4b6-ec6d-4657-8ee0-6aa045e270f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:20 np0005466013 nova_compute[192144]: 2025-10-02 12:38:20.567 2 INFO os_vif [req-716bcc52-037d-4d10-b11c-b3d93ebb4096 req-f60c473e-bab7-4dea-960b-b0d3522ef545 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a5:99:e1,bridge_name='br-int',has_traffic_filtering=True,id=7b31c42c-5f0b-40fa-b913-4546441b9444,network=Network(a970b3c6-2fc3-4025-868b-2e9af396991a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b31c42c-5f')#033[00m
Oct  2 08:38:20 np0005466013 nova_compute[192144]: 2025-10-02 12:38:20.568 2 DEBUG nova.virt.libvirt.guest [req-716bcc52-037d-4d10-b11c-b3d93ebb4096 req-f60c473e-bab7-4dea-960b-b0d3522ef545 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <nova:name>tempest-TestNetworkBasicOps-server-1457788990</nova:name>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <nova:creationTime>2025-10-02 12:38:20</nova:creationTime>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <nova:flavor name="m1.nano">
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <nova:memory>128</nova:memory>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <nova:disk>1</nova:disk>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <nova:swap>0</nova:swap>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <nova:vcpus>1</nova:vcpus>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  </nova:flavor>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <nova:owner>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <nova:user uuid="a1898fdf056c4a249c33590f26d4d845">tempest-TestNetworkBasicOps-1323893370-project-member</nova:user>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <nova:project uuid="6e2a4899168a47618e377cb3ac85ddd2">tempest-TestNetworkBasicOps-1323893370</nova:project>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  </nova:owner>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  <nova:ports>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    <nova:port uuid="7fd89911-3957-4f68-8adb-9a1c640f6bdb">
Oct  2 08:38:20 np0005466013 nova_compute[192144]:      <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:    </nova:port>
Oct  2 08:38:20 np0005466013 nova_compute[192144]:  </nova:ports>
Oct  2 08:38:20 np0005466013 nova_compute[192144]: </nova:instance>
Oct  2 08:38:20 np0005466013 nova_compute[192144]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Oct  2 08:38:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:20.567 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[8cb2c299-c02f-47bd-a1fe-47da6a8a182f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:20.582 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[6b6e4163-c336-4e48-a52d-37bcc6323318]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 661754, 'reachable_time': 28745, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247979, 'error': None, 'target': 'ovnmeta-a970b3c6-2fc3-4025-868b-2e9af396991a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:20.584 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a970b3c6-2fc3-4025-868b-2e9af396991a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:38:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:20.584 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[b2a4ef97-49ae-4f45-b380-2084df342e4e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:20 np0005466013 systemd[1]: run-netns-ovnmeta\x2da970b3c6\x2d2fc3\x2d4025\x2d868b\x2d2e9af396991a.mount: Deactivated successfully.
Oct  2 08:38:21 np0005466013 ovn_controller[94366]: 2025-10-02T12:38:21Z|00719|binding|INFO|Releasing lport 02b99a60-d61e-4fd5-b4f5-ea414f3b2b3c from this chassis (sb_readonly=0)
Oct  2 08:38:21 np0005466013 ovn_controller[94366]: 2025-10-02T12:38:21Z|00720|binding|INFO|Releasing lport e78bd1c4-7546-4ebe-a71b-a49e8c78f36c from this chassis (sb_readonly=0)
Oct  2 08:38:21 np0005466013 nova_compute[192144]: 2025-10-02 12:38:21.941 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:22 np0005466013 nova_compute[192144]: 2025-10-02 12:38:22.518 2 INFO nova.network.neutron [None req-3a10a512-b7f4-408c-8427-03cd0d5a48c4 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Port 7b31c42c-5f0b-40fa-b913-4546441b9444 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Oct  2 08:38:22 np0005466013 nova_compute[192144]: 2025-10-02 12:38:22.518 2 DEBUG nova.network.neutron [None req-3a10a512-b7f4-408c-8427-03cd0d5a48c4 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Updating instance_info_cache with network_info: [{"id": "7fd89911-3957-4f68-8adb-9a1c640f6bdb", "address": "fa:16:3e:16:78:94", "network": {"id": "f6b28beb-3fac-4e00-bd1f-932a66109b1d", "bridge": "br-int", "label": "tempest-network-smoke--1067683882", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7fd89911-39", "ovs_interfaceid": "7fd89911-3957-4f68-8adb-9a1c640f6bdb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:38:22 np0005466013 nova_compute[192144]: 2025-10-02 12:38:22.536 2 DEBUG oslo_concurrency.lockutils [None req-3a10a512-b7f4-408c-8427-03cd0d5a48c4 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Releasing lock "refresh_cache-2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:38:22 np0005466013 nova_compute[192144]: 2025-10-02 12:38:22.564 2 DEBUG nova.compute.manager [req-ab0e6f26-a174-456e-9b9a-ad86e8f39169 req-d02f99ef-d161-468f-b08e-7e578dbe881e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Received event network-vif-plugged-7b31c42c-5f0b-40fa-b913-4546441b9444 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:38:22 np0005466013 nova_compute[192144]: 2025-10-02 12:38:22.564 2 DEBUG oslo_concurrency.lockutils [req-ab0e6f26-a174-456e-9b9a-ad86e8f39169 req-d02f99ef-d161-468f-b08e-7e578dbe881e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:38:22 np0005466013 nova_compute[192144]: 2025-10-02 12:38:22.564 2 DEBUG oslo_concurrency.lockutils [req-ab0e6f26-a174-456e-9b9a-ad86e8f39169 req-d02f99ef-d161-468f-b08e-7e578dbe881e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:38:22 np0005466013 nova_compute[192144]: 2025-10-02 12:38:22.564 2 DEBUG oslo_concurrency.lockutils [req-ab0e6f26-a174-456e-9b9a-ad86e8f39169 req-d02f99ef-d161-468f-b08e-7e578dbe881e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:22 np0005466013 nova_compute[192144]: 2025-10-02 12:38:22.565 2 DEBUG nova.compute.manager [req-ab0e6f26-a174-456e-9b9a-ad86e8f39169 req-d02f99ef-d161-468f-b08e-7e578dbe881e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] No waiting events found dispatching network-vif-plugged-7b31c42c-5f0b-40fa-b913-4546441b9444 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:38:22 np0005466013 nova_compute[192144]: 2025-10-02 12:38:22.565 2 WARNING nova.compute.manager [req-ab0e6f26-a174-456e-9b9a-ad86e8f39169 req-d02f99ef-d161-468f-b08e-7e578dbe881e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Received unexpected event network-vif-plugged-7b31c42c-5f0b-40fa-b913-4546441b9444 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:38:22 np0005466013 nova_compute[192144]: 2025-10-02 12:38:22.573 2 DEBUG oslo_concurrency.lockutils [None req-3a10a512-b7f4-408c-8427-03cd0d5a48c4 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "interface-2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-7b31c42c-5f0b-40fa-b913-4546441b9444" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 3.136s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:22 np0005466013 podman[247980]: 2025-10-02 12:38:22.716003151 +0000 UTC m=+0.081366136 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  2 08:38:22 np0005466013 podman[247981]: 2025-10-02 12:38:22.72177319 +0000 UTC m=+0.078858197 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:38:22 np0005466013 podman[247982]: 2025-10-02 12:38:22.761944531 +0000 UTC m=+0.113803115 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct  2 08:38:22 np0005466013 nova_compute[192144]: 2025-10-02 12:38:22.959 2 DEBUG nova.compute.manager [req-a76ed08e-cc92-4411-a6a9-34108a915b04 req-ccdd4d93-010d-4e36-950d-44b51a324438 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Received event network-changed-7fd89911-3957-4f68-8adb-9a1c640f6bdb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:38:22 np0005466013 nova_compute[192144]: 2025-10-02 12:38:22.960 2 DEBUG nova.compute.manager [req-a76ed08e-cc92-4411-a6a9-34108a915b04 req-ccdd4d93-010d-4e36-950d-44b51a324438 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Refreshing instance network info cache due to event network-changed-7fd89911-3957-4f68-8adb-9a1c640f6bdb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:38:22 np0005466013 nova_compute[192144]: 2025-10-02 12:38:22.960 2 DEBUG oslo_concurrency.lockutils [req-a76ed08e-cc92-4411-a6a9-34108a915b04 req-ccdd4d93-010d-4e36-950d-44b51a324438 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:38:22 np0005466013 nova_compute[192144]: 2025-10-02 12:38:22.960 2 DEBUG oslo_concurrency.lockutils [req-a76ed08e-cc92-4411-a6a9-34108a915b04 req-ccdd4d93-010d-4e36-950d-44b51a324438 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:38:22 np0005466013 nova_compute[192144]: 2025-10-02 12:38:22.960 2 DEBUG nova.network.neutron [req-a76ed08e-cc92-4411-a6a9-34108a915b04 req-ccdd4d93-010d-4e36-950d-44b51a324438 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Refreshing network info cache for port 7fd89911-3957-4f68-8adb-9a1c640f6bdb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:38:23 np0005466013 nova_compute[192144]: 2025-10-02 12:38:23.068 2 DEBUG oslo_concurrency.lockutils [None req-4b95cf5f-f07e-435d-bc9f-54a97d7384d2 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:38:23 np0005466013 nova_compute[192144]: 2025-10-02 12:38:23.068 2 DEBUG oslo_concurrency.lockutils [None req-4b95cf5f-f07e-435d-bc9f-54a97d7384d2 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:38:23 np0005466013 nova_compute[192144]: 2025-10-02 12:38:23.069 2 DEBUG oslo_concurrency.lockutils [None req-4b95cf5f-f07e-435d-bc9f-54a97d7384d2 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:38:23 np0005466013 nova_compute[192144]: 2025-10-02 12:38:23.069 2 DEBUG oslo_concurrency.lockutils [None req-4b95cf5f-f07e-435d-bc9f-54a97d7384d2 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:38:23 np0005466013 nova_compute[192144]: 2025-10-02 12:38:23.069 2 DEBUG oslo_concurrency.lockutils [None req-4b95cf5f-f07e-435d-bc9f-54a97d7384d2 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:23 np0005466013 nova_compute[192144]: 2025-10-02 12:38:23.085 2 INFO nova.compute.manager [None req-4b95cf5f-f07e-435d-bc9f-54a97d7384d2 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Terminating instance#033[00m
Oct  2 08:38:23 np0005466013 nova_compute[192144]: 2025-10-02 12:38:23.095 2 DEBUG nova.compute.manager [None req-4b95cf5f-f07e-435d-bc9f-54a97d7384d2 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:38:23 np0005466013 kernel: tap7fd89911-39 (unregistering): left promiscuous mode
Oct  2 08:38:23 np0005466013 NetworkManager[51205]: <info>  [1759408703.1671] device (tap7fd89911-39): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:38:23 np0005466013 ovn_controller[94366]: 2025-10-02T12:38:23Z|00721|binding|INFO|Releasing lport 7fd89911-3957-4f68-8adb-9a1c640f6bdb from this chassis (sb_readonly=0)
Oct  2 08:38:23 np0005466013 ovn_controller[94366]: 2025-10-02T12:38:23Z|00722|binding|INFO|Setting lport 7fd89911-3957-4f68-8adb-9a1c640f6bdb down in Southbound
Oct  2 08:38:23 np0005466013 nova_compute[192144]: 2025-10-02 12:38:23.178 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:23 np0005466013 ovn_controller[94366]: 2025-10-02T12:38:23Z|00723|binding|INFO|Removing iface tap7fd89911-39 ovn-installed in OVS
Oct  2 08:38:23 np0005466013 nova_compute[192144]: 2025-10-02 12:38:23.182 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:23 np0005466013 nova_compute[192144]: 2025-10-02 12:38:23.191 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:23.194 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:16:78:94 10.100.0.4'], port_security=['fa:16:3e:16:78:94 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f6b28beb-3fac-4e00-bd1f-932a66109b1d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f032af94-2449-4fc3-bf18-1eca195c2d1c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8bd58f68-4db4-498a-b482-4891f2ea7922, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=7fd89911-3957-4f68-8adb-9a1c640f6bdb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:38:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:23.198 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 7fd89911-3957-4f68-8adb-9a1c640f6bdb in datapath f6b28beb-3fac-4e00-bd1f-932a66109b1d unbound from our chassis#033[00m
Oct  2 08:38:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:23.201 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f6b28beb-3fac-4e00-bd1f-932a66109b1d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:38:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:23.203 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[c22d63d4-444f-48ad-bb30-3e9b079bdac1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:23.203 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f6b28beb-3fac-4e00-bd1f-932a66109b1d namespace which is not needed anymore#033[00m
Oct  2 08:38:23 np0005466013 systemd[1]: machine-qemu\x2d74\x2dinstance\x2d000000a0.scope: Deactivated successfully.
Oct  2 08:38:23 np0005466013 systemd[1]: machine-qemu\x2d74\x2dinstance\x2d000000a0.scope: Consumed 18.671s CPU time.
Oct  2 08:38:23 np0005466013 systemd-machined[152202]: Machine qemu-74-instance-000000a0 terminated.
Oct  2 08:38:23 np0005466013 neutron-haproxy-ovnmeta-f6b28beb-3fac-4e00-bd1f-932a66109b1d[246897]: [NOTICE]   (246930) : haproxy version is 2.8.14-c23fe91
Oct  2 08:38:23 np0005466013 neutron-haproxy-ovnmeta-f6b28beb-3fac-4e00-bd1f-932a66109b1d[246897]: [NOTICE]   (246930) : path to executable is /usr/sbin/haproxy
Oct  2 08:38:23 np0005466013 neutron-haproxy-ovnmeta-f6b28beb-3fac-4e00-bd1f-932a66109b1d[246897]: [WARNING]  (246930) : Exiting Master process...
Oct  2 08:38:23 np0005466013 neutron-haproxy-ovnmeta-f6b28beb-3fac-4e00-bd1f-932a66109b1d[246897]: [ALERT]    (246930) : Current worker (246932) exited with code 143 (Terminated)
Oct  2 08:38:23 np0005466013 neutron-haproxy-ovnmeta-f6b28beb-3fac-4e00-bd1f-932a66109b1d[246897]: [WARNING]  (246930) : All workers exited. Exiting... (0)
Oct  2 08:38:23 np0005466013 systemd[1]: libpod-d0ec391301a98e9c709b7adb64263f7195426a9348857f5be62b4811e55fb1ad.scope: Deactivated successfully.
Oct  2 08:38:23 np0005466013 podman[248071]: 2025-10-02 12:38:23.375593875 +0000 UTC m=+0.067604827 container died d0ec391301a98e9c709b7adb64263f7195426a9348857f5be62b4811e55fb1ad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f6b28beb-3fac-4e00-bd1f-932a66109b1d, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:38:23 np0005466013 nova_compute[192144]: 2025-10-02 12:38:23.381 2 INFO nova.virt.libvirt.driver [-] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Instance destroyed successfully.#033[00m
Oct  2 08:38:23 np0005466013 nova_compute[192144]: 2025-10-02 12:38:23.381 2 DEBUG nova.objects.instance [None req-4b95cf5f-f07e-435d-bc9f-54a97d7384d2 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lazy-loading 'resources' on Instance uuid 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:38:23 np0005466013 nova_compute[192144]: 2025-10-02 12:38:23.410 2 DEBUG nova.virt.libvirt.vif [None req-4b95cf5f-f07e-435d-bc9f-54a97d7384d2 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:36:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1457788990',display_name='tempest-TestNetworkBasicOps-server-1457788990',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1457788990',id=160,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLAX+2XQrOd7KzesCSBvPQcBu33n2cNneMCsROY2zt8eQS2rwe7lFkCII3Yacz+uHDwkN7yLjJdqPxnhD7koMwAMvGoeCFoDh9M4G2I81ZizY75euwWGW9AYqhxHgyD2+w==',key_name='tempest-TestNetworkBasicOps-1430246833',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:36:32Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6e2a4899168a47618e377cb3ac85ddd2',ramdisk_id='',reservation_id='r-dfdu8cck',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1323893370',owner_user_name='tempest-TestNetworkBasicOps-1323893370-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:36:32Z,user_data=None,user_id='a1898fdf056c4a249c33590f26d4d845',uuid=2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7fd89911-3957-4f68-8adb-9a1c640f6bdb", "address": "fa:16:3e:16:78:94", "network": {"id": "f6b28beb-3fac-4e00-bd1f-932a66109b1d", "bridge": "br-int", "label": "tempest-network-smoke--1067683882", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7fd89911-39", "ovs_interfaceid": "7fd89911-3957-4f68-8adb-9a1c640f6bdb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:38:23 np0005466013 nova_compute[192144]: 2025-10-02 12:38:23.410 2 DEBUG nova.network.os_vif_util [None req-4b95cf5f-f07e-435d-bc9f-54a97d7384d2 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Converting VIF {"id": "7fd89911-3957-4f68-8adb-9a1c640f6bdb", "address": "fa:16:3e:16:78:94", "network": {"id": "f6b28beb-3fac-4e00-bd1f-932a66109b1d", "bridge": "br-int", "label": "tempest-network-smoke--1067683882", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7fd89911-39", "ovs_interfaceid": "7fd89911-3957-4f68-8adb-9a1c640f6bdb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:38:23 np0005466013 nova_compute[192144]: 2025-10-02 12:38:23.411 2 DEBUG nova.network.os_vif_util [None req-4b95cf5f-f07e-435d-bc9f-54a97d7384d2 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:16:78:94,bridge_name='br-int',has_traffic_filtering=True,id=7fd89911-3957-4f68-8adb-9a1c640f6bdb,network=Network(f6b28beb-3fac-4e00-bd1f-932a66109b1d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7fd89911-39') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:38:23 np0005466013 nova_compute[192144]: 2025-10-02 12:38:23.411 2 DEBUG os_vif [None req-4b95cf5f-f07e-435d-bc9f-54a97d7384d2 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:16:78:94,bridge_name='br-int',has_traffic_filtering=True,id=7fd89911-3957-4f68-8adb-9a1c640f6bdb,network=Network(f6b28beb-3fac-4e00-bd1f-932a66109b1d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7fd89911-39') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:38:23 np0005466013 nova_compute[192144]: 2025-10-02 12:38:23.413 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:23 np0005466013 nova_compute[192144]: 2025-10-02 12:38:23.413 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7fd89911-39, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:38:23 np0005466013 nova_compute[192144]: 2025-10-02 12:38:23.414 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:23 np0005466013 nova_compute[192144]: 2025-10-02 12:38:23.416 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:23 np0005466013 nova_compute[192144]: 2025-10-02 12:38:23.419 2 INFO os_vif [None req-4b95cf5f-f07e-435d-bc9f-54a97d7384d2 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:16:78:94,bridge_name='br-int',has_traffic_filtering=True,id=7fd89911-3957-4f68-8adb-9a1c640f6bdb,network=Network(f6b28beb-3fac-4e00-bd1f-932a66109b1d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7fd89911-39')#033[00m
Oct  2 08:38:23 np0005466013 nova_compute[192144]: 2025-10-02 12:38:23.419 2 INFO nova.virt.libvirt.driver [None req-4b95cf5f-f07e-435d-bc9f-54a97d7384d2 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Deleting instance files /var/lib/nova/instances/2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf_del#033[00m
Oct  2 08:38:23 np0005466013 nova_compute[192144]: 2025-10-02 12:38:23.420 2 INFO nova.virt.libvirt.driver [None req-4b95cf5f-f07e-435d-bc9f-54a97d7384d2 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Deletion of /var/lib/nova/instances/2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf_del complete#033[00m
Oct  2 08:38:23 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d0ec391301a98e9c709b7adb64263f7195426a9348857f5be62b4811e55fb1ad-userdata-shm.mount: Deactivated successfully.
Oct  2 08:38:23 np0005466013 systemd[1]: var-lib-containers-storage-overlay-277afd71fb3fc70971fd153beb7e7fafeaa6e69bfa2b36ee3d3533aa1c9278e5-merged.mount: Deactivated successfully.
Oct  2 08:38:23 np0005466013 podman[248071]: 2025-10-02 12:38:23.468639873 +0000 UTC m=+0.160650835 container cleanup d0ec391301a98e9c709b7adb64263f7195426a9348857f5be62b4811e55fb1ad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f6b28beb-3fac-4e00-bd1f-932a66109b1d, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 08:38:23 np0005466013 systemd[1]: libpod-conmon-d0ec391301a98e9c709b7adb64263f7195426a9348857f5be62b4811e55fb1ad.scope: Deactivated successfully.
Oct  2 08:38:23 np0005466013 nova_compute[192144]: 2025-10-02 12:38:23.510 2 INFO nova.compute.manager [None req-4b95cf5f-f07e-435d-bc9f-54a97d7384d2 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Took 0.41 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:38:23 np0005466013 nova_compute[192144]: 2025-10-02 12:38:23.511 2 DEBUG oslo.service.loopingcall [None req-4b95cf5f-f07e-435d-bc9f-54a97d7384d2 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:38:23 np0005466013 nova_compute[192144]: 2025-10-02 12:38:23.511 2 DEBUG nova.compute.manager [-] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:38:23 np0005466013 nova_compute[192144]: 2025-10-02 12:38:23.511 2 DEBUG nova.network.neutron [-] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:38:23 np0005466013 podman[248118]: 2025-10-02 12:38:23.599687015 +0000 UTC m=+0.109154441 container remove d0ec391301a98e9c709b7adb64263f7195426a9348857f5be62b4811e55fb1ad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f6b28beb-3fac-4e00-bd1f-932a66109b1d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Oct  2 08:38:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:23.605 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[3216de1d-4e39-4acf-8ec3-4525724ec7c6]: (4, ('Thu Oct  2 12:38:23 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f6b28beb-3fac-4e00-bd1f-932a66109b1d (d0ec391301a98e9c709b7adb64263f7195426a9348857f5be62b4811e55fb1ad)\nd0ec391301a98e9c709b7adb64263f7195426a9348857f5be62b4811e55fb1ad\nThu Oct  2 12:38:23 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f6b28beb-3fac-4e00-bd1f-932a66109b1d (d0ec391301a98e9c709b7adb64263f7195426a9348857f5be62b4811e55fb1ad)\nd0ec391301a98e9c709b7adb64263f7195426a9348857f5be62b4811e55fb1ad\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:23.609 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[da200c74-5de3-4e10-a255-7ce68962301f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:23.611 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf6b28beb-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:38:23 np0005466013 nova_compute[192144]: 2025-10-02 12:38:23.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:23 np0005466013 kernel: tapf6b28beb-30: left promiscuous mode
Oct  2 08:38:23 np0005466013 nova_compute[192144]: 2025-10-02 12:38:23.616 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:23.618 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[7f0e2b1a-dd31-4add-83c7-c33e12f5492a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:23 np0005466013 nova_compute[192144]: 2025-10-02 12:38:23.628 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:23.660 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[d8e81180-0f74-406b-bba1-ea8ba9202c19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:23.662 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[d75d23a2-2141-45f0-a312-346cbf45ce1b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:23.680 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[ab5cd636-530d-4eeb-9feb-23afe7ac5fb6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 658293, 'reachable_time': 42989, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 248133, 'error': None, 'target': 'ovnmeta-f6b28beb-3fac-4e00-bd1f-932a66109b1d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:23.682 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f6b28beb-3fac-4e00-bd1f-932a66109b1d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:38:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:23.682 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[1d164647-48c0-41f5-879f-53910af04838]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:23 np0005466013 systemd[1]: run-netns-ovnmeta\x2df6b28beb\x2d3fac\x2d4e00\x2dbd1f\x2d932a66109b1d.mount: Deactivated successfully.
Oct  2 08:38:24 np0005466013 nova_compute[192144]: 2025-10-02 12:38:24.166 2 DEBUG nova.network.neutron [-] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:38:24 np0005466013 nova_compute[192144]: 2025-10-02 12:38:24.188 2 INFO nova.compute.manager [-] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Took 0.68 seconds to deallocate network for instance.#033[00m
Oct  2 08:38:24 np0005466013 nova_compute[192144]: 2025-10-02 12:38:24.248 2 DEBUG nova.compute.manager [req-4f5f78c2-7d52-47f0-a40e-79cc4c9bff23 req-ac5e1280-0619-475c-8235-12f215eca23d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Received event network-vif-deleted-7fd89911-3957-4f68-8adb-9a1c640f6bdb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:38:24 np0005466013 nova_compute[192144]: 2025-10-02 12:38:24.259 2 DEBUG oslo_concurrency.lockutils [None req-4b95cf5f-f07e-435d-bc9f-54a97d7384d2 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:38:24 np0005466013 nova_compute[192144]: 2025-10-02 12:38:24.260 2 DEBUG oslo_concurrency.lockutils [None req-4b95cf5f-f07e-435d-bc9f-54a97d7384d2 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:38:24 np0005466013 nova_compute[192144]: 2025-10-02 12:38:24.385 2 DEBUG nova.network.neutron [req-a76ed08e-cc92-4411-a6a9-34108a915b04 req-ccdd4d93-010d-4e36-950d-44b51a324438 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Updated VIF entry in instance network info cache for port 7fd89911-3957-4f68-8adb-9a1c640f6bdb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:38:24 np0005466013 nova_compute[192144]: 2025-10-02 12:38:24.385 2 DEBUG nova.network.neutron [req-a76ed08e-cc92-4411-a6a9-34108a915b04 req-ccdd4d93-010d-4e36-950d-44b51a324438 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Updating instance_info_cache with network_info: [{"id": "7fd89911-3957-4f68-8adb-9a1c640f6bdb", "address": "fa:16:3e:16:78:94", "network": {"id": "f6b28beb-3fac-4e00-bd1f-932a66109b1d", "bridge": "br-int", "label": "tempest-network-smoke--1067683882", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7fd89911-39", "ovs_interfaceid": "7fd89911-3957-4f68-8adb-9a1c640f6bdb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:38:24 np0005466013 nova_compute[192144]: 2025-10-02 12:38:24.422 2 DEBUG oslo_concurrency.lockutils [req-a76ed08e-cc92-4411-a6a9-34108a915b04 req-ccdd4d93-010d-4e36-950d-44b51a324438 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:38:24 np0005466013 nova_compute[192144]: 2025-10-02 12:38:24.444 2 DEBUG nova.compute.provider_tree [None req-4b95cf5f-f07e-435d-bc9f-54a97d7384d2 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:38:24 np0005466013 nova_compute[192144]: 2025-10-02 12:38:24.459 2 DEBUG nova.scheduler.client.report [None req-4b95cf5f-f07e-435d-bc9f-54a97d7384d2 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:38:24 np0005466013 nova_compute[192144]: 2025-10-02 12:38:24.490 2 DEBUG oslo_concurrency.lockutils [None req-4b95cf5f-f07e-435d-bc9f-54a97d7384d2 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.230s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:24 np0005466013 nova_compute[192144]: 2025-10-02 12:38:24.513 2 INFO nova.scheduler.client.report [None req-4b95cf5f-f07e-435d-bc9f-54a97d7384d2 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Deleted allocations for instance 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf#033[00m
Oct  2 08:38:24 np0005466013 nova_compute[192144]: 2025-10-02 12:38:24.664 2 DEBUG oslo_concurrency.lockutils [None req-4b95cf5f-f07e-435d-bc9f-54a97d7384d2 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.596s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:25 np0005466013 nova_compute[192144]: 2025-10-02 12:38:25.039 2 DEBUG nova.compute.manager [req-2e6f29e9-34da-4f53-a597-88e3305dcc40 req-cedc3955-03bb-4fee-beea-dd1c7f2f535f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Received event network-vif-unplugged-7fd89911-3957-4f68-8adb-9a1c640f6bdb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:38:25 np0005466013 nova_compute[192144]: 2025-10-02 12:38:25.040 2 DEBUG oslo_concurrency.lockutils [req-2e6f29e9-34da-4f53-a597-88e3305dcc40 req-cedc3955-03bb-4fee-beea-dd1c7f2f535f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:38:25 np0005466013 nova_compute[192144]: 2025-10-02 12:38:25.040 2 DEBUG oslo_concurrency.lockutils [req-2e6f29e9-34da-4f53-a597-88e3305dcc40 req-cedc3955-03bb-4fee-beea-dd1c7f2f535f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:38:25 np0005466013 nova_compute[192144]: 2025-10-02 12:38:25.040 2 DEBUG oslo_concurrency.lockutils [req-2e6f29e9-34da-4f53-a597-88e3305dcc40 req-cedc3955-03bb-4fee-beea-dd1c7f2f535f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:25 np0005466013 nova_compute[192144]: 2025-10-02 12:38:25.041 2 DEBUG nova.compute.manager [req-2e6f29e9-34da-4f53-a597-88e3305dcc40 req-cedc3955-03bb-4fee-beea-dd1c7f2f535f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] No waiting events found dispatching network-vif-unplugged-7fd89911-3957-4f68-8adb-9a1c640f6bdb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:38:25 np0005466013 nova_compute[192144]: 2025-10-02 12:38:25.041 2 WARNING nova.compute.manager [req-2e6f29e9-34da-4f53-a597-88e3305dcc40 req-cedc3955-03bb-4fee-beea-dd1c7f2f535f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Received unexpected event network-vif-unplugged-7fd89911-3957-4f68-8adb-9a1c640f6bdb for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:38:25 np0005466013 nova_compute[192144]: 2025-10-02 12:38:25.041 2 DEBUG nova.compute.manager [req-2e6f29e9-34da-4f53-a597-88e3305dcc40 req-cedc3955-03bb-4fee-beea-dd1c7f2f535f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Received event network-vif-plugged-7fd89911-3957-4f68-8adb-9a1c640f6bdb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:38:25 np0005466013 nova_compute[192144]: 2025-10-02 12:38:25.041 2 DEBUG oslo_concurrency.lockutils [req-2e6f29e9-34da-4f53-a597-88e3305dcc40 req-cedc3955-03bb-4fee-beea-dd1c7f2f535f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:38:25 np0005466013 nova_compute[192144]: 2025-10-02 12:38:25.041 2 DEBUG oslo_concurrency.lockutils [req-2e6f29e9-34da-4f53-a597-88e3305dcc40 req-cedc3955-03bb-4fee-beea-dd1c7f2f535f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:38:25 np0005466013 nova_compute[192144]: 2025-10-02 12:38:25.041 2 DEBUG oslo_concurrency.lockutils [req-2e6f29e9-34da-4f53-a597-88e3305dcc40 req-cedc3955-03bb-4fee-beea-dd1c7f2f535f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:25 np0005466013 nova_compute[192144]: 2025-10-02 12:38:25.042 2 DEBUG nova.compute.manager [req-2e6f29e9-34da-4f53-a597-88e3305dcc40 req-cedc3955-03bb-4fee-beea-dd1c7f2f535f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] No waiting events found dispatching network-vif-plugged-7fd89911-3957-4f68-8adb-9a1c640f6bdb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:38:25 np0005466013 nova_compute[192144]: 2025-10-02 12:38:25.042 2 WARNING nova.compute.manager [req-2e6f29e9-34da-4f53-a597-88e3305dcc40 req-cedc3955-03bb-4fee-beea-dd1c7f2f535f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Received unexpected event network-vif-plugged-7fd89911-3957-4f68-8adb-9a1c640f6bdb for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:38:25 np0005466013 nova_compute[192144]: 2025-10-02 12:38:25.412 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:28 np0005466013 nova_compute[192144]: 2025-10-02 12:38:28.416 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:30 np0005466013 ovn_controller[94366]: 2025-10-02T12:38:30Z|00724|binding|INFO|Releasing lport e78bd1c4-7546-4ebe-a71b-a49e8c78f36c from this chassis (sb_readonly=0)
Oct  2 08:38:30 np0005466013 nova_compute[192144]: 2025-10-02 12:38:30.444 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:30 np0005466013 nova_compute[192144]: 2025-10-02 12:38:30.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:32 np0005466013 podman[248137]: 2025-10-02 12:38:32.724999304 +0000 UTC m=+0.071631352 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  2 08:38:32 np0005466013 podman[248136]: 2025-10-02 12:38:32.745685469 +0000 UTC m=+0.096319721 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.buildah.version=1.33.7, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter)
Oct  2 08:38:32 np0005466013 podman[248135]: 2025-10-02 12:38:32.746047401 +0000 UTC m=+0.100533954 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:38:33 np0005466013 nova_compute[192144]: 2025-10-02 12:38:33.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:34 np0005466013 nova_compute[192144]: 2025-10-02 12:38:34.765 2 DEBUG oslo_concurrency.lockutils [None req-bc0a8668-db34-455c-b5db-614d049cbd25 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "5fc50e91-2988-453d-87f4-afbd5214d7f6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:38:34 np0005466013 nova_compute[192144]: 2025-10-02 12:38:34.766 2 DEBUG oslo_concurrency.lockutils [None req-bc0a8668-db34-455c-b5db-614d049cbd25 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "5fc50e91-2988-453d-87f4-afbd5214d7f6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:38:34 np0005466013 nova_compute[192144]: 2025-10-02 12:38:34.766 2 DEBUG oslo_concurrency.lockutils [None req-bc0a8668-db34-455c-b5db-614d049cbd25 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "5fc50e91-2988-453d-87f4-afbd5214d7f6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:38:34 np0005466013 nova_compute[192144]: 2025-10-02 12:38:34.767 2 DEBUG oslo_concurrency.lockutils [None req-bc0a8668-db34-455c-b5db-614d049cbd25 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "5fc50e91-2988-453d-87f4-afbd5214d7f6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:38:34 np0005466013 nova_compute[192144]: 2025-10-02 12:38:34.767 2 DEBUG oslo_concurrency.lockutils [None req-bc0a8668-db34-455c-b5db-614d049cbd25 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "5fc50e91-2988-453d-87f4-afbd5214d7f6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:34 np0005466013 nova_compute[192144]: 2025-10-02 12:38:34.783 2 INFO nova.compute.manager [None req-bc0a8668-db34-455c-b5db-614d049cbd25 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Terminating instance#033[00m
Oct  2 08:38:34 np0005466013 nova_compute[192144]: 2025-10-02 12:38:34.797 2 DEBUG nova.compute.manager [None req-bc0a8668-db34-455c-b5db-614d049cbd25 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:38:34 np0005466013 kernel: tapa745c1a1-9c (unregistering): left promiscuous mode
Oct  2 08:38:34 np0005466013 NetworkManager[51205]: <info>  [1759408714.8187] device (tapa745c1a1-9c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:38:34 np0005466013 ovn_controller[94366]: 2025-10-02T12:38:34Z|00725|binding|INFO|Releasing lport a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb from this chassis (sb_readonly=0)
Oct  2 08:38:34 np0005466013 ovn_controller[94366]: 2025-10-02T12:38:34Z|00726|binding|INFO|Setting lport a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb down in Southbound
Oct  2 08:38:34 np0005466013 ovn_controller[94366]: 2025-10-02T12:38:34Z|00727|binding|INFO|Removing iface tapa745c1a1-9c ovn-installed in OVS
Oct  2 08:38:34 np0005466013 nova_compute[192144]: 2025-10-02 12:38:34.831 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:34 np0005466013 nova_compute[192144]: 2025-10-02 12:38:34.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:34.851 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cb:c0:e4 10.100.0.6 2001:db8:0:1:f816:3eff:fecb:c0e4 2001:db8::f816:3eff:fecb:c0e4'], port_security=['fa:16:3e:cb:c0:e4 10.100.0.6 2001:db8:0:1:f816:3eff:fecb:c0e4 2001:db8::f816:3eff:fecb:c0e4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28 2001:db8:0:1:f816:3eff:fecb:c0e4/64 2001:db8::f816:3eff:fecb:c0e4/64', 'neutron:device_id': '5fc50e91-2988-453d-87f4-afbd5214d7f6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '4', 'neutron:security_group_ids': '38601fe0-d139-4a59-b46e-238283b5fcdd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=546080ca-391c-439c-be48-88bb942119c9, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:38:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:34.852 103323 INFO neutron.agent.ovn.metadata.agent [-] Port a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb in datapath 1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6 unbound from our chassis#033[00m
Oct  2 08:38:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:34.854 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:38:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:34.856 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[46ce8d66-bb39-4bd0-a1d1-3e5313a80a90]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:34.857 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6 namespace which is not needed anymore#033[00m
Oct  2 08:38:34 np0005466013 nova_compute[192144]: 2025-10-02 12:38:34.870 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:34 np0005466013 systemd[1]: machine-qemu\x2d75\x2dinstance\x2d000000a3.scope: Deactivated successfully.
Oct  2 08:38:34 np0005466013 systemd[1]: machine-qemu\x2d75\x2dinstance\x2d000000a3.scope: Consumed 15.628s CPU time.
Oct  2 08:38:34 np0005466013 systemd-machined[152202]: Machine qemu-75-instance-000000a3 terminated.
Oct  2 08:38:34 np0005466013 podman[248190]: 2025-10-02 12:38:34.917313769 +0000 UTC m=+0.077311579 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:38:34 np0005466013 podman[248193]: 2025-10-02 12:38:34.930262032 +0000 UTC m=+0.083923084 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 08:38:35 np0005466013 neutron-haproxy-ovnmeta-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6[247643]: [NOTICE]   (247647) : haproxy version is 2.8.14-c23fe91
Oct  2 08:38:35 np0005466013 neutron-haproxy-ovnmeta-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6[247643]: [NOTICE]   (247647) : path to executable is /usr/sbin/haproxy
Oct  2 08:38:35 np0005466013 neutron-haproxy-ovnmeta-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6[247643]: [WARNING]  (247647) : Exiting Master process...
Oct  2 08:38:35 np0005466013 neutron-haproxy-ovnmeta-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6[247643]: [WARNING]  (247647) : Exiting Master process...
Oct  2 08:38:35 np0005466013 neutron-haproxy-ovnmeta-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6[247643]: [ALERT]    (247647) : Current worker (247649) exited with code 143 (Terminated)
Oct  2 08:38:35 np0005466013 neutron-haproxy-ovnmeta-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6[247643]: [WARNING]  (247647) : All workers exited. Exiting... (0)
Oct  2 08:38:35 np0005466013 systemd[1]: libpod-6f0cb77385badf45492fbb84f9da4bdc8b1d32c37eadba993d8ae16483881a70.scope: Deactivated successfully.
Oct  2 08:38:35 np0005466013 podman[248253]: 2025-10-02 12:38:35.027047787 +0000 UTC m=+0.061783455 container died 6f0cb77385badf45492fbb84f9da4bdc8b1d32c37eadba993d8ae16483881a70 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:38:35 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6f0cb77385badf45492fbb84f9da4bdc8b1d32c37eadba993d8ae16483881a70-userdata-shm.mount: Deactivated successfully.
Oct  2 08:38:35 np0005466013 systemd[1]: var-lib-containers-storage-overlay-cde8710b46bd5dd8ae65822b81fd68b96f4c9f942fc195a828fd4b387b2aca42-merged.mount: Deactivated successfully.
Oct  2 08:38:35 np0005466013 nova_compute[192144]: 2025-10-02 12:38:35.068 2 INFO nova.virt.libvirt.driver [-] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Instance destroyed successfully.#033[00m
Oct  2 08:38:35 np0005466013 nova_compute[192144]: 2025-10-02 12:38:35.069 2 DEBUG nova.objects.instance [None req-bc0a8668-db34-455c-b5db-614d049cbd25 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lazy-loading 'resources' on Instance uuid 5fc50e91-2988-453d-87f4-afbd5214d7f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:38:35 np0005466013 podman[248253]: 2025-10-02 12:38:35.073774152 +0000 UTC m=+0.108509850 container cleanup 6f0cb77385badf45492fbb84f9da4bdc8b1d32c37eadba993d8ae16483881a70 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:38:35 np0005466013 nova_compute[192144]: 2025-10-02 12:38:35.086 2 DEBUG nova.virt.libvirt.vif [None req-bc0a8668-db34-455c-b5db-614d049cbd25 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:37:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1257094405',display_name='tempest-TestGettingAddress-server-1257094405',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1257094405',id=163,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGgra/wXYI+rCsG5upBBPiIDSxRkzAR1A6pFxaSU1LPFfL3D5RfEN0Sz4k+PeFJCJFhU6eEresOI7XeTo6tERj3riWEwLSsbwiPk4PW1j9Dz/nyAQSV9AMMgUGCskGAq1Q==',key_name='tempest-TestGettingAddress-228200713',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:37:31Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-24sur2rw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:37:31Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=5fc50e91-2988-453d-87f4-afbd5214d7f6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb", "address": "fa:16:3e:cb:c0:e4", "network": {"id": "1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6", "bridge": "br-int", "label": "tempest-network-smoke--1033345898", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fecb:c0e4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fecb:c0e4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa745c1a1-9c", "ovs_interfaceid": "a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:38:35 np0005466013 nova_compute[192144]: 2025-10-02 12:38:35.087 2 DEBUG nova.network.os_vif_util [None req-bc0a8668-db34-455c-b5db-614d049cbd25 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb", "address": "fa:16:3e:cb:c0:e4", "network": {"id": "1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6", "bridge": "br-int", "label": "tempest-network-smoke--1033345898", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fecb:c0e4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fecb:c0e4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa745c1a1-9c", "ovs_interfaceid": "a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:38:35 np0005466013 systemd[1]: libpod-conmon-6f0cb77385badf45492fbb84f9da4bdc8b1d32c37eadba993d8ae16483881a70.scope: Deactivated successfully.
Oct  2 08:38:35 np0005466013 nova_compute[192144]: 2025-10-02 12:38:35.088 2 DEBUG nova.network.os_vif_util [None req-bc0a8668-db34-455c-b5db-614d049cbd25 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:cb:c0:e4,bridge_name='br-int',has_traffic_filtering=True,id=a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb,network=Network(1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa745c1a1-9c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:38:35 np0005466013 nova_compute[192144]: 2025-10-02 12:38:35.088 2 DEBUG os_vif [None req-bc0a8668-db34-455c-b5db-614d049cbd25 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:cb:c0:e4,bridge_name='br-int',has_traffic_filtering=True,id=a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb,network=Network(1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa745c1a1-9c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:38:35 np0005466013 nova_compute[192144]: 2025-10-02 12:38:35.090 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:35 np0005466013 nova_compute[192144]: 2025-10-02 12:38:35.091 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa745c1a1-9c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:38:35 np0005466013 nova_compute[192144]: 2025-10-02 12:38:35.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:35 np0005466013 nova_compute[192144]: 2025-10-02 12:38:35.096 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:38:35 np0005466013 nova_compute[192144]: 2025-10-02 12:38:35.098 2 INFO os_vif [None req-bc0a8668-db34-455c-b5db-614d049cbd25 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:cb:c0:e4,bridge_name='br-int',has_traffic_filtering=True,id=a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb,network=Network(1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa745c1a1-9c')#033[00m
Oct  2 08:38:35 np0005466013 nova_compute[192144]: 2025-10-02 12:38:35.099 2 INFO nova.virt.libvirt.driver [None req-bc0a8668-db34-455c-b5db-614d049cbd25 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Deleting instance files /var/lib/nova/instances/5fc50e91-2988-453d-87f4-afbd5214d7f6_del#033[00m
Oct  2 08:38:35 np0005466013 nova_compute[192144]: 2025-10-02 12:38:35.100 2 INFO nova.virt.libvirt.driver [None req-bc0a8668-db34-455c-b5db-614d049cbd25 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Deletion of /var/lib/nova/instances/5fc50e91-2988-453d-87f4-afbd5214d7f6_del complete#033[00m
Oct  2 08:38:35 np0005466013 podman[248301]: 2025-10-02 12:38:35.153438304 +0000 UTC m=+0.047469009 container remove 6f0cb77385badf45492fbb84f9da4bdc8b1d32c37eadba993d8ae16483881a70 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:38:35 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:35.160 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[7dfc4566-eba4-4470-ab69-cf46a8493ac3]: (4, ('Thu Oct  2 12:38:34 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6 (6f0cb77385badf45492fbb84f9da4bdc8b1d32c37eadba993d8ae16483881a70)\n6f0cb77385badf45492fbb84f9da4bdc8b1d32c37eadba993d8ae16483881a70\nThu Oct  2 12:38:35 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6 (6f0cb77385badf45492fbb84f9da4bdc8b1d32c37eadba993d8ae16483881a70)\n6f0cb77385badf45492fbb84f9da4bdc8b1d32c37eadba993d8ae16483881a70\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:35 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:35.162 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[bdae85a5-7945-43e5-a21b-ad540821bc9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:35 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:35.163 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1d7388dd-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:38:35 np0005466013 nova_compute[192144]: 2025-10-02 12:38:35.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:35 np0005466013 kernel: tap1d7388dd-d0: left promiscuous mode
Oct  2 08:38:35 np0005466013 nova_compute[192144]: 2025-10-02 12:38:35.182 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:35 np0005466013 nova_compute[192144]: 2025-10-02 12:38:35.183 2 INFO nova.compute.manager [None req-bc0a8668-db34-455c-b5db-614d049cbd25 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:38:35 np0005466013 nova_compute[192144]: 2025-10-02 12:38:35.184 2 DEBUG oslo.service.loopingcall [None req-bc0a8668-db34-455c-b5db-614d049cbd25 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:38:35 np0005466013 nova_compute[192144]: 2025-10-02 12:38:35.184 2 DEBUG nova.compute.manager [-] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:38:35 np0005466013 nova_compute[192144]: 2025-10-02 12:38:35.184 2 DEBUG nova.network.neutron [-] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:38:35 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:35.186 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[1fc66bd9-582d-4b70-8b79-fd2e4f6996aa]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:35 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:35.225 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[de03849e-42d9-4fc9-82ac-5a4fb8ebae0c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:35 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:35.227 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[29acc990-fb77-41e4-8efe-5a84ee15ff32]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:35 np0005466013 nova_compute[192144]: 2025-10-02 12:38:35.237 2 DEBUG nova.compute.manager [req-1b4bf145-263a-4bcd-97fc-0e7bdf5b76ae req-4dbf5232-a894-49c0-91e4-b47486955a93 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Received event network-changed-a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:38:35 np0005466013 nova_compute[192144]: 2025-10-02 12:38:35.238 2 DEBUG nova.compute.manager [req-1b4bf145-263a-4bcd-97fc-0e7bdf5b76ae req-4dbf5232-a894-49c0-91e4-b47486955a93 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Refreshing instance network info cache due to event network-changed-a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:38:35 np0005466013 nova_compute[192144]: 2025-10-02 12:38:35.238 2 DEBUG oslo_concurrency.lockutils [req-1b4bf145-263a-4bcd-97fc-0e7bdf5b76ae req-4dbf5232-a894-49c0-91e4-b47486955a93 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-5fc50e91-2988-453d-87f4-afbd5214d7f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:38:35 np0005466013 nova_compute[192144]: 2025-10-02 12:38:35.238 2 DEBUG oslo_concurrency.lockutils [req-1b4bf145-263a-4bcd-97fc-0e7bdf5b76ae req-4dbf5232-a894-49c0-91e4-b47486955a93 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-5fc50e91-2988-453d-87f4-afbd5214d7f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:38:35 np0005466013 nova_compute[192144]: 2025-10-02 12:38:35.239 2 DEBUG nova.network.neutron [req-1b4bf145-263a-4bcd-97fc-0e7bdf5b76ae req-4dbf5232-a894-49c0-91e4-b47486955a93 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Refreshing network info cache for port a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:38:35 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:35.259 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[63d40db4-304d-4586-8ec9-e4d98c069191]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 664259, 'reachable_time': 41825, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 248316, 'error': None, 'target': 'ovnmeta-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:35 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:35.263 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:38:35 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:35.263 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[62ace298-9b4e-428e-a9f6-8dce678d60a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:35 np0005466013 systemd[1]: run-netns-ovnmeta\x2d1d7388dd\x2dd8ef\x2d404d\x2d8bb8\x2d6f3d3ab763b6.mount: Deactivated successfully.
Oct  2 08:38:35 np0005466013 nova_compute[192144]: 2025-10-02 12:38:35.301 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:35 np0005466013 nova_compute[192144]: 2025-10-02 12:38:35.473 2 DEBUG nova.compute.manager [req-170bed69-232c-4a40-a101-24d9e88b610d req-aac8d3a8-9a8d-42d2-ae39-e0e65fde3ed6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Received event network-vif-unplugged-a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:38:35 np0005466013 nova_compute[192144]: 2025-10-02 12:38:35.474 2 DEBUG oslo_concurrency.lockutils [req-170bed69-232c-4a40-a101-24d9e88b610d req-aac8d3a8-9a8d-42d2-ae39-e0e65fde3ed6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "5fc50e91-2988-453d-87f4-afbd5214d7f6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:38:35 np0005466013 nova_compute[192144]: 2025-10-02 12:38:35.475 2 DEBUG oslo_concurrency.lockutils [req-170bed69-232c-4a40-a101-24d9e88b610d req-aac8d3a8-9a8d-42d2-ae39-e0e65fde3ed6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "5fc50e91-2988-453d-87f4-afbd5214d7f6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:38:35 np0005466013 nova_compute[192144]: 2025-10-02 12:38:35.475 2 DEBUG oslo_concurrency.lockutils [req-170bed69-232c-4a40-a101-24d9e88b610d req-aac8d3a8-9a8d-42d2-ae39-e0e65fde3ed6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "5fc50e91-2988-453d-87f4-afbd5214d7f6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:35 np0005466013 nova_compute[192144]: 2025-10-02 12:38:35.475 2 DEBUG nova.compute.manager [req-170bed69-232c-4a40-a101-24d9e88b610d req-aac8d3a8-9a8d-42d2-ae39-e0e65fde3ed6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] No waiting events found dispatching network-vif-unplugged-a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:38:35 np0005466013 nova_compute[192144]: 2025-10-02 12:38:35.476 2 DEBUG nova.compute.manager [req-170bed69-232c-4a40-a101-24d9e88b610d req-aac8d3a8-9a8d-42d2-ae39-e0e65fde3ed6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Received event network-vif-unplugged-a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:38:35 np0005466013 nova_compute[192144]: 2025-10-02 12:38:35.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:35 np0005466013 nova_compute[192144]: 2025-10-02 12:38:35.990 2 DEBUG nova.network.neutron [-] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:38:36 np0005466013 nova_compute[192144]: 2025-10-02 12:38:36.007 2 INFO nova.compute.manager [-] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Took 0.82 seconds to deallocate network for instance.#033[00m
Oct  2 08:38:36 np0005466013 nova_compute[192144]: 2025-10-02 12:38:36.082 2 DEBUG oslo_concurrency.lockutils [None req-bc0a8668-db34-455c-b5db-614d049cbd25 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:38:36 np0005466013 nova_compute[192144]: 2025-10-02 12:38:36.083 2 DEBUG oslo_concurrency.lockutils [None req-bc0a8668-db34-455c-b5db-614d049cbd25 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:38:36 np0005466013 nova_compute[192144]: 2025-10-02 12:38:36.149 2 DEBUG nova.compute.provider_tree [None req-bc0a8668-db34-455c-b5db-614d049cbd25 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:38:36 np0005466013 nova_compute[192144]: 2025-10-02 12:38:36.164 2 DEBUG nova.scheduler.client.report [None req-bc0a8668-db34-455c-b5db-614d049cbd25 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:38:36 np0005466013 nova_compute[192144]: 2025-10-02 12:38:36.182 2 DEBUG oslo_concurrency.lockutils [None req-bc0a8668-db34-455c-b5db-614d049cbd25 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:36 np0005466013 nova_compute[192144]: 2025-10-02 12:38:36.213 2 INFO nova.scheduler.client.report [None req-bc0a8668-db34-455c-b5db-614d049cbd25 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Deleted allocations for instance 5fc50e91-2988-453d-87f4-afbd5214d7f6#033[00m
Oct  2 08:38:36 np0005466013 nova_compute[192144]: 2025-10-02 12:38:36.294 2 DEBUG oslo_concurrency.lockutils [None req-bc0a8668-db34-455c-b5db-614d049cbd25 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "5fc50e91-2988-453d-87f4-afbd5214d7f6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.529s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:37 np0005466013 nova_compute[192144]: 2025-10-02 12:38:37.343 2 DEBUG nova.compute.manager [req-a0c4ed67-0ea5-4e5b-ae7e-0a1cda58e1fe req-e3933ae8-9631-4677-9e40-1f55a4108e9c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Received event network-vif-deleted-a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:38:37 np0005466013 nova_compute[192144]: 2025-10-02 12:38:37.576 2 DEBUG nova.compute.manager [req-bc95c172-0515-43c0-a760-4452bea431b3 req-10d3f090-4073-4afa-bf3a-36f02d6f9670 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Received event network-vif-plugged-a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:38:37 np0005466013 nova_compute[192144]: 2025-10-02 12:38:37.577 2 DEBUG oslo_concurrency.lockutils [req-bc95c172-0515-43c0-a760-4452bea431b3 req-10d3f090-4073-4afa-bf3a-36f02d6f9670 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "5fc50e91-2988-453d-87f4-afbd5214d7f6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:38:37 np0005466013 nova_compute[192144]: 2025-10-02 12:38:37.577 2 DEBUG oslo_concurrency.lockutils [req-bc95c172-0515-43c0-a760-4452bea431b3 req-10d3f090-4073-4afa-bf3a-36f02d6f9670 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "5fc50e91-2988-453d-87f4-afbd5214d7f6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:38:37 np0005466013 nova_compute[192144]: 2025-10-02 12:38:37.578 2 DEBUG oslo_concurrency.lockutils [req-bc95c172-0515-43c0-a760-4452bea431b3 req-10d3f090-4073-4afa-bf3a-36f02d6f9670 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "5fc50e91-2988-453d-87f4-afbd5214d7f6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:37 np0005466013 nova_compute[192144]: 2025-10-02 12:38:37.578 2 DEBUG nova.compute.manager [req-bc95c172-0515-43c0-a760-4452bea431b3 req-10d3f090-4073-4afa-bf3a-36f02d6f9670 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] No waiting events found dispatching network-vif-plugged-a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:38:37 np0005466013 nova_compute[192144]: 2025-10-02 12:38:37.579 2 WARNING nova.compute.manager [req-bc95c172-0515-43c0-a760-4452bea431b3 req-10d3f090-4073-4afa-bf3a-36f02d6f9670 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Received unexpected event network-vif-plugged-a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:38:37 np0005466013 nova_compute[192144]: 2025-10-02 12:38:37.863 2 DEBUG nova.network.neutron [req-1b4bf145-263a-4bcd-97fc-0e7bdf5b76ae req-4dbf5232-a894-49c0-91e4-b47486955a93 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Updated VIF entry in instance network info cache for port a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:38:37 np0005466013 nova_compute[192144]: 2025-10-02 12:38:37.864 2 DEBUG nova.network.neutron [req-1b4bf145-263a-4bcd-97fc-0e7bdf5b76ae req-4dbf5232-a894-49c0-91e4-b47486955a93 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Updating instance_info_cache with network_info: [{"id": "a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb", "address": "fa:16:3e:cb:c0:e4", "network": {"id": "1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6", "bridge": "br-int", "label": "tempest-network-smoke--1033345898", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fecb:c0e4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fecb:c0e4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa745c1a1-9c", "ovs_interfaceid": "a745c1a1-9c5e-4ac5-8c96-4b9a2f53ddcb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:38:37 np0005466013 nova_compute[192144]: 2025-10-02 12:38:37.881 2 DEBUG oslo_concurrency.lockutils [req-1b4bf145-263a-4bcd-97fc-0e7bdf5b76ae req-4dbf5232-a894-49c0-91e4-b47486955a93 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-5fc50e91-2988-453d-87f4-afbd5214d7f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:38:38 np0005466013 nova_compute[192144]: 2025-10-02 12:38:38.380 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408703.3782375, 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:38:38 np0005466013 nova_compute[192144]: 2025-10-02 12:38:38.381 2 INFO nova.compute.manager [-] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:38:38 np0005466013 nova_compute[192144]: 2025-10-02 12:38:38.400 2 DEBUG nova.compute.manager [None req-22d602c7-ba89-4d81-942e-58d3513393ca - - - - - -] [instance: 2438f0aa-8b8f-46d0-9f94-e5a331c9e5bf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:38:39 np0005466013 nova_compute[192144]: 2025-10-02 12:38:39.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:38:39 np0005466013 nova_compute[192144]: 2025-10-02 12:38:39.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:38:40 np0005466013 nova_compute[192144]: 2025-10-02 12:38:40.094 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:40 np0005466013 nova_compute[192144]: 2025-10-02 12:38:40.200 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:40 np0005466013 nova_compute[192144]: 2025-10-02 12:38:40.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:40 np0005466013 nova_compute[192144]: 2025-10-02 12:38:40.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:38:41 np0005466013 nova_compute[192144]: 2025-10-02 12:38:41.018 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:38:41 np0005466013 nova_compute[192144]: 2025-10-02 12:38:41.018 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:38:41 np0005466013 nova_compute[192144]: 2025-10-02 12:38:41.019 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:41 np0005466013 nova_compute[192144]: 2025-10-02 12:38:41.019 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:38:41 np0005466013 nova_compute[192144]: 2025-10-02 12:38:41.234 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:38:41 np0005466013 nova_compute[192144]: 2025-10-02 12:38:41.234 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5707MB free_disk=73.13351058959961GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:38:41 np0005466013 nova_compute[192144]: 2025-10-02 12:38:41.235 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:38:41 np0005466013 nova_compute[192144]: 2025-10-02 12:38:41.235 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:38:41 np0005466013 nova_compute[192144]: 2025-10-02 12:38:41.312 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:38:41 np0005466013 nova_compute[192144]: 2025-10-02 12:38:41.313 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:38:41 np0005466013 nova_compute[192144]: 2025-10-02 12:38:41.347 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:38:41 np0005466013 nova_compute[192144]: 2025-10-02 12:38:41.361 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:38:41 np0005466013 nova_compute[192144]: 2025-10-02 12:38:41.390 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:38:41 np0005466013 nova_compute[192144]: 2025-10-02 12:38:41.390 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.155s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:42 np0005466013 nova_compute[192144]: 2025-10-02 12:38:42.392 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:38:42 np0005466013 nova_compute[192144]: 2025-10-02 12:38:42.617 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:42 np0005466013 nova_compute[192144]: 2025-10-02 12:38:42.836 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:42 np0005466013 nova_compute[192144]: 2025-10-02 12:38:42.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:38:44 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:44.551 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=43, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=42) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:38:44 np0005466013 nova_compute[192144]: 2025-10-02 12:38:44.551 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:44 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:44.553 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:38:44 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:44.553 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '43'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:38:45 np0005466013 nova_compute[192144]: 2025-10-02 12:38:45.099 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:45 np0005466013 nova_compute[192144]: 2025-10-02 12:38:45.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:45 np0005466013 nova_compute[192144]: 2025-10-02 12:38:45.988 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:38:45 np0005466013 nova_compute[192144]: 2025-10-02 12:38:45.989 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:38:46 np0005466013 nova_compute[192144]: 2025-10-02 12:38:46.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:38:50 np0005466013 nova_compute[192144]: 2025-10-02 12:38:50.063 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408715.0614464, 5fc50e91-2988-453d-87f4-afbd5214d7f6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:38:50 np0005466013 nova_compute[192144]: 2025-10-02 12:38:50.064 2 INFO nova.compute.manager [-] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:38:50 np0005466013 nova_compute[192144]: 2025-10-02 12:38:50.144 2 DEBUG nova.compute.manager [None req-7ffacf4a-1fc9-451c-91b9-e90f7e2a678d - - - - - -] [instance: 5fc50e91-2988-453d-87f4-afbd5214d7f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:38:50 np0005466013 nova_compute[192144]: 2025-10-02 12:38:50.160 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:50 np0005466013 nova_compute[192144]: 2025-10-02 12:38:50.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:51 np0005466013 nova_compute[192144]: 2025-10-02 12:38:51.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:38:52 np0005466013 nova_compute[192144]: 2025-10-02 12:38:52.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:38:52 np0005466013 nova_compute[192144]: 2025-10-02 12:38:52.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:38:53 np0005466013 nova_compute[192144]: 2025-10-02 12:38:53.023 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:38:53 np0005466013 nova_compute[192144]: 2025-10-02 12:38:53.024 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:38:53 np0005466013 podman[248319]: 2025-10-02 12:38:53.692283422 +0000 UTC m=+0.068710831 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  2 08:38:53 np0005466013 podman[248320]: 2025-10-02 12:38:53.695162312 +0000 UTC m=+0.068763453 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:38:53 np0005466013 podman[248321]: 2025-10-02 12:38:53.747040758 +0000 UTC m=+0.105805297 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, container_name=ovn_controller, org.label-schema.license=GPLv2)
Oct  2 08:38:55 np0005466013 nova_compute[192144]: 2025-10-02 12:38:55.163 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:55 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:55.258 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2b:a1:21 10.100.0.2 2001:db8::f816:3eff:fe2b:a121'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe2b:a121/64', 'neutron:device_id': 'ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5fce3bea-36c3-4b1e-bdee-b694cf8990ad, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ad07d234-3bc8-429a-8834-7a9ae3274be2) old=Port_Binding(mac=['fa:16:3e:2b:a1:21 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:38:55 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:55.259 103323 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ad07d234-3bc8-429a-8834-7a9ae3274be2 in datapath b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177 updated#033[00m
Oct  2 08:38:55 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:55.260 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:38:55 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:55.261 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[e148b450-f978-48f7-a089-7d5094ae1b72]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:55 np0005466013 nova_compute[192144]: 2025-10-02 12:38:55.514 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:59.578 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2b:a1:21 10.100.0.2 2001:db8:0:1:f816:3eff:fe2b:a121 2001:db8::f816:3eff:fe2b:a121'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8:0:1:f816:3eff:fe2b:a121/64 2001:db8::f816:3eff:fe2b:a121/64', 'neutron:device_id': 'ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5fce3bea-36c3-4b1e-bdee-b694cf8990ad, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ad07d234-3bc8-429a-8834-7a9ae3274be2) old=Port_Binding(mac=['fa:16:3e:2b:a1:21 10.100.0.2 2001:db8::f816:3eff:fe2b:a121'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe2b:a121/64', 'neutron:device_id': 'ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:38:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:59.581 103323 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ad07d234-3bc8-429a-8834-7a9ae3274be2 in datapath b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177 updated#033[00m
Oct  2 08:38:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:59.583 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:38:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:38:59.584 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[f67edaf7-d1cd-49bf-b3ae-055f1babbd68]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:00 np0005466013 nova_compute[192144]: 2025-10-02 12:39:00.166 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:00 np0005466013 nova_compute[192144]: 2025-10-02 12:39:00.544 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:02.323 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:39:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:02.323 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:39:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:02.324 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:39:03 np0005466013 podman[248385]: 2025-10-02 12:39:03.678857629 +0000 UTC m=+0.057287666 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:39:03 np0005466013 podman[248386]: 2025-10-02 12:39:03.687712055 +0000 UTC m=+0.062845069 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_id=edpm, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, release=1755695350, name=ubi9-minimal)
Oct  2 08:39:03 np0005466013 podman[248387]: 2025-10-02 12:39:03.714732926 +0000 UTC m=+0.086422543 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute)
Oct  2 08:39:05 np0005466013 nova_compute[192144]: 2025-10-02 12:39:05.169 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:05 np0005466013 nova_compute[192144]: 2025-10-02 12:39:05.545 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:05 np0005466013 podman[248446]: 2025-10-02 12:39:05.713922866 +0000 UTC m=+0.076572476 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 08:39:05 np0005466013 podman[248445]: 2025-10-02 12:39:05.719871941 +0000 UTC m=+0.094722432 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:39:10 np0005466013 nova_compute[192144]: 2025-10-02 12:39:10.172 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:10 np0005466013 nova_compute[192144]: 2025-10-02 12:39:10.586 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:12 np0005466013 nova_compute[192144]: 2025-10-02 12:39:12.838 2 DEBUG oslo_concurrency.lockutils [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "9c5a0e30-230d-427f-8ffd-2459f12ec78e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:39:12 np0005466013 nova_compute[192144]: 2025-10-02 12:39:12.839 2 DEBUG oslo_concurrency.lockutils [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "9c5a0e30-230d-427f-8ffd-2459f12ec78e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:39:12 np0005466013 nova_compute[192144]: 2025-10-02 12:39:12.855 2 DEBUG nova.compute.manager [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:39:12 np0005466013 nova_compute[192144]: 2025-10-02 12:39:12.954 2 DEBUG oslo_concurrency.lockutils [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:39:12 np0005466013 nova_compute[192144]: 2025-10-02 12:39:12.955 2 DEBUG oslo_concurrency.lockutils [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:39:12 np0005466013 nova_compute[192144]: 2025-10-02 12:39:12.962 2 DEBUG nova.virt.hardware [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:39:12 np0005466013 nova_compute[192144]: 2025-10-02 12:39:12.963 2 INFO nova.compute.claims [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:39:13 np0005466013 nova_compute[192144]: 2025-10-02 12:39:13.091 2 DEBUG nova.compute.provider_tree [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:39:13 np0005466013 nova_compute[192144]: 2025-10-02 12:39:13.106 2 DEBUG nova.scheduler.client.report [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:39:13 np0005466013 nova_compute[192144]: 2025-10-02 12:39:13.127 2 DEBUG oslo_concurrency.lockutils [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.171s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:39:13 np0005466013 nova_compute[192144]: 2025-10-02 12:39:13.128 2 DEBUG nova.compute.manager [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:39:13 np0005466013 nova_compute[192144]: 2025-10-02 12:39:13.180 2 DEBUG nova.compute.manager [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:39:13 np0005466013 nova_compute[192144]: 2025-10-02 12:39:13.180 2 DEBUG nova.network.neutron [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:39:13 np0005466013 nova_compute[192144]: 2025-10-02 12:39:13.197 2 INFO nova.virt.libvirt.driver [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:39:13 np0005466013 nova_compute[192144]: 2025-10-02 12:39:13.214 2 DEBUG nova.compute.manager [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:39:13 np0005466013 nova_compute[192144]: 2025-10-02 12:39:13.334 2 DEBUG nova.compute.manager [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:39:13 np0005466013 nova_compute[192144]: 2025-10-02 12:39:13.336 2 DEBUG nova.virt.libvirt.driver [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:39:13 np0005466013 nova_compute[192144]: 2025-10-02 12:39:13.336 2 INFO nova.virt.libvirt.driver [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Creating image(s)#033[00m
Oct  2 08:39:13 np0005466013 nova_compute[192144]: 2025-10-02 12:39:13.337 2 DEBUG oslo_concurrency.lockutils [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "/var/lib/nova/instances/9c5a0e30-230d-427f-8ffd-2459f12ec78e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:39:13 np0005466013 nova_compute[192144]: 2025-10-02 12:39:13.337 2 DEBUG oslo_concurrency.lockutils [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "/var/lib/nova/instances/9c5a0e30-230d-427f-8ffd-2459f12ec78e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:39:13 np0005466013 nova_compute[192144]: 2025-10-02 12:39:13.338 2 DEBUG oslo_concurrency.lockutils [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "/var/lib/nova/instances/9c5a0e30-230d-427f-8ffd-2459f12ec78e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:39:13 np0005466013 nova_compute[192144]: 2025-10-02 12:39:13.355 2 DEBUG oslo_concurrency.processutils [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:39:13 np0005466013 nova_compute[192144]: 2025-10-02 12:39:13.442 2 DEBUG oslo_concurrency.processutils [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:39:13 np0005466013 nova_compute[192144]: 2025-10-02 12:39:13.443 2 DEBUG oslo_concurrency.lockutils [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:39:13 np0005466013 nova_compute[192144]: 2025-10-02 12:39:13.443 2 DEBUG oslo_concurrency.lockutils [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:39:13 np0005466013 nova_compute[192144]: 2025-10-02 12:39:13.453 2 DEBUG oslo_concurrency.processutils [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:39:13 np0005466013 nova_compute[192144]: 2025-10-02 12:39:13.512 2 DEBUG oslo_concurrency.processutils [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:39:13 np0005466013 nova_compute[192144]: 2025-10-02 12:39:13.513 2 DEBUG oslo_concurrency.processutils [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/9c5a0e30-230d-427f-8ffd-2459f12ec78e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:39:13 np0005466013 nova_compute[192144]: 2025-10-02 12:39:13.926 2 DEBUG oslo_concurrency.processutils [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/9c5a0e30-230d-427f-8ffd-2459f12ec78e/disk 1073741824" returned: 0 in 0.413s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:39:13 np0005466013 nova_compute[192144]: 2025-10-02 12:39:13.927 2 DEBUG oslo_concurrency.lockutils [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.484s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:39:13 np0005466013 nova_compute[192144]: 2025-10-02 12:39:13.928 2 DEBUG oslo_concurrency.processutils [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:39:13 np0005466013 nova_compute[192144]: 2025-10-02 12:39:13.993 2 DEBUG oslo_concurrency.processutils [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:39:13 np0005466013 nova_compute[192144]: 2025-10-02 12:39:13.994 2 DEBUG nova.virt.disk.api [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Checking if we can resize image /var/lib/nova/instances/9c5a0e30-230d-427f-8ffd-2459f12ec78e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:39:13 np0005466013 nova_compute[192144]: 2025-10-02 12:39:13.994 2 DEBUG oslo_concurrency.processutils [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9c5a0e30-230d-427f-8ffd-2459f12ec78e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:39:14 np0005466013 nova_compute[192144]: 2025-10-02 12:39:14.052 2 DEBUG oslo_concurrency.processutils [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9c5a0e30-230d-427f-8ffd-2459f12ec78e/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:39:14 np0005466013 nova_compute[192144]: 2025-10-02 12:39:14.054 2 DEBUG nova.virt.disk.api [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Cannot resize image /var/lib/nova/instances/9c5a0e30-230d-427f-8ffd-2459f12ec78e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:39:14 np0005466013 nova_compute[192144]: 2025-10-02 12:39:14.055 2 DEBUG nova.objects.instance [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lazy-loading 'migration_context' on Instance uuid 9c5a0e30-230d-427f-8ffd-2459f12ec78e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:39:14 np0005466013 nova_compute[192144]: 2025-10-02 12:39:14.070 2 DEBUG nova.virt.libvirt.driver [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:39:14 np0005466013 nova_compute[192144]: 2025-10-02 12:39:14.071 2 DEBUG nova.virt.libvirt.driver [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Ensure instance console log exists: /var/lib/nova/instances/9c5a0e30-230d-427f-8ffd-2459f12ec78e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:39:14 np0005466013 nova_compute[192144]: 2025-10-02 12:39:14.072 2 DEBUG oslo_concurrency.lockutils [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:39:14 np0005466013 nova_compute[192144]: 2025-10-02 12:39:14.072 2 DEBUG oslo_concurrency.lockutils [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:39:14 np0005466013 nova_compute[192144]: 2025-10-02 12:39:14.072 2 DEBUG oslo_concurrency.lockutils [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:39:14 np0005466013 nova_compute[192144]: 2025-10-02 12:39:14.288 2 DEBUG nova.policy [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:39:15 np0005466013 nova_compute[192144]: 2025-10-02 12:39:15.173 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:15 np0005466013 nova_compute[192144]: 2025-10-02 12:39:15.218 2 DEBUG nova.network.neutron [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Successfully updated port: b2d256d9-6788-41ed-a218-ab6139d999cb _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:39:15 np0005466013 nova_compute[192144]: 2025-10-02 12:39:15.235 2 DEBUG oslo_concurrency.lockutils [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "refresh_cache-9c5a0e30-230d-427f-8ffd-2459f12ec78e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:39:15 np0005466013 nova_compute[192144]: 2025-10-02 12:39:15.235 2 DEBUG oslo_concurrency.lockutils [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquired lock "refresh_cache-9c5a0e30-230d-427f-8ffd-2459f12ec78e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:39:15 np0005466013 nova_compute[192144]: 2025-10-02 12:39:15.236 2 DEBUG nova.network.neutron [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:39:15 np0005466013 nova_compute[192144]: 2025-10-02 12:39:15.292 2 DEBUG nova.compute.manager [req-f7c39c3e-465b-4d1a-8871-eb38b67fd422 req-c470ef69-dd5f-42aa-87f7-d7cf5a4108de 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Received event network-changed-b2d256d9-6788-41ed-a218-ab6139d999cb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:39:15 np0005466013 nova_compute[192144]: 2025-10-02 12:39:15.292 2 DEBUG nova.compute.manager [req-f7c39c3e-465b-4d1a-8871-eb38b67fd422 req-c470ef69-dd5f-42aa-87f7-d7cf5a4108de 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Refreshing instance network info cache due to event network-changed-b2d256d9-6788-41ed-a218-ab6139d999cb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:39:15 np0005466013 nova_compute[192144]: 2025-10-02 12:39:15.292 2 DEBUG oslo_concurrency.lockutils [req-f7c39c3e-465b-4d1a-8871-eb38b67fd422 req-c470ef69-dd5f-42aa-87f7-d7cf5a4108de 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-9c5a0e30-230d-427f-8ffd-2459f12ec78e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:39:15 np0005466013 nova_compute[192144]: 2025-10-02 12:39:15.417 2 DEBUG nova.network.neutron [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:39:15 np0005466013 nova_compute[192144]: 2025-10-02 12:39:15.587 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:39:16.357 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:39:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:39:16.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:39:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:39:16.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:39:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:39:16.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:39:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:39:16.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:39:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:39:16.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:39:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:39:16.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:39:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:39:16.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:39:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:39:16.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:39:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:39:16.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:39:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:39:16.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:39:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:39:16.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:39:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:39:16.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:39:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:39:16.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:39:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:39:16.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:39:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:39:16.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:39:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:39:16.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:39:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:39:16.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:39:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:39:16.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:39:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:39:16.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:39:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:39:16.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:39:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:39:16.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:39:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:39:16.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:39:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:39:16.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:39:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:39:16.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:39:17 np0005466013 nova_compute[192144]: 2025-10-02 12:39:17.522 2 DEBUG nova.network.neutron [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Updating instance_info_cache with network_info: [{"id": "b2d256d9-6788-41ed-a218-ab6139d999cb", "address": "fa:16:3e:02:80:1f", "network": {"id": "670889c7-549b-45d0-be10-992f080979ef", "bridge": "br-int", "label": "tempest-network-smoke--1745189972", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2d256d9-67", "ovs_interfaceid": "b2d256d9-6788-41ed-a218-ab6139d999cb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:39:17 np0005466013 nova_compute[192144]: 2025-10-02 12:39:17.542 2 DEBUG oslo_concurrency.lockutils [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Releasing lock "refresh_cache-9c5a0e30-230d-427f-8ffd-2459f12ec78e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:39:17 np0005466013 nova_compute[192144]: 2025-10-02 12:39:17.542 2 DEBUG nova.compute.manager [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Instance network_info: |[{"id": "b2d256d9-6788-41ed-a218-ab6139d999cb", "address": "fa:16:3e:02:80:1f", "network": {"id": "670889c7-549b-45d0-be10-992f080979ef", "bridge": "br-int", "label": "tempest-network-smoke--1745189972", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2d256d9-67", "ovs_interfaceid": "b2d256d9-6788-41ed-a218-ab6139d999cb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:39:17 np0005466013 nova_compute[192144]: 2025-10-02 12:39:17.542 2 DEBUG oslo_concurrency.lockutils [req-f7c39c3e-465b-4d1a-8871-eb38b67fd422 req-c470ef69-dd5f-42aa-87f7-d7cf5a4108de 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-9c5a0e30-230d-427f-8ffd-2459f12ec78e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:39:17 np0005466013 nova_compute[192144]: 2025-10-02 12:39:17.543 2 DEBUG nova.network.neutron [req-f7c39c3e-465b-4d1a-8871-eb38b67fd422 req-c470ef69-dd5f-42aa-87f7-d7cf5a4108de 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Refreshing network info cache for port b2d256d9-6788-41ed-a218-ab6139d999cb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:39:17 np0005466013 nova_compute[192144]: 2025-10-02 12:39:17.545 2 DEBUG nova.virt.libvirt.driver [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Start _get_guest_xml network_info=[{"id": "b2d256d9-6788-41ed-a218-ab6139d999cb", "address": "fa:16:3e:02:80:1f", "network": {"id": "670889c7-549b-45d0-be10-992f080979ef", "bridge": "br-int", "label": "tempest-network-smoke--1745189972", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2d256d9-67", "ovs_interfaceid": "b2d256d9-6788-41ed-a218-ab6139d999cb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:39:17 np0005466013 nova_compute[192144]: 2025-10-02 12:39:17.549 2 WARNING nova.virt.libvirt.driver [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:39:17 np0005466013 nova_compute[192144]: 2025-10-02 12:39:17.553 2 DEBUG nova.virt.libvirt.host [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:39:17 np0005466013 nova_compute[192144]: 2025-10-02 12:39:17.554 2 DEBUG nova.virt.libvirt.host [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:39:17 np0005466013 nova_compute[192144]: 2025-10-02 12:39:17.556 2 DEBUG nova.virt.libvirt.host [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:39:17 np0005466013 nova_compute[192144]: 2025-10-02 12:39:17.557 2 DEBUG nova.virt.libvirt.host [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:39:17 np0005466013 nova_compute[192144]: 2025-10-02 12:39:17.558 2 DEBUG nova.virt.libvirt.driver [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:39:17 np0005466013 nova_compute[192144]: 2025-10-02 12:39:17.558 2 DEBUG nova.virt.hardware [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:39:17 np0005466013 nova_compute[192144]: 2025-10-02 12:39:17.559 2 DEBUG nova.virt.hardware [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:39:17 np0005466013 nova_compute[192144]: 2025-10-02 12:39:17.559 2 DEBUG nova.virt.hardware [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:39:17 np0005466013 nova_compute[192144]: 2025-10-02 12:39:17.559 2 DEBUG nova.virt.hardware [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:39:17 np0005466013 nova_compute[192144]: 2025-10-02 12:39:17.559 2 DEBUG nova.virt.hardware [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:39:17 np0005466013 nova_compute[192144]: 2025-10-02 12:39:17.559 2 DEBUG nova.virt.hardware [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:39:17 np0005466013 nova_compute[192144]: 2025-10-02 12:39:17.560 2 DEBUG nova.virt.hardware [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:39:17 np0005466013 nova_compute[192144]: 2025-10-02 12:39:17.560 2 DEBUG nova.virt.hardware [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:39:17 np0005466013 nova_compute[192144]: 2025-10-02 12:39:17.560 2 DEBUG nova.virt.hardware [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:39:17 np0005466013 nova_compute[192144]: 2025-10-02 12:39:17.560 2 DEBUG nova.virt.hardware [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:39:17 np0005466013 nova_compute[192144]: 2025-10-02 12:39:17.561 2 DEBUG nova.virt.hardware [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:39:17 np0005466013 nova_compute[192144]: 2025-10-02 12:39:17.564 2 DEBUG nova.virt.libvirt.vif [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:39:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-555592205',display_name='tempest-TestNetworkBasicOps-server-555592205',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-555592205',id=168,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGbxc6q0Wnt6CUsjPK2DStON6HOjT3BcHvBERZfcZOJWtRfL5sAnOvqJCqKS5RgU8YP9fiJsOommoyAcbsnsX7YwYYXbA+WpeOp372gCPNe82JFzK/uLGwdQxKUr5+F/0g==',key_name='tempest-TestNetworkBasicOps-363464227',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6e2a4899168a47618e377cb3ac85ddd2',ramdisk_id='',reservation_id='r-5g0b3agz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1323893370',owner_user_name='tempest-TestNetworkBasicOps-1323893370-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:39:13Z,user_data=None,user_id='a1898fdf056c4a249c33590f26d4d845',uuid=9c5a0e30-230d-427f-8ffd-2459f12ec78e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b2d256d9-6788-41ed-a218-ab6139d999cb", "address": "fa:16:3e:02:80:1f", "network": {"id": "670889c7-549b-45d0-be10-992f080979ef", "bridge": "br-int", "label": "tempest-network-smoke--1745189972", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2d256d9-67", "ovs_interfaceid": "b2d256d9-6788-41ed-a218-ab6139d999cb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:39:17 np0005466013 nova_compute[192144]: 2025-10-02 12:39:17.564 2 DEBUG nova.network.os_vif_util [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Converting VIF {"id": "b2d256d9-6788-41ed-a218-ab6139d999cb", "address": "fa:16:3e:02:80:1f", "network": {"id": "670889c7-549b-45d0-be10-992f080979ef", "bridge": "br-int", "label": "tempest-network-smoke--1745189972", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2d256d9-67", "ovs_interfaceid": "b2d256d9-6788-41ed-a218-ab6139d999cb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:39:17 np0005466013 nova_compute[192144]: 2025-10-02 12:39:17.565 2 DEBUG nova.network.os_vif_util [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:02:80:1f,bridge_name='br-int',has_traffic_filtering=True,id=b2d256d9-6788-41ed-a218-ab6139d999cb,network=Network(670889c7-549b-45d0-be10-992f080979ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb2d256d9-67') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:39:17 np0005466013 nova_compute[192144]: 2025-10-02 12:39:17.565 2 DEBUG nova.objects.instance [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9c5a0e30-230d-427f-8ffd-2459f12ec78e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:39:17 np0005466013 nova_compute[192144]: 2025-10-02 12:39:17.577 2 DEBUG nova.virt.libvirt.driver [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:39:17 np0005466013 nova_compute[192144]:  <uuid>9c5a0e30-230d-427f-8ffd-2459f12ec78e</uuid>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:  <name>instance-000000a8</name>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:39:17 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:      <nova:name>tempest-TestNetworkBasicOps-server-555592205</nova:name>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:39:17</nova:creationTime>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:39:17 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:        <nova:user uuid="a1898fdf056c4a249c33590f26d4d845">tempest-TestNetworkBasicOps-1323893370-project-member</nova:user>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:        <nova:project uuid="6e2a4899168a47618e377cb3ac85ddd2">tempest-TestNetworkBasicOps-1323893370</nova:project>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:        <nova:port uuid="b2d256d9-6788-41ed-a218-ab6139d999cb">
Oct  2 08:39:17 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:      <entry name="serial">9c5a0e30-230d-427f-8ffd-2459f12ec78e</entry>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:      <entry name="uuid">9c5a0e30-230d-427f-8ffd-2459f12ec78e</entry>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:39:17 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/9c5a0e30-230d-427f-8ffd-2459f12ec78e/disk"/>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:39:17 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/9c5a0e30-230d-427f-8ffd-2459f12ec78e/disk.config"/>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:39:17 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:02:80:1f"/>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:      <target dev="tapb2d256d9-67"/>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:39:17 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/9c5a0e30-230d-427f-8ffd-2459f12ec78e/console.log" append="off"/>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:39:17 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:39:17 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:39:17 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:39:17 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:39:17 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:39:17 np0005466013 nova_compute[192144]: 2025-10-02 12:39:17.578 2 DEBUG nova.compute.manager [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Preparing to wait for external event network-vif-plugged-b2d256d9-6788-41ed-a218-ab6139d999cb prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:39:17 np0005466013 nova_compute[192144]: 2025-10-02 12:39:17.578 2 DEBUG oslo_concurrency.lockutils [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "9c5a0e30-230d-427f-8ffd-2459f12ec78e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:39:17 np0005466013 nova_compute[192144]: 2025-10-02 12:39:17.578 2 DEBUG oslo_concurrency.lockutils [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "9c5a0e30-230d-427f-8ffd-2459f12ec78e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:39:17 np0005466013 nova_compute[192144]: 2025-10-02 12:39:17.579 2 DEBUG oslo_concurrency.lockutils [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "9c5a0e30-230d-427f-8ffd-2459f12ec78e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:39:17 np0005466013 nova_compute[192144]: 2025-10-02 12:39:17.579 2 DEBUG nova.virt.libvirt.vif [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:39:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-555592205',display_name='tempest-TestNetworkBasicOps-server-555592205',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-555592205',id=168,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGbxc6q0Wnt6CUsjPK2DStON6HOjT3BcHvBERZfcZOJWtRfL5sAnOvqJCqKS5RgU8YP9fiJsOommoyAcbsnsX7YwYYXbA+WpeOp372gCPNe82JFzK/uLGwdQxKUr5+F/0g==',key_name='tempest-TestNetworkBasicOps-363464227',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6e2a4899168a47618e377cb3ac85ddd2',ramdisk_id='',reservation_id='r-5g0b3agz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1323893370',owner_user_name='tempest-TestNetworkBasicOps-1323893370-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:39:13Z,user_data=None,user_id='a1898fdf056c4a249c33590f26d4d845',uuid=9c5a0e30-230d-427f-8ffd-2459f12ec78e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b2d256d9-6788-41ed-a218-ab6139d999cb", "address": "fa:16:3e:02:80:1f", "network": {"id": "670889c7-549b-45d0-be10-992f080979ef", "bridge": "br-int", "label": "tempest-network-smoke--1745189972", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2d256d9-67", "ovs_interfaceid": "b2d256d9-6788-41ed-a218-ab6139d999cb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:39:17 np0005466013 nova_compute[192144]: 2025-10-02 12:39:17.579 2 DEBUG nova.network.os_vif_util [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Converting VIF {"id": "b2d256d9-6788-41ed-a218-ab6139d999cb", "address": "fa:16:3e:02:80:1f", "network": {"id": "670889c7-549b-45d0-be10-992f080979ef", "bridge": "br-int", "label": "tempest-network-smoke--1745189972", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2d256d9-67", "ovs_interfaceid": "b2d256d9-6788-41ed-a218-ab6139d999cb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:39:17 np0005466013 nova_compute[192144]: 2025-10-02 12:39:17.580 2 DEBUG nova.network.os_vif_util [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:02:80:1f,bridge_name='br-int',has_traffic_filtering=True,id=b2d256d9-6788-41ed-a218-ab6139d999cb,network=Network(670889c7-549b-45d0-be10-992f080979ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb2d256d9-67') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:39:17 np0005466013 nova_compute[192144]: 2025-10-02 12:39:17.580 2 DEBUG os_vif [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:02:80:1f,bridge_name='br-int',has_traffic_filtering=True,id=b2d256d9-6788-41ed-a218-ab6139d999cb,network=Network(670889c7-549b-45d0-be10-992f080979ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb2d256d9-67') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:39:17 np0005466013 nova_compute[192144]: 2025-10-02 12:39:17.581 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:17 np0005466013 nova_compute[192144]: 2025-10-02 12:39:17.581 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:39:17 np0005466013 nova_compute[192144]: 2025-10-02 12:39:17.581 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:39:17 np0005466013 nova_compute[192144]: 2025-10-02 12:39:17.584 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:17 np0005466013 nova_compute[192144]: 2025-10-02 12:39:17.584 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb2d256d9-67, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:39:17 np0005466013 nova_compute[192144]: 2025-10-02 12:39:17.584 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb2d256d9-67, col_values=(('external_ids', {'iface-id': 'b2d256d9-6788-41ed-a218-ab6139d999cb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:02:80:1f', 'vm-uuid': '9c5a0e30-230d-427f-8ffd-2459f12ec78e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:39:17 np0005466013 nova_compute[192144]: 2025-10-02 12:39:17.585 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:17 np0005466013 NetworkManager[51205]: <info>  [1759408757.5865] manager: (tapb2d256d9-67): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/317)
Oct  2 08:39:17 np0005466013 nova_compute[192144]: 2025-10-02 12:39:17.588 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:39:17 np0005466013 nova_compute[192144]: 2025-10-02 12:39:17.592 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:17 np0005466013 nova_compute[192144]: 2025-10-02 12:39:17.593 2 INFO os_vif [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:02:80:1f,bridge_name='br-int',has_traffic_filtering=True,id=b2d256d9-6788-41ed-a218-ab6139d999cb,network=Network(670889c7-549b-45d0-be10-992f080979ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb2d256d9-67')#033[00m
Oct  2 08:39:17 np0005466013 nova_compute[192144]: 2025-10-02 12:39:17.662 2 DEBUG nova.virt.libvirt.driver [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:39:17 np0005466013 nova_compute[192144]: 2025-10-02 12:39:17.663 2 DEBUG nova.virt.libvirt.driver [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:39:17 np0005466013 nova_compute[192144]: 2025-10-02 12:39:17.663 2 DEBUG nova.virt.libvirt.driver [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] No VIF found with MAC fa:16:3e:02:80:1f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:39:17 np0005466013 nova_compute[192144]: 2025-10-02 12:39:17.663 2 INFO nova.virt.libvirt.driver [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Using config drive#033[00m
Oct  2 08:39:18 np0005466013 nova_compute[192144]: 2025-10-02 12:39:18.403 2 INFO nova.virt.libvirt.driver [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Creating config drive at /var/lib/nova/instances/9c5a0e30-230d-427f-8ffd-2459f12ec78e/disk.config#033[00m
Oct  2 08:39:18 np0005466013 nova_compute[192144]: 2025-10-02 12:39:18.414 2 DEBUG oslo_concurrency.processutils [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9c5a0e30-230d-427f-8ffd-2459f12ec78e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvccoz6f2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:39:18 np0005466013 nova_compute[192144]: 2025-10-02 12:39:18.545 2 DEBUG oslo_concurrency.processutils [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9c5a0e30-230d-427f-8ffd-2459f12ec78e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvccoz6f2" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:39:18 np0005466013 kernel: tapb2d256d9-67: entered promiscuous mode
Oct  2 08:39:18 np0005466013 NetworkManager[51205]: <info>  [1759408758.6086] manager: (tapb2d256d9-67): new Tun device (/org/freedesktop/NetworkManager/Devices/318)
Oct  2 08:39:18 np0005466013 systemd-udevd[248519]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:39:18 np0005466013 ovn_controller[94366]: 2025-10-02T12:39:18Z|00728|binding|INFO|Claiming lport b2d256d9-6788-41ed-a218-ab6139d999cb for this chassis.
Oct  2 08:39:18 np0005466013 ovn_controller[94366]: 2025-10-02T12:39:18Z|00729|binding|INFO|b2d256d9-6788-41ed-a218-ab6139d999cb: Claiming fa:16:3e:02:80:1f 10.100.0.13
Oct  2 08:39:18 np0005466013 nova_compute[192144]: 2025-10-02 12:39:18.667 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:18 np0005466013 nova_compute[192144]: 2025-10-02 12:39:18.677 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:18 np0005466013 NetworkManager[51205]: <info>  [1759408758.6784] device (tapb2d256d9-67): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:39:18 np0005466013 NetworkManager[51205]: <info>  [1759408758.6797] device (tapb2d256d9-67): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:39:18 np0005466013 NetworkManager[51205]: <info>  [1759408758.6839] manager: (patch-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/319)
Oct  2 08:39:18 np0005466013 NetworkManager[51205]: <info>  [1759408758.6847] manager: (patch-br-int-to-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/320)
Oct  2 08:39:18 np0005466013 nova_compute[192144]: 2025-10-02 12:39:18.683 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:18 np0005466013 systemd-machined[152202]: New machine qemu-76-instance-000000a8.
Oct  2 08:39:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:18.700 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:02:80:1f 10.100.0.13'], port_security=['fa:16:3e:02:80:1f 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1976934795', 'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '9c5a0e30-230d-427f-8ffd-2459f12ec78e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-670889c7-549b-45d0-be10-992f080979ef', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1976934795', 'neutron:project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'neutron:revision_number': '6', 'neutron:security_group_ids': '1aab0b39-6daf-41d1-a7da-b7bb077ff5e9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.172'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=54854aa2-539b-45ea-833b-3fc4d3ced3bf, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=b2d256d9-6788-41ed-a218-ab6139d999cb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:39:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:18.703 103323 INFO neutron.agent.ovn.metadata.agent [-] Port b2d256d9-6788-41ed-a218-ab6139d999cb in datapath 670889c7-549b-45d0-be10-992f080979ef bound to our chassis#033[00m
Oct  2 08:39:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:18.705 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 670889c7-549b-45d0-be10-992f080979ef#033[00m
Oct  2 08:39:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:18.719 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[cc74b31c-df52-4af1-808c-992fa5b08401]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:18.721 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap670889c7-51 in ovnmeta-670889c7-549b-45d0-be10-992f080979ef namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:39:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:18.723 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap670889c7-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:39:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:18.724 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[5fab862e-2ab9-46f3-9741-c810f7f99dcf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:18 np0005466013 systemd[1]: Started Virtual Machine qemu-76-instance-000000a8.
Oct  2 08:39:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:18.725 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[dc39770c-af6b-4393-bd86-0da442622417]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:18.740 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[701a3efa-5f85-44ee-ae25-adea77deb8e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:18.771 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[c46a57f8-0dd5-4e5c-bc25-dd595651e553]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:18.796 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[ca499ddc-d0dd-4cd6-8068-755cfdce859b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:18 np0005466013 systemd-udevd[248523]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:39:18 np0005466013 NetworkManager[51205]: <info>  [1759408758.8071] manager: (tap670889c7-50): new Veth device (/org/freedesktop/NetworkManager/Devices/321)
Oct  2 08:39:18 np0005466013 nova_compute[192144]: 2025-10-02 12:39:18.807 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:18.806 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[875f73f7-b499-426f-a0d3-143b14f59859]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:18 np0005466013 nova_compute[192144]: 2025-10-02 12:39:18.821 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:18 np0005466013 ovn_controller[94366]: 2025-10-02T12:39:18Z|00730|binding|INFO|Setting lport b2d256d9-6788-41ed-a218-ab6139d999cb ovn-installed in OVS
Oct  2 08:39:18 np0005466013 ovn_controller[94366]: 2025-10-02T12:39:18Z|00731|binding|INFO|Setting lport b2d256d9-6788-41ed-a218-ab6139d999cb up in Southbound
Oct  2 08:39:18 np0005466013 nova_compute[192144]: 2025-10-02 12:39:18.831 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:18.841 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[9af1e952-64a3-4cbb-bad1-0ca2bc5059ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:18.844 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[0856527f-6ab4-4371-a938-3dc1c433b88a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:18 np0005466013 NetworkManager[51205]: <info>  [1759408758.8705] device (tap670889c7-50): carrier: link connected
Oct  2 08:39:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:18.877 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[71fc2efc-4fd2-4f86-94b4-baa688735b16]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:18.900 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[573ba0e1-3501-400f-8233-0d0dd027602b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap670889c7-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4d:42:59'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 214], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 675048, 'reachable_time': 44176, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 248555, 'error': None, 'target': 'ovnmeta-670889c7-549b-45d0-be10-992f080979ef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:18.926 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[e7be02b5-ab03-45d9-8743-7018f768b53d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4d:4259'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 675048, 'tstamp': 675048}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 248558, 'error': None, 'target': 'ovnmeta-670889c7-549b-45d0-be10-992f080979ef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:18.952 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[204ac1d1-ecb5-450c-8f05-bc8266ab9a76]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap670889c7-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4d:42:59'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 214], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 675048, 'reachable_time': 44176, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 248563, 'error': None, 'target': 'ovnmeta-670889c7-549b-45d0-be10-992f080979ef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:18 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:18.994 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[5007b07c-a2e6-46f4-ae11-f17c476b8d68]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:19.066 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[5bef5764-b571-46b8-a868-dba06a7a7d21]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:19.067 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap670889c7-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:39:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:19.068 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:39:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:19.069 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap670889c7-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:39:19 np0005466013 nova_compute[192144]: 2025-10-02 12:39:19.070 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:19 np0005466013 kernel: tap670889c7-50: entered promiscuous mode
Oct  2 08:39:19 np0005466013 NetworkManager[51205]: <info>  [1759408759.0720] manager: (tap670889c7-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/322)
Oct  2 08:39:19 np0005466013 nova_compute[192144]: 2025-10-02 12:39:19.072 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:19.075 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap670889c7-50, col_values=(('external_ids', {'iface-id': 'cdb35ee2-22af-436d-82f3-c08eadf2b2c7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:39:19 np0005466013 nova_compute[192144]: 2025-10-02 12:39:19.076 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:19 np0005466013 ovn_controller[94366]: 2025-10-02T12:39:19Z|00732|binding|INFO|Releasing lport cdb35ee2-22af-436d-82f3-c08eadf2b2c7 from this chassis (sb_readonly=0)
Oct  2 08:39:19 np0005466013 nova_compute[192144]: 2025-10-02 12:39:19.077 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:19.078 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/670889c7-549b-45d0-be10-992f080979ef.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/670889c7-549b-45d0-be10-992f080979ef.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:39:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:19.079 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[ff37ba4f-0cdd-4607-8a24-a597a55ba9a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:19.080 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:39:19 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:39:19 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:39:19 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-670889c7-549b-45d0-be10-992f080979ef
Oct  2 08:39:19 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:39:19 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:39:19 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:39:19 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/670889c7-549b-45d0-be10-992f080979ef.pid.haproxy
Oct  2 08:39:19 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:39:19 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:39:19 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:39:19 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:39:19 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:39:19 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:39:19 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:39:19 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:39:19 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:39:19 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:39:19 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:39:19 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:39:19 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:39:19 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:39:19 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:39:19 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:39:19 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:39:19 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:39:19 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:39:19 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:39:19 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID 670889c7-549b-45d0-be10-992f080979ef
Oct  2 08:39:19 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:39:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:19.082 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-670889c7-549b-45d0-be10-992f080979ef', 'env', 'PROCESS_TAG=haproxy-670889c7-549b-45d0-be10-992f080979ef', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/670889c7-549b-45d0-be10-992f080979ef.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:39:19 np0005466013 nova_compute[192144]: 2025-10-02 12:39:19.088 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:19 np0005466013 nova_compute[192144]: 2025-10-02 12:39:19.397 2 DEBUG nova.compute.manager [req-2c29d717-ffd3-4438-9f2a-e688bae03390 req-bedc16a9-83ee-4ae0-ae3e-1c562fdb714d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Received event network-vif-plugged-b2d256d9-6788-41ed-a218-ab6139d999cb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:39:19 np0005466013 nova_compute[192144]: 2025-10-02 12:39:19.400 2 DEBUG oslo_concurrency.lockutils [req-2c29d717-ffd3-4438-9f2a-e688bae03390 req-bedc16a9-83ee-4ae0-ae3e-1c562fdb714d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "9c5a0e30-230d-427f-8ffd-2459f12ec78e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:39:19 np0005466013 nova_compute[192144]: 2025-10-02 12:39:19.400 2 DEBUG oslo_concurrency.lockutils [req-2c29d717-ffd3-4438-9f2a-e688bae03390 req-bedc16a9-83ee-4ae0-ae3e-1c562fdb714d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "9c5a0e30-230d-427f-8ffd-2459f12ec78e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:39:19 np0005466013 nova_compute[192144]: 2025-10-02 12:39:19.401 2 DEBUG oslo_concurrency.lockutils [req-2c29d717-ffd3-4438-9f2a-e688bae03390 req-bedc16a9-83ee-4ae0-ae3e-1c562fdb714d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "9c5a0e30-230d-427f-8ffd-2459f12ec78e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:39:19 np0005466013 nova_compute[192144]: 2025-10-02 12:39:19.401 2 DEBUG nova.compute.manager [req-2c29d717-ffd3-4438-9f2a-e688bae03390 req-bedc16a9-83ee-4ae0-ae3e-1c562fdb714d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Processing event network-vif-plugged-b2d256d9-6788-41ed-a218-ab6139d999cb _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:39:19 np0005466013 nova_compute[192144]: 2025-10-02 12:39:19.423 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759408759.4230697, 9c5a0e30-230d-427f-8ffd-2459f12ec78e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:39:19 np0005466013 nova_compute[192144]: 2025-10-02 12:39:19.424 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] VM Started (Lifecycle Event)#033[00m
Oct  2 08:39:19 np0005466013 nova_compute[192144]: 2025-10-02 12:39:19.426 2 DEBUG nova.compute.manager [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:39:19 np0005466013 nova_compute[192144]: 2025-10-02 12:39:19.433 2 DEBUG nova.virt.libvirt.driver [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:39:19 np0005466013 nova_compute[192144]: 2025-10-02 12:39:19.440 2 INFO nova.virt.libvirt.driver [-] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Instance spawned successfully.#033[00m
Oct  2 08:39:19 np0005466013 nova_compute[192144]: 2025-10-02 12:39:19.440 2 DEBUG nova.virt.libvirt.driver [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:39:19 np0005466013 nova_compute[192144]: 2025-10-02 12:39:19.445 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:39:19 np0005466013 nova_compute[192144]: 2025-10-02 12:39:19.447 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:39:19 np0005466013 nova_compute[192144]: 2025-10-02 12:39:19.463 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:39:19 np0005466013 nova_compute[192144]: 2025-10-02 12:39:19.463 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759408759.4239306, 9c5a0e30-230d-427f-8ffd-2459f12ec78e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:39:19 np0005466013 nova_compute[192144]: 2025-10-02 12:39:19.464 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:39:19 np0005466013 nova_compute[192144]: 2025-10-02 12:39:19.469 2 DEBUG nova.virt.libvirt.driver [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:39:19 np0005466013 nova_compute[192144]: 2025-10-02 12:39:19.469 2 DEBUG nova.virt.libvirt.driver [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:39:19 np0005466013 nova_compute[192144]: 2025-10-02 12:39:19.470 2 DEBUG nova.virt.libvirt.driver [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:39:19 np0005466013 nova_compute[192144]: 2025-10-02 12:39:19.471 2 DEBUG nova.virt.libvirt.driver [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:39:19 np0005466013 nova_compute[192144]: 2025-10-02 12:39:19.471 2 DEBUG nova.virt.libvirt.driver [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:39:19 np0005466013 nova_compute[192144]: 2025-10-02 12:39:19.472 2 DEBUG nova.virt.libvirt.driver [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:39:19 np0005466013 nova_compute[192144]: 2025-10-02 12:39:19.480 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:39:19 np0005466013 nova_compute[192144]: 2025-10-02 12:39:19.482 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759408759.4286072, 9c5a0e30-230d-427f-8ffd-2459f12ec78e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:39:19 np0005466013 nova_compute[192144]: 2025-10-02 12:39:19.483 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:39:19 np0005466013 nova_compute[192144]: 2025-10-02 12:39:19.502 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:39:19 np0005466013 nova_compute[192144]: 2025-10-02 12:39:19.505 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:39:19 np0005466013 nova_compute[192144]: 2025-10-02 12:39:19.537 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:39:19 np0005466013 nova_compute[192144]: 2025-10-02 12:39:19.573 2 INFO nova.compute.manager [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Took 6.24 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:39:19 np0005466013 nova_compute[192144]: 2025-10-02 12:39:19.575 2 DEBUG nova.compute.manager [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:39:19 np0005466013 podman[248595]: 2025-10-02 12:39:19.487573469 +0000 UTC m=+0.028117266 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:39:19 np0005466013 podman[248595]: 2025-10-02 12:39:19.672639443 +0000 UTC m=+0.213183190 container create 1d573efa8f9ca041e8e7f045301c85dc707117c01cdb5a7544e869fe52fa9422 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-670889c7-549b-45d0-be10-992f080979ef, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:39:19 np0005466013 nova_compute[192144]: 2025-10-02 12:39:19.681 2 INFO nova.compute.manager [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Took 6.77 seconds to build instance.#033[00m
Oct  2 08:39:19 np0005466013 nova_compute[192144]: 2025-10-02 12:39:19.704 2 DEBUG oslo_concurrency.lockutils [None req-af597c8a-d4d0-4207-bc99-d0280c76e4c5 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "9c5a0e30-230d-427f-8ffd-2459f12ec78e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.865s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:39:19 np0005466013 systemd[1]: Started libpod-conmon-1d573efa8f9ca041e8e7f045301c85dc707117c01cdb5a7544e869fe52fa9422.scope.
Oct  2 08:39:19 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:39:19 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/128a1bb4742fffd0cd863f9ebd7b46e24274ec6bb99396e73b0e01fbdf727a74/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:39:19 np0005466013 podman[248595]: 2025-10-02 12:39:19.841478302 +0000 UTC m=+0.382022089 container init 1d573efa8f9ca041e8e7f045301c85dc707117c01cdb5a7544e869fe52fa9422 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-670889c7-549b-45d0-be10-992f080979ef, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001)
Oct  2 08:39:19 np0005466013 podman[248595]: 2025-10-02 12:39:19.846323363 +0000 UTC m=+0.386867120 container start 1d573efa8f9ca041e8e7f045301c85dc707117c01cdb5a7544e869fe52fa9422 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-670889c7-549b-45d0-be10-992f080979ef, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Oct  2 08:39:19 np0005466013 neutron-haproxy-ovnmeta-670889c7-549b-45d0-be10-992f080979ef[248610]: [NOTICE]   (248614) : New worker (248616) forked
Oct  2 08:39:19 np0005466013 neutron-haproxy-ovnmeta-670889c7-549b-45d0-be10-992f080979ef[248610]: [NOTICE]   (248614) : Loading success.
Oct  2 08:39:20 np0005466013 nova_compute[192144]: 2025-10-02 12:39:20.264 2 DEBUG nova.network.neutron [req-f7c39c3e-465b-4d1a-8871-eb38b67fd422 req-c470ef69-dd5f-42aa-87f7-d7cf5a4108de 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Updated VIF entry in instance network info cache for port b2d256d9-6788-41ed-a218-ab6139d999cb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:39:20 np0005466013 nova_compute[192144]: 2025-10-02 12:39:20.264 2 DEBUG nova.network.neutron [req-f7c39c3e-465b-4d1a-8871-eb38b67fd422 req-c470ef69-dd5f-42aa-87f7-d7cf5a4108de 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Updating instance_info_cache with network_info: [{"id": "b2d256d9-6788-41ed-a218-ab6139d999cb", "address": "fa:16:3e:02:80:1f", "network": {"id": "670889c7-549b-45d0-be10-992f080979ef", "bridge": "br-int", "label": "tempest-network-smoke--1745189972", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2d256d9-67", "ovs_interfaceid": "b2d256d9-6788-41ed-a218-ab6139d999cb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:39:20 np0005466013 nova_compute[192144]: 2025-10-02 12:39:20.284 2 DEBUG oslo_concurrency.lockutils [req-f7c39c3e-465b-4d1a-8871-eb38b67fd422 req-c470ef69-dd5f-42aa-87f7-d7cf5a4108de 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-9c5a0e30-230d-427f-8ffd-2459f12ec78e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:39:20 np0005466013 nova_compute[192144]: 2025-10-02 12:39:20.590 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:21 np0005466013 nova_compute[192144]: 2025-10-02 12:39:21.502 2 DEBUG nova.compute.manager [req-a78758a4-c348-46e7-b580-ee3b72523137 req-c8581e44-68f8-4cd9-b735-a8e6842dfddd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Received event network-vif-plugged-b2d256d9-6788-41ed-a218-ab6139d999cb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:39:21 np0005466013 nova_compute[192144]: 2025-10-02 12:39:21.503 2 DEBUG oslo_concurrency.lockutils [req-a78758a4-c348-46e7-b580-ee3b72523137 req-c8581e44-68f8-4cd9-b735-a8e6842dfddd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "9c5a0e30-230d-427f-8ffd-2459f12ec78e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:39:21 np0005466013 nova_compute[192144]: 2025-10-02 12:39:21.504 2 DEBUG oslo_concurrency.lockutils [req-a78758a4-c348-46e7-b580-ee3b72523137 req-c8581e44-68f8-4cd9-b735-a8e6842dfddd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "9c5a0e30-230d-427f-8ffd-2459f12ec78e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:39:21 np0005466013 nova_compute[192144]: 2025-10-02 12:39:21.505 2 DEBUG oslo_concurrency.lockutils [req-a78758a4-c348-46e7-b580-ee3b72523137 req-c8581e44-68f8-4cd9-b735-a8e6842dfddd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "9c5a0e30-230d-427f-8ffd-2459f12ec78e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:39:21 np0005466013 nova_compute[192144]: 2025-10-02 12:39:21.505 2 DEBUG nova.compute.manager [req-a78758a4-c348-46e7-b580-ee3b72523137 req-c8581e44-68f8-4cd9-b735-a8e6842dfddd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] No waiting events found dispatching network-vif-plugged-b2d256d9-6788-41ed-a218-ab6139d999cb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:39:21 np0005466013 nova_compute[192144]: 2025-10-02 12:39:21.506 2 WARNING nova.compute.manager [req-a78758a4-c348-46e7-b580-ee3b72523137 req-c8581e44-68f8-4cd9-b735-a8e6842dfddd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Received unexpected event network-vif-plugged-b2d256d9-6788-41ed-a218-ab6139d999cb for instance with vm_state active and task_state None.#033[00m
Oct  2 08:39:22 np0005466013 nova_compute[192144]: 2025-10-02 12:39:22.587 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:22 np0005466013 nova_compute[192144]: 2025-10-02 12:39:22.613 2 DEBUG oslo_concurrency.lockutils [None req-7ec4b217-ee85-471c-a3d1-a407f38ea0ab a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "9c5a0e30-230d-427f-8ffd-2459f12ec78e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:39:22 np0005466013 nova_compute[192144]: 2025-10-02 12:39:22.613 2 DEBUG oslo_concurrency.lockutils [None req-7ec4b217-ee85-471c-a3d1-a407f38ea0ab a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "9c5a0e30-230d-427f-8ffd-2459f12ec78e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:39:22 np0005466013 nova_compute[192144]: 2025-10-02 12:39:22.613 2 DEBUG oslo_concurrency.lockutils [None req-7ec4b217-ee85-471c-a3d1-a407f38ea0ab a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "9c5a0e30-230d-427f-8ffd-2459f12ec78e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:39:22 np0005466013 nova_compute[192144]: 2025-10-02 12:39:22.614 2 DEBUG oslo_concurrency.lockutils [None req-7ec4b217-ee85-471c-a3d1-a407f38ea0ab a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "9c5a0e30-230d-427f-8ffd-2459f12ec78e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:39:22 np0005466013 nova_compute[192144]: 2025-10-02 12:39:22.614 2 DEBUG oslo_concurrency.lockutils [None req-7ec4b217-ee85-471c-a3d1-a407f38ea0ab a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "9c5a0e30-230d-427f-8ffd-2459f12ec78e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:39:22 np0005466013 nova_compute[192144]: 2025-10-02 12:39:22.630 2 INFO nova.compute.manager [None req-7ec4b217-ee85-471c-a3d1-a407f38ea0ab a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Terminating instance#033[00m
Oct  2 08:39:22 np0005466013 nova_compute[192144]: 2025-10-02 12:39:22.642 2 DEBUG nova.compute.manager [None req-7ec4b217-ee85-471c-a3d1-a407f38ea0ab a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:39:22 np0005466013 kernel: tapb2d256d9-67 (unregistering): left promiscuous mode
Oct  2 08:39:22 np0005466013 NetworkManager[51205]: <info>  [1759408762.6793] device (tapb2d256d9-67): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:39:22 np0005466013 ovn_controller[94366]: 2025-10-02T12:39:22Z|00733|binding|INFO|Releasing lport b2d256d9-6788-41ed-a218-ab6139d999cb from this chassis (sb_readonly=0)
Oct  2 08:39:22 np0005466013 nova_compute[192144]: 2025-10-02 12:39:22.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:22 np0005466013 ovn_controller[94366]: 2025-10-02T12:39:22Z|00734|binding|INFO|Setting lport b2d256d9-6788-41ed-a218-ab6139d999cb down in Southbound
Oct  2 08:39:22 np0005466013 ovn_controller[94366]: 2025-10-02T12:39:22Z|00735|binding|INFO|Removing iface tapb2d256d9-67 ovn-installed in OVS
Oct  2 08:39:22 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:22.696 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:02:80:1f 10.100.0.13'], port_security=['fa:16:3e:02:80:1f 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1976934795', 'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '9c5a0e30-230d-427f-8ffd-2459f12ec78e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-670889c7-549b-45d0-be10-992f080979ef', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1976934795', 'neutron:project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'neutron:revision_number': '8', 'neutron:security_group_ids': '1aab0b39-6daf-41d1-a7da-b7bb077ff5e9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.172', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=54854aa2-539b-45ea-833b-3fc4d3ced3bf, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=b2d256d9-6788-41ed-a218-ab6139d999cb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:39:22 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:22.698 103323 INFO neutron.agent.ovn.metadata.agent [-] Port b2d256d9-6788-41ed-a218-ab6139d999cb in datapath 670889c7-549b-45d0-be10-992f080979ef unbound from our chassis#033[00m
Oct  2 08:39:22 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:22.699 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 670889c7-549b-45d0-be10-992f080979ef, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:39:22 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:22.700 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[80366235-d847-4aed-8d9d-20ba76545e52]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:22 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:22.701 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-670889c7-549b-45d0-be10-992f080979ef namespace which is not needed anymore#033[00m
Oct  2 08:39:22 np0005466013 nova_compute[192144]: 2025-10-02 12:39:22.706 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:22 np0005466013 systemd[1]: machine-qemu\x2d76\x2dinstance\x2d000000a8.scope: Deactivated successfully.
Oct  2 08:39:22 np0005466013 systemd[1]: machine-qemu\x2d76\x2dinstance\x2d000000a8.scope: Consumed 3.824s CPU time.
Oct  2 08:39:22 np0005466013 systemd-machined[152202]: Machine qemu-76-instance-000000a8 terminated.
Oct  2 08:39:22 np0005466013 neutron-haproxy-ovnmeta-670889c7-549b-45d0-be10-992f080979ef[248610]: [NOTICE]   (248614) : haproxy version is 2.8.14-c23fe91
Oct  2 08:39:22 np0005466013 neutron-haproxy-ovnmeta-670889c7-549b-45d0-be10-992f080979ef[248610]: [NOTICE]   (248614) : path to executable is /usr/sbin/haproxy
Oct  2 08:39:22 np0005466013 neutron-haproxy-ovnmeta-670889c7-549b-45d0-be10-992f080979ef[248610]: [WARNING]  (248614) : Exiting Master process...
Oct  2 08:39:22 np0005466013 neutron-haproxy-ovnmeta-670889c7-549b-45d0-be10-992f080979ef[248610]: [ALERT]    (248614) : Current worker (248616) exited with code 143 (Terminated)
Oct  2 08:39:22 np0005466013 neutron-haproxy-ovnmeta-670889c7-549b-45d0-be10-992f080979ef[248610]: [WARNING]  (248614) : All workers exited. Exiting... (0)
Oct  2 08:39:22 np0005466013 systemd[1]: libpod-1d573efa8f9ca041e8e7f045301c85dc707117c01cdb5a7544e869fe52fa9422.scope: Deactivated successfully.
Oct  2 08:39:22 np0005466013 podman[248648]: 2025-10-02 12:39:22.878694154 +0000 UTC m=+0.103699440 container died 1d573efa8f9ca041e8e7f045301c85dc707117c01cdb5a7544e869fe52fa9422 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-670889c7-549b-45d0-be10-992f080979ef, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 08:39:22 np0005466013 nova_compute[192144]: 2025-10-02 12:39:22.905 2 INFO nova.virt.libvirt.driver [-] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Instance destroyed successfully.#033[00m
Oct  2 08:39:22 np0005466013 nova_compute[192144]: 2025-10-02 12:39:22.906 2 DEBUG nova.objects.instance [None req-7ec4b217-ee85-471c-a3d1-a407f38ea0ab a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lazy-loading 'resources' on Instance uuid 9c5a0e30-230d-427f-8ffd-2459f12ec78e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:39:22 np0005466013 nova_compute[192144]: 2025-10-02 12:39:22.920 2 DEBUG nova.virt.libvirt.vif [None req-7ec4b217-ee85-471c-a3d1-a407f38ea0ab a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:39:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-555592205',display_name='tempest-TestNetworkBasicOps-server-555592205',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-555592205',id=168,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGbxc6q0Wnt6CUsjPK2DStON6HOjT3BcHvBERZfcZOJWtRfL5sAnOvqJCqKS5RgU8YP9fiJsOommoyAcbsnsX7YwYYXbA+WpeOp372gCPNe82JFzK/uLGwdQxKUr5+F/0g==',key_name='tempest-TestNetworkBasicOps-363464227',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:39:19Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6e2a4899168a47618e377cb3ac85ddd2',ramdisk_id='',reservation_id='r-5g0b3agz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1323893370',owner_user_name='tempest-TestNetworkBasicOps-1323893370-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:39:19Z,user_data=None,user_id='a1898fdf056c4a249c33590f26d4d845',uuid=9c5a0e30-230d-427f-8ffd-2459f12ec78e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b2d256d9-6788-41ed-a218-ab6139d999cb", "address": "fa:16:3e:02:80:1f", "network": {"id": "670889c7-549b-45d0-be10-992f080979ef", "bridge": "br-int", "label": "tempest-network-smoke--1745189972", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2d256d9-67", "ovs_interfaceid": "b2d256d9-6788-41ed-a218-ab6139d999cb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:39:22 np0005466013 nova_compute[192144]: 2025-10-02 12:39:22.920 2 DEBUG nova.network.os_vif_util [None req-7ec4b217-ee85-471c-a3d1-a407f38ea0ab a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Converting VIF {"id": "b2d256d9-6788-41ed-a218-ab6139d999cb", "address": "fa:16:3e:02:80:1f", "network": {"id": "670889c7-549b-45d0-be10-992f080979ef", "bridge": "br-int", "label": "tempest-network-smoke--1745189972", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2d256d9-67", "ovs_interfaceid": "b2d256d9-6788-41ed-a218-ab6139d999cb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:39:22 np0005466013 nova_compute[192144]: 2025-10-02 12:39:22.921 2 DEBUG nova.network.os_vif_util [None req-7ec4b217-ee85-471c-a3d1-a407f38ea0ab a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:02:80:1f,bridge_name='br-int',has_traffic_filtering=True,id=b2d256d9-6788-41ed-a218-ab6139d999cb,network=Network(670889c7-549b-45d0-be10-992f080979ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb2d256d9-67') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:39:22 np0005466013 nova_compute[192144]: 2025-10-02 12:39:22.922 2 DEBUG os_vif [None req-7ec4b217-ee85-471c-a3d1-a407f38ea0ab a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:02:80:1f,bridge_name='br-int',has_traffic_filtering=True,id=b2d256d9-6788-41ed-a218-ab6139d999cb,network=Network(670889c7-549b-45d0-be10-992f080979ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb2d256d9-67') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:39:22 np0005466013 nova_compute[192144]: 2025-10-02 12:39:22.923 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:22 np0005466013 nova_compute[192144]: 2025-10-02 12:39:22.924 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb2d256d9-67, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:39:22 np0005466013 nova_compute[192144]: 2025-10-02 12:39:22.926 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:22 np0005466013 nova_compute[192144]: 2025-10-02 12:39:22.927 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:22 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1d573efa8f9ca041e8e7f045301c85dc707117c01cdb5a7544e869fe52fa9422-userdata-shm.mount: Deactivated successfully.
Oct  2 08:39:22 np0005466013 nova_compute[192144]: 2025-10-02 12:39:22.930 2 INFO os_vif [None req-7ec4b217-ee85-471c-a3d1-a407f38ea0ab a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:02:80:1f,bridge_name='br-int',has_traffic_filtering=True,id=b2d256d9-6788-41ed-a218-ab6139d999cb,network=Network(670889c7-549b-45d0-be10-992f080979ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb2d256d9-67')#033[00m
Oct  2 08:39:22 np0005466013 nova_compute[192144]: 2025-10-02 12:39:22.931 2 INFO nova.virt.libvirt.driver [None req-7ec4b217-ee85-471c-a3d1-a407f38ea0ab a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Deleting instance files /var/lib/nova/instances/9c5a0e30-230d-427f-8ffd-2459f12ec78e_del#033[00m
Oct  2 08:39:22 np0005466013 nova_compute[192144]: 2025-10-02 12:39:22.932 2 INFO nova.virt.libvirt.driver [None req-7ec4b217-ee85-471c-a3d1-a407f38ea0ab a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Deletion of /var/lib/nova/instances/9c5a0e30-230d-427f-8ffd-2459f12ec78e_del complete#033[00m
Oct  2 08:39:22 np0005466013 systemd[1]: var-lib-containers-storage-overlay-128a1bb4742fffd0cd863f9ebd7b46e24274ec6bb99396e73b0e01fbdf727a74-merged.mount: Deactivated successfully.
Oct  2 08:39:22 np0005466013 podman[248648]: 2025-10-02 12:39:22.946760575 +0000 UTC m=+0.171765861 container cleanup 1d573efa8f9ca041e8e7f045301c85dc707117c01cdb5a7544e869fe52fa9422 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-670889c7-549b-45d0-be10-992f080979ef, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  2 08:39:22 np0005466013 systemd[1]: libpod-conmon-1d573efa8f9ca041e8e7f045301c85dc707117c01cdb5a7544e869fe52fa9422.scope: Deactivated successfully.
Oct  2 08:39:23 np0005466013 nova_compute[192144]: 2025-10-02 12:39:23.005 2 INFO nova.compute.manager [None req-7ec4b217-ee85-471c-a3d1-a407f38ea0ab a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:39:23 np0005466013 nova_compute[192144]: 2025-10-02 12:39:23.006 2 DEBUG oslo.service.loopingcall [None req-7ec4b217-ee85-471c-a3d1-a407f38ea0ab a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:39:23 np0005466013 nova_compute[192144]: 2025-10-02 12:39:23.006 2 DEBUG nova.compute.manager [-] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:39:23 np0005466013 nova_compute[192144]: 2025-10-02 12:39:23.006 2 DEBUG nova.network.neutron [-] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:39:23 np0005466013 podman[248696]: 2025-10-02 12:39:23.007969211 +0000 UTC m=+0.040972288 container remove 1d573efa8f9ca041e8e7f045301c85dc707117c01cdb5a7544e869fe52fa9422 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-670889c7-549b-45d0-be10-992f080979ef, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:39:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:23.012 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[f1771c2e-2896-4413-b5b4-0d20843404e2]: (4, ('Thu Oct  2 12:39:22 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-670889c7-549b-45d0-be10-992f080979ef (1d573efa8f9ca041e8e7f045301c85dc707117c01cdb5a7544e869fe52fa9422)\n1d573efa8f9ca041e8e7f045301c85dc707117c01cdb5a7544e869fe52fa9422\nThu Oct  2 12:39:22 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-670889c7-549b-45d0-be10-992f080979ef (1d573efa8f9ca041e8e7f045301c85dc707117c01cdb5a7544e869fe52fa9422)\n1d573efa8f9ca041e8e7f045301c85dc707117c01cdb5a7544e869fe52fa9422\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:23.014 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[8287dbdd-4009-40e6-b679-b4cf134cc528]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:23.015 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap670889c7-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:39:23 np0005466013 nova_compute[192144]: 2025-10-02 12:39:23.016 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:23 np0005466013 kernel: tap670889c7-50: left promiscuous mode
Oct  2 08:39:23 np0005466013 nova_compute[192144]: 2025-10-02 12:39:23.018 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:23.021 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[7e6f7830-c019-41ab-92f9-495235104bea]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:23 np0005466013 nova_compute[192144]: 2025-10-02 12:39:23.028 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:23.062 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[798462f8-7aee-4278-8cf9-496106c959ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:23.063 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[5a0b7468-1e24-4610-a362-771c6ee84f08]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:23.080 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[bc5b6105-ccaf-4ebc-9477-a23b95f71c92]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 675040, 'reachable_time': 42252, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 248711, 'error': None, 'target': 'ovnmeta-670889c7-549b-45d0-be10-992f080979ef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:23.082 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-670889c7-549b-45d0-be10-992f080979ef deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:39:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:23.082 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[06bbff15-c901-40a9-858e-af49bf9fe32f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:23 np0005466013 systemd[1]: run-netns-ovnmeta\x2d670889c7\x2d549b\x2d45d0\x2dbe10\x2d992f080979ef.mount: Deactivated successfully.
Oct  2 08:39:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:23.505 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=44, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=43) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:39:23 np0005466013 nova_compute[192144]: 2025-10-02 12:39:23.505 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:23.506 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:39:23 np0005466013 nova_compute[192144]: 2025-10-02 12:39:23.603 2 DEBUG nova.compute.manager [req-eef3ff05-7845-48ec-ab8c-06a3037724c2 req-04301dce-58e5-480e-ac05-9dca318a6f4b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Received event network-vif-unplugged-b2d256d9-6788-41ed-a218-ab6139d999cb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:39:23 np0005466013 nova_compute[192144]: 2025-10-02 12:39:23.603 2 DEBUG oslo_concurrency.lockutils [req-eef3ff05-7845-48ec-ab8c-06a3037724c2 req-04301dce-58e5-480e-ac05-9dca318a6f4b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "9c5a0e30-230d-427f-8ffd-2459f12ec78e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:39:23 np0005466013 nova_compute[192144]: 2025-10-02 12:39:23.604 2 DEBUG oslo_concurrency.lockutils [req-eef3ff05-7845-48ec-ab8c-06a3037724c2 req-04301dce-58e5-480e-ac05-9dca318a6f4b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "9c5a0e30-230d-427f-8ffd-2459f12ec78e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:39:23 np0005466013 nova_compute[192144]: 2025-10-02 12:39:23.604 2 DEBUG oslo_concurrency.lockutils [req-eef3ff05-7845-48ec-ab8c-06a3037724c2 req-04301dce-58e5-480e-ac05-9dca318a6f4b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "9c5a0e30-230d-427f-8ffd-2459f12ec78e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:39:23 np0005466013 nova_compute[192144]: 2025-10-02 12:39:23.604 2 DEBUG nova.compute.manager [req-eef3ff05-7845-48ec-ab8c-06a3037724c2 req-04301dce-58e5-480e-ac05-9dca318a6f4b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] No waiting events found dispatching network-vif-unplugged-b2d256d9-6788-41ed-a218-ab6139d999cb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:39:23 np0005466013 nova_compute[192144]: 2025-10-02 12:39:23.605 2 DEBUG nova.compute.manager [req-eef3ff05-7845-48ec-ab8c-06a3037724c2 req-04301dce-58e5-480e-ac05-9dca318a6f4b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Received event network-vif-unplugged-b2d256d9-6788-41ed-a218-ab6139d999cb for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:39:23 np0005466013 nova_compute[192144]: 2025-10-02 12:39:23.605 2 DEBUG nova.compute.manager [req-eef3ff05-7845-48ec-ab8c-06a3037724c2 req-04301dce-58e5-480e-ac05-9dca318a6f4b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Received event network-vif-plugged-b2d256d9-6788-41ed-a218-ab6139d999cb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:39:23 np0005466013 nova_compute[192144]: 2025-10-02 12:39:23.606 2 DEBUG oslo_concurrency.lockutils [req-eef3ff05-7845-48ec-ab8c-06a3037724c2 req-04301dce-58e5-480e-ac05-9dca318a6f4b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "9c5a0e30-230d-427f-8ffd-2459f12ec78e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:39:23 np0005466013 nova_compute[192144]: 2025-10-02 12:39:23.606 2 DEBUG oslo_concurrency.lockutils [req-eef3ff05-7845-48ec-ab8c-06a3037724c2 req-04301dce-58e5-480e-ac05-9dca318a6f4b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "9c5a0e30-230d-427f-8ffd-2459f12ec78e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:39:23 np0005466013 nova_compute[192144]: 2025-10-02 12:39:23.606 2 DEBUG oslo_concurrency.lockutils [req-eef3ff05-7845-48ec-ab8c-06a3037724c2 req-04301dce-58e5-480e-ac05-9dca318a6f4b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "9c5a0e30-230d-427f-8ffd-2459f12ec78e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:39:23 np0005466013 nova_compute[192144]: 2025-10-02 12:39:23.607 2 DEBUG nova.compute.manager [req-eef3ff05-7845-48ec-ab8c-06a3037724c2 req-04301dce-58e5-480e-ac05-9dca318a6f4b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] No waiting events found dispatching network-vif-plugged-b2d256d9-6788-41ed-a218-ab6139d999cb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:39:23 np0005466013 nova_compute[192144]: 2025-10-02 12:39:23.607 2 WARNING nova.compute.manager [req-eef3ff05-7845-48ec-ab8c-06a3037724c2 req-04301dce-58e5-480e-ac05-9dca318a6f4b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Received unexpected event network-vif-plugged-b2d256d9-6788-41ed-a218-ab6139d999cb for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:39:24 np0005466013 nova_compute[192144]: 2025-10-02 12:39:24.318 2 DEBUG nova.network.neutron [-] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:39:24 np0005466013 nova_compute[192144]: 2025-10-02 12:39:24.343 2 INFO nova.compute.manager [-] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Took 1.34 seconds to deallocate network for instance.#033[00m
Oct  2 08:39:24 np0005466013 podman[248713]: 2025-10-02 12:39:24.399821382 +0000 UTC m=+0.045553689 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 08:39:24 np0005466013 podman[248712]: 2025-10-02 12:39:24.416708949 +0000 UTC m=+0.061137566 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:39:24 np0005466013 nova_compute[192144]: 2025-10-02 12:39:24.421 2 DEBUG oslo_concurrency.lockutils [None req-7ec4b217-ee85-471c-a3d1-a407f38ea0ab a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:39:24 np0005466013 nova_compute[192144]: 2025-10-02 12:39:24.422 2 DEBUG oslo_concurrency.lockutils [None req-7ec4b217-ee85-471c-a3d1-a407f38ea0ab a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:39:24 np0005466013 podman[248714]: 2025-10-02 12:39:24.444498784 +0000 UTC m=+0.082970095 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251001, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:39:24 np0005466013 nova_compute[192144]: 2025-10-02 12:39:24.487 2 DEBUG nova.compute.provider_tree [None req-7ec4b217-ee85-471c-a3d1-a407f38ea0ab a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:39:24 np0005466013 nova_compute[192144]: 2025-10-02 12:39:24.502 2 DEBUG nova.scheduler.client.report [None req-7ec4b217-ee85-471c-a3d1-a407f38ea0ab a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:39:24 np0005466013 nova_compute[192144]: 2025-10-02 12:39:24.526 2 DEBUG oslo_concurrency.lockutils [None req-7ec4b217-ee85-471c-a3d1-a407f38ea0ab a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:39:24 np0005466013 nova_compute[192144]: 2025-10-02 12:39:24.553 2 INFO nova.scheduler.client.report [None req-7ec4b217-ee85-471c-a3d1-a407f38ea0ab a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Deleted allocations for instance 9c5a0e30-230d-427f-8ffd-2459f12ec78e#033[00m
Oct  2 08:39:24 np0005466013 nova_compute[192144]: 2025-10-02 12:39:24.628 2 DEBUG oslo_concurrency.lockutils [None req-7ec4b217-ee85-471c-a3d1-a407f38ea0ab a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "9c5a0e30-230d-427f-8ffd-2459f12ec78e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.015s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:39:25 np0005466013 nova_compute[192144]: 2025-10-02 12:39:25.630 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:27.508 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '44'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:39:27 np0005466013 nova_compute[192144]: 2025-10-02 12:39:27.928 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:30 np0005466013 nova_compute[192144]: 2025-10-02 12:39:30.324 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:30 np0005466013 nova_compute[192144]: 2025-10-02 12:39:30.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:30 np0005466013 nova_compute[192144]: 2025-10-02 12:39:30.633 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:32 np0005466013 nova_compute[192144]: 2025-10-02 12:39:32.931 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:34 np0005466013 podman[248784]: 2025-10-02 12:39:34.681079117 +0000 UTC m=+0.055327844 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, container_name=openstack_network_exporter, vcs-type=git, managed_by=edpm_ansible, version=9.6, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct  2 08:39:34 np0005466013 podman[248785]: 2025-10-02 12:39:34.683738401 +0000 UTC m=+0.054916312 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:39:34 np0005466013 podman[248783]: 2025-10-02 12:39:34.688139247 +0000 UTC m=+0.062349023 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 08:39:35 np0005466013 nova_compute[192144]: 2025-10-02 12:39:35.635 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:36 np0005466013 nova_compute[192144]: 2025-10-02 12:39:36.657 2 DEBUG oslo_concurrency.lockutils [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "cecfc8a5-ee75-46f1-8280-6869075855ac" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:39:36 np0005466013 nova_compute[192144]: 2025-10-02 12:39:36.657 2 DEBUG oslo_concurrency.lockutils [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "cecfc8a5-ee75-46f1-8280-6869075855ac" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:39:36 np0005466013 podman[248840]: 2025-10-02 12:39:36.673591269 +0000 UTC m=+0.055361426 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 08:39:36 np0005466013 nova_compute[192144]: 2025-10-02 12:39:36.672 2 DEBUG nova.compute.manager [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:39:36 np0005466013 podman[248841]: 2025-10-02 12:39:36.705382329 +0000 UTC m=+0.071027533 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:39:36 np0005466013 nova_compute[192144]: 2025-10-02 12:39:36.796 2 DEBUG oslo_concurrency.lockutils [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:39:36 np0005466013 nova_compute[192144]: 2025-10-02 12:39:36.796 2 DEBUG oslo_concurrency.lockutils [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:39:36 np0005466013 nova_compute[192144]: 2025-10-02 12:39:36.804 2 DEBUG nova.virt.hardware [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:39:36 np0005466013 nova_compute[192144]: 2025-10-02 12:39:36.804 2 INFO nova.compute.claims [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:39:36 np0005466013 nova_compute[192144]: 2025-10-02 12:39:36.928 2 DEBUG nova.compute.provider_tree [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:39:36 np0005466013 nova_compute[192144]: 2025-10-02 12:39:36.946 2 DEBUG nova.scheduler.client.report [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:39:36 np0005466013 nova_compute[192144]: 2025-10-02 12:39:36.970 2 DEBUG oslo_concurrency.lockutils [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.174s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:39:36 np0005466013 nova_compute[192144]: 2025-10-02 12:39:36.971 2 DEBUG nova.compute.manager [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:39:37 np0005466013 nova_compute[192144]: 2025-10-02 12:39:37.032 2 DEBUG nova.compute.manager [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:39:37 np0005466013 nova_compute[192144]: 2025-10-02 12:39:37.033 2 DEBUG nova.network.neutron [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:39:37 np0005466013 nova_compute[192144]: 2025-10-02 12:39:37.060 2 INFO nova.virt.libvirt.driver [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:39:37 np0005466013 nova_compute[192144]: 2025-10-02 12:39:37.086 2 DEBUG nova.compute.manager [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:39:37 np0005466013 nova_compute[192144]: 2025-10-02 12:39:37.269 2 DEBUG nova.compute.manager [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:39:37 np0005466013 nova_compute[192144]: 2025-10-02 12:39:37.271 2 DEBUG nova.virt.libvirt.driver [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:39:37 np0005466013 nova_compute[192144]: 2025-10-02 12:39:37.271 2 INFO nova.virt.libvirt.driver [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Creating image(s)#033[00m
Oct  2 08:39:37 np0005466013 nova_compute[192144]: 2025-10-02 12:39:37.272 2 DEBUG oslo_concurrency.lockutils [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "/var/lib/nova/instances/cecfc8a5-ee75-46f1-8280-6869075855ac/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:39:37 np0005466013 nova_compute[192144]: 2025-10-02 12:39:37.272 2 DEBUG oslo_concurrency.lockutils [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "/var/lib/nova/instances/cecfc8a5-ee75-46f1-8280-6869075855ac/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:39:37 np0005466013 nova_compute[192144]: 2025-10-02 12:39:37.273 2 DEBUG oslo_concurrency.lockutils [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "/var/lib/nova/instances/cecfc8a5-ee75-46f1-8280-6869075855ac/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:39:37 np0005466013 nova_compute[192144]: 2025-10-02 12:39:37.287 2 DEBUG oslo_concurrency.processutils [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:39:37 np0005466013 nova_compute[192144]: 2025-10-02 12:39:37.351 2 DEBUG oslo_concurrency.processutils [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:39:37 np0005466013 nova_compute[192144]: 2025-10-02 12:39:37.352 2 DEBUG oslo_concurrency.lockutils [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:39:37 np0005466013 nova_compute[192144]: 2025-10-02 12:39:37.353 2 DEBUG oslo_concurrency.lockutils [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:39:37 np0005466013 nova_compute[192144]: 2025-10-02 12:39:37.370 2 DEBUG oslo_concurrency.processutils [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:39:37 np0005466013 nova_compute[192144]: 2025-10-02 12:39:37.432 2 DEBUG oslo_concurrency.processutils [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:39:37 np0005466013 nova_compute[192144]: 2025-10-02 12:39:37.434 2 DEBUG oslo_concurrency.processutils [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/cecfc8a5-ee75-46f1-8280-6869075855ac/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:39:37 np0005466013 nova_compute[192144]: 2025-10-02 12:39:37.461 2 DEBUG nova.policy [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:39:37 np0005466013 nova_compute[192144]: 2025-10-02 12:39:37.672 2 DEBUG oslo_concurrency.processutils [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/cecfc8a5-ee75-46f1-8280-6869075855ac/disk 1073741824" returned: 0 in 0.238s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:39:37 np0005466013 nova_compute[192144]: 2025-10-02 12:39:37.673 2 DEBUG oslo_concurrency.lockutils [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.320s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:39:37 np0005466013 nova_compute[192144]: 2025-10-02 12:39:37.674 2 DEBUG oslo_concurrency.processutils [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:39:37 np0005466013 nova_compute[192144]: 2025-10-02 12:39:37.766 2 DEBUG oslo_concurrency.processutils [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:39:37 np0005466013 nova_compute[192144]: 2025-10-02 12:39:37.768 2 DEBUG nova.virt.disk.api [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Checking if we can resize image /var/lib/nova/instances/cecfc8a5-ee75-46f1-8280-6869075855ac/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:39:37 np0005466013 nova_compute[192144]: 2025-10-02 12:39:37.769 2 DEBUG oslo_concurrency.processutils [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cecfc8a5-ee75-46f1-8280-6869075855ac/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:39:37 np0005466013 nova_compute[192144]: 2025-10-02 12:39:37.840 2 DEBUG oslo_concurrency.processutils [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cecfc8a5-ee75-46f1-8280-6869075855ac/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:39:37 np0005466013 nova_compute[192144]: 2025-10-02 12:39:37.841 2 DEBUG nova.virt.disk.api [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Cannot resize image /var/lib/nova/instances/cecfc8a5-ee75-46f1-8280-6869075855ac/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:39:37 np0005466013 nova_compute[192144]: 2025-10-02 12:39:37.842 2 DEBUG nova.objects.instance [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lazy-loading 'migration_context' on Instance uuid cecfc8a5-ee75-46f1-8280-6869075855ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:39:37 np0005466013 nova_compute[192144]: 2025-10-02 12:39:37.905 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408762.9039598, 9c5a0e30-230d-427f-8ffd-2459f12ec78e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:39:37 np0005466013 nova_compute[192144]: 2025-10-02 12:39:37.906 2 INFO nova.compute.manager [-] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:39:37 np0005466013 nova_compute[192144]: 2025-10-02 12:39:37.934 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:37 np0005466013 nova_compute[192144]: 2025-10-02 12:39:37.994 2 DEBUG nova.virt.libvirt.driver [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:39:37 np0005466013 nova_compute[192144]: 2025-10-02 12:39:37.995 2 DEBUG nova.virt.libvirt.driver [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Ensure instance console log exists: /var/lib/nova/instances/cecfc8a5-ee75-46f1-8280-6869075855ac/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:39:37 np0005466013 nova_compute[192144]: 2025-10-02 12:39:37.996 2 DEBUG oslo_concurrency.lockutils [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:39:37 np0005466013 nova_compute[192144]: 2025-10-02 12:39:37.997 2 DEBUG oslo_concurrency.lockutils [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:39:37 np0005466013 nova_compute[192144]: 2025-10-02 12:39:37.997 2 DEBUG oslo_concurrency.lockutils [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:39:38 np0005466013 nova_compute[192144]: 2025-10-02 12:39:38.009 2 DEBUG nova.compute.manager [None req-ab1a04f3-82eb-40aa-87db-05dba337c6dd - - - - - -] [instance: 9c5a0e30-230d-427f-8ffd-2459f12ec78e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:39:39 np0005466013 nova_compute[192144]: 2025-10-02 12:39:39.965 2 DEBUG nova.network.neutron [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Successfully created port: 14809110-a5f0-4169-b4d1-b20b6b32320a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:39:40 np0005466013 nova_compute[192144]: 2025-10-02 12:39:40.637 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:41 np0005466013 nova_compute[192144]: 2025-10-02 12:39:41.004 2 DEBUG nova.network.neutron [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Successfully updated port: 14809110-a5f0-4169-b4d1-b20b6b32320a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:39:41 np0005466013 nova_compute[192144]: 2025-10-02 12:39:41.024 2 DEBUG oslo_concurrency.lockutils [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "refresh_cache-cecfc8a5-ee75-46f1-8280-6869075855ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:39:41 np0005466013 nova_compute[192144]: 2025-10-02 12:39:41.025 2 DEBUG oslo_concurrency.lockutils [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquired lock "refresh_cache-cecfc8a5-ee75-46f1-8280-6869075855ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:39:41 np0005466013 nova_compute[192144]: 2025-10-02 12:39:41.025 2 DEBUG nova.network.neutron [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:39:41 np0005466013 nova_compute[192144]: 2025-10-02 12:39:41.133 2 DEBUG nova.compute.manager [req-18ef040a-7559-42f3-b30c-fa5ba8313740 req-c4b100c2-4185-4428-84bd-e3d6e6b9f943 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Received event network-changed-14809110-a5f0-4169-b4d1-b20b6b32320a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:39:41 np0005466013 nova_compute[192144]: 2025-10-02 12:39:41.134 2 DEBUG nova.compute.manager [req-18ef040a-7559-42f3-b30c-fa5ba8313740 req-c4b100c2-4185-4428-84bd-e3d6e6b9f943 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Refreshing instance network info cache due to event network-changed-14809110-a5f0-4169-b4d1-b20b6b32320a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:39:41 np0005466013 nova_compute[192144]: 2025-10-02 12:39:41.134 2 DEBUG oslo_concurrency.lockutils [req-18ef040a-7559-42f3-b30c-fa5ba8313740 req-c4b100c2-4185-4428-84bd-e3d6e6b9f943 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-cecfc8a5-ee75-46f1-8280-6869075855ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:39:41 np0005466013 nova_compute[192144]: 2025-10-02 12:39:41.185 2 DEBUG nova.network.neutron [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:39:41 np0005466013 nova_compute[192144]: 2025-10-02 12:39:41.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:39:41 np0005466013 nova_compute[192144]: 2025-10-02 12:39:41.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:39:41 np0005466013 nova_compute[192144]: 2025-10-02 12:39:41.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:39:42 np0005466013 nova_compute[192144]: 2025-10-02 12:39:42.017 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:39:42 np0005466013 nova_compute[192144]: 2025-10-02 12:39:42.018 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:39:42 np0005466013 nova_compute[192144]: 2025-10-02 12:39:42.018 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:39:42 np0005466013 nova_compute[192144]: 2025-10-02 12:39:42.018 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:39:42 np0005466013 nova_compute[192144]: 2025-10-02 12:39:42.171 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:39:42 np0005466013 nova_compute[192144]: 2025-10-02 12:39:42.172 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5687MB free_disk=73.13326263427734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:39:42 np0005466013 nova_compute[192144]: 2025-10-02 12:39:42.172 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:39:42 np0005466013 nova_compute[192144]: 2025-10-02 12:39:42.172 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:39:42 np0005466013 nova_compute[192144]: 2025-10-02 12:39:42.271 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance cecfc8a5-ee75-46f1-8280-6869075855ac actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:39:42 np0005466013 nova_compute[192144]: 2025-10-02 12:39:42.271 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:39:42 np0005466013 nova_compute[192144]: 2025-10-02 12:39:42.272 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:39:42 np0005466013 nova_compute[192144]: 2025-10-02 12:39:42.352 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:39:42 np0005466013 nova_compute[192144]: 2025-10-02 12:39:42.376 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:39:42 np0005466013 nova_compute[192144]: 2025-10-02 12:39:42.410 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:39:42 np0005466013 nova_compute[192144]: 2025-10-02 12:39:42.411 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.239s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:39:42 np0005466013 nova_compute[192144]: 2025-10-02 12:39:42.937 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:43 np0005466013 nova_compute[192144]: 2025-10-02 12:39:43.410 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:39:43 np0005466013 nova_compute[192144]: 2025-10-02 12:39:43.410 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:39:44 np0005466013 nova_compute[192144]: 2025-10-02 12:39:44.541 2 DEBUG nova.network.neutron [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Updating instance_info_cache with network_info: [{"id": "14809110-a5f0-4169-b4d1-b20b6b32320a", "address": "fa:16:3e:53:ca:b0", "network": {"id": "b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177", "bridge": "br-int", "label": "tempest-network-smoke--1389422686", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe53:cab0", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe53:cab0", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14809110-a5", "ovs_interfaceid": "14809110-a5f0-4169-b4d1-b20b6b32320a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:39:44 np0005466013 nova_compute[192144]: 2025-10-02 12:39:44.564 2 DEBUG oslo_concurrency.lockutils [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Releasing lock "refresh_cache-cecfc8a5-ee75-46f1-8280-6869075855ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:39:44 np0005466013 nova_compute[192144]: 2025-10-02 12:39:44.565 2 DEBUG nova.compute.manager [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Instance network_info: |[{"id": "14809110-a5f0-4169-b4d1-b20b6b32320a", "address": "fa:16:3e:53:ca:b0", "network": {"id": "b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177", "bridge": "br-int", "label": "tempest-network-smoke--1389422686", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe53:cab0", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe53:cab0", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14809110-a5", "ovs_interfaceid": "14809110-a5f0-4169-b4d1-b20b6b32320a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:39:44 np0005466013 nova_compute[192144]: 2025-10-02 12:39:44.565 2 DEBUG oslo_concurrency.lockutils [req-18ef040a-7559-42f3-b30c-fa5ba8313740 req-c4b100c2-4185-4428-84bd-e3d6e6b9f943 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-cecfc8a5-ee75-46f1-8280-6869075855ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:39:44 np0005466013 nova_compute[192144]: 2025-10-02 12:39:44.566 2 DEBUG nova.network.neutron [req-18ef040a-7559-42f3-b30c-fa5ba8313740 req-c4b100c2-4185-4428-84bd-e3d6e6b9f943 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Refreshing network info cache for port 14809110-a5f0-4169-b4d1-b20b6b32320a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:39:44 np0005466013 nova_compute[192144]: 2025-10-02 12:39:44.569 2 DEBUG nova.virt.libvirt.driver [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Start _get_guest_xml network_info=[{"id": "14809110-a5f0-4169-b4d1-b20b6b32320a", "address": "fa:16:3e:53:ca:b0", "network": {"id": "b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177", "bridge": "br-int", "label": "tempest-network-smoke--1389422686", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe53:cab0", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe53:cab0", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14809110-a5", "ovs_interfaceid": "14809110-a5f0-4169-b4d1-b20b6b32320a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:39:44 np0005466013 nova_compute[192144]: 2025-10-02 12:39:44.574 2 WARNING nova.virt.libvirt.driver [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:39:44 np0005466013 nova_compute[192144]: 2025-10-02 12:39:44.582 2 DEBUG nova.virt.libvirt.host [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:39:44 np0005466013 nova_compute[192144]: 2025-10-02 12:39:44.583 2 DEBUG nova.virt.libvirt.host [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:39:44 np0005466013 nova_compute[192144]: 2025-10-02 12:39:44.587 2 DEBUG nova.virt.libvirt.host [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:39:44 np0005466013 nova_compute[192144]: 2025-10-02 12:39:44.589 2 DEBUG nova.virt.libvirt.host [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:39:44 np0005466013 nova_compute[192144]: 2025-10-02 12:39:44.591 2 DEBUG nova.virt.libvirt.driver [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:39:44 np0005466013 nova_compute[192144]: 2025-10-02 12:39:44.592 2 DEBUG nova.virt.hardware [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:39:44 np0005466013 nova_compute[192144]: 2025-10-02 12:39:44.592 2 DEBUG nova.virt.hardware [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:39:44 np0005466013 nova_compute[192144]: 2025-10-02 12:39:44.593 2 DEBUG nova.virt.hardware [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:39:44 np0005466013 nova_compute[192144]: 2025-10-02 12:39:44.593 2 DEBUG nova.virt.hardware [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:39:44 np0005466013 nova_compute[192144]: 2025-10-02 12:39:44.594 2 DEBUG nova.virt.hardware [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:39:44 np0005466013 nova_compute[192144]: 2025-10-02 12:39:44.594 2 DEBUG nova.virt.hardware [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:39:44 np0005466013 nova_compute[192144]: 2025-10-02 12:39:44.595 2 DEBUG nova.virt.hardware [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:39:44 np0005466013 nova_compute[192144]: 2025-10-02 12:39:44.595 2 DEBUG nova.virt.hardware [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:39:44 np0005466013 nova_compute[192144]: 2025-10-02 12:39:44.595 2 DEBUG nova.virt.hardware [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:39:44 np0005466013 nova_compute[192144]: 2025-10-02 12:39:44.596 2 DEBUG nova.virt.hardware [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:39:44 np0005466013 nova_compute[192144]: 2025-10-02 12:39:44.596 2 DEBUG nova.virt.hardware [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:39:44 np0005466013 nova_compute[192144]: 2025-10-02 12:39:44.603 2 DEBUG nova.virt.libvirt.vif [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:39:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-640150742',display_name='tempest-TestGettingAddress-server-640150742',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-640150742',id=169,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIRceitbRKs6XXbf7kadsFogQdqZxrZp17i/4hhUcf2GwYzZeOwXhP4JbD0KEwVqLcpeLnMlwZ2D2mCuHBJR8KfWga2sLnw7wFx6+c4GjZ3JqHI31K0krXjHmfSh85q/Fw==',key_name='tempest-TestGettingAddress-568346474',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-95v55i56',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:39:37Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=cecfc8a5-ee75-46f1-8280-6869075855ac,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "14809110-a5f0-4169-b4d1-b20b6b32320a", "address": "fa:16:3e:53:ca:b0", "network": {"id": "b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177", "bridge": "br-int", "label": "tempest-network-smoke--1389422686", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe53:cab0", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe53:cab0", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14809110-a5", "ovs_interfaceid": "14809110-a5f0-4169-b4d1-b20b6b32320a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:39:44 np0005466013 nova_compute[192144]: 2025-10-02 12:39:44.604 2 DEBUG nova.network.os_vif_util [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "14809110-a5f0-4169-b4d1-b20b6b32320a", "address": "fa:16:3e:53:ca:b0", "network": {"id": "b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177", "bridge": "br-int", "label": "tempest-network-smoke--1389422686", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe53:cab0", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe53:cab0", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14809110-a5", "ovs_interfaceid": "14809110-a5f0-4169-b4d1-b20b6b32320a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:39:44 np0005466013 nova_compute[192144]: 2025-10-02 12:39:44.606 2 DEBUG nova.network.os_vif_util [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:53:ca:b0,bridge_name='br-int',has_traffic_filtering=True,id=14809110-a5f0-4169-b4d1-b20b6b32320a,network=Network(b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap14809110-a5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:39:44 np0005466013 nova_compute[192144]: 2025-10-02 12:39:44.607 2 DEBUG nova.objects.instance [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lazy-loading 'pci_devices' on Instance uuid cecfc8a5-ee75-46f1-8280-6869075855ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:39:44 np0005466013 nova_compute[192144]: 2025-10-02 12:39:44.654 2 DEBUG nova.virt.libvirt.driver [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:39:44 np0005466013 nova_compute[192144]:  <uuid>cecfc8a5-ee75-46f1-8280-6869075855ac</uuid>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:  <name>instance-000000a9</name>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:39:44 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:      <nova:name>tempest-TestGettingAddress-server-640150742</nova:name>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:39:44</nova:creationTime>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:39:44 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:        <nova:user uuid="97ce9f1898484e0e9a1f7c84a9f0dfe3">tempest-TestGettingAddress-1355720650-project-member</nova:user>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:        <nova:project uuid="fd801958556f4c8aab047ecdef6b5ee8">tempest-TestGettingAddress-1355720650</nova:project>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:        <nova:port uuid="14809110-a5f0-4169-b4d1-b20b6b32320a">
Oct  2 08:39:44 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe53:cab0" ipVersion="6"/>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fe53:cab0" ipVersion="6"/>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:      <entry name="serial">cecfc8a5-ee75-46f1-8280-6869075855ac</entry>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:      <entry name="uuid">cecfc8a5-ee75-46f1-8280-6869075855ac</entry>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:39:44 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/cecfc8a5-ee75-46f1-8280-6869075855ac/disk"/>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:39:44 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/cecfc8a5-ee75-46f1-8280-6869075855ac/disk.config"/>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:39:44 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:53:ca:b0"/>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:      <target dev="tap14809110-a5"/>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:39:44 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/cecfc8a5-ee75-46f1-8280-6869075855ac/console.log" append="off"/>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:39:44 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:39:44 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:39:44 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:39:44 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:39:44 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:39:44 np0005466013 nova_compute[192144]: 2025-10-02 12:39:44.656 2 DEBUG nova.compute.manager [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Preparing to wait for external event network-vif-plugged-14809110-a5f0-4169-b4d1-b20b6b32320a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:39:44 np0005466013 nova_compute[192144]: 2025-10-02 12:39:44.656 2 DEBUG oslo_concurrency.lockutils [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "cecfc8a5-ee75-46f1-8280-6869075855ac-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:39:44 np0005466013 nova_compute[192144]: 2025-10-02 12:39:44.657 2 DEBUG oslo_concurrency.lockutils [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "cecfc8a5-ee75-46f1-8280-6869075855ac-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:39:44 np0005466013 nova_compute[192144]: 2025-10-02 12:39:44.657 2 DEBUG oslo_concurrency.lockutils [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "cecfc8a5-ee75-46f1-8280-6869075855ac-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:39:44 np0005466013 nova_compute[192144]: 2025-10-02 12:39:44.657 2 DEBUG nova.virt.libvirt.vif [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:39:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-640150742',display_name='tempest-TestGettingAddress-server-640150742',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-640150742',id=169,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIRceitbRKs6XXbf7kadsFogQdqZxrZp17i/4hhUcf2GwYzZeOwXhP4JbD0KEwVqLcpeLnMlwZ2D2mCuHBJR8KfWga2sLnw7wFx6+c4GjZ3JqHI31K0krXjHmfSh85q/Fw==',key_name='tempest-TestGettingAddress-568346474',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-95v55i56',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:39:37Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=cecfc8a5-ee75-46f1-8280-6869075855ac,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "14809110-a5f0-4169-b4d1-b20b6b32320a", "address": "fa:16:3e:53:ca:b0", "network": {"id": "b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177", "bridge": "br-int", "label": "tempest-network-smoke--1389422686", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe53:cab0", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe53:cab0", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14809110-a5", "ovs_interfaceid": "14809110-a5f0-4169-b4d1-b20b6b32320a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:39:44 np0005466013 nova_compute[192144]: 2025-10-02 12:39:44.658 2 DEBUG nova.network.os_vif_util [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "14809110-a5f0-4169-b4d1-b20b6b32320a", "address": "fa:16:3e:53:ca:b0", "network": {"id": "b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177", "bridge": "br-int", "label": "tempest-network-smoke--1389422686", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe53:cab0", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe53:cab0", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14809110-a5", "ovs_interfaceid": "14809110-a5f0-4169-b4d1-b20b6b32320a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:39:44 np0005466013 nova_compute[192144]: 2025-10-02 12:39:44.658 2 DEBUG nova.network.os_vif_util [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:53:ca:b0,bridge_name='br-int',has_traffic_filtering=True,id=14809110-a5f0-4169-b4d1-b20b6b32320a,network=Network(b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap14809110-a5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:39:44 np0005466013 nova_compute[192144]: 2025-10-02 12:39:44.659 2 DEBUG os_vif [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:53:ca:b0,bridge_name='br-int',has_traffic_filtering=True,id=14809110-a5f0-4169-b4d1-b20b6b32320a,network=Network(b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap14809110-a5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:39:44 np0005466013 nova_compute[192144]: 2025-10-02 12:39:44.659 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:44 np0005466013 nova_compute[192144]: 2025-10-02 12:39:44.659 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:39:44 np0005466013 nova_compute[192144]: 2025-10-02 12:39:44.660 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:39:44 np0005466013 nova_compute[192144]: 2025-10-02 12:39:44.662 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:44 np0005466013 nova_compute[192144]: 2025-10-02 12:39:44.663 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap14809110-a5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:39:44 np0005466013 nova_compute[192144]: 2025-10-02 12:39:44.663 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap14809110-a5, col_values=(('external_ids', {'iface-id': '14809110-a5f0-4169-b4d1-b20b6b32320a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:53:ca:b0', 'vm-uuid': 'cecfc8a5-ee75-46f1-8280-6869075855ac'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:39:44 np0005466013 nova_compute[192144]: 2025-10-02 12:39:44.665 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:44 np0005466013 nova_compute[192144]: 2025-10-02 12:39:44.666 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:44 np0005466013 NetworkManager[51205]: <info>  [1759408784.6673] manager: (tap14809110-a5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/323)
Oct  2 08:39:44 np0005466013 nova_compute[192144]: 2025-10-02 12:39:44.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:39:44 np0005466013 nova_compute[192144]: 2025-10-02 12:39:44.675 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:44 np0005466013 nova_compute[192144]: 2025-10-02 12:39:44.678 2 INFO os_vif [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:53:ca:b0,bridge_name='br-int',has_traffic_filtering=True,id=14809110-a5f0-4169-b4d1-b20b6b32320a,network=Network(b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap14809110-a5')#033[00m
Oct  2 08:39:44 np0005466013 nova_compute[192144]: 2025-10-02 12:39:44.737 2 DEBUG nova.virt.libvirt.driver [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:39:44 np0005466013 nova_compute[192144]: 2025-10-02 12:39:44.738 2 DEBUG nova.virt.libvirt.driver [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:39:44 np0005466013 nova_compute[192144]: 2025-10-02 12:39:44.738 2 DEBUG nova.virt.libvirt.driver [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] No VIF found with MAC fa:16:3e:53:ca:b0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:39:44 np0005466013 nova_compute[192144]: 2025-10-02 12:39:44.738 2 INFO nova.virt.libvirt.driver [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Using config drive#033[00m
Oct  2 08:39:45 np0005466013 nova_compute[192144]: 2025-10-02 12:39:45.371 2 INFO nova.virt.libvirt.driver [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Creating config drive at /var/lib/nova/instances/cecfc8a5-ee75-46f1-8280-6869075855ac/disk.config#033[00m
Oct  2 08:39:45 np0005466013 nova_compute[192144]: 2025-10-02 12:39:45.376 2 DEBUG oslo_concurrency.processutils [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/cecfc8a5-ee75-46f1-8280-6869075855ac/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprkunehib execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:39:45 np0005466013 nova_compute[192144]: 2025-10-02 12:39:45.507 2 DEBUG oslo_concurrency.processutils [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/cecfc8a5-ee75-46f1-8280-6869075855ac/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprkunehib" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:39:45 np0005466013 kernel: tap14809110-a5: entered promiscuous mode
Oct  2 08:39:45 np0005466013 ovn_controller[94366]: 2025-10-02T12:39:45Z|00736|binding|INFO|Claiming lport 14809110-a5f0-4169-b4d1-b20b6b32320a for this chassis.
Oct  2 08:39:45 np0005466013 ovn_controller[94366]: 2025-10-02T12:39:45Z|00737|binding|INFO|14809110-a5f0-4169-b4d1-b20b6b32320a: Claiming fa:16:3e:53:ca:b0 10.100.0.13 2001:db8:0:1:f816:3eff:fe53:cab0 2001:db8::f816:3eff:fe53:cab0
Oct  2 08:39:45 np0005466013 NetworkManager[51205]: <info>  [1759408785.5960] manager: (tap14809110-a5): new Tun device (/org/freedesktop/NetworkManager/Devices/324)
Oct  2 08:39:45 np0005466013 nova_compute[192144]: 2025-10-02 12:39:45.597 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:45 np0005466013 nova_compute[192144]: 2025-10-02 12:39:45.608 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:45 np0005466013 NetworkManager[51205]: <info>  [1759408785.6114] manager: (patch-br-int-to-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/325)
Oct  2 08:39:45 np0005466013 NetworkManager[51205]: <info>  [1759408785.6131] manager: (patch-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/326)
Oct  2 08:39:45 np0005466013 systemd-udevd[248917]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:39:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:45.628 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:53:ca:b0 10.100.0.13 2001:db8:0:1:f816:3eff:fe53:cab0 2001:db8::f816:3eff:fe53:cab0'], port_security=['fa:16:3e:53:ca:b0 10.100.0.13 2001:db8:0:1:f816:3eff:fe53:cab0 2001:db8::f816:3eff:fe53:cab0'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28 2001:db8:0:1:f816:3eff:fe53:cab0/64 2001:db8::f816:3eff:fe53:cab0/64', 'neutron:device_id': 'cecfc8a5-ee75-46f1-8280-6869075855ac', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '2', 'neutron:security_group_ids': '28c32917-1b1e-49b6-8086-1dbf8a5f4a7b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5fce3bea-36c3-4b1e-bdee-b694cf8990ad, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=14809110-a5f0-4169-b4d1-b20b6b32320a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:39:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:45.630 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 14809110-a5f0-4169-b4d1-b20b6b32320a in datapath b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177 bound to our chassis#033[00m
Oct  2 08:39:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:45.631 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177#033[00m
Oct  2 08:39:45 np0005466013 NetworkManager[51205]: <info>  [1759408785.6396] device (tap14809110-a5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:39:45 np0005466013 NetworkManager[51205]: <info>  [1759408785.6410] device (tap14809110-a5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:39:45 np0005466013 systemd-machined[152202]: New machine qemu-77-instance-000000a9.
Oct  2 08:39:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:45.647 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[218567ce-c3a5-4cb3-b6f2-f5d638be8170]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:45.648 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb9d6d69e-01 in ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:39:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:45.651 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb9d6d69e-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:39:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:45.652 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[bc4d79ee-33e7-4714-9fdb-6dada3d0911e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:45.652 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[d211bb72-1406-471a-bb7f-4d4d764b157d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:45.674 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[c8420fa1-278d-4f0a-ba85-7ad79d4184f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:45 np0005466013 systemd[1]: Started Virtual Machine qemu-77-instance-000000a9.
Oct  2 08:39:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:45.712 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[309c1710-741a-423f-8161-a9de164aab86]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:45 np0005466013 nova_compute[192144]: 2025-10-02 12:39:45.721 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:45 np0005466013 nova_compute[192144]: 2025-10-02 12:39:45.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:45 np0005466013 nova_compute[192144]: 2025-10-02 12:39:45.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:45 np0005466013 ovn_controller[94366]: 2025-10-02T12:39:45Z|00738|binding|INFO|Setting lport 14809110-a5f0-4169-b4d1-b20b6b32320a ovn-installed in OVS
Oct  2 08:39:45 np0005466013 ovn_controller[94366]: 2025-10-02T12:39:45Z|00739|binding|INFO|Setting lport 14809110-a5f0-4169-b4d1-b20b6b32320a up in Southbound
Oct  2 08:39:45 np0005466013 nova_compute[192144]: 2025-10-02 12:39:45.755 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:45.761 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[87da4af3-ee0a-45f0-9c2c-69e924bf967d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:45.767 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[50c2491d-e52a-4bc1-bf58-cfdc90636b1a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:45 np0005466013 NetworkManager[51205]: <info>  [1759408785.7694] manager: (tapb9d6d69e-00): new Veth device (/org/freedesktop/NetworkManager/Devices/327)
Oct  2 08:39:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:45.812 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[4ffc7a89-ac0d-4af0-a9f6-5339c5586997]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:45.820 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[87226134-8546-41b2-9c8a-38e6eef95b39]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:45 np0005466013 NetworkManager[51205]: <info>  [1759408785.8443] device (tapb9d6d69e-00): carrier: link connected
Oct  2 08:39:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:45.850 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[e378de24-91b1-4ab1-83a8-4f2640e347f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:45.871 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[4c92cacc-42c0-424f-b98a-f2c4a52396ea]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb9d6d69e-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2b:a1:21'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 217], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 677746, 'reachable_time': 28251, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 248951, 'error': None, 'target': 'ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:45.894 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[59483c01-145c-4246-b8a7-54de5e886140]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2b:a121'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 677746, 'tstamp': 677746}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 248952, 'error': None, 'target': 'ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:45.916 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[27039dac-a0b1-41ea-a2f4-62b6bc5e7717]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb9d6d69e-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2b:a1:21'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 217], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 677746, 'reachable_time': 28251, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 248953, 'error': None, 'target': 'ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:45 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:45.960 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[f42e08f6-4a44-474b-9d8b-ffefcef35a9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:46 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:46.037 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[324e2f2d-016c-4e9a-8fd9-c6f47f64060c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:46 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:46.040 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb9d6d69e-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:39:46 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:46.040 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:39:46 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:46.041 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb9d6d69e-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:39:46 np0005466013 kernel: tapb9d6d69e-00: entered promiscuous mode
Oct  2 08:39:46 np0005466013 nova_compute[192144]: 2025-10-02 12:39:46.045 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:46 np0005466013 NetworkManager[51205]: <info>  [1759408786.0469] manager: (tapb9d6d69e-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/328)
Oct  2 08:39:46 np0005466013 nova_compute[192144]: 2025-10-02 12:39:46.047 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:46 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:46.048 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb9d6d69e-00, col_values=(('external_ids', {'iface-id': 'ad07d234-3bc8-429a-8834-7a9ae3274be2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:39:46 np0005466013 nova_compute[192144]: 2025-10-02 12:39:46.049 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:46 np0005466013 nova_compute[192144]: 2025-10-02 12:39:46.050 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:46 np0005466013 ovn_controller[94366]: 2025-10-02T12:39:46Z|00740|binding|INFO|Releasing lport ad07d234-3bc8-429a-8834-7a9ae3274be2 from this chassis (sb_readonly=0)
Oct  2 08:39:46 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:46.051 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:39:46 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:46.052 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[49d7bf1d-078b-46e8-adab-9d8d233448f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:46 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:46.053 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:39:46 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:39:46 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:39:46 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177
Oct  2 08:39:46 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:39:46 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:39:46 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:39:46 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177.pid.haproxy
Oct  2 08:39:46 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:39:46 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:39:46 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:39:46 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:39:46 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:39:46 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:39:46 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:39:46 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:39:46 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:39:46 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:39:46 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:39:46 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:39:46 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:39:46 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:39:46 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:39:46 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:39:46 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:39:46 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:39:46 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:39:46 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:39:46 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177
Oct  2 08:39:46 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:39:46 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:39:46.053 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177', 'env', 'PROCESS_TAG=haproxy-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:39:46 np0005466013 nova_compute[192144]: 2025-10-02 12:39:46.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:46 np0005466013 podman[248985]: 2025-10-02 12:39:46.46188782 +0000 UTC m=+0.052105404 container create c0d6db00e9954f901f6c377101d6d4c1be5f472574345f974d584e22e3deb0f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  2 08:39:46 np0005466013 systemd[1]: Started libpod-conmon-c0d6db00e9954f901f6c377101d6d4c1be5f472574345f974d584e22e3deb0f9.scope.
Oct  2 08:39:46 np0005466013 podman[248985]: 2025-10-02 12:39:46.4333253 +0000 UTC m=+0.023542804 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:39:46 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:39:46 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ed049222bbf747bea54b089e5e736ca4ba054e6a20351e46cd5cc74ad1163ffe/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:39:46 np0005466013 podman[248985]: 2025-10-02 12:39:46.562544275 +0000 UTC m=+0.152761839 container init c0d6db00e9954f901f6c377101d6d4c1be5f472574345f974d584e22e3deb0f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct  2 08:39:46 np0005466013 podman[248985]: 2025-10-02 12:39:46.574127567 +0000 UTC m=+0.164345091 container start c0d6db00e9954f901f6c377101d6d4c1be5f472574345f974d584e22e3deb0f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 08:39:46 np0005466013 neutron-haproxy-ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177[249001]: [NOTICE]   (249005) : New worker (249007) forked
Oct  2 08:39:46 np0005466013 neutron-haproxy-ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177[249001]: [NOTICE]   (249005) : Loading success.
Oct  2 08:39:47 np0005466013 nova_compute[192144]: 2025-10-02 12:39:47.144 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759408787.1429644, cecfc8a5-ee75-46f1-8280-6869075855ac => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:39:47 np0005466013 nova_compute[192144]: 2025-10-02 12:39:47.144 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] VM Started (Lifecycle Event)#033[00m
Oct  2 08:39:47 np0005466013 nova_compute[192144]: 2025-10-02 12:39:47.989 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:39:48 np0005466013 nova_compute[192144]: 2025-10-02 12:39:48.666 2 DEBUG nova.compute.manager [req-6efee65f-2b8b-46fe-bde2-1d2b2d58247c req-270d5f06-aec4-49a3-97dc-390ed38c6f76 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Received event network-vif-plugged-14809110-a5f0-4169-b4d1-b20b6b32320a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:39:48 np0005466013 nova_compute[192144]: 2025-10-02 12:39:48.666 2 DEBUG oslo_concurrency.lockutils [req-6efee65f-2b8b-46fe-bde2-1d2b2d58247c req-270d5f06-aec4-49a3-97dc-390ed38c6f76 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "cecfc8a5-ee75-46f1-8280-6869075855ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:39:48 np0005466013 nova_compute[192144]: 2025-10-02 12:39:48.666 2 DEBUG oslo_concurrency.lockutils [req-6efee65f-2b8b-46fe-bde2-1d2b2d58247c req-270d5f06-aec4-49a3-97dc-390ed38c6f76 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "cecfc8a5-ee75-46f1-8280-6869075855ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:39:48 np0005466013 nova_compute[192144]: 2025-10-02 12:39:48.666 2 DEBUG oslo_concurrency.lockutils [req-6efee65f-2b8b-46fe-bde2-1d2b2d58247c req-270d5f06-aec4-49a3-97dc-390ed38c6f76 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "cecfc8a5-ee75-46f1-8280-6869075855ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:39:48 np0005466013 nova_compute[192144]: 2025-10-02 12:39:48.667 2 DEBUG nova.compute.manager [req-6efee65f-2b8b-46fe-bde2-1d2b2d58247c req-270d5f06-aec4-49a3-97dc-390ed38c6f76 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Processing event network-vif-plugged-14809110-a5f0-4169-b4d1-b20b6b32320a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:39:48 np0005466013 nova_compute[192144]: 2025-10-02 12:39:48.667 2 DEBUG nova.compute.manager [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:39:48 np0005466013 nova_compute[192144]: 2025-10-02 12:39:48.673 2 DEBUG nova.virt.libvirt.driver [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:39:48 np0005466013 nova_compute[192144]: 2025-10-02 12:39:48.678 2 INFO nova.virt.libvirt.driver [-] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Instance spawned successfully.#033[00m
Oct  2 08:39:48 np0005466013 nova_compute[192144]: 2025-10-02 12:39:48.679 2 DEBUG nova.virt.libvirt.driver [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:39:48 np0005466013 nova_compute[192144]: 2025-10-02 12:39:48.688 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:39:48 np0005466013 nova_compute[192144]: 2025-10-02 12:39:48.693 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:39:48 np0005466013 nova_compute[192144]: 2025-10-02 12:39:48.721 2 DEBUG nova.virt.libvirt.driver [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:39:48 np0005466013 nova_compute[192144]: 2025-10-02 12:39:48.722 2 DEBUG nova.virt.libvirt.driver [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:39:48 np0005466013 nova_compute[192144]: 2025-10-02 12:39:48.722 2 DEBUG nova.virt.libvirt.driver [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:39:48 np0005466013 nova_compute[192144]: 2025-10-02 12:39:48.723 2 DEBUG nova.virt.libvirt.driver [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:39:48 np0005466013 nova_compute[192144]: 2025-10-02 12:39:48.723 2 DEBUG nova.virt.libvirt.driver [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:39:48 np0005466013 nova_compute[192144]: 2025-10-02 12:39:48.724 2 DEBUG nova.virt.libvirt.driver [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:39:48 np0005466013 nova_compute[192144]: 2025-10-02 12:39:48.729 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:39:48 np0005466013 nova_compute[192144]: 2025-10-02 12:39:48.729 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759408787.1437278, cecfc8a5-ee75-46f1-8280-6869075855ac => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:39:48 np0005466013 nova_compute[192144]: 2025-10-02 12:39:48.729 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:39:48 np0005466013 nova_compute[192144]: 2025-10-02 12:39:48.760 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:39:48 np0005466013 nova_compute[192144]: 2025-10-02 12:39:48.764 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759408788.672344, cecfc8a5-ee75-46f1-8280-6869075855ac => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:39:48 np0005466013 nova_compute[192144]: 2025-10-02 12:39:48.764 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:39:48 np0005466013 nova_compute[192144]: 2025-10-02 12:39:48.791 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:39:48 np0005466013 nova_compute[192144]: 2025-10-02 12:39:48.794 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:39:48 np0005466013 nova_compute[192144]: 2025-10-02 12:39:48.819 2 INFO nova.compute.manager [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Took 11.55 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:39:48 np0005466013 nova_compute[192144]: 2025-10-02 12:39:48.819 2 DEBUG nova.compute.manager [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:39:48 np0005466013 nova_compute[192144]: 2025-10-02 12:39:48.821 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:39:48 np0005466013 nova_compute[192144]: 2025-10-02 12:39:48.912 2 INFO nova.compute.manager [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Took 12.16 seconds to build instance.#033[00m
Oct  2 08:39:48 np0005466013 nova_compute[192144]: 2025-10-02 12:39:48.944 2 DEBUG oslo_concurrency.lockutils [None req-145bc34b-6e3a-4598-9150-556c5b450874 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "cecfc8a5-ee75-46f1-8280-6869075855ac" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.286s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:39:48 np0005466013 nova_compute[192144]: 2025-10-02 12:39:48.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:39:49 np0005466013 nova_compute[192144]: 2025-10-02 12:39:49.516 2 DEBUG nova.network.neutron [req-18ef040a-7559-42f3-b30c-fa5ba8313740 req-c4b100c2-4185-4428-84bd-e3d6e6b9f943 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Updated VIF entry in instance network info cache for port 14809110-a5f0-4169-b4d1-b20b6b32320a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:39:49 np0005466013 nova_compute[192144]: 2025-10-02 12:39:49.517 2 DEBUG nova.network.neutron [req-18ef040a-7559-42f3-b30c-fa5ba8313740 req-c4b100c2-4185-4428-84bd-e3d6e6b9f943 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Updating instance_info_cache with network_info: [{"id": "14809110-a5f0-4169-b4d1-b20b6b32320a", "address": "fa:16:3e:53:ca:b0", "network": {"id": "b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177", "bridge": "br-int", "label": "tempest-network-smoke--1389422686", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe53:cab0", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe53:cab0", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14809110-a5", "ovs_interfaceid": "14809110-a5f0-4169-b4d1-b20b6b32320a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:39:49 np0005466013 nova_compute[192144]: 2025-10-02 12:39:49.554 2 DEBUG oslo_concurrency.lockutils [req-18ef040a-7559-42f3-b30c-fa5ba8313740 req-c4b100c2-4185-4428-84bd-e3d6e6b9f943 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-cecfc8a5-ee75-46f1-8280-6869075855ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:39:49 np0005466013 nova_compute[192144]: 2025-10-02 12:39:49.665 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:49 np0005466013 nova_compute[192144]: 2025-10-02 12:39:49.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:39:49 np0005466013 nova_compute[192144]: 2025-10-02 12:39:49.995 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  2 08:39:50 np0005466013 nova_compute[192144]: 2025-10-02 12:39:50.727 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:50 np0005466013 nova_compute[192144]: 2025-10-02 12:39:50.795 2 DEBUG nova.compute.manager [req-3da7cf86-6f5f-4d0a-acc2-cc9ad68dbf85 req-203a0c81-ff48-4e27-9c8f-cf559143911e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Received event network-vif-plugged-14809110-a5f0-4169-b4d1-b20b6b32320a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:39:50 np0005466013 nova_compute[192144]: 2025-10-02 12:39:50.796 2 DEBUG oslo_concurrency.lockutils [req-3da7cf86-6f5f-4d0a-acc2-cc9ad68dbf85 req-203a0c81-ff48-4e27-9c8f-cf559143911e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "cecfc8a5-ee75-46f1-8280-6869075855ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:39:50 np0005466013 nova_compute[192144]: 2025-10-02 12:39:50.796 2 DEBUG oslo_concurrency.lockutils [req-3da7cf86-6f5f-4d0a-acc2-cc9ad68dbf85 req-203a0c81-ff48-4e27-9c8f-cf559143911e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "cecfc8a5-ee75-46f1-8280-6869075855ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:39:50 np0005466013 nova_compute[192144]: 2025-10-02 12:39:50.796 2 DEBUG oslo_concurrency.lockutils [req-3da7cf86-6f5f-4d0a-acc2-cc9ad68dbf85 req-203a0c81-ff48-4e27-9c8f-cf559143911e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "cecfc8a5-ee75-46f1-8280-6869075855ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:39:50 np0005466013 nova_compute[192144]: 2025-10-02 12:39:50.796 2 DEBUG nova.compute.manager [req-3da7cf86-6f5f-4d0a-acc2-cc9ad68dbf85 req-203a0c81-ff48-4e27-9c8f-cf559143911e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] No waiting events found dispatching network-vif-plugged-14809110-a5f0-4169-b4d1-b20b6b32320a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:39:50 np0005466013 nova_compute[192144]: 2025-10-02 12:39:50.796 2 WARNING nova.compute.manager [req-3da7cf86-6f5f-4d0a-acc2-cc9ad68dbf85 req-203a0c81-ff48-4e27-9c8f-cf559143911e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Received unexpected event network-vif-plugged-14809110-a5f0-4169-b4d1-b20b6b32320a for instance with vm_state active and task_state None.#033[00m
Oct  2 08:39:52 np0005466013 nova_compute[192144]: 2025-10-02 12:39:52.020 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:39:54 np0005466013 nova_compute[192144]: 2025-10-02 12:39:54.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:54 np0005466013 podman[249024]: 2025-10-02 12:39:54.694117671 +0000 UTC m=+0.060196235 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  2 08:39:54 np0005466013 podman[249023]: 2025-10-02 12:39:54.699811239 +0000 UTC m=+0.067705341 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:39:54 np0005466013 podman[249025]: 2025-10-02 12:39:54.73518178 +0000 UTC m=+0.098733426 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:39:54 np0005466013 nova_compute[192144]: 2025-10-02 12:39:54.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:39:54 np0005466013 nova_compute[192144]: 2025-10-02 12:39:54.995 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:39:54 np0005466013 nova_compute[192144]: 2025-10-02 12:39:54.995 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:39:55 np0005466013 nova_compute[192144]: 2025-10-02 12:39:55.419 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "refresh_cache-cecfc8a5-ee75-46f1-8280-6869075855ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:39:55 np0005466013 nova_compute[192144]: 2025-10-02 12:39:55.420 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquired lock "refresh_cache-cecfc8a5-ee75-46f1-8280-6869075855ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:39:55 np0005466013 nova_compute[192144]: 2025-10-02 12:39:55.420 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:39:55 np0005466013 nova_compute[192144]: 2025-10-02 12:39:55.420 2 DEBUG nova.objects.instance [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lazy-loading 'info_cache' on Instance uuid cecfc8a5-ee75-46f1-8280-6869075855ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:39:55 np0005466013 nova_compute[192144]: 2025-10-02 12:39:55.731 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:56 np0005466013 nova_compute[192144]: 2025-10-02 12:39:56.557 2 DEBUG nova.compute.manager [req-54512dba-ea7b-4651-9f96-8eb49e3e80c8 req-6278d953-8ae8-4d54-bc3c-f9332185efbc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Received event network-changed-14809110-a5f0-4169-b4d1-b20b6b32320a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:39:56 np0005466013 nova_compute[192144]: 2025-10-02 12:39:56.558 2 DEBUG nova.compute.manager [req-54512dba-ea7b-4651-9f96-8eb49e3e80c8 req-6278d953-8ae8-4d54-bc3c-f9332185efbc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Refreshing instance network info cache due to event network-changed-14809110-a5f0-4169-b4d1-b20b6b32320a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:39:56 np0005466013 nova_compute[192144]: 2025-10-02 12:39:56.558 2 DEBUG oslo_concurrency.lockutils [req-54512dba-ea7b-4651-9f96-8eb49e3e80c8 req-6278d953-8ae8-4d54-bc3c-f9332185efbc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-cecfc8a5-ee75-46f1-8280-6869075855ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:39:59 np0005466013 nova_compute[192144]: 2025-10-02 12:39:59.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:00 np0005466013 nova_compute[192144]: 2025-10-02 12:40:00.452 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Updating instance_info_cache with network_info: [{"id": "14809110-a5f0-4169-b4d1-b20b6b32320a", "address": "fa:16:3e:53:ca:b0", "network": {"id": "b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177", "bridge": "br-int", "label": "tempest-network-smoke--1389422686", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe53:cab0", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe53:cab0", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14809110-a5", "ovs_interfaceid": "14809110-a5f0-4169-b4d1-b20b6b32320a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:40:00 np0005466013 nova_compute[192144]: 2025-10-02 12:40:00.565 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Releasing lock "refresh_cache-cecfc8a5-ee75-46f1-8280-6869075855ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:40:00 np0005466013 nova_compute[192144]: 2025-10-02 12:40:00.566 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:40:00 np0005466013 nova_compute[192144]: 2025-10-02 12:40:00.568 2 DEBUG oslo_concurrency.lockutils [req-54512dba-ea7b-4651-9f96-8eb49e3e80c8 req-6278d953-8ae8-4d54-bc3c-f9332185efbc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-cecfc8a5-ee75-46f1-8280-6869075855ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:40:00 np0005466013 nova_compute[192144]: 2025-10-02 12:40:00.568 2 DEBUG nova.network.neutron [req-54512dba-ea7b-4651-9f96-8eb49e3e80c8 req-6278d953-8ae8-4d54-bc3c-f9332185efbc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Refreshing network info cache for port 14809110-a5f0-4169-b4d1-b20b6b32320a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:40:00 np0005466013 nova_compute[192144]: 2025-10-02 12:40:00.570 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:40:00 np0005466013 nova_compute[192144]: 2025-10-02 12:40:00.572 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:40:00 np0005466013 nova_compute[192144]: 2025-10-02 12:40:00.572 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  2 08:40:00 np0005466013 nova_compute[192144]: 2025-10-02 12:40:00.734 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:00 np0005466013 ovn_controller[94366]: 2025-10-02T12:40:00Z|00077|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:53:ca:b0 10.100.0.13
Oct  2 08:40:00 np0005466013 ovn_controller[94366]: 2025-10-02T12:40:00Z|00078|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:53:ca:b0 10.100.0.13
Oct  2 08:40:01 np0005466013 nova_compute[192144]: 2025-10-02 12:40:01.949 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  2 08:40:01 np0005466013 nova_compute[192144]: 2025-10-02 12:40:01.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:40:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:40:02.325 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:40:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:40:02.326 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:40:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:40:02.327 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:40:04 np0005466013 nova_compute[192144]: 2025-10-02 12:40:04.256 2 DEBUG nova.network.neutron [req-54512dba-ea7b-4651-9f96-8eb49e3e80c8 req-6278d953-8ae8-4d54-bc3c-f9332185efbc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Updated VIF entry in instance network info cache for port 14809110-a5f0-4169-b4d1-b20b6b32320a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:40:04 np0005466013 nova_compute[192144]: 2025-10-02 12:40:04.257 2 DEBUG nova.network.neutron [req-54512dba-ea7b-4651-9f96-8eb49e3e80c8 req-6278d953-8ae8-4d54-bc3c-f9332185efbc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Updating instance_info_cache with network_info: [{"id": "14809110-a5f0-4169-b4d1-b20b6b32320a", "address": "fa:16:3e:53:ca:b0", "network": {"id": "b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177", "bridge": "br-int", "label": "tempest-network-smoke--1389422686", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe53:cab0", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe53:cab0", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14809110-a5", "ovs_interfaceid": "14809110-a5f0-4169-b4d1-b20b6b32320a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:40:04 np0005466013 nova_compute[192144]: 2025-10-02 12:40:04.333 2 DEBUG oslo_concurrency.lockutils [req-54512dba-ea7b-4651-9f96-8eb49e3e80c8 req-6278d953-8ae8-4d54-bc3c-f9332185efbc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-cecfc8a5-ee75-46f1-8280-6869075855ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:40:04 np0005466013 nova_compute[192144]: 2025-10-02 12:40:04.673 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:40:05.573 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=45, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=44) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:40:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:40:05.574 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:40:05 np0005466013 nova_compute[192144]: 2025-10-02 12:40:05.574 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:05 np0005466013 podman[249098]: 2025-10-02 12:40:05.702284577 +0000 UTC m=+0.065959025 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  2 08:40:05 np0005466013 podman[249100]: 2025-10-02 12:40:05.712126484 +0000 UTC m=+0.064873332 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  2 08:40:05 np0005466013 podman[249099]: 2025-10-02 12:40:05.725727587 +0000 UTC m=+0.073643034 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-type=git, release=1755695350, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container)
Oct  2 08:40:05 np0005466013 nova_compute[192144]: 2025-10-02 12:40:05.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:06 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:40:06.576 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '45'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:40:07 np0005466013 podman[249155]: 2025-10-02 12:40:07.715417911 +0000 UTC m=+0.077186185 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.schema-version=1.0, container_name=iscsid, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:40:07 np0005466013 podman[249154]: 2025-10-02 12:40:07.733527106 +0000 UTC m=+0.101306147 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:40:09 np0005466013 nova_compute[192144]: 2025-10-02 12:40:09.675 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:10 np0005466013 nova_compute[192144]: 2025-10-02 12:40:10.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:14 np0005466013 nova_compute[192144]: 2025-10-02 12:40:14.589 2 DEBUG nova.compute.manager [req-f7c73c41-4138-4229-a55c-2832dc83c031 req-83fd3c72-d29d-4f79-8af4-2ff0b8e345db 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Received event network-changed-14809110-a5f0-4169-b4d1-b20b6b32320a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:40:14 np0005466013 nova_compute[192144]: 2025-10-02 12:40:14.590 2 DEBUG nova.compute.manager [req-f7c73c41-4138-4229-a55c-2832dc83c031 req-83fd3c72-d29d-4f79-8af4-2ff0b8e345db 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Refreshing instance network info cache due to event network-changed-14809110-a5f0-4169-b4d1-b20b6b32320a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:40:14 np0005466013 nova_compute[192144]: 2025-10-02 12:40:14.591 2 DEBUG oslo_concurrency.lockutils [req-f7c73c41-4138-4229-a55c-2832dc83c031 req-83fd3c72-d29d-4f79-8af4-2ff0b8e345db 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-cecfc8a5-ee75-46f1-8280-6869075855ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:40:14 np0005466013 nova_compute[192144]: 2025-10-02 12:40:14.591 2 DEBUG oslo_concurrency.lockutils [req-f7c73c41-4138-4229-a55c-2832dc83c031 req-83fd3c72-d29d-4f79-8af4-2ff0b8e345db 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-cecfc8a5-ee75-46f1-8280-6869075855ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:40:14 np0005466013 nova_compute[192144]: 2025-10-02 12:40:14.591 2 DEBUG nova.network.neutron [req-f7c73c41-4138-4229-a55c-2832dc83c031 req-83fd3c72-d29d-4f79-8af4-2ff0b8e345db 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Refreshing network info cache for port 14809110-a5f0-4169-b4d1-b20b6b32320a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:40:14 np0005466013 nova_compute[192144]: 2025-10-02 12:40:14.677 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:15 np0005466013 nova_compute[192144]: 2025-10-02 12:40:15.249 2 DEBUG oslo_concurrency.lockutils [None req-de42ca87-487c-4eb4-9dac-c8f6a845d587 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "cecfc8a5-ee75-46f1-8280-6869075855ac" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:40:15 np0005466013 nova_compute[192144]: 2025-10-02 12:40:15.251 2 DEBUG oslo_concurrency.lockutils [None req-de42ca87-487c-4eb4-9dac-c8f6a845d587 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "cecfc8a5-ee75-46f1-8280-6869075855ac" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:40:15 np0005466013 nova_compute[192144]: 2025-10-02 12:40:15.251 2 DEBUG oslo_concurrency.lockutils [None req-de42ca87-487c-4eb4-9dac-c8f6a845d587 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "cecfc8a5-ee75-46f1-8280-6869075855ac-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:40:15 np0005466013 nova_compute[192144]: 2025-10-02 12:40:15.252 2 DEBUG oslo_concurrency.lockutils [None req-de42ca87-487c-4eb4-9dac-c8f6a845d587 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "cecfc8a5-ee75-46f1-8280-6869075855ac-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:40:15 np0005466013 nova_compute[192144]: 2025-10-02 12:40:15.252 2 DEBUG oslo_concurrency.lockutils [None req-de42ca87-487c-4eb4-9dac-c8f6a845d587 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "cecfc8a5-ee75-46f1-8280-6869075855ac-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:40:15 np0005466013 nova_compute[192144]: 2025-10-02 12:40:15.403 2 INFO nova.compute.manager [None req-de42ca87-487c-4eb4-9dac-c8f6a845d587 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Terminating instance#033[00m
Oct  2 08:40:15 np0005466013 nova_compute[192144]: 2025-10-02 12:40:15.501 2 DEBUG nova.compute.manager [None req-de42ca87-487c-4eb4-9dac-c8f6a845d587 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:40:15 np0005466013 kernel: tap14809110-a5 (unregistering): left promiscuous mode
Oct  2 08:40:15 np0005466013 NetworkManager[51205]: <info>  [1759408815.5259] device (tap14809110-a5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:40:15 np0005466013 ovn_controller[94366]: 2025-10-02T12:40:15Z|00741|binding|INFO|Releasing lport 14809110-a5f0-4169-b4d1-b20b6b32320a from this chassis (sb_readonly=0)
Oct  2 08:40:15 np0005466013 ovn_controller[94366]: 2025-10-02T12:40:15Z|00742|binding|INFO|Setting lport 14809110-a5f0-4169-b4d1-b20b6b32320a down in Southbound
Oct  2 08:40:15 np0005466013 ovn_controller[94366]: 2025-10-02T12:40:15Z|00743|binding|INFO|Removing iface tap14809110-a5 ovn-installed in OVS
Oct  2 08:40:15 np0005466013 nova_compute[192144]: 2025-10-02 12:40:15.539 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:15 np0005466013 nova_compute[192144]: 2025-10-02 12:40:15.544 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:15 np0005466013 nova_compute[192144]: 2025-10-02 12:40:15.574 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:40:15.580 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:53:ca:b0 10.100.0.13 2001:db8:0:1:f816:3eff:fe53:cab0 2001:db8::f816:3eff:fe53:cab0'], port_security=['fa:16:3e:53:ca:b0 10.100.0.13 2001:db8:0:1:f816:3eff:fe53:cab0 2001:db8::f816:3eff:fe53:cab0'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28 2001:db8:0:1:f816:3eff:fe53:cab0/64 2001:db8::f816:3eff:fe53:cab0/64', 'neutron:device_id': 'cecfc8a5-ee75-46f1-8280-6869075855ac', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '4', 'neutron:security_group_ids': '28c32917-1b1e-49b6-8086-1dbf8a5f4a7b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5fce3bea-36c3-4b1e-bdee-b694cf8990ad, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=14809110-a5f0-4169-b4d1-b20b6b32320a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:40:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:40:15.583 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 14809110-a5f0-4169-b4d1-b20b6b32320a in datapath b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177 unbound from our chassis#033[00m
Oct  2 08:40:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:40:15.586 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:40:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:40:15.587 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[268986d0-70f4-4250-8502-3ac8db60f9e9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:40:15 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:40:15.588 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177 namespace which is not needed anymore#033[00m
Oct  2 08:40:15 np0005466013 systemd[1]: machine-qemu\x2d77\x2dinstance\x2d000000a9.scope: Deactivated successfully.
Oct  2 08:40:15 np0005466013 systemd[1]: machine-qemu\x2d77\x2dinstance\x2d000000a9.scope: Consumed 14.347s CPU time.
Oct  2 08:40:15 np0005466013 systemd-machined[152202]: Machine qemu-77-instance-000000a9 terminated.
Oct  2 08:40:15 np0005466013 nova_compute[192144]: 2025-10-02 12:40:15.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:15 np0005466013 nova_compute[192144]: 2025-10-02 12:40:15.776 2 INFO nova.virt.libvirt.driver [-] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Instance destroyed successfully.#033[00m
Oct  2 08:40:15 np0005466013 nova_compute[192144]: 2025-10-02 12:40:15.777 2 DEBUG nova.objects.instance [None req-de42ca87-487c-4eb4-9dac-c8f6a845d587 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lazy-loading 'resources' on Instance uuid cecfc8a5-ee75-46f1-8280-6869075855ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:40:15 np0005466013 nova_compute[192144]: 2025-10-02 12:40:15.803 2 DEBUG nova.virt.libvirt.vif [None req-de42ca87-487c-4eb4-9dac-c8f6a845d587 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:39:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-640150742',display_name='tempest-TestGettingAddress-server-640150742',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-640150742',id=169,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIRceitbRKs6XXbf7kadsFogQdqZxrZp17i/4hhUcf2GwYzZeOwXhP4JbD0KEwVqLcpeLnMlwZ2D2mCuHBJR8KfWga2sLnw7wFx6+c4GjZ3JqHI31K0krXjHmfSh85q/Fw==',key_name='tempest-TestGettingAddress-568346474',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:39:48Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-95v55i56',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:39:48Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=cecfc8a5-ee75-46f1-8280-6869075855ac,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "14809110-a5f0-4169-b4d1-b20b6b32320a", "address": "fa:16:3e:53:ca:b0", "network": {"id": "b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177", "bridge": "br-int", "label": "tempest-network-smoke--1389422686", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe53:cab0", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe53:cab0", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14809110-a5", "ovs_interfaceid": "14809110-a5f0-4169-b4d1-b20b6b32320a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:40:15 np0005466013 nova_compute[192144]: 2025-10-02 12:40:15.804 2 DEBUG nova.network.os_vif_util [None req-de42ca87-487c-4eb4-9dac-c8f6a845d587 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "14809110-a5f0-4169-b4d1-b20b6b32320a", "address": "fa:16:3e:53:ca:b0", "network": {"id": "b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177", "bridge": "br-int", "label": "tempest-network-smoke--1389422686", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe53:cab0", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe53:cab0", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14809110-a5", "ovs_interfaceid": "14809110-a5f0-4169-b4d1-b20b6b32320a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:40:15 np0005466013 nova_compute[192144]: 2025-10-02 12:40:15.805 2 DEBUG nova.network.os_vif_util [None req-de42ca87-487c-4eb4-9dac-c8f6a845d587 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:53:ca:b0,bridge_name='br-int',has_traffic_filtering=True,id=14809110-a5f0-4169-b4d1-b20b6b32320a,network=Network(b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap14809110-a5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:40:15 np0005466013 nova_compute[192144]: 2025-10-02 12:40:15.805 2 DEBUG os_vif [None req-de42ca87-487c-4eb4-9dac-c8f6a845d587 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:53:ca:b0,bridge_name='br-int',has_traffic_filtering=True,id=14809110-a5f0-4169-b4d1-b20b6b32320a,network=Network(b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap14809110-a5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:40:15 np0005466013 nova_compute[192144]: 2025-10-02 12:40:15.807 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:15 np0005466013 nova_compute[192144]: 2025-10-02 12:40:15.807 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap14809110-a5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:40:15 np0005466013 nova_compute[192144]: 2025-10-02 12:40:15.809 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:15 np0005466013 nova_compute[192144]: 2025-10-02 12:40:15.811 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:15 np0005466013 nova_compute[192144]: 2025-10-02 12:40:15.813 2 INFO os_vif [None req-de42ca87-487c-4eb4-9dac-c8f6a845d587 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:53:ca:b0,bridge_name='br-int',has_traffic_filtering=True,id=14809110-a5f0-4169-b4d1-b20b6b32320a,network=Network(b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap14809110-a5')#033[00m
Oct  2 08:40:15 np0005466013 nova_compute[192144]: 2025-10-02 12:40:15.814 2 INFO nova.virt.libvirt.driver [None req-de42ca87-487c-4eb4-9dac-c8f6a845d587 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Deleting instance files /var/lib/nova/instances/cecfc8a5-ee75-46f1-8280-6869075855ac_del#033[00m
Oct  2 08:40:15 np0005466013 nova_compute[192144]: 2025-10-02 12:40:15.815 2 INFO nova.virt.libvirt.driver [None req-de42ca87-487c-4eb4-9dac-c8f6a845d587 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Deletion of /var/lib/nova/instances/cecfc8a5-ee75-46f1-8280-6869075855ac_del complete#033[00m
Oct  2 08:40:15 np0005466013 neutron-haproxy-ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177[249001]: [NOTICE]   (249005) : haproxy version is 2.8.14-c23fe91
Oct  2 08:40:15 np0005466013 neutron-haproxy-ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177[249001]: [NOTICE]   (249005) : path to executable is /usr/sbin/haproxy
Oct  2 08:40:15 np0005466013 neutron-haproxy-ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177[249001]: [WARNING]  (249005) : Exiting Master process...
Oct  2 08:40:15 np0005466013 neutron-haproxy-ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177[249001]: [ALERT]    (249005) : Current worker (249007) exited with code 143 (Terminated)
Oct  2 08:40:15 np0005466013 neutron-haproxy-ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177[249001]: [WARNING]  (249005) : All workers exited. Exiting... (0)
Oct  2 08:40:15 np0005466013 systemd[1]: libpod-c0d6db00e9954f901f6c377101d6d4c1be5f472574345f974d584e22e3deb0f9.scope: Deactivated successfully.
Oct  2 08:40:15 np0005466013 podman[249222]: 2025-10-02 12:40:15.898154962 +0000 UTC m=+0.156954439 container died c0d6db00e9954f901f6c377101d6d4c1be5f472574345f974d584e22e3deb0f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3)
Oct  2 08:40:16 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c0d6db00e9954f901f6c377101d6d4c1be5f472574345f974d584e22e3deb0f9-userdata-shm.mount: Deactivated successfully.
Oct  2 08:40:16 np0005466013 systemd[1]: var-lib-containers-storage-overlay-ed049222bbf747bea54b089e5e736ca4ba054e6a20351e46cd5cc74ad1163ffe-merged.mount: Deactivated successfully.
Oct  2 08:40:16 np0005466013 podman[249222]: 2025-10-02 12:40:16.104047976 +0000 UTC m=+0.362847443 container cleanup c0d6db00e9954f901f6c377101d6d4c1be5f472574345f974d584e22e3deb0f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Oct  2 08:40:16 np0005466013 systemd[1]: libpod-conmon-c0d6db00e9954f901f6c377101d6d4c1be5f472574345f974d584e22e3deb0f9.scope: Deactivated successfully.
Oct  2 08:40:16 np0005466013 podman[249270]: 2025-10-02 12:40:16.337044823 +0000 UTC m=+0.197475522 container remove c0d6db00e9954f901f6c377101d6d4c1be5f472574345f974d584e22e3deb0f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  2 08:40:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:40:16.345 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[fc7cded4-8fef-417b-a56e-a124b498da58]: (4, ('Thu Oct  2 12:40:15 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177 (c0d6db00e9954f901f6c377101d6d4c1be5f472574345f974d584e22e3deb0f9)\nc0d6db00e9954f901f6c377101d6d4c1be5f472574345f974d584e22e3deb0f9\nThu Oct  2 12:40:16 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177 (c0d6db00e9954f901f6c377101d6d4c1be5f472574345f974d584e22e3deb0f9)\nc0d6db00e9954f901f6c377101d6d4c1be5f472574345f974d584e22e3deb0f9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:40:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:40:16.349 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[0632c09b-32e3-413b-8d0e-4c4b97aa02e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:40:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:40:16.350 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb9d6d69e-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:40:16 np0005466013 nova_compute[192144]: 2025-10-02 12:40:16.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:16 np0005466013 kernel: tapb9d6d69e-00: left promiscuous mode
Oct  2 08:40:16 np0005466013 nova_compute[192144]: 2025-10-02 12:40:16.370 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:40:16.373 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[7e68fce1-8cd6-4891-8199-5a12f24f86e1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:40:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:40:16.411 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[bad1fe00-625b-48bd-9a93-cd987ad3a0ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:40:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:40:16.413 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[ec9950e3-fbbf-48b3-aaf4-b69dc71d556b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:40:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:40:16.433 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[5bb82532-d6d9-42d7-adb1-55fc097ad459]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 677737, 'reachable_time': 19564, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 249285, 'error': None, 'target': 'ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:40:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:40:16.436 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:40:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:40:16.436 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[704f5493-f614-4d68-a9a1-3a36e8b55494]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:40:16 np0005466013 systemd[1]: run-netns-ovnmeta\x2db9d6d69e\x2d0327\x2d4bcf\x2db8a6\x2db2cf69a4d177.mount: Deactivated successfully.
Oct  2 08:40:17 np0005466013 nova_compute[192144]: 2025-10-02 12:40:17.849 2 DEBUG nova.compute.manager [req-36c255fa-1afb-4c6b-a331-0e8f2c976f9a req-43b3116a-1179-4efc-abf6-72ff59e3ca76 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Received event network-vif-unplugged-14809110-a5f0-4169-b4d1-b20b6b32320a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:40:17 np0005466013 nova_compute[192144]: 2025-10-02 12:40:17.850 2 DEBUG oslo_concurrency.lockutils [req-36c255fa-1afb-4c6b-a331-0e8f2c976f9a req-43b3116a-1179-4efc-abf6-72ff59e3ca76 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "cecfc8a5-ee75-46f1-8280-6869075855ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:40:17 np0005466013 nova_compute[192144]: 2025-10-02 12:40:17.850 2 DEBUG oslo_concurrency.lockutils [req-36c255fa-1afb-4c6b-a331-0e8f2c976f9a req-43b3116a-1179-4efc-abf6-72ff59e3ca76 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "cecfc8a5-ee75-46f1-8280-6869075855ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:40:17 np0005466013 nova_compute[192144]: 2025-10-02 12:40:17.851 2 DEBUG oslo_concurrency.lockutils [req-36c255fa-1afb-4c6b-a331-0e8f2c976f9a req-43b3116a-1179-4efc-abf6-72ff59e3ca76 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "cecfc8a5-ee75-46f1-8280-6869075855ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:40:17 np0005466013 nova_compute[192144]: 2025-10-02 12:40:17.851 2 DEBUG nova.compute.manager [req-36c255fa-1afb-4c6b-a331-0e8f2c976f9a req-43b3116a-1179-4efc-abf6-72ff59e3ca76 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] No waiting events found dispatching network-vif-unplugged-14809110-a5f0-4169-b4d1-b20b6b32320a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:40:17 np0005466013 nova_compute[192144]: 2025-10-02 12:40:17.851 2 DEBUG nova.compute.manager [req-36c255fa-1afb-4c6b-a331-0e8f2c976f9a req-43b3116a-1179-4efc-abf6-72ff59e3ca76 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Received event network-vif-unplugged-14809110-a5f0-4169-b4d1-b20b6b32320a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:40:17 np0005466013 nova_compute[192144]: 2025-10-02 12:40:17.893 2 INFO nova.compute.manager [None req-de42ca87-487c-4eb4-9dac-c8f6a845d587 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Took 2.39 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:40:17 np0005466013 nova_compute[192144]: 2025-10-02 12:40:17.893 2 DEBUG oslo.service.loopingcall [None req-de42ca87-487c-4eb4-9dac-c8f6a845d587 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:40:17 np0005466013 nova_compute[192144]: 2025-10-02 12:40:17.894 2 DEBUG nova.compute.manager [-] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:40:17 np0005466013 nova_compute[192144]: 2025-10-02 12:40:17.894 2 DEBUG nova.network.neutron [-] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:40:19 np0005466013 nova_compute[192144]: 2025-10-02 12:40:19.700 2 DEBUG nova.network.neutron [-] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:40:19 np0005466013 nova_compute[192144]: 2025-10-02 12:40:19.728 2 INFO nova.compute.manager [-] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Took 1.83 seconds to deallocate network for instance.#033[00m
Oct  2 08:40:19 np0005466013 nova_compute[192144]: 2025-10-02 12:40:19.849 2 DEBUG nova.compute.manager [req-d5cd0a67-e84d-4d8e-bfe7-f60e39e8e369 req-6e6baf27-fcfc-4f1b-a97f-557e78ba0b09 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Received event network-vif-deleted-14809110-a5f0-4169-b4d1-b20b6b32320a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:40:19 np0005466013 nova_compute[192144]: 2025-10-02 12:40:19.856 2 DEBUG oslo_concurrency.lockutils [None req-de42ca87-487c-4eb4-9dac-c8f6a845d587 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:40:19 np0005466013 nova_compute[192144]: 2025-10-02 12:40:19.857 2 DEBUG oslo_concurrency.lockutils [None req-de42ca87-487c-4eb4-9dac-c8f6a845d587 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:40:19 np0005466013 nova_compute[192144]: 2025-10-02 12:40:19.985 2 DEBUG nova.compute.provider_tree [None req-de42ca87-487c-4eb4-9dac-c8f6a845d587 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:40:20 np0005466013 nova_compute[192144]: 2025-10-02 12:40:19.999 2 DEBUG nova.scheduler.client.report [None req-de42ca87-487c-4eb4-9dac-c8f6a845d587 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:40:20 np0005466013 nova_compute[192144]: 2025-10-02 12:40:20.006 2 DEBUG nova.compute.manager [req-e4d555d3-1f47-429f-bd0b-032ad7922671 req-46a81b8f-0f91-476f-9e15-5f6a4aa981e8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Received event network-vif-plugged-14809110-a5f0-4169-b4d1-b20b6b32320a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:40:20 np0005466013 nova_compute[192144]: 2025-10-02 12:40:20.007 2 DEBUG oslo_concurrency.lockutils [req-e4d555d3-1f47-429f-bd0b-032ad7922671 req-46a81b8f-0f91-476f-9e15-5f6a4aa981e8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "cecfc8a5-ee75-46f1-8280-6869075855ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:40:20 np0005466013 nova_compute[192144]: 2025-10-02 12:40:20.007 2 DEBUG oslo_concurrency.lockutils [req-e4d555d3-1f47-429f-bd0b-032ad7922671 req-46a81b8f-0f91-476f-9e15-5f6a4aa981e8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "cecfc8a5-ee75-46f1-8280-6869075855ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:40:20 np0005466013 nova_compute[192144]: 2025-10-02 12:40:20.008 2 DEBUG oslo_concurrency.lockutils [req-e4d555d3-1f47-429f-bd0b-032ad7922671 req-46a81b8f-0f91-476f-9e15-5f6a4aa981e8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "cecfc8a5-ee75-46f1-8280-6869075855ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:40:20 np0005466013 nova_compute[192144]: 2025-10-02 12:40:20.008 2 DEBUG nova.compute.manager [req-e4d555d3-1f47-429f-bd0b-032ad7922671 req-46a81b8f-0f91-476f-9e15-5f6a4aa981e8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] No waiting events found dispatching network-vif-plugged-14809110-a5f0-4169-b4d1-b20b6b32320a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:40:20 np0005466013 nova_compute[192144]: 2025-10-02 12:40:20.008 2 WARNING nova.compute.manager [req-e4d555d3-1f47-429f-bd0b-032ad7922671 req-46a81b8f-0f91-476f-9e15-5f6a4aa981e8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Received unexpected event network-vif-plugged-14809110-a5f0-4169-b4d1-b20b6b32320a for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:40:20 np0005466013 nova_compute[192144]: 2025-10-02 12:40:20.036 2 DEBUG oslo_concurrency.lockutils [None req-de42ca87-487c-4eb4-9dac-c8f6a845d587 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.179s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:40:20 np0005466013 nova_compute[192144]: 2025-10-02 12:40:20.087 2 INFO nova.scheduler.client.report [None req-de42ca87-487c-4eb4-9dac-c8f6a845d587 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Deleted allocations for instance cecfc8a5-ee75-46f1-8280-6869075855ac#033[00m
Oct  2 08:40:20 np0005466013 nova_compute[192144]: 2025-10-02 12:40:20.208 2 DEBUG oslo_concurrency.lockutils [None req-de42ca87-487c-4eb4-9dac-c8f6a845d587 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "cecfc8a5-ee75-46f1-8280-6869075855ac" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.957s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:40:20 np0005466013 nova_compute[192144]: 2025-10-02 12:40:20.749 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:20 np0005466013 nova_compute[192144]: 2025-10-02 12:40:20.808 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:21 np0005466013 nova_compute[192144]: 2025-10-02 12:40:21.236 2 DEBUG nova.network.neutron [req-f7c73c41-4138-4229-a55c-2832dc83c031 req-83fd3c72-d29d-4f79-8af4-2ff0b8e345db 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Updated VIF entry in instance network info cache for port 14809110-a5f0-4169-b4d1-b20b6b32320a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:40:21 np0005466013 nova_compute[192144]: 2025-10-02 12:40:21.237 2 DEBUG nova.network.neutron [req-f7c73c41-4138-4229-a55c-2832dc83c031 req-83fd3c72-d29d-4f79-8af4-2ff0b8e345db 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Updating instance_info_cache with network_info: [{"id": "14809110-a5f0-4169-b4d1-b20b6b32320a", "address": "fa:16:3e:53:ca:b0", "network": {"id": "b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177", "bridge": "br-int", "label": "tempest-network-smoke--1389422686", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe53:cab0", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe53:cab0", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14809110-a5", "ovs_interfaceid": "14809110-a5f0-4169-b4d1-b20b6b32320a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:40:21 np0005466013 nova_compute[192144]: 2025-10-02 12:40:21.907 2 DEBUG oslo_concurrency.lockutils [req-f7c73c41-4138-4229-a55c-2832dc83c031 req-83fd3c72-d29d-4f79-8af4-2ff0b8e345db 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-cecfc8a5-ee75-46f1-8280-6869075855ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:40:25 np0005466013 podman[249287]: 2025-10-02 12:40:25.705787736 +0000 UTC m=+0.073267453 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Oct  2 08:40:25 np0005466013 podman[249286]: 2025-10-02 12:40:25.73157147 +0000 UTC m=+0.103656460 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:40:25 np0005466013 podman[249288]: 2025-10-02 12:40:25.73157214 +0000 UTC m=+0.099158080 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller)
Oct  2 08:40:25 np0005466013 nova_compute[192144]: 2025-10-02 12:40:25.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:25 np0005466013 nova_compute[192144]: 2025-10-02 12:40:25.810 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:30 np0005466013 nova_compute[192144]: 2025-10-02 12:40:30.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:30 np0005466013 nova_compute[192144]: 2025-10-02 12:40:30.774 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408815.7734506, cecfc8a5-ee75-46f1-8280-6869075855ac => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:40:30 np0005466013 nova_compute[192144]: 2025-10-02 12:40:30.774 2 INFO nova.compute.manager [-] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:40:30 np0005466013 nova_compute[192144]: 2025-10-02 12:40:30.813 2 DEBUG nova.compute.manager [None req-41087ddd-213c-4b30-a303-5e48230eb68b - - - - - -] [instance: cecfc8a5-ee75-46f1-8280-6869075855ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:40:30 np0005466013 nova_compute[192144]: 2025-10-02 12:40:30.814 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:35 np0005466013 nova_compute[192144]: 2025-10-02 12:40:35.794 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:35 np0005466013 nova_compute[192144]: 2025-10-02 12:40:35.815 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:36 np0005466013 podman[249354]: 2025-10-02 12:40:36.709564096 +0000 UTC m=+0.064635634 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:40:36 np0005466013 podman[249353]: 2025-10-02 12:40:36.715814391 +0000 UTC m=+0.075488763 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, config_id=edpm, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vcs-type=git, architecture=x86_64, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., name=ubi9-minimal)
Oct  2 08:40:36 np0005466013 podman[249352]: 2025-10-02 12:40:36.733082518 +0000 UTC m=+0.090695946 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=multipathd)
Oct  2 08:40:38 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:40:38.405 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2b:a1:21 10.100.0.2 2001:db8::f816:3eff:fe2b:a121'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe2b:a121/64', 'neutron:device_id': 'ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5fce3bea-36c3-4b1e-bdee-b694cf8990ad, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ad07d234-3bc8-429a-8834-7a9ae3274be2) old=Port_Binding(mac=['fa:16:3e:2b:a1:21 10.100.0.2 2001:db8:0:1:f816:3eff:fe2b:a121 2001:db8::f816:3eff:fe2b:a121'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8:0:1:f816:3eff:fe2b:a121/64 2001:db8::f816:3eff:fe2b:a121/64', 'neutron:device_id': 'ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:40:38 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:40:38.408 103323 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ad07d234-3bc8-429a-8834-7a9ae3274be2 in datapath b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177 updated#033[00m
Oct  2 08:40:38 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:40:38.411 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:40:38 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:40:38.412 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[d68c2609-92da-4e22-9161-b1ce7144cd8a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:40:38 np0005466013 podman[249412]: 2025-10-02 12:40:38.678322287 +0000 UTC m=+0.056143160 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:40:38 np0005466013 podman[249413]: 2025-10-02 12:40:38.704815102 +0000 UTC m=+0.068184545 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:40:40 np0005466013 nova_compute[192144]: 2025-10-02 12:40:40.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:40 np0005466013 nova_compute[192144]: 2025-10-02 12:40:40.596 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:40 np0005466013 nova_compute[192144]: 2025-10-02 12:40:40.796 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:40 np0005466013 nova_compute[192144]: 2025-10-02 12:40:40.816 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:43 np0005466013 nova_compute[192144]: 2025-10-02 12:40:43.133 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:40:43 np0005466013 nova_compute[192144]: 2025-10-02 12:40:43.134 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:40:43 np0005466013 nova_compute[192144]: 2025-10-02 12:40:43.134 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:40:43 np0005466013 nova_compute[192144]: 2025-10-02 12:40:43.134 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:40:43 np0005466013 nova_compute[192144]: 2025-10-02 12:40:43.157 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:40:43 np0005466013 nova_compute[192144]: 2025-10-02 12:40:43.157 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:40:43 np0005466013 nova_compute[192144]: 2025-10-02 12:40:43.157 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:40:43 np0005466013 nova_compute[192144]: 2025-10-02 12:40:43.157 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:40:43 np0005466013 nova_compute[192144]: 2025-10-02 12:40:43.290 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:40:43 np0005466013 nova_compute[192144]: 2025-10-02 12:40:43.292 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5713MB free_disk=73.13349533081055GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:40:43 np0005466013 nova_compute[192144]: 2025-10-02 12:40:43.292 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:40:43 np0005466013 nova_compute[192144]: 2025-10-02 12:40:43.292 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:40:43 np0005466013 nova_compute[192144]: 2025-10-02 12:40:43.740 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:40:43 np0005466013 nova_compute[192144]: 2025-10-02 12:40:43.740 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:40:43 np0005466013 nova_compute[192144]: 2025-10-02 12:40:43.853 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Refreshing inventories for resource provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  2 08:40:43 np0005466013 nova_compute[192144]: 2025-10-02 12:40:43.964 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Updating ProviderTree inventory for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  2 08:40:43 np0005466013 nova_compute[192144]: 2025-10-02 12:40:43.964 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Updating inventory in ProviderTree for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 08:40:43 np0005466013 nova_compute[192144]: 2025-10-02 12:40:43.981 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Refreshing aggregate associations for resource provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  2 08:40:44 np0005466013 nova_compute[192144]: 2025-10-02 12:40:44.008 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Refreshing trait associations for resource provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80, traits: COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_TRUSTED_CERTS,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NODE,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE,HW_CPU_X86_MMX,COMPUTE_RESCUE_BFV _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  2 08:40:44 np0005466013 nova_compute[192144]: 2025-10-02 12:40:44.027 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:40:44 np0005466013 nova_compute[192144]: 2025-10-02 12:40:44.044 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:40:44 np0005466013 nova_compute[192144]: 2025-10-02 12:40:44.079 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:40:44 np0005466013 nova_compute[192144]: 2025-10-02 12:40:44.080 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.787s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:40:45 np0005466013 nova_compute[192144]: 2025-10-02 12:40:45.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:45 np0005466013 nova_compute[192144]: 2025-10-02 12:40:45.817 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:45 np0005466013 nova_compute[192144]: 2025-10-02 12:40:45.940 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:40:48 np0005466013 nova_compute[192144]: 2025-10-02 12:40:48.988 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:40:50 np0005466013 nova_compute[192144]: 2025-10-02 12:40:50.010 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:40:50 np0005466013 nova_compute[192144]: 2025-10-02 12:40:50.800 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:50 np0005466013 nova_compute[192144]: 2025-10-02 12:40:50.820 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:50 np0005466013 nova_compute[192144]: 2025-10-02 12:40:50.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:40:52 np0005466013 nova_compute[192144]: 2025-10-02 12:40:52.007 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:40:55 np0005466013 nova_compute[192144]: 2025-10-02 12:40:55.801 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:55 np0005466013 nova_compute[192144]: 2025-10-02 12:40:55.821 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:55 np0005466013 podman[249457]: 2025-10-02 12:40:55.870084087 +0000 UTC m=+0.046116677 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:40:55 np0005466013 podman[249458]: 2025-10-02 12:40:55.887744797 +0000 UTC m=+0.057209003 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent)
Oct  2 08:40:55 np0005466013 podman[249459]: 2025-10-02 12:40:55.906657776 +0000 UTC m=+0.076861445 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct  2 08:40:55 np0005466013 nova_compute[192144]: 2025-10-02 12:40:55.995 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:40:55 np0005466013 nova_compute[192144]: 2025-10-02 12:40:55.995 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:40:55 np0005466013 nova_compute[192144]: 2025-10-02 12:40:55.995 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:40:56 np0005466013 nova_compute[192144]: 2025-10-02 12:40:56.012 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:40:56 np0005466013 nova_compute[192144]: 2025-10-02 12:40:56.013 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:40:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:40:59.122 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c1:9b:5a 10.100.0.2 2001:db8::f816:3eff:fec1:9b5a'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fec1:9b5a/64', 'neutron:device_id': 'ovnmeta-c4f50473-7465-4325-8b4d-bb57fca0162f', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c4f50473-7465-4325-8b4d-bb57fca0162f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d90df5bc-8770-4be5-937c-0abfe33bbe11, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=f69cd95a-5b20-4a47-8acc-7e190d1dac4c) old=Port_Binding(mac=['fa:16:3e:c1:9b:5a 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-c4f50473-7465-4325-8b4d-bb57fca0162f', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c4f50473-7465-4325-8b4d-bb57fca0162f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:40:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:40:59.124 103323 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port f69cd95a-5b20-4a47-8acc-7e190d1dac4c in datapath c4f50473-7465-4325-8b4d-bb57fca0162f updated#033[00m
Oct  2 08:40:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:40:59.125 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c4f50473-7465-4325-8b4d-bb57fca0162f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:40:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:40:59.126 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[ae75cf3f-997b-436b-a2d1-0091e9bed4b3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:00 np0005466013 nova_compute[192144]: 2025-10-02 12:41:00.803 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:00 np0005466013 nova_compute[192144]: 2025-10-02 12:41:00.823 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:41:02.325 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:41:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:41:02.326 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:41:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:41:02.326 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:41:05 np0005466013 nova_compute[192144]: 2025-10-02 12:41:05.825 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:41:05 np0005466013 nova_compute[192144]: 2025-10-02 12:41:05.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:41:05 np0005466013 nova_compute[192144]: 2025-10-02 12:41:05.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Oct  2 08:41:05 np0005466013 nova_compute[192144]: 2025-10-02 12:41:05.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  2 08:41:05 np0005466013 nova_compute[192144]: 2025-10-02 12:41:05.851 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:05 np0005466013 nova_compute[192144]: 2025-10-02 12:41:05.851 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  2 08:41:07 np0005466013 podman[249523]: 2025-10-02 12:41:07.735329098 +0000 UTC m=+0.097630922 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct  2 08:41:07 np0005466013 podman[249525]: 2025-10-02 12:41:07.736071201 +0000 UTC m=+0.085837384 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251001, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 08:41:07 np0005466013 podman[249524]: 2025-10-02 12:41:07.765996774 +0000 UTC m=+0.125001995 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, io.openshift.expose-services=, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, config_id=edpm, managed_by=edpm_ansible, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vcs-type=git, io.openshift.tags=minimal rhel9, name=ubi9-minimal)
Oct  2 08:41:09 np0005466013 podman[249581]: 2025-10-02 12:41:09.715338931 +0000 UTC m=+0.079422685 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:41:09 np0005466013 podman[249582]: 2025-10-02 12:41:09.732051692 +0000 UTC m=+0.100338107 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  2 08:41:10 np0005466013 nova_compute[192144]: 2025-10-02 12:41:10.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:41:12 np0005466013 nova_compute[192144]: 2025-10-02 12:41:12.029 2 DEBUG oslo_concurrency.lockutils [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "2b393965-8a4a-44e3-a983-22523112e307" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:41:12 np0005466013 nova_compute[192144]: 2025-10-02 12:41:12.030 2 DEBUG oslo_concurrency.lockutils [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "2b393965-8a4a-44e3-a983-22523112e307" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:41:12 np0005466013 nova_compute[192144]: 2025-10-02 12:41:12.054 2 DEBUG nova.compute.manager [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:41:12 np0005466013 nova_compute[192144]: 2025-10-02 12:41:12.206 2 DEBUG oslo_concurrency.lockutils [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:41:12 np0005466013 nova_compute[192144]: 2025-10-02 12:41:12.207 2 DEBUG oslo_concurrency.lockutils [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:41:12 np0005466013 nova_compute[192144]: 2025-10-02 12:41:12.217 2 DEBUG nova.virt.hardware [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:41:12 np0005466013 nova_compute[192144]: 2025-10-02 12:41:12.217 2 INFO nova.compute.claims [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:41:12 np0005466013 nova_compute[192144]: 2025-10-02 12:41:12.572 2 DEBUG nova.compute.provider_tree [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:41:12 np0005466013 nova_compute[192144]: 2025-10-02 12:41:12.604 2 DEBUG nova.scheduler.client.report [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:41:12 np0005466013 nova_compute[192144]: 2025-10-02 12:41:12.643 2 DEBUG oslo_concurrency.lockutils [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.437s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:41:12 np0005466013 nova_compute[192144]: 2025-10-02 12:41:12.644 2 DEBUG nova.compute.manager [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:41:12 np0005466013 nova_compute[192144]: 2025-10-02 12:41:12.732 2 DEBUG nova.compute.manager [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:41:12 np0005466013 nova_compute[192144]: 2025-10-02 12:41:12.732 2 DEBUG nova.network.neutron [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:41:12 np0005466013 nova_compute[192144]: 2025-10-02 12:41:12.764 2 INFO nova.virt.libvirt.driver [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:41:12 np0005466013 nova_compute[192144]: 2025-10-02 12:41:12.792 2 DEBUG nova.compute.manager [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:41:12 np0005466013 nova_compute[192144]: 2025-10-02 12:41:12.981 2 DEBUG nova.compute.manager [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:41:12 np0005466013 nova_compute[192144]: 2025-10-02 12:41:12.983 2 DEBUG nova.virt.libvirt.driver [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:41:12 np0005466013 nova_compute[192144]: 2025-10-02 12:41:12.984 2 INFO nova.virt.libvirt.driver [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Creating image(s)#033[00m
Oct  2 08:41:12 np0005466013 nova_compute[192144]: 2025-10-02 12:41:12.985 2 DEBUG oslo_concurrency.lockutils [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "/var/lib/nova/instances/2b393965-8a4a-44e3-a983-22523112e307/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:41:12 np0005466013 nova_compute[192144]: 2025-10-02 12:41:12.986 2 DEBUG oslo_concurrency.lockutils [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "/var/lib/nova/instances/2b393965-8a4a-44e3-a983-22523112e307/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:41:12 np0005466013 nova_compute[192144]: 2025-10-02 12:41:12.987 2 DEBUG oslo_concurrency.lockutils [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "/var/lib/nova/instances/2b393965-8a4a-44e3-a983-22523112e307/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:41:13 np0005466013 nova_compute[192144]: 2025-10-02 12:41:13.013 2 DEBUG oslo_concurrency.processutils [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:41:13 np0005466013 nova_compute[192144]: 2025-10-02 12:41:13.108 2 DEBUG oslo_concurrency.processutils [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:41:13 np0005466013 nova_compute[192144]: 2025-10-02 12:41:13.110 2 DEBUG oslo_concurrency.lockutils [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:41:13 np0005466013 nova_compute[192144]: 2025-10-02 12:41:13.111 2 DEBUG oslo_concurrency.lockutils [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:41:13 np0005466013 nova_compute[192144]: 2025-10-02 12:41:13.135 2 DEBUG oslo_concurrency.processutils [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:41:13 np0005466013 nova_compute[192144]: 2025-10-02 12:41:13.207 2 DEBUG oslo_concurrency.processutils [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:41:13 np0005466013 nova_compute[192144]: 2025-10-02 12:41:13.209 2 DEBUG oslo_concurrency.processutils [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/2b393965-8a4a-44e3-a983-22523112e307/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:41:13 np0005466013 nova_compute[192144]: 2025-10-02 12:41:13.253 2 DEBUG oslo_concurrency.processutils [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/2b393965-8a4a-44e3-a983-22523112e307/disk 1073741824" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:41:13 np0005466013 nova_compute[192144]: 2025-10-02 12:41:13.254 2 DEBUG oslo_concurrency.lockutils [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.144s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:41:13 np0005466013 nova_compute[192144]: 2025-10-02 12:41:13.255 2 DEBUG oslo_concurrency.processutils [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:41:13 np0005466013 nova_compute[192144]: 2025-10-02 12:41:13.325 2 DEBUG oslo_concurrency.processutils [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:41:13 np0005466013 nova_compute[192144]: 2025-10-02 12:41:13.327 2 DEBUG nova.virt.disk.api [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Checking if we can resize image /var/lib/nova/instances/2b393965-8a4a-44e3-a983-22523112e307/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:41:13 np0005466013 nova_compute[192144]: 2025-10-02 12:41:13.328 2 DEBUG oslo_concurrency.processutils [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2b393965-8a4a-44e3-a983-22523112e307/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:41:13 np0005466013 nova_compute[192144]: 2025-10-02 12:41:13.393 2 DEBUG nova.policy [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:41:13 np0005466013 nova_compute[192144]: 2025-10-02 12:41:13.399 2 DEBUG oslo_concurrency.processutils [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2b393965-8a4a-44e3-a983-22523112e307/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:41:13 np0005466013 nova_compute[192144]: 2025-10-02 12:41:13.400 2 DEBUG nova.virt.disk.api [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Cannot resize image /var/lib/nova/instances/2b393965-8a4a-44e3-a983-22523112e307/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:41:13 np0005466013 nova_compute[192144]: 2025-10-02 12:41:13.400 2 DEBUG nova.objects.instance [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lazy-loading 'migration_context' on Instance uuid 2b393965-8a4a-44e3-a983-22523112e307 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:41:13 np0005466013 nova_compute[192144]: 2025-10-02 12:41:13.426 2 DEBUG nova.virt.libvirt.driver [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:41:13 np0005466013 nova_compute[192144]: 2025-10-02 12:41:13.427 2 DEBUG nova.virt.libvirt.driver [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Ensure instance console log exists: /var/lib/nova/instances/2b393965-8a4a-44e3-a983-22523112e307/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:41:13 np0005466013 nova_compute[192144]: 2025-10-02 12:41:13.427 2 DEBUG oslo_concurrency.lockutils [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:41:13 np0005466013 nova_compute[192144]: 2025-10-02 12:41:13.428 2 DEBUG oslo_concurrency.lockutils [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:41:13 np0005466013 nova_compute[192144]: 2025-10-02 12:41:13.428 2 DEBUG oslo_concurrency.lockutils [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:41:15 np0005466013 nova_compute[192144]: 2025-10-02 12:41:15.854 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:41:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:41:16.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:41:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:41:16.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:41:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:41:16.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:41:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:41:16.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:41:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:41:16.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:41:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:41:16.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:41:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:41:16.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:41:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:41:16.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:41:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:41:16.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:41:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:41:16.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:41:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:41:16.362 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:41:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:41:16.362 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:41:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:41:16.362 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:41:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:41:16.362 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:41:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:41:16.363 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:41:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:41:16.363 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:41:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:41:16.363 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:41:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:41:16.363 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:41:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:41:16.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:41:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:41:16.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:41:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:41:16.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:41:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:41:16.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:41:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:41:16.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:41:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:41:16.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:41:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:41:16.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:41:17 np0005466013 nova_compute[192144]: 2025-10-02 12:41:17.386 2 DEBUG nova.network.neutron [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Successfully created port: d526d728-0b99-484c-ba01-139df2c989a1 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:41:19 np0005466013 nova_compute[192144]: 2025-10-02 12:41:19.073 2 DEBUG nova.network.neutron [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Successfully updated port: d526d728-0b99-484c-ba01-139df2c989a1 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:41:19 np0005466013 nova_compute[192144]: 2025-10-02 12:41:19.089 2 DEBUG oslo_concurrency.lockutils [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "refresh_cache-2b393965-8a4a-44e3-a983-22523112e307" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:41:19 np0005466013 nova_compute[192144]: 2025-10-02 12:41:19.090 2 DEBUG oslo_concurrency.lockutils [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquired lock "refresh_cache-2b393965-8a4a-44e3-a983-22523112e307" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:41:19 np0005466013 nova_compute[192144]: 2025-10-02 12:41:19.090 2 DEBUG nova.network.neutron [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:41:19 np0005466013 nova_compute[192144]: 2025-10-02 12:41:19.208 2 DEBUG nova.compute.manager [req-3c868777-5bb6-44ee-96f0-36b27b15b88a req-dfe82693-ba65-4ba2-9897-09e771b58114 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Received event network-changed-d526d728-0b99-484c-ba01-139df2c989a1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:41:19 np0005466013 nova_compute[192144]: 2025-10-02 12:41:19.209 2 DEBUG nova.compute.manager [req-3c868777-5bb6-44ee-96f0-36b27b15b88a req-dfe82693-ba65-4ba2-9897-09e771b58114 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Refreshing instance network info cache due to event network-changed-d526d728-0b99-484c-ba01-139df2c989a1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:41:19 np0005466013 nova_compute[192144]: 2025-10-02 12:41:19.209 2 DEBUG oslo_concurrency.lockutils [req-3c868777-5bb6-44ee-96f0-36b27b15b88a req-dfe82693-ba65-4ba2-9897-09e771b58114 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-2b393965-8a4a-44e3-a983-22523112e307" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:41:19 np0005466013 nova_compute[192144]: 2025-10-02 12:41:19.397 2 DEBUG nova.network.neutron [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:41:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:41:19.427 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=46, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=45) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:41:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:41:19.428 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:41:19 np0005466013 nova_compute[192144]: 2025-10-02 12:41:19.429 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:19 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:41:19.430 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '46'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:41:20 np0005466013 nova_compute[192144]: 2025-10-02 12:41:20.893 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:21 np0005466013 ovn_controller[94366]: 2025-10-02T12:41:21Z|00744|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Oct  2 08:41:22 np0005466013 nova_compute[192144]: 2025-10-02 12:41:22.621 2 DEBUG nova.network.neutron [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Updating instance_info_cache with network_info: [{"id": "d526d728-0b99-484c-ba01-139df2c989a1", "address": "fa:16:3e:8f:83:09", "network": {"id": "c4f50473-7465-4325-8b4d-bb57fca0162f", "bridge": "br-int", "label": "tempest-network-smoke--1490536879", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe8f:8309", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd526d728-0b", "ovs_interfaceid": "d526d728-0b99-484c-ba01-139df2c989a1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:41:22 np0005466013 nova_compute[192144]: 2025-10-02 12:41:22.638 2 DEBUG oslo_concurrency.lockutils [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Releasing lock "refresh_cache-2b393965-8a4a-44e3-a983-22523112e307" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:41:22 np0005466013 nova_compute[192144]: 2025-10-02 12:41:22.638 2 DEBUG nova.compute.manager [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Instance network_info: |[{"id": "d526d728-0b99-484c-ba01-139df2c989a1", "address": "fa:16:3e:8f:83:09", "network": {"id": "c4f50473-7465-4325-8b4d-bb57fca0162f", "bridge": "br-int", "label": "tempest-network-smoke--1490536879", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe8f:8309", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd526d728-0b", "ovs_interfaceid": "d526d728-0b99-484c-ba01-139df2c989a1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:41:22 np0005466013 nova_compute[192144]: 2025-10-02 12:41:22.639 2 DEBUG oslo_concurrency.lockutils [req-3c868777-5bb6-44ee-96f0-36b27b15b88a req-dfe82693-ba65-4ba2-9897-09e771b58114 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-2b393965-8a4a-44e3-a983-22523112e307" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:41:22 np0005466013 nova_compute[192144]: 2025-10-02 12:41:22.639 2 DEBUG nova.network.neutron [req-3c868777-5bb6-44ee-96f0-36b27b15b88a req-dfe82693-ba65-4ba2-9897-09e771b58114 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Refreshing network info cache for port d526d728-0b99-484c-ba01-139df2c989a1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:41:22 np0005466013 nova_compute[192144]: 2025-10-02 12:41:22.642 2 DEBUG nova.virt.libvirt.driver [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Start _get_guest_xml network_info=[{"id": "d526d728-0b99-484c-ba01-139df2c989a1", "address": "fa:16:3e:8f:83:09", "network": {"id": "c4f50473-7465-4325-8b4d-bb57fca0162f", "bridge": "br-int", "label": "tempest-network-smoke--1490536879", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe8f:8309", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd526d728-0b", "ovs_interfaceid": "d526d728-0b99-484c-ba01-139df2c989a1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:41:22 np0005466013 nova_compute[192144]: 2025-10-02 12:41:22.648 2 WARNING nova.virt.libvirt.driver [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:41:22 np0005466013 nova_compute[192144]: 2025-10-02 12:41:22.655 2 DEBUG nova.virt.libvirt.host [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:41:22 np0005466013 nova_compute[192144]: 2025-10-02 12:41:22.656 2 DEBUG nova.virt.libvirt.host [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:41:22 np0005466013 nova_compute[192144]: 2025-10-02 12:41:22.660 2 DEBUG nova.virt.libvirt.host [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:41:22 np0005466013 nova_compute[192144]: 2025-10-02 12:41:22.661 2 DEBUG nova.virt.libvirt.host [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:41:22 np0005466013 nova_compute[192144]: 2025-10-02 12:41:22.662 2 DEBUG nova.virt.libvirt.driver [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:41:22 np0005466013 nova_compute[192144]: 2025-10-02 12:41:22.662 2 DEBUG nova.virt.hardware [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:41:22 np0005466013 nova_compute[192144]: 2025-10-02 12:41:22.663 2 DEBUG nova.virt.hardware [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:41:22 np0005466013 nova_compute[192144]: 2025-10-02 12:41:22.663 2 DEBUG nova.virt.hardware [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:41:22 np0005466013 nova_compute[192144]: 2025-10-02 12:41:22.663 2 DEBUG nova.virt.hardware [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:41:22 np0005466013 nova_compute[192144]: 2025-10-02 12:41:22.663 2 DEBUG nova.virt.hardware [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:41:22 np0005466013 nova_compute[192144]: 2025-10-02 12:41:22.663 2 DEBUG nova.virt.hardware [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:41:22 np0005466013 nova_compute[192144]: 2025-10-02 12:41:22.664 2 DEBUG nova.virt.hardware [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:41:22 np0005466013 nova_compute[192144]: 2025-10-02 12:41:22.664 2 DEBUG nova.virt.hardware [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:41:22 np0005466013 nova_compute[192144]: 2025-10-02 12:41:22.664 2 DEBUG nova.virt.hardware [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:41:22 np0005466013 nova_compute[192144]: 2025-10-02 12:41:22.664 2 DEBUG nova.virt.hardware [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:41:22 np0005466013 nova_compute[192144]: 2025-10-02 12:41:22.664 2 DEBUG nova.virt.hardware [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:41:22 np0005466013 nova_compute[192144]: 2025-10-02 12:41:22.668 2 DEBUG nova.virt.libvirt.vif [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:41:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-702332065',display_name='tempest-TestGettingAddress-server-702332065',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-702332065',id=172,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA+W0VRkiRweukUMP9gTiBmsYPV9brmjtI29ZBDpBBAygrYooWG25zCR9mTx41solx25wcFgiDOyxlNfpE1uIsSJmxlJws14lF4ZLmAF9ATw8SJfg5U3tBL7QCF1+HVJ7Q==',key_name='tempest-TestGettingAddress-972559981',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-mta0kn8b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:41:12Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=2b393965-8a4a-44e3-a983-22523112e307,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d526d728-0b99-484c-ba01-139df2c989a1", "address": "fa:16:3e:8f:83:09", "network": {"id": "c4f50473-7465-4325-8b4d-bb57fca0162f", "bridge": "br-int", "label": "tempest-network-smoke--1490536879", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe8f:8309", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd526d728-0b", "ovs_interfaceid": "d526d728-0b99-484c-ba01-139df2c989a1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:41:22 np0005466013 nova_compute[192144]: 2025-10-02 12:41:22.668 2 DEBUG nova.network.os_vif_util [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "d526d728-0b99-484c-ba01-139df2c989a1", "address": "fa:16:3e:8f:83:09", "network": {"id": "c4f50473-7465-4325-8b4d-bb57fca0162f", "bridge": "br-int", "label": "tempest-network-smoke--1490536879", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe8f:8309", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd526d728-0b", "ovs_interfaceid": "d526d728-0b99-484c-ba01-139df2c989a1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:41:22 np0005466013 nova_compute[192144]: 2025-10-02 12:41:22.670 2 DEBUG nova.network.os_vif_util [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8f:83:09,bridge_name='br-int',has_traffic_filtering=True,id=d526d728-0b99-484c-ba01-139df2c989a1,network=Network(c4f50473-7465-4325-8b4d-bb57fca0162f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd526d728-0b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:41:22 np0005466013 nova_compute[192144]: 2025-10-02 12:41:22.670 2 DEBUG nova.objects.instance [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2b393965-8a4a-44e3-a983-22523112e307 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:41:22 np0005466013 nova_compute[192144]: 2025-10-02 12:41:22.694 2 DEBUG nova.virt.libvirt.driver [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:41:22 np0005466013 nova_compute[192144]:  <uuid>2b393965-8a4a-44e3-a983-22523112e307</uuid>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:  <name>instance-000000ac</name>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:41:22 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:      <nova:name>tempest-TestGettingAddress-server-702332065</nova:name>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:41:22</nova:creationTime>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:41:22 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:        <nova:user uuid="97ce9f1898484e0e9a1f7c84a9f0dfe3">tempest-TestGettingAddress-1355720650-project-member</nova:user>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:        <nova:project uuid="fd801958556f4c8aab047ecdef6b5ee8">tempest-TestGettingAddress-1355720650</nova:project>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:        <nova:port uuid="d526d728-0b99-484c-ba01-139df2c989a1">
Oct  2 08:41:22 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe8f:8309" ipVersion="6"/>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:      <entry name="serial">2b393965-8a4a-44e3-a983-22523112e307</entry>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:      <entry name="uuid">2b393965-8a4a-44e3-a983-22523112e307</entry>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:41:22 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/2b393965-8a4a-44e3-a983-22523112e307/disk"/>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:41:22 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/2b393965-8a4a-44e3-a983-22523112e307/disk.config"/>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:41:22 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:8f:83:09"/>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:      <target dev="tapd526d728-0b"/>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:41:22 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/2b393965-8a4a-44e3-a983-22523112e307/console.log" append="off"/>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:41:22 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:41:22 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:41:22 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:41:22 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:41:22 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:41:22 np0005466013 nova_compute[192144]: 2025-10-02 12:41:22.696 2 DEBUG nova.compute.manager [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Preparing to wait for external event network-vif-plugged-d526d728-0b99-484c-ba01-139df2c989a1 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:41:22 np0005466013 nova_compute[192144]: 2025-10-02 12:41:22.696 2 DEBUG oslo_concurrency.lockutils [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "2b393965-8a4a-44e3-a983-22523112e307-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:41:22 np0005466013 nova_compute[192144]: 2025-10-02 12:41:22.697 2 DEBUG oslo_concurrency.lockutils [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "2b393965-8a4a-44e3-a983-22523112e307-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:41:22 np0005466013 nova_compute[192144]: 2025-10-02 12:41:22.697 2 DEBUG oslo_concurrency.lockutils [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "2b393965-8a4a-44e3-a983-22523112e307-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:41:22 np0005466013 nova_compute[192144]: 2025-10-02 12:41:22.698 2 DEBUG nova.virt.libvirt.vif [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:41:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-702332065',display_name='tempest-TestGettingAddress-server-702332065',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-702332065',id=172,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA+W0VRkiRweukUMP9gTiBmsYPV9brmjtI29ZBDpBBAygrYooWG25zCR9mTx41solx25wcFgiDOyxlNfpE1uIsSJmxlJws14lF4ZLmAF9ATw8SJfg5U3tBL7QCF1+HVJ7Q==',key_name='tempest-TestGettingAddress-972559981',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-mta0kn8b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:41:12Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=2b393965-8a4a-44e3-a983-22523112e307,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d526d728-0b99-484c-ba01-139df2c989a1", "address": "fa:16:3e:8f:83:09", "network": {"id": "c4f50473-7465-4325-8b4d-bb57fca0162f", "bridge": "br-int", "label": "tempest-network-smoke--1490536879", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe8f:8309", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd526d728-0b", "ovs_interfaceid": "d526d728-0b99-484c-ba01-139df2c989a1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:41:22 np0005466013 nova_compute[192144]: 2025-10-02 12:41:22.698 2 DEBUG nova.network.os_vif_util [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "d526d728-0b99-484c-ba01-139df2c989a1", "address": "fa:16:3e:8f:83:09", "network": {"id": "c4f50473-7465-4325-8b4d-bb57fca0162f", "bridge": "br-int", "label": "tempest-network-smoke--1490536879", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe8f:8309", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd526d728-0b", "ovs_interfaceid": "d526d728-0b99-484c-ba01-139df2c989a1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:41:22 np0005466013 nova_compute[192144]: 2025-10-02 12:41:22.698 2 DEBUG nova.network.os_vif_util [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8f:83:09,bridge_name='br-int',has_traffic_filtering=True,id=d526d728-0b99-484c-ba01-139df2c989a1,network=Network(c4f50473-7465-4325-8b4d-bb57fca0162f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd526d728-0b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:41:22 np0005466013 nova_compute[192144]: 2025-10-02 12:41:22.699 2 DEBUG os_vif [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8f:83:09,bridge_name='br-int',has_traffic_filtering=True,id=d526d728-0b99-484c-ba01-139df2c989a1,network=Network(c4f50473-7465-4325-8b4d-bb57fca0162f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd526d728-0b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:41:22 np0005466013 nova_compute[192144]: 2025-10-02 12:41:22.699 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:22 np0005466013 nova_compute[192144]: 2025-10-02 12:41:22.700 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:41:22 np0005466013 nova_compute[192144]: 2025-10-02 12:41:22.700 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:41:22 np0005466013 nova_compute[192144]: 2025-10-02 12:41:22.702 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:22 np0005466013 nova_compute[192144]: 2025-10-02 12:41:22.703 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd526d728-0b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:41:22 np0005466013 nova_compute[192144]: 2025-10-02 12:41:22.703 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd526d728-0b, col_values=(('external_ids', {'iface-id': 'd526d728-0b99-484c-ba01-139df2c989a1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8f:83:09', 'vm-uuid': '2b393965-8a4a-44e3-a983-22523112e307'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:41:22 np0005466013 nova_compute[192144]: 2025-10-02 12:41:22.705 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:22 np0005466013 NetworkManager[51205]: <info>  [1759408882.7070] manager: (tapd526d728-0b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/329)
Oct  2 08:41:22 np0005466013 nova_compute[192144]: 2025-10-02 12:41:22.709 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:41:22 np0005466013 nova_compute[192144]: 2025-10-02 12:41:22.712 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:22 np0005466013 nova_compute[192144]: 2025-10-02 12:41:22.713 2 INFO os_vif [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8f:83:09,bridge_name='br-int',has_traffic_filtering=True,id=d526d728-0b99-484c-ba01-139df2c989a1,network=Network(c4f50473-7465-4325-8b4d-bb57fca0162f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd526d728-0b')#033[00m
Oct  2 08:41:22 np0005466013 nova_compute[192144]: 2025-10-02 12:41:22.867 2 DEBUG nova.virt.libvirt.driver [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:41:22 np0005466013 nova_compute[192144]: 2025-10-02 12:41:22.868 2 DEBUG nova.virt.libvirt.driver [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:41:22 np0005466013 nova_compute[192144]: 2025-10-02 12:41:22.869 2 DEBUG nova.virt.libvirt.driver [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] No VIF found with MAC fa:16:3e:8f:83:09, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:41:22 np0005466013 nova_compute[192144]: 2025-10-02 12:41:22.870 2 INFO nova.virt.libvirt.driver [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Using config drive#033[00m
Oct  2 08:41:23 np0005466013 nova_compute[192144]: 2025-10-02 12:41:23.606 2 INFO nova.virt.libvirt.driver [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Creating config drive at /var/lib/nova/instances/2b393965-8a4a-44e3-a983-22523112e307/disk.config#033[00m
Oct  2 08:41:23 np0005466013 nova_compute[192144]: 2025-10-02 12:41:23.615 2 DEBUG oslo_concurrency.processutils [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2b393965-8a4a-44e3-a983-22523112e307/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpekoht0io execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:41:23 np0005466013 nova_compute[192144]: 2025-10-02 12:41:23.762 2 DEBUG oslo_concurrency.processutils [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2b393965-8a4a-44e3-a983-22523112e307/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpekoht0io" returned: 0 in 0.147s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:41:23 np0005466013 kernel: tapd526d728-0b: entered promiscuous mode
Oct  2 08:41:23 np0005466013 ovn_controller[94366]: 2025-10-02T12:41:23Z|00745|binding|INFO|Claiming lport d526d728-0b99-484c-ba01-139df2c989a1 for this chassis.
Oct  2 08:41:23 np0005466013 ovn_controller[94366]: 2025-10-02T12:41:23Z|00746|binding|INFO|d526d728-0b99-484c-ba01-139df2c989a1: Claiming fa:16:3e:8f:83:09 10.100.0.9 2001:db8::f816:3eff:fe8f:8309
Oct  2 08:41:23 np0005466013 NetworkManager[51205]: <info>  [1759408883.8379] manager: (tapd526d728-0b): new Tun device (/org/freedesktop/NetworkManager/Devices/330)
Oct  2 08:41:23 np0005466013 nova_compute[192144]: 2025-10-02 12:41:23.837 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:23 np0005466013 nova_compute[192144]: 2025-10-02 12:41:23.847 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:41:23.860 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8f:83:09 10.100.0.9 2001:db8::f816:3eff:fe8f:8309'], port_security=['fa:16:3e:8f:83:09 10.100.0.9 2001:db8::f816:3eff:fe8f:8309'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28 2001:db8::f816:3eff:fe8f:8309/64', 'neutron:device_id': '2b393965-8a4a-44e3-a983-22523112e307', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c4f50473-7465-4325-8b4d-bb57fca0162f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '2', 'neutron:security_group_ids': '629ae092-c18f-4a62-b981-2da6f9b93771', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d90df5bc-8770-4be5-937c-0abfe33bbe11, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=d526d728-0b99-484c-ba01-139df2c989a1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:41:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:41:23.861 103323 INFO neutron.agent.ovn.metadata.agent [-] Port d526d728-0b99-484c-ba01-139df2c989a1 in datapath c4f50473-7465-4325-8b4d-bb57fca0162f bound to our chassis#033[00m
Oct  2 08:41:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:41:23.862 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c4f50473-7465-4325-8b4d-bb57fca0162f#033[00m
Oct  2 08:41:23 np0005466013 systemd-udevd[249660]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:41:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:41:23.878 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[18eb4806-4a75-4e5d-81a4-ee0dcef2512b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:41:23.879 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc4f50473-71 in ovnmeta-c4f50473-7465-4325-8b4d-bb57fca0162f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:41:23 np0005466013 NetworkManager[51205]: <info>  [1759408883.8839] device (tapd526d728-0b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:41:23 np0005466013 NetworkManager[51205]: <info>  [1759408883.8856] device (tapd526d728-0b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:41:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:41:23.885 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc4f50473-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:41:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:41:23.885 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[6ca3a3a7-b7df-4d63-a080-38bc0e0358b4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:41:23.886 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[51f2d0c9-3aeb-4b84-90b6-bfad563fff4d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:23 np0005466013 systemd-machined[152202]: New machine qemu-78-instance-000000ac.
Oct  2 08:41:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:41:23.898 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[a0c7d7ef-31b9-4f05-b718-9f8aba9dc6af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:23 np0005466013 systemd[1]: Started Virtual Machine qemu-78-instance-000000ac.
Oct  2 08:41:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:41:23.929 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[7d349916-6e06-46d2-8e46-a9ba9b56ee49]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:23 np0005466013 ovn_controller[94366]: 2025-10-02T12:41:23Z|00747|binding|INFO|Setting lport d526d728-0b99-484c-ba01-139df2c989a1 ovn-installed in OVS
Oct  2 08:41:23 np0005466013 ovn_controller[94366]: 2025-10-02T12:41:23Z|00748|binding|INFO|Setting lport d526d728-0b99-484c-ba01-139df2c989a1 up in Southbound
Oct  2 08:41:23 np0005466013 nova_compute[192144]: 2025-10-02 12:41:23.936 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:41:23.962 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[b1e7823c-f90d-4ee1-978d-50cfaf3608f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:23 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:41:23.966 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[415f54e5-4594-4caa-9a17-2a7a9fab3771]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:23 np0005466013 NetworkManager[51205]: <info>  [1759408883.9688] manager: (tapc4f50473-70): new Veth device (/org/freedesktop/NetworkManager/Devices/331)
Oct  2 08:41:23 np0005466013 systemd-udevd[249664]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:41:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:41:24.003 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[7431d6e9-e6da-4163-9780-72683f0d1db6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:41:24.007 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[26f3f1a2-813f-4f0d-ba37-fad97c276d67]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:24 np0005466013 NetworkManager[51205]: <info>  [1759408884.0273] device (tapc4f50473-70): carrier: link connected
Oct  2 08:41:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:41:24.031 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[b7763836-c3f0-4cb7-b117-083face59f3e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:41:24.045 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[e4af5386-41ae-46fa-a931-775e759d1e43]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc4f50473-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c1:9b:5a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 220], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 687564, 'reachable_time': 15412, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 249694, 'error': None, 'target': 'ovnmeta-c4f50473-7465-4325-8b4d-bb57fca0162f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:41:24.057 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[f18c7aea-9850-454d-b8d3-1f438f531497]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec1:9b5a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 687564, 'tstamp': 687564}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 249695, 'error': None, 'target': 'ovnmeta-c4f50473-7465-4325-8b4d-bb57fca0162f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:41:24.070 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[bbffb958-d4a4-49da-ba6b-b67cacd834d6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc4f50473-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c1:9b:5a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 220], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 687564, 'reachable_time': 15412, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 249696, 'error': None, 'target': 'ovnmeta-c4f50473-7465-4325-8b4d-bb57fca0162f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:41:24.102 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[10d6b7c2-573e-4c96-9e91-622b65265d5f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:41:24.166 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[cac55121-6778-4682-9df0-670d316d36ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:41:24.167 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc4f50473-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:41:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:41:24.168 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:41:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:41:24.168 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc4f50473-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:41:24 np0005466013 NetworkManager[51205]: <info>  [1759408884.1706] manager: (tapc4f50473-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/332)
Oct  2 08:41:24 np0005466013 kernel: tapc4f50473-70: entered promiscuous mode
Oct  2 08:41:24 np0005466013 nova_compute[192144]: 2025-10-02 12:41:24.170 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:24 np0005466013 nova_compute[192144]: 2025-10-02 12:41:24.173 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:41:24.176 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc4f50473-70, col_values=(('external_ids', {'iface-id': 'f69cd95a-5b20-4a47-8acc-7e190d1dac4c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:41:24 np0005466013 nova_compute[192144]: 2025-10-02 12:41:24.177 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:24 np0005466013 ovn_controller[94366]: 2025-10-02T12:41:24Z|00749|binding|INFO|Releasing lport f69cd95a-5b20-4a47-8acc-7e190d1dac4c from this chassis (sb_readonly=0)
Oct  2 08:41:24 np0005466013 nova_compute[192144]: 2025-10-02 12:41:24.178 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:41:24.179 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c4f50473-7465-4325-8b4d-bb57fca0162f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c4f50473-7465-4325-8b4d-bb57fca0162f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:41:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:41:24.180 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[0eac05f9-9368-4094-8f2a-b96f547be103]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:41:24.181 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:41:24 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:41:24 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:41:24 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-c4f50473-7465-4325-8b4d-bb57fca0162f
Oct  2 08:41:24 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:41:24 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:41:24 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:41:24 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/c4f50473-7465-4325-8b4d-bb57fca0162f.pid.haproxy
Oct  2 08:41:24 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:41:24 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:41:24 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:41:24 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:41:24 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:41:24 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:41:24 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:41:24 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:41:24 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:41:24 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:41:24 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:41:24 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:41:24 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:41:24 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:41:24 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:41:24 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:41:24 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:41:24 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:41:24 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:41:24 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:41:24 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID c4f50473-7465-4325-8b4d-bb57fca0162f
Oct  2 08:41:24 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:41:24 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:41:24.181 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c4f50473-7465-4325-8b4d-bb57fca0162f', 'env', 'PROCESS_TAG=haproxy-c4f50473-7465-4325-8b4d-bb57fca0162f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c4f50473-7465-4325-8b4d-bb57fca0162f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:41:24 np0005466013 nova_compute[192144]: 2025-10-02 12:41:24.193 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:24 np0005466013 podman[249735]: 2025-10-02 12:41:24.589347189 +0000 UTC m=+0.068069541 container create 66bd383ddbb21f9342ad7861b3f47e46f0b344dbced235ddf33ac22d537e72e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c4f50473-7465-4325-8b4d-bb57fca0162f, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 08:41:24 np0005466013 podman[249735]: 2025-10-02 12:41:24.547496465 +0000 UTC m=+0.026218867 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:41:24 np0005466013 systemd[1]: Started libpod-conmon-66bd383ddbb21f9342ad7861b3f47e46f0b344dbced235ddf33ac22d537e72e9.scope.
Oct  2 08:41:24 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:41:24 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/413eb68d11cdca05d60dadd398d797cbcd8079ccc2d1f5fba01adb8ca4688cf7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:41:24 np0005466013 podman[249735]: 2025-10-02 12:41:24.705120614 +0000 UTC m=+0.183843006 container init 66bd383ddbb21f9342ad7861b3f47e46f0b344dbced235ddf33ac22d537e72e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c4f50473-7465-4325-8b4d-bb57fca0162f, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 08:41:24 np0005466013 podman[249735]: 2025-10-02 12:41:24.710190353 +0000 UTC m=+0.188912695 container start 66bd383ddbb21f9342ad7861b3f47e46f0b344dbced235ddf33ac22d537e72e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c4f50473-7465-4325-8b4d-bb57fca0162f, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct  2 08:41:24 np0005466013 nova_compute[192144]: 2025-10-02 12:41:24.726 2 DEBUG nova.compute.manager [req-b7a2cf62-9e5b-40d0-aeb2-cb51714c6e66 req-9dc86c2a-ec8c-4376-b4f6-67340c5d3400 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Received event network-vif-plugged-d526d728-0b99-484c-ba01-139df2c989a1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:41:24 np0005466013 nova_compute[192144]: 2025-10-02 12:41:24.727 2 DEBUG oslo_concurrency.lockutils [req-b7a2cf62-9e5b-40d0-aeb2-cb51714c6e66 req-9dc86c2a-ec8c-4376-b4f6-67340c5d3400 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "2b393965-8a4a-44e3-a983-22523112e307-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:41:24 np0005466013 nova_compute[192144]: 2025-10-02 12:41:24.728 2 DEBUG oslo_concurrency.lockutils [req-b7a2cf62-9e5b-40d0-aeb2-cb51714c6e66 req-9dc86c2a-ec8c-4376-b4f6-67340c5d3400 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "2b393965-8a4a-44e3-a983-22523112e307-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:41:24 np0005466013 nova_compute[192144]: 2025-10-02 12:41:24.729 2 DEBUG oslo_concurrency.lockutils [req-b7a2cf62-9e5b-40d0-aeb2-cb51714c6e66 req-9dc86c2a-ec8c-4376-b4f6-67340c5d3400 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "2b393965-8a4a-44e3-a983-22523112e307-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:41:24 np0005466013 nova_compute[192144]: 2025-10-02 12:41:24.729 2 DEBUG nova.compute.manager [req-b7a2cf62-9e5b-40d0-aeb2-cb51714c6e66 req-9dc86c2a-ec8c-4376-b4f6-67340c5d3400 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Processing event network-vif-plugged-d526d728-0b99-484c-ba01-139df2c989a1 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:41:24 np0005466013 neutron-haproxy-ovnmeta-c4f50473-7465-4325-8b4d-bb57fca0162f[249750]: [NOTICE]   (249754) : New worker (249756) forked
Oct  2 08:41:24 np0005466013 neutron-haproxy-ovnmeta-c4f50473-7465-4325-8b4d-bb57fca0162f[249750]: [NOTICE]   (249754) : Loading success.
Oct  2 08:41:24 np0005466013 nova_compute[192144]: 2025-10-02 12:41:24.835 2 DEBUG nova.compute.manager [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:41:24 np0005466013 nova_compute[192144]: 2025-10-02 12:41:24.837 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759408884.8353145, 2b393965-8a4a-44e3-a983-22523112e307 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:41:24 np0005466013 nova_compute[192144]: 2025-10-02 12:41:24.838 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 2b393965-8a4a-44e3-a983-22523112e307] VM Started (Lifecycle Event)#033[00m
Oct  2 08:41:24 np0005466013 nova_compute[192144]: 2025-10-02 12:41:24.841 2 DEBUG nova.virt.libvirt.driver [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:41:24 np0005466013 nova_compute[192144]: 2025-10-02 12:41:24.845 2 INFO nova.virt.libvirt.driver [-] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Instance spawned successfully.#033[00m
Oct  2 08:41:24 np0005466013 nova_compute[192144]: 2025-10-02 12:41:24.845 2 DEBUG nova.virt.libvirt.driver [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:41:24 np0005466013 nova_compute[192144]: 2025-10-02 12:41:24.870 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:41:24 np0005466013 nova_compute[192144]: 2025-10-02 12:41:24.876 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:41:24 np0005466013 nova_compute[192144]: 2025-10-02 12:41:24.880 2 DEBUG nova.virt.libvirt.driver [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:41:24 np0005466013 nova_compute[192144]: 2025-10-02 12:41:24.881 2 DEBUG nova.virt.libvirt.driver [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:41:24 np0005466013 nova_compute[192144]: 2025-10-02 12:41:24.881 2 DEBUG nova.virt.libvirt.driver [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:41:24 np0005466013 nova_compute[192144]: 2025-10-02 12:41:24.882 2 DEBUG nova.virt.libvirt.driver [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:41:24 np0005466013 nova_compute[192144]: 2025-10-02 12:41:24.882 2 DEBUG nova.virt.libvirt.driver [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:41:24 np0005466013 nova_compute[192144]: 2025-10-02 12:41:24.883 2 DEBUG nova.virt.libvirt.driver [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:41:24 np0005466013 nova_compute[192144]: 2025-10-02 12:41:24.920 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 2b393965-8a4a-44e3-a983-22523112e307] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:41:24 np0005466013 nova_compute[192144]: 2025-10-02 12:41:24.921 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759408884.8381023, 2b393965-8a4a-44e3-a983-22523112e307 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:41:24 np0005466013 nova_compute[192144]: 2025-10-02 12:41:24.921 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 2b393965-8a4a-44e3-a983-22523112e307] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:41:24 np0005466013 nova_compute[192144]: 2025-10-02 12:41:24.950 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:41:24 np0005466013 nova_compute[192144]: 2025-10-02 12:41:24.954 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759408884.841052, 2b393965-8a4a-44e3-a983-22523112e307 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:41:24 np0005466013 nova_compute[192144]: 2025-10-02 12:41:24.955 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 2b393965-8a4a-44e3-a983-22523112e307] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:41:24 np0005466013 nova_compute[192144]: 2025-10-02 12:41:24.975 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:41:24 np0005466013 nova_compute[192144]: 2025-10-02 12:41:24.979 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:41:24 np0005466013 nova_compute[192144]: 2025-10-02 12:41:24.984 2 INFO nova.compute.manager [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Took 12.00 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:41:24 np0005466013 nova_compute[192144]: 2025-10-02 12:41:24.985 2 DEBUG nova.compute.manager [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:41:24 np0005466013 nova_compute[192144]: 2025-10-02 12:41:24.997 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 2b393965-8a4a-44e3-a983-22523112e307] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:41:25 np0005466013 nova_compute[192144]: 2025-10-02 12:41:25.089 2 INFO nova.compute.manager [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Took 12.96 seconds to build instance.#033[00m
Oct  2 08:41:25 np0005466013 nova_compute[192144]: 2025-10-02 12:41:25.125 2 DEBUG oslo_concurrency.lockutils [None req-055d2ce0-187d-40c5-bcc1-1757cf51fd36 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "2b393965-8a4a-44e3-a983-22523112e307" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.095s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:41:25 np0005466013 nova_compute[192144]: 2025-10-02 12:41:25.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:26 np0005466013 podman[249765]: 2025-10-02 12:41:26.679561833 +0000 UTC m=+0.051851025 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:41:26 np0005466013 podman[249766]: 2025-10-02 12:41:26.685390985 +0000 UTC m=+0.054201849 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  2 08:41:26 np0005466013 podman[249767]: 2025-10-02 12:41:26.797355622 +0000 UTC m=+0.154329697 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 08:41:26 np0005466013 nova_compute[192144]: 2025-10-02 12:41:26.886 2 DEBUG nova.compute.manager [req-413bb6c0-799a-4375-908e-7ec82669da3b req-f0207f03-c768-4429-a026-04e0069a1c16 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Received event network-vif-plugged-d526d728-0b99-484c-ba01-139df2c989a1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:41:26 np0005466013 nova_compute[192144]: 2025-10-02 12:41:26.886 2 DEBUG oslo_concurrency.lockutils [req-413bb6c0-799a-4375-908e-7ec82669da3b req-f0207f03-c768-4429-a026-04e0069a1c16 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "2b393965-8a4a-44e3-a983-22523112e307-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:41:26 np0005466013 nova_compute[192144]: 2025-10-02 12:41:26.887 2 DEBUG oslo_concurrency.lockutils [req-413bb6c0-799a-4375-908e-7ec82669da3b req-f0207f03-c768-4429-a026-04e0069a1c16 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "2b393965-8a4a-44e3-a983-22523112e307-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:41:26 np0005466013 nova_compute[192144]: 2025-10-02 12:41:26.887 2 DEBUG oslo_concurrency.lockutils [req-413bb6c0-799a-4375-908e-7ec82669da3b req-f0207f03-c768-4429-a026-04e0069a1c16 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "2b393965-8a4a-44e3-a983-22523112e307-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:41:26 np0005466013 nova_compute[192144]: 2025-10-02 12:41:26.888 2 DEBUG nova.compute.manager [req-413bb6c0-799a-4375-908e-7ec82669da3b req-f0207f03-c768-4429-a026-04e0069a1c16 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] No waiting events found dispatching network-vif-plugged-d526d728-0b99-484c-ba01-139df2c989a1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:41:26 np0005466013 nova_compute[192144]: 2025-10-02 12:41:26.888 2 WARNING nova.compute.manager [req-413bb6c0-799a-4375-908e-7ec82669da3b req-f0207f03-c768-4429-a026-04e0069a1c16 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Received unexpected event network-vif-plugged-d526d728-0b99-484c-ba01-139df2c989a1 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:41:26 np0005466013 nova_compute[192144]: 2025-10-02 12:41:26.902 2 DEBUG nova.network.neutron [req-3c868777-5bb6-44ee-96f0-36b27b15b88a req-dfe82693-ba65-4ba2-9897-09e771b58114 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Updated VIF entry in instance network info cache for port d526d728-0b99-484c-ba01-139df2c989a1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:41:26 np0005466013 nova_compute[192144]: 2025-10-02 12:41:26.903 2 DEBUG nova.network.neutron [req-3c868777-5bb6-44ee-96f0-36b27b15b88a req-dfe82693-ba65-4ba2-9897-09e771b58114 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Updating instance_info_cache with network_info: [{"id": "d526d728-0b99-484c-ba01-139df2c989a1", "address": "fa:16:3e:8f:83:09", "network": {"id": "c4f50473-7465-4325-8b4d-bb57fca0162f", "bridge": "br-int", "label": "tempest-network-smoke--1490536879", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe8f:8309", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd526d728-0b", "ovs_interfaceid": "d526d728-0b99-484c-ba01-139df2c989a1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:41:26 np0005466013 nova_compute[192144]: 2025-10-02 12:41:26.924 2 DEBUG oslo_concurrency.lockutils [req-3c868777-5bb6-44ee-96f0-36b27b15b88a req-dfe82693-ba65-4ba2-9897-09e771b58114 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-2b393965-8a4a-44e3-a983-22523112e307" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:41:27 np0005466013 nova_compute[192144]: 2025-10-02 12:41:27.704 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:29 np0005466013 nova_compute[192144]: 2025-10-02 12:41:29.484 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:29 np0005466013 NetworkManager[51205]: <info>  [1759408889.4852] manager: (patch-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/333)
Oct  2 08:41:29 np0005466013 NetworkManager[51205]: <info>  [1759408889.4875] manager: (patch-br-int-to-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/334)
Oct  2 08:41:29 np0005466013 nova_compute[192144]: 2025-10-02 12:41:29.547 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:29 np0005466013 ovn_controller[94366]: 2025-10-02T12:41:29Z|00750|binding|INFO|Releasing lport f69cd95a-5b20-4a47-8acc-7e190d1dac4c from this chassis (sb_readonly=0)
Oct  2 08:41:29 np0005466013 nova_compute[192144]: 2025-10-02 12:41:29.561 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:29 np0005466013 nova_compute[192144]: 2025-10-02 12:41:29.932 2 DEBUG nova.compute.manager [req-bdd32642-d6d1-4d34-acc9-142df2f6078d req-66c703a4-60c1-4a3e-bd17-a3bfc83ecd26 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Received event network-changed-d526d728-0b99-484c-ba01-139df2c989a1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:41:29 np0005466013 nova_compute[192144]: 2025-10-02 12:41:29.932 2 DEBUG nova.compute.manager [req-bdd32642-d6d1-4d34-acc9-142df2f6078d req-66c703a4-60c1-4a3e-bd17-a3bfc83ecd26 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Refreshing instance network info cache due to event network-changed-d526d728-0b99-484c-ba01-139df2c989a1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:41:29 np0005466013 nova_compute[192144]: 2025-10-02 12:41:29.933 2 DEBUG oslo_concurrency.lockutils [req-bdd32642-d6d1-4d34-acc9-142df2f6078d req-66c703a4-60c1-4a3e-bd17-a3bfc83ecd26 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-2b393965-8a4a-44e3-a983-22523112e307" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:41:29 np0005466013 nova_compute[192144]: 2025-10-02 12:41:29.933 2 DEBUG oslo_concurrency.lockutils [req-bdd32642-d6d1-4d34-acc9-142df2f6078d req-66c703a4-60c1-4a3e-bd17-a3bfc83ecd26 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-2b393965-8a4a-44e3-a983-22523112e307" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:41:29 np0005466013 nova_compute[192144]: 2025-10-02 12:41:29.934 2 DEBUG nova.network.neutron [req-bdd32642-d6d1-4d34-acc9-142df2f6078d req-66c703a4-60c1-4a3e-bd17-a3bfc83ecd26 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Refreshing network info cache for port d526d728-0b99-484c-ba01-139df2c989a1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:41:30 np0005466013 nova_compute[192144]: 2025-10-02 12:41:30.898 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:32 np0005466013 nova_compute[192144]: 2025-10-02 12:41:32.706 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:32 np0005466013 nova_compute[192144]: 2025-10-02 12:41:32.993 2 DEBUG nova.network.neutron [req-bdd32642-d6d1-4d34-acc9-142df2f6078d req-66c703a4-60c1-4a3e-bd17-a3bfc83ecd26 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Updated VIF entry in instance network info cache for port d526d728-0b99-484c-ba01-139df2c989a1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:41:32 np0005466013 nova_compute[192144]: 2025-10-02 12:41:32.994 2 DEBUG nova.network.neutron [req-bdd32642-d6d1-4d34-acc9-142df2f6078d req-66c703a4-60c1-4a3e-bd17-a3bfc83ecd26 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Updating instance_info_cache with network_info: [{"id": "d526d728-0b99-484c-ba01-139df2c989a1", "address": "fa:16:3e:8f:83:09", "network": {"id": "c4f50473-7465-4325-8b4d-bb57fca0162f", "bridge": "br-int", "label": "tempest-network-smoke--1490536879", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe8f:8309", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd526d728-0b", "ovs_interfaceid": "d526d728-0b99-484c-ba01-139df2c989a1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:41:33 np0005466013 nova_compute[192144]: 2025-10-02 12:41:33.015 2 DEBUG oslo_concurrency.lockutils [req-bdd32642-d6d1-4d34-acc9-142df2f6078d req-66c703a4-60c1-4a3e-bd17-a3bfc83ecd26 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-2b393965-8a4a-44e3-a983-22523112e307" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:41:35 np0005466013 nova_compute[192144]: 2025-10-02 12:41:35.901 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:37 np0005466013 nova_compute[192144]: 2025-10-02 12:41:37.709 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:38 np0005466013 ovn_controller[94366]: 2025-10-02T12:41:38Z|00079|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:8f:83:09 10.100.0.9
Oct  2 08:41:38 np0005466013 ovn_controller[94366]: 2025-10-02T12:41:38Z|00080|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:8f:83:09 10.100.0.9
Oct  2 08:41:38 np0005466013 podman[249851]: 2025-10-02 12:41:38.712802686 +0000 UTC m=+0.081608740 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  2 08:41:38 np0005466013 podman[249852]: 2025-10-02 12:41:38.714565452 +0000 UTC m=+0.073429845 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, distribution-scope=public, managed_by=edpm_ansible, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, release=1755695350, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  2 08:41:38 np0005466013 podman[249853]: 2025-10-02 12:41:38.742354664 +0000 UTC m=+0.102152646 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  2 08:41:40 np0005466013 podman[249911]: 2025-10-02 12:41:40.720770682 +0000 UTC m=+0.081890509 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:41:40 np0005466013 podman[249912]: 2025-10-02 12:41:40.7321818 +0000 UTC m=+0.087973440 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, config_id=iscsid, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:41:40 np0005466013 nova_compute[192144]: 2025-10-02 12:41:40.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:42 np0005466013 nova_compute[192144]: 2025-10-02 12:41:42.711 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:42 np0005466013 nova_compute[192144]: 2025-10-02 12:41:42.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:41:42 np0005466013 nova_compute[192144]: 2025-10-02 12:41:42.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:41:44 np0005466013 nova_compute[192144]: 2025-10-02 12:41:44.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:41:44 np0005466013 nova_compute[192144]: 2025-10-02 12:41:44.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:41:44 np0005466013 nova_compute[192144]: 2025-10-02 12:41:44.995 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:41:45 np0005466013 nova_compute[192144]: 2025-10-02 12:41:45.027 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:41:45 np0005466013 nova_compute[192144]: 2025-10-02 12:41:45.028 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:41:45 np0005466013 nova_compute[192144]: 2025-10-02 12:41:45.029 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:41:45 np0005466013 nova_compute[192144]: 2025-10-02 12:41:45.029 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:41:45 np0005466013 nova_compute[192144]: 2025-10-02 12:41:45.122 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2b393965-8a4a-44e3-a983-22523112e307/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:41:45 np0005466013 nova_compute[192144]: 2025-10-02 12:41:45.202 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2b393965-8a4a-44e3-a983-22523112e307/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:41:45 np0005466013 nova_compute[192144]: 2025-10-02 12:41:45.203 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2b393965-8a4a-44e3-a983-22523112e307/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:41:45 np0005466013 nova_compute[192144]: 2025-10-02 12:41:45.258 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2b393965-8a4a-44e3-a983-22523112e307/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:41:45 np0005466013 nova_compute[192144]: 2025-10-02 12:41:45.407 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:41:45 np0005466013 nova_compute[192144]: 2025-10-02 12:41:45.408 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5501MB free_disk=73.10440444946289GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:41:45 np0005466013 nova_compute[192144]: 2025-10-02 12:41:45.409 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:41:45 np0005466013 nova_compute[192144]: 2025-10-02 12:41:45.409 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:41:45 np0005466013 nova_compute[192144]: 2025-10-02 12:41:45.510 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance 2b393965-8a4a-44e3-a983-22523112e307 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:41:45 np0005466013 nova_compute[192144]: 2025-10-02 12:41:45.511 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:41:45 np0005466013 nova_compute[192144]: 2025-10-02 12:41:45.511 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:41:45 np0005466013 nova_compute[192144]: 2025-10-02 12:41:45.560 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:41:45 np0005466013 nova_compute[192144]: 2025-10-02 12:41:45.584 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:41:45 np0005466013 nova_compute[192144]: 2025-10-02 12:41:45.612 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:41:45 np0005466013 nova_compute[192144]: 2025-10-02 12:41:45.612 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.203s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:41:45 np0005466013 nova_compute[192144]: 2025-10-02 12:41:45.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:47 np0005466013 nova_compute[192144]: 2025-10-02 12:41:47.713 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:50 np0005466013 nova_compute[192144]: 2025-10-02 12:41:50.935 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:51 np0005466013 nova_compute[192144]: 2025-10-02 12:41:51.760 2 DEBUG oslo_concurrency.lockutils [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "772fac62-a076-4ec7-b14f-4b869824cceb" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:41:51 np0005466013 nova_compute[192144]: 2025-10-02 12:41:51.760 2 DEBUG oslo_concurrency.lockutils [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "772fac62-a076-4ec7-b14f-4b869824cceb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:41:51 np0005466013 nova_compute[192144]: 2025-10-02 12:41:51.785 2 DEBUG nova.compute.manager [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:41:51 np0005466013 nova_compute[192144]: 2025-10-02 12:41:51.912 2 DEBUG oslo_concurrency.lockutils [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:41:51 np0005466013 nova_compute[192144]: 2025-10-02 12:41:51.913 2 DEBUG oslo_concurrency.lockutils [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:41:51 np0005466013 nova_compute[192144]: 2025-10-02 12:41:51.921 2 DEBUG nova.virt.hardware [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:41:51 np0005466013 nova_compute[192144]: 2025-10-02 12:41:51.922 2 INFO nova.compute.claims [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:41:52 np0005466013 nova_compute[192144]: 2025-10-02 12:41:52.088 2 DEBUG nova.compute.provider_tree [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:41:52 np0005466013 nova_compute[192144]: 2025-10-02 12:41:52.116 2 DEBUG nova.scheduler.client.report [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:41:52 np0005466013 nova_compute[192144]: 2025-10-02 12:41:52.202 2 DEBUG oslo_concurrency.lockutils [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.289s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:41:52 np0005466013 nova_compute[192144]: 2025-10-02 12:41:52.203 2 DEBUG nova.compute.manager [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:41:52 np0005466013 nova_compute[192144]: 2025-10-02 12:41:52.286 2 DEBUG nova.compute.manager [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:41:52 np0005466013 nova_compute[192144]: 2025-10-02 12:41:52.286 2 DEBUG nova.network.neutron [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:41:52 np0005466013 nova_compute[192144]: 2025-10-02 12:41:52.311 2 INFO nova.virt.libvirt.driver [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:41:52 np0005466013 nova_compute[192144]: 2025-10-02 12:41:52.349 2 DEBUG nova.compute.manager [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:41:52 np0005466013 nova_compute[192144]: 2025-10-02 12:41:52.509 2 DEBUG nova.compute.manager [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:41:52 np0005466013 nova_compute[192144]: 2025-10-02 12:41:52.511 2 DEBUG nova.virt.libvirt.driver [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:41:52 np0005466013 nova_compute[192144]: 2025-10-02 12:41:52.511 2 INFO nova.virt.libvirt.driver [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Creating image(s)#033[00m
Oct  2 08:41:52 np0005466013 nova_compute[192144]: 2025-10-02 12:41:52.512 2 DEBUG oslo_concurrency.lockutils [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "/var/lib/nova/instances/772fac62-a076-4ec7-b14f-4b869824cceb/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:41:52 np0005466013 nova_compute[192144]: 2025-10-02 12:41:52.512 2 DEBUG oslo_concurrency.lockutils [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "/var/lib/nova/instances/772fac62-a076-4ec7-b14f-4b869824cceb/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:41:52 np0005466013 nova_compute[192144]: 2025-10-02 12:41:52.513 2 DEBUG oslo_concurrency.lockutils [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "/var/lib/nova/instances/772fac62-a076-4ec7-b14f-4b869824cceb/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:41:52 np0005466013 nova_compute[192144]: 2025-10-02 12:41:52.528 2 DEBUG oslo_concurrency.processutils [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:41:52 np0005466013 nova_compute[192144]: 2025-10-02 12:41:52.595 2 DEBUG oslo_concurrency.processutils [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:41:52 np0005466013 nova_compute[192144]: 2025-10-02 12:41:52.597 2 DEBUG oslo_concurrency.lockutils [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:41:52 np0005466013 nova_compute[192144]: 2025-10-02 12:41:52.597 2 DEBUG oslo_concurrency.lockutils [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:41:52 np0005466013 nova_compute[192144]: 2025-10-02 12:41:52.614 2 DEBUG oslo_concurrency.processutils [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:41:52 np0005466013 nova_compute[192144]: 2025-10-02 12:41:52.637 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:41:52 np0005466013 nova_compute[192144]: 2025-10-02 12:41:52.638 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:41:52 np0005466013 nova_compute[192144]: 2025-10-02 12:41:52.684 2 DEBUG oslo_concurrency.processutils [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:41:52 np0005466013 nova_compute[192144]: 2025-10-02 12:41:52.686 2 DEBUG oslo_concurrency.processutils [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/772fac62-a076-4ec7-b14f-4b869824cceb/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:41:52 np0005466013 nova_compute[192144]: 2025-10-02 12:41:52.717 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:52 np0005466013 nova_compute[192144]: 2025-10-02 12:41:52.756 2 DEBUG oslo_concurrency.processutils [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/772fac62-a076-4ec7-b14f-4b869824cceb/disk 1073741824" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:41:52 np0005466013 nova_compute[192144]: 2025-10-02 12:41:52.758 2 DEBUG oslo_concurrency.lockutils [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.161s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:41:52 np0005466013 nova_compute[192144]: 2025-10-02 12:41:52.759 2 DEBUG oslo_concurrency.processutils [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:41:52 np0005466013 nova_compute[192144]: 2025-10-02 12:41:52.853 2 DEBUG oslo_concurrency.processutils [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:41:52 np0005466013 nova_compute[192144]: 2025-10-02 12:41:52.855 2 DEBUG nova.virt.disk.api [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Checking if we can resize image /var/lib/nova/instances/772fac62-a076-4ec7-b14f-4b869824cceb/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:41:52 np0005466013 nova_compute[192144]: 2025-10-02 12:41:52.856 2 DEBUG oslo_concurrency.processutils [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/772fac62-a076-4ec7-b14f-4b869824cceb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:41:52 np0005466013 nova_compute[192144]: 2025-10-02 12:41:52.915 2 DEBUG oslo_concurrency.processutils [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/772fac62-a076-4ec7-b14f-4b869824cceb/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:41:52 np0005466013 nova_compute[192144]: 2025-10-02 12:41:52.917 2 DEBUG nova.virt.disk.api [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Cannot resize image /var/lib/nova/instances/772fac62-a076-4ec7-b14f-4b869824cceb/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:41:52 np0005466013 nova_compute[192144]: 2025-10-02 12:41:52.918 2 DEBUG nova.objects.instance [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lazy-loading 'migration_context' on Instance uuid 772fac62-a076-4ec7-b14f-4b869824cceb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:41:52 np0005466013 nova_compute[192144]: 2025-10-02 12:41:52.938 2 DEBUG nova.virt.libvirt.driver [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:41:52 np0005466013 nova_compute[192144]: 2025-10-02 12:41:52.939 2 DEBUG nova.virt.libvirt.driver [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Ensure instance console log exists: /var/lib/nova/instances/772fac62-a076-4ec7-b14f-4b869824cceb/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:41:52 np0005466013 nova_compute[192144]: 2025-10-02 12:41:52.940 2 DEBUG oslo_concurrency.lockutils [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:41:52 np0005466013 nova_compute[192144]: 2025-10-02 12:41:52.941 2 DEBUG oslo_concurrency.lockutils [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:41:52 np0005466013 nova_compute[192144]: 2025-10-02 12:41:52.941 2 DEBUG oslo_concurrency.lockutils [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:41:53 np0005466013 nova_compute[192144]: 2025-10-02 12:41:53.376 2 DEBUG nova.policy [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:41:53 np0005466013 nova_compute[192144]: 2025-10-02 12:41:53.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:41:55 np0005466013 nova_compute[192144]: 2025-10-02 12:41:55.938 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:56 np0005466013 nova_compute[192144]: 2025-10-02 12:41:56.656 2 DEBUG nova.network.neutron [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Successfully created port: 80f550e3-804c-4ec1-86ab-fcd1b44d48b5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:41:57 np0005466013 podman[249983]: 2025-10-02 12:41:57.684896598 +0000 UTC m=+0.062892214 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  2 08:41:57 np0005466013 podman[249982]: 2025-10-02 12:41:57.685700853 +0000 UTC m=+0.063539554 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 08:41:57 np0005466013 podman[249984]: 2025-10-02 12:41:57.712662619 +0000 UTC m=+0.081303471 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:41:57 np0005466013 nova_compute[192144]: 2025-10-02 12:41:57.719 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:57 np0005466013 nova_compute[192144]: 2025-10-02 12:41:57.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:41:57 np0005466013 nova_compute[192144]: 2025-10-02 12:41:57.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:41:57 np0005466013 nova_compute[192144]: 2025-10-02 12:41:57.995 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:41:58 np0005466013 nova_compute[192144]: 2025-10-02 12:41:58.014 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Oct  2 08:41:58 np0005466013 nova_compute[192144]: 2025-10-02 12:41:58.081 2 DEBUG nova.network.neutron [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Successfully updated port: 80f550e3-804c-4ec1-86ab-fcd1b44d48b5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:41:58 np0005466013 nova_compute[192144]: 2025-10-02 12:41:58.097 2 DEBUG oslo_concurrency.lockutils [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "refresh_cache-772fac62-a076-4ec7-b14f-4b869824cceb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:41:58 np0005466013 nova_compute[192144]: 2025-10-02 12:41:58.097 2 DEBUG oslo_concurrency.lockutils [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquired lock "refresh_cache-772fac62-a076-4ec7-b14f-4b869824cceb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:41:58 np0005466013 nova_compute[192144]: 2025-10-02 12:41:58.097 2 DEBUG nova.network.neutron [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:41:58 np0005466013 nova_compute[192144]: 2025-10-02 12:41:58.196 2 DEBUG nova.compute.manager [req-e4bb3c82-1ffc-49bc-9929-184b37f34456 req-4c8ab6f5-c78b-432b-84e4-46188618503e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Received event network-changed-80f550e3-804c-4ec1-86ab-fcd1b44d48b5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:41:58 np0005466013 nova_compute[192144]: 2025-10-02 12:41:58.196 2 DEBUG nova.compute.manager [req-e4bb3c82-1ffc-49bc-9929-184b37f34456 req-4c8ab6f5-c78b-432b-84e4-46188618503e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Refreshing instance network info cache due to event network-changed-80f550e3-804c-4ec1-86ab-fcd1b44d48b5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:41:58 np0005466013 nova_compute[192144]: 2025-10-02 12:41:58.196 2 DEBUG oslo_concurrency.lockutils [req-e4bb3c82-1ffc-49bc-9929-184b37f34456 req-4c8ab6f5-c78b-432b-84e4-46188618503e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-772fac62-a076-4ec7-b14f-4b869824cceb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:41:58 np0005466013 nova_compute[192144]: 2025-10-02 12:41:58.203 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "refresh_cache-2b393965-8a4a-44e3-a983-22523112e307" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:41:58 np0005466013 nova_compute[192144]: 2025-10-02 12:41:58.203 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquired lock "refresh_cache-2b393965-8a4a-44e3-a983-22523112e307" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:41:58 np0005466013 nova_compute[192144]: 2025-10-02 12:41:58.203 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:41:58 np0005466013 nova_compute[192144]: 2025-10-02 12:41:58.203 2 DEBUG nova.objects.instance [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lazy-loading 'info_cache' on Instance uuid 2b393965-8a4a-44e3-a983-22523112e307 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:41:58 np0005466013 nova_compute[192144]: 2025-10-02 12:41:58.252 2 DEBUG nova.network.neutron [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:41:59 np0005466013 nova_compute[192144]: 2025-10-02 12:41:59.982 2 DEBUG nova.network.neutron [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Updating instance_info_cache with network_info: [{"id": "80f550e3-804c-4ec1-86ab-fcd1b44d48b5", "address": "fa:16:3e:4e:c9:26", "network": {"id": "c4f50473-7465-4325-8b4d-bb57fca0162f", "bridge": "br-int", "label": "tempest-network-smoke--1490536879", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe4e:c926", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap80f550e3-80", "ovs_interfaceid": "80f550e3-804c-4ec1-86ab-fcd1b44d48b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.005 2 DEBUG oslo_concurrency.lockutils [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Releasing lock "refresh_cache-772fac62-a076-4ec7-b14f-4b869824cceb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.006 2 DEBUG nova.compute.manager [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Instance network_info: |[{"id": "80f550e3-804c-4ec1-86ab-fcd1b44d48b5", "address": "fa:16:3e:4e:c9:26", "network": {"id": "c4f50473-7465-4325-8b4d-bb57fca0162f", "bridge": "br-int", "label": "tempest-network-smoke--1490536879", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe4e:c926", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap80f550e3-80", "ovs_interfaceid": "80f550e3-804c-4ec1-86ab-fcd1b44d48b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.006 2 DEBUG oslo_concurrency.lockutils [req-e4bb3c82-1ffc-49bc-9929-184b37f34456 req-4c8ab6f5-c78b-432b-84e4-46188618503e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-772fac62-a076-4ec7-b14f-4b869824cceb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.006 2 DEBUG nova.network.neutron [req-e4bb3c82-1ffc-49bc-9929-184b37f34456 req-4c8ab6f5-c78b-432b-84e4-46188618503e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Refreshing network info cache for port 80f550e3-804c-4ec1-86ab-fcd1b44d48b5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.010 2 DEBUG nova.virt.libvirt.driver [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Start _get_guest_xml network_info=[{"id": "80f550e3-804c-4ec1-86ab-fcd1b44d48b5", "address": "fa:16:3e:4e:c9:26", "network": {"id": "c4f50473-7465-4325-8b4d-bb57fca0162f", "bridge": "br-int", "label": "tempest-network-smoke--1490536879", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe4e:c926", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap80f550e3-80", "ovs_interfaceid": "80f550e3-804c-4ec1-86ab-fcd1b44d48b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.014 2 WARNING nova.virt.libvirt.driver [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.019 2 DEBUG nova.virt.libvirt.host [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.019 2 DEBUG nova.virt.libvirt.host [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.022 2 DEBUG nova.virt.libvirt.host [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.022 2 DEBUG nova.virt.libvirt.host [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.023 2 DEBUG nova.virt.libvirt.driver [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.023 2 DEBUG nova.virt.hardware [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.024 2 DEBUG nova.virt.hardware [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.024 2 DEBUG nova.virt.hardware [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.024 2 DEBUG nova.virt.hardware [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.025 2 DEBUG nova.virt.hardware [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.025 2 DEBUG nova.virt.hardware [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.025 2 DEBUG nova.virt.hardware [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.025 2 DEBUG nova.virt.hardware [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.025 2 DEBUG nova.virt.hardware [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.026 2 DEBUG nova.virt.hardware [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.026 2 DEBUG nova.virt.hardware [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.030 2 DEBUG nova.virt.libvirt.vif [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:41:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1028993127',display_name='tempest-TestGettingAddress-server-1028993127',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1028993127',id=174,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA+W0VRkiRweukUMP9gTiBmsYPV9brmjtI29ZBDpBBAygrYooWG25zCR9mTx41solx25wcFgiDOyxlNfpE1uIsSJmxlJws14lF4ZLmAF9ATw8SJfg5U3tBL7QCF1+HVJ7Q==',key_name='tempest-TestGettingAddress-972559981',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-ie1iqzyk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:41:52Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=772fac62-a076-4ec7-b14f-4b869824cceb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "80f550e3-804c-4ec1-86ab-fcd1b44d48b5", "address": "fa:16:3e:4e:c9:26", "network": {"id": "c4f50473-7465-4325-8b4d-bb57fca0162f", "bridge": "br-int", "label": "tempest-network-smoke--1490536879", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe4e:c926", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap80f550e3-80", "ovs_interfaceid": "80f550e3-804c-4ec1-86ab-fcd1b44d48b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.031 2 DEBUG nova.network.os_vif_util [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "80f550e3-804c-4ec1-86ab-fcd1b44d48b5", "address": "fa:16:3e:4e:c9:26", "network": {"id": "c4f50473-7465-4325-8b4d-bb57fca0162f", "bridge": "br-int", "label": "tempest-network-smoke--1490536879", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe4e:c926", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap80f550e3-80", "ovs_interfaceid": "80f550e3-804c-4ec1-86ab-fcd1b44d48b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.031 2 DEBUG nova.network.os_vif_util [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4e:c9:26,bridge_name='br-int',has_traffic_filtering=True,id=80f550e3-804c-4ec1-86ab-fcd1b44d48b5,network=Network(c4f50473-7465-4325-8b4d-bb57fca0162f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap80f550e3-80') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.032 2 DEBUG nova.objects.instance [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lazy-loading 'pci_devices' on Instance uuid 772fac62-a076-4ec7-b14f-4b869824cceb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.047 2 DEBUG nova.virt.libvirt.driver [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:42:00 np0005466013 nova_compute[192144]:  <uuid>772fac62-a076-4ec7-b14f-4b869824cceb</uuid>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:  <name>instance-000000ae</name>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:42:00 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:      <nova:name>tempest-TestGettingAddress-server-1028993127</nova:name>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:42:00</nova:creationTime>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:42:00 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:        <nova:user uuid="97ce9f1898484e0e9a1f7c84a9f0dfe3">tempest-TestGettingAddress-1355720650-project-member</nova:user>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:        <nova:project uuid="fd801958556f4c8aab047ecdef6b5ee8">tempest-TestGettingAddress-1355720650</nova:project>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:        <nova:port uuid="80f550e3-804c-4ec1-86ab-fcd1b44d48b5">
Oct  2 08:42:00 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe4e:c926" ipVersion="6"/>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:      <entry name="serial">772fac62-a076-4ec7-b14f-4b869824cceb</entry>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:      <entry name="uuid">772fac62-a076-4ec7-b14f-4b869824cceb</entry>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:42:00 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/772fac62-a076-4ec7-b14f-4b869824cceb/disk"/>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:42:00 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/772fac62-a076-4ec7-b14f-4b869824cceb/disk.config"/>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:42:00 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:4e:c9:26"/>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:      <target dev="tap80f550e3-80"/>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:42:00 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/772fac62-a076-4ec7-b14f-4b869824cceb/console.log" append="off"/>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:42:00 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:42:00 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:42:00 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:42:00 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:42:00 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.048 2 DEBUG nova.compute.manager [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Preparing to wait for external event network-vif-plugged-80f550e3-804c-4ec1-86ab-fcd1b44d48b5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.048 2 DEBUG oslo_concurrency.lockutils [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "772fac62-a076-4ec7-b14f-4b869824cceb-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.048 2 DEBUG oslo_concurrency.lockutils [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "772fac62-a076-4ec7-b14f-4b869824cceb-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.049 2 DEBUG oslo_concurrency.lockutils [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "772fac62-a076-4ec7-b14f-4b869824cceb-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.049 2 DEBUG nova.virt.libvirt.vif [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:41:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1028993127',display_name='tempest-TestGettingAddress-server-1028993127',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1028993127',id=174,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA+W0VRkiRweukUMP9gTiBmsYPV9brmjtI29ZBDpBBAygrYooWG25zCR9mTx41solx25wcFgiDOyxlNfpE1uIsSJmxlJws14lF4ZLmAF9ATw8SJfg5U3tBL7QCF1+HVJ7Q==',key_name='tempest-TestGettingAddress-972559981',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-ie1iqzyk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:41:52Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=772fac62-a076-4ec7-b14f-4b869824cceb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "80f550e3-804c-4ec1-86ab-fcd1b44d48b5", "address": "fa:16:3e:4e:c9:26", "network": {"id": "c4f50473-7465-4325-8b4d-bb57fca0162f", "bridge": "br-int", "label": "tempest-network-smoke--1490536879", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe4e:c926", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap80f550e3-80", "ovs_interfaceid": "80f550e3-804c-4ec1-86ab-fcd1b44d48b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.050 2 DEBUG nova.network.os_vif_util [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "80f550e3-804c-4ec1-86ab-fcd1b44d48b5", "address": "fa:16:3e:4e:c9:26", "network": {"id": "c4f50473-7465-4325-8b4d-bb57fca0162f", "bridge": "br-int", "label": "tempest-network-smoke--1490536879", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe4e:c926", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap80f550e3-80", "ovs_interfaceid": "80f550e3-804c-4ec1-86ab-fcd1b44d48b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.050 2 DEBUG nova.network.os_vif_util [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4e:c9:26,bridge_name='br-int',has_traffic_filtering=True,id=80f550e3-804c-4ec1-86ab-fcd1b44d48b5,network=Network(c4f50473-7465-4325-8b4d-bb57fca0162f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap80f550e3-80') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.051 2 DEBUG os_vif [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:c9:26,bridge_name='br-int',has_traffic_filtering=True,id=80f550e3-804c-4ec1-86ab-fcd1b44d48b5,network=Network(c4f50473-7465-4325-8b4d-bb57fca0162f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap80f550e3-80') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.051 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.052 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.052 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.055 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.056 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap80f550e3-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.056 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap80f550e3-80, col_values=(('external_ids', {'iface-id': '80f550e3-804c-4ec1-86ab-fcd1b44d48b5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4e:c9:26', 'vm-uuid': '772fac62-a076-4ec7-b14f-4b869824cceb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.058 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:00 np0005466013 NetworkManager[51205]: <info>  [1759408920.0594] manager: (tap80f550e3-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/335)
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.060 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.064 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.065 2 INFO os_vif [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:c9:26,bridge_name='br-int',has_traffic_filtering=True,id=80f550e3-804c-4ec1-86ab-fcd1b44d48b5,network=Network(c4f50473-7465-4325-8b4d-bb57fca0162f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap80f550e3-80')#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.183 2 DEBUG nova.virt.libvirt.driver [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.184 2 DEBUG nova.virt.libvirt.driver [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.184 2 DEBUG nova.virt.libvirt.driver [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] No VIF found with MAC fa:16:3e:4e:c9:26, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.184 2 INFO nova.virt.libvirt.driver [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Using config drive#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.248 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Updating instance_info_cache with network_info: [{"id": "d526d728-0b99-484c-ba01-139df2c989a1", "address": "fa:16:3e:8f:83:09", "network": {"id": "c4f50473-7465-4325-8b4d-bb57fca0162f", "bridge": "br-int", "label": "tempest-network-smoke--1490536879", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe8f:8309", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd526d728-0b", "ovs_interfaceid": "d526d728-0b99-484c-ba01-139df2c989a1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.285 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Releasing lock "refresh_cache-2b393965-8a4a-44e3-a983-22523112e307" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.286 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.286 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.581 2 INFO nova.virt.libvirt.driver [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Creating config drive at /var/lib/nova/instances/772fac62-a076-4ec7-b14f-4b869824cceb/disk.config#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.589 2 DEBUG oslo_concurrency.processutils [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/772fac62-a076-4ec7-b14f-4b869824cceb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpo6h3rdnv execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.719 2 DEBUG oslo_concurrency.processutils [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/772fac62-a076-4ec7-b14f-4b869824cceb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpo6h3rdnv" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:42:00 np0005466013 kernel: tap80f550e3-80: entered promiscuous mode
Oct  2 08:42:00 np0005466013 NetworkManager[51205]: <info>  [1759408920.7924] manager: (tap80f550e3-80): new Tun device (/org/freedesktop/NetworkManager/Devices/336)
Oct  2 08:42:00 np0005466013 ovn_controller[94366]: 2025-10-02T12:42:00Z|00751|binding|INFO|Claiming lport 80f550e3-804c-4ec1-86ab-fcd1b44d48b5 for this chassis.
Oct  2 08:42:00 np0005466013 ovn_controller[94366]: 2025-10-02T12:42:00Z|00752|binding|INFO|80f550e3-804c-4ec1-86ab-fcd1b44d48b5: Claiming fa:16:3e:4e:c9:26 10.100.0.14 2001:db8::f816:3eff:fe4e:c926
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.794 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:42:00.805 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4e:c9:26 10.100.0.14 2001:db8::f816:3eff:fe4e:c926'], port_security=['fa:16:3e:4e:c9:26 10.100.0.14 2001:db8::f816:3eff:fe4e:c926'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28 2001:db8::f816:3eff:fe4e:c926/64', 'neutron:device_id': '772fac62-a076-4ec7-b14f-4b869824cceb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c4f50473-7465-4325-8b4d-bb57fca0162f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '2', 'neutron:security_group_ids': '629ae092-c18f-4a62-b981-2da6f9b93771', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d90df5bc-8770-4be5-937c-0abfe33bbe11, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=80f550e3-804c-4ec1-86ab-fcd1b44d48b5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:42:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:42:00.808 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 80f550e3-804c-4ec1-86ab-fcd1b44d48b5 in datapath c4f50473-7465-4325-8b4d-bb57fca0162f bound to our chassis#033[00m
Oct  2 08:42:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:42:00.812 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c4f50473-7465-4325-8b4d-bb57fca0162f#033[00m
Oct  2 08:42:00 np0005466013 ovn_controller[94366]: 2025-10-02T12:42:00Z|00753|binding|INFO|Setting lport 80f550e3-804c-4ec1-86ab-fcd1b44d48b5 ovn-installed in OVS
Oct  2 08:42:00 np0005466013 ovn_controller[94366]: 2025-10-02T12:42:00Z|00754|binding|INFO|Setting lport 80f550e3-804c-4ec1-86ab-fcd1b44d48b5 up in Southbound
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.824 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:00 np0005466013 systemd-udevd[250067]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.831 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:42:00.835 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[21704f39-aaf0-4845-8bcc-41948fc3013d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:00 np0005466013 NetworkManager[51205]: <info>  [1759408920.8457] device (tap80f550e3-80): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:42:00 np0005466013 NetworkManager[51205]: <info>  [1759408920.8467] device (tap80f550e3-80): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:42:00 np0005466013 systemd-machined[152202]: New machine qemu-79-instance-000000ae.
Oct  2 08:42:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:42:00.872 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[0ae6da82-735a-447a-bca3-be5a766fa5c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:42:00.875 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[24e5f97c-82be-4576-a872-35959ac3d118]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:00 np0005466013 systemd[1]: Started Virtual Machine qemu-79-instance-000000ae.
Oct  2 08:42:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:42:00.906 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[5a19ec64-310f-4da8-9b15-361e964ff1f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:42:00.924 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[df3a9272-a0e6-46b7-ae1e-ab1e22e8bdbc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc4f50473-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c1:9b:5a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 22, 'tx_packets': 5, 'rx_bytes': 1956, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 22, 'tx_packets': 5, 'rx_bytes': 1956, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 220], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 687564, 'reachable_time': 15412, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 20, 'inoctets': 1592, 'indelivers': 4, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 20, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 1592, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 20, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 4, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 250079, 'error': None, 'target': 'ovnmeta-c4f50473-7465-4325-8b4d-bb57fca0162f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.939 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:42:00.941 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[185d3708-9541-4088-823e-05836726b1a6]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapc4f50473-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 687574, 'tstamp': 687574}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 250083, 'error': None, 'target': 'ovnmeta-c4f50473-7465-4325-8b4d-bb57fca0162f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapc4f50473-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 687577, 'tstamp': 687577}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 250083, 'error': None, 'target': 'ovnmeta-c4f50473-7465-4325-8b4d-bb57fca0162f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:42:00.943 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc4f50473-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.944 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:00 np0005466013 nova_compute[192144]: 2025-10-02 12:42:00.945 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:42:00.945 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc4f50473-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:42:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:42:00.945 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:42:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:42:00.946 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc4f50473-70, col_values=(('external_ids', {'iface-id': 'f69cd95a-5b20-4a47-8acc-7e190d1dac4c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:42:00 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:42:00.946 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:42:01 np0005466013 nova_compute[192144]: 2025-10-02 12:42:01.613 2 DEBUG nova.compute.manager [req-9989b4e8-b0f6-4dce-9d59-176f46b1e33e req-343be375-205d-4596-98e6-38d85c48a010 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Received event network-vif-plugged-80f550e3-804c-4ec1-86ab-fcd1b44d48b5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:42:01 np0005466013 nova_compute[192144]: 2025-10-02 12:42:01.615 2 DEBUG oslo_concurrency.lockutils [req-9989b4e8-b0f6-4dce-9d59-176f46b1e33e req-343be375-205d-4596-98e6-38d85c48a010 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "772fac62-a076-4ec7-b14f-4b869824cceb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:42:01 np0005466013 nova_compute[192144]: 2025-10-02 12:42:01.615 2 DEBUG oslo_concurrency.lockutils [req-9989b4e8-b0f6-4dce-9d59-176f46b1e33e req-343be375-205d-4596-98e6-38d85c48a010 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "772fac62-a076-4ec7-b14f-4b869824cceb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:42:01 np0005466013 nova_compute[192144]: 2025-10-02 12:42:01.615 2 DEBUG oslo_concurrency.lockutils [req-9989b4e8-b0f6-4dce-9d59-176f46b1e33e req-343be375-205d-4596-98e6-38d85c48a010 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "772fac62-a076-4ec7-b14f-4b869824cceb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:42:01 np0005466013 nova_compute[192144]: 2025-10-02 12:42:01.615 2 DEBUG nova.compute.manager [req-9989b4e8-b0f6-4dce-9d59-176f46b1e33e req-343be375-205d-4596-98e6-38d85c48a010 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Processing event network-vif-plugged-80f550e3-804c-4ec1-86ab-fcd1b44d48b5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:42:01 np0005466013 nova_compute[192144]: 2025-10-02 12:42:01.699 2 DEBUG nova.compute.manager [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:42:01 np0005466013 nova_compute[192144]: 2025-10-02 12:42:01.700 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759408921.698694, 772fac62-a076-4ec7-b14f-4b869824cceb => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:42:01 np0005466013 nova_compute[192144]: 2025-10-02 12:42:01.701 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] VM Started (Lifecycle Event)#033[00m
Oct  2 08:42:01 np0005466013 nova_compute[192144]: 2025-10-02 12:42:01.705 2 DEBUG nova.virt.libvirt.driver [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:42:01 np0005466013 nova_compute[192144]: 2025-10-02 12:42:01.709 2 INFO nova.virt.libvirt.driver [-] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Instance spawned successfully.#033[00m
Oct  2 08:42:01 np0005466013 nova_compute[192144]: 2025-10-02 12:42:01.710 2 DEBUG nova.virt.libvirt.driver [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:42:01 np0005466013 nova_compute[192144]: 2025-10-02 12:42:01.730 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:42:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:42:01.734 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=47, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=46) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:42:01 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:42:01.735 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:42:01 np0005466013 nova_compute[192144]: 2025-10-02 12:42:01.736 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:01 np0005466013 nova_compute[192144]: 2025-10-02 12:42:01.740 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:42:01 np0005466013 nova_compute[192144]: 2025-10-02 12:42:01.744 2 DEBUG nova.virt.libvirt.driver [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:42:01 np0005466013 nova_compute[192144]: 2025-10-02 12:42:01.745 2 DEBUG nova.virt.libvirt.driver [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:42:01 np0005466013 nova_compute[192144]: 2025-10-02 12:42:01.746 2 DEBUG nova.virt.libvirt.driver [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:42:01 np0005466013 nova_compute[192144]: 2025-10-02 12:42:01.746 2 DEBUG nova.virt.libvirt.driver [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:42:01 np0005466013 nova_compute[192144]: 2025-10-02 12:42:01.747 2 DEBUG nova.virt.libvirt.driver [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:42:01 np0005466013 nova_compute[192144]: 2025-10-02 12:42:01.748 2 DEBUG nova.virt.libvirt.driver [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:42:01 np0005466013 nova_compute[192144]: 2025-10-02 12:42:01.770 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:42:01 np0005466013 nova_compute[192144]: 2025-10-02 12:42:01.771 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759408921.6988606, 772fac62-a076-4ec7-b14f-4b869824cceb => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:42:01 np0005466013 nova_compute[192144]: 2025-10-02 12:42:01.772 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:42:01 np0005466013 nova_compute[192144]: 2025-10-02 12:42:01.805 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:42:01 np0005466013 nova_compute[192144]: 2025-10-02 12:42:01.810 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759408921.7042072, 772fac62-a076-4ec7-b14f-4b869824cceb => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:42:01 np0005466013 nova_compute[192144]: 2025-10-02 12:42:01.810 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:42:01 np0005466013 nova_compute[192144]: 2025-10-02 12:42:01.840 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:42:01 np0005466013 nova_compute[192144]: 2025-10-02 12:42:01.845 2 INFO nova.compute.manager [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Took 9.34 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:42:01 np0005466013 nova_compute[192144]: 2025-10-02 12:42:01.846 2 DEBUG nova.compute.manager [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:42:01 np0005466013 nova_compute[192144]: 2025-10-02 12:42:01.847 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:42:01 np0005466013 nova_compute[192144]: 2025-10-02 12:42:01.890 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:42:01 np0005466013 nova_compute[192144]: 2025-10-02 12:42:01.956 2 INFO nova.compute.manager [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Took 10.09 seconds to build instance.#033[00m
Oct  2 08:42:01 np0005466013 nova_compute[192144]: 2025-10-02 12:42:01.976 2 DEBUG oslo_concurrency.lockutils [None req-2da7db19-223e-4b01-a0e1-d2dae1ca3705 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "772fac62-a076-4ec7-b14f-4b869824cceb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.216s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:42:02 np0005466013 nova_compute[192144]: 2025-10-02 12:42:02.317 2 DEBUG nova.network.neutron [req-e4bb3c82-1ffc-49bc-9929-184b37f34456 req-4c8ab6f5-c78b-432b-84e4-46188618503e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Updated VIF entry in instance network info cache for port 80f550e3-804c-4ec1-86ab-fcd1b44d48b5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:42:02 np0005466013 nova_compute[192144]: 2025-10-02 12:42:02.318 2 DEBUG nova.network.neutron [req-e4bb3c82-1ffc-49bc-9929-184b37f34456 req-4c8ab6f5-c78b-432b-84e4-46188618503e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Updating instance_info_cache with network_info: [{"id": "80f550e3-804c-4ec1-86ab-fcd1b44d48b5", "address": "fa:16:3e:4e:c9:26", "network": {"id": "c4f50473-7465-4325-8b4d-bb57fca0162f", "bridge": "br-int", "label": "tempest-network-smoke--1490536879", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe4e:c926", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap80f550e3-80", "ovs_interfaceid": "80f550e3-804c-4ec1-86ab-fcd1b44d48b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:42:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:42:02.326 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:42:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:42:02.327 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:42:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:42:02.328 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:42:02 np0005466013 nova_compute[192144]: 2025-10-02 12:42:02.364 2 DEBUG oslo_concurrency.lockutils [req-e4bb3c82-1ffc-49bc-9929-184b37f34456 req-4c8ab6f5-c78b-432b-84e4-46188618503e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-772fac62-a076-4ec7-b14f-4b869824cceb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:42:03 np0005466013 nova_compute[192144]: 2025-10-02 12:42:03.696 2 DEBUG nova.compute.manager [req-38918ec8-8fbe-4ab1-a658-1b578d5715a5 req-e806afa2-a823-4f95-b221-d0a879558387 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Received event network-vif-plugged-80f550e3-804c-4ec1-86ab-fcd1b44d48b5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:42:03 np0005466013 nova_compute[192144]: 2025-10-02 12:42:03.697 2 DEBUG oslo_concurrency.lockutils [req-38918ec8-8fbe-4ab1-a658-1b578d5715a5 req-e806afa2-a823-4f95-b221-d0a879558387 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "772fac62-a076-4ec7-b14f-4b869824cceb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:42:03 np0005466013 nova_compute[192144]: 2025-10-02 12:42:03.697 2 DEBUG oslo_concurrency.lockutils [req-38918ec8-8fbe-4ab1-a658-1b578d5715a5 req-e806afa2-a823-4f95-b221-d0a879558387 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "772fac62-a076-4ec7-b14f-4b869824cceb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:42:03 np0005466013 nova_compute[192144]: 2025-10-02 12:42:03.697 2 DEBUG oslo_concurrency.lockutils [req-38918ec8-8fbe-4ab1-a658-1b578d5715a5 req-e806afa2-a823-4f95-b221-d0a879558387 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "772fac62-a076-4ec7-b14f-4b869824cceb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:42:03 np0005466013 nova_compute[192144]: 2025-10-02 12:42:03.698 2 DEBUG nova.compute.manager [req-38918ec8-8fbe-4ab1-a658-1b578d5715a5 req-e806afa2-a823-4f95-b221-d0a879558387 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] No waiting events found dispatching network-vif-plugged-80f550e3-804c-4ec1-86ab-fcd1b44d48b5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:42:03 np0005466013 nova_compute[192144]: 2025-10-02 12:42:03.698 2 WARNING nova.compute.manager [req-38918ec8-8fbe-4ab1-a658-1b578d5715a5 req-e806afa2-a823-4f95-b221-d0a879558387 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Received unexpected event network-vif-plugged-80f550e3-804c-4ec1-86ab-fcd1b44d48b5 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:42:03 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:42:03.737 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '47'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:42:05 np0005466013 nova_compute[192144]: 2025-10-02 12:42:05.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:05 np0005466013 nova_compute[192144]: 2025-10-02 12:42:05.977 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:07 np0005466013 nova_compute[192144]: 2025-10-02 12:42:07.885 2 DEBUG nova.compute.manager [req-901d2ad6-21f8-4533-bb39-33acafe8ca15 req-80ba738d-bdf0-496b-a246-ccf7c23060d2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Received event network-changed-80f550e3-804c-4ec1-86ab-fcd1b44d48b5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:42:07 np0005466013 nova_compute[192144]: 2025-10-02 12:42:07.886 2 DEBUG nova.compute.manager [req-901d2ad6-21f8-4533-bb39-33acafe8ca15 req-80ba738d-bdf0-496b-a246-ccf7c23060d2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Refreshing instance network info cache due to event network-changed-80f550e3-804c-4ec1-86ab-fcd1b44d48b5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:42:07 np0005466013 nova_compute[192144]: 2025-10-02 12:42:07.887 2 DEBUG oslo_concurrency.lockutils [req-901d2ad6-21f8-4533-bb39-33acafe8ca15 req-80ba738d-bdf0-496b-a246-ccf7c23060d2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-772fac62-a076-4ec7-b14f-4b869824cceb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:42:07 np0005466013 nova_compute[192144]: 2025-10-02 12:42:07.887 2 DEBUG oslo_concurrency.lockutils [req-901d2ad6-21f8-4533-bb39-33acafe8ca15 req-80ba738d-bdf0-496b-a246-ccf7c23060d2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-772fac62-a076-4ec7-b14f-4b869824cceb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:42:07 np0005466013 nova_compute[192144]: 2025-10-02 12:42:07.887 2 DEBUG nova.network.neutron [req-901d2ad6-21f8-4533-bb39-33acafe8ca15 req-80ba738d-bdf0-496b-a246-ccf7c23060d2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Refreshing network info cache for port 80f550e3-804c-4ec1-86ab-fcd1b44d48b5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:42:09 np0005466013 nova_compute[192144]: 2025-10-02 12:42:09.367 2 DEBUG nova.network.neutron [req-901d2ad6-21f8-4533-bb39-33acafe8ca15 req-80ba738d-bdf0-496b-a246-ccf7c23060d2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Updated VIF entry in instance network info cache for port 80f550e3-804c-4ec1-86ab-fcd1b44d48b5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:42:09 np0005466013 nova_compute[192144]: 2025-10-02 12:42:09.368 2 DEBUG nova.network.neutron [req-901d2ad6-21f8-4533-bb39-33acafe8ca15 req-80ba738d-bdf0-496b-a246-ccf7c23060d2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Updating instance_info_cache with network_info: [{"id": "80f550e3-804c-4ec1-86ab-fcd1b44d48b5", "address": "fa:16:3e:4e:c9:26", "network": {"id": "c4f50473-7465-4325-8b4d-bb57fca0162f", "bridge": "br-int", "label": "tempest-network-smoke--1490536879", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe4e:c926", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.244", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap80f550e3-80", "ovs_interfaceid": "80f550e3-804c-4ec1-86ab-fcd1b44d48b5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:42:09 np0005466013 nova_compute[192144]: 2025-10-02 12:42:09.392 2 DEBUG oslo_concurrency.lockutils [req-901d2ad6-21f8-4533-bb39-33acafe8ca15 req-80ba738d-bdf0-496b-a246-ccf7c23060d2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-772fac62-a076-4ec7-b14f-4b869824cceb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:42:09 np0005466013 podman[250097]: 2025-10-02 12:42:09.697710446 +0000 UTC m=+0.069791049 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, distribution-scope=public, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git, io.openshift.tags=minimal rhel9, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, managed_by=edpm_ansible)
Oct  2 08:42:09 np0005466013 podman[250096]: 2025-10-02 12:42:09.712903133 +0000 UTC m=+0.084197331 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Oct  2 08:42:09 np0005466013 podman[250098]: 2025-10-02 12:42:09.730739393 +0000 UTC m=+0.094817525 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute)
Oct  2 08:42:10 np0005466013 nova_compute[192144]: 2025-10-02 12:42:10.061 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:10 np0005466013 nova_compute[192144]: 2025-10-02 12:42:10.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:11 np0005466013 podman[250156]: 2025-10-02 12:42:11.676772025 +0000 UTC m=+0.053242652 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  2 08:42:11 np0005466013 podman[250155]: 2025-10-02 12:42:11.710253885 +0000 UTC m=+0.080750404 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 08:42:14 np0005466013 ovn_controller[94366]: 2025-10-02T12:42:14Z|00755|binding|INFO|Releasing lport f69cd95a-5b20-4a47-8acc-7e190d1dac4c from this chassis (sb_readonly=0)
Oct  2 08:42:14 np0005466013 nova_compute[192144]: 2025-10-02 12:42:14.229 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:15 np0005466013 nova_compute[192144]: 2025-10-02 12:42:15.104 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:15 np0005466013 nova_compute[192144]: 2025-10-02 12:42:15.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:18 np0005466013 ovn_controller[94366]: 2025-10-02T12:42:18Z|00081|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:4e:c9:26 10.100.0.14
Oct  2 08:42:18 np0005466013 ovn_controller[94366]: 2025-10-02T12:42:18Z|00082|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:4e:c9:26 10.100.0.14
Oct  2 08:42:20 np0005466013 nova_compute[192144]: 2025-10-02 12:42:20.134 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:20 np0005466013 nova_compute[192144]: 2025-10-02 12:42:20.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:20 np0005466013 nova_compute[192144]: 2025-10-02 12:42:20.982 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:25 np0005466013 nova_compute[192144]: 2025-10-02 12:42:25.137 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:25 np0005466013 nova_compute[192144]: 2025-10-02 12:42:25.925 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:25 np0005466013 nova_compute[192144]: 2025-10-02 12:42:25.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:28 np0005466013 podman[250225]: 2025-10-02 12:42:28.696302576 +0000 UTC m=+0.055024987 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  2 08:42:28 np0005466013 podman[250224]: 2025-10-02 12:42:28.712925538 +0000 UTC m=+0.086222376 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:42:28 np0005466013 podman[250226]: 2025-10-02 12:42:28.753433898 +0000 UTC m=+0.111206829 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 08:42:30 np0005466013 nova_compute[192144]: 2025-10-02 12:42:30.139 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:30 np0005466013 nova_compute[192144]: 2025-10-02 12:42:30.987 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:35 np0005466013 nova_compute[192144]: 2025-10-02 12:42:35.141 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:35 np0005466013 nova_compute[192144]: 2025-10-02 12:42:35.989 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:40 np0005466013 nova_compute[192144]: 2025-10-02 12:42:40.144 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:40 np0005466013 nova_compute[192144]: 2025-10-02 12:42:40.206 2 DEBUG nova.compute.manager [req-db6996e9-d50e-4617-ada4-27f87b752255 req-aa757ec6-fd62-4054-9d28-942db2807b38 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Received event network-changed-80f550e3-804c-4ec1-86ab-fcd1b44d48b5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:42:40 np0005466013 nova_compute[192144]: 2025-10-02 12:42:40.206 2 DEBUG nova.compute.manager [req-db6996e9-d50e-4617-ada4-27f87b752255 req-aa757ec6-fd62-4054-9d28-942db2807b38 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Refreshing instance network info cache due to event network-changed-80f550e3-804c-4ec1-86ab-fcd1b44d48b5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:42:40 np0005466013 nova_compute[192144]: 2025-10-02 12:42:40.206 2 DEBUG oslo_concurrency.lockutils [req-db6996e9-d50e-4617-ada4-27f87b752255 req-aa757ec6-fd62-4054-9d28-942db2807b38 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-772fac62-a076-4ec7-b14f-4b869824cceb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:42:40 np0005466013 nova_compute[192144]: 2025-10-02 12:42:40.206 2 DEBUG oslo_concurrency.lockutils [req-db6996e9-d50e-4617-ada4-27f87b752255 req-aa757ec6-fd62-4054-9d28-942db2807b38 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-772fac62-a076-4ec7-b14f-4b869824cceb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:42:40 np0005466013 nova_compute[192144]: 2025-10-02 12:42:40.207 2 DEBUG nova.network.neutron [req-db6996e9-d50e-4617-ada4-27f87b752255 req-aa757ec6-fd62-4054-9d28-942db2807b38 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Refreshing network info cache for port 80f550e3-804c-4ec1-86ab-fcd1b44d48b5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:42:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:42:40.210 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=48, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=47) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:42:40 np0005466013 nova_compute[192144]: 2025-10-02 12:42:40.211 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:42:40.211 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:42:40 np0005466013 nova_compute[192144]: 2025-10-02 12:42:40.336 2 DEBUG oslo_concurrency.lockutils [None req-42b5c6f9-6d80-4ccb-87e8-5a19c97a6230 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "772fac62-a076-4ec7-b14f-4b869824cceb" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:42:40 np0005466013 nova_compute[192144]: 2025-10-02 12:42:40.336 2 DEBUG oslo_concurrency.lockutils [None req-42b5c6f9-6d80-4ccb-87e8-5a19c97a6230 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "772fac62-a076-4ec7-b14f-4b869824cceb" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:42:40 np0005466013 nova_compute[192144]: 2025-10-02 12:42:40.336 2 DEBUG oslo_concurrency.lockutils [None req-42b5c6f9-6d80-4ccb-87e8-5a19c97a6230 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "772fac62-a076-4ec7-b14f-4b869824cceb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:42:40 np0005466013 nova_compute[192144]: 2025-10-02 12:42:40.336 2 DEBUG oslo_concurrency.lockutils [None req-42b5c6f9-6d80-4ccb-87e8-5a19c97a6230 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "772fac62-a076-4ec7-b14f-4b869824cceb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:42:40 np0005466013 nova_compute[192144]: 2025-10-02 12:42:40.337 2 DEBUG oslo_concurrency.lockutils [None req-42b5c6f9-6d80-4ccb-87e8-5a19c97a6230 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "772fac62-a076-4ec7-b14f-4b869824cceb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:42:40 np0005466013 nova_compute[192144]: 2025-10-02 12:42:40.348 2 INFO nova.compute.manager [None req-42b5c6f9-6d80-4ccb-87e8-5a19c97a6230 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Terminating instance#033[00m
Oct  2 08:42:40 np0005466013 nova_compute[192144]: 2025-10-02 12:42:40.357 2 DEBUG nova.compute.manager [None req-42b5c6f9-6d80-4ccb-87e8-5a19c97a6230 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:42:40 np0005466013 kernel: tap80f550e3-80 (unregistering): left promiscuous mode
Oct  2 08:42:40 np0005466013 NetworkManager[51205]: <info>  [1759408960.3835] device (tap80f550e3-80): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:42:40 np0005466013 nova_compute[192144]: 2025-10-02 12:42:40.416 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:40 np0005466013 ovn_controller[94366]: 2025-10-02T12:42:40Z|00756|binding|INFO|Releasing lport 80f550e3-804c-4ec1-86ab-fcd1b44d48b5 from this chassis (sb_readonly=0)
Oct  2 08:42:40 np0005466013 ovn_controller[94366]: 2025-10-02T12:42:40Z|00757|binding|INFO|Setting lport 80f550e3-804c-4ec1-86ab-fcd1b44d48b5 down in Southbound
Oct  2 08:42:40 np0005466013 ovn_controller[94366]: 2025-10-02T12:42:40Z|00758|binding|INFO|Removing iface tap80f550e3-80 ovn-installed in OVS
Oct  2 08:42:40 np0005466013 nova_compute[192144]: 2025-10-02 12:42:40.418 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:42:40.424 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4e:c9:26 10.100.0.14 2001:db8::f816:3eff:fe4e:c926'], port_security=['fa:16:3e:4e:c9:26 10.100.0.14 2001:db8::f816:3eff:fe4e:c926'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28 2001:db8::f816:3eff:fe4e:c926/64', 'neutron:device_id': '772fac62-a076-4ec7-b14f-4b869824cceb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c4f50473-7465-4325-8b4d-bb57fca0162f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '4', 'neutron:security_group_ids': '629ae092-c18f-4a62-b981-2da6f9b93771', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d90df5bc-8770-4be5-937c-0abfe33bbe11, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=80f550e3-804c-4ec1-86ab-fcd1b44d48b5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:42:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:42:40.425 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 80f550e3-804c-4ec1-86ab-fcd1b44d48b5 in datapath c4f50473-7465-4325-8b4d-bb57fca0162f unbound from our chassis#033[00m
Oct  2 08:42:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:42:40.427 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c4f50473-7465-4325-8b4d-bb57fca0162f#033[00m
Oct  2 08:42:40 np0005466013 nova_compute[192144]: 2025-10-02 12:42:40.433 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:42:40.454 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[d15c8a4a-c264-44c8-8a0d-7b33e46e6971]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:40 np0005466013 systemd[1]: machine-qemu\x2d79\x2dinstance\x2d000000ae.scope: Deactivated successfully.
Oct  2 08:42:40 np0005466013 systemd[1]: machine-qemu\x2d79\x2dinstance\x2d000000ae.scope: Consumed 14.387s CPU time.
Oct  2 08:42:40 np0005466013 systemd-machined[152202]: Machine qemu-79-instance-000000ae terminated.
Oct  2 08:42:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:42:40.484 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[de3444ce-b246-4dd0-b8d9-c40c57162d49]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:42:40.487 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[7a830ca6-4955-42f9-b5de-d3382c3987ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:40 np0005466013 podman[250295]: 2025-10-02 12:42:40.491411455 +0000 UTC m=+0.057641890 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 08:42:40 np0005466013 podman[250299]: 2025-10-02 12:42:40.510684059 +0000 UTC m=+0.066564409 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:42:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:42:40.512 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[dcfddedc-eb92-4fd3-8aca-e72a9141dcfc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:42:40.528 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[9beeebf6-695c-4d8f-94d7-979e05fbdbc9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc4f50473-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c1:9b:5a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 36, 'tx_packets': 7, 'rx_bytes': 3080, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 36, 'tx_packets': 7, 'rx_bytes': 3080, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 220], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 687564, 'reachable_time': 15412, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 32, 'inoctets': 2464, 'indelivers': 7, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 32, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 2464, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 32, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 7, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 250363, 'error': None, 'target': 'ovnmeta-c4f50473-7465-4325-8b4d-bb57fca0162f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:42:40.545 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[fb094aed-31d0-4c75-9ed1-3079b74fa66e]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapc4f50473-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 687574, 'tstamp': 687574}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 250364, 'error': None, 'target': 'ovnmeta-c4f50473-7465-4325-8b4d-bb57fca0162f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapc4f50473-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 687577, 'tstamp': 687577}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 250364, 'error': None, 'target': 'ovnmeta-c4f50473-7465-4325-8b4d-bb57fca0162f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:42:40.546 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc4f50473-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:42:40 np0005466013 nova_compute[192144]: 2025-10-02 12:42:40.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:40 np0005466013 podman[250298]: 2025-10-02 12:42:40.548758924 +0000 UTC m=+0.112706916 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, maintainer=Red Hat, Inc., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, vcs-type=git, version=9.6, build-date=2025-08-20T13:12:41, config_id=edpm, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct  2 08:42:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:42:40.552 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc4f50473-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:42:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:42:40.553 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:42:40 np0005466013 nova_compute[192144]: 2025-10-02 12:42:40.553 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:42:40.553 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc4f50473-70, col_values=(('external_ids', {'iface-id': 'f69cd95a-5b20-4a47-8acc-7e190d1dac4c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:42:40 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:42:40.553 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:42:40 np0005466013 nova_compute[192144]: 2025-10-02 12:42:40.623 2 INFO nova.virt.libvirt.driver [-] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Instance destroyed successfully.#033[00m
Oct  2 08:42:40 np0005466013 nova_compute[192144]: 2025-10-02 12:42:40.624 2 DEBUG nova.objects.instance [None req-42b5c6f9-6d80-4ccb-87e8-5a19c97a6230 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lazy-loading 'resources' on Instance uuid 772fac62-a076-4ec7-b14f-4b869824cceb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:42:40 np0005466013 nova_compute[192144]: 2025-10-02 12:42:40.640 2 DEBUG nova.virt.libvirt.vif [None req-42b5c6f9-6d80-4ccb-87e8-5a19c97a6230 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:41:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1028993127',display_name='tempest-TestGettingAddress-server-1028993127',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1028993127',id=174,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA+W0VRkiRweukUMP9gTiBmsYPV9brmjtI29ZBDpBBAygrYooWG25zCR9mTx41solx25wcFgiDOyxlNfpE1uIsSJmxlJws14lF4ZLmAF9ATw8SJfg5U3tBL7QCF1+HVJ7Q==',key_name='tempest-TestGettingAddress-972559981',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:42:01Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-ie1iqzyk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:42:01Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=772fac62-a076-4ec7-b14f-4b869824cceb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "80f550e3-804c-4ec1-86ab-fcd1b44d48b5", "address": "fa:16:3e:4e:c9:26", "network": {"id": "c4f50473-7465-4325-8b4d-bb57fca0162f", "bridge": "br-int", "label": "tempest-network-smoke--1490536879", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe4e:c926", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.244", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap80f550e3-80", "ovs_interfaceid": "80f550e3-804c-4ec1-86ab-fcd1b44d48b5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:42:40 np0005466013 nova_compute[192144]: 2025-10-02 12:42:40.640 2 DEBUG nova.network.os_vif_util [None req-42b5c6f9-6d80-4ccb-87e8-5a19c97a6230 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "80f550e3-804c-4ec1-86ab-fcd1b44d48b5", "address": "fa:16:3e:4e:c9:26", "network": {"id": "c4f50473-7465-4325-8b4d-bb57fca0162f", "bridge": "br-int", "label": "tempest-network-smoke--1490536879", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe4e:c926", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.244", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap80f550e3-80", "ovs_interfaceid": "80f550e3-804c-4ec1-86ab-fcd1b44d48b5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:42:40 np0005466013 nova_compute[192144]: 2025-10-02 12:42:40.641 2 DEBUG nova.network.os_vif_util [None req-42b5c6f9-6d80-4ccb-87e8-5a19c97a6230 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:4e:c9:26,bridge_name='br-int',has_traffic_filtering=True,id=80f550e3-804c-4ec1-86ab-fcd1b44d48b5,network=Network(c4f50473-7465-4325-8b4d-bb57fca0162f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap80f550e3-80') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:42:40 np0005466013 nova_compute[192144]: 2025-10-02 12:42:40.641 2 DEBUG os_vif [None req-42b5c6f9-6d80-4ccb-87e8-5a19c97a6230 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:4e:c9:26,bridge_name='br-int',has_traffic_filtering=True,id=80f550e3-804c-4ec1-86ab-fcd1b44d48b5,network=Network(c4f50473-7465-4325-8b4d-bb57fca0162f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap80f550e3-80') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:42:40 np0005466013 nova_compute[192144]: 2025-10-02 12:42:40.642 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:40 np0005466013 nova_compute[192144]: 2025-10-02 12:42:40.642 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap80f550e3-80, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:42:40 np0005466013 nova_compute[192144]: 2025-10-02 12:42:40.648 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:40 np0005466013 nova_compute[192144]: 2025-10-02 12:42:40.649 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:42:40 np0005466013 nova_compute[192144]: 2025-10-02 12:42:40.651 2 INFO os_vif [None req-42b5c6f9-6d80-4ccb-87e8-5a19c97a6230 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:4e:c9:26,bridge_name='br-int',has_traffic_filtering=True,id=80f550e3-804c-4ec1-86ab-fcd1b44d48b5,network=Network(c4f50473-7465-4325-8b4d-bb57fca0162f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap80f550e3-80')#033[00m
Oct  2 08:42:40 np0005466013 nova_compute[192144]: 2025-10-02 12:42:40.651 2 INFO nova.virt.libvirt.driver [None req-42b5c6f9-6d80-4ccb-87e8-5a19c97a6230 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Deleting instance files /var/lib/nova/instances/772fac62-a076-4ec7-b14f-4b869824cceb_del#033[00m
Oct  2 08:42:40 np0005466013 nova_compute[192144]: 2025-10-02 12:42:40.652 2 INFO nova.virt.libvirt.driver [None req-42b5c6f9-6d80-4ccb-87e8-5a19c97a6230 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Deletion of /var/lib/nova/instances/772fac62-a076-4ec7-b14f-4b869824cceb_del complete#033[00m
Oct  2 08:42:40 np0005466013 nova_compute[192144]: 2025-10-02 12:42:40.735 2 INFO nova.compute.manager [None req-42b5c6f9-6d80-4ccb-87e8-5a19c97a6230 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:42:40 np0005466013 nova_compute[192144]: 2025-10-02 12:42:40.736 2 DEBUG oslo.service.loopingcall [None req-42b5c6f9-6d80-4ccb-87e8-5a19c97a6230 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:42:40 np0005466013 nova_compute[192144]: 2025-10-02 12:42:40.738 2 DEBUG nova.compute.manager [-] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:42:40 np0005466013 nova_compute[192144]: 2025-10-02 12:42:40.739 2 DEBUG nova.network.neutron [-] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:42:40 np0005466013 nova_compute[192144]: 2025-10-02 12:42:40.992 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:41 np0005466013 nova_compute[192144]: 2025-10-02 12:42:41.529 2 DEBUG nova.compute.manager [req-4e4b860c-9c79-4829-b25a-13a8964821f3 req-194338f6-e540-4b9e-93e5-6bbc5793215a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Received event network-vif-unplugged-80f550e3-804c-4ec1-86ab-fcd1b44d48b5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:42:41 np0005466013 nova_compute[192144]: 2025-10-02 12:42:41.530 2 DEBUG oslo_concurrency.lockutils [req-4e4b860c-9c79-4829-b25a-13a8964821f3 req-194338f6-e540-4b9e-93e5-6bbc5793215a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "772fac62-a076-4ec7-b14f-4b869824cceb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:42:41 np0005466013 nova_compute[192144]: 2025-10-02 12:42:41.530 2 DEBUG oslo_concurrency.lockutils [req-4e4b860c-9c79-4829-b25a-13a8964821f3 req-194338f6-e540-4b9e-93e5-6bbc5793215a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "772fac62-a076-4ec7-b14f-4b869824cceb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:42:41 np0005466013 nova_compute[192144]: 2025-10-02 12:42:41.530 2 DEBUG oslo_concurrency.lockutils [req-4e4b860c-9c79-4829-b25a-13a8964821f3 req-194338f6-e540-4b9e-93e5-6bbc5793215a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "772fac62-a076-4ec7-b14f-4b869824cceb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:42:41 np0005466013 nova_compute[192144]: 2025-10-02 12:42:41.530 2 DEBUG nova.compute.manager [req-4e4b860c-9c79-4829-b25a-13a8964821f3 req-194338f6-e540-4b9e-93e5-6bbc5793215a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] No waiting events found dispatching network-vif-unplugged-80f550e3-804c-4ec1-86ab-fcd1b44d48b5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:42:41 np0005466013 nova_compute[192144]: 2025-10-02 12:42:41.531 2 DEBUG nova.compute.manager [req-4e4b860c-9c79-4829-b25a-13a8964821f3 req-194338f6-e540-4b9e-93e5-6bbc5793215a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Received event network-vif-unplugged-80f550e3-804c-4ec1-86ab-fcd1b44d48b5 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:42:41 np0005466013 nova_compute[192144]: 2025-10-02 12:42:41.869 2 DEBUG nova.network.neutron [-] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:42:41 np0005466013 nova_compute[192144]: 2025-10-02 12:42:41.886 2 INFO nova.compute.manager [-] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Took 1.15 seconds to deallocate network for instance.#033[00m
Oct  2 08:42:41 np0005466013 nova_compute[192144]: 2025-10-02 12:42:41.951 2 DEBUG oslo_concurrency.lockutils [None req-42b5c6f9-6d80-4ccb-87e8-5a19c97a6230 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:42:41 np0005466013 nova_compute[192144]: 2025-10-02 12:42:41.952 2 DEBUG oslo_concurrency.lockutils [None req-42b5c6f9-6d80-4ccb-87e8-5a19c97a6230 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:42:42 np0005466013 nova_compute[192144]: 2025-10-02 12:42:42.013 2 DEBUG nova.compute.provider_tree [None req-42b5c6f9-6d80-4ccb-87e8-5a19c97a6230 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:42:42 np0005466013 nova_compute[192144]: 2025-10-02 12:42:42.028 2 DEBUG nova.scheduler.client.report [None req-42b5c6f9-6d80-4ccb-87e8-5a19c97a6230 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:42:42 np0005466013 nova_compute[192144]: 2025-10-02 12:42:42.050 2 DEBUG oslo_concurrency.lockutils [None req-42b5c6f9-6d80-4ccb-87e8-5a19c97a6230 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.098s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:42:42 np0005466013 nova_compute[192144]: 2025-10-02 12:42:42.073 2 INFO nova.scheduler.client.report [None req-42b5c6f9-6d80-4ccb-87e8-5a19c97a6230 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Deleted allocations for instance 772fac62-a076-4ec7-b14f-4b869824cceb#033[00m
Oct  2 08:42:42 np0005466013 nova_compute[192144]: 2025-10-02 12:42:42.203 2 DEBUG oslo_concurrency.lockutils [None req-42b5c6f9-6d80-4ccb-87e8-5a19c97a6230 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "772fac62-a076-4ec7-b14f-4b869824cceb" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.866s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:42:42 np0005466013 nova_compute[192144]: 2025-10-02 12:42:42.255 2 DEBUG nova.network.neutron [req-db6996e9-d50e-4617-ada4-27f87b752255 req-aa757ec6-fd62-4054-9d28-942db2807b38 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Updated VIF entry in instance network info cache for port 80f550e3-804c-4ec1-86ab-fcd1b44d48b5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:42:42 np0005466013 nova_compute[192144]: 2025-10-02 12:42:42.255 2 DEBUG nova.network.neutron [req-db6996e9-d50e-4617-ada4-27f87b752255 req-aa757ec6-fd62-4054-9d28-942db2807b38 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Updating instance_info_cache with network_info: [{"id": "80f550e3-804c-4ec1-86ab-fcd1b44d48b5", "address": "fa:16:3e:4e:c9:26", "network": {"id": "c4f50473-7465-4325-8b4d-bb57fca0162f", "bridge": "br-int", "label": "tempest-network-smoke--1490536879", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe4e:c926", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap80f550e3-80", "ovs_interfaceid": "80f550e3-804c-4ec1-86ab-fcd1b44d48b5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:42:42 np0005466013 nova_compute[192144]: 2025-10-02 12:42:42.271 2 DEBUG oslo_concurrency.lockutils [req-db6996e9-d50e-4617-ada4-27f87b752255 req-aa757ec6-fd62-4054-9d28-942db2807b38 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-772fac62-a076-4ec7-b14f-4b869824cceb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:42:42 np0005466013 nova_compute[192144]: 2025-10-02 12:42:42.274 2 DEBUG nova.compute.manager [req-b215ce1c-7d5f-4790-9a8b-98592988745a req-7a218e33-99dd-4bdd-aafc-d9fce016a059 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Received event network-vif-deleted-80f550e3-804c-4ec1-86ab-fcd1b44d48b5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:42:42 np0005466013 podman[250385]: 2025-10-02 12:42:42.708970343 +0000 UTC m=+0.069984076 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, managed_by=edpm_ansible)
Oct  2 08:42:42 np0005466013 podman[250384]: 2025-10-02 12:42:42.739169171 +0000 UTC m=+0.099015547 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 08:42:43 np0005466013 nova_compute[192144]: 2025-10-02 12:42:43.403 2 DEBUG oslo_concurrency.lockutils [None req-18a4f691-e852-465e-8b27-db6204cebfd1 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "2b393965-8a4a-44e3-a983-22523112e307" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:42:43 np0005466013 nova_compute[192144]: 2025-10-02 12:42:43.404 2 DEBUG oslo_concurrency.lockutils [None req-18a4f691-e852-465e-8b27-db6204cebfd1 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "2b393965-8a4a-44e3-a983-22523112e307" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:42:43 np0005466013 nova_compute[192144]: 2025-10-02 12:42:43.404 2 DEBUG oslo_concurrency.lockutils [None req-18a4f691-e852-465e-8b27-db6204cebfd1 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "2b393965-8a4a-44e3-a983-22523112e307-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:42:43 np0005466013 nova_compute[192144]: 2025-10-02 12:42:43.404 2 DEBUG oslo_concurrency.lockutils [None req-18a4f691-e852-465e-8b27-db6204cebfd1 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "2b393965-8a4a-44e3-a983-22523112e307-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:42:43 np0005466013 nova_compute[192144]: 2025-10-02 12:42:43.404 2 DEBUG oslo_concurrency.lockutils [None req-18a4f691-e852-465e-8b27-db6204cebfd1 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "2b393965-8a4a-44e3-a983-22523112e307-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:42:43 np0005466013 nova_compute[192144]: 2025-10-02 12:42:43.416 2 INFO nova.compute.manager [None req-18a4f691-e852-465e-8b27-db6204cebfd1 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Terminating instance#033[00m
Oct  2 08:42:43 np0005466013 nova_compute[192144]: 2025-10-02 12:42:43.430 2 DEBUG nova.compute.manager [None req-18a4f691-e852-465e-8b27-db6204cebfd1 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:42:43 np0005466013 nova_compute[192144]: 2025-10-02 12:42:43.461 2 DEBUG nova.compute.manager [req-22add124-ab83-401b-aca5-4a5b7690bbfc req-89b66e00-6a46-4831-abb9-5e3c91778e4f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Received event network-changed-d526d728-0b99-484c-ba01-139df2c989a1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:42:43 np0005466013 nova_compute[192144]: 2025-10-02 12:42:43.461 2 DEBUG nova.compute.manager [req-22add124-ab83-401b-aca5-4a5b7690bbfc req-89b66e00-6a46-4831-abb9-5e3c91778e4f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Refreshing instance network info cache due to event network-changed-d526d728-0b99-484c-ba01-139df2c989a1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:42:43 np0005466013 nova_compute[192144]: 2025-10-02 12:42:43.462 2 DEBUG oslo_concurrency.lockutils [req-22add124-ab83-401b-aca5-4a5b7690bbfc req-89b66e00-6a46-4831-abb9-5e3c91778e4f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-2b393965-8a4a-44e3-a983-22523112e307" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:42:43 np0005466013 nova_compute[192144]: 2025-10-02 12:42:43.462 2 DEBUG oslo_concurrency.lockutils [req-22add124-ab83-401b-aca5-4a5b7690bbfc req-89b66e00-6a46-4831-abb9-5e3c91778e4f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-2b393965-8a4a-44e3-a983-22523112e307" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:42:43 np0005466013 nova_compute[192144]: 2025-10-02 12:42:43.462 2 DEBUG nova.network.neutron [req-22add124-ab83-401b-aca5-4a5b7690bbfc req-89b66e00-6a46-4831-abb9-5e3c91778e4f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Refreshing network info cache for port d526d728-0b99-484c-ba01-139df2c989a1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:42:43 np0005466013 kernel: tapd526d728-0b (unregistering): left promiscuous mode
Oct  2 08:42:43 np0005466013 NetworkManager[51205]: <info>  [1759408963.5301] device (tapd526d728-0b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:42:43 np0005466013 nova_compute[192144]: 2025-10-02 12:42:43.544 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:43 np0005466013 nova_compute[192144]: 2025-10-02 12:42:43.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:43 np0005466013 ovn_controller[94366]: 2025-10-02T12:42:43Z|00759|binding|INFO|Releasing lport d526d728-0b99-484c-ba01-139df2c989a1 from this chassis (sb_readonly=0)
Oct  2 08:42:43 np0005466013 ovn_controller[94366]: 2025-10-02T12:42:43Z|00760|binding|INFO|Setting lport d526d728-0b99-484c-ba01-139df2c989a1 down in Southbound
Oct  2 08:42:43 np0005466013 ovn_controller[94366]: 2025-10-02T12:42:43Z|00761|binding|INFO|Removing iface tapd526d728-0b ovn-installed in OVS
Oct  2 08:42:43 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:42:43.566 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8f:83:09 10.100.0.9 2001:db8::f816:3eff:fe8f:8309'], port_security=['fa:16:3e:8f:83:09 10.100.0.9 2001:db8::f816:3eff:fe8f:8309'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28 2001:db8::f816:3eff:fe8f:8309/64', 'neutron:device_id': '2b393965-8a4a-44e3-a983-22523112e307', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c4f50473-7465-4325-8b4d-bb57fca0162f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '4', 'neutron:security_group_ids': '629ae092-c18f-4a62-b981-2da6f9b93771', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d90df5bc-8770-4be5-937c-0abfe33bbe11, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=d526d728-0b99-484c-ba01-139df2c989a1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:42:43 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:42:43.567 103323 INFO neutron.agent.ovn.metadata.agent [-] Port d526d728-0b99-484c-ba01-139df2c989a1 in datapath c4f50473-7465-4325-8b4d-bb57fca0162f unbound from our chassis#033[00m
Oct  2 08:42:43 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:42:43.569 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c4f50473-7465-4325-8b4d-bb57fca0162f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:42:43 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:42:43.570 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[605d9e29-e033-43ad-9798-b735cec52b4a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:43 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:42:43.571 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c4f50473-7465-4325-8b4d-bb57fca0162f namespace which is not needed anymore#033[00m
Oct  2 08:42:43 np0005466013 nova_compute[192144]: 2025-10-02 12:42:43.573 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:43 np0005466013 systemd[1]: machine-qemu\x2d78\x2dinstance\x2d000000ac.scope: Deactivated successfully.
Oct  2 08:42:43 np0005466013 systemd[1]: machine-qemu\x2d78\x2dinstance\x2d000000ac.scope: Consumed 15.838s CPU time.
Oct  2 08:42:43 np0005466013 nova_compute[192144]: 2025-10-02 12:42:43.613 2 DEBUG nova.compute.manager [req-7e314be9-1271-4589-ba16-80f679eb686d req-836b2ea2-a783-490e-90c0-6ec86db96f59 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Received event network-vif-plugged-80f550e3-804c-4ec1-86ab-fcd1b44d48b5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:42:43 np0005466013 nova_compute[192144]: 2025-10-02 12:42:43.614 2 DEBUG oslo_concurrency.lockutils [req-7e314be9-1271-4589-ba16-80f679eb686d req-836b2ea2-a783-490e-90c0-6ec86db96f59 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "772fac62-a076-4ec7-b14f-4b869824cceb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:42:43 np0005466013 nova_compute[192144]: 2025-10-02 12:42:43.614 2 DEBUG oslo_concurrency.lockutils [req-7e314be9-1271-4589-ba16-80f679eb686d req-836b2ea2-a783-490e-90c0-6ec86db96f59 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "772fac62-a076-4ec7-b14f-4b869824cceb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:42:43 np0005466013 systemd-machined[152202]: Machine qemu-78-instance-000000ac terminated.
Oct  2 08:42:43 np0005466013 nova_compute[192144]: 2025-10-02 12:42:43.615 2 DEBUG oslo_concurrency.lockutils [req-7e314be9-1271-4589-ba16-80f679eb686d req-836b2ea2-a783-490e-90c0-6ec86db96f59 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "772fac62-a076-4ec7-b14f-4b869824cceb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:42:43 np0005466013 nova_compute[192144]: 2025-10-02 12:42:43.616 2 DEBUG nova.compute.manager [req-7e314be9-1271-4589-ba16-80f679eb686d req-836b2ea2-a783-490e-90c0-6ec86db96f59 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] No waiting events found dispatching network-vif-plugged-80f550e3-804c-4ec1-86ab-fcd1b44d48b5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:42:43 np0005466013 nova_compute[192144]: 2025-10-02 12:42:43.617 2 WARNING nova.compute.manager [req-7e314be9-1271-4589-ba16-80f679eb686d req-836b2ea2-a783-490e-90c0-6ec86db96f59 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Received unexpected event network-vif-plugged-80f550e3-804c-4ec1-86ab-fcd1b44d48b5 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:42:43 np0005466013 nova_compute[192144]: 2025-10-02 12:42:43.700 2 INFO nova.virt.libvirt.driver [-] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Instance destroyed successfully.#033[00m
Oct  2 08:42:43 np0005466013 nova_compute[192144]: 2025-10-02 12:42:43.701 2 DEBUG nova.objects.instance [None req-18a4f691-e852-465e-8b27-db6204cebfd1 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lazy-loading 'resources' on Instance uuid 2b393965-8a4a-44e3-a983-22523112e307 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:42:43 np0005466013 nova_compute[192144]: 2025-10-02 12:42:43.715 2 DEBUG nova.virt.libvirt.vif [None req-18a4f691-e852-465e-8b27-db6204cebfd1 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:41:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-702332065',display_name='tempest-TestGettingAddress-server-702332065',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-702332065',id=172,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA+W0VRkiRweukUMP9gTiBmsYPV9brmjtI29ZBDpBBAygrYooWG25zCR9mTx41solx25wcFgiDOyxlNfpE1uIsSJmxlJws14lF4ZLmAF9ATw8SJfg5U3tBL7QCF1+HVJ7Q==',key_name='tempest-TestGettingAddress-972559981',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:41:24Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-mta0kn8b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:41:25Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=2b393965-8a4a-44e3-a983-22523112e307,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d526d728-0b99-484c-ba01-139df2c989a1", "address": "fa:16:3e:8f:83:09", "network": {"id": "c4f50473-7465-4325-8b4d-bb57fca0162f", "bridge": "br-int", "label": "tempest-network-smoke--1490536879", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe8f:8309", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd526d728-0b", "ovs_interfaceid": "d526d728-0b99-484c-ba01-139df2c989a1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:42:43 np0005466013 nova_compute[192144]: 2025-10-02 12:42:43.715 2 DEBUG nova.network.os_vif_util [None req-18a4f691-e852-465e-8b27-db6204cebfd1 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "d526d728-0b99-484c-ba01-139df2c989a1", "address": "fa:16:3e:8f:83:09", "network": {"id": "c4f50473-7465-4325-8b4d-bb57fca0162f", "bridge": "br-int", "label": "tempest-network-smoke--1490536879", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe8f:8309", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd526d728-0b", "ovs_interfaceid": "d526d728-0b99-484c-ba01-139df2c989a1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:42:43 np0005466013 nova_compute[192144]: 2025-10-02 12:42:43.716 2 DEBUG nova.network.os_vif_util [None req-18a4f691-e852-465e-8b27-db6204cebfd1 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8f:83:09,bridge_name='br-int',has_traffic_filtering=True,id=d526d728-0b99-484c-ba01-139df2c989a1,network=Network(c4f50473-7465-4325-8b4d-bb57fca0162f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd526d728-0b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:42:43 np0005466013 nova_compute[192144]: 2025-10-02 12:42:43.716 2 DEBUG os_vif [None req-18a4f691-e852-465e-8b27-db6204cebfd1 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8f:83:09,bridge_name='br-int',has_traffic_filtering=True,id=d526d728-0b99-484c-ba01-139df2c989a1,network=Network(c4f50473-7465-4325-8b4d-bb57fca0162f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd526d728-0b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:42:43 np0005466013 nova_compute[192144]: 2025-10-02 12:42:43.718 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:43 np0005466013 nova_compute[192144]: 2025-10-02 12:42:43.718 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd526d728-0b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:42:43 np0005466013 nova_compute[192144]: 2025-10-02 12:42:43.766 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:43 np0005466013 nova_compute[192144]: 2025-10-02 12:42:43.768 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:42:43 np0005466013 nova_compute[192144]: 2025-10-02 12:42:43.770 2 INFO os_vif [None req-18a4f691-e852-465e-8b27-db6204cebfd1 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8f:83:09,bridge_name='br-int',has_traffic_filtering=True,id=d526d728-0b99-484c-ba01-139df2c989a1,network=Network(c4f50473-7465-4325-8b4d-bb57fca0162f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd526d728-0b')#033[00m
Oct  2 08:42:43 np0005466013 nova_compute[192144]: 2025-10-02 12:42:43.771 2 INFO nova.virt.libvirt.driver [None req-18a4f691-e852-465e-8b27-db6204cebfd1 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Deleting instance files /var/lib/nova/instances/2b393965-8a4a-44e3-a983-22523112e307_del#033[00m
Oct  2 08:42:43 np0005466013 nova_compute[192144]: 2025-10-02 12:42:43.771 2 INFO nova.virt.libvirt.driver [None req-18a4f691-e852-465e-8b27-db6204cebfd1 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Deletion of /var/lib/nova/instances/2b393965-8a4a-44e3-a983-22523112e307_del complete#033[00m
Oct  2 08:42:43 np0005466013 neutron-haproxy-ovnmeta-c4f50473-7465-4325-8b4d-bb57fca0162f[249750]: [NOTICE]   (249754) : haproxy version is 2.8.14-c23fe91
Oct  2 08:42:43 np0005466013 neutron-haproxy-ovnmeta-c4f50473-7465-4325-8b4d-bb57fca0162f[249750]: [NOTICE]   (249754) : path to executable is /usr/sbin/haproxy
Oct  2 08:42:43 np0005466013 neutron-haproxy-ovnmeta-c4f50473-7465-4325-8b4d-bb57fca0162f[249750]: [WARNING]  (249754) : Exiting Master process...
Oct  2 08:42:43 np0005466013 neutron-haproxy-ovnmeta-c4f50473-7465-4325-8b4d-bb57fca0162f[249750]: [ALERT]    (249754) : Current worker (249756) exited with code 143 (Terminated)
Oct  2 08:42:43 np0005466013 neutron-haproxy-ovnmeta-c4f50473-7465-4325-8b4d-bb57fca0162f[249750]: [WARNING]  (249754) : All workers exited. Exiting... (0)
Oct  2 08:42:43 np0005466013 systemd[1]: libpod-66bd383ddbb21f9342ad7861b3f47e46f0b344dbced235ddf33ac22d537e72e9.scope: Deactivated successfully.
Oct  2 08:42:43 np0005466013 podman[250456]: 2025-10-02 12:42:43.786463592 +0000 UTC m=+0.110159207 container died 66bd383ddbb21f9342ad7861b3f47e46f0b344dbced235ddf33ac22d537e72e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c4f50473-7465-4325-8b4d-bb57fca0162f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  2 08:42:43 np0005466013 nova_compute[192144]: 2025-10-02 12:42:43.852 2 INFO nova.compute.manager [None req-18a4f691-e852-465e-8b27-db6204cebfd1 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Took 0.42 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:42:43 np0005466013 nova_compute[192144]: 2025-10-02 12:42:43.853 2 DEBUG oslo.service.loopingcall [None req-18a4f691-e852-465e-8b27-db6204cebfd1 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:42:43 np0005466013 nova_compute[192144]: 2025-10-02 12:42:43.853 2 DEBUG nova.compute.manager [-] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:42:43 np0005466013 nova_compute[192144]: 2025-10-02 12:42:43.854 2 DEBUG nova.network.neutron [-] [instance: 2b393965-8a4a-44e3-a983-22523112e307] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:42:43 np0005466013 systemd[1]: var-lib-containers-storage-overlay-413eb68d11cdca05d60dadd398d797cbcd8079ccc2d1f5fba01adb8ca4688cf7-merged.mount: Deactivated successfully.
Oct  2 08:42:43 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-66bd383ddbb21f9342ad7861b3f47e46f0b344dbced235ddf33ac22d537e72e9-userdata-shm.mount: Deactivated successfully.
Oct  2 08:42:43 np0005466013 podman[250456]: 2025-10-02 12:42:43.974784529 +0000 UTC m=+0.298480144 container cleanup 66bd383ddbb21f9342ad7861b3f47e46f0b344dbced235ddf33ac22d537e72e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c4f50473-7465-4325-8b4d-bb57fca0162f, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3)
Oct  2 08:42:43 np0005466013 systemd[1]: libpod-conmon-66bd383ddbb21f9342ad7861b3f47e46f0b344dbced235ddf33ac22d537e72e9.scope: Deactivated successfully.
Oct  2 08:42:43 np0005466013 nova_compute[192144]: 2025-10-02 12:42:43.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:42:43 np0005466013 nova_compute[192144]: 2025-10-02 12:42:43.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:42:44 np0005466013 podman[250496]: 2025-10-02 12:42:44.123942588 +0000 UTC m=+0.126019044 container remove 66bd383ddbb21f9342ad7861b3f47e46f0b344dbced235ddf33ac22d537e72e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c4f50473-7465-4325-8b4d-bb57fca0162f, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:42:44 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:42:44.132 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[a8fb0a50-5317-49eb-a53a-9080f699193d]: (4, ('Thu Oct  2 12:42:43 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-c4f50473-7465-4325-8b4d-bb57fca0162f (66bd383ddbb21f9342ad7861b3f47e46f0b344dbced235ddf33ac22d537e72e9)\n66bd383ddbb21f9342ad7861b3f47e46f0b344dbced235ddf33ac22d537e72e9\nThu Oct  2 12:42:43 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c4f50473-7465-4325-8b4d-bb57fca0162f (66bd383ddbb21f9342ad7861b3f47e46f0b344dbced235ddf33ac22d537e72e9)\n66bd383ddbb21f9342ad7861b3f47e46f0b344dbced235ddf33ac22d537e72e9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:44 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:42:44.133 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[873172ea-4840-4f6d-8af9-d0a5044e8b28]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:44 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:42:44.134 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc4f50473-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:42:44 np0005466013 nova_compute[192144]: 2025-10-02 12:42:44.136 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:44 np0005466013 kernel: tapc4f50473-70: left promiscuous mode
Oct  2 08:42:44 np0005466013 nova_compute[192144]: 2025-10-02 12:42:44.160 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:44 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:42:44.162 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[610e4b6c-7fe0-4fdd-9b2a-f81bc40a9e49]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:44 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:42:44.202 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[6b35efbc-452e-499f-9b89-ed034af4a396]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:44 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:42:44.203 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[23a3330d-0474-4066-9e5d-6ad4a1b56cdc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:44 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:42:44.213 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '48'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:42:44 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:42:44.221 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[6cbd349c-1c8c-404e-b55c-016992ada8e9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 687557, 'reachable_time': 29333, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 250512, 'error': None, 'target': 'ovnmeta-c4f50473-7465-4325-8b4d-bb57fca0162f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:44 np0005466013 systemd[1]: run-netns-ovnmeta\x2dc4f50473\x2d7465\x2d4325\x2d8b4d\x2dbb57fca0162f.mount: Deactivated successfully.
Oct  2 08:42:44 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:42:44.224 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c4f50473-7465-4325-8b4d-bb57fca0162f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:42:44 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:42:44.224 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[52b045ee-2522-41de-9b8b-09677dcec46a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:44 np0005466013 nova_compute[192144]: 2025-10-02 12:42:44.395 2 DEBUG nova.compute.manager [req-9df72bf1-6854-4413-8d9a-d6908e5f6333 req-0044a679-87bb-439a-8125-ffc6b9edd805 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Received event network-vif-unplugged-d526d728-0b99-484c-ba01-139df2c989a1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:42:44 np0005466013 nova_compute[192144]: 2025-10-02 12:42:44.395 2 DEBUG oslo_concurrency.lockutils [req-9df72bf1-6854-4413-8d9a-d6908e5f6333 req-0044a679-87bb-439a-8125-ffc6b9edd805 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "2b393965-8a4a-44e3-a983-22523112e307-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:42:44 np0005466013 nova_compute[192144]: 2025-10-02 12:42:44.395 2 DEBUG oslo_concurrency.lockutils [req-9df72bf1-6854-4413-8d9a-d6908e5f6333 req-0044a679-87bb-439a-8125-ffc6b9edd805 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "2b393965-8a4a-44e3-a983-22523112e307-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:42:44 np0005466013 nova_compute[192144]: 2025-10-02 12:42:44.396 2 DEBUG oslo_concurrency.lockutils [req-9df72bf1-6854-4413-8d9a-d6908e5f6333 req-0044a679-87bb-439a-8125-ffc6b9edd805 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "2b393965-8a4a-44e3-a983-22523112e307-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:42:44 np0005466013 nova_compute[192144]: 2025-10-02 12:42:44.396 2 DEBUG nova.compute.manager [req-9df72bf1-6854-4413-8d9a-d6908e5f6333 req-0044a679-87bb-439a-8125-ffc6b9edd805 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] No waiting events found dispatching network-vif-unplugged-d526d728-0b99-484c-ba01-139df2c989a1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:42:44 np0005466013 nova_compute[192144]: 2025-10-02 12:42:44.396 2 DEBUG nova.compute.manager [req-9df72bf1-6854-4413-8d9a-d6908e5f6333 req-0044a679-87bb-439a-8125-ffc6b9edd805 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Received event network-vif-unplugged-d526d728-0b99-484c-ba01-139df2c989a1 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:42:44 np0005466013 nova_compute[192144]: 2025-10-02 12:42:44.708 2 DEBUG nova.network.neutron [-] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:42:44 np0005466013 nova_compute[192144]: 2025-10-02 12:42:44.730 2 INFO nova.compute.manager [-] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Took 0.88 seconds to deallocate network for instance.#033[00m
Oct  2 08:42:44 np0005466013 nova_compute[192144]: 2025-10-02 12:42:44.812 2 DEBUG oslo_concurrency.lockutils [None req-18a4f691-e852-465e-8b27-db6204cebfd1 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:42:44 np0005466013 nova_compute[192144]: 2025-10-02 12:42:44.812 2 DEBUG oslo_concurrency.lockutils [None req-18a4f691-e852-465e-8b27-db6204cebfd1 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:42:44 np0005466013 nova_compute[192144]: 2025-10-02 12:42:44.862 2 DEBUG nova.compute.provider_tree [None req-18a4f691-e852-465e-8b27-db6204cebfd1 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:42:44 np0005466013 nova_compute[192144]: 2025-10-02 12:42:44.876 2 DEBUG nova.scheduler.client.report [None req-18a4f691-e852-465e-8b27-db6204cebfd1 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:42:44 np0005466013 nova_compute[192144]: 2025-10-02 12:42:44.880 2 DEBUG nova.network.neutron [req-22add124-ab83-401b-aca5-4a5b7690bbfc req-89b66e00-6a46-4831-abb9-5e3c91778e4f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Updated VIF entry in instance network info cache for port d526d728-0b99-484c-ba01-139df2c989a1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:42:44 np0005466013 nova_compute[192144]: 2025-10-02 12:42:44.881 2 DEBUG nova.network.neutron [req-22add124-ab83-401b-aca5-4a5b7690bbfc req-89b66e00-6a46-4831-abb9-5e3c91778e4f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Updating instance_info_cache with network_info: [{"id": "d526d728-0b99-484c-ba01-139df2c989a1", "address": "fa:16:3e:8f:83:09", "network": {"id": "c4f50473-7465-4325-8b4d-bb57fca0162f", "bridge": "br-int", "label": "tempest-network-smoke--1490536879", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe8f:8309", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd526d728-0b", "ovs_interfaceid": "d526d728-0b99-484c-ba01-139df2c989a1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:42:44 np0005466013 nova_compute[192144]: 2025-10-02 12:42:44.901 2 DEBUG oslo_concurrency.lockutils [None req-18a4f691-e852-465e-8b27-db6204cebfd1 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.088s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:42:44 np0005466013 nova_compute[192144]: 2025-10-02 12:42:44.904 2 DEBUG oslo_concurrency.lockutils [req-22add124-ab83-401b-aca5-4a5b7690bbfc req-89b66e00-6a46-4831-abb9-5e3c91778e4f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-2b393965-8a4a-44e3-a983-22523112e307" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:42:44 np0005466013 nova_compute[192144]: 2025-10-02 12:42:44.924 2 INFO nova.scheduler.client.report [None req-18a4f691-e852-465e-8b27-db6204cebfd1 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Deleted allocations for instance 2b393965-8a4a-44e3-a983-22523112e307#033[00m
Oct  2 08:42:44 np0005466013 nova_compute[192144]: 2025-10-02 12:42:44.990 2 DEBUG oslo_concurrency.lockutils [None req-18a4f691-e852-465e-8b27-db6204cebfd1 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "2b393965-8a4a-44e3-a983-22523112e307" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.587s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:42:45 np0005466013 nova_compute[192144]: 2025-10-02 12:42:45.993 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:45 np0005466013 nova_compute[192144]: 2025-10-02 12:42:45.995 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:42:45 np0005466013 nova_compute[192144]: 2025-10-02 12:42:45.995 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:42:46 np0005466013 nova_compute[192144]: 2025-10-02 12:42:46.019 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:42:46 np0005466013 nova_compute[192144]: 2025-10-02 12:42:46.020 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:42:46 np0005466013 nova_compute[192144]: 2025-10-02 12:42:46.021 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:42:46 np0005466013 nova_compute[192144]: 2025-10-02 12:42:46.021 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:42:46 np0005466013 nova_compute[192144]: 2025-10-02 12:42:46.172 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:42:46 np0005466013 nova_compute[192144]: 2025-10-02 12:42:46.173 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5702MB free_disk=73.13291931152344GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:42:46 np0005466013 nova_compute[192144]: 2025-10-02 12:42:46.173 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:42:46 np0005466013 nova_compute[192144]: 2025-10-02 12:42:46.174 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:42:46 np0005466013 nova_compute[192144]: 2025-10-02 12:42:46.219 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:42:46 np0005466013 nova_compute[192144]: 2025-10-02 12:42:46.220 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:42:46 np0005466013 nova_compute[192144]: 2025-10-02 12:42:46.246 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:42:46 np0005466013 nova_compute[192144]: 2025-10-02 12:42:46.261 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:42:46 np0005466013 nova_compute[192144]: 2025-10-02 12:42:46.281 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:42:46 np0005466013 nova_compute[192144]: 2025-10-02 12:42:46.282 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:42:46 np0005466013 nova_compute[192144]: 2025-10-02 12:42:46.465 2 DEBUG nova.compute.manager [req-a53b6338-88d3-42aa-8d3b-1b87716f495d req-933b39d7-190f-42d6-8b48-a0724a21c7ad 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Received event network-vif-plugged-d526d728-0b99-484c-ba01-139df2c989a1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:42:46 np0005466013 nova_compute[192144]: 2025-10-02 12:42:46.466 2 DEBUG oslo_concurrency.lockutils [req-a53b6338-88d3-42aa-8d3b-1b87716f495d req-933b39d7-190f-42d6-8b48-a0724a21c7ad 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "2b393965-8a4a-44e3-a983-22523112e307-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:42:46 np0005466013 nova_compute[192144]: 2025-10-02 12:42:46.466 2 DEBUG oslo_concurrency.lockutils [req-a53b6338-88d3-42aa-8d3b-1b87716f495d req-933b39d7-190f-42d6-8b48-a0724a21c7ad 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "2b393965-8a4a-44e3-a983-22523112e307-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:42:46 np0005466013 nova_compute[192144]: 2025-10-02 12:42:46.466 2 DEBUG oslo_concurrency.lockutils [req-a53b6338-88d3-42aa-8d3b-1b87716f495d req-933b39d7-190f-42d6-8b48-a0724a21c7ad 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "2b393965-8a4a-44e3-a983-22523112e307-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:42:46 np0005466013 nova_compute[192144]: 2025-10-02 12:42:46.467 2 DEBUG nova.compute.manager [req-a53b6338-88d3-42aa-8d3b-1b87716f495d req-933b39d7-190f-42d6-8b48-a0724a21c7ad 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] No waiting events found dispatching network-vif-plugged-d526d728-0b99-484c-ba01-139df2c989a1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:42:46 np0005466013 nova_compute[192144]: 2025-10-02 12:42:46.467 2 WARNING nova.compute.manager [req-a53b6338-88d3-42aa-8d3b-1b87716f495d req-933b39d7-190f-42d6-8b48-a0724a21c7ad 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Received unexpected event network-vif-plugged-d526d728-0b99-484c-ba01-139df2c989a1 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:42:46 np0005466013 nova_compute[192144]: 2025-10-02 12:42:46.467 2 DEBUG nova.compute.manager [req-a53b6338-88d3-42aa-8d3b-1b87716f495d req-933b39d7-190f-42d6-8b48-a0724a21c7ad 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Received event network-vif-deleted-d526d728-0b99-484c-ba01-139df2c989a1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:42:46 np0005466013 nova_compute[192144]: 2025-10-02 12:42:46.467 2 INFO nova.compute.manager [req-a53b6338-88d3-42aa-8d3b-1b87716f495d req-933b39d7-190f-42d6-8b48-a0724a21c7ad 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Neutron deleted interface d526d728-0b99-484c-ba01-139df2c989a1; detaching it from the instance and deleting it from the info cache#033[00m
Oct  2 08:42:46 np0005466013 nova_compute[192144]: 2025-10-02 12:42:46.467 2 DEBUG nova.network.neutron [req-a53b6338-88d3-42aa-8d3b-1b87716f495d req-933b39d7-190f-42d6-8b48-a0724a21c7ad 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106#033[00m
Oct  2 08:42:46 np0005466013 nova_compute[192144]: 2025-10-02 12:42:46.470 2 DEBUG nova.compute.manager [req-a53b6338-88d3-42aa-8d3b-1b87716f495d req-933b39d7-190f-42d6-8b48-a0724a21c7ad 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Detach interface failed, port_id=d526d728-0b99-484c-ba01-139df2c989a1, reason: Instance 2b393965-8a4a-44e3-a983-22523112e307 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Oct  2 08:42:47 np0005466013 nova_compute[192144]: 2025-10-02 12:42:47.280 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:42:48 np0005466013 nova_compute[192144]: 2025-10-02 12:42:48.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:49 np0005466013 nova_compute[192144]: 2025-10-02 12:42:49.448 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:49 np0005466013 nova_compute[192144]: 2025-10-02 12:42:49.557 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:50 np0005466013 nova_compute[192144]: 2025-10-02 12:42:50.989 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:42:50 np0005466013 nova_compute[192144]: 2025-10-02 12:42:50.995 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:51 np0005466013 nova_compute[192144]: 2025-10-02 12:42:51.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:42:53 np0005466013 nova_compute[192144]: 2025-10-02 12:42:53.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:53 np0005466013 nova_compute[192144]: 2025-10-02 12:42:53.989 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:42:53 np0005466013 nova_compute[192144]: 2025-10-02 12:42:53.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:42:55 np0005466013 nova_compute[192144]: 2025-10-02 12:42:55.623 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408960.6214404, 772fac62-a076-4ec7-b14f-4b869824cceb => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:42:55 np0005466013 nova_compute[192144]: 2025-10-02 12:42:55.623 2 INFO nova.compute.manager [-] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:42:55 np0005466013 nova_compute[192144]: 2025-10-02 12:42:55.641 2 DEBUG nova.compute.manager [None req-f680be7a-cc8a-462d-9c27-1c7902bf8a4f - - - - - -] [instance: 772fac62-a076-4ec7-b14f-4b869824cceb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:42:56 np0005466013 nova_compute[192144]: 2025-10-02 12:42:56.017 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:57 np0005466013 nova_compute[192144]: 2025-10-02 12:42:57.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:42:57 np0005466013 nova_compute[192144]: 2025-10-02 12:42:57.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:42:57 np0005466013 nova_compute[192144]: 2025-10-02 12:42:57.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:42:58 np0005466013 nova_compute[192144]: 2025-10-02 12:42:58.009 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:42:58 np0005466013 nova_compute[192144]: 2025-10-02 12:42:58.009 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:42:58 np0005466013 nova_compute[192144]: 2025-10-02 12:42:58.699 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408963.6982436, 2b393965-8a4a-44e3-a983-22523112e307 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:42:58 np0005466013 nova_compute[192144]: 2025-10-02 12:42:58.700 2 INFO nova.compute.manager [-] [instance: 2b393965-8a4a-44e3-a983-22523112e307] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:42:58 np0005466013 nova_compute[192144]: 2025-10-02 12:42:58.718 2 DEBUG nova.compute.manager [None req-af0cee8a-285c-4230-82a5-160eaf92cadb - - - - - -] [instance: 2b393965-8a4a-44e3-a983-22523112e307] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:42:58 np0005466013 nova_compute[192144]: 2025-10-02 12:42:58.772 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:59 np0005466013 podman[250515]: 2025-10-02 12:42:59.696452964 +0000 UTC m=+0.063080600 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  2 08:42:59 np0005466013 podman[250516]: 2025-10-02 12:42:59.710636189 +0000 UTC m=+0.071558136 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Oct  2 08:42:59 np0005466013 podman[250517]: 2025-10-02 12:42:59.738593426 +0000 UTC m=+0.092393429 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Oct  2 08:43:01 np0005466013 nova_compute[192144]: 2025-10-02 12:43:01.019 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:43:02.327 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:43:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:43:02.327 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:43:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:43:02.327 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:43:03 np0005466013 nova_compute[192144]: 2025-10-02 12:43:03.775 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:06 np0005466013 nova_compute[192144]: 2025-10-02 12:43:06.022 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:08 np0005466013 nova_compute[192144]: 2025-10-02 12:43:08.816 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:10 np0005466013 podman[250579]: 2025-10-02 12:43:10.696136514 +0000 UTC m=+0.063538354 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, vcs-type=git, container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, release=1755695350, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, architecture=x86_64, config_id=edpm, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers)
Oct  2 08:43:10 np0005466013 podman[250578]: 2025-10-02 12:43:10.706003354 +0000 UTC m=+0.076509182 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:43:10 np0005466013 podman[250580]: 2025-10-02 12:43:10.719916889 +0000 UTC m=+0.079723181 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Oct  2 08:43:11 np0005466013 nova_compute[192144]: 2025-10-02 12:43:11.026 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:13 np0005466013 podman[250638]: 2025-10-02 12:43:13.674475356 +0000 UTC m=+0.055742760 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:43:13 np0005466013 podman[250639]: 2025-10-02 12:43:13.688081343 +0000 UTC m=+0.064131433 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.license=GPLv2)
Oct  2 08:43:13 np0005466013 nova_compute[192144]: 2025-10-02 12:43:13.819 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:16 np0005466013 nova_compute[192144]: 2025-10-02 12:43:16.028 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:43:16.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:43:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:43:16.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:43:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:43:16.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:43:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:43:16.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:43:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:43:16.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:43:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:43:16.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:43:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:43:16.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:43:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:43:16.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:43:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:43:16.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:43:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:43:16.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:43:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:43:16.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:43:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:43:16.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:43:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:43:16.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:43:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:43:16.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:43:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:43:16.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:43:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:43:16.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:43:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:43:16.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:43:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:43:16.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:43:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:43:16.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:43:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:43:16.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:43:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:43:16.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:43:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:43:16.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:43:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:43:16.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:43:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:43:16.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:43:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:43:16.362 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:43:18 np0005466013 nova_compute[192144]: 2025-10-02 12:43:18.823 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:20 np0005466013 nova_compute[192144]: 2025-10-02 12:43:20.839 2 DEBUG oslo_concurrency.lockutils [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Acquiring lock "0039457a-eaaf-4eb6-836e-2c4ce4c14a83" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:43:20 np0005466013 nova_compute[192144]: 2025-10-02 12:43:20.839 2 DEBUG oslo_concurrency.lockutils [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Lock "0039457a-eaaf-4eb6-836e-2c4ce4c14a83" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:43:20 np0005466013 nova_compute[192144]: 2025-10-02 12:43:20.902 2 DEBUG nova.compute.manager [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:43:20 np0005466013 nova_compute[192144]: 2025-10-02 12:43:20.997 2 DEBUG oslo_concurrency.lockutils [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:43:20 np0005466013 nova_compute[192144]: 2025-10-02 12:43:20.997 2 DEBUG oslo_concurrency.lockutils [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:43:21 np0005466013 nova_compute[192144]: 2025-10-02 12:43:21.005 2 DEBUG nova.virt.hardware [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:43:21 np0005466013 nova_compute[192144]: 2025-10-02 12:43:21.006 2 INFO nova.compute.claims [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:43:21 np0005466013 nova_compute[192144]: 2025-10-02 12:43:21.030 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:21 np0005466013 nova_compute[192144]: 2025-10-02 12:43:21.131 2 DEBUG nova.compute.provider_tree [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:43:21 np0005466013 nova_compute[192144]: 2025-10-02 12:43:21.175 2 DEBUG nova.scheduler.client.report [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:43:21 np0005466013 nova_compute[192144]: 2025-10-02 12:43:21.216 2 DEBUG oslo_concurrency.lockutils [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.219s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:43:21 np0005466013 nova_compute[192144]: 2025-10-02 12:43:21.217 2 DEBUG nova.compute.manager [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:43:21 np0005466013 nova_compute[192144]: 2025-10-02 12:43:21.427 2 DEBUG nova.compute.manager [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:43:21 np0005466013 nova_compute[192144]: 2025-10-02 12:43:21.428 2 DEBUG nova.network.neutron [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:43:21 np0005466013 nova_compute[192144]: 2025-10-02 12:43:21.453 2 INFO nova.virt.libvirt.driver [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:43:21 np0005466013 nova_compute[192144]: 2025-10-02 12:43:21.521 2 DEBUG nova.compute.manager [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:43:21 np0005466013 nova_compute[192144]: 2025-10-02 12:43:21.644 2 DEBUG nova.compute.manager [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:43:21 np0005466013 nova_compute[192144]: 2025-10-02 12:43:21.645 2 DEBUG nova.virt.libvirt.driver [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:43:21 np0005466013 nova_compute[192144]: 2025-10-02 12:43:21.646 2 INFO nova.virt.libvirt.driver [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Creating image(s)#033[00m
Oct  2 08:43:21 np0005466013 nova_compute[192144]: 2025-10-02 12:43:21.646 2 DEBUG oslo_concurrency.lockutils [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Acquiring lock "/var/lib/nova/instances/0039457a-eaaf-4eb6-836e-2c4ce4c14a83/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:43:21 np0005466013 nova_compute[192144]: 2025-10-02 12:43:21.646 2 DEBUG oslo_concurrency.lockutils [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Lock "/var/lib/nova/instances/0039457a-eaaf-4eb6-836e-2c4ce4c14a83/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:43:21 np0005466013 nova_compute[192144]: 2025-10-02 12:43:21.647 2 DEBUG oslo_concurrency.lockutils [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Lock "/var/lib/nova/instances/0039457a-eaaf-4eb6-836e-2c4ce4c14a83/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:43:21 np0005466013 nova_compute[192144]: 2025-10-02 12:43:21.660 2 DEBUG oslo_concurrency.processutils [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:43:21 np0005466013 nova_compute[192144]: 2025-10-02 12:43:21.739 2 DEBUG oslo_concurrency.processutils [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:43:21 np0005466013 nova_compute[192144]: 2025-10-02 12:43:21.741 2 DEBUG oslo_concurrency.lockutils [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:43:21 np0005466013 nova_compute[192144]: 2025-10-02 12:43:21.741 2 DEBUG oslo_concurrency.lockutils [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:43:21 np0005466013 nova_compute[192144]: 2025-10-02 12:43:21.756 2 DEBUG oslo_concurrency.processutils [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:43:21 np0005466013 nova_compute[192144]: 2025-10-02 12:43:21.813 2 DEBUG oslo_concurrency.processutils [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:43:21 np0005466013 nova_compute[192144]: 2025-10-02 12:43:21.814 2 DEBUG oslo_concurrency.processutils [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/0039457a-eaaf-4eb6-836e-2c4ce4c14a83/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:43:22 np0005466013 nova_compute[192144]: 2025-10-02 12:43:22.250 2 DEBUG oslo_concurrency.processutils [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/0039457a-eaaf-4eb6-836e-2c4ce4c14a83/disk 1073741824" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:43:22 np0005466013 nova_compute[192144]: 2025-10-02 12:43:22.251 2 DEBUG oslo_concurrency.lockutils [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.510s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:43:22 np0005466013 nova_compute[192144]: 2025-10-02 12:43:22.252 2 DEBUG oslo_concurrency.processutils [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:43:22 np0005466013 nova_compute[192144]: 2025-10-02 12:43:22.337 2 DEBUG oslo_concurrency.processutils [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:43:22 np0005466013 nova_compute[192144]: 2025-10-02 12:43:22.338 2 DEBUG nova.virt.disk.api [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Checking if we can resize image /var/lib/nova/instances/0039457a-eaaf-4eb6-836e-2c4ce4c14a83/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:43:22 np0005466013 nova_compute[192144]: 2025-10-02 12:43:22.339 2 DEBUG oslo_concurrency.processutils [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0039457a-eaaf-4eb6-836e-2c4ce4c14a83/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:43:22 np0005466013 nova_compute[192144]: 2025-10-02 12:43:22.400 2 DEBUG oslo_concurrency.processutils [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0039457a-eaaf-4eb6-836e-2c4ce4c14a83/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:43:22 np0005466013 nova_compute[192144]: 2025-10-02 12:43:22.402 2 DEBUG nova.virt.disk.api [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Cannot resize image /var/lib/nova/instances/0039457a-eaaf-4eb6-836e-2c4ce4c14a83/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:43:22 np0005466013 nova_compute[192144]: 2025-10-02 12:43:22.402 2 DEBUG nova.objects.instance [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Lazy-loading 'migration_context' on Instance uuid 0039457a-eaaf-4eb6-836e-2c4ce4c14a83 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:43:22 np0005466013 nova_compute[192144]: 2025-10-02 12:43:22.416 2 DEBUG nova.virt.libvirt.driver [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:43:22 np0005466013 nova_compute[192144]: 2025-10-02 12:43:22.416 2 DEBUG nova.virt.libvirt.driver [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Ensure instance console log exists: /var/lib/nova/instances/0039457a-eaaf-4eb6-836e-2c4ce4c14a83/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:43:22 np0005466013 nova_compute[192144]: 2025-10-02 12:43:22.417 2 DEBUG oslo_concurrency.lockutils [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:43:22 np0005466013 nova_compute[192144]: 2025-10-02 12:43:22.417 2 DEBUG oslo_concurrency.lockutils [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:43:22 np0005466013 nova_compute[192144]: 2025-10-02 12:43:22.417 2 DEBUG oslo_concurrency.lockutils [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:43:22 np0005466013 nova_compute[192144]: 2025-10-02 12:43:22.520 2 DEBUG nova.network.neutron [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Successfully created port: 72aa18cf-4db0-4dfb-9c47-5b03817aa2f2 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:43:23 np0005466013 nova_compute[192144]: 2025-10-02 12:43:23.640 2 DEBUG nova.network.neutron [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Successfully updated port: 72aa18cf-4db0-4dfb-9c47-5b03817aa2f2 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:43:23 np0005466013 nova_compute[192144]: 2025-10-02 12:43:23.662 2 DEBUG oslo_concurrency.lockutils [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Acquiring lock "refresh_cache-0039457a-eaaf-4eb6-836e-2c4ce4c14a83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:43:23 np0005466013 nova_compute[192144]: 2025-10-02 12:43:23.662 2 DEBUG oslo_concurrency.lockutils [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Acquired lock "refresh_cache-0039457a-eaaf-4eb6-836e-2c4ce4c14a83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:43:23 np0005466013 nova_compute[192144]: 2025-10-02 12:43:23.663 2 DEBUG nova.network.neutron [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:43:23 np0005466013 nova_compute[192144]: 2025-10-02 12:43:23.745 2 DEBUG nova.compute.manager [req-0108cf78-607f-45e8-aaaa-9c7f5b1ffd36 req-630e1cfb-cb34-4506-bc5c-48a4fdfdf052 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Received event network-changed-72aa18cf-4db0-4dfb-9c47-5b03817aa2f2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:43:23 np0005466013 nova_compute[192144]: 2025-10-02 12:43:23.746 2 DEBUG nova.compute.manager [req-0108cf78-607f-45e8-aaaa-9c7f5b1ffd36 req-630e1cfb-cb34-4506-bc5c-48a4fdfdf052 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Refreshing instance network info cache due to event network-changed-72aa18cf-4db0-4dfb-9c47-5b03817aa2f2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:43:23 np0005466013 nova_compute[192144]: 2025-10-02 12:43:23.746 2 DEBUG oslo_concurrency.lockutils [req-0108cf78-607f-45e8-aaaa-9c7f5b1ffd36 req-630e1cfb-cb34-4506-bc5c-48a4fdfdf052 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-0039457a-eaaf-4eb6-836e-2c4ce4c14a83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:43:23 np0005466013 nova_compute[192144]: 2025-10-02 12:43:23.826 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:23 np0005466013 nova_compute[192144]: 2025-10-02 12:43:23.873 2 DEBUG nova.network.neutron [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:43:25 np0005466013 nova_compute[192144]: 2025-10-02 12:43:25.708 2 DEBUG nova.network.neutron [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Updating instance_info_cache with network_info: [{"id": "72aa18cf-4db0-4dfb-9c47-5b03817aa2f2", "address": "fa:16:3e:6b:c5:aa", "network": {"id": "89c6a9c2-23c1-4b8b-81b9-3050a42a016f", "bridge": "br-int", "label": "tempest-TestServerMultinode-1758818255-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a5d17af56da453cb0073e5e2be72803", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72aa18cf-4d", "ovs_interfaceid": "72aa18cf-4db0-4dfb-9c47-5b03817aa2f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:43:25 np0005466013 nova_compute[192144]: 2025-10-02 12:43:25.730 2 DEBUG oslo_concurrency.lockutils [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Releasing lock "refresh_cache-0039457a-eaaf-4eb6-836e-2c4ce4c14a83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:43:25 np0005466013 nova_compute[192144]: 2025-10-02 12:43:25.730 2 DEBUG nova.compute.manager [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Instance network_info: |[{"id": "72aa18cf-4db0-4dfb-9c47-5b03817aa2f2", "address": "fa:16:3e:6b:c5:aa", "network": {"id": "89c6a9c2-23c1-4b8b-81b9-3050a42a016f", "bridge": "br-int", "label": "tempest-TestServerMultinode-1758818255-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a5d17af56da453cb0073e5e2be72803", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72aa18cf-4d", "ovs_interfaceid": "72aa18cf-4db0-4dfb-9c47-5b03817aa2f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:43:25 np0005466013 nova_compute[192144]: 2025-10-02 12:43:25.730 2 DEBUG oslo_concurrency.lockutils [req-0108cf78-607f-45e8-aaaa-9c7f5b1ffd36 req-630e1cfb-cb34-4506-bc5c-48a4fdfdf052 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-0039457a-eaaf-4eb6-836e-2c4ce4c14a83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:43:25 np0005466013 nova_compute[192144]: 2025-10-02 12:43:25.731 2 DEBUG nova.network.neutron [req-0108cf78-607f-45e8-aaaa-9c7f5b1ffd36 req-630e1cfb-cb34-4506-bc5c-48a4fdfdf052 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Refreshing network info cache for port 72aa18cf-4db0-4dfb-9c47-5b03817aa2f2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:43:25 np0005466013 nova_compute[192144]: 2025-10-02 12:43:25.733 2 DEBUG nova.virt.libvirt.driver [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Start _get_guest_xml network_info=[{"id": "72aa18cf-4db0-4dfb-9c47-5b03817aa2f2", "address": "fa:16:3e:6b:c5:aa", "network": {"id": "89c6a9c2-23c1-4b8b-81b9-3050a42a016f", "bridge": "br-int", "label": "tempest-TestServerMultinode-1758818255-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a5d17af56da453cb0073e5e2be72803", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72aa18cf-4d", "ovs_interfaceid": "72aa18cf-4db0-4dfb-9c47-5b03817aa2f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:43:25 np0005466013 nova_compute[192144]: 2025-10-02 12:43:25.737 2 WARNING nova.virt.libvirt.driver [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:43:25 np0005466013 nova_compute[192144]: 2025-10-02 12:43:25.741 2 DEBUG nova.virt.libvirt.host [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:43:25 np0005466013 nova_compute[192144]: 2025-10-02 12:43:25.741 2 DEBUG nova.virt.libvirt.host [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:43:25 np0005466013 nova_compute[192144]: 2025-10-02 12:43:25.744 2 DEBUG nova.virt.libvirt.host [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:43:25 np0005466013 nova_compute[192144]: 2025-10-02 12:43:25.745 2 DEBUG nova.virt.libvirt.host [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:43:25 np0005466013 nova_compute[192144]: 2025-10-02 12:43:25.747 2 DEBUG nova.virt.libvirt.driver [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:43:25 np0005466013 nova_compute[192144]: 2025-10-02 12:43:25.748 2 DEBUG nova.virt.hardware [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:43:25 np0005466013 nova_compute[192144]: 2025-10-02 12:43:25.748 2 DEBUG nova.virt.hardware [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:43:25 np0005466013 nova_compute[192144]: 2025-10-02 12:43:25.748 2 DEBUG nova.virt.hardware [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:43:25 np0005466013 nova_compute[192144]: 2025-10-02 12:43:25.748 2 DEBUG nova.virt.hardware [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:43:25 np0005466013 nova_compute[192144]: 2025-10-02 12:43:25.749 2 DEBUG nova.virt.hardware [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:43:25 np0005466013 nova_compute[192144]: 2025-10-02 12:43:25.749 2 DEBUG nova.virt.hardware [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:43:25 np0005466013 nova_compute[192144]: 2025-10-02 12:43:25.749 2 DEBUG nova.virt.hardware [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:43:25 np0005466013 nova_compute[192144]: 2025-10-02 12:43:25.749 2 DEBUG nova.virt.hardware [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:43:25 np0005466013 nova_compute[192144]: 2025-10-02 12:43:25.749 2 DEBUG nova.virt.hardware [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:43:25 np0005466013 nova_compute[192144]: 2025-10-02 12:43:25.750 2 DEBUG nova.virt.hardware [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:43:25 np0005466013 nova_compute[192144]: 2025-10-02 12:43:25.750 2 DEBUG nova.virt.hardware [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:43:25 np0005466013 nova_compute[192144]: 2025-10-02 12:43:25.753 2 DEBUG nova.virt.libvirt.vif [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:43:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-1757639312',display_name='tempest-TestServerMultinode-server-1757639312',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testservermultinode-server-1757639312',id=178,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0acd1c52a26d4654b24111e5ad4814f2',ramdisk_id='',reservation_id='r-5hvdq2xx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,admin',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-1539275040',owner_user_name='tempest-TestServerMultinode-1539275040-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:43:21Z,user_data=None,user_id='7ed2a973cfed4867a095aecf0c6453fb',uuid=0039457a-eaaf-4eb6-836e-2c4ce4c14a83,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "72aa18cf-4db0-4dfb-9c47-5b03817aa2f2", "address": "fa:16:3e:6b:c5:aa", "network": {"id": "89c6a9c2-23c1-4b8b-81b9-3050a42a016f", "bridge": "br-int", "label": "tempest-TestServerMultinode-1758818255-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a5d17af56da453cb0073e5e2be72803", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72aa18cf-4d", "ovs_interfaceid": "72aa18cf-4db0-4dfb-9c47-5b03817aa2f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:43:25 np0005466013 nova_compute[192144]: 2025-10-02 12:43:25.754 2 DEBUG nova.network.os_vif_util [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Converting VIF {"id": "72aa18cf-4db0-4dfb-9c47-5b03817aa2f2", "address": "fa:16:3e:6b:c5:aa", "network": {"id": "89c6a9c2-23c1-4b8b-81b9-3050a42a016f", "bridge": "br-int", "label": "tempest-TestServerMultinode-1758818255-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a5d17af56da453cb0073e5e2be72803", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72aa18cf-4d", "ovs_interfaceid": "72aa18cf-4db0-4dfb-9c47-5b03817aa2f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:43:25 np0005466013 nova_compute[192144]: 2025-10-02 12:43:25.754 2 DEBUG nova.network.os_vif_util [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6b:c5:aa,bridge_name='br-int',has_traffic_filtering=True,id=72aa18cf-4db0-4dfb-9c47-5b03817aa2f2,network=Network(89c6a9c2-23c1-4b8b-81b9-3050a42a016f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap72aa18cf-4d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:43:25 np0005466013 nova_compute[192144]: 2025-10-02 12:43:25.755 2 DEBUG nova.objects.instance [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0039457a-eaaf-4eb6-836e-2c4ce4c14a83 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:43:25 np0005466013 nova_compute[192144]: 2025-10-02 12:43:25.770 2 DEBUG nova.virt.libvirt.driver [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:43:25 np0005466013 nova_compute[192144]:  <uuid>0039457a-eaaf-4eb6-836e-2c4ce4c14a83</uuid>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:  <name>instance-000000b2</name>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:43:25 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:      <nova:name>tempest-TestServerMultinode-server-1757639312</nova:name>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:43:25</nova:creationTime>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:43:25 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:        <nova:user uuid="7ed2a973cfed4867a095aecf0c6453fb">tempest-TestServerMultinode-1539275040-project-admin</nova:user>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:        <nova:project uuid="0acd1c52a26d4654b24111e5ad4814f2">tempest-TestServerMultinode-1539275040</nova:project>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:        <nova:port uuid="72aa18cf-4db0-4dfb-9c47-5b03817aa2f2">
Oct  2 08:43:25 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:      <entry name="serial">0039457a-eaaf-4eb6-836e-2c4ce4c14a83</entry>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:      <entry name="uuid">0039457a-eaaf-4eb6-836e-2c4ce4c14a83</entry>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:43:25 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/0039457a-eaaf-4eb6-836e-2c4ce4c14a83/disk"/>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:43:25 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/0039457a-eaaf-4eb6-836e-2c4ce4c14a83/disk.config"/>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:43:25 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:6b:c5:aa"/>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:      <target dev="tap72aa18cf-4d"/>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:43:25 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/0039457a-eaaf-4eb6-836e-2c4ce4c14a83/console.log" append="off"/>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:43:25 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:43:25 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:43:25 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:43:25 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:43:25 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:43:25 np0005466013 nova_compute[192144]: 2025-10-02 12:43:25.771 2 DEBUG nova.compute.manager [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Preparing to wait for external event network-vif-plugged-72aa18cf-4db0-4dfb-9c47-5b03817aa2f2 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:43:25 np0005466013 nova_compute[192144]: 2025-10-02 12:43:25.771 2 DEBUG oslo_concurrency.lockutils [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Acquiring lock "0039457a-eaaf-4eb6-836e-2c4ce4c14a83-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:43:25 np0005466013 nova_compute[192144]: 2025-10-02 12:43:25.772 2 DEBUG oslo_concurrency.lockutils [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Lock "0039457a-eaaf-4eb6-836e-2c4ce4c14a83-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:43:25 np0005466013 nova_compute[192144]: 2025-10-02 12:43:25.772 2 DEBUG oslo_concurrency.lockutils [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Lock "0039457a-eaaf-4eb6-836e-2c4ce4c14a83-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:43:25 np0005466013 nova_compute[192144]: 2025-10-02 12:43:25.772 2 DEBUG nova.virt.libvirt.vif [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:43:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-1757639312',display_name='tempest-TestServerMultinode-server-1757639312',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testservermultinode-server-1757639312',id=178,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0acd1c52a26d4654b24111e5ad4814f2',ramdisk_id='',reservation_id='r-5hvdq2xx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,admin',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-1539275040',owner_user_name='tempest-TestServerMultinode-1539275040-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:43:21Z,user_data=None,user_id='7ed2a973cfed4867a095aecf0c6453fb',uuid=0039457a-eaaf-4eb6-836e-2c4ce4c14a83,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "72aa18cf-4db0-4dfb-9c47-5b03817aa2f2", "address": "fa:16:3e:6b:c5:aa", "network": {"id": "89c6a9c2-23c1-4b8b-81b9-3050a42a016f", "bridge": "br-int", "label": "tempest-TestServerMultinode-1758818255-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a5d17af56da453cb0073e5e2be72803", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72aa18cf-4d", "ovs_interfaceid": "72aa18cf-4db0-4dfb-9c47-5b03817aa2f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:43:25 np0005466013 nova_compute[192144]: 2025-10-02 12:43:25.773 2 DEBUG nova.network.os_vif_util [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Converting VIF {"id": "72aa18cf-4db0-4dfb-9c47-5b03817aa2f2", "address": "fa:16:3e:6b:c5:aa", "network": {"id": "89c6a9c2-23c1-4b8b-81b9-3050a42a016f", "bridge": "br-int", "label": "tempest-TestServerMultinode-1758818255-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a5d17af56da453cb0073e5e2be72803", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72aa18cf-4d", "ovs_interfaceid": "72aa18cf-4db0-4dfb-9c47-5b03817aa2f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:43:25 np0005466013 nova_compute[192144]: 2025-10-02 12:43:25.773 2 DEBUG nova.network.os_vif_util [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6b:c5:aa,bridge_name='br-int',has_traffic_filtering=True,id=72aa18cf-4db0-4dfb-9c47-5b03817aa2f2,network=Network(89c6a9c2-23c1-4b8b-81b9-3050a42a016f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap72aa18cf-4d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:43:25 np0005466013 nova_compute[192144]: 2025-10-02 12:43:25.774 2 DEBUG os_vif [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6b:c5:aa,bridge_name='br-int',has_traffic_filtering=True,id=72aa18cf-4db0-4dfb-9c47-5b03817aa2f2,network=Network(89c6a9c2-23c1-4b8b-81b9-3050a42a016f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap72aa18cf-4d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:43:25 np0005466013 nova_compute[192144]: 2025-10-02 12:43:25.774 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:25 np0005466013 nova_compute[192144]: 2025-10-02 12:43:25.775 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:43:25 np0005466013 nova_compute[192144]: 2025-10-02 12:43:25.775 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:43:25 np0005466013 nova_compute[192144]: 2025-10-02 12:43:25.777 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:25 np0005466013 nova_compute[192144]: 2025-10-02 12:43:25.777 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap72aa18cf-4d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:43:25 np0005466013 nova_compute[192144]: 2025-10-02 12:43:25.778 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap72aa18cf-4d, col_values=(('external_ids', {'iface-id': '72aa18cf-4db0-4dfb-9c47-5b03817aa2f2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6b:c5:aa', 'vm-uuid': '0039457a-eaaf-4eb6-836e-2c4ce4c14a83'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:43:25 np0005466013 NetworkManager[51205]: <info>  [1759409005.7807] manager: (tap72aa18cf-4d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/337)
Oct  2 08:43:25 np0005466013 nova_compute[192144]: 2025-10-02 12:43:25.781 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:43:25 np0005466013 nova_compute[192144]: 2025-10-02 12:43:25.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:25 np0005466013 nova_compute[192144]: 2025-10-02 12:43:25.788 2 INFO os_vif [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6b:c5:aa,bridge_name='br-int',has_traffic_filtering=True,id=72aa18cf-4db0-4dfb-9c47-5b03817aa2f2,network=Network(89c6a9c2-23c1-4b8b-81b9-3050a42a016f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap72aa18cf-4d')#033[00m
Oct  2 08:43:26 np0005466013 nova_compute[192144]: 2025-10-02 12:43:26.032 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:26 np0005466013 nova_compute[192144]: 2025-10-02 12:43:26.037 2 DEBUG nova.virt.libvirt.driver [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:43:26 np0005466013 nova_compute[192144]: 2025-10-02 12:43:26.037 2 DEBUG nova.virt.libvirt.driver [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:43:26 np0005466013 nova_compute[192144]: 2025-10-02 12:43:26.038 2 DEBUG nova.virt.libvirt.driver [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] No VIF found with MAC fa:16:3e:6b:c5:aa, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:43:26 np0005466013 nova_compute[192144]: 2025-10-02 12:43:26.039 2 INFO nova.virt.libvirt.driver [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Using config drive#033[00m
Oct  2 08:43:26 np0005466013 nova_compute[192144]: 2025-10-02 12:43:26.611 2 INFO nova.virt.libvirt.driver [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Creating config drive at /var/lib/nova/instances/0039457a-eaaf-4eb6-836e-2c4ce4c14a83/disk.config#033[00m
Oct  2 08:43:26 np0005466013 nova_compute[192144]: 2025-10-02 12:43:26.618 2 DEBUG oslo_concurrency.processutils [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0039457a-eaaf-4eb6-836e-2c4ce4c14a83/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpg7eo6c69 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:43:26 np0005466013 nova_compute[192144]: 2025-10-02 12:43:26.744 2 DEBUG oslo_concurrency.processutils [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0039457a-eaaf-4eb6-836e-2c4ce4c14a83/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpg7eo6c69" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:43:26 np0005466013 kernel: tap72aa18cf-4d: entered promiscuous mode
Oct  2 08:43:26 np0005466013 NetworkManager[51205]: <info>  [1759409006.8170] manager: (tap72aa18cf-4d): new Tun device (/org/freedesktop/NetworkManager/Devices/338)
Oct  2 08:43:26 np0005466013 systemd-udevd[250714]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:43:26 np0005466013 nova_compute[192144]: 2025-10-02 12:43:26.871 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:26 np0005466013 ovn_controller[94366]: 2025-10-02T12:43:26Z|00762|binding|INFO|Claiming lport 72aa18cf-4db0-4dfb-9c47-5b03817aa2f2 for this chassis.
Oct  2 08:43:26 np0005466013 ovn_controller[94366]: 2025-10-02T12:43:26Z|00763|binding|INFO|72aa18cf-4db0-4dfb-9c47-5b03817aa2f2: Claiming fa:16:3e:6b:c5:aa 10.100.0.5
Oct  2 08:43:26 np0005466013 nova_compute[192144]: 2025-10-02 12:43:26.876 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:26 np0005466013 NetworkManager[51205]: <info>  [1759409006.8849] device (tap72aa18cf-4d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:43:26 np0005466013 NetworkManager[51205]: <info>  [1759409006.8860] device (tap72aa18cf-4d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:43:26 np0005466013 systemd-machined[152202]: New machine qemu-80-instance-000000b2.
Oct  2 08:43:26 np0005466013 systemd[1]: Started Virtual Machine qemu-80-instance-000000b2.
Oct  2 08:43:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:43:26.927 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6b:c5:aa 10.100.0.5'], port_security=['fa:16:3e:6b:c5:aa 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '0039457a-eaaf-4eb6-836e-2c4ce4c14a83', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-89c6a9c2-23c1-4b8b-81b9-3050a42a016f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0acd1c52a26d4654b24111e5ad4814f2', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5f9d53e4-02f4-4598-9a8f-67bc82369860', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7c1b0270-8f0a-4540-b305-4a4654e80399, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=72aa18cf-4db0-4dfb-9c47-5b03817aa2f2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:43:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:43:26.928 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 72aa18cf-4db0-4dfb-9c47-5b03817aa2f2 in datapath 89c6a9c2-23c1-4b8b-81b9-3050a42a016f bound to our chassis#033[00m
Oct  2 08:43:26 np0005466013 ovn_controller[94366]: 2025-10-02T12:43:26Z|00764|binding|INFO|Setting lport 72aa18cf-4db0-4dfb-9c47-5b03817aa2f2 ovn-installed in OVS
Oct  2 08:43:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:43:26.929 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 89c6a9c2-23c1-4b8b-81b9-3050a42a016f#033[00m
Oct  2 08:43:26 np0005466013 ovn_controller[94366]: 2025-10-02T12:43:26Z|00765|binding|INFO|Setting lport 72aa18cf-4db0-4dfb-9c47-5b03817aa2f2 up in Southbound
Oct  2 08:43:26 np0005466013 nova_compute[192144]: 2025-10-02 12:43:26.930 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:43:26.942 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[c186ef47-8cb2-48e0-ae5a-fc4c663fb8b0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:43:26.943 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap89c6a9c2-21 in ovnmeta-89c6a9c2-23c1-4b8b-81b9-3050a42a016f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:43:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:43:26.945 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap89c6a9c2-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:43:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:43:26.945 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[9ca933b8-f20c-4969-8831-3e5147b993a2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:43:26.946 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[68c4c1a2-e933-4ffa-bb27-8a24f064f5b4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:43:26.956 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[a34db509-baad-4fb7-a84f-644994684a72]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:43:26.980 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[9d3db619-0ba6-42ba-857c-41224fa6d441]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:43:27.006 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[0acfbe6b-922c-414e-bbae-a4e72926ce50]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:43:27.010 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[f8711b4b-7428-4254-899c-ac41019060ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:27 np0005466013 NetworkManager[51205]: <info>  [1759409007.0119] manager: (tap89c6a9c2-20): new Veth device (/org/freedesktop/NetworkManager/Devices/339)
Oct  2 08:43:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:43:27.044 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[57e53d15-a84d-4d8d-a837-ab06a5ef469b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:43:27.047 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[f770e378-a69c-4963-b098-c7dac8be8199]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:27 np0005466013 NetworkManager[51205]: <info>  [1759409007.0666] device (tap89c6a9c2-20): carrier: link connected
Oct  2 08:43:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:43:27.071 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[9904084a-bfdc-4536-9a4a-a2f5a6797107]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:43:27.092 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[5ba9ebdd-d9cd-4777-9750-933a9420aa75]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap89c6a9c2-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d4:39:14'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 225], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 699868, 'reachable_time': 18146, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 250750, 'error': None, 'target': 'ovnmeta-89c6a9c2-23c1-4b8b-81b9-3050a42a016f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:43:27.107 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[77a7b594-cb72-4ca4-a849-6a1b148f6e70]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed4:3914'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 699868, 'tstamp': 699868}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 250751, 'error': None, 'target': 'ovnmeta-89c6a9c2-23c1-4b8b-81b9-3050a42a016f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:43:27.123 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[9d1be56c-aba3-4add-88ac-350711e8ee99]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap89c6a9c2-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d4:39:14'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 225], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 699868, 'reachable_time': 18146, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 250752, 'error': None, 'target': 'ovnmeta-89c6a9c2-23c1-4b8b-81b9-3050a42a016f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:43:27.153 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[9e3398cd-29e7-415b-9fb8-ad4c9c705855]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:43:27.206 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[5895ba81-8805-48e2-9b33-921fc32ab18d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:43:27.208 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap89c6a9c2-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:43:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:43:27.208 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:43:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:43:27.209 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap89c6a9c2-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:43:27 np0005466013 nova_compute[192144]: 2025-10-02 12:43:27.210 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:27 np0005466013 kernel: tap89c6a9c2-20: entered promiscuous mode
Oct  2 08:43:27 np0005466013 NetworkManager[51205]: <info>  [1759409007.2124] manager: (tap89c6a9c2-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/340)
Oct  2 08:43:27 np0005466013 nova_compute[192144]: 2025-10-02 12:43:27.217 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:43:27.218 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap89c6a9c2-20, col_values=(('external_ids', {'iface-id': 'f668a745-fb31-4662-9099-e8e7982b3bbb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:43:27 np0005466013 nova_compute[192144]: 2025-10-02 12:43:27.219 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:27 np0005466013 ovn_controller[94366]: 2025-10-02T12:43:27Z|00766|binding|INFO|Releasing lport f668a745-fb31-4662-9099-e8e7982b3bbb from this chassis (sb_readonly=0)
Oct  2 08:43:27 np0005466013 nova_compute[192144]: 2025-10-02 12:43:27.220 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:43:27.221 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/89c6a9c2-23c1-4b8b-81b9-3050a42a016f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/89c6a9c2-23c1-4b8b-81b9-3050a42a016f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:43:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:43:27.222 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[aa1a0259-f12d-4c1c-bd80-21d88f7849bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:43:27.223 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:43:27 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:43:27 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:43:27 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-89c6a9c2-23c1-4b8b-81b9-3050a42a016f
Oct  2 08:43:27 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:43:27 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:43:27 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:43:27 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/89c6a9c2-23c1-4b8b-81b9-3050a42a016f.pid.haproxy
Oct  2 08:43:27 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:43:27 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:43:27 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:43:27 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:43:27 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:43:27 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:43:27 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:43:27 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:43:27 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:43:27 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:43:27 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:43:27 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:43:27 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:43:27 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:43:27 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:43:27 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:43:27 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:43:27 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:43:27 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:43:27 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:43:27 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID 89c6a9c2-23c1-4b8b-81b9-3050a42a016f
Oct  2 08:43:27 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:43:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:43:27.223 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-89c6a9c2-23c1-4b8b-81b9-3050a42a016f', 'env', 'PROCESS_TAG=haproxy-89c6a9c2-23c1-4b8b-81b9-3050a42a016f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/89c6a9c2-23c1-4b8b-81b9-3050a42a016f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:43:27 np0005466013 nova_compute[192144]: 2025-10-02 12:43:27.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:27 np0005466013 podman[250784]: 2025-10-02 12:43:27.547714189 +0000 UTC m=+0.022468945 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:43:27 np0005466013 podman[250784]: 2025-10-02 12:43:27.76994134 +0000 UTC m=+0.244696086 container create 73072c1fe0981a3345126f9f8de0ceee03ee3a47a5e636b45e04d7d70e60b151 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89c6a9c2-23c1-4b8b-81b9-3050a42a016f, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:43:27 np0005466013 systemd[1]: Started libpod-conmon-73072c1fe0981a3345126f9f8de0ceee03ee3a47a5e636b45e04d7d70e60b151.scope.
Oct  2 08:43:27 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:43:27 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/875fb725f34442e94ba4ef406266b922400fe0877c653b1f93615f0ed0b4e852/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:43:27 np0005466013 podman[250784]: 2025-10-02 12:43:27.902151028 +0000 UTC m=+0.376905784 container init 73072c1fe0981a3345126f9f8de0ceee03ee3a47a5e636b45e04d7d70e60b151 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89c6a9c2-23c1-4b8b-81b9-3050a42a016f, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:43:27 np0005466013 nova_compute[192144]: 2025-10-02 12:43:27.905 2 DEBUG nova.compute.manager [req-008b4c93-ed35-4384-bed1-076cabab18b8 req-916072f7-f222-4b3e-acfc-e1ebadfa9e3a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Received event network-vif-plugged-72aa18cf-4db0-4dfb-9c47-5b03817aa2f2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:43:27 np0005466013 nova_compute[192144]: 2025-10-02 12:43:27.907 2 DEBUG oslo_concurrency.lockutils [req-008b4c93-ed35-4384-bed1-076cabab18b8 req-916072f7-f222-4b3e-acfc-e1ebadfa9e3a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "0039457a-eaaf-4eb6-836e-2c4ce4c14a83-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:43:27 np0005466013 nova_compute[192144]: 2025-10-02 12:43:27.907 2 DEBUG oslo_concurrency.lockutils [req-008b4c93-ed35-4384-bed1-076cabab18b8 req-916072f7-f222-4b3e-acfc-e1ebadfa9e3a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "0039457a-eaaf-4eb6-836e-2c4ce4c14a83-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:43:27 np0005466013 nova_compute[192144]: 2025-10-02 12:43:27.908 2 DEBUG oslo_concurrency.lockutils [req-008b4c93-ed35-4384-bed1-076cabab18b8 req-916072f7-f222-4b3e-acfc-e1ebadfa9e3a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "0039457a-eaaf-4eb6-836e-2c4ce4c14a83-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:43:27 np0005466013 nova_compute[192144]: 2025-10-02 12:43:27.908 2 DEBUG nova.compute.manager [req-008b4c93-ed35-4384-bed1-076cabab18b8 req-916072f7-f222-4b3e-acfc-e1ebadfa9e3a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Processing event network-vif-plugged-72aa18cf-4db0-4dfb-9c47-5b03817aa2f2 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:43:27 np0005466013 podman[250784]: 2025-10-02 12:43:27.911904923 +0000 UTC m=+0.386659649 container start 73072c1fe0981a3345126f9f8de0ceee03ee3a47a5e636b45e04d7d70e60b151 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89c6a9c2-23c1-4b8b-81b9-3050a42a016f, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Oct  2 08:43:27 np0005466013 neutron-haproxy-ovnmeta-89c6a9c2-23c1-4b8b-81b9-3050a42a016f[250799]: [NOTICE]   (250806) : New worker (250812) forked
Oct  2 08:43:27 np0005466013 neutron-haproxy-ovnmeta-89c6a9c2-23c1-4b8b-81b9-3050a42a016f[250799]: [NOTICE]   (250806) : Loading success.
Oct  2 08:43:28 np0005466013 nova_compute[192144]: 2025-10-02 12:43:28.447 2 DEBUG nova.network.neutron [req-0108cf78-607f-45e8-aaaa-9c7f5b1ffd36 req-630e1cfb-cb34-4506-bc5c-48a4fdfdf052 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Updated VIF entry in instance network info cache for port 72aa18cf-4db0-4dfb-9c47-5b03817aa2f2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:43:28 np0005466013 nova_compute[192144]: 2025-10-02 12:43:28.449 2 DEBUG nova.network.neutron [req-0108cf78-607f-45e8-aaaa-9c7f5b1ffd36 req-630e1cfb-cb34-4506-bc5c-48a4fdfdf052 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Updating instance_info_cache with network_info: [{"id": "72aa18cf-4db0-4dfb-9c47-5b03817aa2f2", "address": "fa:16:3e:6b:c5:aa", "network": {"id": "89c6a9c2-23c1-4b8b-81b9-3050a42a016f", "bridge": "br-int", "label": "tempest-TestServerMultinode-1758818255-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a5d17af56da453cb0073e5e2be72803", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72aa18cf-4d", "ovs_interfaceid": "72aa18cf-4db0-4dfb-9c47-5b03817aa2f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:43:28 np0005466013 nova_compute[192144]: 2025-10-02 12:43:28.459 2 DEBUG nova.compute.manager [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:43:28 np0005466013 nova_compute[192144]: 2025-10-02 12:43:28.460 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759409008.4581616, 0039457a-eaaf-4eb6-836e-2c4ce4c14a83 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:43:28 np0005466013 nova_compute[192144]: 2025-10-02 12:43:28.460 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] VM Started (Lifecycle Event)#033[00m
Oct  2 08:43:28 np0005466013 nova_compute[192144]: 2025-10-02 12:43:28.464 2 DEBUG nova.virt.libvirt.driver [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:43:28 np0005466013 nova_compute[192144]: 2025-10-02 12:43:28.467 2 INFO nova.virt.libvirt.driver [-] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Instance spawned successfully.#033[00m
Oct  2 08:43:28 np0005466013 nova_compute[192144]: 2025-10-02 12:43:28.468 2 DEBUG nova.virt.libvirt.driver [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:43:28 np0005466013 nova_compute[192144]: 2025-10-02 12:43:28.503 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:43:28 np0005466013 nova_compute[192144]: 2025-10-02 12:43:28.504 2 DEBUG oslo_concurrency.lockutils [req-0108cf78-607f-45e8-aaaa-9c7f5b1ffd36 req-630e1cfb-cb34-4506-bc5c-48a4fdfdf052 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-0039457a-eaaf-4eb6-836e-2c4ce4c14a83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:43:28 np0005466013 nova_compute[192144]: 2025-10-02 12:43:28.511 2 DEBUG nova.virt.libvirt.driver [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:43:28 np0005466013 nova_compute[192144]: 2025-10-02 12:43:28.512 2 DEBUG nova.virt.libvirt.driver [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:43:28 np0005466013 nova_compute[192144]: 2025-10-02 12:43:28.512 2 DEBUG nova.virt.libvirt.driver [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:43:28 np0005466013 nova_compute[192144]: 2025-10-02 12:43:28.513 2 DEBUG nova.virt.libvirt.driver [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:43:28 np0005466013 nova_compute[192144]: 2025-10-02 12:43:28.513 2 DEBUG nova.virt.libvirt.driver [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:43:28 np0005466013 nova_compute[192144]: 2025-10-02 12:43:28.514 2 DEBUG nova.virt.libvirt.driver [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:43:28 np0005466013 nova_compute[192144]: 2025-10-02 12:43:28.523 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:43:28 np0005466013 nova_compute[192144]: 2025-10-02 12:43:28.585 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:43:28 np0005466013 nova_compute[192144]: 2025-10-02 12:43:28.586 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759409008.4595478, 0039457a-eaaf-4eb6-836e-2c4ce4c14a83 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:43:28 np0005466013 nova_compute[192144]: 2025-10-02 12:43:28.586 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:43:28 np0005466013 nova_compute[192144]: 2025-10-02 12:43:28.654 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:43:28 np0005466013 nova_compute[192144]: 2025-10-02 12:43:28.663 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759409008.4630961, 0039457a-eaaf-4eb6-836e-2c4ce4c14a83 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:43:28 np0005466013 nova_compute[192144]: 2025-10-02 12:43:28.663 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:43:28 np0005466013 nova_compute[192144]: 2025-10-02 12:43:28.732 2 INFO nova.compute.manager [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Took 7.09 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:43:28 np0005466013 nova_compute[192144]: 2025-10-02 12:43:28.733 2 DEBUG nova.compute.manager [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:43:28 np0005466013 nova_compute[192144]: 2025-10-02 12:43:28.740 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:43:28 np0005466013 nova_compute[192144]: 2025-10-02 12:43:28.743 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:43:28 np0005466013 nova_compute[192144]: 2025-10-02 12:43:28.803 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:43:28 np0005466013 nova_compute[192144]: 2025-10-02 12:43:28.915 2 INFO nova.compute.manager [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Took 7.96 seconds to build instance.#033[00m
Oct  2 08:43:28 np0005466013 nova_compute[192144]: 2025-10-02 12:43:28.936 2 DEBUG oslo_concurrency.lockutils [None req-18994e3f-7915-44fd-8a25-20b5a44a4ef9 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Lock "0039457a-eaaf-4eb6-836e-2c4ce4c14a83" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.097s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:43:29 np0005466013 nova_compute[192144]: 2025-10-02 12:43:29.995 2 DEBUG nova.compute.manager [req-c0177744-b99d-4fdd-9836-4ec30d05d64c req-43c4bf34-e891-4f47-9781-174ed8751947 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Received event network-vif-plugged-72aa18cf-4db0-4dfb-9c47-5b03817aa2f2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:43:29 np0005466013 nova_compute[192144]: 2025-10-02 12:43:29.996 2 DEBUG oslo_concurrency.lockutils [req-c0177744-b99d-4fdd-9836-4ec30d05d64c req-43c4bf34-e891-4f47-9781-174ed8751947 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "0039457a-eaaf-4eb6-836e-2c4ce4c14a83-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:43:29 np0005466013 nova_compute[192144]: 2025-10-02 12:43:29.996 2 DEBUG oslo_concurrency.lockutils [req-c0177744-b99d-4fdd-9836-4ec30d05d64c req-43c4bf34-e891-4f47-9781-174ed8751947 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "0039457a-eaaf-4eb6-836e-2c4ce4c14a83-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:43:29 np0005466013 nova_compute[192144]: 2025-10-02 12:43:29.996 2 DEBUG oslo_concurrency.lockutils [req-c0177744-b99d-4fdd-9836-4ec30d05d64c req-43c4bf34-e891-4f47-9781-174ed8751947 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "0039457a-eaaf-4eb6-836e-2c4ce4c14a83-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:43:29 np0005466013 nova_compute[192144]: 2025-10-02 12:43:29.996 2 DEBUG nova.compute.manager [req-c0177744-b99d-4fdd-9836-4ec30d05d64c req-43c4bf34-e891-4f47-9781-174ed8751947 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] No waiting events found dispatching network-vif-plugged-72aa18cf-4db0-4dfb-9c47-5b03817aa2f2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:43:29 np0005466013 nova_compute[192144]: 2025-10-02 12:43:29.996 2 WARNING nova.compute.manager [req-c0177744-b99d-4fdd-9836-4ec30d05d64c req-43c4bf34-e891-4f47-9781-174ed8751947 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Received unexpected event network-vif-plugged-72aa18cf-4db0-4dfb-9c47-5b03817aa2f2 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:43:30 np0005466013 nova_compute[192144]: 2025-10-02 12:43:30.419 2 DEBUG oslo_concurrency.lockutils [None req-45b95a05-b0d3-4959-83a4-3395b643f9c3 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Acquiring lock "0039457a-eaaf-4eb6-836e-2c4ce4c14a83" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:43:30 np0005466013 nova_compute[192144]: 2025-10-02 12:43:30.420 2 DEBUG oslo_concurrency.lockutils [None req-45b95a05-b0d3-4959-83a4-3395b643f9c3 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Lock "0039457a-eaaf-4eb6-836e-2c4ce4c14a83" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:43:30 np0005466013 nova_compute[192144]: 2025-10-02 12:43:30.420 2 DEBUG oslo_concurrency.lockutils [None req-45b95a05-b0d3-4959-83a4-3395b643f9c3 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Acquiring lock "0039457a-eaaf-4eb6-836e-2c4ce4c14a83-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:43:30 np0005466013 nova_compute[192144]: 2025-10-02 12:43:30.421 2 DEBUG oslo_concurrency.lockutils [None req-45b95a05-b0d3-4959-83a4-3395b643f9c3 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Lock "0039457a-eaaf-4eb6-836e-2c4ce4c14a83-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:43:30 np0005466013 nova_compute[192144]: 2025-10-02 12:43:30.421 2 DEBUG oslo_concurrency.lockutils [None req-45b95a05-b0d3-4959-83a4-3395b643f9c3 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Lock "0039457a-eaaf-4eb6-836e-2c4ce4c14a83-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:43:30 np0005466013 nova_compute[192144]: 2025-10-02 12:43:30.508 2 INFO nova.compute.manager [None req-45b95a05-b0d3-4959-83a4-3395b643f9c3 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Terminating instance#033[00m
Oct  2 08:43:30 np0005466013 nova_compute[192144]: 2025-10-02 12:43:30.588 2 DEBUG nova.compute.manager [None req-45b95a05-b0d3-4959-83a4-3395b643f9c3 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:43:30 np0005466013 kernel: tap72aa18cf-4d (unregistering): left promiscuous mode
Oct  2 08:43:30 np0005466013 NetworkManager[51205]: <info>  [1759409010.6105] device (tap72aa18cf-4d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:43:30 np0005466013 ovn_controller[94366]: 2025-10-02T12:43:30Z|00767|binding|INFO|Releasing lport 72aa18cf-4db0-4dfb-9c47-5b03817aa2f2 from this chassis (sb_readonly=0)
Oct  2 08:43:30 np0005466013 ovn_controller[94366]: 2025-10-02T12:43:30Z|00768|binding|INFO|Setting lport 72aa18cf-4db0-4dfb-9c47-5b03817aa2f2 down in Southbound
Oct  2 08:43:30 np0005466013 ovn_controller[94366]: 2025-10-02T12:43:30Z|00769|binding|INFO|Removing iface tap72aa18cf-4d ovn-installed in OVS
Oct  2 08:43:30 np0005466013 nova_compute[192144]: 2025-10-02 12:43:30.632 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:43:30.635 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6b:c5:aa 10.100.0.5'], port_security=['fa:16:3e:6b:c5:aa 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '0039457a-eaaf-4eb6-836e-2c4ce4c14a83', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-89c6a9c2-23c1-4b8b-81b9-3050a42a016f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0acd1c52a26d4654b24111e5ad4814f2', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5f9d53e4-02f4-4598-9a8f-67bc82369860', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7c1b0270-8f0a-4540-b305-4a4654e80399, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=72aa18cf-4db0-4dfb-9c47-5b03817aa2f2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:43:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:43:30.636 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 72aa18cf-4db0-4dfb-9c47-5b03817aa2f2 in datapath 89c6a9c2-23c1-4b8b-81b9-3050a42a016f unbound from our chassis#033[00m
Oct  2 08:43:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:43:30.637 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 89c6a9c2-23c1-4b8b-81b9-3050a42a016f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:43:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:43:30.638 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[245de0b3-f39d-42b0-94a4-99d030927f90]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:43:30.639 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-89c6a9c2-23c1-4b8b-81b9-3050a42a016f namespace which is not needed anymore#033[00m
Oct  2 08:43:30 np0005466013 nova_compute[192144]: 2025-10-02 12:43:30.661 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:30 np0005466013 systemd[1]: machine-qemu\x2d80\x2dinstance\x2d000000b2.scope: Deactivated successfully.
Oct  2 08:43:30 np0005466013 systemd[1]: machine-qemu\x2d80\x2dinstance\x2d000000b2.scope: Consumed 3.653s CPU time.
Oct  2 08:43:30 np0005466013 systemd-machined[152202]: Machine qemu-80-instance-000000b2 terminated.
Oct  2 08:43:30 np0005466013 podman[250825]: 2025-10-02 12:43:30.72560149 +0000 UTC m=+0.090092137 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 08:43:30 np0005466013 podman[250822]: 2025-10-02 12:43:30.73742173 +0000 UTC m=+0.100916955 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:43:30 np0005466013 podman[250826]: 2025-10-02 12:43:30.763921272 +0000 UTC m=+0.112431628 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Oct  2 08:43:30 np0005466013 nova_compute[192144]: 2025-10-02 12:43:30.780 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:30 np0005466013 nova_compute[192144]: 2025-10-02 12:43:30.856 2 INFO nova.virt.libvirt.driver [-] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Instance destroyed successfully.#033[00m
Oct  2 08:43:30 np0005466013 nova_compute[192144]: 2025-10-02 12:43:30.857 2 DEBUG nova.objects.instance [None req-45b95a05-b0d3-4959-83a4-3395b643f9c3 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Lazy-loading 'resources' on Instance uuid 0039457a-eaaf-4eb6-836e-2c4ce4c14a83 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:43:30 np0005466013 nova_compute[192144]: 2025-10-02 12:43:30.980 2 DEBUG nova.virt.libvirt.vif [None req-45b95a05-b0d3-4959-83a4-3395b643f9c3 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:43:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerMultinode-server-1757639312',display_name='tempest-TestServerMultinode-server-1757639312',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testservermultinode-server-1757639312',id=178,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:43:28Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0acd1c52a26d4654b24111e5ad4814f2',ramdisk_id='',reservation_id='r-5hvdq2xx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,admin',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestServerMultinode-1539275040',owner_user_name='tempest-TestServerMultinode-1539275040-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:43:28Z,user_data=None,user_id='7ed2a973cfed4867a095aecf0c6453fb',uuid=0039457a-eaaf-4eb6-836e-2c4ce4c14a83,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "72aa18cf-4db0-4dfb-9c47-5b03817aa2f2", "address": "fa:16:3e:6b:c5:aa", "network": {"id": "89c6a9c2-23c1-4b8b-81b9-3050a42a016f", "bridge": "br-int", "label": "tempest-TestServerMultinode-1758818255-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a5d17af56da453cb0073e5e2be72803", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72aa18cf-4d", "ovs_interfaceid": "72aa18cf-4db0-4dfb-9c47-5b03817aa2f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:43:30 np0005466013 nova_compute[192144]: 2025-10-02 12:43:30.981 2 DEBUG nova.network.os_vif_util [None req-45b95a05-b0d3-4959-83a4-3395b643f9c3 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Converting VIF {"id": "72aa18cf-4db0-4dfb-9c47-5b03817aa2f2", "address": "fa:16:3e:6b:c5:aa", "network": {"id": "89c6a9c2-23c1-4b8b-81b9-3050a42a016f", "bridge": "br-int", "label": "tempest-TestServerMultinode-1758818255-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a5d17af56da453cb0073e5e2be72803", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72aa18cf-4d", "ovs_interfaceid": "72aa18cf-4db0-4dfb-9c47-5b03817aa2f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:43:30 np0005466013 nova_compute[192144]: 2025-10-02 12:43:30.982 2 DEBUG nova.network.os_vif_util [None req-45b95a05-b0d3-4959-83a4-3395b643f9c3 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6b:c5:aa,bridge_name='br-int',has_traffic_filtering=True,id=72aa18cf-4db0-4dfb-9c47-5b03817aa2f2,network=Network(89c6a9c2-23c1-4b8b-81b9-3050a42a016f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap72aa18cf-4d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:43:30 np0005466013 nova_compute[192144]: 2025-10-02 12:43:30.982 2 DEBUG os_vif [None req-45b95a05-b0d3-4959-83a4-3395b643f9c3 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6b:c5:aa,bridge_name='br-int',has_traffic_filtering=True,id=72aa18cf-4db0-4dfb-9c47-5b03817aa2f2,network=Network(89c6a9c2-23c1-4b8b-81b9-3050a42a016f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap72aa18cf-4d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:43:30 np0005466013 nova_compute[192144]: 2025-10-02 12:43:30.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:30 np0005466013 nova_compute[192144]: 2025-10-02 12:43:30.986 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap72aa18cf-4d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:43:30 np0005466013 nova_compute[192144]: 2025-10-02 12:43:30.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:30 np0005466013 nova_compute[192144]: 2025-10-02 12:43:30.990 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:30 np0005466013 nova_compute[192144]: 2025-10-02 12:43:30.994 2 INFO os_vif [None req-45b95a05-b0d3-4959-83a4-3395b643f9c3 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6b:c5:aa,bridge_name='br-int',has_traffic_filtering=True,id=72aa18cf-4db0-4dfb-9c47-5b03817aa2f2,network=Network(89c6a9c2-23c1-4b8b-81b9-3050a42a016f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap72aa18cf-4d')#033[00m
Oct  2 08:43:30 np0005466013 nova_compute[192144]: 2025-10-02 12:43:30.994 2 INFO nova.virt.libvirt.driver [None req-45b95a05-b0d3-4959-83a4-3395b643f9c3 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Deleting instance files /var/lib/nova/instances/0039457a-eaaf-4eb6-836e-2c4ce4c14a83_del#033[00m
Oct  2 08:43:30 np0005466013 nova_compute[192144]: 2025-10-02 12:43:30.995 2 INFO nova.virt.libvirt.driver [None req-45b95a05-b0d3-4959-83a4-3395b643f9c3 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Deletion of /var/lib/nova/instances/0039457a-eaaf-4eb6-836e-2c4ce4c14a83_del complete#033[00m
Oct  2 08:43:31 np0005466013 neutron-haproxy-ovnmeta-89c6a9c2-23c1-4b8b-81b9-3050a42a016f[250799]: [NOTICE]   (250806) : haproxy version is 2.8.14-c23fe91
Oct  2 08:43:31 np0005466013 neutron-haproxy-ovnmeta-89c6a9c2-23c1-4b8b-81b9-3050a42a016f[250799]: [NOTICE]   (250806) : path to executable is /usr/sbin/haproxy
Oct  2 08:43:31 np0005466013 neutron-haproxy-ovnmeta-89c6a9c2-23c1-4b8b-81b9-3050a42a016f[250799]: [WARNING]  (250806) : Exiting Master process...
Oct  2 08:43:31 np0005466013 neutron-haproxy-ovnmeta-89c6a9c2-23c1-4b8b-81b9-3050a42a016f[250799]: [ALERT]    (250806) : Current worker (250812) exited with code 143 (Terminated)
Oct  2 08:43:31 np0005466013 neutron-haproxy-ovnmeta-89c6a9c2-23c1-4b8b-81b9-3050a42a016f[250799]: [WARNING]  (250806) : All workers exited. Exiting... (0)
Oct  2 08:43:31 np0005466013 nova_compute[192144]: 2025-10-02 12:43:31.034 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:31 np0005466013 systemd[1]: libpod-73072c1fe0981a3345126f9f8de0ceee03ee3a47a5e636b45e04d7d70e60b151.scope: Deactivated successfully.
Oct  2 08:43:31 np0005466013 podman[250905]: 2025-10-02 12:43:31.041617013 +0000 UTC m=+0.304238304 container died 73072c1fe0981a3345126f9f8de0ceee03ee3a47a5e636b45e04d7d70e60b151 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89c6a9c2-23c1-4b8b-81b9-3050a42a016f, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 08:43:31 np0005466013 nova_compute[192144]: 2025-10-02 12:43:31.193 2 INFO nova.compute.manager [None req-45b95a05-b0d3-4959-83a4-3395b643f9c3 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Took 0.60 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:43:31 np0005466013 nova_compute[192144]: 2025-10-02 12:43:31.194 2 DEBUG oslo.service.loopingcall [None req-45b95a05-b0d3-4959-83a4-3395b643f9c3 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:43:31 np0005466013 nova_compute[192144]: 2025-10-02 12:43:31.194 2 DEBUG nova.compute.manager [-] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:43:31 np0005466013 nova_compute[192144]: 2025-10-02 12:43:31.194 2 DEBUG nova.network.neutron [-] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:43:31 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:43:31.875 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=49, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=48) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:43:31 np0005466013 nova_compute[192144]: 2025-10-02 12:43:31.875 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:31 np0005466013 nova_compute[192144]: 2025-10-02 12:43:31.885 2 DEBUG nova.network.neutron [-] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:43:31 np0005466013 nova_compute[192144]: 2025-10-02 12:43:31.926 2 DEBUG nova.compute.manager [req-2d554e8e-b32f-4138-86bf-14d8ff6aed76 req-34a45dbd-48bc-463f-abda-f1afc40fdbde 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Received event network-vif-deleted-72aa18cf-4db0-4dfb-9c47-5b03817aa2f2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:43:31 np0005466013 nova_compute[192144]: 2025-10-02 12:43:31.926 2 INFO nova.compute.manager [req-2d554e8e-b32f-4138-86bf-14d8ff6aed76 req-34a45dbd-48bc-463f-abda-f1afc40fdbde 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Neutron deleted interface 72aa18cf-4db0-4dfb-9c47-5b03817aa2f2; detaching it from the instance and deleting it from the info cache#033[00m
Oct  2 08:43:31 np0005466013 nova_compute[192144]: 2025-10-02 12:43:31.927 2 DEBUG nova.network.neutron [req-2d554e8e-b32f-4138-86bf-14d8ff6aed76 req-34a45dbd-48bc-463f-abda-f1afc40fdbde 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:43:31 np0005466013 nova_compute[192144]: 2025-10-02 12:43:31.929 2 INFO nova.compute.manager [-] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Took 0.73 seconds to deallocate network for instance.#033[00m
Oct  2 08:43:31 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-73072c1fe0981a3345126f9f8de0ceee03ee3a47a5e636b45e04d7d70e60b151-userdata-shm.mount: Deactivated successfully.
Oct  2 08:43:31 np0005466013 systemd[1]: var-lib-containers-storage-overlay-875fb725f34442e94ba4ef406266b922400fe0877c653b1f93615f0ed0b4e852-merged.mount: Deactivated successfully.
Oct  2 08:43:31 np0005466013 nova_compute[192144]: 2025-10-02 12:43:31.959 2 DEBUG nova.compute.manager [req-2d554e8e-b32f-4138-86bf-14d8ff6aed76 req-34a45dbd-48bc-463f-abda-f1afc40fdbde 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Detach interface failed, port_id=72aa18cf-4db0-4dfb-9c47-5b03817aa2f2, reason: Instance 0039457a-eaaf-4eb6-836e-2c4ce4c14a83 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Oct  2 08:43:32 np0005466013 nova_compute[192144]: 2025-10-02 12:43:32.088 2 DEBUG oslo_concurrency.lockutils [None req-45b95a05-b0d3-4959-83a4-3395b643f9c3 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:43:32 np0005466013 nova_compute[192144]: 2025-10-02 12:43:32.088 2 DEBUG oslo_concurrency.lockutils [None req-45b95a05-b0d3-4959-83a4-3395b643f9c3 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:43:32 np0005466013 nova_compute[192144]: 2025-10-02 12:43:32.180 2 DEBUG nova.compute.provider_tree [None req-45b95a05-b0d3-4959-83a4-3395b643f9c3 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:43:32 np0005466013 nova_compute[192144]: 2025-10-02 12:43:32.243 2 DEBUG nova.scheduler.client.report [None req-45b95a05-b0d3-4959-83a4-3395b643f9c3 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:43:32 np0005466013 podman[250905]: 2025-10-02 12:43:32.29509032 +0000 UTC m=+1.557711611 container cleanup 73072c1fe0981a3345126f9f8de0ceee03ee3a47a5e636b45e04d7d70e60b151 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89c6a9c2-23c1-4b8b-81b9-3050a42a016f, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:43:32 np0005466013 systemd[1]: libpod-conmon-73072c1fe0981a3345126f9f8de0ceee03ee3a47a5e636b45e04d7d70e60b151.scope: Deactivated successfully.
Oct  2 08:43:32 np0005466013 nova_compute[192144]: 2025-10-02 12:43:32.320 2 DEBUG nova.compute.manager [req-2ab8ce73-51e9-4aed-becc-f625fdda28f8 req-f773f742-4175-4828-937e-be19d660dc06 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Received event network-vif-unplugged-72aa18cf-4db0-4dfb-9c47-5b03817aa2f2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:43:32 np0005466013 nova_compute[192144]: 2025-10-02 12:43:32.321 2 DEBUG oslo_concurrency.lockutils [req-2ab8ce73-51e9-4aed-becc-f625fdda28f8 req-f773f742-4175-4828-937e-be19d660dc06 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "0039457a-eaaf-4eb6-836e-2c4ce4c14a83-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:43:32 np0005466013 nova_compute[192144]: 2025-10-02 12:43:32.322 2 DEBUG oslo_concurrency.lockutils [req-2ab8ce73-51e9-4aed-becc-f625fdda28f8 req-f773f742-4175-4828-937e-be19d660dc06 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "0039457a-eaaf-4eb6-836e-2c4ce4c14a83-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:43:32 np0005466013 nova_compute[192144]: 2025-10-02 12:43:32.322 2 DEBUG oslo_concurrency.lockutils [req-2ab8ce73-51e9-4aed-becc-f625fdda28f8 req-f773f742-4175-4828-937e-be19d660dc06 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "0039457a-eaaf-4eb6-836e-2c4ce4c14a83-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:43:32 np0005466013 nova_compute[192144]: 2025-10-02 12:43:32.322 2 DEBUG nova.compute.manager [req-2ab8ce73-51e9-4aed-becc-f625fdda28f8 req-f773f742-4175-4828-937e-be19d660dc06 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] No waiting events found dispatching network-vif-unplugged-72aa18cf-4db0-4dfb-9c47-5b03817aa2f2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:43:32 np0005466013 nova_compute[192144]: 2025-10-02 12:43:32.323 2 WARNING nova.compute.manager [req-2ab8ce73-51e9-4aed-becc-f625fdda28f8 req-f773f742-4175-4828-937e-be19d660dc06 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Received unexpected event network-vif-unplugged-72aa18cf-4db0-4dfb-9c47-5b03817aa2f2 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:43:32 np0005466013 nova_compute[192144]: 2025-10-02 12:43:32.323 2 DEBUG nova.compute.manager [req-2ab8ce73-51e9-4aed-becc-f625fdda28f8 req-f773f742-4175-4828-937e-be19d660dc06 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Received event network-vif-plugged-72aa18cf-4db0-4dfb-9c47-5b03817aa2f2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:43:32 np0005466013 nova_compute[192144]: 2025-10-02 12:43:32.323 2 DEBUG oslo_concurrency.lockutils [req-2ab8ce73-51e9-4aed-becc-f625fdda28f8 req-f773f742-4175-4828-937e-be19d660dc06 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "0039457a-eaaf-4eb6-836e-2c4ce4c14a83-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:43:32 np0005466013 nova_compute[192144]: 2025-10-02 12:43:32.323 2 DEBUG oslo_concurrency.lockutils [req-2ab8ce73-51e9-4aed-becc-f625fdda28f8 req-f773f742-4175-4828-937e-be19d660dc06 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "0039457a-eaaf-4eb6-836e-2c4ce4c14a83-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:43:32 np0005466013 nova_compute[192144]: 2025-10-02 12:43:32.324 2 DEBUG oslo_concurrency.lockutils [req-2ab8ce73-51e9-4aed-becc-f625fdda28f8 req-f773f742-4175-4828-937e-be19d660dc06 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "0039457a-eaaf-4eb6-836e-2c4ce4c14a83-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:43:32 np0005466013 nova_compute[192144]: 2025-10-02 12:43:32.324 2 DEBUG nova.compute.manager [req-2ab8ce73-51e9-4aed-becc-f625fdda28f8 req-f773f742-4175-4828-937e-be19d660dc06 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] No waiting events found dispatching network-vif-plugged-72aa18cf-4db0-4dfb-9c47-5b03817aa2f2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:43:32 np0005466013 nova_compute[192144]: 2025-10-02 12:43:32.324 2 WARNING nova.compute.manager [req-2ab8ce73-51e9-4aed-becc-f625fdda28f8 req-f773f742-4175-4828-937e-be19d660dc06 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Received unexpected event network-vif-plugged-72aa18cf-4db0-4dfb-9c47-5b03817aa2f2 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:43:32 np0005466013 nova_compute[192144]: 2025-10-02 12:43:32.396 2 DEBUG oslo_concurrency.lockutils [None req-45b95a05-b0d3-4959-83a4-3395b643f9c3 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.308s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:43:32 np0005466013 nova_compute[192144]: 2025-10-02 12:43:32.455 2 INFO nova.scheduler.client.report [None req-45b95a05-b0d3-4959-83a4-3395b643f9c3 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Deleted allocations for instance 0039457a-eaaf-4eb6-836e-2c4ce4c14a83#033[00m
Oct  2 08:43:32 np0005466013 nova_compute[192144]: 2025-10-02 12:43:32.571 2 DEBUG oslo_concurrency.lockutils [None req-45b95a05-b0d3-4959-83a4-3395b643f9c3 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Lock "0039457a-eaaf-4eb6-836e-2c4ce4c14a83" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:43:32 np0005466013 podman[250959]: 2025-10-02 12:43:32.719278026 +0000 UTC m=+0.390809020 container remove 73072c1fe0981a3345126f9f8de0ceee03ee3a47a5e636b45e04d7d70e60b151 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89c6a9c2-23c1-4b8b-81b9-3050a42a016f, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 08:43:32 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:43:32.726 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[a1fe0e3f-8dc7-44b8-bd16-1ddeb444bb34]: (4, ('Thu Oct  2 12:43:30 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-89c6a9c2-23c1-4b8b-81b9-3050a42a016f (73072c1fe0981a3345126f9f8de0ceee03ee3a47a5e636b45e04d7d70e60b151)\n73072c1fe0981a3345126f9f8de0ceee03ee3a47a5e636b45e04d7d70e60b151\nThu Oct  2 12:43:32 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-89c6a9c2-23c1-4b8b-81b9-3050a42a016f (73072c1fe0981a3345126f9f8de0ceee03ee3a47a5e636b45e04d7d70e60b151)\n73072c1fe0981a3345126f9f8de0ceee03ee3a47a5e636b45e04d7d70e60b151\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:32 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:43:32.728 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[ee9d02f8-d8bc-4649-af56-91f5ad5b28ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:32 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:43:32.730 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap89c6a9c2-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:43:32 np0005466013 nova_compute[192144]: 2025-10-02 12:43:32.732 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:32 np0005466013 kernel: tap89c6a9c2-20: left promiscuous mode
Oct  2 08:43:32 np0005466013 nova_compute[192144]: 2025-10-02 12:43:32.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:32 np0005466013 nova_compute[192144]: 2025-10-02 12:43:32.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:32 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:43:32.748 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[df6c5aef-6b88-41f1-83db-42887e0d6fa6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:32 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:43:32.785 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[b8223cf3-1eac-476c-b941-f7128f3bb244]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:32 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:43:32.788 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[fe6166d6-c766-4429-ab57-acb3c323b2d8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:32 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:43:32.807 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[95a4083e-f503-4543-826f-2e13537c2970]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 699861, 'reachable_time': 22714, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 250975, 'error': None, 'target': 'ovnmeta-89c6a9c2-23c1-4b8b-81b9-3050a42a016f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:32 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:43:32.810 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-89c6a9c2-23c1-4b8b-81b9-3050a42a016f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:43:32 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:43:32.810 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[b90cef24-6f5d-4e37-8308-b7caa62e3f58]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:32 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:43:32.811 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:43:32 np0005466013 systemd[1]: run-netns-ovnmeta\x2d89c6a9c2\x2d23c1\x2d4b8b\x2d81b9\x2d3050a42a016f.mount: Deactivated successfully.
Oct  2 08:43:35 np0005466013 nova_compute[192144]: 2025-10-02 12:43:35.989 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:36 np0005466013 nova_compute[192144]: 2025-10-02 12:43:36.035 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:40 np0005466013 nova_compute[192144]: 2025-10-02 12:43:40.991 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:41 np0005466013 nova_compute[192144]: 2025-10-02 12:43:41.036 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:41 np0005466013 podman[250977]: 2025-10-02 12:43:41.696936962 +0000 UTC m=+0.063058199 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, name=ubi9-minimal, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, version=9.6, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:43:41 np0005466013 podman[250976]: 2025-10-02 12:43:41.732173738 +0000 UTC m=+0.099247054 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  2 08:43:41 np0005466013 podman[250978]: 2025-10-02 12:43:41.733050705 +0000 UTC m=+0.087079182 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Oct  2 08:43:41 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:43:41.814 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '49'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:43:43 np0005466013 nova_compute[192144]: 2025-10-02 12:43:43.321 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:44 np0005466013 podman[251036]: 2025-10-02 12:43:44.699255125 +0000 UTC m=+0.062598864 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  2 08:43:44 np0005466013 podman[251037]: 2025-10-02 12:43:44.704868092 +0000 UTC m=+0.070549165 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  2 08:43:45 np0005466013 nova_compute[192144]: 2025-10-02 12:43:45.855 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759409010.8541725, 0039457a-eaaf-4eb6-836e-2c4ce4c14a83 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:43:45 np0005466013 nova_compute[192144]: 2025-10-02 12:43:45.856 2 INFO nova.compute.manager [-] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:43:45 np0005466013 nova_compute[192144]: 2025-10-02 12:43:45.923 2 DEBUG nova.compute.manager [None req-226f89af-b4f7-4d2c-9892-f51efe18958f - - - - - -] [instance: 0039457a-eaaf-4eb6-836e-2c4ce4c14a83] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:43:45 np0005466013 nova_compute[192144]: 2025-10-02 12:43:45.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:43:45 np0005466013 nova_compute[192144]: 2025-10-02 12:43:45.993 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:45 np0005466013 nova_compute[192144]: 2025-10-02 12:43:45.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:43:45 np0005466013 nova_compute[192144]: 2025-10-02 12:43:45.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:43:46 np0005466013 nova_compute[192144]: 2025-10-02 12:43:46.038 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:46 np0005466013 nova_compute[192144]: 2025-10-02 12:43:46.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:43:47 np0005466013 nova_compute[192144]: 2025-10-02 12:43:47.036 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:43:47 np0005466013 nova_compute[192144]: 2025-10-02 12:43:47.037 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:43:47 np0005466013 nova_compute[192144]: 2025-10-02 12:43:47.037 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:43:47 np0005466013 nova_compute[192144]: 2025-10-02 12:43:47.037 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:43:47 np0005466013 nova_compute[192144]: 2025-10-02 12:43:47.230 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:43:47 np0005466013 nova_compute[192144]: 2025-10-02 12:43:47.231 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5711MB free_disk=73.13301849365234GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:43:47 np0005466013 nova_compute[192144]: 2025-10-02 12:43:47.232 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:43:47 np0005466013 nova_compute[192144]: 2025-10-02 12:43:47.232 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:43:47 np0005466013 nova_compute[192144]: 2025-10-02 12:43:47.293 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:43:47 np0005466013 nova_compute[192144]: 2025-10-02 12:43:47.294 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:43:47 np0005466013 nova_compute[192144]: 2025-10-02 12:43:47.327 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:43:47 np0005466013 nova_compute[192144]: 2025-10-02 12:43:47.360 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:43:47 np0005466013 nova_compute[192144]: 2025-10-02 12:43:47.391 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:43:47 np0005466013 nova_compute[192144]: 2025-10-02 12:43:47.391 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:43:49 np0005466013 nova_compute[192144]: 2025-10-02 12:43:49.392 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:43:50 np0005466013 nova_compute[192144]: 2025-10-02 12:43:50.996 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:51 np0005466013 nova_compute[192144]: 2025-10-02 12:43:51.040 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:52 np0005466013 nova_compute[192144]: 2025-10-02 12:43:52.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:43:55 np0005466013 nova_compute[192144]: 2025-10-02 12:43:55.988 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:43:55 np0005466013 nova_compute[192144]: 2025-10-02 12:43:55.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:43:55 np0005466013 nova_compute[192144]: 2025-10-02 12:43:55.998 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:56 np0005466013 nova_compute[192144]: 2025-10-02 12:43:56.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:59 np0005466013 nova_compute[192144]: 2025-10-02 12:43:59.995 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:43:59 np0005466013 nova_compute[192144]: 2025-10-02 12:43:59.995 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:43:59 np0005466013 nova_compute[192144]: 2025-10-02 12:43:59.995 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:44:00 np0005466013 nova_compute[192144]: 2025-10-02 12:44:00.028 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:44:00 np0005466013 nova_compute[192144]: 2025-10-02 12:44:00.030 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:44:01 np0005466013 nova_compute[192144]: 2025-10-02 12:44:01.002 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:01 np0005466013 nova_compute[192144]: 2025-10-02 12:44:01.078 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:01 np0005466013 podman[251081]: 2025-10-02 12:44:01.681407938 +0000 UTC m=+0.056418720 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  2 08:44:01 np0005466013 podman[251080]: 2025-10-02 12:44:01.681861162 +0000 UTC m=+0.059948531 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  2 08:44:01 np0005466013 podman[251082]: 2025-10-02 12:44:01.750097443 +0000 UTC m=+0.122772682 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  2 08:44:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:44:02.327 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:44:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:44:02.328 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:44:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:44:02.328 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:44:06 np0005466013 nova_compute[192144]: 2025-10-02 12:44:06.005 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:06 np0005466013 nova_compute[192144]: 2025-10-02 12:44:06.080 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:10 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:44:10.027 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=50, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=49) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:44:10 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:44:10.029 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:44:10 np0005466013 nova_compute[192144]: 2025-10-02 12:44:10.029 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:11 np0005466013 nova_compute[192144]: 2025-10-02 12:44:11.008 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:11 np0005466013 nova_compute[192144]: 2025-10-02 12:44:11.083 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:44:12.032 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '50'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:44:12 np0005466013 podman[251153]: 2025-10-02 12:44:12.712535123 +0000 UTC m=+0.075173409 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, config_id=multipathd, container_name=multipathd)
Oct  2 08:44:12 np0005466013 podman[251154]: 2025-10-02 12:44:12.732822139 +0000 UTC m=+0.085472831 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_id=edpm, io.openshift.expose-services=, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vendor=Red Hat, Inc.)
Oct  2 08:44:12 np0005466013 podman[251155]: 2025-10-02 12:44:12.747881932 +0000 UTC m=+0.094723223 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:44:15 np0005466013 podman[251216]: 2025-10-02 12:44:15.691730482 +0000 UTC m=+0.057683961 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  2 08:44:15 np0005466013 podman[251217]: 2025-10-02 12:44:15.723451117 +0000 UTC m=+0.085493963 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  2 08:44:16 np0005466013 nova_compute[192144]: 2025-10-02 12:44:16.011 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:16 np0005466013 nova_compute[192144]: 2025-10-02 12:44:16.085 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:17 np0005466013 ovn_controller[94366]: 2025-10-02T12:44:17Z|00770|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory
Oct  2 08:44:19 np0005466013 nova_compute[192144]: 2025-10-02 12:44:19.119 2 DEBUG oslo_concurrency.lockutils [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Acquiring lock "fae6fd47-f4b2-4e6a-9e91-f172b0d4c382" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:44:19 np0005466013 nova_compute[192144]: 2025-10-02 12:44:19.119 2 DEBUG oslo_concurrency.lockutils [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Lock "fae6fd47-f4b2-4e6a-9e91-f172b0d4c382" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:44:19 np0005466013 nova_compute[192144]: 2025-10-02 12:44:19.138 2 DEBUG nova.compute.manager [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:44:19 np0005466013 nova_compute[192144]: 2025-10-02 12:44:19.245 2 DEBUG oslo_concurrency.lockutils [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:44:19 np0005466013 nova_compute[192144]: 2025-10-02 12:44:19.246 2 DEBUG oslo_concurrency.lockutils [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:44:19 np0005466013 nova_compute[192144]: 2025-10-02 12:44:19.255 2 DEBUG nova.virt.hardware [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:44:19 np0005466013 nova_compute[192144]: 2025-10-02 12:44:19.255 2 INFO nova.compute.claims [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:44:19 np0005466013 nova_compute[192144]: 2025-10-02 12:44:19.398 2 DEBUG nova.compute.provider_tree [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:44:19 np0005466013 nova_compute[192144]: 2025-10-02 12:44:19.413 2 DEBUG nova.scheduler.client.report [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:44:19 np0005466013 nova_compute[192144]: 2025-10-02 12:44:19.432 2 DEBUG oslo_concurrency.lockutils [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.186s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:44:19 np0005466013 nova_compute[192144]: 2025-10-02 12:44:19.433 2 DEBUG nova.compute.manager [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:44:19 np0005466013 nova_compute[192144]: 2025-10-02 12:44:19.488 2 DEBUG nova.compute.manager [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:44:19 np0005466013 nova_compute[192144]: 2025-10-02 12:44:19.488 2 DEBUG nova.network.neutron [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:44:19 np0005466013 nova_compute[192144]: 2025-10-02 12:44:19.505 2 INFO nova.virt.libvirt.driver [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:44:19 np0005466013 nova_compute[192144]: 2025-10-02 12:44:19.522 2 DEBUG nova.compute.manager [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:44:19 np0005466013 nova_compute[192144]: 2025-10-02 12:44:19.648 2 DEBUG nova.compute.manager [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:44:19 np0005466013 nova_compute[192144]: 2025-10-02 12:44:19.650 2 DEBUG nova.virt.libvirt.driver [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:44:19 np0005466013 nova_compute[192144]: 2025-10-02 12:44:19.651 2 INFO nova.virt.libvirt.driver [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Creating image(s)#033[00m
Oct  2 08:44:19 np0005466013 nova_compute[192144]: 2025-10-02 12:44:19.653 2 DEBUG oslo_concurrency.lockutils [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Acquiring lock "/var/lib/nova/instances/fae6fd47-f4b2-4e6a-9e91-f172b0d4c382/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:44:19 np0005466013 nova_compute[192144]: 2025-10-02 12:44:19.653 2 DEBUG oslo_concurrency.lockutils [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Lock "/var/lib/nova/instances/fae6fd47-f4b2-4e6a-9e91-f172b0d4c382/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:44:19 np0005466013 nova_compute[192144]: 2025-10-02 12:44:19.654 2 DEBUG oslo_concurrency.lockutils [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Lock "/var/lib/nova/instances/fae6fd47-f4b2-4e6a-9e91-f172b0d4c382/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:44:19 np0005466013 nova_compute[192144]: 2025-10-02 12:44:19.681 2 DEBUG oslo_concurrency.processutils [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:44:19 np0005466013 nova_compute[192144]: 2025-10-02 12:44:19.741 2 DEBUG oslo_concurrency.processutils [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:44:19 np0005466013 nova_compute[192144]: 2025-10-02 12:44:19.743 2 DEBUG oslo_concurrency.lockutils [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:44:19 np0005466013 nova_compute[192144]: 2025-10-02 12:44:19.744 2 DEBUG oslo_concurrency.lockutils [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:44:19 np0005466013 nova_compute[192144]: 2025-10-02 12:44:19.760 2 DEBUG oslo_concurrency.processutils [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:44:19 np0005466013 nova_compute[192144]: 2025-10-02 12:44:19.822 2 DEBUG oslo_concurrency.processutils [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:44:19 np0005466013 nova_compute[192144]: 2025-10-02 12:44:19.824 2 DEBUG oslo_concurrency.processutils [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/fae6fd47-f4b2-4e6a-9e91-f172b0d4c382/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:44:19 np0005466013 nova_compute[192144]: 2025-10-02 12:44:19.986 2 DEBUG oslo_concurrency.processutils [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/fae6fd47-f4b2-4e6a-9e91-f172b0d4c382/disk 1073741824" returned: 0 in 0.162s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:44:19 np0005466013 nova_compute[192144]: 2025-10-02 12:44:19.987 2 DEBUG oslo_concurrency.lockutils [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.243s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:44:19 np0005466013 nova_compute[192144]: 2025-10-02 12:44:19.988 2 DEBUG oslo_concurrency.processutils [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:44:20 np0005466013 nova_compute[192144]: 2025-10-02 12:44:20.011 2 DEBUG nova.policy [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4720b7e5b8a9423e8c0d475a2e20be2b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'dfbc688565e746c793b3d943d9813f40', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:44:20 np0005466013 nova_compute[192144]: 2025-10-02 12:44:20.051 2 DEBUG oslo_concurrency.processutils [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:44:20 np0005466013 nova_compute[192144]: 2025-10-02 12:44:20.052 2 DEBUG nova.virt.disk.api [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Checking if we can resize image /var/lib/nova/instances/fae6fd47-f4b2-4e6a-9e91-f172b0d4c382/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:44:20 np0005466013 nova_compute[192144]: 2025-10-02 12:44:20.052 2 DEBUG oslo_concurrency.processutils [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fae6fd47-f4b2-4e6a-9e91-f172b0d4c382/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:44:20 np0005466013 nova_compute[192144]: 2025-10-02 12:44:20.131 2 DEBUG oslo_concurrency.processutils [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fae6fd47-f4b2-4e6a-9e91-f172b0d4c382/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:44:20 np0005466013 nova_compute[192144]: 2025-10-02 12:44:20.133 2 DEBUG nova.virt.disk.api [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Cannot resize image /var/lib/nova/instances/fae6fd47-f4b2-4e6a-9e91-f172b0d4c382/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:44:20 np0005466013 nova_compute[192144]: 2025-10-02 12:44:20.133 2 DEBUG nova.objects.instance [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Lazy-loading 'migration_context' on Instance uuid fae6fd47-f4b2-4e6a-9e91-f172b0d4c382 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:44:20 np0005466013 nova_compute[192144]: 2025-10-02 12:44:20.166 2 DEBUG nova.virt.libvirt.driver [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:44:20 np0005466013 nova_compute[192144]: 2025-10-02 12:44:20.167 2 DEBUG nova.virt.libvirt.driver [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Ensure instance console log exists: /var/lib/nova/instances/fae6fd47-f4b2-4e6a-9e91-f172b0d4c382/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:44:20 np0005466013 nova_compute[192144]: 2025-10-02 12:44:20.167 2 DEBUG oslo_concurrency.lockutils [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:44:20 np0005466013 nova_compute[192144]: 2025-10-02 12:44:20.168 2 DEBUG oslo_concurrency.lockutils [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:44:20 np0005466013 nova_compute[192144]: 2025-10-02 12:44:20.168 2 DEBUG oslo_concurrency.lockutils [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:44:21 np0005466013 nova_compute[192144]: 2025-10-02 12:44:21.013 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:21 np0005466013 nova_compute[192144]: 2025-10-02 12:44:21.087 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:21 np0005466013 nova_compute[192144]: 2025-10-02 12:44:21.524 2 DEBUG nova.network.neutron [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Successfully created port: 25825dcb-892c-4f4b-bd9c-b35fc536e4fa _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:44:22 np0005466013 nova_compute[192144]: 2025-10-02 12:44:22.571 2 DEBUG nova.network.neutron [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Successfully updated port: 25825dcb-892c-4f4b-bd9c-b35fc536e4fa _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:44:22 np0005466013 nova_compute[192144]: 2025-10-02 12:44:22.596 2 DEBUG oslo_concurrency.lockutils [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Acquiring lock "refresh_cache-fae6fd47-f4b2-4e6a-9e91-f172b0d4c382" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:44:22 np0005466013 nova_compute[192144]: 2025-10-02 12:44:22.597 2 DEBUG oslo_concurrency.lockutils [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Acquired lock "refresh_cache-fae6fd47-f4b2-4e6a-9e91-f172b0d4c382" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:44:22 np0005466013 nova_compute[192144]: 2025-10-02 12:44:22.597 2 DEBUG nova.network.neutron [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:44:22 np0005466013 nova_compute[192144]: 2025-10-02 12:44:22.701 2 DEBUG nova.compute.manager [req-f559b3c4-e929-4d6c-852d-c08842484f05 req-7b22b554-c3d5-46fe-9429-4ea509c63892 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Received event network-changed-25825dcb-892c-4f4b-bd9c-b35fc536e4fa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:44:22 np0005466013 nova_compute[192144]: 2025-10-02 12:44:22.702 2 DEBUG nova.compute.manager [req-f559b3c4-e929-4d6c-852d-c08842484f05 req-7b22b554-c3d5-46fe-9429-4ea509c63892 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Refreshing instance network info cache due to event network-changed-25825dcb-892c-4f4b-bd9c-b35fc536e4fa. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:44:22 np0005466013 nova_compute[192144]: 2025-10-02 12:44:22.702 2 DEBUG oslo_concurrency.lockutils [req-f559b3c4-e929-4d6c-852d-c08842484f05 req-7b22b554-c3d5-46fe-9429-4ea509c63892 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-fae6fd47-f4b2-4e6a-9e91-f172b0d4c382" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:44:23 np0005466013 nova_compute[192144]: 2025-10-02 12:44:23.446 2 DEBUG nova.network.neutron [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:44:24 np0005466013 nova_compute[192144]: 2025-10-02 12:44:24.818 2 DEBUG nova.network.neutron [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Updating instance_info_cache with network_info: [{"id": "25825dcb-892c-4f4b-bd9c-b35fc536e4fa", "address": "fa:16:3e:22:19:59", "network": {"id": "dd580c44-7ea8-4947-a156-85f7d60017c7", "bridge": "br-int", "label": "tempest-network-smoke--1617017631", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dfbc688565e746c793b3d943d9813f40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25825dcb-89", "ovs_interfaceid": "25825dcb-892c-4f4b-bd9c-b35fc536e4fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:44:24 np0005466013 nova_compute[192144]: 2025-10-02 12:44:24.839 2 DEBUG oslo_concurrency.lockutils [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Releasing lock "refresh_cache-fae6fd47-f4b2-4e6a-9e91-f172b0d4c382" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:44:24 np0005466013 nova_compute[192144]: 2025-10-02 12:44:24.839 2 DEBUG nova.compute.manager [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Instance network_info: |[{"id": "25825dcb-892c-4f4b-bd9c-b35fc536e4fa", "address": "fa:16:3e:22:19:59", "network": {"id": "dd580c44-7ea8-4947-a156-85f7d60017c7", "bridge": "br-int", "label": "tempest-network-smoke--1617017631", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dfbc688565e746c793b3d943d9813f40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25825dcb-89", "ovs_interfaceid": "25825dcb-892c-4f4b-bd9c-b35fc536e4fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:44:24 np0005466013 nova_compute[192144]: 2025-10-02 12:44:24.840 2 DEBUG oslo_concurrency.lockutils [req-f559b3c4-e929-4d6c-852d-c08842484f05 req-7b22b554-c3d5-46fe-9429-4ea509c63892 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-fae6fd47-f4b2-4e6a-9e91-f172b0d4c382" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:44:24 np0005466013 nova_compute[192144]: 2025-10-02 12:44:24.840 2 DEBUG nova.network.neutron [req-f559b3c4-e929-4d6c-852d-c08842484f05 req-7b22b554-c3d5-46fe-9429-4ea509c63892 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Refreshing network info cache for port 25825dcb-892c-4f4b-bd9c-b35fc536e4fa _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:44:24 np0005466013 nova_compute[192144]: 2025-10-02 12:44:24.843 2 DEBUG nova.virt.libvirt.driver [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Start _get_guest_xml network_info=[{"id": "25825dcb-892c-4f4b-bd9c-b35fc536e4fa", "address": "fa:16:3e:22:19:59", "network": {"id": "dd580c44-7ea8-4947-a156-85f7d60017c7", "bridge": "br-int", "label": "tempest-network-smoke--1617017631", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dfbc688565e746c793b3d943d9813f40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25825dcb-89", "ovs_interfaceid": "25825dcb-892c-4f4b-bd9c-b35fc536e4fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:44:24 np0005466013 nova_compute[192144]: 2025-10-02 12:44:24.850 2 WARNING nova.virt.libvirt.driver [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:44:24 np0005466013 nova_compute[192144]: 2025-10-02 12:44:24.860 2 DEBUG nova.virt.libvirt.host [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:44:24 np0005466013 nova_compute[192144]: 2025-10-02 12:44:24.862 2 DEBUG nova.virt.libvirt.host [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:44:24 np0005466013 nova_compute[192144]: 2025-10-02 12:44:24.866 2 DEBUG nova.virt.libvirt.host [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:44:24 np0005466013 nova_compute[192144]: 2025-10-02 12:44:24.867 2 DEBUG nova.virt.libvirt.host [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:44:24 np0005466013 nova_compute[192144]: 2025-10-02 12:44:24.869 2 DEBUG nova.virt.libvirt.driver [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:44:24 np0005466013 nova_compute[192144]: 2025-10-02 12:44:24.869 2 DEBUG nova.virt.hardware [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:44:24 np0005466013 nova_compute[192144]: 2025-10-02 12:44:24.870 2 DEBUG nova.virt.hardware [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:44:24 np0005466013 nova_compute[192144]: 2025-10-02 12:44:24.870 2 DEBUG nova.virt.hardware [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:44:24 np0005466013 nova_compute[192144]: 2025-10-02 12:44:24.871 2 DEBUG nova.virt.hardware [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:44:24 np0005466013 nova_compute[192144]: 2025-10-02 12:44:24.871 2 DEBUG nova.virt.hardware [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:44:24 np0005466013 nova_compute[192144]: 2025-10-02 12:44:24.871 2 DEBUG nova.virt.hardware [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:44:24 np0005466013 nova_compute[192144]: 2025-10-02 12:44:24.872 2 DEBUG nova.virt.hardware [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:44:24 np0005466013 nova_compute[192144]: 2025-10-02 12:44:24.872 2 DEBUG nova.virt.hardware [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:44:24 np0005466013 nova_compute[192144]: 2025-10-02 12:44:24.873 2 DEBUG nova.virt.hardware [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:44:24 np0005466013 nova_compute[192144]: 2025-10-02 12:44:24.873 2 DEBUG nova.virt.hardware [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:44:24 np0005466013 nova_compute[192144]: 2025-10-02 12:44:24.873 2 DEBUG nova.virt.hardware [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:44:24 np0005466013 nova_compute[192144]: 2025-10-02 12:44:24.878 2 DEBUG nova.virt.libvirt.vif [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:44:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-291806933-access_point-731764317',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-291806933-access_point-731764317',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-291806933-acc',id=180,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEhNOBg3b/xTSagZcSj8txxntH3FGkYT7DOxHjjsRe9VmpiBcoAB/AXdGUMr1TrLAN/Df9iz4Dq9MgjgjSuz7uKI5e+2IK4FbSUyGty2k/OKwq7ViW1u3j47knodLa8RIA==',key_name='tempest-TestSecurityGroupsBasicOps-1553086170',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='dfbc688565e746c793b3d943d9813f40',ramdisk_id='',reservation_id='r-zd1uzee4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-291806933',owner_user_name='tempest-TestSecurityGroupsBasicOps-291806933-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:44:19Z,user_data=None,user_id='4720b7e5b8a9423e8c0d475a2e20be2b',uuid=fae6fd47-f4b2-4e6a-9e91-f172b0d4c382,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "25825dcb-892c-4f4b-bd9c-b35fc536e4fa", "address": "fa:16:3e:22:19:59", "network": {"id": "dd580c44-7ea8-4947-a156-85f7d60017c7", "bridge": "br-int", "label": "tempest-network-smoke--1617017631", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dfbc688565e746c793b3d943d9813f40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25825dcb-89", "ovs_interfaceid": "25825dcb-892c-4f4b-bd9c-b35fc536e4fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:44:24 np0005466013 nova_compute[192144]: 2025-10-02 12:44:24.879 2 DEBUG nova.network.os_vif_util [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Converting VIF {"id": "25825dcb-892c-4f4b-bd9c-b35fc536e4fa", "address": "fa:16:3e:22:19:59", "network": {"id": "dd580c44-7ea8-4947-a156-85f7d60017c7", "bridge": "br-int", "label": "tempest-network-smoke--1617017631", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dfbc688565e746c793b3d943d9813f40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25825dcb-89", "ovs_interfaceid": "25825dcb-892c-4f4b-bd9c-b35fc536e4fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:44:24 np0005466013 nova_compute[192144]: 2025-10-02 12:44:24.880 2 DEBUG nova.network.os_vif_util [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:22:19:59,bridge_name='br-int',has_traffic_filtering=True,id=25825dcb-892c-4f4b-bd9c-b35fc536e4fa,network=Network(dd580c44-7ea8-4947-a156-85f7d60017c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25825dcb-89') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:44:24 np0005466013 nova_compute[192144]: 2025-10-02 12:44:24.881 2 DEBUG nova.objects.instance [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Lazy-loading 'pci_devices' on Instance uuid fae6fd47-f4b2-4e6a-9e91-f172b0d4c382 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:44:24 np0005466013 nova_compute[192144]: 2025-10-02 12:44:24.896 2 DEBUG nova.virt.libvirt.driver [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:44:24 np0005466013 nova_compute[192144]:  <uuid>fae6fd47-f4b2-4e6a-9e91-f172b0d4c382</uuid>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:  <name>instance-000000b4</name>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:44:24 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-291806933-access_point-731764317</nova:name>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:44:24</nova:creationTime>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:44:24 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:        <nova:user uuid="4720b7e5b8a9423e8c0d475a2e20be2b">tempest-TestSecurityGroupsBasicOps-291806933-project-member</nova:user>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:        <nova:project uuid="dfbc688565e746c793b3d943d9813f40">tempest-TestSecurityGroupsBasicOps-291806933</nova:project>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:        <nova:port uuid="25825dcb-892c-4f4b-bd9c-b35fc536e4fa">
Oct  2 08:44:24 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:      <entry name="serial">fae6fd47-f4b2-4e6a-9e91-f172b0d4c382</entry>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:      <entry name="uuid">fae6fd47-f4b2-4e6a-9e91-f172b0d4c382</entry>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:44:24 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/fae6fd47-f4b2-4e6a-9e91-f172b0d4c382/disk"/>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:44:24 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/fae6fd47-f4b2-4e6a-9e91-f172b0d4c382/disk.config"/>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:44:24 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:22:19:59"/>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:      <target dev="tap25825dcb-89"/>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:44:24 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/fae6fd47-f4b2-4e6a-9e91-f172b0d4c382/console.log" append="off"/>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:44:24 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:44:24 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:44:24 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:44:24 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:44:24 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:44:24 np0005466013 nova_compute[192144]: 2025-10-02 12:44:24.898 2 DEBUG nova.compute.manager [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Preparing to wait for external event network-vif-plugged-25825dcb-892c-4f4b-bd9c-b35fc536e4fa prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:44:24 np0005466013 nova_compute[192144]: 2025-10-02 12:44:24.899 2 DEBUG oslo_concurrency.lockutils [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Acquiring lock "fae6fd47-f4b2-4e6a-9e91-f172b0d4c382-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:44:24 np0005466013 nova_compute[192144]: 2025-10-02 12:44:24.899 2 DEBUG oslo_concurrency.lockutils [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Lock "fae6fd47-f4b2-4e6a-9e91-f172b0d4c382-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:44:24 np0005466013 nova_compute[192144]: 2025-10-02 12:44:24.900 2 DEBUG oslo_concurrency.lockutils [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Lock "fae6fd47-f4b2-4e6a-9e91-f172b0d4c382-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:44:24 np0005466013 nova_compute[192144]: 2025-10-02 12:44:24.901 2 DEBUG nova.virt.libvirt.vif [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:44:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-291806933-access_point-731764317',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-291806933-access_point-731764317',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-291806933-acc',id=180,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEhNOBg3b/xTSagZcSj8txxntH3FGkYT7DOxHjjsRe9VmpiBcoAB/AXdGUMr1TrLAN/Df9iz4Dq9MgjgjSuz7uKI5e+2IK4FbSUyGty2k/OKwq7ViW1u3j47knodLa8RIA==',key_name='tempest-TestSecurityGroupsBasicOps-1553086170',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='dfbc688565e746c793b3d943d9813f40',ramdisk_id='',reservation_id='r-zd1uzee4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-291806933',owner_user_name='tempest-TestSecurityGroupsBasicOps-291806933-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:44:19Z,user_data=None,user_id='4720b7e5b8a9423e8c0d475a2e20be2b',uuid=fae6fd47-f4b2-4e6a-9e91-f172b0d4c382,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "25825dcb-892c-4f4b-bd9c-b35fc536e4fa", "address": "fa:16:3e:22:19:59", "network": {"id": "dd580c44-7ea8-4947-a156-85f7d60017c7", "bridge": "br-int", "label": "tempest-network-smoke--1617017631", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dfbc688565e746c793b3d943d9813f40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25825dcb-89", "ovs_interfaceid": "25825dcb-892c-4f4b-bd9c-b35fc536e4fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:44:24 np0005466013 nova_compute[192144]: 2025-10-02 12:44:24.901 2 DEBUG nova.network.os_vif_util [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Converting VIF {"id": "25825dcb-892c-4f4b-bd9c-b35fc536e4fa", "address": "fa:16:3e:22:19:59", "network": {"id": "dd580c44-7ea8-4947-a156-85f7d60017c7", "bridge": "br-int", "label": "tempest-network-smoke--1617017631", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dfbc688565e746c793b3d943d9813f40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25825dcb-89", "ovs_interfaceid": "25825dcb-892c-4f4b-bd9c-b35fc536e4fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:44:24 np0005466013 nova_compute[192144]: 2025-10-02 12:44:24.902 2 DEBUG nova.network.os_vif_util [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:22:19:59,bridge_name='br-int',has_traffic_filtering=True,id=25825dcb-892c-4f4b-bd9c-b35fc536e4fa,network=Network(dd580c44-7ea8-4947-a156-85f7d60017c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25825dcb-89') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:44:24 np0005466013 nova_compute[192144]: 2025-10-02 12:44:24.903 2 DEBUG os_vif [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:22:19:59,bridge_name='br-int',has_traffic_filtering=True,id=25825dcb-892c-4f4b-bd9c-b35fc536e4fa,network=Network(dd580c44-7ea8-4947-a156-85f7d60017c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25825dcb-89') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:44:24 np0005466013 nova_compute[192144]: 2025-10-02 12:44:24.904 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:24 np0005466013 nova_compute[192144]: 2025-10-02 12:44:24.905 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:44:24 np0005466013 nova_compute[192144]: 2025-10-02 12:44:24.905 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:44:24 np0005466013 nova_compute[192144]: 2025-10-02 12:44:24.910 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:24 np0005466013 nova_compute[192144]: 2025-10-02 12:44:24.910 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap25825dcb-89, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:44:24 np0005466013 nova_compute[192144]: 2025-10-02 12:44:24.911 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap25825dcb-89, col_values=(('external_ids', {'iface-id': '25825dcb-892c-4f4b-bd9c-b35fc536e4fa', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:22:19:59', 'vm-uuid': 'fae6fd47-f4b2-4e6a-9e91-f172b0d4c382'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:44:24 np0005466013 nova_compute[192144]: 2025-10-02 12:44:24.913 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:24 np0005466013 NetworkManager[51205]: <info>  [1759409064.9138] manager: (tap25825dcb-89): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/341)
Oct  2 08:44:24 np0005466013 nova_compute[192144]: 2025-10-02 12:44:24.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:44:24 np0005466013 nova_compute[192144]: 2025-10-02 12:44:24.918 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:24 np0005466013 nova_compute[192144]: 2025-10-02 12:44:24.919 2 INFO os_vif [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:22:19:59,bridge_name='br-int',has_traffic_filtering=True,id=25825dcb-892c-4f4b-bd9c-b35fc536e4fa,network=Network(dd580c44-7ea8-4947-a156-85f7d60017c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25825dcb-89')#033[00m
Oct  2 08:44:25 np0005466013 nova_compute[192144]: 2025-10-02 12:44:25.034 2 DEBUG nova.virt.libvirt.driver [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:44:25 np0005466013 nova_compute[192144]: 2025-10-02 12:44:25.035 2 DEBUG nova.virt.libvirt.driver [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:44:25 np0005466013 nova_compute[192144]: 2025-10-02 12:44:25.035 2 DEBUG nova.virt.libvirt.driver [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] No VIF found with MAC fa:16:3e:22:19:59, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:44:25 np0005466013 nova_compute[192144]: 2025-10-02 12:44:25.035 2 INFO nova.virt.libvirt.driver [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Using config drive#033[00m
Oct  2 08:44:25 np0005466013 nova_compute[192144]: 2025-10-02 12:44:25.642 2 INFO nova.virt.libvirt.driver [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Creating config drive at /var/lib/nova/instances/fae6fd47-f4b2-4e6a-9e91-f172b0d4c382/disk.config#033[00m
Oct  2 08:44:25 np0005466013 nova_compute[192144]: 2025-10-02 12:44:25.647 2 DEBUG oslo_concurrency.processutils [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/fae6fd47-f4b2-4e6a-9e91-f172b0d4c382/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqd0wxb4x execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:44:25 np0005466013 nova_compute[192144]: 2025-10-02 12:44:25.774 2 DEBUG oslo_concurrency.processutils [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/fae6fd47-f4b2-4e6a-9e91-f172b0d4c382/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqd0wxb4x" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:44:25 np0005466013 kernel: tap25825dcb-89: entered promiscuous mode
Oct  2 08:44:25 np0005466013 NetworkManager[51205]: <info>  [1759409065.8248] manager: (tap25825dcb-89): new Tun device (/org/freedesktop/NetworkManager/Devices/342)
Oct  2 08:44:25 np0005466013 ovn_controller[94366]: 2025-10-02T12:44:25Z|00771|binding|INFO|Claiming lport 25825dcb-892c-4f4b-bd9c-b35fc536e4fa for this chassis.
Oct  2 08:44:25 np0005466013 ovn_controller[94366]: 2025-10-02T12:44:25Z|00772|binding|INFO|25825dcb-892c-4f4b-bd9c-b35fc536e4fa: Claiming fa:16:3e:22:19:59 10.100.0.8
Oct  2 08:44:25 np0005466013 nova_compute[192144]: 2025-10-02 12:44:25.826 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:25 np0005466013 nova_compute[192144]: 2025-10-02 12:44:25.831 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:25 np0005466013 nova_compute[192144]: 2025-10-02 12:44:25.835 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:44:25.853 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:22:19:59 10.100.0.8'], port_security=['fa:16:3e:22:19:59 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'fae6fd47-f4b2-4e6a-9e91-f172b0d4c382', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dd580c44-7ea8-4947-a156-85f7d60017c7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dfbc688565e746c793b3d943d9813f40', 'neutron:revision_number': '2', 'neutron:security_group_ids': '81338e3b-03ce-46a9-ae0b-cfaf86cc38c4 8fcce784-adae-41f2-bafd-5dbfdcd7ef48', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e4beee2e-45ac-4e17-b436-9d8290222cd4, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=25825dcb-892c-4f4b-bd9c-b35fc536e4fa) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:44:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:44:25.854 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 25825dcb-892c-4f4b-bd9c-b35fc536e4fa in datapath dd580c44-7ea8-4947-a156-85f7d60017c7 bound to our chassis#033[00m
Oct  2 08:44:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:44:25.856 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network dd580c44-7ea8-4947-a156-85f7d60017c7#033[00m
Oct  2 08:44:25 np0005466013 systemd-udevd[251294]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:44:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:44:25.872 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[a3a14439-fea1-441d-915f-e2242c2c357f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:44:25.873 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapdd580c44-71 in ovnmeta-dd580c44-7ea8-4947-a156-85f7d60017c7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:44:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:44:25.876 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapdd580c44-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:44:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:44:25.876 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[80446b2e-ddd4-4ad9-bb60-e61960d8dfd7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:44:25.877 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[8c905f06-5279-4ce5-9ce7-676111643c66]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:25 np0005466013 NetworkManager[51205]: <info>  [1759409065.8850] device (tap25825dcb-89): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:44:25 np0005466013 NetworkManager[51205]: <info>  [1759409065.8858] device (tap25825dcb-89): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:44:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:44:25.892 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[0c8dea96-029a-423d-8d74-5e098e85bf53]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:25 np0005466013 systemd-machined[152202]: New machine qemu-81-instance-000000b4.
Oct  2 08:44:25 np0005466013 nova_compute[192144]: 2025-10-02 12:44:25.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:25 np0005466013 ovn_controller[94366]: 2025-10-02T12:44:25Z|00773|binding|INFO|Setting lport 25825dcb-892c-4f4b-bd9c-b35fc536e4fa ovn-installed in OVS
Oct  2 08:44:25 np0005466013 ovn_controller[94366]: 2025-10-02T12:44:25Z|00774|binding|INFO|Setting lport 25825dcb-892c-4f4b-bd9c-b35fc536e4fa up in Southbound
Oct  2 08:44:25 np0005466013 systemd[1]: Started Virtual Machine qemu-81-instance-000000b4.
Oct  2 08:44:25 np0005466013 nova_compute[192144]: 2025-10-02 12:44:25.931 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:44:25.943 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[ebbd3cc0-f937-4ce2-9dc0-f8a6a958237e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:44:25.971 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[954859e7-9b8e-4e41-b0d4-0c866608e4d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:25 np0005466013 NetworkManager[51205]: <info>  [1759409065.9772] manager: (tapdd580c44-70): new Veth device (/org/freedesktop/NetworkManager/Devices/343)
Oct  2 08:44:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:44:25.978 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[5b609de1-d5db-450d-b5f5-88b6fc46852b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:44:26.011 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[afc5c93a-4b5d-4f5e-9b8d-ca3a8df256a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:44:26.017 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[6810bddf-9100-4337-8635-2ed2cf6f6bf1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:26 np0005466013 NetworkManager[51205]: <info>  [1759409066.0405] device (tapdd580c44-70): carrier: link connected
Oct  2 08:44:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:44:26.044 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[0cba7b1d-c2bc-44fd-a1be-c84460dbe3bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:44:26.060 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[82e7f680-3e49-4659-ab53-f59aed82cf10]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdd580c44-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9e:13:0f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 228], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 705765, 'reachable_time': 19886, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 251327, 'error': None, 'target': 'ovnmeta-dd580c44-7ea8-4947-a156-85f7d60017c7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:44:26.072 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[577f59e8-bf96-4c54-b389-2ad616891bff]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe9e:130f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 705765, 'tstamp': 705765}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 251328, 'error': None, 'target': 'ovnmeta-dd580c44-7ea8-4947-a156-85f7d60017c7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:44:26.085 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[c3ea3c00-27b0-4400-b5fa-63654f5ccf89]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdd580c44-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9e:13:0f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 228], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 705765, 'reachable_time': 19886, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 251329, 'error': None, 'target': 'ovnmeta-dd580c44-7ea8-4947-a156-85f7d60017c7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:26 np0005466013 nova_compute[192144]: 2025-10-02 12:44:26.088 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:44:26.108 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[782b6a84-cae4-4590-8da9-29a2b853ca8d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:44:26.172 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[889858a8-6041-4c31-a7c1-5372a84b1b1c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:44:26.176 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdd580c44-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:44:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:44:26.176 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:44:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:44:26.176 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdd580c44-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:44:26 np0005466013 nova_compute[192144]: 2025-10-02 12:44:26.178 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:26 np0005466013 NetworkManager[51205]: <info>  [1759409066.1791] manager: (tapdd580c44-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/344)
Oct  2 08:44:26 np0005466013 kernel: tapdd580c44-70: entered promiscuous mode
Oct  2 08:44:26 np0005466013 nova_compute[192144]: 2025-10-02 12:44:26.181 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:44:26.182 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdd580c44-70, col_values=(('external_ids', {'iface-id': '6a7f5a0e-ca2b-4699-832b-55a79cd945a6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:44:26 np0005466013 nova_compute[192144]: 2025-10-02 12:44:26.185 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:26 np0005466013 ovn_controller[94366]: 2025-10-02T12:44:26Z|00775|binding|INFO|Releasing lport 6a7f5a0e-ca2b-4699-832b-55a79cd945a6 from this chassis (sb_readonly=0)
Oct  2 08:44:26 np0005466013 nova_compute[192144]: 2025-10-02 12:44:26.186 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:44:26.187 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/dd580c44-7ea8-4947-a156-85f7d60017c7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/dd580c44-7ea8-4947-a156-85f7d60017c7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:44:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:44:26.188 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[c0ba2d0f-8c4e-4dcb-a92a-ec62c81b1ef3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:44:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:44:26.189 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:44:26 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:44:26 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:44:26 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-dd580c44-7ea8-4947-a156-85f7d60017c7
Oct  2 08:44:26 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:44:26 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:44:26 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:44:26 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/dd580c44-7ea8-4947-a156-85f7d60017c7.pid.haproxy
Oct  2 08:44:26 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:44:26 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:44:26 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:44:26 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:44:26 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:44:26 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:44:26 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:44:26 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:44:26 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:44:26 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:44:26 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:44:26 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:44:26 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:44:26 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:44:26 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:44:26 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:44:26 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:44:26 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:44:26 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:44:26 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:44:26 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID dd580c44-7ea8-4947-a156-85f7d60017c7
Oct  2 08:44:26 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:44:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:44:26.190 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-dd580c44-7ea8-4947-a156-85f7d60017c7', 'env', 'PROCESS_TAG=haproxy-dd580c44-7ea8-4947-a156-85f7d60017c7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/dd580c44-7ea8-4947-a156-85f7d60017c7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:44:26 np0005466013 nova_compute[192144]: 2025-10-02 12:44:26.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:26 np0005466013 nova_compute[192144]: 2025-10-02 12:44:26.592 2 DEBUG nova.compute.manager [req-6be08470-af98-425e-8805-f988e9dc7e65 req-69e7731c-a0ea-4374-bde3-bd2c009258a1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Received event network-vif-plugged-25825dcb-892c-4f4b-bd9c-b35fc536e4fa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:44:26 np0005466013 nova_compute[192144]: 2025-10-02 12:44:26.593 2 DEBUG oslo_concurrency.lockutils [req-6be08470-af98-425e-8805-f988e9dc7e65 req-69e7731c-a0ea-4374-bde3-bd2c009258a1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "fae6fd47-f4b2-4e6a-9e91-f172b0d4c382-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:44:26 np0005466013 nova_compute[192144]: 2025-10-02 12:44:26.593 2 DEBUG oslo_concurrency.lockutils [req-6be08470-af98-425e-8805-f988e9dc7e65 req-69e7731c-a0ea-4374-bde3-bd2c009258a1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "fae6fd47-f4b2-4e6a-9e91-f172b0d4c382-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:44:26 np0005466013 nova_compute[192144]: 2025-10-02 12:44:26.594 2 DEBUG oslo_concurrency.lockutils [req-6be08470-af98-425e-8805-f988e9dc7e65 req-69e7731c-a0ea-4374-bde3-bd2c009258a1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "fae6fd47-f4b2-4e6a-9e91-f172b0d4c382-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:44:26 np0005466013 nova_compute[192144]: 2025-10-02 12:44:26.594 2 DEBUG nova.compute.manager [req-6be08470-af98-425e-8805-f988e9dc7e65 req-69e7731c-a0ea-4374-bde3-bd2c009258a1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Processing event network-vif-plugged-25825dcb-892c-4f4b-bd9c-b35fc536e4fa _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:44:26 np0005466013 podman[251368]: 2025-10-02 12:44:26.528860422 +0000 UTC m=+0.023359294 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:44:26 np0005466013 nova_compute[192144]: 2025-10-02 12:44:26.796 2 DEBUG nova.network.neutron [req-f559b3c4-e929-4d6c-852d-c08842484f05 req-7b22b554-c3d5-46fe-9429-4ea509c63892 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Updated VIF entry in instance network info cache for port 25825dcb-892c-4f4b-bd9c-b35fc536e4fa. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:44:26 np0005466013 nova_compute[192144]: 2025-10-02 12:44:26.796 2 DEBUG nova.network.neutron [req-f559b3c4-e929-4d6c-852d-c08842484f05 req-7b22b554-c3d5-46fe-9429-4ea509c63892 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Updating instance_info_cache with network_info: [{"id": "25825dcb-892c-4f4b-bd9c-b35fc536e4fa", "address": "fa:16:3e:22:19:59", "network": {"id": "dd580c44-7ea8-4947-a156-85f7d60017c7", "bridge": "br-int", "label": "tempest-network-smoke--1617017631", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dfbc688565e746c793b3d943d9813f40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25825dcb-89", "ovs_interfaceid": "25825dcb-892c-4f4b-bd9c-b35fc536e4fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:44:26 np0005466013 nova_compute[192144]: 2025-10-02 12:44:26.817 2 DEBUG oslo_concurrency.lockutils [req-f559b3c4-e929-4d6c-852d-c08842484f05 req-7b22b554-c3d5-46fe-9429-4ea509c63892 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-fae6fd47-f4b2-4e6a-9e91-f172b0d4c382" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:44:26 np0005466013 nova_compute[192144]: 2025-10-02 12:44:26.900 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759409066.8996785, fae6fd47-f4b2-4e6a-9e91-f172b0d4c382 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:44:26 np0005466013 nova_compute[192144]: 2025-10-02 12:44:26.900 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] VM Started (Lifecycle Event)#033[00m
Oct  2 08:44:26 np0005466013 nova_compute[192144]: 2025-10-02 12:44:26.903 2 DEBUG nova.compute.manager [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:44:26 np0005466013 nova_compute[192144]: 2025-10-02 12:44:26.906 2 DEBUG nova.virt.libvirt.driver [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:44:26 np0005466013 nova_compute[192144]: 2025-10-02 12:44:26.909 2 INFO nova.virt.libvirt.driver [-] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Instance spawned successfully.#033[00m
Oct  2 08:44:26 np0005466013 nova_compute[192144]: 2025-10-02 12:44:26.909 2 DEBUG nova.virt.libvirt.driver [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:44:26 np0005466013 nova_compute[192144]: 2025-10-02 12:44:26.927 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:44:26 np0005466013 nova_compute[192144]: 2025-10-02 12:44:26.933 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:44:26 np0005466013 nova_compute[192144]: 2025-10-02 12:44:26.938 2 DEBUG nova.virt.libvirt.driver [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:44:26 np0005466013 nova_compute[192144]: 2025-10-02 12:44:26.939 2 DEBUG nova.virt.libvirt.driver [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:44:26 np0005466013 nova_compute[192144]: 2025-10-02 12:44:26.939 2 DEBUG nova.virt.libvirt.driver [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:44:26 np0005466013 nova_compute[192144]: 2025-10-02 12:44:26.940 2 DEBUG nova.virt.libvirt.driver [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:44:26 np0005466013 nova_compute[192144]: 2025-10-02 12:44:26.940 2 DEBUG nova.virt.libvirt.driver [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:44:26 np0005466013 nova_compute[192144]: 2025-10-02 12:44:26.941 2 DEBUG nova.virt.libvirt.driver [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:44:26 np0005466013 nova_compute[192144]: 2025-10-02 12:44:26.951 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:44:26 np0005466013 nova_compute[192144]: 2025-10-02 12:44:26.951 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759409066.899785, fae6fd47-f4b2-4e6a-9e91-f172b0d4c382 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:44:26 np0005466013 nova_compute[192144]: 2025-10-02 12:44:26.951 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:44:26 np0005466013 nova_compute[192144]: 2025-10-02 12:44:26.975 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:44:26 np0005466013 nova_compute[192144]: 2025-10-02 12:44:26.980 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759409066.9071262, fae6fd47-f4b2-4e6a-9e91-f172b0d4c382 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:44:26 np0005466013 nova_compute[192144]: 2025-10-02 12:44:26.980 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:44:27 np0005466013 nova_compute[192144]: 2025-10-02 12:44:27.006 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:44:27 np0005466013 nova_compute[192144]: 2025-10-02 12:44:27.010 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:44:27 np0005466013 podman[251368]: 2025-10-02 12:44:27.015486186 +0000 UTC m=+0.509985028 container create 098b4cb97295ceb5039269bf08eeffad40a56a297048a7f5da061862275c8aaf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd580c44-7ea8-4947-a156-85f7d60017c7, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.vendor=CentOS)
Oct  2 08:44:27 np0005466013 nova_compute[192144]: 2025-10-02 12:44:27.027 2 INFO nova.compute.manager [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Took 7.38 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:44:27 np0005466013 nova_compute[192144]: 2025-10-02 12:44:27.027 2 DEBUG nova.compute.manager [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:44:27 np0005466013 nova_compute[192144]: 2025-10-02 12:44:27.033 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:44:27 np0005466013 systemd[1]: Started libpod-conmon-098b4cb97295ceb5039269bf08eeffad40a56a297048a7f5da061862275c8aaf.scope.
Oct  2 08:44:27 np0005466013 nova_compute[192144]: 2025-10-02 12:44:27.111 2 INFO nova.compute.manager [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Took 7.91 seconds to build instance.#033[00m
Oct  2 08:44:27 np0005466013 nova_compute[192144]: 2025-10-02 12:44:27.131 2 DEBUG oslo_concurrency.lockutils [None req-e0bfcdc8-1f30-4910-a8c9-375e0dc84bfd 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Lock "fae6fd47-f4b2-4e6a-9e91-f172b0d4c382" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.011s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:44:27 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:44:27 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/254fca8021587b25a9f2cc4d78626dc4590b5a550fc2eb653666c83368297bea/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:44:27 np0005466013 podman[251368]: 2025-10-02 12:44:27.193306404 +0000 UTC m=+0.687805266 container init 098b4cb97295ceb5039269bf08eeffad40a56a297048a7f5da061862275c8aaf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd580c44-7ea8-4947-a156-85f7d60017c7, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:44:27 np0005466013 podman[251368]: 2025-10-02 12:44:27.198415554 +0000 UTC m=+0.692914396 container start 098b4cb97295ceb5039269bf08eeffad40a56a297048a7f5da061862275c8aaf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd580c44-7ea8-4947-a156-85f7d60017c7, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001)
Oct  2 08:44:27 np0005466013 neutron-haproxy-ovnmeta-dd580c44-7ea8-4947-a156-85f7d60017c7[251384]: [NOTICE]   (251388) : New worker (251390) forked
Oct  2 08:44:27 np0005466013 neutron-haproxy-ovnmeta-dd580c44-7ea8-4947-a156-85f7d60017c7[251384]: [NOTICE]   (251388) : Loading success.
Oct  2 08:44:28 np0005466013 nova_compute[192144]: 2025-10-02 12:44:28.698 2 DEBUG nova.compute.manager [req-aa78e8ad-7193-4a1c-a3ae-2824767021c5 req-ed943384-60c7-4c54-8c24-d27042f140d6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Received event network-vif-plugged-25825dcb-892c-4f4b-bd9c-b35fc536e4fa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:44:28 np0005466013 nova_compute[192144]: 2025-10-02 12:44:28.698 2 DEBUG oslo_concurrency.lockutils [req-aa78e8ad-7193-4a1c-a3ae-2824767021c5 req-ed943384-60c7-4c54-8c24-d27042f140d6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "fae6fd47-f4b2-4e6a-9e91-f172b0d4c382-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:44:28 np0005466013 nova_compute[192144]: 2025-10-02 12:44:28.699 2 DEBUG oslo_concurrency.lockutils [req-aa78e8ad-7193-4a1c-a3ae-2824767021c5 req-ed943384-60c7-4c54-8c24-d27042f140d6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "fae6fd47-f4b2-4e6a-9e91-f172b0d4c382-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:44:28 np0005466013 nova_compute[192144]: 2025-10-02 12:44:28.699 2 DEBUG oslo_concurrency.lockutils [req-aa78e8ad-7193-4a1c-a3ae-2824767021c5 req-ed943384-60c7-4c54-8c24-d27042f140d6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "fae6fd47-f4b2-4e6a-9e91-f172b0d4c382-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:44:28 np0005466013 nova_compute[192144]: 2025-10-02 12:44:28.699 2 DEBUG nova.compute.manager [req-aa78e8ad-7193-4a1c-a3ae-2824767021c5 req-ed943384-60c7-4c54-8c24-d27042f140d6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] No waiting events found dispatching network-vif-plugged-25825dcb-892c-4f4b-bd9c-b35fc536e4fa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:44:28 np0005466013 nova_compute[192144]: 2025-10-02 12:44:28.699 2 WARNING nova.compute.manager [req-aa78e8ad-7193-4a1c-a3ae-2824767021c5 req-ed943384-60c7-4c54-8c24-d27042f140d6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Received unexpected event network-vif-plugged-25825dcb-892c-4f4b-bd9c-b35fc536e4fa for instance with vm_state active and task_state None.#033[00m
Oct  2 08:44:29 np0005466013 nova_compute[192144]: 2025-10-02 12:44:29.915 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:31 np0005466013 nova_compute[192144]: 2025-10-02 12:44:31.091 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:32 np0005466013 nova_compute[192144]: 2025-10-02 12:44:32.520 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:32 np0005466013 NetworkManager[51205]: <info>  [1759409072.5216] manager: (patch-br-int-to-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/345)
Oct  2 08:44:32 np0005466013 NetworkManager[51205]: <info>  [1759409072.5228] manager: (patch-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/346)
Oct  2 08:44:32 np0005466013 nova_compute[192144]: 2025-10-02 12:44:32.578 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:32 np0005466013 ovn_controller[94366]: 2025-10-02T12:44:32Z|00776|binding|INFO|Releasing lport 6a7f5a0e-ca2b-4699-832b-55a79cd945a6 from this chassis (sb_readonly=0)
Oct  2 08:44:32 np0005466013 nova_compute[192144]: 2025-10-02 12:44:32.587 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:32 np0005466013 podman[251402]: 2025-10-02 12:44:32.690752513 +0000 UTC m=+0.062407689 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Oct  2 08:44:32 np0005466013 podman[251401]: 2025-10-02 12:44:32.714718325 +0000 UTC m=+0.086888117 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:44:32 np0005466013 podman[251403]: 2025-10-02 12:44:32.752803399 +0000 UTC m=+0.121094639 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251001, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:44:32 np0005466013 nova_compute[192144]: 2025-10-02 12:44:32.863 2 DEBUG nova.compute.manager [req-941e9726-ee54-4476-9199-a473c0ffd141 req-f68b1938-0a7a-4dc1-84a6-3a66a355ad23 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Received event network-changed-25825dcb-892c-4f4b-bd9c-b35fc536e4fa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:44:32 np0005466013 nova_compute[192144]: 2025-10-02 12:44:32.865 2 DEBUG nova.compute.manager [req-941e9726-ee54-4476-9199-a473c0ffd141 req-f68b1938-0a7a-4dc1-84a6-3a66a355ad23 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Refreshing instance network info cache due to event network-changed-25825dcb-892c-4f4b-bd9c-b35fc536e4fa. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:44:32 np0005466013 nova_compute[192144]: 2025-10-02 12:44:32.866 2 DEBUG oslo_concurrency.lockutils [req-941e9726-ee54-4476-9199-a473c0ffd141 req-f68b1938-0a7a-4dc1-84a6-3a66a355ad23 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-fae6fd47-f4b2-4e6a-9e91-f172b0d4c382" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:44:32 np0005466013 nova_compute[192144]: 2025-10-02 12:44:32.866 2 DEBUG oslo_concurrency.lockutils [req-941e9726-ee54-4476-9199-a473c0ffd141 req-f68b1938-0a7a-4dc1-84a6-3a66a355ad23 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-fae6fd47-f4b2-4e6a-9e91-f172b0d4c382" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:44:32 np0005466013 nova_compute[192144]: 2025-10-02 12:44:32.866 2 DEBUG nova.network.neutron [req-941e9726-ee54-4476-9199-a473c0ffd141 req-f68b1938-0a7a-4dc1-84a6-3a66a355ad23 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Refreshing network info cache for port 25825dcb-892c-4f4b-bd9c-b35fc536e4fa _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:44:34 np0005466013 nova_compute[192144]: 2025-10-02 12:44:34.788 2 DEBUG nova.network.neutron [req-941e9726-ee54-4476-9199-a473c0ffd141 req-f68b1938-0a7a-4dc1-84a6-3a66a355ad23 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Updated VIF entry in instance network info cache for port 25825dcb-892c-4f4b-bd9c-b35fc536e4fa. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:44:34 np0005466013 nova_compute[192144]: 2025-10-02 12:44:34.788 2 DEBUG nova.network.neutron [req-941e9726-ee54-4476-9199-a473c0ffd141 req-f68b1938-0a7a-4dc1-84a6-3a66a355ad23 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Updating instance_info_cache with network_info: [{"id": "25825dcb-892c-4f4b-bd9c-b35fc536e4fa", "address": "fa:16:3e:22:19:59", "network": {"id": "dd580c44-7ea8-4947-a156-85f7d60017c7", "bridge": "br-int", "label": "tempest-network-smoke--1617017631", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dfbc688565e746c793b3d943d9813f40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25825dcb-89", "ovs_interfaceid": "25825dcb-892c-4f4b-bd9c-b35fc536e4fa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:44:34 np0005466013 nova_compute[192144]: 2025-10-02 12:44:34.808 2 DEBUG oslo_concurrency.lockutils [req-941e9726-ee54-4476-9199-a473c0ffd141 req-f68b1938-0a7a-4dc1-84a6-3a66a355ad23 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-fae6fd47-f4b2-4e6a-9e91-f172b0d4c382" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:44:34 np0005466013 nova_compute[192144]: 2025-10-02 12:44:34.920 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:36 np0005466013 nova_compute[192144]: 2025-10-02 12:44:36.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:39 np0005466013 nova_compute[192144]: 2025-10-02 12:44:39.923 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:41 np0005466013 nova_compute[192144]: 2025-10-02 12:44:41.096 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:43 np0005466013 podman[251483]: 2025-10-02 12:44:43.687741007 +0000 UTC m=+0.059596310 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:44:43 np0005466013 podman[251488]: 2025-10-02 12:44:43.709486039 +0000 UTC m=+0.068725646 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:44:43 np0005466013 podman[251484]: 2025-10-02 12:44:43.723790788 +0000 UTC m=+0.090510390 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, vcs-type=git, vendor=Red Hat, Inc., release=1755695350, io.buildah.version=1.33.7, name=ubi9-minimal, config_id=edpm, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers)
Oct  2 08:44:44 np0005466013 ovn_controller[94366]: 2025-10-02T12:44:44Z|00083|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:22:19:59 10.100.0.8
Oct  2 08:44:44 np0005466013 ovn_controller[94366]: 2025-10-02T12:44:44Z|00084|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:22:19:59 10.100.0.8
Oct  2 08:44:44 np0005466013 nova_compute[192144]: 2025-10-02 12:44:44.927 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:45 np0005466013 nova_compute[192144]: 2025-10-02 12:44:45.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:44:45 np0005466013 nova_compute[192144]: 2025-10-02 12:44:45.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:44:46 np0005466013 nova_compute[192144]: 2025-10-02 12:44:46.098 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:46 np0005466013 podman[251542]: 2025-10-02 12:44:46.678810498 +0000 UTC m=+0.057127833 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:44:46 np0005466013 podman[251543]: 2025-10-02 12:44:46.704728781 +0000 UTC m=+0.074415085 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct  2 08:44:46 np0005466013 nova_compute[192144]: 2025-10-02 12:44:46.995 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:44:47 np0005466013 nova_compute[192144]: 2025-10-02 12:44:47.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:44:48 np0005466013 nova_compute[192144]: 2025-10-02 12:44:48.022 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:44:48 np0005466013 nova_compute[192144]: 2025-10-02 12:44:48.022 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:44:48 np0005466013 nova_compute[192144]: 2025-10-02 12:44:48.022 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:44:48 np0005466013 nova_compute[192144]: 2025-10-02 12:44:48.023 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:44:48 np0005466013 nova_compute[192144]: 2025-10-02 12:44:48.092 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fae6fd47-f4b2-4e6a-9e91-f172b0d4c382/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:44:48 np0005466013 nova_compute[192144]: 2025-10-02 12:44:48.148 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fae6fd47-f4b2-4e6a-9e91-f172b0d4c382/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:44:48 np0005466013 nova_compute[192144]: 2025-10-02 12:44:48.149 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fae6fd47-f4b2-4e6a-9e91-f172b0d4c382/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:44:48 np0005466013 nova_compute[192144]: 2025-10-02 12:44:48.209 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fae6fd47-f4b2-4e6a-9e91-f172b0d4c382/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:44:48 np0005466013 nova_compute[192144]: 2025-10-02 12:44:48.367 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:44:48 np0005466013 nova_compute[192144]: 2025-10-02 12:44:48.368 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5530MB free_disk=73.10421371459961GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:44:48 np0005466013 nova_compute[192144]: 2025-10-02 12:44:48.369 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:44:48 np0005466013 nova_compute[192144]: 2025-10-02 12:44:48.369 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:44:48 np0005466013 nova_compute[192144]: 2025-10-02 12:44:48.460 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance fae6fd47-f4b2-4e6a-9e91-f172b0d4c382 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:44:48 np0005466013 nova_compute[192144]: 2025-10-02 12:44:48.461 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:44:48 np0005466013 nova_compute[192144]: 2025-10-02 12:44:48.461 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:44:48 np0005466013 nova_compute[192144]: 2025-10-02 12:44:48.516 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:44:48 np0005466013 nova_compute[192144]: 2025-10-02 12:44:48.546 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:44:48 np0005466013 nova_compute[192144]: 2025-10-02 12:44:48.574 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:44:48 np0005466013 nova_compute[192144]: 2025-10-02 12:44:48.574 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.205s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:44:49 np0005466013 nova_compute[192144]: 2025-10-02 12:44:49.931 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:51 np0005466013 nova_compute[192144]: 2025-10-02 12:44:51.101 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:51 np0005466013 nova_compute[192144]: 2025-10-02 12:44:51.574 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:44:52 np0005466013 nova_compute[192144]: 2025-10-02 12:44:52.988 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:44:53 np0005466013 nova_compute[192144]: 2025-10-02 12:44:53.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:44:54 np0005466013 nova_compute[192144]: 2025-10-02 12:44:54.934 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:55 np0005466013 nova_compute[192144]: 2025-10-02 12:44:55.988 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:44:55 np0005466013 nova_compute[192144]: 2025-10-02 12:44:55.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:44:55 np0005466013 nova_compute[192144]: 2025-10-02 12:44:55.993 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  2 08:44:56 np0005466013 nova_compute[192144]: 2025-10-02 12:44:56.103 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:57 np0005466013 nova_compute[192144]: 2025-10-02 12:44:57.011 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:44:59 np0005466013 nova_compute[192144]: 2025-10-02 12:44:59.937 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:01 np0005466013 nova_compute[192144]: 2025-10-02 12:45:01.104 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:01 np0005466013 nova_compute[192144]: 2025-10-02 12:45:01.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:45:01 np0005466013 nova_compute[192144]: 2025-10-02 12:45:01.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:45:01 np0005466013 nova_compute[192144]: 2025-10-02 12:45:01.995 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:45:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:45:02.328 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:45:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:45:02.329 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:45:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:45:02.330 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:45:02 np0005466013 ovn_controller[94366]: 2025-10-02T12:45:02Z|00777|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory
Oct  2 08:45:03 np0005466013 nova_compute[192144]: 2025-10-02 12:45:03.479 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "refresh_cache-fae6fd47-f4b2-4e6a-9e91-f172b0d4c382" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:45:03 np0005466013 nova_compute[192144]: 2025-10-02 12:45:03.480 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquired lock "refresh_cache-fae6fd47-f4b2-4e6a-9e91-f172b0d4c382" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:45:03 np0005466013 nova_compute[192144]: 2025-10-02 12:45:03.480 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:45:03 np0005466013 nova_compute[192144]: 2025-10-02 12:45:03.481 2 DEBUG nova.objects.instance [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lazy-loading 'info_cache' on Instance uuid fae6fd47-f4b2-4e6a-9e91-f172b0d4c382 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:45:03 np0005466013 podman[251592]: 2025-10-02 12:45:03.681551587 +0000 UTC m=+0.047754088 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:45:03 np0005466013 podman[251591]: 2025-10-02 12:45:03.704679193 +0000 UTC m=+0.073650501 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:45:03 np0005466013 podman[251593]: 2025-10-02 12:45:03.717809725 +0000 UTC m=+0.080724103 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct  2 08:45:04 np0005466013 nova_compute[192144]: 2025-10-02 12:45:04.940 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:06 np0005466013 nova_compute[192144]: 2025-10-02 12:45:06.106 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:08 np0005466013 nova_compute[192144]: 2025-10-02 12:45:08.641 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Updating instance_info_cache with network_info: [{"id": "25825dcb-892c-4f4b-bd9c-b35fc536e4fa", "address": "fa:16:3e:22:19:59", "network": {"id": "dd580c44-7ea8-4947-a156-85f7d60017c7", "bridge": "br-int", "label": "tempest-network-smoke--1617017631", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dfbc688565e746c793b3d943d9813f40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25825dcb-89", "ovs_interfaceid": "25825dcb-892c-4f4b-bd9c-b35fc536e4fa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:45:08 np0005466013 nova_compute[192144]: 2025-10-02 12:45:08.676 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Releasing lock "refresh_cache-fae6fd47-f4b2-4e6a-9e91-f172b0d4c382" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:45:08 np0005466013 nova_compute[192144]: 2025-10-02 12:45:08.677 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:45:08 np0005466013 nova_compute[192144]: 2025-10-02 12:45:08.678 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:45:08 np0005466013 nova_compute[192144]: 2025-10-02 12:45:08.678 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:45:08 np0005466013 nova_compute[192144]: 2025-10-02 12:45:08.679 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  2 08:45:08 np0005466013 nova_compute[192144]: 2025-10-02 12:45:08.716 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  2 08:45:08 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:45:08.785 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=51, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=50) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:45:08 np0005466013 nova_compute[192144]: 2025-10-02 12:45:08.786 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:08 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:45:08.787 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:45:09 np0005466013 nova_compute[192144]: 2025-10-02 12:45:09.943 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:11 np0005466013 nova_compute[192144]: 2025-10-02 12:45:11.108 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:11 np0005466013 nova_compute[192144]: 2025-10-02 12:45:11.727 2 DEBUG nova.compute.manager [req-e79afee2-5698-4173-bdba-3703c89e0483 req-586f4dd8-9b4b-4e6d-b63b-fa1cb5fa7ea7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Received event network-changed-25825dcb-892c-4f4b-bd9c-b35fc536e4fa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:45:11 np0005466013 nova_compute[192144]: 2025-10-02 12:45:11.727 2 DEBUG nova.compute.manager [req-e79afee2-5698-4173-bdba-3703c89e0483 req-586f4dd8-9b4b-4e6d-b63b-fa1cb5fa7ea7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Refreshing instance network info cache due to event network-changed-25825dcb-892c-4f4b-bd9c-b35fc536e4fa. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:45:11 np0005466013 nova_compute[192144]: 2025-10-02 12:45:11.728 2 DEBUG oslo_concurrency.lockutils [req-e79afee2-5698-4173-bdba-3703c89e0483 req-586f4dd8-9b4b-4e6d-b63b-fa1cb5fa7ea7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-fae6fd47-f4b2-4e6a-9e91-f172b0d4c382" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:45:11 np0005466013 nova_compute[192144]: 2025-10-02 12:45:11.728 2 DEBUG oslo_concurrency.lockutils [req-e79afee2-5698-4173-bdba-3703c89e0483 req-586f4dd8-9b4b-4e6d-b63b-fa1cb5fa7ea7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-fae6fd47-f4b2-4e6a-9e91-f172b0d4c382" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:45:11 np0005466013 nova_compute[192144]: 2025-10-02 12:45:11.728 2 DEBUG nova.network.neutron [req-e79afee2-5698-4173-bdba-3703c89e0483 req-586f4dd8-9b4b-4e6d-b63b-fa1cb5fa7ea7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Refreshing network info cache for port 25825dcb-892c-4f4b-bd9c-b35fc536e4fa _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:45:12 np0005466013 nova_compute[192144]: 2025-10-02 12:45:12.014 2 DEBUG oslo_concurrency.lockutils [None req-2f7be7a5-ec7a-4d7f-8845-909f8c8c64f7 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Acquiring lock "fae6fd47-f4b2-4e6a-9e91-f172b0d4c382" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:45:12 np0005466013 nova_compute[192144]: 2025-10-02 12:45:12.014 2 DEBUG oslo_concurrency.lockutils [None req-2f7be7a5-ec7a-4d7f-8845-909f8c8c64f7 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Lock "fae6fd47-f4b2-4e6a-9e91-f172b0d4c382" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:45:12 np0005466013 nova_compute[192144]: 2025-10-02 12:45:12.014 2 DEBUG oslo_concurrency.lockutils [None req-2f7be7a5-ec7a-4d7f-8845-909f8c8c64f7 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Acquiring lock "fae6fd47-f4b2-4e6a-9e91-f172b0d4c382-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:45:12 np0005466013 nova_compute[192144]: 2025-10-02 12:45:12.015 2 DEBUG oslo_concurrency.lockutils [None req-2f7be7a5-ec7a-4d7f-8845-909f8c8c64f7 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Lock "fae6fd47-f4b2-4e6a-9e91-f172b0d4c382-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:45:12 np0005466013 nova_compute[192144]: 2025-10-02 12:45:12.015 2 DEBUG oslo_concurrency.lockutils [None req-2f7be7a5-ec7a-4d7f-8845-909f8c8c64f7 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Lock "fae6fd47-f4b2-4e6a-9e91-f172b0d4c382-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:45:12 np0005466013 nova_compute[192144]: 2025-10-02 12:45:12.032 2 INFO nova.compute.manager [None req-2f7be7a5-ec7a-4d7f-8845-909f8c8c64f7 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Terminating instance#033[00m
Oct  2 08:45:12 np0005466013 nova_compute[192144]: 2025-10-02 12:45:12.060 2 DEBUG nova.compute.manager [None req-2f7be7a5-ec7a-4d7f-8845-909f8c8c64f7 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:45:12 np0005466013 kernel: tap25825dcb-89 (unregistering): left promiscuous mode
Oct  2 08:45:12 np0005466013 NetworkManager[51205]: <info>  [1759409112.0816] device (tap25825dcb-89): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:45:12 np0005466013 nova_compute[192144]: 2025-10-02 12:45:12.092 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:12 np0005466013 ovn_controller[94366]: 2025-10-02T12:45:12Z|00778|binding|INFO|Releasing lport 25825dcb-892c-4f4b-bd9c-b35fc536e4fa from this chassis (sb_readonly=0)
Oct  2 08:45:12 np0005466013 ovn_controller[94366]: 2025-10-02T12:45:12Z|00779|binding|INFO|Setting lport 25825dcb-892c-4f4b-bd9c-b35fc536e4fa down in Southbound
Oct  2 08:45:12 np0005466013 ovn_controller[94366]: 2025-10-02T12:45:12Z|00780|binding|INFO|Removing iface tap25825dcb-89 ovn-installed in OVS
Oct  2 08:45:12 np0005466013 nova_compute[192144]: 2025-10-02 12:45:12.107 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:12 np0005466013 systemd[1]: machine-qemu\x2d81\x2dinstance\x2d000000b4.scope: Deactivated successfully.
Oct  2 08:45:12 np0005466013 systemd[1]: machine-qemu\x2d81\x2dinstance\x2d000000b4.scope: Consumed 15.795s CPU time.
Oct  2 08:45:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:45:12.132 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:22:19:59 10.100.0.8'], port_security=['fa:16:3e:22:19:59 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'fae6fd47-f4b2-4e6a-9e91-f172b0d4c382', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dd580c44-7ea8-4947-a156-85f7d60017c7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dfbc688565e746c793b3d943d9813f40', 'neutron:revision_number': '4', 'neutron:security_group_ids': '81338e3b-03ce-46a9-ae0b-cfaf86cc38c4 8fcce784-adae-41f2-bafd-5dbfdcd7ef48', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e4beee2e-45ac-4e17-b436-9d8290222cd4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=25825dcb-892c-4f4b-bd9c-b35fc536e4fa) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:45:12 np0005466013 systemd-machined[152202]: Machine qemu-81-instance-000000b4 terminated.
Oct  2 08:45:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:45:12.134 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 25825dcb-892c-4f4b-bd9c-b35fc536e4fa in datapath dd580c44-7ea8-4947-a156-85f7d60017c7 unbound from our chassis#033[00m
Oct  2 08:45:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:45:12.135 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network dd580c44-7ea8-4947-a156-85f7d60017c7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:45:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:45:12.136 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[8a567f9f-fd33-455c-b58e-057eddd1c49c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:45:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:45:12.136 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-dd580c44-7ea8-4947-a156-85f7d60017c7 namespace which is not needed anymore#033[00m
Oct  2 08:45:12 np0005466013 neutron-haproxy-ovnmeta-dd580c44-7ea8-4947-a156-85f7d60017c7[251384]: [NOTICE]   (251388) : haproxy version is 2.8.14-c23fe91
Oct  2 08:45:12 np0005466013 neutron-haproxy-ovnmeta-dd580c44-7ea8-4947-a156-85f7d60017c7[251384]: [NOTICE]   (251388) : path to executable is /usr/sbin/haproxy
Oct  2 08:45:12 np0005466013 neutron-haproxy-ovnmeta-dd580c44-7ea8-4947-a156-85f7d60017c7[251384]: [WARNING]  (251388) : Exiting Master process...
Oct  2 08:45:12 np0005466013 neutron-haproxy-ovnmeta-dd580c44-7ea8-4947-a156-85f7d60017c7[251384]: [ALERT]    (251388) : Current worker (251390) exited with code 143 (Terminated)
Oct  2 08:45:12 np0005466013 neutron-haproxy-ovnmeta-dd580c44-7ea8-4947-a156-85f7d60017c7[251384]: [WARNING]  (251388) : All workers exited. Exiting... (0)
Oct  2 08:45:12 np0005466013 systemd[1]: libpod-098b4cb97295ceb5039269bf08eeffad40a56a297048a7f5da061862275c8aaf.scope: Deactivated successfully.
Oct  2 08:45:12 np0005466013 podman[251679]: 2025-10-02 12:45:12.26400642 +0000 UTC m=+0.042811765 container died 098b4cb97295ceb5039269bf08eeffad40a56a297048a7f5da061862275c8aaf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd580c44-7ea8-4947-a156-85f7d60017c7, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:45:12 np0005466013 nova_compute[192144]: 2025-10-02 12:45:12.283 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:12 np0005466013 nova_compute[192144]: 2025-10-02 12:45:12.286 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:12 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-098b4cb97295ceb5039269bf08eeffad40a56a297048a7f5da061862275c8aaf-userdata-shm.mount: Deactivated successfully.
Oct  2 08:45:12 np0005466013 systemd[1]: var-lib-containers-storage-overlay-254fca8021587b25a9f2cc4d78626dc4590b5a550fc2eb653666c83368297bea-merged.mount: Deactivated successfully.
Oct  2 08:45:12 np0005466013 podman[251679]: 2025-10-02 12:45:12.309700267 +0000 UTC m=+0.088505612 container cleanup 098b4cb97295ceb5039269bf08eeffad40a56a297048a7f5da061862275c8aaf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd580c44-7ea8-4947-a156-85f7d60017c7, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:45:12 np0005466013 systemd[1]: libpod-conmon-098b4cb97295ceb5039269bf08eeffad40a56a297048a7f5da061862275c8aaf.scope: Deactivated successfully.
Oct  2 08:45:12 np0005466013 nova_compute[192144]: 2025-10-02 12:45:12.330 2 INFO nova.virt.libvirt.driver [-] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Instance destroyed successfully.#033[00m
Oct  2 08:45:12 np0005466013 nova_compute[192144]: 2025-10-02 12:45:12.331 2 DEBUG nova.objects.instance [None req-2f7be7a5-ec7a-4d7f-8845-909f8c8c64f7 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Lazy-loading 'resources' on Instance uuid fae6fd47-f4b2-4e6a-9e91-f172b0d4c382 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:45:12 np0005466013 podman[251721]: 2025-10-02 12:45:12.380797021 +0000 UTC m=+0.044778548 container remove 098b4cb97295ceb5039269bf08eeffad40a56a297048a7f5da061862275c8aaf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd580c44-7ea8-4947-a156-85f7d60017c7, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:45:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:45:12.389 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[cbb10a03-651b-4f16-b67a-8d118d48a98c]: (4, ('Thu Oct  2 12:45:12 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-dd580c44-7ea8-4947-a156-85f7d60017c7 (098b4cb97295ceb5039269bf08eeffad40a56a297048a7f5da061862275c8aaf)\n098b4cb97295ceb5039269bf08eeffad40a56a297048a7f5da061862275c8aaf\nThu Oct  2 12:45:12 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-dd580c44-7ea8-4947-a156-85f7d60017c7 (098b4cb97295ceb5039269bf08eeffad40a56a297048a7f5da061862275c8aaf)\n098b4cb97295ceb5039269bf08eeffad40a56a297048a7f5da061862275c8aaf\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:45:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:45:12.391 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[c021d810-c4e8-42bc-b87a-ee35a22c619c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:45:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:45:12.392 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdd580c44-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:45:12 np0005466013 nova_compute[192144]: 2025-10-02 12:45:12.394 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:12 np0005466013 kernel: tapdd580c44-70: left promiscuous mode
Oct  2 08:45:12 np0005466013 nova_compute[192144]: 2025-10-02 12:45:12.412 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:45:12.415 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[1bb6fe17-0322-4743-9fef-b1c711899c6e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:45:12 np0005466013 nova_compute[192144]: 2025-10-02 12:45:12.423 2 DEBUG nova.virt.libvirt.vif [None req-2f7be7a5-ec7a-4d7f-8845-909f8c8c64f7 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:44:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-291806933-access_point-731764317',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-291806933-access_point-731764317',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-291806933-acc',id=180,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEhNOBg3b/xTSagZcSj8txxntH3FGkYT7DOxHjjsRe9VmpiBcoAB/AXdGUMr1TrLAN/Df9iz4Dq9MgjgjSuz7uKI5e+2IK4FbSUyGty2k/OKwq7ViW1u3j47knodLa8RIA==',key_name='tempest-TestSecurityGroupsBasicOps-1553086170',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:44:27Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='dfbc688565e746c793b3d943d9813f40',ramdisk_id='',reservation_id='r-zd1uzee4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-291806933',owner_user_name='tempest-TestSecurityGroupsBasicOps-291806933-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:44:27Z,user_data=None,user_id='4720b7e5b8a9423e8c0d475a2e20be2b',uuid=fae6fd47-f4b2-4e6a-9e91-f172b0d4c382,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "25825dcb-892c-4f4b-bd9c-b35fc536e4fa", "address": "fa:16:3e:22:19:59", "network": {"id": "dd580c44-7ea8-4947-a156-85f7d60017c7", "bridge": "br-int", "label": "tempest-network-smoke--1617017631", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dfbc688565e746c793b3d943d9813f40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25825dcb-89", "ovs_interfaceid": "25825dcb-892c-4f4b-bd9c-b35fc536e4fa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:45:12 np0005466013 nova_compute[192144]: 2025-10-02 12:45:12.423 2 DEBUG nova.network.os_vif_util [None req-2f7be7a5-ec7a-4d7f-8845-909f8c8c64f7 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Converting VIF {"id": "25825dcb-892c-4f4b-bd9c-b35fc536e4fa", "address": "fa:16:3e:22:19:59", "network": {"id": "dd580c44-7ea8-4947-a156-85f7d60017c7", "bridge": "br-int", "label": "tempest-network-smoke--1617017631", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dfbc688565e746c793b3d943d9813f40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25825dcb-89", "ovs_interfaceid": "25825dcb-892c-4f4b-bd9c-b35fc536e4fa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:45:12 np0005466013 nova_compute[192144]: 2025-10-02 12:45:12.424 2 DEBUG nova.network.os_vif_util [None req-2f7be7a5-ec7a-4d7f-8845-909f8c8c64f7 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:22:19:59,bridge_name='br-int',has_traffic_filtering=True,id=25825dcb-892c-4f4b-bd9c-b35fc536e4fa,network=Network(dd580c44-7ea8-4947-a156-85f7d60017c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25825dcb-89') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:45:12 np0005466013 nova_compute[192144]: 2025-10-02 12:45:12.424 2 DEBUG os_vif [None req-2f7be7a5-ec7a-4d7f-8845-909f8c8c64f7 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:22:19:59,bridge_name='br-int',has_traffic_filtering=True,id=25825dcb-892c-4f4b-bd9c-b35fc536e4fa,network=Network(dd580c44-7ea8-4947-a156-85f7d60017c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25825dcb-89') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:45:12 np0005466013 nova_compute[192144]: 2025-10-02 12:45:12.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:12 np0005466013 nova_compute[192144]: 2025-10-02 12:45:12.426 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap25825dcb-89, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:45:12 np0005466013 nova_compute[192144]: 2025-10-02 12:45:12.427 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:12 np0005466013 nova_compute[192144]: 2025-10-02 12:45:12.429 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:12 np0005466013 nova_compute[192144]: 2025-10-02 12:45:12.432 2 INFO os_vif [None req-2f7be7a5-ec7a-4d7f-8845-909f8c8c64f7 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:22:19:59,bridge_name='br-int',has_traffic_filtering=True,id=25825dcb-892c-4f4b-bd9c-b35fc536e4fa,network=Network(dd580c44-7ea8-4947-a156-85f7d60017c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25825dcb-89')#033[00m
Oct  2 08:45:12 np0005466013 nova_compute[192144]: 2025-10-02 12:45:12.433 2 INFO nova.virt.libvirt.driver [None req-2f7be7a5-ec7a-4d7f-8845-909f8c8c64f7 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Deleting instance files /var/lib/nova/instances/fae6fd47-f4b2-4e6a-9e91-f172b0d4c382_del#033[00m
Oct  2 08:45:12 np0005466013 nova_compute[192144]: 2025-10-02 12:45:12.434 2 INFO nova.virt.libvirt.driver [None req-2f7be7a5-ec7a-4d7f-8845-909f8c8c64f7 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Deletion of /var/lib/nova/instances/fae6fd47-f4b2-4e6a-9e91-f172b0d4c382_del complete#033[00m
Oct  2 08:45:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:45:12.453 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[e38113c7-609f-42bc-8ce9-1a003e283d0e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:45:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:45:12.455 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[4eb10c7d-8ee3-4b84-a6bb-50103d59699d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:45:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:45:12.478 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[994092f7-39d1-44b6-b714-eb70cc70517d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 705758, 'reachable_time': 28483, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 251740, 'error': None, 'target': 'ovnmeta-dd580c44-7ea8-4947-a156-85f7d60017c7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:45:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:45:12.481 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-dd580c44-7ea8-4947-a156-85f7d60017c7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:45:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:45:12.481 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[024ee252-4200-4277-bf63-ede49d4192b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:45:12 np0005466013 systemd[1]: run-netns-ovnmeta\x2ddd580c44\x2d7ea8\x2d4947\x2da156\x2d85f7d60017c7.mount: Deactivated successfully.
Oct  2 08:45:12 np0005466013 nova_compute[192144]: 2025-10-02 12:45:12.523 2 DEBUG nova.compute.manager [req-b13477bb-ac44-4998-916a-dcbf7af5a042 req-dfb25468-49fc-4738-9bf5-9d40d2a83503 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Received event network-vif-unplugged-25825dcb-892c-4f4b-bd9c-b35fc536e4fa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:45:12 np0005466013 nova_compute[192144]: 2025-10-02 12:45:12.524 2 DEBUG oslo_concurrency.lockutils [req-b13477bb-ac44-4998-916a-dcbf7af5a042 req-dfb25468-49fc-4738-9bf5-9d40d2a83503 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "fae6fd47-f4b2-4e6a-9e91-f172b0d4c382-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:45:12 np0005466013 nova_compute[192144]: 2025-10-02 12:45:12.524 2 DEBUG oslo_concurrency.lockutils [req-b13477bb-ac44-4998-916a-dcbf7af5a042 req-dfb25468-49fc-4738-9bf5-9d40d2a83503 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "fae6fd47-f4b2-4e6a-9e91-f172b0d4c382-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:45:12 np0005466013 nova_compute[192144]: 2025-10-02 12:45:12.524 2 DEBUG oslo_concurrency.lockutils [req-b13477bb-ac44-4998-916a-dcbf7af5a042 req-dfb25468-49fc-4738-9bf5-9d40d2a83503 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "fae6fd47-f4b2-4e6a-9e91-f172b0d4c382-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:45:12 np0005466013 nova_compute[192144]: 2025-10-02 12:45:12.524 2 DEBUG nova.compute.manager [req-b13477bb-ac44-4998-916a-dcbf7af5a042 req-dfb25468-49fc-4738-9bf5-9d40d2a83503 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] No waiting events found dispatching network-vif-unplugged-25825dcb-892c-4f4b-bd9c-b35fc536e4fa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:45:12 np0005466013 nova_compute[192144]: 2025-10-02 12:45:12.524 2 DEBUG nova.compute.manager [req-b13477bb-ac44-4998-916a-dcbf7af5a042 req-dfb25468-49fc-4738-9bf5-9d40d2a83503 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Received event network-vif-unplugged-25825dcb-892c-4f4b-bd9c-b35fc536e4fa for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:45:12 np0005466013 nova_compute[192144]: 2025-10-02 12:45:12.613 2 INFO nova.compute.manager [None req-2f7be7a5-ec7a-4d7f-8845-909f8c8c64f7 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Took 0.55 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:45:12 np0005466013 nova_compute[192144]: 2025-10-02 12:45:12.613 2 DEBUG oslo.service.loopingcall [None req-2f7be7a5-ec7a-4d7f-8845-909f8c8c64f7 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:45:12 np0005466013 nova_compute[192144]: 2025-10-02 12:45:12.614 2 DEBUG nova.compute.manager [-] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:45:12 np0005466013 nova_compute[192144]: 2025-10-02 12:45:12.614 2 DEBUG nova.network.neutron [-] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:45:13 np0005466013 nova_compute[192144]: 2025-10-02 12:45:13.355 2 DEBUG nova.network.neutron [-] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:45:13 np0005466013 nova_compute[192144]: 2025-10-02 12:45:13.360 2 DEBUG nova.network.neutron [req-e79afee2-5698-4173-bdba-3703c89e0483 req-586f4dd8-9b4b-4e6d-b63b-fa1cb5fa7ea7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Updated VIF entry in instance network info cache for port 25825dcb-892c-4f4b-bd9c-b35fc536e4fa. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:45:13 np0005466013 nova_compute[192144]: 2025-10-02 12:45:13.361 2 DEBUG nova.network.neutron [req-e79afee2-5698-4173-bdba-3703c89e0483 req-586f4dd8-9b4b-4e6d-b63b-fa1cb5fa7ea7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Updating instance_info_cache with network_info: [{"id": "25825dcb-892c-4f4b-bd9c-b35fc536e4fa", "address": "fa:16:3e:22:19:59", "network": {"id": "dd580c44-7ea8-4947-a156-85f7d60017c7", "bridge": "br-int", "label": "tempest-network-smoke--1617017631", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dfbc688565e746c793b3d943d9813f40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25825dcb-89", "ovs_interfaceid": "25825dcb-892c-4f4b-bd9c-b35fc536e4fa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:45:13 np0005466013 nova_compute[192144]: 2025-10-02 12:45:13.392 2 INFO nova.compute.manager [-] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Took 0.78 seconds to deallocate network for instance.#033[00m
Oct  2 08:45:13 np0005466013 nova_compute[192144]: 2025-10-02 12:45:13.399 2 DEBUG oslo_concurrency.lockutils [req-e79afee2-5698-4173-bdba-3703c89e0483 req-586f4dd8-9b4b-4e6d-b63b-fa1cb5fa7ea7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-fae6fd47-f4b2-4e6a-9e91-f172b0d4c382" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:45:13 np0005466013 nova_compute[192144]: 2025-10-02 12:45:13.472 2 DEBUG oslo_concurrency.lockutils [None req-2f7be7a5-ec7a-4d7f-8845-909f8c8c64f7 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:45:13 np0005466013 nova_compute[192144]: 2025-10-02 12:45:13.472 2 DEBUG oslo_concurrency.lockutils [None req-2f7be7a5-ec7a-4d7f-8845-909f8c8c64f7 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:45:13 np0005466013 nova_compute[192144]: 2025-10-02 12:45:13.531 2 DEBUG nova.compute.provider_tree [None req-2f7be7a5-ec7a-4d7f-8845-909f8c8c64f7 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:45:13 np0005466013 nova_compute[192144]: 2025-10-02 12:45:13.548 2 DEBUG nova.scheduler.client.report [None req-2f7be7a5-ec7a-4d7f-8845-909f8c8c64f7 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:45:13 np0005466013 nova_compute[192144]: 2025-10-02 12:45:13.578 2 DEBUG oslo_concurrency.lockutils [None req-2f7be7a5-ec7a-4d7f-8845-909f8c8c64f7 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:45:13 np0005466013 nova_compute[192144]: 2025-10-02 12:45:13.604 2 INFO nova.scheduler.client.report [None req-2f7be7a5-ec7a-4d7f-8845-909f8c8c64f7 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Deleted allocations for instance fae6fd47-f4b2-4e6a-9e91-f172b0d4c382#033[00m
Oct  2 08:45:13 np0005466013 nova_compute[192144]: 2025-10-02 12:45:13.692 2 DEBUG oslo_concurrency.lockutils [None req-2f7be7a5-ec7a-4d7f-8845-909f8c8c64f7 4720b7e5b8a9423e8c0d475a2e20be2b dfbc688565e746c793b3d943d9813f40 - - default default] Lock "fae6fd47-f4b2-4e6a-9e91-f172b0d4c382" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.678s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:45:13 np0005466013 nova_compute[192144]: 2025-10-02 12:45:13.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:45:14 np0005466013 nova_compute[192144]: 2025-10-02 12:45:14.621 2 DEBUG nova.compute.manager [req-c9b762e8-a8a9-465f-8119-4b0305936f1b req-5808ab13-4b0e-4d23-ad7b-7bb77d747e34 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Received event network-vif-plugged-25825dcb-892c-4f4b-bd9c-b35fc536e4fa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:45:14 np0005466013 nova_compute[192144]: 2025-10-02 12:45:14.622 2 DEBUG oslo_concurrency.lockutils [req-c9b762e8-a8a9-465f-8119-4b0305936f1b req-5808ab13-4b0e-4d23-ad7b-7bb77d747e34 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "fae6fd47-f4b2-4e6a-9e91-f172b0d4c382-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:45:14 np0005466013 nova_compute[192144]: 2025-10-02 12:45:14.622 2 DEBUG oslo_concurrency.lockutils [req-c9b762e8-a8a9-465f-8119-4b0305936f1b req-5808ab13-4b0e-4d23-ad7b-7bb77d747e34 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "fae6fd47-f4b2-4e6a-9e91-f172b0d4c382-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:45:14 np0005466013 nova_compute[192144]: 2025-10-02 12:45:14.622 2 DEBUG oslo_concurrency.lockutils [req-c9b762e8-a8a9-465f-8119-4b0305936f1b req-5808ab13-4b0e-4d23-ad7b-7bb77d747e34 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "fae6fd47-f4b2-4e6a-9e91-f172b0d4c382-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:45:14 np0005466013 nova_compute[192144]: 2025-10-02 12:45:14.623 2 DEBUG nova.compute.manager [req-c9b762e8-a8a9-465f-8119-4b0305936f1b req-5808ab13-4b0e-4d23-ad7b-7bb77d747e34 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] No waiting events found dispatching network-vif-plugged-25825dcb-892c-4f4b-bd9c-b35fc536e4fa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:45:14 np0005466013 nova_compute[192144]: 2025-10-02 12:45:14.623 2 WARNING nova.compute.manager [req-c9b762e8-a8a9-465f-8119-4b0305936f1b req-5808ab13-4b0e-4d23-ad7b-7bb77d747e34 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Received unexpected event network-vif-plugged-25825dcb-892c-4f4b-bd9c-b35fc536e4fa for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:45:14 np0005466013 nova_compute[192144]: 2025-10-02 12:45:14.623 2 DEBUG nova.compute.manager [req-c9b762e8-a8a9-465f-8119-4b0305936f1b req-5808ab13-4b0e-4d23-ad7b-7bb77d747e34 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Received event network-vif-deleted-25825dcb-892c-4f4b-bd9c-b35fc536e4fa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:45:14 np0005466013 nova_compute[192144]: 2025-10-02 12:45:14.623 2 INFO nova.compute.manager [req-c9b762e8-a8a9-465f-8119-4b0305936f1b req-5808ab13-4b0e-4d23-ad7b-7bb77d747e34 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Neutron deleted interface 25825dcb-892c-4f4b-bd9c-b35fc536e4fa; detaching it from the instance and deleting it from the info cache#033[00m
Oct  2 08:45:14 np0005466013 nova_compute[192144]: 2025-10-02 12:45:14.624 2 DEBUG nova.network.neutron [req-c9b762e8-a8a9-465f-8119-4b0305936f1b req-5808ab13-4b0e-4d23-ad7b-7bb77d747e34 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106#033[00m
Oct  2 08:45:14 np0005466013 nova_compute[192144]: 2025-10-02 12:45:14.627 2 DEBUG nova.compute.manager [req-c9b762e8-a8a9-465f-8119-4b0305936f1b req-5808ab13-4b0e-4d23-ad7b-7bb77d747e34 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Detach interface failed, port_id=25825dcb-892c-4f4b-bd9c-b35fc536e4fa, reason: Instance fae6fd47-f4b2-4e6a-9e91-f172b0d4c382 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Oct  2 08:45:14 np0005466013 podman[251742]: 2025-10-02 12:45:14.702259161 +0000 UTC m=+0.066704927 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, container_name=openstack_network_exporter, io.buildah.version=1.33.7, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container)
Oct  2 08:45:14 np0005466013 podman[251743]: 2025-10-02 12:45:14.725617554 +0000 UTC m=+0.087311344 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=edpm)
Oct  2 08:45:14 np0005466013 podman[251741]: 2025-10-02 12:45:14.726236854 +0000 UTC m=+0.095127710 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:45:15 np0005466013 nova_compute[192144]: 2025-10-02 12:45:15.371 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:45:16 np0005466013 nova_compute[192144]: 2025-10-02 12:45:16.111 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:45:16.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:45:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:45:16.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:45:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:45:16.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:45:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:45:16.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:45:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:45:16.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:45:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:45:16.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:45:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:45:16.362 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:45:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:45:16.362 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:45:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:45:16.362 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:45:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:45:16.362 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:45:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:45:16.362 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:45:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:45:16.362 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:45:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:45:16.363 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:45:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:45:16.363 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:45:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:45:16.363 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:45:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:45:16.363 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:45:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:45:16.363 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:45:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:45:16.363 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:45:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:45:16.363 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:45:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:45:16.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:45:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:45:16.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:45:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:45:16.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:45:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:45:16.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:45:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:45:16.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:45:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:45:16.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:45:16 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:45:16.790 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '51'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:45:17 np0005466013 nova_compute[192144]: 2025-10-02 12:45:17.427 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:17 np0005466013 podman[251795]: 2025-10-02 12:45:17.666945284 +0000 UTC m=+0.047311558 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  2 08:45:17 np0005466013 podman[251796]: 2025-10-02 12:45:17.67477024 +0000 UTC m=+0.052579703 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=iscsid, org.label-schema.build-date=20251001)
Oct  2 08:45:18 np0005466013 nova_compute[192144]: 2025-10-02 12:45:18.799 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:18 np0005466013 nova_compute[192144]: 2025-10-02 12:45:18.906 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:21 np0005466013 nova_compute[192144]: 2025-10-02 12:45:21.113 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:22 np0005466013 nova_compute[192144]: 2025-10-02 12:45:22.429 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:26 np0005466013 nova_compute[192144]: 2025-10-02 12:45:26.116 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:27 np0005466013 nova_compute[192144]: 2025-10-02 12:45:27.330 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759409112.3285065, fae6fd47-f4b2-4e6a-9e91-f172b0d4c382 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:45:27 np0005466013 nova_compute[192144]: 2025-10-02 12:45:27.330 2 INFO nova.compute.manager [-] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:45:27 np0005466013 nova_compute[192144]: 2025-10-02 12:45:27.357 2 DEBUG nova.compute.manager [None req-e1a71709-244d-45b5-a6bd-903828f3c18c - - - - - -] [instance: fae6fd47-f4b2-4e6a-9e91-f172b0d4c382] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:45:27 np0005466013 nova_compute[192144]: 2025-10-02 12:45:27.430 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:31 np0005466013 nova_compute[192144]: 2025-10-02 12:45:31.118 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:32 np0005466013 nova_compute[192144]: 2025-10-02 12:45:32.433 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:34 np0005466013 podman[251840]: 2025-10-02 12:45:34.680816772 +0000 UTC m=+0.049063493 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:45:34 np0005466013 podman[251841]: 2025-10-02 12:45:34.703456932 +0000 UTC m=+0.065418026 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Oct  2 08:45:34 np0005466013 podman[251842]: 2025-10-02 12:45:34.735746668 +0000 UTC m=+0.096220226 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:45:36 np0005466013 nova_compute[192144]: 2025-10-02 12:45:36.121 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:37 np0005466013 nova_compute[192144]: 2025-10-02 12:45:37.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:41 np0005466013 nova_compute[192144]: 2025-10-02 12:45:41.123 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:42 np0005466013 nova_compute[192144]: 2025-10-02 12:45:42.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:45 np0005466013 podman[251907]: 2025-10-02 12:45:45.708777617 +0000 UTC m=+0.072682985 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible, release=1755695350, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_id=edpm)
Oct  2 08:45:45 np0005466013 podman[251908]: 2025-10-02 12:45:45.719141472 +0000 UTC m=+0.080374436 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible)
Oct  2 08:45:45 np0005466013 podman[251906]: 2025-10-02 12:45:45.723791698 +0000 UTC m=+0.102667787 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  2 08:45:46 np0005466013 nova_compute[192144]: 2025-10-02 12:45:46.125 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:46 np0005466013 nova_compute[192144]: 2025-10-02 12:45:46.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:45:46 np0005466013 nova_compute[192144]: 2025-10-02 12:45:46.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:45:47 np0005466013 nova_compute[192144]: 2025-10-02 12:45:47.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:48 np0005466013 podman[251966]: 2025-10-02 12:45:48.690337288 +0000 UTC m=+0.061551705 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  2 08:45:48 np0005466013 podman[251965]: 2025-10-02 12:45:48.711654918 +0000 UTC m=+0.075056860 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  2 08:45:48 np0005466013 nova_compute[192144]: 2025-10-02 12:45:48.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:45:49 np0005466013 nova_compute[192144]: 2025-10-02 12:45:49.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:45:50 np0005466013 nova_compute[192144]: 2025-10-02 12:45:50.020 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:45:50 np0005466013 nova_compute[192144]: 2025-10-02 12:45:50.020 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:45:50 np0005466013 nova_compute[192144]: 2025-10-02 12:45:50.020 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:45:50 np0005466013 nova_compute[192144]: 2025-10-02 12:45:50.020 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:45:50 np0005466013 nova_compute[192144]: 2025-10-02 12:45:50.167 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:45:50 np0005466013 nova_compute[192144]: 2025-10-02 12:45:50.168 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5722MB free_disk=73.13301086425781GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:45:50 np0005466013 nova_compute[192144]: 2025-10-02 12:45:50.168 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:45:50 np0005466013 nova_compute[192144]: 2025-10-02 12:45:50.169 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:45:50 np0005466013 nova_compute[192144]: 2025-10-02 12:45:50.350 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:45:50 np0005466013 nova_compute[192144]: 2025-10-02 12:45:50.350 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:45:50 np0005466013 nova_compute[192144]: 2025-10-02 12:45:50.432 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Refreshing inventories for resource provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  2 08:45:50 np0005466013 nova_compute[192144]: 2025-10-02 12:45:50.511 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Updating ProviderTree inventory for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  2 08:45:50 np0005466013 nova_compute[192144]: 2025-10-02 12:45:50.511 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Updating inventory in ProviderTree for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 08:45:50 np0005466013 nova_compute[192144]: 2025-10-02 12:45:50.533 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Refreshing aggregate associations for resource provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  2 08:45:50 np0005466013 nova_compute[192144]: 2025-10-02 12:45:50.557 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Refreshing trait associations for resource provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80, traits: COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_TRUSTED_CERTS,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NODE,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE,HW_CPU_X86_MMX,COMPUTE_RESCUE_BFV _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  2 08:45:50 np0005466013 nova_compute[192144]: 2025-10-02 12:45:50.584 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:45:50 np0005466013 nova_compute[192144]: 2025-10-02 12:45:50.607 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:45:50 np0005466013 nova_compute[192144]: 2025-10-02 12:45:50.633 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:45:50 np0005466013 nova_compute[192144]: 2025-10-02 12:45:50.633 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.465s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:45:51 np0005466013 nova_compute[192144]: 2025-10-02 12:45:51.127 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:52 np0005466013 nova_compute[192144]: 2025-10-02 12:45:52.439 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:53 np0005466013 nova_compute[192144]: 2025-10-02 12:45:53.633 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:45:53 np0005466013 nova_compute[192144]: 2025-10-02 12:45:53.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:45:56 np0005466013 nova_compute[192144]: 2025-10-02 12:45:56.127 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:56 np0005466013 nova_compute[192144]: 2025-10-02 12:45:56.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:45:57 np0005466013 nova_compute[192144]: 2025-10-02 12:45:57.454 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:57 np0005466013 nova_compute[192144]: 2025-10-02 12:45:57.988 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:46:01 np0005466013 nova_compute[192144]: 2025-10-02 12:46:01.130 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:46:02.329 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:46:02.330 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:46:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:46:02.330 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:46:02 np0005466013 nova_compute[192144]: 2025-10-02 12:46:02.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:02 np0005466013 nova_compute[192144]: 2025-10-02 12:46:02.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:46:02 np0005466013 nova_compute[192144]: 2025-10-02 12:46:02.993 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:46:02 np0005466013 nova_compute[192144]: 2025-10-02 12:46:02.993 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:46:03 np0005466013 nova_compute[192144]: 2025-10-02 12:46:03.009 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:46:03 np0005466013 nova_compute[192144]: 2025-10-02 12:46:03.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:46:05 np0005466013 podman[252012]: 2025-10-02 12:46:05.711293029 +0000 UTC m=+0.060208443 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Oct  2 08:46:05 np0005466013 podman[252011]: 2025-10-02 12:46:05.733262759 +0000 UTC m=+0.090506295 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  2 08:46:05 np0005466013 podman[252013]: 2025-10-02 12:46:05.747741774 +0000 UTC m=+0.101193201 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  2 08:46:06 np0005466013 nova_compute[192144]: 2025-10-02 12:46:06.132 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:07 np0005466013 nova_compute[192144]: 2025-10-02 12:46:07.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:09 np0005466013 ovn_controller[94366]: 2025-10-02T12:46:09Z|00781|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Oct  2 08:46:11 np0005466013 nova_compute[192144]: 2025-10-02 12:46:11.134 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:12 np0005466013 nova_compute[192144]: 2025-10-02 12:46:12.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:16 np0005466013 nova_compute[192144]: 2025-10-02 12:46:16.138 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:16 np0005466013 podman[252076]: 2025-10-02 12:46:16.678790264 +0000 UTC m=+0.052810650 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-type=git, distribution-scope=public, release=1755695350, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, version=9.6, build-date=2025-08-20T13:12:41)
Oct  2 08:46:16 np0005466013 podman[252077]: 2025-10-02 12:46:16.678785204 +0000 UTC m=+0.052160980 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001)
Oct  2 08:46:16 np0005466013 podman[252075]: 2025-10-02 12:46:16.678882267 +0000 UTC m=+0.055824465 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251001)
Oct  2 08:46:17 np0005466013 nova_compute[192144]: 2025-10-02 12:46:17.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:19 np0005466013 podman[252135]: 2025-10-02 12:46:19.676794553 +0000 UTC m=+0.054548174 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  2 08:46:19 np0005466013 podman[252134]: 2025-10-02 12:46:19.689559265 +0000 UTC m=+0.071219779 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:46:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:46:20.450 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=52, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=51) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:46:20 np0005466013 nova_compute[192144]: 2025-10-02 12:46:20.451 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:20 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:46:20.451 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:46:21 np0005466013 nova_compute[192144]: 2025-10-02 12:46:21.138 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:22 np0005466013 nova_compute[192144]: 2025-10-02 12:46:22.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:26 np0005466013 nova_compute[192144]: 2025-10-02 12:46:26.180 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:27 np0005466013 nova_compute[192144]: 2025-10-02 12:46:27.551 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:46:30.454 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '52'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:46:31 np0005466013 nova_compute[192144]: 2025-10-02 12:46:31.182 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:32 np0005466013 nova_compute[192144]: 2025-10-02 12:46:32.553 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:36 np0005466013 nova_compute[192144]: 2025-10-02 12:46:36.183 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:36 np0005466013 podman[252179]: 2025-10-02 12:46:36.681967438 +0000 UTC m=+0.051485799 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:46:36 np0005466013 podman[252178]: 2025-10-02 12:46:36.705744345 +0000 UTC m=+0.079435927 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:46:36 np0005466013 podman[252180]: 2025-10-02 12:46:36.721697736 +0000 UTC m=+0.088128700 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:46:37 np0005466013 nova_compute[192144]: 2025-10-02 12:46:37.582 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:41 np0005466013 nova_compute[192144]: 2025-10-02 12:46:41.185 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:42 np0005466013 nova_compute[192144]: 2025-10-02 12:46:42.585 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:46 np0005466013 nova_compute[192144]: 2025-10-02 12:46:46.186 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:47 np0005466013 nova_compute[192144]: 2025-10-02 12:46:47.588 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:47 np0005466013 podman[252249]: 2025-10-02 12:46:47.676041669 +0000 UTC m=+0.052941124 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, container_name=openstack_network_exporter, distribution-scope=public, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, architecture=x86_64, build-date=2025-08-20T13:12:41, config_id=edpm, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350)
Oct  2 08:46:47 np0005466013 podman[252248]: 2025-10-02 12:46:47.676588376 +0000 UTC m=+0.056033941 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  2 08:46:47 np0005466013 podman[252250]: 2025-10-02 12:46:47.684998441 +0000 UTC m=+0.057675384 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 08:46:48 np0005466013 nova_compute[192144]: 2025-10-02 12:46:48.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:46:48 np0005466013 nova_compute[192144]: 2025-10-02 12:46:48.993 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:46:49 np0005466013 nova_compute[192144]: 2025-10-02 12:46:49.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:46:50 np0005466013 podman[252308]: 2025-10-02 12:46:50.675829675 +0000 UTC m=+0.052717147 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:46:50 np0005466013 podman[252309]: 2025-10-02 12:46:50.708687567 +0000 UTC m=+0.080608173 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, container_name=iscsid, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:46:50 np0005466013 nova_compute[192144]: 2025-10-02 12:46:50.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:46:51 np0005466013 nova_compute[192144]: 2025-10-02 12:46:51.014 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:46:51 np0005466013 nova_compute[192144]: 2025-10-02 12:46:51.015 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:46:51 np0005466013 nova_compute[192144]: 2025-10-02 12:46:51.015 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:46:51 np0005466013 nova_compute[192144]: 2025-10-02 12:46:51.015 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:46:51 np0005466013 nova_compute[192144]: 2025-10-02 12:46:51.172 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:46:51 np0005466013 nova_compute[192144]: 2025-10-02 12:46:51.174 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5722MB free_disk=73.13301086425781GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:46:51 np0005466013 nova_compute[192144]: 2025-10-02 12:46:51.174 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:46:51 np0005466013 nova_compute[192144]: 2025-10-02 12:46:51.174 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:46:51 np0005466013 nova_compute[192144]: 2025-10-02 12:46:51.188 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:51 np0005466013 nova_compute[192144]: 2025-10-02 12:46:51.235 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:46:51 np0005466013 nova_compute[192144]: 2025-10-02 12:46:51.236 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:46:51 np0005466013 nova_compute[192144]: 2025-10-02 12:46:51.266 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:46:51 np0005466013 nova_compute[192144]: 2025-10-02 12:46:51.282 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:46:51 np0005466013 nova_compute[192144]: 2025-10-02 12:46:51.284 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:46:51 np0005466013 nova_compute[192144]: 2025-10-02 12:46:51.284 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:46:52 np0005466013 nova_compute[192144]: 2025-10-02 12:46:52.591 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:54 np0005466013 nova_compute[192144]: 2025-10-02 12:46:54.285 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:46:54 np0005466013 nova_compute[192144]: 2025-10-02 12:46:54.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:46:56 np0005466013 nova_compute[192144]: 2025-10-02 12:46:56.189 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:57 np0005466013 nova_compute[192144]: 2025-10-02 12:46:57.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:57 np0005466013 nova_compute[192144]: 2025-10-02 12:46:57.990 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:46:58 np0005466013 nova_compute[192144]: 2025-10-02 12:46:58.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:46:59 np0005466013 nova_compute[192144]: 2025-10-02 12:46:59.989 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:47:01 np0005466013 nova_compute[192144]: 2025-10-02 12:47:01.190 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:47:02.330 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:47:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:47:02.331 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:47:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:47:02.331 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:47:02 np0005466013 nova_compute[192144]: 2025-10-02 12:47:02.596 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:03 np0005466013 nova_compute[192144]: 2025-10-02 12:47:03.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:47:03 np0005466013 nova_compute[192144]: 2025-10-02 12:47:03.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:47:03 np0005466013 nova_compute[192144]: 2025-10-02 12:47:03.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:47:04 np0005466013 nova_compute[192144]: 2025-10-02 12:47:04.012 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:47:04 np0005466013 nova_compute[192144]: 2025-10-02 12:47:04.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:47:06 np0005466013 nova_compute[192144]: 2025-10-02 12:47:06.221 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:07 np0005466013 nova_compute[192144]: 2025-10-02 12:47:07.599 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:07 np0005466013 podman[252352]: 2025-10-02 12:47:07.671717247 +0000 UTC m=+0.048248427 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:47:07 np0005466013 podman[252354]: 2025-10-02 12:47:07.712650193 +0000 UTC m=+0.079093526 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:47:07 np0005466013 podman[252353]: 2025-10-02 12:47:07.730955719 +0000 UTC m=+0.090939579 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, tcib_managed=true)
Oct  2 08:47:11 np0005466013 nova_compute[192144]: 2025-10-02 12:47:11.224 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:12 np0005466013 nova_compute[192144]: 2025-10-02 12:47:12.602 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:16 np0005466013 nova_compute[192144]: 2025-10-02 12:47:16.225 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:47:16.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:47:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:47:16.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:47:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:47:16.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:47:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:47:16.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:47:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:47:16.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:47:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:47:16.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:47:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:47:16.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:47:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:47:16.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:47:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:47:16.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:47:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:47:16.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:47:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:47:16.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:47:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:47:16.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:47:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:47:16.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:47:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:47:16.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:47:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:47:16.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:47:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:47:16.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:47:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:47:16.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:47:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:47:16.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:47:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:47:16.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:47:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:47:16.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:47:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:47:16.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:47:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:47:16.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:47:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:47:16.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:47:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:47:16.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:47:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:47:16.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:47:17 np0005466013 nova_compute[192144]: 2025-10-02 12:47:17.635 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:18 np0005466013 podman[252418]: 2025-10-02 12:47:18.685577938 +0000 UTC m=+0.061744411 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  2 08:47:18 np0005466013 podman[252420]: 2025-10-02 12:47:18.698244066 +0000 UTC m=+0.063767424 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=edpm)
Oct  2 08:47:18 np0005466013 podman[252419]: 2025-10-02 12:47:18.701979074 +0000 UTC m=+0.070032392 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, vcs-type=git, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_id=edpm, managed_by=edpm_ansible, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  2 08:47:21 np0005466013 nova_compute[192144]: 2025-10-02 12:47:21.229 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:21 np0005466013 podman[252477]: 2025-10-02 12:47:21.678079454 +0000 UTC m=+0.054642877 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  2 08:47:21 np0005466013 podman[252478]: 2025-10-02 12:47:21.692201928 +0000 UTC m=+0.063348281 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=iscsid, container_name=iscsid, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct  2 08:47:21 np0005466013 nova_compute[192144]: 2025-10-02 12:47:21.747 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:47:21.747 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=53, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=52) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:47:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:47:21.748 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:47:22 np0005466013 nova_compute[192144]: 2025-10-02 12:47:22.637 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:26 np0005466013 nova_compute[192144]: 2025-10-02 12:47:26.230 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:27 np0005466013 nova_compute[192144]: 2025-10-02 12:47:27.653 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:30 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:47:30.751 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '53'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:47:31 np0005466013 nova_compute[192144]: 2025-10-02 12:47:31.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:32 np0005466013 nova_compute[192144]: 2025-10-02 12:47:32.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:36 np0005466013 nova_compute[192144]: 2025-10-02 12:47:36.234 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:37 np0005466013 nova_compute[192144]: 2025-10-02 12:47:37.663 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:38 np0005466013 podman[252521]: 2025-10-02 12:47:38.671057624 +0000 UTC m=+0.048024720 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 08:47:38 np0005466013 podman[252522]: 2025-10-02 12:47:38.671147257 +0000 UTC m=+0.044908551 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Oct  2 08:47:38 np0005466013 podman[252523]: 2025-10-02 12:47:38.698807526 +0000 UTC m=+0.069020799 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_controller)
Oct  2 08:47:41 np0005466013 nova_compute[192144]: 2025-10-02 12:47:41.236 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:42 np0005466013 nova_compute[192144]: 2025-10-02 12:47:42.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:46 np0005466013 nova_compute[192144]: 2025-10-02 12:47:46.238 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:47 np0005466013 nova_compute[192144]: 2025-10-02 12:47:47.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:49 np0005466013 podman[252588]: 2025-10-02 12:47:49.679918002 +0000 UTC m=+0.060149172 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd)
Oct  2 08:47:49 np0005466013 podman[252589]: 2025-10-02 12:47:49.690918616 +0000 UTC m=+0.064631561 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, container_name=openstack_network_exporter, release=1755695350, com.redhat.component=ubi9-minimal-container)
Oct  2 08:47:49 np0005466013 podman[252590]: 2025-10-02 12:47:49.719218666 +0000 UTC m=+0.093675044 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:47:49 np0005466013 nova_compute[192144]: 2025-10-02 12:47:49.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:47:49 np0005466013 nova_compute[192144]: 2025-10-02 12:47:49.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:47:50 np0005466013 nova_compute[192144]: 2025-10-02 12:47:50.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:47:50 np0005466013 nova_compute[192144]: 2025-10-02 12:47:50.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:47:51 np0005466013 nova_compute[192144]: 2025-10-02 12:47:51.028 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:47:51 np0005466013 nova_compute[192144]: 2025-10-02 12:47:51.028 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:47:51 np0005466013 nova_compute[192144]: 2025-10-02 12:47:51.029 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:47:51 np0005466013 nova_compute[192144]: 2025-10-02 12:47:51.029 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:47:51 np0005466013 nova_compute[192144]: 2025-10-02 12:47:51.191 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:47:51 np0005466013 nova_compute[192144]: 2025-10-02 12:47:51.192 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5725MB free_disk=73.13240051269531GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:47:51 np0005466013 nova_compute[192144]: 2025-10-02 12:47:51.192 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:47:51 np0005466013 nova_compute[192144]: 2025-10-02 12:47:51.193 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:47:51 np0005466013 nova_compute[192144]: 2025-10-02 12:47:51.240 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:51 np0005466013 nova_compute[192144]: 2025-10-02 12:47:51.285 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:47:51 np0005466013 nova_compute[192144]: 2025-10-02 12:47:51.286 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:47:51 np0005466013 nova_compute[192144]: 2025-10-02 12:47:51.308 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:47:51 np0005466013 nova_compute[192144]: 2025-10-02 12:47:51.324 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:47:51 np0005466013 nova_compute[192144]: 2025-10-02 12:47:51.325 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:47:51 np0005466013 nova_compute[192144]: 2025-10-02 12:47:51.325 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.133s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:47:52 np0005466013 nova_compute[192144]: 2025-10-02 12:47:52.697 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:52 np0005466013 podman[252647]: 2025-10-02 12:47:52.714760456 +0000 UTC m=+0.090118672 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:47:52 np0005466013 podman[252648]: 2025-10-02 12:47:52.721045154 +0000 UTC m=+0.096563135 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, tcib_managed=true)
Oct  2 08:47:54 np0005466013 nova_compute[192144]: 2025-10-02 12:47:54.325 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:47:54 np0005466013 nova_compute[192144]: 2025-10-02 12:47:54.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:47:56 np0005466013 nova_compute[192144]: 2025-10-02 12:47:56.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:57 np0005466013 nova_compute[192144]: 2025-10-02 12:47:57.700 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:58 np0005466013 nova_compute[192144]: 2025-10-02 12:47:58.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:47:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:47:59.679 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=54, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=53) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:47:59 np0005466013 nova_compute[192144]: 2025-10-02 12:47:59.678 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:47:59.680 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:48:01 np0005466013 nova_compute[192144]: 2025-10-02 12:48:01.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:01 np0005466013 nova_compute[192144]: 2025-10-02 12:48:01.988 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:48:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:48:02.332 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:48:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:48:02.332 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:48:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:48:02.332 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:48:02 np0005466013 nova_compute[192144]: 2025-10-02 12:48:02.702 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:04 np0005466013 nova_compute[192144]: 2025-10-02 12:48:04.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:48:05 np0005466013 nova_compute[192144]: 2025-10-02 12:48:05.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:48:05 np0005466013 nova_compute[192144]: 2025-10-02 12:48:05.995 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:48:05 np0005466013 nova_compute[192144]: 2025-10-02 12:48:05.995 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:48:06 np0005466013 nova_compute[192144]: 2025-10-02 12:48:06.014 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:48:06 np0005466013 nova_compute[192144]: 2025-10-02 12:48:06.246 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:07 np0005466013 nova_compute[192144]: 2025-10-02 12:48:07.705 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:08 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:48:08.681 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '54'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:48:09 np0005466013 podman[252692]: 2025-10-02 12:48:09.688911386 +0000 UTC m=+0.060415249 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct  2 08:48:09 np0005466013 podman[252691]: 2025-10-02 12:48:09.708445849 +0000 UTC m=+0.086539069 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 08:48:09 np0005466013 podman[252693]: 2025-10-02 12:48:09.720895221 +0000 UTC m=+0.093936742 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  2 08:48:11 np0005466013 nova_compute[192144]: 2025-10-02 12:48:11.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:12 np0005466013 nova_compute[192144]: 2025-10-02 12:48:12.708 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:16 np0005466013 nova_compute[192144]: 2025-10-02 12:48:16.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:17 np0005466013 nova_compute[192144]: 2025-10-02 12:48:17.214 2 DEBUG oslo_concurrency.lockutils [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Acquiring lock "c7dd2042-e2a5-4491-aa1c-0e72597641e0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:48:17 np0005466013 nova_compute[192144]: 2025-10-02 12:48:17.215 2 DEBUG oslo_concurrency.lockutils [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "c7dd2042-e2a5-4491-aa1c-0e72597641e0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:48:17 np0005466013 nova_compute[192144]: 2025-10-02 12:48:17.236 2 DEBUG nova.compute.manager [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:48:17 np0005466013 nova_compute[192144]: 2025-10-02 12:48:17.367 2 DEBUG oslo_concurrency.lockutils [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:48:17 np0005466013 nova_compute[192144]: 2025-10-02 12:48:17.368 2 DEBUG oslo_concurrency.lockutils [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:48:17 np0005466013 nova_compute[192144]: 2025-10-02 12:48:17.375 2 DEBUG nova.virt.hardware [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:48:17 np0005466013 nova_compute[192144]: 2025-10-02 12:48:17.376 2 INFO nova.compute.claims [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:48:17 np0005466013 nova_compute[192144]: 2025-10-02 12:48:17.485 2 DEBUG nova.compute.provider_tree [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:48:17 np0005466013 nova_compute[192144]: 2025-10-02 12:48:17.499 2 DEBUG nova.scheduler.client.report [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:48:17 np0005466013 nova_compute[192144]: 2025-10-02 12:48:17.532 2 DEBUG oslo_concurrency.lockutils [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.165s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:48:17 np0005466013 nova_compute[192144]: 2025-10-02 12:48:17.533 2 DEBUG nova.compute.manager [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:48:17 np0005466013 nova_compute[192144]: 2025-10-02 12:48:17.580 2 DEBUG nova.compute.manager [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:48:17 np0005466013 nova_compute[192144]: 2025-10-02 12:48:17.580 2 DEBUG nova.network.neutron [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:48:17 np0005466013 nova_compute[192144]: 2025-10-02 12:48:17.596 2 INFO nova.virt.libvirt.driver [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:48:17 np0005466013 nova_compute[192144]: 2025-10-02 12:48:17.622 2 DEBUG nova.compute.manager [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:48:17 np0005466013 nova_compute[192144]: 2025-10-02 12:48:17.710 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:17 np0005466013 nova_compute[192144]: 2025-10-02 12:48:17.747 2 DEBUG nova.compute.manager [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:48:17 np0005466013 nova_compute[192144]: 2025-10-02 12:48:17.748 2 DEBUG nova.virt.libvirt.driver [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:48:17 np0005466013 nova_compute[192144]: 2025-10-02 12:48:17.749 2 INFO nova.virt.libvirt.driver [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Creating image(s)#033[00m
Oct  2 08:48:17 np0005466013 nova_compute[192144]: 2025-10-02 12:48:17.749 2 DEBUG oslo_concurrency.lockutils [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Acquiring lock "/var/lib/nova/instances/c7dd2042-e2a5-4491-aa1c-0e72597641e0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:48:17 np0005466013 nova_compute[192144]: 2025-10-02 12:48:17.749 2 DEBUG oslo_concurrency.lockutils [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "/var/lib/nova/instances/c7dd2042-e2a5-4491-aa1c-0e72597641e0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:48:17 np0005466013 nova_compute[192144]: 2025-10-02 12:48:17.750 2 DEBUG oslo_concurrency.lockutils [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "/var/lib/nova/instances/c7dd2042-e2a5-4491-aa1c-0e72597641e0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:48:17 np0005466013 nova_compute[192144]: 2025-10-02 12:48:17.763 2 DEBUG oslo_concurrency.processutils [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:48:17 np0005466013 nova_compute[192144]: 2025-10-02 12:48:17.822 2 DEBUG oslo_concurrency.processutils [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:48:17 np0005466013 nova_compute[192144]: 2025-10-02 12:48:17.823 2 DEBUG oslo_concurrency.lockutils [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:48:17 np0005466013 nova_compute[192144]: 2025-10-02 12:48:17.824 2 DEBUG oslo_concurrency.lockutils [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:48:17 np0005466013 nova_compute[192144]: 2025-10-02 12:48:17.835 2 DEBUG oslo_concurrency.processutils [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:48:17 np0005466013 nova_compute[192144]: 2025-10-02 12:48:17.894 2 DEBUG oslo_concurrency.processutils [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:48:17 np0005466013 nova_compute[192144]: 2025-10-02 12:48:17.895 2 DEBUG oslo_concurrency.processutils [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/c7dd2042-e2a5-4491-aa1c-0e72597641e0/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:48:17 np0005466013 nova_compute[192144]: 2025-10-02 12:48:17.925 2 DEBUG oslo_concurrency.processutils [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/c7dd2042-e2a5-4491-aa1c-0e72597641e0/disk 1073741824" returned: 0 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:48:17 np0005466013 nova_compute[192144]: 2025-10-02 12:48:17.926 2 DEBUG oslo_concurrency.lockutils [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:48:17 np0005466013 nova_compute[192144]: 2025-10-02 12:48:17.926 2 DEBUG oslo_concurrency.processutils [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:48:17 np0005466013 nova_compute[192144]: 2025-10-02 12:48:17.978 2 DEBUG oslo_concurrency.processutils [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:48:17 np0005466013 nova_compute[192144]: 2025-10-02 12:48:17.979 2 DEBUG nova.virt.disk.api [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Checking if we can resize image /var/lib/nova/instances/c7dd2042-e2a5-4491-aa1c-0e72597641e0/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:48:17 np0005466013 nova_compute[192144]: 2025-10-02 12:48:17.980 2 DEBUG oslo_concurrency.processutils [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c7dd2042-e2a5-4491-aa1c-0e72597641e0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:48:18 np0005466013 nova_compute[192144]: 2025-10-02 12:48:18.030 2 DEBUG oslo_concurrency.processutils [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c7dd2042-e2a5-4491-aa1c-0e72597641e0/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:48:18 np0005466013 nova_compute[192144]: 2025-10-02 12:48:18.031 2 DEBUG nova.virt.disk.api [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Cannot resize image /var/lib/nova/instances/c7dd2042-e2a5-4491-aa1c-0e72597641e0/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:48:18 np0005466013 nova_compute[192144]: 2025-10-02 12:48:18.032 2 DEBUG nova.objects.instance [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lazy-loading 'migration_context' on Instance uuid c7dd2042-e2a5-4491-aa1c-0e72597641e0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:48:18 np0005466013 nova_compute[192144]: 2025-10-02 12:48:18.046 2 DEBUG nova.virt.libvirt.driver [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:48:18 np0005466013 nova_compute[192144]: 2025-10-02 12:48:18.046 2 DEBUG nova.virt.libvirt.driver [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Ensure instance console log exists: /var/lib/nova/instances/c7dd2042-e2a5-4491-aa1c-0e72597641e0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:48:18 np0005466013 nova_compute[192144]: 2025-10-02 12:48:18.047 2 DEBUG oslo_concurrency.lockutils [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:48:18 np0005466013 nova_compute[192144]: 2025-10-02 12:48:18.047 2 DEBUG oslo_concurrency.lockutils [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:48:18 np0005466013 nova_compute[192144]: 2025-10-02 12:48:18.047 2 DEBUG oslo_concurrency.lockutils [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:48:18 np0005466013 nova_compute[192144]: 2025-10-02 12:48:18.679 2 DEBUG nova.policy [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:48:20 np0005466013 podman[252773]: 2025-10-02 12:48:20.686629701 +0000 UTC m=+0.056077623 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, release=1755695350, version=9.6, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible)
Oct  2 08:48:20 np0005466013 podman[252774]: 2025-10-02 12:48:20.69167167 +0000 UTC m=+0.058859251 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:48:20 np0005466013 podman[252772]: 2025-10-02 12:48:20.69167405 +0000 UTC m=+0.061628137 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, io.buildah.version=1.41.3)
Oct  2 08:48:21 np0005466013 nova_compute[192144]: 2025-10-02 12:48:21.299 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:22 np0005466013 nova_compute[192144]: 2025-10-02 12:48:22.503 2 DEBUG nova.network.neutron [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Successfully created port: 84cebaa0-8158-4467-a214-70216aa0fa77 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:48:22 np0005466013 nova_compute[192144]: 2025-10-02 12:48:22.713 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:23 np0005466013 podman[252826]: 2025-10-02 12:48:23.671445337 +0000 UTC m=+0.044798899 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 08:48:23 np0005466013 podman[252827]: 2025-10-02 12:48:23.688574395 +0000 UTC m=+0.054622457 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  2 08:48:24 np0005466013 nova_compute[192144]: 2025-10-02 12:48:24.051 2 DEBUG nova.network.neutron [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Successfully updated port: 84cebaa0-8158-4467-a214-70216aa0fa77 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:48:24 np0005466013 nova_compute[192144]: 2025-10-02 12:48:24.099 2 DEBUG oslo_concurrency.lockutils [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Acquiring lock "refresh_cache-c7dd2042-e2a5-4491-aa1c-0e72597641e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:48:24 np0005466013 nova_compute[192144]: 2025-10-02 12:48:24.099 2 DEBUG oslo_concurrency.lockutils [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Acquired lock "refresh_cache-c7dd2042-e2a5-4491-aa1c-0e72597641e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:48:24 np0005466013 nova_compute[192144]: 2025-10-02 12:48:24.099 2 DEBUG nova.network.neutron [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:48:24 np0005466013 nova_compute[192144]: 2025-10-02 12:48:24.273 2 DEBUG nova.compute.manager [req-40775e32-3af1-4809-a4cc-645362bf10a4 req-1ffdf540-a84d-486d-862c-8d63577b4626 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Received event network-changed-84cebaa0-8158-4467-a214-70216aa0fa77 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:48:24 np0005466013 nova_compute[192144]: 2025-10-02 12:48:24.274 2 DEBUG nova.compute.manager [req-40775e32-3af1-4809-a4cc-645362bf10a4 req-1ffdf540-a84d-486d-862c-8d63577b4626 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Refreshing instance network info cache due to event network-changed-84cebaa0-8158-4467-a214-70216aa0fa77. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:48:24 np0005466013 nova_compute[192144]: 2025-10-02 12:48:24.274 2 DEBUG oslo_concurrency.lockutils [req-40775e32-3af1-4809-a4cc-645362bf10a4 req-1ffdf540-a84d-486d-862c-8d63577b4626 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-c7dd2042-e2a5-4491-aa1c-0e72597641e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:48:24 np0005466013 nova_compute[192144]: 2025-10-02 12:48:24.675 2 DEBUG nova.network.neutron [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:48:26 np0005466013 nova_compute[192144]: 2025-10-02 12:48:26.121 2 DEBUG nova.network.neutron [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Updating instance_info_cache with network_info: [{"id": "84cebaa0-8158-4467-a214-70216aa0fa77", "address": "fa:16:3e:95:8b:55", "network": {"id": "3c776fa4-63c0-44fa-bf3f-04ad74974c2c", "bridge": "br-int", "label": "tempest-network-smoke--1497944835", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84cebaa0-81", "ovs_interfaceid": "84cebaa0-8158-4467-a214-70216aa0fa77", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:48:26 np0005466013 nova_compute[192144]: 2025-10-02 12:48:26.143 2 DEBUG oslo_concurrency.lockutils [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Releasing lock "refresh_cache-c7dd2042-e2a5-4491-aa1c-0e72597641e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:48:26 np0005466013 nova_compute[192144]: 2025-10-02 12:48:26.143 2 DEBUG nova.compute.manager [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Instance network_info: |[{"id": "84cebaa0-8158-4467-a214-70216aa0fa77", "address": "fa:16:3e:95:8b:55", "network": {"id": "3c776fa4-63c0-44fa-bf3f-04ad74974c2c", "bridge": "br-int", "label": "tempest-network-smoke--1497944835", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84cebaa0-81", "ovs_interfaceid": "84cebaa0-8158-4467-a214-70216aa0fa77", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:48:26 np0005466013 nova_compute[192144]: 2025-10-02 12:48:26.143 2 DEBUG oslo_concurrency.lockutils [req-40775e32-3af1-4809-a4cc-645362bf10a4 req-1ffdf540-a84d-486d-862c-8d63577b4626 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-c7dd2042-e2a5-4491-aa1c-0e72597641e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:48:26 np0005466013 nova_compute[192144]: 2025-10-02 12:48:26.144 2 DEBUG nova.network.neutron [req-40775e32-3af1-4809-a4cc-645362bf10a4 req-1ffdf540-a84d-486d-862c-8d63577b4626 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Refreshing network info cache for port 84cebaa0-8158-4467-a214-70216aa0fa77 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:48:26 np0005466013 nova_compute[192144]: 2025-10-02 12:48:26.146 2 DEBUG nova.virt.libvirt.driver [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Start _get_guest_xml network_info=[{"id": "84cebaa0-8158-4467-a214-70216aa0fa77", "address": "fa:16:3e:95:8b:55", "network": {"id": "3c776fa4-63c0-44fa-bf3f-04ad74974c2c", "bridge": "br-int", "label": "tempest-network-smoke--1497944835", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84cebaa0-81", "ovs_interfaceid": "84cebaa0-8158-4467-a214-70216aa0fa77", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:48:26 np0005466013 nova_compute[192144]: 2025-10-02 12:48:26.151 2 WARNING nova.virt.libvirt.driver [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:48:26 np0005466013 nova_compute[192144]: 2025-10-02 12:48:26.164 2 DEBUG nova.virt.libvirt.host [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:48:26 np0005466013 nova_compute[192144]: 2025-10-02 12:48:26.165 2 DEBUG nova.virt.libvirt.host [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:48:26 np0005466013 nova_compute[192144]: 2025-10-02 12:48:26.170 2 DEBUG nova.virt.libvirt.host [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:48:26 np0005466013 nova_compute[192144]: 2025-10-02 12:48:26.171 2 DEBUG nova.virt.libvirt.host [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:48:26 np0005466013 nova_compute[192144]: 2025-10-02 12:48:26.172 2 DEBUG nova.virt.libvirt.driver [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:48:26 np0005466013 nova_compute[192144]: 2025-10-02 12:48:26.172 2 DEBUG nova.virt.hardware [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:48:26 np0005466013 nova_compute[192144]: 2025-10-02 12:48:26.172 2 DEBUG nova.virt.hardware [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:48:26 np0005466013 nova_compute[192144]: 2025-10-02 12:48:26.173 2 DEBUG nova.virt.hardware [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:48:26 np0005466013 nova_compute[192144]: 2025-10-02 12:48:26.173 2 DEBUG nova.virt.hardware [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:48:26 np0005466013 nova_compute[192144]: 2025-10-02 12:48:26.173 2 DEBUG nova.virt.hardware [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:48:26 np0005466013 nova_compute[192144]: 2025-10-02 12:48:26.173 2 DEBUG nova.virt.hardware [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:48:26 np0005466013 nova_compute[192144]: 2025-10-02 12:48:26.174 2 DEBUG nova.virt.hardware [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:48:26 np0005466013 nova_compute[192144]: 2025-10-02 12:48:26.174 2 DEBUG nova.virt.hardware [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:48:26 np0005466013 nova_compute[192144]: 2025-10-02 12:48:26.174 2 DEBUG nova.virt.hardware [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:48:26 np0005466013 nova_compute[192144]: 2025-10-02 12:48:26.174 2 DEBUG nova.virt.hardware [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:48:26 np0005466013 nova_compute[192144]: 2025-10-02 12:48:26.175 2 DEBUG nova.virt.hardware [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:48:26 np0005466013 nova_compute[192144]: 2025-10-02 12:48:26.179 2 DEBUG nova.virt.libvirt.vif [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:48:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-1773803623',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-1773803623',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1020134341-ac',id=184,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI2KKxYqJuo+bm0uXO0va+WiltctIuUrNVSuyXKH60Q282vpKz7lkIUwo7YbhQgvFPQ6W6pvlS1MgI71IgsIlYiUsaPlzFVJnshPK84X/j2YUTiXwv4g5W08cDEUTRF7vw==',key_name='tempest-TestSecurityGroupsBasicOps-1976615750',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='575f3d227ab24f2daa62e65e14a4cd9c',ramdisk_id='',reservation_id='r-dv489toa',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1020134341',owner_user_name='tempest-TestSecurityGroupsBasicOps-1020134341-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:48:17Z,user_data=None,user_id='2d2b4a2da57543ef88e44ae28ad61647',uuid=c7dd2042-e2a5-4491-aa1c-0e72597641e0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "84cebaa0-8158-4467-a214-70216aa0fa77", "address": "fa:16:3e:95:8b:55", "network": {"id": "3c776fa4-63c0-44fa-bf3f-04ad74974c2c", "bridge": "br-int", "label": "tempest-network-smoke--1497944835", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84cebaa0-81", "ovs_interfaceid": "84cebaa0-8158-4467-a214-70216aa0fa77", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:48:26 np0005466013 nova_compute[192144]: 2025-10-02 12:48:26.180 2 DEBUG nova.network.os_vif_util [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Converting VIF {"id": "84cebaa0-8158-4467-a214-70216aa0fa77", "address": "fa:16:3e:95:8b:55", "network": {"id": "3c776fa4-63c0-44fa-bf3f-04ad74974c2c", "bridge": "br-int", "label": "tempest-network-smoke--1497944835", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84cebaa0-81", "ovs_interfaceid": "84cebaa0-8158-4467-a214-70216aa0fa77", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:48:26 np0005466013 nova_compute[192144]: 2025-10-02 12:48:26.181 2 DEBUG nova.network.os_vif_util [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:95:8b:55,bridge_name='br-int',has_traffic_filtering=True,id=84cebaa0-8158-4467-a214-70216aa0fa77,network=Network(3c776fa4-63c0-44fa-bf3f-04ad74974c2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap84cebaa0-81') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:48:26 np0005466013 nova_compute[192144]: 2025-10-02 12:48:26.182 2 DEBUG nova.objects.instance [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lazy-loading 'pci_devices' on Instance uuid c7dd2042-e2a5-4491-aa1c-0e72597641e0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:48:26 np0005466013 nova_compute[192144]: 2025-10-02 12:48:26.199 2 DEBUG nova.virt.libvirt.driver [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:48:26 np0005466013 nova_compute[192144]:  <uuid>c7dd2042-e2a5-4491-aa1c-0e72597641e0</uuid>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:  <name>instance-000000b8</name>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:48:26 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-1773803623</nova:name>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:48:26</nova:creationTime>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:48:26 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:        <nova:user uuid="2d2b4a2da57543ef88e44ae28ad61647">tempest-TestSecurityGroupsBasicOps-1020134341-project-member</nova:user>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:        <nova:project uuid="575f3d227ab24f2daa62e65e14a4cd9c">tempest-TestSecurityGroupsBasicOps-1020134341</nova:project>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:        <nova:port uuid="84cebaa0-8158-4467-a214-70216aa0fa77">
Oct  2 08:48:26 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:      <entry name="serial">c7dd2042-e2a5-4491-aa1c-0e72597641e0</entry>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:      <entry name="uuid">c7dd2042-e2a5-4491-aa1c-0e72597641e0</entry>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:48:26 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/c7dd2042-e2a5-4491-aa1c-0e72597641e0/disk"/>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:48:26 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/c7dd2042-e2a5-4491-aa1c-0e72597641e0/disk.config"/>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:48:26 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:95:8b:55"/>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:      <target dev="tap84cebaa0-81"/>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:48:26 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/c7dd2042-e2a5-4491-aa1c-0e72597641e0/console.log" append="off"/>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:48:26 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:48:26 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:48:26 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:48:26 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:48:26 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:48:26 np0005466013 nova_compute[192144]: 2025-10-02 12:48:26.200 2 DEBUG nova.compute.manager [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Preparing to wait for external event network-vif-plugged-84cebaa0-8158-4467-a214-70216aa0fa77 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:48:26 np0005466013 nova_compute[192144]: 2025-10-02 12:48:26.201 2 DEBUG oslo_concurrency.lockutils [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Acquiring lock "c7dd2042-e2a5-4491-aa1c-0e72597641e0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:48:26 np0005466013 nova_compute[192144]: 2025-10-02 12:48:26.201 2 DEBUG oslo_concurrency.lockutils [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "c7dd2042-e2a5-4491-aa1c-0e72597641e0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:48:26 np0005466013 nova_compute[192144]: 2025-10-02 12:48:26.201 2 DEBUG oslo_concurrency.lockutils [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "c7dd2042-e2a5-4491-aa1c-0e72597641e0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:48:26 np0005466013 nova_compute[192144]: 2025-10-02 12:48:26.202 2 DEBUG nova.virt.libvirt.vif [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:48:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-1773803623',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-1773803623',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1020134341-ac',id=184,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI2KKxYqJuo+bm0uXO0va+WiltctIuUrNVSuyXKH60Q282vpKz7lkIUwo7YbhQgvFPQ6W6pvlS1MgI71IgsIlYiUsaPlzFVJnshPK84X/j2YUTiXwv4g5W08cDEUTRF7vw==',key_name='tempest-TestSecurityGroupsBasicOps-1976615750',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='575f3d227ab24f2daa62e65e14a4cd9c',ramdisk_id='',reservation_id='r-dv489toa',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1020134341',owner_user_name='tempest-TestSecurityGroupsBasicOps-1020134341-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:48:17Z,user_data=None,user_id='2d2b4a2da57543ef88e44ae28ad61647',uuid=c7dd2042-e2a5-4491-aa1c-0e72597641e0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "84cebaa0-8158-4467-a214-70216aa0fa77", "address": "fa:16:3e:95:8b:55", "network": {"id": "3c776fa4-63c0-44fa-bf3f-04ad74974c2c", "bridge": "br-int", "label": "tempest-network-smoke--1497944835", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84cebaa0-81", "ovs_interfaceid": "84cebaa0-8158-4467-a214-70216aa0fa77", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:48:26 np0005466013 nova_compute[192144]: 2025-10-02 12:48:26.202 2 DEBUG nova.network.os_vif_util [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Converting VIF {"id": "84cebaa0-8158-4467-a214-70216aa0fa77", "address": "fa:16:3e:95:8b:55", "network": {"id": "3c776fa4-63c0-44fa-bf3f-04ad74974c2c", "bridge": "br-int", "label": "tempest-network-smoke--1497944835", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84cebaa0-81", "ovs_interfaceid": "84cebaa0-8158-4467-a214-70216aa0fa77", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:48:26 np0005466013 nova_compute[192144]: 2025-10-02 12:48:26.203 2 DEBUG nova.network.os_vif_util [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:95:8b:55,bridge_name='br-int',has_traffic_filtering=True,id=84cebaa0-8158-4467-a214-70216aa0fa77,network=Network(3c776fa4-63c0-44fa-bf3f-04ad74974c2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap84cebaa0-81') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:48:26 np0005466013 nova_compute[192144]: 2025-10-02 12:48:26.203 2 DEBUG os_vif [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:95:8b:55,bridge_name='br-int',has_traffic_filtering=True,id=84cebaa0-8158-4467-a214-70216aa0fa77,network=Network(3c776fa4-63c0-44fa-bf3f-04ad74974c2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap84cebaa0-81') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:48:26 np0005466013 nova_compute[192144]: 2025-10-02 12:48:26.204 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:26 np0005466013 nova_compute[192144]: 2025-10-02 12:48:26.204 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:48:26 np0005466013 nova_compute[192144]: 2025-10-02 12:48:26.205 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:48:26 np0005466013 nova_compute[192144]: 2025-10-02 12:48:26.207 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:26 np0005466013 nova_compute[192144]: 2025-10-02 12:48:26.207 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap84cebaa0-81, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:48:26 np0005466013 nova_compute[192144]: 2025-10-02 12:48:26.208 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap84cebaa0-81, col_values=(('external_ids', {'iface-id': '84cebaa0-8158-4467-a214-70216aa0fa77', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:95:8b:55', 'vm-uuid': 'c7dd2042-e2a5-4491-aa1c-0e72597641e0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:48:26 np0005466013 nova_compute[192144]: 2025-10-02 12:48:26.209 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:26 np0005466013 NetworkManager[51205]: <info>  [1759409306.2105] manager: (tap84cebaa0-81): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/347)
Oct  2 08:48:26 np0005466013 nova_compute[192144]: 2025-10-02 12:48:26.212 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:48:26 np0005466013 nova_compute[192144]: 2025-10-02 12:48:26.216 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:26 np0005466013 nova_compute[192144]: 2025-10-02 12:48:26.219 2 INFO os_vif [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:95:8b:55,bridge_name='br-int',has_traffic_filtering=True,id=84cebaa0-8158-4467-a214-70216aa0fa77,network=Network(3c776fa4-63c0-44fa-bf3f-04ad74974c2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap84cebaa0-81')#033[00m
Oct  2 08:48:26 np0005466013 nova_compute[192144]: 2025-10-02 12:48:26.290 2 DEBUG nova.virt.libvirt.driver [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:48:26 np0005466013 nova_compute[192144]: 2025-10-02 12:48:26.291 2 DEBUG nova.virt.libvirt.driver [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:48:26 np0005466013 nova_compute[192144]: 2025-10-02 12:48:26.291 2 DEBUG nova.virt.libvirt.driver [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] No VIF found with MAC fa:16:3e:95:8b:55, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:48:26 np0005466013 nova_compute[192144]: 2025-10-02 12:48:26.292 2 INFO nova.virt.libvirt.driver [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Using config drive#033[00m
Oct  2 08:48:26 np0005466013 nova_compute[192144]: 2025-10-02 12:48:26.301 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:26 np0005466013 nova_compute[192144]: 2025-10-02 12:48:26.893 2 INFO nova.virt.libvirt.driver [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Creating config drive at /var/lib/nova/instances/c7dd2042-e2a5-4491-aa1c-0e72597641e0/disk.config#033[00m
Oct  2 08:48:26 np0005466013 nova_compute[192144]: 2025-10-02 12:48:26.898 2 DEBUG oslo_concurrency.processutils [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c7dd2042-e2a5-4491-aa1c-0e72597641e0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_fx8ilto execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:48:27 np0005466013 nova_compute[192144]: 2025-10-02 12:48:27.025 2 DEBUG oslo_concurrency.processutils [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c7dd2042-e2a5-4491-aa1c-0e72597641e0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_fx8ilto" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:48:27 np0005466013 kernel: tap84cebaa0-81: entered promiscuous mode
Oct  2 08:48:27 np0005466013 NetworkManager[51205]: <info>  [1759409307.0773] manager: (tap84cebaa0-81): new Tun device (/org/freedesktop/NetworkManager/Devices/348)
Oct  2 08:48:27 np0005466013 ovn_controller[94366]: 2025-10-02T12:48:27Z|00782|binding|INFO|Claiming lport 84cebaa0-8158-4467-a214-70216aa0fa77 for this chassis.
Oct  2 08:48:27 np0005466013 ovn_controller[94366]: 2025-10-02T12:48:27Z|00783|binding|INFO|84cebaa0-8158-4467-a214-70216aa0fa77: Claiming fa:16:3e:95:8b:55 10.100.0.6
Oct  2 08:48:27 np0005466013 nova_compute[192144]: 2025-10-02 12:48:27.078 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:27 np0005466013 nova_compute[192144]: 2025-10-02 12:48:27.082 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:27 np0005466013 nova_compute[192144]: 2025-10-02 12:48:27.087 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:48:27.092 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:95:8b:55 10.100.0.6'], port_security=['fa:16:3e:95:8b:55 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'c7dd2042-e2a5-4491-aa1c-0e72597641e0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3c776fa4-63c0-44fa-bf3f-04ad74974c2c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3bf5c068-41c3-45ca-8822-72717311e7da d068f527-d669-40d0-ac19-cae69897b62d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=78978e88-15ad-4f25-bc19-feb08335ac33, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=84cebaa0-8158-4467-a214-70216aa0fa77) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:48:27.093 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 84cebaa0-8158-4467-a214-70216aa0fa77 in datapath 3c776fa4-63c0-44fa-bf3f-04ad74974c2c bound to our chassis#033[00m
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:48:27.094 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3c776fa4-63c0-44fa-bf3f-04ad74974c2c#033[00m
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:48:27.107 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[58bb8d8a-f9a8-47e7-bc67-f3ad6719b942]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:48:27.108 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3c776fa4-61 in ovnmeta-3c776fa4-63c0-44fa-bf3f-04ad74974c2c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:48:27 np0005466013 systemd-udevd[252891]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:48:27.110 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3c776fa4-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:48:27.110 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[aaec95eb-f5f6-4781-bcd4-870649c31a06]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:48:27.112 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[f7bc5455-c8af-4cd4-8b07-c23e1673ff70]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:48:27 np0005466013 systemd-machined[152202]: New machine qemu-82-instance-000000b8.
Oct  2 08:48:27 np0005466013 NetworkManager[51205]: <info>  [1759409307.1217] device (tap84cebaa0-81): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:48:27 np0005466013 NetworkManager[51205]: <info>  [1759409307.1229] device (tap84cebaa0-81): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:48:27.123 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[1a895fc3-d9fa-49c2-9598-6820ab662c1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:48:27 np0005466013 nova_compute[192144]: 2025-10-02 12:48:27.139 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:27 np0005466013 systemd[1]: Started Virtual Machine qemu-82-instance-000000b8.
Oct  2 08:48:27 np0005466013 ovn_controller[94366]: 2025-10-02T12:48:27Z|00784|binding|INFO|Setting lport 84cebaa0-8158-4467-a214-70216aa0fa77 ovn-installed in OVS
Oct  2 08:48:27 np0005466013 ovn_controller[94366]: 2025-10-02T12:48:27Z|00785|binding|INFO|Setting lport 84cebaa0-8158-4467-a214-70216aa0fa77 up in Southbound
Oct  2 08:48:27 np0005466013 nova_compute[192144]: 2025-10-02 12:48:27.143 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:48:27.150 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[453127c6-cad2-4642-8ddc-3ad6b52b7fa6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:48:27.177 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[852f0ede-42c3-4aa2-8c6b-2376d94ecd1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:48:27.182 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[e0d06f8f-fa7b-4902-bc45-44f8f83804a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:48:27 np0005466013 NetworkManager[51205]: <info>  [1759409307.1832] manager: (tap3c776fa4-60): new Veth device (/org/freedesktop/NetworkManager/Devices/349)
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:48:27.210 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[7592f399-8ab4-461d-8cab-a597ba6249a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:48:27.213 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[e79382eb-6730-4c6d-b0be-53585d40dacc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:48:27 np0005466013 NetworkManager[51205]: <info>  [1759409307.2337] device (tap3c776fa4-60): carrier: link connected
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:48:27.238 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[37e65de8-3c49-4da0-8218-f988ccb0be8a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:48:27.252 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[784a2aff-a85b-4722-93b5-489131a0be8d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3c776fa4-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c5:93:34'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 231], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 729885, 'reachable_time': 43426, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 252923, 'error': None, 'target': 'ovnmeta-3c776fa4-63c0-44fa-bf3f-04ad74974c2c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:48:27.264 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[cb0cac20-43cb-401f-97f4-f1747add5e97]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec5:9334'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 729885, 'tstamp': 729885}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 252924, 'error': None, 'target': 'ovnmeta-3c776fa4-63c0-44fa-bf3f-04ad74974c2c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:48:27.284 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[e62881c3-57c1-4184-9ae2-dfda1104edc7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3c776fa4-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c5:93:34'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 231], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 729885, 'reachable_time': 43426, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 252925, 'error': None, 'target': 'ovnmeta-3c776fa4-63c0-44fa-bf3f-04ad74974c2c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:48:27.308 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[5801df3a-078a-4735-bad9-a6f7c054926a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:48:27.357 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[42698376-f8a1-4032-998a-e6661e654788]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:48:27.358 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3c776fa4-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:48:27.359 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:48:27.359 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3c776fa4-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:48:27 np0005466013 nova_compute[192144]: 2025-10-02 12:48:27.361 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:27 np0005466013 kernel: tap3c776fa4-60: entered promiscuous mode
Oct  2 08:48:27 np0005466013 NetworkManager[51205]: <info>  [1759409307.3619] manager: (tap3c776fa4-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/350)
Oct  2 08:48:27 np0005466013 nova_compute[192144]: 2025-10-02 12:48:27.363 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:48:27.368 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3c776fa4-60, col_values=(('external_ids', {'iface-id': 'b354dca0-bf82-4ac8-ba2d-7afd74e436fa'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:48:27 np0005466013 nova_compute[192144]: 2025-10-02 12:48:27.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:27 np0005466013 ovn_controller[94366]: 2025-10-02T12:48:27Z|00786|binding|INFO|Releasing lport b354dca0-bf82-4ac8-ba2d-7afd74e436fa from this chassis (sb_readonly=0)
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:48:27.372 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3c776fa4-63c0-44fa-bf3f-04ad74974c2c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3c776fa4-63c0-44fa-bf3f-04ad74974c2c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:48:27.373 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[2fbe6d48-267f-4e61-887d-e805131576c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:48:27.373 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-3c776fa4-63c0-44fa-bf3f-04ad74974c2c
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/3c776fa4-63c0-44fa-bf3f-04ad74974c2c.pid.haproxy
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID 3c776fa4-63c0-44fa-bf3f-04ad74974c2c
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:48:27 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:48:27.374 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3c776fa4-63c0-44fa-bf3f-04ad74974c2c', 'env', 'PROCESS_TAG=haproxy-3c776fa4-63c0-44fa-bf3f-04ad74974c2c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3c776fa4-63c0-44fa-bf3f-04ad74974c2c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:48:27 np0005466013 nova_compute[192144]: 2025-10-02 12:48:27.382 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:27 np0005466013 nova_compute[192144]: 2025-10-02 12:48:27.662 2 DEBUG nova.compute.manager [req-590e61bd-2a00-4f67-8c0d-204d7733f2f1 req-ea03245f-a1df-4cba-8fc6-40e04d18cd8e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Received event network-vif-plugged-84cebaa0-8158-4467-a214-70216aa0fa77 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:48:27 np0005466013 nova_compute[192144]: 2025-10-02 12:48:27.662 2 DEBUG oslo_concurrency.lockutils [req-590e61bd-2a00-4f67-8c0d-204d7733f2f1 req-ea03245f-a1df-4cba-8fc6-40e04d18cd8e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "c7dd2042-e2a5-4491-aa1c-0e72597641e0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:48:27 np0005466013 nova_compute[192144]: 2025-10-02 12:48:27.663 2 DEBUG oslo_concurrency.lockutils [req-590e61bd-2a00-4f67-8c0d-204d7733f2f1 req-ea03245f-a1df-4cba-8fc6-40e04d18cd8e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "c7dd2042-e2a5-4491-aa1c-0e72597641e0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:48:27 np0005466013 nova_compute[192144]: 2025-10-02 12:48:27.663 2 DEBUG oslo_concurrency.lockutils [req-590e61bd-2a00-4f67-8c0d-204d7733f2f1 req-ea03245f-a1df-4cba-8fc6-40e04d18cd8e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "c7dd2042-e2a5-4491-aa1c-0e72597641e0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:48:27 np0005466013 nova_compute[192144]: 2025-10-02 12:48:27.663 2 DEBUG nova.compute.manager [req-590e61bd-2a00-4f67-8c0d-204d7733f2f1 req-ea03245f-a1df-4cba-8fc6-40e04d18cd8e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Processing event network-vif-plugged-84cebaa0-8158-4467-a214-70216aa0fa77 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:48:27 np0005466013 podman[252957]: 2025-10-02 12:48:27.710395178 +0000 UTC m=+0.047065300 container create e50db7013cf41217d5ea112b0cbb46da6a67fd1cd1d2b584658e6564c9781945 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3c776fa4-63c0-44fa-bf3f-04ad74974c2c, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 08:48:27 np0005466013 systemd[1]: Started libpod-conmon-e50db7013cf41217d5ea112b0cbb46da6a67fd1cd1d2b584658e6564c9781945.scope.
Oct  2 08:48:27 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:48:27 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b5ece0441b1a7a5b58653ecb9ea88d47d6d0284f7c461d4b695439d43caf69a4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:48:27 np0005466013 podman[252957]: 2025-10-02 12:48:27.684753962 +0000 UTC m=+0.021424104 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:48:27 np0005466013 podman[252957]: 2025-10-02 12:48:27.792628532 +0000 UTC m=+0.129298684 container init e50db7013cf41217d5ea112b0cbb46da6a67fd1cd1d2b584658e6564c9781945 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3c776fa4-63c0-44fa-bf3f-04ad74974c2c, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001)
Oct  2 08:48:27 np0005466013 podman[252957]: 2025-10-02 12:48:27.798631411 +0000 UTC m=+0.135301533 container start e50db7013cf41217d5ea112b0cbb46da6a67fd1cd1d2b584658e6564c9781945 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3c776fa4-63c0-44fa-bf3f-04ad74974c2c, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, io.buildah.version=1.41.3)
Oct  2 08:48:27 np0005466013 neutron-haproxy-ovnmeta-3c776fa4-63c0-44fa-bf3f-04ad74974c2c[252972]: [NOTICE]   (252976) : New worker (252978) forked
Oct  2 08:48:27 np0005466013 neutron-haproxy-ovnmeta-3c776fa4-63c0-44fa-bf3f-04ad74974c2c[252972]: [NOTICE]   (252976) : Loading success.
Oct  2 08:48:28 np0005466013 nova_compute[192144]: 2025-10-02 12:48:28.060 2 DEBUG nova.network.neutron [req-40775e32-3af1-4809-a4cc-645362bf10a4 req-1ffdf540-a84d-486d-862c-8d63577b4626 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Updated VIF entry in instance network info cache for port 84cebaa0-8158-4467-a214-70216aa0fa77. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:48:28 np0005466013 nova_compute[192144]: 2025-10-02 12:48:28.061 2 DEBUG nova.network.neutron [req-40775e32-3af1-4809-a4cc-645362bf10a4 req-1ffdf540-a84d-486d-862c-8d63577b4626 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Updating instance_info_cache with network_info: [{"id": "84cebaa0-8158-4467-a214-70216aa0fa77", "address": "fa:16:3e:95:8b:55", "network": {"id": "3c776fa4-63c0-44fa-bf3f-04ad74974c2c", "bridge": "br-int", "label": "tempest-network-smoke--1497944835", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84cebaa0-81", "ovs_interfaceid": "84cebaa0-8158-4467-a214-70216aa0fa77", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:48:28 np0005466013 nova_compute[192144]: 2025-10-02 12:48:28.264 2 DEBUG oslo_concurrency.lockutils [req-40775e32-3af1-4809-a4cc-645362bf10a4 req-1ffdf540-a84d-486d-862c-8d63577b4626 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-c7dd2042-e2a5-4491-aa1c-0e72597641e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:48:29 np0005466013 nova_compute[192144]: 2025-10-02 12:48:29.801 2 DEBUG nova.compute.manager [req-60ad0bfb-7af2-4b67-88a1-91ce816bbce5 req-7508363c-47b3-4f11-9f61-5c08fb846d2d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Received event network-vif-plugged-84cebaa0-8158-4467-a214-70216aa0fa77 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:48:29 np0005466013 nova_compute[192144]: 2025-10-02 12:48:29.803 2 DEBUG oslo_concurrency.lockutils [req-60ad0bfb-7af2-4b67-88a1-91ce816bbce5 req-7508363c-47b3-4f11-9f61-5c08fb846d2d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "c7dd2042-e2a5-4491-aa1c-0e72597641e0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:48:29 np0005466013 nova_compute[192144]: 2025-10-02 12:48:29.803 2 DEBUG oslo_concurrency.lockutils [req-60ad0bfb-7af2-4b67-88a1-91ce816bbce5 req-7508363c-47b3-4f11-9f61-5c08fb846d2d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "c7dd2042-e2a5-4491-aa1c-0e72597641e0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:48:29 np0005466013 nova_compute[192144]: 2025-10-02 12:48:29.804 2 DEBUG oslo_concurrency.lockutils [req-60ad0bfb-7af2-4b67-88a1-91ce816bbce5 req-7508363c-47b3-4f11-9f61-5c08fb846d2d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "c7dd2042-e2a5-4491-aa1c-0e72597641e0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:48:29 np0005466013 nova_compute[192144]: 2025-10-02 12:48:29.804 2 DEBUG nova.compute.manager [req-60ad0bfb-7af2-4b67-88a1-91ce816bbce5 req-7508363c-47b3-4f11-9f61-5c08fb846d2d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] No waiting events found dispatching network-vif-plugged-84cebaa0-8158-4467-a214-70216aa0fa77 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:48:29 np0005466013 nova_compute[192144]: 2025-10-02 12:48:29.804 2 WARNING nova.compute.manager [req-60ad0bfb-7af2-4b67-88a1-91ce816bbce5 req-7508363c-47b3-4f11-9f61-5c08fb846d2d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Received unexpected event network-vif-plugged-84cebaa0-8158-4467-a214-70216aa0fa77 for instance with vm_state building and task_state spawning.#033[00m
Oct  2 08:48:29 np0005466013 nova_compute[192144]: 2025-10-02 12:48:29.813 2 DEBUG nova.compute.manager [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:48:29 np0005466013 nova_compute[192144]: 2025-10-02 12:48:29.814 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759409309.8128028, c7dd2042-e2a5-4491-aa1c-0e72597641e0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:48:29 np0005466013 nova_compute[192144]: 2025-10-02 12:48:29.814 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] VM Started (Lifecycle Event)#033[00m
Oct  2 08:48:29 np0005466013 nova_compute[192144]: 2025-10-02 12:48:29.816 2 DEBUG nova.virt.libvirt.driver [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:48:29 np0005466013 nova_compute[192144]: 2025-10-02 12:48:29.820 2 INFO nova.virt.libvirt.driver [-] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Instance spawned successfully.#033[00m
Oct  2 08:48:29 np0005466013 nova_compute[192144]: 2025-10-02 12:48:29.820 2 DEBUG nova.virt.libvirt.driver [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:48:29 np0005466013 nova_compute[192144]: 2025-10-02 12:48:29.841 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:48:29 np0005466013 nova_compute[192144]: 2025-10-02 12:48:29.846 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:48:29 np0005466013 nova_compute[192144]: 2025-10-02 12:48:29.855 2 DEBUG nova.virt.libvirt.driver [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:48:29 np0005466013 nova_compute[192144]: 2025-10-02 12:48:29.855 2 DEBUG nova.virt.libvirt.driver [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:48:29 np0005466013 nova_compute[192144]: 2025-10-02 12:48:29.856 2 DEBUG nova.virt.libvirt.driver [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:48:29 np0005466013 nova_compute[192144]: 2025-10-02 12:48:29.856 2 DEBUG nova.virt.libvirt.driver [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:48:29 np0005466013 nova_compute[192144]: 2025-10-02 12:48:29.856 2 DEBUG nova.virt.libvirt.driver [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:48:29 np0005466013 nova_compute[192144]: 2025-10-02 12:48:29.857 2 DEBUG nova.virt.libvirt.driver [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:48:29 np0005466013 nova_compute[192144]: 2025-10-02 12:48:29.880 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:48:29 np0005466013 nova_compute[192144]: 2025-10-02 12:48:29.881 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759409309.8135767, c7dd2042-e2a5-4491-aa1c-0e72597641e0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:48:29 np0005466013 nova_compute[192144]: 2025-10-02 12:48:29.881 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:48:29 np0005466013 nova_compute[192144]: 2025-10-02 12:48:29.918 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:48:29 np0005466013 nova_compute[192144]: 2025-10-02 12:48:29.922 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759409309.8157902, c7dd2042-e2a5-4491-aa1c-0e72597641e0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:48:29 np0005466013 nova_compute[192144]: 2025-10-02 12:48:29.923 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:48:29 np0005466013 nova_compute[192144]: 2025-10-02 12:48:29.944 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:48:29 np0005466013 nova_compute[192144]: 2025-10-02 12:48:29.946 2 INFO nova.compute.manager [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Took 12.20 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:48:29 np0005466013 nova_compute[192144]: 2025-10-02 12:48:29.946 2 DEBUG nova.compute.manager [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:48:29 np0005466013 nova_compute[192144]: 2025-10-02 12:48:29.950 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:48:29 np0005466013 nova_compute[192144]: 2025-10-02 12:48:29.979 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:48:30 np0005466013 nova_compute[192144]: 2025-10-02 12:48:30.060 2 INFO nova.compute.manager [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Took 12.75 seconds to build instance.#033[00m
Oct  2 08:48:30 np0005466013 nova_compute[192144]: 2025-10-02 12:48:30.091 2 DEBUG oslo_concurrency.lockutils [None req-81b53b21-b7e8-4856-8edc-cd68c5f9ce6c 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "c7dd2042-e2a5-4491-aa1c-0e72597641e0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.876s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:48:31 np0005466013 nova_compute[192144]: 2025-10-02 12:48:31.210 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:31 np0005466013 nova_compute[192144]: 2025-10-02 12:48:31.303 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:36 np0005466013 nova_compute[192144]: 2025-10-02 12:48:36.212 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:36 np0005466013 nova_compute[192144]: 2025-10-02 12:48:36.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:39 np0005466013 NetworkManager[51205]: <info>  [1759409319.8073] manager: (patch-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/351)
Oct  2 08:48:39 np0005466013 NetworkManager[51205]: <info>  [1759409319.8083] manager: (patch-br-int-to-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/352)
Oct  2 08:48:39 np0005466013 nova_compute[192144]: 2025-10-02 12:48:39.806 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:39 np0005466013 ovn_controller[94366]: 2025-10-02T12:48:39Z|00787|binding|INFO|Releasing lport b354dca0-bf82-4ac8-ba2d-7afd74e436fa from this chassis (sb_readonly=0)
Oct  2 08:48:39 np0005466013 ovn_controller[94366]: 2025-10-02T12:48:39Z|00788|binding|INFO|Releasing lport b354dca0-bf82-4ac8-ba2d-7afd74e436fa from this chassis (sb_readonly=0)
Oct  2 08:48:39 np0005466013 nova_compute[192144]: 2025-10-02 12:48:39.846 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:39 np0005466013 nova_compute[192144]: 2025-10-02 12:48:39.854 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:40 np0005466013 nova_compute[192144]: 2025-10-02 12:48:40.545 2 DEBUG nova.compute.manager [req-bf2edd42-3d1b-41e0-86c9-95f8c18845a6 req-94dcd3e3-e5aa-462a-9f86-b7330d3e6314 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Received event network-changed-84cebaa0-8158-4467-a214-70216aa0fa77 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:48:40 np0005466013 nova_compute[192144]: 2025-10-02 12:48:40.546 2 DEBUG nova.compute.manager [req-bf2edd42-3d1b-41e0-86c9-95f8c18845a6 req-94dcd3e3-e5aa-462a-9f86-b7330d3e6314 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Refreshing instance network info cache due to event network-changed-84cebaa0-8158-4467-a214-70216aa0fa77. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:48:40 np0005466013 nova_compute[192144]: 2025-10-02 12:48:40.546 2 DEBUG oslo_concurrency.lockutils [req-bf2edd42-3d1b-41e0-86c9-95f8c18845a6 req-94dcd3e3-e5aa-462a-9f86-b7330d3e6314 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-c7dd2042-e2a5-4491-aa1c-0e72597641e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:48:40 np0005466013 nova_compute[192144]: 2025-10-02 12:48:40.546 2 DEBUG oslo_concurrency.lockutils [req-bf2edd42-3d1b-41e0-86c9-95f8c18845a6 req-94dcd3e3-e5aa-462a-9f86-b7330d3e6314 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-c7dd2042-e2a5-4491-aa1c-0e72597641e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:48:40 np0005466013 nova_compute[192144]: 2025-10-02 12:48:40.546 2 DEBUG nova.network.neutron [req-bf2edd42-3d1b-41e0-86c9-95f8c18845a6 req-94dcd3e3-e5aa-462a-9f86-b7330d3e6314 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Refreshing network info cache for port 84cebaa0-8158-4467-a214-70216aa0fa77 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:48:40 np0005466013 podman[252997]: 2025-10-02 12:48:40.677880121 +0000 UTC m=+0.052052317 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:48:40 np0005466013 podman[252996]: 2025-10-02 12:48:40.714863093 +0000 UTC m=+0.086345444 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:48:40 np0005466013 podman[252998]: 2025-10-02 12:48:40.738792325 +0000 UTC m=+0.108792560 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 08:48:41 np0005466013 nova_compute[192144]: 2025-10-02 12:48:41.216 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:41 np0005466013 nova_compute[192144]: 2025-10-02 12:48:41.307 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:43 np0005466013 nova_compute[192144]: 2025-10-02 12:48:43.885 2 DEBUG nova.network.neutron [req-bf2edd42-3d1b-41e0-86c9-95f8c18845a6 req-94dcd3e3-e5aa-462a-9f86-b7330d3e6314 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Updated VIF entry in instance network info cache for port 84cebaa0-8158-4467-a214-70216aa0fa77. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:48:43 np0005466013 nova_compute[192144]: 2025-10-02 12:48:43.886 2 DEBUG nova.network.neutron [req-bf2edd42-3d1b-41e0-86c9-95f8c18845a6 req-94dcd3e3-e5aa-462a-9f86-b7330d3e6314 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Updating instance_info_cache with network_info: [{"id": "84cebaa0-8158-4467-a214-70216aa0fa77", "address": "fa:16:3e:95:8b:55", "network": {"id": "3c776fa4-63c0-44fa-bf3f-04ad74974c2c", "bridge": "br-int", "label": "tempest-network-smoke--1497944835", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84cebaa0-81", "ovs_interfaceid": "84cebaa0-8158-4467-a214-70216aa0fa77", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:48:43 np0005466013 nova_compute[192144]: 2025-10-02 12:48:43.913 2 DEBUG oslo_concurrency.lockutils [req-bf2edd42-3d1b-41e0-86c9-95f8c18845a6 req-94dcd3e3-e5aa-462a-9f86-b7330d3e6314 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-c7dd2042-e2a5-4491-aa1c-0e72597641e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:48:45 np0005466013 ovn_controller[94366]: 2025-10-02T12:48:45Z|00085|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:95:8b:55 10.100.0.6
Oct  2 08:48:45 np0005466013 ovn_controller[94366]: 2025-10-02T12:48:45Z|00086|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:95:8b:55 10.100.0.6
Oct  2 08:48:46 np0005466013 nova_compute[192144]: 2025-10-02 12:48:46.254 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:46 np0005466013 nova_compute[192144]: 2025-10-02 12:48:46.309 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:50 np0005466013 nova_compute[192144]: 2025-10-02 12:48:50.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:48:50 np0005466013 nova_compute[192144]: 2025-10-02 12:48:50.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:48:51 np0005466013 nova_compute[192144]: 2025-10-02 12:48:51.256 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:51 np0005466013 nova_compute[192144]: 2025-10-02 12:48:51.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:51 np0005466013 podman[253084]: 2025-10-02 12:48:51.701772359 +0000 UTC m=+0.068338439 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_id=edpm, distribution-scope=public, container_name=openstack_network_exporter, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  2 08:48:51 np0005466013 podman[253083]: 2025-10-02 12:48:51.714819879 +0000 UTC m=+0.077899739 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  2 08:48:51 np0005466013 podman[253085]: 2025-10-02 12:48:51.727163577 +0000 UTC m=+0.083860716 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3)
Oct  2 08:48:51 np0005466013 nova_compute[192144]: 2025-10-02 12:48:51.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:48:52 np0005466013 nova_compute[192144]: 2025-10-02 12:48:52.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:48:53 np0005466013 nova_compute[192144]: 2025-10-02 12:48:53.199 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:48:53 np0005466013 nova_compute[192144]: 2025-10-02 12:48:53.200 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:48:53 np0005466013 nova_compute[192144]: 2025-10-02 12:48:53.200 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:48:53 np0005466013 nova_compute[192144]: 2025-10-02 12:48:53.200 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:48:53 np0005466013 nova_compute[192144]: 2025-10-02 12:48:53.517 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c7dd2042-e2a5-4491-aa1c-0e72597641e0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:48:53 np0005466013 nova_compute[192144]: 2025-10-02 12:48:53.598 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c7dd2042-e2a5-4491-aa1c-0e72597641e0/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:48:53 np0005466013 nova_compute[192144]: 2025-10-02 12:48:53.599 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c7dd2042-e2a5-4491-aa1c-0e72597641e0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:48:53 np0005466013 nova_compute[192144]: 2025-10-02 12:48:53.693 2 DEBUG oslo_concurrency.processutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c7dd2042-e2a5-4491-aa1c-0e72597641e0/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:48:53 np0005466013 nova_compute[192144]: 2025-10-02 12:48:53.839 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:48:53 np0005466013 nova_compute[192144]: 2025-10-02 12:48:53.840 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5519MB free_disk=73.1031265258789GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:48:53 np0005466013 nova_compute[192144]: 2025-10-02 12:48:53.840 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:48:53 np0005466013 nova_compute[192144]: 2025-10-02 12:48:53.840 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:48:54 np0005466013 nova_compute[192144]: 2025-10-02 12:48:54.040 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Instance c7dd2042-e2a5-4491-aa1c-0e72597641e0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:48:54 np0005466013 nova_compute[192144]: 2025-10-02 12:48:54.040 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:48:54 np0005466013 nova_compute[192144]: 2025-10-02 12:48:54.040 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:48:54 np0005466013 nova_compute[192144]: 2025-10-02 12:48:54.096 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:48:54 np0005466013 nova_compute[192144]: 2025-10-02 12:48:54.209 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:48:54 np0005466013 nova_compute[192144]: 2025-10-02 12:48:54.337 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:48:54 np0005466013 nova_compute[192144]: 2025-10-02 12:48:54.337 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.497s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:48:54 np0005466013 podman[253148]: 2025-10-02 12:48:54.678028745 +0000 UTC m=+0.049302070 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:48:54 np0005466013 podman[253149]: 2025-10-02 12:48:54.678026805 +0000 UTC m=+0.046708758 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, container_name=iscsid, io.buildah.version=1.41.3)
Oct  2 08:48:55 np0005466013 nova_compute[192144]: 2025-10-02 12:48:55.338 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:48:56 np0005466013 nova_compute[192144]: 2025-10-02 12:48:56.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:56 np0005466013 nova_compute[192144]: 2025-10-02 12:48:56.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:56 np0005466013 nova_compute[192144]: 2025-10-02 12:48:56.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:48:58 np0005466013 nova_compute[192144]: 2025-10-02 12:48:58.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:49:01 np0005466013 nova_compute[192144]: 2025-10-02 12:49:01.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:01 np0005466013 nova_compute[192144]: 2025-10-02 12:49:01.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:01 np0005466013 nova_compute[192144]: 2025-10-02 12:49:01.988 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:49:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:49:02.332 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:49:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:49:02.333 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:49:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:49:02.333 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:49:04 np0005466013 nova_compute[192144]: 2025-10-02 12:49:04.716 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:49:04 np0005466013 nova_compute[192144]: 2025-10-02 12:49:04.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:49:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:49:05.520 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=55, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=54) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:49:05 np0005466013 nova_compute[192144]: 2025-10-02 12:49:05.521 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:05 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:49:05.521 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:49:05 np0005466013 nova_compute[192144]: 2025-10-02 12:49:05.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:49:05 np0005466013 nova_compute[192144]: 2025-10-02 12:49:05.995 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:49:05 np0005466013 nova_compute[192144]: 2025-10-02 12:49:05.995 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:49:06 np0005466013 nova_compute[192144]: 2025-10-02 12:49:06.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:06 np0005466013 nova_compute[192144]: 2025-10-02 12:49:06.318 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:06 np0005466013 nova_compute[192144]: 2025-10-02 12:49:06.668 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "refresh_cache-c7dd2042-e2a5-4491-aa1c-0e72597641e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:49:06 np0005466013 nova_compute[192144]: 2025-10-02 12:49:06.669 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquired lock "refresh_cache-c7dd2042-e2a5-4491-aa1c-0e72597641e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:49:06 np0005466013 nova_compute[192144]: 2025-10-02 12:49:06.669 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:49:06 np0005466013 nova_compute[192144]: 2025-10-02 12:49:06.669 2 DEBUG nova.objects.instance [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lazy-loading 'info_cache' on Instance uuid c7dd2042-e2a5-4491-aa1c-0e72597641e0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:49:09 np0005466013 nova_compute[192144]: 2025-10-02 12:49:09.432 2 DEBUG nova.network.neutron [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Updating instance_info_cache with network_info: [{"id": "84cebaa0-8158-4467-a214-70216aa0fa77", "address": "fa:16:3e:95:8b:55", "network": {"id": "3c776fa4-63c0-44fa-bf3f-04ad74974c2c", "bridge": "br-int", "label": "tempest-network-smoke--1497944835", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84cebaa0-81", "ovs_interfaceid": "84cebaa0-8158-4467-a214-70216aa0fa77", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:49:09 np0005466013 nova_compute[192144]: 2025-10-02 12:49:09.451 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Releasing lock "refresh_cache-c7dd2042-e2a5-4491-aa1c-0e72597641e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:49:09 np0005466013 nova_compute[192144]: 2025-10-02 12:49:09.451 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:49:11 np0005466013 nova_compute[192144]: 2025-10-02 12:49:11.300 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:11 np0005466013 nova_compute[192144]: 2025-10-02 12:49:11.320 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:11 np0005466013 podman[253193]: 2025-10-02 12:49:11.67681756 +0000 UTC m=+0.048695852 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  2 08:49:11 np0005466013 podman[253194]: 2025-10-02 12:49:11.682339093 +0000 UTC m=+0.052027996 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct  2 08:49:11 np0005466013 podman[253195]: 2025-10-02 12:49:11.744107324 +0000 UTC m=+0.110411401 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Oct  2 08:49:14 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:49:14.523 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '55'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:49:16 np0005466013 nova_compute[192144]: 2025-10-02 12:49:16.304 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:16 np0005466013 nova_compute[192144]: 2025-10-02 12:49:16.321 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.363 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c7dd2042-e2a5-4491-aa1c-0e72597641e0', 'name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-1773803623', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000b8', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'hostId': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.364 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.367 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for c7dd2042-e2a5-4491-aa1c-0e72597641e0 / tap84cebaa0-81 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.367 12 DEBUG ceilometer.compute.pollsters [-] c7dd2042-e2a5-4491-aa1c-0e72597641e0/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.369 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3522c2c2-d7fb-4a3b-af76-12d33f39468d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': 'instance-000000b8-c7dd2042-e2a5-4491-aa1c-0e72597641e0-tap84cebaa0-81', 'timestamp': '2025-10-02T12:49:16.364771', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-1773803623', 'name': 'tap84cebaa0-81', 'instance_id': 'c7dd2042-e2a5-4491-aa1c-0e72597641e0', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:95:8b:55', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap84cebaa0-81'}, 'message_id': '34835118-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7348.043028417, 'message_signature': '09237a535e5f9adfde8736275b2343d4b9acb1b4d2e32dd8bee0e5824fed299f'}]}, 'timestamp': '2025-10-02 12:49:16.368107', '_unique_id': '2f19bc9bd5d446608624a7a4bb77daa8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.369 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.369 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.369 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.369 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.369 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.369 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.369 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.369 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.369 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.369 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.369 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.369 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.369 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.369 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.369 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.369 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.369 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.369 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.369 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.369 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.369 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.369 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.369 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.369 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.369 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.369 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.369 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.369 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.369 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.369 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.369 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.371 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.371 12 DEBUG ceilometer.compute.pollsters [-] c7dd2042-e2a5-4491-aa1c-0e72597641e0/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.372 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eef64669-d99d-433d-b3c7-836de1d84a3c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': 'instance-000000b8-c7dd2042-e2a5-4491-aa1c-0e72597641e0-tap84cebaa0-81', 'timestamp': '2025-10-02T12:49:16.371451', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-1773803623', 'name': 'tap84cebaa0-81', 'instance_id': 'c7dd2042-e2a5-4491-aa1c-0e72597641e0', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:95:8b:55', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap84cebaa0-81'}, 'message_id': '3483e40c-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7348.043028417, 'message_signature': 'f0e3e901f5c6a8accb70e23f17a508e15bca3a2c159d5baeb7b5cb564932846d'}]}, 'timestamp': '2025-10-02 12:49:16.371880', '_unique_id': '5345e3baf9804cf18681c44448a2e7a1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.372 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.372 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.372 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.372 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.372 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.372 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.372 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.372 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.372 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.372 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.372 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.372 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.372 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.372 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.372 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.372 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.372 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.372 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.372 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.372 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.372 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.372 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.372 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.372 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.372 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.372 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.372 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.372 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.372 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.372 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.372 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.373 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.394 12 DEBUG ceilometer.compute.pollsters [-] c7dd2042-e2a5-4491-aa1c-0e72597641e0/disk.device.write.bytes volume: 72921088 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.394 12 DEBUG ceilometer.compute.pollsters [-] c7dd2042-e2a5-4491-aa1c-0e72597641e0/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.396 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0f12e030-4043-4177-81c4-324b2cb59247', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72921088, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': 'c7dd2042-e2a5-4491-aa1c-0e72597641e0-vda', 'timestamp': '2025-10-02T12:49:16.373805', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-1773803623', 'name': 'instance-000000b8', 'instance_id': 'c7dd2042-e2a5-4491-aa1c-0e72597641e0', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '34876726-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7348.052099632, 'message_signature': '26a5ac88a8501c907890d58a7c187c38d7b20a38eef93679cf2d41642253ddf1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': 'c7dd2042-e2a5-4491-aa1c-0e72597641e0-sda', 'timestamp': '2025-10-02T12:49:16.373805', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-1773803623', 'name': 'instance-000000b8', 'instance_id': 'c7dd2042-e2a5-4491-aa1c-0e72597641e0', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '348777fc-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7348.052099632, 'message_signature': '0e5be2ace93ae5fbe8c2c663716179367937324e370233fbc53dccbdc05ad89d'}]}, 'timestamp': '2025-10-02 12:49:16.395239', '_unique_id': 'd4d930d9f554445b980c7280e591a2fe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.396 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.396 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.396 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.396 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.396 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.396 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.396 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.396 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.396 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.396 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.396 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.396 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.396 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.396 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.396 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.396 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.396 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.396 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.396 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.396 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.396 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.396 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.396 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.396 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.396 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.396 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.396 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.396 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.396 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.396 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.396 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.397 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.397 12 DEBUG ceilometer.compute.pollsters [-] c7dd2042-e2a5-4491-aa1c-0e72597641e0/network.outgoing.bytes volume: 9788 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.398 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ac022a0e-57bd-4ff2-9ec0-87647bf137fa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9788, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': 'instance-000000b8-c7dd2042-e2a5-4491-aa1c-0e72597641e0-tap84cebaa0-81', 'timestamp': '2025-10-02T12:49:16.397647', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-1773803623', 'name': 'tap84cebaa0-81', 'instance_id': 'c7dd2042-e2a5-4491-aa1c-0e72597641e0', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:95:8b:55', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap84cebaa0-81'}, 'message_id': '3487e07a-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7348.043028417, 'message_signature': 'd0d49df0be443d460549cbba95cbc9e60747b1650b40e3b286d09871cb01b4c6'}]}, 'timestamp': '2025-10-02 12:49:16.397915', '_unique_id': '5da233f8ddc848829090874580d979ac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.398 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.398 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.398 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.398 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.398 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.398 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.398 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.398 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.398 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.398 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.398 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.398 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.398 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.398 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.398 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.398 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.398 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.398 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.398 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.398 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.398 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.398 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.398 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.398 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.398 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.398 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.398 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.398 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.398 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.398 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.398 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.398 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.399 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.399 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-1773803623>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-1773803623>]
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.399 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.399 12 DEBUG ceilometer.compute.pollsters [-] c7dd2042-e2a5-4491-aa1c-0e72597641e0/disk.device.read.latency volume: 1155500005 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.399 12 DEBUG ceilometer.compute.pollsters [-] c7dd2042-e2a5-4491-aa1c-0e72597641e0/disk.device.read.latency volume: 41999061 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.400 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6641464f-0f40-4d1e-aa25-29160d4d413e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1155500005, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': 'c7dd2042-e2a5-4491-aa1c-0e72597641e0-vda', 'timestamp': '2025-10-02T12:49:16.399449', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-1773803623', 'name': 'instance-000000b8', 'instance_id': 'c7dd2042-e2a5-4491-aa1c-0e72597641e0', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '34882652-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7348.052099632, 'message_signature': 'c464a09ddf59be8795b1aa0e630ea93f7280631a4b66ddfa46b244747207c78b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 41999061, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': 'c7dd2042-e2a5-4491-aa1c-0e72597641e0-sda', 'timestamp': '2025-10-02T12:49:16.399449', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-1773803623', 'name': 'instance-000000b8', 'instance_id': 'c7dd2042-e2a5-4491-aa1c-0e72597641e0', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '34882e2c-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7348.052099632, 'message_signature': 'b87691b37c77d482cc34dfbff924fdebec5b6b534c3831f37fcc8ee8516fe32d'}]}, 'timestamp': '2025-10-02 12:49:16.399884', '_unique_id': 'a50054674ad04a02861216f5ec1c0893'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.400 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.400 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.400 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.400 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.400 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.400 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.400 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.400 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.400 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.400 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.400 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.400 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.400 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.400 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.400 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.400 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.400 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.400 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.400 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.400 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.400 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.400 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.400 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.400 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.400 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.400 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.400 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.400 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.400 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.400 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.400 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.400 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 DEBUG ceilometer.compute.pollsters [-] c7dd2042-e2a5-4491-aa1c-0e72597641e0/disk.device.read.bytes volume: 31070720 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 DEBUG ceilometer.compute.pollsters [-] c7dd2042-e2a5-4491-aa1c-0e72597641e0/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cf463594-2c49-40e6-9ca2-6b7273d8da58', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31070720, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': 'c7dd2042-e2a5-4491-aa1c-0e72597641e0-vda', 'timestamp': '2025-10-02T12:49:16.400997', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-1773803623', 'name': 'instance-000000b8', 'instance_id': 'c7dd2042-e2a5-4491-aa1c-0e72597641e0', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3488627a-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7348.052099632, 'message_signature': 'be37a6bdcff86f298eadc4ec58ad45b1b953b105480936aa87669c5be74ac672'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': 'c7dd2042-e2a5-4491-aa1c-0e72597641e0-sda', 'timestamp': '2025-10-02T12:49:16.400997', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-1773803623', 'name': 'instance-000000b8', 'instance_id': 'c7dd2042-e2a5-4491-aa1c-0e72597641e0', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '34886a7c-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7348.052099632, 'message_signature': '938947f63c1fc7be358eddd99c11a947ba2d0eae296d44f6e8e1ded898b16d0e'}]}, 'timestamp': '2025-10-02 12:49:16.401411', '_unique_id': '892a2a5e35d94aef8c86b4079d5a01bf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.401 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.402 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.414 12 DEBUG ceilometer.compute.pollsters [-] c7dd2042-e2a5-4491-aa1c-0e72597641e0/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.414 12 DEBUG ceilometer.compute.pollsters [-] c7dd2042-e2a5-4491-aa1c-0e72597641e0/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.416 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '63c5efac-6469-4f0c-a7b9-030a691def4a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': 'c7dd2042-e2a5-4491-aa1c-0e72597641e0-vda', 'timestamp': '2025-10-02T12:49:16.402489', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-1773803623', 'name': 'instance-000000b8', 'instance_id': 'c7dd2042-e2a5-4491-aa1c-0e72597641e0', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '348a7650-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7348.080737652, 'message_signature': 'a4a8bb4b02c813cd55415d5f0d82495b33c61a1665c962ae3e061bf3188d2ff0'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': 'c7dd2042-e2a5-4491-aa1c-0e72597641e0-sda', 'timestamp': '2025-10-02T12:49:16.402489', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-1773803623', 'name': 'instance-000000b8', 'instance_id': 'c7dd2042-e2a5-4491-aa1c-0e72597641e0', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '348a84ec-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7348.080737652, 'message_signature': '0c820b1949b61feaba5f5f89fb6d9edb05cc93a1e2763b6d261669796c7b5e5f'}]}, 'timestamp': '2025-10-02 12:49:16.415251', '_unique_id': '6a0124a21e4e41b3a933a8005a4cb7fb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.416 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.416 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.416 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.416 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.416 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.416 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.416 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.416 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.416 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.416 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.416 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.416 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.416 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.416 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.416 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.416 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.416 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.416 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.416 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.416 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.416 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.416 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.416 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.416 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.416 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.416 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.416 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.416 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.416 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.416 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.416 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.417 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.417 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.417 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-1773803623>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-1773803623>]
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.417 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.417 12 DEBUG ceilometer.compute.pollsters [-] c7dd2042-e2a5-4491-aa1c-0e72597641e0/network.incoming.packets volume: 67 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.418 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b18d3a29-2249-493c-ae8e-c63fb46dd77b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 67, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': 'instance-000000b8-c7dd2042-e2a5-4491-aa1c-0e72597641e0-tap84cebaa0-81', 'timestamp': '2025-10-02T12:49:16.417495', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-1773803623', 'name': 'tap84cebaa0-81', 'instance_id': 'c7dd2042-e2a5-4491-aa1c-0e72597641e0', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:95:8b:55', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap84cebaa0-81'}, 'message_id': '348ae86a-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7348.043028417, 'message_signature': '5e9bfa799882837aebcf025a40eb0212c2e5c88423441d700437d07bd2d5d098'}]}, 'timestamp': '2025-10-02 12:49:16.417757', '_unique_id': 'd8cff91f8420410cba5340c234475357'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.418 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.418 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.418 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.418 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.418 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.418 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.418 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.418 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.418 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.418 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.418 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.418 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.418 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.418 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.418 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.418 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.418 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.418 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.418 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.418 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.418 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.418 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.418 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.418 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.418 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.418 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.418 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.418 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.418 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.418 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.418 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.419 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.419 12 DEBUG ceilometer.compute.pollsters [-] c7dd2042-e2a5-4491-aa1c-0e72597641e0/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.419 12 DEBUG ceilometer.compute.pollsters [-] c7dd2042-e2a5-4491-aa1c-0e72597641e0/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '60cf4177-b219-4d73-8422-0853ac6a294c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': 'c7dd2042-e2a5-4491-aa1c-0e72597641e0-vda', 'timestamp': '2025-10-02T12:49:16.419134', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-1773803623', 'name': 'instance-000000b8', 'instance_id': 'c7dd2042-e2a5-4491-aa1c-0e72597641e0', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '348b2852-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7348.080737652, 'message_signature': '408c5f48e23a472fa3d0c6efc9617bb4c1c7513f77ccf2b09f18a369819f0dca'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': 'c7dd2042-e2a5-4491-aa1c-0e72597641e0-sda', 'timestamp': '2025-10-02T12:49:16.419134', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-1773803623', 'name': 'instance-000000b8', 'instance_id': 'c7dd2042-e2a5-4491-aa1c-0e72597641e0', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '348b32a2-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7348.080737652, 'message_signature': '209cc471e229ab202b9d23a09b287a144adfbb6d87af248f87604d72e3238a02'}]}, 'timestamp': '2025-10-02 12:49:16.419644', '_unique_id': '594ebd8fc4e44241b713cba57853b2f1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.420 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-1773803623>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-1773803623>]
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 DEBUG ceilometer.compute.pollsters [-] c7dd2042-e2a5-4491-aa1c-0e72597641e0/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd64c213f-25d6-4eb4-b504-7dd7d892841f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': 'instance-000000b8-c7dd2042-e2a5-4491-aa1c-0e72597641e0-tap84cebaa0-81', 'timestamp': '2025-10-02T12:49:16.421061', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-1773803623', 'name': 'tap84cebaa0-81', 'instance_id': 'c7dd2042-e2a5-4491-aa1c-0e72597641e0', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:95:8b:55', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap84cebaa0-81'}, 'message_id': '348b729e-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7348.043028417, 'message_signature': '380267a5bc9b820f8ac3a5a3c99ad4e5544b0be11adf4bbecb85e19615ee82dc'}]}, 'timestamp': '2025-10-02 12:49:16.421293', '_unique_id': 'ec4fc8a9d14241ce9fe7ac7574905ca1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.421 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.422 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.422 12 DEBUG ceilometer.compute.pollsters [-] c7dd2042-e2a5-4491-aa1c-0e72597641e0/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.422 12 DEBUG ceilometer.compute.pollsters [-] c7dd2042-e2a5-4491-aa1c-0e72597641e0/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ea389cd7-eee6-43a3-828e-18e6a187cfa2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': 'c7dd2042-e2a5-4491-aa1c-0e72597641e0-vda', 'timestamp': '2025-10-02T12:49:16.422352', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-1773803623', 'name': 'instance-000000b8', 'instance_id': 'c7dd2042-e2a5-4491-aa1c-0e72597641e0', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '348ba48a-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7348.080737652, 'message_signature': '359b9cdacdd3cc6a52d0d2c7e87059f6ae3d5a5c07455beab916f97a04f8190c'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': 'c7dd2042-e2a5-4491-aa1c-0e72597641e0-sda', 'timestamp': '2025-10-02T12:49:16.422352', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-1773803623', 'name': 'instance-000000b8', 'instance_id': 'c7dd2042-e2a5-4491-aa1c-0e72597641e0', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '348bac00-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7348.080737652, 'message_signature': '804b65848ab3076a2dbc039a0263a96dbbab6791ab998736cbeea21a3d458186'}]}, 'timestamp': '2025-10-02 12:49:16.422746', '_unique_id': 'ab2e421d8e8343beb94a1486acba0dbe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.423 12 DEBUG ceilometer.compute.pollsters [-] c7dd2042-e2a5-4491-aa1c-0e72597641e0/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.424 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e07499aa-7dee-4496-a469-16783be201e2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': 'instance-000000b8-c7dd2042-e2a5-4491-aa1c-0e72597641e0-tap84cebaa0-81', 'timestamp': '2025-10-02T12:49:16.423823', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-1773803623', 'name': 'tap84cebaa0-81', 'instance_id': 'c7dd2042-e2a5-4491-aa1c-0e72597641e0', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:95:8b:55', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap84cebaa0-81'}, 'message_id': '348bdf04-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7348.043028417, 'message_signature': '80d8e9940271f72a2e2e2b7577ffe55d618d3370a4019a5c2a56a8cf5e8e4c50'}]}, 'timestamp': '2025-10-02 12:49:16.424068', '_unique_id': 'be5152b7814d4ce2847803dc2054fa30'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.424 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.424 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.424 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.424 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.424 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.424 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.424 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.424 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.424 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.424 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.424 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.424 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.424 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.424 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.424 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.424 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.424 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.424 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.424 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.424 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.424 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.424 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.424 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.424 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.424 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.424 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.424 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.424 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.424 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.424 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.424 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 DEBUG ceilometer.compute.pollsters [-] c7dd2042-e2a5-4491-aa1c-0e72597641e0/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4360d6e1-fa23-48b5-9e96-81fac4065539', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': 'instance-000000b8-c7dd2042-e2a5-4491-aa1c-0e72597641e0-tap84cebaa0-81', 'timestamp': '2025-10-02T12:49:16.425119', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-1773803623', 'name': 'tap84cebaa0-81', 'instance_id': 'c7dd2042-e2a5-4491-aa1c-0e72597641e0', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:95:8b:55', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap84cebaa0-81'}, 'message_id': '348c10a0-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7348.043028417, 'message_signature': '808afd0f6a2afccc9b8504ffc872c76ccb87598fd9e02df8ac8ecbd186e60ffd'}]}, 'timestamp': '2025-10-02 12:49:16.425336', '_unique_id': '62fa7738f37546d1a99f21f938192e9c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.425 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.426 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.426 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.426 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-1773803623>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-1773803623>]
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.426 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.426 12 DEBUG ceilometer.compute.pollsters [-] c7dd2042-e2a5-4491-aa1c-0e72597641e0/disk.device.write.latency volume: 57848571676 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.426 12 DEBUG ceilometer.compute.pollsters [-] c7dd2042-e2a5-4491-aa1c-0e72597641e0/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.427 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '60ab97d3-5d21-4415-8b03-68ed0ffa4196', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 57848571676, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': 'c7dd2042-e2a5-4491-aa1c-0e72597641e0-vda', 'timestamp': '2025-10-02T12:49:16.426636', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-1773803623', 'name': 'instance-000000b8', 'instance_id': 'c7dd2042-e2a5-4491-aa1c-0e72597641e0', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '348c4c32-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7348.052099632, 'message_signature': '7691f89a2a8e0f1a2ebad94ca61cb0da5688722cf930e7e6bdb50c65efa79a4a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': 'c7dd2042-e2a5-4491-aa1c-0e72597641e0-sda', 'timestamp': '2025-10-02T12:49:16.426636', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-1773803623', 'name': 'instance-000000b8', 'instance_id': 'c7dd2042-e2a5-4491-aa1c-0e72597641e0', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '348c5592-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7348.052099632, 'message_signature': 'b59291dcb6a2d4b5d98f1d21e5570c3a1236d8c2de970ee8d047671bc3d410f6'}]}, 'timestamp': '2025-10-02 12:49:16.427092', '_unique_id': 'b363dbd7727646c8bfa340e9eea351a6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.427 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.427 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.427 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.427 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.427 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.427 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.427 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.427 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.427 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.427 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.427 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.427 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.427 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.427 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.427 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.427 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.427 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.427 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.427 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.427 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.427 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.427 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.427 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.427 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.427 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.427 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.427 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.427 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.427 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.427 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.427 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.428 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.428 12 DEBUG ceilometer.compute.pollsters [-] c7dd2042-e2a5-4491-aa1c-0e72597641e0/disk.device.write.requests volume: 305 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.428 12 DEBUG ceilometer.compute.pollsters [-] c7dd2042-e2a5-4491-aa1c-0e72597641e0/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd50bee5c-2cce-45b8-9b95-bd6107b07979', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 305, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': 'c7dd2042-e2a5-4491-aa1c-0e72597641e0-vda', 'timestamp': '2025-10-02T12:49:16.428177', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-1773803623', 'name': 'instance-000000b8', 'instance_id': 'c7dd2042-e2a5-4491-aa1c-0e72597641e0', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '348c8828-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7348.052099632, 'message_signature': '4d5dc9aaecf88d4c486ff18c04b32c1a4c264813563c2b16d98d003d05f7d8c8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': 'c7dd2042-e2a5-4491-aa1c-0e72597641e0-sda', 'timestamp': '2025-10-02T12:49:16.428177', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-1773803623', 'name': 'instance-000000b8', 'instance_id': 'c7dd2042-e2a5-4491-aa1c-0e72597641e0', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '348c9002-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7348.052099632, 'message_signature': '16af7a7d51fe157611d2fe0ea8f221ab967b0dc25edde1c31982e2f0045f9bbd'}]}, 'timestamp': '2025-10-02 12:49:16.428597', '_unique_id': '0e2fe7d0415b4e4daaa018fa229e3818'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.429 12 DEBUG ceilometer.compute.pollsters [-] c7dd2042-e2a5-4491-aa1c-0e72597641e0/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.430 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5b21820c-3901-42b4-8350-9a5a4bb7c5d5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': 'instance-000000b8-c7dd2042-e2a5-4491-aa1c-0e72597641e0-tap84cebaa0-81', 'timestamp': '2025-10-02T12:49:16.429712', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-1773803623', 'name': 'tap84cebaa0-81', 'instance_id': 'c7dd2042-e2a5-4491-aa1c-0e72597641e0', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:95:8b:55', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap84cebaa0-81'}, 'message_id': '348cc432-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7348.043028417, 'message_signature': 'f60eead349fdef185c0b5749f99ba9e7d4820ca36c8b27f69b078eb4a141ac0e'}]}, 'timestamp': '2025-10-02 12:49:16.429957', '_unique_id': '25d1526416744745aaba930e96922b02'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.430 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.430 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.430 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.430 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.430 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.430 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.430 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.430 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.430 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.430 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.430 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.430 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.430 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.430 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.430 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.430 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.430 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.430 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.430 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.430 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.430 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.430 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.430 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.430 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.430 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.430 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.430 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.430 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.430 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.430 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.430 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 DEBUG ceilometer.compute.pollsters [-] c7dd2042-e2a5-4491-aa1c-0e72597641e0/network.incoming.bytes volume: 12748 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7404a0a1-b657-4132-882e-40ebffcef4a6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 12748, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': 'instance-000000b8-c7dd2042-e2a5-4491-aa1c-0e72597641e0-tap84cebaa0-81', 'timestamp': '2025-10-02T12:49:16.431218', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-1773803623', 'name': 'tap84cebaa0-81', 'instance_id': 'c7dd2042-e2a5-4491-aa1c-0e72597641e0', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:95:8b:55', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap84cebaa0-81'}, 'message_id': '348cfef2-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7348.043028417, 'message_signature': 'b54c1552d7848703556602a8ea8694b907aa7155d7b97f50ae15f8149bd5b0b6'}]}, 'timestamp': '2025-10-02 12:49:16.431438', '_unique_id': '371962cffe7541aa8eef21e4a8cff2af'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.431 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.432 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.446 12 DEBUG ceilometer.compute.pollsters [-] c7dd2042-e2a5-4491-aa1c-0e72597641e0/cpu volume: 12000000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.448 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1ce4a0a8-62b9-4bf8-9526-ea80028d194c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12000000000, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': 'c7dd2042-e2a5-4491-aa1c-0e72597641e0', 'timestamp': '2025-10-02T12:49:16.432479', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-1773803623', 'name': 'instance-000000b8', 'instance_id': 'c7dd2042-e2a5-4491-aa1c-0e72597641e0', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '348f707e-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7348.124878909, 'message_signature': '98ba16d52803e21c100d4b63b3df7c4bcb3e094d579bde4d459ac2a2558178ae'}]}, 'timestamp': '2025-10-02 12:49:16.447574', '_unique_id': '13d2afe14d2e4dbcb8799f7039083302'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.448 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.448 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.448 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.448 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.448 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.448 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.448 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.448 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.448 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.448 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.448 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.448 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.448 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.448 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.448 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.448 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.448 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.448 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.448 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.448 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.448 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.448 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.448 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.448 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.448 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.448 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.448 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.448 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.448 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.448 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.448 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.449 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.449 12 DEBUG ceilometer.compute.pollsters [-] c7dd2042-e2a5-4491-aa1c-0e72597641e0/disk.device.read.requests volume: 1136 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.449 12 DEBUG ceilometer.compute.pollsters [-] c7dd2042-e2a5-4491-aa1c-0e72597641e0/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.450 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c7074c31-02ad-4d53-9630-ad74bd009bb8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1136, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': 'c7dd2042-e2a5-4491-aa1c-0e72597641e0-vda', 'timestamp': '2025-10-02T12:49:16.449647', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-1773803623', 'name': 'instance-000000b8', 'instance_id': 'c7dd2042-e2a5-4491-aa1c-0e72597641e0', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '348fcf6a-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7348.052099632, 'message_signature': 'a7764b51594e8afec2c671fa247c11e528bdf28f0c56c26c4270a6687ab0d6ad'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': 'c7dd2042-e2a5-4491-aa1c-0e72597641e0-sda', 'timestamp': '2025-10-02T12:49:16.449647', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-1773803623', 'name': 'instance-000000b8', 'instance_id': 'c7dd2042-e2a5-4491-aa1c-0e72597641e0', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '348fd870-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7348.052099632, 'message_signature': '201712a545ade229ce3438b765eadc96cc85edb322df583d753f1ef4c33bdfb9'}]}, 'timestamp': '2025-10-02 12:49:16.450101', '_unique_id': '8eb3c31b81734ba98552a3a28cee01f9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.450 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.450 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.450 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.450 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.450 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.450 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.450 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.450 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.450 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.450 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.450 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.450 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.450 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.450 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.450 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.450 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.450 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.450 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.450 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.450 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.450 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.450 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.450 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.450 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.450 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.450 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.450 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.450 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.450 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.450 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.450 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 DEBUG ceilometer.compute.pollsters [-] c7dd2042-e2a5-4491-aa1c-0e72597641e0/memory.usage volume: 42.8828125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c427ac03-c3b3-439c-ac97-d5e67f1c8039', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.8828125, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': 'c7dd2042-e2a5-4491-aa1c-0e72597641e0', 'timestamp': '2025-10-02T12:49:16.451199', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-1773803623', 'name': 'instance-000000b8', 'instance_id': 'c7dd2042-e2a5-4491-aa1c-0e72597641e0', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '34900b9c-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7348.124878909, 'message_signature': '52b108a70688c74ca031fdce4295f4f020777906129957b41161bc6aa6237c9b'}]}, 'timestamp': '2025-10-02 12:49:16.451416', '_unique_id': '26fbcd71a9364e80b449833ee80f05a5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.451 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.452 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.452 12 DEBUG ceilometer.compute.pollsters [-] c7dd2042-e2a5-4491-aa1c-0e72597641e0/network.outgoing.packets volume: 72 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.453 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '633415ed-81aa-4808-8cc7-b2fc93bbdde9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 72, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': 'instance-000000b8-c7dd2042-e2a5-4491-aa1c-0e72597641e0-tap84cebaa0-81', 'timestamp': '2025-10-02T12:49:16.452452', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-1773803623', 'name': 'tap84cebaa0-81', 'instance_id': 'c7dd2042-e2a5-4491-aa1c-0e72597641e0', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:95:8b:55', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap84cebaa0-81'}, 'message_id': '34903ce8-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7348.043028417, 'message_signature': '1b8b5dea43e8bafd86179254c9fd1acf10b174984311a79a982d9e83d694ad25'}]}, 'timestamp': '2025-10-02 12:49:16.452685', '_unique_id': '05539e581b43474cb7037a52ea892c6d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.453 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.453 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.453 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.453 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.453 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.453 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.453 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.453 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.453 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.453 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.453 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.453 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.453 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.453 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.453 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.453 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.453 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.453 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.453 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.453 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.453 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.453 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.453 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.453 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.453 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.453 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.453 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.453 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.453 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.453 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:49:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:49:16.453 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:49:21 np0005466013 nova_compute[192144]: 2025-10-02 12:49:21.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:21 np0005466013 nova_compute[192144]: 2025-10-02 12:49:21.323 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:22 np0005466013 podman[253258]: 2025-10-02 12:49:22.680594196 +0000 UTC m=+0.061001468 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, tcib_managed=true)
Oct  2 08:49:22 np0005466013 podman[253260]: 2025-10-02 12:49:22.688294887 +0000 UTC m=+0.064498168 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=edpm)
Oct  2 08:49:22 np0005466013 podman[253259]: 2025-10-02 12:49:22.704775715 +0000 UTC m=+0.084314210 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.buildah.version=1.33.7, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.openshift.expose-services=, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, managed_by=edpm_ansible, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, container_name=openstack_network_exporter, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Oct  2 08:49:25 np0005466013 podman[253318]: 2025-10-02 12:49:25.664169397 +0000 UTC m=+0.042300888 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  2 08:49:25 np0005466013 podman[253319]: 2025-10-02 12:49:25.688349466 +0000 UTC m=+0.061558592 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  2 08:49:26 np0005466013 nova_compute[192144]: 2025-10-02 12:49:26.310 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:26 np0005466013 nova_compute[192144]: 2025-10-02 12:49:26.325 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:31 np0005466013 nova_compute[192144]: 2025-10-02 12:49:31.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:31 np0005466013 nova_compute[192144]: 2025-10-02 12:49:31.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:34 np0005466013 nova_compute[192144]: 2025-10-02 12:49:34.310 2 DEBUG oslo_concurrency.lockutils [None req-59a4dfc5-c2b1-4745-ad73-4ca665a9e7ae 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Acquiring lock "c7dd2042-e2a5-4491-aa1c-0e72597641e0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:49:34 np0005466013 nova_compute[192144]: 2025-10-02 12:49:34.310 2 DEBUG oslo_concurrency.lockutils [None req-59a4dfc5-c2b1-4745-ad73-4ca665a9e7ae 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "c7dd2042-e2a5-4491-aa1c-0e72597641e0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:49:34 np0005466013 nova_compute[192144]: 2025-10-02 12:49:34.311 2 DEBUG oslo_concurrency.lockutils [None req-59a4dfc5-c2b1-4745-ad73-4ca665a9e7ae 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Acquiring lock "c7dd2042-e2a5-4491-aa1c-0e72597641e0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:49:34 np0005466013 nova_compute[192144]: 2025-10-02 12:49:34.311 2 DEBUG oslo_concurrency.lockutils [None req-59a4dfc5-c2b1-4745-ad73-4ca665a9e7ae 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "c7dd2042-e2a5-4491-aa1c-0e72597641e0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:49:34 np0005466013 nova_compute[192144]: 2025-10-02 12:49:34.311 2 DEBUG oslo_concurrency.lockutils [None req-59a4dfc5-c2b1-4745-ad73-4ca665a9e7ae 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "c7dd2042-e2a5-4491-aa1c-0e72597641e0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:49:34 np0005466013 nova_compute[192144]: 2025-10-02 12:49:34.323 2 INFO nova.compute.manager [None req-59a4dfc5-c2b1-4745-ad73-4ca665a9e7ae 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Terminating instance#033[00m
Oct  2 08:49:34 np0005466013 nova_compute[192144]: 2025-10-02 12:49:34.337 2 DEBUG nova.compute.manager [None req-59a4dfc5-c2b1-4745-ad73-4ca665a9e7ae 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:49:34 np0005466013 kernel: tap84cebaa0-81 (unregistering): left promiscuous mode
Oct  2 08:49:34 np0005466013 NetworkManager[51205]: <info>  [1759409374.3585] device (tap84cebaa0-81): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:49:34 np0005466013 ovn_controller[94366]: 2025-10-02T12:49:34Z|00789|binding|INFO|Releasing lport 84cebaa0-8158-4467-a214-70216aa0fa77 from this chassis (sb_readonly=0)
Oct  2 08:49:34 np0005466013 ovn_controller[94366]: 2025-10-02T12:49:34Z|00790|binding|INFO|Setting lport 84cebaa0-8158-4467-a214-70216aa0fa77 down in Southbound
Oct  2 08:49:34 np0005466013 ovn_controller[94366]: 2025-10-02T12:49:34Z|00791|binding|INFO|Removing iface tap84cebaa0-81 ovn-installed in OVS
Oct  2 08:49:34 np0005466013 nova_compute[192144]: 2025-10-02 12:49:34.374 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:34 np0005466013 nova_compute[192144]: 2025-10-02 12:49:34.376 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:34 np0005466013 nova_compute[192144]: 2025-10-02 12:49:34.392 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:34 np0005466013 systemd[1]: machine-qemu\x2d82\x2dinstance\x2d000000b8.scope: Deactivated successfully.
Oct  2 08:49:34 np0005466013 systemd[1]: machine-qemu\x2d82\x2dinstance\x2d000000b8.scope: Consumed 17.274s CPU time.
Oct  2 08:49:34 np0005466013 systemd-machined[152202]: Machine qemu-82-instance-000000b8 terminated.
Oct  2 08:49:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:49:34.442 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:95:8b:55 10.100.0.6'], port_security=['fa:16:3e:95:8b:55 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'c7dd2042-e2a5-4491-aa1c-0e72597641e0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3c776fa4-63c0-44fa-bf3f-04ad74974c2c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3bf5c068-41c3-45ca-8822-72717311e7da d068f527-d669-40d0-ac19-cae69897b62d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=78978e88-15ad-4f25-bc19-feb08335ac33, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=84cebaa0-8158-4467-a214-70216aa0fa77) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:49:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:49:34.443 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 84cebaa0-8158-4467-a214-70216aa0fa77 in datapath 3c776fa4-63c0-44fa-bf3f-04ad74974c2c unbound from our chassis#033[00m
Oct  2 08:49:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:49:34.445 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3c776fa4-63c0-44fa-bf3f-04ad74974c2c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:49:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:49:34.446 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[3087e0dc-730e-4f41-ae1e-3a812ab6773f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:49:34.446 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3c776fa4-63c0-44fa-bf3f-04ad74974c2c namespace which is not needed anymore#033[00m
Oct  2 08:49:34 np0005466013 neutron-haproxy-ovnmeta-3c776fa4-63c0-44fa-bf3f-04ad74974c2c[252972]: [NOTICE]   (252976) : haproxy version is 2.8.14-c23fe91
Oct  2 08:49:34 np0005466013 neutron-haproxy-ovnmeta-3c776fa4-63c0-44fa-bf3f-04ad74974c2c[252972]: [NOTICE]   (252976) : path to executable is /usr/sbin/haproxy
Oct  2 08:49:34 np0005466013 neutron-haproxy-ovnmeta-3c776fa4-63c0-44fa-bf3f-04ad74974c2c[252972]: [WARNING]  (252976) : Exiting Master process...
Oct  2 08:49:34 np0005466013 neutron-haproxy-ovnmeta-3c776fa4-63c0-44fa-bf3f-04ad74974c2c[252972]: [WARNING]  (252976) : Exiting Master process...
Oct  2 08:49:34 np0005466013 neutron-haproxy-ovnmeta-3c776fa4-63c0-44fa-bf3f-04ad74974c2c[252972]: [ALERT]    (252976) : Current worker (252978) exited with code 143 (Terminated)
Oct  2 08:49:34 np0005466013 neutron-haproxy-ovnmeta-3c776fa4-63c0-44fa-bf3f-04ad74974c2c[252972]: [WARNING]  (252976) : All workers exited. Exiting... (0)
Oct  2 08:49:34 np0005466013 systemd[1]: libpod-e50db7013cf41217d5ea112b0cbb46da6a67fd1cd1d2b584658e6564c9781945.scope: Deactivated successfully.
Oct  2 08:49:34 np0005466013 conmon[252972]: conmon e50db7013cf41217d5ea <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-e50db7013cf41217d5ea112b0cbb46da6a67fd1cd1d2b584658e6564c9781945.scope/container/memory.events
Oct  2 08:49:34 np0005466013 podman[253386]: 2025-10-02 12:49:34.589660897 +0000 UTC m=+0.061978916 container died e50db7013cf41217d5ea112b0cbb46da6a67fd1cd1d2b584658e6564c9781945 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3c776fa4-63c0-44fa-bf3f-04ad74974c2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  2 08:49:34 np0005466013 nova_compute[192144]: 2025-10-02 12:49:34.605 2 INFO nova.virt.libvirt.driver [-] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Instance destroyed successfully.#033[00m
Oct  2 08:49:34 np0005466013 nova_compute[192144]: 2025-10-02 12:49:34.607 2 DEBUG nova.objects.instance [None req-59a4dfc5-c2b1-4745-ad73-4ca665a9e7ae 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lazy-loading 'resources' on Instance uuid c7dd2042-e2a5-4491-aa1c-0e72597641e0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:49:34 np0005466013 nova_compute[192144]: 2025-10-02 12:49:34.630 2 DEBUG nova.virt.libvirt.vif [None req-59a4dfc5-c2b1-4745-ad73-4ca665a9e7ae 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:48:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-1773803623',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-1773803623',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1020134341-ac',id=184,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI2KKxYqJuo+bm0uXO0va+WiltctIuUrNVSuyXKH60Q282vpKz7lkIUwo7YbhQgvFPQ6W6pvlS1MgI71IgsIlYiUsaPlzFVJnshPK84X/j2YUTiXwv4g5W08cDEUTRF7vw==',key_name='tempest-TestSecurityGroupsBasicOps-1976615750',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:48:29Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='575f3d227ab24f2daa62e65e14a4cd9c',ramdisk_id='',reservation_id='r-dv489toa',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-1020134341',owner_user_name='tempest-TestSecurityGroupsBasicOps-1020134341-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:48:30Z,user_data=None,user_id='2d2b4a2da57543ef88e44ae28ad61647',uuid=c7dd2042-e2a5-4491-aa1c-0e72597641e0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "84cebaa0-8158-4467-a214-70216aa0fa77", "address": "fa:16:3e:95:8b:55", "network": {"id": "3c776fa4-63c0-44fa-bf3f-04ad74974c2c", "bridge": "br-int", "label": "tempest-network-smoke--1497944835", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84cebaa0-81", "ovs_interfaceid": "84cebaa0-8158-4467-a214-70216aa0fa77", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:49:34 np0005466013 nova_compute[192144]: 2025-10-02 12:49:34.630 2 DEBUG nova.network.os_vif_util [None req-59a4dfc5-c2b1-4745-ad73-4ca665a9e7ae 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Converting VIF {"id": "84cebaa0-8158-4467-a214-70216aa0fa77", "address": "fa:16:3e:95:8b:55", "network": {"id": "3c776fa4-63c0-44fa-bf3f-04ad74974c2c", "bridge": "br-int", "label": "tempest-network-smoke--1497944835", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84cebaa0-81", "ovs_interfaceid": "84cebaa0-8158-4467-a214-70216aa0fa77", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:49:34 np0005466013 nova_compute[192144]: 2025-10-02 12:49:34.631 2 DEBUG nova.network.os_vif_util [None req-59a4dfc5-c2b1-4745-ad73-4ca665a9e7ae 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:95:8b:55,bridge_name='br-int',has_traffic_filtering=True,id=84cebaa0-8158-4467-a214-70216aa0fa77,network=Network(3c776fa4-63c0-44fa-bf3f-04ad74974c2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap84cebaa0-81') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:49:34 np0005466013 nova_compute[192144]: 2025-10-02 12:49:34.631 2 DEBUG os_vif [None req-59a4dfc5-c2b1-4745-ad73-4ca665a9e7ae 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:95:8b:55,bridge_name='br-int',has_traffic_filtering=True,id=84cebaa0-8158-4467-a214-70216aa0fa77,network=Network(3c776fa4-63c0-44fa-bf3f-04ad74974c2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap84cebaa0-81') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:49:34 np0005466013 nova_compute[192144]: 2025-10-02 12:49:34.633 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:34 np0005466013 nova_compute[192144]: 2025-10-02 12:49:34.633 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap84cebaa0-81, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:49:34 np0005466013 nova_compute[192144]: 2025-10-02 12:49:34.637 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:49:34 np0005466013 nova_compute[192144]: 2025-10-02 12:49:34.640 2 INFO os_vif [None req-59a4dfc5-c2b1-4745-ad73-4ca665a9e7ae 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:95:8b:55,bridge_name='br-int',has_traffic_filtering=True,id=84cebaa0-8158-4467-a214-70216aa0fa77,network=Network(3c776fa4-63c0-44fa-bf3f-04ad74974c2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap84cebaa0-81')#033[00m
Oct  2 08:49:34 np0005466013 nova_compute[192144]: 2025-10-02 12:49:34.641 2 INFO nova.virt.libvirt.driver [None req-59a4dfc5-c2b1-4745-ad73-4ca665a9e7ae 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Deleting instance files /var/lib/nova/instances/c7dd2042-e2a5-4491-aa1c-0e72597641e0_del#033[00m
Oct  2 08:49:34 np0005466013 nova_compute[192144]: 2025-10-02 12:49:34.642 2 INFO nova.virt.libvirt.driver [None req-59a4dfc5-c2b1-4745-ad73-4ca665a9e7ae 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Deletion of /var/lib/nova/instances/c7dd2042-e2a5-4491-aa1c-0e72597641e0_del complete#033[00m
Oct  2 08:49:34 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e50db7013cf41217d5ea112b0cbb46da6a67fd1cd1d2b584658e6564c9781945-userdata-shm.mount: Deactivated successfully.
Oct  2 08:49:34 np0005466013 systemd[1]: var-lib-containers-storage-overlay-b5ece0441b1a7a5b58653ecb9ea88d47d6d0284f7c461d4b695439d43caf69a4-merged.mount: Deactivated successfully.
Oct  2 08:49:34 np0005466013 nova_compute[192144]: 2025-10-02 12:49:34.721 2 INFO nova.compute.manager [None req-59a4dfc5-c2b1-4745-ad73-4ca665a9e7ae 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:49:34 np0005466013 nova_compute[192144]: 2025-10-02 12:49:34.722 2 DEBUG oslo.service.loopingcall [None req-59a4dfc5-c2b1-4745-ad73-4ca665a9e7ae 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:49:34 np0005466013 nova_compute[192144]: 2025-10-02 12:49:34.722 2 DEBUG nova.compute.manager [-] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:49:34 np0005466013 nova_compute[192144]: 2025-10-02 12:49:34.722 2 DEBUG nova.network.neutron [-] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:49:34 np0005466013 podman[253386]: 2025-10-02 12:49:34.815185052 +0000 UTC m=+0.287503051 container cleanup e50db7013cf41217d5ea112b0cbb46da6a67fd1cd1d2b584658e6564c9781945 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3c776fa4-63c0-44fa-bf3f-04ad74974c2c, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001)
Oct  2 08:49:34 np0005466013 systemd[1]: libpod-conmon-e50db7013cf41217d5ea112b0cbb46da6a67fd1cd1d2b584658e6564c9781945.scope: Deactivated successfully.
Oct  2 08:49:34 np0005466013 podman[253430]: 2025-10-02 12:49:34.907265092 +0000 UTC m=+0.065394863 container remove e50db7013cf41217d5ea112b0cbb46da6a67fd1cd1d2b584658e6564c9781945 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3c776fa4-63c0-44fa-bf3f-04ad74974c2c, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  2 08:49:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:49:34.914 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[fa60fbbb-070e-4599-a908-4534adf29f42]: (4, ('Thu Oct  2 12:49:34 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-3c776fa4-63c0-44fa-bf3f-04ad74974c2c (e50db7013cf41217d5ea112b0cbb46da6a67fd1cd1d2b584658e6564c9781945)\ne50db7013cf41217d5ea112b0cbb46da6a67fd1cd1d2b584658e6564c9781945\nThu Oct  2 12:49:34 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-3c776fa4-63c0-44fa-bf3f-04ad74974c2c (e50db7013cf41217d5ea112b0cbb46da6a67fd1cd1d2b584658e6564c9781945)\ne50db7013cf41217d5ea112b0cbb46da6a67fd1cd1d2b584658e6564c9781945\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:49:34.916 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[70cf136d-5235-42ba-a72e-861881d220dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:49:34.918 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3c776fa4-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:49:34 np0005466013 kernel: tap3c776fa4-60: left promiscuous mode
Oct  2 08:49:34 np0005466013 nova_compute[192144]: 2025-10-02 12:49:34.922 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:34 np0005466013 nova_compute[192144]: 2025-10-02 12:49:34.932 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:49:34.935 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[a7c7ba4c-8d61-4c9c-852c-1884b0c095d5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:49:34.966 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[59908747-8209-42d5-9ebc-c62b5161bd52]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:49:34.967 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[82fd38d3-9a8d-4016-9938-1862ede11a68]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:49:34.983 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[b2493f8c-5a0f-4975-8379-94ece0fb55b2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 729879, 'reachable_time': 32715, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 253445, 'error': None, 'target': 'ovnmeta-3c776fa4-63c0-44fa-bf3f-04ad74974c2c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:34 np0005466013 systemd[1]: run-netns-ovnmeta\x2d3c776fa4\x2d63c0\x2d44fa\x2dbf3f\x2d04ad74974c2c.mount: Deactivated successfully.
Oct  2 08:49:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:49:34.988 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3c776fa4-63c0-44fa-bf3f-04ad74974c2c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:49:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:49:34.988 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[24f9cf27-c80e-4cde-a213-729726d61e0a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:49:36 np0005466013 nova_compute[192144]: 2025-10-02 12:49:36.014 2 DEBUG nova.network.neutron [-] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:49:36 np0005466013 nova_compute[192144]: 2025-10-02 12:49:36.030 2 INFO nova.compute.manager [-] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Took 1.31 seconds to deallocate network for instance.#033[00m
Oct  2 08:49:36 np0005466013 nova_compute[192144]: 2025-10-02 12:49:36.112 2 DEBUG oslo_concurrency.lockutils [None req-59a4dfc5-c2b1-4745-ad73-4ca665a9e7ae 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:49:36 np0005466013 nova_compute[192144]: 2025-10-02 12:49:36.112 2 DEBUG oslo_concurrency.lockutils [None req-59a4dfc5-c2b1-4745-ad73-4ca665a9e7ae 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:49:36 np0005466013 nova_compute[192144]: 2025-10-02 12:49:36.189 2 DEBUG nova.compute.provider_tree [None req-59a4dfc5-c2b1-4745-ad73-4ca665a9e7ae 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:49:36 np0005466013 nova_compute[192144]: 2025-10-02 12:49:36.205 2 DEBUG nova.scheduler.client.report [None req-59a4dfc5-c2b1-4745-ad73-4ca665a9e7ae 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:49:36 np0005466013 nova_compute[192144]: 2025-10-02 12:49:36.223 2 DEBUG oslo_concurrency.lockutils [None req-59a4dfc5-c2b1-4745-ad73-4ca665a9e7ae 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:49:36 np0005466013 nova_compute[192144]: 2025-10-02 12:49:36.250 2 DEBUG nova.compute.manager [req-b87fad57-9081-4f1e-a8eb-860d7b79fd72 req-c7725014-ab74-43aa-b36a-54a607a8cf1b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Received event network-changed-84cebaa0-8158-4467-a214-70216aa0fa77 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:49:36 np0005466013 nova_compute[192144]: 2025-10-02 12:49:36.250 2 DEBUG nova.compute.manager [req-b87fad57-9081-4f1e-a8eb-860d7b79fd72 req-c7725014-ab74-43aa-b36a-54a607a8cf1b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Refreshing instance network info cache due to event network-changed-84cebaa0-8158-4467-a214-70216aa0fa77. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:49:36 np0005466013 nova_compute[192144]: 2025-10-02 12:49:36.251 2 DEBUG oslo_concurrency.lockutils [req-b87fad57-9081-4f1e-a8eb-860d7b79fd72 req-c7725014-ab74-43aa-b36a-54a607a8cf1b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-c7dd2042-e2a5-4491-aa1c-0e72597641e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:49:36 np0005466013 nova_compute[192144]: 2025-10-02 12:49:36.251 2 DEBUG oslo_concurrency.lockutils [req-b87fad57-9081-4f1e-a8eb-860d7b79fd72 req-c7725014-ab74-43aa-b36a-54a607a8cf1b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-c7dd2042-e2a5-4491-aa1c-0e72597641e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:49:36 np0005466013 nova_compute[192144]: 2025-10-02 12:49:36.251 2 DEBUG nova.network.neutron [req-b87fad57-9081-4f1e-a8eb-860d7b79fd72 req-c7725014-ab74-43aa-b36a-54a607a8cf1b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Refreshing network info cache for port 84cebaa0-8158-4467-a214-70216aa0fa77 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:49:36 np0005466013 nova_compute[192144]: 2025-10-02 12:49:36.253 2 INFO nova.scheduler.client.report [None req-59a4dfc5-c2b1-4745-ad73-4ca665a9e7ae 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Deleted allocations for instance c7dd2042-e2a5-4491-aa1c-0e72597641e0#033[00m
Oct  2 08:49:36 np0005466013 nova_compute[192144]: 2025-10-02 12:49:36.317 2 DEBUG oslo_concurrency.lockutils [None req-59a4dfc5-c2b1-4745-ad73-4ca665a9e7ae 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "c7dd2042-e2a5-4491-aa1c-0e72597641e0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.007s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:49:36 np0005466013 nova_compute[192144]: 2025-10-02 12:49:36.330 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:36 np0005466013 nova_compute[192144]: 2025-10-02 12:49:36.666 2 DEBUG nova.network.neutron [req-b87fad57-9081-4f1e-a8eb-860d7b79fd72 req-c7725014-ab74-43aa-b36a-54a607a8cf1b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:49:36 np0005466013 nova_compute[192144]: 2025-10-02 12:49:36.955 2 DEBUG nova.network.neutron [req-b87fad57-9081-4f1e-a8eb-860d7b79fd72 req-c7725014-ab74-43aa-b36a-54a607a8cf1b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:49:36 np0005466013 nova_compute[192144]: 2025-10-02 12:49:36.972 2 DEBUG oslo_concurrency.lockutils [req-b87fad57-9081-4f1e-a8eb-860d7b79fd72 req-c7725014-ab74-43aa-b36a-54a607a8cf1b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-c7dd2042-e2a5-4491-aa1c-0e72597641e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:49:36 np0005466013 nova_compute[192144]: 2025-10-02 12:49:36.973 2 DEBUG nova.compute.manager [req-b87fad57-9081-4f1e-a8eb-860d7b79fd72 req-c7725014-ab74-43aa-b36a-54a607a8cf1b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Received event network-vif-unplugged-84cebaa0-8158-4467-a214-70216aa0fa77 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:49:36 np0005466013 nova_compute[192144]: 2025-10-02 12:49:36.973 2 DEBUG oslo_concurrency.lockutils [req-b87fad57-9081-4f1e-a8eb-860d7b79fd72 req-c7725014-ab74-43aa-b36a-54a607a8cf1b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "c7dd2042-e2a5-4491-aa1c-0e72597641e0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:49:36 np0005466013 nova_compute[192144]: 2025-10-02 12:49:36.973 2 DEBUG oslo_concurrency.lockutils [req-b87fad57-9081-4f1e-a8eb-860d7b79fd72 req-c7725014-ab74-43aa-b36a-54a607a8cf1b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "c7dd2042-e2a5-4491-aa1c-0e72597641e0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:49:36 np0005466013 nova_compute[192144]: 2025-10-02 12:49:36.974 2 DEBUG oslo_concurrency.lockutils [req-b87fad57-9081-4f1e-a8eb-860d7b79fd72 req-c7725014-ab74-43aa-b36a-54a607a8cf1b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "c7dd2042-e2a5-4491-aa1c-0e72597641e0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:49:36 np0005466013 nova_compute[192144]: 2025-10-02 12:49:36.974 2 DEBUG nova.compute.manager [req-b87fad57-9081-4f1e-a8eb-860d7b79fd72 req-c7725014-ab74-43aa-b36a-54a607a8cf1b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] No waiting events found dispatching network-vif-unplugged-84cebaa0-8158-4467-a214-70216aa0fa77 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:49:36 np0005466013 nova_compute[192144]: 2025-10-02 12:49:36.974 2 WARNING nova.compute.manager [req-b87fad57-9081-4f1e-a8eb-860d7b79fd72 req-c7725014-ab74-43aa-b36a-54a607a8cf1b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Received unexpected event network-vif-unplugged-84cebaa0-8158-4467-a214-70216aa0fa77 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:49:36 np0005466013 nova_compute[192144]: 2025-10-02 12:49:36.974 2 DEBUG nova.compute.manager [req-b87fad57-9081-4f1e-a8eb-860d7b79fd72 req-c7725014-ab74-43aa-b36a-54a607a8cf1b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Received event network-vif-plugged-84cebaa0-8158-4467-a214-70216aa0fa77 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:49:36 np0005466013 nova_compute[192144]: 2025-10-02 12:49:36.975 2 DEBUG oslo_concurrency.lockutils [req-b87fad57-9081-4f1e-a8eb-860d7b79fd72 req-c7725014-ab74-43aa-b36a-54a607a8cf1b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "c7dd2042-e2a5-4491-aa1c-0e72597641e0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:49:36 np0005466013 nova_compute[192144]: 2025-10-02 12:49:36.975 2 DEBUG oslo_concurrency.lockutils [req-b87fad57-9081-4f1e-a8eb-860d7b79fd72 req-c7725014-ab74-43aa-b36a-54a607a8cf1b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "c7dd2042-e2a5-4491-aa1c-0e72597641e0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:49:36 np0005466013 nova_compute[192144]: 2025-10-02 12:49:36.975 2 DEBUG oslo_concurrency.lockutils [req-b87fad57-9081-4f1e-a8eb-860d7b79fd72 req-c7725014-ab74-43aa-b36a-54a607a8cf1b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "c7dd2042-e2a5-4491-aa1c-0e72597641e0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:49:36 np0005466013 nova_compute[192144]: 2025-10-02 12:49:36.975 2 DEBUG nova.compute.manager [req-b87fad57-9081-4f1e-a8eb-860d7b79fd72 req-c7725014-ab74-43aa-b36a-54a607a8cf1b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] No waiting events found dispatching network-vif-plugged-84cebaa0-8158-4467-a214-70216aa0fa77 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:49:36 np0005466013 nova_compute[192144]: 2025-10-02 12:49:36.976 2 WARNING nova.compute.manager [req-b87fad57-9081-4f1e-a8eb-860d7b79fd72 req-c7725014-ab74-43aa-b36a-54a607a8cf1b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Received unexpected event network-vif-plugged-84cebaa0-8158-4467-a214-70216aa0fa77 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:49:36 np0005466013 nova_compute[192144]: 2025-10-02 12:49:36.976 2 DEBUG nova.compute.manager [req-b87fad57-9081-4f1e-a8eb-860d7b79fd72 req-c7725014-ab74-43aa-b36a-54a607a8cf1b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Received event network-vif-deleted-84cebaa0-8158-4467-a214-70216aa0fa77 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:49:39 np0005466013 nova_compute[192144]: 2025-10-02 12:49:39.637 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:41 np0005466013 nova_compute[192144]: 2025-10-02 12:49:41.332 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:42 np0005466013 podman[253448]: 2025-10-02 12:49:42.678850331 +0000 UTC m=+0.047816942 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 08:49:42 np0005466013 podman[253447]: 2025-10-02 12:49:42.685937963 +0000 UTC m=+0.059038284 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:49:42 np0005466013 podman[253449]: 2025-10-02 12:49:42.740947069 +0000 UTC m=+0.101192866 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible)
Oct  2 08:49:44 np0005466013 nova_compute[192144]: 2025-10-02 12:49:44.640 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:44 np0005466013 nova_compute[192144]: 2025-10-02 12:49:44.994 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:45 np0005466013 nova_compute[192144]: 2025-10-02 12:49:45.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:46 np0005466013 nova_compute[192144]: 2025-10-02 12:49:46.334 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:49 np0005466013 nova_compute[192144]: 2025-10-02 12:49:49.605 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759409374.6031756, c7dd2042-e2a5-4491-aa1c-0e72597641e0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:49:49 np0005466013 nova_compute[192144]: 2025-10-02 12:49:49.606 2 INFO nova.compute.manager [-] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:49:49 np0005466013 nova_compute[192144]: 2025-10-02 12:49:49.644 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:50 np0005466013 nova_compute[192144]: 2025-10-02 12:49:50.698 2 DEBUG nova.compute.manager [None req-473c48e0-2c6a-4eca-a081-61f6fa4be0ed - - - - - -] [instance: c7dd2042-e2a5-4491-aa1c-0e72597641e0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:49:51 np0005466013 nova_compute[192144]: 2025-10-02 12:49:51.336 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:51 np0005466013 nova_compute[192144]: 2025-10-02 12:49:51.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:49:51 np0005466013 nova_compute[192144]: 2025-10-02 12:49:51.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:49:51 np0005466013 nova_compute[192144]: 2025-10-02 12:49:51.995 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:49:52 np0005466013 nova_compute[192144]: 2025-10-02 12:49:52.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:49:53 np0005466013 nova_compute[192144]: 2025-10-02 12:49:53.121 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:49:53 np0005466013 nova_compute[192144]: 2025-10-02 12:49:53.122 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:49:53 np0005466013 nova_compute[192144]: 2025-10-02 12:49:53.122 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:49:53 np0005466013 nova_compute[192144]: 2025-10-02 12:49:53.122 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:49:53 np0005466013 podman[253518]: 2025-10-02 12:49:53.225221805 +0000 UTC m=+0.063515414 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 08:49:53 np0005466013 podman[253521]: 2025-10-02 12:49:53.241799655 +0000 UTC m=+0.074556470 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2)
Oct  2 08:49:53 np0005466013 podman[253520]: 2025-10-02 12:49:53.255641019 +0000 UTC m=+0.079377462 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, name=ubi9-minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, container_name=openstack_network_exporter, architecture=x86_64, distribution-scope=public, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  2 08:49:53 np0005466013 nova_compute[192144]: 2025-10-02 12:49:53.309 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:49:53 np0005466013 nova_compute[192144]: 2025-10-02 12:49:53.310 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5727MB free_disk=73.13184356689453GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:49:53 np0005466013 nova_compute[192144]: 2025-10-02 12:49:53.310 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:49:53 np0005466013 nova_compute[192144]: 2025-10-02 12:49:53.310 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:49:53 np0005466013 nova_compute[192144]: 2025-10-02 12:49:53.722 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:49:53 np0005466013 nova_compute[192144]: 2025-10-02 12:49:53.723 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:49:53 np0005466013 nova_compute[192144]: 2025-10-02 12:49:53.744 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:49:53 np0005466013 nova_compute[192144]: 2025-10-02 12:49:53.790 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:49:53 np0005466013 nova_compute[192144]: 2025-10-02 12:49:53.876 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:49:53 np0005466013 nova_compute[192144]: 2025-10-02 12:49:53.876 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.566s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:49:54 np0005466013 nova_compute[192144]: 2025-10-02 12:49:54.647 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:54 np0005466013 nova_compute[192144]: 2025-10-02 12:49:54.877 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:49:56 np0005466013 nova_compute[192144]: 2025-10-02 12:49:56.383 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:56 np0005466013 podman[253576]: 2025-10-02 12:49:56.675744582 +0000 UTC m=+0.052803398 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:49:56 np0005466013 podman[253577]: 2025-10-02 12:49:56.688695079 +0000 UTC m=+0.059161557 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:49:58 np0005466013 nova_compute[192144]: 2025-10-02 12:49:58.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:49:59 np0005466013 nova_compute[192144]: 2025-10-02 12:49:59.649 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:59 np0005466013 nova_compute[192144]: 2025-10-02 12:49:59.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:50:01 np0005466013 nova_compute[192144]: 2025-10-02 12:50:01.385 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:50:02.334 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:50:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:50:02.335 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:50:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:50:02.335 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:50:04 np0005466013 nova_compute[192144]: 2025-10-02 12:50:04.652 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:04 np0005466013 nova_compute[192144]: 2025-10-02 12:50:04.988 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:50:04 np0005466013 nova_compute[192144]: 2025-10-02 12:50:04.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:50:05 np0005466013 nova_compute[192144]: 2025-10-02 12:50:05.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:50:05 np0005466013 nova_compute[192144]: 2025-10-02 12:50:05.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:50:05 np0005466013 nova_compute[192144]: 2025-10-02 12:50:05.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:50:06 np0005466013 nova_compute[192144]: 2025-10-02 12:50:06.334 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:50:06 np0005466013 nova_compute[192144]: 2025-10-02 12:50:06.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:08 np0005466013 nova_compute[192144]: 2025-10-02 12:50:08.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:50:08 np0005466013 nova_compute[192144]: 2025-10-02 12:50:08.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  2 08:50:09 np0005466013 nova_compute[192144]: 2025-10-02 12:50:09.656 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:11 np0005466013 nova_compute[192144]: 2025-10-02 12:50:11.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:12 np0005466013 nova_compute[192144]: 2025-10-02 12:50:12.029 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:50:12 np0005466013 nova_compute[192144]: 2025-10-02 12:50:12.030 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  2 08:50:12 np0005466013 nova_compute[192144]: 2025-10-02 12:50:12.329 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  2 08:50:13 np0005466013 podman[253621]: 2025-10-02 12:50:13.676309373 +0000 UTC m=+0.051339511 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 08:50:13 np0005466013 podman[253622]: 2025-10-02 12:50:13.684203261 +0000 UTC m=+0.052271091 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:50:13 np0005466013 podman[253623]: 2025-10-02 12:50:13.735754368 +0000 UTC m=+0.105165820 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Oct  2 08:50:14 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:50:14.438 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=56, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=55) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:50:14 np0005466013 nova_compute[192144]: 2025-10-02 12:50:14.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:14 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:50:14.439 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:50:14 np0005466013 nova_compute[192144]: 2025-10-02 12:50:14.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:16 np0005466013 nova_compute[192144]: 2025-10-02 12:50:16.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:19 np0005466013 nova_compute[192144]: 2025-10-02 12:50:19.661 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:21 np0005466013 nova_compute[192144]: 2025-10-02 12:50:21.391 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:21 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:50:21.440 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '56'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:50:23 np0005466013 podman[253690]: 2025-10-02 12:50:23.679742083 +0000 UTC m=+0.053018424 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, architecture=x86_64, io.buildah.version=1.33.7)
Oct  2 08:50:23 np0005466013 podman[253689]: 2025-10-02 12:50:23.680481527 +0000 UTC m=+0.055706469 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:50:23 np0005466013 podman[253691]: 2025-10-02 12:50:23.710819278 +0000 UTC m=+0.079189416 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:50:23 np0005466013 nova_compute[192144]: 2025-10-02 12:50:23.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:50:24 np0005466013 nova_compute[192144]: 2025-10-02 12:50:24.664 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:26 np0005466013 nova_compute[192144]: 2025-10-02 12:50:26.392 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:27 np0005466013 podman[253750]: 2025-10-02 12:50:27.678830081 +0000 UTC m=+0.056969528 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  2 08:50:27 np0005466013 podman[253751]: 2025-10-02 12:50:27.689721403 +0000 UTC m=+0.062507232 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:50:27 np0005466013 ovn_controller[94366]: 2025-10-02T12:50:27Z|00792|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Oct  2 08:50:29 np0005466013 nova_compute[192144]: 2025-10-02 12:50:29.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:31 np0005466013 nova_compute[192144]: 2025-10-02 12:50:31.394 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:34 np0005466013 nova_compute[192144]: 2025-10-02 12:50:34.673 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:36 np0005466013 nova_compute[192144]: 2025-10-02 12:50:36.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:39 np0005466013 nova_compute[192144]: 2025-10-02 12:50:39.676 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:41 np0005466013 nova_compute[192144]: 2025-10-02 12:50:41.396 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:44 np0005466013 podman[253794]: 2025-10-02 12:50:44.661006673 +0000 UTC m=+0.042855756 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  2 08:50:44 np0005466013 podman[253795]: 2025-10-02 12:50:44.6672949 +0000 UTC m=+0.045911832 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  2 08:50:44 np0005466013 nova_compute[192144]: 2025-10-02 12:50:44.679 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:44 np0005466013 podman[253796]: 2025-10-02 12:50:44.706696936 +0000 UTC m=+0.082151388 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  2 08:50:46 np0005466013 nova_compute[192144]: 2025-10-02 12:50:46.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:49 np0005466013 nova_compute[192144]: 2025-10-02 12:50:49.683 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:51 np0005466013 nova_compute[192144]: 2025-10-02 12:50:51.441 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:53 np0005466013 nova_compute[192144]: 2025-10-02 12:50:53.499 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:50:53 np0005466013 nova_compute[192144]: 2025-10-02 12:50:53.499 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:50:53 np0005466013 nova_compute[192144]: 2025-10-02 12:50:53.499 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:50:53 np0005466013 nova_compute[192144]: 2025-10-02 12:50:53.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:50:54 np0005466013 nova_compute[192144]: 2025-10-02 12:50:54.034 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:50:54 np0005466013 nova_compute[192144]: 2025-10-02 12:50:54.034 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:50:54 np0005466013 nova_compute[192144]: 2025-10-02 12:50:54.034 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:50:54 np0005466013 nova_compute[192144]: 2025-10-02 12:50:54.035 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:50:54 np0005466013 nova_compute[192144]: 2025-10-02 12:50:54.162 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:50:54 np0005466013 nova_compute[192144]: 2025-10-02 12:50:54.163 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5722MB free_disk=73.1318588256836GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:50:54 np0005466013 nova_compute[192144]: 2025-10-02 12:50:54.163 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:50:54 np0005466013 nova_compute[192144]: 2025-10-02 12:50:54.164 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:50:54 np0005466013 nova_compute[192144]: 2025-10-02 12:50:54.370 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:50:54 np0005466013 nova_compute[192144]: 2025-10-02 12:50:54.370 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:50:54 np0005466013 nova_compute[192144]: 2025-10-02 12:50:54.440 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Refreshing inventories for resource provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  2 08:50:54 np0005466013 nova_compute[192144]: 2025-10-02 12:50:54.524 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Updating ProviderTree inventory for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  2 08:50:54 np0005466013 nova_compute[192144]: 2025-10-02 12:50:54.525 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Updating resource provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 generation from 86 to 87 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Oct  2 08:50:54 np0005466013 nova_compute[192144]: 2025-10-02 12:50:54.525 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Updating inventory in ProviderTree for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 08:50:54 np0005466013 nova_compute[192144]: 2025-10-02 12:50:54.539 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Refreshing aggregate associations for resource provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  2 08:50:54 np0005466013 nova_compute[192144]: 2025-10-02 12:50:54.569 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Refreshing trait associations for resource provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80, traits: COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_TRUSTED_CERTS,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NODE,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE,HW_CPU_X86_MMX,COMPUTE_RESCUE_BFV _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  2 08:50:54 np0005466013 nova_compute[192144]: 2025-10-02 12:50:54.621 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:50:54 np0005466013 podman[253862]: 2025-10-02 12:50:54.687909988 +0000 UTC m=+0.061363667 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=multipathd)
Oct  2 08:50:54 np0005466013 nova_compute[192144]: 2025-10-02 12:50:54.686 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:54 np0005466013 podman[253864]: 2025-10-02 12:50:54.700853604 +0000 UTC m=+0.066790967 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:50:54 np0005466013 podman[253863]: 2025-10-02 12:50:54.718437576 +0000 UTC m=+0.091514043 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, distribution-scope=public, maintainer=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, container_name=openstack_network_exporter, vcs-type=git, com.redhat.component=ubi9-minimal-container)
Oct  2 08:50:56 np0005466013 nova_compute[192144]: 2025-10-02 12:50:56.463 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:50:56 np0005466013 nova_compute[192144]: 2025-10-02 12:50:56.464 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:50:56 np0005466013 nova_compute[192144]: 2025-10-02 12:50:56.464 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.301s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:50:56 np0005466013 nova_compute[192144]: 2025-10-02 12:50:56.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:57 np0005466013 nova_compute[192144]: 2025-10-02 12:50:57.047 2 DEBUG oslo_concurrency.lockutils [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Acquiring lock "702895f8-1281-4b2f-8f4b-c838ce84b37a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:50:57 np0005466013 nova_compute[192144]: 2025-10-02 12:50:57.048 2 DEBUG oslo_concurrency.lockutils [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "702895f8-1281-4b2f-8f4b-c838ce84b37a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:50:57 np0005466013 nova_compute[192144]: 2025-10-02 12:50:57.071 2 DEBUG nova.compute.manager [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:50:57 np0005466013 nova_compute[192144]: 2025-10-02 12:50:57.213 2 DEBUG oslo_concurrency.lockutils [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:50:57 np0005466013 nova_compute[192144]: 2025-10-02 12:50:57.213 2 DEBUG oslo_concurrency.lockutils [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:50:57 np0005466013 nova_compute[192144]: 2025-10-02 12:50:57.220 2 DEBUG nova.virt.hardware [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:50:57 np0005466013 nova_compute[192144]: 2025-10-02 12:50:57.221 2 INFO nova.compute.claims [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  2 08:50:57 np0005466013 nova_compute[192144]: 2025-10-02 12:50:57.359 2 DEBUG nova.compute.provider_tree [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:50:57 np0005466013 nova_compute[192144]: 2025-10-02 12:50:57.377 2 DEBUG nova.scheduler.client.report [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:50:57 np0005466013 nova_compute[192144]: 2025-10-02 12:50:57.408 2 DEBUG oslo_concurrency.lockutils [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.195s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:50:57 np0005466013 nova_compute[192144]: 2025-10-02 12:50:57.409 2 DEBUG nova.compute.manager [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:50:57 np0005466013 nova_compute[192144]: 2025-10-02 12:50:57.479 2 DEBUG nova.compute.manager [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:50:57 np0005466013 nova_compute[192144]: 2025-10-02 12:50:57.479 2 DEBUG nova.network.neutron [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:50:57 np0005466013 nova_compute[192144]: 2025-10-02 12:50:57.503 2 INFO nova.virt.libvirt.driver [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:50:57 np0005466013 nova_compute[192144]: 2025-10-02 12:50:57.528 2 DEBUG nova.compute.manager [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:50:57 np0005466013 nova_compute[192144]: 2025-10-02 12:50:57.662 2 DEBUG nova.compute.manager [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:50:57 np0005466013 nova_compute[192144]: 2025-10-02 12:50:57.665 2 DEBUG nova.virt.libvirt.driver [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:50:57 np0005466013 nova_compute[192144]: 2025-10-02 12:50:57.666 2 INFO nova.virt.libvirt.driver [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Creating image(s)#033[00m
Oct  2 08:50:57 np0005466013 nova_compute[192144]: 2025-10-02 12:50:57.667 2 DEBUG oslo_concurrency.lockutils [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Acquiring lock "/var/lib/nova/instances/702895f8-1281-4b2f-8f4b-c838ce84b37a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:50:57 np0005466013 nova_compute[192144]: 2025-10-02 12:50:57.667 2 DEBUG oslo_concurrency.lockutils [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "/var/lib/nova/instances/702895f8-1281-4b2f-8f4b-c838ce84b37a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:50:57 np0005466013 nova_compute[192144]: 2025-10-02 12:50:57.669 2 DEBUG oslo_concurrency.lockutils [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "/var/lib/nova/instances/702895f8-1281-4b2f-8f4b-c838ce84b37a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:50:57 np0005466013 nova_compute[192144]: 2025-10-02 12:50:57.689 2 DEBUG oslo_concurrency.processutils [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:50:57 np0005466013 nova_compute[192144]: 2025-10-02 12:50:57.717 2 DEBUG nova.policy [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:50:57 np0005466013 nova_compute[192144]: 2025-10-02 12:50:57.750 2 DEBUG oslo_concurrency.processutils [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:50:57 np0005466013 nova_compute[192144]: 2025-10-02 12:50:57.750 2 DEBUG oslo_concurrency.lockutils [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:50:57 np0005466013 nova_compute[192144]: 2025-10-02 12:50:57.751 2 DEBUG oslo_concurrency.lockutils [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:50:57 np0005466013 nova_compute[192144]: 2025-10-02 12:50:57.762 2 DEBUG oslo_concurrency.processutils [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:50:57 np0005466013 nova_compute[192144]: 2025-10-02 12:50:57.814 2 DEBUG oslo_concurrency.processutils [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:50:57 np0005466013 nova_compute[192144]: 2025-10-02 12:50:57.815 2 DEBUG oslo_concurrency.processutils [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/702895f8-1281-4b2f-8f4b-c838ce84b37a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:50:58 np0005466013 nova_compute[192144]: 2025-10-02 12:50:58.028 2 DEBUG oslo_concurrency.processutils [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/702895f8-1281-4b2f-8f4b-c838ce84b37a/disk 1073741824" returned: 0 in 0.213s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:50:58 np0005466013 nova_compute[192144]: 2025-10-02 12:50:58.029 2 DEBUG oslo_concurrency.lockutils [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.278s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:50:58 np0005466013 nova_compute[192144]: 2025-10-02 12:50:58.029 2 DEBUG oslo_concurrency.processutils [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:50:58 np0005466013 nova_compute[192144]: 2025-10-02 12:50:58.086 2 DEBUG oslo_concurrency.processutils [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:50:58 np0005466013 nova_compute[192144]: 2025-10-02 12:50:58.087 2 DEBUG nova.virt.disk.api [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Checking if we can resize image /var/lib/nova/instances/702895f8-1281-4b2f-8f4b-c838ce84b37a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:50:58 np0005466013 nova_compute[192144]: 2025-10-02 12:50:58.087 2 DEBUG oslo_concurrency.processutils [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/702895f8-1281-4b2f-8f4b-c838ce84b37a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:50:58 np0005466013 nova_compute[192144]: 2025-10-02 12:50:58.145 2 DEBUG oslo_concurrency.processutils [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/702895f8-1281-4b2f-8f4b-c838ce84b37a/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:50:58 np0005466013 nova_compute[192144]: 2025-10-02 12:50:58.146 2 DEBUG nova.virt.disk.api [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Cannot resize image /var/lib/nova/instances/702895f8-1281-4b2f-8f4b-c838ce84b37a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:50:58 np0005466013 nova_compute[192144]: 2025-10-02 12:50:58.146 2 DEBUG nova.objects.instance [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lazy-loading 'migration_context' on Instance uuid 702895f8-1281-4b2f-8f4b-c838ce84b37a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:50:58 np0005466013 nova_compute[192144]: 2025-10-02 12:50:58.180 2 DEBUG nova.virt.libvirt.driver [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:50:58 np0005466013 nova_compute[192144]: 2025-10-02 12:50:58.181 2 DEBUG nova.virt.libvirt.driver [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Ensure instance console log exists: /var/lib/nova/instances/702895f8-1281-4b2f-8f4b-c838ce84b37a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:50:58 np0005466013 nova_compute[192144]: 2025-10-02 12:50:58.181 2 DEBUG oslo_concurrency.lockutils [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:50:58 np0005466013 nova_compute[192144]: 2025-10-02 12:50:58.181 2 DEBUG oslo_concurrency.lockutils [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:50:58 np0005466013 nova_compute[192144]: 2025-10-02 12:50:58.182 2 DEBUG oslo_concurrency.lockutils [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:50:58 np0005466013 nova_compute[192144]: 2025-10-02 12:50:58.464 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:50:58 np0005466013 podman[253934]: 2025-10-02 12:50:58.673714802 +0000 UTC m=+0.051647832 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  2 08:50:58 np0005466013 podman[253935]: 2025-10-02 12:50:58.681618049 +0000 UTC m=+0.056113571 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  2 08:50:59 np0005466013 nova_compute[192144]: 2025-10-02 12:50:59.690 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:59 np0005466013 nova_compute[192144]: 2025-10-02 12:50:59.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:51:00 np0005466013 nova_compute[192144]: 2025-10-02 12:51:00.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:51:01 np0005466013 nova_compute[192144]: 2025-10-02 12:51:01.491 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:51:02.336 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:51:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:51:02.336 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:51:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:51:02.336 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:51:04 np0005466013 nova_compute[192144]: 2025-10-02 12:51:04.210 2 DEBUG nova.network.neutron [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Successfully created port: 50f63236-6cb1-4e39-bed8-36dac99fd408 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:51:04 np0005466013 nova_compute[192144]: 2025-10-02 12:51:04.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:04 np0005466013 nova_compute[192144]: 2025-10-02 12:51:04.988 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:51:05 np0005466013 nova_compute[192144]: 2025-10-02 12:51:05.988 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:51:06 np0005466013 nova_compute[192144]: 2025-10-02 12:51:06.078 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:51:06 np0005466013 nova_compute[192144]: 2025-10-02 12:51:06.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:06 np0005466013 nova_compute[192144]: 2025-10-02 12:51:06.976 2 DEBUG nova.network.neutron [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Successfully updated port: 50f63236-6cb1-4e39-bed8-36dac99fd408 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:51:06 np0005466013 nova_compute[192144]: 2025-10-02 12:51:06.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:51:06 np0005466013 nova_compute[192144]: 2025-10-02 12:51:06.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:51:06 np0005466013 nova_compute[192144]: 2025-10-02 12:51:06.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:51:07 np0005466013 nova_compute[192144]: 2025-10-02 12:51:07.113 2 DEBUG nova.compute.manager [req-f7452fff-63ce-4f74-a268-122372aafd74 req-5a22cfb6-25d8-4b1e-afd6-4e226502f90f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Received event network-changed-50f63236-6cb1-4e39-bed8-36dac99fd408 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:51:07 np0005466013 nova_compute[192144]: 2025-10-02 12:51:07.113 2 DEBUG nova.compute.manager [req-f7452fff-63ce-4f74-a268-122372aafd74 req-5a22cfb6-25d8-4b1e-afd6-4e226502f90f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Refreshing instance network info cache due to event network-changed-50f63236-6cb1-4e39-bed8-36dac99fd408. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:51:07 np0005466013 nova_compute[192144]: 2025-10-02 12:51:07.114 2 DEBUG oslo_concurrency.lockutils [req-f7452fff-63ce-4f74-a268-122372aafd74 req-5a22cfb6-25d8-4b1e-afd6-4e226502f90f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-702895f8-1281-4b2f-8f4b-c838ce84b37a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:51:07 np0005466013 nova_compute[192144]: 2025-10-02 12:51:07.114 2 DEBUG oslo_concurrency.lockutils [req-f7452fff-63ce-4f74-a268-122372aafd74 req-5a22cfb6-25d8-4b1e-afd6-4e226502f90f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-702895f8-1281-4b2f-8f4b-c838ce84b37a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:51:07 np0005466013 nova_compute[192144]: 2025-10-02 12:51:07.114 2 DEBUG nova.network.neutron [req-f7452fff-63ce-4f74-a268-122372aafd74 req-5a22cfb6-25d8-4b1e-afd6-4e226502f90f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Refreshing network info cache for port 50f63236-6cb1-4e39-bed8-36dac99fd408 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:51:07 np0005466013 nova_compute[192144]: 2025-10-02 12:51:07.131 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Oct  2 08:51:07 np0005466013 nova_compute[192144]: 2025-10-02 12:51:07.131 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:51:07 np0005466013 nova_compute[192144]: 2025-10-02 12:51:07.134 2 DEBUG oslo_concurrency.lockutils [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Acquiring lock "refresh_cache-702895f8-1281-4b2f-8f4b-c838ce84b37a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:51:07 np0005466013 nova_compute[192144]: 2025-10-02 12:51:07.736 2 DEBUG nova.network.neutron [req-f7452fff-63ce-4f74-a268-122372aafd74 req-5a22cfb6-25d8-4b1e-afd6-4e226502f90f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:51:08 np0005466013 nova_compute[192144]: 2025-10-02 12:51:08.033 2 DEBUG nova.network.neutron [req-f7452fff-63ce-4f74-a268-122372aafd74 req-5a22cfb6-25d8-4b1e-afd6-4e226502f90f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:51:08 np0005466013 nova_compute[192144]: 2025-10-02 12:51:08.159 2 DEBUG oslo_concurrency.lockutils [req-f7452fff-63ce-4f74-a268-122372aafd74 req-5a22cfb6-25d8-4b1e-afd6-4e226502f90f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-702895f8-1281-4b2f-8f4b-c838ce84b37a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:51:08 np0005466013 nova_compute[192144]: 2025-10-02 12:51:08.161 2 DEBUG oslo_concurrency.lockutils [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Acquired lock "refresh_cache-702895f8-1281-4b2f-8f4b-c838ce84b37a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:51:08 np0005466013 nova_compute[192144]: 2025-10-02 12:51:08.161 2 DEBUG nova.network.neutron [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:51:08 np0005466013 nova_compute[192144]: 2025-10-02 12:51:08.762 2 DEBUG nova.network.neutron [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:51:09 np0005466013 nova_compute[192144]: 2025-10-02 12:51:09.697 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:10 np0005466013 nova_compute[192144]: 2025-10-02 12:51:10.259 2 DEBUG nova.network.neutron [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Updating instance_info_cache with network_info: [{"id": "50f63236-6cb1-4e39-bed8-36dac99fd408", "address": "fa:16:3e:17:54:85", "network": {"id": "c56f578e-f013-4483-b9f2-ee1459896133", "bridge": "br-int", "label": "tempest-network-smoke--1680080003", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50f63236-6c", "ovs_interfaceid": "50f63236-6cb1-4e39-bed8-36dac99fd408", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:51:10 np0005466013 nova_compute[192144]: 2025-10-02 12:51:10.380 2 DEBUG oslo_concurrency.lockutils [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Releasing lock "refresh_cache-702895f8-1281-4b2f-8f4b-c838ce84b37a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:51:10 np0005466013 nova_compute[192144]: 2025-10-02 12:51:10.380 2 DEBUG nova.compute.manager [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Instance network_info: |[{"id": "50f63236-6cb1-4e39-bed8-36dac99fd408", "address": "fa:16:3e:17:54:85", "network": {"id": "c56f578e-f013-4483-b9f2-ee1459896133", "bridge": "br-int", "label": "tempest-network-smoke--1680080003", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50f63236-6c", "ovs_interfaceid": "50f63236-6cb1-4e39-bed8-36dac99fd408", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:51:10 np0005466013 nova_compute[192144]: 2025-10-02 12:51:10.383 2 DEBUG nova.virt.libvirt.driver [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Start _get_guest_xml network_info=[{"id": "50f63236-6cb1-4e39-bed8-36dac99fd408", "address": "fa:16:3e:17:54:85", "network": {"id": "c56f578e-f013-4483-b9f2-ee1459896133", "bridge": "br-int", "label": "tempest-network-smoke--1680080003", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50f63236-6c", "ovs_interfaceid": "50f63236-6cb1-4e39-bed8-36dac99fd408", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:51:10 np0005466013 nova_compute[192144]: 2025-10-02 12:51:10.389 2 WARNING nova.virt.libvirt.driver [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:51:10 np0005466013 nova_compute[192144]: 2025-10-02 12:51:10.397 2 DEBUG nova.virt.libvirt.host [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:51:10 np0005466013 nova_compute[192144]: 2025-10-02 12:51:10.398 2 DEBUG nova.virt.libvirt.host [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:51:10 np0005466013 nova_compute[192144]: 2025-10-02 12:51:10.404 2 DEBUG nova.virt.libvirt.host [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:51:10 np0005466013 nova_compute[192144]: 2025-10-02 12:51:10.405 2 DEBUG nova.virt.libvirt.host [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:51:10 np0005466013 nova_compute[192144]: 2025-10-02 12:51:10.406 2 DEBUG nova.virt.libvirt.driver [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:51:10 np0005466013 nova_compute[192144]: 2025-10-02 12:51:10.407 2 DEBUG nova.virt.hardware [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:51:10 np0005466013 nova_compute[192144]: 2025-10-02 12:51:10.407 2 DEBUG nova.virt.hardware [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:51:10 np0005466013 nova_compute[192144]: 2025-10-02 12:51:10.407 2 DEBUG nova.virt.hardware [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:51:10 np0005466013 nova_compute[192144]: 2025-10-02 12:51:10.408 2 DEBUG nova.virt.hardware [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:51:10 np0005466013 nova_compute[192144]: 2025-10-02 12:51:10.408 2 DEBUG nova.virt.hardware [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:51:10 np0005466013 nova_compute[192144]: 2025-10-02 12:51:10.408 2 DEBUG nova.virt.hardware [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:51:10 np0005466013 nova_compute[192144]: 2025-10-02 12:51:10.408 2 DEBUG nova.virt.hardware [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:51:10 np0005466013 nova_compute[192144]: 2025-10-02 12:51:10.409 2 DEBUG nova.virt.hardware [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:51:10 np0005466013 nova_compute[192144]: 2025-10-02 12:51:10.409 2 DEBUG nova.virt.hardware [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:51:10 np0005466013 nova_compute[192144]: 2025-10-02 12:51:10.409 2 DEBUG nova.virt.hardware [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:51:10 np0005466013 nova_compute[192144]: 2025-10-02 12:51:10.410 2 DEBUG nova.virt.hardware [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:51:10 np0005466013 nova_compute[192144]: 2025-10-02 12:51:10.415 2 DEBUG nova.virt.libvirt.vif [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:50:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-gen-1-1251183732',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-gen-1-1251183732',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1020134341-ge',id=187,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPtdLgRziYi/gtQwh2c90NnE9jWcSnkXXhVGvo+TNtzW3MSE83NoyumTXAaB/UU4ExIeaKr77+vb5N+WSXOcyyn7dDdXMPaG0pk4M0kXpEwbShNs9Jn1NVnaa85coBBWBQ==',key_name='tempest-TestSecurityGroupsBasicOps-1574692784',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='575f3d227ab24f2daa62e65e14a4cd9c',ramdisk_id='',reservation_id='r-0ovuvx2b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1020134341',owner_user_name='tempest-TestSecurityGroupsBasicOps-1020134341-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:50:57Z,user_data=None,user_id='2d2b4a2da57543ef88e44ae28ad61647',uuid=702895f8-1281-4b2f-8f4b-c838ce84b37a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "50f63236-6cb1-4e39-bed8-36dac99fd408", "address": "fa:16:3e:17:54:85", "network": {"id": "c56f578e-f013-4483-b9f2-ee1459896133", "bridge": "br-int", "label": "tempest-network-smoke--1680080003", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50f63236-6c", "ovs_interfaceid": "50f63236-6cb1-4e39-bed8-36dac99fd408", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:51:10 np0005466013 nova_compute[192144]: 2025-10-02 12:51:10.415 2 DEBUG nova.network.os_vif_util [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Converting VIF {"id": "50f63236-6cb1-4e39-bed8-36dac99fd408", "address": "fa:16:3e:17:54:85", "network": {"id": "c56f578e-f013-4483-b9f2-ee1459896133", "bridge": "br-int", "label": "tempest-network-smoke--1680080003", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50f63236-6c", "ovs_interfaceid": "50f63236-6cb1-4e39-bed8-36dac99fd408", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:51:10 np0005466013 nova_compute[192144]: 2025-10-02 12:51:10.416 2 DEBUG nova.network.os_vif_util [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:17:54:85,bridge_name='br-int',has_traffic_filtering=True,id=50f63236-6cb1-4e39-bed8-36dac99fd408,network=Network(c56f578e-f013-4483-b9f2-ee1459896133),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50f63236-6c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:51:10 np0005466013 nova_compute[192144]: 2025-10-02 12:51:10.417 2 DEBUG nova.objects.instance [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lazy-loading 'pci_devices' on Instance uuid 702895f8-1281-4b2f-8f4b-c838ce84b37a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:51:10 np0005466013 nova_compute[192144]: 2025-10-02 12:51:10.521 2 DEBUG nova.virt.libvirt.driver [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:51:10 np0005466013 nova_compute[192144]:  <uuid>702895f8-1281-4b2f-8f4b-c838ce84b37a</uuid>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:  <name>instance-000000bb</name>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:  <memory>131072</memory>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:  <vcpu>1</vcpu>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:  <metadata>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:51:10 np0005466013 nova_compute[192144]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-gen-1-1251183732</nova:name>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:      <nova:creationTime>2025-10-02 12:51:10</nova:creationTime>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:      <nova:flavor name="m1.nano">
Oct  2 08:51:10 np0005466013 nova_compute[192144]:        <nova:memory>128</nova:memory>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:        <nova:disk>1</nova:disk>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:        <nova:swap>0</nova:swap>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:      </nova:flavor>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:      <nova:owner>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:        <nova:user uuid="2d2b4a2da57543ef88e44ae28ad61647">tempest-TestSecurityGroupsBasicOps-1020134341-project-member</nova:user>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:        <nova:project uuid="575f3d227ab24f2daa62e65e14a4cd9c">tempest-TestSecurityGroupsBasicOps-1020134341</nova:project>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:      </nova:owner>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:      <nova:ports>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:        <nova:port uuid="50f63236-6cb1-4e39-bed8-36dac99fd408">
Oct  2 08:51:10 np0005466013 nova_compute[192144]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:        </nova:port>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:      </nova:ports>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    </nova:instance>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:  </metadata>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:  <sysinfo type="smbios">
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    <system>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:      <entry name="serial">702895f8-1281-4b2f-8f4b-c838ce84b37a</entry>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:      <entry name="uuid">702895f8-1281-4b2f-8f4b-c838ce84b37a</entry>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    </system>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:  </sysinfo>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:  <os>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    <boot dev="hd"/>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    <smbios mode="sysinfo"/>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:  </os>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:  <features>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    <acpi/>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    <apic/>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    <vmcoreinfo/>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:  </features>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:  <clock offset="utc">
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    <timer name="hpet" present="no"/>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:  </clock>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:  <cpu mode="custom" match="exact">
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    <model>Nehalem</model>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:  </cpu>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:  <devices>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    <disk type="file" device="disk">
Oct  2 08:51:10 np0005466013 nova_compute[192144]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/702895f8-1281-4b2f-8f4b-c838ce84b37a/disk"/>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:      <target dev="vda" bus="virtio"/>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    <disk type="file" device="cdrom">
Oct  2 08:51:10 np0005466013 nova_compute[192144]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:      <source file="/var/lib/nova/instances/702895f8-1281-4b2f-8f4b-c838ce84b37a/disk.config"/>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:      <target dev="sda" bus="sata"/>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    </disk>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    <interface type="ethernet">
Oct  2 08:51:10 np0005466013 nova_compute[192144]:      <mac address="fa:16:3e:17:54:85"/>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:      <mtu size="1442"/>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:      <target dev="tap50f63236-6c"/>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    </interface>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    <serial type="pty">
Oct  2 08:51:10 np0005466013 nova_compute[192144]:      <log file="/var/lib/nova/instances/702895f8-1281-4b2f-8f4b-c838ce84b37a/console.log" append="off"/>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    </serial>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    <video>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:      <model type="virtio"/>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    </video>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    <input type="tablet" bus="usb"/>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    <rng model="virtio">
Oct  2 08:51:10 np0005466013 nova_compute[192144]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    </rng>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    <controller type="usb" index="0"/>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    <memballoon model="virtio">
Oct  2 08:51:10 np0005466013 nova_compute[192144]:      <stats period="10"/>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:    </memballoon>
Oct  2 08:51:10 np0005466013 nova_compute[192144]:  </devices>
Oct  2 08:51:10 np0005466013 nova_compute[192144]: </domain>
Oct  2 08:51:10 np0005466013 nova_compute[192144]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:51:10 np0005466013 nova_compute[192144]: 2025-10-02 12:51:10.522 2 DEBUG nova.compute.manager [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Preparing to wait for external event network-vif-plugged-50f63236-6cb1-4e39-bed8-36dac99fd408 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:51:10 np0005466013 nova_compute[192144]: 2025-10-02 12:51:10.523 2 DEBUG oslo_concurrency.lockutils [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Acquiring lock "702895f8-1281-4b2f-8f4b-c838ce84b37a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:51:10 np0005466013 nova_compute[192144]: 2025-10-02 12:51:10.523 2 DEBUG oslo_concurrency.lockutils [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "702895f8-1281-4b2f-8f4b-c838ce84b37a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:51:10 np0005466013 nova_compute[192144]: 2025-10-02 12:51:10.523 2 DEBUG oslo_concurrency.lockutils [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "702895f8-1281-4b2f-8f4b-c838ce84b37a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:51:10 np0005466013 nova_compute[192144]: 2025-10-02 12:51:10.524 2 DEBUG nova.virt.libvirt.vif [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:50:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-gen-1-1251183732',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-gen-1-1251183732',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1020134341-ge',id=187,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPtdLgRziYi/gtQwh2c90NnE9jWcSnkXXhVGvo+TNtzW3MSE83NoyumTXAaB/UU4ExIeaKr77+vb5N+WSXOcyyn7dDdXMPaG0pk4M0kXpEwbShNs9Jn1NVnaa85coBBWBQ==',key_name='tempest-TestSecurityGroupsBasicOps-1574692784',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='575f3d227ab24f2daa62e65e14a4cd9c',ramdisk_id='',reservation_id='r-0ovuvx2b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1020134341',owner_user_name='tempest-TestSecurityGroupsBasicOps-1020134341-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:50:57Z,user_data=None,user_id='2d2b4a2da57543ef88e44ae28ad61647',uuid=702895f8-1281-4b2f-8f4b-c838ce84b37a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "50f63236-6cb1-4e39-bed8-36dac99fd408", "address": "fa:16:3e:17:54:85", "network": {"id": "c56f578e-f013-4483-b9f2-ee1459896133", "bridge": "br-int", "label": "tempest-network-smoke--1680080003", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50f63236-6c", "ovs_interfaceid": "50f63236-6cb1-4e39-bed8-36dac99fd408", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:51:10 np0005466013 nova_compute[192144]: 2025-10-02 12:51:10.524 2 DEBUG nova.network.os_vif_util [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Converting VIF {"id": "50f63236-6cb1-4e39-bed8-36dac99fd408", "address": "fa:16:3e:17:54:85", "network": {"id": "c56f578e-f013-4483-b9f2-ee1459896133", "bridge": "br-int", "label": "tempest-network-smoke--1680080003", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50f63236-6c", "ovs_interfaceid": "50f63236-6cb1-4e39-bed8-36dac99fd408", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:51:10 np0005466013 nova_compute[192144]: 2025-10-02 12:51:10.525 2 DEBUG nova.network.os_vif_util [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:17:54:85,bridge_name='br-int',has_traffic_filtering=True,id=50f63236-6cb1-4e39-bed8-36dac99fd408,network=Network(c56f578e-f013-4483-b9f2-ee1459896133),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50f63236-6c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:51:10 np0005466013 nova_compute[192144]: 2025-10-02 12:51:10.525 2 DEBUG os_vif [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:17:54:85,bridge_name='br-int',has_traffic_filtering=True,id=50f63236-6cb1-4e39-bed8-36dac99fd408,network=Network(c56f578e-f013-4483-b9f2-ee1459896133),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50f63236-6c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:51:10 np0005466013 nova_compute[192144]: 2025-10-02 12:51:10.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:10 np0005466013 nova_compute[192144]: 2025-10-02 12:51:10.526 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:51:10 np0005466013 nova_compute[192144]: 2025-10-02 12:51:10.526 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:51:10 np0005466013 nova_compute[192144]: 2025-10-02 12:51:10.529 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:10 np0005466013 nova_compute[192144]: 2025-10-02 12:51:10.529 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap50f63236-6c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:51:10 np0005466013 nova_compute[192144]: 2025-10-02 12:51:10.529 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap50f63236-6c, col_values=(('external_ids', {'iface-id': '50f63236-6cb1-4e39-bed8-36dac99fd408', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:17:54:85', 'vm-uuid': '702895f8-1281-4b2f-8f4b-c838ce84b37a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:51:10 np0005466013 NetworkManager[51205]: <info>  [1759409470.5323] manager: (tap50f63236-6c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/353)
Oct  2 08:51:10 np0005466013 nova_compute[192144]: 2025-10-02 12:51:10.534 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:51:10 np0005466013 nova_compute[192144]: 2025-10-02 12:51:10.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:10 np0005466013 nova_compute[192144]: 2025-10-02 12:51:10.537 2 INFO os_vif [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:17:54:85,bridge_name='br-int',has_traffic_filtering=True,id=50f63236-6cb1-4e39-bed8-36dac99fd408,network=Network(c56f578e-f013-4483-b9f2-ee1459896133),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50f63236-6c')#033[00m
Oct  2 08:51:10 np0005466013 nova_compute[192144]: 2025-10-02 12:51:10.744 2 DEBUG nova.virt.libvirt.driver [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:51:10 np0005466013 nova_compute[192144]: 2025-10-02 12:51:10.745 2 DEBUG nova.virt.libvirt.driver [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:51:10 np0005466013 nova_compute[192144]: 2025-10-02 12:51:10.745 2 DEBUG nova.virt.libvirt.driver [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] No VIF found with MAC fa:16:3e:17:54:85, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:51:10 np0005466013 nova_compute[192144]: 2025-10-02 12:51:10.745 2 INFO nova.virt.libvirt.driver [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Using config drive#033[00m
Oct  2 08:51:11 np0005466013 nova_compute[192144]: 2025-10-02 12:51:11.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:11 np0005466013 nova_compute[192144]: 2025-10-02 12:51:11.892 2 INFO nova.virt.libvirt.driver [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Creating config drive at /var/lib/nova/instances/702895f8-1281-4b2f-8f4b-c838ce84b37a/disk.config#033[00m
Oct  2 08:51:11 np0005466013 nova_compute[192144]: 2025-10-02 12:51:11.896 2 DEBUG oslo_concurrency.processutils [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/702895f8-1281-4b2f-8f4b-c838ce84b37a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpblw6r4pe execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:51:12 np0005466013 nova_compute[192144]: 2025-10-02 12:51:12.021 2 DEBUG oslo_concurrency.processutils [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/702895f8-1281-4b2f-8f4b-c838ce84b37a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpblw6r4pe" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:51:12 np0005466013 kernel: tap50f63236-6c: entered promiscuous mode
Oct  2 08:51:12 np0005466013 NetworkManager[51205]: <info>  [1759409472.0873] manager: (tap50f63236-6c): new Tun device (/org/freedesktop/NetworkManager/Devices/354)
Oct  2 08:51:12 np0005466013 ovn_controller[94366]: 2025-10-02T12:51:12Z|00793|binding|INFO|Claiming lport 50f63236-6cb1-4e39-bed8-36dac99fd408 for this chassis.
Oct  2 08:51:12 np0005466013 ovn_controller[94366]: 2025-10-02T12:51:12Z|00794|binding|INFO|50f63236-6cb1-4e39-bed8-36dac99fd408: Claiming fa:16:3e:17:54:85 10.100.0.9
Oct  2 08:51:12 np0005466013 nova_compute[192144]: 2025-10-02 12:51:12.088 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:12 np0005466013 nova_compute[192144]: 2025-10-02 12:51:12.092 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:12 np0005466013 nova_compute[192144]: 2025-10-02 12:51:12.098 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:12 np0005466013 NetworkManager[51205]: <info>  [1759409472.1001] manager: (patch-br-int-to-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/355)
Oct  2 08:51:12 np0005466013 NetworkManager[51205]: <info>  [1759409472.1010] manager: (patch-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/356)
Oct  2 08:51:12 np0005466013 systemd-udevd[253993]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:51:12 np0005466013 systemd-machined[152202]: New machine qemu-83-instance-000000bb.
Oct  2 08:51:12 np0005466013 NetworkManager[51205]: <info>  [1759409472.1342] device (tap50f63236-6c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:51:12 np0005466013 NetworkManager[51205]: <info>  [1759409472.1356] device (tap50f63236-6c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:51:12 np0005466013 systemd[1]: Started Virtual Machine qemu-83-instance-000000bb.
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:51:12.190 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:17:54:85 10.100.0.9'], port_security=['fa:16:3e:17:54:85 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '702895f8-1281-4b2f-8f4b-c838ce84b37a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c56f578e-f013-4483-b9f2-ee1459896133', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '48a842a9-048b-49fa-aad1-710802b3266f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=50529df0-c539-4067-a62b-3ef6d48b20aa, chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=50f63236-6cb1-4e39-bed8-36dac99fd408) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:51:12.192 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 50f63236-6cb1-4e39-bed8-36dac99fd408 in datapath c56f578e-f013-4483-b9f2-ee1459896133 bound to our chassis#033[00m
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:51:12.193 103323 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c56f578e-f013-4483-b9f2-ee1459896133#033[00m
Oct  2 08:51:12 np0005466013 ovn_controller[94366]: 2025-10-02T12:51:12Z|00795|binding|INFO|Setting lport 50f63236-6cb1-4e39-bed8-36dac99fd408 up in Southbound
Oct  2 08:51:12 np0005466013 nova_compute[192144]: 2025-10-02 12:51:12.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:51:12.217 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[90539847-337d-4394-9f19-baec503e9465]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:51:12.218 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc56f578e-f1 in ovnmeta-c56f578e-f013-4483-b9f2-ee1459896133 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:51:12.222 219962 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc56f578e-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:51:12.223 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[ee52c4a5-645d-4546-ba5a-5885e8075fa1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:51:12.224 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[cf616b29-5fe6-4c35-92c4-31668b501d30]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:51:12.235 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[3af4f8ba-a051-4d27-96e3-0771f039e045]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:51:12 np0005466013 ovn_controller[94366]: 2025-10-02T12:51:12Z|00796|binding|INFO|Setting lport 50f63236-6cb1-4e39-bed8-36dac99fd408 ovn-installed in OVS
Oct  2 08:51:12 np0005466013 nova_compute[192144]: 2025-10-02 12:51:12.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:51:12.252 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[d8d470b8-3508-4570-920b-5adba35915fa]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:51:12.285 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[0fd3a553-76ff-43d6-b178-0d0c7f2a87db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:51:12.289 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[dae9c766-4d9e-4708-bd2f-f07425b4b84c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:51:12 np0005466013 NetworkManager[51205]: <info>  [1759409472.2907] manager: (tapc56f578e-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/357)
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:51:12.321 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[3d845c40-ef73-4186-9044-8f6c03f79e9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:51:12.326 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[14ff409c-80ae-42cb-8e27-eef0165bd62f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:51:12 np0005466013 NetworkManager[51205]: <info>  [1759409472.3537] device (tapc56f578e-f0): carrier: link connected
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:51:12.360 219977 DEBUG oslo.privsep.daemon [-] privsep: reply[6a113812-a0f0-467b-9758-c98017fb50bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:51:12.377 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[02bcc154-9d79-46c6-a237-d4f4a93bdf48]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc56f578e-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6b:73:63'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 234], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 746397, 'reachable_time': 22440, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 254032, 'error': None, 'target': 'ovnmeta-c56f578e-f013-4483-b9f2-ee1459896133', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:51:12.396 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[cbabc79b-b295-4aa5-9408-7a6f6d14f12e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6b:7363'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 746397, 'tstamp': 746397}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 254034, 'error': None, 'target': 'ovnmeta-c56f578e-f013-4483-b9f2-ee1459896133', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:51:12.417 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[7cefa081-042d-4927-9221-2f63dea587f8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc56f578e-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6b:73:63'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 234], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 746397, 'reachable_time': 22440, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 168, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 168, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 254035, 'error': None, 'target': 'ovnmeta-c56f578e-f013-4483-b9f2-ee1459896133', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:51:12.449 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[b1f77a5d-2443-4961-bf7d-5ec0c6d9fc36]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:51:12.529 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[ab3e6a72-2cc3-427c-af39-1fa7fab6f837]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:51:12.530 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc56f578e-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:51:12.531 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:51:12.531 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc56f578e-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:51:12 np0005466013 nova_compute[192144]: 2025-10-02 12:51:12.533 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:12 np0005466013 NetworkManager[51205]: <info>  [1759409472.5347] manager: (tapc56f578e-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/358)
Oct  2 08:51:12 np0005466013 kernel: tapc56f578e-f0: entered promiscuous mode
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:51:12.538 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc56f578e-f0, col_values=(('external_ids', {'iface-id': '14c8642a-f433-48b2-a9ce-dc24a1a84079'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:51:12 np0005466013 ovn_controller[94366]: 2025-10-02T12:51:12Z|00797|binding|INFO|Releasing lport 14c8642a-f433-48b2-a9ce-dc24a1a84079 from this chassis (sb_readonly=0)
Oct  2 08:51:12 np0005466013 nova_compute[192144]: 2025-10-02 12:51:12.539 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:12 np0005466013 nova_compute[192144]: 2025-10-02 12:51:12.540 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:51:12.542 103323 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c56f578e-f013-4483-b9f2-ee1459896133.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c56f578e-f013-4483-b9f2-ee1459896133.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:51:12.543 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[c0709a55-ba4a-492a-b7ed-51b98b558420]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:51:12.545 103323 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]: global
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]:    log         /dev/log local0 debug
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]:    log-tag     haproxy-metadata-proxy-c56f578e-f013-4483-b9f2-ee1459896133
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]:    user        root
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]:    group       root
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]:    maxconn     1024
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]:    pidfile     /var/lib/neutron/external/pids/c56f578e-f013-4483-b9f2-ee1459896133.pid.haproxy
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]:    daemon
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]: defaults
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]:    log global
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]:    mode http
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]:    option httplog
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]:    option dontlognull
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]:    option http-server-close
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]:    option forwardfor
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]:    retries                 3
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]:    timeout http-request    30s
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]:    timeout connect         30s
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]:    timeout client          32s
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]:    timeout server          32s
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]:    timeout http-keep-alive 30s
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]: 
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]: listen listener
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]:    bind 169.254.169.254:80
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]:    http-request add-header X-OVN-Network-ID c56f578e-f013-4483-b9f2-ee1459896133
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:51:12 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:51:12.545 103323 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c56f578e-f013-4483-b9f2-ee1459896133', 'env', 'PROCESS_TAG=haproxy-c56f578e-f013-4483-b9f2-ee1459896133', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c56f578e-f013-4483-b9f2-ee1459896133.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:51:12 np0005466013 nova_compute[192144]: 2025-10-02 12:51:12.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:12 np0005466013 nova_compute[192144]: 2025-10-02 12:51:12.695 2 DEBUG nova.compute.manager [req-cf0a7bc0-c1d8-493d-a76e-24e32ec4dee3 req-79cc5d95-7370-45ee-85d3-5b66cd0590b2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Received event network-vif-plugged-50f63236-6cb1-4e39-bed8-36dac99fd408 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:51:12 np0005466013 nova_compute[192144]: 2025-10-02 12:51:12.695 2 DEBUG oslo_concurrency.lockutils [req-cf0a7bc0-c1d8-493d-a76e-24e32ec4dee3 req-79cc5d95-7370-45ee-85d3-5b66cd0590b2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "702895f8-1281-4b2f-8f4b-c838ce84b37a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:51:12 np0005466013 nova_compute[192144]: 2025-10-02 12:51:12.696 2 DEBUG oslo_concurrency.lockutils [req-cf0a7bc0-c1d8-493d-a76e-24e32ec4dee3 req-79cc5d95-7370-45ee-85d3-5b66cd0590b2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "702895f8-1281-4b2f-8f4b-c838ce84b37a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:51:12 np0005466013 nova_compute[192144]: 2025-10-02 12:51:12.696 2 DEBUG oslo_concurrency.lockutils [req-cf0a7bc0-c1d8-493d-a76e-24e32ec4dee3 req-79cc5d95-7370-45ee-85d3-5b66cd0590b2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "702895f8-1281-4b2f-8f4b-c838ce84b37a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:51:12 np0005466013 nova_compute[192144]: 2025-10-02 12:51:12.696 2 DEBUG nova.compute.manager [req-cf0a7bc0-c1d8-493d-a76e-24e32ec4dee3 req-79cc5d95-7370-45ee-85d3-5b66cd0590b2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Processing event network-vif-plugged-50f63236-6cb1-4e39-bed8-36dac99fd408 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:51:12 np0005466013 nova_compute[192144]: 2025-10-02 12:51:12.884 2 DEBUG nova.compute.manager [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:51:12 np0005466013 nova_compute[192144]: 2025-10-02 12:51:12.885 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759409472.8834286, 702895f8-1281-4b2f-8f4b-c838ce84b37a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:51:12 np0005466013 nova_compute[192144]: 2025-10-02 12:51:12.886 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] VM Started (Lifecycle Event)#033[00m
Oct  2 08:51:12 np0005466013 nova_compute[192144]: 2025-10-02 12:51:12.889 2 DEBUG nova.virt.libvirt.driver [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:51:12 np0005466013 nova_compute[192144]: 2025-10-02 12:51:12.892 2 INFO nova.virt.libvirt.driver [-] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Instance spawned successfully.#033[00m
Oct  2 08:51:12 np0005466013 nova_compute[192144]: 2025-10-02 12:51:12.893 2 DEBUG nova.virt.libvirt.driver [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:51:12 np0005466013 podman[254068]: 2025-10-02 12:51:12.973658653 +0000 UTC m=+0.086667490 container create 7457633d4d7cf7509fc86816cd9e97a770ed14346b33eae35f19b09b1bb6b97c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c56f578e-f013-4483-b9f2-ee1459896133, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0)
Oct  2 08:51:13 np0005466013 nova_compute[192144]: 2025-10-02 12:51:13.005 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:51:13 np0005466013 podman[254068]: 2025-10-02 12:51:12.910299406 +0000 UTC m=+0.023308263 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:51:13 np0005466013 nova_compute[192144]: 2025-10-02 12:51:13.011 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:51:13 np0005466013 systemd[1]: Started libpod-conmon-7457633d4d7cf7509fc86816cd9e97a770ed14346b33eae35f19b09b1bb6b97c.scope.
Oct  2 08:51:13 np0005466013 nova_compute[192144]: 2025-10-02 12:51:13.043 2 DEBUG nova.virt.libvirt.driver [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:51:13 np0005466013 nova_compute[192144]: 2025-10-02 12:51:13.044 2 DEBUG nova.virt.libvirt.driver [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:51:13 np0005466013 nova_compute[192144]: 2025-10-02 12:51:13.045 2 DEBUG nova.virt.libvirt.driver [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:51:13 np0005466013 nova_compute[192144]: 2025-10-02 12:51:13.045 2 DEBUG nova.virt.libvirt.driver [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:51:13 np0005466013 nova_compute[192144]: 2025-10-02 12:51:13.046 2 DEBUG nova.virt.libvirt.driver [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:51:13 np0005466013 nova_compute[192144]: 2025-10-02 12:51:13.046 2 DEBUG nova.virt.libvirt.driver [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:51:13 np0005466013 systemd[1]: Started libcrun container.
Oct  2 08:51:13 np0005466013 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4ece5e59e87cd92d3319f0ea7b30fd25e8ab9126951dbdf6c0b8dedb65621b8d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:51:13 np0005466013 podman[254068]: 2025-10-02 12:51:13.073605379 +0000 UTC m=+0.186614216 container init 7457633d4d7cf7509fc86816cd9e97a770ed14346b33eae35f19b09b1bb6b97c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c56f578e-f013-4483-b9f2-ee1459896133, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Oct  2 08:51:13 np0005466013 podman[254068]: 2025-10-02 12:51:13.078922496 +0000 UTC m=+0.191931333 container start 7457633d4d7cf7509fc86816cd9e97a770ed14346b33eae35f19b09b1bb6b97c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c56f578e-f013-4483-b9f2-ee1459896133, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:51:13 np0005466013 neutron-haproxy-ovnmeta-c56f578e-f013-4483-b9f2-ee1459896133[254083]: [NOTICE]   (254087) : New worker (254089) forked
Oct  2 08:51:13 np0005466013 neutron-haproxy-ovnmeta-c56f578e-f013-4483-b9f2-ee1459896133[254083]: [NOTICE]   (254087) : Loading success.
Oct  2 08:51:13 np0005466013 nova_compute[192144]: 2025-10-02 12:51:13.146 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:51:13 np0005466013 nova_compute[192144]: 2025-10-02 12:51:13.146 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759409472.8840127, 702895f8-1281-4b2f-8f4b-c838ce84b37a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:51:13 np0005466013 nova_compute[192144]: 2025-10-02 12:51:13.147 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:51:13 np0005466013 nova_compute[192144]: 2025-10-02 12:51:13.320 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:51:13 np0005466013 nova_compute[192144]: 2025-10-02 12:51:13.327 2 DEBUG nova.virt.driver [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] Emitting event <LifecycleEvent: 1759409472.8884478, 702895f8-1281-4b2f-8f4b-c838ce84b37a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:51:13 np0005466013 nova_compute[192144]: 2025-10-02 12:51:13.328 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:51:13 np0005466013 nova_compute[192144]: 2025-10-02 12:51:13.368 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:51:13 np0005466013 nova_compute[192144]: 2025-10-02 12:51:13.371 2 DEBUG nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:51:13 np0005466013 nova_compute[192144]: 2025-10-02 12:51:13.439 2 INFO nova.compute.manager [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Took 15.78 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:51:13 np0005466013 nova_compute[192144]: 2025-10-02 12:51:13.440 2 DEBUG nova.compute.manager [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:51:13 np0005466013 nova_compute[192144]: 2025-10-02 12:51:13.444 2 INFO nova.compute.manager [None req-da57098c-261e-45c7-a7b0-dfd8b9da835b - - - - - -] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:51:13 np0005466013 nova_compute[192144]: 2025-10-02 12:51:13.791 2 INFO nova.compute.manager [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Took 16.63 seconds to build instance.#033[00m
Oct  2 08:51:13 np0005466013 nova_compute[192144]: 2025-10-02 12:51:13.897 2 DEBUG oslo_concurrency.lockutils [None req-6e8591c5-1116-4154-9e70-c0db1397e57f 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "702895f8-1281-4b2f-8f4b-c838ce84b37a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.849s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:51:14 np0005466013 nova_compute[192144]: 2025-10-02 12:51:14.944 2 DEBUG nova.compute.manager [req-079ebba6-1e56-4a36-b484-e681ffceb435 req-a5d84c04-3462-4fea-8b4a-d5fd9bc4c457 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Received event network-vif-plugged-50f63236-6cb1-4e39-bed8-36dac99fd408 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:51:14 np0005466013 nova_compute[192144]: 2025-10-02 12:51:14.945 2 DEBUG oslo_concurrency.lockutils [req-079ebba6-1e56-4a36-b484-e681ffceb435 req-a5d84c04-3462-4fea-8b4a-d5fd9bc4c457 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "702895f8-1281-4b2f-8f4b-c838ce84b37a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:51:14 np0005466013 nova_compute[192144]: 2025-10-02 12:51:14.945 2 DEBUG oslo_concurrency.lockutils [req-079ebba6-1e56-4a36-b484-e681ffceb435 req-a5d84c04-3462-4fea-8b4a-d5fd9bc4c457 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "702895f8-1281-4b2f-8f4b-c838ce84b37a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:51:14 np0005466013 nova_compute[192144]: 2025-10-02 12:51:14.945 2 DEBUG oslo_concurrency.lockutils [req-079ebba6-1e56-4a36-b484-e681ffceb435 req-a5d84c04-3462-4fea-8b4a-d5fd9bc4c457 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "702895f8-1281-4b2f-8f4b-c838ce84b37a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:51:14 np0005466013 nova_compute[192144]: 2025-10-02 12:51:14.946 2 DEBUG nova.compute.manager [req-079ebba6-1e56-4a36-b484-e681ffceb435 req-a5d84c04-3462-4fea-8b4a-d5fd9bc4c457 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] No waiting events found dispatching network-vif-plugged-50f63236-6cb1-4e39-bed8-36dac99fd408 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:51:14 np0005466013 nova_compute[192144]: 2025-10-02 12:51:14.946 2 WARNING nova.compute.manager [req-079ebba6-1e56-4a36-b484-e681ffceb435 req-a5d84c04-3462-4fea-8b4a-d5fd9bc4c457 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Received unexpected event network-vif-plugged-50f63236-6cb1-4e39-bed8-36dac99fd408 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:51:15 np0005466013 nova_compute[192144]: 2025-10-02 12:51:15.531 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:15 np0005466013 podman[254098]: 2025-10-02 12:51:15.682514021 +0000 UTC m=+0.057431992 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:51:15 np0005466013 podman[254099]: 2025-10-02 12:51:15.686060823 +0000 UTC m=+0.059913681 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 08:51:15 np0005466013 podman[254100]: 2025-10-02 12:51:15.717057886 +0000 UTC m=+0.088638353 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.362 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '702895f8-1281-4b2f-8f4b-c838ce84b37a', 'name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-gen-1-1251183732', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000bb', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'hostId': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.362 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.365 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 702895f8-1281-4b2f-8f4b-c838ce84b37a / tap50f63236-6c inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.365 12 DEBUG ceilometer.compute.pollsters [-] 702895f8-1281-4b2f-8f4b-c838ce84b37a/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.366 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '41f0f8c1-ab0c-48f2-8843-1a3f99af9f6d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': 'instance-000000bb-702895f8-1281-4b2f-8f4b-c838ce84b37a-tap50f63236-6c', 'timestamp': '2025-10-02T12:51:16.363086', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-gen-1-1251183732', 'name': 'tap50f63236-6c', 'instance_id': '702895f8-1281-4b2f-8f4b-c838ce84b37a', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:17:54:85', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap50f63236-6c'}, 'message_id': '7c0985fc-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7468.041299114, 'message_signature': 'e677ca2d2147e5425d926b2deccc6a5672e654b3fb590a85e066a4a5bd5dfb90'}]}, 'timestamp': '2025-10-02 12:51:16.365881', '_unique_id': 'fa8143afc2b44cb2bca41ba8bd865d2f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.366 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.366 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.366 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.366 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.366 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.366 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.366 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.366 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.366 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.366 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.366 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.366 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.366 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.366 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.366 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.366 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.366 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.366 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.366 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.366 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.366 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.366 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.366 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.366 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.366 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.366 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.366 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.366 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.366 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.366 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.366 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.367 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.367 12 DEBUG ceilometer.compute.pollsters [-] 702895f8-1281-4b2f-8f4b-c838ce84b37a/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.368 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b7698c41-c9d7-4e1e-8fd3-01080361009c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': 'instance-000000bb-702895f8-1281-4b2f-8f4b-c838ce84b37a-tap50f63236-6c', 'timestamp': '2025-10-02T12:51:16.367700', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-gen-1-1251183732', 'name': 'tap50f63236-6c', 'instance_id': '702895f8-1281-4b2f-8f4b-c838ce84b37a', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:17:54:85', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap50f63236-6c'}, 'message_id': '7c09dcaa-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7468.041299114, 'message_signature': 'c6f0d68072a580ebed5e5866efe4eea71637a2367cb34285303614c7bf8e8796'}]}, 'timestamp': '2025-10-02 12:51:16.368055', '_unique_id': 'ae35408661094263ac1d69ec0d5c1bef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.368 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.368 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.368 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.368 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.368 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.368 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.368 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.368 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.368 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.368 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.368 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.368 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.368 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.368 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.368 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.368 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.368 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.368 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.368 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.368 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.368 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.368 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.368 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.368 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.368 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.368 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.368 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.368 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.368 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.368 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.368 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.369 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.378 12 DEBUG ceilometer.compute.pollsters [-] 702895f8-1281-4b2f-8f4b-c838ce84b37a/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.379 12 DEBUG ceilometer.compute.pollsters [-] 702895f8-1281-4b2f-8f4b-c838ce84b37a/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd95c07e5-bd62-485c-b803-16b2c45248f0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': '702895f8-1281-4b2f-8f4b-c838ce84b37a-vda', 'timestamp': '2025-10-02T12:51:16.369315', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-gen-1-1251183732', 'name': 'instance-000000bb', 'instance_id': '702895f8-1281-4b2f-8f4b-c838ce84b37a', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7c0b8b2c-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7468.04754453, 'message_signature': '842c52792e50d6cc6ad874cba98ad950cfa374cc5bd9878da3acb3ddca8bc401'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': '702895f8-1281-4b2f-8f4b-c838ce84b37a-sda', 'timestamp': '2025-10-02T12:51:16.369315', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-gen-1-1251183732', 'name': 'instance-000000bb', 'instance_id': '702895f8-1281-4b2f-8f4b-c838ce84b37a', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7c0b94d2-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7468.04754453, 'message_signature': '6e72f9bcaa3e8236d39845b868ddd48d90ac136e12f6905bb8e4dc9984d61425'}]}, 'timestamp': '2025-10-02 12:51:16.379259', '_unique_id': 'f53f076e004d4281ab35acede44cee6b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.380 12 DEBUG ceilometer.compute.pollsters [-] 702895f8-1281-4b2f-8f4b-c838ce84b37a/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.381 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6be105c7-ce8b-4fb9-85ea-34edd3b3ba74', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': 'instance-000000bb-702895f8-1281-4b2f-8f4b-c838ce84b37a-tap50f63236-6c', 'timestamp': '2025-10-02T12:51:16.380702', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-gen-1-1251183732', 'name': 'tap50f63236-6c', 'instance_id': '702895f8-1281-4b2f-8f4b-c838ce84b37a', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:17:54:85', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap50f63236-6c'}, 'message_id': '7c0bd640-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7468.041299114, 'message_signature': 'c46497d8e150f6cdc30bb1a2d63626c8c3a285b48c3f963b0ea452f8472aa9ab'}]}, 'timestamp': '2025-10-02 12:51:16.380958', '_unique_id': 'eba5d9f68893479e83bad3680c8673b7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.381 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.381 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.381 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.381 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.381 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.381 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.381 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.381 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.381 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.381 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.381 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.381 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.381 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.381 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.381 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.381 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.381 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.381 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.381 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.381 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.381 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.381 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.381 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.381 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.381 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.381 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.381 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.381 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.381 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.381 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.381 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.382 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.382 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.382 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-gen-1-1251183732>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-gen-1-1251183732>]
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.382 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.382 12 DEBUG ceilometer.compute.pollsters [-] 702895f8-1281-4b2f-8f4b-c838ce84b37a/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.382 12 DEBUG ceilometer.compute.pollsters [-] 702895f8-1281-4b2f-8f4b-c838ce84b37a/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.383 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '92bf5336-eb2b-4a5e-b7dd-a19636349a0e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': '702895f8-1281-4b2f-8f4b-c838ce84b37a-vda', 'timestamp': '2025-10-02T12:51:16.382631', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-gen-1-1251183732', 'name': 'instance-000000bb', 'instance_id': '702895f8-1281-4b2f-8f4b-c838ce84b37a', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7c0c2320-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7468.04754453, 'message_signature': 'cc495126aa8996a8ec52677af1e67cbb11942aca3f9b4fb57ca682c991806c9f'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': '702895f8-1281-4b2f-8f4b-c838ce84b37a-sda', 'timestamp': '2025-10-02T12:51:16.382631', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-gen-1-1251183732', 'name': 'instance-000000bb', 'instance_id': '702895f8-1281-4b2f-8f4b-c838ce84b37a', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7c0c2e24-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7468.04754453, 'message_signature': '511cc151550e213a6c688d6a93df064878b3ce540ec6d152b30340dfebfdec46'}]}, 'timestamp': '2025-10-02 12:51:16.383202', '_unique_id': 'b4aa4c14099f44f79789c93a1cb8850f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.383 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.383 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.383 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.383 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.383 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.383 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.383 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.383 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.383 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.383 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.383 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.383 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.383 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.383 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.383 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.383 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.383 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.383 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.383 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.383 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.383 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.383 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.383 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.383 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.383 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.383 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.383 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.383 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.383 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.383 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.383 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.384 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.384 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.384 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-gen-1-1251183732>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-gen-1-1251183732>]
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.384 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.402 12 DEBUG ceilometer.compute.pollsters [-] 702895f8-1281-4b2f-8f4b-c838ce84b37a/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.403 12 DEBUG ceilometer.compute.pollsters [-] 702895f8-1281-4b2f-8f4b-c838ce84b37a/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.404 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ce581e24-53b7-4b73-a4d9-8b46928aad50', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': '702895f8-1281-4b2f-8f4b-c838ce84b37a-vda', 'timestamp': '2025-10-02T12:51:16.384913', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-gen-1-1251183732', 'name': 'instance-000000bb', 'instance_id': '702895f8-1281-4b2f-8f4b-c838ce84b37a', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7c0f43d4-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7468.06313375, 'message_signature': 'ee3173ea05879cc0835f4bf5988ffa1154c5689d540b32bc9426aca5ac1eab46'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': '702895f8-1281-4b2f-8f4b-c838ce84b37a-sda', 'timestamp': '2025-10-02T12:51:16.384913', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-gen-1-1251183732', 'name': 'instance-000000bb', 'instance_id': '702895f8-1281-4b2f-8f4b-c838ce84b37a', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7c0f5086-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7468.06313375, 'message_signature': '7963a7f20f565acc395cde7014d530723c48df0c587b35cbe9d3b02251a9f478'}]}, 'timestamp': '2025-10-02 12:51:16.403771', '_unique_id': '8e1a23419fb3403fb90acbbf8ec4cd95'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.404 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.404 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.404 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.404 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.404 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.404 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.404 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.404 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.404 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.404 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.404 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.404 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.404 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.404 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.404 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.404 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.404 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.404 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.404 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.404 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.404 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.404 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.404 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.404 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.404 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.404 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.404 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.404 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.404 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.404 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.404 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.405 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.405 12 DEBUG ceilometer.compute.pollsters [-] 702895f8-1281-4b2f-8f4b-c838ce84b37a/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.406 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '44f5b9ec-726f-4ff1-b59e-f7082da2a784', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': 'instance-000000bb-702895f8-1281-4b2f-8f4b-c838ce84b37a-tap50f63236-6c', 'timestamp': '2025-10-02T12:51:16.405497', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-gen-1-1251183732', 'name': 'tap50f63236-6c', 'instance_id': '702895f8-1281-4b2f-8f4b-c838ce84b37a', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:17:54:85', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap50f63236-6c'}, 'message_id': '7c0fa1ee-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7468.041299114, 'message_signature': '4978923ec8bbb1e10ad3fb421cf82344f54ea78d0e338323eb6399ceb3b78f26'}]}, 'timestamp': '2025-10-02 12:51:16.405882', '_unique_id': '35a81fbe716c4a5a823d6a9214d8b74f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.406 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.406 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.406 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.406 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.406 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.406 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.406 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.406 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.406 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.406 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.406 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.406 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.406 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.406 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.406 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.406 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.406 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.406 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.406 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.406 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.406 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.406 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.406 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.406 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.406 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.406 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.406 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.406 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.406 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.406 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.406 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.407 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.407 12 DEBUG ceilometer.compute.pollsters [-] 702895f8-1281-4b2f-8f4b-c838ce84b37a/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.407 12 DEBUG ceilometer.compute.pollsters [-] 702895f8-1281-4b2f-8f4b-c838ce84b37a/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.408 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f0fe3b59-e76b-4552-9039-47758f2e659d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': '702895f8-1281-4b2f-8f4b-c838ce84b37a-vda', 'timestamp': '2025-10-02T12:51:16.407345', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-gen-1-1251183732', 'name': 'instance-000000bb', 'instance_id': '702895f8-1281-4b2f-8f4b-c838ce84b37a', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7c0fe85c-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7468.06313375, 'message_signature': 'e3a3cb1e39124494e32df3acf7ace5bdef038988f03a0660248fd2fa85d1bfbc'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': '702895f8-1281-4b2f-8f4b-c838ce84b37a-sda', 'timestamp': '2025-10-02T12:51:16.407345', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-gen-1-1251183732', 'name': 'instance-000000bb', 'instance_id': '702895f8-1281-4b2f-8f4b-c838ce84b37a', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7c0ff34c-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7468.06313375, 'message_signature': 'f8d313ef3b0dbc69ab977018cca8a0348fa5a29d97d11c6a4576bc69e7bf0eba'}]}, 'timestamp': '2025-10-02 12:51:16.407948', '_unique_id': '64ae5eb4afea42338eb8256f58a8e7b4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.408 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.408 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.408 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.408 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.408 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.408 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.408 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.408 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.408 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.408 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.408 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.408 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.408 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.408 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.408 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.408 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.408 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.408 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.408 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.408 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.408 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.408 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.408 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.408 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.408 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.408 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.408 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.408 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.408 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.408 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.408 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.409 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.409 12 DEBUG ceilometer.compute.pollsters [-] 702895f8-1281-4b2f-8f4b-c838ce84b37a/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.410 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '90577680-18e1-4ce7-bcb7-3f38a72a6e17', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': 'instance-000000bb-702895f8-1281-4b2f-8f4b-c838ce84b37a-tap50f63236-6c', 'timestamp': '2025-10-02T12:51:16.409449', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-gen-1-1251183732', 'name': 'tap50f63236-6c', 'instance_id': '702895f8-1281-4b2f-8f4b-c838ce84b37a', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:17:54:85', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap50f63236-6c'}, 'message_id': '7c103b36-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7468.041299114, 'message_signature': '98771c0de68f540ea67b211964bf39ee666c78d44dde32566950611de6f9261e'}]}, 'timestamp': '2025-10-02 12:51:16.409785', '_unique_id': 'c35d32ab7d07480f85730f78366f7212'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.410 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.410 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.410 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.410 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.410 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.410 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.410 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.410 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.410 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.410 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.410 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.410 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.410 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.410 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.410 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.410 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.410 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.410 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.410 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.410 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.410 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.410 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.410 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.410 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.410 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.410 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.410 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.410 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.410 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.410 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.410 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.411 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.427 12 DEBUG ceilometer.compute.pollsters [-] 702895f8-1281-4b2f-8f4b-c838ce84b37a/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.428 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 702895f8-1281-4b2f-8f4b-c838ce84b37a: ceilometer.compute.pollsters.NoVolumeException
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.428 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.428 12 DEBUG ceilometer.compute.pollsters [-] 702895f8-1281-4b2f-8f4b-c838ce84b37a/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.428 12 DEBUG ceilometer.compute.pollsters [-] 702895f8-1281-4b2f-8f4b-c838ce84b37a/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.429 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bec9927b-3f22-46e7-8f91-538c426ec43a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': '702895f8-1281-4b2f-8f4b-c838ce84b37a-vda', 'timestamp': '2025-10-02T12:51:16.428206', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-gen-1-1251183732', 'name': 'instance-000000bb', 'instance_id': '702895f8-1281-4b2f-8f4b-c838ce84b37a', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7c131720-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7468.06313375, 'message_signature': 'b2ecb6eb354530874e23f6e15b81d2417871b3fb29979877ae9a1c32c763fedc'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': '702895f8-1281-4b2f-8f4b-c838ce84b37a-sda', 'timestamp': '2025-10-02T12:51:16.428206', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-gen-1-1251183732', 'name': 'instance-000000bb', 'instance_id': '702895f8-1281-4b2f-8f4b-c838ce84b37a', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7c132062-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7468.06313375, 'message_signature': 'c766f193428129efc5d706bda60cc1955ed64ed9a544d61e7abf0b1fe13f228e'}]}, 'timestamp': '2025-10-02 12:51:16.428703', '_unique_id': 'bd03940d51da466cb31ae88453592248'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.429 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.429 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.429 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.429 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.429 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.429 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.429 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.429 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.429 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.429 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.429 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.429 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.429 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.429 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.429 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.429 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.429 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.429 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.429 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.429 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.429 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.429 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.429 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.429 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.429 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.429 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.429 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.429 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.429 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.429 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.429 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.430 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.430 12 DEBUG ceilometer.compute.pollsters [-] 702895f8-1281-4b2f-8f4b-c838ce84b37a/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c7051154-b11f-430c-8886-b7b8211de0b7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': 'instance-000000bb-702895f8-1281-4b2f-8f4b-c838ce84b37a-tap50f63236-6c', 'timestamp': '2025-10-02T12:51:16.430399', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-gen-1-1251183732', 'name': 'tap50f63236-6c', 'instance_id': '702895f8-1281-4b2f-8f4b-c838ce84b37a', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:17:54:85', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap50f63236-6c'}, 'message_id': '7c136b26-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7468.041299114, 'message_signature': 'f56722ddf8d91cc03e8c713bd495a89498038e02c7754b567e5d1ace99a8e0f3'}]}, 'timestamp': '2025-10-02 12:51:16.430626', '_unique_id': 'a7ee11b75a8a41b280debfdd65291bc4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.431 12 DEBUG ceilometer.compute.pollsters [-] 702895f8-1281-4b2f-8f4b-c838ce84b37a/network.incoming.bytes volume: 110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '99750ab5-4a5d-4e41-bc5b-8e69fd95b649', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 110, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': 'instance-000000bb-702895f8-1281-4b2f-8f4b-c838ce84b37a-tap50f63236-6c', 'timestamp': '2025-10-02T12:51:16.431685', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-gen-1-1251183732', 'name': 'tap50f63236-6c', 'instance_id': '702895f8-1281-4b2f-8f4b-c838ce84b37a', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:17:54:85', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap50f63236-6c'}, 'message_id': '7c139d44-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7468.041299114, 'message_signature': 'f65ce950ea495a98805dcf625761dcaf37e5ef7e190999d4dd15ffde0993efca'}]}, 'timestamp': '2025-10-02 12:51:16.431925', '_unique_id': 'a2e23c9b105d480cb684bba51dd98848'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.432 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.433 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-gen-1-1251183732>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-gen-1-1251183732>]
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.433 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.433 12 DEBUG ceilometer.compute.pollsters [-] 702895f8-1281-4b2f-8f4b-c838ce84b37a/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.433 12 DEBUG ceilometer.compute.pollsters [-] 702895f8-1281-4b2f-8f4b-c838ce84b37a/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.434 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '599a4452-5a52-4027-800d-4f5f64dafa05', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': '702895f8-1281-4b2f-8f4b-c838ce84b37a-vda', 'timestamp': '2025-10-02T12:51:16.433246', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-gen-1-1251183732', 'name': 'instance-000000bb', 'instance_id': '702895f8-1281-4b2f-8f4b-c838ce84b37a', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7c13d9ee-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7468.04754453, 'message_signature': '3c0ab9fa7e3e4a95d406eb532bbc00831c3780ccc710acb0ea81fe0eb4af9d74'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': '702895f8-1281-4b2f-8f4b-c838ce84b37a-sda', 'timestamp': '2025-10-02T12:51:16.433246', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-gen-1-1251183732', 'name': 'instance-000000bb', 'instance_id': '702895f8-1281-4b2f-8f4b-c838ce84b37a', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7c13ec90-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7468.04754453, 'message_signature': 'aa4f1a185cbc63f423dea4db097f51b71437fa65996578c6fdba7f3a6457543f'}]}, 'timestamp': '2025-10-02 12:51:16.433980', '_unique_id': '8586cea2a94d45b3be6e137386ba4b26'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.434 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.434 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.434 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.434 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.434 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.434 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.434 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.434 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.434 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.434 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.434 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.434 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.434 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.434 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.434 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.434 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.434 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.434 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.434 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.434 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.434 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.434 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.434 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.434 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.434 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.434 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.434 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.434 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.434 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.434 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.434 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.435 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.435 12 DEBUG ceilometer.compute.pollsters [-] 702895f8-1281-4b2f-8f4b-c838ce84b37a/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a1b7f95d-9a6d-4503-81fa-45c52341d402', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': 'instance-000000bb-702895f8-1281-4b2f-8f4b-c838ce84b37a-tap50f63236-6c', 'timestamp': '2025-10-02T12:51:16.435282', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-gen-1-1251183732', 'name': 'tap50f63236-6c', 'instance_id': '702895f8-1281-4b2f-8f4b-c838ce84b37a', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:17:54:85', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap50f63236-6c'}, 'message_id': '7c142a34-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7468.041299114, 'message_signature': 'dad5c100a8084d4cc5d8ceaf04b4ebec6b2e099924579757982f92e68324fc48'}]}, 'timestamp': '2025-10-02 12:51:16.435530', '_unique_id': 'd970d85882c54905aa6faa1ff8173b05'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 DEBUG ceilometer.compute.pollsters [-] 702895f8-1281-4b2f-8f4b-c838ce84b37a/disk.device.read.latency volume: 521033918 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.436 12 DEBUG ceilometer.compute.pollsters [-] 702895f8-1281-4b2f-8f4b-c838ce84b37a/disk.device.read.latency volume: 3457428 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.437 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9e44596b-ffb0-405b-b5db-2374eb5c6d49', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 521033918, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': '702895f8-1281-4b2f-8f4b-c838ce84b37a-vda', 'timestamp': '2025-10-02T12:51:16.436636', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-gen-1-1251183732', 'name': 'instance-000000bb', 'instance_id': '702895f8-1281-4b2f-8f4b-c838ce84b37a', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7c145f04-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7468.06313375, 'message_signature': '9e9ec2982db75a7b0854384324a2f2701d2e8cf1c65498c930f1d53e22c69565'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3457428, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': '702895f8-1281-4b2f-8f4b-c838ce84b37a-sda', 'timestamp': '2025-10-02T12:51:16.436636', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-gen-1-1251183732', 'name': 'instance-000000bb', 'instance_id': '702895f8-1281-4b2f-8f4b-c838ce84b37a', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7c14681e-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7468.06313375, 'message_signature': 'd78345712a955581dbf3a23db78b7049880d36935f06d1e9a0250ac4ffd40594'}]}, 'timestamp': '2025-10-02 12:51:16.437087', '_unique_id': 'aeb79b245c994db4bbd16cefd168d02d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.437 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.437 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.437 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.437 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.437 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.437 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.437 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.437 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.437 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.437 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.437 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.437 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.437 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.437 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.437 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.437 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.437 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.437 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.437 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.437 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.437 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.437 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.437 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.437 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.437 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.437 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.437 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.437 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.437 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.437 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.437 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.438 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.438 12 DEBUG ceilometer.compute.pollsters [-] 702895f8-1281-4b2f-8f4b-c838ce84b37a/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'de18261a-a29d-4dfe-9d49-5b62464bd91d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': 'instance-000000bb-702895f8-1281-4b2f-8f4b-c838ce84b37a-tap50f63236-6c', 'timestamp': '2025-10-02T12:51:16.438458', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-gen-1-1251183732', 'name': 'tap50f63236-6c', 'instance_id': '702895f8-1281-4b2f-8f4b-c838ce84b37a', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:17:54:85', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap50f63236-6c'}, 'message_id': '7c14a5d6-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7468.041299114, 'message_signature': 'd6a4e402d1942c4b645c03848f256b19fc1628c7fb51bd7e85c01c977c29a0ac'}]}, 'timestamp': '2025-10-02 12:51:16.438700', '_unique_id': '93305f912432454fb8d09ba814ae3aa5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.439 12 DEBUG ceilometer.compute.pollsters [-] 702895f8-1281-4b2f-8f4b-c838ce84b37a/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.440 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ccfc0422-de18-4ccd-ae17-8fc227740c3a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': 'instance-000000bb-702895f8-1281-4b2f-8f4b-c838ce84b37a-tap50f63236-6c', 'timestamp': '2025-10-02T12:51:16.439772', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-gen-1-1251183732', 'name': 'tap50f63236-6c', 'instance_id': '702895f8-1281-4b2f-8f4b-c838ce84b37a', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:17:54:85', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap50f63236-6c'}, 'message_id': '7c14da1a-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7468.041299114, 'message_signature': 'eddeceddb44baaf3b0e9b16f965993a771f455e66c6f2e8975c135015ab46477'}]}, 'timestamp': '2025-10-02 12:51:16.440020', '_unique_id': '94f00326313f47adafb03f607370f393'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.440 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.440 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.440 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.440 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.440 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.440 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.440 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.440 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.440 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.440 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.440 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.440 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.440 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.440 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.440 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.440 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.440 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.440 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.440 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.440 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.440 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.440 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.440 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.440 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.440 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.440 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.440 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.440 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.440 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.440 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.440 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.441 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.441 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.441 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-gen-1-1251183732>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-gen-1-1251183732>]
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.441 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.441 12 DEBUG ceilometer.compute.pollsters [-] 702895f8-1281-4b2f-8f4b-c838ce84b37a/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.441 12 DEBUG ceilometer.compute.pollsters [-] 702895f8-1281-4b2f-8f4b-c838ce84b37a/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.442 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0d4bb2f9-5ed0-48db-8d2b-bd3abf972749', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': '702895f8-1281-4b2f-8f4b-c838ce84b37a-vda', 'timestamp': '2025-10-02T12:51:16.441435', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-gen-1-1251183732', 'name': 'instance-000000bb', 'instance_id': '702895f8-1281-4b2f-8f4b-c838ce84b37a', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7c1519d0-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7468.06313375, 'message_signature': '5f16a9a625806e53e7db92396d57da751f7819e806e152432ebadbca7d41dd5c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': '702895f8-1281-4b2f-8f4b-c838ce84b37a-sda', 'timestamp': '2025-10-02T12:51:16.441435', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-gen-1-1251183732', 'name': 'instance-000000bb', 'instance_id': '702895f8-1281-4b2f-8f4b-c838ce84b37a', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7c152394-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7468.06313375, 'message_signature': '4c17e57f9df79fd829c3f180ed936c3ceb05fbb5ff246facf4e4b02ed698fa6d'}]}, 'timestamp': '2025-10-02 12:51:16.441904', '_unique_id': '7b07588b65d34f21b063bd6990e06d01'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.442 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.442 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.442 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.442 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.442 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.442 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.442 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.442 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.442 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.442 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.442 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.442 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.442 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.442 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.442 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.442 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.442 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.442 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.442 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.442 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.442 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.442 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.442 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.442 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.442 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.442 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.442 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.442 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.442 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.442 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.442 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.442 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 DEBUG ceilometer.compute.pollsters [-] 702895f8-1281-4b2f-8f4b-c838ce84b37a/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 DEBUG ceilometer.compute.pollsters [-] 702895f8-1281-4b2f-8f4b-c838ce84b37a/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ad9d38b4-a54e-42bf-92b0-fa583e4b66b2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': '702895f8-1281-4b2f-8f4b-c838ce84b37a-vda', 'timestamp': '2025-10-02T12:51:16.443041', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-gen-1-1251183732', 'name': 'instance-000000bb', 'instance_id': '702895f8-1281-4b2f-8f4b-c838ce84b37a', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7c1558b4-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7468.06313375, 'message_signature': 'daddeac2e5458155cc1042eb86e0910ba82c4f998ba3eb6bab229f5810583a80'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': '702895f8-1281-4b2f-8f4b-c838ce84b37a-sda', 'timestamp': '2025-10-02T12:51:16.443041', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-gen-1-1251183732', 'name': 'instance-000000bb', 'instance_id': '702895f8-1281-4b2f-8f4b-c838ce84b37a', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7c156016-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7468.06313375, 'message_signature': '6ebab3b68a9242806c7fe708b481ebdd6bbaef2fa0ee1422022de4f69ea3247c'}]}, 'timestamp': '2025-10-02 12:51:16.443433', '_unique_id': '541b4bf9e99c45b38173df22d5065dc9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.443 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.444 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.444 12 DEBUG ceilometer.compute.pollsters [-] 702895f8-1281-4b2f-8f4b-c838ce84b37a/cpu volume: 3370000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.445 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '631ce52c-f4d4-4693-b41e-93994747ef7c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3370000000, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': '702895f8-1281-4b2f-8f4b-c838ce84b37a', 'timestamp': '2025-10-02T12:51:16.444487', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-gen-1-1251183732', 'name': 'instance-000000bb', 'instance_id': '702895f8-1281-4b2f-8f4b-c838ce84b37a', 'instance_type': 'm1.nano', 'host': 'cb0ce57b3c71748ae8a4bc550db694a08565036fcdc052c2f5e493a4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '7c15913a-9f8e-11f0-9b9a-fa163ec2af05', 'monotonic_time': 7468.105943912, 'message_signature': 'fbc64957b68ae49fafd81c6c924daac333d2b21eefd527d56c43dc172f2e43b6'}]}, 'timestamp': '2025-10-02 12:51:16.444720', '_unique_id': '78be79f52479433db662e1e8c06b0b86'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.445 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.445 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.445 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.445 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.445 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.445 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.445 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.445 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.445 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.445 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.445 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.445 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.445 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.445 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.445 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.445 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.445 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.445 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.445 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.445 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.445 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.445 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.445 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.445 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.445 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.445 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.445 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.445 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.445 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.445 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:51:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:51:16.445 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:51:16 np0005466013 nova_compute[192144]: 2025-10-02 12:51:16.498 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:20 np0005466013 nova_compute[192144]: 2025-10-02 12:51:20.533 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:20 np0005466013 nova_compute[192144]: 2025-10-02 12:51:20.587 2 DEBUG nova.compute.manager [req-e4836da5-5c03-4d12-b48c-9a42571a90a0 req-44c800d5-6457-4316-9b56-fc69e3498c4a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Received event network-changed-50f63236-6cb1-4e39-bed8-36dac99fd408 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:51:20 np0005466013 nova_compute[192144]: 2025-10-02 12:51:20.587 2 DEBUG nova.compute.manager [req-e4836da5-5c03-4d12-b48c-9a42571a90a0 req-44c800d5-6457-4316-9b56-fc69e3498c4a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Refreshing instance network info cache due to event network-changed-50f63236-6cb1-4e39-bed8-36dac99fd408. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:51:20 np0005466013 nova_compute[192144]: 2025-10-02 12:51:20.587 2 DEBUG oslo_concurrency.lockutils [req-e4836da5-5c03-4d12-b48c-9a42571a90a0 req-44c800d5-6457-4316-9b56-fc69e3498c4a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-702895f8-1281-4b2f-8f4b-c838ce84b37a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:51:20 np0005466013 nova_compute[192144]: 2025-10-02 12:51:20.587 2 DEBUG oslo_concurrency.lockutils [req-e4836da5-5c03-4d12-b48c-9a42571a90a0 req-44c800d5-6457-4316-9b56-fc69e3498c4a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-702895f8-1281-4b2f-8f4b-c838ce84b37a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:51:20 np0005466013 nova_compute[192144]: 2025-10-02 12:51:20.588 2 DEBUG nova.network.neutron [req-e4836da5-5c03-4d12-b48c-9a42571a90a0 req-44c800d5-6457-4316-9b56-fc69e3498c4a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Refreshing network info cache for port 50f63236-6cb1-4e39-bed8-36dac99fd408 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:51:21 np0005466013 nova_compute[192144]: 2025-10-02 12:51:21.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:23 np0005466013 nova_compute[192144]: 2025-10-02 12:51:23.090 2 DEBUG nova.network.neutron [req-e4836da5-5c03-4d12-b48c-9a42571a90a0 req-44c800d5-6457-4316-9b56-fc69e3498c4a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Updated VIF entry in instance network info cache for port 50f63236-6cb1-4e39-bed8-36dac99fd408. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:51:23 np0005466013 nova_compute[192144]: 2025-10-02 12:51:23.090 2 DEBUG nova.network.neutron [req-e4836da5-5c03-4d12-b48c-9a42571a90a0 req-44c800d5-6457-4316-9b56-fc69e3498c4a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Updating instance_info_cache with network_info: [{"id": "50f63236-6cb1-4e39-bed8-36dac99fd408", "address": "fa:16:3e:17:54:85", "network": {"id": "c56f578e-f013-4483-b9f2-ee1459896133", "bridge": "br-int", "label": "tempest-network-smoke--1680080003", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50f63236-6c", "ovs_interfaceid": "50f63236-6cb1-4e39-bed8-36dac99fd408", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:51:23 np0005466013 nova_compute[192144]: 2025-10-02 12:51:23.233 2 DEBUG oslo_concurrency.lockutils [req-e4836da5-5c03-4d12-b48c-9a42571a90a0 req-44c800d5-6457-4316-9b56-fc69e3498c4a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-702895f8-1281-4b2f-8f4b-c838ce84b37a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:51:25 np0005466013 nova_compute[192144]: 2025-10-02 12:51:25.535 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:25 np0005466013 podman[254180]: 2025-10-02 12:51:25.628247162 +0000 UTC m=+0.057499856 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_id=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:51:25 np0005466013 podman[254181]: 2025-10-02 12:51:25.637493002 +0000 UTC m=+0.051297471 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, build-date=2025-08-20T13:12:41, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, container_name=openstack_network_exporter, managed_by=edpm_ansible, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, version=9.6, distribution-scope=public, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, name=ubi9-minimal)
Oct  2 08:51:25 np0005466013 podman[254182]: 2025-10-02 12:51:25.651319445 +0000 UTC m=+0.069689708 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 08:51:26 np0005466013 ovn_controller[94366]: 2025-10-02T12:51:26Z|00087|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:17:54:85 10.100.0.9
Oct  2 08:51:26 np0005466013 ovn_controller[94366]: 2025-10-02T12:51:26Z|00088|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:17:54:85 10.100.0.9
Oct  2 08:51:26 np0005466013 nova_compute[192144]: 2025-10-02 12:51:26.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:29 np0005466013 podman[254240]: 2025-10-02 12:51:29.682655573 +0000 UTC m=+0.057700281 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  2 08:51:29 np0005466013 podman[254241]: 2025-10-02 12:51:29.697570471 +0000 UTC m=+0.062071629 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, org.label-schema.build-date=20251001)
Oct  2 08:51:30 np0005466013 nova_compute[192144]: 2025-10-02 12:51:30.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:31 np0005466013 nova_compute[192144]: 2025-10-02 12:51:31.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:33 np0005466013 nova_compute[192144]: 2025-10-02 12:51:33.434 2 DEBUG oslo_concurrency.lockutils [None req-f21d43f8-7746-449e-9aaf-7574268de597 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Acquiring lock "702895f8-1281-4b2f-8f4b-c838ce84b37a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:51:33 np0005466013 nova_compute[192144]: 2025-10-02 12:51:33.434 2 DEBUG oslo_concurrency.lockutils [None req-f21d43f8-7746-449e-9aaf-7574268de597 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "702895f8-1281-4b2f-8f4b-c838ce84b37a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:51:33 np0005466013 nova_compute[192144]: 2025-10-02 12:51:33.434 2 DEBUG oslo_concurrency.lockutils [None req-f21d43f8-7746-449e-9aaf-7574268de597 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Acquiring lock "702895f8-1281-4b2f-8f4b-c838ce84b37a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:51:33 np0005466013 nova_compute[192144]: 2025-10-02 12:51:33.434 2 DEBUG oslo_concurrency.lockutils [None req-f21d43f8-7746-449e-9aaf-7574268de597 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "702895f8-1281-4b2f-8f4b-c838ce84b37a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:51:33 np0005466013 nova_compute[192144]: 2025-10-02 12:51:33.435 2 DEBUG oslo_concurrency.lockutils [None req-f21d43f8-7746-449e-9aaf-7574268de597 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "702895f8-1281-4b2f-8f4b-c838ce84b37a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:51:33 np0005466013 nova_compute[192144]: 2025-10-02 12:51:33.527 2 INFO nova.compute.manager [None req-f21d43f8-7746-449e-9aaf-7574268de597 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Terminating instance#033[00m
Oct  2 08:51:33 np0005466013 nova_compute[192144]: 2025-10-02 12:51:33.551 2 DEBUG nova.compute.manager [None req-f21d43f8-7746-449e-9aaf-7574268de597 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:51:33 np0005466013 kernel: tap50f63236-6c (unregistering): left promiscuous mode
Oct  2 08:51:33 np0005466013 NetworkManager[51205]: <info>  [1759409493.5806] device (tap50f63236-6c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:51:33 np0005466013 ovn_controller[94366]: 2025-10-02T12:51:33Z|00798|binding|INFO|Releasing lport 50f63236-6cb1-4e39-bed8-36dac99fd408 from this chassis (sb_readonly=0)
Oct  2 08:51:33 np0005466013 nova_compute[192144]: 2025-10-02 12:51:33.589 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:33 np0005466013 ovn_controller[94366]: 2025-10-02T12:51:33Z|00799|binding|INFO|Setting lport 50f63236-6cb1-4e39-bed8-36dac99fd408 down in Southbound
Oct  2 08:51:33 np0005466013 ovn_controller[94366]: 2025-10-02T12:51:33Z|00800|binding|INFO|Removing iface tap50f63236-6c ovn-installed in OVS
Oct  2 08:51:33 np0005466013 nova_compute[192144]: 2025-10-02 12:51:33.592 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:33 np0005466013 nova_compute[192144]: 2025-10-02 12:51:33.607 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:33 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:51:33.624 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:17:54:85 10.100.0.9'], port_security=['fa:16:3e:17:54:85 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '702895f8-1281-4b2f-8f4b-c838ce84b37a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c56f578e-f013-4483-b9f2-ee1459896133', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'neutron:revision_number': '5', 'neutron:security_group_ids': '0c614090-7feb-4374-a7f9-1cb3b34f8e4e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=50529df0-c539-4067-a62b-3ef6d48b20aa, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>], logical_port=50f63236-6cb1-4e39-bed8-36dac99fd408) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7febe61f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:51:33 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:51:33.627 103323 INFO neutron.agent.ovn.metadata.agent [-] Port 50f63236-6cb1-4e39-bed8-36dac99fd408 in datapath c56f578e-f013-4483-b9f2-ee1459896133 unbound from our chassis#033[00m
Oct  2 08:51:33 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:51:33.628 103323 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c56f578e-f013-4483-b9f2-ee1459896133, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:51:33 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:51:33.630 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[9c042fc5-860e-473c-9ca6-10bfc7f304be]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:51:33 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:51:33.630 103323 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c56f578e-f013-4483-b9f2-ee1459896133 namespace which is not needed anymore#033[00m
Oct  2 08:51:33 np0005466013 systemd[1]: machine-qemu\x2d83\x2dinstance\x2d000000bb.scope: Deactivated successfully.
Oct  2 08:51:33 np0005466013 systemd[1]: machine-qemu\x2d83\x2dinstance\x2d000000bb.scope: Consumed 12.696s CPU time.
Oct  2 08:51:33 np0005466013 systemd-machined[152202]: Machine qemu-83-instance-000000bb terminated.
Oct  2 08:51:33 np0005466013 neutron-haproxy-ovnmeta-c56f578e-f013-4483-b9f2-ee1459896133[254083]: [NOTICE]   (254087) : haproxy version is 2.8.14-c23fe91
Oct  2 08:51:33 np0005466013 neutron-haproxy-ovnmeta-c56f578e-f013-4483-b9f2-ee1459896133[254083]: [NOTICE]   (254087) : path to executable is /usr/sbin/haproxy
Oct  2 08:51:33 np0005466013 neutron-haproxy-ovnmeta-c56f578e-f013-4483-b9f2-ee1459896133[254083]: [WARNING]  (254087) : Exiting Master process...
Oct  2 08:51:33 np0005466013 neutron-haproxy-ovnmeta-c56f578e-f013-4483-b9f2-ee1459896133[254083]: [ALERT]    (254087) : Current worker (254089) exited with code 143 (Terminated)
Oct  2 08:51:33 np0005466013 neutron-haproxy-ovnmeta-c56f578e-f013-4483-b9f2-ee1459896133[254083]: [WARNING]  (254087) : All workers exited. Exiting... (0)
Oct  2 08:51:33 np0005466013 systemd[1]: libpod-7457633d4d7cf7509fc86816cd9e97a770ed14346b33eae35f19b09b1bb6b97c.scope: Deactivated successfully.
Oct  2 08:51:33 np0005466013 podman[254309]: 2025-10-02 12:51:33.77513788 +0000 UTC m=+0.052108475 container died 7457633d4d7cf7509fc86816cd9e97a770ed14346b33eae35f19b09b1bb6b97c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c56f578e-f013-4483-b9f2-ee1459896133, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:51:33 np0005466013 nova_compute[192144]: 2025-10-02 12:51:33.779 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:33 np0005466013 nova_compute[192144]: 2025-10-02 12:51:33.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:33 np0005466013 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7457633d4d7cf7509fc86816cd9e97a770ed14346b33eae35f19b09b1bb6b97c-userdata-shm.mount: Deactivated successfully.
Oct  2 08:51:33 np0005466013 systemd[1]: var-lib-containers-storage-overlay-4ece5e59e87cd92d3319f0ea7b30fd25e8ab9126951dbdf6c0b8dedb65621b8d-merged.mount: Deactivated successfully.
Oct  2 08:51:33 np0005466013 nova_compute[192144]: 2025-10-02 12:51:33.826 2 INFO nova.virt.libvirt.driver [-] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Instance destroyed successfully.#033[00m
Oct  2 08:51:33 np0005466013 nova_compute[192144]: 2025-10-02 12:51:33.827 2 DEBUG nova.objects.instance [None req-f21d43f8-7746-449e-9aaf-7574268de597 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lazy-loading 'resources' on Instance uuid 702895f8-1281-4b2f-8f4b-c838ce84b37a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:51:33 np0005466013 podman[254309]: 2025-10-02 12:51:33.83794374 +0000 UTC m=+0.114914385 container cleanup 7457633d4d7cf7509fc86816cd9e97a770ed14346b33eae35f19b09b1bb6b97c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c56f578e-f013-4483-b9f2-ee1459896133, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3)
Oct  2 08:51:33 np0005466013 systemd[1]: libpod-conmon-7457633d4d7cf7509fc86816cd9e97a770ed14346b33eae35f19b09b1bb6b97c.scope: Deactivated successfully.
Oct  2 08:51:33 np0005466013 nova_compute[192144]: 2025-10-02 12:51:33.878 2 DEBUG nova.virt.libvirt.vif [None req-f21d43f8-7746-449e-9aaf-7574268de597 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:50:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-gen-1-1251183732',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-gen-1-1251183732',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1020134341-ge',id=187,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPtdLgRziYi/gtQwh2c90NnE9jWcSnkXXhVGvo+TNtzW3MSE83NoyumTXAaB/UU4ExIeaKr77+vb5N+WSXOcyyn7dDdXMPaG0pk4M0kXpEwbShNs9Jn1NVnaa85coBBWBQ==',key_name='tempest-TestSecurityGroupsBasicOps-1574692784',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:51:13Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='575f3d227ab24f2daa62e65e14a4cd9c',ramdisk_id='',reservation_id='r-0ovuvx2b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-1020134341',owner_user_name='tempest-TestSecurityGroupsBasicOps-1020134341-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:51:13Z,user_data=None,user_id='2d2b4a2da57543ef88e44ae28ad61647',uuid=702895f8-1281-4b2f-8f4b-c838ce84b37a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "50f63236-6cb1-4e39-bed8-36dac99fd408", "address": "fa:16:3e:17:54:85", "network": {"id": "c56f578e-f013-4483-b9f2-ee1459896133", "bridge": "br-int", "label": "tempest-network-smoke--1680080003", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50f63236-6c", "ovs_interfaceid": "50f63236-6cb1-4e39-bed8-36dac99fd408", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:51:33 np0005466013 nova_compute[192144]: 2025-10-02 12:51:33.879 2 DEBUG nova.network.os_vif_util [None req-f21d43f8-7746-449e-9aaf-7574268de597 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Converting VIF {"id": "50f63236-6cb1-4e39-bed8-36dac99fd408", "address": "fa:16:3e:17:54:85", "network": {"id": "c56f578e-f013-4483-b9f2-ee1459896133", "bridge": "br-int", "label": "tempest-network-smoke--1680080003", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50f63236-6c", "ovs_interfaceid": "50f63236-6cb1-4e39-bed8-36dac99fd408", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:51:33 np0005466013 nova_compute[192144]: 2025-10-02 12:51:33.880 2 DEBUG nova.network.os_vif_util [None req-f21d43f8-7746-449e-9aaf-7574268de597 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:17:54:85,bridge_name='br-int',has_traffic_filtering=True,id=50f63236-6cb1-4e39-bed8-36dac99fd408,network=Network(c56f578e-f013-4483-b9f2-ee1459896133),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50f63236-6c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:51:33 np0005466013 nova_compute[192144]: 2025-10-02 12:51:33.880 2 DEBUG os_vif [None req-f21d43f8-7746-449e-9aaf-7574268de597 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:17:54:85,bridge_name='br-int',has_traffic_filtering=True,id=50f63236-6cb1-4e39-bed8-36dac99fd408,network=Network(c56f578e-f013-4483-b9f2-ee1459896133),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50f63236-6c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:51:33 np0005466013 nova_compute[192144]: 2025-10-02 12:51:33.882 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:33 np0005466013 nova_compute[192144]: 2025-10-02 12:51:33.883 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap50f63236-6c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:51:33 np0005466013 nova_compute[192144]: 2025-10-02 12:51:33.885 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:33 np0005466013 nova_compute[192144]: 2025-10-02 12:51:33.886 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:33 np0005466013 nova_compute[192144]: 2025-10-02 12:51:33.889 2 INFO os_vif [None req-f21d43f8-7746-449e-9aaf-7574268de597 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:17:54:85,bridge_name='br-int',has_traffic_filtering=True,id=50f63236-6cb1-4e39-bed8-36dac99fd408,network=Network(c56f578e-f013-4483-b9f2-ee1459896133),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50f63236-6c')#033[00m
Oct  2 08:51:33 np0005466013 nova_compute[192144]: 2025-10-02 12:51:33.890 2 INFO nova.virt.libvirt.driver [None req-f21d43f8-7746-449e-9aaf-7574268de597 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Deleting instance files /var/lib/nova/instances/702895f8-1281-4b2f-8f4b-c838ce84b37a_del#033[00m
Oct  2 08:51:33 np0005466013 nova_compute[192144]: 2025-10-02 12:51:33.890 2 INFO nova.virt.libvirt.driver [None req-f21d43f8-7746-449e-9aaf-7574268de597 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Deletion of /var/lib/nova/instances/702895f8-1281-4b2f-8f4b-c838ce84b37a_del complete#033[00m
Oct  2 08:51:34 np0005466013 podman[254356]: 2025-10-02 12:51:34.010116343 +0000 UTC m=+0.150998609 container remove 7457633d4d7cf7509fc86816cd9e97a770ed14346b33eae35f19b09b1bb6b97c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c56f578e-f013-4483-b9f2-ee1459896133, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 08:51:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:51:34.015 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[a7e9f8f2-c266-439c-88f7-e7370ffbf310]: (4, ('Thu Oct  2 12:51:33 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-c56f578e-f013-4483-b9f2-ee1459896133 (7457633d4d7cf7509fc86816cd9e97a770ed14346b33eae35f19b09b1bb6b97c)\n7457633d4d7cf7509fc86816cd9e97a770ed14346b33eae35f19b09b1bb6b97c\nThu Oct  2 12:51:33 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c56f578e-f013-4483-b9f2-ee1459896133 (7457633d4d7cf7509fc86816cd9e97a770ed14346b33eae35f19b09b1bb6b97c)\n7457633d4d7cf7509fc86816cd9e97a770ed14346b33eae35f19b09b1bb6b97c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:51:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:51:34.018 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[149ff193-c7e7-49f5-84e9-288c1e7c3122]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:51:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:51:34.019 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc56f578e-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:51:34 np0005466013 kernel: tapc56f578e-f0: left promiscuous mode
Oct  2 08:51:34 np0005466013 nova_compute[192144]: 2025-10-02 12:51:34.023 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:51:34.025 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[f6086bb1-50fb-49e4-bccd-73383583ddd5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:51:34 np0005466013 nova_compute[192144]: 2025-10-02 12:51:34.036 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:34 np0005466013 nova_compute[192144]: 2025-10-02 12:51:34.056 2 DEBUG nova.compute.manager [req-b741afe1-3f18-4bae-b47d-dc09a07d2de8 req-a3bae0d0-7b93-40e7-b3d0-b8cc739ba74d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Received event network-vif-unplugged-50f63236-6cb1-4e39-bed8-36dac99fd408 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:51:34 np0005466013 nova_compute[192144]: 2025-10-02 12:51:34.057 2 DEBUG oslo_concurrency.lockutils [req-b741afe1-3f18-4bae-b47d-dc09a07d2de8 req-a3bae0d0-7b93-40e7-b3d0-b8cc739ba74d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "702895f8-1281-4b2f-8f4b-c838ce84b37a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:51:34 np0005466013 nova_compute[192144]: 2025-10-02 12:51:34.057 2 DEBUG oslo_concurrency.lockutils [req-b741afe1-3f18-4bae-b47d-dc09a07d2de8 req-a3bae0d0-7b93-40e7-b3d0-b8cc739ba74d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "702895f8-1281-4b2f-8f4b-c838ce84b37a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:51:34 np0005466013 nova_compute[192144]: 2025-10-02 12:51:34.057 2 DEBUG oslo_concurrency.lockutils [req-b741afe1-3f18-4bae-b47d-dc09a07d2de8 req-a3bae0d0-7b93-40e7-b3d0-b8cc739ba74d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "702895f8-1281-4b2f-8f4b-c838ce84b37a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:51:34 np0005466013 nova_compute[192144]: 2025-10-02 12:51:34.057 2 DEBUG nova.compute.manager [req-b741afe1-3f18-4bae-b47d-dc09a07d2de8 req-a3bae0d0-7b93-40e7-b3d0-b8cc739ba74d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] No waiting events found dispatching network-vif-unplugged-50f63236-6cb1-4e39-bed8-36dac99fd408 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:51:34 np0005466013 nova_compute[192144]: 2025-10-02 12:51:34.057 2 DEBUG nova.compute.manager [req-b741afe1-3f18-4bae-b47d-dc09a07d2de8 req-a3bae0d0-7b93-40e7-b3d0-b8cc739ba74d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Received event network-vif-unplugged-50f63236-6cb1-4e39-bed8-36dac99fd408 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:51:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:51:34.060 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[52a5fb6b-478a-4a23-a61f-ab2b324ef560]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:51:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:51:34.061 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[4d071d80-e41d-4c50-8b1c-8e85ab02716e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:51:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:51:34.077 219962 DEBUG oslo.privsep.daemon [-] privsep: reply[4a0886a5-b469-4dc3-87fe-a5941fdac64e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 746389, 'reachable_time': 42110, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 254369, 'error': None, 'target': 'ovnmeta-c56f578e-f013-4483-b9f2-ee1459896133', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:51:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:51:34.080 103439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c56f578e-f013-4483-b9f2-ee1459896133 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:51:34 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:51:34.080 103439 DEBUG oslo.privsep.daemon [-] privsep: reply[3cbce2fb-376a-4104-acb7-e4b8c32e48b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:51:34 np0005466013 systemd[1]: run-netns-ovnmeta\x2dc56f578e\x2df013\x2d4483\x2db9f2\x2dee1459896133.mount: Deactivated successfully.
Oct  2 08:51:34 np0005466013 nova_compute[192144]: 2025-10-02 12:51:34.139 2 INFO nova.compute.manager [None req-f21d43f8-7746-449e-9aaf-7574268de597 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Took 0.59 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:51:34 np0005466013 nova_compute[192144]: 2025-10-02 12:51:34.140 2 DEBUG oslo.service.loopingcall [None req-f21d43f8-7746-449e-9aaf-7574268de597 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:51:34 np0005466013 nova_compute[192144]: 2025-10-02 12:51:34.140 2 DEBUG nova.compute.manager [-] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:51:34 np0005466013 nova_compute[192144]: 2025-10-02 12:51:34.140 2 DEBUG nova.network.neutron [-] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:51:36 np0005466013 nova_compute[192144]: 2025-10-02 12:51:36.199 2 DEBUG nova.compute.manager [req-db0fd659-e4bc-4240-baf1-50946d59a7ee req-d901eaca-8f54-41ec-abb6-8af54b218fbb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Received event network-vif-plugged-50f63236-6cb1-4e39-bed8-36dac99fd408 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:51:36 np0005466013 nova_compute[192144]: 2025-10-02 12:51:36.200 2 DEBUG oslo_concurrency.lockutils [req-db0fd659-e4bc-4240-baf1-50946d59a7ee req-d901eaca-8f54-41ec-abb6-8af54b218fbb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "702895f8-1281-4b2f-8f4b-c838ce84b37a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:51:36 np0005466013 nova_compute[192144]: 2025-10-02 12:51:36.200 2 DEBUG oslo_concurrency.lockutils [req-db0fd659-e4bc-4240-baf1-50946d59a7ee req-d901eaca-8f54-41ec-abb6-8af54b218fbb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "702895f8-1281-4b2f-8f4b-c838ce84b37a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:51:36 np0005466013 nova_compute[192144]: 2025-10-02 12:51:36.200 2 DEBUG oslo_concurrency.lockutils [req-db0fd659-e4bc-4240-baf1-50946d59a7ee req-d901eaca-8f54-41ec-abb6-8af54b218fbb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "702895f8-1281-4b2f-8f4b-c838ce84b37a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:51:36 np0005466013 nova_compute[192144]: 2025-10-02 12:51:36.201 2 DEBUG nova.compute.manager [req-db0fd659-e4bc-4240-baf1-50946d59a7ee req-d901eaca-8f54-41ec-abb6-8af54b218fbb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] No waiting events found dispatching network-vif-plugged-50f63236-6cb1-4e39-bed8-36dac99fd408 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:51:36 np0005466013 nova_compute[192144]: 2025-10-02 12:51:36.201 2 WARNING nova.compute.manager [req-db0fd659-e4bc-4240-baf1-50946d59a7ee req-d901eaca-8f54-41ec-abb6-8af54b218fbb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Received unexpected event network-vif-plugged-50f63236-6cb1-4e39-bed8-36dac99fd408 for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:51:36 np0005466013 nova_compute[192144]: 2025-10-02 12:51:36.506 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:37 np0005466013 nova_compute[192144]: 2025-10-02 12:51:37.238 2 DEBUG nova.network.neutron [-] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:51:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:51:37.276 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=57, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=56) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:51:37 np0005466013 nova_compute[192144]: 2025-10-02 12:51:37.277 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:37 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:51:37.278 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:51:37 np0005466013 nova_compute[192144]: 2025-10-02 12:51:37.374 2 DEBUG nova.compute.manager [req-66947bff-747c-418e-a14b-437903779bb7 req-d7b893fd-9f5b-49d1-bbe7-6170eabedaca 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Received event network-vif-deleted-50f63236-6cb1-4e39-bed8-36dac99fd408 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:51:37 np0005466013 nova_compute[192144]: 2025-10-02 12:51:37.375 2 INFO nova.compute.manager [req-66947bff-747c-418e-a14b-437903779bb7 req-d7b893fd-9f5b-49d1-bbe7-6170eabedaca 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Neutron deleted interface 50f63236-6cb1-4e39-bed8-36dac99fd408; detaching it from the instance and deleting it from the info cache#033[00m
Oct  2 08:51:37 np0005466013 nova_compute[192144]: 2025-10-02 12:51:37.375 2 DEBUG nova.network.neutron [req-66947bff-747c-418e-a14b-437903779bb7 req-d7b893fd-9f5b-49d1-bbe7-6170eabedaca 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:51:37 np0005466013 nova_compute[192144]: 2025-10-02 12:51:37.911 2 INFO nova.compute.manager [-] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Took 3.77 seconds to deallocate network for instance.#033[00m
Oct  2 08:51:37 np0005466013 nova_compute[192144]: 2025-10-02 12:51:37.916 2 DEBUG nova.compute.manager [req-66947bff-747c-418e-a14b-437903779bb7 req-d7b893fd-9f5b-49d1-bbe7-6170eabedaca 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Detach interface failed, port_id=50f63236-6cb1-4e39-bed8-36dac99fd408, reason: Instance 702895f8-1281-4b2f-8f4b-c838ce84b37a could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Oct  2 08:51:38 np0005466013 nova_compute[192144]: 2025-10-02 12:51:38.886 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:39 np0005466013 nova_compute[192144]: 2025-10-02 12:51:39.257 2 DEBUG oslo_concurrency.lockutils [None req-f21d43f8-7746-449e-9aaf-7574268de597 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:51:39 np0005466013 nova_compute[192144]: 2025-10-02 12:51:39.257 2 DEBUG oslo_concurrency.lockutils [None req-f21d43f8-7746-449e-9aaf-7574268de597 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:51:39 np0005466013 nova_compute[192144]: 2025-10-02 12:51:39.329 2 DEBUG nova.compute.provider_tree [None req-f21d43f8-7746-449e-9aaf-7574268de597 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:51:39 np0005466013 nova_compute[192144]: 2025-10-02 12:51:39.430 2 DEBUG nova.scheduler.client.report [None req-f21d43f8-7746-449e-9aaf-7574268de597 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:51:39 np0005466013 nova_compute[192144]: 2025-10-02 12:51:39.567 2 DEBUG oslo_concurrency.lockutils [None req-f21d43f8-7746-449e-9aaf-7574268de597 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.310s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:51:39 np0005466013 nova_compute[192144]: 2025-10-02 12:51:39.756 2 INFO nova.scheduler.client.report [None req-f21d43f8-7746-449e-9aaf-7574268de597 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Deleted allocations for instance 702895f8-1281-4b2f-8f4b-c838ce84b37a#033[00m
Oct  2 08:51:40 np0005466013 nova_compute[192144]: 2025-10-02 12:51:40.242 2 DEBUG oslo_concurrency.lockutils [None req-f21d43f8-7746-449e-9aaf-7574268de597 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "702895f8-1281-4b2f-8f4b-c838ce84b37a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.808s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:51:41 np0005466013 nova_compute[192144]: 2025-10-02 12:51:41.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:42 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:51:42.281 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '57'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:51:43 np0005466013 nova_compute[192144]: 2025-10-02 12:51:43.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:46 np0005466013 nova_compute[192144]: 2025-10-02 12:51:46.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:46 np0005466013 podman[254372]: 2025-10-02 12:51:46.679274439 +0000 UTC m=+0.050742384 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  2 08:51:46 np0005466013 podman[254371]: 2025-10-02 12:51:46.682652664 +0000 UTC m=+0.057682241 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 08:51:46 np0005466013 podman[254373]: 2025-10-02 12:51:46.71055536 +0000 UTC m=+0.079005780 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:51:48 np0005466013 nova_compute[192144]: 2025-10-02 12:51:48.826 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759409493.8237746, 702895f8-1281-4b2f-8f4b-c838ce84b37a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:51:48 np0005466013 nova_compute[192144]: 2025-10-02 12:51:48.826 2 INFO nova.compute.manager [-] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:51:48 np0005466013 nova_compute[192144]: 2025-10-02 12:51:48.884 2 DEBUG nova.compute.manager [None req-877d91ac-23dc-4a4f-a144-590dc5b9ec29 - - - - - -] [instance: 702895f8-1281-4b2f-8f4b-c838ce84b37a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:51:48 np0005466013 nova_compute[192144]: 2025-10-02 12:51:48.912 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:51 np0005466013 nova_compute[192144]: 2025-10-02 12:51:51.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:52 np0005466013 nova_compute[192144]: 2025-10-02 12:51:52.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:51:53 np0005466013 nova_compute[192144]: 2025-10-02 12:51:53.914 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:53 np0005466013 nova_compute[192144]: 2025-10-02 12:51:53.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:51:53 np0005466013 nova_compute[192144]: 2025-10-02 12:51:53.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:51:53 np0005466013 nova_compute[192144]: 2025-10-02 12:51:53.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:51:54 np0005466013 nova_compute[192144]: 2025-10-02 12:51:54.189 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:51:54 np0005466013 nova_compute[192144]: 2025-10-02 12:51:54.189 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:51:54 np0005466013 nova_compute[192144]: 2025-10-02 12:51:54.189 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:51:54 np0005466013 nova_compute[192144]: 2025-10-02 12:51:54.190 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:51:54 np0005466013 nova_compute[192144]: 2025-10-02 12:51:54.325 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:51:54 np0005466013 nova_compute[192144]: 2025-10-02 12:51:54.325 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5690MB free_disk=73.1318359375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:51:54 np0005466013 nova_compute[192144]: 2025-10-02 12:51:54.326 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:51:54 np0005466013 nova_compute[192144]: 2025-10-02 12:51:54.326 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:51:55 np0005466013 nova_compute[192144]: 2025-10-02 12:51:55.073 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:51:55 np0005466013 nova_compute[192144]: 2025-10-02 12:51:55.073 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:51:55 np0005466013 nova_compute[192144]: 2025-10-02 12:51:55.101 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:51:55 np0005466013 nova_compute[192144]: 2025-10-02 12:51:55.250 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:51:55 np0005466013 nova_compute[192144]: 2025-10-02 12:51:55.320 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:51:55 np0005466013 nova_compute[192144]: 2025-10-02 12:51:55.320 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.995s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:51:56 np0005466013 nova_compute[192144]: 2025-10-02 12:51:56.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:56 np0005466013 podman[254439]: 2025-10-02 12:51:56.683858404 +0000 UTC m=+0.062214133 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:51:56 np0005466013 podman[254440]: 2025-10-02 12:51:56.689591434 +0000 UTC m=+0.065075573 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-type=git, build-date=2025-08-20T13:12:41, config_id=edpm, maintainer=Red Hat, Inc., architecture=x86_64, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct  2 08:51:56 np0005466013 podman[254441]: 2025-10-02 12:51:56.699653399 +0000 UTC m=+0.073726993 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Oct  2 08:51:57 np0005466013 nova_compute[192144]: 2025-10-02 12:51:57.321 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:51:58 np0005466013 nova_compute[192144]: 2025-10-02 12:51:58.918 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:00 np0005466013 podman[254497]: 2025-10-02 12:52:00.669545702 +0000 UTC m=+0.049121442 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:52:00 np0005466013 podman[254498]: 2025-10-02 12:52:00.675523919 +0000 UTC m=+0.054381486 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid)
Oct  2 08:52:00 np0005466013 nova_compute[192144]: 2025-10-02 12:52:00.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:52:01 np0005466013 nova_compute[192144]: 2025-10-02 12:52:01.045 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:01 np0005466013 nova_compute[192144]: 2025-10-02 12:52:01.134 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:01 np0005466013 nova_compute[192144]: 2025-10-02 12:52:01.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:52:02.337 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:52:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:52:02.338 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:52:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:52:02.338 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:52:02 np0005466013 nova_compute[192144]: 2025-10-02 12:52:02.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:52:03 np0005466013 nova_compute[192144]: 2025-10-02 12:52:03.921 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:04 np0005466013 nova_compute[192144]: 2025-10-02 12:52:04.990 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:52:06 np0005466013 nova_compute[192144]: 2025-10-02 12:52:06.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:06 np0005466013 nova_compute[192144]: 2025-10-02 12:52:06.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:52:06 np0005466013 nova_compute[192144]: 2025-10-02 12:52:06.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:52:06 np0005466013 nova_compute[192144]: 2025-10-02 12:52:06.995 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:52:07 np0005466013 nova_compute[192144]: 2025-10-02 12:52:07.038 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:52:07 np0005466013 nova_compute[192144]: 2025-10-02 12:52:07.038 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:52:08 np0005466013 nova_compute[192144]: 2025-10-02 12:52:08.924 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:11 np0005466013 nova_compute[192144]: 2025-10-02 12:52:11.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:13 np0005466013 nova_compute[192144]: 2025-10-02 12:52:13.926 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:16 np0005466013 nova_compute[192144]: 2025-10-02 12:52:16.518 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:17 np0005466013 podman[254541]: 2025-10-02 12:52:17.675626995 +0000 UTC m=+0.052500968 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 08:52:17 np0005466013 podman[254540]: 2025-10-02 12:52:17.685664081 +0000 UTC m=+0.056326959 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  2 08:52:17 np0005466013 podman[254542]: 2025-10-02 12:52:17.740619124 +0000 UTC m=+0.115160484 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS)
Oct  2 08:52:18 np0005466013 nova_compute[192144]: 2025-10-02 12:52:18.929 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:21 np0005466013 nova_compute[192144]: 2025-10-02 12:52:21.521 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:23 np0005466013 nova_compute[192144]: 2025-10-02 12:52:23.932 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:52:25.484 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=58, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=57) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:52:25 np0005466013 nova_compute[192144]: 2025-10-02 12:52:25.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:25 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:52:25.485 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:52:26 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:52:26.488 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '58'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:52:26 np0005466013 nova_compute[192144]: 2025-10-02 12:52:26.523 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:27 np0005466013 podman[254606]: 2025-10-02 12:52:27.707564699 +0000 UTC m=+0.069243064 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vcs-type=git, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, name=ubi9-minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7)
Oct  2 08:52:27 np0005466013 podman[254605]: 2025-10-02 12:52:27.711549754 +0000 UTC m=+0.080365663 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001)
Oct  2 08:52:27 np0005466013 podman[254607]: 2025-10-02 12:52:27.71302194 +0000 UTC m=+0.068662965 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  2 08:52:28 np0005466013 nova_compute[192144]: 2025-10-02 12:52:28.934 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:31 np0005466013 nova_compute[192144]: 2025-10-02 12:52:31.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:31 np0005466013 podman[254664]: 2025-10-02 12:52:31.665722693 +0000 UTC m=+0.046341144 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:52:31 np0005466013 podman[254665]: 2025-10-02 12:52:31.692920637 +0000 UTC m=+0.070666408 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.vendor=CentOS, config_id=iscsid)
Oct  2 08:52:33 np0005466013 nova_compute[192144]: 2025-10-02 12:52:33.937 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:36 np0005466013 nova_compute[192144]: 2025-10-02 12:52:36.527 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:38 np0005466013 nova_compute[192144]: 2025-10-02 12:52:38.939 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:41 np0005466013 nova_compute[192144]: 2025-10-02 12:52:41.527 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:43 np0005466013 nova_compute[192144]: 2025-10-02 12:52:43.942 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:46 np0005466013 nova_compute[192144]: 2025-10-02 12:52:46.530 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:48 np0005466013 podman[254709]: 2025-10-02 12:52:48.684162306 +0000 UTC m=+0.055721000 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent)
Oct  2 08:52:48 np0005466013 podman[254708]: 2025-10-02 12:52:48.705609238 +0000 UTC m=+0.070800002 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  2 08:52:48 np0005466013 podman[254710]: 2025-10-02 12:52:48.713707562 +0000 UTC m=+0.084962737 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  2 08:52:48 np0005466013 nova_compute[192144]: 2025-10-02 12:52:48.943 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:51 np0005466013 nova_compute[192144]: 2025-10-02 12:52:51.531 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:53 np0005466013 nova_compute[192144]: 2025-10-02 12:52:53.945 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:53 np0005466013 nova_compute[192144]: 2025-10-02 12:52:53.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:52:53 np0005466013 nova_compute[192144]: 2025-10-02 12:52:53.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:52:53 np0005466013 nova_compute[192144]: 2025-10-02 12:52:53.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:52:54 np0005466013 nova_compute[192144]: 2025-10-02 12:52:54.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:52:55 np0005466013 nova_compute[192144]: 2025-10-02 12:52:55.085 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:52:55 np0005466013 nova_compute[192144]: 2025-10-02 12:52:55.086 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:52:55 np0005466013 nova_compute[192144]: 2025-10-02 12:52:55.086 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:52:55 np0005466013 nova_compute[192144]: 2025-10-02 12:52:55.086 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:52:55 np0005466013 nova_compute[192144]: 2025-10-02 12:52:55.216 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:52:55 np0005466013 nova_compute[192144]: 2025-10-02 12:52:55.217 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5706MB free_disk=73.13262939453125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:52:55 np0005466013 nova_compute[192144]: 2025-10-02 12:52:55.217 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:52:55 np0005466013 nova_compute[192144]: 2025-10-02 12:52:55.218 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:52:56 np0005466013 ovn_controller[94366]: 2025-10-02T12:52:56Z|00801|memory_trim|INFO|Detected inactivity (last active 30008 ms ago): trimming memory
Oct  2 08:52:56 np0005466013 nova_compute[192144]: 2025-10-02 12:52:56.533 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:57 np0005466013 nova_compute[192144]: 2025-10-02 12:52:57.115 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:52:57 np0005466013 nova_compute[192144]: 2025-10-02 12:52:57.116 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:52:57 np0005466013 nova_compute[192144]: 2025-10-02 12:52:57.782 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:52:57 np0005466013 nova_compute[192144]: 2025-10-02 12:52:57.931 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:52:57 np0005466013 nova_compute[192144]: 2025-10-02 12:52:57.932 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:52:57 np0005466013 nova_compute[192144]: 2025-10-02 12:52:57.933 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.715s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:52:58 np0005466013 podman[254775]: 2025-10-02 12:52:58.679800182 +0000 UTC m=+0.052897960 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:52:58 np0005466013 podman[254774]: 2025-10-02 12:52:58.68131129 +0000 UTC m=+0.057701662 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, name=ubi9-minimal, version=9.6, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_id=edpm, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct  2 08:52:58 np0005466013 podman[254773]: 2025-10-02 12:52:58.682143646 +0000 UTC m=+0.063345909 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, io.buildah.version=1.41.3)
Oct  2 08:52:58 np0005466013 nova_compute[192144]: 2025-10-02 12:52:58.933 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:52:58 np0005466013 nova_compute[192144]: 2025-10-02 12:52:58.947 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:01 np0005466013 nova_compute[192144]: 2025-10-02 12:53:01.535 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:53:02.339 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:53:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:53:02.339 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:53:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:53:02.339 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:53:02 np0005466013 podman[254829]: 2025-10-02 12:53:02.684220088 +0000 UTC m=+0.050497415 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:53:02 np0005466013 podman[254830]: 2025-10-02 12:53:02.693125788 +0000 UTC m=+0.052392775 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  2 08:53:02 np0005466013 nova_compute[192144]: 2025-10-02 12:53:02.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:53:03 np0005466013 nova_compute[192144]: 2025-10-02 12:53:03.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:03 np0005466013 nova_compute[192144]: 2025-10-02 12:53:03.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:53:04 np0005466013 nova_compute[192144]: 2025-10-02 12:53:04.989 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:53:06 np0005466013 nova_compute[192144]: 2025-10-02 12:53:06.536 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:07 np0005466013 nova_compute[192144]: 2025-10-02 12:53:07.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:53:07 np0005466013 nova_compute[192144]: 2025-10-02 12:53:07.993 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:53:07 np0005466013 nova_compute[192144]: 2025-10-02 12:53:07.993 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:53:08 np0005466013 nova_compute[192144]: 2025-10-02 12:53:08.017 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:53:08 np0005466013 nova_compute[192144]: 2025-10-02 12:53:08.952 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:08 np0005466013 nova_compute[192144]: 2025-10-02 12:53:08.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:53:10 np0005466013 nova_compute[192144]: 2025-10-02 12:53:10.989 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:53:11 np0005466013 nova_compute[192144]: 2025-10-02 12:53:11.538 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:13 np0005466013 nova_compute[192144]: 2025-10-02 12:53:13.954 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:53:16.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:53:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:53:16.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:53:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:53:16.362 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:53:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:53:16.362 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:53:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:53:16.362 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:53:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:53:16.362 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:53:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:53:16.362 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:53:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:53:16.363 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:53:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:53:16.363 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:53:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:53:16.363 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:53:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:53:16.363 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:53:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:53:16.363 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:53:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:53:16.363 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:53:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:53:16.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:53:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:53:16.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:53:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:53:16.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:53:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:53:16.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:53:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:53:16.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:53:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:53:16.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:53:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:53:16.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:53:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:53:16.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:53:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:53:16.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:53:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:53:16.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:53:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:53:16.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:53:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:53:16.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:53:16 np0005466013 nova_compute[192144]: 2025-10-02 12:53:16.540 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:18 np0005466013 nova_compute[192144]: 2025-10-02 12:53:18.982 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:19 np0005466013 podman[254870]: 2025-10-02 12:53:19.673232048 +0000 UTC m=+0.047324036 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  2 08:53:19 np0005466013 podman[254871]: 2025-10-02 12:53:19.681641532 +0000 UTC m=+0.052184939 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, container_name=ovn_metadata_agent)
Oct  2 08:53:19 np0005466013 podman[254872]: 2025-10-02 12:53:19.761666543 +0000 UTC m=+0.120381479 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:53:21 np0005466013 nova_compute[192144]: 2025-10-02 12:53:21.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:23 np0005466013 nova_compute[192144]: 2025-10-02 12:53:23.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:26 np0005466013 nova_compute[192144]: 2025-10-02 12:53:26.543 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:28 np0005466013 nova_compute[192144]: 2025-10-02 12:53:28.987 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:29 np0005466013 podman[254944]: 2025-10-02 12:53:29.697337397 +0000 UTC m=+0.062063328 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 08:53:29 np0005466013 podman[254943]: 2025-10-02 12:53:29.702711136 +0000 UTC m=+0.072968541 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, version=9.6, config_id=edpm)
Oct  2 08:53:29 np0005466013 podman[254942]: 2025-10-02 12:53:29.728738402 +0000 UTC m=+0.100998580 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  2 08:53:31 np0005466013 nova_compute[192144]: 2025-10-02 12:53:31.544 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:33 np0005466013 podman[254997]: 2025-10-02 12:53:33.671187873 +0000 UTC m=+0.047734188 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  2 08:53:33 np0005466013 podman[254998]: 2025-10-02 12:53:33.692787331 +0000 UTC m=+0.062629625 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=iscsid, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  2 08:53:33 np0005466013 nova_compute[192144]: 2025-10-02 12:53:33.990 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:36 np0005466013 nova_compute[192144]: 2025-10-02 12:53:36.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:38 np0005466013 nova_compute[192144]: 2025-10-02 12:53:38.994 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:41 np0005466013 nova_compute[192144]: 2025-10-02 12:53:41.547 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:43 np0005466013 nova_compute[192144]: 2025-10-02 12:53:43.996 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:46 np0005466013 nova_compute[192144]: 2025-10-02 12:53:46.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:48 np0005466013 nova_compute[192144]: 2025-10-02 12:53:48.998 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:50 np0005466013 podman[255041]: 2025-10-02 12:53:50.680076479 +0000 UTC m=+0.051250437 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:53:50 np0005466013 podman[255042]: 2025-10-02 12:53:50.714518049 +0000 UTC m=+0.081154745 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent)
Oct  2 08:53:50 np0005466013 podman[255043]: 2025-10-02 12:53:50.740650719 +0000 UTC m=+0.105339555 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Oct  2 08:53:51 np0005466013 nova_compute[192144]: 2025-10-02 12:53:51.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:53 np0005466013 nova_compute[192144]: 2025-10-02 12:53:53.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:53:54 np0005466013 nova_compute[192144]: 2025-10-02 12:53:54.000 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:54 np0005466013 nova_compute[192144]: 2025-10-02 12:53:54.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:53:54 np0005466013 nova_compute[192144]: 2025-10-02 12:53:54.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:53:54 np0005466013 nova_compute[192144]: 2025-10-02 12:53:54.995 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:53:55 np0005466013 nova_compute[192144]: 2025-10-02 12:53:55.025 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:53:55 np0005466013 nova_compute[192144]: 2025-10-02 12:53:55.026 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:53:55 np0005466013 nova_compute[192144]: 2025-10-02 12:53:55.026 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:53:55 np0005466013 nova_compute[192144]: 2025-10-02 12:53:55.026 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:53:55 np0005466013 nova_compute[192144]: 2025-10-02 12:53:55.234 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:53:55 np0005466013 nova_compute[192144]: 2025-10-02 12:53:55.235 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5712MB free_disk=73.1335678100586GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:53:55 np0005466013 nova_compute[192144]: 2025-10-02 12:53:55.235 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:53:55 np0005466013 nova_compute[192144]: 2025-10-02 12:53:55.236 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:53:55 np0005466013 nova_compute[192144]: 2025-10-02 12:53:55.311 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:53:55 np0005466013 nova_compute[192144]: 2025-10-02 12:53:55.312 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:53:55 np0005466013 nova_compute[192144]: 2025-10-02 12:53:55.334 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:53:55 np0005466013 nova_compute[192144]: 2025-10-02 12:53:55.347 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:53:55 np0005466013 nova_compute[192144]: 2025-10-02 12:53:55.348 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:53:55 np0005466013 nova_compute[192144]: 2025-10-02 12:53:55.349 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:53:56 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:53:56.077 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=59, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=58) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:53:56 np0005466013 nova_compute[192144]: 2025-10-02 12:53:56.078 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:56 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:53:56.078 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:53:56 np0005466013 nova_compute[192144]: 2025-10-02 12:53:56.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:58 np0005466013 nova_compute[192144]: 2025-10-02 12:53:58.349 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:53:59 np0005466013 nova_compute[192144]: 2025-10-02 12:53:59.001 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:59 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:53:59.081 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '59'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:54:00 np0005466013 podman[255108]: 2025-10-02 12:54:00.677134719 +0000 UTC m=+0.052181598 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:54:00 np0005466013 podman[255110]: 2025-10-02 12:54:00.677275213 +0000 UTC m=+0.047327455 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct  2 08:54:00 np0005466013 podman[255109]: 2025-10-02 12:54:00.689650411 +0000 UTC m=+0.060003533 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, name=ubi9-minimal, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, release=1755695350)
Oct  2 08:54:01 np0005466013 nova_compute[192144]: 2025-10-02 12:54:01.556 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:54:02.340 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:54:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:54:02.340 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:54:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:54:02.341 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:54:02 np0005466013 nova_compute[192144]: 2025-10-02 12:54:02.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_shelved_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:54:02 np0005466013 nova_compute[192144]: 2025-10-02 12:54:02.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:54:04 np0005466013 nova_compute[192144]: 2025-10-02 12:54:04.004 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:04 np0005466013 podman[255166]: 2025-10-02 12:54:04.678085795 +0000 UTC m=+0.060070986 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  2 08:54:04 np0005466013 podman[255167]: 2025-10-02 12:54:04.715656533 +0000 UTC m=+0.084892314 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=iscsid, container_name=iscsid, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:54:04 np0005466013 nova_compute[192144]: 2025-10-02 12:54:04.995 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:54:05 np0005466013 nova_compute[192144]: 2025-10-02 12:54:05.989 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:54:06 np0005466013 nova_compute[192144]: 2025-10-02 12:54:06.558 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:07 np0005466013 nova_compute[192144]: 2025-10-02 12:54:07.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:54:07 np0005466013 nova_compute[192144]: 2025-10-02 12:54:07.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:54:07 np0005466013 nova_compute[192144]: 2025-10-02 12:54:07.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:54:08 np0005466013 nova_compute[192144]: 2025-10-02 12:54:08.010 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:54:09 np0005466013 nova_compute[192144]: 2025-10-02 12:54:09.006 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:09 np0005466013 nova_compute[192144]: 2025-10-02 12:54:09.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:54:11 np0005466013 nova_compute[192144]: 2025-10-02 12:54:11.559 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:14 np0005466013 nova_compute[192144]: 2025-10-02 12:54:14.008 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:16 np0005466013 nova_compute[192144]: 2025-10-02 12:54:16.589 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:19 np0005466013 nova_compute[192144]: 2025-10-02 12:54:19.011 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:21 np0005466013 nova_compute[192144]: 2025-10-02 12:54:21.591 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:21 np0005466013 podman[255208]: 2025-10-02 12:54:21.67270729 +0000 UTC m=+0.050169165 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 08:54:21 np0005466013 podman[255209]: 2025-10-02 12:54:21.683271721 +0000 UTC m=+0.053717106 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Oct  2 08:54:21 np0005466013 podman[255210]: 2025-10-02 12:54:21.71643004 +0000 UTC m=+0.084875892 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  2 08:54:24 np0005466013 nova_compute[192144]: 2025-10-02 12:54:24.013 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:26 np0005466013 nova_compute[192144]: 2025-10-02 12:54:26.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:29 np0005466013 nova_compute[192144]: 2025-10-02 12:54:29.015 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:31 np0005466013 nova_compute[192144]: 2025-10-02 12:54:31.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:31 np0005466013 podman[255276]: 2025-10-02 12:54:31.677204139 +0000 UTC m=+0.058923709 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  2 08:54:31 np0005466013 podman[255277]: 2025-10-02 12:54:31.70849893 +0000 UTC m=+0.074241039 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_id=edpm, vendor=Red Hat, Inc., architecture=x86_64, name=ubi9-minimal, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, vcs-type=git, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct  2 08:54:31 np0005466013 podman[255278]: 2025-10-02 12:54:31.715599213 +0000 UTC m=+0.081324101 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_id=edpm, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  2 08:54:34 np0005466013 nova_compute[192144]: 2025-10-02 12:54:34.019 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:35 np0005466013 podman[255336]: 2025-10-02 12:54:35.6738889 +0000 UTC m=+0.050802384 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:54:35 np0005466013 podman[255337]: 2025-10-02 12:54:35.680657043 +0000 UTC m=+0.054071527 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  2 08:54:36 np0005466013 nova_compute[192144]: 2025-10-02 12:54:36.607 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:39 np0005466013 nova_compute[192144]: 2025-10-02 12:54:39.020 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:41 np0005466013 nova_compute[192144]: 2025-10-02 12:54:41.607 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:44 np0005466013 nova_compute[192144]: 2025-10-02 12:54:44.022 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:46 np0005466013 nova_compute[192144]: 2025-10-02 12:54:46.609 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:49 np0005466013 nova_compute[192144]: 2025-10-02 12:54:49.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:51 np0005466013 nova_compute[192144]: 2025-10-02 12:54:51.612 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:52 np0005466013 podman[255389]: 2025-10-02 12:54:52.700481128 +0000 UTC m=+0.076752877 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 08:54:52 np0005466013 podman[255388]: 2025-10-02 12:54:52.704168204 +0000 UTC m=+0.083269482 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:54:52 np0005466013 podman[255390]: 2025-10-02 12:54:52.714426455 +0000 UTC m=+0.084735357 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Oct  2 08:54:54 np0005466013 nova_compute[192144]: 2025-10-02 12:54:54.028 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:54 np0005466013 nova_compute[192144]: 2025-10-02 12:54:54.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:54:54 np0005466013 nova_compute[192144]: 2025-10-02 12:54:54.995 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:54:55 np0005466013 nova_compute[192144]: 2025-10-02 12:54:55.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:54:55 np0005466013 nova_compute[192144]: 2025-10-02 12:54:55.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:54:56 np0005466013 nova_compute[192144]: 2025-10-02 12:54:56.026 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:54:56 np0005466013 nova_compute[192144]: 2025-10-02 12:54:56.026 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:54:56 np0005466013 nova_compute[192144]: 2025-10-02 12:54:56.027 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:54:56 np0005466013 nova_compute[192144]: 2025-10-02 12:54:56.027 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:54:56 np0005466013 nova_compute[192144]: 2025-10-02 12:54:56.168 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:54:56 np0005466013 nova_compute[192144]: 2025-10-02 12:54:56.170 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5720MB free_disk=73.1335678100586GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:54:56 np0005466013 nova_compute[192144]: 2025-10-02 12:54:56.170 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:54:56 np0005466013 nova_compute[192144]: 2025-10-02 12:54:56.171 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:54:56 np0005466013 nova_compute[192144]: 2025-10-02 12:54:56.236 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:54:56 np0005466013 nova_compute[192144]: 2025-10-02 12:54:56.237 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:54:56 np0005466013 nova_compute[192144]: 2025-10-02 12:54:56.258 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:54:56 np0005466013 nova_compute[192144]: 2025-10-02 12:54:56.273 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:54:56 np0005466013 nova_compute[192144]: 2025-10-02 12:54:56.274 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:54:56 np0005466013 nova_compute[192144]: 2025-10-02 12:54:56.275 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:54:56 np0005466013 nova_compute[192144]: 2025-10-02 12:54:56.613 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:59 np0005466013 nova_compute[192144]: 2025-10-02 12:54:59.030 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:59 np0005466013 nova_compute[192144]: 2025-10-02 12:54:59.275 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:55:01 np0005466013 nova_compute[192144]: 2025-10-02 12:55:01.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:55:02.341 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:55:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:55:02.341 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:55:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:55:02.341 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:55:02 np0005466013 podman[255457]: 2025-10-02 12:55:02.672656716 +0000 UTC m=+0.051073553 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, io.buildah.version=1.41.3)
Oct  2 08:55:02 np0005466013 podman[255458]: 2025-10-02 12:55:02.680623735 +0000 UTC m=+0.054224741 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, distribution-scope=public, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, managed_by=edpm_ansible, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  2 08:55:02 np0005466013 podman[255459]: 2025-10-02 12:55:02.680960056 +0000 UTC m=+0.053238331 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:55:03 np0005466013 nova_compute[192144]: 2025-10-02 12:55:03.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:55:04 np0005466013 nova_compute[192144]: 2025-10-02 12:55:04.033 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:05 np0005466013 nova_compute[192144]: 2025-10-02 12:55:05.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:55:06 np0005466013 nova_compute[192144]: 2025-10-02 12:55:06.616 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:06 np0005466013 podman[255514]: 2025-10-02 12:55:06.667996436 +0000 UTC m=+0.049018829 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  2 08:55:06 np0005466013 podman[255515]: 2025-10-02 12:55:06.675645135 +0000 UTC m=+0.052501028 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:55:07 np0005466013 nova_compute[192144]: 2025-10-02 12:55:07.988 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:55:07 np0005466013 nova_compute[192144]: 2025-10-02 12:55:07.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:55:07 np0005466013 nova_compute[192144]: 2025-10-02 12:55:07.993 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:55:07 np0005466013 nova_compute[192144]: 2025-10-02 12:55:07.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:55:08 np0005466013 nova_compute[192144]: 2025-10-02 12:55:08.027 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:55:09 np0005466013 nova_compute[192144]: 2025-10-02 12:55:09.035 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:11 np0005466013 nova_compute[192144]: 2025-10-02 12:55:11.618 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:11 np0005466013 nova_compute[192144]: 2025-10-02 12:55:11.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:55:12 np0005466013 nova_compute[192144]: 2025-10-02 12:55:12.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:55:12 np0005466013 nova_compute[192144]: 2025-10-02 12:55:12.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  2 08:55:14 np0005466013 nova_compute[192144]: 2025-10-02 12:55:14.038 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:14 np0005466013 nova_compute[192144]: 2025-10-02 12:55:14.085 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:55:14 np0005466013 nova_compute[192144]: 2025-10-02 12:55:14.104 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:55:14 np0005466013 nova_compute[192144]: 2025-10-02 12:55:14.104 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  2 08:55:14 np0005466013 nova_compute[192144]: 2025-10-02 12:55:14.122 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  2 08:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:55:16.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:55:16.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:55:16.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:55:16.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:55:16.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:55:16.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:55:16.362 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:55:16.362 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:55:16.362 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:55:16.362 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:55:16.362 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:55:16.362 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:55:16.362 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:55:16.362 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:55:16.362 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:55:16.362 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:55:16.362 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:55:16.362 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:55:16.363 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:55:16.363 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:55:16.363 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:55:16.363 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:55:16.363 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:55:16.363 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:55:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:55:16.363 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:55:16 np0005466013 nova_compute[192144]: 2025-10-02 12:55:16.377 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:55:16 np0005466013 nova_compute[192144]: 2025-10-02 12:55:16.619 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:19 np0005466013 nova_compute[192144]: 2025-10-02 12:55:19.040 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:21 np0005466013 nova_compute[192144]: 2025-10-02 12:55:21.622 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:21 np0005466013 systemd-logind[784]: New session 42 of user zuul.
Oct  2 08:55:21 np0005466013 systemd[1]: Started Session 42 of User zuul.
Oct  2 08:55:23 np0005466013 podman[255606]: 2025-10-02 12:55:23.382010078 +0000 UTC m=+0.063556814 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:55:23 np0005466013 podman[255605]: 2025-10-02 12:55:23.382001737 +0000 UTC m=+0.066894608 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:55:23 np0005466013 podman[255607]: 2025-10-02 12:55:23.412955978 +0000 UTC m=+0.092625965 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:55:24 np0005466013 nova_compute[192144]: 2025-10-02 12:55:24.043 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:26 np0005466013 nova_compute[192144]: 2025-10-02 12:55:26.676 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:28 np0005466013 ovs-vsctl[255818]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Oct  2 08:55:29 np0005466013 nova_compute[192144]: 2025-10-02 12:55:29.045 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:29 np0005466013 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 255595 (sos)
Oct  2 08:55:29 np0005466013 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Oct  2 08:55:29 np0005466013 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Oct  2 08:55:29 np0005466013 virtqemud[191867]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Oct  2 08:55:29 np0005466013 virtqemud[191867]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Oct  2 08:55:29 np0005466013 virtqemud[191867]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Oct  2 08:55:30 np0005466013 kernel: block vda: the capability attribute has been deprecated.
Oct  2 08:55:31 np0005466013 nova_compute[192144]: 2025-10-02 12:55:31.678 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:33 np0005466013 systemd[1]: Starting Hostname Service...
Oct  2 08:55:33 np0005466013 podman[256354]: 2025-10-02 12:55:33.285627385 +0000 UTC m=+0.067075264 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., name=ubi9-minimal, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, version=9.6, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, release=1755695350, maintainer=Red Hat, Inc., vcs-type=git, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, config_id=edpm)
Oct  2 08:55:33 np0005466013 podman[256353]: 2025-10-02 12:55:33.314873593 +0000 UTC m=+0.095964200 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  2 08:55:33 np0005466013 podman[256355]: 2025-10-02 12:55:33.323223275 +0000 UTC m=+0.102050052 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:55:33 np0005466013 systemd[1]: Started Hostname Service.
Oct  2 08:55:33 np0005466013 nova_compute[192144]: 2025-10-02 12:55:33.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:55:34 np0005466013 nova_compute[192144]: 2025-10-02 12:55:34.047 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:36 np0005466013 nova_compute[192144]: 2025-10-02 12:55:36.681 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:37 np0005466013 podman[256621]: 2025-10-02 12:55:37.675239499 +0000 UTC m=+0.047864772 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 08:55:37 np0005466013 podman[256622]: 2025-10-02 12:55:37.716451992 +0000 UTC m=+0.082208120 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, config_id=iscsid, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  2 08:55:38 np0005466013 podman[206422]: time="2025-10-02T12:55:38Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct  2 08:55:38 np0005466013 podman[206422]: @ - - [02/Oct/2025:12:55:38 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 25331 "" "Go-http-client/1.1"
Oct  2 08:55:39 np0005466013 nova_compute[192144]: 2025-10-02 12:55:39.049 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:41 np0005466013 ovs-appctl[257399]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Oct  2 08:55:41 np0005466013 ovs-appctl[257403]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Oct  2 08:55:41 np0005466013 ovs-appctl[257408]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Oct  2 08:55:41 np0005466013 nova_compute[192144]: 2025-10-02 12:55:41.683 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:44 np0005466013 nova_compute[192144]: 2025-10-02 12:55:44.052 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:46 np0005466013 nova_compute[192144]: 2025-10-02 12:55:46.685 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:49 np0005466013 nova_compute[192144]: 2025-10-02 12:55:49.054 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:50 np0005466013 nova_compute[192144]: 2025-10-02 12:55:50.499 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:55:51 np0005466013 nova_compute[192144]: 2025-10-02 12:55:51.687 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:53 np0005466013 podman[258587]: 2025-10-02 12:55:53.749164741 +0000 UTC m=+0.084942625 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:55:53 np0005466013 podman[258586]: 2025-10-02 12:55:53.755551051 +0000 UTC m=+0.094072951 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 08:55:53 np0005466013 podman[258588]: 2025-10-02 12:55:53.821621503 +0000 UTC m=+0.161910728 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  2 08:55:54 np0005466013 nova_compute[192144]: 2025-10-02 12:55:54.056 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:55 np0005466013 nova_compute[192144]: 2025-10-02 12:55:54.999 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:55:55 np0005466013 nova_compute[192144]: 2025-10-02 12:55:55.000 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:55:56 np0005466013 nova_compute[192144]: 2025-10-02 12:55:56.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:56 np0005466013 nova_compute[192144]: 2025-10-02 12:55:56.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:55:56 np0005466013 nova_compute[192144]: 2025-10-02 12:55:56.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:55:57 np0005466013 nova_compute[192144]: 2025-10-02 12:55:57.031 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:55:57 np0005466013 nova_compute[192144]: 2025-10-02 12:55:57.032 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:55:57 np0005466013 nova_compute[192144]: 2025-10-02 12:55:57.032 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:55:57 np0005466013 nova_compute[192144]: 2025-10-02 12:55:57.033 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:55:57 np0005466013 virtqemud[191867]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Oct  2 08:55:57 np0005466013 nova_compute[192144]: 2025-10-02 12:55:57.194 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:55:57 np0005466013 nova_compute[192144]: 2025-10-02 12:55:57.195 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5516MB free_disk=72.69837188720703GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:55:57 np0005466013 nova_compute[192144]: 2025-10-02 12:55:57.195 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:55:57 np0005466013 nova_compute[192144]: 2025-10-02 12:55:57.196 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:55:57 np0005466013 nova_compute[192144]: 2025-10-02 12:55:57.312 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:55:57 np0005466013 nova_compute[192144]: 2025-10-02 12:55:57.313 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:55:57 np0005466013 nova_compute[192144]: 2025-10-02 12:55:57.336 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Refreshing inventories for resource provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  2 08:55:57 np0005466013 nova_compute[192144]: 2025-10-02 12:55:57.500 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Updating ProviderTree inventory for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  2 08:55:57 np0005466013 nova_compute[192144]: 2025-10-02 12:55:57.500 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Updating inventory in ProviderTree for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 08:55:57 np0005466013 nova_compute[192144]: 2025-10-02 12:55:57.524 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Refreshing aggregate associations for resource provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  2 08:55:57 np0005466013 nova_compute[192144]: 2025-10-02 12:55:57.556 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Refreshing trait associations for resource provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80, traits: COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_TRUSTED_CERTS,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NODE,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE,HW_CPU_X86_MMX,COMPUTE_RESCUE_BFV _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  2 08:55:57 np0005466013 nova_compute[192144]: 2025-10-02 12:55:57.575 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:55:57 np0005466013 nova_compute[192144]: 2025-10-02 12:55:57.673 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:55:57 np0005466013 nova_compute[192144]: 2025-10-02 12:55:57.866 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:55:57 np0005466013 nova_compute[192144]: 2025-10-02 12:55:57.866 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.671s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:55:59 np0005466013 nova_compute[192144]: 2025-10-02 12:55:59.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:00 np0005466013 systemd[1]: Starting Time & Date Service...
Oct  2 08:56:00 np0005466013 systemd[1]: Started Time & Date Service.
Oct  2 08:56:01 np0005466013 nova_compute[192144]: 2025-10-02 12:56:01.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:01 np0005466013 nova_compute[192144]: 2025-10-02 12:56:01.865 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:56:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:56:02.342 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:56:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:56:02.342 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:56:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:56:02.342 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:56:03 np0005466013 podman[259146]: 2025-10-02 12:56:03.454734777 +0000 UTC m=+0.075496758 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, container_name=openstack_network_exporter, vcs-type=git, vendor=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal)
Oct  2 08:56:03 np0005466013 podman[259145]: 2025-10-02 12:56:03.456944966 +0000 UTC m=+0.080316859 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  2 08:56:03 np0005466013 podman[259147]: 2025-10-02 12:56:03.48926036 +0000 UTC m=+0.107923525 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:56:04 np0005466013 nova_compute[192144]: 2025-10-02 12:56:04.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:05 np0005466013 nova_compute[192144]: 2025-10-02 12:56:05.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:56:05 np0005466013 nova_compute[192144]: 2025-10-02 12:56:05.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:56:06 np0005466013 nova_compute[192144]: 2025-10-02 12:56:06.731 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:07 np0005466013 nova_compute[192144]: 2025-10-02 12:56:07.989 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:56:08 np0005466013 podman[259202]: 2025-10-02 12:56:08.086800454 +0000 UTC m=+0.069352846 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  2 08:56:08 np0005466013 podman[259201]: 2025-10-02 12:56:08.093781643 +0000 UTC m=+0.074523559 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 08:56:08 np0005466013 nova_compute[192144]: 2025-10-02 12:56:08.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:56:08 np0005466013 nova_compute[192144]: 2025-10-02 12:56:08.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:56:08 np0005466013 nova_compute[192144]: 2025-10-02 12:56:08.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:56:09 np0005466013 nova_compute[192144]: 2025-10-02 12:56:09.019 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:56:09 np0005466013 nova_compute[192144]: 2025-10-02 12:56:09.108 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:11 np0005466013 nova_compute[192144]: 2025-10-02 12:56:11.734 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:12 np0005466013 nova_compute[192144]: 2025-10-02 12:56:12.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:56:14 np0005466013 nova_compute[192144]: 2025-10-02 12:56:14.110 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:16 np0005466013 nova_compute[192144]: 2025-10-02 12:56:16.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:19 np0005466013 nova_compute[192144]: 2025-10-02 12:56:19.166 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:21 np0005466013 nova_compute[192144]: 2025-10-02 12:56:21.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:22 np0005466013 systemd[1]: session-42.scope: Deactivated successfully.
Oct  2 08:56:22 np0005466013 systemd[1]: session-42.scope: Consumed 1min 23.066s CPU time, 817.4M memory peak, read 335.3M from disk, written 217.2M to disk.
Oct  2 08:56:22 np0005466013 systemd-logind[784]: Session 42 logged out. Waiting for processes to exit.
Oct  2 08:56:22 np0005466013 systemd-logind[784]: Removed session 42.
Oct  2 08:56:22 np0005466013 systemd-logind[784]: New session 43 of user zuul.
Oct  2 08:56:22 np0005466013 systemd[1]: Started Session 43 of User zuul.
Oct  2 08:56:23 np0005466013 systemd[1]: session-43.scope: Deactivated successfully.
Oct  2 08:56:23 np0005466013 systemd-logind[784]: Session 43 logged out. Waiting for processes to exit.
Oct  2 08:56:23 np0005466013 systemd-logind[784]: Removed session 43.
Oct  2 08:56:23 np0005466013 systemd-logind[784]: New session 44 of user zuul.
Oct  2 08:56:23 np0005466013 systemd[1]: Started Session 44 of User zuul.
Oct  2 08:56:23 np0005466013 systemd[1]: session-44.scope: Deactivated successfully.
Oct  2 08:56:23 np0005466013 systemd-logind[784]: Session 44 logged out. Waiting for processes to exit.
Oct  2 08:56:23 np0005466013 systemd-logind[784]: Removed session 44.
Oct  2 08:56:24 np0005466013 nova_compute[192144]: 2025-10-02 12:56:24.168 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:24 np0005466013 podman[259302]: 2025-10-02 12:56:24.67774493 +0000 UTC m=+0.052260630 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3)
Oct  2 08:56:24 np0005466013 podman[259301]: 2025-10-02 12:56:24.679799264 +0000 UTC m=+0.054264892 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  2 08:56:24 np0005466013 podman[259303]: 2025-10-02 12:56:24.754152275 +0000 UTC m=+0.126432185 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, managed_by=edpm_ansible)
Oct  2 08:56:26 np0005466013 nova_compute[192144]: 2025-10-02 12:56:26.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:29 np0005466013 nova_compute[192144]: 2025-10-02 12:56:29.212 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:30 np0005466013 systemd[1]: systemd-timedated.service: Deactivated successfully.
Oct  2 08:56:30 np0005466013 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct  2 08:56:31 np0005466013 nova_compute[192144]: 2025-10-02 12:56:31.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:33 np0005466013 podman[259376]: 2025-10-02 12:56:33.684575313 +0000 UTC m=+0.055515891 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct  2 08:56:33 np0005466013 podman[259374]: 2025-10-02 12:56:33.707479462 +0000 UTC m=+0.082893551 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_id=multipathd)
Oct  2 08:56:33 np0005466013 podman[259375]: 2025-10-02 12:56:33.707549874 +0000 UTC m=+0.082101596 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, distribution-scope=public)
Oct  2 08:56:34 np0005466013 nova_compute[192144]: 2025-10-02 12:56:34.214 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:36 np0005466013 nova_compute[192144]: 2025-10-02 12:56:36.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:38 np0005466013 podman[259437]: 2025-10-02 12:56:38.674147823 +0000 UTC m=+0.051529427 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=iscsid)
Oct  2 08:56:38 np0005466013 podman[259436]: 2025-10-02 12:56:38.703568375 +0000 UTC m=+0.080970050 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:56:39 np0005466013 nova_compute[192144]: 2025-10-02 12:56:39.216 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:41 np0005466013 nova_compute[192144]: 2025-10-02 12:56:41.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:44 np0005466013 nova_compute[192144]: 2025-10-02 12:56:44.219 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:46 np0005466013 nova_compute[192144]: 2025-10-02 12:56:46.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:49 np0005466013 nova_compute[192144]: 2025-10-02 12:56:49.221 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:49 np0005466013 systemd[1]: Starting dnf makecache...
Oct  2 08:56:49 np0005466013 dnf[259481]: Metadata cache refreshed recently.
Oct  2 08:56:49 np0005466013 systemd[1]: dnf-makecache.service: Deactivated successfully.
Oct  2 08:56:49 np0005466013 systemd[1]: Finished dnf makecache.
Oct  2 08:56:51 np0005466013 nova_compute[192144]: 2025-10-02 12:56:51.748 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:54 np0005466013 nova_compute[192144]: 2025-10-02 12:56:54.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:54 np0005466013 nova_compute[192144]: 2025-10-02 12:56:54.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:56:54 np0005466013 nova_compute[192144]: 2025-10-02 12:56:54.995 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:56:55 np0005466013 podman[259482]: 2025-10-02 12:56:55.689747575 +0000 UTC m=+0.061458478 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 08:56:55 np0005466013 podman[259483]: 2025-10-02 12:56:55.693079719 +0000 UTC m=+0.059396894 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Oct  2 08:56:55 np0005466013 podman[259484]: 2025-10-02 12:56:55.719657392 +0000 UTC m=+0.084223082 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:56:56 np0005466013 nova_compute[192144]: 2025-10-02 12:56:56.749 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:56 np0005466013 nova_compute[192144]: 2025-10-02 12:56:56.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:56:57 np0005466013 nova_compute[192144]: 2025-10-02 12:56:57.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:56:58 np0005466013 nova_compute[192144]: 2025-10-02 12:56:58.042 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:56:58 np0005466013 nova_compute[192144]: 2025-10-02 12:56:58.043 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:56:58 np0005466013 nova_compute[192144]: 2025-10-02 12:56:58.043 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:56:58 np0005466013 nova_compute[192144]: 2025-10-02 12:56:58.043 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:56:58 np0005466013 nova_compute[192144]: 2025-10-02 12:56:58.210 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:56:58 np0005466013 nova_compute[192144]: 2025-10-02 12:56:58.211 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5627MB free_disk=73.13312530517578GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:56:58 np0005466013 nova_compute[192144]: 2025-10-02 12:56:58.211 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:56:58 np0005466013 nova_compute[192144]: 2025-10-02 12:56:58.211 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:56:58 np0005466013 nova_compute[192144]: 2025-10-02 12:56:58.327 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:56:58 np0005466013 nova_compute[192144]: 2025-10-02 12:56:58.328 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:56:58 np0005466013 nova_compute[192144]: 2025-10-02 12:56:58.357 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:56:58 np0005466013 nova_compute[192144]: 2025-10-02 12:56:58.375 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:56:58 np0005466013 nova_compute[192144]: 2025-10-02 12:56:58.408 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:56:58 np0005466013 nova_compute[192144]: 2025-10-02 12:56:58.408 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.197s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:56:59 np0005466013 nova_compute[192144]: 2025-10-02 12:56:59.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:01 np0005466013 nova_compute[192144]: 2025-10-02 12:57:01.751 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:57:02.343 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:57:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:57:02.343 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:57:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:57:02.343 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:57:02 np0005466013 nova_compute[192144]: 2025-10-02 12:57:02.408 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:57:04 np0005466013 nova_compute[192144]: 2025-10-02 12:57:04.250 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:04 np0005466013 podman[259548]: 2025-10-02 12:57:04.690884901 +0000 UTC m=+0.056618007 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS)
Oct  2 08:57:04 np0005466013 podman[259547]: 2025-10-02 12:57:04.691151069 +0000 UTC m=+0.060009413 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2025-08-20T13:12:41, config_id=edpm, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., release=1755695350, version=9.6, name=ubi9-minimal, distribution-scope=public, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct  2 08:57:04 np0005466013 podman[259546]: 2025-10-02 12:57:04.708871835 +0000 UTC m=+0.082992514 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:57:05 np0005466013 nova_compute[192144]: 2025-10-02 12:57:05.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:57:05 np0005466013 nova_compute[192144]: 2025-10-02 12:57:05.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:57:06 np0005466013 nova_compute[192144]: 2025-10-02 12:57:06.753 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:09 np0005466013 nova_compute[192144]: 2025-10-02 12:57:09.252 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:09 np0005466013 podman[259605]: 2025-10-02 12:57:09.677929989 +0000 UTC m=+0.051915309 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:57:09 np0005466013 podman[259606]: 2025-10-02 12:57:09.716471868 +0000 UTC m=+0.087702812 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true)
Oct  2 08:57:09 np0005466013 nova_compute[192144]: 2025-10-02 12:57:09.988 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:57:10 np0005466013 nova_compute[192144]: 2025-10-02 12:57:10.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:57:10 np0005466013 nova_compute[192144]: 2025-10-02 12:57:10.995 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:57:10 np0005466013 nova_compute[192144]: 2025-10-02 12:57:10.995 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:57:11 np0005466013 nova_compute[192144]: 2025-10-02 12:57:11.042 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:57:11 np0005466013 nova_compute[192144]: 2025-10-02 12:57:11.755 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:12 np0005466013 nova_compute[192144]: 2025-10-02 12:57:12.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:57:14 np0005466013 nova_compute[192144]: 2025-10-02 12:57:14.254 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:57:16.362 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:57:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:57:16.362 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:57:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:57:16.362 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:57:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:57:16.362 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:57:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:57:16.362 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:57:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:57:16.363 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:57:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:57:16.363 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:57:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:57:16.363 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:57:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:57:16.363 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:57:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:57:16.363 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:57:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:57:16.363 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:57:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:57:16.363 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:57:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:57:16.363 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:57:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:57:16.363 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:57:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:57:16.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:57:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:57:16.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:57:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:57:16.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:57:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:57:16.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:57:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:57:16.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:57:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:57:16.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:57:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:57:16.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:57:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:57:16.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:57:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:57:16.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:57:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:57:16.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:57:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:57:16.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:57:16 np0005466013 nova_compute[192144]: 2025-10-02 12:57:16.757 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:18 np0005466013 nova_compute[192144]: 2025-10-02 12:57:18.990 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:57:19 np0005466013 nova_compute[192144]: 2025-10-02 12:57:19.256 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:21 np0005466013 nova_compute[192144]: 2025-10-02 12:57:21.759 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:24 np0005466013 nova_compute[192144]: 2025-10-02 12:57:24.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:26 np0005466013 podman[259649]: 2025-10-02 12:57:26.703566065 +0000 UTC m=+0.075540071 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:57:26 np0005466013 podman[259650]: 2025-10-02 12:57:26.709706026 +0000 UTC m=+0.078338137 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  2 08:57:26 np0005466013 podman[259651]: 2025-10-02 12:57:26.748259486 +0000 UTC m=+0.111071294 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller)
Oct  2 08:57:26 np0005466013 nova_compute[192144]: 2025-10-02 12:57:26.760 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:29 np0005466013 nova_compute[192144]: 2025-10-02 12:57:29.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:31 np0005466013 nova_compute[192144]: 2025-10-02 12:57:31.764 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:34 np0005466013 nova_compute[192144]: 2025-10-02 12:57:34.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:35 np0005466013 podman[259711]: 2025-10-02 12:57:35.696615227 +0000 UTC m=+0.069150650 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:57:35 np0005466013 podman[259712]: 2025-10-02 12:57:35.702905854 +0000 UTC m=+0.074387573 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, release=1755695350, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, build-date=2025-08-20T13:12:41, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container)
Oct  2 08:57:35 np0005466013 podman[259713]: 2025-10-02 12:57:35.707028493 +0000 UTC m=+0.079138902 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Oct  2 08:57:36 np0005466013 nova_compute[192144]: 2025-10-02 12:57:36.768 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:39 np0005466013 nova_compute[192144]: 2025-10-02 12:57:39.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:40 np0005466013 podman[259772]: 2025-10-02 12:57:40.68447461 +0000 UTC m=+0.060272481 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 08:57:40 np0005466013 podman[259773]: 2025-10-02 12:57:40.704353124 +0000 UTC m=+0.076582683 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_id=iscsid)
Oct  2 08:57:41 np0005466013 nova_compute[192144]: 2025-10-02 12:57:41.772 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:44 np0005466013 nova_compute[192144]: 2025-10-02 12:57:44.277 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:46 np0005466013 nova_compute[192144]: 2025-10-02 12:57:46.774 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:49 np0005466013 nova_compute[192144]: 2025-10-02 12:57:49.279 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:51 np0005466013 nova_compute[192144]: 2025-10-02 12:57:51.777 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:54 np0005466013 nova_compute[192144]: 2025-10-02 12:57:54.281 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:55 np0005466013 nova_compute[192144]: 2025-10-02 12:57:55.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:57:55 np0005466013 nova_compute[192144]: 2025-10-02 12:57:55.993 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:57:56 np0005466013 nova_compute[192144]: 2025-10-02 12:57:56.779 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:56 np0005466013 nova_compute[192144]: 2025-10-02 12:57:56.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:57:57 np0005466013 podman[259816]: 2025-10-02 12:57:57.689609473 +0000 UTC m=+0.065969689 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:57:57 np0005466013 podman[259817]: 2025-10-02 12:57:57.722271349 +0000 UTC m=+0.091813421 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, io.buildah.version=1.41.3)
Oct  2 08:57:57 np0005466013 podman[259823]: 2025-10-02 12:57:57.728925767 +0000 UTC m=+0.088809066 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 08:57:59 np0005466013 nova_compute[192144]: 2025-10-02 12:57:59.284 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:59 np0005466013 nova_compute[192144]: 2025-10-02 12:57:59.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:58:00 np0005466013 nova_compute[192144]: 2025-10-02 12:58:00.023 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:58:00 np0005466013 nova_compute[192144]: 2025-10-02 12:58:00.024 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:58:00 np0005466013 nova_compute[192144]: 2025-10-02 12:58:00.025 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:58:00 np0005466013 nova_compute[192144]: 2025-10-02 12:58:00.026 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:58:00 np0005466013 nova_compute[192144]: 2025-10-02 12:58:00.158 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:58:00 np0005466013 nova_compute[192144]: 2025-10-02 12:58:00.159 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5650MB free_disk=73.13312530517578GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:58:00 np0005466013 nova_compute[192144]: 2025-10-02 12:58:00.160 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:58:00 np0005466013 nova_compute[192144]: 2025-10-02 12:58:00.160 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:58:00 np0005466013 nova_compute[192144]: 2025-10-02 12:58:00.234 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:58:00 np0005466013 nova_compute[192144]: 2025-10-02 12:58:00.235 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:58:00 np0005466013 nova_compute[192144]: 2025-10-02 12:58:00.533 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:58:00 np0005466013 nova_compute[192144]: 2025-10-02 12:58:00.549 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:58:00 np0005466013 nova_compute[192144]: 2025-10-02 12:58:00.551 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:58:00 np0005466013 nova_compute[192144]: 2025-10-02 12:58:00.552 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.392s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:58:01 np0005466013 nova_compute[192144]: 2025-10-02 12:58:01.782 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:58:02.344 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:58:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:58:02.345 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:58:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:58:02.345 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:58:04 np0005466013 nova_compute[192144]: 2025-10-02 12:58:04.287 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:04 np0005466013 nova_compute[192144]: 2025-10-02 12:58:04.553 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:58:05 np0005466013 nova_compute[192144]: 2025-10-02 12:58:05.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:58:06 np0005466013 podman[259886]: 2025-10-02 12:58:06.666672539 +0000 UTC m=+0.047007158 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:58:06 np0005466013 podman[259887]: 2025-10-02 12:58:06.674307579 +0000 UTC m=+0.052013115 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, config_id=edpm, distribution-scope=public, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, managed_by=edpm_ansible, name=ubi9-minimal, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Oct  2 08:58:06 np0005466013 podman[259888]: 2025-10-02 12:58:06.700185601 +0000 UTC m=+0.076438672 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=edpm, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  2 08:58:06 np0005466013 nova_compute[192144]: 2025-10-02 12:58:06.812 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:07 np0005466013 nova_compute[192144]: 2025-10-02 12:58:07.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:58:09 np0005466013 nova_compute[192144]: 2025-10-02 12:58:09.290 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:09 np0005466013 nova_compute[192144]: 2025-10-02 12:58:09.988 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:58:11 np0005466013 podman[259942]: 2025-10-02 12:58:11.664238721 +0000 UTC m=+0.046052028 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  2 08:58:11 np0005466013 podman[259943]: 2025-10-02 12:58:11.671304743 +0000 UTC m=+0.049305200 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  2 08:58:11 np0005466013 nova_compute[192144]: 2025-10-02 12:58:11.814 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:12 np0005466013 nova_compute[192144]: 2025-10-02 12:58:12.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:58:12 np0005466013 nova_compute[192144]: 2025-10-02 12:58:12.993 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:58:12 np0005466013 nova_compute[192144]: 2025-10-02 12:58:12.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:58:13 np0005466013 nova_compute[192144]: 2025-10-02 12:58:13.009 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:58:14 np0005466013 nova_compute[192144]: 2025-10-02 12:58:14.291 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:14 np0005466013 nova_compute[192144]: 2025-10-02 12:58:14.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:58:16 np0005466013 nova_compute[192144]: 2025-10-02 12:58:16.815 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:19 np0005466013 nova_compute[192144]: 2025-10-02 12:58:19.327 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:21 np0005466013 nova_compute[192144]: 2025-10-02 12:58:21.817 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:24 np0005466013 nova_compute[192144]: 2025-10-02 12:58:24.329 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:26 np0005466013 nova_compute[192144]: 2025-10-02 12:58:26.819 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:28 np0005466013 podman[259986]: 2025-10-02 12:58:28.670772944 +0000 UTC m=+0.048242956 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  2 08:58:28 np0005466013 podman[259985]: 2025-10-02 12:58:28.705739123 +0000 UTC m=+0.083915847 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  2 08:58:28 np0005466013 podman[259987]: 2025-10-02 12:58:28.74862254 +0000 UTC m=+0.114018803 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  2 08:58:29 np0005466013 nova_compute[192144]: 2025-10-02 12:58:29.370 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:31 np0005466013 nova_compute[192144]: 2025-10-02 12:58:31.821 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:34 np0005466013 nova_compute[192144]: 2025-10-02 12:58:34.372 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:36 np0005466013 nova_compute[192144]: 2025-10-02 12:58:36.823 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:37 np0005466013 podman[260053]: 2025-10-02 12:58:37.699673775 +0000 UTC m=+0.063513307 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, distribution-scope=public, config_id=edpm, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1755695350, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct  2 08:58:37 np0005466013 podman[260052]: 2025-10-02 12:58:37.704386623 +0000 UTC m=+0.072570561 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  2 08:58:37 np0005466013 podman[260054]: 2025-10-02 12:58:37.720702495 +0000 UTC m=+0.078348423 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:58:39 np0005466013 nova_compute[192144]: 2025-10-02 12:58:39.374 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:41 np0005466013 nova_compute[192144]: 2025-10-02 12:58:41.869 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:42 np0005466013 podman[260113]: 2025-10-02 12:58:42.694322937 +0000 UTC m=+0.068215424 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  2 08:58:42 np0005466013 podman[260114]: 2025-10-02 12:58:42.722306236 +0000 UTC m=+0.083955008 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:58:44 np0005466013 nova_compute[192144]: 2025-10-02 12:58:44.377 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:46 np0005466013 nova_compute[192144]: 2025-10-02 12:58:46.870 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:49 np0005466013 nova_compute[192144]: 2025-10-02 12:58:49.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:51 np0005466013 nova_compute[192144]: 2025-10-02 12:58:51.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:54 np0005466013 nova_compute[192144]: 2025-10-02 12:58:54.391 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:56 np0005466013 nova_compute[192144]: 2025-10-02 12:58:56.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:56 np0005466013 nova_compute[192144]: 2025-10-02 12:58:56.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:58:57 np0005466013 nova_compute[192144]: 2025-10-02 12:58:57.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:58:57 np0005466013 nova_compute[192144]: 2025-10-02 12:58:57.993 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:58:59 np0005466013 nova_compute[192144]: 2025-10-02 12:58:59.393 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:59 np0005466013 podman[260155]: 2025-10-02 12:58:59.661566407 +0000 UTC m=+0.042712633 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 08:58:59 np0005466013 podman[260156]: 2025-10-02 12:58:59.68266522 +0000 UTC m=+0.055271248 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, tcib_managed=true)
Oct  2 08:58:59 np0005466013 podman[260157]: 2025-10-02 12:58:59.712981333 +0000 UTC m=+0.084948421 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller)
Oct  2 08:59:01 np0005466013 nova_compute[192144]: 2025-10-02 12:59:01.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:01 np0005466013 nova_compute[192144]: 2025-10-02 12:59:01.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:59:02 np0005466013 nova_compute[192144]: 2025-10-02 12:59:02.034 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:59:02 np0005466013 nova_compute[192144]: 2025-10-02 12:59:02.034 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:59:02 np0005466013 nova_compute[192144]: 2025-10-02 12:59:02.034 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:59:02 np0005466013 nova_compute[192144]: 2025-10-02 12:59:02.035 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:59:02 np0005466013 nova_compute[192144]: 2025-10-02 12:59:02.173 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:59:02 np0005466013 nova_compute[192144]: 2025-10-02 12:59:02.174 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5667MB free_disk=73.13312530517578GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:59:02 np0005466013 nova_compute[192144]: 2025-10-02 12:59:02.174 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:59:02 np0005466013 nova_compute[192144]: 2025-10-02 12:59:02.174 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:59:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:59:02.344 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:59:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:59:02.345 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:59:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 12:59:02.345 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:59:02 np0005466013 nova_compute[192144]: 2025-10-02 12:59:02.405 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:59:02 np0005466013 nova_compute[192144]: 2025-10-02 12:59:02.405 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:59:02 np0005466013 nova_compute[192144]: 2025-10-02 12:59:02.494 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:59:02 np0005466013 nova_compute[192144]: 2025-10-02 12:59:02.521 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:59:02 np0005466013 nova_compute[192144]: 2025-10-02 12:59:02.522 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:59:02 np0005466013 nova_compute[192144]: 2025-10-02 12:59:02.522 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.348s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:59:04 np0005466013 nova_compute[192144]: 2025-10-02 12:59:04.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:05 np0005466013 nova_compute[192144]: 2025-10-02 12:59:05.523 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:59:06 np0005466013 nova_compute[192144]: 2025-10-02 12:59:06.877 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:07 np0005466013 nova_compute[192144]: 2025-10-02 12:59:07.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:59:08 np0005466013 podman[260225]: 2025-10-02 12:59:08.685392238 +0000 UTC m=+0.060834802 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:59:08 np0005466013 podman[260224]: 2025-10-02 12:59:08.692044268 +0000 UTC m=+0.070559358 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, vcs-type=git, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, managed_by=edpm_ansible)
Oct  2 08:59:08 np0005466013 podman[260223]: 2025-10-02 12:59:08.704571351 +0000 UTC m=+0.083602018 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Oct  2 08:59:09 np0005466013 nova_compute[192144]: 2025-10-02 12:59:09.398 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:09 np0005466013 nova_compute[192144]: 2025-10-02 12:59:09.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:59:11 np0005466013 nova_compute[192144]: 2025-10-02 12:59:11.881 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:11 np0005466013 nova_compute[192144]: 2025-10-02 12:59:11.989 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:59:12 np0005466013 nova_compute[192144]: 2025-10-02 12:59:12.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:59:12 np0005466013 nova_compute[192144]: 2025-10-02 12:59:12.996 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:59:12 np0005466013 nova_compute[192144]: 2025-10-02 12:59:12.996 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:59:13 np0005466013 nova_compute[192144]: 2025-10-02 12:59:13.209 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:59:13 np0005466013 podman[260285]: 2025-10-02 12:59:13.689997191 +0000 UTC m=+0.069745023 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:59:13 np0005466013 podman[260286]: 2025-10-02 12:59:13.70586947 +0000 UTC m=+0.077927450 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  2 08:59:14 np0005466013 nova_compute[192144]: 2025-10-02 12:59:14.401 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:59:16.363 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:59:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:59:16.363 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:59:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:59:16.363 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:59:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:59:16.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:59:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:59:16.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:59:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:59:16.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:59:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:59:16.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:59:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:59:16.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:59:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:59:16.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:59:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:59:16.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:59:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:59:16.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:59:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:59:16.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:59:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:59:16.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:59:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:59:16.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:59:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:59:16.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:59:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:59:16.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:59:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:59:16.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:59:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:59:16.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:59:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:59:16.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:59:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:59:16.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:59:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:59:16.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:59:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:59:16.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:59:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:59:16.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:59:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:59:16.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:59:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 12:59:16.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:59:16 np0005466013 nova_compute[192144]: 2025-10-02 12:59:16.883 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:16 np0005466013 nova_compute[192144]: 2025-10-02 12:59:16.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:59:19 np0005466013 nova_compute[192144]: 2025-10-02 12:59:19.419 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:19 np0005466013 nova_compute[192144]: 2025-10-02 12:59:19.989 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:59:21 np0005466013 nova_compute[192144]: 2025-10-02 12:59:21.885 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:24 np0005466013 nova_compute[192144]: 2025-10-02 12:59:24.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:26 np0005466013 nova_compute[192144]: 2025-10-02 12:59:26.887 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:29 np0005466013 nova_compute[192144]: 2025-10-02 12:59:29.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:30 np0005466013 podman[260329]: 2025-10-02 12:59:30.761021732 +0000 UTC m=+0.125684760 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:59:30 np0005466013 podman[260330]: 2025-10-02 12:59:30.795167414 +0000 UTC m=+0.153610966 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:59:30 np0005466013 podman[260331]: 2025-10-02 12:59:30.802782314 +0000 UTC m=+0.161042801 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller)
Oct  2 08:59:31 np0005466013 nova_compute[192144]: 2025-10-02 12:59:31.889 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:34 np0005466013 nova_compute[192144]: 2025-10-02 12:59:34.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:36 np0005466013 nova_compute[192144]: 2025-10-02 12:59:36.890 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:39 np0005466013 nova_compute[192144]: 2025-10-02 12:59:39.475 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:39 np0005466013 podman[260397]: 2025-10-02 12:59:39.702644799 +0000 UTC m=+0.073065146 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, distribution-scope=public, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers)
Oct  2 08:59:39 np0005466013 podman[260398]: 2025-10-02 12:59:39.70966825 +0000 UTC m=+0.076138673 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:59:39 np0005466013 podman[260396]: 2025-10-02 12:59:39.744184465 +0000 UTC m=+0.107830719 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd)
Oct  2 08:59:41 np0005466013 nova_compute[192144]: 2025-10-02 12:59:41.892 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:44 np0005466013 nova_compute[192144]: 2025-10-02 12:59:44.478 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:44 np0005466013 podman[260452]: 2025-10-02 12:59:44.689458325 +0000 UTC m=+0.064326651 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:59:44 np0005466013 podman[260451]: 2025-10-02 12:59:44.709275418 +0000 UTC m=+0.078064393 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:59:46 np0005466013 nova_compute[192144]: 2025-10-02 12:59:46.895 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:49 np0005466013 nova_compute[192144]: 2025-10-02 12:59:49.480 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:51 np0005466013 nova_compute[192144]: 2025-10-02 12:59:51.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:51 np0005466013 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 08:59:51 np0005466013 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 08:59:51 np0005466013 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 08:59:54 np0005466013 nova_compute[192144]: 2025-10-02 12:59:54.484 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:55 np0005466013 nova_compute[192144]: 2025-10-02 12:59:55.012 2 DEBUG oslo_concurrency.processutils [None req-659d4632-c79e-4633-9620-923cb0652eb1 6f66e2b43c7641758f7c71dec37ebcb6 c543175414e2485bb476e4dfce01c394 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:59:55 np0005466013 nova_compute[192144]: 2025-10-02 12:59:55.037 2 DEBUG oslo_concurrency.processutils [None req-659d4632-c79e-4633-9620-923cb0652eb1 6f66e2b43c7641758f7c71dec37ebcb6 c543175414e2485bb476e4dfce01c394 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.025s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:59:56 np0005466013 nova_compute[192144]: 2025-10-02 12:59:56.898 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:57 np0005466013 nova_compute[192144]: 2025-10-02 12:59:57.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:59:57 np0005466013 nova_compute[192144]: 2025-10-02 12:59:57.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:59:58 np0005466013 nova_compute[192144]: 2025-10-02 12:59:58.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:59:59 np0005466013 nova_compute[192144]: 2025-10-02 12:59:59.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:01 np0005466013 podman[260498]: 2025-10-02 13:00:01.690058603 +0000 UTC m=+0.065505118 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 09:00:01 np0005466013 podman[260500]: 2025-10-02 13:00:01.721723858 +0000 UTC m=+0.094793519 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 09:00:01 np0005466013 podman[260499]: 2025-10-02 13:00:01.724044771 +0000 UTC m=+0.090051760 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Oct  2 09:00:01 np0005466013 nova_compute[192144]: 2025-10-02 13:00:01.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 13:00:02.191 103323 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=60, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=59) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 09:00:02 np0005466013 nova_compute[192144]: 2025-10-02 13:00:02.192 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 13:00:02.193 103323 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 09:00:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 13:00:02.345 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:00:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 13:00:02.346 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:00:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 13:00:02.347 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:00:03 np0005466013 nova_compute[192144]: 2025-10-02 13:00:03.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:00:04 np0005466013 nova_compute[192144]: 2025-10-02 13:00:04.490 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:05 np0005466013 nova_compute[192144]: 2025-10-02 13:00:05.320 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:00:05 np0005466013 nova_compute[192144]: 2025-10-02 13:00:05.320 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:00:05 np0005466013 nova_compute[192144]: 2025-10-02 13:00:05.320 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:00:05 np0005466013 nova_compute[192144]: 2025-10-02 13:00:05.321 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 09:00:05 np0005466013 nova_compute[192144]: 2025-10-02 13:00:05.467 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:00:05 np0005466013 nova_compute[192144]: 2025-10-02 13:00:05.468 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5678MB free_disk=73.13312530517578GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 09:00:05 np0005466013 nova_compute[192144]: 2025-10-02 13:00:05.468 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:00:05 np0005466013 nova_compute[192144]: 2025-10-02 13:00:05.469 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:00:05 np0005466013 nova_compute[192144]: 2025-10-02 13:00:05.546 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 09:00:05 np0005466013 nova_compute[192144]: 2025-10-02 13:00:05.546 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 09:00:05 np0005466013 nova_compute[192144]: 2025-10-02 13:00:05.569 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:00:05 np0005466013 nova_compute[192144]: 2025-10-02 13:00:05.588 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:00:05 np0005466013 nova_compute[192144]: 2025-10-02 13:00:05.589 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 09:00:05 np0005466013 nova_compute[192144]: 2025-10-02 13:00:05.589 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:00:06 np0005466013 nova_compute[192144]: 2025-10-02 13:00:06.901 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:07 np0005466013 nova_compute[192144]: 2025-10-02 13:00:07.588 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:00:09 np0005466013 nova_compute[192144]: 2025-10-02 13:00:09.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:09 np0005466013 nova_compute[192144]: 2025-10-02 13:00:09.997 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:00:09 np0005466013 nova_compute[192144]: 2025-10-02 13:00:09.998 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:00:10 np0005466013 podman[260565]: 2025-10-02 13:00:10.682758984 +0000 UTC m=+0.054708070 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.buildah.version=1.33.7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Oct  2 09:00:10 np0005466013 podman[260566]: 2025-10-02 13:00:10.683639661 +0000 UTC m=+0.053712658 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=edpm, tcib_managed=true)
Oct  2 09:00:10 np0005466013 podman[260564]: 2025-10-02 13:00:10.703118513 +0000 UTC m=+0.068378479 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct  2 09:00:11 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 13:00:11.196 103323 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1fc220e5-4479-4f53-8f4d-9aefe7dad458, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '60'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:00:11 np0005466013 nova_compute[192144]: 2025-10-02 13:00:11.958 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:13 np0005466013 nova_compute[192144]: 2025-10-02 13:00:13.990 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:00:14 np0005466013 nova_compute[192144]: 2025-10-02 13:00:14.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:14 np0005466013 nova_compute[192144]: 2025-10-02 13:00:14.996 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:00:14 np0005466013 nova_compute[192144]: 2025-10-02 13:00:14.997 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 09:00:14 np0005466013 nova_compute[192144]: 2025-10-02 13:00:14.997 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 09:00:15 np0005466013 nova_compute[192144]: 2025-10-02 13:00:15.030 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 09:00:15 np0005466013 nova_compute[192144]: 2025-10-02 13:00:15.031 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:00:15 np0005466013 nova_compute[192144]: 2025-10-02 13:00:15.031 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  2 09:00:15 np0005466013 podman[260623]: 2025-10-02 13:00:15.705315944 +0000 UTC m=+0.075285876 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  2 09:00:15 np0005466013 podman[260624]: 2025-10-02 13:00:15.707323977 +0000 UTC m=+0.075929166 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  2 09:00:16 np0005466013 nova_compute[192144]: 2025-10-02 13:00:16.983 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:17 np0005466013 nova_compute[192144]: 2025-10-02 13:00:17.008 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:00:19 np0005466013 nova_compute[192144]: 2025-10-02 13:00:19.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:22 np0005466013 nova_compute[192144]: 2025-10-02 13:00:22.009 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:24 np0005466013 nova_compute[192144]: 2025-10-02 13:00:24.505 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:26 np0005466013 nova_compute[192144]: 2025-10-02 13:00:26.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:00:26 np0005466013 nova_compute[192144]: 2025-10-02 13:00:26.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  2 09:00:27 np0005466013 nova_compute[192144]: 2025-10-02 13:00:27.017 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  2 09:00:27 np0005466013 nova_compute[192144]: 2025-10-02 13:00:27.046 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:29 np0005466013 nova_compute[192144]: 2025-10-02 13:00:29.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:32 np0005466013 nova_compute[192144]: 2025-10-02 13:00:32.049 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:32 np0005466013 podman[260665]: 2025-10-02 13:00:32.711829388 +0000 UTC m=+0.078320952 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 09:00:32 np0005466013 podman[260664]: 2025-10-02 13:00:32.718803467 +0000 UTC m=+0.081476881 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 09:00:32 np0005466013 podman[260666]: 2025-10-02 13:00:32.739952912 +0000 UTC m=+0.094396027 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Oct  2 09:00:34 np0005466013 nova_compute[192144]: 2025-10-02 13:00:34.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:37 np0005466013 nova_compute[192144]: 2025-10-02 13:00:37.086 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:39 np0005466013 nova_compute[192144]: 2025-10-02 13:00:39.543 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:41 np0005466013 podman[260733]: 2025-10-02 13:00:41.698918693 +0000 UTC m=+0.072971133 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, managed_by=edpm_ansible, release=1755695350, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, name=ubi9-minimal)
Oct  2 09:00:41 np0005466013 podman[260732]: 2025-10-02 13:00:41.723611439 +0000 UTC m=+0.093021284 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  2 09:00:41 np0005466013 podman[260734]: 2025-10-02 13:00:41.748565193 +0000 UTC m=+0.107157077 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  2 09:00:42 np0005466013 nova_compute[192144]: 2025-10-02 13:00:42.122 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:42 np0005466013 nova_compute[192144]: 2025-10-02 13:00:42.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:00:44 np0005466013 nova_compute[192144]: 2025-10-02 13:00:44.547 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:46 np0005466013 podman[260793]: 2025-10-02 13:00:46.710328072 +0000 UTC m=+0.072971934 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, container_name=iscsid)
Oct  2 09:00:46 np0005466013 podman[260792]: 2025-10-02 13:00:46.712777439 +0000 UTC m=+0.075624197 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 09:00:47 np0005466013 nova_compute[192144]: 2025-10-02 13:00:47.156 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:49 np0005466013 nova_compute[192144]: 2025-10-02 13:00:49.551 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:52 np0005466013 nova_compute[192144]: 2025-10-02 13:00:52.162 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:54 np0005466013 nova_compute[192144]: 2025-10-02 13:00:54.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:57 np0005466013 nova_compute[192144]: 2025-10-02 13:00:57.167 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:59 np0005466013 nova_compute[192144]: 2025-10-02 13:00:59.051 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:00:59 np0005466013 nova_compute[192144]: 2025-10-02 13:00:59.051 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:00:59 np0005466013 nova_compute[192144]: 2025-10-02 13:00:59.052 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 09:00:59 np0005466013 nova_compute[192144]: 2025-10-02 13:00:59.558 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:02 np0005466013 nova_compute[192144]: 2025-10-02 13:01:02.167 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 13:01:02.346 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:01:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 13:01:02.347 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:01:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 13:01:02.347 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:01:03 np0005466013 podman[260849]: 2025-10-02 13:01:03.734282143 +0000 UTC m=+0.085528149 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 09:01:03 np0005466013 podman[260848]: 2025-10-02 13:01:03.744055189 +0000 UTC m=+0.102608314 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  2 09:01:03 np0005466013 podman[260850]: 2025-10-02 13:01:03.767256949 +0000 UTC m=+0.119855287 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  2 09:01:04 np0005466013 nova_compute[192144]: 2025-10-02 13:01:04.560 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:05 np0005466013 nova_compute[192144]: 2025-10-02 13:01:05.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:01:05 np0005466013 nova_compute[192144]: 2025-10-02 13:01:05.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:01:06 np0005466013 nova_compute[192144]: 2025-10-02 13:01:06.100 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:01:06 np0005466013 nova_compute[192144]: 2025-10-02 13:01:06.100 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:01:06 np0005466013 nova_compute[192144]: 2025-10-02 13:01:06.101 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:01:06 np0005466013 nova_compute[192144]: 2025-10-02 13:01:06.101 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 09:01:06 np0005466013 nova_compute[192144]: 2025-10-02 13:01:06.280 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:01:06 np0005466013 nova_compute[192144]: 2025-10-02 13:01:06.281 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5680MB free_disk=73.13324356079102GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 09:01:06 np0005466013 nova_compute[192144]: 2025-10-02 13:01:06.281 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:01:06 np0005466013 nova_compute[192144]: 2025-10-02 13:01:06.282 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:01:06 np0005466013 nova_compute[192144]: 2025-10-02 13:01:06.426 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 09:01:06 np0005466013 nova_compute[192144]: 2025-10-02 13:01:06.426 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 09:01:06 np0005466013 nova_compute[192144]: 2025-10-02 13:01:06.761 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Refreshing inventories for resource provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  2 09:01:06 np0005466013 nova_compute[192144]: 2025-10-02 13:01:06.988 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Updating ProviderTree inventory for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  2 09:01:06 np0005466013 nova_compute[192144]: 2025-10-02 13:01:06.989 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Updating inventory in ProviderTree for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 09:01:07 np0005466013 nova_compute[192144]: 2025-10-02 13:01:07.038 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Refreshing aggregate associations for resource provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  2 09:01:07 np0005466013 nova_compute[192144]: 2025-10-02 13:01:07.069 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Refreshing trait associations for resource provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80, traits: COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_TRUSTED_CERTS,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NODE,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE,HW_CPU_X86_MMX,COMPUTE_RESCUE_BFV _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  2 09:01:07 np0005466013 nova_compute[192144]: 2025-10-02 13:01:07.139 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:01:07 np0005466013 nova_compute[192144]: 2025-10-02 13:01:07.167 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:01:07 np0005466013 nova_compute[192144]: 2025-10-02 13:01:07.169 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 09:01:07 np0005466013 nova_compute[192144]: 2025-10-02 13:01:07.169 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.887s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:01:07 np0005466013 nova_compute[192144]: 2025-10-02 13:01:07.169 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:09 np0005466013 nova_compute[192144]: 2025-10-02 13:01:09.562 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:11 np0005466013 nova_compute[192144]: 2025-10-02 13:01:11.171 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:01:11 np0005466013 nova_compute[192144]: 2025-10-02 13:01:11.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:01:12 np0005466013 nova_compute[192144]: 2025-10-02 13:01:12.171 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:12 np0005466013 podman[260923]: 2025-10-02 13:01:12.704645306 +0000 UTC m=+0.068038809 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251001, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 09:01:12 np0005466013 podman[260921]: 2025-10-02 13:01:12.705629196 +0000 UTC m=+0.075423091 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, io.buildah.version=1.41.3)
Oct  2 09:01:12 np0005466013 podman[260922]: 2025-10-02 13:01:12.710262091 +0000 UTC m=+0.079111365 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, version=9.6, distribution-scope=public, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, managed_by=edpm_ansible, architecture=x86_64, vendor=Red Hat, Inc.)
Oct  2 09:01:14 np0005466013 nova_compute[192144]: 2025-10-02 13:01:14.565 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:14 np0005466013 nova_compute[192144]: 2025-10-02 13:01:14.996 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:01:14 np0005466013 nova_compute[192144]: 2025-10-02 13:01:14.997 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 09:01:14 np0005466013 nova_compute[192144]: 2025-10-02 13:01:14.997 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 09:01:15 np0005466013 nova_compute[192144]: 2025-10-02 13:01:15.086 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 09:01:16 np0005466013 nova_compute[192144]: 2025-10-02 13:01:16.078 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:01:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:01:16.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:01:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:01:16.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:01:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:01:16.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:01:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:01:16.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:01:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:01:16.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:01:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:01:16.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:01:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:01:16.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:01:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:01:16.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:01:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:01:16.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:01:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:01:16.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:01:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:01:16.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:01:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:01:16.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:01:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:01:16.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:01:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:01:16.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:01:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:01:16.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:01:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:01:16.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:01:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:01:16.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:01:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:01:16.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:01:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:01:16.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:01:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:01:16.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:01:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:01:16.371 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:01:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:01:16.371 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:01:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:01:16.371 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:01:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:01:16.371 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:01:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:01:16.371 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:01:16 np0005466013 nova_compute[192144]: 2025-10-02 13:01:16.995 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:01:17 np0005466013 nova_compute[192144]: 2025-10-02 13:01:17.173 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:17 np0005466013 podman[260980]: 2025-10-02 13:01:17.687098531 +0000 UTC m=+0.060757949 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 09:01:17 np0005466013 podman[260981]: 2025-10-02 13:01:17.730720211 +0000 UTC m=+0.096598555 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  2 09:01:19 np0005466013 nova_compute[192144]: 2025-10-02 13:01:19.568 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:22 np0005466013 nova_compute[192144]: 2025-10-02 13:01:22.182 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:23 np0005466013 nova_compute[192144]: 2025-10-02 13:01:23.989 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:01:24 np0005466013 nova_compute[192144]: 2025-10-02 13:01:24.601 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:27 np0005466013 nova_compute[192144]: 2025-10-02 13:01:27.184 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:29 np0005466013 nova_compute[192144]: 2025-10-02 13:01:29.602 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:32 np0005466013 nova_compute[192144]: 2025-10-02 13:01:32.186 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:34 np0005466013 nova_compute[192144]: 2025-10-02 13:01:34.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:34 np0005466013 podman[261028]: 2025-10-02 13:01:34.71185506 +0000 UTC m=+0.066466220 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  2 09:01:34 np0005466013 podman[261027]: 2025-10-02 13:01:34.718806018 +0000 UTC m=+0.078340112 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 09:01:34 np0005466013 podman[261029]: 2025-10-02 13:01:34.736777902 +0000 UTC m=+0.096681058 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 09:01:37 np0005466013 nova_compute[192144]: 2025-10-02 13:01:37.187 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:39 np0005466013 nova_compute[192144]: 2025-10-02 13:01:39.606 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:42 np0005466013 nova_compute[192144]: 2025-10-02 13:01:42.190 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:43 np0005466013 podman[261096]: 2025-10-02 13:01:43.679798644 +0000 UTC m=+0.060717309 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct  2 09:01:43 np0005466013 podman[261098]: 2025-10-02 13:01:43.708708822 +0000 UTC m=+0.072920392 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible)
Oct  2 09:01:43 np0005466013 podman[261097]: 2025-10-02 13:01:43.713268625 +0000 UTC m=+0.086167278 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, version=9.6, io.openshift.tags=minimal rhel9, vcs-type=git, com.redhat.component=ubi9-minimal-container, distribution-scope=public, name=ubi9-minimal, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., architecture=x86_64, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.openshift.expose-services=)
Oct  2 09:01:44 np0005466013 nova_compute[192144]: 2025-10-02 13:01:44.609 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:47 np0005466013 nova_compute[192144]: 2025-10-02 13:01:47.191 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:48 np0005466013 podman[261162]: 2025-10-02 13:01:48.688823167 +0000 UTC m=+0.062718581 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  2 09:01:48 np0005466013 podman[261163]: 2025-10-02 13:01:48.696022034 +0000 UTC m=+0.071289451 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 09:01:49 np0005466013 nova_compute[192144]: 2025-10-02 13:01:49.611 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:52 np0005466013 nova_compute[192144]: 2025-10-02 13:01:52.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:54 np0005466013 nova_compute[192144]: 2025-10-02 13:01:54.613 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:57 np0005466013 nova_compute[192144]: 2025-10-02 13:01:57.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:59 np0005466013 nova_compute[192144]: 2025-10-02 13:01:59.615 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:59 np0005466013 nova_compute[192144]: 2025-10-02 13:01:59.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:01:59 np0005466013 nova_compute[192144]: 2025-10-02 13:01:59.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 09:02:00 np0005466013 nova_compute[192144]: 2025-10-02 13:02:00.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:02:02 np0005466013 nova_compute[192144]: 2025-10-02 13:02:02.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 13:02:02.346 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:02:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 13:02:02.347 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:02:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 13:02:02.347 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:02:04 np0005466013 nova_compute[192144]: 2025-10-02 13:02:04.664 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:05 np0005466013 podman[261216]: 2025-10-02 13:02:05.707152543 +0000 UTC m=+0.067471421 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Oct  2 09:02:05 np0005466013 podman[261215]: 2025-10-02 13:02:05.715368271 +0000 UTC m=+0.077109964 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 09:02:05 np0005466013 podman[261217]: 2025-10-02 13:02:05.788674134 +0000 UTC m=+0.145933046 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  2 09:02:06 np0005466013 nova_compute[192144]: 2025-10-02 13:02:06.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:02:07 np0005466013 nova_compute[192144]: 2025-10-02 13:02:07.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:07 np0005466013 nova_compute[192144]: 2025-10-02 13:02:07.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:02:08 np0005466013 nova_compute[192144]: 2025-10-02 13:02:08.031 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:02:08 np0005466013 nova_compute[192144]: 2025-10-02 13:02:08.032 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:02:08 np0005466013 nova_compute[192144]: 2025-10-02 13:02:08.032 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:02:08 np0005466013 nova_compute[192144]: 2025-10-02 13:02:08.032 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 09:02:08 np0005466013 nova_compute[192144]: 2025-10-02 13:02:08.190 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:02:08 np0005466013 nova_compute[192144]: 2025-10-02 13:02:08.191 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5667MB free_disk=73.13312530517578GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 09:02:08 np0005466013 nova_compute[192144]: 2025-10-02 13:02:08.191 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:02:08 np0005466013 nova_compute[192144]: 2025-10-02 13:02:08.191 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:02:08 np0005466013 nova_compute[192144]: 2025-10-02 13:02:08.250 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 09:02:08 np0005466013 nova_compute[192144]: 2025-10-02 13:02:08.251 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 09:02:08 np0005466013 nova_compute[192144]: 2025-10-02 13:02:08.277 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:02:08 np0005466013 nova_compute[192144]: 2025-10-02 13:02:08.314 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:02:08 np0005466013 nova_compute[192144]: 2025-10-02 13:02:08.316 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 09:02:08 np0005466013 nova_compute[192144]: 2025-10-02 13:02:08.317 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:02:09 np0005466013 nova_compute[192144]: 2025-10-02 13:02:09.665 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:10 np0005466013 nova_compute[192144]: 2025-10-02 13:02:10.318 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:02:11 np0005466013 nova_compute[192144]: 2025-10-02 13:02:11.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:02:12 np0005466013 nova_compute[192144]: 2025-10-02 13:02:12.200 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:14 np0005466013 nova_compute[192144]: 2025-10-02 13:02:14.666 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:14 np0005466013 podman[261285]: 2025-10-02 13:02:14.686818207 +0000 UTC m=+0.059703747 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  2 09:02:14 np0005466013 podman[261286]: 2025-10-02 13:02:14.708908561 +0000 UTC m=+0.070461865 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., config_id=edpm, container_name=openstack_network_exporter, managed_by=edpm_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct  2 09:02:14 np0005466013 podman[261287]: 2025-10-02 13:02:14.722767746 +0000 UTC m=+0.088081288 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team)
Oct  2 09:02:15 np0005466013 nova_compute[192144]: 2025-10-02 13:02:15.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:02:15 np0005466013 nova_compute[192144]: 2025-10-02 13:02:15.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 09:02:15 np0005466013 nova_compute[192144]: 2025-10-02 13:02:15.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 09:02:16 np0005466013 nova_compute[192144]: 2025-10-02 13:02:16.025 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 09:02:17 np0005466013 nova_compute[192144]: 2025-10-02 13:02:17.020 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:02:17 np0005466013 nova_compute[192144]: 2025-10-02 13:02:17.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:17 np0005466013 nova_compute[192144]: 2025-10-02 13:02:17.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:02:19 np0005466013 podman[261344]: 2025-10-02 13:02:19.668581663 +0000 UTC m=+0.047401000 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 09:02:19 np0005466013 nova_compute[192144]: 2025-10-02 13:02:19.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:19 np0005466013 podman[261343]: 2025-10-02 13:02:19.702124817 +0000 UTC m=+0.081047837 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 09:02:22 np0005466013 nova_compute[192144]: 2025-10-02 13:02:22.205 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:24 np0005466013 nova_compute[192144]: 2025-10-02 13:02:24.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:27 np0005466013 nova_compute[192144]: 2025-10-02 13:02:27.206 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:29 np0005466013 nova_compute[192144]: 2025-10-02 13:02:29.673 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:32 np0005466013 nova_compute[192144]: 2025-10-02 13:02:32.207 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:34 np0005466013 nova_compute[192144]: 2025-10-02 13:02:34.674 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:36 np0005466013 podman[261382]: 2025-10-02 13:02:36.679594809 +0000 UTC m=+0.060541084 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 09:02:36 np0005466013 podman[261383]: 2025-10-02 13:02:36.705485291 +0000 UTC m=+0.073390296 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 09:02:36 np0005466013 podman[261384]: 2025-10-02 13:02:36.751069913 +0000 UTC m=+0.120197137 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 09:02:37 np0005466013 nova_compute[192144]: 2025-10-02 13:02:37.208 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:39 np0005466013 nova_compute[192144]: 2025-10-02 13:02:39.676 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:42 np0005466013 nova_compute[192144]: 2025-10-02 13:02:42.211 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:44 np0005466013 nova_compute[192144]: 2025-10-02 13:02:44.706 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:45 np0005466013 podman[261450]: 2025-10-02 13:02:45.671985822 +0000 UTC m=+0.048503225 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  2 09:02:45 np0005466013 podman[261452]: 2025-10-02 13:02:45.687457008 +0000 UTC m=+0.055643290 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct  2 09:02:45 np0005466013 podman[261451]: 2025-10-02 13:02:45.688038416 +0000 UTC m=+0.058805879 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, vcs-type=git, release=1755695350, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, version=9.6)
Oct  2 09:02:47 np0005466013 nova_compute[192144]: 2025-10-02 13:02:47.211 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:49 np0005466013 nova_compute[192144]: 2025-10-02 13:02:49.709 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:50 np0005466013 podman[261508]: 2025-10-02 13:02:50.683916755 +0000 UTC m=+0.057210458 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 09:02:50 np0005466013 podman[261509]: 2025-10-02 13:02:50.710038816 +0000 UTC m=+0.072281302 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, config_id=iscsid, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  2 09:02:52 np0005466013 nova_compute[192144]: 2025-10-02 13:02:52.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:54 np0005466013 nova_compute[192144]: 2025-10-02 13:02:54.710 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:57 np0005466013 nova_compute[192144]: 2025-10-02 13:02:57.214 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:59 np0005466013 nova_compute[192144]: 2025-10-02 13:02:59.713 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:00 np0005466013 nova_compute[192144]: 2025-10-02 13:03:00.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:03:01 np0005466013 nova_compute[192144]: 2025-10-02 13:03:01.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:03:01 np0005466013 nova_compute[192144]: 2025-10-02 13:03:01.993 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 09:03:02 np0005466013 nova_compute[192144]: 2025-10-02 13:03:02.216 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 13:03:02.347 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:03:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 13:03:02.347 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:03:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 13:03:02.347 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:03:04 np0005466013 nova_compute[192144]: 2025-10-02 13:03:04.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:06 np0005466013 nova_compute[192144]: 2025-10-02 13:03:06.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:03:07 np0005466013 nova_compute[192144]: 2025-10-02 13:03:07.218 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:07 np0005466013 podman[261552]: 2025-10-02 13:03:07.676931325 +0000 UTC m=+0.051309694 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 09:03:07 np0005466013 podman[261554]: 2025-10-02 13:03:07.700936229 +0000 UTC m=+0.068301077 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 09:03:07 np0005466013 podman[261553]: 2025-10-02 13:03:07.700904237 +0000 UTC m=+0.071564479 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  2 09:03:09 np0005466013 nova_compute[192144]: 2025-10-02 13:03:09.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:09 np0005466013 nova_compute[192144]: 2025-10-02 13:03:09.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:03:09 np0005466013 nova_compute[192144]: 2025-10-02 13:03:09.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:03:10 np0005466013 nova_compute[192144]: 2025-10-02 13:03:10.022 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:03:10 np0005466013 nova_compute[192144]: 2025-10-02 13:03:10.023 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:03:10 np0005466013 nova_compute[192144]: 2025-10-02 13:03:10.023 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:03:10 np0005466013 nova_compute[192144]: 2025-10-02 13:03:10.023 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 09:03:10 np0005466013 nova_compute[192144]: 2025-10-02 13:03:10.156 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:03:10 np0005466013 nova_compute[192144]: 2025-10-02 13:03:10.157 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5675MB free_disk=73.13314056396484GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 09:03:10 np0005466013 nova_compute[192144]: 2025-10-02 13:03:10.157 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:03:10 np0005466013 nova_compute[192144]: 2025-10-02 13:03:10.158 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:03:10 np0005466013 nova_compute[192144]: 2025-10-02 13:03:10.452 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 09:03:10 np0005466013 nova_compute[192144]: 2025-10-02 13:03:10.452 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 09:03:10 np0005466013 nova_compute[192144]: 2025-10-02 13:03:10.470 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:03:10 np0005466013 nova_compute[192144]: 2025-10-02 13:03:10.482 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:03:10 np0005466013 nova_compute[192144]: 2025-10-02 13:03:10.483 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 09:03:10 np0005466013 nova_compute[192144]: 2025-10-02 13:03:10.483 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.326s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:03:12 np0005466013 nova_compute[192144]: 2025-10-02 13:03:12.220 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:13 np0005466013 nova_compute[192144]: 2025-10-02 13:03:13.483 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:03:14 np0005466013 nova_compute[192144]: 2025-10-02 13:03:14.753 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:03:16.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:03:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:03:16.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:03:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:03:16.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:03:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:03:16.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:03:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:03:16.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:03:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:03:16.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:03:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:03:16.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:03:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:03:16.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:03:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:03:16.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:03:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:03:16.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:03:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:03:16.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:03:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:03:16.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:03:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:03:16.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:03:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:03:16.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:03:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:03:16.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:03:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:03:16.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:03:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:03:16.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:03:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:03:16.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:03:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:03:16.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:03:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:03:16.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:03:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:03:16.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:03:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:03:16.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:03:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:03:16.371 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:03:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:03:16.371 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:03:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:03:16.371 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:03:16 np0005466013 podman[261617]: 2025-10-02 13:03:16.678319381 +0000 UTC m=+0.057973133 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 09:03:16 np0005466013 podman[261618]: 2025-10-02 13:03:16.697229425 +0000 UTC m=+0.069781123 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, distribution-scope=public, release=1755695350, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, architecture=x86_64, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal)
Oct  2 09:03:16 np0005466013 podman[261619]: 2025-10-02 13:03:16.69995215 +0000 UTC m=+0.062326098 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Oct  2 09:03:16 np0005466013 nova_compute[192144]: 2025-10-02 13:03:16.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:03:16 np0005466013 nova_compute[192144]: 2025-10-02 13:03:16.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 09:03:16 np0005466013 nova_compute[192144]: 2025-10-02 13:03:16.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 09:03:17 np0005466013 nova_compute[192144]: 2025-10-02 13:03:17.030 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 09:03:17 np0005466013 nova_compute[192144]: 2025-10-02 13:03:17.221 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:17 np0005466013 nova_compute[192144]: 2025-10-02 13:03:17.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:03:17 np0005466013 nova_compute[192144]: 2025-10-02 13:03:17.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:03:19 np0005466013 nova_compute[192144]: 2025-10-02 13:03:19.755 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:21 np0005466013 podman[261676]: 2025-10-02 13:03:21.665532858 +0000 UTC m=+0.044753447 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  2 09:03:21 np0005466013 podman[261677]: 2025-10-02 13:03:21.667140609 +0000 UTC m=+0.043837558 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 09:03:22 np0005466013 nova_compute[192144]: 2025-10-02 13:03:22.224 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:24 np0005466013 nova_compute[192144]: 2025-10-02 13:03:24.757 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:24 np0005466013 nova_compute[192144]: 2025-10-02 13:03:24.988 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:03:27 np0005466013 nova_compute[192144]: 2025-10-02 13:03:27.227 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:29 np0005466013 nova_compute[192144]: 2025-10-02 13:03:29.760 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:32 np0005466013 nova_compute[192144]: 2025-10-02 13:03:32.289 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:34 np0005466013 nova_compute[192144]: 2025-10-02 13:03:34.762 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:37 np0005466013 nova_compute[192144]: 2025-10-02 13:03:37.290 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:38 np0005466013 podman[261719]: 2025-10-02 13:03:38.700737534 +0000 UTC m=+0.068421622 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  2 09:03:38 np0005466013 podman[261720]: 2025-10-02 13:03:38.713526255 +0000 UTC m=+0.083792613 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Oct  2 09:03:38 np0005466013 podman[261721]: 2025-10-02 13:03:38.752617933 +0000 UTC m=+0.109232232 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3)
Oct  2 09:03:39 np0005466013 nova_compute[192144]: 2025-10-02 13:03:39.764 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:42 np0005466013 nova_compute[192144]: 2025-10-02 13:03:42.292 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:44 np0005466013 nova_compute[192144]: 2025-10-02 13:03:44.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:47 np0005466013 nova_compute[192144]: 2025-10-02 13:03:47.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:47 np0005466013 podman[261787]: 2025-10-02 13:03:47.684706492 +0000 UTC m=+0.064009062 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  2 09:03:47 np0005466013 podman[261788]: 2025-10-02 13:03:47.686515949 +0000 UTC m=+0.062557997 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vcs-type=git, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, maintainer=Red Hat, Inc., name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_id=edpm, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers)
Oct  2 09:03:47 np0005466013 podman[261789]: 2025-10-02 13:03:47.699558088 +0000 UTC m=+0.072788227 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Oct  2 09:03:49 np0005466013 nova_compute[192144]: 2025-10-02 13:03:49.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:52 np0005466013 nova_compute[192144]: 2025-10-02 13:03:52.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:52 np0005466013 podman[261848]: 2025-10-02 13:03:52.671826115 +0000 UTC m=+0.046246183 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 09:03:52 np0005466013 podman[261849]: 2025-10-02 13:03:52.701862369 +0000 UTC m=+0.065384576 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.schema-version=1.0)
Oct  2 09:03:54 np0005466013 nova_compute[192144]: 2025-10-02 13:03:54.772 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:57 np0005466013 nova_compute[192144]: 2025-10-02 13:03:57.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:59 np0005466013 nova_compute[192144]: 2025-10-02 13:03:59.774 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:00 np0005466013 nova_compute[192144]: 2025-10-02 13:04:00.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:04:01 np0005466013 nova_compute[192144]: 2025-10-02 13:04:01.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:04:01 np0005466013 nova_compute[192144]: 2025-10-02 13:04:01.995 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 09:04:02 np0005466013 nova_compute[192144]: 2025-10-02 13:04:02.298 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 13:04:02.347 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:04:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 13:04:02.348 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:04:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 13:04:02.348 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:04:04 np0005466013 nova_compute[192144]: 2025-10-02 13:04:04.777 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:07 np0005466013 nova_compute[192144]: 2025-10-02 13:04:07.300 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:08 np0005466013 nova_compute[192144]: 2025-10-02 13:04:08.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:04:09 np0005466013 podman[261893]: 2025-10-02 13:04:09.697554664 +0000 UTC m=+0.059389497 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct  2 09:04:09 np0005466013 podman[261892]: 2025-10-02 13:04:09.704378659 +0000 UTC m=+0.075188574 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 09:04:09 np0005466013 podman[261894]: 2025-10-02 13:04:09.733106351 +0000 UTC m=+0.089546205 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, tcib_managed=true, org.label-schema.build-date=20251001)
Oct  2 09:04:09 np0005466013 nova_compute[192144]: 2025-10-02 13:04:09.808 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:09 np0005466013 nova_compute[192144]: 2025-10-02 13:04:09.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:04:10 np0005466013 nova_compute[192144]: 2025-10-02 13:04:10.021 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:04:10 np0005466013 nova_compute[192144]: 2025-10-02 13:04:10.021 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:04:10 np0005466013 nova_compute[192144]: 2025-10-02 13:04:10.022 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:04:10 np0005466013 nova_compute[192144]: 2025-10-02 13:04:10.022 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 09:04:10 np0005466013 nova_compute[192144]: 2025-10-02 13:04:10.173 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:04:10 np0005466013 nova_compute[192144]: 2025-10-02 13:04:10.174 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5679MB free_disk=73.1329574584961GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 09:04:10 np0005466013 nova_compute[192144]: 2025-10-02 13:04:10.175 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:04:10 np0005466013 nova_compute[192144]: 2025-10-02 13:04:10.175 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:04:10 np0005466013 nova_compute[192144]: 2025-10-02 13:04:10.231 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 09:04:10 np0005466013 nova_compute[192144]: 2025-10-02 13:04:10.232 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 09:04:10 np0005466013 nova_compute[192144]: 2025-10-02 13:04:10.249 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:04:10 np0005466013 nova_compute[192144]: 2025-10-02 13:04:10.267 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:04:10 np0005466013 nova_compute[192144]: 2025-10-02 13:04:10.268 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 09:04:10 np0005466013 nova_compute[192144]: 2025-10-02 13:04:10.268 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.093s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:04:12 np0005466013 nova_compute[192144]: 2025-10-02 13:04:12.269 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:04:12 np0005466013 nova_compute[192144]: 2025-10-02 13:04:12.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:13 np0005466013 nova_compute[192144]: 2025-10-02 13:04:13.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:04:14 np0005466013 nova_compute[192144]: 2025-10-02 13:04:14.811 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:17 np0005466013 nova_compute[192144]: 2025-10-02 13:04:17.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:18 np0005466013 podman[261961]: 2025-10-02 13:04:18.673358647 +0000 UTC m=+0.053169422 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 09:04:18 np0005466013 podman[261962]: 2025-10-02 13:04:18.679568601 +0000 UTC m=+0.056377222 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, managed_by=edpm_ansible, distribution-scope=public, release=1755695350, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, vendor=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, maintainer=Red Hat, Inc., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct  2 09:04:18 np0005466013 podman[261963]: 2025-10-02 13:04:18.68142018 +0000 UTC m=+0.054457572 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm)
Oct  2 09:04:18 np0005466013 nova_compute[192144]: 2025-10-02 13:04:18.989 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:04:18 np0005466013 nova_compute[192144]: 2025-10-02 13:04:18.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:04:18 np0005466013 nova_compute[192144]: 2025-10-02 13:04:18.993 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 09:04:18 np0005466013 nova_compute[192144]: 2025-10-02 13:04:18.993 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 09:04:19 np0005466013 nova_compute[192144]: 2025-10-02 13:04:19.010 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 09:04:19 np0005466013 nova_compute[192144]: 2025-10-02 13:04:19.011 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:04:19 np0005466013 nova_compute[192144]: 2025-10-02 13:04:19.812 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:22 np0005466013 nova_compute[192144]: 2025-10-02 13:04:22.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:23 np0005466013 podman[262022]: 2025-10-02 13:04:23.659832071 +0000 UTC m=+0.040617557 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  2 09:04:23 np0005466013 podman[262023]: 2025-10-02 13:04:23.667884614 +0000 UTC m=+0.045343895 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, managed_by=edpm_ansible)
Oct  2 09:04:24 np0005466013 nova_compute[192144]: 2025-10-02 13:04:24.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:27 np0005466013 nova_compute[192144]: 2025-10-02 13:04:27.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:29 np0005466013 nova_compute[192144]: 2025-10-02 13:04:29.830 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:32 np0005466013 nova_compute[192144]: 2025-10-02 13:04:32.380 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:34 np0005466013 nova_compute[192144]: 2025-10-02 13:04:34.832 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:37 np0005466013 nova_compute[192144]: 2025-10-02 13:04:37.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:39 np0005466013 nova_compute[192144]: 2025-10-02 13:04:39.834 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:40 np0005466013 podman[262068]: 2025-10-02 13:04:40.680708475 +0000 UTC m=+0.055506584 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 09:04:40 np0005466013 podman[262069]: 2025-10-02 13:04:40.69579081 +0000 UTC m=+0.061066640 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team)
Oct  2 09:04:40 np0005466013 podman[262070]: 2025-10-02 13:04:40.709558432 +0000 UTC m=+0.079657673 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 09:04:42 np0005466013 nova_compute[192144]: 2025-10-02 13:04:42.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:44 np0005466013 nova_compute[192144]: 2025-10-02 13:04:44.836 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:47 np0005466013 nova_compute[192144]: 2025-10-02 13:04:47.428 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:49 np0005466013 podman[262134]: 2025-10-02 13:04:49.67249394 +0000 UTC m=+0.052319555 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, distribution-scope=public, maintainer=Red Hat, Inc., name=ubi9-minimal, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, release=1755695350)
Oct  2 09:04:49 np0005466013 podman[262133]: 2025-10-02 13:04:49.685662663 +0000 UTC m=+0.063654091 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 09:04:49 np0005466013 podman[262135]: 2025-10-02 13:04:49.708116988 +0000 UTC m=+0.080090357 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct  2 09:04:49 np0005466013 nova_compute[192144]: 2025-10-02 13:04:49.837 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:52 np0005466013 nova_compute[192144]: 2025-10-02 13:04:52.432 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:54 np0005466013 podman[262194]: 2025-10-02 13:04:54.719485955 +0000 UTC m=+0.083368789 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 09:04:54 np0005466013 podman[262195]: 2025-10-02 13:04:54.725899527 +0000 UTC m=+0.086422986 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  2 09:04:54 np0005466013 nova_compute[192144]: 2025-10-02 13:04:54.838 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:57 np0005466013 nova_compute[192144]: 2025-10-02 13:04:57.434 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:59 np0005466013 nova_compute[192144]: 2025-10-02 13:04:59.841 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:01 np0005466013 nova_compute[192144]: 2025-10-02 13:05:01.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:05:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 13:05:02.348 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:05:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 13:05:02.349 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:05:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 13:05:02.349 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:05:02 np0005466013 nova_compute[192144]: 2025-10-02 13:05:02.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:03 np0005466013 nova_compute[192144]: 2025-10-02 13:05:03.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:05:03 np0005466013 nova_compute[192144]: 2025-10-02 13:05:03.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 09:05:04 np0005466013 nova_compute[192144]: 2025-10-02 13:05:04.844 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:07 np0005466013 nova_compute[192144]: 2025-10-02 13:05:07.437 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:09 np0005466013 nova_compute[192144]: 2025-10-02 13:05:09.846 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:10 np0005466013 nova_compute[192144]: 2025-10-02 13:05:10.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:05:11 np0005466013 podman[262236]: 2025-10-02 13:05:11.665396935 +0000 UTC m=+0.046209253 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 09:05:11 np0005466013 podman[262237]: 2025-10-02 13:05:11.678631291 +0000 UTC m=+0.051613623 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  2 09:05:11 np0005466013 podman[262238]: 2025-10-02 13:05:11.704771342 +0000 UTC m=+0.078731104 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, org.label-schema.schema-version=1.0)
Oct  2 09:05:11 np0005466013 nova_compute[192144]: 2025-10-02 13:05:11.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:05:12 np0005466013 nova_compute[192144]: 2025-10-02 13:05:12.056 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:05:12 np0005466013 nova_compute[192144]: 2025-10-02 13:05:12.057 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:05:12 np0005466013 nova_compute[192144]: 2025-10-02 13:05:12.057 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:05:12 np0005466013 nova_compute[192144]: 2025-10-02 13:05:12.057 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 09:05:12 np0005466013 nova_compute[192144]: 2025-10-02 13:05:12.184 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:05:12 np0005466013 nova_compute[192144]: 2025-10-02 13:05:12.186 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5677MB free_disk=73.1329574584961GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 09:05:12 np0005466013 nova_compute[192144]: 2025-10-02 13:05:12.186 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:05:12 np0005466013 nova_compute[192144]: 2025-10-02 13:05:12.186 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:05:12 np0005466013 nova_compute[192144]: 2025-10-02 13:05:12.440 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:12 np0005466013 nova_compute[192144]: 2025-10-02 13:05:12.822 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 09:05:12 np0005466013 nova_compute[192144]: 2025-10-02 13:05:12.823 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 09:05:12 np0005466013 nova_compute[192144]: 2025-10-02 13:05:12.850 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:05:12 np0005466013 nova_compute[192144]: 2025-10-02 13:05:12.971 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:05:12 np0005466013 nova_compute[192144]: 2025-10-02 13:05:12.973 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 09:05:12 np0005466013 nova_compute[192144]: 2025-10-02 13:05:12.974 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.788s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:05:14 np0005466013 nova_compute[192144]: 2025-10-02 13:05:14.847 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:14 np0005466013 nova_compute[192144]: 2025-10-02 13:05:14.974 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:05:15 np0005466013 nova_compute[192144]: 2025-10-02 13:05:15.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:05:16.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:05:16.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:05:16.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:05:16.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:05:16.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:05:16.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:05:16.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:05:16.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:05:16.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:05:16.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:05:16.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:05:16.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:05:16.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:05:16.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:05:16.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:05:16.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:05:16.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:05:16.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:05:16.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:05:16.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:05:16.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:05:16.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:05:16.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:05:16.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:05:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:05:16.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:05:17 np0005466013 nova_compute[192144]: 2025-10-02 13:05:17.440 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:17 np0005466013 nova_compute[192144]: 2025-10-02 13:05:17.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:05:17 np0005466013 nova_compute[192144]: 2025-10-02 13:05:17.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  2 09:05:19 np0005466013 nova_compute[192144]: 2025-10-02 13:05:19.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:20 np0005466013 nova_compute[192144]: 2025-10-02 13:05:20.060 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:05:20 np0005466013 nova_compute[192144]: 2025-10-02 13:05:20.061 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 09:05:20 np0005466013 nova_compute[192144]: 2025-10-02 13:05:20.061 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 09:05:20 np0005466013 nova_compute[192144]: 2025-10-02 13:05:20.086 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 09:05:20 np0005466013 nova_compute[192144]: 2025-10-02 13:05:20.087 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:05:20 np0005466013 podman[262302]: 2025-10-02 13:05:20.672690968 +0000 UTC m=+0.051087666 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  2 09:05:20 np0005466013 podman[262303]: 2025-10-02 13:05:20.672726048 +0000 UTC m=+0.049148945 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, config_id=edpm, io.buildah.version=1.33.7, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, container_name=openstack_network_exporter, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Oct  2 09:05:20 np0005466013 podman[262304]: 2025-10-02 13:05:20.676565609 +0000 UTC m=+0.047536334 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 09:05:21 np0005466013 nova_compute[192144]: 2025-10-02 13:05:21.015 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:05:22 np0005466013 nova_compute[192144]: 2025-10-02 13:05:22.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:24 np0005466013 nova_compute[192144]: 2025-10-02 13:05:24.851 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:25 np0005466013 podman[262364]: 2025-10-02 13:05:25.669807776 +0000 UTC m=+0.047584026 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 09:05:25 np0005466013 podman[262365]: 2025-10-02 13:05:25.692981225 +0000 UTC m=+0.058618243 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 09:05:26 np0005466013 nova_compute[192144]: 2025-10-02 13:05:26.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:05:26 np0005466013 nova_compute[192144]: 2025-10-02 13:05:26.993 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  2 09:05:27 np0005466013 nova_compute[192144]: 2025-10-02 13:05:27.026 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  2 09:05:27 np0005466013 nova_compute[192144]: 2025-10-02 13:05:27.445 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:29 np0005466013 nova_compute[192144]: 2025-10-02 13:05:29.022 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:05:29 np0005466013 nova_compute[192144]: 2025-10-02 13:05:29.862 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:32 np0005466013 nova_compute[192144]: 2025-10-02 13:05:32.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:34 np0005466013 nova_compute[192144]: 2025-10-02 13:05:34.359 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:05:34 np0005466013 nova_compute[192144]: 2025-10-02 13:05:34.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:37 np0005466013 nova_compute[192144]: 2025-10-02 13:05:37.450 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:39 np0005466013 nova_compute[192144]: 2025-10-02 13:05:39.867 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:42 np0005466013 nova_compute[192144]: 2025-10-02 13:05:42.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:42 np0005466013 podman[262410]: 2025-10-02 13:05:42.674313847 +0000 UTC m=+0.051972444 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 09:05:42 np0005466013 podman[262409]: 2025-10-02 13:05:42.679671255 +0000 UTC m=+0.055297008 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 09:05:42 np0005466013 podman[262411]: 2025-10-02 13:05:42.707569631 +0000 UTC m=+0.082134551 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3)
Oct  2 09:05:44 np0005466013 nova_compute[192144]: 2025-10-02 13:05:44.868 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:47 np0005466013 nova_compute[192144]: 2025-10-02 13:05:47.454 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:49 np0005466013 nova_compute[192144]: 2025-10-02 13:05:49.870 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:51 np0005466013 podman[262478]: 2025-10-02 13:05:51.687368409 +0000 UTC m=+0.062116733 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 09:05:51 np0005466013 podman[262485]: 2025-10-02 13:05:51.69855796 +0000 UTC m=+0.060029817 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 09:05:51 np0005466013 podman[262479]: 2025-10-02 13:05:51.70394229 +0000 UTC m=+0.063931360 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, distribution-scope=public, managed_by=edpm_ansible, maintainer=Red Hat, Inc., vcs-type=git, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, version=9.6, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9)
Oct  2 09:05:52 np0005466013 nova_compute[192144]: 2025-10-02 13:05:52.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:54 np0005466013 nova_compute[192144]: 2025-10-02 13:05:54.871 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:56 np0005466013 podman[262539]: 2025-10-02 13:05:56.671054026 +0000 UTC m=+0.049925429 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  2 09:05:56 np0005466013 podman[262538]: 2025-10-02 13:05:56.68644098 +0000 UTC m=+0.060273475 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  2 09:05:56 np0005466013 nova_compute[192144]: 2025-10-02 13:05:56.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:05:57 np0005466013 nova_compute[192144]: 2025-10-02 13:05:57.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:59 np0005466013 nova_compute[192144]: 2025-10-02 13:05:59.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 13:06:02.350 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:06:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 13:06:02.350 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:06:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 13:06:02.350 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:06:02 np0005466013 nova_compute[192144]: 2025-10-02 13:06:02.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:04 np0005466013 nova_compute[192144]: 2025-10-02 13:06:04.008 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:06:04 np0005466013 nova_compute[192144]: 2025-10-02 13:06:04.890 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:04 np0005466013 nova_compute[192144]: 2025-10-02 13:06:04.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:06:04 np0005466013 nova_compute[192144]: 2025-10-02 13:06:04.993 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 09:06:07 np0005466013 nova_compute[192144]: 2025-10-02 13:06:07.495 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:09 np0005466013 nova_compute[192144]: 2025-10-02 13:06:09.955 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:11 np0005466013 nova_compute[192144]: 2025-10-02 13:06:11.995 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:06:12 np0005466013 nova_compute[192144]: 2025-10-02 13:06:12.553 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:12 np0005466013 nova_compute[192144]: 2025-10-02 13:06:12.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:06:13 np0005466013 nova_compute[192144]: 2025-10-02 13:06:13.033 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:06:13 np0005466013 nova_compute[192144]: 2025-10-02 13:06:13.034 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:06:13 np0005466013 nova_compute[192144]: 2025-10-02 13:06:13.034 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:06:13 np0005466013 nova_compute[192144]: 2025-10-02 13:06:13.034 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 09:06:13 np0005466013 nova_compute[192144]: 2025-10-02 13:06:13.178 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:06:13 np0005466013 nova_compute[192144]: 2025-10-02 13:06:13.179 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5684MB free_disk=73.13305282592773GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 09:06:13 np0005466013 nova_compute[192144]: 2025-10-02 13:06:13.179 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:06:13 np0005466013 nova_compute[192144]: 2025-10-02 13:06:13.180 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:06:13 np0005466013 nova_compute[192144]: 2025-10-02 13:06:13.452 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 09:06:13 np0005466013 nova_compute[192144]: 2025-10-02 13:06:13.453 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 09:06:13 np0005466013 nova_compute[192144]: 2025-10-02 13:06:13.544 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Refreshing inventories for resource provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  2 09:06:13 np0005466013 nova_compute[192144]: 2025-10-02 13:06:13.560 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Updating ProviderTree inventory for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  2 09:06:13 np0005466013 nova_compute[192144]: 2025-10-02 13:06:13.560 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Updating inventory in ProviderTree for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 09:06:13 np0005466013 nova_compute[192144]: 2025-10-02 13:06:13.573 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Refreshing aggregate associations for resource provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  2 09:06:13 np0005466013 nova_compute[192144]: 2025-10-02 13:06:13.589 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Refreshing trait associations for resource provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80, traits: COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_TRUSTED_CERTS,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NODE,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE,HW_CPU_X86_MMX,COMPUTE_RESCUE_BFV _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  2 09:06:13 np0005466013 nova_compute[192144]: 2025-10-02 13:06:13.625 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:06:13 np0005466013 nova_compute[192144]: 2025-10-02 13:06:13.638 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:06:13 np0005466013 nova_compute[192144]: 2025-10-02 13:06:13.639 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 09:06:13 np0005466013 nova_compute[192144]: 2025-10-02 13:06:13.639 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.459s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:06:13 np0005466013 podman[262583]: 2025-10-02 13:06:13.671455488 +0000 UTC m=+0.048407583 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 09:06:13 np0005466013 podman[262584]: 2025-10-02 13:06:13.680657616 +0000 UTC m=+0.054338338 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 09:06:13 np0005466013 podman[262585]: 2025-10-02 13:06:13.767659439 +0000 UTC m=+0.138068208 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct  2 09:06:14 np0005466013 nova_compute[192144]: 2025-10-02 13:06:14.957 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:15 np0005466013 nova_compute[192144]: 2025-10-02 13:06:15.641 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:06:15 np0005466013 nova_compute[192144]: 2025-10-02 13:06:15.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:06:17 np0005466013 nova_compute[192144]: 2025-10-02 13:06:17.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:19 np0005466013 nova_compute[192144]: 2025-10-02 13:06:19.959 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:19 np0005466013 nova_compute[192144]: 2025-10-02 13:06:19.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:06:19 np0005466013 nova_compute[192144]: 2025-10-02 13:06:19.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 09:06:19 np0005466013 nova_compute[192144]: 2025-10-02 13:06:19.994 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 09:06:20 np0005466013 nova_compute[192144]: 2025-10-02 13:06:20.025 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 09:06:21 np0005466013 nova_compute[192144]: 2025-10-02 13:06:21.996 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:06:22 np0005466013 nova_compute[192144]: 2025-10-02 13:06:22.557 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:22 np0005466013 podman[262645]: 2025-10-02 13:06:22.681441425 +0000 UTC m=+0.058154918 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  2 09:06:22 np0005466013 podman[262646]: 2025-10-02 13:06:22.691785939 +0000 UTC m=+0.064103635 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.buildah.version=1.33.7, vendor=Red Hat, Inc., version=9.6, io.openshift.expose-services=, container_name=openstack_network_exporter, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, config_id=edpm, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct  2 09:06:22 np0005466013 podman[262647]: 2025-10-02 13:06:22.709200147 +0000 UTC m=+0.082075100 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct  2 09:06:22 np0005466013 nova_compute[192144]: 2025-10-02 13:06:22.989 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:06:24 np0005466013 nova_compute[192144]: 2025-10-02 13:06:24.961 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:27 np0005466013 nova_compute[192144]: 2025-10-02 13:06:27.558 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:27 np0005466013 podman[262701]: 2025-10-02 13:06:27.679225973 +0000 UTC m=+0.058762907 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 09:06:27 np0005466013 podman[262702]: 2025-10-02 13:06:27.697501647 +0000 UTC m=+0.066100418 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  2 09:06:29 np0005466013 nova_compute[192144]: 2025-10-02 13:06:29.963 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:32 np0005466013 nova_compute[192144]: 2025-10-02 13:06:32.560 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:35 np0005466013 nova_compute[192144]: 2025-10-02 13:06:35.029 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:37 np0005466013 nova_compute[192144]: 2025-10-02 13:06:37.594 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:40 np0005466013 nova_compute[192144]: 2025-10-02 13:06:40.068 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:42 np0005466013 nova_compute[192144]: 2025-10-02 13:06:42.619 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:44 np0005466013 podman[262743]: 2025-10-02 13:06:44.696744876 +0000 UTC m=+0.067848623 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 09:06:44 np0005466013 podman[262744]: 2025-10-02 13:06:44.697267462 +0000 UTC m=+0.062831166 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Oct  2 09:06:44 np0005466013 podman[262745]: 2025-10-02 13:06:44.716515601 +0000 UTC m=+0.084983176 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct  2 09:06:45 np0005466013 nova_compute[192144]: 2025-10-02 13:06:45.070 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:47 np0005466013 nova_compute[192144]: 2025-10-02 13:06:47.622 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:50 np0005466013 nova_compute[192144]: 2025-10-02 13:06:50.071 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:52 np0005466013 nova_compute[192144]: 2025-10-02 13:06:52.660 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:53 np0005466013 podman[262808]: 2025-10-02 13:06:53.678845782 +0000 UTC m=+0.057647545 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 09:06:53 np0005466013 podman[262810]: 2025-10-02 13:06:53.691168736 +0000 UTC m=+0.053155366 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct  2 09:06:53 np0005466013 podman[262809]: 2025-10-02 13:06:53.707293268 +0000 UTC m=+0.081709825 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, architecture=x86_64, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, name=ubi9-minimal, io.openshift.expose-services=, release=1755695350, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter)
Oct  2 09:06:55 np0005466013 nova_compute[192144]: 2025-10-02 13:06:55.073 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:57 np0005466013 nova_compute[192144]: 2025-10-02 13:06:57.663 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:58 np0005466013 podman[262871]: 2025-10-02 13:06:58.678624699 +0000 UTC m=+0.047605522 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.build-date=20251001)
Oct  2 09:06:58 np0005466013 podman[262870]: 2025-10-02 13:06:58.702683378 +0000 UTC m=+0.075718648 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  2 09:07:00 np0005466013 nova_compute[192144]: 2025-10-02 13:07:00.075 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 13:07:02.350 103323 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:07:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 13:07:02.351 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:07:02 np0005466013 ovn_metadata_agent[103318]: 2025-10-02 13:07:02.351 103323 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:07:02 np0005466013 nova_compute[192144]: 2025-10-02 13:07:02.663 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:03 np0005466013 nova_compute[192144]: 2025-10-02 13:07:03.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:07:05 np0005466013 nova_compute[192144]: 2025-10-02 13:07:05.077 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:06 np0005466013 nova_compute[192144]: 2025-10-02 13:07:06.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:07:06 np0005466013 nova_compute[192144]: 2025-10-02 13:07:06.993 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 09:07:07 np0005466013 nova_compute[192144]: 2025-10-02 13:07:07.664 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:10 np0005466013 nova_compute[192144]: 2025-10-02 13:07:10.079 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:12 np0005466013 nova_compute[192144]: 2025-10-02 13:07:12.666 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:12 np0005466013 nova_compute[192144]: 2025-10-02 13:07:12.994 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:07:13 np0005466013 nova_compute[192144]: 2025-10-02 13:07:13.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:07:14 np0005466013 nova_compute[192144]: 2025-10-02 13:07:14.027 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:07:14 np0005466013 nova_compute[192144]: 2025-10-02 13:07:14.027 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:07:14 np0005466013 nova_compute[192144]: 2025-10-02 13:07:14.027 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:07:14 np0005466013 nova_compute[192144]: 2025-10-02 13:07:14.027 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 09:07:14 np0005466013 nova_compute[192144]: 2025-10-02 13:07:14.167 2 WARNING nova.virt.libvirt.driver [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:07:14 np0005466013 nova_compute[192144]: 2025-10-02 13:07:14.168 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5693MB free_disk=73.13305282592773GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 09:07:14 np0005466013 nova_compute[192144]: 2025-10-02 13:07:14.168 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:07:14 np0005466013 nova_compute[192144]: 2025-10-02 13:07:14.168 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:07:14 np0005466013 nova_compute[192144]: 2025-10-02 13:07:14.256 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 09:07:14 np0005466013 nova_compute[192144]: 2025-10-02 13:07:14.257 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 09:07:14 np0005466013 nova_compute[192144]: 2025-10-02 13:07:14.280 2 DEBUG nova.compute.provider_tree [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed in ProviderTree for provider: 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:07:14 np0005466013 nova_compute[192144]: 2025-10-02 13:07:14.297 2 DEBUG nova.scheduler.client.report [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Inventory has not changed for provider 8a5c5335-95d5-48d7-aa6f-2fc6c798dc80 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:07:14 np0005466013 nova_compute[192144]: 2025-10-02 13:07:14.298 2 DEBUG nova.compute.resource_tracker [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 09:07:14 np0005466013 nova_compute[192144]: 2025-10-02 13:07:14.299 2 DEBUG oslo_concurrency.lockutils [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:07:15 np0005466013 nova_compute[192144]: 2025-10-02 13:07:15.082 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:15 np0005466013 podman[262911]: 2025-10-02 13:07:15.682945328 +0000 UTC m=+0.051750352 container health_status 4d5b47d075dd8a4a5daba7b3a66dcc87be3ce16cb987134e9f0fa5479e21e2aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Oct  2 09:07:15 np0005466013 podman[262910]: 2025-10-02 13:07:15.705980894 +0000 UTC m=+0.078018008 container health_status 16bc85b5c92aff23aada2a30feb5e86c3143aa8a7f6b3b09bc99f83e7e3f1e58 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 09:07:15 np0005466013 podman[262912]: 2025-10-02 13:07:15.722007353 +0000 UTC m=+0.086079850 container health_status ada46ee32524901aa48482d5ec8e668ba9edbe6e2385fb8e9d2b5ed4463cd486 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 09:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:07:16.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:07:16.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:07:16.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:07:16.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:07:16.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:07:16.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:07:16.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:07:16.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:07:16.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:07:16.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:07:16.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:07:16.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:07:16.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:07:16.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:07:16.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:07:16.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:07:16.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:07:16.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:07:16.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:07:16.371 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:07:16.371 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:07:16.371 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:07:16.371 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:07:16.371 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:07:16 np0005466013 ceilometer_agent_compute[202946]: 2025-10-02 13:07:16.371 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:07:17 np0005466013 nova_compute[192144]: 2025-10-02 13:07:17.299 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:07:17 np0005466013 nova_compute[192144]: 2025-10-02 13:07:17.300 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:07:17 np0005466013 nova_compute[192144]: 2025-10-02 13:07:17.667 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:20 np0005466013 nova_compute[192144]: 2025-10-02 13:07:20.083 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:21 np0005466013 nova_compute[192144]: 2025-10-02 13:07:21.995 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:07:21 np0005466013 nova_compute[192144]: 2025-10-02 13:07:21.996 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 09:07:21 np0005466013 nova_compute[192144]: 2025-10-02 13:07:21.996 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 09:07:22 np0005466013 nova_compute[192144]: 2025-10-02 13:07:22.020 2 DEBUG nova.compute.manager [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 09:07:22 np0005466013 nova_compute[192144]: 2025-10-02 13:07:22.669 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:23 np0005466013 nova_compute[192144]: 2025-10-02 13:07:23.014 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:07:23 np0005466013 nova_compute[192144]: 2025-10-02 13:07:23.993 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:07:24 np0005466013 podman[262974]: 2025-10-02 13:07:24.688618668 +0000 UTC m=+0.060169534 container health_status e90e12f685561aa820ab369fc00e87082a9bd013f34aa2f4a7f4eb88482cb2cc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3)
Oct  2 09:07:24 np0005466013 podman[262973]: 2025-10-02 13:07:24.689086592 +0000 UTC m=+0.061261729 container health_status b8b4ae783d7759f1eed92a1373866d8307eb423ce3b52240798eae2e166a499d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct  2 09:07:24 np0005466013 podman[262972]: 2025-10-02 13:07:24.699663591 +0000 UTC m=+0.082064575 container health_status 5a2e46f49a9a14144213ce19bebc49cd033235d6b3a426f2c16921ebe97d93b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  2 09:07:25 np0005466013 nova_compute[192144]: 2025-10-02 13:07:25.084 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:27 np0005466013 nova_compute[192144]: 2025-10-02 13:07:27.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:28 np0005466013 nova_compute[192144]: 2025-10-02 13:07:28.988 2 DEBUG oslo_service.periodic_task [None req-38105279-c107-49ae-b447-dc3266062d1b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:07:29 np0005466013 podman[263033]: 2025-10-02 13:07:29.676048731 +0000 UTC m=+0.050092080 container health_status f5bdf220b90afef83163b02f7f51879950afcf65fa5615e413a52a9edda00ffb (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid)
Oct  2 09:07:29 np0005466013 podman[263032]: 2025-10-02 13:07:29.684914696 +0000 UTC m=+0.063822186 container health_status bf82ca936085ab50c800a4b63f76edbf164e97cd3012aac2ec3f593120d56720 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 09:07:30 np0005466013 nova_compute[192144]: 2025-10-02 13:07:30.085 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:32 np0005466013 nova_compute[192144]: 2025-10-02 13:07:32.714 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:35 np0005466013 nova_compute[192144]: 2025-10-02 13:07:35.087 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:35 np0005466013 systemd-logind[784]: New session 45 of user zuul.
Oct  2 09:07:35 np0005466013 systemd[1]: Started Session 45 of User zuul.
Oct  2 09:07:37 np0005466013 nova_compute[192144]: 2025-10-02 13:07:37.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:40 np0005466013 nova_compute[192144]: 2025-10-02 13:07:40.124 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:40 np0005466013 ovs-vsctl[263244]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Oct  2 09:07:41 np0005466013 virtqemud[191867]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Oct  2 09:07:41 np0005466013 virtqemud[191867]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Oct  2 09:07:41 np0005466013 virtqemud[191867]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Oct  2 09:07:42 np0005466013 nova_compute[192144]: 2025-10-02 13:07:42.717 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:44 np0005466013 systemd[1]: Starting Hostname Service...
Oct  2 09:07:44 np0005466013 systemd[1]: Started Hostname Service.
Oct  2 09:07:45 np0005466013 nova_compute[192144]: 2025-10-02 13:07:45.126 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
